﻿1
00:00:00,200 --> 00:00:02,435
♪♪

2
00:00:03,136 --> 00:00:04,971
-Welcome to City Inside/Out.

3
00:00:04,971 --> 00:00:07,307
I'm your host, 
Brian Callanan.

4
00:00:07,407 --> 00:00:08,742
State lawmakers are trying

5
00:00:08,742 --> 00:00:10,877
to regulate
artificial intelligence

6
00:00:10,877 --> 00:00:13,246
with measures aimed
at controlling chat bots

7
00:00:13,246 --> 00:00:14,914
and requiring disclosure

8
00:00:14,914 --> 00:00:17,350
when content
is created by AI.

9
00:00:17,350 --> 00:00:18,785
-By making it clear when A.I.

10
00:00:18,785 --> 00:00:19,986
generates media,

11
00:00:19,986 --> 00:00:21,521
Washingtonians
are better protected

12
00:00:21,521 --> 00:00:24,524
against confusion, deception
and misinformation.

13
00:00:25,358 --> 00:00:28,128
-The state says a new law
requiring A.I.

14
00:00:28,128 --> 00:00:31,131
detection tools will 
help prevent deepfakes.

15
00:00:31,331 --> 00:00:34,000
But tech industry leaders
aren't so sure.

16
00:00:34,000 --> 00:00:34,801
-And as of now, there's

17
00:00:34,801 --> 00:00:36,102
no foolproof method

18
00:00:36,102 --> 00:00:37,003
for determining
whether or not

19
00:00:37,003 --> 00:00:38,371
something was generated
entirely

20
00:00:38,371 --> 00:00:40,407
or in part
by artificial intelligence.

21
00:00:40,407 --> 00:00:42,942
-The state's also
trying to protect people,

22
00:00:42,942 --> 00:00:46,212
especially kids, from
the challenge of chat bots.

23
00:00:46,246 --> 00:00:46,913
-We're already seeing

24
00:00:46,913 --> 00:00:47,814
the synthetic friendships

25
00:00:47,814 --> 00:00:49,582
of chat bots normalize 
self-harm, promote

26
00:00:49,582 --> 00:00:51,117
suicide and violence.

27
00:00:51,117 --> 00:00:52,685
-How much regulation does

28
00:00:52,685 --> 00:00:53,620
A.I. need

29
00:00:53,620 --> 00:00:55,321
and how will these new laws

30
00:00:55,321 --> 00:00:56,423
affect one of the leading

31
00:00:56,423 --> 00:00:58,391
industries
in Washington State?

32
00:00:58,391 --> 00:01:00,493
-The companies want to hold
themselves accountable.

33
00:01:00,493 --> 00:01:02,662
-Our studio panel weighs in.

34
00:01:02,662 --> 00:01:05,665
-We're seeing some
pretty disturbing

35
00:01:05,832 --> 00:01:08,301
emotional attachments
that teens are forming.

36
00:01:08,301 --> 00:01:10,270
-This is not a human.

37
00:01:10,270 --> 00:01:12,338
This is technology.

38
00:01:12,338 --> 00:01:14,474
-Is Washington State
getting smarter

39
00:01:14,474 --> 00:01:17,544
or getting outsmarted
by artificial intelligence?

40
00:01:17,777 --> 00:01:19,813
That's next on City Inside/Out.

41
00:01:20,146 --> 00:01:36,262
♪♪

42
00:01:39,165 --> 00:01:40,800
And great to have you 
back here on City

43
00:01:40,800 --> 00:01:42,469
Inside/Out on the 
Seattle Channel.

44
00:01:42,469 --> 00:01:45,271
Brian Callanan here with you
and I am with three people

45
00:01:45,271 --> 00:01:46,773
who will help us
further understand

46
00:01:46,773 --> 00:01:48,942
these very important issues,
including State

47
00:01:48,942 --> 00:01:50,610
Senator Lisa Wellman,
the Democrat

48
00:01:50,610 --> 00:01:53,012
from the 41st District
out on Mercer Island.

49
00:01:53,012 --> 00:01:55,115
A key figure
talking about AI safety

50
00:01:55,115 --> 00:01:57,016
in this past session
for the legislature.

51
00:01:57,016 --> 00:01:59,185
Thanks a lot for joining us.
-Thank you for having me.

52
00:01:59,185 --> 00:02:01,054
-We also have with us
Amy Harris,

53
00:02:01,054 --> 00:02:03,223
director of government
affairs for the Washington

54
00:02:03,223 --> 00:02:05,125
Technology
Industry Association,

55
00:02:05,125 --> 00:02:06,392
which raised
a number of questions

56
00:02:06,392 --> 00:02:07,961
about these measures
during the session.

57
00:02:07,961 --> 00:02:09,762
Amy, appreciate
you being here, too.

58
00:02:09,762 --> 00:02:10,363
-Thank you.

59
00:02:10,363 --> 00:02:11,531
-Also with us, Dr.

60
00:02:11,531 --> 00:02:13,199
Katie Davis, UW professor

61
00:02:13,199 --> 00:02:15,135
and co-director
of the Center for Digital

62
00:02:15,135 --> 00:02:16,803
Youth, which has done
a lot of study

63
00:02:16,803 --> 00:02:19,305
on the impact of AI on teens
and kids. Dr.

64
00:02:19,305 --> 00:02:21,441
Davis, good to have you.
-Thank you. It's a pleasure.

65
00:02:21,441 --> 00:02:22,942
-Let me jump into it
and I'm going to get into

66
00:02:22,942 --> 00:02:24,878
some of the details
about these major AI

67
00:02:24,878 --> 00:02:26,146
bills that you were working on

68
00:02:26,146 --> 00:02:28,047
in this past session
in just a minute.

69
00:02:28,047 --> 00:02:29,182
But first,
I want to ask that

70
00:02:29,182 --> 00:02:30,984
overarching question of why.

71
00:02:30,984 --> 00:02:31,751
Why do you think

72
00:02:31,751 --> 00:02:33,820
AI needs to be regulated
in the first place?

73
00:02:33,820 --> 00:02:34,387
There's this

74
00:02:34,387 --> 00:02:36,156
balance between
managing risks

75
00:02:36,156 --> 00:02:38,625
and encouraging innovation
in the field of AI.

76
00:02:38,625 --> 00:02:41,161
Senator Wellman, you either
sponsored or were involved

77
00:02:41,161 --> 00:02:42,662
in the passage
of several pieces

78
00:02:42,662 --> 00:02:44,597
of this legislation
in this past session.

79
00:02:44,597 --> 00:02:45,865
How would you answer that?

80
00:02:45,865 --> 00:02:48,668
-Well, I do think that
we've created

81
00:02:48,668 --> 00:02:52,238
a digital playground
for our kids to learn in.

82
00:02:52,772 --> 00:02:55,241
I was very responsible
for making sure that we had

83
00:02:55,241 --> 00:02:56,376
computer science

84
00:02:56,376 --> 00:02:57,744
because we want to get kids
ready

85
00:02:57,744 --> 00:02:59,445
for the future
that they're going into.

86
00:02:59,445 --> 00:03:02,081
-And you have a background in
teaching and tech, too.

87
00:03:02,081 --> 00:03:07,487
-And so it's just making sure
that our kids are aware

88
00:03:07,720 --> 00:03:11,591
of the good parts
and the bad parts, making

89
00:03:11,591 --> 00:03:12,325
sure parents,

90
00:03:12,325 --> 00:03:13,860
I think this is
very important

91
00:03:13,860 --> 00:03:16,563
that parents are aware
of some challenges

92
00:03:16,563 --> 00:03:19,566
of being on screens
all the time

93
00:03:19,766 --> 00:03:22,068
and using
their phones or using

94
00:03:23,036 --> 00:03:23,937
what is happening.

95
00:03:23,937 --> 00:03:24,571
And we're

96
00:03:24,571 --> 00:03:27,574
reading things in the paper
that very disturbing.

97
00:03:27,574 --> 00:03:28,308
You know, there are now

98
00:03:28,308 --> 00:03:31,744
three cases of suicide
where AI is involved.

99
00:03:32,345 --> 00:03:35,348
And so it
it says as a responsible

100
00:03:35,515 --> 00:03:37,984
role as a legislator,

101
00:03:37,984 --> 00:03:39,152
you've got to be
looking at this

102
00:03:39,152 --> 00:03:42,021
and you've got to be
deciding what you can do

103
00:03:42,021 --> 00:03:44,190
to bring things forward
and make people aware.

104
00:03:44,190 --> 00:03:45,391
-Thank you very much
for that.

105
00:03:45,391 --> 00:03:47,360
Amy, let's let's
get your piece on this.

106
00:03:47,360 --> 00:03:49,262
What sort of regulation does
AI need?

107
00:03:49,262 --> 00:03:51,030
How does that regulation
impact

108
00:03:51,030 --> 00:03:53,299
the business of developing
AI in our state?

109
00:03:53,299 --> 00:03:55,001
-Yeah, we as a as an industry

110
00:03:55,001 --> 00:03:56,236
really wanted to make sure
we aren't

111
00:03:56,236 --> 00:03:58,037
regulating ourselves
out of the innovation

112
00:03:58,037 --> 00:04:00,740
and the good fortune
that we have here;

113
00:04:00,740 --> 00:04:02,709
being the hub
of all of these amazing

114
00:04:02,709 --> 00:04:03,710
world-class companies,

115
00:04:03,710 --> 00:04:05,778
research institutions,
and the good fortune

116
00:04:05,778 --> 00:04:07,513
that we have
of being Washington State,

117
00:04:07,513 --> 00:04:10,383
just not regulating
ourselves out of innovation.

118
00:04:10,383 --> 00:04:13,353
And the future of tech
startups we represent

119
00:04:13,453 --> 00:04:16,923
as WTIA, the 0 to 50
0 to 10 startups

120
00:04:16,923 --> 00:04:19,058
a lot of first
gen Americans.

121
00:04:19,058 --> 00:04:21,361
So that was our big concern
in these regulations.

122
00:04:21,361 --> 00:04:22,862
And anybody that
brought us into the room,

123
00:04:22,862 --> 00:04:25,031
we were happy to be a part
of those conversations.

124
00:04:25,031 --> 00:04:26,566
We had the experts that are,

125
00:04:26,566 --> 00:04:27,700
“Hey, this worked
in the state,

126
00:04:27,700 --> 00:04:29,702
this didn't work
in that state.”

127
00:04:29,702 --> 00:04:30,903
And we've seen the EU,

128
00:04:30,903 --> 00:04:32,872
UK come down in a lot
of regulations as well.

129
00:04:32,872 --> 00:04:34,207
And Senator
Wellman was happy

130
00:04:34,207 --> 00:04:35,708
to bring us
in on those conversations.

131
00:04:35,708 --> 00:04:37,977
And we have the subject
matter experts

132
00:04:37,977 --> 00:04:40,680
and I thought we landed
in really good places.

133
00:04:40,680 --> 00:04:41,581
-Okay, good.

134
00:04:41,581 --> 00:04:42,649
I know
there's a lot more to it.

135
00:04:42,649 --> 00:04:44,250
We're going to crack that
open in just a little bit.

136
00:04:44,250 --> 00:04:44,851
But Dr.

137
00:04:44,851 --> 00:04:46,486
Davis, let me get
your point of view here.

138
00:04:46,486 --> 00:04:48,755
Why do you think
AI needs to be regulated?

139
00:04:48,755 --> 00:04:50,223
What does your research
with kids

140
00:04:50,223 --> 00:04:52,191
and families showing you? 
-Yeah.

141
00:04:52,191 --> 00:04:55,128
So I've been studying
the impact of various

142
00:04:55,128 --> 00:04:57,630
technologies,
particularly social media,

143
00:04:57,630 --> 00:05:00,700
on various
aspects of children's

144
00:05:00,700 --> 00:05:03,703
and teens development
for over 20 years now.

145
00:05:03,770 --> 00:05:06,339
And I think we've learned
a lot

146
00:05:06,339 --> 00:05:09,776
from the example of social
media and not regulating

147
00:05:09,776 --> 00:05:11,344
social media

148
00:05:11,344 --> 00:05:12,312
and how difficult

149
00:05:12,312 --> 00:05:14,814
that's been on teens
and their families,

150
00:05:14,814 --> 00:05:17,083
putting all the onus
on families

151
00:05:17,083 --> 00:05:18,451
to figure out

152
00:05:18,451 --> 00:05:20,086
how to regulate social media

153
00:05:20,086 --> 00:05:22,622
within the context
of the family.

154
00:05:22,622 --> 00:05:23,756
And I would really like

155
00:05:23,756 --> 00:05:26,759
to see us avoid
that when it comes to AI.

156
00:05:27,493 --> 00:05:28,361
In our research

157
00:05:28,361 --> 00:05:29,429
at the Center for Digital

158
00:05:29,429 --> 00:05:32,565
Youth, we're seeing some
pretty disturbing

159
00:05:32,965 --> 00:05:35,968
emotional attachments
that teens are forming to

160
00:05:36,002 --> 00:05:37,236
AI chatbots.

161
00:05:37,236 --> 00:05:40,340
In the most extreme
cases, instances of suicide.

162
00:05:41,007 --> 00:05:43,743
But even beyond that, just,

163
00:05:43,743 --> 00:05:46,612
you know, teens are forming
these emotional attachments

164
00:05:46,612 --> 00:05:47,547
at a time

165
00:05:47,547 --> 00:05:50,550
when they are particularly
emotionally vulnerable.

166
00:05:51,117 --> 00:05:53,052
And we just don't know yet

167
00:05:53,052 --> 00:05:55,221
what are the long term
consequences of that.

168
00:05:55,221 --> 00:05:58,224
And so I think we really
need to step in and regulate

169
00:05:58,591 --> 00:05:59,659
how they're developed.

170
00:05:59,659 --> 00:06:00,593
-I'm going to dive
a little bit

171
00:06:00,593 --> 00:06:02,161
more into chatbots
right now, if I could.

172
00:06:02,161 --> 00:06:02,995
Senator Wellman.

173
00:06:02,995 --> 00:06:04,197
Let's talk about this chatbot

174
00:06:04,197 --> 00:06:06,165
regulation
that passed this session,

175
00:06:06,165 --> 00:06:09,802
including 5984, Senate Bill
5984, where you were

176
00:06:09,802 --> 00:06:10,870
the prime sponsor.

177
00:06:10,870 --> 00:06:13,373
So this measure requires
an AI chatbot,

178
00:06:13,373 --> 00:06:14,407
that seems human,

179
00:06:14,407 --> 00:06:16,743
to give reminders
that it is not human.

180
00:06:16,743 --> 00:06:18,144
And it would also
require companies

181
00:06:18,144 --> 00:06:20,313
that provide chatbots
to have a protocol

182
00:06:20,313 --> 00:06:23,082
for detecting and addressing
suicidal ideation.

183
00:06:23,082 --> 00:06:24,784
Can you break down
a little bit further for us

184
00:06:24,784 --> 00:06:26,185
what this bill
is going to do

185
00:06:26,185 --> 00:06:28,621
when it goes into effect
this coming January?

186
00:06:28,621 --> 00:06:29,055
-Well,

187
00:06:30,123 --> 00:06:31,290
I think we've put the word

188
00:06:31,290 --> 00:06:34,293
out to the manufacturers
of these products.

189
00:06:34,327 --> 00:06:38,331
What we're looking for
and also giving the right of

190
00:06:39,098 --> 00:06:41,234
litigation where

191
00:06:41,234 --> 00:06:43,803
an important piece of this,
yeah, things happen.

192
00:06:43,803 --> 00:06:47,340
And I think
it's really important to set

193
00:06:47,340 --> 00:06:51,244
the stage,
as WTIA is very involved,

194
00:06:51,244 --> 00:06:52,712
and as you've heard,

195
00:06:52,712 --> 00:06:55,648
they're looking across
the entire country.

196
00:06:55,648 --> 00:06:56,983
We never just

197
00:06:56,983 --> 00:06:59,719
if I'm going to do a bill on
anything, I'm going to say

198
00:06:59,719 --> 00:07:00,586
who's done it.

199
00:07:00,586 --> 00:07:03,122
Where else
have they addressed this?

200
00:07:03,122 --> 00:07:05,191
Because as far as
the industry is concerned,

201
00:07:05,191 --> 00:07:07,360
we'd like to have
consistency.

202
00:07:07,360 --> 00:07:08,561
It doesn't make any sense

203
00:07:08,561 --> 00:07:10,062
to have this happen
in this state.

204
00:07:10,062 --> 00:07:11,831
And then you have to do this...
-The patchwork idea, yeah.

205
00:07:11,831 --> 00:07:14,367
-We’re not interested
in doing that at all.

206
00:07:14,367 --> 00:07:17,370
So it was very important
for us to see what were

207
00:07:17,804 --> 00:07:21,040
the specific things,
establishing a relationship.

208
00:07:21,107 --> 00:07:22,475
We didn't want young
children

209
00:07:22,475 --> 00:07:24,210
establishing a relationship.

210
00:07:24,210 --> 00:07:26,212
This is not a human.

211
00:07:26,212 --> 00:07:28,181
This is technology.
-Right.

212
00:07:28,181 --> 00:07:30,983
-And making sure that that's part of that. 
-Okay.

213
00:07:30,983 --> 00:07:32,251
Amy, I wanted to get

214
00:07:32,251 --> 00:07:33,853
and get your perspective
here, too,

215
00:07:33,853 --> 00:07:36,189
because you testified
about this issue in Olympia

216
00:07:36,189 --> 00:07:38,090
and said that this chatbot
legislation

217
00:07:38,090 --> 00:07:39,826
might be too broad in scope.

218
00:07:39,826 --> 00:07:41,127
I'm thinking back
to January here

219
00:07:41,127 --> 00:07:42,295
when you were talking
about this

220
00:07:42,295 --> 00:07:43,162
and might amending

221
00:07:43,162 --> 00:07:44,931
a making
a number of companies

222
00:07:44,931 --> 00:07:46,833
more vulnerable
to this litigation

223
00:07:46,833 --> 00:07:49,202
that the senator
was talking about here?

224
00:07:49,202 --> 00:07:51,137
You did support the chat
bot bill in the end.

225
00:07:51,137 --> 00:07:53,072
Do you have still concerns

226
00:07:53,072 --> 00:07:54,474
about what passed
or how would you describe?

227
00:07:54,474 --> 00:07:56,142
-No, we landed
in a much better place

228
00:07:56,142 --> 00:07:56,876
because again,

229
00:07:56,876 --> 00:07:58,544
through
working through it again,

230
00:07:58,544 --> 00:07:59,011
much better

231
00:07:59,011 --> 00:08:00,646
conversations and again,

232
00:08:00,646 --> 00:08:03,049
like I just got back
from a conference in D.C.

233
00:08:03,049 --> 00:08:03,950
where

234
00:08:03,950 --> 00:08:06,385
one of the main
pillars of this was avoiding

235
00:08:06,385 --> 00:08:07,420
50-state patchwork.

236
00:08:07,420 --> 00:08:09,922
I don't think a child in
Iowa is less valuable

237
00:08:09,922 --> 00:08:11,257
than a child
in Washington state.

238
00:08:11,257 --> 00:08:15,294
So asking these companies
to hire 1000 more lawyers,

239
00:08:15,294 --> 00:08:17,363
which I don't think
is a good idea,

240
00:08:17,363 --> 00:08:19,999
that that to me
would be unreasonable.

241
00:08:19,999 --> 00:08:21,100
I don't understand that.

242
00:08:21,100 --> 00:08:24,103
So asking
that it's nonsensical.

243
00:08:24,103 --> 00:08:25,972
And if we've seen it not work
here, that to me

244
00:08:25,972 --> 00:08:27,306
it means it's
not going to work here.

245
00:08:27,306 --> 00:08:28,774
So a blanketed,

246
00:08:28,774 --> 00:08:31,644
better solution
would make more sense.

247
00:08:31,644 --> 00:08:33,212
-What happened
with that litigation piece

248
00:08:33,212 --> 00:08:35,314
that that made more sense
to you in Washington state?

249
00:08:35,314 --> 00:08:36,916
-There must have been
a language change.

250
00:08:36,916 --> 00:08:37,917
And now I'm blanking

251
00:08:37,917 --> 00:08:40,253
as I’ve rinsed my mind

252
00:08:40,253 --> 00:08:43,456
from the 60-day
session, which was a haul.

253
00:08:43,556 --> 00:08:45,992
-Yeah.
-It must have been a part of it.

254
00:08:45,992 --> 00:08:46,459
Do you remember

255
00:08:46,459 --> 00:08:47,193
the language change

256
00:08:47,193 --> 00:08:48,928
that came through in it?
-Not specifically,

257
00:08:48,928 --> 00:08:50,429
but that's part
of the process

258
00:08:50,429 --> 00:08:52,198
that we're constantly
when we put it out

259
00:08:52,198 --> 00:08:53,933
there,
we're looking at amendments.

260
00:08:53,933 --> 00:08:54,300
-Yeah.

261
00:08:54,300 --> 00:08:56,602
-There was a bill, actually,
the bill from the House

262
00:08:56,602 --> 00:08:58,771
is the one that
finally made it through.

263
00:08:58,771 --> 00:09:00,540
-There was because you said
you said a bill number.

264
00:09:00,540 --> 00:09:02,108
And now I'm thinking
I remember it was House Bill

265
00:09:02,108 --> 00:09:04,110
2225. Yeah, yeah, yeah.

266
00:09:04,110 --> 00:09:05,745
I was going to say
that's what,

267
00:09:05,745 --> 00:09:06,913
that's what I was thinking
of. Yeah.

268
00:09:06,913 --> 00:09:07,580
And I know

269
00:09:07,580 --> 00:09:10,616
there was an issue
with ongoing relationship.

270
00:09:10,850 --> 00:09:12,018
So when I get in

271
00:09:12,018 --> 00:09:15,454
to chatGPT, it knows
it'll say Amy's tone.

272
00:09:15,855 --> 00:09:17,456
And so it has to know me

273
00:09:17,456 --> 00:09:19,125
from the last time
I went on.

274
00:09:19,125 --> 00:09:20,793
And that was
one of the problems

275
00:09:20,793 --> 00:09:22,328
because it was like,
we don't want it to have

276
00:09:22,328 --> 00:09:23,229
an ongoing relationship.

277
00:09:23,229 --> 00:09:25,498
And it's like, well, that's
actually part of the point.

278
00:09:25,498 --> 00:09:28,501
So and having that ongoing
relationship with a minor

279
00:09:28,668 --> 00:09:30,002
was it was an issue. 
-Okay.

280
00:09:30,002 --> 00:09:31,704
-So that like,
I understand that.

281
00:09:31,704 --> 00:09:32,038
And then

282
00:09:32,038 --> 00:09:35,041
there was another issue
with Alexa, so not right.

283
00:09:35,041 --> 00:09:37,476
So it's like do does
Alexa have to remind a minor

284
00:09:37,476 --> 00:09:38,911
that it's not a person?

285
00:09:38,911 --> 00:09:40,246
And it's like, well, it's

286
00:09:40,246 --> 00:09:42,381
not interfacing, it's a...
-Right.

287
00:09:42,381 --> 00:09:43,449
-So like, that was

288
00:09:43,449 --> 00:09:45,117
I know there was an issue
there as well.

289
00:09:45,117 --> 00:09:47,853
So I think it was those
are specified in language.

290
00:09:47,853 --> 00:09:48,187
=Okay.

291
00:09:48,187 --> 00:09:52,358
-And then other products
had had specified specificities.

292
00:09:52,525 --> 00:09:53,326
-Right. Yeah.

293
00:09:53,326 --> 00:09:54,160
And it sounds like

294
00:09:54,160 --> 00:09:56,829
this is going to be
this is part of the ongoing

295
00:09:56,829 --> 00:09:58,364
part of this legislation
for sure.

296
00:09:58,364 --> 00:09:59,265
-And it's not

297
00:09:59,265 --> 00:10:02,201
now that it's out there,
we may get other information

298
00:10:02,201 --> 00:10:05,137
and have to respond to that
tweaking at this session.

299
00:10:05,137 --> 00:10:06,105
-Yeah. Dr.

300
00:10:06,105 --> 00:10:08,341
Davis, I guess I'll just ask
that general question.

301
00:10:08,341 --> 00:10:10,009
What impact
do you see coming

302
00:10:10,009 --> 00:10:11,711
out of these chatbot
regulations?

303
00:10:11,711 --> 00:10:13,579
-Yeah, so actually
the piece of the bill

304
00:10:13,579 --> 00:10:18,084
that I find most promising
is the piece that prohibits

305
00:10:18,284 --> 00:10:21,153
manipulative designs
that extend

306
00:10:21,153 --> 00:10:22,788
emotional interactions.

307
00:10:23,756 --> 00:10:24,156
Actually, the

308
00:10:24,156 --> 00:10:29,395
piece around reminding youth
that this that you're

309
00:10:29,595 --> 00:10:32,598
talking with a chatbot
rather than a human,

310
00:10:33,633 --> 00:10:34,667
I'm not so sure.

311
00:10:34,667 --> 00:10:36,969
I'm not convinced that
that's going to necessarily

312
00:10:36,969 --> 00:10:38,838
be helpful
because there's a difference

313
00:10:38,838 --> 00:10:41,307
between knowing something
intellectually.

314
00:10:41,307 --> 00:10:42,241
And, you know,

315
00:10:42,241 --> 00:10:45,244
my son has a robot at home,
and we both know

316
00:10:45,244 --> 00:10:47,013
that this is a robot.
-Right.

317
00:10:47,013 --> 00:10:48,948
-But it feels real.

318
00:10:48,948 --> 00:10:50,016
And so there's a difference

319
00:10:50,016 --> 00:10:51,584
between knowing and feeling.

320
00:10:51,584 --> 00:10:53,419
And even we've seen
in research,

321
00:10:53,419 --> 00:10:56,088
even when people know
and are reminded

322
00:10:56,088 --> 00:10:57,356
that something is not human,

323
00:10:57,356 --> 00:10:59,992
we are just
natural anthropomorphizers.

324
00:10:59,992 --> 00:11:01,827
And so I'm not sure about

325
00:11:01,827 --> 00:11:03,963
that piece,
but in our research

326
00:11:03,963 --> 00:11:06,932
there's
a lot of evidence that

327
00:11:07,767 --> 00:11:10,603
these chatbots,
the way they're designed,

328
00:11:10,603 --> 00:11:14,106
is specifically to keep
and prolong engagement

329
00:11:14,106 --> 00:11:18,411
and particularly to engage
users in an emotional way.

330
00:11:18,611 --> 00:11:21,647
And so I really like that
piece of the bill,

331
00:11:21,647 --> 00:11:22,682
and I think that's

332
00:11:22,682 --> 00:11:23,616
a lot of our research

333
00:11:23,616 --> 00:11:26,619
is documenting these types
of manipulative designs.

334
00:11:26,652 --> 00:11:26,919
Yeah.

335
00:11:26,919 --> 00:11:28,854
And I think if we can really

336
00:11:28,854 --> 00:11:30,790
wring
that in from companies,

337
00:11:30,790 --> 00:11:33,025
then we'll be doing it. 
-Okay. Thank you. Please.

338
00:11:34,126 --> 00:11:35,261
-Sorry, I was just going to say,

339
00:11:35,261 --> 00:11:37,630
I think that that's one
thing that I'm looking at,

340
00:11:37,630 --> 00:11:40,599
whether or not we say
you must end the session

341
00:11:40,599 --> 00:11:43,736
completely at 10 minutes
or 15 minutes.

342
00:11:43,836 --> 00:11:45,137
We'll be talking about that

343
00:11:45,137 --> 00:11:46,972
and get your recommendation.

344
00:11:46,972 --> 00:11:48,007
But that may be something

345
00:11:48,007 --> 00:11:49,442
that we'll have to do
next session.

346
00:11:49,442 --> 00:11:50,042
-What do you got?

347
00:11:50,042 --> 00:11:51,277
-I know one of them was

348
00:11:51,277 --> 00:11:52,845
if I'm in there
and I'm, you know,

349
00:11:52,845 --> 00:11:54,580
I want a briefing on this
or I need to

350
00:11:54,580 --> 00:11:56,582
I need a tone
change on this or,

351
00:11:56,582 --> 00:11:58,851
you know, summarize
this 25 page, whatever.

352
00:11:58,851 --> 00:12:00,953
And then I type
in at the end, Wow.

353
00:12:00,953 --> 00:12:03,723
This has been a really hard
three weeks. I'm exhausted.

354
00:12:03,723 --> 00:12:06,158
Is that mean? I'm depressed?

355
00:12:06,158 --> 00:12:09,061
So one of it is like, is it
is that me asking for help?

356
00:12:09,061 --> 00:12:12,998
Is that asking So is it is
that an emotional like cry?

357
00:12:13,265 --> 00:12:14,867
So like,
is that asking for a

358
00:12:14,867 --> 00:12:16,669
response
in an emotional way?

359
00:12:16,669 --> 00:12:20,206
So I think those are those
things like is that is

360
00:12:20,206 --> 00:12:23,309
that is that it's called
like does it send me to 911.

361
00:12:23,342 --> 00:12:24,543
Yeah. Like
do you know what I mean.

362
00:12:24,543 --> 00:12:27,446
So I know those were the
things of like, I'm fine.

363
00:12:27,446 --> 00:12:29,515
Yeah,
but is that me typing in.

364
00:12:29,515 --> 00:12:30,783
Do you know what I mean?

365
00:12:30,783 --> 00:12:32,184
So then what you're getting,
I would at least on

366
00:12:32,184 --> 00:12:34,553
then we're
getting into legal responses

367
00:12:34,553 --> 00:12:35,187
and I'm obvious

368
00:12:35,187 --> 00:12:36,689
like I'm somebody
that's okay and I have good

369
00:12:36,689 --> 00:12:38,257
personal relationships
and that's not that.

370
00:12:38,257 --> 00:12:39,158
But like, that's

371
00:12:39,158 --> 00:12:41,494
those are those concerns.
-I see real quick

372
00:12:41,494 --> 00:12:43,229
and I want to move on to
another piece of it, please.

373
00:12:43,229 --> 00:12:43,429
Yeah.

374
00:12:43,429 --> 00:12:44,497
Was there something else
you wanted to.

375
00:12:44,497 --> 00:12:45,765
I was also going to say?
-You know,

376
00:12:45,765 --> 00:12:47,633
when we look at Facebook
and we say, you

377
00:12:47,633 --> 00:12:48,100
know, kind of

378
00:12:48,100 --> 00:12:49,502
you've been on Facebook
for a little while

379
00:12:49,502 --> 00:12:52,004
and all of a sudden
these ads start appearing

380
00:12:52,004 --> 00:12:53,305
because you

381
00:12:53,305 --> 00:12:55,741
you didn't think that you
were talking about anything.

382
00:12:55,741 --> 00:12:57,843
But, yeah, I was thinking
about buying new shoes

383
00:12:57,843 --> 00:12:59,712
and there's,
you know, all of that.

384
00:12:59,712 --> 00:13:02,314
So I'm concerned
about gathering information.

385
00:13:02,314 --> 00:13:03,182
Privacy.

386
00:13:03,182 --> 00:13:04,583
Privacy is one of the things

387
00:13:04,583 --> 00:13:05,050
that we have

388
00:13:05,050 --> 00:13:06,352
to really examine

389
00:13:06,352 --> 00:13:09,054
in terms of what they're
learning about this child

390
00:13:09,054 --> 00:13:11,323
and how it can be used,
perhaps in other ways.

391
00:13:11,323 --> 00:13:11,791
-Yeah.

392
00:13:11,791 --> 00:13:13,859
Can I put chatbots aside
for just a second?

393
00:13:13,859 --> 00:13:15,861
Just because I want to
make sure that I talk about

394
00:13:15,861 --> 00:13:18,798
another big bill passed this
session, House Bill 1170.

395
00:13:18,798 --> 00:13:20,566
And maybe, Amy,
I can talk with you. Yeah.

396
00:13:20,566 --> 00:13:21,967
So this says that
when content

397
00:13:21,967 --> 00:13:24,970
is substantially modified
with generative AI,

398
00:13:25,171 --> 00:13:26,639
the information
has to be traceable

399
00:13:26,639 --> 00:13:28,307
using watermarks
or metadata.

400
00:13:28,307 --> 00:13:29,108
The big concern

401
00:13:29,108 --> 00:13:29,608
I heard here

402
00:13:29,608 --> 00:13:31,577
from an industry perspective
was that

403
00:13:31,577 --> 00:13:35,080
there's no guaranteed way
to do this, and bad actors

404
00:13:35,080 --> 00:13:35,848
are always going
to find a way

405
00:13:35,848 --> 00:13:37,049
to remove those watermarks.

406
00:13:37,049 --> 00:13:37,983
Can you break down

407
00:13:37,983 --> 00:13:39,985
some of the thoughts around
House Bill 1170?

408
00:13:39,985 --> 00:13:40,719
-We got

409
00:13:40,719 --> 00:13:41,654
we got to pro on that

410
00:13:41,654 --> 00:13:43,622
because by the time
we had introduced it

411
00:13:43,622 --> 00:13:46,192
and then by the time
it had come to fruition,

412
00:13:46,192 --> 00:13:46,625
number one,

413
00:13:46,625 --> 00:13:48,994
they had modeled it
after a state that had been

414
00:13:48,994 --> 00:13:50,396
under litigation
for three years.

415
00:13:50,396 --> 00:13:53,465
Again, please don't
model bills after states

416
00:13:53,465 --> 00:13:54,767
that are not doing it
correctly.

417
00:13:54,767 --> 00:13:56,502
To me
we should be leading the way

418
00:13:56,502 --> 00:13:57,703
on doing this correctly.

419
00:13:57,703 --> 00:14:00,072
-Okay.
-So don't do that. Yeah.

420
00:14:00,072 --> 00:14:01,507
Okay. Model it, please.

421
00:14:01,507 --> 00:14:03,309
Let's be the model
of how to do this right.

422
00:14:04,844 --> 00:14:06,512
That that
was the issue there.

423
00:14:06,512 --> 00:14:07,246
-Okay.

424
00:14:07,246 --> 00:14:08,781
-We got to pro on it,
but it was

425
00:14:08,781 --> 00:14:10,049
there will be bad actors.

426
00:14:10,049 --> 00:14:11,116
But if we can do this right

427
00:14:11,116 --> 00:14:13,018
and it got it
got to a much better place

428
00:14:13,018 --> 00:14:14,987
and the companies want to
hold themselves accountable.

429
00:14:14,987 --> 00:14:16,789
Those large actors want to
hold themselves accountable

430
00:14:16,789 --> 00:14:18,490
because then they can be
the ones leading it.

431
00:14:18,490 --> 00:14:19,358
-Okay, Do

432
00:14:19,358 --> 00:14:20,659
they have concerns that,
well,

433
00:14:20,659 --> 00:14:22,428
maybe I'll hold that
question for just a second

434
00:14:22,428 --> 00:14:24,530
because I want to get
some more input here.

435
00:14:24,530 --> 00:14:25,231
Dr. Davis,

436
00:14:25,231 --> 00:14:26,365
some thoughts
about this idea

437
00:14:26,365 --> 00:14:28,133
of requiring watermarks,
etc..

438
00:14:28,133 --> 00:14:30,803
Do you think House Bill 1170
is going to be effective?

439
00:14:30,803 --> 00:14:33,072
-I was less involved
with that House bill.

440
00:14:33,072 --> 00:14:34,740
However,
I do feel that there's

441
00:14:34,740 --> 00:14:38,010
real opportunity there
when it comes to

442
00:14:38,611 --> 00:14:41,614
developing children's AI
literacy and helping them

443
00:14:42,281 --> 00:14:45,217
to figure out
what is AI versus not

444
00:14:45,217 --> 00:14:47,853
and what should I trust
when I when I'm reading

445
00:14:47,853 --> 00:14:50,122
and encountering
content online and not so...

446
00:14:50,122 --> 00:14:53,125
I do see that
there's some value there

447
00:14:53,192 --> 00:14:53,792
in that bill

448
00:14:53,792 --> 00:14:55,227
that could be incorporated

449
00:14:55,227 --> 00:14:57,663
into educational
interventions

450
00:14:57,663 --> 00:14:59,765
to help develop
young people's AI

451
00:14:59,765 --> 00:15:00,933
literacy. 
-Got it.

452
00:15:00,933 --> 00:15:01,700
And this feels like

453
00:15:01,700 --> 00:15:03,502
one of those situations
where it's a moving

454
00:15:03,502 --> 00:15:05,237
target again,
but some toss up.

455
00:15:05,237 --> 00:15:06,872
-Well,
I was also going to say that

456
00:15:06,872 --> 00:15:08,941
one of the things
that we did was to specify

457
00:15:08,941 --> 00:15:10,776
that if you use the word
nurse,

458
00:15:10,776 --> 00:15:12,344
you have to specify that

459
00:15:12,344 --> 00:15:15,014
that cannot be used
for an AI.

460
00:15:15,014 --> 00:15:16,248
In other words, I'm
a nurse, I'm

461
00:15:16,248 --> 00:15:17,483
giving you this information.

462
00:15:17,483 --> 00:15:20,552
It must be a person
who has a degree of a nurse.

463
00:15:20,552 --> 00:15:22,588
-I see. Yeah. Okay.
Thank you.

464
00:15:22,588 --> 00:15:22,788
That's

465
00:15:22,788 --> 00:15:25,858
and just overall, with 1170,
do you think it's going

466
00:15:25,858 --> 00:15:27,826
to be it's
going to be effective.

467
00:15:27,826 --> 00:15:29,428
What are your thoughts
about that piece?

468
00:15:29,428 --> 00:15:29,862
-Yeah, no,

469
00:15:29,862 --> 00:15:31,697
I think that we are going
to make a difference

470
00:15:31,697 --> 00:15:33,766
with all the bills
that we did.

471
00:15:33,766 --> 00:15:34,533
I think,

472
00:15:34,533 --> 00:15:35,701
you know, we're teaching

473
00:15:35,701 --> 00:15:38,337
digital literacy
is one of the really

474
00:15:38,337 --> 00:15:41,040
challenging things
that we're focusing in on

475
00:15:41,040 --> 00:15:43,242
and making sure
that people understand is

476
00:15:43,242 --> 00:15:44,643
you can only imagine

477
00:15:44,643 --> 00:15:47,947
the things you've seen
on YouTube, etc..

478
00:15:48,013 --> 00:15:48,881
-Yeah.

479
00:15:48,881 --> 00:15:50,249
-And you're being told that

480
00:15:50,249 --> 00:15:51,684
somebody
is saying something.

481
00:15:52,851 --> 00:15:54,687
You want to have
some degree of confidence

482
00:15:54,687 --> 00:15:57,056
that that is the person
saying it. -Yeah.

483
00:15:57,056 --> 00:15:59,458
-That what you're seeing
is authentic.

484
00:15:59,458 --> 00:16:00,159
-Yeah.

485
00:16:00,159 --> 00:16:02,094
You don't think people
can can work around that.

486
00:16:02,094 --> 00:16:04,296
I've seen some amazing
things done with AI.

487
00:16:04,296 --> 00:16:05,798
What are your thoughts
about that?

488
00:16:05,798 --> 00:16:06,932
-Well, I'm

489
00:16:06,932 --> 00:16:08,867
I will tell you
and I've said before that

490
00:16:08,867 --> 00:16:12,271
I am a science fiction nut
and I read a lot of science

491
00:16:12,271 --> 00:16:15,307
fiction nut,
and I love Isaac Asimov.

492
00:16:15,541 --> 00:16:16,842
-Oh, one of my faves.

493
00:16:16,842 --> 00:16:19,511
-So the three laws
of robotics. -Ah, there we go.

494
00:16:19,511 --> 00:16:22,214
-The robot
may not harm a human.

495
00:16:22,214 --> 00:16:23,582
I don't see any three laws,

496
00:16:23,582 --> 00:16:25,651
four laws
or five laws of AI.

497
00:16:25,651 --> 00:16:28,654
And I think that that's
unfortunate that companies

498
00:16:28,654 --> 00:16:31,056
that we need to be doing
the work that we're doing.

499
00:16:31,056 --> 00:16:31,423
-Yeah.

500
00:16:31,423 --> 00:16:32,891
-And by the way,
we were ready.

501
00:16:32,891 --> 00:16:36,295
Now, EU just passed their
AI program

502
00:16:36,662 --> 00:16:40,199
and I think we want to be
consistent across the world

503
00:16:40,499 --> 00:16:41,734
because all of our companies

504
00:16:41,734 --> 00:16:43,902
do a lot of business
in other parts of the world.

505
00:16:43,902 --> 00:16:45,037
-Yeah, yeah, that's true.

506
00:16:45,037 --> 00:16:47,272
-”I, Robot,” read the book, don't
watch the movie.

507
00:16:47,272 --> 00:16:49,008
I just want to make sure
I put that out there.

508
00:16:49,008 --> 00:16:50,743
Let me talk
about the industry of

509
00:16:50,743 --> 00:16:51,810
AI in the state.

510
00:16:51,810 --> 00:16:53,245
Maybe I'll just go
down the line here.

511
00:16:53,245 --> 00:16:56,281
Dr. Davis This critique
that regulating AI

512
00:16:56,281 --> 00:16:58,350
too much
might have bad effect

513
00:16:58,350 --> 00:17:00,085
on an important industry
in Washington.

514
00:17:00,085 --> 00:17:01,720
You have companies saying
new laws

515
00:17:01,720 --> 00:17:03,422
could stifle innovation
in our state.

516
00:17:03,422 --> 00:17:05,090
There's also a threat
from President Trump,

517
00:17:05,090 --> 00:17:05,691
who signed

518
00:17:05,691 --> 00:17:07,359
an executive order
back in December

519
00:17:07,359 --> 00:17:08,427
saying he would pull

520
00:17:08,427 --> 00:17:10,195
federal broadband
funding from states

521
00:17:10,195 --> 00:17:12,064
if the federal government
believes

522
00:17:12,064 --> 00:17:14,600
that states have passed
onerous AI laws.

523
00:17:14,600 --> 00:17:16,268
Regulation
hurting innovation.

524
00:17:16,268 --> 00:17:17,336
What's your stance on that?

525
00:17:18,303 --> 00:17:19,972
-Well,
what I would love to see

526
00:17:19,972 --> 00:17:23,876
is AI companies
compete in an innovative way

527
00:17:24,676 --> 00:17:28,747
to develop products
that support well-being

528
00:17:28,747 --> 00:17:30,049
rather than

529
00:17:30,049 --> 00:17:33,585
prolong engagement and
and take advantage of our

530
00:17:34,620 --> 00:17:35,721
our attachment system.

531
00:17:35,721 --> 00:17:37,456
-But where's the financial
incentive in that?

532
00:17:37,456 --> 00:17:37,823
-Well,

533
00:17:37,823 --> 00:17:40,526
if they are compelled
to by regulation,

534
00:17:40,526 --> 00:17:41,527
then I think they could get

535
00:17:41,527 --> 00:17:43,362
really creative
and innovative

536
00:17:43,362 --> 00:17:45,764
in the way
they design their products

537
00:17:45,764 --> 00:17:48,600
so that they're putting
well-being first.

538
00:17:48,600 --> 00:17:51,537
And I don't see how putting
well-being

539
00:17:51,537 --> 00:17:54,540
first
is at odds with innovation.

540
00:17:54,973 --> 00:17:57,709
But I just think that
the incentives right now

541
00:17:57,709 --> 00:18:01,180
are more to capture
market share and

542
00:18:01,980 --> 00:18:04,116
to put engagement
and capturing

543
00:18:04,116 --> 00:18:06,151
as many users as possible.

544
00:18:06,151 --> 00:18:08,921
That is first
and it really blocks out

545
00:18:08,921 --> 00:18:12,458
some well intentioned actors
who maybe can't get in

546
00:18:13,058 --> 00:18:14,493
and establish a foothold.

547
00:18:14,493 --> 00:18:16,261
And so I feel like
regulation

548
00:18:16,261 --> 00:18:18,464
could really help
even the playing field

549
00:18:18,464 --> 00:18:21,467
so that companies
are compelled to innovate

550
00:18:22,067 --> 00:18:24,870
on the basis of well-being
rather than engagement.

551
00:18:24,870 --> 00:18:26,238
-Amy, let me ask you
this question.

552
00:18:26,238 --> 00:18:28,240
I know you've heard
some of these points before.

553
00:18:28,240 --> 00:18:30,509
When does regulation
become too much?

554
00:18:30,509 --> 00:18:32,611
When does it
stifle innovation?

555
00:18:32,611 --> 00:18:34,913
-We're a pretty heavy
regulated state.

556
00:18:34,913 --> 00:18:36,782
So, yes, it will
drive our innovation.

557
00:18:36,782 --> 00:18:38,884
And that's
that's a WTIA position.

558
00:18:38,884 --> 00:18:41,587
There was 20 bills
that affected tech.

559
00:18:41,587 --> 00:18:43,989
Two passed this last
legislative session.

560
00:18:43,989 --> 00:18:45,357
So I think everybody's

561
00:18:45,357 --> 00:18:47,726
I think tech is fine if
we're brought to the table.

562
00:18:47,726 --> 00:18:49,228
Again, we have subject
matter experts

563
00:18:49,228 --> 00:18:50,362
that have worked in this

564
00:18:50,362 --> 00:18:51,830
that want to see

565
00:18:51,830 --> 00:18:54,666
a national framework,
not a 50-state patchwork.

566
00:18:55,634 --> 00:18:57,302
A national
framework from the

567
00:18:57,302 --> 00:18:58,971
last conference
I was out in D.C.

568
00:18:58,971 --> 00:19:00,272
would involve

569
00:19:00,272 --> 00:19:02,908
national security,
child safety and protection

570
00:19:02,908 --> 00:19:03,942
for innovators.

571
00:19:03,942 --> 00:19:06,512
So if we long as we are
protecting people that are

572
00:19:06,512 --> 00:19:09,715
putting themselves out there
and starting that

573
00:19:09,748 --> 00:19:14,052
we have to protect people
that are willing to

574
00:19:14,820 --> 00:19:16,021
start that,

575
00:19:16,021 --> 00:19:18,991
to start their American
dream, to start their

576
00:19:18,991 --> 00:19:22,060
their passion,
I think we can do that.

577
00:19:22,060 --> 00:19:23,795
And we a lot of them,
what's hard is

578
00:19:23,795 --> 00:19:27,666
a lot of them are former
big, big tech employees.

579
00:19:27,666 --> 00:19:29,168
And we have that hub here.

580
00:19:29,168 --> 00:19:30,536
And what's funny is
when I'm there

581
00:19:30,536 --> 00:19:31,803
and when I was in D.C.

582
00:19:31,803 --> 00:19:33,839
and I was talking to people
from Arizona,

583
00:19:33,839 --> 00:19:35,541
Louisiana and Pennsylvania,
they're just like,

584
00:19:35,541 --> 00:19:37,009
wow, like Washington,
that's the model.

585
00:19:37,009 --> 00:19:39,178
And when you're here,
it doesn't feel that way.

586
00:19:39,178 --> 00:19:40,846
It's more of a burden then,
yeah,

587
00:19:40,846 --> 00:19:42,948
then this massive
something to clamor for.

588
00:19:42,948 --> 00:19:44,249
-But but just in
looking at that,

589
00:19:44,249 --> 00:19:46,552
you still hold the concern
that these new laws

590
00:19:46,552 --> 00:19:48,520
that have been passed
will push out

591
00:19:48,520 --> 00:19:49,288
some tech companies

592
00:19:49,288 --> 00:19:50,289
-Think what happened this

593
00:19:50,289 --> 00:19:52,357
last legislative session
because we were

594
00:19:52,357 --> 00:19:54,059
we were brought in
on some of these,

595
00:19:54,059 --> 00:19:55,327
but not all of them.

596
00:19:55,327 --> 00:19:55,928
And some of them

597
00:19:55,928 --> 00:19:57,162
it was like
maybe they just didn't pass

598
00:19:57,162 --> 00:19:58,497
because there wasn't enough
time.

599
00:19:58,497 --> 00:20:00,666
There's only a 60 day session.
-Right.

600
00:20:00,666 --> 00:20:01,700
-I'm not sure. So.

601
00:20:01,700 --> 00:20:03,402
-And one more follow up
here.

602
00:20:03,402 --> 00:20:06,738
Do people to people
trust big tech, right?

603
00:20:06,738 --> 00:20:07,673
(overlapping conversation)

604
00:20:07,673 --> 00:20:09,408
-It’s more of the overarching
narrative. Yeah.

605
00:20:09,408 --> 00:20:09,841
It's funny,

606
00:20:09,841 --> 00:20:11,476
we have people
that would come in

607
00:20:11,476 --> 00:20:14,379
and on Zoom
and talk about how terrible

608
00:20:14,379 --> 00:20:16,081
we were and all of these,
and it was like,

609
00:20:16,081 --> 00:20:18,717
Wait a year,
You just disconnected it.

610
00:20:18,717 --> 00:20:20,552
Like you're here to tell me
how terrible I am.

611
00:20:20,552 --> 00:20:21,620
And it was like,
and look where we are.

612
00:20:21,620 --> 00:20:22,454
We're in the studio

613
00:20:22,454 --> 00:20:23,989
and we're doing this
amazing thing.

614
00:20:23,989 --> 00:20:26,225
And so it's like, we can't
what a double edged sword.

615
00:20:26,225 --> 00:20:27,092
So and I understand

616
00:20:27,092 --> 00:20:29,194
the concerns
and the worries with it.

617
00:20:29,194 --> 00:20:30,963
It's almost the same
as Big Pharma.

618
00:20:30,963 --> 00:20:31,630
Like what?

619
00:20:31,630 --> 00:20:33,599
You know, you hate it
until you have cancer.

620
00:20:33,599 --> 00:20:34,466
So I don't know.

621
00:20:34,466 --> 00:20:35,300
I don't know what
to tell you.

622
00:20:35,300 --> 00:20:36,768
So I think we just need

623
00:20:36,768 --> 00:20:38,704
to continue working
through those things.

624
00:20:38,704 --> 00:20:41,807
And I, I completely
understand those worries.

625
00:20:41,807 --> 00:20:43,842
But what a what
a fortune that we have here.

626
00:20:43,842 --> 00:20:45,177
-Okay, Senator

627
00:20:45,177 --> 00:20:45,477
Wellman

628
00:20:45,477 --> 00:20:47,079
I know you've heard some of
these critiques, too,

629
00:20:47,079 --> 00:20:49,381
about how AI regulation
could impact

630
00:20:49,381 --> 00:20:51,049
a global industry
in our state.

631
00:20:51,049 --> 00:20:53,051
What is your message
to industry leaders

632
00:20:53,051 --> 00:20:53,852
here in Washington

633
00:20:53,852 --> 00:20:56,722
who are concerned about
maybe what the legislature

634
00:20:56,722 --> 00:20:57,789
is doing
to their businesses?

635
00:20:59,091 --> 00:21:01,560
-Well, first of
all, I have not heard that

636
00:21:01,560 --> 00:21:04,396
from any big tech companies,
and I have been lobbied

637
00:21:04,396 --> 00:21:05,564
by them all.

638
00:21:05,564 --> 00:21:07,699
I have not heard
that they have

639
00:21:07,699 --> 00:21:10,002
their business has been hurt
by what we're doing.

640
00:21:10,002 --> 00:21:14,106
And being able
to praise them

641
00:21:14,106 --> 00:21:15,907
and bring them forward
and say,

642
00:21:15,907 --> 00:21:18,143
look how this company
is responding

643
00:21:18,143 --> 00:21:20,245
to the concerns
that we've brought forth

644
00:21:20,245 --> 00:21:22,447
is the best advertising
you can get.

645
00:21:22,447 --> 00:21:24,549
So I don't think that
that's really the problem.

646
00:21:24,549 --> 00:21:26,585
I do think
that on a national level,

647
00:21:26,585 --> 00:21:28,620
we have some issues
as far as

648
00:21:28,620 --> 00:21:31,456
big tech is concerned
and that is we're now

649
00:21:31,456 --> 00:21:34,459
seeing some of big tech
leave the country

650
00:21:34,726 --> 00:21:38,363
because of being here
on visas.

651
00:21:38,430 --> 00:21:38,930
You're right,

652
00:21:38,930 --> 00:21:40,832
with immigration,
we have other pieces

653
00:21:40,832 --> 00:21:43,235
that are concerning,
much more concerning to me

654
00:21:43,235 --> 00:21:44,002
in our state,

655
00:21:44,002 --> 00:21:47,005
we really have lost
half of the bioscience

656
00:21:47,773 --> 00:21:49,308
industry of our state.

657
00:21:49,308 --> 00:21:51,343
Not that it's
not continuing,

658
00:21:51,343 --> 00:21:52,644
but where it's being done

659
00:21:52,644 --> 00:21:55,080
is going to be done
in another country unfortunately.

660
00:21:55,080 --> 00:21:57,115
-A whole other
show there. Thank you.

661
00:21:57,115 --> 00:21:58,984
-We'll follow up.
-I can't wait. Okay.

662
00:21:58,984 --> 00:22:00,686
We do need to start
wrapping up now.

663
00:22:00,686 --> 00:22:01,086
And I,

664
00:22:01,086 --> 00:22:02,287
I think it's important
to point out

665
00:22:02,287 --> 00:22:05,290
that this conversation, this
has come up a few times.

666
00:22:05,290 --> 00:22:06,425
AI when it comes
to regulation,

667
00:22:06,425 --> 00:22:07,359
it's constantly changing.

668
00:22:07,359 --> 00:22:09,594
It feels like a moving
target here in many ways.

669
00:22:09,594 --> 00:22:11,997
City of Seattle,
for example, is right now

670
00:22:11,997 --> 00:22:12,597
trying to figure out

671
00:22:12,597 --> 00:22:14,132
the best way
to roll out the use of

672
00:22:14,132 --> 00:22:16,468
Microsoft
Copilot among city workers,

673
00:22:16,468 --> 00:22:17,669
one of many
local governments

674
00:22:17,669 --> 00:22:20,372
grappling with the risks
and benefits of AI use.

675
00:22:20,372 --> 00:22:22,574
Senator Wellman,
maybe I can start with you.

676
00:22:22,574 --> 00:22:24,776
What advice do you have for
local leaders,

677
00:22:24,776 --> 00:22:26,712
maybe some state
leaders, too, trying

678
00:22:26,712 --> 00:22:29,081
to create some guidelines
about how to use AI

679
00:22:29,081 --> 00:22:29,915
properly?

680
00:22:29,915 --> 00:22:30,582
-Well, I think

681
00:22:30,582 --> 00:22:33,719
there are different issues
surrounding every use.

682
00:22:33,985 --> 00:22:34,986
I'm very excited.

683
00:22:34,986 --> 00:22:37,089
We know
we have a housing problem,

684
00:22:37,089 --> 00:22:38,690
a significant
housing problem.

685
00:22:38,690 --> 00:22:41,827
If we can speed up
the permitting process,

686
00:22:41,993 --> 00:22:43,395
which sometimes can go on

687
00:22:43,395 --> 00:22:45,030
for months
and months and months,

688
00:22:45,030 --> 00:22:46,832
where we can get more
housing out there.

689
00:22:46,832 --> 00:22:48,300
That's a very good thing.

690
00:22:48,300 --> 00:22:50,702
So we what we need to be
working in a partnership,

691
00:22:50,702 --> 00:22:52,170
but we have to know
that we're working

692
00:22:52,170 --> 00:22:53,405
with a trusted partner

693
00:22:53,405 --> 00:22:55,140
who is responsible
and listening

694
00:22:55,140 --> 00:22:58,443
to the concerns that we have.
-Yeah, trying to use AI

695
00:22:58,477 --> 00:23:00,178
for good, it sounds like. 
-Absolutely.

696
00:23:00,178 --> 00:23:02,114
There's so many things
I can think of where

697
00:23:02,114 --> 00:23:04,082
AI can make a really positive
difference.

698
00:23:04,082 --> 00:23:06,184
-Okay. Looking forward
to hearing more about that.

699
00:23:06,184 --> 00:23:07,352
Amy, same question to you.

700
00:23:07,352 --> 00:23:08,820
How should local governments

701
00:23:08,820 --> 00:23:10,889
be approaching this issue
of regulating AI?

702
00:23:10,889 --> 00:23:12,124
What should they be
keeping in mind? 

703
00:23:12,124 --> 00:23:15,660
-100%. We don't hear it because
it's not attention grabbing.

704
00:23:15,660 --> 00:23:16,895
The Department of Natural
Resources

705
00:23:16,895 --> 00:23:18,964
is already using it
for detecting wildfires.

706
00:23:18,964 --> 00:23:21,199
I'm sure
you're aware of that. Why?

707
00:23:21,199 --> 00:23:22,801
Why is that
not getting more?

708
00:23:22,801 --> 00:23:24,269
I mean, this is a ways
that we can help

709
00:23:24,269 --> 00:23:25,370
people save lives,

710
00:23:25,370 --> 00:23:27,072
get people out of
their homes quicker. -Yeah.

711
00:23:27,072 --> 00:23:28,106
-I mean, there's
there's already

712
00:23:28,106 --> 00:23:29,341
so many ways
that we're using it.

713
00:23:29,341 --> 00:23:30,675
So not sexy,

714
00:23:30,675 --> 00:23:32,778
not the headline,
but we have got it.

715
00:23:32,778 --> 00:23:34,713
We have got to be
integrating this quicker.

716
00:23:34,713 --> 00:23:36,381
-I was just
at a demonstration

717
00:23:36,381 --> 00:23:38,550
and seeing that
they can identify

718
00:23:38,550 --> 00:23:40,218
that,
is this kind of a fire,

719
00:23:40,218 --> 00:23:41,553
you don't have
to worry about it

720
00:23:41,553 --> 00:23:43,455
and identifying
that this is a fire.

721
00:23:43,455 --> 00:23:44,890
Get the guys there
immediately.

722
00:23:44,890 --> 00:23:47,259
-I see saving lives.
I mean, some of this stuff.

723
00:23:47,259 --> 00:23:49,494
So but like, that's not
I get that's not clickbait.

724
00:23:49,494 --> 00:23:51,496
That's not that's
I understand that.

725
00:23:51,496 --> 00:23:54,332
But but that to me
is also like on tech of like

726
00:23:54,332 --> 00:23:55,734
why aren’t we telling this
story better?

727
00:23:55,734 --> 00:23:57,068
Yeah. So let's, let's,
let's do that.

728
00:23:57,068 --> 00:23:57,936
-You think there might be

729
00:23:57,936 --> 00:24:01,039
some other avenues too,
in terms of using AI

730
00:24:01,106 --> 00:24:03,809
for good? Okay. Okay.
All right. All right. I know

731
00:24:04,876 --> 00:24:05,710
an interesting

732
00:24:05,710 --> 00:24:07,646
piece ahead from you
to look at the story.

733
00:24:07,646 --> 00:24:08,513
I can't wait.

734
00:24:08,513 --> 00:24:11,049
I can't wait. I can't wait.

735
00:24:11,049 --> 00:24:13,752
Dr. Davis, I'll jump jump
to you to finish up here.

736
00:24:13,752 --> 00:24:15,720
You study
a lot of kids and families

737
00:24:15,720 --> 00:24:17,355
dealing
with the impact of AI.

738
00:24:17,355 --> 00:24:19,658
What sort of things should,
I guess the average person,

739
00:24:19,658 --> 00:24:21,359
the average family,
be considering

740
00:24:21,359 --> 00:24:23,962
moving forward when it comes
to this debate over AI?

741
00:24:23,962 --> 00:24:25,964
There's a lot to consider.
What do you have to say?

742
00:24:25,964 --> 00:24:28,533
-Yes. Well, that's right.

743
00:24:28,533 --> 00:24:32,838
So with all of my research,
we find over and over again

744
00:24:32,838 --> 00:24:34,406
that when it comes to

745
00:24:34,406 --> 00:24:38,243
figuring out and navigating
how to introduce kids

746
00:24:38,243 --> 00:24:41,246
to different technologies,
social media

747
00:24:41,546 --> 00:24:44,583
and the Internet,
cell phones, and now AI,

748
00:24:44,883 --> 00:24:48,720
a lot of those decisions
fall to individual families

749
00:24:48,720 --> 00:24:51,723
and individual
youth and parents.

750
00:24:52,257 --> 00:24:55,460
And so I see the same thing
happening here with AI.

751
00:24:55,494 --> 00:24:58,897
And I guess I would just
want families to know

752
00:24:59,164 --> 00:25:02,434
that absolutely learn
as much as you can about AI.

753
00:25:02,467 --> 00:25:05,303
Have conversations
with your kids

754
00:25:05,303 --> 00:25:09,541
as much as possible,
but also look to lawmakers

755
00:25:09,541 --> 00:25:12,544
and look to broader systems
for guidance.

756
00:25:12,711 --> 00:25:14,846
I really strongly
believe that

757
00:25:14,846 --> 00:25:16,081
all of this responsibility

758
00:25:16,081 --> 00:25:19,050
should not be foisted
on individual families

759
00:25:19,317 --> 00:25:22,554
because these are very
complicated situations

760
00:25:22,554 --> 00:25:24,155
and challenges
that we're dealing with.

761
00:25:25,323 --> 00:25:26,691
Me as a researcher, we need

762
00:25:26,691 --> 00:25:29,728
to step up and figure out
what are the effects

763
00:25:29,728 --> 00:25:32,130
of young people's engagement
with AI.

764
00:25:32,130 --> 00:25:34,199
We need to communicate
that well.

765
00:25:34,199 --> 00:25:37,202
Lawmakers need to pass
robust legislation.

766
00:25:37,903 --> 00:25:41,006
And yeah, and then to really
support our families.

767
00:25:41,039 --> 00:25:42,040
-Just quickly on this,

768
00:25:42,040 --> 00:25:42,440
is there a

769
00:25:42,440 --> 00:25:44,175
resource that you like
to direct families

770
00:25:44,175 --> 00:25:46,478
to when they have questions about this? 
-Yes.

771
00:25:46,478 --> 00:25:48,013
So we at the Center

772
00:25:48,013 --> 00:25:50,515
for Digital Youth,
we have our website,

773
00:25:50,515 --> 00:25:53,518
but I also think there's
some great other resources.

774
00:25:54,085 --> 00:25:55,587
The American Academy of

775
00:25:55,587 --> 00:25:57,589
Pediatrics
has a wonderful website

776
00:25:57,589 --> 00:26:01,593
with a ton of resources
on every sort of technology

777
00:26:01,593 --> 00:26:02,794
that you can think of.

778
00:26:02,794 --> 00:26:05,897
Common Sense Media
has fantastic resources

779
00:26:05,897 --> 00:26:07,732
to help

780
00:26:07,732 --> 00:26:08,667
parents to navigate

781
00:26:08,667 --> 00:26:12,537
which applications
and devices are okay

782
00:26:12,537 --> 00:26:14,239
and safe for my kids
and which are not.

783
00:26:14,239 --> 00:26:15,206
-Okay, Thank you for that.

784
00:26:15,206 --> 00:26:17,275
And just a note
to folks at home.

785
00:26:17,275 --> 00:26:18,610
If you or someone you know

786
00:26:18,610 --> 00:26:20,011
is having a serious problem
with AI,

787
00:26:20,011 --> 00:26:22,047
might be facing
a mental health crisis--

788
00:26:22,047 --> 00:26:23,582
local support is available.

789
00:26:23,582 --> 00:26:26,351
Make sure you call or text
988. We'll be right back.

790
00:26:27,485 --> 00:26:28,520
What are people saying

791
00:26:28,520 --> 00:26:31,523
online about regulating
artificial intelligence?

792
00:26:31,623 --> 00:26:32,624
One person writes:

793
00:26:43,602 --> 00:26:45,136
Another comments:

794
00:26:56,715 --> 00:26:58,516
We'd like to know
what you think.

795
00:26:58,516 --> 00:27:01,853
Send us an email at 
contact@seattlechannel.org,

796
00:27:02,053 --> 00:27:04,923
or find us on social media.

797
00:27:04,923 --> 00:27:06,024
Great to get that feedback.

798
00:27:06,024 --> 00:27:07,025
And a big thanks to Dr.

799
00:27:07,025 --> 00:27:08,493
Katie Davis, Amy Harris,

800
00:27:08,493 --> 00:27:10,261
and also State Senator
Lisa Wellman.

801
00:27:10,261 --> 00:27:12,497
Thanks a lot for being here.
-Thank you. -Thank you.

802
00:27:12,497 --> 00:27:14,432
And we'll see you next
time on City Inside/Out.

803
00:27:15,900 --> 00:27:27,712
♪♪
