ABOUT THE SPEAKER
Danielle Citron - Law professor, deepfake scholar
Danielle Citron writes, speaks and teaches her academic loves: privacy, free speech and civil rights. Through her work with privacy organizations, she also puts these ideas into practice.

Why you should listen

As a law professor, Danielle Citron puts her commitment to civil rights into practice. She is vice president of the Cyber Civil Rights Initiative, a nonprofit combatting privacy-invading online abuse that undermines civil rights and civil liberties.

When Citron began addressing cyber harassment ten years ago, it was commonly believed that it was "no big deal," and that any legal response would "break the internet." Those attitudes -- and the heartbreaking stories of victims who were terrorized, silenced and economically damaged -- drove Citron to write her 2014 book, Hate Crimes in Cyberspace. Ever since, she has been working with lawmakers, law enforcers, and tech companies to make online spaces and tools available to all on equal terms. Her latest book project focuses on the importance of sexual privacy and how we should protect it. 

More profile about the speaker
Danielle Citron | Speaker | TED.com
TEDSummit 2019

Danielle Citron: How deepfakes undermine truth and threaten democracy

Filmed:
1,533,743 views

The use of deepfake technology to manipulate video and audio for malicious purposes -- whether it's to stoke violence or defame politicians and journalists -- is becoming a real threat. As these tools become more accessible and their products more realistic, how will they shape what we believe about the world? In a portentous talk, law professor Danielle Citron reveals how deepfakes magnify our distrust -- and suggests approaches to safeguarding the truth.
- Law professor, deepfake scholar
Danielle Citron writes, speaks and teaches her academic loves: privacy, free speech and civil rights. Through her work with privacy organizations, she also puts these ideas into practice. Full bio

Double-click the English transcript below to play the video.

00:12
[This talk contains mature content]
0
535
2767
00:17
Rana Ayyub is a journalist in India
1
5762
2992
00:20
whose work has exposed
government corruption
2
8778
2602
00:24
and human rights violations.
3
12411
2157
00:26
And over the years,
4
14990
1167
00:28
she's gotten used to vitriol
and controversy around her work.
5
16181
3303
00:32
But none of it could have prepared her
for what she faced in April 2018.
6
20149
5109
00:38
She was sitting in a café with a friend
when she first saw it:
7
26125
3651
00:41
a two-minute, 20-second video
of her engaged in a sex act.
8
29800
4943
00:47
And she couldn't believe her eyes.
9
35188
2349
00:49
She had never made a sex video.
10
37561
2273
00:52
But unfortunately, thousands
upon thousands of people
11
40506
3465
00:55
would believe it was her.
12
43995
1666
00:58
I interviewed Ms. Ayyub
about three months ago,
13
46673
2944
01:01
in connection with my book
on sexual privacy.
14
49641
2504
01:04
I'm a law professor, lawyer
and civil rights advocate.
15
52681
3198
01:08
So it's incredibly frustrating
knowing that right now,
16
56204
4611
01:12
law could do very little to help her.
17
60839
2238
01:15
And as we talked,
18
63458
1547
01:17
she explained that she should have seen
the fake sex video coming.
19
65029
4512
01:22
She said, "After all, sex is so often used
to demean and to shame women,
20
70038
5596
01:27
especially minority women,
21
75658
2428
01:30
and especially minority women
who dare to challenge powerful men,"
22
78110
4312
01:34
as she had in her work.
23
82446
1533
01:37
The fake sex video went viral in 48 hours.
24
85191
3976
01:42
All of her online accounts were flooded
with screenshots of the video,
25
90064
5307
01:47
with graphic rape and death threats
26
95395
2627
01:50
and with slurs about her Muslim faith.
27
98046
2533
01:53
Online posts suggested that
she was "available" for sex.
28
101426
4564
01:58
And she was doxed,
29
106014
1610
01:59
which means that her home address
and her cell phone number
30
107648
2778
02:02
were spread across the internet.
31
110450
1746
02:04
The video was shared
more than 40,000 times.
32
112879
4084
02:09
Now, when someone is targeted
with this kind of cybermob attack,
33
117760
3936
02:13
the harm is profound.
34
121720
2063
02:16
Rana Ayyub's life was turned upside down.
35
124482
3039
02:20
For weeks, she could hardly eat or speak.
36
128211
3334
02:23
She stopped writing and closed
all of her social media accounts,
37
131919
3689
02:27
which is, you know, a tough thing to do
when you're a journalist.
38
135632
3158
02:31
And she was afraid to go outside
her family's home.
39
139188
3484
02:34
What if the posters
made good on their threats?
40
142696
3022
02:38
The UN Council on Human Rights
confirmed that she wasn't being crazy.
41
146395
4365
02:42
It issued a public statement saying
that they were worried about her safety.
42
150784
4637
02:48
What Rana Ayyub faced was a deepfake:
43
156776
4229
02:53
machine-learning technology
44
161029
2540
02:55
that manipulates or fabricates
audio and video recordings
45
163593
4111
02:59
to show people doing and saying things
46
167728
2723
03:02
that they never did or said.
47
170475
1866
03:04
Deepfakes appear authentic
and realistic, but they're not;
48
172807
3361
03:08
they're total falsehoods.
49
176192
1772
03:11
Although the technology
is still developing in its sophistication,
50
179228
3794
03:15
it is widely available.
51
183046
1614
03:17
Now, the most recent attention
to deepfakes arose,
52
185371
3072
03:20
as so many things do online,
53
188467
2161
03:22
with pornography.
54
190652
1255
03:24
In early 2018,
55
192498
2111
03:26
someone posted a tool on Reddit
56
194633
2468
03:29
to allow users to insert faces
into porn videos.
57
197125
4412
03:33
And what followed was a cascade
of fake porn videos
58
201561
3440
03:37
featuring people's favorite
female celebrities.
59
205025
2797
03:40
And today, you can go on YouTube
and pull up countless tutorials
60
208712
3477
03:44
with step-by-step instructions
61
212213
2286
03:46
on how to make a deepfake
on your desktop application.
62
214523
3163
03:50
And soon we may be even able
to make them on our cell phones.
63
218260
3706
03:55
Now, it's the interaction
of some of our most basic human frailties
64
223072
5382
04:00
and network tools
65
228478
1682
04:02
that can turn deepfakes into weapons.
66
230184
2666
04:04
So let me explain.
67
232874
1200
04:06
As human beings, we have
a visceral reaction to audio and video.
68
234875
4566
04:11
We believe they're true,
69
239860
1488
04:13
on the notion that
of course you can believe
70
241372
2078
04:15
what your eyes and ears are telling you.
71
243474
2478
04:18
And it's that mechanism
72
246476
1699
04:20
that might undermine our shared
sense of reality.
73
248199
3698
04:23
Although we believe deepfakes
to be true, they're not.
74
251921
3147
04:27
And we're attracted
to the salacious, the provocative.
75
255604
4157
04:32
We tend to believe
and to share information
76
260365
3047
04:35
that's negative and novel.
77
263436
2023
04:37
And researchers have found that online
hoaxes spread 10 times faster
78
265809
5019
04:42
than accurate stories.
79
270852
1627
04:46
Now, we're also drawn to information
80
274015
4380
04:50
that aligns with our viewpoints.
81
278419
1892
04:52
Psychologists call that tendency
"confirmation bias."
82
280950
3561
04:57
And social media platforms
supercharge that tendency,
83
285300
4387
05:01
by allowing us to instantly
and widely share information
84
289711
3881
05:05
that accords with our viewpoints.
85
293616
1792
05:08
Now, deepfakes have the potential to cause
grave individual and societal harm.
86
296735
5568
05:15
So, imagine a deepfake
87
303204
2024
05:17
that shows American soldiers
in Afganistan burning a Koran.
88
305252
4182
05:22
You can imagine that that deepfake
would provoke violence
89
310807
3024
05:25
against those soldiers.
90
313855
1533
05:27
And what if the very next day
91
315847
2873
05:30
there's another deepfake that drops,
92
318744
2254
05:33
that shows a well-known imam
based in London
93
321022
3317
05:36
praising the attack on those soldiers?
94
324363
2467
05:39
We might see violence and civil unrest,
95
327617
3163
05:42
not only in Afganistan
and the United Kingdom,
96
330804
3249
05:46
but across the globe.
97
334077
1515
05:48
And you might say to me,
98
336251
1158
05:49
"Come on, Danielle, that's far-fetched."
99
337433
2247
05:51
But it's not.
100
339704
1150
05:53
We've seen falsehoods spread
101
341293
2191
05:55
on WhatsApp and other
online message services
102
343508
2722
05:58
lead to violence
against ethnic minorities.
103
346254
2761
06:01
And that was just text --
104
349039
1887
06:02
imagine if it were video.
105
350950
2024
06:06
Now, deepfakes have the potential
to corrode the trust that we have
106
354593
5357
06:11
in democratic institutions.
107
359974
1992
06:15
So, imagine the night before an election.
108
363006
2667
06:17
There's a deepfake showing
one of the major party candidates
109
365996
3238
06:21
gravely sick.
110
369258
1150
06:23
The deepfake could tip the election
111
371202
2333
06:25
and shake our sense
that elections are legitimate.
112
373559
3375
06:30
Imagine if the night before
an initial public offering
113
378515
3326
06:33
of a major global bank,
114
381865
2333
06:36
there was a deepfake
showing the bank's CEO
115
384222
3149
06:39
drunkenly spouting conspiracy theories.
116
387395
2697
06:42
The deepfake could tank the IPO,
117
390887
3047
06:45
and worse, shake our sense
that financial markets are stable.
118
393958
4115
06:51
So deepfakes can exploit and magnify
the deep distrust that we already have
119
399385
6989
06:58
in politicians, business leaders
and other influential leaders.
120
406398
4214
07:02
They find an audience
primed to believe them.
121
410945
3284
07:07
And the pursuit of truth
is on the line as well.
122
415287
2765
07:11
Technologists expect
that with advances in AI,
123
419077
3564
07:14
soon it may be difficult if not impossible
124
422665
3682
07:18
to tell the difference between
a real video and a fake one.
125
426371
3769
07:23
So how can the truth emerge
in a deepfake-ridden marketplace of ideas?
126
431022
5341
07:28
Will we just proceed along
the path of least resistance
127
436752
3420
07:32
and believe what we want to believe,
128
440196
2437
07:34
truth be damned?
129
442657
1150
07:36
And not only might we believe the fakery,
130
444831
3175
07:40
we might start disbelieving the truth.
131
448030
3326
07:43
We've already seen people invoke
the phenomenon of deepfakes
132
451887
4079
07:47
to cast doubt on real evidence
of their wrongdoing.
133
455990
3920
07:51
We've heard politicians say of audio
of their disturbing comments,
134
459934
5969
07:57
"Come on, that's fake news.
135
465927
1746
07:59
You can't believe what your eyes
and ears are telling you."
136
467697
3920
08:04
And it's that risk
137
472402
1731
08:06
that professor Robert Chesney and I
call the "liar's dividend":
138
474157
5436
08:11
the risk that liars will invoke deepfakes
139
479617
3357
08:14
to escape accountability
for their wrongdoing.
140
482998
2905
08:18
So we've got our work cut out for us,
there's no doubt about it.
141
486963
3071
08:22
And we're going to need
a proactive solution
142
490606
3325
08:25
from tech companies, from lawmakers,
143
493955
3511
08:29
law enforcers and the media.
144
497490
1984
08:32
And we're going to need
a healthy dose of societal resilience.
145
500093
4016
08:37
So now, we're right now engaged
in a very public conversation
146
505506
3896
08:41
about the responsibility
of tech companies.
147
509426
2913
08:44
And my advice to social media platforms
148
512926
3032
08:47
has been to change their terms of service
and community guidelines
149
515982
3873
08:51
to ban deepfakes that cause harm.
150
519879
2336
08:54
That determination,
that's going to require human judgment,
151
522712
3960
08:58
and it's expensive.
152
526696
1571
09:00
But we need human beings
153
528673
2285
09:02
to look at the content
and context of a deepfake
154
530982
3873
09:06
to figure out if it is
a harmful impersonation
155
534879
3682
09:10
or instead, if it's valuable
satire, art or education.
156
538585
4382
09:16
So now, what about the law?
157
544118
1495
09:18
Law is our educator.
158
546666
2349
09:21
It teaches us about
what's harmful and what's wrong.
159
549515
4038
09:25
And it shapes behavior it deters
by punishing perpetrators
160
553577
4555
09:30
and securing remedies for victims.
161
558156
2267
09:33
Right now, law is not up to
the challenge of deepfakes.
162
561148
4280
09:38
Across the globe,
163
566116
1390
09:39
we lack well-tailored laws
164
567530
2444
09:41
that would be designed to tackle
digital impersonations
165
569998
3570
09:45
that invade sexual privacy,
166
573592
2231
09:47
that damage reputations
167
575847
1387
09:49
and that cause emotional distress.
168
577258
1951
09:51
What happened to Rana Ayyub
is increasingly commonplace.
169
579725
3873
09:56
Yet, when she went
to law enforcement in Delhi,
170
584074
2214
09:58
she was told nothing could be done.
171
586312
2135
10:01
And the sad truth is
that the same would be true
172
589101
3183
10:04
in the United States and in Europe.
173
592308
2266
10:07
So we have a legal vacuum
that needs to be filled.
174
595300
4356
10:12
My colleague Dr. Mary Anne Franks and I
are working with US lawmakers
175
600292
4092
10:16
to devise legislation that would ban
harmful digital impersonations
176
604408
4804
10:21
that are tantamount to identity theft.
177
609236
2533
10:24
And we've seen similar moves
178
612252
2126
10:26
in Iceland, the UK and Australia.
179
614402
3301
10:30
But of course, that's just a small piece
of the regulatory puzzle.
180
618157
3259
10:34
Now, I know law is not a cure-all. Right?
181
622911
3169
10:38
It's a blunt instrument.
182
626104
1600
10:40
And we've got to use it wisely.
183
628346
1539
10:42
It also has some practical impediments.
184
630411
2812
10:45
You can't leverage law against people
you can't identify and find.
185
633657
5044
10:51
And if a perpetrator lives
outside the country
186
639463
3286
10:54
where a victim lives,
187
642773
1754
10:56
then you may not be able to insist
188
644551
1629
10:58
that the perpetrator
come into local courts
189
646204
2349
11:00
to face justice.
190
648577
1150
11:02
And so we're going to need
a coordinated international response.
191
650236
4063
11:07
Education has to be part
of our response as well.
192
655819
3333
11:11
Law enforcers are not
going to enforce laws
193
659803
3731
11:15
they don't know about
194
663558
1458
11:17
and proffer problems
they don't understand.
195
665040
2596
11:20
In my research on cyberstalking,
196
668376
2191
11:22
I found that law enforcement
lacked the training
197
670591
3499
11:26
to understand the laws available to them
198
674114
2582
11:28
and the problem of online abuse.
199
676720
2349
11:31
And so often they told victims,
200
679093
2682
11:33
"Just turn your computer off.
Ignore it. It'll go away."
201
681799
3971
11:38
And we saw that in Rana Ayyub's case.
202
686261
2466
11:41
She was told, "Come on,
you're making such a big deal about this.
203
689102
3468
11:44
It's boys being boys."
204
692594
1743
11:47
And so we need to pair new legislation
with efforts at training.
205
695268
5252
11:54
And education has to be aimed
on the media as well.
206
702053
3429
11:58
Journalists need educating
about the phenomenon of deepfakes
207
706180
4260
12:02
so they don't amplify and spread them.
208
710464
3039
12:06
And this is the part
where we're all involved.
209
714583
2168
12:08
Each and every one of us needs educating.
210
716775
3855
12:13
We click, we share, we like,
and we don't even think about it.
211
721375
3675
12:17
We need to do better.
212
725551
1547
12:19
We need far better radar for fakery.
213
727726
2809
12:25
So as we're working
through these solutions,
214
733744
3841
12:29
there's going to be
a lot of suffering to go around.
215
737609
2563
12:33
Rana Ayyub is still wrestling
with the fallout.
216
741093
2746
12:36
She still doesn't feel free
to express herself on- and offline.
217
744669
4189
12:41
And as she told me,
218
749566
1365
12:42
she still feels like there are thousands
of eyes on her naked body,
219
750955
5074
12:48
even though, intellectually,
she knows it wasn't her body.
220
756053
3661
12:52
And she has frequent panic attacks,
221
760371
2349
12:54
especially when someone she doesn't know
tries to take her picture.
222
762744
4100
12:58
"What if they're going to make
another deepfake?" she thinks to herself.
223
766868
3511
13:03
And so for the sake of
individuals like Rana Ayyub
224
771082
3921
13:07
and the sake of our democracy,
225
775027
2306
13:09
we need to do something right now.
226
777357
2182
13:11
Thank you.
227
779563
1151
13:12
(Applause)
228
780738
2508

▲Back to top

ABOUT THE SPEAKER
Danielle Citron - Law professor, deepfake scholar
Danielle Citron writes, speaks and teaches her academic loves: privacy, free speech and civil rights. Through her work with privacy organizations, she also puts these ideas into practice.

Why you should listen

As a law professor, Danielle Citron puts her commitment to civil rights into practice. She is vice president of the Cyber Civil Rights Initiative, a nonprofit combatting privacy-invading online abuse that undermines civil rights and civil liberties.

When Citron began addressing cyber harassment ten years ago, it was commonly believed that it was "no big deal," and that any legal response would "break the internet." Those attitudes -- and the heartbreaking stories of victims who were terrorized, silenced and economically damaged -- drove Citron to write her 2014 book, Hate Crimes in Cyberspace. Ever since, she has been working with lawmakers, law enforcers, and tech companies to make online spaces and tools available to all on equal terms. Her latest book project focuses on the importance of sexual privacy and how we should protect it. 

More profile about the speaker
Danielle Citron | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee