ABOUT THE SPEAKER
Yasmin Green - Geopolitical technologist
Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. focused on solving global security challenges through technology.

Why you should listen

Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. (formerly Google Ideas), focused on using tech tools to make the world safer, both on and offline. She has experience leading projects in some of the world’s toughest environments, including Iran, Syria, the UAE and Nigeria. In 2012, she led a multi-partner coalition to launch Against Violent Extremism, the world's first online network of former violent extremists and survivors of terrorism. Based on her own interviews with ISIS defectors and jailed recruits, last year Yasmin launched the Redirect Method, a new deployment of targeted advertising and video to confront online radicalization.

Green is a senior advisor on innovation to Oxford Analytica, a member of the Aspen Cyber Strategy Group, and until 2015 co-chaired the European Commission's Working Group on Online Radicalization. She was named one of Fortune's "40 Under 40" most influential young leaders in 2017, and in 2016 she was named one of Fast Company's "Most Creative People in Business."

More profile about the speaker
Yasmin Green | Speaker | TED.com
TED2018

Yasmin Green: How technology can fight extremism and online harassment

Filmed:
2,460,759 views

Can technology make people safer from threats like violent extremism, censorship and persecution? In this illuminating talk, technologist Yasmin Green details programs pioneered at Jigsaw (a unit within Alphabet Inc., the collection of companies that also includes Google) to counter radicalization and online harassment -- including a project that could give commenters real-time feedback about how their words might land, which has already increased spaces for dialogue. "If we ever thought that we could build an internet insulated from the dark side of humanity, we were wrong," Green says. "We have to throw our entire selves into building solutions that are as human as the problems they aim to solve."
- Geopolitical technologist
Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. focused on solving global security challenges through technology. Full bio

Double-click the English transcript below to play the video.

00:13
My relationship with the internet
reminds me of the setup
0
1131
4411
00:17
to a clichéd horror movie.
1
5566
1833
00:19
You know, the blissfully happy family
moves in to their perfect new home,
2
7867
4386
00:24
excited about their perfect future,
3
12277
2281
00:26
and it's sunny outside
and the birds are chirping ...
4
14582
3521
00:30
And then it gets dark.
5
18857
1839
00:32
And there are noises from the attic.
6
20720
2348
00:35
And we realize that that perfect
new house isn't so perfect.
7
23092
4345
00:40
When I started working at Google in 2006,
8
28485
3131
00:43
Facebook was just a two-year-old,
9
31640
1767
00:45
and Twitter hadn't yet been born.
10
33431
2012
00:47
And I was in absolute awe
of the internet and all of its promise
11
35848
4410
00:52
to make us closer
12
40282
1437
00:53
and smarter
13
41743
1296
00:55
and more free.
14
43063
1214
00:57
But as we were doing the inspiring work
of building search engines
15
45265
3714
01:01
and video-sharing sites
and social networks,
16
49003
2886
01:04
criminals, dictators and terrorists
were figuring out
17
52907
4304
01:09
how to use those same
platforms against us.
18
57235
3202
01:13
And we didn't have
the foresight to stop them.
19
61417
2455
01:16
Over the last few years, geopolitical
forces have come online to wreak havoc.
20
64746
5099
01:21
And in response,
21
69869
1169
01:23
Google supported a few colleagues and me
to set up a new group called Jigsaw,
22
71062
4778
01:27
with a mandate to make people safer
from threats like violent extremism,
23
75864
4596
01:32
censorship, persecution --
24
80484
2078
01:35
threats that feel very personal to me
because I was born in Iran,
25
83186
4117
01:39
and I left in the aftermath
of a violent revolution.
26
87327
2929
01:43
But I've come to realize
that even if we had all of the resources
27
91525
4346
01:47
of all of the technology
companies in the world,
28
95895
2858
01:51
we'd still fail
29
99595
1230
01:53
if we overlooked one critical ingredient:
30
101586
2948
01:57
the human experiences of the victims
and perpetrators of those threats.
31
105653
5789
02:04
There are many challenges
I could talk to you about today.
32
112935
2736
02:07
I'm going to focus on just two.
33
115695
1504
02:09
The first is terrorism.
34
117623
2079
02:13
So in order to understand
the radicalization process,
35
121563
2557
02:16
we met with dozens of former members
of violent extremist groups.
36
124144
4287
02:21
One was a British schoolgirl,
37
129590
2483
02:25
who had been taken off of a plane
at London Heathrow
38
133049
3699
02:28
as she was trying to make her way
to Syria to join ISIS.
39
136772
4692
02:34
And she was 13 years old.
40
142281
1931
02:37
So I sat down with her and her father,
and I said, "Why?"
41
145792
4625
02:42
And she said,
42
150441
1717
02:44
"I was looking at pictures
of what life is like in Syria,
43
152182
3639
02:47
and I thought I was going to go
and live in the Islamic Disney World."
44
155845
3510
02:52
That's what she saw in ISIS.
45
160527
2084
02:54
She thought she'd meet and marry
a jihadi Brad Pitt
46
162635
3492
02:58
and go shopping in the mall all day
and live happily ever after.
47
166151
3058
03:02
ISIS understands what drives people,
48
170977
2824
03:05
and they carefully craft a message
for each audience.
49
173825
3544
03:11
Just look at how many languages
50
179122
1511
03:12
they translate their
marketing material into.
51
180657
2273
03:15
They make pamphlets,
radio shows and videos
52
183677
2661
03:18
in not just English and Arabic,
53
186362
1973
03:20
but German, Russian,
French, Turkish, Kurdish,
54
188359
4767
03:25
Hebrew,
55
193150
1672
03:26
Mandarin Chinese.
56
194846
1741
03:29
I've even seen an ISIS-produced
video in sign language.
57
197309
4192
03:34
Just think about that for a second:
58
202605
1884
03:36
ISIS took the time and made the effort
59
204513
2308
03:38
to ensure their message is reaching
the deaf and hard of hearing.
60
206845
3804
03:45
It's actually not tech-savviness
61
213143
2144
03:47
that is the reason why
ISIS wins hearts and minds.
62
215311
2595
03:49
It's their insight into the prejudices,
the vulnerabilities, the desires
63
217930
4163
03:54
of the people they're trying to reach
64
222117
1774
03:55
that does that.
65
223915
1161
03:57
That's why it's not enough
66
225718
1429
03:59
for the online platforms
to focus on removing recruiting material.
67
227171
4239
04:04
If we want to have a shot
at building meaningful technology
68
232518
3581
04:08
that's going to counter radicalization,
69
236123
1874
04:10
we have to start with the human
journey at its core.
70
238021
2979
04:13
So we went to Iraq
71
241884
2187
04:16
to speak to young men
who'd bought into ISIS's promise
72
244095
2831
04:18
of heroism and righteousness,
73
246950
3191
04:22
who'd taken up arms to fight for them
74
250165
1847
04:24
and then who'd defected
75
252036
1338
04:25
after they witnessed
the brutality of ISIS's rule.
76
253398
3021
04:28
And I'm sitting there in this makeshift
prison in the north of Iraq
77
256880
3192
04:32
with this 23-year-old who had actually
trained as a suicide bomber
78
260096
4550
04:36
before defecting.
79
264670
1552
04:39
And he says,
80
267080
1158
04:41
"I arrived in Syria full of hope,
81
269119
3220
04:44
and immediately, I had two
of my prized possessions confiscated:
82
272363
4365
04:48
my passport and my mobile phone."
83
276752
2933
04:52
The symbols of his physical
and digital liberty
84
280140
2406
04:54
were taken away from him on arrival.
85
282570
1760
04:57
And then this is the way he described
that moment of loss to me.
86
285248
3510
05:01
He said,
87
289356
1586
05:02
"You know in 'Tom and Jerry,'
88
290966
2329
05:06
when Jerry wants to escape,
and then Tom locks the door
89
294192
3103
05:09
and swallows the key
90
297319
1156
05:10
and you see it bulging out
of his throat as it travels down?"
91
298499
3551
05:14
And of course, I really could see
the image that he was describing,
92
302446
3153
05:17
and I really did connect with the feeling
that he was trying to convey,
93
305623
3661
05:21
which was one of doom,
94
309308
2021
05:23
when you know there's no way out.
95
311353
1789
05:26
And I was wondering:
96
314551
1289
05:28
What, if anything,
could have changed his mind
97
316644
2682
05:31
the day that he left home?
98
319350
1240
05:32
So I asked,
99
320614
1250
05:33
"If you knew everything that you know now
100
321888
3178
05:37
about the suffering
and the corruption, the brutality --
101
325090
3051
05:40
that day you left home,
102
328165
1415
05:41
would you still have gone?"
103
329604
1679
05:43
And he said, "Yes."
104
331786
1711
05:45
And I thought, "Holy crap, he said 'Yes.'"
105
333846
2282
05:48
And then he said,
106
336694
1219
05:49
"At that point, I was so brainwashed,
107
337937
3001
05:52
I wasn't taking in
any contradictory information.
108
340962
3244
05:56
I couldn't have been swayed."
109
344744
1555
05:59
"Well, what if you knew
everything that you know now
110
347235
2527
06:01
six months before the day that you left?"
111
349786
2098
06:05
"At that point, I think it probably
would have changed my mind."
112
353345
3131
06:10
Radicalization isn't
this yes-or-no choice.
113
358138
3397
06:14
It's a process, during which
people have questions --
114
362007
2977
06:17
about ideology, religion,
the living conditions.
115
365008
3776
06:20
And they're coming online for answers,
116
368808
2766
06:23
which is an opportunity to reach them.
117
371598
1917
06:25
And there are videos online
from people who have answers --
118
373905
3014
06:28
defectors, for example,
telling the story of their journey
119
376943
2876
06:31
into and out of violence;
120
379843
1583
06:33
stories like the one from that man
I met in the Iraqi prison.
121
381450
3487
06:37
There are locals who've uploaded
cell phone footage
122
385914
2590
06:40
of what life is really like
in the caliphate under ISIS's rule.
123
388528
3503
06:44
There are clerics who are sharing
peaceful interpretations of Islam.
124
392055
3735
06:48
But you know what?
125
396830
1150
06:50
These people don't generally have
the marketing prowess of ISIS.
126
398004
3020
06:54
They risk their lives to speak up
and confront terrorist propaganda,
127
402049
4532
06:58
and then they tragically
don't reach the people
128
406605
2211
07:00
who most need to hear from them.
129
408840
1682
07:03
And we wanted to see
if technology could change that.
130
411173
2612
07:06
So in 2016, we partnered with Moonshot CVE
131
414205
4183
07:10
to pilot a new approach
to countering radicalization
132
418412
3180
07:13
called the "Redirect Method."
133
421616
1780
07:16
It uses the power of online advertising
134
424453
3012
07:19
to bridge the gap between
those susceptible to ISIS's messaging
135
427489
4514
07:24
and those credible voices
that are debunking that messaging.
136
432027
3760
07:28
And it works like this:
137
436633
1150
07:29
someone looking for extremist material --
138
437807
1961
07:31
say they search
for "How do I join ISIS?" --
139
439792
2990
07:34
will see an ad appear
140
442806
2476
07:37
that invites them to watch a YouTube video
of a cleric, of a defector --
141
445306
4882
07:42
someone who has an authentic answer.
142
450212
2310
07:44
And that targeting is based
not on a profile of who they are,
143
452546
3623
07:48
but of determining something
that's directly relevant
144
456193
3053
07:51
to their query or question.
145
459270
1708
07:54
During our eight-week pilot
in English and Arabic,
146
462122
2842
07:56
we reached over 300,000 people
147
464988
3279
08:00
who had expressed an interest in
or sympathy towards a jihadi group.
148
468291
5545
08:06
These people were now watching videos
149
474626
2264
08:08
that could prevent them
from making devastating choices.
150
476914
3340
08:13
And because violent extremism
isn't confined to any one language,
151
481405
3727
08:17
religion or ideology,
152
485156
1804
08:18
the Redirect Method is now
being deployed globally
153
486984
3501
08:22
to protect people being courted online
by violent ideologues,
154
490509
3804
08:26
whether they're Islamists,
white supremacists
155
494337
2596
08:28
or other violent extremists,
156
496957
2103
08:31
with the goal of giving them the chance
to hear from someone
157
499084
2873
08:33
on the other side of that journey;
158
501981
2091
08:36
to give them the chance to choose
a different path.
159
504096
2839
08:40
It turns out that often the bad guys
are good at exploiting the internet,
160
508749
5980
08:46
not because they're some kind
of technological geniuses,
161
514753
3744
08:50
but because they understand
what makes people tick.
162
518521
2985
08:54
I want to give you a second example:
163
522855
2369
08:58
online harassment.
164
526019
1391
09:00
Online harassers also work
to figure out what will resonate
165
528629
3363
09:04
with another human being.
166
532016
1615
09:05
But not to recruit them like ISIS does,
167
533655
3110
09:08
but to cause them pain.
168
536789
1275
09:11
Imagine this:
169
539259
1342
09:13
you're a woman,
170
541347
1659
09:15
you're married,
171
543030
1413
09:16
you have a kid.
172
544467
1154
09:18
You post something on social media,
173
546834
1784
09:20
and in a reply,
you're told that you'll be raped,
174
548642
2886
09:24
that your son will be watching,
175
552577
1560
09:26
details of when and where.
176
554825
1856
09:29
In fact, your home address
is put online for everyone to see.
177
557148
3143
09:33
That feels like a pretty real threat.
178
561580
2007
09:37
Do you think you'd go home?
179
565113
1656
09:39
Do you think you'd continue doing
the thing that you were doing?
180
567999
3048
09:43
Would you continue doing that thing
that's irritating your attacker?
181
571071
3220
09:48
Online abuse has been this perverse art
182
576016
3096
09:51
of figuring out what makes people angry,
183
579136
3468
09:54
what makes people afraid,
184
582628
2132
09:56
what makes people insecure,
185
584784
1641
09:58
and then pushing those pressure points
until they're silenced.
186
586449
3067
10:02
When online harassment goes unchecked,
187
590333
2304
10:04
free speech is stifled.
188
592661
1667
10:07
And even the people
hosting the conversation
189
595196
2127
10:09
throw up their arms and call it quits,
190
597347
1834
10:11
closing their comment sections
and their forums altogether.
191
599205
2957
10:14
That means we're actually
losing spaces online
192
602186
2849
10:17
to meet and exchange ideas.
193
605059
1987
10:19
And where online spaces remain,
194
607939
2163
10:22
we descend into echo chambers
with people who think just like us.
195
610126
4470
10:27
But that enables
the spread of disinformation;
196
615688
2499
10:30
that facilitates polarization.
197
618211
2184
10:34
What if technology instead
could enable empathy at scale?
198
622508
5269
10:40
This was the question
that motivated our partnership
199
628451
2486
10:42
with Google's Counter Abuse team,
200
630961
1819
10:44
Wikipedia
201
632804
1178
10:46
and newspapers like the New York Times.
202
634006
1934
10:47
We wanted to see if we could build
machine-learning models
203
635964
2876
10:50
that could understand
the emotional impact of language.
204
638864
3606
10:55
Could we predict which comments
were likely to make someone else leave
205
643062
3610
10:58
the online conversation?
206
646696
1374
11:00
And that's no mean feat.
207
648515
3887
11:04
That's no trivial accomplishment
208
652426
1566
11:06
for AI to be able to do
something like that.
209
654016
2563
11:08
I mean, just consider
these two examples of messages
210
656603
3729
11:12
that could have been sent to me last week.
211
660356
2224
11:15
"Break a leg at TED!"
212
663517
1879
11:17
... and
213
665420
1164
11:18
"I'll break your legs at TED."
214
666608
2126
11:20
(Laughter)
215
668758
1246
11:22
You are human,
216
670028
1513
11:23
that's why that's an obvious
difference to you,
217
671565
2210
11:25
even though the words
are pretty much the same.
218
673799
2224
11:28
But for AI, it takes some training
to teach the models
219
676047
3079
11:31
to recognize that difference.
220
679150
1571
11:32
The beauty of building AI
that can tell the difference
221
680745
3245
11:36
is that AI can then scale to the size
of the online toxicity phenomenon,
222
684014
5050
11:41
and that was our goal in building
our technology called Perspective.
223
689088
3287
11:45
With the help of Perspective,
224
693056
1427
11:46
the New York Times, for example,
225
694507
1583
11:48
has increased spaces
online for conversation.
226
696114
2487
11:51
Before our collaboration,
227
699005
1310
11:52
they only had comments enabled
on just 10 percent of their articles.
228
700339
4305
11:57
With the help of machine learning,
229
705495
1644
11:59
they have that number up to 30 percent.
230
707163
1897
12:01
So they've tripled it,
231
709084
1156
12:02
and we're still just getting started.
232
710264
1917
12:04
But this is about way more than just
making moderators more efficient.
233
712872
3461
12:10
Right now I can see you,
234
718076
1850
12:11
and I can gauge how what I'm saying
is landing with you.
235
719950
3294
12:16
You don't have that opportunity online.
236
724370
1879
12:18
Imagine if machine learning
could give commenters,
237
726558
3635
12:22
as they're typing,
238
730217
1162
12:23
real-time feedback about how
their words might land,
239
731403
3347
12:27
just like facial expressions do
in a face-to-face conversation.
240
735609
3024
12:32
Machine learning isn't perfect,
241
740926
1842
12:34
and it still makes plenty of mistakes.
242
742792
2394
12:37
But if we can build technology
243
745210
1557
12:38
that understands the emotional
impact of language,
244
746791
3293
12:42
we can build empathy.
245
750108
1460
12:43
That means that we can have
dialogue between people
246
751592
2425
12:46
with different politics,
247
754041
1816
12:47
different worldviews,
248
755881
1216
12:49
different values.
249
757121
1246
12:51
And we can reinvigorate the spaces online
that most of us have given up on.
250
759359
4775
12:57
When people use technology
to exploit and harm others,
251
765857
3785
13:01
they're preying on our human fears
and vulnerabilities.
252
769666
3642
13:06
If we ever thought
that we could build an internet
253
774461
3508
13:09
insulated from the dark side of humanity,
254
777993
2578
13:12
we were wrong.
255
780595
1184
13:14
If we want today to build technology
256
782361
2270
13:16
that can overcome
the challenges that we face,
257
784655
3127
13:19
we have to throw our entire selves
into understanding the issues
258
787806
4043
13:23
and into building solutions
259
791873
1893
13:25
that are as human as the problems
they aim to solve.
260
793790
3782
13:30
Let's make that happen.
261
798071
1513
13:31
Thank you.
262
799924
1150
13:33
(Applause)
263
801098
3277

▲Back to top

ABOUT THE SPEAKER
Yasmin Green - Geopolitical technologist
Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. focused on solving global security challenges through technology.

Why you should listen

Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. (formerly Google Ideas), focused on using tech tools to make the world safer, both on and offline. She has experience leading projects in some of the world’s toughest environments, including Iran, Syria, the UAE and Nigeria. In 2012, she led a multi-partner coalition to launch Against Violent Extremism, the world's first online network of former violent extremists and survivors of terrorism. Based on her own interviews with ISIS defectors and jailed recruits, last year Yasmin launched the Redirect Method, a new deployment of targeted advertising and video to confront online radicalization.

Green is a senior advisor on innovation to Oxford Analytica, a member of the Aspen Cyber Strategy Group, and until 2015 co-chaired the European Commission's Working Group on Online Radicalization. She was named one of Fortune's "40 Under 40" most influential young leaders in 2017, and in 2016 she was named one of Fast Company's "Most Creative People in Business."

More profile about the speaker
Yasmin Green | Speaker | TED.com