ABOUT THE SPEAKER
Erin Marie Saltman - Policy researcher
Dr. Erin Marie Saltman manages Facebook's counterterrorism and counter-extremism policy work for Europe, the Middle East and Africa.

Why you should listen

Dr. Erin Marie Saltman's background and expertise includes both Far Right and Islamist extremist processes of radicalization within a range of regional and socio-political contexts. Her research and publications have focused on the evolving nature of online extremism and terrorismgender dynamics within violent extremist organizations and youth radicalization. Saltman has previously held senior research positions at Quilliam Foundation and the Institute for Strategic Dialogue, where she managed international programs. She has also worked with local activists, artists and techies to challenge violent extremism.

As Facebook's Counterterrorism Policy Manager based in London, Saltman regularly speaks with both governments and NGOs on issues related to how Facebook counters terrorism and violent extremism. She has also helped establish the Global Internet Forum to Counter Terrorism, bringing together leading industry partners (Facebook, Google, Microsoft and Twitter) with smaller startups and tech companies to create cross-platform knowledge sharing, technology solutions and research. 

Saltman remains a Research Fellow at the Institute for Strategic Dialogue. She is a graduate of Columbia University (BA) and University College London (MA and PhD). View her articles and publications here.

More profile about the speaker
Erin Marie Saltman | Speaker | TED.com
TEDxGhent

Erin Marie Saltman: How young people join violent extremist groups -- and how to stop them

Filmed:
1,214,427 views

Terrorists and extremists aren't all naturally violent sociopaths -- they're deliberately recruited and radicalized in a process that doesn't fit into a neat pattern. Erin Marie Saltman discusses the push and pull factors that cause people to join extremist groups and explains innovative ways of preventing and countering radicalization.
- Policy researcher
Dr. Erin Marie Saltman manages Facebook's counterterrorism and counter-extremism policy work for Europe, the Middle East and Africa. Full bio

Double-click the English transcript below to play the video.

00:12
So in 2011, I altered my name
0
833
3398
00:16
so that I could participate
in Far Right youth camp in Hungary.
1
4255
3976
00:20
I was doing a PhD looking at
youth political socialization --
2
8794
4366
00:25
why young people were developing
political ideologies
3
13184
3071
00:28
in a post-communist setting,
4
16279
2010
00:30
and I saw that a lot
of young people I was talking to
5
18313
3233
00:33
were joining the Far Right,
6
21570
1600
00:35
and this was astounding to me.
7
23194
2156
00:37
So I wanted to enroll in this youth camp
8
25374
2295
00:39
to get a better understanding
of why people were joining.
9
27693
3136
00:42
So a colleague enrolled me,
10
30853
1557
00:44
and my last name sounds
a little bit too Jewish.
11
32434
2928
00:47
So Erin got turned into Iréna,
12
35680
2745
00:50
and Saltman got turned into Sós,
13
38449
2200
00:52
which means "salty" in Hungarian.
14
40673
2921
00:55
And in Hungarian,
your last name goes first,
15
43618
2310
00:57
so my James Bond name
turned into "Salty Irena,"
16
45952
4414
01:02
which is not something
I would have naturally chosen for myself.
17
50390
3483
01:06
But going to this camp,
18
54280
1907
01:08
I was further shocked to realize
that it was actually really fun.
19
56211
4523
01:13
They talked very little about politics.
20
61180
2215
01:15
It was mostly learning how to ride horses,
21
63419
3013
01:18
shooting a bow and arrow,
22
66456
1868
01:20
live music at night,
23
68348
1687
01:22
free food and alcohol,
24
70059
1969
01:24
also some air-gun target practice
25
72052
2866
01:26
using mainstream politicians'
faces as targets.
26
74942
3654
01:30
And this seemed like a very,
actually, friendly, inclusive group
27
78620
3727
01:34
until you started talking or mentioning
anything to do with the Roma population,
28
82371
5413
01:39
Jewish people or immigrants,
29
87808
2262
01:42
and then the discourse would become
very hate-based very quickly.
30
90094
4150
01:46
So it led me into my work now,
31
94843
2810
01:49
where we pose the question,
32
97677
2381
01:52
"Why do people join
violent extremist movements,
33
100082
3040
01:55
and how do we effectively
counter these processes?"
34
103146
3031
01:58
In the aftermath of horrible
atrocities and attacks
35
106573
3291
02:01
in places like Belgium, France,
but all over the world,
36
109888
3363
02:05
sometimes it's easier for us to think,
37
113275
1833
02:07
"Well, these must be sociopaths,
38
115132
1945
02:09
these must be naturally
violent individuals.
39
117101
3064
02:12
They must have something wrong
with their upbringing."
40
120189
2596
02:14
And what's really tragic
41
122809
2087
02:16
is that oftentimes there's no one profile.
42
124920
2191
02:19
Many people come
from educated backgrounds,
43
127135
3254
02:22
different socioeconomic backgrounds,
44
130413
2096
02:24
men and women, different ages,
45
132533
2848
02:27
some with families, some single.
46
135405
2278
02:29
So why? What is this allure?
47
137707
2655
02:32
And this is what
I want to talk you through,
48
140386
2049
02:34
as well as how do we
challenge this in a modern era?
49
142459
2887
02:38
We do know, through research,
50
146711
1483
02:40
that there are quite a number
of different things
51
148218
2356
02:42
that affect somebody's
process of radicalization,
52
150598
3351
02:45
and we categorize these
into push and pull factors.
53
153973
2770
02:48
And these are pretty much similar
for Far Right, neo-Nazi groups
54
156767
3413
02:52
all the way to Islamist extremist
and terrorist groups.
55
160204
2904
02:55
And push factors are basically
what makes you vulnerable
56
163663
3858
02:59
to a process of radicalization,
57
167545
1858
03:01
to joining a violent extremist group.
58
169427
2206
03:03
And these can be
a lot of different things,
59
171657
2126
03:05
but roughly, a sense of alienation,
a sense of isolation,
60
173807
3913
03:09
questioning your own identity,
61
177744
2151
03:11
but also feeling that your in-group
is under attack,
62
179919
2826
03:14
and your in group might be based
on a nationality or an ethnicity
63
182769
3793
03:18
or a religion,
64
186586
1326
03:19
and feeling that larger powers around you
are doing nothing to help.
65
187936
3611
03:24
Now, push factors alone
do not make you a violent extremist,
66
192075
3421
03:27
because if that were the fact,
67
195520
1430
03:28
those same factors would go
towards a group like the Roma population,
68
196974
3270
03:32
and they're not
a violently mobilized group.
69
200268
2781
03:35
So we have to look at the pull factors.
70
203073
2287
03:37
What are these violent
extremist organizations offering
71
205384
3310
03:40
that other groups are not offering?
72
208718
1945
03:42
And actually, this is usually
very positive things,
73
210687
2563
03:45
very seemingly empowering things,
74
213274
2017
03:47
such as brotherhood and sisterhood
75
215315
2463
03:49
and a sense of belonging,
76
217802
1334
03:51
as well as giving somebody
a spiritual purpose,
77
219160
2874
03:54
a divine purpose
to build a utopian society
78
222058
3715
03:57
if their goals can be met,
79
225797
1921
03:59
but also a sense of empowerment
and adventure.
80
227742
2751
04:02
When we look
at foreign terrorist fighters,
81
230517
2043
04:04
we see young men
with the wind in their hair
82
232584
2691
04:07
out in the desert
and women going to join them
83
235299
2546
04:09
to have nuptials out in the sunset.
84
237869
2641
04:12
It's very romantic, and you become a hero.
85
240534
3820
04:16
For both men and women,
that's the propaganda being given.
86
244378
2888
04:19
So what extremist groups are very good at
87
247667
2642
04:22
is taking a very complicated,
confusing, nuanced world
88
250333
4826
04:27
and simplifying that world
into black and white,
89
255183
3243
04:30
good and evil.
90
258450
1210
04:31
And you become what is good,
91
259684
1881
04:33
challenging what is evil.
92
261589
1855
04:36
So I want to talk a little bit
about ISIS, Daesh,
93
264541
3864
04:40
because they have been a game changer
in how we look at these processes,
94
268429
4378
04:44
and through a lot of the material
and their tactics.
95
272831
3206
04:48
They're very much a modern movement.
96
276061
2548
04:50
One of the aspects is the internet
and the usage of social media,
97
278925
4485
04:55
as we've all seen in headlines
tweeting and videos of beheadings.
98
283434
4382
04:59
But the internet alone
does not radicalize you.
99
287840
2475
05:02
The internet is a tool.
100
290339
1207
05:03
You don't go online shopping for shoes
101
291570
1856
05:05
and accidentally become a jihadist.
102
293450
1798
05:07
However, what the Internet
does do is it is a catalyst.
103
295793
3389
05:11
It provides tools and scale and rapidity
104
299206
4119
05:15
that doesn't exist elsewhere.
105
303349
1508
05:16
And with ISIS, all of a sudden,
106
304881
2461
05:19
this idea of a cloaked, dark figure
of a jihadist changed for us.
107
307366
5318
05:24
All of a sudden,
we were in their kitchens.
108
312708
2055
05:26
We saw what they were eating for dinner.
109
314787
1999
05:28
They were tweeting.
110
316810
1151
05:29
We had foreign terrorist fighters
tweeting in their own languages.
111
317985
3158
05:33
We had women going out there
talking about their wedding day,
112
321167
2952
05:36
about the births of their children.
113
324143
1747
05:37
We had gaming culture, all of a sudden,
114
325914
1897
05:39
and references
to Grand Theft Auto being made.
115
327835
3166
05:43
So all of a sudden, they were homey.
116
331471
2461
05:45
They became human.
117
333956
1151
05:47
And the problem
is that trying to counter it,
118
335131
2214
05:49
lots of governments
and social media companies
119
337369
2310
05:51
just tried to censor.
120
339703
1151
05:52
How do we get rid of terrorist content?
121
340878
1991
05:54
And it became a cat-and-mouse game
122
342893
1655
05:56
where we would see accounts taken down
and they'd just come back up,
123
344572
3204
05:59
and an arrogance around somebody
having a 25th account
124
347800
3113
06:02
and material that was
disseminated everywhere.
125
350937
3094
06:06
But we also saw a dangerous trend --
126
354055
2021
06:08
violent extremists know the rules
and regulations of social media, too.
127
356100
5008
06:13
So we would see a banal
conversation with a recruiter
128
361132
4000
06:17
start on a mainstream platform,
129
365156
1973
06:19
and at the point
at which that conversation
130
367153
2081
06:21
was going to become illegal,
131
369258
1340
06:22
they would jump to a smaller,
less regulated,
132
370622
2501
06:25
more encrypted platform.
133
373147
1623
06:26
So all of a sudden, we couldn't
track where that conversation went.
134
374794
3533
06:30
So this is a problem with censorship,
135
378351
1862
06:32
which is why we need to develop
alternatives to censorship.
136
380237
3232
06:36
ISIS is also a game-changer
because it's state-building.
137
384035
3350
06:39
It's not just recruiting combatants;
138
387409
2112
06:41
it's trying to build a state.
139
389545
1862
06:43
And what that means is all of a sudden,
140
391431
1940
06:45
your recruitment model is much more broad.
141
393395
2000
06:47
You're not just trying to get fighters --
142
395419
2049
06:49
now you need architects, engineers,
accountants, hackers and women.
143
397492
4266
06:53
We've actually seen
a huge increase of women going
144
401782
2390
06:56
in the last 24, but especially 12 months.
145
404196
3499
06:59
Some countries, one in four
of the people going over to join
146
407719
2889
07:02
are now women.
147
410632
1239
07:03
And so, this really changes
148
411895
1368
07:05
who we're trying to counter
this process with.
149
413287
2782
07:08
Now, not all doom and gloom.
150
416679
1651
07:10
So the rest I'd like to talk about
some of the positive things
151
418354
2960
07:13
and the new innovation in trying
to prevent and counter violent extremism.
152
421338
3844
07:17
Preventing is very different
than countering,
153
425206
2273
07:19
and actually, you can think of it
in medical terms.
154
427503
2556
07:22
So preventative medicine is,
155
430083
2222
07:24
how do we make it
so you are naturally resilient
156
432329
3174
07:27
to this process of radicalization,
157
435527
2500
07:30
whereas that is going to be different
158
438051
1862
07:31
if somebody is already showing
a symptom or a sign
159
439937
2659
07:34
of belonging to a violent
extremist ideology.
160
442620
2881
07:37
And so in preventative measures,
161
445525
1547
07:39
we're talking more
about really broad groups of people
162
447096
2691
07:41
and exposure to ideas
163
449811
1817
07:43
to make them resilient.
164
451652
1767
07:45
Whereas it's very different
165
453443
1516
07:46
if somebody is starting to question
and agree with certain things online,
166
454983
3825
07:50
and it's also very different
if somebody already has a swastika tattoo
167
458832
3849
07:54
and is very much embedded within a group.
168
462705
2048
07:56
How do you reach them?
169
464777
1434
07:58
So I'd like to go through three examples
of each one of those levels
170
466785
3682
08:02
and talk you through
171
470491
1215
08:03
what some of the new ways
of engaging with people are becoming.
172
471730
3316
08:07
One is "Extreme Dialogue,"
173
475374
1413
08:08
and it's an educational program
that we helped develop.
174
476811
3080
08:11
This one is from Canada,
175
479915
2381
08:14
and it's meant to create dialogues
within a classroom setting,
176
482320
4095
08:18
using storytelling,
177
486439
1532
08:19
because violent extremism
can be very hard to try to explain,
178
487995
3151
08:23
especially to younger individuals.
179
491170
1699
08:25
So we have a network of former extremists
and survivors of extremism
180
493305
3913
08:29
that tell their stories through video
and create question-giving to classrooms,
181
497242
3937
08:33
to start a conversation about the topic.
182
501203
2303
08:35
These two examples show Christianne,
183
503530
2532
08:38
who lost her son,
184
506086
1151
08:39
who radicalized and died
fighting for ISIS,
185
507261
2493
08:41
and Daniel is a former neo-Nazi
186
509778
1667
08:43
who was an extremely violent neo-Nazi,
187
511469
2358
08:45
and they pose questions about their lives
and where they're at and regret,
188
513851
4158
08:50
and force a classroom
to have a dialogue around it.
189
518033
2650
08:53
Now, looking at that middle range
of individuals,
190
521175
2985
08:56
actually, we need a lot
of civil society voices.
191
524184
2699
08:58
How do you interact with people
that are looking for information online,
192
526907
3445
09:02
that are starting to toy with an ideology,
193
530376
2342
09:04
that are doing those searching
identity questions?
194
532742
3064
09:07
How do we provide alternatives for that?
195
535830
2142
09:09
And that's when we combine
large groups of civil society voices
196
537996
3390
09:13
with creatives, techies,
app developers, artists, comedians,
197
541410
4531
09:17
and we can create really specified content
198
545965
2683
09:20
and actually, online, disseminate it
to very strategic audiences.
199
548672
4294
09:24
So one example would be
creating a satirical video
200
552990
2803
09:27
which makes fun of Islamophobia,
201
555817
2499
09:30
and targeting it
to 15- to 20-year-olds online
202
558340
3936
09:34
that have an interest in white power music
203
562300
2247
09:36
and live specifically in Manchester.
204
564571
2399
09:38
We can use these marketing tools
to be very specific,
205
566994
3031
09:42
so that we know
when somebody's viewing, watching
206
570049
2723
09:44
and engaging with that content,
207
572796
1489
09:46
it's not just the average person,
it's not me or you --
208
574309
2630
09:48
it's a very specific audience
that we are looking to engage with.
209
576963
3107
09:52
Even more downstream, we developed
a pilot program called "One to One,"
210
580704
3699
09:56
where we took former extremists
211
584427
1549
09:58
and we had them reach out directly
to a group of labeled neofascists
212
586000
4864
10:02
as well as Islamist extremists,
213
590888
1624
10:04
and put direct messages through Facebook
Messenger into their inbox, saying,
214
592536
3815
10:08
"Hey, I see where you're going.
I've been there.
215
596375
2286
10:10
If you want to talk, I'm here."
216
598685
1542
10:12
Now, we kind of expected death threats
from this sort of interaction.
217
600251
3254
10:15
It's a little alarming to have
a former neo-Nazi say, "Hey, how are you?"
218
603529
4420
10:19
But actually, we found
that around 60 percent
219
607973
2207
10:22
of the people reached out to responded,
220
610204
2554
10:24
and of that, around another 60 percent
had sustained engagement,
221
612782
4085
10:28
meaning that they were
having conversations
222
616891
2056
10:30
with the hardest people to reach
about what they were going through,
223
618971
3216
10:34
planting seeds of doubt
224
622211
1151
10:35
and giving them alternatives
for talking about these subjects,
225
623386
2992
10:38
and that's really important.
226
626402
1350
10:41
So what we're trying to do
227
629061
2223
10:43
is actually bring
unlikely sectors to the table.
228
631308
2905
10:46
We have amazing activists
all over the world,
229
634237
2326
10:48
but oftentimes,
their messages are not strategic
230
636587
2366
10:50
or they don't actually reach
the audiences they want to reach.
231
638977
2906
10:53
So we work with networks
of former extremists.
232
641907
2239
10:56
We work with networks of young people
in different parts of the world.
233
644170
3429
10:59
And we work with them
to bring the tech sector to the table
234
647623
2770
11:02
with artists and creatives
and marketing expertise
235
650417
2818
11:05
so that we can actually have
a more robust and challenging of extremism
236
653259
5001
11:10
that works together.
237
658284
1300
11:12
So I would say
that if you are in the audience
238
660074
2580
11:14
and you happen to be a graphic designer,
239
662678
2699
11:17
a poet, a marketing expert,
240
665401
2182
11:19
somebody that works in PR,
241
667607
1909
11:21
a comedian --
242
669540
1353
11:22
you might not think
that this is your sector,
243
670917
2151
11:25
but actually, the skills
that you have right now
244
673092
2739
11:27
might be exactly what is needed
245
675855
2003
11:29
to help challenge extremism effectively.
246
677882
2309
11:32
Thank you.
247
680215
1151
11:33
(Applause)
248
681390
4213

▲Back to top

ABOUT THE SPEAKER
Erin Marie Saltman - Policy researcher
Dr. Erin Marie Saltman manages Facebook's counterterrorism and counter-extremism policy work for Europe, the Middle East and Africa.

Why you should listen

Dr. Erin Marie Saltman's background and expertise includes both Far Right and Islamist extremist processes of radicalization within a range of regional and socio-political contexts. Her research and publications have focused on the evolving nature of online extremism and terrorismgender dynamics within violent extremist organizations and youth radicalization. Saltman has previously held senior research positions at Quilliam Foundation and the Institute for Strategic Dialogue, where she managed international programs. She has also worked with local activists, artists and techies to challenge violent extremism.

As Facebook's Counterterrorism Policy Manager based in London, Saltman regularly speaks with both governments and NGOs on issues related to how Facebook counters terrorism and violent extremism. She has also helped establish the Global Internet Forum to Counter Terrorism, bringing together leading industry partners (Facebook, Google, Microsoft and Twitter) with smaller startups and tech companies to create cross-platform knowledge sharing, technology solutions and research. 

Saltman remains a Research Fellow at the Institute for Strategic Dialogue. She is a graduate of Columbia University (BA) and University College London (MA and PhD). View her articles and publications here.

More profile about the speaker
Erin Marie Saltman | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee