ABOUT THE SPEAKER
Erin Marie Saltman - Policy researcher
Dr. Erin Marie Saltman manages Facebook's counterterrorism and counter-extremism policy work for Europe, the Middle East and Africa.

Why you should listen

Dr. Erin Marie Saltman's background and expertise includes both Far Right and Islamist extremist processes of radicalization within a range of regional and socio-political contexts. Her research and publications have focused on the evolving nature of online extremism and terrorismgender dynamics within violent extremist organizations and youth radicalization. Saltman has previously held senior research positions at Quilliam Foundation and the Institute for Strategic Dialogue, where she managed international programs. She has also worked with local activists, artists and techies to challenge violent extremism.

As Facebook's Counterterrorism Policy Manager based in London, Saltman regularly speaks with both governments and NGOs on issues related to how Facebook counters terrorism and violent extremism. She has also helped establish the Global Internet Forum to Counter Terrorism, bringing together leading industry partners (Facebook, Google, Microsoft and Twitter) with smaller startups and tech companies to create cross-platform knowledge sharing, technology solutions and research. 

Saltman remains a Research Fellow at the Institute for Strategic Dialogue. She is a graduate of Columbia University (BA) and University College London (MA and PhD). View her articles and publications here.

More profile about the speaker
Erin Marie Saltman | Speaker | TED.com
TEDxGhent

Erin Marie Saltman: How young people join violent extremist groups -- and how to stop them

艾琳瑪莉沙特曼: 年輕人如何加入極端主義團體──以及如何阻止他們

Filmed:
1,214,427 views

恐怖份子和極端主義份子並不是天性就暴力的反社會者──他們是在一個過程中被刻意徵召並激進化的。艾琳瑪莉沙特曼討論了造成人們加入極端主義團體的推式和拉式因素,並解釋預防和對抗激進化的創新方式。
- Policy researcher
Dr. Erin Marie Saltman manages Facebook's counterterrorism and counter-extremism policy work for Europe, the Middle East and Africa. Full bio

Double-click the English transcript below to play the video.

00:12
So in 2011, I altered改變 my name名稱
0
833
3398
在 2011 年,我改了名字,
00:16
so that I could participate參加
in Far Right youth青年 camp in Hungary匈牙利.
1
4255
3976
這樣我才能參與
在匈牙利的極右派青年營。
00:20
I was doing a PhD博士 looking at
youth青年 political政治 socialization社會化 --
2
8794
4366
我當時在攻讀博士,
研究青年政治社會主義化──
00:25
why young年輕 people were developing發展
political政治 ideologies意識形態
3
13184
3071
為什麼在後共產主義的背景下,
00:28
in a post-communist後共產主義 setting設置,
4
16279
2010
年輕人會發展出政治意識形態,
00:30
and I saw that a lot
of young年輕 people I was talking to
5
18313
3233
我和很多年輕人談過,我看見他們
00:33
were joining加盟 the Far Right,
6
21570
1600
加入極右派,
00:35
and this was astounding驚人 to me.
7
23194
2156
這讓我很吃驚。
00:37
So I wanted to enroll註冊 in this youth青年 camp
8
25374
2295
所以我想要參加這個青年營,
00:39
to get a better understanding理解
of why people were joining加盟.
9
27693
3136
進一步了解為什麼人們會加入。
00:42
So a colleague同事 enrolled就讀 me,
10
30853
1557
一位同事幫我加入了,
00:44
and my last name名稱 sounds聲音
a little bit too Jewish猶太.
11
32434
2928
而我的姓氏聽起來太有猶太味,
00:47
So Erin艾琳 got turned轉身 into Iréna,
12
35680
2745
所以艾琳被換成艾琳娜,
00:50
and SaltmanSaltman got turned轉身 into Sós,
13
38449
2200
沙特曼被換成沙許,
00:52
which哪一個 means手段 "salty" in Hungarian匈牙利.
14
40673
2921
它在匈牙利語的意思是「鹹的」。
00:55
And in Hungarian匈牙利,
your last name名稱 goes first,
15
43618
2310
匈牙利語會把姓氏放在前面,
00:57
so my James詹姆士 Bond name名稱
turned轉身 into "Salty Irena伊雷娜,"
16
45952
4414
所以我的詹姆士龐德(情報員)
假名就成了「鹹的艾琳娜」,
01:02
which哪一個 is not something
I would have naturally自然 chosen選擇 for myself.
17
50390
3483
這不是個我會幫自己選的名字。
01:06
But going to this camp,
18
54280
1907
但,參加這個青年營,
01:08
I was further進一步 shocked吃驚 to realize實現
that it was actually其實 really fun開玩笑.
19
56211
4523
讓我進一步感到震驚,
因為我發現它其實很好玩。
01:13
They talked very little about politics政治.
20
61180
2215
他們很少談政治。
01:15
It was mostly大多 learning學習 how to ride horses馬匹,
21
63419
3013
主要是在學如何騎馬、
01:18
shooting射擊 a bow and arrow箭頭,
22
66456
1868
如何射箭,
01:20
live生活 music音樂 at night,
23
68348
1687
晚上有現場音樂表演,
01:22
free自由 food餐飲 and alcohol,
24
70059
1969
食物和酒精飲料都免費,
01:24
also some air-gun氣槍 target目標 practice實踐
25
72052
2866
還有空氣槍的打靶練習,
01:26
using運用 mainstream主流 politicians'政治家“
faces面孔 as targets目標.
26
74942
3654
用主流政治人物的臉當目標。
01:30
And this seemed似乎 like a very,
actually其實, friendly友善, inclusive包括的 group
27
78620
3727
這其實感覺就像是個
非常友善、包容的團體,
01:34
until直到 you started開始 talking or mentioning
anything to do with the Roma羅馬 population人口,
28
82371
5413
但當你開始談到或提到和吉普賽人口、
猶太人、或移民相關的事時,
01:39
Jewish猶太 people or immigrants移民,
29
87808
2262
就不是這樣好玩了,
01:42
and then the discourse演講 would become成為
very hate-based基於仇恨 very quickly很快.
30
90094
4150
交談接著就馬上會變成
以恨意為基礎。
01:46
So it led me into my work now,
31
94843
2810
它導致我開始做現在的這個研究,
01:49
where we pose提出 the question,
32
97677
2381
在這研究中我們提出一個問題:
01:52
"Why do people join加入
violent暴力 extremist極端主義 movements運動,
33
100082
3040
「為什麼人們要加入
暴力極端主義運動、
01:55
and how do we effectively有效
counter計數器 these processes流程?"
34
103146
3031
以及我們如何有效地
對抗這些過程?」
01:58
In the aftermath後果 of horrible可怕
atrocities暴行 and attacks攻擊
35
106573
3291
在發生在比利時、法國、全世界的
02:01
in places地方 like Belgium比利時, France法國,
but all over the world世界,
36
109888
3363
恐怖暴行和攻擊的餘波中,
02:05
sometimes有時 it's easier更輕鬆 for us to think,
37
113275
1833
有時我們這樣想會比較容易:
02:07
"Well, these must必須 be sociopaths反社會,
38
115132
1945
「這些人一定是反社會者,
02:09
these must必須 be naturally自然
violent暴力 individuals個人.
39
117101
3064
這些人一定天性就很暴力。
02:12
They must必須 have something wrong錯誤
with their upbringing教養."
40
120189
2596
他們在養育過程中一定有出問題。」
02:14
And what's really tragic悲慘
41
122809
2087
很不幸的是,
02:16
is that oftentimes通常情況下 there's no one profile輪廓.
42
124920
2191
通常,他們並非特定形象的人。
02:19
Many許多 people come
from educated博學 backgrounds背景,
43
127135
3254
許多人來自受過教育的背景,
02:22
different不同 socioeconomic社會經濟 backgrounds背景,
44
130413
2096
來自不同的社會經濟背景,
02:24
men男人 and women婦女, different不同 ages年齡,
45
132533
2848
有男有女,年齡都不同,
02:27
some with families家庭, some single.
46
135405
2278
有些人有家庭,有些人單身,
02:29
So why? What is this allure引誘?
47
137707
2655
所以……為什麼?誘因是什麼?
02:32
And this is what
I want to talk you through通過,
48
140386
2049
我來這裡就是想與大家談這點、
02:34
as well as how do we
challenge挑戰 this in a modern現代 era時代?
49
142459
2887
以及在現代我們要如何挑戰它?
02:38
We do know, through通過 research研究,
50
146711
1483
透過研究,我們確實知道
02:40
that there are quite相當 a number
of different不同 things
51
148218
2356
有許多樣不同的因素
02:42
that affect影響 somebody's某人的
process處理 of radicalization激進,
52
150598
3351
會影響人的激進化過程,
我們把這些因素分類為
「推式」和「拉式」因素。
02:45
and we categorize分類 these
into push and pull factors因素.
53
153973
2770
02:48
And these are pretty漂亮 much similar類似
for Far Right, neo-Nazi新納粹 groups
54
156767
3413
從極右派、新納粹團體、
一路到伊斯蘭極端主義
02:52
all the way to Islamist伊斯蘭 extremist極端主義
and terrorist恐怖分子 groups.
55
160204
2904
及恐怖分子團體,
都是很類似的情況。
02:55
And push factors因素 are basically基本上
what makes品牌 you vulnerable弱勢
56
163663
3858
推式因素基本上是指讓你脆弱、
02:59
to a process處理 of radicalization激進,
57
167545
1858
讓你容易被激進化過程影響的因素,
03:01
to joining加盟 a violent暴力 extremist極端主義 group.
58
169427
2206
因而會加入暴力極端主義團體。
03:03
And these can be
a lot of different不同 things,
59
171657
2126
可能的因素有很多,
03:05
but roughly大致, a sense of alienation異化,
a sense of isolation隔離,
60
173807
3913
不過,大致上來說,
包括疏離感、孤立感、
03:09
questioning疑問 your own擁有 identity身分,
61
177744
2151
執疑你自己的身份、
03:11
but also feeling感覺 that your in-group在組
is under attack攻擊,
62
179919
2826
還有感覺到你的小團體受到攻擊,
03:14
and your in group might威力 be based基於
on a nationality國籍 or an ethnicity種族
63
182769
3793
而你的小團體可能是指
同民族、同人種、
03:18
or a religion宗教,
64
186586
1326
同宗教的人,
03:19
and feeling感覺 that larger powers權力 around you
are doing nothing to help.
65
187936
3611
並覺得你周圍比你更強大的
那些力量都沒有出手幫忙。
03:24
Now, push factors因素 alone單獨
do not make you a violent暴力 extremist極端主義,
66
192075
3421
單有推式因素並不會
讓你變成暴力的極端主義者,
03:27
because if that were the fact事實,
67
195520
1430
因為如果是那樣的話,
03:28
those same相同 factors因素 would go
towards a group like the Roma羅馬 population人口,
68
196974
3270
像吉普賽人這類的團體
也應該會有這些因素,
03:32
and they're not
a violently猛烈 mobilized動員 group.
69
200268
2781
而他們並不是個
有暴力行動傾向的團體。
03:35
So we have to look at the pull factors因素.
70
203073
2287
所以我們也得要看拉式因素。
03:37
What are these violent暴力
extremist極端主義 organizations組織 offering
71
205384
3310
這些暴力極端主義組織提供了什麼
03:40
that other groups are not offering?
72
208718
1945
是其他團體沒有提供的?
03:42
And actually其實, this is usually平時
very positive things,
73
210687
2563
事實上,答案通常
是些很正面的事物,
03:45
very seemingly似乎 empowering授權 things,
74
213274
2017
似乎很能夠賦權的事物,
03:47
such這樣 as brotherhood手足情誼 and sisterhood姐妹
75
215315
2463
比如兄弟情誼、姐妹情誼、
03:49
and a sense of belonging屬於,
76
217802
1334
以及歸屬感,
03:51
as well as giving somebody
a spiritual精神 purpose目的,
77
219160
2874
以及給人一個靈性的目的,
03:54
a divine神聖 purpose目的
to build建立 a utopian烏托邦 society社會
78
222058
3715
一個神聖的目的,
如果這個目的能達成,
03:57
if their goals目標 can be met會見,
79
225797
1921
就能建立一個烏托邦社會,
03:59
but also a sense of empowerment權力
and adventure冒險.
80
227742
2751
另外也有賦權和冒險的感覺。
04:02
When we look
at foreign國外 terrorist恐怖分子 fighters戰士,
81
230517
2043
當我們去看外國的恐怖份子鬥士,
04:04
we see young年輕 men男人
with the wind in their hair頭髮
82
232584
2691
我們看到的是沙漠上
自由自在的年輕人,
04:07
out in the desert沙漠
and women婦女 going to join加入 them
83
235299
2546
且女人也去加入他們,
04:09
to have nuptials婚禮 out in the sunset日落.
84
237869
2641
在日落時分舉行婚禮。
04:12
It's very romantic浪漫, and you become成為 a hero英雄.
85
240534
3820
那很浪漫,你變成了英雄。
04:16
For both men男人 and women婦女,
that's the propaganda宣傳 being存在 given特定.
86
244378
2888
對男人和女人用的宣傳都一樣。
04:19
So what extremist極端主義 groups are very good at
87
247667
2642
所以極端主義團體非常擅長的是
04:22
is taking服用 a very complicated複雜,
confusing撲朔迷離, nuanced細緻入微 world世界
88
250333
4826
把一個非常複雜、困惑、微妙的世界
04:27
and simplifying簡化 that world世界
into black黑色 and white白色,
89
255183
3243
簡化成只有黑與白、
04:30
good and evil邪惡.
90
258450
1210
正與邪。
04:31
And you become成為 what is good,
91
259684
1881
你會在好人的一方,
04:33
challenging具有挑戰性的 what is evil邪惡.
92
261589
1855
挑戰邪惡的一方。
04:36
So I want to talk a little bit
about ISIS伊斯蘭國, DaeshDaesh,
93
264541
3864
我想要談一下 ISIS、伊斯蘭國,
04:40
because they have been a game遊戲 changer
in how we look at these processes流程,
94
268429
4378
因為針對我們如何看待這些過程,
他們算是改變遊戲規則的人,
04:44
and through通過 a lot of the material材料
and their tactics策略.
95
272831
3206
他們透過很多素材和戰術來做到。
04:48
They're very much a modern現代 movement運動.
96
276061
2548
他們可說是個現代的運動。
04:50
One of the aspects方面 is the internet互聯網
and the usage用法 of social社會 media媒體,
97
278925
4485
其中一個面向是網際網路,
社交媒體的運用,
04:55
as we've我們已經 all seen看到 in headlines新聞頭條
tweeting啁啾 and videos視頻 of beheadings斬首.
98
283434
4382
我們已經在關於斬首的推特頭條
以及影片中看過這現象了。
04:59
But the internet互聯網 alone單獨
does not radicalize激進 you.
99
287840
2475
但光只有網際網路
並不會讓你變激進。
05:02
The internet互聯網 is a tool工具.
100
290339
1207
網際網路只是工具。
05:03
You don't go online線上 shopping購物 for shoes
101
291570
1856
你不會上網買了鞋子之後
05:05
and accidentally偶然 become成為 a jihadist聖戰.
102
293450
1798
就突然變成聖戰士。
05:07
However然而, what the Internet互聯網
does do is it is a catalyst催化劑.
103
295793
3389
然而,網際網路的確是種催化劑。
05:11
It provides提供 tools工具 and scale規模 and rapidity迅速
104
299206
4119
它提供工具、規模、迅速性,
05:15
that doesn't exist存在 elsewhere別處.
105
303349
1508
而這些都是其他地方沒有的。
05:16
And with ISIS伊斯蘭國, all of a sudden突然,
106
304881
2461
在 ISIS 的例子中則是,
05:19
this idea理念 of a cloaked隱形, dark黑暗 figure數字
of a jihadist聖戰 changed for us.
107
307366
5318
對我們來說,聖戰士的這個
披斗篷黑暗形象突然改變了。
05:24
All of a sudden突然,
we were in their kitchens廚房.
108
312708
2055
突然間,我們在他們的廚房裡,
我們看見他們晚餐吃些什麼。
05:26
We saw what they were eating for dinner晚餐.
109
314787
1999
他們會用推特。
05:28
They were tweeting啁啾.
110
316810
1151
我們會看到外國恐怖份子鬥士
用他們自己的語言在推特。
05:29
We had foreign國外 terrorist恐怖分子 fighters戰士
tweeting啁啾 in their own擁有 languages語言.
111
317985
3158
有女人在那裡談論她們的結婚日,
05:33
We had women婦女 going out there
talking about their wedding婚禮 day,
112
321167
2952
談論她們孩子的出生。
05:36
about the births出生 of their children孩子.
113
324143
1747
05:37
We had gaming賭博 culture文化, all of a sudden突然,
114
325914
1897
突然間還有遊戲文化,
05:39
and references引用
to Grand盛大 Theft盜竊 Auto汽車 being存在 made製作.
115
327835
3166
還會提及到俠盜獵車手系列。
05:43
So all of a sudden突然, they were homey家常.
116
331471
2461
所以,突然間,他們就成了自己人。
05:45
They became成為 human人的.
117
333956
1151
他們變成了人類。
05:47
And the problem問題
is that trying to counter計數器 it,
118
335131
2214
問題是,許多社交媒體公司
05:49
lots of governments政府
and social社會 media媒體 companies公司
119
337369
2310
和政府試圖對抗它的方式,
05:51
just tried試著 to censor審查.
120
339703
1151
就只是做審查。
05:52
How do we get rid擺脫 of terrorist恐怖分子 content內容?
121
340878
1991
我們要如何除去恐怖主義的內容?
05:54
And it became成為 a cat-and-mouse貓和老鼠 game遊戲
122
342893
1655
它變成了貓捉老鼠的遊戲,
05:56
where we would see accounts賬戶 taken採取 down
and they'd他們會 just come back up,
123
344572
3204
在遊戲中,我們看到帳號
被關閉但又馬上東山再起,
05:59
and an arrogance傲慢 around somebody
having a 25th account帳戶
124
347800
3113
也看到有人對開了
第 25 個帳號沾沾自喜,
06:02
and material材料 that was
disseminated傳播 everywhere到處.
125
350937
3094
也會看到素材被到處傳播。
06:06
But we also saw a dangerous危險 trend趨勢 --
126
354055
2021
但我們也看到一個危險的趨勢──
06:08
violent暴力 extremists極端分子 know the rules規則
and regulations法規 of social社會 media媒體, too.
127
356100
5008
暴力極端主義者也知道
社交媒體的規則和規定。
06:13
So we would see a banal平庸
conversation會話 with a recruiter招聘
128
361132
4000
所以我們可以看見與
招聘人員的平凡對話,
06:17
start開始 on a mainstream主流 platform平台,
129
365156
1973
就直接在主流平台上開始進行,
06:19
and at the point
at which哪一個 that conversation會話
130
367153
2081
在某個時點,這個對話就會
06:21
was going to become成為 illegal非法,
131
369258
1340
變成非法的,
06:22
they would jump to a smaller,
less regulated調控,
132
370622
2501
他們就會跳到一個較小、較沒管理、
06:25
more encrypted加密 platform平台.
133
373147
1623
較多加密的平台。
06:26
So all of a sudden突然, we couldn't不能
track跟踪 where that conversation會話 went.
134
374794
3533
突然間,我們就無法
追縱那對話到哪去了。
06:30
So this is a problem問題 with censorship審查,
135
378351
1862
這就是審查制度的問題,
06:32
which哪一個 is why we need to develop發展
alternatives備擇方案 to censorship審查.
136
380237
3232
也是為何我們需要發展
審查制度以外的替代方案。
06:36
ISIS伊斯蘭國 is also a game-changer改變遊戲規則
because it's state-building國家建設.
137
384035
3350
ISIS 之所以能改變遊戲規則,
也是因為它在建立國家。
06:39
It's not just recruiting招聘 combatants戰鬥;
138
387409
2112
它不只是在徵召戰士;
06:41
it's trying to build建立 a state.
139
389545
1862
它是在試圖建立一個國家。
06:43
And what that means手段 is all of a sudden突然,
140
391431
1940
那意味著,突然間,
06:45
your recruitment招聘 model模型 is much more broad廣闊.
141
393395
2000
你的徵召模型就廣泛許多。
06:47
You're not just trying to get fighters戰士 --
142
395419
2049
你並不只需要戰士──
06:49
now you need architects建築師, engineers工程師,
accountants會計師, hackers黑客 and women婦女.
143
397492
4266
你還需要建築師、工程師、
會計、駭客、女人。
06:53
We've我們已經 actually其實 seen看到
a huge巨大 increase增加 of women婦女 going
144
401782
2390
我們確實看到,在過去 24 個月,
06:56
in the last 24, but especially特別 12 months個月.
145
404196
3499
特別是過去 12 個月,
加入的女性人數大增。
06:59
Some countries國家, one in four
of the people going over to join加入
146
407719
2889
在某些國家,去加入的人當中,
07:02
are now women婦女.
147
410632
1239
四個就有一個是女性。
07:03
And so, this really changes變化
148
411895
1368
這真的改變了
07:05
who we're trying to counter計數器
this process處理 with.
149
413287
2782
我們在對抗這個過程時所涉及的人。
07:08
Now, not all doom厄運 and gloom愁雲.
150
416679
1651
並非完全沒希望。
07:10
So the rest休息 I'd like to talk about
some of the positive things
151
418354
2960
所以在剩下的時間中,
我想談些正面的東西、
07:13
and the new innovation革新 in trying
to prevent避免 and counter計數器 violent暴力 extremism極端主義.
152
421338
3844
以及在試著預防和對抗
暴力極端主義方面的創新。
07:17
Preventing預防 is very different不同
than countering反制,
153
425206
2273
預防和對抗是非常不同的,
07:19
and actually其實, you can think of it
in medical terms條款.
154
427503
2556
其實,各位可以用醫學用語來看它。
07:22
So preventative預防 medicine醫學 is,
155
430083
2222
預防性的藥是
07:24
how do we make it
so you are naturally自然 resilient彈性
156
432329
3174
我們要如何做,才能讓你從
07:27
to this process處理 of radicalization激進,
157
435527
2500
激進化的過程當中自然恢復,
07:30
whereas that is going to be different不同
158
438051
1862
這就不同於
07:31
if somebody is already已經 showing展示
a symptom症狀 or a sign標誌
159
439937
2659
已經出現暴力極端主義
意識形態的症狀或徵兆的人。
07:34
of belonging屬於 to a violent暴力
extremist極端主義 ideology思想.
160
442620
2881
07:37
And so in preventative預防 measures措施,
161
445525
1547
所以,在預防措施上,
07:39
we're talking more
about really broad廣闊 groups of people
162
447096
2691
我們會更考量更廣泛的族群,
07:41
and exposure曝光 to ideas思路
163
449811
1817
及對想法更廣泛的接觸,
07:43
to make them resilient彈性.
164
451652
1767
來讓他們能夠恢復。
07:45
Whereas it's very different不同
165
453443
1516
這非常不同於
07:46
if somebody is starting開始 to question
and agree同意 with certain某些 things online線上,
166
454983
3825
有人開始在線上執疑
並同意某些東西的情況,
07:50
and it's also very different不同
if somebody already已經 has a swastika tattoo
167
458832
3849
也非常不同於有人身上
已經有「卍」字刺青的情況,
07:54
and is very much embedded嵌入式 within a group.
168
462705
2048
及已深植在一個團體中的情況。
07:56
How do you reach達到 them?
169
464777
1434
要如何觸及到這些人?
07:58
So I'd like to go through通過 three examples例子
of each one of those levels水平
170
466785
3682
針對這三個層級,
我會各舉一個例子。
08:02
and talk you through通過
171
470491
1215
帶大家了解
08:03
what some of the new ways方法
of engaging with people are becoming變得.
172
471730
3316
跟這些人互動的一些
新方式變成什麼樣子。
08:07
One is "Extreme極端 Dialogue對話,"
173
475374
1413
其一是「極端對話」,
08:08
and it's an educational教育性 program程序
that we helped幫助 develop發展.
174
476811
3080
它是一個我們協助
發展出來的教育專案。
08:11
This one is from Canada加拿大,
175
479915
2381
這個是來自加拿大,
08:14
and it's meant意味著 to create創建 dialogues對話
within a classroom課堂 setting設置,
176
482320
4095
目的是要在教室的環境中創造對話,
08:18
using運用 storytelling評書,
177
486439
1532
用說故事的方式,
08:19
because violent暴力 extremism極端主義
can be very hard to try to explain說明,
178
487995
3151
因為試著解釋暴力
極端主義可能是很困難的,
08:23
especially特別 to younger更年輕 individuals個人.
179
491170
1699
特別是對年輕人解釋。
08:25
So we have a network網絡 of former前任的 extremists極端分子
and survivors倖存者 of extremism極端主義
180
493305
3913
所以我們有一個前極端主義者
與極端主義存活者的網路,
08:29
that tell their stories故事 through通過 video視頻
and create創建 question-giving問題,給 to classrooms教室,
181
497242
3937
透過影片來說他們的故事,
創造提出問題的教室,
08:33
to start開始 a conversation會話 about the topic話題.
182
501203
2303
來開啟和這個主題有關的對話。
08:35
These two examples例子 show顯示 Christianne克里斯蒂安,
183
503530
2532
這兩個例子中的是克莉斯坦,
08:38
who lost丟失 her son兒子,
184
506086
1151
她失去了兒子,
08:39
who radicalized激進 and died死亡
fighting戰鬥 for ISIS伊斯蘭國,
185
507261
2493
她兒子激進化並為 ISIS 戰死,
08:41
and Daniel丹尼爾 is a former前任的 neo-Nazi新納粹
186
509778
1667
還有前新納粹主義者丹尼爾,
08:43
who was an extremely非常 violent暴力 neo-Nazi新納粹,
187
511469
2358
他以前是極暴力的新納粹主義者,
08:45
and they pose提出 questions問題 about their lives生活
and where they're at and regret後悔,
188
513851
4158
他們提出關於他們人生的問題,
以及他們有什麼樣的悔恨,
08:50
and force a classroom課堂
to have a dialogue對話 around it.
189
518033
2650
迫使教室中的人
針對此事來進行對話。
08:53
Now, looking at that middle中間 range範圍
of individuals個人,
190
521175
2985
看看中間範圍的個人,
08:56
actually其實, we need a lot
of civil國內 society社會 voices聲音.
191
524184
2699
其實,我們需要
很多公民社會的聲音。
08:58
How do you interact相互作用 with people
that are looking for information信息 online線上,
192
526907
3445
你要如何和在線上找資訊的人、
09:02
that are starting開始 to toy玩具 with an ideology思想,
193
530376
2342
開始半假半真地考慮
一種意識形態的人、
09:04
that are doing those searching搜索
identity身分 questions問題?
194
532742
3064
及提出關於尋找身份
相關問題的人互動?
09:07
How do we provide提供 alternatives備擇方案 for that?
195
535830
2142
我們要如何對此提供替代方案?
09:09
And that's when we combine結合
large groups of civil國內 society社會 voices聲音
196
537996
3390
所以我們需要結合代表
公民社會聲音的大型團體,
09:13
with creatives創意, techies技術人員,
app應用 developers開發商, artists藝術家, comedians喜劇演員,
197
541410
4531
結合創意者、技術專家、應用程式
開發者、藝術家、喜劇演員,
09:17
and we can create創建 really specified規定 content內容
198
545965
2683
我們能創造出非常明確的內容,
09:20
and actually其實, online線上, disseminate傳播 it
to very strategic戰略 audiences觀眾.
199
548672
4294
放在線上,傳播給
非常關鍵的觀眾群,
09:24
So one example would be
creating創建 a satirical諷刺 video視頻
200
552990
2803
一個例子是製作諷刺影片,
09:27
which哪一個 makes品牌 fun開玩笑 of Islamophobia伊斯蘭恐懼症,
201
555817
2499
拿伊斯蘭恐懼症來開玩笑,
09:30
and targeting針對 it
to 15- to 20-year-olds- 年的孩子 online線上
202
558340
3936
目標為 15 到 20 歲的觀眾,
09:34
that have an interest利益 in white白色 power功率 music音樂
203
562300
2247
對於白人至上音樂感興趣,
09:36
and live生活 specifically特別 in Manchester曼徹斯特.
204
564571
2399
且很明確是住在曼徹斯特的。
09:38
We can use these marketing營銷 tools工具
to be very specific具體,
205
566994
3031
我們可以用這些行銷工具
來針對非常明確的目標,
09:42
so that we know
when somebody's某人的 viewing觀看, watching觀看
206
570049
2723
讓我們知道何時
有人在瀏覽、在觀看,
09:44
and engaging with that content內容,
207
572796
1489
和內容有所連結,
09:46
it's not just the average平均 person,
it's not me or you --
208
574309
2630
並不只是一般人,不是我或你──
09:48
it's a very specific具體 audience聽眾
that we are looking to engage從事 with.
209
576963
3107
是非常明確的觀眾,
我們想要接觸到的特定觀眾。
09:52
Even more downstream下游, we developed發達
a pilot飛行員 program程序 called "One to One,"
210
580704
3699
即使在較下流處,我們也開發了
一個前導專案「一個對一個」,
09:56
where we took former前任的 extremists極端分子
211
584427
1549
我們找來前極端主義者,
09:58
and we had them reach達到 out directly
to a group of labeled標記 neofascistsneofascists
212
586000
4864
並讓他們直接去接觸被標籤為
新法西斯主義以及伊斯蘭
10:02
as well as Islamist伊斯蘭 extremists極端分子,
213
590888
1624
極端份子的團體,
10:04
and put direct直接 messages消息 through通過 FacebookFacebook的
Messenger信使 into their inbox收件箱, saying,
214
592536
3815
用臉書簡訊直接
發送訊息給他們,說:
10:08
"Hey, I see where you're going.
I've been there.
215
596375
2286
「嘿,我知道你打算做什麼,
我有這樣的經驗。
10:10
If you want to talk, I'm here."
216
598685
1542
如果你想談談,我在這裡。」
10:12
Now, we kind of expected預期 death死亡 threats威脅
from this sort分類 of interaction相互作用.
217
600251
3254
我們也預期這類互動
會帶來死亡威脅。
10:15
It's a little alarming驚人 to have
a former前任的 neo-Nazi新納粹 say, "Hey, how are you?"
218
603529
4420
聽到一位前新納粹對你說
「嘿,你好嗎?」
是還蠻讓人不安的。
10:19
But actually其實, we found發現
that around 60 percent百分
219
607973
2207
但其實我們發現,
我們接觸的人中,
約 60% 的人會做出回應,
10:22
of the people reached到達 out to responded回應,
220
610204
2554
10:24
and of that, around another另一個 60 percent百分
had sustained持續 engagement訂婚,
221
612782
4085
這些人當中,又有 60%
有持續接觸下去,
10:28
meaning含義 that they were
having conversations對話
222
616891
2056
也就是說他們繼續和
最難接觸到的人進行交談,
10:30
with the hardest最難 people to reach達到
about what they were going through通過,
223
618971
3216
在談他們的經歷,
種下懷疑的種子,
10:34
planting種植 seeds種子 of doubt懷疑
224
622211
1151
10:35
and giving them alternatives備擇方案
for talking about these subjects主題,
225
623386
2992
提供他們可以談論
這些主題的替代方案,
10:38
and that's really important重要.
226
626402
1350
那是非常重要的。
10:41
So what we're trying to do
227
629061
2223
所以我們試圖做的,
10:43
is actually其實 bring帶來
unlikely不會 sectors行業 to the table.
228
631308
2905
其實是去幫助不太可能的部門。
10:46
We have amazing驚人 activists活動家
all over the world世界,
229
634237
2326
在全世界,我們有很棒的活躍份子,
10:48
but oftentimes通常情況下,
their messages消息 are not strategic戰略
230
636587
2366
但通常,他們的訊息不是策略性的,
10:50
or they don't actually其實 reach達到
the audiences觀眾 they want to reach達到.
231
638977
2906
或者他們其實不會接觸到
他們想要接觸的觀眾,
10:53
So we work with networks網絡
of former前任的 extremists極端分子.
232
641907
2239
所以我們和前極端份子的網路合作。
10:56
We work with networks網絡 of young年輕 people
in different不同 parts部分 of the world世界.
233
644170
3429
我們與世界不同區域的
年輕人的網路合作。
我們和他們合作來提供技術部門,
10:59
And we work with them
to bring帶來 the tech高科技 sector扇形 to the table
234
647623
2770
11:02
with artists藝術家 and creatives創意
and marketing營銷 expertise專門知識
235
650417
2818
配合藝術家、創意者、行銷專家,
11:05
so that we can actually其實 have
a more robust強大的 and challenging具有挑戰性的 of extremism極端主義
236
653259
5001
這樣我們就能有比較強大
且有挑戰性的極端主義,
11:10
that works作品 together一起.
237
658284
1300
能夠一同合作。
11:12
So I would say
that if you are in the audience聽眾
238
660074
2580
我會說,如果你是觀眾之一,
11:14
and you happen發生 to be a graphic圖像 designer設計師,
239
662678
2699
且你剛好是平面設計師、
11:17
a poet詩人, a marketing營銷 expert專家,
240
665401
2182
詩人、行銷專家、
11:19
somebody that works作品 in PRPR,
241
667607
1909
在公關部門工作的人、
11:21
a comedian喜劇演員 --
242
669540
1353
喜劇演員──
11:22
you might威力 not think
that this is your sector扇形,
243
670917
2151
你可能不覺得這是你的部門,
11:25
but actually其實, the skills技能
that you have right now
244
673092
2739
但事實上,你現在擁有的技能
11:27
might威力 be exactly究竟 what is needed需要
245
675855
2003
可能就是協助有效地挑戰
11:29
to help challenge挑戰 extremism極端主義 effectively有效.
246
677882
2309
極端主義所需要的技能。
11:32
Thank you.
247
680215
1151
謝謝。
11:33
(Applause掌聲)
248
681390
4213
(掌聲)
Translated by Lilian Chiu
Reviewed by nr chan

▲Back to top

ABOUT THE SPEAKER
Erin Marie Saltman - Policy researcher
Dr. Erin Marie Saltman manages Facebook's counterterrorism and counter-extremism policy work for Europe, the Middle East and Africa.

Why you should listen

Dr. Erin Marie Saltman's background and expertise includes both Far Right and Islamist extremist processes of radicalization within a range of regional and socio-political contexts. Her research and publications have focused on the evolving nature of online extremism and terrorismgender dynamics within violent extremist organizations and youth radicalization. Saltman has previously held senior research positions at Quilliam Foundation and the Institute for Strategic Dialogue, where she managed international programs. She has also worked with local activists, artists and techies to challenge violent extremism.

As Facebook's Counterterrorism Policy Manager based in London, Saltman regularly speaks with both governments and NGOs on issues related to how Facebook counters terrorism and violent extremism. She has also helped establish the Global Internet Forum to Counter Terrorism, bringing together leading industry partners (Facebook, Google, Microsoft and Twitter) with smaller startups and tech companies to create cross-platform knowledge sharing, technology solutions and research. 

Saltman remains a Research Fellow at the Institute for Strategic Dialogue. She is a graduate of Columbia University (BA) and University College London (MA and PhD). View her articles and publications here.

More profile about the speaker
Erin Marie Saltman | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee