ABOUT THE SPEAKER
Yasmin Green - Geopolitical technologist
Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. focused on solving global security challenges through technology.

Why you should listen

Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. (formerly Google Ideas), focused on using tech tools to make the world safer, both on and offline. She has experience leading projects in some of the world’s toughest environments, including Iran, Syria, the UAE and Nigeria. In 2012, she led a multi-partner coalition to launch Against Violent Extremism, the world's first online network of former violent extremists and survivors of terrorism. Based on her own interviews with ISIS defectors and jailed recruits, last year Yasmin launched the Redirect Method, a new deployment of targeted advertising and video to confront online radicalization.

Green is a senior advisor on innovation to Oxford Analytica, a member of the Aspen Cyber Strategy Group, and until 2015 co-chaired the European Commission's Working Group on Online Radicalization. She was named one of Fortune's "40 Under 40" most influential young leaders in 2017, and in 2016 she was named one of Fast Company's "Most Creative People in Business."

More profile about the speaker
Yasmin Green | Speaker | TED.com
TED2018

Yasmin Green: How technology can fight extremism and online harassment

雅絲敏葛琳: 科技如何對抗極端主義和線上騷擾

Filmed:
2,460,759 views

科技能讓人更安全,不受到像暴力極端主義、審查審查制度、迫害的威脅嗎?在這場演說中,科技專家雅絲敏葛琳詳細說明「Jigsaw」(Alphabet Inc. 中的一個單位,Alphabet Inc. 由許多公司組成,包括谷歌)所開創的計畫,來反擊極端主義和線上騷擾。它能給予留言者即時的回饋,讓他們知道他們的話語會有什麼影響,來增加對話的空間。「如果我們認為我們能夠建立一個完全沒有人性黑暗面的網際網路,我們就錯了。」葛琳說:「我們得把自己全心全意投入,去建立人類的解決方案,來解決人類的問題。」
- Geopolitical technologist
Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. focused on solving global security challenges through technology. Full bio

Double-click the English transcript below to play the video.

00:13
My relationship關係 with the internet互聯網
reminds提醒 me of the setup建立
0
1131
4411
我與網際網路的關係,
讓我想起老套恐怖片的情境。
00:17
to a clich老生常談éd horror恐怖 movie電影.
1
5566
1833
00:19
You know, the blissfully幸福 happy快樂 family家庭
moves移動 in to their perfect完善 new home,
2
7867
4386
幸福快樂的家庭,
搬進一間美好的新房子,
00:24
excited興奮 about their perfect完善 future未來,
3
12277
2281
興奮期待完美的未來,
00:26
and it's sunny晴朗 outside
and the birds鳥類 are chirping鳴叫 ...
4
14582
3521
外頭陽光普照,鳥兒在唱歌……
00:30
And then it gets得到 dark黑暗.
5
18857
1839
接著電影就變黑暗了。
00:32
And there are noises噪音 from the attic閣樓.
6
20720
2348
閣樓傳出噪音。
00:35
And we realize實現 that that perfect完善
new house isn't so perfect完善.
7
23092
4345
我們發現,那間美好的
新房子並沒有那麼美好。
00:40
When I started開始 working加工 at Google谷歌 in 2006,
8
28485
3131
2006 年,當我開始
在谷歌(Google)工作時,
00:43
FacebookFacebook的 was just a two-year-old二十歲,
9
31640
1767
臉書(Facebook)才剛推出兩年,
00:45
and Twitter推特 hadn't有沒有 yet然而 been born天生.
10
33431
2012
推特(Twitter)甚至還沒問世。
00:47
And I was in absolute絕對 awe威嚴
of the internet互聯網 and all of its promise諾言
11
35848
4410
我對網際網路及它所有的承諾
感到絕對的敬畏,
00:52
to make us closer接近
12
40282
1437
它承諾要讓我們
00:53
and smarter聰明
13
41743
1296
更靠近且更聰明,
00:55
and more free自由.
14
43063
1214
還有給予更多自由。
00:57
But as we were doing the inspiring鼓舞人心 work
of building建造 search搜索 engines引擎
15
45265
3714
但當我們開始進行
這鼓舞人心的工作,
建立搜尋引擎,
01:01
and video-sharing視頻分享 sites網站
and social社會 networks網絡,
16
49003
2886
建立影片分享網站和社交網路,
01:04
criminals罪犯, dictators獨裁者 and terrorists恐怖分子
were figuring盤算 out
17
52907
4304
罪犯、獨裁者,
及恐怖分子都在想辦法
01:09
how to use those same相同
platforms平台 against反對 us.
18
57235
3202
如何用同樣的平台來對抗我們。
01:13
And we didn't have
the foresight先見之明 to stop them.
19
61417
2455
我們沒有先見之明來阻止他們。
01:16
Over the last few少數 years年份, geopolitical地緣政治
forces軍隊 have come online線上 to wreak發洩 havoc浩劫.
20
64746
5099
在過去幾年,地緣政治學的
勢力也上網展開大破壞。
01:21
And in response響應,
21
69869
1169
造成的反應是,
01:23
Google谷歌 supported支持的 a few少數 colleagues同事 and me
to set up a new group called Jigsaw拼圖,
22
71062
4778
谷歌支持我和幾位同事,
成立一個小組,叫做「Jigsaw」,
01:27
with a mandate要求 to make people safer更安全
from threats威脅 like violent暴力 extremism極端主義,
23
75864
4596
我們的使命是要讓大家更安全,
避免受到像是極端主義、
01:32
censorship審查, persecution迫害 --
24
80484
2078
審查制度、迫害的威脅——
01:35
threats威脅 that feel very personal個人 to me
because I was born天生 in Iran伊朗,
25
83186
4117
我個人對這些威脅特別有感,
因為我是在伊朗出生的,
01:39
and I left in the aftermath後果
of a violent暴力 revolution革命.
26
87327
2929
在一場暴力革命的餘波中,
我被迫離開了伊朗。
01:43
But I've come to realize實現
that even if we had all of the resources資源
27
91525
4346
但我漸漸了解到,
即使我們有所有的資源,
01:47
of all of the technology技術
companies公司 in the world世界,
28
95895
2858
有全世界所有的科技公司,
01:51
we'd星期三 still fail失敗
29
99595
1230
如果我們忽略了
01:53
if we overlooked忽視 one critical危急 ingredient成分:
30
101586
2948
一項關鍵因素,我們仍然會失敗:
01:57
the human人的 experiences經驗 of the victims受害者
and perpetrators肇事者 of those threats威脅.
31
105653
5789
那些威脅的受害者
與加害者的人類經歷。
02:04
There are many許多 challenges挑戰
I could talk to you about today今天.
32
112935
2736
今天我其實可以
與各位談很多的挑戰。
02:07
I'm going to focus焦點 on just two.
33
115695
1504
但我只打算著重兩項:
02:09
The first is terrorism恐怖主義.
34
117623
2079
第一是恐怖主義。
02:13
So in order訂購 to understand理解
the radicalization激進 process處理,
35
121563
2557
為了要了解激進化的過程,
02:16
we met會見 with dozens許多 of former前任的 members會員
of violent暴力 extremist極端主義 groups.
36
124144
4287
我們和數十名暴力
極端主義團體的前成員見面。
02:21
One was a British英國的 schoolgirl女學生,
37
129590
2483
其中一位是英國的女學生,
02:25
who had been taken採取 off of a plane平面
at London倫敦 Heathrow希思羅機場
38
133049
3699
她曾在倫敦希斯洛機場
被強拉下飛機,
02:28
as she was trying to make her way
to Syria敘利亞 to join加入 ISIS伊斯蘭國.
39
136772
4692
因為當時她打算
去敘利亞加入伊斯蘭國。
02:34
And she was 13 years年份 old.
40
142281
1931
她當時只有十三歲。
02:37
So I satSAT down with her and her father父親,
and I said, "Why?"
41
145792
4625
我和她及她父親坐下來談,
我說:「為什麼?」
02:42
And she said,
42
150441
1717
她說:
02:44
"I was looking at pictures圖片
of what life is like in Syria敘利亞,
43
152182
3639
「我在看一些敘利亞
生活寫照的圖片,
02:47
and I thought I was going to go
and live生活 in the Islamic清真 Disney迪士尼 World世界."
44
155845
3510
我以為我是要去住到
伊斯蘭的迪士尼樂園。」
02:52
That's what she saw in ISIS伊斯蘭國.
45
160527
2084
這是她看到的伊斯蘭國。
02:54
She thought she'd meet遇到 and marry結婚
a jihadi聖戰 Brad布拉德 Pitt皮特
46
162635
3492
她以為她要去見一位聖戰士中的
布萊德彼得並嫁給他,
02:58
and go shopping購物 in the mall購物中心 all day
and live生活 happily高高興興 ever after.
47
166151
3058
整天都能去購物,
從此幸福快樂地生活。
03:02
ISIS伊斯蘭國 understands理解 what drives驅動器 people,
48
170977
2824
伊斯蘭國知道什麼能驅使人,
03:05
and they carefully小心 craft手藝 a message信息
for each audience聽眾.
49
173825
3544
他們會為每一位觀眾
精心策劃一則訊息。
光是去看看他們把他們的行銷素材
03:11
Just look at how many許多 languages語言
50
179122
1511
03:12
they translate翻譯 their
marketing營銷 material材料 into.
51
180657
2273
翻譯成多少語言,就能了解了。
03:15
They make pamphlets小冊子,
radio無線電 shows節目 and videos視頻
52
183677
2661
他們會製作小冊子、
廣播節目,和影片,
03:18
in not just English英語 and Arabic阿拉伯,
53
186362
1973
不只用英語和阿拉伯語,
03:20
but German德語, Russian俄語,
French法國, Turkish土耳其, Kurdish庫爾德,
54
188359
4767
還有德語、俄語、法語、
土耳其語、庫德語、
03:25
Hebrew希伯來語,
55
193150
1672
希伯來語、
03:26
Mandarin普通話 Chinese中文.
56
194846
1741
華語(中文)。
03:29
I've even seen看到 an ISIS-produced伊希斯-生產
video視頻 in sign標誌 language語言.
57
197309
4192
我甚至看過一支伊斯蘭國
製作的影片是用手語的。
03:34
Just think about that for a second第二:
58
202605
1884
花點時間思考一下:
03:36
ISIS伊斯蘭國 took the time and made製作 the effort功夫
59
204513
2308
伊斯蘭國投入時間和心力,
03:38
to ensure確保 their message信息 is reaching到達
the deaf and hard of hearing聽力.
60
206845
3804
來確保他們的訊息
能夠傳達給聽障人士。
03:45
It's actually其實 not tech-savviness技術理智
61
213143
2144
伊斯蘭國能贏得人心和人信,
03:47
that is the reason原因 why
ISIS伊斯蘭國 wins hearts心中 and minds頭腦.
62
215311
2595
並不是因為他們很精通科技。
03:49
It's their insight眼光 into the prejudices偏見,
the vulnerabilities漏洞, the desires慾望
63
217930
4163
因為他們有洞見,了解
他們試圖接觸的人有什麼
03:54
of the people they're trying to reach達到
64
222117
1774
偏見、脆弱、慾望,
03:55
that does that.
65
223915
1161
才能做到這一點。
03:57
That's why it's not enough足夠
66
225718
1429
那就說明了為什麼
03:59
for the online線上 platforms平台
to focus焦點 on removing去除 recruiting招聘 material材料.
67
227171
4239
線上平台只把焦點放在
移除召募素材是不足的。
04:04
If we want to have a shot射擊
at building建造 meaningful富有意義的 technology技術
68
232518
3581
如果我想要有機會建立
一種有意義的技術,
04:08
that's going to counter計數器 radicalization激進,
69
236123
1874
用它來對抗極端化,
04:10
we have to start開始 with the human人的
journey旅程 at its core核心.
70
238021
2979
我們就得要從它核心的
人類旅程開始著手。
04:13
So we went to Iraq伊拉克
71
241884
2187
所以,我們去了伊拉克,
04:16
to speak說話 to young年輕 men男人
who'd誰願意 bought into ISIS's伊希斯的 promise諾言
72
244095
2831
去和年輕人交談,
我們找的對象曾相信伊斯蘭國
04:18
of heroism英雄主義 and righteousness,
73
246950
3191
所做的關於英雄主義與公正的承諾,
04:22
who'd誰願意 taken採取 up arms武器 to fight鬥爭 for them
74
250165
1847
曾拿起武器為他們作戰,
04:24
and then who'd誰願意 defected叛逃
75
252036
1338
接著,在目擊了
04:25
after they witnessed目擊
the brutality殘酷 of ISIS's伊希斯的 rule規則.
76
253398
3021
伊斯蘭國統治的殘酷之後選擇變節。
04:28
And I'm sitting坐在 there in this makeshift湊合的
prison監獄 in the north of Iraq伊拉克
77
256880
3192
我坐在北伊拉克的一間臨時監獄裡,
04:32
with this 23-year-old-歲 who had actually其實
trained熟練 as a suicide自殺 bomber轟炸機
78
260096
4550
會見一位在變節前
受過訓練的自殺炸彈客,
04:36
before defecting叛逃.
79
264670
1552
年僅 23 歲。
04:39
And he says,
80
267080
1158
他說:
04:41
"I arrived到達 in Syria敘利亞 full充分 of hope希望,
81
269119
3220
「我到敘利亞時滿懷著希望,
04:44
and immediately立即, I had two
of my prized珍貴 possessions財產 confiscated沒收:
82
272363
4365
可一下子我兩項最重要的
東西就被沒收了:
04:48
my passport護照 and my mobile移動 phone電話."
83
276752
2933
我的護照和我的手機。」
04:52
The symbols符號 of his physical物理
and digital數字 liberty自由
84
280140
2406
在他抵達時,
這兩樣象徵他實體自由
04:54
were taken採取 away from him on arrival到達.
85
282570
1760
與數位自由的東西被奪去了。
04:57
And then this is the way he described描述
that moment時刻 of loss失利 to me.
86
285248
3510
接著,他這樣向我描述迷失的時刻。
05:01
He said,
87
289356
1586
他說:
05:02
"You know in 'Tom"湯姆 and Jerry傑瑞,'
88
290966
2329
「在《湯姆貓與傑利鼠》中,
05:06
when Jerry傑瑞 wants to escape逃逸,
and then Tom湯姆 locks the door
89
294192
3103
當傑利想要逃脫時,湯姆把門鎖住,
05:09
and swallows燕子 the key
90
297319
1156
把鑰匙吞掉,
05:10
and you see it bulging挺著 out
of his throat as it travels旅行 down?"
91
298499
3551
還可以從外表形狀看到鑰匙
延著喉嚨下滑,記得嗎?」
05:14
And of course課程, I really could see
the image圖片 that he was describing說明,
92
302446
3153
當然,我能看見他所描述的畫面,
05:17
and I really did connect with the feeling感覺
that he was trying to convey傳達,
93
305623
3661
我真的能和他試圖傳達的
這種感受產生連結,
05:21
which哪一個 was one of doom厄運,
94
309308
2021
一種在劫難逃的感受,
05:23
when you know there's no way out.
95
311353
1789
你知道沒有路可逃了。
05:26
And I was wondering想知道:
96
314551
1289
而我很納悶:
05:28
What, if anything,
could have changed his mind心神
97
316644
2682
在他離家的那一天,如果有的話,
什麼能改變他的心意?
05:31
the day that he left home?
98
319350
1240
05:32
So I asked,
99
320614
1250
於是,我問:
05:33
"If you knew知道 everything that you know now
100
321888
3178
「如果你當時知道
你現在知道的這些
05:37
about the suffering痛苦
and the corruption腐敗, the brutality殘酷 --
101
325090
3051
關於苦難、腐敗、殘酷的狀況——
05:40
that day you left home,
102
328165
1415
在離家的那天就知道,
05:41
would you still have gone走了?"
103
329604
1679
你還會選擇離開嗎?」
05:43
And he said, "Yes."
104
331786
1711
他說:「會。」
05:45
And I thought, "Holy crap擲骰子, he said 'Yes"是.'"
105
333846
2282
我心想:「老天爺,他說『會』。」
05:48
And then he said,
106
336694
1219
接著他說:
05:49
"At that point, I was so brainwashed洗腦,
107
337937
3001
「在那個時點,我完全被洗腦了,
05:52
I wasn't taking服用 in
any contradictory矛盾 information信息.
108
340962
3244
我不會接受任何有所矛盾的資訊。
05:56
I couldn't不能 have been swayed動搖."
109
344744
1555
我當時不可能被動搖。」
05:59
"Well, what if you knew知道
everything that you know now
110
347235
2527
「那麼如果你在你離開前
六個月就已經知道
06:01
six months個月 before the day that you left?"
111
349786
2098
你現在知道的這些,結果會如何?」
06:05
"At that point, I think it probably大概
would have changed my mind心神."
112
353345
3131
「若在那個時點,
我想我可能會改變心意。」
06:10
Radicalization激進 isn't
this yes-or-no是-或-否 choice選擇.
113
358138
3397
極端化並不是關於是非的選擇。
06:14
It's a process處理, during which哪一個
people have questions問題 --
114
362007
2977
它是一個過程,在這過程中,
人們會有問題——
06:17
about ideology思想, religion宗教,
the living活的 conditions條件.
115
365008
3776
關於意識型態、宗教、
生活條件的問題。
06:20
And they're coming未來 online線上 for answers答案,
116
368808
2766
他們會上網尋求答案,
06:23
which哪一個 is an opportunity機會 to reach達到 them.
117
371598
1917
這就是一個接觸他們的機會。
06:25
And there are videos視頻 online線上
from people who have answers答案 --
118
373905
3014
有答案的人會在網路提供影片——
06:28
defectors叛逃者, for example,
telling告訴 the story故事 of their journey旅程
119
376943
2876
比如,叛逃者訴說他們
投入和脫離暴力的心路歷程;
06:31
into and out of violence暴力;
120
379843
1583
06:33
stories故事 like the one from that man
I met會見 in the Iraqi伊拉克人 prison監獄.
121
381450
3487
就像我在伊拉克監獄見到的
那名男子告訴我的故事。
06:37
There are locals當地人 who've誰一直 uploaded上傳
cell細胞 phone電話 footage鏡頭
122
385914
2590
有當地人會上傳手機影片,
06:40
of what life is really like
in the caliphate哈里發 under ISIS's伊希斯的 rule規則.
123
388528
3503
呈現在伊斯蘭國統治之下
穆斯林國的真實生活樣貌。
06:44
There are clerics牧師 who are sharing分享
peaceful平靜的 interpretations解讀 of Islam伊斯蘭教.
124
392055
3735
有教會聖職人員分享
關於伊斯蘭的和平詮釋。
06:48
But you know what?
125
396830
1150
但你們知道嗎?
06:50
These people don't generally通常 have
the marketing營銷 prowess實力 of ISIS伊斯蘭國.
126
398004
3020
這些人通常都沒有
伊斯蘭國的高超行銷本領。
06:54
They risk風險 their lives生活 to speak說話 up
and confront面對 terrorist恐怖分子 propaganda宣傳,
127
402049
4532
他們冒著生命危險說出來,
和恐怖主義的宣傳對質,
06:58
and then they tragically可悲
don't reach達到 the people
128
406605
2211
但很不幸的是,他們無法接觸到
07:00
who most need to hear from them.
129
408840
1682
最需要聽到他們聲音的人。
07:03
And we wanted to see
if technology技術 could change更改 that.
130
411173
2612
我們想試看看,
科技是否能改變這一點。
07:06
So in 2016, we partnered合作 with Moonshot登月 CVECVE
131
414205
4183
2016 年,我們和 Moonshot CVE
(信息泄露組織)合作,
07:10
to pilot飛行員 a new approach途徑
to countering反制 radicalization激進
132
418412
3180
試驗一種對抗極端化的新方法,
07:13
called the "Redirect重 定向 Method方法."
133
421616
1780
叫做「重新定向法」。
07:16
It uses使用 the power功率 of online線上 advertising廣告
134
424453
3012
它用線上廣告的力量,
07:19
to bridge the gap間隙 between之間
those susceptible易感 to ISIS's伊希斯的 messaging消息
135
427489
4514
在容易受到伊斯蘭國訊息影響的人
與揭發那些訊息真面目的
可靠聲音之間搭起橋樑。
07:24
and those credible可信的 voices聲音
that are debunking揭穿 that messaging消息.
136
432027
3760
07:28
And it works作品 like this:
137
436633
1150
它的運作方式如下:
07:29
someone有人 looking for extremist極端主義 material材料 --
138
437807
1961
有人在尋找極端主義的素材——
07:31
say they search搜索
for "How do I join加入 ISIS伊斯蘭國?" --
139
439792
2990
比如他們搜尋
「如何加入伊斯蘭國?」——
07:34
will see an ad廣告 appear出現
140
442806
2476
就會有一則廣告出現,
07:37
that invites邀請 them to watch a YouTubeYouTube的 video視頻
of a cleric牧師, of a defector叛逃 者 --
141
445306
4882
邀請他們上 YouTube,
看聖職人員、變節者的影片——
07:42
someone有人 who has an authentic真實 answer回答.
142
450212
2310
有真實答案的人所拍的影片。
07:44
And that targeting針對 is based基於
not on a profile輪廓 of who they are,
143
452546
3623
這個方法鎖定目標對象的方式
不是依據個人資料,
07:48
but of determining決定 something
that's directly relevant相應
144
456193
3053
而是由與他們的查詢或問題有直接
07:51
to their query詢問 or question.
145
459270
1708
相關的東西來決定。
07:54
During our eight-week八週 pilot飛行員
in English英語 and Arabic阿拉伯,
146
462122
2842
我們用英語和阿拉伯語
做了八週的測試,
07:56
we reached到達 over 300,000 people
147
464988
3279
接觸到了超過三十萬人,
08:00
who had expressed表達 an interest利益 in
or sympathy同情 towards a jihadi聖戰 group.
148
468291
5545
他們都是對聖戰團體
表示感興趣或同情的人。
08:06
These people were now watching觀看 videos視頻
149
474626
2264
現在這些人在看的影片
08:08
that could prevent避免 them
from making製造 devastating破壞性的 choices選擇.
150
476914
3340
能預防他們做出毀滅性的選擇。
08:13
And because violent暴力 extremism極端主義
isn't confined受限 to any one language語言,
151
481405
3727
因為暴力極端主義
不侷限於任何一種語言、
08:17
religion宗教 or ideology思想,
152
485156
1804
宗教,或意識形態,
08:18
the Redirect重 定向 Method方法 is now
being存在 deployed部署 globally全球
153
486984
3501
「重新定向法」現已在全球實施,
08:22
to protect保護 people being存在 courted求婚 online線上
by violent暴力 ideologues空想家,
154
490509
3804
保護大家上網時不會受到
暴力意識形態的引誘,
08:26
whether是否 they're Islamists伊斯蘭教徒,
white白色 supremacists至上主義
155
494337
2596
不論是伊斯蘭教的、
白人至上主義的,
08:28
or other violent暴力 extremists極端分子,
156
496957
2103
或其他暴力極端主義的,
08:31
with the goal目標 of giving them the chance機會
to hear from someone有人
157
499084
2873
我們的目標是要
給他們機會去聽聽看
08:33
on the other side of that journey旅程;
158
501981
2091
在那旅程另一端的人怎麼說;
08:36
to give them the chance機會 to choose選擇
a different不同 path路徑.
159
504096
2839
給他們機會去選擇不同的路。
08:40
It turns out that often經常 the bad guys
are good at exploiting利用 the internet互聯網,
160
508749
5980
事實證明,
壞人通常擅長利用網際網路,
08:46
not because they're some kind
of technological技術性 geniuses天才,
161
514753
3744
並不是因為他們是什麼科技天才,
08:50
but because they understand理解
what makes品牌 people tick.
162
518521
2985
而是因為他們了解人的癢處。
08:54
I want to give you a second第二 example:
163
522855
2369
我再舉個例子說明:
08:58
online線上 harassment騷擾.
164
526019
1391
線上騷擾。
09:00
Online線上 harassers騷擾 also work
to figure數字 out what will resonate諧振
165
528629
3363
線上騷擾者也在致力於
找出什麼能讓
09:04
with another另一個 human人的 being存在.
166
532016
1615
另一個人產生共鳴。
09:05
But not to recruit them like ISIS伊斯蘭國 does,
167
533655
3110
但他們的目的不像
伊斯蘭國是要招募人,
09:08
but to cause原因 them pain疼痛.
168
536789
1275
而是造成別人痛苦。
09:11
Imagine想像 this:
169
539259
1342
想像這個狀況:
09:13
you're a woman女人,
170
541347
1659
你是一名女子,
09:15
you're married已婚,
171
543030
1413
已婚,
09:16
you have a kid孩子.
172
544467
1154
有一個孩子。
09:18
You post崗位 something on social社會 media媒體,
173
546834
1784
你在社交媒體上發了一篇文,
09:20
and in a reply回复,
you're told that you'll你會 be raped強姦,
174
548642
2886
你得到一則回應,說你會被強暴,
09:24
that your son兒子 will be watching觀看,
175
552577
1560
你的兒子會被監視,
09:26
details細節 of when and where.
176
554825
1856
還有時間和地點的細節資訊。
09:29
In fact事實, your home address地址
is put online線上 for everyone大家 to see.
177
557148
3143
事實上,在網路上大家
都能看到你家的地址。
09:33
That feels感覺 like a pretty漂亮 real真實 threat威脅.
178
561580
2007
那威脅感覺十分真實。
09:37
Do you think you'd go home?
179
565113
1656
你認為你會回家嗎?
09:39
Do you think you'd continue繼續 doing
the thing that you were doing?
180
567999
3048
你認為你會繼續做你正在做的事嗎?
09:43
Would you continue繼續 doing that thing
that's irritating刺激性 your attacker攻擊者?
181
571071
3220
你會繼續做那件惹惱了
攻擊你的人的那件事嗎?
09:48
Online線上 abuse濫用 has been this perverse art藝術
182
576016
3096
線上虐待一直都是種
刻意作對的藝術,
09:51
of figuring盤算 out what makes品牌 people angry憤怒,
183
579136
3468
找出什麼能讓人生氣,
09:54
what makes品牌 people afraid害怕,
184
582628
2132
什麼能讓人害怕,
09:56
what makes品牌 people insecure不安全,
185
584784
1641
什麼能讓人沒有安全感,
09:58
and then pushing推動 those pressure壓力 points
until直到 they're silenced沉默.
186
586449
3067
接著去壓那些對壓力敏感之處,
直到它們被壓制下來。
10:02
When online線上 harassment騷擾 goes unchecked未選中,
187
590333
2304
當線上騷擾不受管束時,
10:04
free自由 speech言語 is stifled.
188
592661
1667
自由言論就會被扼殺。
10:07
And even the people
hosting託管 the conversation會話
189
595196
2127
即使主持對話的人
10:09
throw up their arms武器 and call it quits退出,
190
597347
1834
棄械並宣佈到此為止,
10:11
closing關閉 their comment評論 sections部分
and their forums論壇 altogether.
191
599205
2957
把他們的留言區以及論壇都關閉。
10:14
That means手段 we're actually其實
losing失去 spaces空間 online線上
192
602186
2849
那意味著,我們其實正在失去線上
10:17
to meet遇到 and exchange交換 ideas思路.
193
605059
1987
可以碰面交換點子的空間。
10:19
And where online線上 spaces空間 remain,
194
607939
2163
在還有線上空間的地方,
10:22
we descend降落 into echo迴聲 chambers
with people who think just like us.
195
610126
4470
我們會陷入到迴音室當中,
只和相同想法的人聚在一起。
10:27
But that enables使
the spread傳播 of disinformation造謠;
196
615688
2499
但那會造成錯誤訊息被散佈出去;
10:30
that facilitates功能有助於 polarization極化.
197
618211
2184
那會促成兩極化。
10:34
What if technology技術 instead代替
could enable啟用 empathy同情 at scale規模?
198
622508
5269
但如果反之能用科技
來大量產生同理心呢?
10:40
This was the question
that motivated動機 our partnership合夥
199
628451
2486
就是這個問題
驅使我們和谷歌的反虐待小組、
10:42
with Google's谷歌的 Counter計數器 Abuse濫用 team球隊,
200
630961
1819
10:44
Wikipedia維基百科
201
632804
1178
維基,
10:46
and newspapers報紙 like the New York紐約 Times.
202
634006
1934
以及像紐約時報這類報紙合作。
10:47
We wanted to see if we could build建立
machine-learning機器學習 models楷模
203
635964
2876
我們想要看看我們
是否能建立出能夠了解
10:50
that could understand理解
the emotional情緒化 impact碰撞 of language語言.
204
638864
3606
語言會帶來什麼情緒影響的
機器學習模型,
10:55
Could we predict預測 which哪一個 comments註釋
were likely容易 to make someone有人 else其他 leave離開
205
643062
3610
我們能否預測什麼樣的意見
有可能會讓另一個人
10:58
the online線上 conversation會話?
206
646696
1374
離開線上對談?
11:00
And that's no mean feat功績.
207
648515
3887
那不是容易的事。
11:04
That's no trivial不重要的 accomplishment成就
208
652426
1566
人工智慧要做到
11:06
for AIAI to be able能夠 to do
something like that.
209
654016
2563
那樣的事,並不是理所當然。
11:08
I mean, just consider考慮
these two examples例子 of messages消息
210
656603
3729
我是指,想想這兩個例子,
都是在上週我有可能
會收到的訊息。
11:12
that could have been sent發送 to me last week.
211
660356
2224
「祝在 TED 順利!」
(直譯:在 TED 斷一條腿。)
11:15
"Break打破 a leg at TEDTED!"
212
663517
1879
11:17
... and
213
665420
1164
以及
11:18
"I'll break打破 your legs at TEDTED."
214
666608
2126
「我會在 TED 打斷你一條腿。」
11:20
(Laughter笑聲)
215
668758
1246
(笑聲)
11:22
You are human人的,
216
670028
1513
你們是人,
11:23
that's why that's an obvious明顯
difference區別 to you,
217
671565
2210
那就是為何你們能明顯看出,
11:25
even though雖然 the words
are pretty漂亮 much the same相同.
218
673799
2224
用字幾乎相同的兩個句子有何差別。
11:28
But for AIAI, it takes some training訓練
to teach the models楷模
219
676047
3079
但對人工智慧來說,
要透過訓練來教導模型
11:31
to recognize認識 that difference區別.
220
679150
1571
去辨視那差別。
11:32
The beauty美女 of building建造 AIAI
that can tell the difference區別
221
680745
3245
建立能夠分辨出那差別的
人工智慧,有個美好之處,
11:36
is that AIAI can then scale規模 to the size尺寸
of the online線上 toxicity毒性 phenomenon現象,
222
684014
5050
就是人工智慧能處理
線上毒素現象的規模,
11:41
and that was our goal目標 in building建造
our technology技術 called Perspective透視.
223
689088
3287
為此目的,我們建立了
一種出名為「觀點」的技術。
11:45
With the help of Perspective透視,
224
693056
1427
在「觀點」的協助下,
11:46
the New York紐約 Times, for example,
225
694507
1583
以紐約時報為例,
11:48
has increased增加 spaces空間
online線上 for conversation會話.
226
696114
2487
他們增加了線上交談的空間。
11:51
Before our collaboration合作,
227
699005
1310
在與我們合作之前,
11:52
they only had comments註釋 enabled啟用
on just 10 percent百分 of their articles用品.
228
700339
4305
他們的文章只有
大約 10% 有開放留言。
11:57
With the help of machine learning學習,
229
705495
1644
在機器學習的協助下,
11:59
they have that number up to 30 percent百分.
230
707163
1897
這個數字提升到了 30%。
12:01
So they've他們已經 tripled三倍 it,
231
709084
1156
增加了三倍,
12:02
and we're still just getting得到 started開始.
232
710264
1917
且我們才剛開始合作而已。
12:04
But this is about way more than just
making製造 moderators版主 more efficient高效.
233
712872
3461
這絕對不只是讓版主更有效率。
12:10
Right now I can see you,
234
718076
1850
現在,我可以看見你們,
12:11
and I can gauge測量 how what I'm saying
is landing降落 with you.
235
719950
3294
我可以估量我所說的話
會如何對你們產生影響。
12:16
You don't have that opportunity機會 online線上.
236
724370
1879
在線上沒有這樣的機會。
12:18
Imagine想像 if machine learning學習
could give commenters提意見,
237
726558
3635
想像一下,
當留言者在打字的時候,
如果機器學習能夠
12:22
as they're typing打字,
238
730217
1162
12:23
real-time即時的 feedback反饋 about how
their words might威力 land土地,
239
731403
3347
即使給他們回饋,
說明他們的文字會造成什麼影響,
12:27
just like facial面部 expressions表達式 do
in a face-to-face面對面 conversation會話.
240
735609
3024
就像在面對面交談時,
面部表情的功能。
12:32
Machine learning學習 isn't perfect完善,
241
740926
1842
機器學習並不完美,
12:34
and it still makes品牌 plenty豐富 of mistakes錯誤.
242
742792
2394
它仍然會犯許多錯誤。
12:37
But if we can build建立 technology技術
243
745210
1557
但如果我們能建立出
12:38
that understands理解 the emotional情緒化
impact碰撞 of language語言,
244
746791
3293
能了解語言有什麼
情緒影響力的技術,
12:42
we can build建立 empathy同情.
245
750108
1460
我們就能建立同理心。
12:43
That means手段 that we can have
dialogue對話 between之間 people
246
751592
2425
那就表示,我們能讓兩個人對話,
12:46
with different不同 politics政治,
247
754041
1816
即使他們政治立場不同,
12:47
different不同 worldviews世界觀,
248
755881
1216
世界觀不同,
12:49
different不同 values.
249
757121
1246
價值觀不同。
12:51
And we can reinvigorate重振 the spaces空間 online線上
that most of us have given特定 up on.
250
759359
4775
我們能讓大部分人已經放棄的
線上空間再度復甦。
12:57
When people use technology技術
to exploit利用 and harm危害 others其他,
251
765857
3785
當人們用科技來利用和傷害他人時,
13:01
they're preying捕食 on our human人的 fears恐懼
and vulnerabilities漏洞.
252
769666
3642
他們靠的是我們人類的恐懼和脆弱。
13:06
If we ever thought
that we could build建立 an internet互聯網
253
774461
3508
如果我們認為我們能夠建立一個完全
13:09
insulated絕緣 from the dark黑暗 side of humanity人性,
254
777993
2578
沒有人性黑暗面的網際網路,
13:12
we were wrong錯誤.
255
780595
1184
我們就錯了。
13:14
If we want today今天 to build建立 technology技術
256
782361
2270
如果現今我們想要建立技術
13:16
that can overcome克服
the challenges挑戰 that we face面對,
257
784655
3127
來克服我們面臨的挑戰,
13:19
we have to throw our entire整個 selves自我
into understanding理解 the issues問題
258
787806
4043
我們就得把自己全心全意投入,
並去了解這些議題,
13:23
and into building建造 solutions解決方案
259
791873
1893
去建立人類的解決方案,
13:25
that are as human人的 as the problems問題
they aim目標 to solve解決.
260
793790
3782
來解決人類的問題。
13:30
Let's make that happen發生.
261
798071
1513
讓我們來實現它吧。
13:31
Thank you.
262
799924
1150
謝謝。
13:33
(Applause掌聲)
263
801098
3277
(掌聲)
Translated by Lilian Chiu
Reviewed by Yanyan Hong

▲Back to top

ABOUT THE SPEAKER
Yasmin Green - Geopolitical technologist
Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. focused on solving global security challenges through technology.

Why you should listen

Yasmin Green is the director of research and development for Jigsaw, a unit within Alphabet Inc. (formerly Google Ideas), focused on using tech tools to make the world safer, both on and offline. She has experience leading projects in some of the world’s toughest environments, including Iran, Syria, the UAE and Nigeria. In 2012, she led a multi-partner coalition to launch Against Violent Extremism, the world's first online network of former violent extremists and survivors of terrorism. Based on her own interviews with ISIS defectors and jailed recruits, last year Yasmin launched the Redirect Method, a new deployment of targeted advertising and video to confront online radicalization.

Green is a senior advisor on innovation to Oxford Analytica, a member of the Aspen Cyber Strategy Group, and until 2015 co-chaired the European Commission's Working Group on Online Radicalization. She was named one of Fortune's "40 Under 40" most influential young leaders in 2017, and in 2016 she was named one of Fast Company's "Most Creative People in Business."

More profile about the speaker
Yasmin Green | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee