TEDxSkoll
Yvette Alberdingk Thijm: The power of citizen video to create undeniable truths
伊維特艾柏汀蒂潔姆: 公民影片有力量可以創造無法否認的真相
Filmed:
Readability: 5.2
1,134,924 views
智慧手機和攝影機有沒有可能是我們最強大的社會正義武器?伊維特艾柏汀蒂潔姆透過她的組織「Witness」,發展策略和技術,來協助行動主義者用影片來保護和守護他們的人權。她分享故事來說明遠方證人的力量正在成長中,並請大家使用我們能自由支配的工具,來捕捉不公平的事件。
Yvette Alberdingk Thijm - Human rights activist
Yvette Alberdingk Thijm helps activists use video and technology to protect and defend human rights. Full bio
Yvette Alberdingk Thijm helps activists use video and technology to protect and defend human rights. Full bio
Double-click the English transcript below to play the video.
00:12
It's 1996
0
760
1856
時間是 1996 年,
00:14
in Uvira in eastern Congo.
1
2640
2016
地點在剛果東部的烏維拉。
00:16
This is Bukeni.
2
4680
1336
這位是布肯尼。
00:18
Militia commanders walk into his village,
3
6040
2696
民兵的指揮官來到他的村落,
00:20
knock on his neighbors' doors
4
8760
1776
敲他鄰居的門,
00:22
and whisk their children away
to training camps.
to training camps.
5
10560
3160
把他們的孩子帶走,帶到訓練營。
00:26
Bukeni borrows a video camera
from a local wedding photographer,
from a local wedding photographer,
6
14320
4256
布肯尼向當地的婚禮攝影師
借了一台攝影機,
借了一台攝影機,
00:30
he disguises as a journalist
7
18600
1776
他假扮成記者,
00:32
and he walks into the camps
to negotiate the release of the children.
to negotiate the release of the children.
8
20400
3560
他進入營地去談判,
希望他們釋放孩子。
希望他們釋放孩子。
00:36
He filmed footage of the children
being trained as soldiers.
being trained as soldiers.
9
24560
3960
他拍下了孩子被訓練成士兵的影片。
00:40
[Soldiers don't worry!]
10
28568
2590
〔士兵們不用擔心!〕
00:43
[You'll wear uniforms!]
11
31202
1478
〔你們將會穿上制服!〕
00:45
[You'll have free cars!]
12
33220
1670
〔你們將會有免費的車!〕
00:46
[Free beans!]
13
34920
740
〔免費的豆子!〕
00:47
Many of these children
are under 15 years old,
are under 15 years old,
14
35680
3536
這些孩子很多都不到十五歲,
00:51
and that is a war crime.
15
39240
1970
那是戰爭犯罪。
00:53
[Free!]
16
41270
1140
〔免費!〕
00:55
But you don't have to go to eastern Congo
to find human rights abuses.
to find human rights abuses.
17
43280
3760
並不需要到東剛果
也能夠找到人權濫用。
也能夠找到人權濫用。
00:59
In America, a country
with a rapidly aging population,
with a rapidly aging population,
18
47760
3776
在美國,一個人口快速老化的國家,
01:03
experts estimate
that one in 10 people over 60
that one in 10 people over 60
19
51560
4376
專家估計,超過六十歲的人,
十個中就有一個
十個中就有一個
01:07
will experience abuse.
20
55960
2000
會經歷到虐待。
01:10
It's a hidden epidemic,
21
58640
2336
它是種隱藏的傳染病,
01:13
and most of that abuse
actually happens at the hands
actually happens at the hands
22
61000
2776
大部份的虐待,施虐者都是
01:15
of close caretakers or family.
23
63800
2080
親近的照護者或是家人。
01:19
This is Vicky.
24
67200
1336
這位是薇奇。
01:20
Vicky put an iron gate on her bedroom door
25
68560
4456
薇奇在她的臥房門上加裝了鐵門,
01:25
and she became a prisoner,
in fact, in her own house,
in fact, in her own house,
26
73040
3096
她成了囚犯,事實上,
是被囚禁在她自己家裡,
是被囚禁在她自己家裡,
01:28
out of fear for her nephew
who had taken over her home as a drug den.
who had taken over her home as a drug den.
27
76160
4360
因為她的姪兒佔據她家
當作毒窟,讓她很害怕。
當作毒窟,讓她很害怕。
01:33
And this is Mary.
28
81760
1296
這位是瑪莉。
01:35
Mary picked up a video camera
for the first time in her life
for the first time in her life
29
83080
3136
她人生中第一次拿起攝影機,
01:38
when she was 65 years old,
30
86240
1936
是她 65 歲時,
01:40
and she asked Vicky
and 99 other older people
and 99 other older people
31
88200
4416
她請薇奇和其他
受過虐待的 99 位老人
受過虐待的 99 位老人
01:44
who had experienced abuse
to tell their stories on camera.
to tell their stories on camera.
32
92640
3600
對著攝影機說出他們的故事。
01:50
And I am Dutch,
33
98160
1856
我是荷蘭人,
01:52
so in the Netherlands
we are obsessed with the truth.
we are obsessed with the truth.
34
100040
2976
在荷蘭,我們對真相很著迷。
01:55
Now, when you are a child,
that's a great thing,
that's a great thing,
35
103040
2456
當你是孩子時,那是很棒的事,
01:57
because you can basically
get away with anything,
get away with anything,
36
105520
2616
因為基本上你可以從任何事情開脫,
02:00
like "Yes, Mama,
it was me who smoked the cigars."
it was me who smoked the cigars."
37
108160
3296
比如:「是的,媽媽,
偷抽雪茄的人是我。」
偷抽雪茄的人是我。」
02:03
(Laughter)
38
111480
1640
(笑聲)
02:05
But I think this is why
I have dedicated my life
I have dedicated my life
39
113920
3976
但,我想,這就是為什麼
我會把我的一生
我會把我的一生
02:09
to promoting citizen video
to expose human rights violations,
to expose human rights violations,
40
117920
4256
投入在推動公民影片,
來揭發違反人權的行為,
來揭發違反人權的行為,
02:14
because I believe in the power of video
to create undeniable truths.
to create undeniable truths.
41
122200
4320
因為我相信影片有力量,
可以創造出無法否認的真相。
可以創造出無法否認的真相。
02:19
And my organization, WITNESS,
42
127400
1696
我的組織「WITNESS(見證)」
02:21
helped use the Congolese videos
43
129120
2736
協助使用那些剛果的影片
02:23
to help convict and send a notorious
warlord called Thomas Lubanga to jail.
warlord called Thomas Lubanga to jail.
44
131880
6480
來將一位惡名昭彰的軍閥
湯瑪士路邦加定罪並送進監獄。
湯瑪士路邦加定罪並送進監獄。
02:31
And the videos that Mary shot,
45
139440
2456
至於瑪莉拍攝的影片,
02:33
we trained Mary and many other
elder justice advocates,
elder justice advocates,
46
141920
2976
我們訓練瑪莉和許多
其他年長的正義擁護者,
其他年長的正義擁護者,
02:36
to make sure that
the stories of elder abuse
the stories of elder abuse
47
144920
2816
來確保老年人受虐的故事
02:39
reached lawmakers,
48
147760
1616
能傳達給立法者,
02:41
and those stories
helped convince lawmakers
helped convince lawmakers
49
149400
3856
而那些故事協助說服了立法者,
02:45
to pass landmark legislation
to protect older Americans.
to protect older Americans.
50
153280
3160
通過了重要的立法,
來保護年長的美國人。
來保護年長的美國人。
02:50
So I wonder,
51
158280
1936
所以,我很納悶,
02:52
billions of us now have this powerful tool
right at our fingertips.
right at our fingertips.
52
160240
4656
我們數十億人現在手上
都有這項強大的工具,
都有這項強大的工具,
02:56
It's a camera.
53
164920
1496
就是攝影機,
02:58
So why are all of us not a more
powerful army of civic witnesses,
powerful army of civic witnesses,
54
166440
4816
那為什麼我們沒有變成
更強大的公民證人大軍,
更強大的公民證人大軍,
03:03
like Mary and Bukeni?
55
171280
2056
像瑪莉和布肯尼那樣?
03:05
Why is it that so much more video
56
173360
2856
為什麼影片多了這麼多,
03:08
is not leading to more rights
and more justice?
and more justice?
57
176240
3520
卻沒有導致更多的權利
和更多的正義?
和更多的正義?
03:13
And I think it is because
being an eyewitness is hard.
being an eyewitness is hard.
58
181000
4536
我想,這是因為當目擊證人很難。
03:17
Your story will get denied,
59
185560
2456
你的故事會被否認。
03:20
your video will get lost
in a sea of images,
in a sea of images,
60
188040
2856
你的影片會在一大堆的影像中失縱,
03:22
your story will not be trusted,
and you will be targeted.
and you will be targeted.
61
190920
3680
你的故事不會被相信,
且你會成為箭靶。
且你會成為箭靶。
03:27
So how do we help witnesses?
62
195920
2000
所以,我們要如何協助證人?
03:31
In Oaxaca, in Mexico,
63
199560
1696
在墨西哥的瓦哈卡市,
03:33
the teachers' movement organized a protest
64
201280
2576
教師運動安排了一項抗議活動,
03:35
after the president pushed down
very undemocratic reforms.
very undemocratic reforms.
65
203880
3520
這是在總統打壓了
非常不民主的改革之後。
非常不民主的改革之後。
03:40
The federal police came down in buses
and started shooting at the protesters.
and started shooting at the protesters.
66
208160
3800
好幾車的聯邦警察到現場,
開始對抗議者開槍。
開始對抗議者開槍。
03:44
At least seven people died
and many, many more were wounded.
and many, many more were wounded.
67
212640
3616
至少有七個人死亡,
還有許多許多人受傷。
還有許多許多人受傷。
03:48
Images started circulating
of the shootings,
of the shootings,
68
216280
3376
射殺的影像開始流傳,
03:51
and the Mexican government
did what it always does.
did what it always does.
69
219680
2416
墨西哥政府採用了一貫的做法,
03:54
It issued a formal statement,
70
222120
1416
發佈了一項正式聲明,
03:55
and the statement basically
accused the independent media
accused the independent media
71
223560
3376
基本上,那份聲明是在指控獨立媒體
03:58
of creating fake news.
72
226960
2056
創造假新聞。
04:01
It said, "We were not there,
73
229040
1776
聲明寫著:「我們不在那裡,
04:02
that was not us doing the shooting,
74
230840
2736
射殺不是我們做的,
04:05
this did not happen."
75
233600
1360
這件事沒有發生。」
04:08
But we had just trained
activists in Mexico
activists in Mexico
76
236760
3216
但我們剛訓練了墨西哥的行動主義者
04:12
to use metadata strategically
with their images.
with their images.
77
240000
3776
搭配他們的影像,
策略性地使用元數據。
策略性地使用元數據。
04:15
Now, metadata is the kind of information
that your camera captures
that your camera captures
78
243800
3736
元數據可說是
你的攝影機所捕捉的資訊,
你的攝影機所捕捉的資訊,
04:19
that shows the date, the location,
79
247560
3016
能顯示出日期、地點、
04:22
the temperature, the weather.
80
250600
1736
溫度、天氣。
04:24
It can even show the very unique way
you hold your camera
you hold your camera
81
252360
3416
它甚至可以顯示出
在拍攝時你用手拿著
在拍攝時你用手拿著
04:27
when you capture something.
82
255800
1736
攝影機的獨特方式。
04:29
So the images started recirculating,
83
257560
2096
所以,影像開始重新流傳,
04:31
and this time with the very verifying,
84
259680
3096
這次上面有著非常清楚可證實
04:34
validating information on top of them.
85
262800
2080
且有效的資訊。
04:37
And the federal government
had to retract their statement.
had to retract their statement.
86
265960
2720
而聯邦政府得要收回他們的聲明。
04:41
Now, justice for the people for Oaxaca
87
269520
3576
對於瓦哈卡市的人來說,
04:45
is still far off,
88
273120
1416
正義還很遠,
04:46
but their stories, their truths,
can no longer be denied.
can no longer be denied.
89
274560
3480
但他們的故事、他們的真相
不會再被否認。
不會再被否認。
04:51
So we started thinking:
90
279240
1776
所以,我們開始想:
04:53
What if you had "Proof Mode?"
91
281040
1696
如果有「證據模式」呢?
04:54
What if everybody had
a camera in their hands
a camera in their hands
92
282760
2136
如果每個人手上都有一台攝影機,
04:56
and all the platforms
had that kind of validating ability.
had that kind of validating ability.
93
284920
3496
且所有的平台都有那種驗證能力呢?
05:00
So we developed --
94
288440
1256
所以我們和很棒的安卓開發者合作,
05:01
together with amazing Android developers
called the Guardian Project,
called the Guardian Project,
95
289720
4456
開發了「守護者計畫」,
05:06
we developed something called
a technology that's called Proof Mode,
a technology that's called Proof Mode,
96
294200
3296
我們開發了一種技術,
叫做「證據模式」,
叫做「證據模式」,
05:09
that marries those metadata
together with your image,
together with your image,
97
297520
3176
把那些元數據和
你的影像結合在一起,
你的影像結合在一起,
05:12
and it validates
and it verifies your video.
and it verifies your video.
98
300720
2720
它能證明和驗證你的影片。
05:16
Now, imagine there is a deluge of images
99
304520
4016
想像一下,有大量的影像
05:20
coming from the world's camera phones.
100
308560
2400
來自世界的攝影機手機。
05:23
Imagine if that information
could be trusted just a little bit more,
could be trusted just a little bit more,
101
311640
4000
想像一下,如果資訊
能變得更可信一點,
能變得更可信一點,
05:28
what the potential
would be for journalists,
would be for journalists,
102
316520
2336
對記者、對人權調查者、
05:30
for human rights investigators,
103
318880
1816
對人權律師而言,
05:32
for human rights lawyers.
104
320720
1440
會有什麼潛力?
05:35
So we started sharing Proof Mode
with our partners in Brazil
with our partners in Brazil
105
323440
3576
所以,我們開始和巴西的
合作伙伴分享「證據模式」,
合作伙伴分享「證據模式」,
05:39
who are an amazing media collective
called Coletivo Papo Reto.
called Coletivo Papo Reto.
106
327040
3280
他們是個很棒的媒體集團,
叫「Coletivo Papo Reto」。
叫「Coletivo Papo Reto」。
05:44
Brazil is a tough place for human rights.
107
332280
2840
在人權方面,
巴西是個很棘手的地方。
巴西是個很棘手的地方。
05:47
The Brazilian police
kills thousands of people every year.
kills thousands of people every year.
108
335680
3960
巴西警方每年會殺死數千人。
05:52
The only time that
there's an investigation,
there's an investigation,
109
340400
2320
唯一會有調查的時候,
05:55
guess when?
110
343920
1200
猜猜是何時?
05:57
When there's video.
111
345440
1200
當有影片的時候。
06:00
Seventeen-year-old Eduardo
was killed in broad daylight
was killed in broad daylight
112
348240
3440
十七歲的伊度瓦多在光天化日下
06:04
by the Rio police,
113
352640
1576
被里約警察殺害,
06:06
and look what happens after they kill him.
114
354240
2280
看看他們殺了他之後發生了什麼事。
06:10
They put a gun in the dead boy's hand,
115
358320
2000
他們把一把槍放到男孩屍體的手中,
06:12
they shoot the gun twice --
116
360880
1720
他們用那把槍開了兩槍-
06:15
(Shot)
117
363840
1200
(槍聲)
06:17
to fabricate their story of self-defense.
118
365600
4120
來假造他們的自衛故事。
06:22
The woman who filmed this
was a very, very courageous eyewitness,
was a very, very courageous eyewitness,
119
370440
3456
拍下這段影片的女子
是位非常勇敢的目擊證人,
是位非常勇敢的目擊證人,
06:25
and she had to go into hiding
after she posted her video
after she posted her video
120
373920
3376
她發佈了她的影片之後,
她得要躲藏起來,
她得要躲藏起來,
06:29
for fear of her life.
121
377320
1280
擔心有生命危險。
06:31
But people are filming,
and they're not going to stop filming,
and they're not going to stop filming,
122
379240
3456
但人們在拍攝,
且他們不會停止拍攝,
且他們不會停止拍攝,
06:34
so we're now working together
with media collectives
with media collectives
123
382720
3016
所以,我們正在和媒體集團合作,
06:37
so the residents on their WhatsApp
124
385760
2496
讓居民可以透過 WhatsApp
06:40
frequently get guidance and tips,
125
388280
2736
經常得到指導和秘訣,
06:43
how to film safely,
126
391040
1736
讓他們知道如何安全拍攝、
06:44
how to upload the video
that you shoot safely,
that you shoot safely,
127
392800
2936
如何將安全拍攝的影片上傳、
06:47
how to capture a scene
so that it can actually count as evidence.
so that it can actually count as evidence.
128
395760
3800
要如何捕捉一個場面
才能讓影片被視為是證據。
才能讓影片被視為是證據。
06:52
And here is an inspiration
129
400360
2096
這裡有件鼓舞人心的事,
06:54
from a group called Mídia Ninja in Brazil.
130
402480
2840
來自一個巴西的團體「媒體忍者」。
06:58
The man on left is a heavily armed
military policeman.
military policeman.
131
406480
3840
左手邊的人是重度武裝的軍方警察。
07:04
He walks up to a protester --
132
412000
1496
他走向一名抗議者-
07:05
when you protest in Brazil,
you can be arrested or worse --
you can be arrested or worse --
133
413520
3216
當你在巴西抗議時,
你可能會被逮捕或更糟-
你可能會被逮捕或更糟-
07:08
and he says to the protester, "Watch me,
134
416760
2576
他對抗議者說:「看著我,
07:11
I am going to search you right now."
135
419360
2520
我現在要對你進行搜身。」
07:15
And the protester
is a live-streaming activist --
is a live-streaming activist --
136
423080
3320
而抗議者是一位
實況直播的行動主義者-
實況直播的行動主義者-
07:18
he wears a little camera --
137
426840
1456
他戴著一個小攝影機-
07:20
and he says to the military policeman,
he says, "I am watching you,
he says, "I am watching you,
138
428320
3936
他對軍方警察說:「我在看著你,
07:24
and there are 5,000 people
watching you with me."
watching you with me."
139
432280
3080
還有五千人和我一起看著你。」
07:28
Now, the tables are turned.
140
436400
2456
現在,局勢翻轉了。
07:30
The distant witnesses,
the watching audience, they matter.
the watching audience, they matter.
141
438880
3280
遠方的證人,在看著的觀眾,
他們是有重要性的。
他們是有重要性的。
07:35
So we started thinking,
142
443240
1656
所以,我們開始思考,
07:36
what if you could tap into that power,
143
444920
2576
如果能夠利用那種力量,
07:39
the power of distant witnesses?
144
447520
2176
遠方證人的力量,會如何?
07:41
What if you could pull in
their expertise, their leverage,
their expertise, their leverage,
145
449720
2736
如果能夠取得他們的專才、
他們的影響力、
他們的影響力、
07:44
their solidarity, their skills
146
452480
2136
他們的團結、他們的技能,
07:46
when a frontline community
needs them to be there?
needs them to be there?
147
454640
3040
當前線社區需要他們在那裡時,
是否能有幫助?
是否能有幫助?
07:50
And we started developing
a project that's called Mobilize Us,
a project that's called Mobilize Us,
148
458960
5336
我們開始開發一個計畫,
叫做「動員我們」,
叫做「動員我們」,
07:56
because many of us, I would assume,
149
464320
3216
因為,我假設,我們之中有許多人
07:59
want to help
150
467560
1656
都想要幫忙,
08:01
and lend our skills and our expertise,
151
469240
2616
想要提供我們的技能和專才,
08:03
but we are often not there
when a frontline community
when a frontline community
152
471880
2856
但我們通常都不在現場,
不在前線社區
不在前線社區
08:06
or a single individual faces an abuse.
153
474760
2760
或是單獨一人面對虐待時的現場。
08:10
And it could be as simple
as this little app that we created
as this little app that we created
154
478760
3656
一個很簡單的做法,就是用
我們創造的這個小應用程式,
我們創造的這個小應用程式,
08:14
that just shows the perpetrator
on the other side of the phone
on the other side of the phone
155
482440
3056
讓手機另一端的作惡者能夠知道
08:17
how many people are watching him.
156
485520
2320
有多少人正在看著他。
08:20
But now, imagine that you could put
a layer of computer task routing
a layer of computer task routing
157
488880
4336
但,想像一下,你可以把
一層電腦工作任務指派功能
一層電腦工作任務指派功能
08:25
on top of that.
158
493240
1416
加在那上面。
08:26
Imagine that you're a community
facing an immigration raid,
facing an immigration raid,
159
494680
4040
想像一下,你們社區正面臨移民搜捕,
08:31
and at that very moment,
at that right moment, via livestream,
at that right moment, via livestream,
160
499520
3736
在那重要時刻,
適當的時刻,透過直播,
適當的時刻,透過直播,
08:35
you could pull in
a hundred legal observers.
a hundred legal observers.
161
503280
2480
你就可以拉進上百名合法觀察者。
08:38
How would that change the situation?
162
506400
1920
那會讓局勢有什麼改變?
08:41
So we started piloting this
with our partner communities in Brazil.
with our partner communities in Brazil.
163
509320
3695
我們開始和我們在巴西的
社區伙伴合作來做試驗。
社區伙伴合作來做試驗。
08:45
This is a woman called Camilla,
164
513039
2137
這位女子叫做卡蜜拉,
08:47
and she was able -- she's the leader
in a favela called Favela Skol --
in a favela called Favela Skol --
165
515200
3800
她是「Favela Skol 」
貧民區的領袖,
貧民區的領袖,
08:51
she was able to pull in distant witnesses
166
519919
4137
她能夠拉進遠方的證人,
08:56
via livestream
167
524080
2056
透過直播,
08:58
to help translation,
168
526160
1736
來協助翻譯、
08:59
to help distribution,
169
527920
1616
來協助散布、
09:01
to help amplify her story
170
529560
2776
來協助擴大她的故事,
09:04
after her community was forcibly evicted
171
532360
2696
因為她的社區先前被強迫驅逐,
09:07
to make room for a very glossy
Olympic event last summer.
Olympic event last summer.
172
535080
3880
騰出空間給去年夏天
非常光鮮亮麗的奧運活動。
非常光鮮亮麗的奧運活動。
09:12
So we're talking about good witnessing,
173
540920
2040
我們在談的是好的見證,
09:15
but what happens
if the perpetrators are filming?
if the perpetrators are filming?
174
543920
3176
但如果是作惡者在拍攝,
會發生什麼事?
會發生什麼事?
09:19
What happens if a bystander films
and doesn't do anything?
and doesn't do anything?
175
547120
3360
如果有旁觀者在拍攝,
卻什麼也不做,會發生什麼事?
卻什麼也不做,會發生什麼事?
09:23
This is the story of Chrissy.
176
551680
2216
這是克莉西的故事。
09:25
Chrissy is a transgender woman
177
553920
2336
克莉西是位變性女子,
09:28
who walked into a McDonald's in Maryland
178
556280
2616
她走進馬里蘭州的麥當勞,
09:30
to use the women's bathroom.
179
558920
1440
去使用女廁。
09:32
Two teens viciously beat her
for using that woman's bathroom,
for using that woman's bathroom,
180
560960
5416
兩位年輕人因為她用女廁
而把她打了一頓,
而把她打了一頓,
09:38
and the McDonald's employee
filmed this on his mobile phone.
filmed this on his mobile phone.
181
566400
3120
麥當勞的員工用手機拍下了過程。
09:42
And he posted his video,
182
570760
1440
他將影片放上網。
09:44
and it has garnered
183
572920
1696
影片得到了
09:46
thousands of racist
and transphobic comments.
and transphobic comments.
184
574640
3840
數千則來自種族主義者
和跨性別恐懼症者的留言。
和跨性別恐懼症者的留言。
09:52
So we started a project
that's called Capturing Hate.
that's called Capturing Hate.
185
580920
2880
所以我們開始了一個計畫,
叫做「捕捉仇恨」。
叫做「捕捉仇恨」。
09:56
We took a very, very small sample
of eyewitness videos
of eyewitness videos
186
584840
3936
我們從目擊證人影片中,
取了非常非常少的樣本,
取了非常非常少的樣本,
10:00
that showed abuse against transgender
and gender-nonconforming people.
and gender-nonconforming people.
187
588800
5376
影片中是對於變性人
或性別不一致者的虐待。
或性別不一致者的虐待。
10:06
We searched two words,
"tranny fight" and "stud fight."
"tranny fight" and "stud fight."
188
594200
4000
我們搜尋了兩組關鍵字:
「變性人打鬥」和「種馬打鬥」。
「變性人打鬥」和「種馬打鬥」。
10:11
And those 329 videos were watched
and are still being watched
and are still being watched
189
599240
5056
有 329 支影片被看過,
且當我們現在坐在這裡時,
且當我們現在坐在這裡時,
10:16
as we sit here in this theater,
190
604320
2296
它們仍然持續被觀看著,
10:18
a stunning almost 90 million times,
191
606640
3536
很驚人的是,觀看人次有近九千萬,
10:22
and there are hundreds of thousands
of comments with these videos,
of comments with these videos,
192
610200
3216
且這些影片下面的留言有數十萬則,
10:25
egging on to more violence and more hate.
193
613440
2880
慫恿更多的暴力和仇恨。
10:30
So we started developing a methodology
194
618240
2416
所以,我們開始發展一項方法論,
10:32
that took all that
unquantified visual evidence
unquantified visual evidence
195
620680
4296
把所有那些沒被量化的視覺證據
10:37
and turned it into data,
turning video into data,
turning video into data,
196
625000
4416
轉換成資料,把影片轉換為資料,
10:41
and with that tool,
197
629440
1256
有了那樣工具,
10:42
LGBT organizations are now using that data
198
630720
4416
多元性別組織現在能夠用那些資料
10:47
to fight for rights.
199
635160
1560
來爭取權利。
10:49
And we take that data
and we take it back to Silicon Valley,
and we take it back to Silicon Valley,
200
637400
3216
我們把那些資料帶回到矽谷,
10:52
and we say to them:
201
640640
1496
我們對他們說:
10:54
"How is it possible
202
642160
1600
「怎麼有可能
10:56
that these videos are still out there
203
644600
3976
這些影片還在外面
11:00
in a climate of hate
204
648600
1896
仇恨的風氣中,
11:02
egging on more hate,
205
650520
1856
慫恿更多的仇恨,
11:04
summoning more violence,
206
652400
2016
召集更多的暴力,
11:06
when you have policies that actually say
207
654440
2416
而政策卻表明
11:08
you do not allow this kind of content? --
208
656880
3216
這種內容是不被允許的?」
11:12
urging them to change their policies.
209
660120
2520
以此強烈要求他們改變政策。
11:16
So I have hope.
210
664480
2656
所以,我抱有希望。
11:19
I have hope that we can turn more video
into more rights and more justice.
into more rights and more justice.
211
667160
4400
我希望我們能把更多的影片
轉變成更多的權利和更多的正義。
轉變成更多的權利和更多的正義。
11:24
Ten billion video views
on Snapchat,
on Snapchat,
212
672360
5280
在 Snapchat 上,每天有一百億人次
11:30
per day.
213
678760
1536
觀看影片。
11:32
So what if we could turn
that Snapchat generation
that Snapchat generation
214
680320
3296
我們是否能把這 Snapchat 世代
11:35
into effective and safe civic witnesses?
215
683640
3216
轉變為有效且安全的公民證人?
11:38
What if they could become
the Bukenis of this new generation?
the Bukenis of this new generation?
216
686880
3640
他們是否能變成
這個新世代的布肯尼?
這個新世代的布肯尼?
11:44
In India, women have already
started using Snapchat filters
started using Snapchat filters
217
692920
4016
在印度,女性已經開始
用 Snapchat 濾鏡
用 Snapchat 濾鏡
11:48
to protect their identity when they
speak out about domestic violence.
speak out about domestic violence.
218
696960
3880
來保護她們的身份,
讓她們可以說出遭家暴的事。
讓她們可以說出遭家暴的事。
11:52
[They tortured me at home
and never let me go out.]
and never let me go out.]
219
700860
2940
〔他們在家折磨我,
且從來不讓我出門。〕
且從來不讓我出門。〕
11:55
The truth is, the real truth, the truth
that doesn't fit into any TED Talk,
that doesn't fit into any TED Talk,
220
703840
3576
真相,真正的真相,
無法放入任何 TED 演講中的真相,
無法放入任何 TED 演講中的真相,
11:59
is fighting human rights abuse is hard.
221
707440
2856
就是:對抗人權濫用是很困難的。
12:02
There are no easy solutions
for human rights abuse.
for human rights abuse.
222
710320
3056
對於人權濫用問題,
沒有簡單的解決方案。
沒有簡單的解決方案。
12:05
And there's not a single
piece of technology
piece of technology
223
713400
2736
沒有任何一種技術,
12:08
that can ever stop the perpetrators.
224
716160
1880
能夠阻止作惡者。
12:11
But for the survivors,
225
719400
1440
但,對生存者而言、
12:13
for the victims,
226
721920
1616
對受害者而言、
12:15
for the marginalized communities,
227
723560
1800
對被邊緣化的社區而言,
12:18
their stories, their truths, matter.
228
726160
4456
他們的故事、他們的真相很重要。
12:22
And that is where justice begins.
229
730640
2816
那就是正義開始之處。
12:25
Thank you.
230
733480
1216
謝謝。
12:26
(Applause)
231
734720
2920
(掌聲)
ABOUT THE SPEAKER
Yvette Alberdingk Thijm - Human rights activistYvette Alberdingk Thijm helps activists use video and technology to protect and defend human rights.
Why you should listen
Yvette Alberdingk Thijm leads WITNESS.org, a global team of human rights activists who help anyone use video and technology to protect and defend human rights. WITNESS supports marginalized and vulnerable communities to expose their truths, counter harmful and abusive narratives, and mobilize their communities to build a just world. Yvette believes that right and power to tell your own story is where dignity, justice, and equality begins.
More profile about the speakerYvette Alberdingk Thijm | Speaker | TED.com