ABOUT THE SPEAKER
Nita Farahany - Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics.

Why you should listen

Nita A. Farahany is a professor of law and philosophy, the founding director of the Duke Initiative for Science & Society and chair of the MA in Bioethics & Science Policy at Duke University. In 2010, Farahany was appointed by President Obama to the Presidential Commission for the Study of Bioethical Issues, and she served as a member until 2017. She is a member of the Neuroethics Division of the Multi-Council Working Group for the BRAIN Initiative, on the President's Research Council of the Canadian Institute for Advanced Research (CIFAR), and past member of the Global Agenda Council for Privacy, Technology and Governance at the World Economic Forum. 

Farahany presents her work to diverse audiences and is a frequent commentator for national media and radio shows. She is an elected member of the American Law Institute, the President-Elect of the International Neuroethics Society, serves on the Board of the International Neuroethics Society, a co-editor a co-editor-in-chief and co-founder of the Journal of Law and the Biosciences and an editorial board member of the American Journal of Bioethics (Neuroscience). She's on the Ethics Advisory Board for Illumina, Inc., the Scientific Advisory Board of Helix, and the Board of Advisors of Scientific American. 

Farahany received her AB in genetics, cell and developmental biology at Dartmouth College, a JD and MA from Duke University, as well as a PhD in philosophy. She also holds an ALM in biology from Harvard University. In 2004-2005, Farahany clerked for Judge Judith W. Rogers of the US Court of Appeals for the D.C. Circuit, after which she joined the faculty at Vanderbilt University. In 2011, Farahany was the Leah Kaplan Visiting Professor of Human Rights at Stanford Law School.

More profile about the speaker
Nita Farahany | Speaker | TED.com
TED Salon Zebra Technologies

Nita Farahany: When technology can read minds, how will we protect our privacy?

妮塔 A. 法拉漢尼: 當科技能有讀心術時,要如何保護我們的隱私?

Filmed:
1,819,292 views

法律學者及倫理學家妮塔法拉漢尼說:能夠解譯你的腦部活動並揭示出你想法的技術已經近在眼前。對於我們那已經被侵犯的隱私感而言,這個趨勢有什麼意涵?在這場警示性的演講中,法拉漢尼警告我們,將來的社會可能會是人民只因為想著要犯罪就被逮捕(如同《關鍵報告》),並為了私人利益而出賣我們的大腦資料。她也提出要建立一種認知自由的權利,來保護我們的思想自由和自我決定自由。
- Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics. Full bio

Double-click the English transcript below to play the video.

00:13
In the months個月 following以下
the 2009 presidential總統 election選舉 in Iran伊朗,
0
1096
4714
在 2009 年伊朗
總統大選之後的數個月,
00:17
protests抗議 erupted爆發 across橫過 the country國家.
1
5834
2894
抗議在該國各地爆發。
00:21
The Iranian伊朗的 government政府
violently猛烈 suppressed抑制
2
9685
2896
伊朗政府以暴力方式壓制
00:24
what came來了 to be known已知
as the Iranian伊朗的 Green綠色 Movement運動,
3
12605
3979
後來大家所知的伊朗綠色革命,
政府甚至阻擋手機訊號,
00:28
even blocking閉塞 mobile移動 signals信號
4
16608
2053
切斷抗議者之間的通訊。
00:30
to cut off communication通訊
between之間 the protesters抗議者.
5
18685
2714
00:34
My parents父母, who emigrated移民
to the United聯合的 States狀態 in the late晚了 1960s,
6
22316
4669
我父母在六○年代末移民到美國,
他們很多時候都在伊朗,
00:39
spend substantial大量的 time there,
7
27009
1794
00:40
where all of my large,
extended擴展 family家庭 live生活.
8
28827
3153
我非常大的大家庭
成員都住在那裡。
00:44
When I would call my family家庭 in Tehran德黑蘭
9
32832
3129
在政府用最暴力的方式鎮壓抗議時,
00:47
during some of the most violent暴力
crackdowns鎮壓 of the protest抗議,
10
35985
3452
我會打電話給在德黑蘭的家人,
00:51
none沒有 of them dared discuss討論
with me what was happening事件.
11
39461
3252
他們通通不敢跟我討論
發生了什麼事。
00:55
They or I knew知道 to quickly很快 steer駕駛
the conversation會話 to other topics主題.
12
43196
3529
他們或是我都會知道
要很快地轉換到其他話題。
00:59
All of us understood了解
what the consequences後果 could be
13
47163
3380
我們都了解若被認為持有不同意見,
01:02
of a perceived感知 dissident持不同政見 action行動.
14
50567
2540
會有什麼樣的後果。
01:06
But I still wish希望 I could have known已知
what they were thinking思維
15
54095
3469
但我還是希望我當時
能知道他們在想什麼,
01:09
or what they were feeling感覺.
16
57588
1418
或他們有什麼感覺。
01:12
What if I could have?
17
60217
1393
如果我當初能做到呢?
01:14
Or more frighteningly令人震驚,
18
62149
1151
或,更令人害怕的狀況,
如果伊朗政府當初能夠做到呢?
01:15
what if the Iranian伊朗的 government政府 could have?
19
63324
2761
01:18
Would they have arrested被捕 them
based基於 on what their brains大腦 revealed透露?
20
66695
3244
政府是否會基於他們
腦中所想的就逮捕他們?
01:22
That day may可能 be closer接近 than you think.
21
70933
2944
那一天可能比你想像的還要近。
01:26
With our growing生長 capabilities功能
in neuroscience神經科學, artificial人造 intelligence情報
22
74527
3811
我們在神經科學、
人工智慧以及機器學習的
能力不斷提升,
01:30
and machine learning學習,
23
78362
1703
01:32
we may可能 soon不久 know a lot more
of what's happening事件 in the human人的 brain.
24
80089
4075
我們可能很快就會知道
更多人腦裡頭的狀況。
01:37
As a bioethicist生物倫理學家, a lawyer律師, a philosopher哲學家
25
85083
3310
我是生物倫理學家、律師、
哲學家,以及伊朗裔美國人,
01:40
and an Iranian-American伊朗裔美國人,
26
88417
1867
01:42
I'm deeply concerned關心
about what this means手段 for our freedoms自由
27
90308
3787
我非常在乎這項發展對於
我們的自由有什麼樣的意涵,
01:46
and what kinds of protections保護 we need.
28
94119
2171
以及我們需要怎樣的保護。
01:48
I believe we need
a right to cognitive認知 liberty自由,
29
96993
3460
我認為我們要有
「認知自由」的權利,
01:52
as a human人的 right
that needs需求 to be protected保護.
30
100477
2892
它是一種需要被保護的人權,
01:55
If not, our freedom自由 of thought,
31
103772
2643
如果沒有保護它,
我們的思想自由、
01:58
access訪問 and control控制 over our own擁有 brains大腦
32
106439
3024
對我們大腦的存取權以及控制權,
02:01
and our mental心理 privacy隱私 will be threatened受威脅.
33
109487
2841
還有我們的心理隱私
都會受到威脅。
02:05
Consider考慮 this:
34
113698
1158
試想看看:一般人每天
會有數千個思維想法。
02:06
the average平均 person thinks
thousands數千 of thoughts思念 each day.
35
114880
3314
02:10
As a thought takes form形成,
36
118697
1151
一個想法形成時,
比如一項數學計算、
02:11
like a math數學 calculation計算
or a number, a word,
37
119872
5056
一個數字、一個單字,
02:16
neurons神經元 are interacting互動 in the brain,
38
124952
2885
神經元就會和大腦互動,
02:19
creating創建 a miniscule微乎其微 electrical電動 discharge卸貨.
39
127861
3088
產生很微小的靜電放電。
02:23
When you have a dominant優勢
mental心理 state, like relaxation鬆弛,
40
131713
3452
當你有一種主動支配的
精神狀態,比如放鬆,
02:27
hundreds數以百計 and thousands數千 of neurons神經元
are firing射擊 in the brain,
41
135189
4175
腦中會有成千上百個
神經元串接在一起,
02:31
creating創建 concurrent同時 electrical電動 discharges放電
in characteristic特性 patterns模式
42
139388
4218
創造出依循特徵模式的
同時靜電放電,
02:35
that can be measured測量
with electroencephalography腦電圖, or EEG腦電圖.
43
143630
4865
可以用腦電圖(EEG)來測量。
02:41
In fact事實, that's what
you're seeing眼看 right now.
44
149118
2160
事實上,就是各位現在所看到的。
02:43
You're seeing眼看 my brain activity活動
that was recorded記錄 in real真實 time
45
151956
3964
你們看到的是我的
即時腦部活動記錄,
02:47
with a simple簡單 device設備
that was worn磨損的 on my head.
46
155944
2735
我頭上只要戴著一個
簡單的裝置即可。
02:51
What you're seeing眼看 is my brain activity活動
when I was relaxed輕鬆 and curious好奇.
47
159669
5653
你們看到的,
是我放鬆且好奇時的腦部活動。
02:58
To share分享 this information信息 with you,
48
166097
1755
為了和各位分享這些資訊,
02:59
I wore穿著 one of the early
consumer-based消費為主 EEG腦電圖 devices設備
49
167876
3020
我戴上了一個早期以消費者
為基礎的腦電圖裝置,就像這個,
03:02
like this one,
50
170920
1211
03:04
which哪一個 recorded記錄 the electrical電動
activity活動 in my brain in real真實 time.
51
172155
3988
它能即時記錄下我腦部的電活動。
03:08
It's not unlike不像 the fitness身體素質 trackers跟踪器
that some of you may可能 be wearing穿著
52
176849
3826
它並不像有些人健身戴的那種
03:12
to measure測量 your heart rate
or the steps腳步 that you've taken採取,
53
180699
3586
心跳追蹤器,或是計步器,
或是睡眠活動監測器。
03:16
or even your sleep睡覺 activity活動.
54
184309
1587
03:19
It's hardly幾乎不 the most sophisticated複雜的
neuroimaging神經影像學 technique技術 on the market市場.
55
187154
3824
它甚至算不上是市場上
最精密的神經成像技術。
03:23
But it's already已經 the most portable手提
56
191597
2378
但它是最能攜帶的,
03:25
and the most likely容易 to impact碰撞
our everyday每天 lives生活.
57
193999
3152
且最可能影響我們日常生活的。
03:29
This is extraordinary非凡.
58
197915
1503
這很了不起。
03:31
Through通過 a simple簡單, wearable穿戴式 device設備,
59
199442
2505
透過一個簡單、可穿戴的裝置,
03:33
we can literally按照字面 see
inside the human人的 brain
60
201971
3548
我們就真的能看見人腦的內部,
03:37
and learn學習 aspects方面 of our mental心理 landscape景觀
without ever uttering發聲 a word.
61
205543
6476
且完全不用說話就可以
知道我們的心景。
03:44
While we can't reliably可靠 decode解碼
complex複雜 thoughts思念 just yet然而,
62
212829
3653
雖然我們還無法用很可靠的方式
來解譯複雜的思想,
03:48
we can already已經 gauge測量 a person's人的 mood心情,
63
216506
2519
我們已經可以測量人的心情,
03:51
and with the help
of artificial人造 intelligence情報,
64
219049
2873
在人工智慧的協助之下,
03:53
we can even decode解碼
some single-digit個位數 numbers數字
65
221946
4341
我們甚至可以將一個人腦中
在想的一位數數字、
一個形狀,或簡單的字詞
給解譯出來,
03:58
or shapes形狀 or simple簡單 words
that a person is thinking思維
66
226311
4882
04:03
or hearing聽力, or seeing眼看.
67
231217
2189
連他聽見的、看見的也可以。
04:06
Despite儘管 some inherent固有 limitations限制 in EEG腦電圖,
68
234345
3365
儘管腦電圖有一些先天的限制,
04:09
I think it's safe安全 to say
that with our advances進步 in technology技術,
69
237734
4720
我認為我們可以說,
隨著我們的科技進步,
04:14
more and more of what's happening事件
in the human人的 brain
70
242478
3809
會有越來越多在人腦中發生的狀況
04:18
can and will be decoded解碼 over time.
71
246311
2310
隨時間都能夠被解譯出來。
04:21
Already已經, using運用 one of these devices設備,
72
249362
2746
現今,使用這種裝置,
04:24
an epileptic癲癇 can know
they're going to have an epileptic癲癇 seizure發作
73
252132
3255
癲癇症患者就可以
在癲癇發作之前知道即將發作,
04:27
before it happens發生.
74
255411
1436
04:28
A paraplegic截癱 can type類型 on a computer電腦
with their thoughts思念 alone單獨.
75
256871
4603
截癱患者能夠單單用
他們的思想在電腦上打字。
一間美國公司開發出了一種技術,
04:34
A US-based美國為基礎 company公司 has developed發達
a technology技術 to embed these sensors傳感器
76
262485
4183
來將這些感測器裝入汽車的頭枕,
04:38
into the headrest頭靠 of automobilies自動百合
77
266692
2230
04:40
so they can track跟踪 driver司機 concentration濃度,
78
268946
2505
他們就能追蹤司機在開車時的
04:43
distraction娛樂 and cognitive認知 load加載
while driving主動.
79
271475
2667
注意力、分心,以及認知負載。
04:46
Nissan日產, insurance保險 companies公司
and AAAAAA have all taken採取 note注意.
80
274912
4058
日產汽車、保險公司,
以及美國汽車協會都已經在關注。
04:51
You could even watch this
choose-your-own-adventure選擇你自己冒險 movie電影
81
279949
4508
你甚至可以觀賞《時刻》這部
「選擇你自己的冒險」的電影,
04:56
"The Moment時刻," which哪一個, with an EEG腦電圖 headset耳機,
82
284481
4240
看的時候要戴上腦電圖耳機,
就能用你腦中的反應來改變電影,
05:00
changes變化 the movie電影
based基於 on your brain-based基於大腦的 reactions反應,
83
288745
3926
每當你的注意力降低,
就會給你一個不一樣的結局。
05:04
giving you a different不同 ending結尾
every一切 time your attention注意 wanes陰晴圓缺.
84
292695
4353
05:11
This may可能 all sound聲音 great,
85
299154
2763
這可能聽起來很棒,
05:13
and as a bioethicist生物倫理學家,
86
301941
2189
身為生物倫理學家,
05:16
I am a huge巨大 proponent支持者 of empowering授權 people
87
304154
3613
我非常支持要讓人們有能力
05:19
to take charge收費 of their own擁有
health健康 and well-being福利
88
307791
2616
去掌控自己的健康和幸福,
05:22
by giving them access訪問
to information信息 about themselves他們自己,
89
310431
2918
而讓他們取得關於自己的資訊——
05:25
including包含 this incredible難以置信
new brain-decoding大腦解碼 technology技術.
90
313373
2976
包括用這項了不起的大腦解譯新技術,
就可以做到這一點。
05:29
But I worry擔心.
91
317878
1167
但,我會擔心。
05:31
I worry擔心 that we will voluntarily自行
or involuntarily不由自主 give up
92
319736
4760
我擔心我們會自願地或不自願地
放棄我們捍衛自由及我們
心理隱私的最後保壘。
05:36
our last bastion堡壘 of freedom自由,
our mental心理 privacy隱私.
93
324520
4118
05:41
That we will trade貿易 our brain activity活動
94
329302
2925
我擔心我們會拿
我們的大腦活動來交易,
05:44
for rebates回扣 or discounts折扣 on insurance保險,
95
332251
3046
換取保險的貼現或折扣,
05:48
or free自由 access訪問
to social-media社交媒體 accounts賬戶 ...
96
336391
2603
或換取免費帳號來使用社交媒體,
05:52
or even to keep our jobs工作.
97
340444
1848
甚至用來保有工作。
05:54
In fact事實, in China中國,
98
342900
1913
事實上,在中國,
05:58
the train培養 drivers司機 on
the Beijing-Shanghai京滬 high-speed高速 rail,
99
346199
5897
北京上海之間的高鐵,
是世界上最忙碌的高鐵,
開這條線的火車司機,
06:04
the busiest最繁忙 of its kind in the world世界,
100
352120
2532
06:06
are required需要 to wear穿 EEG腦電圖 devices設備
101
354676
2476
被要求要配戴腦電圖裝置,
06:09
to monitor監控 their brain activity活動
while driving主動.
102
357176
2427
用來監測他們
在駕駛時的大腦活動。
06:12
According根據 to some news新聞 sources來源,
103
360157
2226
根據一些新聞來源,
06:14
in government-run官辦 factories工廠 in China中國,
104
362407
2679
在中國的國營工廠中,
06:17
the workers工人 are required需要 to wear穿
EEG腦電圖 sensors傳感器 to monitor監控 their productivity生產率
105
365110
5364
工人被要求要配戴腦電圖感測器,
來監測他們的生產力,
以及他們工作時的情緒狀態。
06:22
and their emotional情緒化 state at work.
106
370498
2115
06:25
Workers工人 are even sent發送 home
107
373267
2310
如果工人的大腦展現出
他們對於工作沒有做到最專注,
06:27
if their brains大腦 show顯示 less-than-stellar不那麼明星
concentration濃度 on their jobs工作,
108
375601
4054
或是呈現出情緒煩亂,
他們就會被送回家。
06:31
or emotional情緒化 agitation攪動.
109
379679
2122
06:35
It's not going to happen發生 tomorrow明天,
110
383189
1745
明天還不會發生,
06:36
but we're headed當家 to a world世界
of brain transparency透明度.
111
384958
3086
但我們正在朝向一個
透明大腦的世界邁進。
06:40
And I don't think people understand理解
that that could change更改 everything.
112
388537
3440
我不覺得大家了解那會改變一切。
06:44
Everything from our definitions定義
of data數據 privacy隱私 to our laws法律,
113
392474
3675
一切都會變,從我們
對於資料隱私的定義,
到我們的法律,
到我們對於自由的想法。
06:48
to our ideas思路 about freedom自由.
114
396173
1800
06:50
In fact事實, in my lab實驗室 at Duke公爵 University大學,
115
398731
3077
事實上,在我的
杜克大學實驗室中,
06:53
we recently最近 conducted進行 a nationwide全國 study研究
in the United聯合的 States狀態
116
401832
3175
我們最近進行了一項
美國全國性的研究,
06:57
to see if people appreciated讚賞
117
405031
1959
來看看大家是否知道
他們的大腦資訊有多敏感。
06:59
the sensitivity靈敏度
of their brain information信息.
118
407014
2071
07:02
We asked people to rate
their perceived感知 sensitivity靈敏度
119
410356
3356
我們請受訪者針對三十三種資訊
所認為的敏感度來進行評分,
07:05
of 33 different不同 kinds of information信息,
120
413736
2231
07:07
from their social社會 security安全 numbers數字
121
415991
2220
這些資訊包括他們的身分證字號、
07:10
to the content內容
of their phone電話 conversations對話,
122
418235
2597
他們電話交談的內容、
他們過去的關係、
07:12
their relationship關係 history歷史,
123
420856
2193
07:15
their emotions情緒, their anxiety焦慮,
124
423073
1942
他們的情緒、他們的焦慮、
07:17
the mental心理 images圖片 in their mind心神
125
425039
1946
他們腦中的影像,
07:19
and the thoughts思念 in their mind心神.
126
427009
1538
以及他們腦中的思想。
07:21
Shockingly令人 震驚, people rated額定 their social社會
security安全 number as far more sensitive敏感
127
429481
5229
很讓人驚訝的是,
大家認為身分證字號的敏感度
07:26
than any other kind of information信息,
128
434734
2203
遠高於任何其他資訊,
07:28
including包含 their brain data數據.
129
436961
2435
包括他們腦中的資料。
07:32
I think this is because
people don't yet然而 understand理解
130
440380
3216
我認為這是因為大家尚未了解
07:35
or believe the implications啟示
of this new brain-decoding大腦解碼 technology技術.
131
443620
4063
或還不相信這種新的
大腦解譯技術背後的意涵。
07:40
After all, if we can know
the inner workings運作 of the human人的 brain,
132
448629
3289
畢竟,如果我們能夠知道
人腦內部的運作,
07:43
our social社會 security安全 numbers數字
are the least最小 of our worries.
133
451942
2706
我們的身分證字號
是最不用擔心的了。
07:46
(Laughter笑聲)
134
454672
1285
(笑聲)
07:47
Think about it.
135
455981
1167
想想看。
07:49
In a world世界 of total brain transparency透明度,
136
457172
2396
在一個大腦完全透明的世界中,
07:51
who would dare have
a politically政治上 dissident持不同政見 thought?
137
459592
2429
誰敢有在政治上意見不同的念頭?
07:55
Or a creative創作的 one?
138
463279
1541
或是有創意的想法?
07:57
I worry擔心 that people will self-censor自我審查
139
465503
3476
我擔心大家會因為害怕
被社會排斥而做自我審查,
08:01
in fear恐懼 of being存在 ostracized排斥 by society社會,
140
469003
3302
08:04
or that people will lose失去 their jobs工作
because of their waning衰退 attention注意
141
472329
3813
或者,大家丟掉飯碗可能是
因為注意力下降或是情緒不穩定,
08:08
or emotional情緒化 instability不穩定,
142
476166
2150
08:10
or because they're contemplating考慮
collective集體 action行動 against反對 their employers雇主.
143
478340
3550
或是因為他們在想著
要集體會抗僱主的行動。
08:14
That coming未來 out
will no longer be an option選項,
144
482478
3177
出櫃與否也不再是個選擇,
08:17
because people's人們 brains大腦 will long ago
have revealed透露 their sexual有性 orientation方向,
145
485679
5067
因為人腦會揭露出
他們的性向、
他們的政治意識形態,
08:22
their political政治 ideology思想
146
490770
1822
08:24
or their religious宗教 preferences優先,
147
492616
2025
或是他們的宗教偏好,
08:26
well before they were ready準備
to consciously自覺 share分享 that information信息
148
494665
3080
在他們有意識地準備好和他人
分享之前,大腦都先做了。
08:29
with other people.
149
497769
1253
08:31
I worry擔心 about the ability能力 of our laws法律
to keep up with technological技術性 change更改.
150
499565
4912
我擔心我們的法律是否
能夠跟上科技的改變。
08:36
Take the First Amendment修訂
of the US Constitution憲法,
151
504986
2320
比如美國憲法第一修正案
08:39
which哪一個 protects保護 freedom自由 of speech言語.
152
507330
1958
保障的是言論的自由。
08:41
Does it also protect保護 freedom自由 of thought?
153
509312
1927
它是否也保護思想的自由?
08:43
And if so, does that mean that we're free自由
to alter改變 our thoughts思念 however然而 we want?
154
511944
4169
如果是的話,那是否表示我們
可以任意改變想法?
08:48
Or can the government政府 or society社會 tell us
what we can do with our own擁有 brains大腦?
155
516137
4674
或者,政府或社會是否能
告訴我們能用自己的大腦做什麼?
08:53
Can the NSANSA spy間諜 on our brains大腦
using運用 these new mobile移動 devices設備?
156
521591
3717
國家安全協會是否能用這些新的
行動裝置來監視我們的大腦?
08:58
Can the companies公司 that collect蒐集
the brain data數據 through通過 their applications應用
157
526053
4119
透過應用程式收集大腦資料的公司
是否能將這些資訊販售給第三方?
09:02
sell this information信息 to third第三 parties派對?
158
530196
2074
09:05
Right now, no laws法律 prevent避免 them
from doing so.
159
533174
3222
目前,沒有法律阻止他們這麼做。
09:09
It could be even more problematic問題
160
537203
2025
有些國家問題可能更大,
09:11
in countries國家 that don't share分享
the same相同 freedoms自由
161
539252
2519
如人民不像美國人能享有
09:13
enjoyed享受 by people in the United聯合的 States狀態.
162
541795
2103
那種自由的那些國家。
09:16
What would've會一直 happened發生 during
the Iranian伊朗的 Green綠色 Movement運動
163
544883
3787
在伊朗綠色革命中,
如果政府能夠監視
我家人的大腦活動,
09:20
if the government政府 had been
monitoring監控 my family's家庭的 brain activity活動,
164
548694
3901
因而認為他們同情抗議者,
會發生什麼事?
09:24
and had believed相信 them
to be sympathetic有同情心 to the protesters抗議者?
165
552619
4007
09:30
Is it so far-fetched牽強 to imagine想像 a society社會
166
558091
3047
真的有這麼難想像這樣的社會嗎?
人民因為他們有想要
犯罪的念頭而被逮捕,
09:33
in which哪一個 people are arrested被捕
based基於 on their thoughts思念
167
561162
2842
09:36
of committing提交 a crime犯罪,
168
564028
1167
09:37
like in the science-fiction科幻小說
dystopian反烏托邦 society社會 in "Minority少數民族 Report報告."
169
565219
4312
就像是科幻電影《關鍵報告》中的
反烏托邦社會一樣。
09:42
Already已經, in the United聯合的 States狀態, in Indiana印地安那,
170
570286
4323
在美國印第安納州,
09:46
an 18-year-old-歲 was charged帶電
with attempting嘗試 to intimidate威嚇 his school學校
171
574633
4937
已經有一位十八歲的男子
被控試圖威嚇他的學校,
09:51
by posting發帖 a video視頻 of himself他自己
shooting射擊 people in the hallways走廊 ...
172
579594
3309
因為他張貼了一支他自己
在走廊射殺他人的影片……
09:55
Except the people were zombies殭屍
173
583881
3007
只不過,被殺的人是殭屍,
09:58
and the video視頻 was of him playing播放
an augmented-reality增強的現實 video視頻 game遊戲,
174
586912
5047
那支影片的內容其實是
他在玩虛擬實境的遊戲,
10:03
all interpreted解讀 to be a mental心理 projection投影
of his subjective主觀 intent意圖.
175
591983
4772
這全都被解讀為
他主觀意圖的心理投射。
10:10
This is exactly究竟 why our brains大腦
need special特別 protection保護.
176
598456
4612
這正是為什麼我們的大腦
需要特殊的保護。
10:15
If our brains大腦 are just as subject學科
to data數據 tracking追踪 and aggregation聚合
177
603092
3556
如果我們的大腦就像
我們的財務記錄和交易一樣
10:18
as our financial金融 records記錄 and transactions交易,
178
606672
2532
會受到資料追蹤和整合,
如果我們的大腦能像
我們的線上活動、手機,
10:21
if our brains大腦 can be hacked砍死
and tracked追踪 like our online線上 activities活動,
179
609228
4285
及應用程式一樣被駭入和被追蹤,
10:25
our mobile移動 phones手機 and applications應用,
180
613537
2361
10:27
then we're on the brink邊緣 of a dangerous危險
threat威脅 to our collective集體 humanity人性.
181
615922
4269
那麼我們就是處於集體人性
受到危險威脅的邊緣。
10:33
Before you panic恐慌,
182
621406
1309
在你們慌張之前,
10:36
I believe that there are solutions解決方案
to these concerns關注,
183
624012
3144
我相信這些擔憂都有解決方案,
10:39
but we have to start開始 by focusing調焦
on the right things.
184
627180
2825
但我們一開始就得
把焦點放在對的地方。
10:42
When it comes to privacy隱私
protections保護 in general一般,
185
630580
2921
就一般性的隱私保護而言,
10:45
I think we're fighting戰鬥 a losing失去 battle戰鬥
186
633525
1826
我認為我們在打一場會輸的仗,
因為我們在試圖限制資訊流。
10:47
by trying to restrict限制
the flow of information信息.
187
635375
2858
10:50
Instead代替, we should be focusing調焦
on securing確保 rights權利 and remedies補救措施
188
638257
4057
反之,我們應該要把焦點放在
保衛我們的權利並做出補救,
10:54
against反對 the misuse濫用 of our information信息.
189
642338
2275
確保我們的資訊不會被濫用。
10:57
If people had the right to decide決定
how their information信息 was shared共享,
190
645291
3285
如果人民有權決定
他們的資訊要如何被分享,
更重要的,要有法律補救方法,
11:00
and more importantly重要的, have legal法律 redress糾正
191
648600
2921
以免他們的資訊被濫用
來對他們不利,
11:03
if their information信息
was misused誤用 against反對 them,
192
651545
2428
比如在應徵時,或健康照護
11:05
say to discriminate辨析 against反對 them
in an employment僱用 setting設置
193
653997
2786
或教育的情境中被用來歧視他們,
11:08
or in health健康 care關心 or education教育,
194
656807
2785
11:11
this would go a long way to build建立 trust相信.
195
659616
2571
要花很長的時間才能夠建立信任。
11:14
In fact事實, in some instances實例,
196
662843
1718
事實上,在一些狀況中,
11:16
we want to be sharing分享 more
of our personal個人 information信息.
197
664585
3524
我們會想要分享更多的個人資訊。
11:20
Studying研究 aggregated匯總 information信息
can tell us so much
198
668697
3047
研究整合的資訊能告訴我們許多
11:23
about our health健康 and our well-being福利,
199
671768
2747
關於我們健康和幸福的訊息,
11:26
but to be able能夠 to safely安然 share分享
our information信息,
200
674539
3313
但若要安全地分享我們的資訊,
11:29
we need special特別 protections保護
for mental心理 privacy隱私.
201
677876
3223
我們的心理隱私
會需要特殊的保護。
11:33
This is why we need
a right to cognitive認知 liberty自由.
202
681832
3147
這就是為什麼我們需要
一種認知自由權。
11:37
This right would secure安全 for us
our freedom自由 of thought and rumination沉思,
203
685543
4079
這種權利能確保
我們的思想和沉思自由,
11:41
our freedom自由 of self-determination自決,
204
689646
2540
我們的自我決定自由,
11:44
and it would insure保證 that we have
the right to consent同意 to or refuse垃圾
205
692210
4390
且能保證我們有權同意或拒絕他人
11:48
access訪問 and alteration改造
of our brains大腦 by others其他.
206
696624
2857
取得和變更我們大腦內容的權利。
11:51
This right could be recognized認可
207
699811
2112
這種權利可以被視為是
世界人權宣言的一部分,
11:53
as part部分 of the Universal普遍 Declaration宣言
of Human人的 Rights,
208
701947
2883
11:56
which哪一個 has established既定 mechanisms機制
209
704854
2388
該宣言是在建立一些機制
以強制執行這類社會權利。
11:59
for the enforcement強制
of these kinds of social社會 rights權利.
210
707266
2856
12:03
During the Iranian伊朗的 Green綠色 Movement運動,
211
711872
2070
在伊朗綠色革命時,
12:05
the protesters抗議者 used the internet互聯網
and good old-fashioned過時 word of mouth
212
713966
5186
抗議者用網路
以及傳統的口耳相傳,
12:11
to coordinate坐標 their marches遊行.
213
719176
1948
來協調他們的遊行。
12:14
And some of the most oppressive壓抑
restrictions限制 in Iran伊朗
214
722238
2769
結果,伊朗政府解除了
一些最壓迫的限制。
12:17
were lifted取消 as a result結果.
215
725031
1877
12:20
But what if the Iranian伊朗的 government政府
had used brain surveillance監控
216
728047
4087
但,如果伊朗政府
當初使用大腦監視
12:24
to detect檢測 and prevent避免 the protest抗議?
217
732158
3061
來偵測並預防抗議呢?
12:28
Would the world世界 have ever heard聽說
the protesters'抗議者 ' cries哭聲?
218
736847
3176
世界還會有機會
聽到抗議者的吶喊嗎?
12:33
The time has come for us to call
for a cognitive認知 liberty自由 revolution革命.
219
741732
5121
該是我們來進行一場
認知自由革命的時候了。
12:39
To make sure that we responsibly負責任
advance提前 technology技術
220
747559
3264
以確保我們的科技是以
負責任的方式在進步,
12:42
that could enable啟用 us to embrace擁抱 the future未來
221
750847
2978
讓我們能夠擁抱未來,
12:45
while fiercely激烈 protecting保護 all of us
from any person, company公司 or government政府
222
753849
6717
同時強力保護我們所有人,
不受到任何人、公司,或政府
12:52
that attempts嘗試 to unlawfully非法 access訪問
or alter改變 our innermost lives生活.
223
760590
5040
以不合法的方式嘗試取得
或變更我們最內在的生活。
12:58
Thank you.
224
766659
1174
謝謝。
12:59
(Applause掌聲)
225
767857
3492
(掌聲)
Translated by Lilian Chiu
Reviewed by congmei Han

▲Back to top

ABOUT THE SPEAKER
Nita Farahany - Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics.

Why you should listen

Nita A. Farahany is a professor of law and philosophy, the founding director of the Duke Initiative for Science & Society and chair of the MA in Bioethics & Science Policy at Duke University. In 2010, Farahany was appointed by President Obama to the Presidential Commission for the Study of Bioethical Issues, and she served as a member until 2017. She is a member of the Neuroethics Division of the Multi-Council Working Group for the BRAIN Initiative, on the President's Research Council of the Canadian Institute for Advanced Research (CIFAR), and past member of the Global Agenda Council for Privacy, Technology and Governance at the World Economic Forum. 

Farahany presents her work to diverse audiences and is a frequent commentator for national media and radio shows. She is an elected member of the American Law Institute, the President-Elect of the International Neuroethics Society, serves on the Board of the International Neuroethics Society, a co-editor a co-editor-in-chief and co-founder of the Journal of Law and the Biosciences and an editorial board member of the American Journal of Bioethics (Neuroscience). She's on the Ethics Advisory Board for Illumina, Inc., the Scientific Advisory Board of Helix, and the Board of Advisors of Scientific American. 

Farahany received her AB in genetics, cell and developmental biology at Dartmouth College, a JD and MA from Duke University, as well as a PhD in philosophy. She also holds an ALM in biology from Harvard University. In 2004-2005, Farahany clerked for Judge Judith W. Rogers of the US Court of Appeals for the D.C. Circuit, after which she joined the faculty at Vanderbilt University. In 2011, Farahany was the Leah Kaplan Visiting Professor of Human Rights at Stanford Law School.

More profile about the speaker
Nita Farahany | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee