ABOUT THE SPEAKER
Nita Farahany - Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics.

Why you should listen

Nita A. Farahany is a professor of law and philosophy, the founding director of the Duke Initiative for Science & Society and chair of the MA in Bioethics & Science Policy at Duke University. In 2010, Farahany was appointed by President Obama to the Presidential Commission for the Study of Bioethical Issues, and she served as a member until 2017. She is a member of the Neuroethics Division of the Multi-Council Working Group for the BRAIN Initiative, on the President's Research Council of the Canadian Institute for Advanced Research (CIFAR), and past member of the Global Agenda Council for Privacy, Technology and Governance at the World Economic Forum. 

Farahany presents her work to diverse audiences and is a frequent commentator for national media and radio shows. She is an elected member of the American Law Institute, the President-Elect of the International Neuroethics Society, serves on the Board of the International Neuroethics Society, a co-editor a co-editor-in-chief and co-founder of the Journal of Law and the Biosciences and an editorial board member of the American Journal of Bioethics (Neuroscience). She's on the Ethics Advisory Board for Illumina, Inc., the Scientific Advisory Board of Helix, and the Board of Advisors of Scientific American. 

Farahany received her AB in genetics, cell and developmental biology at Dartmouth College, a JD and MA from Duke University, as well as a PhD in philosophy. She also holds an ALM in biology from Harvard University. In 2004-2005, Farahany clerked for Judge Judith W. Rogers of the US Court of Appeals for the D.C. Circuit, after which she joined the faculty at Vanderbilt University. In 2011, Farahany was the Leah Kaplan Visiting Professor of Human Rights at Stanford Law School.

More profile about the speaker
Nita Farahany | Speaker | TED.com
TED Salon Zebra Technologies

Nita Farahany: When technology can read minds, how will we protect our privacy?

Filmed:
1,819,292 views

Tech that can decode your brain activity and reveal what you're thinking and feeling is on the horizon, says legal scholar and ethicist Nita Farahany. What will it mean for our already violated sense of privacy? In a cautionary talk, Farahany warns of a society where people are arrested for merely thinking about committing a crime (like in "Minority Report") and private interests sell our brain data -- and makes the case for a right to cognitive liberty that protects our freedom of thought and self-determination.
- Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics. Full bio

Double-click the English transcript below to play the video.

00:13
In the months following
the 2009 presidential election in Iran,
0
1096
4714
00:17
protests erupted across the country.
1
5834
2894
00:21
The Iranian government
violently suppressed
2
9685
2896
00:24
what came to be known
as the Iranian Green Movement,
3
12605
3979
00:28
even blocking mobile signals
4
16608
2053
00:30
to cut off communication
between the protesters.
5
18685
2714
00:34
My parents, who emigrated
to the United States in the late 1960s,
6
22316
4669
00:39
spend substantial time there,
7
27009
1794
00:40
where all of my large,
extended family live.
8
28827
3153
00:44
When I would call my family in Tehran
9
32832
3129
00:47
during some of the most violent
crackdowns of the protest,
10
35985
3452
00:51
none of them dared discuss
with me what was happening.
11
39461
3252
00:55
They or I knew to quickly steer
the conversation to other topics.
12
43196
3529
00:59
All of us understood
what the consequences could be
13
47163
3380
01:02
of a perceived dissident action.
14
50567
2540
01:06
But I still wish I could have known
what they were thinking
15
54095
3469
01:09
or what they were feeling.
16
57588
1418
01:12
What if I could have?
17
60217
1393
01:14
Or more frighteningly,
18
62149
1151
01:15
what if the Iranian government could have?
19
63324
2761
01:18
Would they have arrested them
based on what their brains revealed?
20
66695
3244
01:22
That day may be closer than you think.
21
70933
2944
01:26
With our growing capabilities
in neuroscience, artificial intelligence
22
74527
3811
01:30
and machine learning,
23
78362
1703
01:32
we may soon know a lot more
of what's happening in the human brain.
24
80089
4075
01:37
As a bioethicist, a lawyer, a philosopher
25
85083
3310
01:40
and an Iranian-American,
26
88417
1867
01:42
I'm deeply concerned
about what this means for our freedoms
27
90308
3787
01:46
and what kinds of protections we need.
28
94119
2171
01:48
I believe we need
a right to cognitive liberty,
29
96993
3460
01:52
as a human right
that needs to be protected.
30
100477
2892
01:55
If not, our freedom of thought,
31
103772
2643
01:58
access and control over our own brains
32
106439
3024
02:01
and our mental privacy will be threatened.
33
109487
2841
02:05
Consider this:
34
113698
1158
02:06
the average person thinks
thousands of thoughts each day.
35
114880
3314
02:10
As a thought takes form,
36
118697
1151
02:11
like a math calculation
or a number, a word,
37
119872
5056
02:16
neurons are interacting in the brain,
38
124952
2885
02:19
creating a miniscule electrical discharge.
39
127861
3088
02:23
When you have a dominant
mental state, like relaxation,
40
131713
3452
02:27
hundreds and thousands of neurons
are firing in the brain,
41
135189
4175
02:31
creating concurrent electrical discharges
in characteristic patterns
42
139388
4218
02:35
that can be measured
with electroencephalography, or EEG.
43
143630
4865
02:41
In fact, that's what
you're seeing right now.
44
149118
2160
02:43
You're seeing my brain activity
that was recorded in real time
45
151956
3964
02:47
with a simple device
that was worn on my head.
46
155944
2735
02:51
What you're seeing is my brain activity
when I was relaxed and curious.
47
159669
5653
02:58
To share this information with you,
48
166097
1755
02:59
I wore one of the early
consumer-based EEG devices
49
167876
3020
03:02
like this one,
50
170920
1211
03:04
which recorded the electrical
activity in my brain in real time.
51
172155
3988
03:08
It's not unlike the fitness trackers
that some of you may be wearing
52
176849
3826
03:12
to measure your heart rate
or the steps that you've taken,
53
180699
3586
03:16
or even your sleep activity.
54
184309
1587
03:19
It's hardly the most sophisticated
neuroimaging technique on the market.
55
187154
3824
03:23
But it's already the most portable
56
191597
2378
03:25
and the most likely to impact
our everyday lives.
57
193999
3152
03:29
This is extraordinary.
58
197915
1503
03:31
Through a simple, wearable device,
59
199442
2505
03:33
we can literally see
inside the human brain
60
201971
3548
03:37
and learn aspects of our mental landscape
without ever uttering a word.
61
205543
6476
03:44
While we can't reliably decode
complex thoughts just yet,
62
212829
3653
03:48
we can already gauge a person's mood,
63
216506
2519
03:51
and with the help
of artificial intelligence,
64
219049
2873
03:53
we can even decode
some single-digit numbers
65
221946
4341
03:58
or shapes or simple words
that a person is thinking
66
226311
4882
04:03
or hearing, or seeing.
67
231217
2189
04:06
Despite some inherent limitations in EEG,
68
234345
3365
04:09
I think it's safe to say
that with our advances in technology,
69
237734
4720
04:14
more and more of what's happening
in the human brain
70
242478
3809
04:18
can and will be decoded over time.
71
246311
2310
04:21
Already, using one of these devices,
72
249362
2746
04:24
an epileptic can know
they're going to have an epileptic seizure
73
252132
3255
04:27
before it happens.
74
255411
1436
04:28
A paraplegic can type on a computer
with their thoughts alone.
75
256871
4603
04:34
A US-based company has developed
a technology to embed these sensors
76
262485
4183
04:38
into the headrest of automobilies
77
266692
2230
04:40
so they can track driver concentration,
78
268946
2505
04:43
distraction and cognitive load
while driving.
79
271475
2667
04:46
Nissan, insurance companies
and AAA have all taken note.
80
274912
4058
04:51
You could even watch this
choose-your-own-adventure movie
81
279949
4508
04:56
"The Moment," which, with an EEG headset,
82
284481
4240
05:00
changes the movie
based on your brain-based reactions,
83
288745
3926
05:04
giving you a different ending
every time your attention wanes.
84
292695
4353
05:11
This may all sound great,
85
299154
2763
05:13
and as a bioethicist,
86
301941
2189
05:16
I am a huge proponent of empowering people
87
304154
3613
05:19
to take charge of their own
health and well-being
88
307791
2616
05:22
by giving them access
to information about themselves,
89
310431
2918
05:25
including this incredible
new brain-decoding technology.
90
313373
2976
05:29
But I worry.
91
317878
1167
05:31
I worry that we will voluntarily
or involuntarily give up
92
319736
4760
05:36
our last bastion of freedom,
our mental privacy.
93
324520
4118
05:41
That we will trade our brain activity
94
329302
2925
05:44
for rebates or discounts on insurance,
95
332251
3046
05:48
or free access
to social-media accounts ...
96
336391
2603
05:52
or even to keep our jobs.
97
340444
1848
05:54
In fact, in China,
98
342900
1913
05:58
the train drivers on
the Beijing-Shanghai high-speed rail,
99
346199
5897
06:04
the busiest of its kind in the world,
100
352120
2532
06:06
are required to wear EEG devices
101
354676
2476
06:09
to monitor their brain activity
while driving.
102
357176
2427
06:12
According to some news sources,
103
360157
2226
06:14
in government-run factories in China,
104
362407
2679
06:17
the workers are required to wear
EEG sensors to monitor their productivity
105
365110
5364
06:22
and their emotional state at work.
106
370498
2115
06:25
Workers are even sent home
107
373267
2310
06:27
if their brains show less-than-stellar
concentration on their jobs,
108
375601
4054
06:31
or emotional agitation.
109
379679
2122
06:35
It's not going to happen tomorrow,
110
383189
1745
06:36
but we're headed to a world
of brain transparency.
111
384958
3086
06:40
And I don't think people understand
that that could change everything.
112
388537
3440
06:44
Everything from our definitions
of data privacy to our laws,
113
392474
3675
06:48
to our ideas about freedom.
114
396173
1800
06:50
In fact, in my lab at Duke University,
115
398731
3077
06:53
we recently conducted a nationwide study
in the United States
116
401832
3175
06:57
to see if people appreciated
117
405031
1959
06:59
the sensitivity
of their brain information.
118
407014
2071
07:02
We asked people to rate
their perceived sensitivity
119
410356
3356
07:05
of 33 different kinds of information,
120
413736
2231
07:07
from their social security numbers
121
415991
2220
07:10
to the content
of their phone conversations,
122
418235
2597
07:12
their relationship history,
123
420856
2193
07:15
their emotions, their anxiety,
124
423073
1942
07:17
the mental images in their mind
125
425039
1946
07:19
and the thoughts in their mind.
126
427009
1538
07:21
Shockingly, people rated their social
security number as far more sensitive
127
429481
5229
07:26
than any other kind of information,
128
434734
2203
07:28
including their brain data.
129
436961
2435
07:32
I think this is because
people don't yet understand
130
440380
3216
07:35
or believe the implications
of this new brain-decoding technology.
131
443620
4063
07:40
After all, if we can know
the inner workings of the human brain,
132
448629
3289
07:43
our social security numbers
are the least of our worries.
133
451942
2706
07:46
(Laughter)
134
454672
1285
07:47
Think about it.
135
455981
1167
07:49
In a world of total brain transparency,
136
457172
2396
07:51
who would dare have
a politically dissident thought?
137
459592
2429
07:55
Or a creative one?
138
463279
1541
07:57
I worry that people will self-censor
139
465503
3476
08:01
in fear of being ostracized by society,
140
469003
3302
08:04
or that people will lose their jobs
because of their waning attention
141
472329
3813
08:08
or emotional instability,
142
476166
2150
08:10
or because they're contemplating
collective action against their employers.
143
478340
3550
08:14
That coming out
will no longer be an option,
144
482478
3177
08:17
because people's brains will long ago
have revealed their sexual orientation,
145
485679
5067
08:22
their political ideology
146
490770
1822
08:24
or their religious preferences,
147
492616
2025
08:26
well before they were ready
to consciously share that information
148
494665
3080
08:29
with other people.
149
497769
1253
08:31
I worry about the ability of our laws
to keep up with technological change.
150
499565
4912
08:36
Take the First Amendment
of the US Constitution,
151
504986
2320
08:39
which protects freedom of speech.
152
507330
1958
08:41
Does it also protect freedom of thought?
153
509312
1927
08:43
And if so, does that mean that we're free
to alter our thoughts however we want?
154
511944
4169
08:48
Or can the government or society tell us
what we can do with our own brains?
155
516137
4674
08:53
Can the NSA spy on our brains
using these new mobile devices?
156
521591
3717
08:58
Can the companies that collect
the brain data through their applications
157
526053
4119
09:02
sell this information to third parties?
158
530196
2074
09:05
Right now, no laws prevent them
from doing so.
159
533174
3222
09:09
It could be even more problematic
160
537203
2025
09:11
in countries that don't share
the same freedoms
161
539252
2519
09:13
enjoyed by people in the United States.
162
541795
2103
09:16
What would've happened during
the Iranian Green Movement
163
544883
3787
09:20
if the government had been
monitoring my family's brain activity,
164
548694
3901
09:24
and had believed them
to be sympathetic to the protesters?
165
552619
4007
09:30
Is it so far-fetched to imagine a society
166
558091
3047
09:33
in which people are arrested
based on their thoughts
167
561162
2842
09:36
of committing a crime,
168
564028
1167
09:37
like in the science-fiction
dystopian society in "Minority Report."
169
565219
4312
09:42
Already, in the United States, in Indiana,
170
570286
4323
09:46
an 18-year-old was charged
with attempting to intimidate his school
171
574633
4937
09:51
by posting a video of himself
shooting people in the hallways ...
172
579594
3309
09:55
Except the people were zombies
173
583881
3007
09:58
and the video was of him playing
an augmented-reality video game,
174
586912
5047
10:03
all interpreted to be a mental projection
of his subjective intent.
175
591983
4772
10:10
This is exactly why our brains
need special protection.
176
598456
4612
10:15
If our brains are just as subject
to data tracking and aggregation
177
603092
3556
10:18
as our financial records and transactions,
178
606672
2532
10:21
if our brains can be hacked
and tracked like our online activities,
179
609228
4285
10:25
our mobile phones and applications,
180
613537
2361
10:27
then we're on the brink of a dangerous
threat to our collective humanity.
181
615922
4269
10:33
Before you panic,
182
621406
1309
10:36
I believe that there are solutions
to these concerns,
183
624012
3144
10:39
but we have to start by focusing
on the right things.
184
627180
2825
10:42
When it comes to privacy
protections in general,
185
630580
2921
10:45
I think we're fighting a losing battle
186
633525
1826
10:47
by trying to restrict
the flow of information.
187
635375
2858
10:50
Instead, we should be focusing
on securing rights and remedies
188
638257
4057
10:54
against the misuse of our information.
189
642338
2275
10:57
If people had the right to decide
how their information was shared,
190
645291
3285
11:00
and more importantly, have legal redress
191
648600
2921
11:03
if their information
was misused against them,
192
651545
2428
11:05
say to discriminate against them
in an employment setting
193
653997
2786
11:08
or in health care or education,
194
656807
2785
11:11
this would go a long way to build trust.
195
659616
2571
11:14
In fact, in some instances,
196
662843
1718
11:16
we want to be sharing more
of our personal information.
197
664585
3524
11:20
Studying aggregated information
can tell us so much
198
668697
3047
11:23
about our health and our well-being,
199
671768
2747
11:26
but to be able to safely share
our information,
200
674539
3313
11:29
we need special protections
for mental privacy.
201
677876
3223
11:33
This is why we need
a right to cognitive liberty.
202
681832
3147
11:37
This right would secure for us
our freedom of thought and rumination,
203
685543
4079
11:41
our freedom of self-determination,
204
689646
2540
11:44
and it would insure that we have
the right to consent to or refuse
205
692210
4390
11:48
access and alteration
of our brains by others.
206
696624
2857
11:51
This right could be recognized
207
699811
2112
11:53
as part of the Universal Declaration
of Human Rights,
208
701947
2883
11:56
which has established mechanisms
209
704854
2388
11:59
for the enforcement
of these kinds of social rights.
210
707266
2856
12:03
During the Iranian Green Movement,
211
711872
2070
12:05
the protesters used the internet
and good old-fashioned word of mouth
212
713966
5186
12:11
to coordinate their marches.
213
719176
1948
12:14
And some of the most oppressive
restrictions in Iran
214
722238
2769
12:17
were lifted as a result.
215
725031
1877
12:20
But what if the Iranian government
had used brain surveillance
216
728047
4087
12:24
to detect and prevent the protest?
217
732158
3061
12:28
Would the world have ever heard
the protesters' cries?
218
736847
3176
12:33
The time has come for us to call
for a cognitive liberty revolution.
219
741732
5121
12:39
To make sure that we responsibly
advance technology
220
747559
3264
12:42
that could enable us to embrace the future
221
750847
2978
12:45
while fiercely protecting all of us
from any person, company or government
222
753849
6717
12:52
that attempts to unlawfully access
or alter our innermost lives.
223
760590
5040
12:58
Thank you.
224
766659
1174
12:59
(Applause)
225
767857
3492

▲Back to top

ABOUT THE SPEAKER
Nita Farahany - Legal scholar, ethicist
Nita A. Farahany is a leading scholar on the ethical, legal, and social implications of biosciences and emerging technologies, particularly those related to neuroscience and behavioral genetics.

Why you should listen

Nita A. Farahany is a professor of law and philosophy, the founding director of the Duke Initiative for Science & Society and chair of the MA in Bioethics & Science Policy at Duke University. In 2010, Farahany was appointed by President Obama to the Presidential Commission for the Study of Bioethical Issues, and she served as a member until 2017. She is a member of the Neuroethics Division of the Multi-Council Working Group for the BRAIN Initiative, on the President's Research Council of the Canadian Institute for Advanced Research (CIFAR), and past member of the Global Agenda Council for Privacy, Technology and Governance at the World Economic Forum. 

Farahany presents her work to diverse audiences and is a frequent commentator for national media and radio shows. She is an elected member of the American Law Institute, the President-Elect of the International Neuroethics Society, serves on the Board of the International Neuroethics Society, a co-editor a co-editor-in-chief and co-founder of the Journal of Law and the Biosciences and an editorial board member of the American Journal of Bioethics (Neuroscience). She's on the Ethics Advisory Board for Illumina, Inc., the Scientific Advisory Board of Helix, and the Board of Advisors of Scientific American. 

Farahany received her AB in genetics, cell and developmental biology at Dartmouth College, a JD and MA from Duke University, as well as a PhD in philosophy. She also holds an ALM in biology from Harvard University. In 2004-2005, Farahany clerked for Judge Judith W. Rogers of the US Court of Appeals for the D.C. Circuit, after which she joined the faculty at Vanderbilt University. In 2011, Farahany was the Leah Kaplan Visiting Professor of Human Rights at Stanford Law School.

More profile about the speaker
Nita Farahany | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee