ABOUT THE SPEAKER
Christopher Soghoian - Privacy researcher and activist
Christopher Soghoian researches and exposes the high-tech surveillance tools that governments use to spy on their own citizens, and he is a champion of digital privacy rights.

Why you should listen

TED Fellow Christopher Soghoian is a champion of digital privacy rights, with a focus on the role that third-party service providers play in enabling governments to monitor citizens. As the principal technologist at the American Civil Liberties Union, he explores the intersection of federal surveillance and citizen's rights.

Before joining the ACLU, he was the first-ever technologist for the Federal Trade Commision's Division of Privacy and Identity Protection, where he worked on investigations of Facebook, Twitter, MySpace and Netflix. Soghoian is also the creator of Do Not Track, an anti-tracking device that all major web browsers now use, and his work has been cited in court.

More profile about the speaker
Christopher Soghoian | Speaker | TED.com
TEDSummit

Christopher Soghoian: Your smartphone is a civil rights issue

Christopher Soghoian: O teu teléfono intelixente é un asunto de dereitos civís

Filmed:
1,581,538 views

O teléfono intelixente que usas reflite máis ca un gusto persoal... tamén podería determinar como de cerca podes ser rastrexado. O experto en privacidade e TED Fellow Christopher Soghoian explícanos unha notoria diferenza entre a encriptación utilizada polos dispositivos de Apple e Android, e ínstanos a prestar atención á crecente fenda de seguridade dixital. " Se os únicos que poden protexerse da mirada do goberno son os ricos e poderosos, temos un problema", di el. "Non é só un problema de ciberseguridade - é un problema de dereitos civís."
- Privacy researcher and activist
Christopher Soghoian researches and exposes the high-tech surveillance tools that governments use to spy on their own citizens, and he is a champion of digital privacy rights. Full bio

Double-click the English transcript below to play the video.

00:12
In the spring of 2016,
0
499
2523
Na primavera de 2016,
00:15
a legal battle between Apple
and the Federal Bureau of Investigation
1
3046
4404
unha batalla legal entre Apple e o FBI
00:19
captured the world's attention.
2
7474
1678
atraeu a atención do mundo.
00:21
Apple has built security features
into its mobile products
3
9633
3339
Apple construíu funcións de seguridade
nos seus produtos móbiles
00:24
which protect data on its devices
from everyone but the owner.
4
12996
3833
que protexen a información dos dispositvos
contra calquera persoa agás o dono.
00:28
That means that criminals, hackers
and yes, even governments
5
16853
4645
Iso significa que delincuentes,
hackers e, si, incluso o goberno
00:33
are all locked out.
6
21522
1320
teñen o acceso broqueado.
00:35
For Apple's customers,
this is a great thing.
7
23580
2199
Para os clientes de Apple
é unha gran cousa.
00:38
But governments are not so happy.
8
26581
2039
Pero os gobernos non están tan felices.
00:41
You see, Apple has made
a conscious decision
9
29152
2317
Apple tomou unha decisión consciente
00:43
to get out of the surveillance business.
10
31493
2560
para eludir o negocio da vixilancia.
00:46
Apple has tried to make surveillance
as difficult as possible
11
34077
3211
Apple intentou facer a vixilancia
tan dificil como fose posible
00:49
for governments and any other actors.
12
37312
2366
para os gobernos e outros actores.
00:53
There are really two
smartphone operating systems
13
41676
2330
En realidade, hai dous sistemas operativos
no mercado global
dos teléfonos intelixentes:
00:56
in the global smartphone market:
14
44030
1628
00:57
iOS and Android.
15
45682
1714
iOS e Android.
00:59
iOS is made by Apple.
Android is made by Google.
16
47749
3068
iOs desenvolveuno Apple.
Android desenvolveuno Google.
01:03
Apple has spent a lot of time and money
17
51446
3124
Apple gastou moito tempo e diñeiro
01:06
to make sure that its products
are as secure as possible.
18
54594
3271
en asegurarse de que o seu produto
fose tan seguro como fose posible.
01:10
Apple encrypts all data
stored on iPhones by default,
19
58458
3434
Apple encripta por defecto
toda a información gardada nos iPhones;
01:13
and text messages sent from one
Apple customer to another Apple customer
20
61916
4006
e os mensaxes de texto enviados
entre os seus clidentes
01:17
are encrypted by default
21
65946
1702
son encriptados por defecto
01:19
without the user having
to take any actions.
22
67672
2517
sen que o usuario faga
cousa ningunha.
01:22
What this means is that,
23
70769
1611
O que isto significa é que,
01:24
if the police seize an iPhone
and it has a password,
24
72404
3780
se a policia incauta un iPhone
e este ten un contrasinal,
01:28
they'll have a difficult time
getting any data off of it,
25
76724
3642
pasarán uns días difíciles
para extraer calquera información del,
01:32
if they can do it at all.
26
80390
1412
no caso de que o consigan.
01:34
In contrast, the security of Android
just really isn't as good.
27
82374
4138
Pola contra, a seguridade de Android
non é tan boa.
01:38
Android phones, or at least
most of the Android phones
28
86536
3046
Os teléfonos Android,
ou polo menos a maioria
01:41
that have been sold to consumers,
29
89606
1606
dos vendioos aos consumidores,
non encriptan por defecto
a información gardada no dispositivo,
01:43
do not encrypt data stored
on the device by default,
30
91236
2855
01:46
and the built-in text messaging app
in Android does not use encryption.
31
94115
4986
e a aplicación de mensaxes que Android
incorpora non usa encriptación.
01:51
So if the police seize an Android phone,
32
99555
2694
Así que se a policia incauta
un teléfono Android,
01:54
chances are, they'll be able to get
all the data they want
33
102273
3357
o máis probable é que poida obter
toda a información
01:57
off of that device.
34
105654
1292
que queira do dispositivo.
01:59
Two smartphones
35
107858
1686
Dous teléfonos intelixentes
02:01
from two of the biggest
companies in the world;
36
109568
2333
de dúas das empresas
máis grandes do mundo;
02:04
one that protects data by default,
37
112497
1785
unha que protexe a información por defecto
02:06
and one that doesn't.
38
114948
1274
e outra que non o fai.
02:08
Apple is a seller of luxury goods.
39
116840
3142
Apple é un vendedor de artigos de luxo.
02:12
It dominates the high end of the market.
40
120006
2249
Domina a gama alta do mercado.
02:14
And we would expect a manufacturer
of luxury goods to have products
41
122710
4354
E sería de esperar que un fabricante
de artigos de luxo tivese produtos
02:19
that include more features.
42
127088
1957
que incluísen máis características.
02:21
But not everyone can afford an iPhone.
43
129567
2407
Pero non todo o mundo
pode permitirse un iPhone.
02:23
That's where Android
really, really dominates:
44
131998
2587
Aí é onde Android realmente domina:
02:26
at the middle and low end of the market,
45
134609
2751
as gamas media e baixa do mercado,
teléfonos intelixentes
para os mil cincocentos millóns de persoas
02:29
smartphones for the billion
and a half people
46
137384
2626
02:32
who cannot or will not spend
47
140034
2870
que non poden ou non gastarán
02:34
600 dollars on a phone.
48
142928
2461
600 dólares nun telefono.
02:41
But the dominance of Android
has led to what I call
49
149275
5830
Pero o dominio de Android
leva ao que eu chamo
02:47
the "digital security divide."
50
155129
2212
a "fenda de seguridade dixital".
02:49
That is, there is now increasingly a gap
51
157365
2985
É dicir, agora mesmo hai
unha distancia crecente
02:52
between the privacy
and security of the rich,
52
160374
4125
entre a privacidade
e a seguridade dos ricos,
02:56
who can afford devices
that secure their data by default,
53
164523
2770
cuxos dispositivos
aseguran a información por defecto,
03:00
and of the poor,
54
168036
1252
e a dos pobres,
03:01
whose devices do very little
to protect them by default.
55
169851
4674
cuxos dispositivos, por defecto,
fan pouco por protexelos.
03:07
So, think of the average Apple customer:
56
175667
3115
Entón, pensade no cliente medio de Apple:
03:12
a banker, a lawyer,
a doctor, a politician.
57
180013
4862
un banqueiro, un avogado,
un médico, un político.
03:17
These individuals now increasingly have
smartphones in their pockets
58
185387
5254
Cada vez máis estas persoas
teñen nos petos teléfonos
03:22
that encrypt their calls,
their text messages,
59
190665
3473
que encriptan as súas chamadas,
as súas mensaxes,
03:26
all the data on the device,
60
194162
1535
toda a información do aparello,
03:27
without them doing really anything
to secure their information.
61
195721
4008
sen que eles fagan nada
para asegurar a súa información
03:32
In contrast, the poor
and the most vulnerable in our societies
62
200904
4045
Pola contra, os pobres
e os máis vulnerables na nosa sociedade
03:36
are using devices that leave them
completely vulnerable to surveillance.
63
204973
5187
usan dispositivos que os deixan
completamente expostos á vixilancia.
03:43
In the United States, where I live,
64
211065
1989
Nos EE.UU., onde vivo,
03:45
African-Americans are more likely
to be seen as suspicious
65
213078
3949
os afroamericanos teñen máis risco
de ser vistos como sospeitosos
03:49
or more likely to be profiled,
66
217051
1851
ou de seren identificados,
03:51
and are more likely to be targeted
by the state with surveillance.
67
219640
3665
e teñen máis risco de ser obxectivo
de vixilancia por parte do Goberno.
03:56
But African-Americans
are also disproportionately likely
68
224008
2789
Pero os afroamericanos
tenden tamén desproporcionadamente
03:58
to use Android devices
that do nothing at all
69
226821
3096
a usar aparellos con Android,
que non fai nada en absoluto
04:01
to protect them from that surveillance.
70
229941
2070
para protexelos da vixilancia.
04:04
This is a problem.
71
232974
1324
Éste é o problema.
04:07
We must remember
that surveillance is a tool.
72
235536
2509
Debemos recordar que a vixilancia
é unha ferramenta.
04:10
It's a tool used by those in power
73
238810
2648
É unha ferramenta usada polos poderosos
04:13
against those who have no power.
74
241934
2258
contra quen non ten poder.
04:17
And while I think it's absolutely great
75
245173
4750
E mentres eu penso que é realmente bo
04:21
that companies like Apple
are making it easy for people to encrypt,
76
249947
3448
que compañias como Apple
lle faciliten á xente encriptar,
04:26
if the only people
who can protect themselves
77
254355
3815
se as únicas persoas que poden protexerse
04:30
from the gaze of the government
78
258194
1491
da espionaxe do Goberno
04:31
are the rich and powerful,
79
259709
1560
son os ricos e poderosos,
04:33
that's a problem.
80
261741
1187
temos un problema.
04:35
And it's not just a privacy
or a cybersecurity problem.
81
263549
3562
E non é so un problema
de privacidade ou ciberseguridade.
04:39
It's a civil rights problem.
82
267739
1544
É un problema de dereitos civís.
04:42
So the lack of default security in Android
83
270806
2867
Así que a falta de seguridade
por defecto de Android
04:45
is not just a problem
for the poor and vulnerable users
84
273697
5379
non é só un problema para os usuarios
pobres e vulnerables
04:51
who are depending on these devices.
85
279100
2218
que dependen destes dispositivos.
04:53
This is actually a problem
for our democracy.
86
281342
2260
En realidade é un problema
para a nosa democracia
04:56
I'll explain what I mean.
87
284237
1468
Explicarei o que quero dicir.
04:58
Modern social movements
rely on technology --
88
286514
2875
Os movementos sociais modernos
dependen da tecnoloxía
05:01
from Black Lives Matter to the Arab Spring
to Occupy Wall Street.
89
289413
5423
dende Black Lives Matter
á Primavera Árabe ou a Occupy Wall Street.
05:07
The organizers of these movements
and the members of these movements
90
295238
3975
Os organizadores destes movementos
e os membros destes movementos
05:11
increasingly communicate
and coordinate with smartphones.
91
299237
3940
comunícanse e coordínanse
cada vez máis con teléfonos intelixentes.
05:16
And so, naturally governments
that feel threatened by these movements
92
304446
3851
E naturalmente os gobernos que
se senten ameazados por estes movementos
05:20
will also target the organizers
and their smartphones.
93
308321
3656
poñerán no seu punto de mira
aos organizadores e aos seus teléfonos.
05:25
Now, it's quite possible
94
313462
2074
Entón, é bastante posible
05:27
that a future Martin Luther King
or a Mandela or a Gandhi
95
315560
4064
que un futuro Martin Luther King
ou un Mandela ou un Gandhi
05:31
will have an iPhone and be protected
from government surveillance.
96
319648
3857
teña un iPhone e estea protexido
da vixiancia do Goberno.
Pero hai moitas máis probabilidades
05:36
But chances are,
97
324283
1547
05:37
they'll probably have a cheap,
$20 Android phone in their pocket.
98
325854
3714
de que teñan nos seus petos
un teléfono barato, de 20$, con Android.
05:42
And so if we do nothing
to address the digital security divide,
99
330676
3821
Polo tanto se non facemos nada por abordar
a fenda de seguridade dixital,
05:46
if we do nothing to ensure
that everyone in our society
100
334521
3868
se non facemos nada para asegurar
que todo o mundo na nosa sociedade
05:51
gets the same benefits of encryption
101
339167
2285
ten os mesmos beneficios de encriptación
05:53
and is equally able to protect themselves
from surveillance by the state,
102
341476
3657
e pode protexerse da mesma maneira
da vixilancia do Estado,
05:57
not only will the poor and vulnerable
be exposed to surveillance,
103
345157
4782
non só os pobres e vulnerables
estarán expostos á vixilancia,
senon que futuros movementos
de dereitos civís poderán ser destruídos
06:02
but future civil rights
movements may be crushed
104
350404
2941
06:05
before they ever reach
their full potential.
105
353369
2420
antes incluso de que acaden
o seu máximo potencial.
06:07
Thank you.
106
355813
1167
Grazas.
06:09
(Applause)
107
357004
3107
(Aplausos)
06:15
Helen Walters: Chris, thank you so much.
108
363940
1926
Helen Walters: Chris, moitas grazas.
06:17
I have a question for you.
109
365890
1872
Teño unha pregunta pra ti.
06:19
We saw recently in the press
110
367786
1853
Vimos hai pouco na prensa
06:21
that Mark Zuckerberg from Facebook
covers over his camera
111
369663
5504
que Mark Zuckerberg de Facebook
cobre a súa cámara
06:27
and does something
with his headphone mic jack.
112
375191
2974
e fai algo co micrófono do seu auricular.
06:30
So I wanted to ask you
a personal question, which is:
113
378189
2545
Por iso querría facerche
unha pregunta persoal:
06:32
Do you do that?
114
380758
1152
Ti tamén o fas?
06:33
And, on behalf of everyone
here, particularly myself,
115
381934
2579
E, de parte de todos,
particularmente de min,
06:36
Should we be doing that?
116
384537
1325
deberiamos facelo nós?
06:37
Should we be covering these things?
117
385886
1768
Deberiamos tapar esas cousas?
06:39
Christopher Soghoian: Putting a sticker --
actually, I like Band-Aids,
118
387678
4161
Christopher Soghoian: Poñer un adhesivo
--en realidade, eu prefiro apósitos
06:43
because you can remove them
and put them back on
119
391863
2265
porque podes quitalos e volvelos poñer
06:46
whenever you want to make
a call or a Skype call.
120
394152
2312
cando queras facer unha chamada
ou usar Skype--.
06:48
Putting a sticker over your web cam
121
396488
1702
Poñer unha pegatina na cámara web
06:50
is probably the best thing
you can do for your privacy
122
398214
2676
é posiblemente o mellor
que podes facer pola túa privacidade
06:52
in terms of bang for buck.
123
400914
1506
en termos de prezo-calidade.
06:54
There really is malware,
malicious software out there
124
402444
3897
Hai software malicioso,
programas por aí adiante
06:58
that can take over your web cam,
125
406365
1857
que poden controlar a túa cámara web,
07:00
even without the light turning on.
126
408246
1870
incluso sen prender o piloto luminoso.
07:02
This is used by criminals.
This is used by stalkers.
127
410140
3169
Úsanos criminais.
Úsanos acosadores.
07:05
You can buy $19.99 "spy
on your ex-girlfriend" software online.
128
413333
4938
Podes comprar un software online
de "espía a túa ex-moza" por 19,99$.
07:10
It's really terrifying.
129
418295
1151
É realmente terrorífico.
07:11
And then, of course,
it's used by governments.
130
419470
2524
E, por suposto, úsano os gobernos.
07:14
And there's obviously
a sexual violence component to this,
131
422018
3221
E obviamente hai un compoñente
de violencia sexual nisto,
07:17
which is that this kind of surveillance
can be used most effectively
132
425263
3701
polo que este tipo de vixilancia
poder ser usada con máis efectividade
07:20
against women and other people
who can be shamed in our society.
133
428988
6803
contra mulleres e outras persoas que poden
ser avergonzadas na nosa sociedade.
Incluso se pensas
que non tes nada que esconder,
07:28
Even if you think
you have nothing to hide,
134
436417
2108
07:30
at the very least, if you have
children, teenagers in your lives,
135
438549
4558
polo menos, se tes fillos,
rapaces na túa vida,
07:35
make sure you put a sticker
on their camera and protect them.
136
443131
3010
asegúrate de poñer un adhesivo
na camara e protexelos.
07:38
HW: Wow. Thank you so much.
CS: Thank you.
137
446165
1994
HW: Uau. Moitas grazas.
CS: Grazas a ti.
07:40
HW: Thanks, Chris.
138
448183
1151
HW: Grazas, Chris.
07:41
(Applause)
139
449358
2472
(Aplausos)

▲Back to top

ABOUT THE SPEAKER
Christopher Soghoian - Privacy researcher and activist
Christopher Soghoian researches and exposes the high-tech surveillance tools that governments use to spy on their own citizens, and he is a champion of digital privacy rights.

Why you should listen

TED Fellow Christopher Soghoian is a champion of digital privacy rights, with a focus on the role that third-party service providers play in enabling governments to monitor citizens. As the principal technologist at the American Civil Liberties Union, he explores the intersection of federal surveillance and citizen's rights.

Before joining the ACLU, he was the first-ever technologist for the Federal Trade Commision's Division of Privacy and Identity Protection, where he worked on investigations of Facebook, Twitter, MySpace and Netflix. Soghoian is also the creator of Do Not Track, an anti-tracking device that all major web browsers now use, and his work has been cited in court.

More profile about the speaker
Christopher Soghoian | Speaker | TED.com