ABOUT THE SPEAKERS
Kashmir Hill - Technology journalist
Kashmir Hill writes about privacy and technology.

Why you should listen

Kashmir Hill is a senior reporter for the Gizmodo Media Group. As she writes: "I started out in journalism blogging at what was essentially an online tabloid for lawyers. Then I got interested in privacy, and that forced me to write about Facebook and eventually about other technologies; along the way people started describing me as a technology journalist instead of a legal blogger.

"I've always wanted my writing to be approachable for a lay audience, so I usually use humor, a first-person approach or, ideally, both. So I've hacked a smart home, lived in a monitored one, created a fake business and bought it a fake reputation, worked as a crowdsourced girlfriend, lived on Bitcoin and spent a whole week WRITING IN CAPS LOCK. The best way to prepare people for future possible tech dystopias is for me to live in them and report back."

More profile about the speaker
Kashmir Hill | Speaker | TED.com
Surya Mattu - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people."

Why you should listen

Surya Mattu is a data reporter on Gizmodo's Special Projects Desk and an R&D journalism resident at Eyebeam NYC. As he writes: "My practice combines art, investigative journalism, engineering and creative technology. The aim is to reverse-engineer the specific ways in which the tools or technology we create are imbued with the ethics of the culture in which they're created. Currently, I am a data reporter. Previously, I was a contributing researcher at ProPublica, where I worked on "Machine Bias," a series that aims to highlight how algorithmic systems can be biased and discriminate against people."

More profile about the speaker
Surya Mattu | Speaker | TED.com
TED2018

Kashmir Hill and Surya Mattu: What your smart devices know (and share) about you

Kashmir Hill and Surya Mattu: O que seus aparelhos inteligentes sabem (e compartilham) sobre você

Filmed:
2,030,169 views

Uma vez que seus dispositivos inteligentes podem falar com você, com quem mais eles estão falando? Era o que Kashmir Hill e Surya Mattu queriam descobrir. Então, eles equiparam o apartamento de Hill com 18 dispositivos diferentes conectados à internet e construíram um roteador especial para rastrear a frequência com que esses dispositivos contatavam seus servidores, e descobriram a informação que estavam devolvendo. Os resultados foram surpreendentes e mais do que um pouco assustadores. Saiba mais sobre o que os dados de seus dispositivos inteligentes revelam sobre seu horário de sono, suas maratonas em frente à TV e até seus hábitos de escovar os dentes, e como as empresas de tecnologia podem usar essa informação para influenciar você e traçar o seu perfil. (Esta palestra contém linguagem adulta.)
- Technology journalist
Kashmir Hill writes about privacy and technology. Full bio - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people." Full bio

Double-click the English transcript below to play the video.

00:12
Kashmir Hill: So for my
birthday last year,
0
912
2135
Kashmir Hill: No ano passado, meu marido
me deu um Amazon Echo em meu aniversário.
00:15
my husband got me an Amazon Echo.
1
3071
1993
00:17
I was kind of shocked, actually,
2
5595
1540
Foi um choque, na verdade,
00:19
because we both work
in privacy and security.
3
7159
3234
porque ambos trabalhamos
com privacidade e segurança.
00:22
(Laughter)
4
10417
1336
(Risos)
00:24
And this was a device that would sit
in the middle of our home
5
12688
3389
Esse era um aparelho que ficava
no centro de nossa casa
00:28
with a microphone on,
6
16101
1436
com um microfone ligado,
escutando constantemente.
00:29
constantly listening.
7
17561
1649
00:31
We're not alone, though.
8
19696
1254
Aliás, não estamos sozinhos.
00:32
According to a survey by NPR
and Edison Research,
9
20974
2881
Segundo uma pesquisa
da NPR and Edison Research,
00:35
one in six American adults
now has a smart speaker,
10
23879
4079
um em cada seis americanos adultos
tem agora um alto-falante inteligente,
00:39
which means that they have
a virtual assistant at home.
11
27982
2881
o que significa que eles têm
um assistente virtual em casa.
00:42
Like, that's wild.
12
30887
1198
Sim, é extraordinário.
00:44
The future, or the future dystopia,
is getting here fast.
13
32109
4116
O futuro da distopia está chegando rápido.
00:48
Beyond that, companies are offering us
all kinds of internet-connected devices.
14
36778
4507
As empresas nos oferecem todos os tipos
de aparelhos conectados à internet.
00:53
There are smart lights, smart locks,
smart toilets, smart toys,
15
41309
4968
Há luzes, cadeados, banheiros, brinquedos,
00:58
smart sex toys.
16
46301
1311
brinquedos sexuais, todos inteligentes.
01:00
Being smart means the device
can connect to the internet,
17
48323
2691
Ser inteligente significa que o aparelho
pode se conectar à internet, coletar dados
01:03
it can gather data,
18
51038
1176
01:04
and it can talk to its owner.
19
52238
1902
e comunicar-se com seu proprietário.
01:06
But once your appliances can talk to you,
20
54966
2714
Mas, uma vez que seus aparelhos
podem se comunicar com você,
01:09
who else are they going to be talking to?
21
57704
2145
com quem mais eles estarão se comunicando?
01:12
I wanted to find out,
22
60331
1365
Eu queria descobrir.
01:13
so I went all-in and turned my
one-bedroom apartment in San Francisco
23
61720
3388
Então, transformei meu apartamento
de um dormitório em São Francisco
01:17
into a smart home.
24
65132
1349
em uma casa inteligente.
01:18
I even connected our bed to the internet.
25
66800
3091
Conectei até nossa cama à internet.
01:22
As far as I know, it was just
measuring our sleeping habits.
26
70505
3302
Até onde sei, ela só estava medindo
nossos hábitos de sono.
01:26
I can now tell you
that the only thing worse
27
74490
2081
Posso lhes dizer agora que a única coisa
pior do que uma terrível noite de sono
01:28
than getting a terrible night's sleep
28
76595
1787
01:30
is to have your smart bed
tell you the next day
29
78406
2417
é sua cama inteligente dizer
a você no dia seguinte
01:32
that you "missed your goal
and got a low sleep score."
30
80847
3056
que você "fracassou no objetivo
e teve uma baixa pontuação de sono".
01:35
(Laughter)
31
83927
1380
(Risos)
01:37
It's like, "Thanks, smart bed.
32
85331
1437
É tipo: "Valeu, cama inteligente.
Como se eu já não estivesse só o pó hoje".
01:38
As if I didn't already
feel like shit today."
33
86792
2516
01:41
(Laughter)
34
89332
1153
(Risos)
01:42
All together, I installed 18
internet-connected devices in my home.
35
90509
4536
No total, instalei 18 aparelhos
conectados à internet em minha casa.
01:47
I also installed a Surya.
36
95069
2238
Também instalei um Surya.
01:49
Surya Mattu: Hi, I'm Surya.
37
97331
1381
Surya Mattu: Olá, sou o Surya.
01:50
(Laughter)
38
98736
1198
(Risos)
01:51
I monitored everything the smart home did.
39
99958
2944
Monitorei tudo o que
a casa inteligente fazia.
01:54
I built a special router that let me look
at all the network activity.
40
102926
3929
Construí um roteador especial que permite
analisar toda a atividade da rede.
01:58
You can think of my router
sort of like a security guard,
41
106879
2930
Vocês podem pensar em meu roteador
como um segurança, que registra
compulsivamente todos os pacotes de rede
02:01
compulsively logging
all the network packets
42
109833
2133
02:03
as they entered and left the smart home.
43
111990
2047
conforme entram e saem
da casa inteligente.
02:06
KH: Surya and I are both journalists,
he's not my husband,
44
114061
2754
KH: Surya e eu somos jornalistas;
ele não é meu marido;
02:08
we just work together at Gizmodo.
45
116839
1676
só trabalhamos juntos no Gizmodo.
SM: Obrigado por esclarecer.
02:10
SM: Thank you for clarifying.
46
118539
1389
02:11
The devices Kashmir bought --
47
119952
1402
Os aparelhos que Kashmir comprou...
02:13
we were interested in understanding
48
121378
1692
estávamos interessados em entender
o que diziam a seus fabricantes.
02:15
what they were saying
to their manufacturers.
49
123094
2111
Mas também tínhamos interesse em entender
02:17
But we were also interested
in understanding
50
125229
2062
como são as emissões digitais da casa
para o provedor de serviços de internet.
02:19
what the home's digital
emissions look like
51
127315
2370
02:21
to the internet service provider.
52
129709
2265
02:23
We were seeing what the ISP could see,
but more importantly,
53
131998
2842
Víamos o que o provedor poderia descobrir,
mas, principalmente, vender.
02:26
what they could sell.
54
134864
1181
02:28
KH: We ran the experiment for two months.
55
136069
2143
KH: Fizemos o experimento por dois meses.
02:30
In that two months,
56
138236
1167
Naqueles dois meses, não havia
uma hora de silêncio digital na casa,
02:31
there wasn't a single hour
of digital silence in the house --
57
139427
2877
nem mesmo em nossa ausência de uma semana.
02:34
not even when we went away for a week.
58
142328
1932
02:36
SM: Yeah, it's so true.
59
144284
1174
SM: Sim, é verdade.
Baseado nos dados, eu sabia
quando acordavam ou iam para a cama.
02:37
Based on the data, I knew when
you guys woke up and went to bed.
60
145482
3016
Até sabia quando Kashmir
escovava os dentes.
02:40
I even knew when Kashmir
brushed her teeth.
61
148522
2039
Não vou revelar seus hábitos de escovação,
02:42
I'm not going to out your brushing habits,
62
150585
2024
mas vamos dizer que ficou muito claro
pra mim quando você trabalhava em casa.
02:44
but let's just say it was very clear to me
when you were working from home.
63
152633
3629
02:48
KH: Uh, I think you just outed them
to, like, a lot of people here.
64
156286
3212
KH: Acho que você acabou de revelar
para muitas pessoas aqui.
02:51
SM: Don't be embarrassed,
it's just metadata.
65
159522
2346
SM: Não fique envergonhada,
são apenas metadados.
02:54
I knew when you turned on your TV
and how long you watched it for.
66
162472
3143
Eu sabia quando você ligava a TV
e por quanto tempo você assistia.
02:57
Fun fact about the Hill household:
67
165639
1629
Fato divertido sobre a família Hill:
02:59
they don't watch a lot of television,
68
167292
1794
não assistem muito à televisão,
mas quando assistem,
geralmente, é no modo maratona.
03:01
but when they do,
it's usually in binge mode.
69
169110
2268
03:03
Favorite shows include
"Difficult People" and "Party Down."
70
171402
2779
Seus programas favoritos são
"Difficult People" e "Party Down".
03:06
KH: OK, you're right,
I loved "Party Down."
71
174205
2039
KH: Tem razão, eu adorava "Party Down".
É um ótimo seriado
e você deveria assistir.
03:08
It's a great show,
and you should definitely watch it.
72
176268
2547
Mas só meu marido assistia
a "Difficult People", Trevor.
03:10
But "Difficult People"
was all my husband, Trevor.
73
178839
2444
03:13
And Trevor was actually a little upset
that you knew about his binges,
74
181307
3603
Ele ficou mesmo um pouco chateado
por você saber sobre as maratonas na TV,
03:16
because even though he'd been the one
to connect the TV to the router,
75
184934
3334
porque, mesmo tendo sido ele
quem ligou a TV ao roteador,
ele esqueceu que a TV
estava nos observando.
03:20
he forgot that the TV was watching us.
76
188292
2447
03:23
It's actually not the first time
that our TV has spied on us.
77
191109
3293
Na verdade, não é a primeira vez
que nossa TV nos espiona.
03:26
The company that made it, VIZIO,
78
194426
1699
A empresa que a fabricou, a VIZIO,
03:28
paid a 2.2 million-dollar settlement
to the government just last year,
79
196149
4333
pagou um acordo de US$ 2,2 milhões
ao governo no ano passado,
03:32
because it had been collecting
second-by-second information
80
200506
3357
porque estava coletando
informações segundo a segundo
03:35
about what millions of people
were watching on TV, including us,
81
203887
3468
sobre o que milhões de pessoas,
inclusive nós, assistiam na TV
03:39
and then it was selling that information
to data brokers and advertisers.
82
207379
3705
e depois vendia essa informação
para corretores de dados e anunciantes.
03:43
SM: Ah, classic surveillance economy move.
83
211108
3222
SM: Ah, o movimento clássico
de economia de vigilância.
03:46
The devices Kashmir bought
almost all pinged their servers daily.
84
214775
3866
Quase todos os aparelhos
que Kashmir comprou
testavam os servidores diariamente.
03:50
But do you know which device
was especially chatty?
85
218665
2396
Sabe qual aparelho se comunicava mais?
03:53
The Amazon Echo.
86
221085
1294
O Amazon Echo.
03:54
It contacted its servers
every three minutes,
87
222403
2428
Ele contatava os servidores
a cada três minutos,
03:56
regardless of whether
you were using it or not.
88
224855
2198
independentemente
de estar sendo usado ou não.
03:59
KH: In general, it was disconcerting
89
227077
2183
KH: Em geral, era desconcertante
04:01
that all these devices were having
ongoing conversations
90
229284
2984
que todos aqueles aparelhos
estivessem tendo comunicações
04:04
that were invisible to me.
91
232292
1626
invisíveis para mim.
04:05
I mean, I would have had
no idea, without your router.
92
233942
2587
Quer dizer, eu não teria ideia,
sem seu roteador.
04:08
If you buy a smart device,
you should probably know --
93
236553
3468
Se você comprar um aparelho inteligente,
provavelmente deveria saber
04:12
you're going to own the device,
94
240045
2111
que você vai possuir o aparelho,
04:14
but in general, the company
is going to own your data.
95
242180
3420
mas, em geral, a empresa
vai possuir seus dados.
04:17
And you know, I mean,
maybe that's to be expected --
96
245624
2572
E sabe, quero dizer,
talvez seja o esperado:
04:20
you buy an internet-connected device,
it's going to use the internet.
97
248220
3507
se você compra um aparelho conectado
à internet, ele vai usar a internet.
04:24
But it's strange to have these devices
98
252189
1842
Mas é estranho ter esses aparelhos
movendo-se no espaço íntimo que é a casa
04:26
moving into the intimate space
that is the home
99
254055
2658
04:28
and allowing companies to track
our really basic behavior there.
100
256737
3443
e permitir que as empresas rastreiem
nosso comportamento realmente básico.
04:32
SM: So true.
101
260204
1159
SM: Verdade.
Mesmo os dados mais banais podem
ser extraídos pela economia de vigilância.
04:33
Even the most banal-seeming data
can be mined by the surveillance economy.
102
261387
3539
Quem se importa com que frequência
você escova os dentes?
04:36
For example, who cares
how often you brush your teeth?
103
264950
2575
Bem, na verdade, há uma companhia
de seguros dentários chamada Beam.
04:39
Well, as it turns out, there's a dental
insurance company called Beam.
104
267549
3526
04:43
They've been monitoring their customers'
smart toothbrushes since 2015 --
105
271099
3835
Eles monitoram as escovas de dente
inteligentes de seus clientes desde 2015
04:46
for discounts on their
premiums, of course.
106
274958
2579
para descontos nos prêmios, é claro.
04:49
KH: We know what
some of you are thinking:
107
277561
2259
KH: Sabemos o que alguns estão pensando:
04:51
this is the contract of the modern world.
108
279844
2654
"Este é o contrato do mundo moderno".
04:54
You give up a little privacy,
109
282522
1427
Perdemos um pouco de privacidade,
04:55
and you get some convenience
or some price breaks in return.
110
283973
3218
mas ganhamos em troca
conveniência ou reduções de preço.
04:59
But that wasn't my experience
in my smart home.
111
287514
2341
Mas não foi asssim
em minha casa inteligente.
05:01
It wasn't convenient, it was infuriating.
112
289879
3674
Não era conveniente, era enfurecedor.
05:05
I'll admit, I love my smart vacuum,
113
293577
2492
Vou admitir: adoro
meu aspirador inteligente,
05:08
but many other things in the house
drove me insane:
114
296093
2405
mas muitas outras coisas na casa
me deixavam maluca:
05:10
we ran out of electrical outlets,
115
298522
2396
ficamos sem tomadas elétricas,
05:12
and I had to download
over a dozen apps to my phone
116
300942
3379
e tive que baixar mais de uma dúzia
de aplicativos para o meu telefone
05:16
to control everything.
117
304345
1327
para controlar tudo.
05:17
And then every device had its own log-in,
118
305696
2089
Cada aparelho tinha
seu próprio nome de usuário,
05:19
my toothbrush had a password ...
119
307809
2363
minha escova de dentes tinha uma senha...
05:22
(Laughter)
120
310196
1737
(Risos)
05:23
And smart coffee, especially,
was just a world of hell.
121
311957
4128
O café inteligente, particularmente,
era um mundo dos infernos.
05:28
SM: Wait, really? Cloud-powered
coffee wasn't really working for you?
122
316109
3928
SM: Espera, sério? Café movido a nuvem
não estava mesmo funcionando para você?
05:32
KH: I mean, maybe I'm naive,
but I thought it was going to be great.
123
320061
3238
KH: Quero dizer, talvez eu seja ingênua,
mas achei que seria ótimo.
05:35
I thought we'd just wake up in the morning
and we'd say, "Alexa, make us coffee."
124
323323
4019
Pensei que só acordaríamos de manhã
e diríamos: "Alexa, faça café pra nós".
05:39
But that's not how it went down.
125
327366
1873
Mas não foi assim que aconteceu.
05:41
We had to use this really particular,
brand-specific phrase to make it work.
126
329263
4651
Tivemos que usar uma frase específica
da marca para que funcionasse.
05:45
It was, "Alexa, ask the Behmor
to run quick start."
127
333938
4608
Era: "Alexa, peça ao Behmor
para executar o modo rápido".
05:51
And this was just, like,
really hard to remember
128
339222
3342
Era muito difícil se lembrar disso,
05:54
first thing in the morning,
129
342588
1334
de manhã bem cedo, antes da cafeína.
05:55
before you have had your caffeine.
130
343946
1703
05:57
(Laughter)
131
345673
1103
(Risos)
05:58
And apparently, it was hard to say,
132
346800
1690
Parecia difícil de dizer,
porque o Echo Dot, ao lado de nossa cama,
simplesmente não conseguia nos entender.
06:00
because the Echo Dot
that was right next to our bed
133
348514
3063
06:03
just couldn't understand us.
134
351601
1611
06:05
So we would basically start every day
by screaming this phrase at the Echo Dot.
135
353704
4621
Então, geralmente, começávamos todo dia
gritando essa frase para o Echo Dot.
06:10
(Laughter)
136
358349
1173
(Risos)
E Trevor odiava isso.
06:11
And Trevor hated this.
137
359546
1466
06:13
He'd be like, "Please, Kashmir,
138
361339
1748
Ele dizia: "Por favor, Kashmir,
06:15
just let me go to the kitchen and push
the button to make the coffee run."
139
363111
3669
só me deixe ir até a cozinha
e apertar o botão para fazer o café".
06:19
And I'd be like, "No, you can't!
140
367379
2087
E eu respondia: "Não, você não pode!
06:21
We have to do it the smart way!"
141
369490
2409
Temos que fazer isso
da maneira inteligente!"
06:23
(Laughter)
142
371923
1916
(Risos)
06:25
I'm happy to report that our marriage
survived the experiment,
143
373863
3043
Fico feliz em informar que nosso casamento
sobreviveu, mas apenas por pouco.
06:28
but just barely.
144
376930
1560
SM: Se decidirem tornar
sua casa inteligente,
06:30
SM: If you decide to make your home smart,
145
378514
2047
06:32
hopefully, you’ll find it
less infuriating than Kashmir did.
146
380585
2849
tomara que achem isso
menos enfurecedor do que Kashmir.
06:35
But regardless, the smart things you buy
147
383458
2087
Mas as coisas inteligentes que compramos
podem e provavelmente são usadas
06:37
can and probably are used
to target and profile you.
148
385569
3095
para nos influenciar
e traçar nosso perfil.
06:41
Just the number of devices you have
can be used to predict
149
389141
3040
Só o número de aparelhos que temos
pode ser usado para prever
nosso grau de riqueza ou pobreza.
06:44
how rich or poor you are.
150
392205
1422
06:45
Facebook's made this tech,
and they've also patented it.
151
393651
2767
O Facebook criou essa tecnologia
e também a patenteou.
06:48
KH: All the anxiety you currently feel
every time you go online,
152
396442
3686
KH: Toda a ansiedade que sentimos
atualmente toda vez que ficamos on-line,
06:52
about being tracked,
153
400152
1306
sobre ser rastreado,
06:53
is about to move into your living room.
154
401482
2301
está prestes a entrar
em nossa sala de estar,
06:55
Or into your bedroom.
155
403807
1579
ou em nosso quarto.
06:57
There's this sex toy called the We-Vibe.
156
405950
2222
Há um brinquedo sexual chamado We-Vibe.
07:00
You might wonder why
a sex toy connects to the internet,
157
408577
2667
Vocês podem se perguntar
por que se conecta à internet,
07:03
but it's for two people
who are in a long-distance relationship,
158
411268
3436
mas é para duas pessoas que estão
em um relacionamento de longa distância,
07:06
so they can share their love from afar.
159
414728
2920
para que possam compartilhar
seu amor de longe.
07:10
Some hackers took a close look at this toy
160
418259
2116
Alguns hackers analisaram esse brinquedo
07:12
and saw it was sending
a lot of information
161
420399
2075
e viram que ele enviava muita informação
de volta para a empresa que o fabricou:
07:14
back to the company that made it --
162
422498
2462
07:16
when it was used,
how long it was used for,
163
424984
2966
quando era usado,
por quanto tempo era usado,
07:19
what the vibration settings were,
how hot the toy got.
164
427974
3706
quais eram as configurações de vibração,
qual temperatura o brinquedo alcançava.
07:23
It was all going into a database.
165
431704
2189
Tudo estava indo para um banco de dados.
07:25
So I reached out to the company,
166
433917
2749
Então, contatei a empresa e perguntei:
07:28
and I said, "Why are you collecting
this really sensitive data?"
167
436690
3302
"Por que vocês estão coletando
esses dados muito confidenciais?"
Eles responderam:
07:32
And they said, "Well, it's great
for market research."
168
440369
3386
"Bem, é ótimo para pesquisa de mercado".
07:36
But they were data-mining
their customers' orgasms.
169
444592
2929
Mas eles extraíam dados
sobre os orgasmos dos clientes
07:39
And they weren't telling them about it.
170
447545
1921
e não informavam a eles sobre isso.
07:41
I mean, even if
you're cavalier about privacy,
171
449490
2222
Mesmo sendo arrogante sobre a privacidade,
07:43
I hope that you would admit
that's a step too far.
172
451736
2595
espero que se admita que é
um passo longe demais.
07:46
SM: This is why I want
to keep my sex toys dumb.
173
454680
2508
SM: É por isso que mantenho
meus brinquedos sexuais mudos.
07:49
KH: That's great.
174
457212
1166
KH: Ótimo, ficamos todos
muito felizes em saber disso.
07:50
We're all very glad to know that.
175
458402
1666
07:52
(Laughter)
176
460092
1448
(Risos)
07:53
SM: A data point I'm willing to share.
177
461564
2034
SM: É uma informação
que quero compartilhar.
07:55
(Laughter)
178
463622
1723
(Risos)
07:57
The devices Kashmir bought
range from useful to annoying.
179
465369
2970
Os aparelhos que Kashmir comprou
variam de úteis a irritantes.
08:00
But the thing they all had in common
180
468363
1793
Mas o que todos eles tinham em comum
08:02
was sharing data with
the companies that made them.
181
470180
2699
era o compartilhamento de dados
com seus fabricantes.
08:04
With email service providers
and social media,
182
472903
2308
Com provedores de serviços
de e-mail e mídias sociais,
08:07
we've long been told that if it's free,
you're the product.
183
475235
2958
há muito tempo nos dizem
que, se é grátis, somos o produto.
08:10
But with the internet of things, it seems,
184
478217
2007
Mas, com a internet das coisas,
parece que, mesmo pagando,
ainda somos o produto.
08:12
even if you pay, you're still the product.
185
480248
2008
Então, temos que perguntar:
08:14
So you really have to ask:
186
482280
1245
"Quem é o real beneficiário
da casa inteligente:
08:15
Who's the true beneficiary
of your smart home,
187
483549
2185
nós ou os fabricantes dos aparelhos?"
08:17
you or the company mining you?
188
485758
1629
08:19
KH: Look, we're a tech savvy crowd here.
189
487411
1952
KH: Somos especialistas em tecnologia.
Acho que a maioria sabe
08:21
I think most of us know
that these things connect to the internet
190
489387
3112
que essas coisas se conectam à internet
e enviam dados para fora.
08:24
and send data out.
191
492523
1163
Tudo bem, talvez estejamos bem
vivendo nessa prisão circular comercial,
08:25
And fine, maybe you're OK with living
in that commercial panopticon,
192
493710
4018
08:29
but others aren't.
193
497752
1397
mas os outros não estão.
08:31
We need the companies to rethink
the design of these devices
194
499173
2841
Precisamos que as empresas
repensem o design dos aparelhos
08:34
with our privacy in mind,
195
502038
1515
com nossa privacidade em mente.
08:35
because we're not all willing
to participate in "market research,"
196
503577
3111
Não estamos todos dispostos a participar
de "pesquisa de mercado",
08:38
just because a device we bought
has a Wi-Fi connection.
197
506712
2893
só porque um aparelho que compramos
tem uma conexão Wi-Fi.
08:42
And I have to tell you,
198
510014
1198
E tenho que dizer:
mesmo quando estamos cientes
de que isso acontece,
08:43
even when you're aware,
generally, this is happening,
199
511236
2582
é muito fácil esquecer que utensílios
domésticos normais estão nos espiando.
08:45
it's really easy to forget that normal
household items are spying on you.
200
513842
4394
08:50
It's easy to forget
these things are watching you,
201
518664
2350
É fácil esquecer que estão nos observando
porque não se parecem com câmeras.
08:53
because they don't look like cameras.
202
521038
2047
08:55
They could look like ...
203
523109
1429
Poderiam se parecer...
08:56
well, they could look like a dildo.
204
524562
2544
bem, poderiam se parecer com um consolo.
08:59
Thank you.
205
527780
1152
Obrigada.
(Aplausos)
09:00
(Applause)
206
528956
3686
Translated by Maurício Kakuei Tanaka
Reviewed by Leonardo Silva

▲Back to top

ABOUT THE SPEAKERS
Kashmir Hill - Technology journalist
Kashmir Hill writes about privacy and technology.

Why you should listen

Kashmir Hill is a senior reporter for the Gizmodo Media Group. As she writes: "I started out in journalism blogging at what was essentially an online tabloid for lawyers. Then I got interested in privacy, and that forced me to write about Facebook and eventually about other technologies; along the way people started describing me as a technology journalist instead of a legal blogger.

"I've always wanted my writing to be approachable for a lay audience, so I usually use humor, a first-person approach or, ideally, both. So I've hacked a smart home, lived in a monitored one, created a fake business and bought it a fake reputation, worked as a crowdsourced girlfriend, lived on Bitcoin and spent a whole week WRITING IN CAPS LOCK. The best way to prepare people for future possible tech dystopias is for me to live in them and report back."

More profile about the speaker
Kashmir Hill | Speaker | TED.com
Surya Mattu - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people."

Why you should listen

Surya Mattu is a data reporter on Gizmodo's Special Projects Desk and an R&D journalism resident at Eyebeam NYC. As he writes: "My practice combines art, investigative journalism, engineering and creative technology. The aim is to reverse-engineer the specific ways in which the tools or technology we create are imbued with the ethics of the culture in which they're created. Currently, I am a data reporter. Previously, I was a contributing researcher at ProPublica, where I worked on "Machine Bias," a series that aims to highlight how algorithmic systems can be biased and discriminate against people."

More profile about the speaker
Surya Mattu | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee