Kashmir Hill and Surya Mattu: What your smart devices know (and share) about you
Kashmir Hill and Surya Mattu: O que seus aparelhos inteligentes sabem (e compartilham) sobre você
Double-click the English transcript below to play the video.
birthday last year,
me deu um Amazon Echo em meu aniversário.
in privacy and security.
com privacidade e segurança.
in the middle of our home
no centro de nossa casa
escutando constantemente.
and Edison Research,
da NPR and Edison Research,
now has a smart speaker,
tem agora um alto-falante inteligente,
a virtual assistant at home.
um assistente virtual em casa.
is getting here fast.
all kinds of internet-connected devices.
de aparelhos conectados à internet.
smart toilets, smart toys,
can connect to the internet,
pode se conectar à internet, coletar dados
podem se comunicar com você,
one-bedroom apartment in San Francisco
de um dormitório em São Francisco
measuring our sleeping habits.
nossos hábitos de sono.
that the only thing worse
pior do que uma terrível noite de sono
tell you the next day
a você no dia seguinte
and got a low sleep score."
e teve uma baixa pontuação de sono".
Como se eu já não estivesse só o pó hoje".
feel like shit today."
internet-connected devices in my home.
conectados à internet em minha casa.
a casa inteligente fazia.
at all the network activity.
analisar toda a atividade da rede.
sort of like a security guard,
compulsivamente todos os pacotes de rede
all the network packets
da casa inteligente.
he's not my husband,
ele não é meu marido;
o que diziam a seus fabricantes.
to their manufacturers.
in understanding
para o provedor de serviços de internet.
emissions look like
but more importantly,
mas, principalmente, vender.
uma hora de silêncio digital na casa,
of digital silence in the house --
quando acordavam ou iam para a cama.
you guys woke up and went to bed.
escovava os dentes.
brushed her teeth.
pra mim quando você trabalhava em casa.
when you were working from home.
to, like, a lot of people here.
para muitas pessoas aqui.
it's just metadata.
são apenas metadados.
and how long you watched it for.
e por quanto tempo você assistia.
geralmente, é no modo maratona.
it's usually in binge mode.
"Difficult People" and "Party Down."
"Difficult People" e "Party Down".
I loved "Party Down."
e você deveria assistir.
and you should definitely watch it.
a "Difficult People", Trevor.
was all my husband, Trevor.
that you knew about his binges,
por você saber sobre as maratonas na TV,
to connect the TV to the router,
quem ligou a TV ao roteador,
estava nos observando.
that our TV has spied on us.
que nossa TV nos espiona.
to the government just last year,
ao governo no ano passado,
second-by-second information
informações segundo a segundo
were watching on TV, including us,
inclusive nós, assistiam na TV
to data brokers and advertisers.
para corretores de dados e anunciantes.
de economia de vigilância.
almost all pinged their servers daily.
que Kashmir comprou
was especially chatty?
every three minutes,
a cada três minutos,
you were using it or not.
de estar sendo usado ou não.
ongoing conversations
estivessem tendo comunicações
no idea, without your router.
sem seu roteador.
you should probably know --
provavelmente deveria saber
is going to own your data.
vai possuir seus dados.
maybe that's to be expected --
talvez seja o esperado:
it's going to use the internet.
à internet, ele vai usar a internet.
movendo-se no espaço íntimo que é a casa
that is the home
our really basic behavior there.
nosso comportamento realmente básico.
ser extraídos pela economia de vigilância.
can be mined by the surveillance economy.
você escova os dentes?
how often you brush your teeth?
de seguros dentários chamada Beam.
insurance company called Beam.
smart toothbrushes since 2015 --
inteligentes de seus clientes desde 2015
premiums, of course.
some of you are thinking:
or some price breaks in return.
conveniência ou reduções de preço.
in my smart home.
em minha casa inteligente.
meu aspirador inteligente,
drove me insane:
me deixavam maluca:
over a dozen apps to my phone
de aplicativos para o meu telefone
seu próprio nome de usuário,
was just a world of hell.
era um mundo dos infernos.
coffee wasn't really working for you?
não estava mesmo funcionando para você?
but I thought it was going to be great.
mas achei que seria ótimo.
and we'd say, "Alexa, make us coffee."
e diríamos: "Alexa, faça café pra nós".
brand-specific phrase to make it work.
da marca para que funcionasse.
to run quick start."
para executar o modo rápido".
really hard to remember
simplesmente não conseguia nos entender.
that was right next to our bed
by screaming this phrase at the Echo Dot.
gritando essa frase para o Echo Dot.
the button to make the coffee run."
e apertar o botão para fazer o café".
da maneira inteligente!"
survived the experiment,
sobreviveu, mas apenas por pouco.
sua casa inteligente,
less infuriating than Kashmir did.
menos enfurecedor do que Kashmir.
podem e provavelmente são usadas
to target and profile you.
e traçar nosso perfil.
can be used to predict
nosso grau de riqueza ou pobreza.
and they've also patented it.
e também a patenteou.
every time you go online,
atualmente toda vez que ficamos on-line,
em nossa sala de estar,
a sex toy connects to the internet,
por que se conecta à internet,
who are in a long-distance relationship,
em um relacionamento de longa distância,
seu amor de longe.
a lot of information
de volta para a empresa que o fabricou:
how long it was used for,
por quanto tempo era usado,
how hot the toy got.
qual temperatura o brinquedo alcançava.
this really sensitive data?"
esses dados muito confidenciais?"
for market research."
their customers' orgasms.
sobre os orgasmos dos clientes
you're cavalier about privacy,
that's a step too far.
um passo longe demais.
to keep my sex toys dumb.
meus brinquedos sexuais mudos.
muito felizes em saber disso.
que quero compartilhar.
range from useful to annoying.
variam de úteis a irritantes.
the companies that made them.
com seus fabricantes.
and social media,
de e-mail e mídias sociais,
you're the product.
que, se é grátis, somos o produto.
ainda somos o produto.
da casa inteligente:
of your smart home,
that these things connect to the internet
e enviam dados para fora.
vivendo nessa prisão circular comercial,
in that commercial panopticon,
the design of these devices
repensem o design dos aparelhos
to participate in "market research,"
de "pesquisa de mercado",
has a Wi-Fi connection.
tem uma conexão Wi-Fi.
de que isso acontece,
generally, this is happening,
domésticos normais estão nos espiando.
household items are spying on you.
these things are watching you,
porque não se parecem com câmeras.
ABOUT THE SPEAKERS
Kashmir Hill - Technology journalistKashmir Hill writes about privacy and technology.
Why you should listen
Kashmir Hill is a senior reporter for the Gizmodo Media Group. As she writes: "I started out in journalism blogging at what was essentially an online tabloid for lawyers. Then I got interested in privacy, and that forced me to write about Facebook and eventually about other technologies; along the way people started describing me as a technology journalist instead of a legal blogger.
"I've always wanted my writing to be approachable for a lay audience, so I usually use humor, a first-person approach or, ideally, both. So I've hacked a smart home, lived in a monitored one, created a fake business and bought it a fake reputation, worked as a crowdsourced girlfriend, lived on Bitcoin and spent a whole week WRITING IN CAPS LOCK. The best way to prepare people for future possible tech dystopias is for me to live in them and report back."
Kashmir Hill | Speaker | TED.com
Surya Mattu - Artist, investigative journalist, engineer
Think of Surya Mattu as a data detective. As he writes: "I am interested in figuring out the ways in which algorithmic systems harm people."
Why you should listen
Surya Mattu is a data reporter on Gizmodo's Special Projects Desk and an R&D journalism resident at Eyebeam NYC. As he writes: "My practice combines art, investigative journalism, engineering and creative technology. The aim is to reverse-engineer the specific ways in which the tools or technology we create are imbued with the ethics of the culture in which they're created. Currently, I am a data reporter. Previously, I was a contributing researcher at ProPublica, where I worked on "Machine Bias," a series that aims to highlight how algorithmic systems can be biased and discriminate against people."
Surya Mattu | Speaker | TED.com