Zeynep Tufekci: We're building a dystopia just to make people click on ads
Zeynep Tufekcy: Stvaramo distopiju samo kako bismo naveli ljude da kliknu na oglase
Techno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread. Full bio
Double-click the English transcript below to play the video.
of artificial intelligence,
o umjetnoj inteligenciji
of humanoid robots run amok.
humanoidnih robota koji divljaju.
something to consider,
što treba imati na umu,
for the 21st century.
distopija 21. stoljeća.
will do to us on its own,
učiniti sama po sebi,
will use artificial intelligence
koristiti umjetnu inteligenciju
and our dignity in the near-term future
i naše dostojanstvo u bliskoj budućnosti
and selling our data and our attention
naših podataka i naše pozornosti
bolstering their business as well.
podupirati i njihovo poslovanje.
like artificial intelligence
da umjetna inteligencija
of many areas of study and research.
područja proučavanja i istraživanja,
a famous Hollywood philosopher,
poznatog holivudskog filozofa,
comes prodigious risk."
dolazi i ogroman rizik."
of our digital lives, online ads.
digitalnih života, oglase na internetu.
of being followed on the web
sa situacijom da nas na internetu
we searched or read.
što smo tražili ili čitali.
you around everywhere you go.
slijede gdje god pošli.
they're still following you around.
i kupite ih, i dalje vas prate.
of basic, cheap manipulation.
na takvu vrstu jeftine manipulacije.
"You know what? These things don't work."
"Ma znate što? Takve stvari ne pale."
let's think of a physical world example.
proučimo primjer fizičkog svijeta.
at supermarkets, near the cashier,
u supermarketima, pored blagajni
at the eye level of kids?
u visini očiju djece?
whine at their parents
are about to sort of check out.
in every supermarket.
are kind of limited,
so many things by the cashier. Right?
staviti pored blagajne. Zar ne?
it's the same for everyone,
whiny little humans beside them.
pored sebe imaju male gnjavatore.
we live with those limitations.
živimo s takvim ograničenjima.
can be built at the scale of billions
izgraditi na ljestvici od milijardi
to everyone's phone private screen,
na privatne mobitele,
that artificial intelligence can do.
koje umjetna inteligencija može raditi.
plane tickets to Vegas. Right?
avionske karte za Vegas. Dobro?
of some demographics to target
razmišljati o ciljanoj populaciji
and what you can guess.
i o tome što možete pretpostaviti.
a high limit on their credit card,
visok dozvoljeni minus na kartici
količinama podataka i strojnom učenju,
that Facebook has on you:
koje Facebook ima o vama:
that you uploaded there.
koje ste ondje objavili.
and change your mind and delete it,
predomislite se i obrišete to,
and analyzes them, too.
čuvati i analizirati.
to match you with your offline data.
s vašim izvanmrežnim podacima.
a lot of data from data brokers.
od posrednika podacima.
from your financial records
od vaših financijskih izvještaja
povijesti pretraživanja.
such data is routinely collected,
rutinski prikupljaju,
these machine-learning algorithms --
ti algoritmi za strojno učenje --
learning algorithms --
the characteristics of people
how to apply this to new people.
primjenjivati na nove ljude.
is likely to buy a ticket to Vegas or not.
kupiti kartu za Vegas ili ne.
an offer to buy tickets to Vegas.
ponuda da kupim karte za Vegas.
how these complex algorithms work.
kako ti složeni algoritmi funkcioniraju.
how they're doing this categorization.
kako rade tu kategorizaciju.
thousands of rows and columns,
tisuće redova i stupaca,
how exactly it's operating
kako to točno funkcionira
what I was thinking right now
o čemu ja upravo razmišljam
a cross section of my brain.
presjek mog mozga.
that we don't truly understand.
koju baš i ne razumijemo.
if there's an enormous amount of data,
kada imamo ogromne količine podataka,
deep surveillance on all of us
na značajan nadzor svih nas
algorithms can work.
mogu funkcionirati.
to collect all the data it can about you.
prikupiti što više podataka o vama.
that we do not understand
to sell Vegas tickets
prodati karte za Vegas
and about to enter the manic phase.
koje upravo ulaze u fazu manije?
overspenders, compulsive gamblers.
troše i kompulzivno kockaju.
that's what they were picking up on.
imali pojma da tako djeluju.
to a bunch of computer scientists once
većem broju kompjutorskih znanstvenika
"That's why I couldn't publish it."
"Zato to nisam mogao objaviti."
figure out the onset of mania
može li se prepoznati početak faze manije
before clinical symptoms,
prije kliničkih znakova,
or what it was picking up on.
funkcioniralo ili na što je reagiralo.
if he doesn't publish it,
ako to ne objavi
this kind of technology,
ovakvu vrstu tehnologije,
is just off the shelf.
meaning to watch one video
s namjerom da pogledate jedan video
has this column on the right
s desne strane ime stupac
that you might be interested in
da biste vi htjeli pogledati,
uspjeli naći sami.
and what people like you have watched,
i što su gledali ljudi poput vas
what you're interested in,
to vjerojatno zanima,
and useful feature,
i korisna značajka,
of then-candidate Donald Trump
tadašnjeg kandidata Donalda Trumpa
the movement supporting him.
proučavala pokret koji ga podržava.
so I was studying it, too.
pa sam i to proučavala.
about one of his rallies,
o jednom od njegovih skupova,
nekoliko puta na YT-u.
white supremacist videos
videe o nadmoći bijele rase
or Bernie Sanders content,
Hillary Clinton ili Bernija Sandersa,
and autoplays conspiracy left,
i pušta zavjeru ljevice
this is politics, but it's not.
da je to politika, ali nije.
figuring out human behavior.
koji tumači ljudsko ponašanje.
about vegetarianism on YouTube
video o vegetarijanstvu
and autoplayed a video about being vegan.
i pustio video o veganstvu.
hardcore enough for YouTube.
dovoljno žestoki zagovornici ičega.
show them something more hardcore,
pokazati nešto još žešće,
going down that rabbit hole
ulazeći sve dublje u rupu bez dna
the ethics of the store,
na etničku strukturu u trgovini,
anti-Semitic content,
eksplicitan antisemitski sadržaj
usmjerite prema njima.
anti-Semitic content on their profile
antisemitski sadržaj na svom profilu,
may be susceptible to such messages,
koje su osjetljive na takve porukama
usmjeravate oglase.
like an implausible example,
kao nevjerojatan primjer,
do this on Facebook,
možete raditi na Facebooku
offered up suggestions
ponudio prijedloge
and very quickly they found,
na Googleu i brzo su otkrili
spent about 30 dollars
potrošio je 30-ak dolara
social media manager disclosed
Donalda Trumpa priopćio je
to demobilize people,
na Facebooku kako bi demobilizirali ljude,
da uopće ne glasaju.
they targeted specifically,
posebno su se usmjerili na
in key cities like Philadelphia,
u ključnim gradovima poput Philadelphije
exactly what he said.
što je točno rekao.
we want to see it see it.
za koje želimo da ih vide.
to turn these people out."
na sposobnost da te ljude isključi."
arranges the posts
putem algoritma razvrstava objave
or the pages you follow.
na Facebook, ili stranice koje pratite.
everything chronologically.
sve kronološkim slijedom.
that the algorithm thinks will entice you
da vas algoritam želi namamiti
somebody is snubbing you on Facebook.
ignorira na Facebooku.
be showing your post to them.
ne pokaže vašu objavu njima.
some of them and burying the others.
na neke od njih, a druge zakopava.
can affect your emotions.
može utjecati na vaše emocije.
političko ponašanje.
on 61 million people in the US
na 61 milijunu ljudi u SAD-u
"Today is election day,"
"Danas su izbori",
the one with that tiny tweak
neznatno izmijenjenu poruku
kliknuli na "Glasao sam".
they repeated the same experiment.
koja je prikazana samo jednom
US presidential election
predsjedničke izbore 2016. u SAD-u
very easily infer what your politics are,
zaključiti vaše političko opredjeljenje,
disclosed them on the site.
objavili na stranici.
can do that quite easily.
of one candidate over the other?
jednog kandidata na štetu drugoga?
seemingly innocuous --
naizgled bezopasnog --
što nas slijede okolo --
if we're seeing the same information
the beginning stages of this.
personality traits,
crtama ličnosti,
use of addictive substances,
korištenju tvari koje izazivaju ovisnost,
are partially concealed.
djelomično skrivena.
to detect people's sexual orientation
seksualnu orijentaciju ljudi
na stranicama za pronalaženje partnera.
to be 100 percent right,
the temptation to use these technologies
iskušenjima da se koriste te tehnologije
some false positives,
a whole other layer of problems.
stvoriti cijeli niz problema.
it has on its citizens.
koje ima o svojim građanima.
face detection technology
tehnologiju prepoznavanja lica
of surveillance authoritarianism
nadgledačkog autoritarizma
Orwell's authoritarianism.
Orwellov autoritarizam.
is using overt fear to terrorize us,
strah kako bi nas terorizirao,
are using these algorithms
koriste ove algoritme
the troublemakers and the rebels,
one koji prave probleme i buntovnike,
architectures at scale
arhitekturu uvjeravanja u stvarnom životu
weaknesses and vulnerabilities,
individualne slabosti i osjetljivosti,
and neighbors are seeing,
will envelop us like a spider's web
omotati oko nas kao paukova mreža,
as a persuasion architecture.
kao arhitektura uvjeravanja.
na to prodajete li cipele
podložnijima oglasima
personal and social information flows,
osobne i društvene tokove informacija,
because they provide us with great value.
jer nam puno toga pružaju.
with friends and family around the world.
s prijateljima i obitelji diljem svijeta.
social media is for social movements.
društveni mediji za društvene pokrete.
these technologies can be used
tehnologije mogu koristiti
you know, Facebook or Google
or the world more polarized
well-intentioned statements
people in technology make that matter,
ljudi u tehnologiji,
and business models they're building.
i poslovni modeli koje oni grade.
of half a trillion dollars
s pola trilijuna dolara
as a persuasion architecture,
is of great concern.
razlog za brigu.
digital technology operates.
naša digitalna tehnologija funkcionira.
technology is developed
economic and otherwise,
poticaji, ekonomski ili drugi,
created by the proprietary algorithms,
koju stvaraju privatni algoritmi,
of machine learning's opacity,
nejasnoće strojnog učenja,
that's being collected about us.
koji se o nama skupljaju.
artificial intelligence
by our human values.
naše ljudske vrijednosti.
on what those terms mean.
o tome koji bi to uvjeti trebali biti.
depend on for so much operate,
o kojima toliko dugo ovisimo,
this conversation anymore.
odgađati ovaj razgovor.
koje financiraju oglasi
that we are the product that's being sold.
da smo mi proizvod koji se prodaje.
authoritarian or demagogue.
ili demagogiji koja ponudi najviše.
that Hollywood paraphrase,
and digital technology to blossom,
i digitalne tehnologije procvjeta,
this prodigious menace,
s ovom ogromnom prijetnjom,
ABOUT THE SPEAKER
Zeynep Tufekci - Techno-sociologistTechno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread.
Why you should listen
We've entered an era of digital connectivity and machine intelligence. Complex algorithms are increasingly used to make consequential decisions about us. Many of these decisions are subjective and have no right answer: who should be hired, fired or promoted; what news should be shown to whom; which of your friends do you see updates from; which convict should be paroled. With increasing use of machine learning in these systems, we often don't even understand how exactly they are making these decisions. Zeynep Tufekci studies what this historic transition means for culture, markets, politics and personal life.
Tufekci is a contributing opinion writer at the New York Times, an associate professor at the School of Information and Library Science at University of North Carolina, Chapel Hill, and a faculty associate at Harvard's Berkman Klein Center for Internet and Society.
Her book, Twitter and Tear Gas: The Power and Fragility of Networked Protest, was published in 2017 by Yale University Press. Her next book, from Penguin Random House, will be about algorithms that watch, judge and nudge us.
Zeynep Tufekci | Speaker | TED.com