Zeynep Tufekci: We're building a dystopia just to make people click on ads
Techno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread. Full bio
Double-click the English transcript below to play the video.
of artificial intelligence,
of humanoid robots run amok.
something to consider,
for the 21st century.
will do to us on its own,
will use artificial intelligence
and our dignity in the near-term future
and selling our data and our attention
bolstering their business as well.
like artificial intelligence
of many areas of study and research.
a famous Hollywood philosopher,
comes prodigious risk."
of our digital lives, online ads.
of being followed on the web
we searched or read.
you around everywhere you go.
they're still following you around.
of basic, cheap manipulation.
"You know what? These things don't work."
let's think of a physical world example.
at supermarkets, near the cashier,
at the eye level of kids?
whine at their parents
are about to sort of check out.
in every supermarket.
are kind of limited,
so many things by the cashier. Right?
it's the same for everyone,
whiny little humans beside them.
we live with those limitations.
can be built at the scale of billions
to everyone's phone private screen,
that artificial intelligence can do.
plane tickets to Vegas. Right?
of some demographics to target
and what you can guess.
a high limit on their credit card,
that Facebook has on you:
that you uploaded there.
and change your mind and delete it,
and analyzes them, too.
to match you with your offline data.
a lot of data from data brokers.
from your financial records
such data is routinely collected,
these machine-learning algorithms --
learning algorithms --
the characteristics of people
how to apply this to new people.
is likely to buy a ticket to Vegas or not.
an offer to buy tickets to Vegas.
how these complex algorithms work.
how they're doing this categorization.
thousands of rows and columns,
how exactly it's operating
what I was thinking right now
a cross section of my brain.
that we don't truly understand.
if there's an enormous amount of data,
deep surveillance on all of us
algorithms can work.
to collect all the data it can about you.
that we do not understand
to sell Vegas tickets
and about to enter the manic phase.
overspenders, compulsive gamblers.
that's what they were picking up on.
to a bunch of computer scientists once
"That's why I couldn't publish it."
figure out the onset of mania
before clinical symptoms,
or what it was picking up on.
if he doesn't publish it,
this kind of technology,
is just off the shelf.
meaning to watch one video
has this column on the right
that you might be interested in
and what people like you have watched,
what you're interested in,
and useful feature,
of then-candidate Donald Trump
the movement supporting him.
so I was studying it, too.
about one of his rallies,
white supremacist videos
or Bernie Sanders content,
and autoplays conspiracy left,
this is politics, but it's not.
figuring out human behavior.
about vegetarianism on YouTube
and autoplayed a video about being vegan.
hardcore enough for YouTube.
show them something more hardcore,
going down that rabbit hole
the ethics of the store,
anti-Semitic content,
anti-Semitic content on their profile
may be susceptible to such messages,
like an implausible example,
do this on Facebook,
offered up suggestions
and very quickly they found,
spent about 30 dollars
social media manager disclosed
to demobilize people,
they targeted specifically,
in key cities like Philadelphia,
exactly what he said.
we want to see it see it.
to turn these people out."
arranges the posts
or the pages you follow.
everything chronologically.
that the algorithm thinks will entice you
somebody is snubbing you on Facebook.
be showing your post to them.
some of them and burying the others.
can affect your emotions.
on 61 million people in the US
"Today is election day,"
the one with that tiny tweak
they repeated the same experiment.
US presidential election
very easily infer what your politics are,
disclosed them on the site.
can do that quite easily.
of one candidate over the other?
seemingly innocuous --
if we're seeing the same information
the beginning stages of this.
personality traits,
use of addictive substances,
are partially concealed.
to detect people's sexual orientation
to be 100 percent right,
the temptation to use these technologies
some false positives,
a whole other layer of problems.
it has on its citizens.
face detection technology
of surveillance authoritarianism
Orwell's authoritarianism.
is using overt fear to terrorize us,
are using these algorithms
the troublemakers and the rebels,
architectures at scale
weaknesses and vulnerabilities,
and neighbors are seeing,
will envelop us like a spider's web
as a persuasion architecture.
personal and social information flows,
because they provide us with great value.
with friends and family around the world.
social media is for social movements.
these technologies can be used
you know, Facebook or Google
or the world more polarized
well-intentioned statements
people in technology make that matter,
and business models they're building.
of half a trillion dollars
as a persuasion architecture,
is of great concern.
digital technology operates.
technology is developed
economic and otherwise,
created by the proprietary algorithms,
of machine learning's opacity,
that's being collected about us.
artificial intelligence
by our human values.
on what those terms mean.
depend on for so much operate,
this conversation anymore.
that we are the product that's being sold.
authoritarian or demagogue.
that Hollywood paraphrase,
and digital technology to blossom,
this prodigious menace,
ABOUT THE SPEAKER
Zeynep Tufekci - Techno-sociologistTechno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread.
Why you should listen
We've entered an era of digital connectivity and machine intelligence. Complex algorithms are increasingly used to make consequential decisions about us. Many of these decisions are subjective and have no right answer: who should be hired, fired or promoted; what news should be shown to whom; which of your friends do you see updates from; which convict should be paroled. With increasing use of machine learning in these systems, we often don't even understand how exactly they are making these decisions. Zeynep Tufekci studies what this historic transition means for culture, markets, politics and personal life.
Tufekci is a contributing opinion writer at the New York Times, an associate professor at the School of Information and Library Science at University of North Carolina, Chapel Hill, and a faculty associate at Harvard's Berkman Klein Center for Internet and Society.
Her book, Twitter and Tear Gas: The Power and Fragility of Networked Protest, was published in 2017 by Yale University Press. Her next book, from Penguin Random House, will be about algorithms that watch, judge and nudge us.
Zeynep Tufekci | Speaker | TED.com