Zeynep Tufekci: We're building a dystopia just to make people click on ads
齊娜普圖菲西: 為了讓人們點擊廣告因而造成了反烏托邦
Techno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread. Full bio
Double-click the English transcript below to play the video.
of artificial intelligence,
of humanoid robots run amok.
形象似人的機器人失控殺人。
something to consider,
for the 21st century.
21 世紀的反烏托邦。
will do to us on its own,
will use artificial intelligence
and our dignity in the near-term future
and selling our data and our attention
bolstering their business as well.
like artificial intelligence
of many areas of study and research.
許多研究領域的了解。
a famous Hollywood philosopher,
好萊塢哲學家的說法:
comes prodigious risk."
of our digital lives, online ads.
基本面向:線上廣告。
of being followed on the web
搜尋或閱讀過某些內容,
we searched or read.
you around everywhere you go.
都會看見那雙靴子。
they're still following you around.
它還是到處跟著你。
of basic, cheap manipulation.
那種基本、廉價的操縱,
"You know what? These things don't work."
「知道嗎?這些沒有用。」
let's think of a physical world example.
實體世界當作例子。
at supermarkets, near the cashier,
靠近收銀機的地方,
at the eye level of kids?
糖果和口香糖?
whine at their parents
are about to sort of check out.
in every supermarket.
are kind of limited,
so many things by the cashier. Right?
只擺得下那麼點東西,對吧?
it's the same for everyone,
是同樣的糖果和口香糖,
whiny little humans beside them.
we live with those limitations.
can be built at the scale of billions
to everyone's phone private screen,
私人手機的螢幕上,
that artificial intelligence can do.
能做到的基本功能之一。
plane tickets to Vegas. Right?
of some demographics to target
某些特徵的人來當目標,
and what you can guess.
a high limit on their credit card,
that Facebook has on you:
that you uploaded there.
and change your mind and delete it,
但隨後改變主意而將之刪除,
and analyzes them, too.
to match you with your offline data.
和你的離線資料做匹配,
a lot of data from data brokers.
from your financial records
such data is routinely collected,
these machine-learning algorithms --
這些機器學習演算法──
learning algorithms --
被稱為學習演算法──
the characteristics of people
學到這些之後,
how to apply this to new people.
套用到新的人身上。
is likely to buy a ticket to Vegas or not.
或不太可能買機票。
an offer to buy tickets to Vegas.
購買飛往賭城機票的訊息罷了,
how these complex algorithms work.
這些複雜的演算法如何運作。
how they're doing this categorization.
thousands of rows and columns,
有數以千計的直行和橫列,
how exactly it's operating
腦中想什麼的了解程度,
what I was thinking right now
a cross section of my brain.
that we don't truly understand.
真正了解的智慧。
if there's an enormous amount of data,
這些才行得通,
deep surveillance on all of us
對我們所有人的密切監視,
algorithms can work.
to collect all the data it can about you.
收集關於你的資料。
that we do not understand
to sell Vegas tickets
and about to enter the manic phase.
overspenders, compulsive gamblers.
花錢超支的人、強迫性賭徒。
that's what they were picking up on.
那是他們選目標的根據。
to a bunch of computer scientists once
給了一群電腦科學家,
"That's why I couldn't publish it."
為什麼我們無法發表它。」
figure out the onset of mania
就預知躁鬱症快發作了,
before clinical symptoms,
or what it was picking up on.
也不知道預測的根據是什麼。
if he doesn't publish it,
問題就沒有解決,
this kind of technology,
is just off the shelf.
meaning to watch one video
原本只是要看一支影片,
has this column on the right
that you might be interested in
and what people like you have watched,
像你這類的人看過什麼影片,
what you're interested in,
and useful feature,
of then-candidate Donald Trump
擁護當時還是候選人川普的集會,
the movement supporting him.
so I was studying it, too.
about one of his rallies,
他的某次集會寫點什麼,
看了幾遍。
white supremacist videos
白人至上主義的影片,
or Bernie Sanders content,
希拉蕊柯林頓或伯尼桑德斯,
and autoplays conspiracy left,
陰謀論左派的影片,
this is politics, but it's not.
figuring out human behavior.
about vegetarianism on YouTube
看一支關於吃素的影片,
and autoplayed a video about being vegan.
一支關於嚴格素食主義者的影片。
hardcore enough for YouTube.
你的口味永遠都還不夠重。
show them something more hardcore,
going down that rabbit hole
一路掉進兔子洞,
the ethics of the store,
anti-Semitic content,
anti-Semitic content on their profile
may be susceptible to such messages,
很容易受到這類訊息影響的人,
like an implausible example,
難以置信的例子,
do this on Facebook,
在臉書上做到這件事,
offered up suggestions
and very quickly they found,
做了實驗,他們很快發現,
Google 上這樣做。
spent about 30 dollars
花了大約 30 美元
social media manager disclosed
to demobilize people,
來「反動員」選民,
they targeted specifically,
in key cities like Philadelphia,
關鍵城市的非裔美國男性,
exactly what he said.
由競選團隊來控制,
we want to see it see it.
to turn these people out."
動員那些人去投票的能力。」
arranges the posts
來安排你的朋友
or the pages you follow.
來呈現所有內容。
everything chronologically.
that the algorithm thinks will entice you
somebody is snubbing you on Facebook.
be showing your post to them.
你的貼文呈現給他們看。
some of them and burying the others.
而埋藏掉其他的。
can affect your emotions.
會影響你的情緒。
on 61 million people in the US
對象是美國 6100 萬人,
"Today is election day,"
the one with that tiny tweak
小小調整過的版本,
「我已投票」的那些人。
they repeated the same experiment.
他們重覆了同樣的實驗。
US presidential election
總統大選的結果,
very easily infer what your politics are,
disclosed them on the site.
can do that quite easily.
of one candidate over the other?
另一位的則不,會如何呢?
seemingly innocuous --
if we're seeing the same information
我們看到的資訊是否相同,
the beginning stages of this.
這個過程的初始階段。
personality traits,
use of addictive substances,
是否使用上癮式物質、
are partially concealed.
to detect people's sexual orientation
to be 100 percent right,
the temptation to use these technologies
假陽性結果(實際沒有被預測為有)
some false positives,
a whole other layer of problems.
又造成全然另一層的問題。
it has on its citizens.
face detection technology
of surveillance authoritarianism
專制監視的基礎結構,
Orwell's authoritarianism.
is using overt fear to terrorize us,
公然利用恐懼來恐嚇我們,
are using these algorithms
the troublemakers and the rebels,
問題製造者和叛亂份子,
architectures at scale
weaknesses and vulnerabilities,
and neighbors are seeing,
will envelop us like a spider's web
一樣把我們緊緊地包裹起來,
自己被包在裡面。
as a persuasion architecture.
被廣告左右的演算法,
personal and social information flows,
政治、個人和社會的資訊流,
它們能提供我們極大的價值。
because they provide us with great value.
with friends and family around the world.
朋友家人保持聯絡。
social media is for social movements.
社會運動有多重要的文章。
these technologies can be used
you know, Facebook or Google
或 Google 的營運者
or the world more polarized
well-intentioned statements
people in technology make that matter,
and business models they're building.
of half a trillion dollars
詐騙了半兆美元,
as a persuasion architecture,
is of great concern.
digital technology operates.
technology is developed
economic and otherwise,
或其他形式的獎勵──
created by the proprietary algorithms,
of machine learning's opacity,
that's being collected about us.
與我們相關的資料。
artificial intelligence
by our human values.
所限制住的目標。
on what those terms mean.
那些用語意義的共識。
depend on for so much operate,
this conversation anymore.
that we are the product that's being sold.
「我們」就是被銷售的產品。
資料和注意力是非賣品,
authoritarian or demagogue.
專制主義者或煽動者。
that Hollywood paraphrase,
and digital technology to blossom,
this prodigious menace,
面對這個巨大的威脅,
ABOUT THE SPEAKER
Zeynep Tufekci - Techno-sociologistTechno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread.
Why you should listen
We've entered an era of digital connectivity and machine intelligence. Complex algorithms are increasingly used to make consequential decisions about us. Many of these decisions are subjective and have no right answer: who should be hired, fired or promoted; what news should be shown to whom; which of your friends do you see updates from; which convict should be paroled. With increasing use of machine learning in these systems, we often don't even understand how exactly they are making these decisions. Zeynep Tufekci studies what this historic transition means for culture, markets, politics and personal life.
Tufekci is a contributing opinion writer at the New York Times, an associate professor at the School of Information and Library Science at University of North Carolina, Chapel Hill, and a faculty associate at Harvard's Berkman Klein Center for Internet and Society.
Her book, Twitter and Tear Gas: The Power and Fragility of Networked Protest, was published in 2017 by Yale University Press. Her next book, from Penguin Random House, will be about algorithms that watch, judge and nudge us.
Zeynep Tufekci | Speaker | TED.com