ABOUT THE SPEAKER
Zeynep Tufekci - Techno-sociologist
Techno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread.

Why you should listen

We've entered an era of digital connectivity and machine intelligence. Complex algorithms are increasingly used to make consequential decisions about us. Many of these decisions are subjective and have no right answer: who should be hired, fired or promoted; what news should be shown to whom; which of your friends do you see updates from; which convict should be paroled. With increasing use of machine learning in these systems, we often don't even understand how exactly they are making these decisions. Zeynep Tufekci studies what this historic transition means for culture, markets, politics and personal life.

Tufekci is a contributing opinion writer at the New York Times, an associate professor at the School of Information and Library Science at University of North Carolina, Chapel Hill, and a faculty associate at Harvard's Berkman Klein Center for Internet and Society.

Her book, Twitter and Tear Gas: The Power and Fragility of Networked Protest, was published in 2017 by Yale University Press. Her next book, from Penguin Random House, will be about algorithms that watch, judge and nudge us.

More profile about the speaker
Zeynep Tufekci | Speaker | TED.com
TEDGlobal>NYC

Zeynep Tufekci: We're building a dystopia just to make people click on ads

Filmed:
2,866,905 views

We're building an artificial intelligence-powered dystopia, one click at a time, says techno-sociologist Zeynep Tufekci. In an eye-opening talk, she details how the same algorithms companies like Facebook, Google and Amazon use to get you to click on ads are also used to organize your access to political and social information. And the machines aren't even the real threat. What we need to understand is how the powerful might use AI to control us -- and what we can do in response.
- Techno-sociologist
Techno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread. Full bio

Double-click the English transcript below to play the video.

00:12
So when people voice fears
of artificial intelligence,
0
760
3536
00:16
very often, they invoke images
of humanoid robots run amok.
1
4320
3976
00:20
You know? Terminator?
2
8320
1240
00:22
You know, that might be
something to consider,
3
10400
2336
00:24
but that's a distant threat.
4
12760
1856
00:26
Or, we fret about digital surveillance
5
14640
3456
00:30
with metaphors from the past.
6
18120
1776
00:31
"1984," George Orwell's "1984,"
7
19920
2656
00:34
it's hitting the bestseller lists again.
8
22600
2280
00:37
It's a great book,
9
25960
1416
00:39
but it's not the correct dystopia
for the 21st century.
10
27400
3880
00:44
What we need to fear most
11
32080
1416
00:45
is not what artificial intelligence
will do to us on its own,
12
33520
4776
00:50
but how the people in power
will use artificial intelligence
13
38320
4736
00:55
to control us and to manipulate us
14
43080
2816
00:57
in novel, sometimes hidden,
15
45920
3136
01:01
subtle and unexpected ways.
16
49080
3016
01:04
Much of the technology
17
52120
1856
01:06
that threatens our freedom
and our dignity in the near-term future
18
54000
4336
01:10
is being developed by companies
19
58360
1856
01:12
in the business of capturing
and selling our data and our attention
20
60240
4936
01:17
to advertisers and others:
21
65200
2256
01:19
Facebook, Google, Amazon,
22
67480
3416
01:22
Alibaba, Tencent.
23
70920
1880
01:26
Now, artificial intelligence has started
bolstering their business as well.
24
74040
5496
01:31
And it may seem
like artificial intelligence
25
79560
2096
01:33
is just the next thing after online ads.
26
81680
2856
01:36
It's not.
27
84560
1216
01:37
It's a jump in category.
28
85800
2456
01:40
It's a whole different world,
29
88280
2576
01:42
and it has great potential.
30
90880
2616
01:45
It could accelerate our understanding
of many areas of study and research.
31
93520
6920
01:53
But to paraphrase
a famous Hollywood philosopher,
32
101120
3496
01:56
"With prodigious potential
comes prodigious risk."
33
104640
3640
02:01
Now let's look at a basic fact
of our digital lives, online ads.
34
109120
3936
02:05
Right? We kind of dismiss them.
35
113080
2896
02:08
They seem crude, ineffective.
36
116000
1976
02:10
We've all had the experience
of being followed on the web
37
118000
4256
02:14
by an ad based on something
we searched or read.
38
122280
2776
02:17
You know, you look up a pair of boots
39
125080
1856
02:18
and for a week, those boots are following
you around everywhere you go.
40
126960
3376
02:22
Even after you succumb and buy them,
they're still following you around.
41
130360
3656
02:26
We're kind of inured to that kind
of basic, cheap manipulation.
42
134040
3016
02:29
We roll our eyes and we think,
"You know what? These things don't work."
43
137080
3400
02:33
Except, online,
44
141720
2096
02:35
the digital technologies are not just ads.
45
143840
3600
02:40
Now, to understand that,
let's think of a physical world example.
46
148240
3120
02:43
You know how, at the checkout counters
at supermarkets, near the cashier,
47
151840
4656
02:48
there's candy and gum
at the eye level of kids?
48
156520
3480
02:52
That's designed to make them
whine at their parents
49
160800
3496
02:56
just as the parents
are about to sort of check out.
50
164320
3080
03:00
Now, that's a persuasion architecture.
51
168040
2640
03:03
It's not nice, but it kind of works.
52
171160
3096
03:06
That's why you see it
in every supermarket.
53
174280
2040
03:08
Now, in the physical world,
54
176720
1696
03:10
such persuasion architectures
are kind of limited,
55
178440
2496
03:12
because you can only put
so many things by the cashier. Right?
56
180960
4816
03:17
And the candy and gum,
it's the same for everyone,
57
185800
4296
03:22
even though it mostly works
58
190120
1456
03:23
only for people who have
whiny little humans beside them.
59
191600
4040
03:29
In the physical world,
we live with those limitations.
60
197160
3920
03:34
In the digital world, though,
61
202280
1936
03:36
persuasion architectures
can be built at the scale of billions
62
204240
4320
03:41
and they can target, infer, understand
63
209840
3856
03:45
and be deployed at individuals
64
213720
2896
03:48
one by one
65
216640
1216
03:49
by figuring out your weaknesses,
66
217880
2136
03:52
and they can be sent
to everyone's phone private screen,
67
220040
5616
03:57
so it's not visible to us.
68
225680
2256
03:59
And that's different.
69
227960
1256
04:01
And that's just one of the basic things
that artificial intelligence can do.
70
229240
3576
04:04
Now, let's take an example.
71
232840
1336
04:06
Let's say you want to sell
plane tickets to Vegas. Right?
72
234200
2696
04:08
So in the old world, you could think
of some demographics to target
73
236920
3496
04:12
based on experience
and what you can guess.
74
240440
2520
04:15
You might try to advertise to, oh,
75
243560
2816
04:18
men between the ages of 25 and 35,
76
246400
2496
04:20
or people who have
a high limit on their credit card,
77
248920
3936
04:24
or retired couples. Right?
78
252880
1376
04:26
That's what you would do in the past.
79
254280
1816
04:28
With big data and machine learning,
80
256120
2896
04:31
that's not how it works anymore.
81
259040
1524
04:33
So to imagine that,
82
261320
2176
04:35
think of all the data
that Facebook has on you:
83
263520
3856
04:39
every status update you ever typed,
84
267400
2536
04:41
every Messenger conversation,
85
269960
2016
04:44
every place you logged in from,
86
272000
1880
04:48
all your photographs
that you uploaded there.
87
276400
3176
04:51
If you start typing something
and change your mind and delete it,
88
279600
3776
04:55
Facebook keeps those
and analyzes them, too.
89
283400
3200
04:59
Increasingly, it tries
to match you with your offline data.
90
287160
3936
05:03
It also purchases
a lot of data from data brokers.
91
291120
3176
05:06
It could be everything
from your financial records
92
294320
3416
05:09
to a good chunk of your browsing history.
93
297760
2120
05:12
Right? In the US,
such data is routinely collected,
94
300360
5416
05:17
collated and sold.
95
305800
1960
05:20
In Europe, they have tougher rules.
96
308320
2440
05:23
So what happens then is,
97
311680
2200
05:26
by churning through all that data,
these machine-learning algorithms --
98
314920
4016
05:30
that's why they're called
learning algorithms --
99
318960
2896
05:33
they learn to understand
the characteristics of people
100
321880
4096
05:38
who purchased tickets to Vegas before.
101
326000
2520
05:41
When they learn this from existing data,
102
329760
3536
05:45
they also learn
how to apply this to new people.
103
333320
3816
05:49
So if they're presented with a new person,
104
337160
3056
05:52
they can classify whether that person
is likely to buy a ticket to Vegas or not.
105
340240
4640
05:57
Fine. You're thinking,
an offer to buy tickets to Vegas.
106
345720
5456
06:03
I can ignore that.
107
351200
1456
06:04
But the problem isn't that.
108
352680
2216
06:06
The problem is,
109
354920
1576
06:08
we no longer really understand
how these complex algorithms work.
110
356520
4136
06:12
We don't understand
how they're doing this categorization.
111
360680
3456
06:16
It's giant matrices,
thousands of rows and columns,
112
364160
4416
06:20
maybe millions of rows and columns,
113
368600
1960
06:23
and not the programmers
114
371320
2640
06:26
and not anybody who looks at it,
115
374760
1680
06:29
even if you have all the data,
116
377440
1496
06:30
understands anymore
how exactly it's operating
117
378960
4616
06:35
any more than you'd know
what I was thinking right now
118
383600
3776
06:39
if you were shown
a cross section of my brain.
119
387400
3960
06:44
It's like we're not programming anymore,
120
392360
2576
06:46
we're growing intelligence
that we don't truly understand.
121
394960
4400
06:52
And these things only work
if there's an enormous amount of data,
122
400520
3976
06:56
so they also encourage
deep surveillance on all of us
123
404520
5096
07:01
so that the machine learning
algorithms can work.
124
409640
2336
07:04
That's why Facebook wants
to collect all the data it can about you.
125
412000
3176
07:07
The algorithms work better.
126
415200
1576
07:08
So let's push that Vegas example a bit.
127
416800
2696
07:11
What if the system
that we do not understand
128
419520
3680
07:16
was picking up that it's easier
to sell Vegas tickets
129
424200
5136
07:21
to people who are bipolar
and about to enter the manic phase.
130
429360
3760
07:25
Such people tend to become
overspenders, compulsive gamblers.
131
433640
4920
07:31
They could do this, and you'd have no clue
that's what they were picking up on.
132
439280
4456
07:35
I gave this example
to a bunch of computer scientists once
133
443760
3616
07:39
and afterwards, one of them came up to me.
134
447400
2056
07:41
He was troubled and he said,
"That's why I couldn't publish it."
135
449480
3520
07:45
I was like, "Couldn't publish what?"
136
453600
1715
07:47
He had tried to see whether you can indeed
figure out the onset of mania
137
455800
5856
07:53
from social media posts
before clinical symptoms,
138
461680
3216
07:56
and it had worked,
139
464920
1776
07:58
and it had worked very well,
140
466720
2056
08:00
and he had no idea how it worked
or what it was picking up on.
141
468800
4880
08:06
Now, the problem isn't solved
if he doesn't publish it,
142
474840
4416
08:11
because there are already companies
143
479280
1896
08:13
that are developing
this kind of technology,
144
481200
2536
08:15
and a lot of the stuff
is just off the shelf.
145
483760
2800
08:19
This is not very difficult anymore.
146
487240
2576
08:21
Do you ever go on YouTube
meaning to watch one video
147
489840
3456
08:25
and an hour later you've watched 27?
148
493320
2360
08:28
You know how YouTube
has this column on the right
149
496760
2496
08:31
that says, "Up next"
150
499280
2216
08:33
and it autoplays something?
151
501520
1816
08:35
It's an algorithm
152
503360
1216
08:36
picking what it thinks
that you might be interested in
153
504600
3616
08:40
and maybe not find on your own.
154
508240
1536
08:41
It's not a human editor.
155
509800
1256
08:43
It's what algorithms do.
156
511080
1416
08:44
It picks up on what you have watched
and what people like you have watched,
157
512520
4736
08:49
and infers that that must be
what you're interested in,
158
517280
4216
08:53
what you want more of,
159
521520
1255
08:54
and just shows you more.
160
522799
1336
08:56
It sounds like a benign
and useful feature,
161
524159
2201
08:59
except when it isn't.
162
527280
1200
09:01
So in 2016, I attended rallies
of then-candidate Donald Trump
163
529640
6960
09:09
to study as a scholar
the movement supporting him.
164
537840
3336
09:13
I study social movements,
so I was studying it, too.
165
541200
3456
09:16
And then I wanted to write something
about one of his rallies,
166
544680
3336
09:20
so I watched it a few times on YouTube.
167
548040
1960
09:23
YouTube started recommending to me
168
551240
3096
09:26
and autoplaying to me
white supremacist videos
169
554360
4256
09:30
in increasing order of extremism.
170
558640
2656
09:33
If I watched one,
171
561320
1816
09:35
it served up one even more extreme
172
563160
2976
09:38
and autoplayed that one, too.
173
566160
1424
09:40
If you watch Hillary Clinton
or Bernie Sanders content,
174
568320
4536
09:44
YouTube recommends
and autoplays conspiracy left,
175
572880
4696
09:49
and it goes downhill from there.
176
577600
1760
09:52
Well, you might be thinking,
this is politics, but it's not.
177
580480
3056
09:55
This isn't about politics.
178
583560
1256
09:56
This is just the algorithm
figuring out human behavior.
179
584840
3096
09:59
I once watched a video
about vegetarianism on YouTube
180
587960
4776
10:04
and YouTube recommended
and autoplayed a video about being vegan.
181
592760
4936
10:09
It's like you're never
hardcore enough for YouTube.
182
597720
3016
10:12
(Laughter)
183
600760
1576
10:14
So what's going on?
184
602360
1560
10:16
Now, YouTube's algorithm is proprietary,
185
604520
3536
10:20
but here's what I think is going on.
186
608080
2360
10:23
The algorithm has figured out
187
611360
2096
10:25
that if you can entice people
188
613480
3696
10:29
into thinking that you can
show them something more hardcore,
189
617200
3736
10:32
they're more likely to stay on the site
190
620960
2416
10:35
watching video after video
going down that rabbit hole
191
623400
4416
10:39
while Google serves them ads.
192
627840
1680
10:43
Now, with nobody minding
the ethics of the store,
193
631760
3120
10:47
these sites can profile people
194
635720
4240
10:53
who are Jew haters,
195
641680
1920
10:56
who think that Jews are parasites
196
644360
2480
11:00
and who have such explicit
anti-Semitic content,
197
648320
4920
11:06
and let you target them with ads.
198
654080
2000
11:09
They can also mobilize algorithms
199
657200
3536
11:12
to find for you look-alike audiences,
200
660760
3136
11:15
people who do not have such explicit
anti-Semitic content on their profile
201
663920
5576
11:21
but who the algorithm detects
may be susceptible to such messages,
202
669520
6176
11:27
and lets you target them with ads, too.
203
675720
1920
11:30
Now, this may sound
like an implausible example,
204
678680
2736
11:33
but this is real.
205
681440
1320
11:35
ProPublica investigated this
206
683480
2136
11:37
and found that you can indeed
do this on Facebook,
207
685640
3616
11:41
and Facebook helpfully
offered up suggestions
208
689280
2416
11:43
on how to broaden that audience.
209
691720
1600
11:46
BuzzFeed tried it for Google,
and very quickly they found,
210
694720
3016
11:49
yep, you can do it on Google, too.
211
697760
1736
11:51
And it wasn't even expensive.
212
699520
1696
11:53
The ProPublica reporter
spent about 30 dollars
213
701240
4416
11:57
to target this category.
214
705680
2240
12:02
So last year, Donald Trump's
social media manager disclosed
215
710600
5296
12:07
that they were using Facebook dark posts
to demobilize people,
216
715920
5336
12:13
not to persuade them,
217
721280
1376
12:14
but to convince them not to vote at all.
218
722680
2800
12:18
And to do that,
they targeted specifically,
219
726520
3576
12:22
for example, African-American men
in key cities like Philadelphia,
220
730120
3896
12:26
and I'm going to read
exactly what he said.
221
734040
2456
12:28
I'm quoting.
222
736520
1216
12:29
They were using "nonpublic posts
223
737760
3016
12:32
whose viewership the campaign controls
224
740800
2176
12:35
so that only the people
we want to see it see it.
225
743000
3776
12:38
We modeled this.
226
746800
1216
12:40
It will dramatically affect her ability
to turn these people out."
227
748040
4720
12:45
What's in those dark posts?
228
753720
2280
12:48
We have no idea.
229
756480
1656
12:50
Facebook won't tell us.
230
758160
1200
12:52
So Facebook also algorithmically
arranges the posts
231
760480
4376
12:56
that your friends put on Facebook,
or the pages you follow.
232
764880
3736
13:00
It doesn't show you
everything chronologically.
233
768640
2216
13:02
It puts the order in the way
that the algorithm thinks will entice you
234
770880
4816
13:07
to stay on the site longer.
235
775720
1840
13:11
Now, so this has a lot of consequences.
236
779040
3376
13:14
You may be thinking
somebody is snubbing you on Facebook.
237
782440
3800
13:18
The algorithm may never
be showing your post to them.
238
786800
3256
13:22
The algorithm is prioritizing
some of them and burying the others.
239
790080
5960
13:29
Experiments show
240
797320
1296
13:30
that what the algorithm picks to show you
can affect your emotions.
241
798640
4520
13:36
But that's not all.
242
804600
1200
13:38
It also affects political behavior.
243
806280
2360
13:41
So in 2010, in the midterm elections,
244
809360
4656
13:46
Facebook did an experiment
on 61 million people in the US
245
814040
5896
13:51
that was disclosed after the fact.
246
819960
1896
13:53
So some people were shown,
"Today is election day,"
247
821880
3416
13:57
the simpler one,
248
825320
1376
13:58
and some people were shown
the one with that tiny tweak
249
826720
3896
14:02
with those little thumbnails
250
830640
2096
14:04
of your friends who clicked on "I voted."
251
832760
2840
14:09
This simple tweak.
252
837000
1400
14:11
OK? So the pictures were the only change,
253
839520
4296
14:15
and that post shown just once
254
843840
3256
14:19
turned out an additional 340,000 voters
255
847120
6056
14:25
in that election,
256
853200
1696
14:26
according to this research
257
854920
1696
14:28
as confirmed by the voter rolls.
258
856640
2520
14:32
A fluke? No.
259
860920
1656
14:34
Because in 2012,
they repeated the same experiment.
260
862600
5360
14:40
And that time,
261
868840
1736
14:42
that civic message shown just once
262
870600
3296
14:45
turned out an additional 270,000 voters.
263
873920
4440
14:51
For reference, the 2016
US presidential election
264
879160
5216
14:56
was decided by about 100,000 votes.
265
884400
3520
15:01
Now, Facebook can also
very easily infer what your politics are,
266
889360
4736
15:06
even if you've never
disclosed them on the site.
267
894120
2256
15:08
Right? These algorithms
can do that quite easily.
268
896400
2520
15:11
What if a platform with that kind of power
269
899960
3896
15:15
decides to turn out supporters
of one candidate over the other?
270
903880
5040
15:21
How would we even know about it?
271
909680
2440
15:25
Now, we started from someplace
seemingly innocuous --
272
913560
4136
15:29
online adds following us around --
273
917720
2216
15:31
and we've landed someplace else.
274
919960
1840
15:35
As a public and as citizens,
275
923480
2456
15:37
we no longer know
if we're seeing the same information
276
925960
3416
15:41
or what anybody else is seeing,
277
929400
1480
15:43
and without a common basis of information,
278
931680
2576
15:46
little by little,
279
934280
1616
15:47
public debate is becoming impossible,
280
935920
3216
15:51
and we're just at
the beginning stages of this.
281
939160
2976
15:54
These algorithms can quite easily infer
282
942160
3456
15:57
things like your people's ethnicity,
283
945640
3256
16:00
religious and political views,
personality traits,
284
948920
2336
16:03
intelligence, happiness,
use of addictive substances,
285
951280
3376
16:06
parental separation, age and genders,
286
954680
3136
16:09
just from Facebook likes.
287
957840
1960
16:13
These algorithms can identify protesters
288
961440
4056
16:17
even if their faces
are partially concealed.
289
965520
2760
16:21
These algorithms may be able
to detect people's sexual orientation
290
969720
6616
16:28
just from their dating profile pictures.
291
976360
3200
16:33
Now, these are probabilistic guesses,
292
981560
2616
16:36
so they're not going
to be 100 percent right,
293
984200
2896
16:39
but I don't see the powerful resisting
the temptation to use these technologies
294
987120
4896
16:44
just because there are
some false positives,
295
992040
2176
16:46
which will of course create
a whole other layer of problems.
296
994240
3256
16:49
Imagine what a state can do
297
997520
2936
16:52
with the immense amount of data
it has on its citizens.
298
1000480
3560
16:56
China is already using
face detection technology
299
1004680
4776
17:01
to identify and arrest people.
300
1009480
2880
17:05
And here's the tragedy:
301
1013280
2136
17:07
we're building this infrastructure
of surveillance authoritarianism
302
1015440
5536
17:13
merely to get people to click on ads.
303
1021000
2960
17:17
And this won't be
Orwell's authoritarianism.
304
1025240
2576
17:19
This isn't "1984."
305
1027839
1897
17:21
Now, if authoritarianism
is using overt fear to terrorize us,
306
1029760
4576
17:26
we'll all be scared, but we'll know it,
307
1034359
2897
17:29
we'll hate it and we'll resist it.
308
1037280
2200
17:32
But if the people in power
are using these algorithms
309
1040880
4416
17:37
to quietly watch us,
310
1045319
3377
17:40
to judge us and to nudge us,
311
1048720
2080
17:43
to predict and identify
the troublemakers and the rebels,
312
1051720
4176
17:47
to deploy persuasion
architectures at scale
313
1055920
3896
17:51
and to manipulate individuals one by one
314
1059840
4136
17:56
using their personal, individual
weaknesses and vulnerabilities,
315
1064000
5440
18:02
and if they're doing it at scale
316
1070720
2200
18:06
through our private screens
317
1074080
1736
18:07
so that we don't even know
318
1075840
1656
18:09
what our fellow citizens
and neighbors are seeing,
319
1077520
2760
18:13
that authoritarianism
will envelop us like a spider's web
320
1081560
4816
18:18
and we may not even know we're in it.
321
1086400
2480
18:22
So Facebook's market capitalization
322
1090440
2936
18:25
is approaching half a trillion dollars.
323
1093400
3296
18:28
It's because it works great
as a persuasion architecture.
324
1096720
3120
18:33
But the structure of that architecture
325
1101760
2816
18:36
is the same whether you're selling shoes
326
1104600
3216
18:39
or whether you're selling politics.
327
1107840
2496
18:42
The algorithms do not know the difference.
328
1110360
3120
18:46
The same algorithms set loose upon us
329
1114240
3296
18:49
to make us more pliable for ads
330
1117560
3176
18:52
are also organizing our political,
personal and social information flows,
331
1120760
6736
18:59
and that's what's got to change.
332
1127520
1840
19:02
Now, don't get me wrong,
333
1130240
2296
19:04
we use digital platforms
because they provide us with great value.
334
1132560
3680
19:09
I use Facebook to keep in touch
with friends and family around the world.
335
1137120
3560
19:14
I've written about how crucial
social media is for social movements.
336
1142000
5776
19:19
I have studied how
these technologies can be used
337
1147800
3016
19:22
to circumvent censorship around the world.
338
1150840
2480
19:27
But it's not that the people who run,
you know, Facebook or Google
339
1155280
6416
19:33
are maliciously and deliberately trying
340
1161720
2696
19:36
to make the country
or the world more polarized
341
1164440
4456
19:40
and encourage extremism.
342
1168920
1680
19:43
I read the many
well-intentioned statements
343
1171440
3976
19:47
that these people put out.
344
1175440
3320
19:51
But it's not the intent or the statements
people in technology make that matter,
345
1179600
6056
19:57
it's the structures
and business models they're building.
346
1185680
3560
20:02
And that's the core of the problem.
347
1190360
2096
20:04
Either Facebook is a giant con
of half a trillion dollars
348
1192480
4720
20:10
and ads don't work on the site,
349
1198200
1896
20:12
it doesn't work
as a persuasion architecture,
350
1200120
2696
20:14
or its power of influence
is of great concern.
351
1202840
4120
20:20
It's either one or the other.
352
1208560
1776
20:22
It's similar for Google, too.
353
1210360
1600
20:24
So what can we do?
354
1212880
2456
20:27
This needs to change.
355
1215360
1936
20:29
Now, I can't offer a simple recipe,
356
1217320
2576
20:31
because we need to restructure
357
1219920
2256
20:34
the whole way our
digital technology operates.
358
1222200
3016
20:37
Everything from the way
technology is developed
359
1225240
4096
20:41
to the way the incentives,
economic and otherwise,
360
1229360
3856
20:45
are built into the system.
361
1233240
2280
20:48
We have to face and try to deal with
362
1236480
3456
20:51
the lack of transparency
created by the proprietary algorithms,
363
1239960
4656
20:56
the structural challenge
of machine learning's opacity,
364
1244640
3816
21:00
all this indiscriminate data
that's being collected about us.
365
1248480
3400
21:05
We have a big task in front of us.
366
1253000
2520
21:08
We have to mobilize our technology,
367
1256160
2680
21:11
our creativity
368
1259760
1576
21:13
and yes, our politics
369
1261360
1880
21:16
so that we can build
artificial intelligence
370
1264240
2656
21:18
that supports us in our human goals
371
1266920
3120
21:22
but that is also constrained
by our human values.
372
1270800
3920
21:27
And I understand this won't be easy.
373
1275600
2160
21:30
We might not even easily agree
on what those terms mean.
374
1278360
3600
21:34
But if we take seriously
375
1282920
2400
21:38
how these systems that we
depend on for so much operate,
376
1286240
5976
21:44
I don't see how we can postpone
this conversation anymore.
377
1292240
4120
21:49
These structures
378
1297200
2536
21:51
are organizing how we function
379
1299760
4096
21:55
and they're controlling
380
1303880
2296
21:58
what we can and we cannot do.
381
1306200
2616
22:00
And many of these ad-financed platforms,
382
1308840
2456
22:03
they boast that they're free.
383
1311320
1576
22:04
In this context, it means
that we are the product that's being sold.
384
1312920
4560
22:10
We need a digital economy
385
1318840
2736
22:13
where our data and our attention
386
1321600
3496
22:17
is not for sale to the highest-bidding
authoritarian or demagogue.
387
1325120
5080
22:23
(Applause)
388
1331160
3800
22:30
So to go back to
that Hollywood paraphrase,
389
1338480
3256
22:33
we do want the prodigious potential
390
1341760
3736
22:37
of artificial intelligence
and digital technology to blossom,
391
1345520
3200
22:41
but for that, we must face
this prodigious menace,
392
1349400
4936
22:46
open-eyed and now.
393
1354360
1936
22:48
Thank you.
394
1356320
1216
22:49
(Applause)
395
1357560
4640

▲Back to top

ABOUT THE SPEAKER
Zeynep Tufekci - Techno-sociologist
Techno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread.

Why you should listen

We've entered an era of digital connectivity and machine intelligence. Complex algorithms are increasingly used to make consequential decisions about us. Many of these decisions are subjective and have no right answer: who should be hired, fired or promoted; what news should be shown to whom; which of your friends do you see updates from; which convict should be paroled. With increasing use of machine learning in these systems, we often don't even understand how exactly they are making these decisions. Zeynep Tufekci studies what this historic transition means for culture, markets, politics and personal life.

Tufekci is a contributing opinion writer at the New York Times, an associate professor at the School of Information and Library Science at University of North Carolina, Chapel Hill, and a faculty associate at Harvard's Berkman Klein Center for Internet and Society.

Her book, Twitter and Tear Gas: The Power and Fragility of Networked Protest, was published in 2017 by Yale University Press. Her next book, from Penguin Random House, will be about algorithms that watch, judge and nudge us.

More profile about the speaker
Zeynep Tufekci | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee