ABOUT THE SPEAKER
Kevin Slavin - Algoworld expert
Kevin Slavin navigates in the algoworld, the expanding space in our lives that’s determined and run by algorithms.

Why you should listen

Are you addicted to the dead-simple numbers game Drop 7 or Facebook’s Parking Wars? Blame Kevin Slavin and the game development company he co-founded in 2005, Area/Code, which makes clever game entertainments that enter the fabric of reality.

All this fun is powered by algorithms -- as, increasingly, is our daily life. From the Google algorithms to the algos that give you “recommendations” online to those that automatically play the stock markets (and sometimes crash them): we may not realize it, but we live in the algoworld.

He says: "The quickest way to find out what the boundaries of reality are is to figure where they break."

More profile about the speaker
Kevin Slavin | Speaker | TED.com
TEDGlobal 2011

Kevin Slavin: How algorithms shape our world

Filmed:
4,199,898 views

We live in a world run by algorithms, computer programs that make decisions or solve problems for us. In this riveting, funny talk, Kevin Slavin shows how modern algorithms determine stock prices, espionage tactics, even the movies you watch. But, he asks: If we depend on complex algorithms to manage our daily decisions -- when do we start to lose control?
- Algoworld expert
Kevin Slavin navigates in the algoworld, the expanding space in our lives that’s determined and run by algorithms. Full bio

Double-click the English transcript below to play the video.

00:15
This is a photograph
0
0
2000
00:17
by the artist Michael Najjar,
1
2000
2000
00:19
and it's real,
2
4000
2000
00:21
in the sense that he went there to Argentina
3
6000
2000
00:23
to take the photo.
4
8000
2000
00:25
But it's also a fiction. There's a lot of work that went into it after that.
5
10000
3000
00:28
And what he's done
6
13000
2000
00:30
is he's actually reshaped, digitally,
7
15000
2000
00:32
all of the contours of the mountains
8
17000
2000
00:34
to follow the vicissitudes of the Dow Jones index.
9
19000
3000
00:37
So what you see,
10
22000
2000
00:39
that precipice, that high precipice with the valley,
11
24000
2000
00:41
is the 2008 financial crisis.
12
26000
2000
00:43
The photo was made
13
28000
2000
00:45
when we were deep in the valley over there.
14
30000
2000
00:47
I don't know where we are now.
15
32000
2000
00:49
This is the Hang Seng index
16
34000
2000
00:51
for Hong Kong.
17
36000
2000
00:53
And similar topography.
18
38000
2000
00:55
I wonder why.
19
40000
2000
00:57
And this is art. This is metaphor.
20
42000
3000
01:00
But I think the point is
21
45000
2000
01:02
that this is metaphor with teeth,
22
47000
2000
01:04
and it's with those teeth that I want to propose today
23
49000
3000
01:07
that we rethink a little bit
24
52000
2000
01:09
about the role of contemporary math --
25
54000
3000
01:12
not just financial math, but math in general.
26
57000
3000
01:15
That its transition
27
60000
2000
01:17
from being something that we extract and derive from the world
28
62000
3000
01:20
to something that actually starts to shape it --
29
65000
3000
01:23
the world around us and the world inside us.
30
68000
3000
01:26
And it's specifically algorithms,
31
71000
2000
01:28
which are basically the math
32
73000
2000
01:30
that computers use to decide stuff.
33
75000
3000
01:33
They acquire the sensibility of truth
34
78000
2000
01:35
because they repeat over and over again,
35
80000
2000
01:37
and they ossify and calcify,
36
82000
3000
01:40
and they become real.
37
85000
2000
01:42
And I was thinking about this, of all places,
38
87000
3000
01:45
on a transatlantic flight a couple of years ago,
39
90000
3000
01:48
because I happened to be seated
40
93000
2000
01:50
next to a Hungarian physicist about my age
41
95000
2000
01:52
and we were talking
42
97000
2000
01:54
about what life was like during the Cold War
43
99000
2000
01:56
for physicists in Hungary.
44
101000
2000
01:58
And I said, "So what were you doing?"
45
103000
2000
02:00
And he said, "Well we were mostly breaking stealth."
46
105000
2000
02:02
And I said, "That's a good job. That's interesting.
47
107000
2000
02:04
How does that work?"
48
109000
2000
02:06
And to understand that,
49
111000
2000
02:08
you have to understand a little bit about how stealth works.
50
113000
3000
02:11
And so -- this is an over-simplification --
51
116000
3000
02:14
but basically, it's not like
52
119000
2000
02:16
you can just pass a radar signal
53
121000
2000
02:18
right through 156 tons of steel in the sky.
54
123000
3000
02:21
It's not just going to disappear.
55
126000
3000
02:24
But if you can take this big, massive thing,
56
129000
3000
02:27
and you could turn it into
57
132000
3000
02:30
a million little things --
58
135000
2000
02:32
something like a flock of birds --
59
137000
2000
02:34
well then the radar that's looking for that
60
139000
2000
02:36
has to be able to see
61
141000
2000
02:38
every flock of birds in the sky.
62
143000
2000
02:40
And if you're a radar, that's a really bad job.
63
145000
4000
02:44
And he said, "Yeah." He said, "But that's if you're a radar.
64
149000
3000
02:47
So we didn't use a radar;
65
152000
2000
02:49
we built a black box that was looking for electrical signals,
66
154000
3000
02:52
electronic communication.
67
157000
3000
02:55
And whenever we saw a flock of birds that had electronic communication,
68
160000
3000
02:58
we thought, 'Probably has something to do with the Americans.'"
69
163000
3000
03:01
And I said, "Yeah.
70
166000
2000
03:03
That's good.
71
168000
2000
03:05
So you've effectively negated
72
170000
2000
03:07
60 years of aeronautic research.
73
172000
2000
03:09
What's your act two?
74
174000
2000
03:11
What do you do when you grow up?"
75
176000
2000
03:13
And he said,
76
178000
2000
03:15
"Well, financial services."
77
180000
2000
03:17
And I said, "Oh."
78
182000
2000
03:19
Because those had been in the news lately.
79
184000
3000
03:22
And I said, "How does that work?"
80
187000
2000
03:24
And he said, "Well there's 2,000 physicists on Wall Street now,
81
189000
2000
03:26
and I'm one of them."
82
191000
2000
03:28
And I said, "What's the black box for Wall Street?"
83
193000
3000
03:31
And he said, "It's funny you ask that,
84
196000
2000
03:33
because it's actually called black box trading.
85
198000
3000
03:36
And it's also sometimes called algo trading,
86
201000
2000
03:38
algorithmic trading."
87
203000
3000
03:41
And algorithmic trading evolved in part
88
206000
3000
03:44
because institutional traders have the same problems
89
209000
3000
03:47
that the United States Air Force had,
90
212000
3000
03:50
which is that they're moving these positions --
91
215000
3000
03:53
whether it's Proctor & Gamble or Accenture, whatever --
92
218000
2000
03:55
they're moving a million shares of something
93
220000
2000
03:57
through the market.
94
222000
2000
03:59
And if they do that all at once,
95
224000
2000
04:01
it's like playing poker and going all in right away.
96
226000
2000
04:03
You just tip your hand.
97
228000
2000
04:05
And so they have to find a way --
98
230000
2000
04:07
and they use algorithms to do this --
99
232000
2000
04:09
to break up that big thing
100
234000
2000
04:11
into a million little transactions.
101
236000
2000
04:13
And the magic and the horror of that
102
238000
2000
04:15
is that the same math
103
240000
2000
04:17
that you use to break up the big thing
104
242000
2000
04:19
into a million little things
105
244000
2000
04:21
can be used to find a million little things
106
246000
2000
04:23
and sew them back together
107
248000
2000
04:25
and figure out what's actually happening in the market.
108
250000
2000
04:27
So if you need to have some image
109
252000
2000
04:29
of what's happening in the stock market right now,
110
254000
3000
04:32
what you can picture is a bunch of algorithms
111
257000
2000
04:34
that are basically programmed to hide,
112
259000
3000
04:37
and a bunch of algorithms that are programmed to go find them and act.
113
262000
3000
04:40
And all of that's great, and it's fine.
114
265000
3000
04:43
And that's 70 percent
115
268000
2000
04:45
of the United States stock market,
116
270000
2000
04:47
70 percent of the operating system
117
272000
2000
04:49
formerly known as your pension,
118
274000
3000
04:52
your mortgage.
119
277000
3000
04:55
And what could go wrong?
120
280000
2000
04:57
What could go wrong
121
282000
2000
04:59
is that a year ago,
122
284000
2000
05:01
nine percent of the entire market just disappears in five minutes,
123
286000
3000
05:04
and they called it the Flash Crash of 2:45.
124
289000
3000
05:07
All of a sudden, nine percent just goes away,
125
292000
3000
05:10
and nobody to this day
126
295000
2000
05:12
can even agree on what happened
127
297000
2000
05:14
because nobody ordered it, nobody asked for it.
128
299000
3000
05:17
Nobody had any control over what was actually happening.
129
302000
3000
05:20
All they had
130
305000
2000
05:22
was just a monitor in front of them
131
307000
2000
05:24
that had the numbers on it
132
309000
2000
05:26
and just a red button
133
311000
2000
05:28
that said, "Stop."
134
313000
2000
05:30
And that's the thing,
135
315000
2000
05:32
is that we're writing things,
136
317000
2000
05:34
we're writing these things that we can no longer read.
137
319000
3000
05:37
And we've rendered something
138
322000
2000
05:39
illegible,
139
324000
2000
05:41
and we've lost the sense
140
326000
3000
05:44
of what's actually happening
141
329000
2000
05:46
in this world that we've made.
142
331000
2000
05:48
And we're starting to make our way.
143
333000
2000
05:50
There's a company in Boston called Nanex,
144
335000
3000
05:53
and they use math and magic
145
338000
2000
05:55
and I don't know what,
146
340000
2000
05:57
and they reach into all the market data
147
342000
2000
05:59
and they find, actually sometimes, some of these algorithms.
148
344000
3000
06:02
And when they find them they pull them out
149
347000
3000
06:05
and they pin them to the wall like butterflies.
150
350000
3000
06:08
And they do what we've always done
151
353000
2000
06:10
when confronted with huge amounts of data that we don't understand --
152
355000
3000
06:13
which is that they give them a name
153
358000
2000
06:15
and a story.
154
360000
2000
06:17
So this is one that they found,
155
362000
2000
06:19
they called the Knife,
156
364000
4000
06:23
the Carnival,
157
368000
2000
06:25
the Boston Shuffler,
158
370000
4000
06:29
Twilight.
159
374000
2000
06:31
And the gag is
160
376000
2000
06:33
that, of course, these aren't just running through the market.
161
378000
3000
06:36
You can find these kinds of things wherever you look,
162
381000
3000
06:39
once you learn how to look for them.
163
384000
2000
06:41
You can find it here: this book about flies
164
386000
3000
06:44
that you may have been looking at on Amazon.
165
389000
2000
06:46
You may have noticed it
166
391000
2000
06:48
when its price started at 1.7 million dollars.
167
393000
2000
06:50
It's out of print -- still ...
168
395000
2000
06:52
(Laughter)
169
397000
2000
06:54
If you had bought it at 1.7, it would have been a bargain.
170
399000
3000
06:57
A few hours later, it had gone up
171
402000
2000
06:59
to 23.6 million dollars,
172
404000
2000
07:01
plus shipping and handling.
173
406000
2000
07:03
And the question is:
174
408000
2000
07:05
Nobody was buying or selling anything; what was happening?
175
410000
2000
07:07
And you see this behavior on Amazon
176
412000
2000
07:09
as surely as you see it on Wall Street.
177
414000
2000
07:11
And when you see this kind of behavior,
178
416000
2000
07:13
what you see is the evidence
179
418000
2000
07:15
of algorithms in conflict,
180
420000
2000
07:17
algorithms locked in loops with each other,
181
422000
2000
07:19
without any human oversight,
182
424000
2000
07:21
without any adult supervision
183
426000
3000
07:24
to say, "Actually, 1.7 million is plenty."
184
429000
3000
07:27
(Laughter)
185
432000
3000
07:30
And as with Amazon, so it is with Netflix.
186
435000
3000
07:33
And so Netflix has gone through
187
438000
2000
07:35
several different algorithms over the years.
188
440000
2000
07:37
They started with Cinematch, and they've tried a bunch of others --
189
442000
3000
07:40
there's Dinosaur Planet; there's Gravity.
190
445000
2000
07:42
They're using Pragmatic Chaos now.
191
447000
2000
07:44
Pragmatic Chaos is, like all of Netflix algorithms,
192
449000
2000
07:46
trying to do the same thing.
193
451000
2000
07:48
It's trying to get a grasp on you,
194
453000
2000
07:50
on the firmware inside the human skull,
195
455000
2000
07:52
so that it can recommend what movie
196
457000
2000
07:54
you might want to watch next --
197
459000
2000
07:56
which is a very, very difficult problem.
198
461000
3000
07:59
But the difficulty of the problem
199
464000
2000
08:01
and the fact that we don't really quite have it down,
200
466000
3000
08:04
it doesn't take away
201
469000
2000
08:06
from the effects Pragmatic Chaos has.
202
471000
2000
08:08
Pragmatic Chaos, like all Netflix algorithms,
203
473000
3000
08:11
determines, in the end,
204
476000
2000
08:13
60 percent
205
478000
2000
08:15
of what movies end up being rented.
206
480000
2000
08:17
So one piece of code
207
482000
2000
08:19
with one idea about you
208
484000
3000
08:22
is responsible for 60 percent of those movies.
209
487000
3000
08:25
But what if you could rate those movies
210
490000
2000
08:27
before they get made?
211
492000
2000
08:29
Wouldn't that be handy?
212
494000
2000
08:31
Well, a few data scientists from the U.K. are in Hollywood,
213
496000
3000
08:34
and they have "story algorithms" --
214
499000
2000
08:36
a company called Epagogix.
215
501000
2000
08:38
And you can run your script through there,
216
503000
3000
08:41
and they can tell you, quantifiably,
217
506000
2000
08:43
that that's a 30 million dollar movie
218
508000
2000
08:45
or a 200 million dollar movie.
219
510000
2000
08:47
And the thing is, is that this isn't Google.
220
512000
2000
08:49
This isn't information.
221
514000
2000
08:51
These aren't financial stats; this is culture.
222
516000
2000
08:53
And what you see here,
223
518000
2000
08:55
or what you don't really see normally,
224
520000
2000
08:57
is that these are the physics of culture.
225
522000
4000
09:01
And if these algorithms,
226
526000
2000
09:03
like the algorithms on Wall Street,
227
528000
2000
09:05
just crashed one day and went awry,
228
530000
3000
09:08
how would we know?
229
533000
2000
09:10
What would it look like?
230
535000
2000
09:12
And they're in your house. They're in your house.
231
537000
3000
09:15
These are two algorithms competing for your living room.
232
540000
2000
09:17
These are two different cleaning robots
233
542000
2000
09:19
that have very different ideas about what clean means.
234
544000
3000
09:22
And you can see it
235
547000
2000
09:24
if you slow it down and attach lights to them,
236
549000
3000
09:27
and they're sort of like secret architects in your bedroom.
237
552000
3000
09:30
And the idea that architecture itself
238
555000
3000
09:33
is somehow subject to algorithmic optimization
239
558000
2000
09:35
is not far-fetched.
240
560000
2000
09:37
It's super-real and it's happening around you.
241
562000
3000
09:40
You feel it most
242
565000
2000
09:42
when you're in a sealed metal box,
243
567000
2000
09:44
a new-style elevator;
244
569000
2000
09:46
they're called destination-control elevators.
245
571000
2000
09:48
These are the ones where you have to press what floor you're going to go to
246
573000
3000
09:51
before you get in the elevator.
247
576000
2000
09:53
And it uses what's called a bin-packing algorithm.
248
578000
2000
09:55
So none of this mishegas
249
580000
2000
09:57
of letting everybody go into whatever car they want.
250
582000
2000
09:59
Everybody who wants to go to the 10th floor goes into car two,
251
584000
2000
10:01
and everybody who wants to go to the third floor goes into car five.
252
586000
3000
10:04
And the problem with that
253
589000
2000
10:06
is that people freak out.
254
591000
2000
10:08
People panic.
255
593000
2000
10:10
And you see why. You see why.
256
595000
2000
10:12
It's because the elevator
257
597000
2000
10:14
is missing some important instrumentation, like the buttons.
258
599000
3000
10:17
(Laughter)
259
602000
2000
10:19
Like the things that people use.
260
604000
2000
10:21
All it has
261
606000
2000
10:23
is just the number that moves up or down
262
608000
3000
10:26
and that red button that says, "Stop."
263
611000
3000
10:29
And this is what we're designing for.
264
614000
3000
10:32
We're designing
265
617000
2000
10:34
for this machine dialect.
266
619000
2000
10:36
And how far can you take that? How far can you take it?
267
621000
3000
10:39
You can take it really, really far.
268
624000
2000
10:41
So let me take it back to Wall Street.
269
626000
3000
10:45
Because the algorithms of Wall Street
270
630000
2000
10:47
are dependent on one quality above all else,
271
632000
3000
10:50
which is speed.
272
635000
2000
10:52
And they operate on milliseconds and microseconds.
273
637000
3000
10:55
And just to give you a sense of what microseconds are,
274
640000
2000
10:57
it takes you 500,000 microseconds
275
642000
2000
10:59
just to click a mouse.
276
644000
2000
11:01
But if you're a Wall Street algorithm
277
646000
2000
11:03
and you're five microseconds behind,
278
648000
2000
11:05
you're a loser.
279
650000
2000
11:07
So if you were an algorithm,
280
652000
2000
11:09
you'd look for an architect like the one that I met in Frankfurt
281
654000
3000
11:12
who was hollowing out a skyscraper --
282
657000
2000
11:14
throwing out all the furniture, all the infrastructure for human use,
283
659000
3000
11:17
and just running steel on the floors
284
662000
3000
11:20
to get ready for the stacks of servers to go in --
285
665000
3000
11:23
all so an algorithm
286
668000
2000
11:25
could get close to the Internet.
287
670000
3000
11:28
And you think of the Internet as this kind of distributed system.
288
673000
3000
11:31
And of course, it is, but it's distributed from places.
289
676000
3000
11:34
In New York, this is where it's distributed from:
290
679000
2000
11:36
the Carrier Hotel
291
681000
2000
11:38
located on Hudson Street.
292
683000
2000
11:40
And this is really where the wires come right up into the city.
293
685000
3000
11:43
And the reality is that the further away you are from that,
294
688000
4000
11:47
you're a few microseconds behind every time.
295
692000
2000
11:49
These guys down on Wall Street,
296
694000
2000
11:51
Marco Polo and Cherokee Nation,
297
696000
2000
11:53
they're eight microseconds
298
698000
2000
11:55
behind all these guys
299
700000
2000
11:57
going into the empty buildings being hollowed out
300
702000
4000
12:01
up around the Carrier Hotel.
301
706000
2000
12:03
And that's going to keep happening.
302
708000
3000
12:06
We're going to keep hollowing them out,
303
711000
2000
12:08
because you, inch for inch
304
713000
3000
12:11
and pound for pound and dollar for dollar,
305
716000
3000
12:14
none of you could squeeze revenue out of that space
306
719000
3000
12:17
like the Boston Shuffler could.
307
722000
3000
12:20
But if you zoom out,
308
725000
2000
12:22
if you zoom out,
309
727000
2000
12:24
you would see an 825-mile trench
310
729000
4000
12:28
between New York City and Chicago
311
733000
2000
12:30
that's been built over the last few years
312
735000
2000
12:32
by a company called Spread Networks.
313
737000
3000
12:35
This is a fiber optic cable
314
740000
2000
12:37
that was laid between those two cities
315
742000
2000
12:39
to just be able to traffic one signal
316
744000
3000
12:42
37 times faster than you can click a mouse --
317
747000
3000
12:45
just for these algorithms,
318
750000
3000
12:48
just for the Carnival and the Knife.
319
753000
3000
12:51
And when you think about this,
320
756000
2000
12:53
that we're running through the United States
321
758000
2000
12:55
with dynamite and rock saws
322
760000
3000
12:58
so that an algorithm can close the deal
323
763000
2000
13:00
three microseconds faster,
324
765000
3000
13:03
all for a communications framework
325
768000
2000
13:05
that no human will ever know,
326
770000
4000
13:09
that's a kind of manifest destiny;
327
774000
3000
13:12
and we'll always look for a new frontier.
328
777000
3000
13:15
Unfortunately, we have our work cut out for us.
329
780000
3000
13:18
This is just theoretical.
330
783000
2000
13:20
This is some mathematicians at MIT.
331
785000
2000
13:22
And the truth is I don't really understand
332
787000
2000
13:24
a lot of what they're talking about.
333
789000
2000
13:26
It involves light cones and quantum entanglement,
334
791000
3000
13:29
and I don't really understand any of that.
335
794000
2000
13:31
But I can read this map,
336
796000
2000
13:33
and what this map says
337
798000
2000
13:35
is that, if you're trying to make money on the markets where the red dots are,
338
800000
3000
13:38
that's where people are, where the cities are,
339
803000
2000
13:40
you're going to have to put the servers where the blue dots are
340
805000
3000
13:43
to do that most effectively.
341
808000
2000
13:45
And the thing that you might have noticed about those blue dots
342
810000
3000
13:48
is that a lot of them are in the middle of the ocean.
343
813000
3000
13:51
So that's what we'll do: we'll build bubbles or something,
344
816000
3000
13:54
or platforms.
345
819000
2000
13:56
We'll actually part the water
346
821000
2000
13:58
to pull money out of the air,
347
823000
2000
14:00
because it's a bright future
348
825000
2000
14:02
if you're an algorithm.
349
827000
2000
14:04
(Laughter)
350
829000
2000
14:06
And it's not the money that's so interesting actually.
351
831000
3000
14:09
It's what the money motivates,
352
834000
2000
14:11
that we're actually terraforming
353
836000
2000
14:13
the Earth itself
354
838000
2000
14:15
with this kind of algorithmic efficiency.
355
840000
2000
14:17
And in that light,
356
842000
2000
14:19
you go back
357
844000
2000
14:21
and you look at Michael Najjar's photographs,
358
846000
2000
14:23
and you realize that they're not metaphor, they're prophecy.
359
848000
3000
14:26
They're prophecy
360
851000
2000
14:28
for the kind of seismic, terrestrial effects
361
853000
4000
14:32
of the math that we're making.
362
857000
2000
14:34
And the landscape was always made
363
859000
3000
14:37
by this sort of weird, uneasy collaboration
364
862000
3000
14:40
between nature and man.
365
865000
3000
14:43
But now there's this third co-evolutionary force: algorithms --
366
868000
3000
14:46
the Boston Shuffler, the Carnival.
367
871000
3000
14:49
And we will have to understand those as nature,
368
874000
3000
14:52
and in a way, they are.
369
877000
2000
14:54
Thank you.
370
879000
2000
14:56
(Applause)
371
881000
20000

▲Back to top

ABOUT THE SPEAKER
Kevin Slavin - Algoworld expert
Kevin Slavin navigates in the algoworld, the expanding space in our lives that’s determined and run by algorithms.

Why you should listen

Are you addicted to the dead-simple numbers game Drop 7 or Facebook’s Parking Wars? Blame Kevin Slavin and the game development company he co-founded in 2005, Area/Code, which makes clever game entertainments that enter the fabric of reality.

All this fun is powered by algorithms -- as, increasingly, is our daily life. From the Google algorithms to the algos that give you “recommendations” online to those that automatically play the stock markets (and sometimes crash them): we may not realize it, but we live in the algoworld.

He says: "The quickest way to find out what the boundaries of reality are is to figure where they break."

More profile about the speaker
Kevin Slavin | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee