sponsored links
TEDGlobal 2011

Kevin Slavin: How algorithms shape our world

July 13, 2011

Kevin Slavin argues that we're living in a world designed for -- and increasingly controlled by -- algorithms. In this riveting talk from TEDGlobal, he shows how these complex computer programs determine: espionage tactics, stock prices, movie scripts, and architecture. And he warns that we are writing code we can't understand, with implications we can't control.

Kevin Slavin - Algoworld expert
Kevin Slavin navigates in the algoworld, the expanding space in our lives that’s determined and run by algorithms. Full bio

sponsored links
Double-click the English subtitles below to play the video.
This is a photograph
00:15
by the artist Michael Najjar,
00:17
and it's real,
00:19
in the sense that he went there to Argentina
00:21
to take the photo.
00:23
But it's also a fiction. There's a lot of work that went into it after that.
00:25
And what he's done
00:28
is he's actually reshaped, digitally,
00:30
all of the contours of the mountains
00:32
to follow the vicissitudes of the Dow Jones index.
00:34
So what you see,
00:37
that precipice, that high precipice with the valley,
00:39
is the 2008 financial crisis.
00:41
The photo was made
00:43
when we were deep in the valley over there.
00:45
I don't know where we are now.
00:47
This is the Hang Seng index
00:49
for Hong Kong.
00:51
And similar topography.
00:53
I wonder why.
00:55
And this is art. This is metaphor.
00:57
But I think the point is
01:00
that this is metaphor with teeth,
01:02
and it's with those teeth that I want to propose today
01:04
that we rethink a little bit
01:07
about the role of contemporary math --
01:09
not just financial math, but math in general.
01:12
That its transition
01:15
from being something that we extract and derive from the world
01:17
to something that actually starts to shape it --
01:20
the world around us and the world inside us.
01:23
And it's specifically algorithms,
01:26
which are basically the math
01:28
that computers use to decide stuff.
01:30
They acquire the sensibility of truth
01:33
because they repeat over and over again,
01:35
and they ossify and calcify,
01:37
and they become real.
01:40
And I was thinking about this, of all places,
01:42
on a transatlantic flight a couple of years ago,
01:45
because I happened to be seated
01:48
next to a Hungarian physicist about my age
01:50
and we were talking
01:52
about what life was like during the Cold War
01:54
for physicists in Hungary.
01:56
And I said, "So what were you doing?"
01:58
And he said, "Well we were mostly breaking stealth."
02:00
And I said, "That's a good job. That's interesting.
02:02
How does that work?"
02:04
And to understand that,
02:06
you have to understand a little bit about how stealth works.
02:08
And so -- this is an over-simplification --
02:11
but basically, it's not like
02:14
you can just pass a radar signal
02:16
right through 156 tons of steel in the sky.
02:18
It's not just going to disappear.
02:21
But if you can take this big, massive thing,
02:24
and you could turn it into
02:27
a million little things --
02:30
something like a flock of birds --
02:32
well then the radar that's looking for that
02:34
has to be able to see
02:36
every flock of birds in the sky.
02:38
And if you're a radar, that's a really bad job.
02:40
And he said, "Yeah." He said, "But that's if you're a radar.
02:44
So we didn't use a radar;
02:47
we built a black box that was looking for electrical signals,
02:49
electronic communication.
02:52
And whenever we saw a flock of birds that had electronic communication,
02:55
we thought, 'Probably has something to do with the Americans.'"
02:58
And I said, "Yeah.
03:01
That's good.
03:03
So you've effectively negated
03:05
60 years of aeronautic research.
03:07
What's your act two?
03:09
What do you do when you grow up?"
03:11
And he said,
03:13
"Well, financial services."
03:15
And I said, "Oh."
03:17
Because those had been in the news lately.
03:19
And I said, "How does that work?"
03:22
And he said, "Well there's 2,000 physicists on Wall Street now,
03:24
and I'm one of them."
03:26
And I said, "What's the black box for Wall Street?"
03:28
And he said, "It's funny you ask that,
03:31
because it's actually called black box trading.
03:33
And it's also sometimes called algo trading,
03:36
algorithmic trading."
03:38
And algorithmic trading evolved in part
03:41
because institutional traders have the same problems
03:44
that the United States Air Force had,
03:47
which is that they're moving these positions --
03:50
whether it's Proctor & Gamble or Accenture, whatever --
03:53
they're moving a million shares of something
03:55
through the market.
03:57
And if they do that all at once,
03:59
it's like playing poker and going all in right away.
04:01
You just tip your hand.
04:03
And so they have to find a way --
04:05
and they use algorithms to do this --
04:07
to break up that big thing
04:09
into a million little transactions.
04:11
And the magic and the horror of that
04:13
is that the same math
04:15
that you use to break up the big thing
04:17
into a million little things
04:19
can be used to find a million little things
04:21
and sew them back together
04:23
and figure out what's actually happening in the market.
04:25
So if you need to have some image
04:27
of what's happening in the stock market right now,
04:29
what you can picture is a bunch of algorithms
04:32
that are basically programmed to hide,
04:34
and a bunch of algorithms that are programmed to go find them and act.
04:37
And all of that's great, and it's fine.
04:40
And that's 70 percent
04:43
of the United States stock market,
04:45
70 percent of the operating system
04:47
formerly known as your pension,
04:49
your mortgage.
04:52
And what could go wrong?
04:55
What could go wrong
04:57
is that a year ago,
04:59
nine percent of the entire market just disappears in five minutes,
05:01
and they called it the Flash Crash of 2:45.
05:04
All of a sudden, nine percent just goes away,
05:07
and nobody to this day
05:10
can even agree on what happened
05:12
because nobody ordered it, nobody asked for it.
05:14
Nobody had any control over what was actually happening.
05:17
All they had
05:20
was just a monitor in front of them
05:22
that had the numbers on it
05:24
and just a red button
05:26
that said, "Stop."
05:28
And that's the thing,
05:30
is that we're writing things,
05:32
we're writing these things that we can no longer read.
05:34
And we've rendered something
05:37
illegible,
05:39
and we've lost the sense
05:41
of what's actually happening
05:44
in this world that we've made.
05:46
And we're starting to make our way.
05:48
There's a company in Boston called Nanex,
05:50
and they use math and magic
05:53
and I don't know what,
05:55
and they reach into all the market data
05:57
and they find, actually sometimes, some of these algorithms.
05:59
And when they find them they pull them out
06:02
and they pin them to the wall like butterflies.
06:05
And they do what we've always done
06:08
when confronted with huge amounts of data that we don't understand --
06:10
which is that they give them a name
06:13
and a story.
06:15
So this is one that they found,
06:17
they called the Knife,
06:19
the Carnival,
06:23
the Boston Shuffler,
06:25
Twilight.
06:29
And the gag is
06:31
that, of course, these aren't just running through the market.
06:33
You can find these kinds of things wherever you look,
06:36
once you learn how to look for them.
06:39
You can find it here: this book about flies
06:41
that you may have been looking at on Amazon.
06:44
You may have noticed it
06:46
when its price started at 1.7 million dollars.
06:48
It's out of print -- still ...
06:50
(Laughter)
06:52
If you had bought it at 1.7, it would have been a bargain.
06:54
A few hours later, it had gone up
06:57
to 23.6 million dollars,
06:59
plus shipping and handling.
07:01
And the question is:
07:03
Nobody was buying or selling anything; what was happening?
07:05
And you see this behavior on Amazon
07:07
as surely as you see it on Wall Street.
07:09
And when you see this kind of behavior,
07:11
what you see is the evidence
07:13
of algorithms in conflict,
07:15
algorithms locked in loops with each other,
07:17
without any human oversight,
07:19
without any adult supervision
07:21
to say, "Actually, 1.7 million is plenty."
07:24
(Laughter)
07:27
And as with Amazon, so it is with Netflix.
07:30
And so Netflix has gone through
07:33
several different algorithms over the years.
07:35
They started with Cinematch, and they've tried a bunch of others --
07:37
there's Dinosaur Planet; there's Gravity.
07:40
They're using Pragmatic Chaos now.
07:42
Pragmatic Chaos is, like all of Netflix algorithms,
07:44
trying to do the same thing.
07:46
It's trying to get a grasp on you,
07:48
on the firmware inside the human skull,
07:50
so that it can recommend what movie
07:52
you might want to watch next --
07:54
which is a very, very difficult problem.
07:56
But the difficulty of the problem
07:59
and the fact that we don't really quite have it down,
08:01
it doesn't take away
08:04
from the effects Pragmatic Chaos has.
08:06
Pragmatic Chaos, like all Netflix algorithms,
08:08
determines, in the end,
08:11
60 percent
08:13
of what movies end up being rented.
08:15
So one piece of code
08:17
with one idea about you
08:19
is responsible for 60 percent of those movies.
08:22
But what if you could rate those movies
08:25
before they get made?
08:27
Wouldn't that be handy?
08:29
Well, a few data scientists from the U.K. are in Hollywood,
08:31
and they have "story algorithms" --
08:34
a company called Epagogix.
08:36
And you can run your script through there,
08:38
and they can tell you, quantifiably,
08:41
that that's a 30 million dollar movie
08:43
or a 200 million dollar movie.
08:45
And the thing is, is that this isn't Google.
08:47
This isn't information.
08:49
These aren't financial stats; this is culture.
08:51
And what you see here,
08:53
or what you don't really see normally,
08:55
is that these are the physics of culture.
08:57
And if these algorithms,
09:01
like the algorithms on Wall Street,
09:03
just crashed one day and went awry,
09:05
how would we know?
09:08
What would it look like?
09:10
And they're in your house. They're in your house.
09:12
These are two algorithms competing for your living room.
09:15
These are two different cleaning robots
09:17
that have very different ideas about what clean means.
09:19
And you can see it
09:22
if you slow it down and attach lights to them,
09:24
and they're sort of like secret architects in your bedroom.
09:27
And the idea that architecture itself
09:30
is somehow subject to algorithmic optimization
09:33
is not far-fetched.
09:35
It's super-real and it's happening around you.
09:37
You feel it most
09:40
when you're in a sealed metal box,
09:42
a new-style elevator;
09:44
they're called destination-control elevators.
09:46
These are the ones where you have to press what floor you're going to go to
09:48
before you get in the elevator.
09:51
And it uses what's called a bin-packing algorithm.
09:53
So none of this mishegas
09:55
of letting everybody go into whatever car they want.
09:57
Everybody who wants to go to the 10th floor goes into car two,
09:59
and everybody who wants to go to the third floor goes into car five.
10:01
And the problem with that
10:04
is that people freak out.
10:06
People panic.
10:08
And you see why. You see why.
10:10
It's because the elevator
10:12
is missing some important instrumentation, like the buttons.
10:14
(Laughter)
10:17
Like the things that people use.
10:19
All it has
10:21
is just the number that moves up or down
10:23
and that red button that says, "Stop."
10:26
And this is what we're designing for.
10:29
We're designing
10:32
for this machine dialect.
10:34
And how far can you take that? How far can you take it?
10:36
You can take it really, really far.
10:39
So let me take it back to Wall Street.
10:41
Because the algorithms of Wall Street
10:45
are dependent on one quality above all else,
10:47
which is speed.
10:50
And they operate on milliseconds and microseconds.
10:52
And just to give you a sense of what microseconds are,
10:55
it takes you 500,000 microseconds
10:57
just to click a mouse.
10:59
But if you're a Wall Street algorithm
11:01
and you're five microseconds behind,
11:03
you're a loser.
11:05
So if you were an algorithm,
11:07
you'd look for an architect like the one that I met in Frankfurt
11:09
who was hollowing out a skyscraper --
11:12
throwing out all the furniture, all the infrastructure for human use,
11:14
and just running steel on the floors
11:17
to get ready for the stacks of servers to go in --
11:20
all so an algorithm
11:23
could get close to the Internet.
11:25
And you think of the Internet as this kind of distributed system.
11:28
And of course, it is, but it's distributed from places.
11:31
In New York, this is where it's distributed from:
11:34
the Carrier Hotel
11:36
located on Hudson Street.
11:38
And this is really where the wires come right up into the city.
11:40
And the reality is that the further away you are from that,
11:43
you're a few microseconds behind every time.
11:47
These guys down on Wall Street,
11:49
Marco Polo and Cherokee Nation,
11:51
they're eight microseconds
11:53
behind all these guys
11:55
going into the empty buildings being hollowed out
11:57
up around the Carrier Hotel.
12:01
And that's going to keep happening.
12:03
We're going to keep hollowing them out,
12:06
because you, inch for inch
12:08
and pound for pound and dollar for dollar,
12:11
none of you could squeeze revenue out of that space
12:14
like the Boston Shuffler could.
12:17
But if you zoom out,
12:20
if you zoom out,
12:22
you would see an 825-mile trench
12:24
between New York City and Chicago
12:28
that's been built over the last few years
12:30
by a company called Spread Networks.
12:32
This is a fiber optic cable
12:35
that was laid between those two cities
12:37
to just be able to traffic one signal
12:39
37 times faster than you can click a mouse --
12:42
just for these algorithms,
12:45
just for the Carnival and the Knife.
12:48
And when you think about this,
12:51
that we're running through the United States
12:53
with dynamite and rock saws
12:55
so that an algorithm can close the deal
12:58
three microseconds faster,
13:00
all for a communications framework
13:03
that no human will ever know,
13:05
that's a kind of manifest destiny;
13:09
and we'll always look for a new frontier.
13:12
Unfortunately, we have our work cut out for us.
13:15
This is just theoretical.
13:18
This is some mathematicians at MIT.
13:20
And the truth is I don't really understand
13:22
a lot of what they're talking about.
13:24
It involves light cones and quantum entanglement,
13:26
and I don't really understand any of that.
13:29
But I can read this map,
13:31
and what this map says
13:33
is that, if you're trying to make money on the markets where the red dots are,
13:35
that's where people are, where the cities are,
13:38
you're going to have to put the servers where the blue dots are
13:40
to do that most effectively.
13:43
And the thing that you might have noticed about those blue dots
13:45
is that a lot of them are in the middle of the ocean.
13:48
So that's what we'll do: we'll build bubbles or something,
13:51
or platforms.
13:54
We'll actually part the water
13:56
to pull money out of the air,
13:58
because it's a bright future
14:00
if you're an algorithm.
14:02
(Laughter)
14:04
And it's not the money that's so interesting actually.
14:06
It's what the money motivates,
14:09
that we're actually terraforming
14:11
the Earth itself
14:13
with this kind of algorithmic efficiency.
14:15
And in that light,
14:17
you go back
14:19
and you look at Michael Najjar's photographs,
14:21
and you realize that they're not metaphor, they're prophecy.
14:23
They're prophecy
14:26
for the kind of seismic, terrestrial effects
14:28
of the math that we're making.
14:32
And the landscape was always made
14:34
by this sort of weird, uneasy collaboration
14:37
between nature and man.
14:40
But now there's this third co-evolutionary force: algorithms --
14:43
the Boston Shuffler, the Carnival.
14:46
And we will have to understand those as nature,
14:49
and in a way, they are.
14:52
Thank you.
14:54
(Applause)
14:56

sponsored links

Kevin Slavin - Algoworld expert
Kevin Slavin navigates in the algoworld, the expanding space in our lives that’s determined and run by algorithms.

Why you should listen

Are you addicted to the dead-simple numbers game Drop 7 or Facebook’s Parking Wars? Blame Kevin Slavin and the game development company he co-founded in 2005, Area/Code, which makes clever game entertainments that enter the fabric of reality.

All this fun is powered by algorithms -- as, increasingly, is our daily life. From the Google algorithms to the algos that give you “recommendations” online to those that automatically play the stock markets (and sometimes crash them): we may not realize it, but we live in the algoworld.

He says: "The quickest way to find out what the boundaries of reality are is to figure where they break."

The original video is available on TED.com
sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.