15:30
TED2015

Chris Urmson: How a driverless car sees the road

Filmed:

Statistically, the least reliable part of the car is ... the driver. Chris Urmson heads up Google's driverless car program, one of several efforts to remove humans from the driver's seat. He talks about where his program is right now, and shares fascinating footage that shows how the car sees the road and makes autonomous decisions about what to do next.

- Roboticist
Chris Umson is the Director of Self-Driving Cars at Google[x]. Full bio

So in 1885, Karl Benz
invented the automobile.
00:12
Later that year, he took it out
for the first public test drive,
00:16
and -- true story --
crashed into a wall.
00:20
For the last 130 years,
00:24
we've been working around that least
reliable part of the car, the driver.
00:26
We've made the car stronger.
00:30
We've added seat belts,
we've added air bags,
00:32
and in the last decade, we've actually
started trying to make the car smarter
00:34
to fix that bug, the driver.
00:38
Now, today I'm going to talk to you
a little bit about the difference
00:41
between patching around the problem
with driver assistance systems
00:44
and actually having fully
self-driving cars
00:48
and what they can do for the world.
00:51
I'm also going to talk to you
a little bit about our car
00:53
and allow you to see how it sees the world
and how it reacts and what it does,
00:56
but first I'm going to talk
a little bit about the problem.
01:00
And it's a big problem:
01:03
1.2 million people are killed
on the world's roads every year.
01:05
In America alone, 33,000 people
are killed each year.
01:08
To put that in perspective,
01:12
that's the same as a 737
falling out of the sky every working day.
01:14
It's kind of unbelievable.
01:19
Cars are sold to us like this,
01:21
but really, this is what driving's like.
01:23
Right? It's not sunny, it's rainy,
01:26
and you want to do anything
other than drive.
01:28
And the reason why is this:
01:31
Traffic is getting worse.
01:32
In America, between 1990 and 2010,
01:34
the vehicle miles traveled
increased by 38 percent.
01:38
We grew by six percent of roads,
01:42
so it's not in your brains.
01:44
Traffic really is substantially worse
than it was not very long ago.
01:46
And all of this has a very human cost.
01:50
So if you take the average commute time
in America, which is about 50 minutes,
01:53
you multiply that by the 120 million
workers we have,
01:57
that turns out to be
about six billion minutes
02:01
wasted in commuting every day.
02:03
Now, that's a big number,
so let's put it in perspective.
02:05
You take that six billion minutes
02:08
and you divide it by the average
life expectancy of a person,
02:09
that turns out to be 162 lifetimes
02:13
spent every day, wasted,
02:16
just getting from A to B.
02:19
It's unbelievable.
02:21
And then, there are those of us
who don't have the privilege
02:23
of sitting in traffic.
02:26
So this is Steve.
02:28
He's an incredibly capable guy,
02:29
but he just happens to be blind,
02:31
and that means instead of a 30-minute
drive to work in the morning,
02:33
it's a two-hour ordeal
of piecing together bits of public transit
02:37
or asking friends and family for a ride.
02:41
He doesn't have that same freedom
that you and I have to get around.
02:43
We should do something about that.
02:47
Now, conventional wisdom would say
02:49
that we'll just take
these driver assistance systems
02:51
and we'll kind of push them
and incrementally improve them,
02:54
and over time, they'll turn
into self-driving cars.
02:57
Well, I'm here to tell you
that's like me saying
03:00
that if I work really hard at jumping,
one day I'll be able to fly.
03:02
We actually need to do
something a little different.
03:06
And so I'm going to talk to you
about three different ways
03:09
that self-driving systems are different
than driver assistance systems.
03:12
And I'm going to start
with some of our own experience.
03:15
So back in 2013,
03:18
we had the first test
of a self-driving car
03:20
where we let regular people use it.
03:23
Well, almost regular --
they were 100 Googlers,
03:25
but they weren't working on the project.
03:27
And we gave them the car and we allowed
them to use it in their daily lives.
03:29
But unlike a real self-driving car,
this one had a big asterisk with it:
03:33
They had to pay attention,
03:36
because this was an experimental vehicle.
03:38
We tested it a lot,
but it could still fail.
03:40
And so we gave them two hours of training,
03:44
we put them in the car,
we let them use it,
03:46
and what we heard back
was something awesome,
03:48
as someone trying
to bring a product into the world.
03:50
Every one of them told us they loved it.
03:53
In fact, we had a Porsche driver
who came in and told us on the first day,
03:55
"This is completely stupid.
What are we thinking?"
03:58
But at the end of it, he said,
"Not only should I have it,
04:01
everyone else should have it,
because people are terrible drivers."
04:04
So this was music to our ears,
04:09
but then we started to look at what
the people inside the car were doing,
04:10
and this was eye-opening.
04:14
Now, my favorite story is this gentleman
04:16
who looks down at his phone
and realizes the battery is low,
04:18
so he turns around like this in the car
and digs around in his backpack,
04:22
pulls out his laptop,
04:27
puts it on the seat,
04:29
goes in the back again,
04:30
digs around, pulls out
the charging cable for his phone,
04:32
futzes around, puts it into the laptop,
puts it on the phone.
04:35
Sure enough, the phone is charging.
04:39
All the time he's been doing
65 miles per hour down the freeway.
04:41
Right? Unbelievable.
04:45
So we thought about this and we said,
it's kind of obvious, right?
04:47
The better the technology gets,
04:50
the less reliable
the driver is going to get.
04:53
So by just making the cars
incrementally smarter,
04:55
we're probably not going to see
the wins we really need.
04:57
Let me talk about something
a little technical for a moment here.
05:00
So we're looking at this graph,
and along the bottom
05:04
is how often does the car
apply the brakes when it shouldn't.
05:06
You can ignore most of that axis,
05:09
because if you're driving around town,
and the car starts stopping randomly,
05:11
you're never going to buy that car.
05:15
And the vertical axis is how often
the car is going to apply the brakes
05:17
when it's supposed to
to help you avoid an accident.
05:20
Now, if we look at
the bottom left corner here,
05:23
this is your classic car.
05:25
It doesn't apply the brakes for you,
it doesn't do anything goofy,
05:27
but it also doesn't get you
out of an accident.
05:30
Now, if we want to bring
a driver assistance system into a car,
05:33
say with collision mitigation braking,
05:36
we're going to put some package
of technology on there,
05:38
and that's this curve, and it's going
to have some operating properties,
05:40
but it's never going to avoid
all of the accidents,
05:44
because it doesn't have that capability.
05:46
But we'll pick some place
along the curve here,
05:48
and maybe it avoids half of accidents
that the human driver misses,
05:51
and that's amazing, right?
05:54
We just reduced accidents on our roads
by a factor of two.
05:55
There are now 17,000 less people
dying every year in America.
05:58
But if we want a self-driving car,
06:02
we need a technology curve
that looks like this.
06:04
We're going to have to put
more sensors in the vehicle,
06:06
and we'll pick some
operating point up here
06:09
where it basically never
gets into a crash.
06:11
They'll happen, but very low frequency.
06:13
Now you and I could look at this
and we could argue
06:15
about whether it's incremental, and
I could say something like "80-20 rule,"
06:18
and it's really hard to move up
to that new curve.
06:21
But let's look at it
from a different direction for a moment.
06:24
So let's look at how often
the technology has to do the right thing.
06:27
And so this green dot up here
is a driver assistance system.
06:30
It turns out that human drivers
06:34
make mistakes that lead
to traffic accidents
06:36
about once every 100,000 miles in America.
06:39
In contrast, a self-driving system
is probably making decisions
06:42
about 10 times per second,
06:45
so order of magnitude,
06:49
that's about 1,000 times per mile.
06:50
So if you compare the distance
between these two,
06:53
it's about 10 to the eighth, right?
06:56
Eight orders of magnitude.
06:58
That's like comparing how fast I run
07:00
to the speed of light.
07:03
It doesn't matter how hard I train,
I'm never actually going to get there.
07:05
So there's a pretty big gap there.
07:09
And then finally, there's how
the system can handle uncertainty.
07:11
So this pedestrian here might be
stepping into the road, might not be.
07:15
I can't tell,
nor can any of our algorithms,
07:18
but in the case of
a driver assistance system,
07:22
that means it can't take action,
because again,
07:24
if it presses the brakes unexpectedly,
that's completely unacceptable.
07:27
Whereas a self-driving system
can look at that pedestrian and say,
07:30
I don't know what they're about to do,
07:33
slow down, take a better look,
and then react appropriately after that.
07:35
So it can be much safer than
a driver assistance system can ever be.
07:39
So that's enough about
the differences between the two.
07:43
Let's spend some time talking about
how the car sees the world.
07:45
So this is our vehicle.
07:49
It starts by understanding
where it is in the world,
07:50
by taking a map and its sensor data
and aligning the two,
07:53
and then we layer on top of that
what it sees in the moment.
07:55
So here, all the purple boxes you can see
are other vehicles on the road,
07:58
and the red thing on the side
over there is a cyclist,
08:02
and up in the distance,
if you look really closely,
08:05
you can see some cones.
08:07
Then we know where the car
is in the moment,
08:09
but we have to do better than that:
we have to predict what's going to happen.
08:12
So here the pickup truck in top right
is about to make a left lane change
08:15
because the road in front of it is closed,
08:19
so it needs to get out of the way.
08:21
Knowing that one pickup truck is great,
08:23
but we really need to know
what everybody's thinking,
08:25
so it becomes quite a complicated problem.
08:27
And then given that, we can figure out
how the car should respond in the moment,
08:30
so what trajectory it should follow, how
quickly it should slow down or speed up.
08:34
And then that all turns into
just following a path:
08:38
turning the steering wheel left or right,
pressing the brake or gas.
08:41
It's really just two numbers
at the end of the day.
08:45
So how hard can it really be?
08:47
Back when we started in 2009,
08:50
this is what our system looked like.
08:52
So you can see our car in the middle
and the other boxes on the road,
08:54
driving down the highway.
08:57
The car needs to understand where it is
and roughly where the other vehicles are.
08:58
It's really a geometric
understanding of the world.
09:02
Once we started driving
on neighborhood and city streets,
09:05
the problem becomes a whole
new level of difficulty.
09:08
You see pedestrians crossing in front
of us, cars crossing in front of us,
09:10
going every which way,
09:13
the traffic lights, crosswalks.
09:15
It's an incredibly complicated
problem by comparison.
09:17
And then once you have
that problem solved,
09:20
the vehicle has to be able
to deal with construction.
09:22
So here are the cones on the left
forcing it to drive to the right,
09:24
but not just construction
in isolation, of course.
09:27
It has to deal with other people moving
through that construction zone as well.
09:30
And of course, if anyone's
breaking the rules, the police are there
09:34
and the car has to understand that
that flashing light on the top of the car
09:37
means that it's not just a car,
it's actually a police officer.
09:40
Similarly, the orange box
on the side here,
09:43
it's a school bus,
09:46
and we have to treat that
differently as well.
09:47
When we're out on the road,
other people have expectations:
09:50
So, when a cyclist puts up their arm,
09:53
it means they're expecting the car
to yield to them and make room for them
09:55
to make a lane change.
09:58
And when a police officer
stood in the road,
10:01
our vehicle should understand
that this means stop,
10:03
and when they signal to go,
we should continue.
10:05
Now, the way we accomplish this
is by sharing data between the vehicles.
10:09
The first, most crude model of this
10:13
is when one vehicle
sees a construction zone,
10:14
having another know about it
so it can be in the correct lane
10:17
to avoid some of the difficulty.
10:20
But we actually have a much
deeper understanding of this.
10:21
We could take all of the data
that the cars have seen over time,
10:24
the hundreds of thousands
of pedestrians, cyclists,
10:27
and vehicles that have been out there
10:29
and understand what they look like
10:31
and use that to infer
what other vehicles should look like
10:33
and other pedestrians should look like.
10:36
And then, even more importantly,
we could take from that a model
10:37
of how we expect them
to move through the world.
10:40
So here the yellow box is a pedestrian
crossing in front of us.
10:43
Here the blue box is a cyclist
and we anticipate
10:46
that they're going to nudge out
and around the car to the right.
10:48
Here there's a cyclist
coming down the road
10:52
and we know they're going to continue
to drive down the shape of the road.
10:54
Here somebody makes a right turn,
10:57
and in a moment here, somebody's
going to make a U-turn in front of us,
10:59
and we can anticipate that behavior
and respond safely.
11:02
Now, that's all well and good
for things that we've seen,
11:05
but of course, you encounter
lots of things that you haven't
11:08
seen in the world before.
11:11
And so just a couple of months ago,
11:12
our vehicles were driving
through Mountain View,
11:14
and this is what we encountered.
11:16
This is a woman in an electric wheelchair
11:17
chasing a duck in circles on the road.
(Laughter)
11:20
Now it turns out, there is nowhere
in the DMV handbook
11:22
that tells you how to deal with that,
11:25
but our vehicles were able
to encounter that,
11:28
slow down, and drive safely.
11:30
Now, we don't have to deal
with just ducks.
11:32
Watch this bird fly across in front of us.
The car reacts to that.
11:34
Here we're dealing with a cyclist
11:38
that you would never expect to see
anywhere other than Mountain View.
11:39
And of course, we have
to deal with drivers,
11:43
even the very small ones.
11:45
Watch to the right as someone
jumps out of this truck at us.
11:48
And now, watch the left as the car
with the green box decides
11:54
he needs to make a right turn
at the last possible moment.
11:57
Here, as we make a lane change,
the car to our left decides
12:00
it wants to as well.
12:03
And here, we watch a car
blow through a red light
12:07
and yield to it.
12:09
And similarly, here, a cyclist
blowing through that light as well.
12:11
And of course,
the vehicle responds safely.
12:15
And of course, we have people
who do I don't know what
12:18
sometimes on the road, like this guy
pulling out between two self-driving cars.
12:21
You have to ask, "What are you thinking?"
12:24
(Laughter)
12:26
Now, I just fire-hosed you
with a lot of stuff there,
12:28
so I'm going to break one of these
down pretty quickly.
12:30
So what we're looking at is the scene
with the cyclist again,
12:33
and you might notice in the bottom,
we can't actually see the cyclist yet,
12:36
but the car can: it's that little
blue box up there,
12:39
and that comes from the laser data.
12:42
And that's not actually
really easy to understand,
12:44
so what I'm going to do is I'm going
to turn that laser data and look at it,
12:46
and if you're really good at looking
at laser data, you can see
12:50
a few dots on the curve there,
12:53
right there, and that blue box
is that cyclist.
12:54
Now as our light is red,
12:57
the cyclist's light
has turned yellow already,
12:58
and if you squint, you can see that
in the imagery.
13:00
But the cyclist, we see, is going
to proceed through the intersection.
13:03
Our light has now turned green,
his is solidly red,
13:06
and we now anticipate that this bike
is going to come all the way across.
13:08
Unfortunately the other drivers next to us
were not paying as much attention.
13:13
They started to pull forward,
and fortunately for everyone,
13:16
this cyclists reacts, avoids,
13:19
and makes it through the intersection.
13:22
And off we go.
13:25
Now, as you can see, we've made
some pretty exciting progress,
13:26
and at this point we're pretty convinced
13:29
this technology is going
to come to market.
13:31
We do three million miles of testing
in our simulators every single day,
13:33
so you can imagine the experience
that our vehicles have.
13:38
We are looking forward to having
this technology on the road,
13:41
and we think the right path
is to go through the self-driving
13:43
rather than driver assistance approach
13:46
because the urgency is so large.
13:48
In the time I have given this talk today,
13:51
34 people have died on America's roads.
13:53
How soon can we bring it out?
13:56
Well, it's hard to say because
it's a really complicated problem,
13:59
but these are my two boys.
14:02
My oldest son is 11, and that means
in four and a half years,
14:05
he's going to be able
to get his driver's license.
14:08
My team and I are committed
to making sure that doesn't happen.
14:11
Thank you.
14:14
(Laughter) (Applause)
14:16
Chris Anderson: Chris,
I've got a question for you.
14:21
Chris Urmson: Sure.
14:23
CA: So certainly, the mind of your cars
is pretty mind-boggling.
14:26
On this debate between
driver-assisted and fully driverless --
14:30
I mean, there's a real debate
going on out there right now.
14:34
So some of the companies,
for example, Tesla,
14:37
are going the driver-assisted route.
14:40
What you're saying is that
that's kind of going to be a dead end
14:42
because you can't just keep improving
that route and get to fully driverless
14:48
at some point, and then a driver
is going to say, "This feels safe,"
14:53
and climb into the back,
and something ugly will happen.
14:57
CU: Right. No, that's exactly right,
and it's not to say
14:59
that the driver assistance systems
aren't going to be incredibly valuable.
15:02
They can save a lot of lives
in the interim,
15:05
but to see the transformative opportunity
to help someone like Steve get around,
15:08
to really get to the end case in safety,
15:11
to have the opportunity
to change our cities
15:13
and move parking out and get rid of
these urban craters we call parking lots,
15:16
it's the only way to go.
15:20
CA: We will be tracking your progress
with huge interest.
15:21
Thanks so much, Chris.
CU: Thank you. (Applause)
15:24

▲Back to top

About the Speaker:

Chris Urmson - Roboticist
Chris Umson is the Director of Self-Driving Cars at Google[x].

Why you should listen

Since 2009, Chris Urmson has headed up Google’s self-driving car program. So far, the team’s vehicles have driven over three quarters of a million miles. While early models included a driverless Prius that TEDsters got to test- ... um, -not-drive in 2011, more and more the team is building vehicles from the ground up, custom-made to go driverless.

Prior to joining Google, Umson was on the faculty of the Robotics Institute at Carnegie Mellon University, where his research focused on motion planning and perception for robotic vehicles. During his time at Carnegie Mellon, he served as Director of Technology for the team that won the 2007 DARPA Urban Challenge.

More profile about the speaker
Chris Urmson | Speaker | TED.com