ABOUT THE SPEAKER
Chris Urmson - Roboticist
Chris Umson is the Director of Self-Driving Cars at Google[x].

Why you should listen

Since 2009, Chris Urmson has headed up Google’s self-driving car program. So far, the team’s vehicles have driven over three quarters of a million miles. While early models included a driverless Prius that TEDsters got to test- ... um, -not-drive in 2011, more and more the team is building vehicles from the ground up, custom-made to go driverless.

Prior to joining Google, Umson was on the faculty of the Robotics Institute at Carnegie Mellon University, where his research focused on motion planning and perception for robotic vehicles. During his time at Carnegie Mellon, he served as Director of Technology for the team that won the 2007 DARPA Urban Challenge.

More profile about the speaker
Chris Urmson | Speaker | TED.com
TED2015

Chris Urmson: How a driverless car sees the road

克里斯·厄姆森: 无人驾驶汽车是如何看清路况的

Filmed:
2,536,355 views

统计上来说,汽车上最不可靠的部分是...驾驶员。克里斯·厄姆森带领的谷歌无人汽车驾驶项目,是几个想把人类从驾驶座上移除的项目之一。他谈论了目前项目的进展,并分享了有趣的视频,为我们展示了这类汽车如何分析路况以及如何自动做出下一步的抉择。
- Roboticist
Chris Umson is the Director of Self-Driving Cars at Google[x]. Full bio

Double-click the English transcript below to play the video.

00:12
So in 1885, Karl卡尔 Benz奔驰
invented发明 the automobile汽车.
0
528
3949
在1885年,卡尔本茨发明了汽车。
00:16
Later后来 that year, he took it out
for the first public上市 test测试 drive驾驶,
1
4707
3762
那年年末,他把车开出去
进行第一次户外测试,
00:20
and -- true真正 story故事 --
crashed坠毁 into a wall.
2
8469
3375
然后——真实是——他撞墙了。
00:24
For the last 130 years年份,
3
12184
2043
在过去的130年里,
00:26
we've我们已经 been working加工 around that least最小
reliable可靠 part部分 of the car汽车, the driver司机.
4
14227
4319
我们一直在为汽车的最薄弱环节,
驾驶员,做着努力。
00:30
We've我们已经 made制作 the car汽车 stronger.
5
18546
1354
我们把车做得更坚固,
00:32
We've我们已经 added添加 seat座位 belts皮带,
we've我们已经 added添加 air空气 bags包装袋,
6
20200
2548
我们增加了安全带,增加了安全气囊。
00:34
and in the last decade, we've我们已经 actually其实
started开始 trying to make the car汽车 smarter聪明
7
22748
3971
在过去十年里,
我们开始让车变得更智能,
00:38
to fix固定 that bug窃听器, the driver司机.
8
26719
2938
是为弥补驾驶员方面的缺陷。
00:41
Now, today今天 I'm going to talk to you
a little bit about the difference区别
9
29657
3261
那么,今天我将给大家讲讲,
00:44
between之间 patching修补 around the problem问题
with driver司机 assistance帮助 systems系统
10
32918
3808
用驾驶员辅助系统解决问题,
00:48
and actually其实 having fully充分
self-driving自驾车 cars汽车
11
36726
2564
和真正使用全面自动驾驶汽车的区别,
00:51
and what they can do for the world世界.
12
39290
1880
以及它们对世界的贡献。
00:53
I'm also going to talk to you
a little bit about our car汽车
13
41170
2995
我也会介绍一下我们的车,
00:56
and allow允许 you to see how it sees看到 the world世界
and how it reacts发生反应 and what it does,
14
44165
3999
让你们了解它是如何观察这个世界的,
以及如何应对各种情况,
01:00
but first I'm going to talk
a little bit about the problem问题.
15
48164
3187
但首先我先说说存在的问题。
01:03
And it's a big problem问题:
16
51651
1648
这是个很严重的问题,
每年全世界都有120万人
因交通事故丧命。
01:05
1.2 million百万 people are killed杀害
on the world's世界 roads道路 every一切 year.
17
53299
3089
01:08
In America美国 alone单独, 33,000 people
are killed杀害 each year.
18
56388
3784
仅仅在美国,
每年就有3万3千人死于车祸。
01:12
To put that in perspective透视,
19
60172
2028
换个方式说,
01:14
that's the same相同 as a 737
falling落下 out of the sky天空 every一切 working加工 day.
20
62200
4797
等同于每天都有一架737飞机失事。
01:19
It's kind of unbelievable难以置信的.
21
67342
1786
有点不可思议。
01:21
Cars汽车 are sold出售 to us like this,
22
69548
2298
汽车卖给我们后应该是这样一番景象,
01:23
but really, this is what driving's驾驶的 like.
23
71846
2717
但事实上,驾驶过程通常是这样。
01:26
Right? It's not sunny晴朗, it's rainy多雨的,
24
74563
2159
对吧?这不是晴天,是雨天,
01:28
and you want to do anything
other than drive驾驶.
25
76722
2488
而且除了开车,
你还想做点别的事情。
01:31
And the reason原因 why is this:
26
79210
1622
原因就是:
01:32
Traffic交通 is getting得到 worse更差.
27
80832
1858
交通状况变得越来越糟。
01:34
In America美国, between之间 1990 and 2010,
28
82690
3506
在美国,从1990年到2010年,
01:38
the vehicle车辆 miles英里 traveled旅行
increased增加 by 38 percent百分.
29
86196
3504
交通工具的里程数增加了38%。
01:42
We grew成长 by six percent百分 of roads道路,
30
90213
2749
而我们只增修了6%的路,
01:44
so it's not in your brains大脑.
31
92962
1602
所以不单单是你的感觉如此。
01:46
Traffic交通 really is substantially基本上 worse更差
than it was not very long ago.
32
94564
4276
交通状况的确比以前糟糕得多。
01:50
And all of this has a very human人的 cost成本.
33
98840
2409
所有的这一切也都伴随着人力成本。
01:53
So if you take the average平均 commute改判 time
in America美国, which哪一个 is about 50 minutes分钟,
34
101529
3948
如果你把在美的平均通勤时间,
约50分钟,
01:57
you multiply that by the 120 million百万
workers工人 we have,
35
105477
3649
乘以我们的1亿2000万工作者,
02:01
that turns out to be
about six billion十亿 minutes分钟
36
109126
2225
结果就是60亿分钟,
02:03
wasted浪费 in commuting上下班 every一切 day.
37
111351
2026
每天会被浪费在路上。
02:05
Now, that's a big number,
so let's put it in perspective透视.
38
113377
2827
这是个很大的数字,
那么我们换个方式,
02:08
You take that six billion十亿 minutes分钟
39
116204
1774
你把这60亿分钟,
02:09
and you divide划分 it by the average平均
life expectancy期待 of a person,
40
117978
3784
除以人均寿命,
02:13
that turns out to be 162 lifetimes寿命
41
121762
3135
得出数字是162个生命周期。
02:16
spent花费 every一切 day, wasted浪费,
42
124897
2925
仅仅从A地到B地,每天就有这么多
02:19
just getting得到 from A to B.
43
127822
2044
生命白白浪费掉了。
02:21
It's unbelievable难以置信的.
44
129866
1730
太难以置信了。
02:23
And then, there are those of us
who don't have the privilege特权
45
131596
2844
当然,我们当中还有一些人无法享有
02:26
of sitting坐在 in traffic交通.
46
134440
1672
坐到车流中来的权利。
02:28
So this is Steve史蒂夫.
47
136112
1578
这个人叫史蒂夫。
他是个很有才能的人。
02:29
He's an incredibly令人难以置信 capable guy,
48
137690
1765
02:31
but he just happens发生 to be blind,
49
139455
2516
但很不幸,他是盲人,
02:33
and that means手段 instead代替 of a 30-minute-分钟
drive驾驶 to work in the morning早上,
50
141971
3217
这意味着本来早上
上班路上的30分钟,
02:37
it's a two-hour两小时 ordeal考验
of piecing接头 together一起 bits of public上市 transit过境
51
145188
3979
变成了两个小时的
各种转乘公共交通的折磨,
02:41
or asking friends朋友 and family家庭 for a ride.
52
149167
2385
或者是请求朋友或家人载他一程。
02:43
He doesn't have that same相同 freedom自由
that you and I have to get around.
53
151552
3669
他并没有像你我一样
想去哪儿就去哪儿的自由。
02:47
We should do something about that.
54
155221
2460
我们应该改变这种现状。
02:49
Now, conventional常规 wisdom智慧 would say
55
157891
1757
现在,传统观点认为,
02:51
that we'll just take
these driver司机 assistance帮助 systems系统
56
159648
2492
我们应该使用驾驶员辅助系统,
02:54
and we'll kind of push them
and incrementally增量 improve提高 them,
57
162140
3750
然后不停改进它们,
总有一天,它们能够实现自动驾驶。
02:57
and over time, they'll他们会 turn
into self-driving自驾车 cars汽车.
58
165890
2542
03:00
Well, I'm here to tell you
that's like me saying
59
168432
2409
事实上我今天来就是想告诉你们,
这就跟
03:02
that if I work really hard at jumping跳跃,
one day I'll be able能够 to fly.
60
170841
4057
如果我努力练习弹跳,
有一天我就能飞翔一样不现实。
03:06
We actually其实 need to do
something a little different不同.
61
174898
2728
我们需要做点不同的东西。
03:09
And so I'm going to talk to you
about three different不同 ways方法
62
177626
2711
我会跟你们介绍
自动驾驶系统和驾驶员辅助系统的
三个不同方面。
03:12
that self-driving自驾车 systems系统 are different不同
than driver司机 assistance帮助 systems系统.
63
180337
3346
03:15
And I'm going to start开始
with some of our own拥有 experience经验.
64
183683
2651
我先从自己的经历说起。
03:18
So back in 2013,
65
186334
2253
在2013年,
03:20
we had the first test测试
of a self-driving自驾车 car汽车
66
188587
2663
我们进行了第一次自动驾驶测试,
03:23
where we let regular定期 people use it.
67
191250
2027
让普通人来操作。
03:25
Well, almost几乎 regular定期 --
they were 100 GooglersGoogle员工,
68
193277
2202
算是普通人吧——
他们是100名谷歌员工,
03:27
but they weren't working加工 on the project项目.
69
195479
2003
但他们没有参与开发这个项目。
03:29
And we gave them the car汽车 and we allowed允许
them to use it in their daily日常 lives生活.
70
197482
3621
我们把车给他们,
让他们在日常生活中使用。
03:33
But unlike不像 a real真实 self-driving自驾车 car汽车,
this one had a big asterisk星号 with it:
71
201103
3719
但与真的自动驾驶汽车不同,
这一辆得加个星号上去:
03:36
They had to pay工资 attention注意,
72
204822
1504
他们得留多个心眼儿,
03:38
because this was an experimental试验 vehicle车辆.
73
206326
2633
因为这只是一辆试验车。
03:40
We tested测试 it a lot,
but it could still fail失败.
74
208959
3525
我们虽然进行了很多测试,
但还是有风险。
03:44
And so we gave them two hours小时 of training训练,
75
212484
2059
我们对他们进行了两个小时的训练,
03:46
we put them in the car汽车,
we let them use it,
76
214543
2092
然后让他们进行实际操作,
03:48
and what we heard听说 back
was something awesome真棒,
77
216635
2127
然后我们得到了一些很好的反馈,
03:50
as someone有人 trying
to bring带来 a product产品 into the world世界.
78
218762
2524
因为有人把产品带到现实中来了。
每个人都对它赞不绝口。
03:53
Every一切 one of them told us they loved喜爱 it.
79
221286
1925
03:55
In fact事实, we had a Porsche保时捷 driver司机
who came来了 in and told us on the first day,
80
223211
3566
事实上,在第一天
有一个保时捷驾驶员进来跟我们说,
03:58
"This is completely全然 stupid.
What are we thinking思维?"
81
226777
2663
“这实在是太无厘头了。
你们到底怎么想的?”
04:01
But at the end结束 of it, he said,
"Not only should I have it,
82
229850
2840
但最后,他说,“不单是我需要它,
每个人都需要有一辆,
大家的车技都太烂了。”
04:04
everyone大家 else其他 should have it,
because people are terrible可怕 drivers司机."
83
232690
3175
04:09
So this was music音乐 to our ears耳朵,
84
237135
1735
这番话就是我们的福音,
04:10
but then we started开始 to look at what
the people inside the car汽车 were doing,
85
238870
3803
然后我们开始观察
人们在车里都在做什么,
04:14
and this was eye-opening大开眼界.
86
242673
1579
真让人大开眼界。
04:16
Now, my favorite喜爱 story故事 is this gentleman绅士
87
244252
2438
我最喜欢的故事,是一位先生
04:18
who looks容貌 down at his phone电话
and realizes实现 the battery电池 is low,
88
246690
3829
低头看手机,发现手机快没电了,
04:22
so he turns around like this in the car汽车
and digs around in his backpack背包,
89
250519
4548
然后他在车里这样转过身,
在背包里四处摸索着,
04:27
pulls out his laptop笔记本电脑,
90
255067
2153
拿出他的笔记本电脑,
放到副驾驶座位上,
04:29
puts看跌期权 it on the seat座位,
91
257220
1567
04:30
goes in the back again,
92
258787
1764
再转过身,
04:32
digs around, pulls out
the charging充电 cable电缆 for his phone电话,
93
260551
3367
又摸了一通,拿出手机充电线,
04:35
futzesfutzes around, puts看跌期权 it into the laptop笔记本电脑,
puts看跌期权 it on the phone电话.
94
263918
3367
理一下线,插进电脑里,连上手机。
04:39
Sure enough足够, the phone电话 is charging充电.
95
267285
2043
棒极了,手机有电了。
04:41
All the time he's been doing
65 miles英里 per hour小时 down the freeway高速公路.
96
269328
3994
而他那时正行驶在时速65英里的高速上
(约104公里每小时)。
04:45
Right? Unbelievable难以置信的.
97
273322
2484
能想象到吗?太难以置信了。
04:47
So we thought about this and we said,
it's kind of obvious明显, right?
98
275806
3121
所以我们想了想,说,
这挺明显的对吧?
04:50
The better the technology技术 gets得到,
99
278927
2263
科技越来越发达,
04:53
the less reliable可靠
the driver司机 is going to get.
100
281190
2121
驾驶员就不需要太负责任。
04:55
So by just making制造 the cars汽车
incrementally增量 smarter聪明,
101
283311
2396
所以只是把车变得更加智能,
04:57
we're probably大概 not going to see
the wins we really need.
102
285707
2902
并没法让我们看到真正需要的成功。
05:00
Let me talk about something
a little technical技术 for a moment时刻 here.
103
288609
3901
我在这里要暂时说一点技术上的东西。
05:04
So we're looking at this graph图形,
and along沿 the bottom底部
104
292510
2438
在这张图上,底部的线
05:06
is how often经常 does the car汽车
apply应用 the brakes刹车 when it shouldn't不能.
105
294948
3051
代表着在不必要的时候
制动刹车发生的频率。
05:09
You can ignore忽视 most of that axis,
106
297999
1621
你可以忽略这条轴的大部分.
05:11
because if you're driving主动 around town,
and the car汽车 starts启动 stopping停止 randomly随机,
107
299620
3719
因为如果你在城里开车,
然后车时不时自己停下来。
05:15
you're never going to buy购买 that car汽车.
108
303339
1701
你永远都不会买这辆车。
05:17
And the vertical垂直 axis is how often经常
the car汽车 is going to apply应用 the brakes刹车
109
305040
3375
竖直的轴线表示车会
在你需要避免事故时
05:20
when it's supposed应该 to
to help you avoid避免 an accident事故.
110
308415
3049
采取制动刹车的频率。
05:23
Now, if we look at
the bottom底部 left corner here,
111
311464
2221
如果我们看左下角,
05:25
this is your classic经典 car汽车.
112
313685
1845
这是你们正在开的普通汽车。
05:27
It doesn't apply应用 the brakes刹车 for you,
it doesn't do anything goofy高飞,
113
315530
3133
它不会自动为你刹车,
也不至于刹车失灵,
05:30
but it also doesn't get you
out of an accident事故.
114
318663
2779
但它无法为你避免事故。
05:33
Now, if we want to bring带来
a driver司机 assistance帮助 system系统 into a car汽车,
115
321442
3018
如果我们把驾驶员辅助系统
装进车里,
比如说撞击缓冲刹车系统,
05:36
say with collision碰撞 mitigation减轻 braking制动,
116
324460
1828
我们会导入一系列的科技,
05:38
we're going to put some package
of technology技术 on there,
117
326288
2612
也就是这条曲线,
它有了一些操作属性,
05:40
and that's this curve曲线, and it's going
to have some operating操作 properties性能,
118
328900
3418
但也不会完全规避事故,
05:44
but it's never going to avoid避免
all of the accidents事故,
119
332318
2490
因为它没有这个能力。
05:46
because it doesn't have that capability能力.
120
334808
2059
但我们会在曲线上取某个点,
05:48
But we'll pick some place地点
along沿 the curve曲线 here,
121
336867
2249
也许它可以避免一半
因驾驶员失误引起的事故。
05:51
and maybe it avoids避免 half of accidents事故
that the human人的 driver司机 misses错过,
122
339116
3254
挺赞的,对吧?
05:54
and that's amazing惊人, right?
123
342370
1297
我们可以减少一半的交通事故。
05:55
We just reduced减少 accidents事故 on our roads道路
by a factor因子 of two.
124
343667
2727
05:58
There are now 17,000 less people
dying垂死 every一切 year in America美国.
125
346394
3987
这样每年在美国就有1万7千人
幸免于难。
06:02
But if we want a self-driving自驾车 car汽车,
126
350381
2020
但如果我们想要一辆自动驾驶汽车,
06:04
we need a technology技术 curve曲线
that looks容貌 like this.
127
352401
2307
我们需要一条这样的曲线。
06:06
We're going to have to put
more sensors传感器 in the vehicle车辆,
128
354708
2599
我们得在车里加装更多的传感器,
06:09
and we'll pick some
operating操作 point up here
129
357307
2021
然后在这里挑某个操作点,
在这个点上基本
永远不会有事故发生。
06:11
where it basically基本上 never
gets得到 into a crash紧急.
130
359328
2019
06:13
They'll他们会 happen发生, but very low frequency频率.
131
361347
2443
多少还是会发生,但概率极低。
06:15
Now you and I could look at this
and we could argue争论
132
363790
2461
现在我们可以看看这里,
探讨一下是否有所增加,
我会提到比方说“80-20规则”,
06:18
about whether是否 it's incremental增加的, and
I could say something like "80-20 rule规则,"
133
366251
3605
而且很难再上升了。
06:21
and it's really hard to move移动 up
to that new curve曲线.
134
369856
2568
但让我们暂时从另一个角度看一看。
06:24
But let's look at it
from a different不同 direction方向 for a moment时刻.
135
372424
2934
我们看看这种科技应用的频率多高。
06:27
So let's look at how often经常
the technology技术 has to do the right thing.
136
375358
3512
06:30
And so this green绿色 dot up here
is a driver司机 assistance帮助 system系统.
137
378870
3506
这个绿点表示驾驶员辅助系统。
06:34
It turns out that human人的 drivers司机
138
382376
2485
调查发现人类驾驶员
06:36
make mistakes错误 that lead
to traffic交通 accidents事故
139
384861
2647
因为自身错误导致的交通事故,
06:39
about once一旦 every一切 100,000 miles英里 in America美国.
140
387508
3172
在美国是每10万英里(约16万公里)
发生一次。
06:42
In contrast对比, a self-driving自驾车 system系统
is probably大概 making制造 decisions决定
141
390680
3167
对比之下,自动驾驶系统约在每秒
06:45
about 10 times per second第二,
142
393847
3663
会自行做10次决定。
所以就数量级而言,
06:49
so order订购 of magnitude大小,
143
397510
1422
06:50
that's about 1,000 times per mile英里.
144
398932
2832
约是每英里(1.6公里)1000次。
06:53
So if you compare比较 the distance距离
between之间 these two,
145
401764
2485
如果你对比一下两者的差距,
06:56
it's about 10 to the eighth第八, right?
146
404249
2600
就是10的八次方,对吧?
06:58
Eight orders命令 of magnitude大小.
147
406849
1765
8个数量级,
07:00
That's like comparing比较 how fast快速 I run
148
408614
2809
这就像拿我跑步的速度
07:03
to the speed速度 of light.
149
411423
2206
和光速比较,
07:05
It doesn't matter how hard I train培养,
I'm never actually其实 going to get there.
150
413629
3785
即便我再刻苦训练,
也永远达不到光速。
07:09
So there's a pretty漂亮 big gap间隙 there.
151
417414
2438
所以这个跨度非常大。
07:11
And then finally最后, there's how
the system系统 can handle处理 uncertainty不确定.
152
419852
3729
最后,就是这个系统如何
处理不确定性。
07:15
So this pedestrian行人 here might威力 be
stepping步进 into the road, might威力 not be.
153
423581
3323
那么这个行人可能会走到路上,
也可能不会。
07:18
I can't tell,
nor也不 can any of our algorithms算法,
154
426904
3395
我不确定,也没有任何算法能确定,
但对于驾驶员辅助系统来说,
07:22
but in the case案件 of
a driver司机 assistance帮助 system系统,
155
430310
2284
07:24
that means手段 it can't take action行动,
because again,
156
432594
2806
这意味着它无法采取措施,
因为如果它在预期之外采取制动,
是完全不合适的。
07:27
if it presses印刷机 the brakes刹车 unexpectedly不料,
that's completely全然 unacceptable不可接受.
157
435400
3339
07:30
Whereas a self-driving自驾车 system系统
can look at that pedestrian行人 and say,
158
438739
3133
但自动驾驶系统则会观察行人,
07:33
I don't know what they're about to do,
159
441872
1890
然后说我不知道他们要做什么,
07:35
slow down, take a better look,
and then react应对 appropriately适当 after that.
160
443762
3762
于是减速,再仔细观察,
然后采取适当措施。
07:39
So it can be much safer更安全 than
a driver司机 assistance帮助 system系统 can ever be.
161
447524
3702
所以这就比驾驶员辅助系统
要安全得多,
07:43
So that's enough足够 about
the differences分歧 between之间 the two.
162
451226
2730
那么以上的例子就
足以体现这两者的区别了。
07:45
Let's spend some time talking about
how the car汽车 sees看到 the world世界.
163
453956
3484
现在我们再花点时间聊聊
车是如何观察环境的。
这是我们的车。
07:49
So this is our vehicle车辆.
164
457440
1252
07:50
It starts启动 by understanding理解
where it is in the world世界,
165
458692
2438
它从识别自己的位置开始,
通过将它的地图和
传感器数据进行叠加,
07:53
by taking服用 a map地图 and its sensor传感器 data数据
and aligning调心 the two,
166
461130
2787
07:55
and then we layer on top最佳 of that
what it sees看到 in the moment时刻.
167
463917
2948
然后我们再加上它当时看到的东西,
07:58
So here, all the purple紫色 boxes盒子 you can see
are other vehicles汽车 on the road,
168
466865
3655
所以在这里所有你能看到的紫色盒子
是路上的其他车辆,
08:02
and the red thing on the side
over there is a cyclist骑车人,
169
470520
2528
旁边红色的物体则是一位骑单车的人,
08:05
and up in the distance距离,
if you look really closely密切,
170
473048
2402
如果你再仔细点看,在远处,
08:07
you can see some cones.
171
475450
1794
你能看到一些锥形路障。
08:09
Then we know where the car汽车
is in the moment时刻,
172
477244
2773
这样我们就能知道汽车现在的位置了,
08:12
but we have to do better than that:
we have to predict预测 what's going to happen发生.
173
480017
3833
但我们还得再改善:
我们得预测将发生的事情。
这里右上角的小卡车正准备换到左道,
08:15
So here the pickup捡起 truck卡车 in top最佳 right
is about to make a left lane车道 change更改
174
483850
3488
因为前面的路被封了,
08:19
because the road in front面前 of it is closed关闭,
175
487338
2223
所以它得驶离原车道。
08:21
so it needs需求 to get out of the way.
176
489561
1731
08:23
Knowing会心 that one pickup捡起 truck卡车 is great,
177
491292
1863
能知道一辆小卡车的轨迹确实不错,
08:25
but we really need to know
what everybody's每个人的 thinking思维,
178
493155
2479
但我们真正需要的是
了解每个人的想法,
08:27
so it becomes quite相当 a complicated复杂 problem问题.
179
495634
2507
所以问题就变得十分复杂了。
08:30
And then given特定 that, we can figure数字 out
how the car汽车 should respond响应 in the moment时刻,
180
498141
4749
了解这些之后,
我们就可以算出汽车该如何应对,
该跟随哪条线路,
要多快实现减速或者加速。
08:34
so what trajectory弹道 it should follow跟随, how
quickly很快 it should slow down or speed速度 up.
181
502890
3866
08:38
And then that all turns into
just following以下 a path路径:
182
506756
3065
所有的这一切最终都会
变成跟随一条路径:
08:41
turning车削 the steering操舵 wheel left or right,
pressing紧迫 the brake制动 or gas加油站.
183
509821
3197
向左或还是右打方向盘,
踩刹车还是油门。
08:45
It's really just two numbers数字
at the end结束 of the day.
184
513018
2464
所有的一切最终都
简化成了两个数值,
08:47
So how hard can it really be?
185
515482
2241
那这能有多难呢?
08:50
Back when we started开始 in 2009,
186
518433
1952
在2009年我们刚开始测试的时候,
08:52
this is what our system系统 looked看着 like.
187
520385
1798
我们的系统看起来是这样的。
你能看到我们的车在中间,
路上还有其他盒子,
08:54
So you can see our car汽车 in the middle中间
and the other boxes盒子 on the road,
188
522183
3391
同时在高速上行驶着。
08:57
driving主动 down the highway高速公路.
189
525574
1271
这辆车需要知道自己的位置
以及其他车辆的大概方位。
08:58
The car汽车 needs需求 to understand理解 where it is
and roughly大致 where the other vehicles汽车 are.
190
526845
3818
基本上就是一种几何的分析方式。
09:02
It's really a geometric几何
understanding理解 of the world世界.
191
530663
2429
09:05
Once一旦 we started开始 driving主动
on neighborhood邻里 and city streets街道,
192
533092
2948
当我们开始在社区和
城市道路上行驶时,
09:08
the problem问题 becomes a whole整个
new level水平 of difficulty困难.
193
536040
2445
问题又上升到了一个新的难度。
09:10
You see pedestrians行人 crossing路口 in front面前
of us, cars汽车 crossing路口 in front面前 of us,
194
538485
3494
你能看到行人在我们面前穿梭,
还有汽车,
09:13
going every一切 which哪一个 way,
195
541979
1811
横纵交错,
还有红绿灯,人行横道。
09:15
the traffic交通 lights灯火, crosswalks人行横道.
196
543790
1527
09:17
It's an incredibly令人难以置信 complicated复杂
problem问题 by comparison对照.
197
545317
2797
相比之下问题变得极度复杂。
09:20
And then once一旦 you have
that problem问题 solved解决了,
198
548114
2103
当你把这个问题解决后,
09:22
the vehicle车辆 has to be able能够
to deal合同 with construction施工.
199
550217
2512
接下来汽车还得能应付建筑施工。
09:24
So here are the cones on the left
forcing迫使 it to drive驾驶 to the right,
200
552729
3151
所以左边的锥形路障会
迫使汽车开到右边,
09:27
but not just construction施工
in isolation隔离, of course课程.
201
555880
2402
当然不仅需要避开施工区域,
09:30
It has to deal合同 with other people moving移动
through通过 that construction施工 zone as well.
202
558282
3723
它还得应付在其间走动的其他人。
09:34
And of course课程, if anyone's任何人的
breaking破坏 the rules规则, the police警察 are there
203
562005
3263
当然,如果有人违规了,
有警察在那里,
汽车就得明白车上闪着灯
09:37
and the car汽车 has to understand理解 that
that flashing闪烁 light on the top最佳 of the car汽车
204
565268
3622
意味着这不仅仅是一辆车,
还有一位警官。
09:40
means手段 that it's not just a car汽车,
it's actually其实 a police警察 officer.
205
568890
3105
类似的,这里路边橘黄色的盒子,
09:43
Similarly同样, the orange橙子 box
on the side here,
206
571995
2032
是校车,
09:46
it's a school学校 bus总线,
207
574027
1109
09:47
and we have to treat对待 that
differently不同 as well.
208
575136
2520
我们也得分开来处理。
09:50
When we're out on the road,
other people have expectations期望:
209
578576
2793
当我们在路上时,
其他人会表达各种意图:
09:53
So, when a cyclist骑车人 puts看跌期权 up their arm,
210
581369
1780
当骑单车的人举起他们的手臂,
09:55
it means手段 they're expecting期待 the car汽车
to yield产量 to them and make room房间 for them
211
583149
3518
这就意味着他们希望有车能
让给他们点空间
09:58
to make a lane车道 change更改.
212
586667
2053
以便进行换道。
10:01
And when a police警察 officer
stood站在 in the road,
213
589030
2173
当路中间站着一位警官,
10:03
our vehicle车辆 should understand理解
that this means手段 stop,
214
591203
2740
我们的车得明白这手势是要你停下来,
10:05
and when they signal信号 to go,
we should continue继续.
215
593943
3506
当他们示意我们可以走了,
我们才能继续。
10:09
Now, the way we accomplish完成 this
is by sharing分享 data数据 between之间 the vehicles汽车.
216
597449
3761
我们达成这些目标,
是通过和其他车辆分享数据。
10:13
The first, most crude原油 model模型 of this
217
601210
1696
首先,最简单粗制的模型,
10:14
is when one vehicle车辆
sees看到 a construction施工 zone,
218
602906
2113
就是当一辆车看到建筑施工地带时,
10:17
having another另一个 know about it
so it can be in the correct正确 lane车道
219
605019
3062
告知另一辆车,让它驶上正确的车道
10:20
to avoid避免 some of the difficulty困难.
220
608081
1570
以避免麻烦。
10:21
But we actually其实 have a much
deeper更深 understanding理解 of this.
221
609651
2664
但我们对此有更深的认识。
10:24
We could take all of the data数据
that the cars汽车 have seen看到 over time,
222
612315
3009
我们可以搜集车辆在一段时间内
看到的数据,
10:27
the hundreds数以百计 of thousands数千
of pedestrians行人, cyclists骑自行车的人,
223
615324
2376
数以千计在路上的行人,骑单车的人,
10:29
and vehicles汽车 that have been out there
224
617700
1787
以及其他车辆,
10:31
and understand理解 what they look like
225
619487
1695
了解他们的外形,
10:33
and use that to infer推断
what other vehicles汽车 should look like
226
621182
2831
再用之去推理其他车辆
以及其他行人的外形。
10:36
and other pedestrians行人 should look like.
227
624013
1926
10:37
And then, even more importantly重要的,
we could take from that a model模型
228
625939
3021
然后,更重要的是,
我们可以从中得出一个模型,
10:40
of how we expect期望 them
to move移动 through通过 the world世界.
229
628960
2330
预测所有交通参与者的去向。
10:43
So here the yellow黄色 box is a pedestrian行人
crossing路口 in front面前 of us.
230
631290
2963
这里的黄色盒子是
我们面前过马路的行人。
10:46
Here the blue蓝色 box is a cyclist骑车人
and we anticipate预料
231
634253
2250
这个蓝色盒子是个骑单车的人,
而且我们预测
10:48
that they're going to nudge微调 out
and around the car汽车 to the right.
232
636503
3312
他会一直保持在车辆右侧。
10:52
Here there's a cyclist骑车人
coming未来 down the road
233
640115
2092
这里有个在路上骑单车的人,
10:54
and we know they're going to continue继续
to drive驾驶 down the shape形状 of the road.
234
642207
3486
我们知道他会沿着路一直骑下去。
这里有人右转了,
10:57
Here somebody makes品牌 a right turn,
235
645693
1867
而在这里,有人会在我们面前调头,
10:59
and in a moment时刻 here, somebody's某人的
going to make a U-turn掉头 in front面前 of us,
236
647560
3360
我们可以预测这个行为并安全应对。
11:02
and we can anticipate预料 that behavior行为
and respond响应 safely安然.
237
650920
2614
目前为止,对于我们见过的场景
都没什么问题,
11:05
Now, that's all well and good
for things that we've我们已经 seen看到,
238
653534
2728
但当然,你还会遇见很多
11:08
but of course课程, you encounter遭遇
lots of things that you haven't没有
239
656262
2865
之前没见过的东西。
11:11
seen看到 in the world世界 before.
240
659127
1231
几个月前,
11:12
And so just a couple一对 of months个月 ago,
241
660358
1741
我们的车辆在通过Mountain View
(硅谷地名)的时候,
11:14
our vehicles汽车 were driving主动
through通过 Mountain View视图,
242
662099
2235
就遇到了这样的情景。
11:16
and this is what we encountered遇到.
243
664334
1644
这是个坐着电动轮椅的女士,
11:17
This is a woman女人 in an electric电动 wheelchair轮椅
244
665978
2082
在路中央绕着圈追赶一只鸭子。
(笑声)
11:20
chasing a duck in circles on the road.
(Laughter笑声)
245
668060
2617
11:22
Now it turns out, there is nowhere无处
in the DMVDMV handbook手册
246
670677
3111
结果呢,在机动车驾驶管理处的手册里
11:25
that tells告诉 you how to deal合同 with that,
247
673788
2245
没有一条告诉你该怎么做,
11:28
but our vehicles汽车 were able能够
to encounter遭遇 that,
248
676033
2143
但我们的车辆却能灵活应对,
减速并安全通过。
11:30
slow down, and drive驾驶 safely安然.
249
678176
2255
11:32
Now, we don't have to deal合同
with just ducks鸭子.
250
680431
2041
我们应付的不只是鸭子,
看看这只飞过我们面前的鸟,
汽车也会对之做出处理。
11:34
Watch this bird fly across横过 in front面前 of us.
The car汽车 reacts发生反应 to that.
251
682472
3708
11:38
Here we're dealing交易 with a cyclist骑车人
252
686180
1615
这里还有一个骑车的,
11:39
that you would never expect期望 to see
anywhere随地 other than Mountain View视图.
253
687795
3290
估计除了在Mountain View,
其他地方很难见到。
11:43
And of course课程, we have
to deal合同 with drivers司机,
254
691085
2068
当然,我们还得应付其他驾驶员,
11:45
even the very small ones那些.
255
693153
3715
甚至那些幼龄的。
11:48
Watch to the right as someone有人
jumps跳跃 out of this truck卡车 at us.
256
696868
4131
注意右边,那个从货车上跳下来的人。
现在,注意左边绿盒子那辆车,
11:54
And now, watch the left as the car汽车
with the green绿色 box decides决定
257
702460
2929
11:57
he needs需求 to make a right turn
at the last possible可能 moment时刻.
258
705389
3325
它决定在最后的时刻右转。
12:00
Here, as we make a lane车道 change更改,
the car汽车 to our left decides决定
259
708714
2851
这里,当我们变道时,我们左边的车
也同样想变道。
12:03
it wants to as well.
260
711565
3553
12:07
And here, we watch a car汽车
blow打击 through通过 a red light
261
715118
2693
还有这里,我们看到一辆车闯了红灯,
12:09
and yield产量 to it.
262
717811
2090
我们就先让它通过。
12:11
And similarly同样, here, a cyclist骑车人
blowing through通过 that light as well.
263
719901
3854
同样这里,骑单车的人也闯红灯了,
12:15
And of course课程,
the vehicle车辆 responds响应 safely安然.
264
723755
2746
不出所料,
我们的车也能安全应对。
12:18
And of course课程, we have people
who do I don't know what
265
726501
2601
当然还有一些莫名其妙的人,
12:21
sometimes有时 on the road, like this guy
pulling out between之间 two self-driving自驾车 cars汽车.
266
729102
3823
就像这家伙一样,
直接就从两辆自动驾驶汽车中窜出来。
你会想问,“你脑子是怎么想的?”
12:24
You have to ask, "What are you thinking思维?"
267
732925
2045
12:26
(Laughter笑声)
268
734970
1212
(笑声)
12:28
Now, I just fire-hosed消防用水管冲洗 you
with a lot of stuff东东 there,
269
736182
2521
我已经给你们看了大量的例子,
我快速地讲一下其中的一个案例。
12:30
so I'm going to break打破 one of these
down pretty漂亮 quickly很快.
270
738703
2650
我们现在看到的还是骑单车的人,
12:33
So what we're looking at is the scene现场
with the cyclist骑车人 again,
271
741353
2940
你们可能会发现在下面的视角,
我们还看不到那个人,
12:36
and you might威力 notice注意 in the bottom底部,
we can't actually其实 see the cyclist骑车人 yet然而,
272
744293
3491
但车能看到:就是那里的小蓝盒子,
12:39
but the car汽车 can: it's that little
blue蓝色 box up there,
273
747784
2504
这来自于激光数据。
12:42
and that comes from the laser激光 data数据.
274
750288
2081
这并不是很容易理解,
12:44
And that's not actually其实
really easy简单 to understand理解,
275
752369
2418
那么我接下来要做的,
就是调出激光数据看一下,
12:46
so what I'm going to do is I'm going
to turn that laser激光 data数据 and look at it,
276
754787
3584
如果你擅长分析激光数据,你能发现
12:50
and if you're really good at looking
at laser激光 data数据, you can see
277
758371
3029
曲线上的一些点,
就在那儿,
那个蓝色小盒子就是骑单车的人。
12:53
a few少数 dots on the curve曲线 there,
278
761400
1487
12:54
right there, and that blue蓝色 box
is that cyclist骑车人.
279
762887
2372
这会儿面对我们的还是红灯,
12:57
Now as our light is red,
280
765259
1149
自行车道的灯已经变黄了,
12:58
the cyclist's骑车人 light
has turned转身 yellow黄色 already已经,
281
766408
2192
如果你瞥一眼的话,就能看到了。
13:00
and if you squint, you can see that
in the imagery意象.
282
768600
2438
但是骑单车的人,
我们看到他准备穿过这个十字路口,
13:03
But the cyclist骑车人, we see, is going
to proceed继续 through通过 the intersection路口.
283
771038
3286
我们的灯已经变绿,
他的方向也变红了,
13:06
Our light has now turned转身 green绿色,
his is solidly扎实 red,
284
774324
2394
13:08
and we now anticipate预料 that this bike自行车
is going to come all the way across横过.
285
776718
4292
我们预期到这辆单车将会横穿马路。
13:13
Unfortunately不幸 the other drivers司机 next下一个 to us
were not paying付款 as much attention注意.
286
781010
3742
但不幸的是,
我们旁边的其他司机并没有注意到。
13:16
They started开始 to pull forward前锋,
and fortunately幸好 for everyone大家,
287
784752
3157
他们开始踩油门,
不过幸运的是,
13:19
this cyclists骑自行车的人 reacts发生反应, avoids避免,
288
787909
3011
骑单车的人及时避开了,
然后平安地穿过了十字路口。
13:22
and makes品牌 it through通过 the intersection路口.
289
790920
2191
13:25
And off we go.
290
793111
1568
之后我们才又继续前进。
13:26
Now, as you can see, we've我们已经 made制作
some pretty漂亮 exciting扣人心弦 progress进展,
291
794679
2948
正如你们所见,
我们已取得了一些激动人心的成就,
13:29
and at this point we're pretty漂亮 convinced相信
292
797627
1902
此时我们深信
13:31
this technology技术 is going
to come to market市场.
293
799529
2010
这项技术将会进入市场。
13:33
We do three million百万 miles英里 of testing测试
in our simulators模拟器 every一切 single day,
294
801539
4783
我们每天用虚拟器做
3百万英里的测试,
13:38
so you can imagine想像 the experience经验
that our vehicles汽车 have.
295
806322
2689
所以你就能够想象到
我们的车辆获得了多少经验。
13:41
We are looking forward前锋 to having
this technology技术 on the road,
296
809011
2864
我们期待这项技术能在道路上使用,
13:43
and we think the right path路径
is to go through通过 the self-driving自驾车
297
811875
2890
并且认为正确的选择是自动驾驶,
13:46
rather than driver司机 assistance帮助 approach途径
298
814765
1844
而非驾驶员辅助系统,
13:48
because the urgency is so large.
299
816609
2621
因为情况已经刻不容缓了。
13:51
In the time I have given特定 this talk today今天,
300
819230
2393
就在我演讲的时间段内,
13:53
34 people have died死亡 on America's美国 roads道路.
301
821623
3135
在美国的公路上已经有34人丧生。
13:56
How soon不久 can we bring带来 it out?
302
824758
2368
我们多久能实现这个目标呢?
13:59
Well, it's hard to say because
it's a really complicated复杂 problem问题,
303
827126
3832
嗯,很难说,
因为这是个很复杂的问题,
14:02
but these are my two boys男孩.
304
830958
2214
这两个是我的儿子。
14:05
My oldest最老的 son儿子 is 11, and that means手段
in four and a half years年份,
305
833172
3623
大的11岁,也就是说在四年半后,
14:08
he's going to be able能够
to get his driver's司机 license执照.
306
836795
2577
他就能去考驾照了。
14:11
My team球队 and I are committed提交
to making制造 sure that doesn't happen发生.
307
839372
3204
我和我的团队承诺
尽量不让他去考(已经不需要了)。
14:14
Thank you.
308
842576
1904
谢谢。
14:16
(Laughter笑声) (Applause掌声)
309
844480
3667
(笑声)(掌声)
克利斯·安德森(CA):
克里斯,我有个问题要问你。
14:21
Chris克里斯 Anderson安德森: Chris克里斯,
I've got a question for you.
310
849110
2568
克里斯·厄姆森(CU):问吧。
14:23
Chris克里斯 Urmson乌尔逊: Sure.
311
851678
2809
14:26
CACA: So certainly当然, the mind心神 of your cars汽车
is pretty漂亮 mind-boggling令人难以置信.
312
854487
3924
CA:显而易见,
你们的车有着让人惊奇的大脑。
14:30
On this debate辩论 between之间
driver-assisted驾驶员辅助 and fully充分 driverless无人驾驶 --
313
858411
4459
在驾驶辅助和无人驾驶这场辩论上——
14:34
I mean, there's a real真实 debate辩论
going on out there right now.
314
862870
3041
我是说,现在就有一场真正的辩论。
14:37
So some of the companies公司,
for example, Tesla特斯拉,
315
865911
2833
一些公司,例如,特斯拉,
14:40
are going the driver-assisted驾驶员辅助 route路线.
316
868744
2159
正在走驾驶辅助的路线。
14:42
What you're saying is that
that's kind of going to be a dead end结束
317
870903
5248
而你所说的,
就是这将是个没前途的死胡同,
14:48
because you can't just keep improving提高
that route路线 and get to fully充分 driverless无人驾驶
318
876151
5456
因为你不能指望在这方面不断提高
就会在某个时候实现无人驾驶,
14:53
at some point, and then a driver司机
is going to say, "This feels感觉 safe安全,"
319
881607
3530
然后有驾驶员说
“这已经挺安全的了”,
14:57
and climb into the back,
and something ugly丑陋 will happen发生.
320
885137
2647
然后转身去后座,
不幸就发生了。
14:59
CUCU: Right. No, that's exactly究竟 right,
and it's not to say
321
887784
2676
CU:对,你说得对,这并不是说
驾驶员辅助系统作用不大。
15:02
that the driver司机 assistance帮助 systems系统
aren't going to be incredibly令人难以置信 valuable有价值.
322
890460
3537
它能在这个过渡阶段拯救很多生命,
15:05
They can save保存 a lot of lives生活
in the interim临时,
323
893997
2058
但为了抓住这一变革性的机会,
能帮助像史蒂夫这样的人行动自如,
15:08
but to see the transformative变革 opportunity机会
to help someone有人 like Steve史蒂夫 get around,
324
896055
3833
为了终结安全事故,
15:11
to really get to the end结束 case案件 in safety安全,
325
899888
1969
15:13
to have the opportunity机会
to change更改 our cities城市
326
901857
2479
为了有机会改变我们的城市,
15:16
and move移动 parking停車處 out and get rid摆脱 of
these urban城市的 craters陨石坑 we call parking停車處 lots,
327
904336
4204
解决停车问题,
摆脱我们称为停车场的城市大坑,
这是唯一的办法了。
15:20
it's the only way to go.
328
908540
1240
15:21
CACA: We will be tracking追踪 your progress进展
with huge巨大 interest利益.
329
909780
2718
CA:我们会带着浓厚的兴趣
关注你们的进展的。
15:24
Thanks谢谢 so much, Chris克里斯.
CUCU: Thank you. (Applause掌声)
330
912498
4232
谢谢你,克里斯。
CU:谢谢。(掌声)
Translated by Lee Li
Reviewed by Yuanqing Edberg

▲Back to top

ABOUT THE SPEAKER
Chris Urmson - Roboticist
Chris Umson is the Director of Self-Driving Cars at Google[x].

Why you should listen

Since 2009, Chris Urmson has headed up Google’s self-driving car program. So far, the team’s vehicles have driven over three quarters of a million miles. While early models included a driverless Prius that TEDsters got to test- ... um, -not-drive in 2011, more and more the team is building vehicles from the ground up, custom-made to go driverless.

Prior to joining Google, Umson was on the faculty of the Robotics Institute at Carnegie Mellon University, where his research focused on motion planning and perception for robotic vehicles. During his time at Carnegie Mellon, he served as Director of Technology for the team that won the 2007 DARPA Urban Challenge.

More profile about the speaker
Chris Urmson | Speaker | TED.com