ABOUT THE SPEAKER
Jennifer Healey - Research scientist
A research scientist at Intel, Jennifer Healey develops the mobile internet devices of the future.

Why you should listen

Jennifer Healey imagines a future where computers and smartphones are capable of being sensitive to human emotions and where cars are able to talk to each other, and thus keep their drivers away from accidents. A scientist at Intel Corporation Research Labs, she researches devices and systems that would allow for these major innovations.

Healey holds PhD from MIT in electrical engineering and computer science. While there, she pioneered “Affective Computing” with Rosalind Picard and developed the first wearable computer with physiological sensors and a video camera that allows the wearer to track their daily activities and how they feel while doing them. From there, she moved to IBM where she worked on the next generation of multi-modal interactive smartphones and helped architect the "Interaction Mark-Up language" that allows users to switch from voice to speech input seamlessly.

Healey has also used her interest in embedded devices in the field of healthcare. While an instructor at Harvard Medical School and at Beth Israel Deaconess Medical Center, she worked on new ways to use heart rate to predict cardiac health. She then joined HP Research in Cambridge to further develop wearable sensors for health monitoring and continued this research when she joined Intel Digital Health.

More profile about the speaker
Jennifer Healey | Speaker | TED.com
TED@Intel

Jennifer Healey: If cars could talk, accidents might be avoidable

Filmed:
908,454 views

When we drive, we get into a glass bubble, lock the doors and press the accelerator, relying on our eyes to guide us -- even though we can only see the few cars ahead of and behind us. But what if cars could share data with each other about their position and velocity, and use predictive models to calculate the safest routes for everyone on the road? Jennifer Healey imagines a world without car accidents.
- Research scientist
A research scientist at Intel, Jennifer Healey develops the mobile internet devices of the future. Full bio

Double-click the English transcript below to play the video.

00:12
Let's face it:
0
703
1914
00:14
Driving is dangerous.
1
2617
2445
00:17
It's one of the things that we don't like to think about,
2
5062
3098
00:20
but the fact that religious icons and good luck charms
3
8160
3652
00:23
show up on dashboards around the world
4
11812
4790
00:28
betrays the fact that we know this to be true.
5
16602
4137
00:32
Car accidents are the leading cause of death
6
20739
3594
00:36
in people ages 16 to 19 in the United States --
7
24333
4170
00:40
leading cause of death --
8
28503
2843
00:43
and 75 percent of these accidents have nothing to do
9
31346
3863
00:47
with drugs or alcohol.
10
35209
2285
00:49
So what happens?
11
37494
2261
00:51
No one can say for sure, but I remember my first accident.
12
39755
4219
00:55
I was a young driver out on the highway,
13
43974
3803
00:59
and the car in front of me, I saw the brake lights go on.
14
47777
2258
01:02
I'm like, "Okay, all right, this guy is slowing down,
15
50035
1800
01:03
I'll slow down too."
16
51835
1282
01:05
I step on the brake.
17
53117
1926
01:07
But no, this guy isn't slowing down.
18
55043
2254
01:09
This guy is stopping, dead stop, dead stop on the highway.
19
57297
3178
01:12
It was just going 65 -- to zero?
20
60475
2540
01:15
I slammed on the brakes.
21
63015
1520
01:16
I felt the ABS kick in, and the car is still going,
22
64535
3059
01:19
and it's not going to stop, and I know it's not going to stop,
23
67594
2696
01:22
and the air bag deploys, the car is totaled,
24
70290
2939
01:25
and fortunately, no one was hurt.
25
73229
3557
01:28
But I had no idea that car was stopping,
26
76786
4211
01:32
and I think we can do a lot better than that.
27
80997
3645
01:36
I think we can transform the driving experience
28
84642
4145
01:40
by letting our cars talk to each other.
29
88787
3879
01:44
I just want you to think a little bit
30
92666
1424
01:46
about what the experience of driving is like now.
31
94090
2888
01:48
Get into your car. Close the door. You're in a glass bubble.
32
96978
4028
01:53
You can't really directly sense the world around you.
33
101006
2916
01:55
You're in this extended body.
34
103922
2181
01:58
You're tasked with navigating it down
35
106103
2163
02:00
partially-seen roadways,
36
108266
2056
02:02
in and amongst other metal giants, at super-human speeds.
37
110322
4424
02:06
Okay? And all you have to guide you are your two eyes.
38
114746
4480
02:11
Okay, so that's all you have,
39
119226
1762
02:12
eyes that weren't really designed for this task,
40
120988
1735
02:14
but then people ask you to do things like,
41
122723
3751
02:18
you want to make a lane change,
42
126474
1549
02:20
what's the first thing they ask you do?
43
128023
2321
02:22
Take your eyes off the road. That's right.
44
130344
3095
02:25
Stop looking where you're going, turn,
45
133439
2096
02:27
check your blind spot,
46
135535
2018
02:29
and drive down the road without looking where you're going.
47
137553
3471
02:33
You and everyone else. This is the safe way to drive.
48
141024
3135
02:36
Why do we do this? Because we have to,
49
144159
2241
02:38
we have to make a choice, do I look here or do I look here?
50
146400
2579
02:40
What's more important?
51
148979
1521
02:42
And usually we do a fantastic job
52
150500
2711
02:45
picking and choosing what we attend to on the road.
53
153211
3769
02:48
But occasionally we miss something.
54
156980
3650
02:52
Occasionally we sense something wrong or too late.
55
160630
4461
02:57
In countless accidents, the driver says,
56
165091
1988
02:59
"I didn't see it coming."
57
167079
2308
03:01
And I believe that. I believe that.
58
169387
3281
03:04
We can only watch so much.
59
172668
2925
03:07
But the technology exists now that can help us improve that.
60
175593
5144
03:12
In the future, with cars exchanging data with each other,
61
180737
4296
03:17
we will be able to see not just three cars ahead
62
185033
3928
03:20
and three cars behind, to the right and left,
63
188961
1594
03:22
all at the same time, bird's eye view,
64
190555
3166
03:25
we will actually be able to see into those cars.
65
193721
3128
03:28
We will be able to see the velocity of the car in front of us,
66
196849
2371
03:31
to see how fast that guy's going or stopping.
67
199220
3240
03:34
If that guy's going down to zero, I'll know.
68
202460
4510
03:38
And with computation and algorithms and predictive models,
69
206970
3859
03:42
we will be able to see the future.
70
210829
3273
03:46
You may think that's impossible.
71
214102
1556
03:47
How can you predict the future? That's really hard.
72
215658
2731
03:50
Actually, no. With cars, it's not impossible.
73
218389
3619
03:54
Cars are three-dimensional objects
74
222008
2732
03:56
that have a fixed position and velocity.
75
224740
2332
03:59
They travel down roads.
76
227072
1631
04:00
Often they travel on pre-published routes.
77
228703
2412
04:03
It's really not that hard to make reasonable predictions
78
231115
3938
04:07
about where a car's going to be in the near future.
79
235053
2864
04:09
Even if, when you're in your car
80
237917
2002
04:11
and some motorcyclist comes -- bshoom! --
81
239919
1994
04:13
85 miles an hour down, lane-splitting --
82
241913
2296
04:16
I know you've had this experience --
83
244209
2547
04:18
that guy didn't "just come out of nowhere."
84
246756
2603
04:21
That guy's been on the road probably for the last half hour.
85
249359
3643
04:25
(Laughter)
86
253002
1190
04:26
Right? I mean, somebody's seen him.
87
254192
3589
04:29
Ten, 20, 30 miles back, someone's seen that guy,
88
257781
2768
04:32
and as soon as one car sees that guy
89
260549
2384
04:34
and puts him on the map, he's on the map --
90
262933
2231
04:37
position, velocity,
91
265164
2176
04:39
good estimate he'll continue going 85 miles an hour.
92
267340
2321
04:41
You'll know, because your car will know, because
93
269661
2184
04:43
that other car will have whispered something in his ear,
94
271845
2275
04:46
like, "By the way, five minutes,
95
274120
1923
04:48
motorcyclist, watch out."
96
276043
2775
04:50
You can make reasonable predictions about how cars behave.
97
278818
2703
04:53
I mean, they're Newtonian objects.
98
281521
1365
04:54
That's very nice about them.
99
282886
2909
04:57
So how do we get there?
100
285795
3034
05:00
We can start with something as simple
101
288829
2266
05:03
as sharing our position data between cars,
102
291095
2870
05:05
just sharing GPS.
103
293965
1892
05:07
If I have a GPS and a camera in my car,
104
295857
2444
05:10
I have a pretty precise idea of where I am
105
298301
2231
05:12
and how fast I'm going.
106
300532
1732
05:14
With computer vision, I can estimate where
107
302264
1657
05:15
the cars around me are, sort of, and where they're going.
108
303921
3537
05:19
And same with the other cars.
109
307458
970
05:20
They can have a precise idea of where they are,
110
308428
1814
05:22
and sort of a vague idea of where the other cars are.
111
310242
2146
05:24
What happens if two cars share that data,
112
312388
3231
05:27
if they talk to each other?
113
315619
1955
05:29
I can tell you exactly what happens.
114
317574
2778
05:32
Both models improve.
115
320352
2339
05:34
Everybody wins.
116
322691
2055
05:36
Professor Bob Wang and his team
117
324746
2577
05:39
have done computer simulations of what happens
118
327323
2738
05:42
when fuzzy estimates combine, even in light traffic,
119
330061
3431
05:45
when cars just share GPS data,
120
333492
2624
05:48
and we've moved this research out of the computer simulation
121
336116
2513
05:50
and into robot test beds that have the actual sensors
122
338629
3027
05:53
that are in cars now on these robots:
123
341656
3133
05:56
stereo cameras, GPS,
124
344789
1838
05:58
and the two-dimensional laser range finders
125
346627
1874
06:00
that are common in backup systems.
126
348501
2240
06:02
We also attach a discrete short-range communication radio,
127
350741
4484
06:07
and the robots talk to each other.
128
355225
1909
06:09
When these robots come at each other,
129
357134
1539
06:10
they track each other's position precisely,
130
358673
2971
06:13
and they can avoid each other.
131
361644
2737
06:16
We're now adding more and more robots into the mix,
132
364381
3226
06:19
and we encountered some problems.
133
367607
1471
06:21
One of the problems, when you get too much chatter,
134
369078
2359
06:23
it's hard to process all the packets, so you have to prioritize,
135
371437
3728
06:27
and that's where the predictive model helps you.
136
375165
2357
06:29
If your robot cars are all tracking the predicted trajectories,
137
377522
4372
06:33
you don't pay as much attention to those packets.
138
381894
1767
06:35
You prioritize the one guy
139
383661
1703
06:37
who seems to be going a little off course.
140
385364
1333
06:38
That guy could be a problem.
141
386697
2526
06:41
And you can predict the new trajectory.
142
389223
3002
06:44
So you don't only know that he's going off course, you know how.
143
392225
2763
06:46
And you know which drivers you need to alert to get out of the way.
144
394988
3725
06:50
And we wanted to do -- how can we best alert everyone?
145
398713
2633
06:53
How can these cars whisper, "You need to get out of the way?"
146
401346
3183
06:56
Well, it depends on two things:
147
404529
1517
06:58
one, the ability of the car,
148
406046
2169
07:00
and second the ability of the driver.
149
408215
3217
07:03
If one guy has a really great car,
150
411432
1505
07:04
but they're on their phone or, you know, doing something,
151
412937
2925
07:07
they're not probably in the best position
152
415862
1930
07:09
to react in an emergency.
153
417792
2970
07:12
So we started a separate line of research
154
420762
1665
07:14
doing driver state modeling.
155
422427
2551
07:16
And now, using a series of three cameras,
156
424978
2329
07:19
we can detect if a driver is looking forward,
157
427307
2270
07:21
looking away, looking down, on the phone,
158
429577
2860
07:24
or having a cup of coffee.
159
432437
3061
07:27
We can predict the accident
160
435498
2070
07:29
and we can predict who, which cars,
161
437568
3651
07:33
are in the best position to move out of the way
162
441219
3486
07:36
to calculate the safest route for everyone.
163
444705
3009
07:39
Fundamentally, these technologies exist today.
164
447714
4635
07:44
I think the biggest problem that we face
165
452349
2824
07:47
is our own willingness to share our data.
166
455173
3013
07:50
I think it's a very disconcerting notion,
167
458186
2631
07:52
this idea that our cars will be watching us,
168
460817
2386
07:55
talking about us to other cars,
169
463203
3371
07:58
that we'll be going down the road in a sea of gossip.
170
466574
3427
08:02
But I believe it can be done in a way that protects our privacy,
171
470001
3897
08:05
just like right now, when I look at your car from the outside,
172
473898
3741
08:09
I don't really know about you.
173
477639
2363
08:12
If I look at your license plate number,
174
480002
1137
08:13
I don't really know who you are.
175
481139
1886
08:15
I believe our cars can talk about us behind our backs.
176
483025
4249
08:19
(Laughter)
177
487274
2975
08:22
And I think it's going to be a great thing.
178
490249
3185
08:25
I want you to consider for a moment
179
493434
1650
08:27
if you really don't want the distracted teenager behind you
180
495084
4118
08:31
to know that you're braking,
181
499202
2120
08:33
that you're coming to a dead stop.
182
501322
2924
08:36
By sharing our data willingly,
183
504246
2741
08:38
we can do what's best for everyone.
184
506987
2812
08:41
So let your car gossip about you.
185
509799
3076
08:44
It's going to make the roads a lot safer.
186
512875
3038
08:47
Thank you.
187
515913
1791
08:49
(Applause)
188
517704
4985
Translated by Joseph Geni
Reviewed by Morton Bast

▲Back to top

ABOUT THE SPEAKER
Jennifer Healey - Research scientist
A research scientist at Intel, Jennifer Healey develops the mobile internet devices of the future.

Why you should listen

Jennifer Healey imagines a future where computers and smartphones are capable of being sensitive to human emotions and where cars are able to talk to each other, and thus keep their drivers away from accidents. A scientist at Intel Corporation Research Labs, she researches devices and systems that would allow for these major innovations.

Healey holds PhD from MIT in electrical engineering and computer science. While there, she pioneered “Affective Computing” with Rosalind Picard and developed the first wearable computer with physiological sensors and a video camera that allows the wearer to track their daily activities and how they feel while doing them. From there, she moved to IBM where she worked on the next generation of multi-modal interactive smartphones and helped architect the "Interaction Mark-Up language" that allows users to switch from voice to speech input seamlessly.

Healey has also used her interest in embedded devices in the field of healthcare. While an instructor at Harvard Medical School and at Beth Israel Deaconess Medical Center, she worked on new ways to use heart rate to predict cardiac health. She then joined HP Research in Cambridge to further develop wearable sensors for health monitoring and continued this research when she joined Intel Digital Health.

More profile about the speaker
Jennifer Healey | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee