sponsored links
TED2011

Dennis Hong: Making a car for blind drivers

March 4, 2011

Using robotics, laser rangefinders, GPS and smart feedback tools, Dennis Hong is building a car for drivers who are blind. It's not a "self-driving" car, he's careful to note, but a car in which a non-sighted driver can determine speed, proximity and route -- and drive independently.

Dennis Hong - Roboticist
Dennis Hong is the founder and director of RoMeLa -- a Virginia Tech robotics lab that has pioneered several breakthroughs in robot design and engineering. Full bio

sponsored links
Double-click the English subtitles below to play the video.
Many believe driving is an activity
00:15
solely reserved for those who can see.
00:18
A blind person driving a vehicle safely and independently
00:20
was thought to be an impossible task, until now.
00:23
Hello, my name is Dennis Hong,
00:26
and we're bringing freedom and independence to the blind
00:28
by building a vehicle for the visually impaired.
00:30
So before I talk about this car for the blind,
00:33
let me briefly tell you about another project that I worked on
00:36
called the DARPA Urban Challenge.
00:38
Now this was about building a robotic car
00:40
that can drive itself.
00:42
You press start, nobody touches anything,
00:44
and it can reach its destination fully autonomously.
00:46
So in 2007, our team won half a million dollars
00:49
by placing third place in this competition.
00:52
So about that time,
00:54
the National Federation of the Blind, or NFB,
00:56
challenged the research committee
00:58
about who can develop a car
01:00
that lets a blind person drive safely and independently.
01:02
We decided to give it a try,
01:04
because we thought, "Hey, how hard could it be?"
01:06
We have already an autonomous vehicle.
01:08
We just put a blind person in it and we're done, right?
01:10
(Laughter)
01:12
We couldn't have been more wrong.
01:14
What NFB wanted
01:16
was not a vehicle that can drive a blind person around,
01:18
but a vehicle where a blind person can make active decisions and drive.
01:21
So we had to throw everything out the window
01:24
and start from scratch.
01:26
So to test this crazy idea,
01:28
we developed a small dune buggy prototype vehicle
01:30
to test the feasibility.
01:32
And in the summer of 2009,
01:34
we invited dozens of blind youth from all over the country
01:36
and gave them a chance to take it for a spin.
01:39
It was an absolutely amazing experience.
01:41
But the problem with this car was
01:43
it was designed to only be driven in a very controlled environment,
01:45
in a flat, closed-off parking lot --
01:48
even the lanes defined by red traffic cones.
01:50
So with this success,
01:52
we decided to take the next big step,
01:54
to develop a real car that can be driven on real roads.
01:56
So how does it work?
01:59
Well, it's a rather complex system,
02:01
but let me try to explain it, maybe simplify it.
02:03
So we have three steps.
02:06
We have perception, computation
02:08
and non-visual interfaces.
02:10
Now obviously the driver cannot see,
02:12
so the system needs to perceive the environment
02:14
and gather information for the driver.
02:16
For that, we use an initial measurement unit.
02:18
So it measures acceleration, angular acceleration --
02:21
like a human ear, inner ear.
02:23
We fuse that information with a GPS unit
02:25
to get an estimate of the location of the car.
02:27
We also use two cameras to detect the lanes of the road.
02:30
And we also use three laser range finders.
02:33
The lasers scan the environment to detect obstacles --
02:35
a car approaching from the front, the back
02:38
and also any obstacles that run into the roads,
02:40
any obstacles around the vehicle.
02:43
So all this vast amount of information is then fed into the computer,
02:45
and the computer can do two things.
02:48
One is, first of all, process this information
02:50
to have an understanding of the environment --
02:53
these are the lanes of the road, there's the obstacles --
02:55
and convey this information to the driver.
02:58
The system is also smart enough
03:00
to figure out the safest way to operate the car.
03:02
So we can also generate instructions
03:04
on how to operate the controls of the vehicle.
03:06
But the problem is this: How do we convey
03:08
this information and instructions
03:10
to a person who cannot see
03:12
fast enough and accurate enough so he can drive?
03:14
So for this, we developed many different types
03:17
of non-visual user interface technology.
03:19
So starting from a three-dimensional ping sound system,
03:22
a vibrating vest,
03:24
a click wheel with voice commands, a leg strip,
03:26
even a shoe that applies pressure to the foot.
03:29
But today we're going to talk about
03:31
three of these non-visual user interfaces.
03:33
Now the first interface is called a DriveGrip.
03:35
So these are a pair of gloves,
03:38
and it has vibrating elements on the knuckle part
03:40
so you can convey instructions about how to steer --
03:42
the direction and the intensity.
03:45
Another device is called SpeedStrip.
03:47
So this is a chair -- as a matter of fact, it's actually a massage chair.
03:49
We gut it out, and we rearrange the vibrating elements in different patterns,
03:52
and we actuate them to convey information about the speed,
03:56
and also instructions how to use the gas and the brake pedal.
03:59
So over here, you can see
04:02
how the computer understands the environment,
04:04
and because you cannot see the vibration,
04:06
we actually put red LED's on the driver so that you can see what's happening.
04:08
This is the sensory data,
04:11
and that data is transferred to the devices through the computer.
04:13
So these two devices, DriveGrip and SpeedStrip,
04:16
are very effective.
04:18
But the problem is
04:20
these are instructional cue devices.
04:22
So this is not really freedom, right?
04:24
The computer tells you how to drive --
04:26
turn left, turn right, speed up, stop.
04:28
We call this the "backseat-driver problem."
04:30
So we're moving away from the instructional cue devices,
04:32
and we're now focusing more
04:35
on the informational devices.
04:37
A good example for this informational non-visual user interface
04:39
is called AirPix.
04:41
So think of it as a monitor for the blind.
04:43
So it's a small tablet, has many holes in it,
04:45
and compressed air comes out,
04:47
so it can actually draw images.
04:49
So even though you are blind, you can put your hand over it,
04:51
you can see the lanes of the road and obstacles.
04:53
Actually, you can also change the frequency of the air coming out
04:55
and possibly the temperature.
04:58
So it's actually a multi-dimensional user interface.
05:00
So here you can see the left camera, the right camera from the vehicle
05:03
and how the computer interprets that and sends that information to the AirPix.
05:06
For this, we're showing a simulator,
05:09
a blind person driving using the AirPix.
05:11
This simulator was also very useful for training the blind drivers
05:14
and also quickly testing different types of ideas
05:17
for different types of non-visual user interfaces.
05:19
So basically that's how it works.
05:21
So just a month ago,
05:23
on January 29th,
05:25
we unveiled this vehicle for the very first time to the public
05:27
at the world-famous Daytona International Speedway
05:29
during the Rolex 24 racing event.
05:32
We also had some surprises. Let's take a look.
05:34
(Music)
05:37
(Video) Announcer: This is an historic day in January.
05:47
He's coming up to the grandstand, fellow Federationists.
05:51
(Cheering)
05:55
(Honking)
06:01
There's the grandstand now.
06:04
And he's [unclear] following that van that's out in front of him.
06:06
Well there comes the first box.
06:10
Now let's see if Mark avoids it.
06:12
He does. He passes it on the right.
06:15
Third box is out. The fourth box is out.
06:20
And he's perfectly making his way between the two.
06:23
He's closing in on the van
06:26
to make the moving pass.
06:28
Well this is what it's all about,
06:32
this kind of dynamic display of audacity and ingenuity.
06:34
He's approaching the end of the run,
06:39
makes his way between the barrels that are set up there.
06:42
(Honking)
06:47
(Applause)
06:50
Dennis Hong: I'm so happy for you.
06:56
Mark's going to give me a ride back to the hotel.
06:58
Mark Riccobono: Yes.
07:00
(Applause)
07:05
DH: So since we started this project,
07:14
we've been getting hundreds of letters, emails, phone calls
07:16
from people from all around the world.
07:19
Letters thanking us, but sometimes you also get funny letters like this one:
07:21
"Now I understand why there is Braille on a drive-up ATM machine."
07:24
(Laughter)
07:28
But sometimes --
07:30
(Laughter)
07:32
But sometimes I also do get --
07:34
I wouldn't call it hate mail --
07:36
but letters of really strong concern:
07:38
"Dr. Hong, are you insane,
07:40
trying to put blind people on the road?
07:42
You must be out of your mind."
07:44
But this vehicle is a prototype vehicle,
07:46
and it's not going to be on the road
07:48
until it's proven as safe as, or safer than, today's vehicle.
07:50
And I truly believe that this can happen.
07:52
But still, will the society,
07:55
would they accept such a radical idea?
07:57
How are we going to handle insurance?
07:59
How are we going to issue driver's licenses?
08:01
There's many of these different kinds of hurdles besides technology challenges
08:03
that we need to address before this becomes a reality.
08:06
Of course, the main goal of this project
08:09
is to develop a car for the blind.
08:11
But potentially more important than this
08:13
is the tremendous value of the spin-off technology
08:15
that can come from this project.
08:18
The sensors that are used can see through the dark,
08:20
the fog and rain.
08:22
And together with this new type of interfaces,
08:24
we can use these technologies
08:26
and apply them to safer cars for sighted people.
08:28
Or for the blind, everyday home appliances --
08:30
in the educational setting, in the office setting.
08:33
Just imagine, in a classroom a teacher writes on the blackboard
08:35
and a blind student can see what's written and read
08:38
using these non-visual interfaces.
08:41
This is priceless.
08:43
So today, the things I've showed you today, is just the beginning.
08:46
Thank you very much.
08:49
(Applause)
08:51

sponsored links

Dennis Hong - Roboticist
Dennis Hong is the founder and director of RoMeLa -- a Virginia Tech robotics lab that has pioneered several breakthroughs in robot design and engineering.

Why you should listen

As director of a groundbreaking robotics lab, Dennis Hong guides his team of students through projects on robot locomotion and mechanism design, creating award-winning humanoid robots like DARwIn (Dynamic Anthropomorphic Robot with Intelligence). His team is known as RoMeLa (Robotics & Mechanisms Laboratory) and operates at Virginia Tech.

Hong has also pioneered various innovations in soft-body robots, using a “whole-skin locomotion” as inspired by amoebae. Marrying robotics with biochemistry, he has been able to generate new types of motion with these ingenious forms. For his contributions to the field, Hong was selected as a NASA Summer Faculty Fellow in 2005, given the CAREER award by the National Science Foundation in 2007 and in 2009, named as one of Popular Science's Brilliant 10. He is also a gourmet chef and a magician, performing shows for charity and lecturing on the science of magic.

The original video is available on TED.com
sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.