sponsored links
TEDWomen 2010

Cynthia Breazeal: The rise of personal robots

December 8, 2010

As a grad student, Cynthia Breazeal wondered why we were using robots on Mars, but not in our living rooms. The key, she realized: training robots to interact with people. Now she dreams up and builds robots that teach, learn -- and play. Watch for amazing demo footage of a new interactive game for kids.

Cynthia Breazeal - Roboticist
At MIT, Cynthia Breazeal and her team are building robots with social intelligence that communicate and learn the same way people do. Full bio

sponsored links
Double-click the English subtitles below to play the video.
Ever since I was a little girl
00:15
seeing "Star Wars" for the first time,
00:18
I've been fascinated by this idea
00:20
of personal robots.
00:22
And as a little girl,
00:24
I loved the idea of a robot that interacted with us
00:26
much more like a helpful, trusted sidekick --
00:28
something that would delight us, enrich our lives
00:31
and help us save a galaxy or two.
00:33
I knew robots like that didn't really exist,
00:37
but I knew I wanted to build them.
00:40
So 20 years pass --
00:42
I am now a graduate student at MIT
00:44
studying artificial intelligence,
00:46
the year is 1997,
00:48
and NASA has just landed the first robot on Mars.
00:50
But robots are still not in our home, ironically.
00:53
And I remember thinking about
00:56
all the reasons why that was the case.
00:58
But one really struck me.
01:00
Robotics had really been about interacting with things,
01:02
not with people --
01:05
certainly not in a social way that would be natural for us
01:07
and would really help people accept robots
01:09
into our daily lives.
01:11
For me, that was the white space; that's what robots could not do yet.
01:13
And so that year, I started to build this robot, Kismet,
01:16
the world's first social robot.
01:19
Three years later --
01:22
a lot of programming,
01:24
working with other graduate students in the lab --
01:26
Kismet was ready to start interacting with people.
01:28
(Video) Scientist: I want to show you something.
01:30
Kismet: (Nonsense)
01:32
Scientist: This is a watch that my girlfriend gave me.
01:34
Kismet: (Nonsense)
01:37
Scientist: Yeah, look, it's got a little blue light in it too.
01:39
I almost lost it this week.
01:41
Cynthia Breazeal: So Kismet interacted with people
01:44
like kind of a non-verbal child or pre-verbal child,
01:47
which I assume was fitting because it was really the first of its kind.
01:50
It didn't speak language, but it didn't matter.
01:53
This little robot was somehow able
01:55
to tap into something deeply social within us --
01:57
and with that, the promise of an entirely new way
02:00
we could interact with robots.
02:02
So over the past several years
02:04
I've been continuing to explore this interpersonal dimension of robots,
02:06
now at the media lab
02:08
with my own team of incredibly talented students.
02:10
And one of my favorite robots is Leonardo.
02:12
We developed Leonardo in collaboration with Stan Winston Studio.
02:15
And so I want to show you a special moment for me of Leo.
02:18
This is Matt Berlin interacting with Leo,
02:21
introducing Leo to a new object.
02:23
And because it's new, Leo doesn't really know what to make of it.
02:25
But sort of like us, he can actually learn about it
02:28
from watching Matt's reaction.
02:30
(Video) Matt Berlin: Hello, Leo.
02:33
Leo, this is Cookie Monster.
02:38
Can you find Cookie Monster?
02:44
Leo, Cookie Monster is very bad.
02:52
He's very bad, Leo.
02:56
Cookie Monster is very, very bad.
03:00
He's a scary monster.
03:07
He wants to get your cookies.
03:09
(Laughter)
03:12
CB: All right, so Leo and Cookie
03:14
might have gotten off to a little bit of a rough start,
03:17
but they get along great now.
03:19
So what I've learned
03:22
through building these systems
03:24
is that robots are actually
03:26
a really intriguing social technology,
03:28
where it's actually their ability
03:30
to push our social buttons
03:32
and to interact with us like a partner
03:34
that is a core part of their functionality.
03:36
And with that shift in thinking, we can now start to imagine
03:39
new questions, new possibilities for robots
03:41
that we might not have thought about otherwise.
03:44
But what do I mean when I say "push our social buttons?"
03:47
Well, one of the things that we've learned
03:49
is that, if we design these robots to communicate with us
03:51
using the same body language,
03:53
the same sort of non-verbal cues that people use --
03:55
like Nexi, our humanoid robot, is doing here --
03:57
what we find is that people respond to robots
04:00
a lot like they respond to people.
04:02
People use these cues to determine things like how persuasive someone is,
04:04
how likable, how engaging,
04:07
how trustworthy.
04:09
It turns out it's the same for robots.
04:11
It's turning out now
04:13
that robots are actually becoming a really interesting new scientific tool
04:15
to understand human behavior.
04:18
To answer questions like, how is it that, from a brief encounter,
04:20
we're able to make an estimate of how trustworthy another person is?
04:23
Mimicry's believed to play a role, but how?
04:26
Is it the mimicking of particular gestures that matters?
04:29
It turns out it's really hard
04:32
to learn this or understand this from watching people
04:34
because when we interact we do all of these cues automatically.
04:36
We can't carefully control them because they're subconscious for us.
04:39
But with the robot, you can.
04:41
And so in this video here --
04:43
this is a video taken from David DeSteno's lab at Northeastern University.
04:45
He's a psychologist we've been collaborating with.
04:48
There's actually a scientist carefully controlling Nexi's cues
04:50
to be able to study this question.
04:53
And the bottom line is -- the reason why this works is
04:56
because it turns out people just behave like people
04:58
even when interacting with a robot.
05:00
So given that key insight,
05:03
we can now start to imagine
05:05
new kinds of applications for robots.
05:07
For instance, if robots do respond to our non-verbal cues,
05:10
maybe they would be a cool, new communication technology.
05:13
So imagine this:
05:17
What about a robot accessory for your cellphone?
05:19
You call your friend, she puts her handset in a robot,
05:21
and, bam! You're a MeBot --
05:23
you can make eye contact, you can talk with your friends,
05:25
you can move around, you can gesture --
05:28
maybe the next best thing to really being there, or is it?
05:30
To explore this question,
05:33
my student, Siggy Adalgeirsson, did a study
05:35
where we brought human participants, people, into our lab
05:38
to do a collaborative task
05:41
with a remote collaborator.
05:43
The task involved things
05:45
like looking at a set of objects on the table,
05:47
discussing them in terms of their importance and relevance to performing a certain task --
05:49
this ended up being a survival task --
05:52
and then rating them in terms
05:54
of how valuable and important they thought they were.
05:56
The remote collaborator was an experimenter from our group
05:58
who used one of three different technologies
06:01
to interact with the participants.
06:03
The first was just the screen.
06:05
This is just like video conferencing today.
06:07
The next was to add mobility -- so, have the screen on a mobile base.
06:10
This is like, if you're familiar with any of the telepresence robots today --
06:13
this is mirroring that situation.
06:16
And then the fully expressive MeBot.
06:19
So after the interaction,
06:21
we asked people to rate their quality of interaction
06:23
with the technology, with a remote collaborator
06:26
through this technology, in a number of different ways.
06:28
We looked at psychological involvement --
06:31
how much empathy did you feel for the other person?
06:33
We looked at overall engagement.
06:35
We looked at their desire to cooperate.
06:37
And this is what we see when they use just the screen.
06:39
It turns out, when you add mobility -- the ability to roll around the table --
06:42
you get a little more of a boost.
06:45
And you get even more of a boost when you add the full expression.
06:47
So it seems like this physical, social embodiment
06:50
actually really makes a difference.
06:52
Now let's try to put this into a little bit of context.
06:54
Today we know that families are living further and further apart,
06:57
and that definitely takes a toll on family relationships
07:00
and family bonds over distance.
07:02
For me, I have three young boys,
07:04
and I want them to have a really good relationship
07:06
with their grandparents.
07:08
But my parents live thousands of miles away,
07:10
so they just don't get to see each other that often.
07:12
We try Skype, we try phone calls,
07:14
but my boys are little -- they don't really want to talk;
07:16
they want to play.
07:18
So I love the idea of thinking about robots
07:20
as a new kind of distance-play technology.
07:22
I imagine a time not too far from now --
07:25
my mom can go to her computer,
07:28
open up a browser and jack into a little robot.
07:30
And as grandma-bot,
07:32
she can now play, really play,
07:35
with my sons, with her grandsons,
07:37
in the real world with his real toys.
07:39
I could imagine grandmothers being able to do social-plays
07:42
with their granddaughters, with their friends,
07:44
and to be able to share all kinds of other activities around the house,
07:46
like sharing a bedtime story.
07:48
And through this technology,
07:50
being able to be an active participant
07:52
in their grandchildren's lives
07:54
in a way that's not possible today.
07:56
Let's think about some other domains,
07:58
like maybe health.
08:00
So in the United States today,
08:02
over 65 percent of people are either overweight or obese,
08:04
and now it's a big problem with our children as well.
08:07
And we know that as you get older in life,
08:09
if you're obese when you're younger, that can lead to chronic diseases
08:11
that not only reduce your quality of life,
08:14
but are a tremendous economic burden on our health care system.
08:16
But if robots can be engaging,
08:19
if we like to cooperate with robots,
08:21
if robots are persuasive,
08:23
maybe a robot can help you
08:25
maintain a diet and exercise program,
08:27
maybe they can help you manage your weight.
08:29
Sort of like a digital Jiminy --
08:32
as in the well-known fairy tale --
08:34
a kind of friendly, supportive presence that's always there
08:36
to be able to help you make the right decision
08:38
in the right way at the right time
08:40
to help you form healthy habits.
08:42
So we actually explored this idea in our lab.
08:44
This is a robot, Autom.
08:46
Cory Kidd developed this robot for his doctoral work.
08:48
And it was designed to be a robot diet-and-exercise coach.
08:51
It had a couple of simple non-verbal skills it could do.
08:54
It could make eye contact with you.
08:56
It could share information looking down at a screen.
08:58
You'd use a screen interface to enter information,
09:00
like how many calories you ate that day,
09:02
how much exercise you got.
09:04
And then it could help track that for you.
09:06
And the robot spoke with a synthetic voice
09:08
to engage you in a coaching dialogue
09:10
modeled after trainers
09:12
and patients and so forth.
09:14
And it would build a working alliance with you
09:16
through that dialogue.
09:18
It could help you set goals and track your progress,
09:20
and it would help motivate you.
09:22
So an interesting question is,
09:24
does the social embodiment really matter? Does it matter that it's a robot?
09:26
Is it really just the quality of advice and information that matters?
09:29
To explore that question,
09:32
we did a study in the Boston area
09:34
where we put one of three interventions in people's homes
09:36
for a period of several weeks.
09:39
One case was the robot you saw there, Autom.
09:41
Another was a computer that ran the same touch-screen interface,
09:44
ran exactly the same dialogues.
09:47
The quality of advice was identical.
09:49
And the third was just a pen and paper log,
09:51
because that's the standard intervention you typically get
09:53
when you start a diet-and-exercise program.
09:55
So one of the things we really wanted to look at
09:58
was not how much weight people lost,
10:01
but really how long they interacted with the robot.
10:04
Because the challenge is not losing weight, it's actually keeping it off.
10:07
And the longer you could interact with one of these interventions,
10:10
well that's indicative, potentially, of longer-term success.
10:13
So the first thing I want to look at is how long,
10:16
how long did people interact with these systems.
10:18
It turns out that people interacted with the robot
10:20
significantly more,
10:22
even though the quality of the advice was identical to the computer.
10:24
When it asked people to rate it on terms of the quality of the working alliance,
10:28
people rated the robot higher
10:31
and they trusted the robot more.
10:33
(Laughter)
10:35
And when you look at emotional engagement,
10:37
it was completely different.
10:39
People would name the robots.
10:41
They would dress the robots.
10:43
(Laughter)
10:45
And even when we would come up to pick up the robots at the end of the study,
10:47
they would come out to the car and say good-bye to the robots.
10:50
They didn't do this with a computer.
10:52
The last thing I want to talk about today
10:54
is the future of children's media.
10:56
We know that kids spend a lot of time behind screens today,
10:58
whether it's television or computer games or whatnot.
11:01
My sons, they love the screen. They love the screen.
11:04
But I want them to play; as a mom, I want them to play,
11:07
like, real-world play.
11:10
And so I have a new project in my group I wanted to present to you today
11:12
called Playtime Computing
11:15
that's really trying to think about how we can take
11:17
what's so engaging about digital media
11:19
and literally bring it off the screen
11:21
into the real world of the child,
11:23
where it can take on many of the properties of real-world play.
11:25
So here's the first exploration of this idea,
11:29
where characters can be physical or virtual,
11:33
and where the digital content
11:36
can literally come off the screen
11:38
into the world and back.
11:40
I like to think of this
11:42
as the Atari Pong
11:44
of this blended-reality play.
11:46
But we can push this idea further.
11:48
What if --
11:50
(Game) Nathan: Here it comes. Yay!
11:52
CB: -- the character itself could come into your world?
11:55
It turns out that kids love it
11:58
when the character becomes real and enters into their world.
12:00
And when it's in their world,
12:03
they can relate to it and play with it in a way
12:05
that's fundamentally different from how they play with it on the screen.
12:07
Another important idea is this notion
12:09
of persistence of character across realities.
12:11
So changes that children make in the real world
12:14
need to translate to the virtual world.
12:16
So here, Nathan has changed the letter A to the number 2.
12:18
You can imagine maybe these symbols
12:21
give the characters special powers when it goes into the virtual world.
12:23
So they are now sending the character back into that world.
12:26
And now it's got number power.
12:29
And then finally, what I've been trying to do here
12:32
is create a really immersive experience for kids,
12:34
where they really feel like they are part of that story,
12:37
a part of that experience.
12:40
And I really want to spark their imaginations
12:42
the way mine was sparked as a little girl watching "Star Wars."
12:44
But I want to do more than that.
12:47
I actually want them to create those experiences.
12:49
I want them to be able to literally build their imagination
12:52
into these experiences and make them their own.
12:54
So we've been exploring a lot of ideas
12:56
in telepresence and mixed reality
12:58
to literally allow kids to project their ideas into this space
13:00
where other kids can interact with them
13:03
and build upon them.
13:05
I really want to come up with new ways of children's media
13:07
that foster creativity and learning and innovation.
13:10
I think that's very, very important.
13:13
So this is a new project.
13:16
We've invited a lot of kids into this space,
13:18
and they think it's pretty cool.
13:20
But I can tell you, the thing that they love the most
13:23
is the robot.
13:25
What they care about is the robot.
13:27
Robots touch something deeply human within us.
13:30
And so whether they're helping us
13:33
to become creative and innovative,
13:35
or whether they're helping us
13:37
to feel more deeply connected despite distance,
13:39
or whether they are our trusted sidekick
13:41
who's helping us attain our personal goals
13:43
in becoming our highest and best selves,
13:45
for me, robots are all about people.
13:47
Thank you.
13:50
(Applause)
13:52

sponsored links

Cynthia Breazeal - Roboticist
At MIT, Cynthia Breazeal and her team are building robots with social intelligence that communicate and learn the same way people do.

Why you should listen

Cynthia Breazeal founded and directs the Personal Robots Group at MIT’s Media Lab. Her research focuses on developing the principles and technologies for building personal robots that are socially intelligent—that interact and communicate with people in human-centric terms, work with humans as peers, and learn from people as an apprentice.

She has developed some of the world’s most famous robotic creatures, ranging from small hexapod robots to highly expressive humanoids, including the social robot Kismet and the expressive robot Leonardo. Her recent work investigates the impact of social robots on helping people of all ages to achieve personal goals that contribute to quality of life, in domains such as physical performance, learning and education, health, and family communication and play over distance.

The original video is available on TED.com
sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.