ABOUT THE SPEAKER
Cynthia Breazeal - Roboticist
At MIT, Cynthia Breazeal and her team are building robots with social intelligence that communicate and learn the same way people do.

Why you should listen

Cynthia Breazeal founded and directs the Personal Robots Group at MIT’s Media Lab. Her research focuses on developing the principles and technologies for building personal robots that are socially intelligent—that interact and communicate with people in human-centric terms, work with humans as peers, and learn from people as an apprentice.

She has developed some of the world’s most famous robotic creatures, ranging from small hexapod robots to highly expressive humanoids, including the social robot Kismet and the expressive robot Leonardo. Her recent work investigates the impact of social robots on helping people of all ages to achieve personal goals that contribute to quality of life, in domains such as physical performance, learning and education, health, and family communication and play over distance.

More profile about the speaker
Cynthia Breazeal | Speaker | TED.com
TEDWomen 2010

Cynthia Breazeal: The rise of personal robots

Filmed:
1,201,057 views

Cynthia Breazeal wonders: Why can we use robots on Mars, but not in our living rooms? The key, she says, is in training robots to interact with people. Now she dreams up and builds robots that teach, learn -- and play. Watch for amazing demo footage of a new interactive game for kids.
- Roboticist
At MIT, Cynthia Breazeal and her team are building robots with social intelligence that communicate and learn the same way people do. Full bio

Double-click the English transcript below to play the video.

00:15
Ever since I was a little girl
0
0
3000
00:18
seeing "Star Wars" for the first time,
1
3000
2000
00:20
I've been fascinated by this idea
2
5000
2000
00:22
of personal robots.
3
7000
2000
00:24
And as a little girl,
4
9000
2000
00:26
I loved the idea of a robot that interacted with us
5
11000
2000
00:28
much more like a helpful, trusted sidekick --
6
13000
3000
00:31
something that would delight us, enrich our lives
7
16000
2000
00:33
and help us save a galaxy or two.
8
18000
3000
00:37
I knew robots like that didn't really exist,
9
22000
3000
00:40
but I knew I wanted to build them.
10
25000
2000
00:42
So 20 years pass --
11
27000
2000
00:44
I am now a graduate student at MIT
12
29000
2000
00:46
studying artificial intelligence,
13
31000
2000
00:48
the year is 1997,
14
33000
2000
00:50
and NASA has just landed the first robot on Mars.
15
35000
3000
00:53
But robots are still not in our home, ironically.
16
38000
3000
00:56
And I remember thinking about
17
41000
2000
00:58
all the reasons why that was the case.
18
43000
2000
01:00
But one really struck me.
19
45000
2000
01:02
Robotics had really been about interacting with things,
20
47000
3000
01:05
not with people --
21
50000
2000
01:07
certainly not in a social way that would be natural for us
22
52000
2000
01:09
and would really help people accept robots
23
54000
2000
01:11
into our daily lives.
24
56000
2000
01:13
For me, that was the white space; that's what robots could not do yet.
25
58000
3000
01:16
And so that year, I started to build this robot, Kismet,
26
61000
3000
01:19
the world's first social robot.
27
64000
3000
01:22
Three years later --
28
67000
2000
01:24
a lot of programming,
29
69000
2000
01:26
working with other graduate students in the lab --
30
71000
2000
01:28
Kismet was ready to start interacting with people.
31
73000
2000
01:30
(Video) Scientist: I want to show you something.
32
75000
2000
01:32
Kismet: (Nonsense)
33
77000
2000
01:34
Scientist: This is a watch that my girlfriend gave me.
34
79000
3000
01:37
Kismet: (Nonsense)
35
82000
2000
01:39
Scientist: Yeah, look, it's got a little blue light in it too.
36
84000
2000
01:41
I almost lost it this week.
37
86000
3000
01:44
Cynthia Breazeal: So Kismet interacted with people
38
89000
3000
01:47
like kind of a non-verbal child or pre-verbal child,
39
92000
3000
01:50
which I assume was fitting because it was really the first of its kind.
40
95000
3000
01:53
It didn't speak language, but it didn't matter.
41
98000
2000
01:55
This little robot was somehow able
42
100000
2000
01:57
to tap into something deeply social within us --
43
102000
3000
02:00
and with that, the promise of an entirely new way
44
105000
2000
02:02
we could interact with robots.
45
107000
2000
02:04
So over the past several years
46
109000
2000
02:06
I've been continuing to explore this interpersonal dimension of robots,
47
111000
2000
02:08
now at the media lab
48
113000
2000
02:10
with my own team of incredibly talented students.
49
115000
2000
02:12
And one of my favorite robots is Leonardo.
50
117000
3000
02:15
We developed Leonardo in collaboration with Stan Winston Studio.
51
120000
3000
02:18
And so I want to show you a special moment for me of Leo.
52
123000
3000
02:21
This is Matt Berlin interacting with Leo,
53
126000
2000
02:23
introducing Leo to a new object.
54
128000
2000
02:25
And because it's new, Leo doesn't really know what to make of it.
55
130000
3000
02:28
But sort of like us, he can actually learn about it
56
133000
2000
02:30
from watching Matt's reaction.
57
135000
3000
02:33
(Video) Matt Berlin: Hello, Leo.
58
138000
2000
02:38
Leo, this is Cookie Monster.
59
143000
3000
02:44
Can you find Cookie Monster?
60
149000
3000
02:52
Leo, Cookie Monster is very bad.
61
157000
3000
02:56
He's very bad, Leo.
62
161000
2000
03:00
Cookie Monster is very, very bad.
63
165000
3000
03:07
He's a scary monster.
64
172000
2000
03:09
He wants to get your cookies.
65
174000
2000
03:12
(Laughter)
66
177000
2000
03:14
CB: All right, so Leo and Cookie
67
179000
3000
03:17
might have gotten off to a little bit of a rough start,
68
182000
2000
03:19
but they get along great now.
69
184000
3000
03:22
So what I've learned
70
187000
2000
03:24
through building these systems
71
189000
2000
03:26
is that robots are actually
72
191000
2000
03:28
a really intriguing social technology,
73
193000
2000
03:30
where it's actually their ability
74
195000
2000
03:32
to push our social buttons
75
197000
2000
03:34
and to interact with us like a partner
76
199000
2000
03:36
that is a core part of their functionality.
77
201000
3000
03:39
And with that shift in thinking, we can now start to imagine
78
204000
2000
03:41
new questions, new possibilities for robots
79
206000
3000
03:44
that we might not have thought about otherwise.
80
209000
3000
03:47
But what do I mean when I say "push our social buttons?"
81
212000
2000
03:49
Well, one of the things that we've learned
82
214000
2000
03:51
is that, if we design these robots to communicate with us
83
216000
2000
03:53
using the same body language,
84
218000
2000
03:55
the same sort of non-verbal cues that people use --
85
220000
2000
03:57
like Nexi, our humanoid robot, is doing here --
86
222000
3000
04:00
what we find is that people respond to robots
87
225000
2000
04:02
a lot like they respond to people.
88
227000
2000
04:04
People use these cues to determine things like how persuasive someone is,
89
229000
3000
04:07
how likable, how engaging,
90
232000
2000
04:09
how trustworthy.
91
234000
2000
04:11
It turns out it's the same for robots.
92
236000
2000
04:13
It's turning out now
93
238000
2000
04:15
that robots are actually becoming a really interesting new scientific tool
94
240000
3000
04:18
to understand human behavior.
95
243000
2000
04:20
To answer questions like, how is it that, from a brief encounter,
96
245000
3000
04:23
we're able to make an estimate of how trustworthy another person is?
97
248000
3000
04:26
Mimicry's believed to play a role, but how?
98
251000
3000
04:29
Is it the mimicking of particular gestures that matters?
99
254000
3000
04:32
It turns out it's really hard
100
257000
2000
04:34
to learn this or understand this from watching people
101
259000
2000
04:36
because when we interact we do all of these cues automatically.
102
261000
3000
04:39
We can't carefully control them because they're subconscious for us.
103
264000
2000
04:41
But with the robot, you can.
104
266000
2000
04:43
And so in this video here --
105
268000
2000
04:45
this is a video taken from David DeSteno's lab at Northeastern University.
106
270000
3000
04:48
He's a psychologist we've been collaborating with.
107
273000
2000
04:50
There's actually a scientist carefully controlling Nexi's cues
108
275000
3000
04:53
to be able to study this question.
109
278000
3000
04:56
And the bottom line is -- the reason why this works is
110
281000
2000
04:58
because it turns out people just behave like people
111
283000
2000
05:00
even when interacting with a robot.
112
285000
3000
05:03
So given that key insight,
113
288000
2000
05:05
we can now start to imagine
114
290000
2000
05:07
new kinds of applications for robots.
115
292000
3000
05:10
For instance, if robots do respond to our non-verbal cues,
116
295000
3000
05:13
maybe they would be a cool, new communication technology.
117
298000
4000
05:17
So imagine this:
118
302000
2000
05:19
What about a robot accessory for your cellphone?
119
304000
2000
05:21
You call your friend, she puts her handset in a robot,
120
306000
2000
05:23
and, bam! You're a MeBot --
121
308000
2000
05:25
you can make eye contact, you can talk with your friends,
122
310000
3000
05:28
you can move around, you can gesture --
123
313000
2000
05:30
maybe the next best thing to really being there, or is it?
124
315000
3000
05:33
To explore this question,
125
318000
2000
05:35
my student, Siggy Adalgeirsson, did a study
126
320000
3000
05:38
where we brought human participants, people, into our lab
127
323000
3000
05:41
to do a collaborative task
128
326000
2000
05:43
with a remote collaborator.
129
328000
2000
05:45
The task involved things
130
330000
2000
05:47
like looking at a set of objects on the table,
131
332000
2000
05:49
discussing them in terms of their importance and relevance to performing a certain task --
132
334000
3000
05:52
this ended up being a survival task --
133
337000
2000
05:54
and then rating them in terms
134
339000
2000
05:56
of how valuable and important they thought they were.
135
341000
2000
05:58
The remote collaborator was an experimenter from our group
136
343000
3000
06:01
who used one of three different technologies
137
346000
2000
06:03
to interact with the participants.
138
348000
2000
06:05
The first was just the screen.
139
350000
2000
06:07
This is just like video conferencing today.
140
352000
3000
06:10
The next was to add mobility -- so, have the screen on a mobile base.
141
355000
3000
06:13
This is like, if you're familiar with any of the telepresence robots today --
142
358000
3000
06:16
this is mirroring that situation.
143
361000
3000
06:19
And then the fully expressive MeBot.
144
364000
2000
06:21
So after the interaction,
145
366000
2000
06:23
we asked people to rate their quality of interaction
146
368000
3000
06:26
with the technology, with a remote collaborator
147
371000
2000
06:28
through this technology, in a number of different ways.
148
373000
3000
06:31
We looked at psychological involvement --
149
376000
2000
06:33
how much empathy did you feel for the other person?
150
378000
2000
06:35
We looked at overall engagement.
151
380000
2000
06:37
We looked at their desire to cooperate.
152
382000
2000
06:39
And this is what we see when they use just the screen.
153
384000
3000
06:42
It turns out, when you add mobility -- the ability to roll around the table --
154
387000
3000
06:45
you get a little more of a boost.
155
390000
2000
06:47
And you get even more of a boost when you add the full expression.
156
392000
3000
06:50
So it seems like this physical, social embodiment
157
395000
2000
06:52
actually really makes a difference.
158
397000
2000
06:54
Now let's try to put this into a little bit of context.
159
399000
3000
06:57
Today we know that families are living further and further apart,
160
402000
3000
07:00
and that definitely takes a toll on family relationships
161
405000
2000
07:02
and family bonds over distance.
162
407000
2000
07:04
For me, I have three young boys,
163
409000
2000
07:06
and I want them to have a really good relationship
164
411000
2000
07:08
with their grandparents.
165
413000
2000
07:10
But my parents live thousands of miles away,
166
415000
2000
07:12
so they just don't get to see each other that often.
167
417000
2000
07:14
We try Skype, we try phone calls,
168
419000
2000
07:16
but my boys are little -- they don't really want to talk;
169
421000
2000
07:18
they want to play.
170
423000
2000
07:20
So I love the idea of thinking about robots
171
425000
2000
07:22
as a new kind of distance-play technology.
172
427000
3000
07:25
I imagine a time not too far from now --
173
430000
3000
07:28
my mom can go to her computer,
174
433000
2000
07:30
open up a browser and jack into a little robot.
175
435000
2000
07:32
And as grandma-bot,
176
437000
3000
07:35
she can now play, really play,
177
440000
2000
07:37
with my sons, with her grandsons,
178
442000
2000
07:39
in the real world with his real toys.
179
444000
3000
07:42
I could imagine grandmothers being able to do social-plays
180
447000
2000
07:44
with their granddaughters, with their friends,
181
449000
2000
07:46
and to be able to share all kinds of other activities around the house,
182
451000
2000
07:48
like sharing a bedtime story.
183
453000
2000
07:50
And through this technology,
184
455000
2000
07:52
being able to be an active participant
185
457000
2000
07:54
in their grandchildren's lives
186
459000
2000
07:56
in a way that's not possible today.
187
461000
2000
07:58
Let's think about some other domains,
188
463000
2000
08:00
like maybe health.
189
465000
2000
08:02
So in the United States today,
190
467000
2000
08:04
over 65 percent of people are either overweight or obese,
191
469000
3000
08:07
and now it's a big problem with our children as well.
192
472000
2000
08:09
And we know that as you get older in life,
193
474000
2000
08:11
if you're obese when you're younger, that can lead to chronic diseases
194
476000
3000
08:14
that not only reduce your quality of life,
195
479000
2000
08:16
but are a tremendous economic burden on our health care system.
196
481000
3000
08:19
But if robots can be engaging,
197
484000
2000
08:21
if we like to cooperate with robots,
198
486000
2000
08:23
if robots are persuasive,
199
488000
2000
08:25
maybe a robot can help you
200
490000
2000
08:27
maintain a diet and exercise program,
201
492000
2000
08:29
maybe they can help you manage your weight.
202
494000
3000
08:32
Sort of like a digital Jiminy --
203
497000
2000
08:34
as in the well-known fairy tale --
204
499000
2000
08:36
a kind of friendly, supportive presence that's always there
205
501000
2000
08:38
to be able to help you make the right decision
206
503000
2000
08:40
in the right way at the right time
207
505000
2000
08:42
to help you form healthy habits.
208
507000
2000
08:44
So we actually explored this idea in our lab.
209
509000
2000
08:46
This is a robot, Autom.
210
511000
2000
08:48
Cory Kidd developed this robot for his doctoral work.
211
513000
3000
08:51
And it was designed to be a robot diet-and-exercise coach.
212
516000
3000
08:54
It had a couple of simple non-verbal skills it could do.
213
519000
2000
08:56
It could make eye contact with you.
214
521000
2000
08:58
It could share information looking down at a screen.
215
523000
2000
09:00
You'd use a screen interface to enter information,
216
525000
2000
09:02
like how many calories you ate that day,
217
527000
2000
09:04
how much exercise you got.
218
529000
2000
09:06
And then it could help track that for you.
219
531000
2000
09:08
And the robot spoke with a synthetic voice
220
533000
2000
09:10
to engage you in a coaching dialogue
221
535000
2000
09:12
modeled after trainers
222
537000
2000
09:14
and patients and so forth.
223
539000
2000
09:16
And it would build a working alliance with you
224
541000
2000
09:18
through that dialogue.
225
543000
2000
09:20
It could help you set goals and track your progress,
226
545000
2000
09:22
and it would help motivate you.
227
547000
2000
09:24
So an interesting question is,
228
549000
2000
09:26
does the social embodiment really matter? Does it matter that it's a robot?
229
551000
3000
09:29
Is it really just the quality of advice and information that matters?
230
554000
3000
09:32
To explore that question,
231
557000
2000
09:34
we did a study in the Boston area
232
559000
2000
09:36
where we put one of three interventions in people's homes
233
561000
3000
09:39
for a period of several weeks.
234
564000
2000
09:41
One case was the robot you saw there, Autom.
235
566000
3000
09:44
Another was a computer that ran the same touch-screen interface,
236
569000
3000
09:47
ran exactly the same dialogues.
237
572000
2000
09:49
The quality of advice was identical.
238
574000
2000
09:51
And the third was just a pen and paper log,
239
576000
2000
09:53
because that's the standard intervention you typically get
240
578000
2000
09:55
when you start a diet-and-exercise program.
241
580000
3000
09:58
So one of the things we really wanted to look at
242
583000
3000
10:01
was not how much weight people lost,
243
586000
3000
10:04
but really how long they interacted with the robot.
244
589000
3000
10:07
Because the challenge is not losing weight, it's actually keeping it off.
245
592000
3000
10:10
And the longer you could interact with one of these interventions,
246
595000
3000
10:13
well that's indicative, potentially, of longer-term success.
247
598000
3000
10:16
So the first thing I want to look at is how long,
248
601000
2000
10:18
how long did people interact with these systems.
249
603000
2000
10:20
It turns out that people interacted with the robot
250
605000
2000
10:22
significantly more,
251
607000
2000
10:24
even though the quality of the advice was identical to the computer.
252
609000
3000
10:28
When it asked people to rate it on terms of the quality of the working alliance,
253
613000
3000
10:31
people rated the robot higher
254
616000
2000
10:33
and they trusted the robot more.
255
618000
2000
10:35
(Laughter)
256
620000
2000
10:37
And when you look at emotional engagement,
257
622000
2000
10:39
it was completely different.
258
624000
2000
10:41
People would name the robots.
259
626000
2000
10:43
They would dress the robots.
260
628000
2000
10:45
(Laughter)
261
630000
2000
10:47
And even when we would come up to pick up the robots at the end of the study,
262
632000
3000
10:50
they would come out to the car and say good-bye to the robots.
263
635000
2000
10:52
They didn't do this with a computer.
264
637000
2000
10:54
The last thing I want to talk about today
265
639000
2000
10:56
is the future of children's media.
266
641000
2000
10:58
We know that kids spend a lot of time behind screens today,
267
643000
3000
11:01
whether it's television or computer games or whatnot.
268
646000
3000
11:04
My sons, they love the screen. They love the screen.
269
649000
3000
11:07
But I want them to play; as a mom, I want them to play,
270
652000
3000
11:10
like, real-world play.
271
655000
2000
11:12
And so I have a new project in my group I wanted to present to you today
272
657000
3000
11:15
called Playtime Computing
273
660000
2000
11:17
that's really trying to think about how we can take
274
662000
2000
11:19
what's so engaging about digital media
275
664000
2000
11:21
and literally bring it off the screen
276
666000
2000
11:23
into the real world of the child,
277
668000
2000
11:25
where it can take on many of the properties of real-world play.
278
670000
3000
11:29
So here's the first exploration of this idea,
279
674000
4000
11:33
where characters can be physical or virtual,
280
678000
3000
11:36
and where the digital content
281
681000
2000
11:38
can literally come off the screen
282
683000
2000
11:40
into the world and back.
283
685000
2000
11:42
I like to think of this
284
687000
2000
11:44
as the Atari Pong
285
689000
2000
11:46
of this blended-reality play.
286
691000
2000
11:48
But we can push this idea further.
287
693000
2000
11:50
What if --
288
695000
2000
11:52
(Game) Nathan: Here it comes. Yay!
289
697000
3000
11:55
CB: -- the character itself could come into your world?
290
700000
3000
11:58
It turns out that kids love it
291
703000
2000
12:00
when the character becomes real and enters into their world.
292
705000
3000
12:03
And when it's in their world,
293
708000
2000
12:05
they can relate to it and play with it in a way
294
710000
2000
12:07
that's fundamentally different from how they play with it on the screen.
295
712000
2000
12:09
Another important idea is this notion
296
714000
2000
12:11
of persistence of character across realities.
297
716000
3000
12:14
So changes that children make in the real world
298
719000
2000
12:16
need to translate to the virtual world.
299
721000
2000
12:18
So here, Nathan has changed the letter A to the number 2.
300
723000
3000
12:21
You can imagine maybe these symbols
301
726000
2000
12:23
give the characters special powers when it goes into the virtual world.
302
728000
3000
12:26
So they are now sending the character back into that world.
303
731000
3000
12:29
And now it's got number power.
304
734000
3000
12:32
And then finally, what I've been trying to do here
305
737000
2000
12:34
is create a really immersive experience for kids,
306
739000
3000
12:37
where they really feel like they are part of that story,
307
742000
3000
12:40
a part of that experience.
308
745000
2000
12:42
And I really want to spark their imaginations
309
747000
2000
12:44
the way mine was sparked as a little girl watching "Star Wars."
310
749000
3000
12:47
But I want to do more than that.
311
752000
2000
12:49
I actually want them to create those experiences.
312
754000
3000
12:52
I want them to be able to literally build their imagination
313
757000
2000
12:54
into these experiences and make them their own.
314
759000
2000
12:56
So we've been exploring a lot of ideas
315
761000
2000
12:58
in telepresence and mixed reality
316
763000
2000
13:00
to literally allow kids to project their ideas into this space
317
765000
3000
13:03
where other kids can interact with them
318
768000
2000
13:05
and build upon them.
319
770000
2000
13:07
I really want to come up with new ways of children's media
320
772000
3000
13:10
that foster creativity and learning and innovation.
321
775000
3000
13:13
I think that's very, very important.
322
778000
3000
13:16
So this is a new project.
323
781000
2000
13:18
We've invited a lot of kids into this space,
324
783000
2000
13:20
and they think it's pretty cool.
325
785000
3000
13:23
But I can tell you, the thing that they love the most
326
788000
2000
13:25
is the robot.
327
790000
2000
13:27
What they care about is the robot.
328
792000
3000
13:30
Robots touch something deeply human within us.
329
795000
3000
13:33
And so whether they're helping us
330
798000
2000
13:35
to become creative and innovative,
331
800000
2000
13:37
or whether they're helping us
332
802000
2000
13:39
to feel more deeply connected despite distance,
333
804000
2000
13:41
or whether they are our trusted sidekick
334
806000
2000
13:43
who's helping us attain our personal goals
335
808000
2000
13:45
in becoming our highest and best selves,
336
810000
2000
13:47
for me, robots are all about people.
337
812000
3000
13:50
Thank you.
338
815000
2000
13:52
(Applause)
339
817000
5000

▲Back to top

ABOUT THE SPEAKER
Cynthia Breazeal - Roboticist
At MIT, Cynthia Breazeal and her team are building robots with social intelligence that communicate and learn the same way people do.

Why you should listen

Cynthia Breazeal founded and directs the Personal Robots Group at MIT’s Media Lab. Her research focuses on developing the principles and technologies for building personal robots that are socially intelligent—that interact and communicate with people in human-centric terms, work with humans as peers, and learn from people as an apprentice.

She has developed some of the world’s most famous robotic creatures, ranging from small hexapod robots to highly expressive humanoids, including the social robot Kismet and the expressive robot Leonardo. Her recent work investigates the impact of social robots on helping people of all ages to achieve personal goals that contribute to quality of life, in domains such as physical performance, learning and education, health, and family communication and play over distance.

More profile about the speaker
Cynthia Breazeal | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee