ABOUT THE SPEAKER
Leila Takayama - Social scientist
Leila Takayama conducts research on human-robot interaction.

Why you should listen

Leila Takayama is an acting associate professor of Psychology at the University of California, Santa Cruz, where she founded and leads the Re-Embodied Cognition Lab. Her lab examines how people make sense of, interact with, and relate to new technologies. Prior to academia, she was a researcher at GoogleX and Willow Garage, where she developed a taste for working alongside engineers, designers, animators, and more. Her interdisciplinary research continues in her current work on what happens when people interact with robots and through robots.

Takayama is a World Economic Forum Global Futures Council Member and Young Global Leader. In 2015, she was presented the IEEE Robotics & Automation Society Early Career Award. In 2012, she was named a TR35 winner and one of the 100 most creative people in business by Fast Company. She completed her PhD in Communication at Stanford University in 2008, advised by Professor Clifford Nass. She also holds a PhD minor in Psychology from Stanford, a master's degree in Communication from Stanford, and bachelor's of arts degrees in Psychology and Cognitive Science from UC Berkeley (2003). During her graduate studies, she was a research assistant in the User Interface Research (UIR) group at Palo Alto Research Center (PARC).

Photo: Melissa DeWitt

More profile about the speaker
Leila Takayama | Speaker | TED.com
TEDxPaloAlto

Leila Takayama: What's it like to be a robot?

Leila Takayama: 机器人到底是什么样的

Filmed:
1,183,118 views

我们已经生活在一个充满机器人的世界了:比如说洗碗机和恒温器之类的机器,它们已经完全融入我们的生活以至于我们不会将它们称为机器人。未来的机器人会变成什么样子呢?社会科学家Leila Takayama分享了一些在设计人与机器人交互界面过程中的独特挑战以及对创造机器人世界的探索如何引导我们更好的理解我们自己
- Social scientist
Leila Takayama conducts research on human-robot interaction. Full bio

Double-click the English transcript below to play the video.

00:12
You only get one chance机会
to make a first impression印象,
0
760
2656
形成第一印象只有一次机会,
00:15
and that's true真正 if you're a robot机器人
as well as if you're a person.
1
3440
3176
无论对人类还是对机器人都是如此。
00:18
The first time that I met会见
one of these robots机器人
2
6640
3016
我第一次看到机器人
00:21
was at a place地点
called Willow Garage车库 in 2008.
3
9680
2896
是2008年在一个叫做
Willow Garage的地方。
00:24
When I went to visit访问 there,
my host主办 walked me into the building建造
4
12600
3016
当我到那的时候,
主人邀请我进去
00:27
and we met会见 this little guy.
5
15640
1576
我见到了这个小东西。
00:29
He was rolling压延 into the hallway门厅,
6
17240
1656
他在走廊上闲逛,
00:30
came来了 up to me, satSAT there,
7
18920
1816
向我这个方向走来,停在那,
00:32
stared盯着 blankly面无表情 past过去 me,
8
20760
2256
盯着我好像我不存在,
00:35
did nothing for a while,
9
23040
1656
它待了一会儿,
00:36
rapidly急速 spun his head around 180 degrees
10
24720
1936
快速地把头转了180度,
00:38
and then ran away.
11
26680
1536
然后走开了。
00:40
And that was not a great first impression印象.
12
28240
2176
这可不是个好的第一印象。
00:42
The thing that I learned学到了
about robots机器人 that day
13
30440
2176
我那天才知道
00:44
is that they kind of do their own拥有 thing,
14
32640
2176
这些机器人只做他们自己的事情,
00:46
and they're not totally完全 aware知道的 of us.
15
34840
2136
完全没有意识到我们的存在。
00:49
And I think as we're experimenting试验
with these possible可能 robot机器人 futures期货,
16
37000
3239
而且我认为,随着我们对未来
可实现机器人的实验的进行,
00:52
we actually其实 end结束 up learning学习
a lot more about ourselves我们自己
17
40263
2673
我们不光是在研究这些机器
00:54
as opposed反对 to just these machines.
18
42960
1656
我们对自己的了解也会越来越深刻。
00:56
And what I learned学到了 that day
19
44640
1336
另外,那天我也意识到,
00:58
was that I had pretty漂亮 high
expectations期望 for this little dude花花公子.
20
46000
3416
我对这些小东西有着很高的期待。
01:01
He was not only supposed应该 to be able能够
to navigate导航 the physical物理 world世界,
21
49440
3176
他不仅应该存在于现实生活,
01:04
but also be able能够
to navigate导航 my social社会 world世界 --
22
52640
2656
更应该存在于我的社交领域--
01:07
he's in my space空间; it's a personal个人 robot机器人.
23
55320
2176
在我的社交空间;它是一个人。
01:09
wWhy为什么 didn't it understand理解 me?
24
57520
2016
为什么他不能理解我?
01:11
My host主办 explained解释 to me,
25
59560
1256
我的主人跟我解释说,
01:12
"Well, the robot机器人 is trying
to get from point A to point B,
26
60840
3176
“这个机器人正试图从A点移动到B点,
01:16
and you were an obstacle障碍 in his way,
27
64040
1776
你挡住他的路了,
01:17
so he had to replan重新计划 his path路径,
28
65840
2016
所以他不得不重新规划路线,
01:19
figure数字 out where to go,
29
67880
1256
弄清楚该往哪儿走,
01:21
and then get there some other way,"
30
69160
1696
然后找到另一条路径,”
01:22
which哪一个 was actually其实
not a very efficient高效 thing to do.
31
70880
2456
这明显不是个高效的办法。
01:25
If that robot机器人 had figured想通 out
that I was a person, not a chair椅子,
32
73360
3256
如果这个机器人知道我是个人,
而不是把椅子,
01:28
and that I was willing愿意
to get out of its way
33
76640
2096
如果我挡到它的路,
01:30
if it was trying to get somewhere某处,
34
78760
1656
我会愿意给他让路,
01:32
then it actually其实
would have been more efficient高效
35
80440
2216
那么这是让他到达B点的
01:34
at getting得到 its job工作 doneDONE
36
82680
1256
更有效的办法,
01:35
if it had bothered困扰
to notice注意 that I was a human人的
37
83960
2216
只要它能意识到我是个人,
01:38
and that I have different不同 affordances启示
than things like chairs椅子 and walls墙壁 do.
38
86200
3576
我和椅子或者墙相比是不一样的。
01:41
You know, we tend趋向 to think of these robots机器人
as being存在 from outer space空间
39
89800
3216
我们总是认为机器人要么来自外太空
01:45
and from the future未来
and from science科学 fiction小说,
40
93040
2136
来自未来或者科幻小说。
01:47
and while that could be true真正,
41
95200
1416
虽然这些也许是真的,
01:48
I'd actually其实 like to argue争论
that robots机器人 are here today今天,
42
96640
2656
我今天想论证的是机器人就在这里,
01:51
and they live生活 and work
amongst其中包括 us right now.
43
99320
2776
此时此刻他们就跟我们一起生活一起工作。
01:54
These are two robots机器人 that live生活 in my home.
44
102120
2856
这是我家的两个机器人。
01:57
They vacuum真空 the floors地板
and they cut the grass
45
105000
2496
他们每天都
01:59
every一切 single day,
46
107520
1216
扫地,除草,
02:00
which哪一个 is more than I would do
if I actually其实 had time to do these tasks任务,
47
108760
3376
他们做这些事比我做得更快,
02:04
and they probably大概
do it better than I would, too.
48
112160
2336
也比我做得更好。
02:06
This one actually其实 takes care关心 of my kitty猫咪.
49
114520
2496
这个机器人专门照顾我的小猫咪。
02:09
Every一切 single time
he uses使用 the box, it cleans清理 it,
50
117040
2576
每次猫咪便便后,它都会清理掉,
02:11
which哪一个 is not something I'm willing愿意 to do,
51
119640
1976
我可不愿意干这事儿,
02:13
and it actually其实 makes品牌
his life better as well as mine.
52
121640
2576
猫咪和我的生活都变得更舒服了。
02:16
And while we call these robot机器人 products制品 --
53
124240
2416
当我们称呼这些机器人产品
02:18
it's a "robot机器人 vacuum真空 cleaner清洁器,
it's a robot机器人 lawnmower割草机,
54
126680
2696
“这是一个真空吸尘器机器人,
这是个除草剂机器人,
02:21
it's a robot机器人 littler利特勒 box,"
55
129400
1495
这是个猫马桶机器人,“
02:22
I think there's actually其实 a bunch
of other robots机器人 hiding in plain sight视力
56
130919
4137
我想在日常生活中有大量的机器人
02:27
that have just become成为 so darn useful有用
57
135080
1856
已经变得特别有用
02:28
and so darn mundane平凡
58
136960
1456
特别平常了
02:30
that we call them things
like, "dishwasher洗碗机," right?
59
138440
2496
以至于我们不会称呼它们机器人,
比如“洗碗机”,对吧?
02:32
They get new names.
60
140960
1216
他们有了新名字。
02:34
They don't get called robot机器人 anymore
61
142200
1696
他们不再被称作机器人了
因为他们在我们生活中发挥特定作用了。
02:35
because they actually其实
serve服务 a purpose目的 in our lives生活.
62
143920
2416
类似地,还有恒温器,对吧?
02:38
Similarly同样, a thermostat恒温器, right?
63
146360
1496
我一些研究机器人的朋友
02:39
I know my robotics机器人 friends朋友 out there
64
147880
1776
有可能会对我称呼这玩意儿机器人
表示不满
02:41
are probably大概 cringing畏缩
at me calling调用 this a robot机器人,
65
149680
2336
但是它有自己的任务。
02:44
but it has a goal目标.
66
152040
1256
02:45
Its goal目标 is to make my house
66 degrees Fahrenheit飞轮海,
67
153320
2896
它的任务是让我的房子保持66华氏度,
02:48
and it senses感官 the world世界.
68
156240
1256
它能感知世界。
02:49
It knows知道 it's a little bit cold,
69
157520
1576
它能知道,周围有点冷了
02:51
it makes品牌 a plan计划 and then
it acts行为 on the physical物理 world世界.
70
159120
2616
然后它提高温度,
然后周围的气温就改变了。
02:53
It's robotics机器人.
71
161760
1256
这是机器人学。
02:55
Even if it might威力 not
look like Rosie罗西 the Robot机器人,
72
163040
2576
即使它不是像是“杰森一家”中
罗茜那样的机器女仆,
02:57
it's doing something
that's really useful有用 in my life
73
165640
2936
在我的生活中它也很有用。
03:00
so I don't have to take care关心
74
168600
1376
所以我根本不用
03:02
of turning车削 the temperature温度
up and down myself.
75
170000
2576
亲自调节我周围的气温。
03:04
And I think these systems系统
live生活 and work amongst其中包括 us now,
76
172600
3816
并且我认为这些系统
和我们一起生活、工作
03:08
and not only are these systems系统
living活的 amongst其中包括 us
77
176440
2336
这些设备不仅仅和我们共存,
03:10
but you are probably大概
a robot机器人 operator操作者, too.
78
178800
2656
我们也操控着这些设备。
03:13
When you drive驾驶 your car汽车,
79
181480
1256
当你开车时,
03:14
it feels感觉 like you are operating操作 machinery机械.
80
182760
2216
你就像正在操作一个机器。
03:17
You are also going
from point A to point B,
81
185000
2816
你也正要从A点到B点去,
03:19
but your car汽车 probably大概 has power功率 steering操舵,
82
187840
2216
但是你的车可能有动力转向系统,
03:22
it probably大概 has automatic自动 braking制动 systems系统,
83
190080
2696
可能还有防锁死刹车系统,
它可能有一个自动变速器
甚至可能是自适应巡航控制。
03:24
it might威力 have an automatic自动 transmission传输
and maybe even adaptive自适应 cruise巡航 control控制.
84
192800
3736
03:28
And while it might威力 not be
a fully充分 autonomous自主性 car汽车,
85
196560
2936
但它可能不是个全自动车,
03:31
it has bits of autonomy自治,
86
199520
1296
它有一些自主能力,
03:32
and they're so useful有用
87
200840
1336
这很有用,
03:34
and they make us drive驾驶 safer更安全,
88
202200
1816
让我们开车更安全,
03:36
and we just sort分类 of feel
like they're invisible-in-use隐形使用, right?
89
204040
3656
我们无意中就在使用他们,对吧?
03:39
So when you're driving主动 your car汽车,
90
207720
1576
当你正在开车时,
03:41
you should just feel like
you're going from one place地点 to another另一个.
91
209320
3096
你觉得你只是
从一个地方到另一个地方去,
而不觉得这是一件难事,
03:44
It doesn't feel like it's this big thing
that you have to deal合同 with and operate操作
92
212440
3736
你必须处理和操作和使用这些控件,
03:48
and use these controls控制
93
216200
1256
因为我们花了很长时间学习驾驶
03:49
because we spent花费 so long
learning学习 how to drive驾驶
94
217480
2176
03:51
that they've他们已经 become成为
extensions扩展 of ourselves我们自己.
95
219680
2696
他们已经变成我们的左膀右臂。
03:54
When you park公园 that car汽车
in that tight little garage车库 space空间,
96
222400
2696
当你在一个狭小的车库停车时,
03:57
you know where your corners角落 are.
97
225120
1576
你知道角落在哪儿。
03:58
And when you drive驾驶 a rental出租 car汽车
that maybe you haven't没有 driven驱动 before,
98
226720
3256
当你开着一辆以前从没开过的租来的车,
04:02
it takes some time
to get used to your new robot机器人 body身体.
99
230000
3056
你需要一些时间去熟悉这个陌生的车。
04:05
And this is also true真正 for people
who operate操作 other types类型 of robots机器人,
100
233080
3976
对于其他种类的机器也是一样的,
04:09
so I'd like to share分享 with you
a few少数 stories故事 about that.
101
237080
2600
所以我想跟你们分享一些故事。
04:12
Dealing交易 with the problem问题
of remote远程 collaboration合作.
102
240240
2336
一些解决远程协作问题的故事。
04:14
So, at Willow Garage车库
I had a coworker同事 named命名 Dallas达拉斯,
103
242600
2576
在Willow Garage我有个同事叫Dallas,
04:17
and Dallas达拉斯 looked看着 like this.
104
245200
1576
Dallas看起来就像这样。
04:18
He worked工作 from his home in Indiana印地安那
in our company公司 in California加州.
105
246800
4056
他在印第安纳州的家中
与我们加利福尼亚州的员工一起工作。
04:22
He was a voice语音 in a box
on the table in most of our meetings会议,
106
250880
2936
在很多会议上,
他只是语音盒中的声音。
04:25
which哪一个 was kind of OK
except that, you know,
107
253840
2215
这些会议有时候很和谐,
除非我们正在激烈讨论
而且不喜欢他说的,
04:28
if we had a really heated加热 debate辩论
and we didn't like what he was saying,
108
256079
3377
我们可能就得关掉语音盒。
04:31
we might威力 just hang up on him.
109
259480
1416
(笑声)
04:32
(Laughter笑声)
110
260920
1015
会议结束后我们会再开个会
04:33
Then we might威力 have a meeting会议
after that meeting会议
111
261959
2217
当他不在的时候
04:36
and actually其实 make the decisions决定
in the hallway门厅 afterwards之后
112
264200
2696
在走廊上做决定。
04:38
when he wasn't there anymore.
113
266920
1416
所以这对他不太友好。
04:40
So that wasn't so great for him.
114
268360
1576
作为一个在Willlow的机器人公司,
04:41
And as a robotics机器人 company公司 at Willow,
115
269960
1736
我们周围遍布着一些
多余的机器人零部件,
04:43
we had some extra额外
robot机器人 body身体 parts部分 laying铺设 around,
116
271720
2336
Dallas和他的伙计Curt把这些组装起来,
04:46
so Dallas达拉斯 and his buddy伙伴 Curt生硬
put together一起 this thing,
117
274080
2496
看起来有点像是卡在轮子上的 Skype,
04:48
which哪一个 looks容貌 kind of
like SkypeSkype的 on a stick on wheels车轮,
118
276600
2936
看起来像是个无聊的劣质玩具,
04:51
which哪一个 seems似乎 like a techytechy, silly愚蠢 toy玩具,
119
279560
1736
但是它确实是我见过的远程协作中
04:53
but really it's probably大概
one of the most powerful强大 tools工具
120
281320
2776
最有用的工具之一了。
04:56
that I've seen看到 ever made制作
for remote远程 collaboration合作.
121
284120
2480
所以现在,如果我没有回复
Dallas邮件中的问题,
04:59
So now, if I didn't answer回答
Dallas'达拉斯 email电子邮件 question,
122
287160
3496
05:02
he could literally按照字面 roll into my office办公室,
123
290680
2216
他就可以滚到我的办公室里,
05:04
block my doorway门口
and ask me the question again --
124
292920
2576
挡住我的门,再问我一遍--
05:07
(Laughter笑声)
125
295520
1016
(笑声)
直到我回答它。
05:08
until直到 I answered回答 it.
126
296560
1216
我也不会关掉它,对吧?
那有点粗鲁。
05:09
And I'm not going to turn him off, right?
That's kind of rude无礼.
127
297800
2976
这不仅可以用于一对一的交流,
05:12
Not only was it good
for these one-on-one一对一 communications通讯,
128
300800
2696
也能让你在公司的全体会议中出席。
05:15
but also for just showing展示 up
at the company公司 all-hands全手 meeting会议.
129
303520
2936
你坐在椅子上,
05:18
Getting入门 your butt屁股 in that chair椅子
130
306480
1696
05:20
and showing展示 people that you're present当下
and committed提交 to your project项目
131
308200
3216
让别人知道你的存在以及对项目的贡献
05:23
is a big deal合同
132
311440
1256
这很重要,
05:24
and can help remote远程 collaboration合作 a ton.
133
312720
2176
而且也有助于远程协作。
05:26
We saw this over the period
of months个月 and then years年份,
134
314920
2856
我们在经年累月的过程中看到,
05:29
not only at our company公司
but at others其他, too.
135
317800
2160
改变不仅发生在我们公司,
也发生在其他公司。
这些系统最好的地方在于
05:32
The best最好 thing that can happen发生
with these systems系统
136
320720
2336
它能让我觉得你就在那儿。
05:35
is that it starts启动 to feel
like you're just there.
137
323080
2336
那就是你,就是你的身体。
05:37
It's just you, it's just your body身体,
138
325440
1696
所以人们开始给这些东西个人空间。
05:39
and so people actually其实 start开始
to give these things personal个人 space空间.
139
327160
3096
05:42
So when you're having a stand-up站起来 meeting会议,
140
330280
1976
当你在开会的时候,
人们就会站在这周围,
05:44
people will stand around the space空间
141
332280
1656
就像你亲自出席一样。
05:45
just as they would
if you were there in person.
142
333960
2216
这情况挺好的直到有发生一些插曲。
05:48
That's great until直到
there's breakdowns故障 and it's not.
143
336200
2576
当人们第一次见到这东西时,
05:50
People, when they first see these robots机器人,
144
338800
1976
05:52
are like, "Wow, where's哪里 the components组件?
There must必须 be a camera相机 over there,"
145
340800
3576
会说,“哇,别的零件在哪儿?
应该有一个相机的,”
然后显示屏中的脸就会被乱戳一气
05:56
and they start开始 poking your face面对.
146
344400
1576
“你声音太低了,
我要提高你的音量,”
05:58
"You're talking too softly轻轻的,
I'm going to turn up your volume,"
147
346000
2936
就像一个同事向你走来,和你说话,
06:00
which哪一个 is like having a coworker同事
walk步行 up to you and say,
148
348960
2616
"你声音太低了,
我要抬起你的脸。"
06:03
"You're speaking请讲 too softly轻轻的,
I'm going to turn up your face面对."
149
351600
2896
这行为有点尴尬而且不合适,
06:06
That's awkward尴尬 and not OK,
150
354520
1256
06:07
and so we end结束 up having to build建立
these new social社会 norms规范
151
355800
2616
因此, 我们最终不得不在使用这些系统时
06:10
around using运用 these systems系统.
152
358440
2216
建立新的协约。
同理,当你觉得它像你的身体了,
06:12
Similarly同样, as you start开始
feeling感觉 like it's your body身体,
153
360680
3416
06:16
you start开始 noticing注意到 things like,
"Oh, my robot机器人 is kind of short."
154
364120
3696
你就开始注意到一些事情,
像是“哦,我的机器人有点矮。”
Dallas会对我说这些事,
他有六英尺高,
06:19
Dallas达拉斯 would say things to me --
he was six-foot六脚 tall --
155
367840
2656
我们会带他的机器人
参加鸡尾酒派对之类的,
06:22
and we would take him via通过 robot机器人
to cocktail鸡尾酒 parties派对 and things like that,
156
370520
3456
就像你们一样,
06:26
as you do,
157
374000
1216
这个机器人大概有五英尺高,
跟我差不多高。
06:27
and the robot机器人 was about five-foot-tall五英尺高,
which哪一个 is close to my height高度.
158
375240
3336
06:30
And he would tell me,
159
378600
1216
他会跟我说,
06:31
"You know, people are not
really looking at me.
160
379840
2536
“你知道吗,人们都不看我。
06:34
I feel like I'm just looking
at this sea of shoulders肩膀,
161
382400
2696
我感觉我只看到一堆堆的肩膀,
所以,我们需要一个更高机器人。”
06:37
and it's just -- we need a taller robot机器人."
162
385120
1976
我告诉他,
06:39
And I told him,
163
387120
1256
“额,不行。”
06:40
"Um, no.
164
388400
1296
你今天要感受一下我的日常。
06:41
You get to walk步行 in my shoes for today今天.
165
389720
1936
06:43
You get to see what it's like
to be on the shorter end结束 of the spectrum光谱."
166
391680
3536
你可以看到一个矮子眼中的世界。”
06:47
And he actually其实 ended结束 up building建造
a lot of empathy同情 for that experience经验,
167
395240
3376
他后来从这次经历中
获得了很多感受,
对他影响很大。
06:50
which哪一个 was kind of great.
168
398640
1256
当他亲自来找我的时候,
06:51
So when he'd他会 come visit访问 in person,
169
399920
1656
他不会再站在我旁边对我说话,
06:53
he no longer stood站在 over me
as he was talking to me,
170
401600
2416
他会坐下来,直视着我和我说话,
06:56
he would sit down
and talk to me eye to eye,
171
404040
2096
这是件很美好的事情。
06:58
which哪一个 was kind of a beautiful美丽 thing.
172
406160
1736
06:59
So we actually其实 decided决定
to look at this in the laboratory实验室
173
407920
2656
所以我们决定在实验室里观察
07:02
and see what others其他 kinds of differences分歧
things like robot机器人 height高度 would make.
174
410600
3656
不同机器人的身高
会造成其他什么不同。
07:06
And so half of the people in our study研究
used a shorter robot机器人,
175
414280
2856
在我们的研究中,
一半的人使用了较矮的机器人,
一半的人使用较高的机器人。
07:09
half of the people in our study研究
used a taller robot机器人
176
417160
2416
我们发现同一个人,
07:11
and we actually其实 found发现
that the exact精确 same相同 person
177
419600
2256
07:13
who has the exact精确 same相同 body身体
and says the exact精确 same相同 things as someone有人,
178
421880
3336
使用外观完全相同的机器人,
并且和其他人汇报相同的事情时,
通过身高较高的机器人述说的话
07:17
is more persuasive说服力
and perceived感知 as being存在 more credible可信的
179
425240
2616
会更加令人信服以及让人觉得可靠。
07:19
if they're in a taller robot机器人 form形成.
180
427880
1656
这不符合常理,
07:21
It makes品牌 no rational合理的 sense,
181
429560
1816
07:23
but that's why we study研究 psychology心理学.
182
431400
1696
所以这就是我们学习心理学的原因。
07:25
And really, you know,
the way that Cliff悬崖 NassNass would put this
183
433120
2856
事实上, 你知道, Cliff Nass的风格是,
07:28
is that we're having to deal合同
with these new technologies技术
184
436000
3016
我们不得不接受这些新技术,
07:31
despite尽管 the fact事实
that we have very old brains大脑.
185
439040
2736
即使我们的观念已经过时。
07:33
Human人的 psychology心理学 is not changing改变
at the same相同 speed速度 that tech高科技 is
186
441800
2976
人类心理学认知没有技术更新的那么快,
所以我们总是在追赶,
07:36
and so we're always playing播放 catch-up跟上来,
187
444800
1816
试图去理解这个
07:38
trying to make sense of this world世界
188
446640
1656
东西到处乱跑的世界。
07:40
where these autonomous自主性 things
are running赛跑 around.
189
448320
2336
通常, 是人在说, 不是这个机器, 对吗?
07:42
Usually平时, things that talk are people,
not machines, right?
190
450680
2736
所以我们把这个东西只是看成机器,
07:45
And so we breathe呼吸 a lot of meaning含义
into things like just height高度 of a machine,
191
453440
4576
07:50
not a person,
192
458040
1256
而不是人,
07:51
and attribute属性 that
to the person using运用 the system系统.
193
459320
2360
把它看成使用这个东西的人。
我认为,当你想到机器人学,
07:55
You know, this, I think,
is really important重要
194
463120
2216
这才是真正重要的。
07:57
when you're thinking思维 about robotics机器人.
195
465360
1736
这不是要重塑人类,
07:59
It's not so much about reinventing重塑 humans人类,
196
467120
2096
而是要搞清楚怎样去提升人类,
对吧?
08:01
it's more about figuring盘算 out
how we extend延伸 ourselves我们自己, right?
197
469240
3136
08:04
And we end结束 up using运用 things
in ways方法 that are sort分类 of surprising奇怪.
198
472400
2976
我们最终会用一些令人惊讶的方式
来使用这些东西。
08:07
So these guys can't play pool
because the robots机器人 don't have arms武器,
199
475400
4256
这些家伙不能玩台球
因为机器人没有手臂
但是他们可以骚扰玩台球的人
08:11
but they can heckle诘问 the guys
who are playing播放 pool
200
479680
2336
这有助于团队的团结,
08:14
and that can be an important重要 thing
for team球队 bonding结合,
201
482040
3176
很纯粹的作用。
08:17
which哪一个 is kind of neat整齐.
202
485240
1296
08:18
People who get really good
at operating操作 these systems系统
203
486560
2496
那些能熟练操作这些系统的人
甚至会创造一些新的游戏,
08:21
will even do things
like make up new games游戏,
204
489080
2056
就像午夜机器人足球,
08:23
like robot机器人 soccer足球
in the middle中间 of the night,
205
491160
2136
来回踢易拉罐。
08:25
pushing推动 the trash垃圾 cans around.
206
493320
1456
但是并不是所有人都能适应。
08:26
But not everyone's大家的 good.
207
494800
1576
08:28
A lot of people have trouble麻烦
operating操作 these systems系统.
208
496400
2496
很多人在运行这些系统时遇到了问题。
08:30
This is actually其实 a guy
who logged记录 into the robot机器人
209
498920
2256
有一个人登陆上了自己的机器人
机器人的眼球向左边转了90度。
08:33
and his eyeball眼球 was turned转身
90 degrees to the left.
210
501200
2376
但他不知道,
08:35
He didn't know that,
211
503600
1256
所以他在办公室里嗨起来了,
08:36
so he ended结束 up just bashing
around the office办公室,
212
504880
2176
跑到人们的办公桌上,
让场面变得超级尴尬,
08:39
running赛跑 into people's人们 desks书桌,
getting得到 super embarrassed尴尬,
213
507080
2616
还哈哈大笑,他的音量太高了。
08:41
laughing about it --
his volume was way too high.
214
509720
2336
在照片中的这个人告诉我,
08:44
And this guy here
in the image图片 is telling告诉 me,
215
512080
2136
“我们需要一个机器人静音按钮。”
08:46
"We need a robot机器人 mute静音 button按键."
216
514240
2096
08:48
And by that what he really meant意味着
was we don't want it to be so disruptive破坏性.
217
516360
3496
他真正的意思是
我们不想让它变得如此混乱。
08:51
So as a robotics机器人 company公司,
218
519880
1616
所以作为一个机器人公司,
08:53
we added添加 some obstacle障碍
avoidance躲避 to the system系统.
219
521520
2456
我们在系统中增加了一些避障装置。
一个小的激光测距仪,
可以检测到障碍,
08:56
It got a little laser激光 range范围 finder发现者
that could see the obstacles障碍,
220
524000
3056
如果我作为一个主人说,
撞向一个椅子,
08:59
and if I as a robot机器人 operator操作者
try to say, run into a chair椅子,
221
527080
3136
09:02
it wouldn't不会 let me,
it would just plan计划 a path路径 around,
222
530240
2496
它不会照做,它会在周围找一条路径,
09:04
which哪一个 seems似乎 like a good idea理念.
223
532760
1856
这看起来很好。
09:06
People did hit击中 fewer obstacles障碍
using运用 that system系统, obviously明显,
224
534640
3176
很显然,配有该系统的机器人
撞到的障碍物更少,
09:09
but actually其实, for some of the people,
225
537840
2096
但是实际上,对某些人来说,
09:11
it took them a lot longer
to get through通过 our obstacle障碍 course课程,
226
539960
2856
他们花了更长的时间
才能适应机器人的避障功能,
我们想知道其中的原因。
09:14
and we wanted to know why.
227
542840
1560
09:17
It turns out that there's
this important重要 human人的 dimension尺寸 --
228
545080
3056
原来,有一个重要的人格因素--
一个叫做内外控倾向的人格因素,
09:20
a personality个性 dimension尺寸
called locus轨迹 of control控制,
229
548160
2296
09:22
and people who have
a strong强大 internal内部 locus轨迹 of control控制,
230
550480
3136
拥有强内控倾向的人
09:25
they need to be the masters主人
of their own拥有 destiny命运 --
231
553640
3056
他们需要做自己命运的主人--
09:28
really don't like giving up control控制
to an autonomous自主性 system系统 --
232
556720
3096
特别不喜欢放弃对
一个自主系统的控制--
以至于他们会反抗这些自主行为;
09:31
so much so that they will
fight斗争 the autonomy自治;
233
559840
2136
09:34
"If I want to hit击中 that chair椅子,
I'm going to hit击中 that chair椅子."
234
562000
3096
"如果我想撞上那把椅子,
那我就要去撞倒那把椅子。“
09:37
And so they would actually其实 suffer遭受
from having that autonomous自主性 assistance帮助,
235
565120
3616
因此,
他们会因为存在辅助系统而感到难受
知道这件事对我们来说很重要
09:40
which哪一个 is an important重要 thing for us to know
236
568760
2576
09:43
as we're building建造 increasingly日益
autonomous自主性, say, cars汽车, right?
237
571360
3296
因为我们正在提高机器的自主性,
比如汽车,对吧?
09:46
How are different不同 people going
to grapple抓钩 with that loss失利 of control控制?
238
574680
3360
不同的人如何去应对
机器控制权的减少?
09:50
It's going to be different不同
depending根据 on human人的 dimensions尺寸.
239
578880
2696
这取决于个体的不同。
09:53
We can't treat对待 humans人类
as if we're just one monolithic单片 thing.
240
581600
3496
我们不能把整个人类混为一谈。
09:57
We vary变化 by personality个性, by culture文化,
241
585120
2416
我们的性格,文化不同
09:59
we even vary变化 by emotional情绪化 state
moment时刻 to moment时刻,
242
587560
2456
甚至每分钟都有不同的情绪状态,
我们应该考虑每一种人格,
10:02
and being存在 able能够 to design设计 these systems系统,
243
590040
1976
10:04
these human-robot人类与机器人 interaction相互作用 systems系统,
244
592040
2296
从而设计这些系统,
10:06
we need to take into account帐户
the human人的 dimensions尺寸,
245
594360
2736
人机交互系统,
而不仅仅关注科学技术。
10:09
not just the technological技术性 ones那些.
246
597120
1720
10:11
Along沿 with a sense of control控制
also comes a sense of responsibility责任.
247
599640
4296
如果你操作一个这样的机器人,
你除了控制权还要有责任感。
10:15
And if you were a robot机器人 operator操作者
using运用 one of these systems系统,
248
603960
2856
这是界面的样子。
10:18
this is what the interface接口
would look like.
249
606840
2056
它有点像电子游戏,
10:20
It looks容貌 a little bit like a video视频 game游戏,
250
608920
1936
10:22
which哪一个 can be good because
that's very familiar to people,
251
610880
2976
很棒, 因为人们都很熟悉,
但是它也可能是坏处,
10:25
but it can also be bad
252
613880
1216
10:27
because it makes品牌 people feel
like it's a video视频 game游戏.
253
615120
2456
因为它让人觉得它就是个电子游戏。
我们有一群孩子在斯坦福大学
操作这个系统
10:29
We had a bunch of kids孩子
over at Stanford斯坦福 play with the system系统
254
617600
2856
在我们公司旁边的Menlo公园
驱动机器人,
10:32
and drive驾驶 the robot机器人
around our office办公室 in Menlo门罗 Park公园,
255
620480
2456
孩子们开始说,
10:34
and the kids孩子 started开始 saying things like,
256
622960
1936
打中那个人得10分,那个20分。
10:36
"10 points if you hit击中 that guy over there.
20 points for that one."
257
624920
3176
他们会在走廊里追那些人。
10:40
And they would
chase them down the hallway门厅.
258
628120
2016
(笑声)
我告诉他们,”嗯,那些是真人。
10:42
(Laughter笑声)
259
630160
1016
10:43
I told them, "Um, those are real真实 people.
260
631200
1936
如果你们打他们的话
他们会流血会感到疼痛。”
10:45
They're actually其实 going to bleed流血
and feel pain疼痛 if you hit击中 them."
261
633160
3296
他们说,“好吧,知道了。”
10:48
And they'd他们会 be like, "OK, got it."
262
636480
1616
10:50
But five minutes分钟 later后来,
they would be like,
263
638120
2056
但是五分钟后,他们又会开始,
10:52
"20 points for that guy over there,
he just looks容貌 like he needs需求 to get hit击中."
264
640200
3616
“打中那个人20分,
他看起来很欠扁。”
这有点像“安德的游戏", 对吧?
10:55
It's a little bit
like "Ender's安德的 Game游戏," right?
265
643840
2136
另一边有一个真实的世界
10:58
There is a real真实 world世界 on that other side
266
646000
1936
10:59
and I think it's our responsibility责任
as people designing设计 these interfaces接口
267
647960
3416
我认为作为设计人机交互界面的人,
我们的责任是
11:03
to help people remember记得
268
651400
1256
帮助人们记住
11:04
that there's real真实 consequences后果
to their actions行动
269
652680
2256
他们的行为是有后果的
11:06
and to feel a sense of responsibility责任
270
654960
2296
而且当他们操作
自主性越来越高的东西时,
他们要有一种责任感
11:09
when they're operating操作
these increasingly日益 autonomous自主性 things.
271
657280
3280
这些事例能对未来的机器人实验发展
11:13
These are kind of a great example
272
661840
2296
11:16
of experimenting试验 with one
possible可能 robotic机器人 future未来,
273
664160
3256
提供很大帮助,
11:19
and I think it's pretty漂亮 cool
that we can extend延伸 ourselves我们自己
274
667440
3856
我认为我们能够让机器变为我们的扩展
研究如何将一个机器
11:23
and learn学习 about the ways方法
that we extend延伸 ourselves我们自己
275
671320
2336
变为我们的化身
11:25
into these machines
276
673680
1216
11:26
while at the same相同 time
being存在 able能够 to express表现 our humanity人性
277
674920
2696
同时能够表达我们的人性和个性
是很酷的事。
11:29
and our personality个性.
278
677640
1216
我们也会为其他人考虑
11:30
We also build建立 empathy同情 for others其他
279
678880
1576
11:32
in terms条款 of being存在
shorter, taller, faster更快, slower比较慢,
280
680480
3216
根据人的高矮,走路快慢
甚至手臂残疾,
11:35
and maybe even armless断臂,
281
683720
1416
这很美妙。
11:37
which哪一个 is kind of neat整齐.
282
685160
1336
11:38
We also build建立 empathy同情
for the robots机器人 themselves他们自己.
283
686520
2536
我们也会为机器人着想。
11:41
This is one of my favorite喜爱 robots机器人.
284
689080
1656
这是我最喜欢的机器人之一。
11:42
It's called the TweenbotTweenbot.
285
690760
1456
它叫Tweenbot。
11:44
And this guy has a little flag that says,
286
692240
1976
这家伙有一个小旗,上面写着
11:46
"I'm trying to get
to this intersection路口 in Manhattan曼哈顿,"
287
694240
2576
”我想去曼哈顿的十字路口,“
它很可爱,向前滚动
11:48
and it's cute可爱 and rolls劳斯莱斯
forward前锋, that's it.
288
696840
2776
它并不知道如何去建立地图,
也不知道怎么样去看世界,
11:51
It doesn't know how to build建立 a map地图,
it doesn't know how to see the world世界,
289
699640
3456
它只是寻求帮助。
11:55
it just asks for help.
290
703120
1256
最美妙之处在于,
11:56
The nice不错 thing about people
291
704400
1336
11:57
is that it can actually其实 depend依靠
upon the kindness善良 of strangers陌生人.
292
705760
3096
它可以依赖于陌生人的善良。
12:00
It did make it across横过 the park公园
to the other side of Manhattan曼哈顿 --
293
708880
3896
它最后真的穿过公园
到了曼哈顿的另一边--
12:04
which哪一个 is pretty漂亮 great --
294
712800
1256
这太棒了--
就因为人们愿意把它拾起来
给它指明正确的方向。
12:06
just because people would pick it up
and point it in the right direction方向.
295
714080
3456
(笑声)
这很棒,不是吗?
12:09
(Laughter笑声)
296
717560
936
12:10
And that's great, right?
297
718520
1256
我们试图建立一个人机的世界
12:11
We're trying to build建立
this human-robot人类与机器人 world世界
298
719800
2696
12:14
in which哪一个 we can coexist共存
and collaborate合作 with one another另一个,
299
722520
3416
人与机器可以共存,合作,
12:17
and we don't need to be fully充分 autonomous自主性
and just do things on our own拥有.
300
725960
3376
我们不需要各自为政,
完全独立在自己的领域。
确切说我们应该一起合作。
12:21
We actually其实 do things together一起.
301
729360
1496
要想实现人机社会,
12:22
And to make that happen发生,
302
730880
1256
我们实际上需要其他人的帮助
比如艺术家,设计师,
12:24
we actually其实 need help from people
like the artists艺术家 and the designers设计师,
303
732160
3256
政策制定者,法律学者,
12:27
the policy政策 makers制造商, the legal法律 scholars学者,
304
735440
1856
心理学家,社会学家,人类学家--
12:29
psychologists心理学家, sociologists社会学家,
anthropologists人类学家 --
305
737320
2216
我们需要来自各方的力量,
12:31
we need more perspectives观点 in the room房间
306
739560
1816
如果我们要做Stu Card所说的,
我们应该做的事情--
12:33
if we're going to do the thing
that Stu斯图 Card says we should do,
307
741400
2976
12:36
which哪一个 is invent发明 the future未来
that we actually其实 want to live生活 in.
308
744400
3936
创造我们想要的未来。
我认为我们可以一起为了
12:40
And I think we can continue继续 to experiment实验
309
748360
2656
12:43
with these different不同
robotic机器人 futures期货 together一起,
310
751040
2176
未来的机器人世界不断尝试,
通过这样,
我们最终会更多地了解我们自己。
12:45
and in doing so, we will end结束 up
learning学习 a lot more about ourselves我们自己.
311
753240
4680
谢谢。
12:50
Thank you.
312
758720
1216
(掌声)
12:51
(Applause掌声)
313
759960
2440
Translated by Fang Kim
Reviewed by Wei Wu

▲Back to top

ABOUT THE SPEAKER
Leila Takayama - Social scientist
Leila Takayama conducts research on human-robot interaction.

Why you should listen

Leila Takayama is an acting associate professor of Psychology at the University of California, Santa Cruz, where she founded and leads the Re-Embodied Cognition Lab. Her lab examines how people make sense of, interact with, and relate to new technologies. Prior to academia, she was a researcher at GoogleX and Willow Garage, where she developed a taste for working alongside engineers, designers, animators, and more. Her interdisciplinary research continues in her current work on what happens when people interact with robots and through robots.

Takayama is a World Economic Forum Global Futures Council Member and Young Global Leader. In 2015, she was presented the IEEE Robotics & Automation Society Early Career Award. In 2012, she was named a TR35 winner and one of the 100 most creative people in business by Fast Company. She completed her PhD in Communication at Stanford University in 2008, advised by Professor Clifford Nass. She also holds a PhD minor in Psychology from Stanford, a master's degree in Communication from Stanford, and bachelor's of arts degrees in Psychology and Cognitive Science from UC Berkeley (2003). During her graduate studies, she was a research assistant in the User Interface Research (UIR) group at Palo Alto Research Center (PARC).

Photo: Melissa DeWitt

More profile about the speaker
Leila Takayama | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee