ABOUT THE SPEAKER
Leila Takayama - Social scientist
Leila Takayama conducts research on human-robot interaction.

Why you should listen

Leila Takayama is an acting associate professor of Psychology at the University of California, Santa Cruz, where she founded and leads the Re-Embodied Cognition Lab. Her lab examines how people make sense of, interact with, and relate to new technologies. Prior to academia, she was a researcher at GoogleX and Willow Garage, where she developed a taste for working alongside engineers, designers, animators, and more. Her interdisciplinary research continues in her current work on what happens when people interact with robots and through robots.

Takayama is a World Economic Forum Global Futures Council Member and Young Global Leader. In 2015, she was presented the IEEE Robotics & Automation Society Early Career Award. In 2012, she was named a TR35 winner and one of the 100 most creative people in business by Fast Company. She completed her PhD in Communication at Stanford University in 2008, advised by Professor Clifford Nass. She also holds a PhD minor in Psychology from Stanford, a master's degree in Communication from Stanford, and bachelor's of arts degrees in Psychology and Cognitive Science from UC Berkeley (2003). During her graduate studies, she was a research assistant in the User Interface Research (UIR) group at Palo Alto Research Center (PARC).

Photo: Melissa DeWitt

More profile about the speaker
Leila Takayama | Speaker | TED.com
TEDxPaloAlto

Leila Takayama: What's it like to be a robot?

Leila Takayama: Kako je to biti robotom?

Filmed:
1,183,118 views

Mi već živimo među robotima: alatima i uređajima poput perilice posuđa i termostata koji su toliko sjedinjeni s našim životom da ih ni ne smatramo robotima. Kako bi budućnost s još više robota izgledala? Socijalna znanstvenica Leila Takayama dijeli neke od jedinstvenih izazova dizajniranja za ljudsko-robotske interakcije te kako nas eksperimentiranje s budućnošću robota, zapravo, vodi prema boljem razumijevanju nas samih.
- Social scientist
Leila Takayama conducts research on human-robot interaction. Full bio

Double-click the English transcript below to play the video.

00:12
You only get one chanceprilika
to make a first impressionutisak,
0
760
2656
Imate samo jednu priliku
napraviti prvi dojam,
00:15
and that's truepravi if you're a robotrobot
as well as if you're a personosoba.
1
3440
3176
a to je istina ako ste robot
i ako ste osoba.
00:18
The first time that I metsastali
one of these robotsroboti
2
6640
3016
Prvi put kad sam upoznala
jednog od ovih robota
00:21
was at a placemjesto
calledzvao WillowVrba GarageGaraža in 2008.
3
9680
2896
bilo je u mjestu
nazvanom Willow Garage u 2008.
00:24
When I wentotišao to visitposjetiti there,
my hostdomaćin walkedhodao me into the buildingzgrada
4
12600
3016
Kad sam bila u posjetu tamo,
domaćin me uveo u zgradu
00:27
and we metsastali this little guy.
5
15640
1576
i upoznali smo onog malenog.
00:29
He was rollingvaljanje into the hallwayhodnik,
6
17240
1656
Dokotrljao se hodnikom,
00:30
camedošao up to me, satsat there,
7
18920
1816
došao do mene, sjeo,
00:32
staredzurio blanklyravnodušno pastprošlost me,
8
20760
2256
buljio u prazno iza mene,
00:35
did nothing for a while,
9
23040
1656
mirovao neko vrijeme,
00:36
rapidlybrzo spunpreden his headglava around 180 degreesstupnjeva
10
24720
1936
ubrzano okrenuo glavu za 180 stupnjeva
00:38
and then ranran away.
11
26680
1536
i onda pobjegao.
00:40
And that was not a great first impressionutisak.
12
28240
2176
To nije bio dobar prvi dojam.
00:42
The thing that I learnednaučeno
about robotsroboti that day
13
30440
2176
Ono što sam naučila od robota tog dana
00:44
is that they kindljubazan of do theirnjihov ownvlastiti thing,
14
32640
2176
je da se oni ponašaju nekako na svoju ruku
00:46
and they're not totallypotpuno awaresvjestan of us.
15
34840
2136
i da nas nisu sasvim svjesni.
00:49
And I think as we're experimentingeksperimentiranje
with these possiblemoguće robotrobot futuresbudućnosti,
16
37000
3239
Smatram da eksperimentiranjem
s ovim mogućim budućnostima robota,
00:52
we actuallyzapravo endkraj up learningučenje
a lot more about ourselvessebe
17
40263
2673
zapravo učimo puno više o sebi,
00:54
as opposedZa razliku to just these machinesstrojevi.
18
42960
1656
za razliku od ovih strojeva.
00:56
And what I learnednaučeno that day
19
44640
1336
Ono što sam naučila tog dana
00:58
was that I had prettyprilično highvisok
expectationsočekivanja for this little dudeStari.
20
46000
3416
je da sam imala dosta
visoka očekivanja od ovog mališana.
01:01
He was not only supposedtrebala to be ableu stanju
to navigateploviti the physicalfizička worldsvijet,
21
49440
3176
On je trebao znati navigirati
ne samo fizičkim svijetom,
01:04
but alsotakođer be ableu stanju
to navigateploviti my socialsocijalni worldsvijet --
22
52640
2656
već i mojim društvenim svijetom.
01:07
he's in my spaceprostor; it's a personalosobni robotrobot.
23
55320
2176
On je u mom prostoru, to je osobni robot.
01:09
wWhybi vratila didn't it understandrazumjeti me?
24
57520
2016
Zašto me nije razumio?
01:11
My hostdomaćin explainedobjašnjen to me,
25
59560
1256
Moj domaćin mi je objasnio:
01:12
"Well, the robotrobot is tryingtežak
to get from pointtočka A to pointtočka B,
26
60840
3176
"Pa, robot želi doći
od točke A do točke B,
01:16
and you were an obstacleprepreka in his way,
27
64040
1776
a ti si mu bila prepreka na putu
01:17
so he had to replanponovno planiranje his pathstaza,
28
65840
2016
pa je morao ponovno isplanirati svoj put,
01:19
figurelik out where to go,
29
67880
1256
shvatiti kuda ići
01:21
and then get there some other way,"
30
69160
1696
i onda stići tamo na neki drugi način",
01:22
whichkoji was actuallyzapravo
not a very efficientučinkovit thing to do.
31
70880
2456
što, zapravo, nije vrlo učinkovito.
01:25
If that robotrobot had figuredshvaćen out
that I was a personosoba, not a chairstolica,
32
73360
3256
Da je taj robot shvatio
da ja nisam stolica nego osoba,
01:28
and that I was willingspreman
to get out of its way
33
76640
2096
da sam mu se mogla maknuti s puta
01:30
if it was tryingtežak to get somewherenegdje,
34
78760
1656
ako je pokušavao stići negdje,
01:32
then it actuallyzapravo
would have been more efficientučinkovit
35
80440
2216
onda bi bio učinkovitiji
01:34
at gettinguzimajući its jobposao doneučinio
36
82680
1256
u izvršavanju svoje zadaće,
01:35
if it had botheredsmeta
to noticeobavijest that I was a humanljudski
37
83960
2216
da se potrudio primijetiti da sam čovjek
01:38
and that I have differentdrugačiji affordancesAffordances
than things like chairsstolice and wallszidovi do.
38
86200
3576
i da imam drukčiju pristupačnost
od stvari poput stolica i zidova.
01:41
You know, we tendskloni to think of these robotsroboti
as beingbiće from outervanjski spaceprostor
39
89800
3216
Znate, volimo misliti o ovim robotima
kao o nečemu iz svemira,
01:45
and from the futurebudućnost
and from scienceznanost fictionfikcija,
40
93040
2136
iz budućnosti i znanstvene fantastike,
01:47
and while that could be truepravi,
41
95200
1416
no, dok to možda i jest istina,
01:48
I'd actuallyzapravo like to argueraspravljati
that robotsroboti are here todaydanas,
42
96640
2656
zapravo tvrdim da su roboti ovdje danas
01:51
and they liveživjeti and work
amongstmeđu us right now.
43
99320
2776
te da oni žive i rade među nama već sada.
01:54
These are two robotsroboti that liveživjeti in my home.
44
102120
2856
Ova dva robota žive u mom domu.
01:57
They vacuumvakuum the floorspodovi
and they cutrez the grasstrava
45
105000
2496
Usisavaju podove i kose travu
01:59
everysvaki singlesingl day,
46
107520
1216
svakog dana,
02:00
whichkoji is more than I would do
if I actuallyzapravo had time to do these taskszadaci,
47
108760
3376
što je više nego što bih sama čistila
da zapravo imam vremena za te zadatke,
02:04
and they probablyvjerojatno
do it better than I would, too.
48
112160
2336
a oni to vjerojatno rade bolje od mene.
02:06
This one actuallyzapravo takes carebriga of my kittymače.
49
114520
2496
Ovaj se zapravo brine za mog mačka.
02:09
EverySvaki singlesingl time
he usesnamjene the boxkutija, it cleansčisti it,
50
117040
2576
Svaki put kad on obavi nuždu
u pijesak, robot to počisti,
02:11
whichkoji is not something I'm willingspreman to do,
51
119640
1976
jer to nije nešto što sam voljna raditi
02:13
and it actuallyzapravo makesmarke
his life better as well as minerudnik.
52
121640
2576
pa mu čini život lakšim kao i moj.
02:16
And while we call these robotrobot productsproizvodi --
53
124240
2416
Iako zovemo te robote proizvodima;
02:18
it's a "robotrobot vacuumvakuum cleanerčistač,
it's a robotrobot lawnmowerkosilica,
54
126680
2696
"to je robotski usisavač",
"to je robotska kosilica",
02:21
it's a robotrobot littlerLittler boxkutija,"
55
129400
1495
"to je robotska kutija s pijeskom",
02:22
I think there's actuallyzapravo a bunchmnogo
of other robotsroboti hidingskrivanje in plainobičan sightvid
56
130919
4137
smatram da, zapravo, ima hrpa
drugih robota skrivenih na otvorenom
02:27
that have just becomepostati so darnProkleto usefulkoristan
57
135080
1856
koji su postali toliko vražje korisni
02:28
and so darnProkleto mundanezemaljski
58
136960
1456
i toliko vražje svakodnevni,
02:30
that we call them things
like, "dishwasherperilica posuđa," right?
59
138440
2496
da ih zovemo stvarima
poput "perilice posuđa", zar ne?
02:32
They get newnovi namesimena.
60
140960
1216
Oni dobiju nova imena.
02:34
They don't get calledzvao robotrobot anymoreviše
61
142200
1696
Njih više ne zovemo robotima
02:35
because they actuallyzapravo
serveposlužiti a purposesvrha in our livesživot.
62
143920
2416
jer oni, zapravo,
imaju svrhu u našim životima.
02:38
SimilarlyNa sličan način, a thermostattermostat, right?
63
146360
1496
Slično kao termostat, zar ne?
02:39
I know my roboticsRobotika friendsprijatelji out there
64
147880
1776
Znam da su moji prijatelji robotičari
02:41
are probablyvjerojatno cringingtresu
at me callingzvanje this a robotrobot,
65
149680
2336
vjerojatno sada ustuknuli
jer njega zovem robotom,
02:44
but it has a goalcilj.
66
152040
1256
ali on ima cilj.
02:45
Its goalcilj is to make my housekuća
66 degreesstupnjeva FahrenheitFahrenheit,
67
153320
2896
Njegov cilj je održavati
moju kuću toplom na 18°C
02:48
and it sensesosjetila the worldsvijet.
68
156240
1256
i on osjeća svijet.
02:49
It knowszna it's a little bitbit coldhladno,
69
157520
1576
Zna da je zahladilo,
02:51
it makesmarke a planplan and then
it actsdjela on the physicalfizička worldsvijet.
70
159120
2616
napravi plan i onda
djeluje u fizičkom svijetu.
02:53
It's roboticsRobotika.
71
161760
1256
To je robotika.
02:55
Even if it mightmoć not
look like RosieRosie the RobotRobota,
72
163040
2576
Iako ne izgleda kao robot Rosie iz crtića,
02:57
it's doing something
that's really usefulkoristan in my life
73
165640
2936
on radi nešto uistinu korisno u mom životu
03:00
so I don't have to take carebriga
74
168600
1376
da se ja ne moram brinuti
03:02
of turningtokarenje the temperaturetemperatura
up and down myselfsebe.
75
170000
2576
o pojačavanju ili smanjivanju temperature.
03:04
And I think these systemssustavi
liveživjeti and work amongstmeđu us now,
76
172600
3816
Smatram da ovi sustavi
žive i rade među nama sada
03:08
and not only are these systemssustavi
livingživot amongstmeđu us
77
176440
2336
i ne samo da žive među nama,
03:10
but you are probablyvjerojatno
a robotrobot operatoroperatora, too.
78
178800
2656
nego vi njima, također,
vjerojatno i upravljate.
03:13
When you drivepogon your carautomobil,
79
181480
1256
Kad vozite svoj auto,
03:14
it feelsosjeća like you are operatingradni machinerystrojevi.
80
182760
2216
imate osjećaj kao da
upravljate mašinerijom.
03:17
You are alsotakođer going
from pointtočka A to pointtočka B,
81
185000
2816
Vozite od točke A do točke B,
03:19
but your carautomobil probablyvjerojatno has powervlast steeringupravljanja,
82
187840
2216
ali vaš auto vjerojatno
ima servo upravljanje,
03:22
it probablyvjerojatno has automaticAutomatsko brakingkočenje systemssustavi,
83
190080
2696
vjerojatno ima automatsku kočnicu,
03:24
it mightmoć have an automaticAutomatsko transmissionprijenos
and maybe even adaptiveprilagodljiv cruisekrstarenje controlkontrolirati.
84
192800
3736
možda ima automatski mjenjač i možda
ima prilagodljivu kontrolu upravljanja.
03:28
And while it mightmoć not be
a fullypotpuno autonomousautonoman carautomobil,
85
196560
2936
I dok možda nije potpuno autonoman auto,
03:31
it has bitskomadići of autonomyautonomija,
86
199520
1296
ima djeliće autonomije
03:32
and they're so usefulkoristan
87
200840
1336
koji su toliko korisni
03:34
and they make us drivepogon safersigurniji,
88
202200
1816
da vozimo sigurnije zbog njih,
03:36
and we just sortvrsta of feel
like they're invisible-in-usenevidljiv u primjeni, right?
89
204040
3656
a jednostavno osjećamo
kao da su nevidljivi, zar ne?
03:39
So when you're drivingvožnja your carautomobil,
90
207720
1576
Tako da kad vozite
03:41
you should just feel like
you're going from one placemjesto to anotherjoš.
91
209320
3096
trebali biste se osjećati kao
da idete od jednog mjesta prema drugom.
03:44
It doesn't feel like it's this bigvelika thing
that you have to dealdogovor with and operateraditi
92
212440
3736
Ne čini se kao da je to neka velika stvar
s kojom se morate nositi i upravljati
03:48
and use these controlskontrole
93
216200
1256
i koristiti ove kontrole,
03:49
because we spentpotrošen so long
learningučenje how to drivepogon
94
217480
2176
jer smo toliko vremena
proveli učeći kako voziti,
03:51
that they'vešto ga do becomepostati
extensionsproširenja of ourselvessebe.
95
219680
2696
da su oni postali produžeci nas samih.
03:54
When you parkpark that carautomobil
in that tighttijesan little garagegaraža spaceprostor,
96
222400
2696
Kad parkirate taj auto
u ono usko mjesto u garaži,
03:57
you know where your cornerskutovi are.
97
225120
1576
znate gdje su vam uglovi.
03:58
And when you drivepogon a rentalnajam carautomobil
that maybe you haven'tnisu drivenupravljan before,
98
226720
3256
Ali, onda kad vozite iznajmljeni auto
koji možda nikad niste vozili,
potrebno je neko vrijeme
da se naviknete na svoje robotsko tijelo.
04:02
it takes some time
to get used to your newnovi robotrobot bodytijelo.
99
230000
3056
04:05
And this is alsotakođer truepravi for people
who operateraditi other typesvrste of robotsroboti,
100
233080
3976
To je istina i za ljude
koju upravljaju drugim vrstama robota,
04:09
so I'd like to sharePodjeli with you
a fewnekoliko storiespriče about that.
101
237080
2600
zato bih htjela podijeliti s vama
par priča o tome.
04:12
DealingKoje se bave with the problemproblem
of remotedaljinski collaborationkolaboracija.
102
240240
2336
O nošenju s problemom udaljene suradnje.
04:14
So, at WillowVrba GarageGaraža
I had a coworkersuradnika namedpod nazivom DallasDallas,
103
242600
2576
Dakle, u Willow Garageu
imala sam kolegu zvanog Dallas
04:17
and DallasDallas lookedgledao like this.
104
245200
1576
i Dallas je izgledao ovako.
04:18
He workedradio from his home in IndianaIndiana
in our companydruštvo in CaliforniaCalifornia.
105
246800
4056
Radio je od kuće u Indiani
u našoj tvrtci u Kaliforniji.
04:22
He was a voiceglas in a boxkutija
on the tablestol in mostnajviše of our meetingssastanci,
106
250880
2936
On je bio glas u kutiji
na stolu na većini naših sastanaka,
04:25
whichkoji was kindljubazan of OK
exceptosim that, you know,
107
253840
2215
što je bilo u redu, osim što, znate,
ako bismo imali neku žestoku raspravu
i nije nam se sviđalo što bi govorio,
04:28
if we had a really heatedgrijani debatedebata
and we didn't like what he was sayingizreka,
108
256079
3377
04:31
we mightmoć just hangobjesiti up on him.
109
259480
1416
samo bismo prekinuli poziv.
04:32
(LaughterSmijeh)
110
260920
1015
(Smijeh)
Onda bismo mogli održati
sastanak nakon tog sastanka
04:33
Then we mightmoć have a meetingsastanak
after that meetingsastanak
111
261959
2217
04:36
and actuallyzapravo make the decisionsodluke
in the hallwayhodnik afterwardsposlije
112
264200
2696
i, zapravo, donijeti odluke u hodniku,
kada njega više nije bilo.
04:38
when he wasn'tnije there anymoreviše.
113
266920
1416
Tako da to nije bilo baš super za njega.
04:40
So that wasn'tnije so great for him.
114
268360
1576
A kao tvrtka robotike u Willowu,
04:41
And as a roboticsRobotika companydruštvo at WillowVrba,
115
269960
1736
imali smo nekoliko
dodatnih dijelova robota uokolo,
04:43
we had some extraekstra
robotrobot bodytijelo partsdijelovi layingpolaganje around,
116
271720
2336
04:46
so DallasDallas and his buddyprijatelj CurtKurt
put togetherzajedno this thing,
117
274080
2496
tako da su Dallas i njegov
prijatelj Curt sastavili ovo,
04:48
whichkoji looksizgled kindljubazan of
like SkypeSkype on a stickštap on wheelskotači,
118
276600
2936
što izgleda poput Skypea
na štapu s kotačima,
što se čini kao smiješna,
mehanička igračka,
04:51
whichkoji seemsčini se like a techyrazdražljiv, sillyglup toyigračka,
119
279560
1736
04:53
but really it's probablyvjerojatno
one of the mostnajviše powerfulsnažan toolsalat
120
281320
2776
ali stvarno je vjerojatno
jedan od najmoćnijih alata
koje sam ikad vidjela,
stvorenih za udaljenu suradnju.
04:56
that I've seenvidio ever madenapravljen
for remotedaljinski collaborationkolaboracija.
121
284120
2480
04:59
So now, if I didn't answerodgovor
Dallas'Dallas' emaile questionpitanje,
122
287160
3496
Pa tako, ako ne bih odgovorila
na Dallasovo pitanje u e-mailu,
05:02
he could literallydoslovce rollsvitak into my officeured,
123
290680
2216
on bi se mogao doslovno
dokotrljati u moj ured,
05:04
blockblok my doorwayvrata
and askpitati me the questionpitanje again --
124
292920
2576
blokirati mi izlazak i ponovno
mi postaviti pitanje...
05:07
(LaughterSmijeh)
125
295520
1016
(Smijeh)
...dok mu ne bih odgovorila.
05:08
untildo I answeredodgovorio it.
126
296560
1216
05:09
And I'm not going to turnskretanje him off, right?
That's kindljubazan of rudeprimitivan.
127
297800
2976
A ja ga neću ugasiti, zar ne?
To se čini pomalo nepristojnim.
05:12
Not only was it good
for these one-on-onejedan na jedan communicationskomunikacije,
128
300800
2696
Ne samo da je bilo dobro za
ove razgovore jedan na jedan,
05:15
but alsotakođer for just showingpokazivanje up
at the companydruštvo all-handszahtjevnu meetingsastanak.
129
303520
2936
već i za grupne sastanke u tvrtci.
05:18
GettingDobivanje your buttguza in that chairstolica
130
306480
1696
Dovući se na tu stolicu
05:20
and showingpokazivanje people that you're presentpredstaviti
and committedpredan to your projectprojekt
131
308200
3216
i pokazati ljudima da ste prisutni
i odani svom projektu
05:23
is a bigvelika dealdogovor
132
311440
1256
je velika stvar
05:24
and can help remotedaljinski collaborationkolaboracija a tontona.
133
312720
2176
i može uvelike pomoći udaljenoj suradnji.
05:26
We saw this over the periodrazdoblje
of monthsmjeseci and then yearsgodina,
134
314920
2856
Vidjeli smo to u razdoblju
od par mjeseci, zatim i godina,
05:29
not only at our companydruštvo
but at othersdrugi, too.
135
317800
2160
ne samo u našoj tvrtci već i u drugima.
Najbolja stvar koja se može
dogoditi s ovim sustavima
05:32
The bestnajbolje thing that can happendogoditi se
with these systemssustavi
136
320720
2336
05:35
is that it startspočinje to feel
like you're just there.
137
323080
2336
je to da se počnete
osjećati kao da ste tamo.
05:37
It's just you, it's just your bodytijelo,
138
325440
1696
Vi ste tamo, vaše tijelo je tamo,
05:39
and so people actuallyzapravo startpočetak
to give these things personalosobni spaceprostor.
139
327160
3096
tako da ljudi stvarno počinju
davati ovim stvarima osobni prostor.
05:42
So when you're havingima a stand-upustani meetingsastanak,
140
330280
1976
Tako da kad imate stojeći sastanak,
ljudi će stajati uokolo,
05:44
people will standstajati around the spaceprostor
141
332280
1656
baš kao što bi stajali da ste tamo uživo.
05:45
just as they would
if you were there in personosoba.
142
333960
2216
To je odlično dok nema prekida,
a onda više nije.
05:48
That's great untildo
there's breakdownskvarova and it's not.
143
336200
2576
05:50
People, when they first see these robotsroboti,
144
338800
1976
Ljudi, kada prvi put vide ove robote,
05:52
are like, "WowSjajna osoba, where'sgdje je the componentskomponente?
There mustmora be a camerafotoaparat over there,"
145
340800
3576
počnu: "Vau, gdje su komponente?
Mora biti nekakva kamera ovdje negdje"
i onda vam bockaju lice.
05:56
and they startpočetak pokingbode your facelice.
146
344400
1576
"Pretiho govoriš, pojačat ću ti glasnoću",
05:58
"You're talkingkoji govori too softlynježno,
I'm going to turnskretanje up your volumevolumen,"
147
346000
2936
što je isto kao da imate kolegu
koji vam priđe i kaže:
06:00
whichkoji is like havingima a coworkersuradnika
walkhodati up to you and say,
148
348960
2616
"Pretiho govoriš, pojačat ću ti lice".
06:03
"You're speakinggovor too softlynježno,
I'm going to turnskretanje up your facelice."
149
351600
2896
To je čudno i nije u redu,
06:06
That's awkwardneugodno and not OK,
150
354520
1256
06:07
and so we endkraj up havingima to buildizgraditi
these newnovi socialsocijalni normsnorme
151
355800
2616
tako da na kraju moramo
razviti nove društvene norme
06:10
around usingkoristeći these systemssustavi.
152
358440
2216
kad koristimo ove sustave.
06:12
SimilarlyNa sličan način, as you startpočetak
feelingosjećaj like it's your bodytijelo,
153
360680
3416
Slično je kada se počnete
privikavati da je to vaše tijelo,
06:16
you startpočetak noticingprimjećujući things like,
"Oh, my robotrobot is kindljubazan of shortkratak."
154
364120
3696
počnete primjećivati stvari poput:
"Ah, moj robot je nekako kratak".
06:19
DallasDallas would say things to me --
he was six-footšest-stopalo tallvisok --
155
367840
2656
Dallas bi mi znao govoriti --
visok je 1,80 m --
06:22
and we would take him viapreko robotrobot
to cocktailkoktel partiesstranke and things like that,
156
370520
3456
i vodili bismo ga putem robota
na koktel partije i slično,
06:26
as you do,
157
374000
1216
normalno,
06:27
and the robotrobot was about five-foot-tallpet metara visokim,
whichkoji is closeblizu to my heightvisina.
158
375240
3336
a robot je bio visok otprilike 1,50 m,
što je bliže mojoj visini.
06:30
And he would tell me,
159
378600
1216
I on bi mi govorio:
06:31
"You know, people are not
really looking at me.
160
379840
2536
"Znaš, ljudi zapravo ni
ne gledaju na mene.
06:34
I feel like I'm just looking
at this seamore of shouldersramena,
161
382400
2696
Osjećam se kao da
gledam u ovo more ramena,
06:37
and it's just -- we need a tallerviši robotrobot."
162
385120
1976
a to je... treba nam viši robot".
06:39
And I told him,
163
387120
1256
Ja bih mu rekla:
06:40
"UmHm, no.
164
388400
1296
"Hm, ne.
06:41
You get to walkhodati in my shoescipele for todaydanas.
165
389720
1936
Vidjet ćeš danas kako je meni inače.
06:43
You get to see what it's like
to be on the shorterkraće endkraj of the spectrumspektar."
166
391680
3536
Vidjet ćeš kako je to biti nizak".
06:47
And he actuallyzapravo endedzavršeno up buildingzgrada
a lot of empathysuosjecanje for that experienceiskustvo,
167
395240
3376
On je na kraju razvio veliku
empatiju tijekom tog iskustva,
06:50
whichkoji was kindljubazan of great.
168
398640
1256
što je bilo super.
06:51
So when he'don bi come visitposjetiti in personosoba,
169
399920
1656
Tako da kad bi došao uživo u posjet,
06:53
he no longerviše stoodstajao over me
as he was talkingkoji govori to me,
170
401600
2416
više ne bi stajao nada mnom
kad bi mi govorio,
06:56
he would sitsjediti down
and talk to me eyeoko to eyeoko,
171
404040
2096
sjeo bi i govorio sa mnom u razini očiju,
06:58
whichkoji was kindljubazan of a beautifullijep thing.
172
406160
1736
što je bilo predivno.
Tako da smo, zapravo, odlučili
pogledati to u laboratoriju
06:59
So we actuallyzapravo decidedodlučio
to look at this in the laboratorylaboratorija
173
407920
2656
i vidjeti kakve se druge vrste stvari,
poput visine robota, mogu napraviti.
07:02
and see what othersdrugi kindsvrste of differencesRazlike
things like robotrobot heightvisina would make.
174
410600
3656
Pa je pola ljudi u našem istraživanju
koristilo kraćeg robota,
07:06
And so halfpola of the people in our studystudija
used a shorterkraće robotrobot,
175
414280
2856
07:09
halfpola of the people in our studystudija
used a tallerviši robotrobot
176
417160
2416
a druga polovica višeg robota.
07:11
and we actuallyzapravo foundpronađeno
that the exacttočno sameisti personosoba
177
419600
2256
Otkrili smo da će ista osoba
07:13
who has the exacttočno sameisti bodytijelo
and sayskaže the exacttočno sameisti things as someonenetko,
178
421880
3336
koja ima isto tijelo
i govori iste stvari kao netko,
07:17
is more persuasiveuvjerljiv
and perceivedpercipiraju as beingbiće more crediblevjerodostojan
179
425240
2616
biti uvjerljivija i doimati se
vjerodostojnijom
07:19
if they're in a tallerviši robotrobot formoblik.
180
427880
1656
ako je u višem robotskom obliku.
07:21
It makesmarke no rationalracionalan senseosjećaj,
181
429560
1816
Racionalno to nema smisla,
07:23
but that's why we studystudija psychologyPsihologija.
182
431400
1696
ali zato proučavamo psihologiju.
07:25
And really, you know,
the way that CliffCliff NassNass would put this
183
433120
2856
I, znate, kao što bi to Cliff Nass rekao,
07:28
is that we're havingima to dealdogovor
with these newnovi technologiestehnologije
184
436000
3016
suočavamo se s ovim novim tehnologijama,
07:31
despitebez obzira na the factčinjenica
that we have very oldstar brainsmozak.
185
439040
2736
unatoč činjenici da imamo
vrlo stare mozgove.
Ljudska psihologija se ne mijenja
istom brzinom kao tehnologija,
07:33
HumanLjudski psychologyPsihologija is not changingmijenjanje
at the sameisti speedubrzati that techtech is
186
441800
2976
07:36
and so we're always playingigranje catch-upkečap,
187
444800
1816
tako da je stalno nastojimo sustići,
pokušavajući pronaći smisao svijeta
07:38
tryingtežak to make senseosjećaj of this worldsvijet
188
446640
1656
07:40
where these autonomousautonoman things
are runningtrčanje around.
189
448320
2336
u kojem smo okruženi
ovim autonomnim stvarima.
07:42
UsuallyObično, things that talk are people,
not machinesstrojevi, right?
190
450680
2736
Uobičajeno, ljudi su ti koji govore,
ne strojevi, zar ne?
07:45
And so we breathedisati a lot of meaningznačenje
into things like just heightvisina of a machinemašina,
191
453440
4576
Stoga dodajemo mnogo značenja
stvarima poput visine stroja,
07:50
not a personosoba,
192
458040
1256
ne osobe,
07:51
and attributeatribut that
to the personosoba usingkoristeći the systemsistem.
193
459320
2360
i pripisujemo to osobi
koja koristi sustav.
07:55
You know, this, I think,
is really importantvažno
194
463120
2216
Znate, smatram da je ovo uistinu važno
07:57
when you're thinkingmišljenje about roboticsRobotika.
195
465360
1736
kad razmišljate o robotici.
Ne radi se tu toliko
o rekonstrukciji čovjeka,
07:59
It's not so much about reinventingponovnom humansljudi,
196
467120
2096
08:01
it's more about figuringfiguring out
how we extendprodužiti ourselvessebe, right?
197
469240
3136
nego više o tome kako
proširiti sebe, zar ne?
08:04
And we endkraj up usingkoristeći things
in waysnačine that are sortvrsta of surprisingiznenađujuće.
198
472400
2976
Na kraju koristimo stvari
na načine koji su iznenađujući.
08:07
So these guys can't playigrati poolbazen
because the robotsroboti don't have armsoružje,
199
475400
4256
Dakle, ovi momci ne mogu igrati
biljar jer roboti nemaju ruke,
08:11
but they can heckledobacivao the guys
who are playingigranje poolbazen
200
479680
2336
ali mogu provocirati
momke koji igraju biljar
08:14
and that can be an importantvažno thing
for teamtim bondingvezivanje,
201
482040
3176
i to može biti važna stvar
kod timskog zbližavanja,
08:17
whichkoji is kindljubazan of neaturedan.
202
485240
1296
što je zgodno.
Ljudi koji postanu zbilja dobri
u upravljanju tim sustavima,
08:18
People who get really good
at operatingradni these systemssustavi
203
486560
2496
08:21
will even do things
like make up newnovi gamesigre,
204
489080
2056
smislit će stvari poput novih igara,
08:23
like robotrobot soccernogomet
in the middlesrednji of the night,
205
491160
2136
kao, recimo, robotski nogomet usred noći,
gurajući uokolo kante za smeće.
08:25
pushingguranje the trashsmeće canslimenke around.
206
493320
1456
08:26
But not everyone'ssvi su good.
207
494800
1576
No, nisu svi dobri u tome.
Dosta ljudi ima problema
s upravljanjem tim sustavima.
08:28
A lot of people have troublenevolja
operatingradni these systemssustavi.
208
496400
2496
08:30
This is actuallyzapravo a guy
who loggedKreiranje datoteke poruka into the robotrobot
209
498920
2256
Ovo je, zapravo, lik
koji se ulogirao u robota
i oko mu je bilo okrenuto za 90° ulijevo.
08:33
and his eyeballočne jabučice was turnedokrenut
90 degreesstupnjeva to the left.
210
501200
2376
On to nije znao,
08:35
He didn't know that,
211
503600
1256
pa se na kraju sudarao sa svime u uredu,
08:36
so he endedzavršeno up just bashingbashing
around the officeured,
212
504880
2176
zalijetao ljudima u stolove,
postalo mu je neugodno,
08:39
runningtrčanje into people'snarodno desksstolovi,
gettinguzimajući supersuper embarrassedzbunjen,
213
507080
2616
smijao se tome - glasnoća
mu je bila previsoka.
08:41
laughingsmijanje about it --
his volumevolumen was way too highvisok.
214
509720
2336
08:44
And this guy here
in the imageslika is tellingreći me,
215
512080
2136
A ovaj lik ovdje na slici mi govori:
08:46
"We need a robotrobot muteisključivanje buttondugme."
216
514240
2096
"Treba nam gumb za utišavanje robota".
08:48
And by that what he really meantznačilo
was we don't want it to be so disruptivekoji remeti.
217
516360
3496
Time je mislio reći da ne želimo
da bude toliko ometajući.
08:51
So as a roboticsRobotika companydruštvo,
218
519880
1616
Tako da smo kao tvrtka robotike
08:53
we addeddodano some obstacleprepreka
avoidanceizbjegavanje to the systemsistem.
219
521520
2456
dodali izbjegavanje prepreka u sustav.
Dobio je mali laserski daljinometar
koji bi mogao vidjeti prepreke,
08:56
It got a little laserlaser rangeopseg findertražilica
that could see the obstaclesprepreke,
220
524000
3056
tako da kad bih ja, kao upravitelj
robota, rekla da se zabije u stolicu,
08:59
and if I as a robotrobot operatoroperatora
try to say, runtrčanje into a chairstolica,
221
527080
3136
09:02
it wouldn'tne bi let me,
it would just planplan a pathstaza around,
222
530240
2496
ne bi mi dozvolio, nego bi
isplanirao put koji vodi okolo,
09:04
whichkoji seemsčini se like a good ideaideja.
223
532760
1856
što se čini kao bolja ideja.
09:06
People did hithit fewermanje obstaclesprepreke
usingkoristeći that systemsistem, obviouslyočito,
224
534640
3176
Ljudi su se, naravno, manje
sudarali koristeći taj sustav,
09:09
but actuallyzapravo, for some of the people,
225
537840
2096
no, zapravo, za neke ljude,
09:11
it tookuzeo them a lot longerviše
to get throughkroz our obstacleprepreka coursenaravno,
226
539960
2856
trebalo je puno više vremena
da prođu cijeli poligon s preprekama
09:14
and we wanted to know why.
227
542840
1560
pa smo htjeli znati zašto.
09:17
It turnsokreti out that there's
this importantvažno humanljudski dimensiondimenzija --
228
545080
3056
Ispalo je da tu postoji
važna ljudska dimenzija;
09:20
a personalityosoba dimensiondimenzija
calledzvao locusLocus of controlkontrolirati,
229
548160
2296
dimenzija osobnosti zvana lokus kontrole,
09:22
and people who have
a strongjak internalinterni locusLocus of controlkontrolirati,
230
550480
3136
stoga ljudi koji imaju
snažan unutarnji lokus kontrole,
09:25
they need to be the mastersmajstori
of theirnjihov ownvlastiti destinysudbina --
231
553640
3056
moraju biti upravitelji svojih sudbina
09:28
really don't like givingdavanje up controlkontrolirati
to an autonomousautonoman systemsistem --
232
556720
3096
i stvarno ne žele predati
kontrolu autonomnom sustavu,
09:31
so much so that they will
fightborba the autonomyautonomija;
233
559840
2136
toliko da će se boriti s autonomijom.
09:34
"If I want to hithit that chairstolica,
I'm going to hithit that chairstolica."
234
562000
3096
"Ako se ja želim sudariti
s tom stolicom, ja ću se sudariti".
09:37
And so they would actuallyzapravo sufferpatiti
from havingima that autonomousautonoman assistancepomoć,
235
565120
3616
Tako da bi oni zapravo patili kad
bi imali tu autonomnu potporu.
09:40
whichkoji is an importantvažno thing for us to know
236
568760
2576
Što je važno za nas da znamo,
09:43
as we're buildingzgrada increasinglysve
autonomousautonoman, say, carsautomobili, right?
237
571360
3296
budući da gradimo izrazito
automomne aute, recimo. Zar ne?
09:46
How are differentdrugačiji people going
to grapplervati with that lossgubitak of controlkontrolirati?
238
574680
3360
Kako će se različiti ljudi nositi
s tim gubitkom kontrole?
09:50
It's going to be differentdrugačiji
dependingovisno on humanljudski dimensionsdimenzije.
239
578880
2696
To će biti različito ovisno
o ljudskim dimenzijama.
09:53
We can't treatliječiti humansljudi
as if we're just one monolithicmonolitan thing.
240
581600
3496
Ne možemo tretirati ljude kao
da smo neka monolitna stvar.
09:57
We varyvarirati by personalityosoba, by cultureKultura,
241
585120
2416
Razlikujemo se po osobnosti, kulturi,
čak i po emocionalnom stanju
od trenutka do trenutka,
09:59
we even varyvarirati by emotionalemotivan statedržava
momenttrenutak to momenttrenutak,
242
587560
2456
a biti u mogućnosti osmisliti ove sustave,
10:02
and beingbiće ableu stanju to designdizajn these systemssustavi,
243
590040
1976
10:04
these human-robotČovjek-robot interactioninterakcija systemssustavi,
244
592040
2296
ove sustave komunikacije
između čovjeka i robota,
10:06
we need to take into accountračun
the humanljudski dimensionsdimenzije,
245
594360
2736
moramo uzeti u obzir ljudske dimenzije,
10:09
not just the technologicaltehnološki onesone.
246
597120
1720
ne samo tehnološke.
10:11
AlongUz with a senseosjećaj of controlkontrolirati
alsotakođer comesdolazi a senseosjećaj of responsibilityodgovornost.
247
599640
4296
S osjećajem kontrole
dolazi i osjećaj odgovornosti.
Kad biste bili upravitelj robota
koji koristi neke od ovih sustava,
10:15
And if you were a robotrobot operatoroperatora
usingkoristeći one of these systemssustavi,
248
603960
2856
10:18
this is what the interfacesučelje
would look like.
249
606840
2056
ovako bi sučelje izgledalo.
10:20
It looksizgled a little bitbit like a videovideo gameigra,
250
608920
1936
Nalikuje malo na videoigru,
10:22
whichkoji can be good because
that's very familiarupoznat to people,
251
610880
2976
što može biti pozitivno
jer je to poznato ljudima,
10:25
but it can alsotakođer be badloše
252
613880
1216
no, može biti i loše
10:27
because it makesmarke people feel
like it's a videovideo gameigra.
253
615120
2456
jer se onda ljudi osjećaju
kao da su u videoigri.
Imali smo hrpu djece na Stanfordu
koja su se igrala time
10:29
We had a bunchmnogo of kidsdjeca
over at StanfordStanford playigrati with the systemsistem
254
617600
2856
10:32
and drivepogon the robotrobot
around our officeured in MenloMenlo ParkPark,
255
620480
2456
i vozila robota uokolo
u našem uredu u Menlo Parku,
i djeca bi govorila stvari poput:
10:34
and the kidsdjeca startedpočeo sayingizreka things like,
256
622960
1936
"10 bodova ako pogodiš
onog lika tamo. 20 za onog ondje".
10:36
"10 pointsbodova if you hithit that guy over there.
20 pointsbodova for that one."
257
624920
3176
I onda bi ih lovili po hodniku.
10:40
And they would
chaseChase them down the hallwayhodnik.
258
628120
2016
10:42
(LaughterSmijeh)
259
630160
1016
(Smijeh)
10:43
I told them, "UmHm, those are realstvaran people.
260
631200
1936
Rekla bih im: "To su prave osobe.
10:45
They're actuallyzapravo going to bleedkrvarenje
and feel painbol if you hithit them."
261
633160
3296
Oni će zaista krvariti i
osjećati bol ako ih pogodite".
I oni bi na to: "U redu, shvaćamo".
10:48
And they'doni bi be like, "OK, got it."
262
636480
1616
10:50
But fivepet minutesminuta laterkasnije,
they would be like,
263
638120
2056
Ali pet minuta poslije, oni bi opet:
"20 bodova za onog lika tamo, taj
baš izgleda kao da ga se treba pogoditi".
10:52
"20 pointsbodova for that guy over there,
he just looksizgled like he needspotrebe to get hithit."
264
640200
3616
Podsjeća malo na "Enderovu igru", zar ne?
10:55
It's a little bitbit
like "Ender'sEnder's GameIgra," right?
265
643840
2136
Postoji stvarni svijet
na toj drugoj strani
10:58
There is a realstvaran worldsvijet on that other sidestrana
266
646000
1936
i smatram da je naša odgovornost
kao ljudi koji osmišljavaju ta sučelja
10:59
and I think it's our responsibilityodgovornost
as people designingprojektiranje these interfacessučelja
267
647960
3416
da pomognu ljudima zapamtiti
11:03
to help people rememberzapamtiti
268
651400
1256
da postoje stvarne posljedice
njihovih djelovanja
11:04
that there's realstvaran consequencesposljedice
to theirnjihov actionsakcije
269
652680
2256
11:06
and to feel a senseosjećaj of responsibilityodgovornost
270
654960
2296
i da osjećaju odgovornost
11:09
when they're operatingradni
these increasinglysve autonomousautonoman things.
271
657280
3280
kad upravljaju ovim
sve više autonomnim stvarima.
11:13
These are kindljubazan of a great exampleprimjer
272
661840
2296
Ovi su, recimo, sjajan primjer
11:16
of experimentingeksperimentiranje with one
possiblemoguće roboticrobotski futurebudućnost,
273
664160
3256
eksperimentiranja s jednom od
mogućih budućnosti robota,
11:19
and I think it's prettyprilično coolsvjež
that we can extendprodužiti ourselvessebe
274
667440
3856
i smatram da je stvarno kul
što možemo produžiti svoje djelovanje
11:23
and learnnaučiti about the waysnačine
that we extendprodužiti ourselvessebe
275
671320
2336
i učiti o načinima toga
11:25
into these machinesstrojevi
276
673680
1216
u ovim strojevima,
dok u isto vrijeme imamo
mogućnost izraziti svoju čovječnost
11:26
while at the sameisti time
beingbiće ableu stanju to expressizraziti our humanityčovječanstvo
277
674920
2696
11:29
and our personalityosoba.
278
677640
1216
i svoju osobnost.
11:30
We alsotakođer buildizgraditi empathysuosjecanje for othersdrugi
279
678880
1576
Također gradimo empatiju za druge
11:32
in termsUvjeti of beingbiće
shorterkraće, tallerviši, fasterbrže, slowersporije,
280
680480
3216
tako što postajemo
niži, viši, brži, sporiji,
11:35
and maybe even armlesscudak,
281
683720
1416
možda čak i bezruki,
11:37
whichkoji is kindljubazan of neaturedan.
282
685160
1336
što je nekako zgodno.
11:38
We alsotakođer buildizgraditi empathysuosjecanje
for the robotsroboti themselvesse.
283
686520
2536
Također gradimo empatiju za same robote.
Ovo je jedan od mojih omiljenih robota.
11:41
This is one of my favoriteljubimac robotsroboti.
284
689080
1656
11:42
It's calledzvao the TweenbotTweenbot.
285
690760
1456
Zove se Tweenbot.
11:44
And this guy has a little flagZastava that sayskaže,
286
692240
1976
Ima zastavicu na kojoj piše:
"Pokušavam doći do
ovog raskrižja na Manhattanu".
11:46
"I'm tryingtežak to get
to this intersectionkrižanje in ManhattanManhattan,"
287
694240
2576
11:48
and it's cuteslatka and rollspecivo
forwardnaprijed, that's it.
288
696840
2776
Sladak je i kotrlja se
unaprijed, to je sve.
11:51
It doesn't know how to buildizgraditi a mapkarta,
it doesn't know how to see the worldsvijet,
289
699640
3456
Ne zna kako mapirati,
ne zna kako vidjeti svijet,
11:55
it just askspita for help.
290
703120
1256
samo traži upute.
11:56
The nicelijepo thing about people
291
704400
1336
Lijepa stvar kod ljudi je
11:57
is that it can actuallyzapravo dependzavisiti
uponna the kindnessljubaznost of strangersstranci.
292
705760
3096
da se, zapravo, može oslanjati
na ljubaznost stranaca.
12:00
It did make it acrosspreko the parkpark
to the other sidestrana of ManhattanManhattan --
293
708880
3896
On je uspio stići preko parka
na drugu stranu Manhattana,
12:04
whichkoji is prettyprilično great --
294
712800
1256
što je odlično,
samo zato što bi ga ljudi podigli
i usmjerili u pravom smjeru.
12:06
just because people would pickodabrati it up
and pointtočka it in the right directionsmjer.
295
714080
3456
(Smijeh)
12:09
(LaughterSmijeh)
296
717560
936
To je super, zar ne?
12:10
And that's great, right?
297
718520
1256
12:11
We're tryingtežak to buildizgraditi
this human-robotČovjek-robot worldsvijet
298
719800
2696
Pokušavamo izgraditi
svijet ljudi i robota,
12:14
in whichkoji we can coexistsuživot
and collaboratesurađivati with one anotherjoš,
299
722520
3416
u kojem možemo supostojati
i surađivati jedni s drugima
12:17
and we don't need to be fullypotpuno autonomousautonoman
and just do things on our ownvlastiti.
300
725960
3376
pa ne moramo biti posve autonomni
i samo raditi na svoj način.
Zapravo možemo raditi stvari zajedno.
12:21
We actuallyzapravo do things togetherzajedno.
301
729360
1496
12:22
And to make that happendogoditi se,
302
730880
1256
Da bismo to postigli,
12:24
we actuallyzapravo need help from people
like the artistsizvođači and the designersdizajneri,
303
732160
3256
stvarno trebamo pomoć ljudi
poput umjetnika i dizajnera,
kreatora politike, pravnika,
12:27
the policypolitika makersodluka, the legalpravni scholarsZnanstvenici,
304
735440
1856
12:29
psychologistspsiholozi, sociologistssociolozi,
anthropologistsantropolozi --
305
737320
2216
psihologa, sociologa, antropologa,
treba nam više perspektiva u prostoriji,
12:31
we need more perspectivesperspektive in the roomsoba
306
739560
1816
12:33
if we're going to do the thing
that StuStu CardKartice sayskaže we should do,
307
741400
2976
ako ćemo učiniti ono što nam je
Stu Card rekao da bismo trebali,
12:36
whichkoji is inventizumiti the futurebudućnost
that we actuallyzapravo want to liveživjeti in.
308
744400
3936
a to je stvaranje budućnosti
u kojoj uistinu želimo živjeti.
12:40
And I think we can continuenastaviti to experimenteksperiment
309
748360
2656
I smatram da možemo
nastaviti eksperimentirati
12:43
with these differentdrugačiji
roboticrobotski futuresbudućnosti togetherzajedno,
310
751040
2176
s ovim različitim
budućnostima robota zajedno
12:45
and in doing so, we will endkraj up
learningučenje a lot more about ourselvessebe.
311
753240
4680
i na taj način naučiti
puno više o sebi samima.
12:50
Thank you.
312
758720
1216
Hvala vam.
12:51
(ApplausePljesak)
313
759960
2440
(Pljesak)
Translated by Ivana Varga
Reviewed by Sanda Liker

▲Back to top

ABOUT THE SPEAKER
Leila Takayama - Social scientist
Leila Takayama conducts research on human-robot interaction.

Why you should listen

Leila Takayama is an acting associate professor of Psychology at the University of California, Santa Cruz, where she founded and leads the Re-Embodied Cognition Lab. Her lab examines how people make sense of, interact with, and relate to new technologies. Prior to academia, she was a researcher at GoogleX and Willow Garage, where she developed a taste for working alongside engineers, designers, animators, and more. Her interdisciplinary research continues in her current work on what happens when people interact with robots and through robots.

Takayama is a World Economic Forum Global Futures Council Member and Young Global Leader. In 2015, she was presented the IEEE Robotics & Automation Society Early Career Award. In 2012, she was named a TR35 winner and one of the 100 most creative people in business by Fast Company. She completed her PhD in Communication at Stanford University in 2008, advised by Professor Clifford Nass. She also holds a PhD minor in Psychology from Stanford, a master's degree in Communication from Stanford, and bachelor's of arts degrees in Psychology and Cognitive Science from UC Berkeley (2003). During her graduate studies, she was a research assistant in the User Interface Research (UIR) group at Palo Alto Research Center (PARC).

Photo: Melissa DeWitt

More profile about the speaker
Leila Takayama | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee