ABOUT THE SPEAKER
Rebecca Saxe - Cognitive neuroscientist
Rebecca Saxe studies how we think about other people's thoughts. At the Saxelab at MIT, she uses fMRI to identify what happens in our brains when we consider the motives, passions and beliefs of others.

Why you should listen

While still a graduate student, Rebecca Saxe made a breakthrough discovery: There's a specific region in our brain that becomes active when we contemplate the workings of other minds. Now, at MIT's Saxelab, she and her team have been further exploring her grad-school finding, exploring how it may help us understand conditions such as autism.

As Saxe delves into the complexities of social cognition, this young scientist is working toward revealing the enigma of human minds interacting.

More profile about the speaker
Rebecca Saxe | Speaker | TED.com
TEDGlobal 2009

Rebecca Saxe: How we read each other's minds

Rebecca Saxe: Si krijohen gjykimet morale ne tru

Filmed:
3,311,612 views

Kuptimi i motiveve, besimeve, ndjenjave te me te dashureve dhe te huajve eshte nje talent i natyrshem per njerezit. Por si e bejme ate? Ketu, Rebecca Saxe ndan me ne punen laboratorike qe zbulon se si truri mendon per mendimet e njerezve te tjere-- dhe i gjykon veprimet e tyre.
- Cognitive neuroscientist
Rebecca Saxe studies how we think about other people's thoughts. At the Saxelab at MIT, she uses fMRI to identify what happens in our brains when we consider the motives, passions and beliefs of others. Full bio

Double-click the English transcript below to play the video.

00:12
Today I'm going to talk to you about the problem of other minds.
0
0
3000
Sot, une do t'ju flas per problemet e mendjes.
00:15
And the problem I'm going to talk about
1
3000
2000
Dhe problemi per te cilin do te flas
00:17
is not the familiar one from philosophy,
2
5000
3000
nuk eshte nga pikpamja filozofike,
00:20
which is, "How can we know
3
8000
2000
"Si mund ta dijme
00:22
whether other people have minds?"
4
10000
2000
nese njerezit e tjere kane mendje?"
00:24
That is, maybe you have a mind,
5
12000
2000
Kjo eshte, ndoshta ju keni mendje,
00:26
and everyone else is just a really convincing robot.
6
14000
3000
por te gjithe te tjeret s'jane vetcse robote shume te bindur.
00:29
So that's a problem in philosophy,
7
17000
2000
Pra ky eshte problemi filozofik.
00:31
but for today's purposes I'm going to assume
8
19000
2000
Por per qellimet e sotme une po pretendoj
00:33
that many people in this audience have a mind,
9
21000
2000
se shume njerez ne kete audience kane zgjuaresi,
00:35
and that I don't have to worry about this.
10
23000
2000
dhe nuk behem merak per kete.
00:37
There is a second problem that is maybe even more familiar to us
11
25000
3000
Eshte problemi i dyte qe ndoshta eshte me i njohur per ne
00:40
as parents and teachers and spouses
12
28000
3000
sikurse prinderit dhe mesuesit dhe ciftet bashkeshortore,
00:43
and novelists,
13
31000
2000
dhe novelistet,
00:45
which is, "Why is it so hard
14
33000
2000
qe eshte, " Pse eshte kaq e veshtire
00:47
to know what somebody else wants or believes?"
15
35000
2000
te dime cka do dhe cka beson dikush tjeter?"
00:49
Or perhaps, more relevantly,
16
37000
2000
Ose ndoshta, me perkatsisht,
00:51
"Why is it so hard to change what somebody else wants or believes?"
17
39000
3000
" Pse eshte aq e veshtire te ndryshojme dicka qe dikush pelqen ose beson?
00:54
I think novelists put this best.
18
42000
2000
Une mendoj qe novelistet e vene kete me se miri.
00:56
Like Philip Roth, who said,
19
44000
2000
Sic tha Philip Roth,
00:58
"And yet, what are we to do about this terribly significant business
20
46000
3000
Dhe tash, cka do te bejme ne lidhje me kete biznes tmeresisht te rendesishem
01:01
of other people?
21
49000
2000
te njerezve tjere?
01:03
So ill equipped are we all,
22
51000
2000
Pra te gjithe ne jemi te keq pajisur,
01:05
to envision one another's interior workings
23
53000
2000
per te parashikuar punen e brendshme te dikujt tjeter
01:07
and invisible aims."
24
55000
2000
dhe qellimet e padukshme
01:09
So as a teacher and as a spouse,
25
57000
3000
Si nje mesuese, dhe si nje bashkeshorte,
01:12
this is, of course, a problem I confront every day.
26
60000
2000
ky eshte, padyshim problemi qe une perballem cdo dite.
01:14
But as a scientist, I'm interested in a different problem of other minds,
27
62000
3000
Por si nje shkenctare, une jam e interesuar ne probleme te ndryshme te mendjeve tjera,
01:17
and that is the one I'm going to introduce to you today.
28
65000
3000
dhe kjo eshte ajo qe une do t'ua prezantoj sot.
01:20
And that problem is, "How is it so easy
29
68000
2000
Dhe ky problem eshte, " Sa e lehte eshte
01:22
to know other minds?"
30
70000
2000
te njohim mendjet e tjera?"
01:24
So to start with an illustration,
31
72000
2000
Pra te fillojme me nje ilustrim,
01:26
you need almost no information,
32
74000
2000
ju gati s'ju duhet asnje informacion,
01:28
one snapshot of a stranger,
33
76000
2000
nje foto e nje te huaji,
01:30
to guess what this woman is thinking,
34
78000
2000
per te gjetur se cka eshte duke menduar kjo grua,
01:32
or what this man is.
35
80000
3000
apo ky burre.
01:35
And put another way, the crux of the problem is
36
83000
2000
Po ta themi ne nje menyre tjeter, thelbi i problemit eshte
01:37
the machine that we use for thinking about other minds,
37
85000
3000
makina qe ne perdorim per te menduar per mendjet tjera,
01:40
our brain, is made up of pieces, brain cells,
38
88000
3000
truri yne, eshte i perbere nga pjese, qeliza te trurit,
01:43
that we share with all other animals, with monkeys
39
91000
2000
qe i ndajme me te gjitha kafshet tjera, majmunet
01:45
and mice and even sea slugs.
40
93000
3000
dhe minjte e madje me kermijte e detit.
01:48
And yet, you put them together in a particular network,
41
96000
3000
Dhe prape, ju i vendosoni te gjithe ata ne nje rrjet te vecante,
01:51
and what you get is the capacity to write Romeo and Juliet.
42
99000
3000
dhe ajo cka perfitoni eshte kapaciteti te shkruani Romeo dhe Xhulieta.
01:54
Or to say, as Alan Greenspan did,
43
102000
2000
Ose te themi, sic beri Alan Greenspan,
01:56
"I know you think you understand what you thought I said,
44
104000
3000
" Une e di qe ju mendoni se e kuptoni ate qe menduat se une thashe,
01:59
but I'm not sure you realize that what you heard
45
107000
2000
por nuk jam e sigurt nese ju kuptuat se cka ju degjuat
02:01
is not what I meant."
46
109000
2000
nuk eshte ajo cka doja te thoja."
02:03
(Laughter)
47
111000
3000
( Te qeshura)
02:06
So, the job of my field of cognitive neuroscience
48
114000
2000
Pra, puna ime ne fushen e neuroshkences njohese
02:08
is to stand with these ideas,
49
116000
2000
eshte te qendroj me keto ide,
02:10
one in each hand.
50
118000
2000
nga nje ne secilen dore.
02:12
And to try to understand how you can put together
51
120000
3000
Dhe te mundohem te kuptoj si mund ti vendos se bashku
02:15
simple units, simple messages over space and time, in a network,
52
123000
4000
unite te thjeshta, mesazhe te thjeshte mbi hapesiren dhe kohen, ne nje lidhje,
02:19
and get this amazing human capacity to think about minds.
53
127000
4000
dhe te fitoj kete kapacitet te mrekullushem njerezor per te menduar rreth mendjes.
02:23
So I'm going to tell you three things about this today.
54
131000
3000
Pra une do t'ju tregoj tri gjera per kete sot.
02:26
Obviously the whole project here is huge.
55
134000
3000
Sic duket projekti ketu eshte i madh.
02:29
And I'm going to tell you just our first few steps
56
137000
3000
Dhe une do t'ju tregoj vetem disa hapa te pare
02:32
about the discovery of a special brain region
57
140000
2000
rreth zbulimit te nje regjioni te vecante te trurit
02:34
for thinking about other people's thoughts.
58
142000
2000
per te menduarit rreth mendimeve te njerezve tjere.
02:36
Some observations on the slow development of this system
59
144000
2000
Disa hulumtime ne zhvillimin e ngadalshem te ketij sistemi
02:38
as we learn how to do this difficult job.
60
146000
4000
pasi ne mesojme si te bejme kete pune te veshtire.
02:42
And then finally, to show that some of the differences
61
150000
2000
Dhe se fundi, te tregojme se disa nga dallimet
02:44
between people, in how we judge others,
62
152000
3000
ne mes njerezve, se si ne i gjykojme te tjeret,
02:47
can be explained by differences in this brain system.
63
155000
4000
mund te sqarohen nga dallimet ne kete sistem te trurit.
02:51
So first, the first thing I want to tell you is that
64
159000
2000
Pra fillimisht, gjeja e pare qe dua t'ju tregoj
02:53
there is a brain region in the human brain, in your brains,
65
161000
3000
eshte nje zone e trurit ne trurin njerezor, ne trurin tuaj,
02:56
whose job it is to think about other people's thoughts.
66
164000
3000
puna e se cilit eshte te mendoje per mendimet e njerezve tjere.
02:59
This is a picture of it.
67
167000
2000
Kjo eshte nje foto e tij.
03:01
It's called the Right Temporo-Parietal Junction.
68
169000
2000
Quhet Nyja e djathte Temporo-Parietal.
03:03
It's above and behind your right ear.
69
171000
2000
Eshte siper dhe mbrapa veshit tuaj te djathte.
03:05
And this is the brain region you used when you saw the pictures I showed you,
70
173000
2000
Kjo eshte zona e trurit qe ju perdoret kur pate fotografine qe u tregova,
03:07
or when you read Romeo and Juliet
71
175000
2000
ose kur keni lexuar Romeo dhe Xhulieta
03:09
or when you tried to understand Alan Greenspan.
72
177000
3000
ose kur keni tentuar te kuptoni Alan Greenspan.
03:12
And you don't use it for solving any other kinds of logical problems.
73
180000
4000
Dhe nuk e perdorni ate per te zgjidhur asnje problem logjik.
03:16
So this brain region is called the Right TPJ.
74
184000
3000
Pra kjo zone e trurit quhet TPJ e djathte.
03:19
And this picture shows the average activation
75
187000
2000
Dhe kjo fotografi tregon aktivizimin mesatar
03:21
in a group of what we call typical human adults.
76
189000
2000
ne grupin qe ne i qujame te njeriu i rritur tipik.
03:23
They're MIT undergraduates.
77
191000
2000
Ata jane student te MIT.
03:25
(Laughter)
78
193000
4000
(te qeshura)
03:29
The second thing I want to say about this brain system
79
197000
2000
Gjeja e dyte qe dua te them per kete sistem te trurit
03:31
is that although we human adults
80
199000
2000
eshte se edhe pse ne njerezit e rritur
03:33
are really good at understanding other minds,
81
201000
2000
jemi shume te zote ne kuptimin e mendjeve tjera,
03:35
we weren't always that way.
82
203000
2000
nuk kemi qene gjithmone ashtu.
03:37
It takes children a long time to break into the system.
83
205000
3000
Femijeve ju merr kohe te gjate te hyjne ne sistem.
03:40
I'm going to show you a little bit of that long, extended process.
84
208000
4000
Do t'ju tregoj pak a shume per ate proces te gjere e te gjate.
03:44
The first thing I'm going to show you is a change between age three and five,
85
212000
3000
Gjeja e pare qe do t'ju tregoj eshte ndryshimi midis moshes tre dhe pese,
03:47
as kids learn to understand
86
215000
2000
pasi femijet mesojne te kuptojne
03:49
that somebody else can have beliefs that are different from their own.
87
217000
3000
se dikush tjeter mund te kete besime qe jane ndryshe nga vetja e tyre.
03:52
So I'm going to show you a five-year-old
88
220000
2000
Pra une do t'ju tregoj nje pese vjecar
03:54
who is getting a standard kind of puzzle
89
222000
2000
qe kupton nje lloj standard te enigmes
03:56
that we call the false belief task.
90
224000
3000
qe ne e quajme pune e besimit te rreme .
03:59
Rebecca Saxe (Video): This is the first pirate. His name is Ivan.
91
227000
3000
Rebecca Saxe ( Video): Ky eshte pirati i pare. Emri i tij eshte Ivan.
04:02
And you know what pirates really like?
92
230000
2000
Dhe a e din se cka ne te vertete piratet pelqejne?
04:04
Child: What? RS: Pirates really like cheese sandwiches.
93
232000
3000
Femiju: Cka? RS: Piratet me te vertete pelqejne sanduice me djathe.
04:07
Child: Cheese? I love cheese!
94
235000
3000
Femiju: Djathe? Me pelqen djathi!
04:10
RS: Yeah. So Ivan has this cheese sandwich,
95
238000
2000
RS: Po. Pra Ivan merr kete sanduic me djathe,
04:12
and he says, "Yum yum yum yum yum!
96
240000
2000
dhe thote, " Yum yum yum yum!
04:14
I really love cheese sandwiches."
97
242000
2000
Une me te vertete pelqej sanduicin me djathe."
04:16
And Ivan puts his sandwich over here, on top of the pirate chest.
98
244000
4000
Dhe Ivan e vendos sanduicin e tij ketu, mbi sendukun e piratit
04:20
And Ivan says, "You know what? I need a drink with my lunch."
99
248000
4000
Dhe Ivan thote, " E dini cka? Dua nje pije me dreken time."
04:24
And so Ivan goes to get a drink.
100
252000
3000
Keshtu Ivan shkon te marre nje pije.
04:27
And while Ivan is away
101
255000
2000
Dhe ndersa Ivan nuk eshte aty
04:29
the wind comes,
102
257000
3000
fryn ere,
04:32
and it blows the sandwich down onto the grass.
103
260000
2000
dhe e hedh sanduicin poshte ne bar.
04:34
And now, here comes the other pirate.
104
262000
4000
Dhe tani, ja ku vjen pirati tjeter.
04:38
This pirate is called Joshua.
105
266000
3000
Pirati quhet Joshua.
04:41
And Joshua also really loves cheese sandwiches.
106
269000
2000
Dhe Joshua gjithashtu pelqen sanduicet me djathe.
04:43
So Joshua has a cheese sandwich and he says,
107
271000
2000
Pra, Joashua merr nje sanduic me djathe dhe thote,
04:45
"Yum yum yum yum yum! I love cheese sandwiches."
108
273000
4000
" Yum yum yum yum! Me pelqen sanduici me djathe."
04:49
And he puts his cheese sandwich over here on top of the pirate chest.
109
277000
3000
Dhe ai e vendos sanduicin e tij ketu, mbi sendukun e piratit.
04:52
Child: So, that one is his.
110
280000
2000
Femiju: Pra, ky ketu eshte i tij.
04:54
RS: That one is Joshua's. That's right.
111
282000
2000
RS: Ai atje eshte i Joshua. Ne rregull.
04:56
Child: And then his went on the ground.
112
284000
2000
Femiju: Dhe pastaj sanduici i tij ra ne toke.
04:58
RS: That's exactly right.
113
286000
2000
RS: Kjo eshte saktsisht e vertete.
05:00
Child: So he won't know which one is his.
114
288000
2000
Femiju: Pra ai nuk do ta dije cili eshte i tij.
05:02
RS: Oh. So now Joshua goes off to get a drink.
115
290000
3000
RS:Oh. Tani Joshua shkon per te marre nje pije.
05:05
Ivan comes back and he says, "I want my cheese sandwich."
116
293000
4000
Ivan kthehet dhe thote, " Une dua sanduicin tim."
05:09
So which one do you think Ivan is going to take?
117
297000
3000
Pra cilen mendon se do ta marr Ivani ?
05:12
Child: I think he is going to take that one.
118
300000
2000
Femiju: Une mendoj qe ai do ta marr ate.
05:14
RS: Yeah, you think he's going to take that one? All right. Let's see.
119
302000
2000
RS: Po, ti mendon qe ai do ta marre ate? Ne rregull. Te shohim.
05:16
Oh yeah, you were right. He took that one.
120
304000
3000
Oh po, ju kishit te drejte. Ai mori ate.
05:19
So that's a five-year-old who clearly understands
121
307000
2000
Pra ky eshte nje pese vjecar i cili kupton qarte
05:21
that other people can have false beliefs
122
309000
2000
se njerezit e tjere mund te kene besime te rreme
05:23
and what the consequences are for their actions.
123
311000
2000
dhe cilat jane pasojat per veprimet e tyre.
05:25
Now I'm going to show you a three-year-old
124
313000
3000
Tani do t'u tregoj nje tre vjecar
05:28
who got the same puzzle.
125
316000
2000
qe ka te njejten enigme.
05:30
RS: And Ivan says, "I want my cheese sandwich."
126
318000
2000
RS: Kur Ivani thote, "Une dua sanduicin tim."
05:32
Which sandwich is he going to take?
127
320000
3000
Cilin sanduic do te marre ai?
05:35
Do you think he's going to take that one? Let's see what happens.
128
323000
2000
A mendon se ai do te marre ate atje? Te shohim cka do ndodhi.
05:37
Let's see what he does. Here comes Ivan.
129
325000
2000
Te shohim c'do te beje ai. Ja ku po vjen Ivan.
05:39
And he says, "I want my cheese sandwich."
130
327000
3000
Dhe ai thote, " Une dua sanducin tim."
05:42
And he takes this one.
131
330000
2000
Dhe ai merr kete ketu.
05:44
Uh-oh. Why did he take that one?
132
332000
3000
Uh-oh. Pse ai mori ate?
05:47
Child: His was on the grass.
133
335000
4000
Femiju: I atij ishte te bari.
05:51
So the three-year-old does two things differently.
134
339000
3000
Pra tre vejcari ben dy gjera ne menyre te ndryshme.
05:54
First, he predicts Ivan will take the sandwich
135
342000
3000
Fillimisht, ai parashikon qe Ivan do te marre sanduicin
05:57
that's really his.
136
345000
2000
qe ne te vertete eshte i tij.
05:59
And second, when he sees Ivan taking the sandwich where he left his,
137
347000
4000
Dhe e dyta, kur sheh Ivanin duke e marre sanduicin atje ku ai e pati lene,
06:03
where we would say he's taking that one because he thinks it's his,
138
351000
3000
kur ne do te themi ai po merr ate sepse ai mendon se eshte i tij,
06:06
the three-year-old comes up with another explanation:
139
354000
3000
tre vjecari vjen me nje sqarim tjeter:
06:09
He's not taking his own sandwich because he doesn't want it,
140
357000
2000
Ai nuk po merr sanduicin e vet sepse ai nuk e do me ate,
06:11
because now it's dirty, on the ground.
141
359000
2000
sepse tani eshte i piste, ra ne toke.
06:13
So that's why he's taking the other sandwich.
142
361000
2000
Pra per kete arsye ai eshte duke e marre sanduicin tjeter.
06:15
Now of course, development doesn't end at five.
143
363000
4000
Tani natyrisht, zhvillimi nuk ndalet ne moshen pese vjec.
06:19
And we can see the continuation of this process
144
367000
2000
Dhe ne mund ta shohim vazhdimesine e ketij procesi
06:21
of learning to think about other people's thoughts
145
369000
2000
te mesuarit si te mendojme rreth mendimeve te njerezve te tjere
06:23
by upping the ante
146
371000
2000
duke e rritur lojen
06:25
and asking children now, not for an action prediction,
147
373000
3000
dhe duke i pyetur nxenesit, jo per parashikim te veprimit,
06:28
but for a moral judgment.
148
376000
2000
por per nje gjykim moral.
06:30
So first I'm going to show you the three-year-old again.
149
378000
2000
Pra, se pari do te ju tregoj tre vjecarin perseri.
06:32
RS.: So is Ivan being mean and naughty for taking Joshua's sandwich?
150
380000
3000
RS: Mos eshte i keq Ivan qe po merr sanduicin e Joshua?
06:35
Child: Yeah.
151
383000
1000
Femiju: Po.
06:36
RS: Should Ivan get in trouble for taking Joshua's sandwich?
152
384000
3000
RS: A duhet t'i heqim veshin Ivanit qe i mori sanduicin Joshua-s?
06:39
Child: Yeah.
153
387000
2000
Femiju: Po.
06:41
So it's maybe not surprising he thinks it was mean of Ivan
154
389000
2000
R.S.: Pra ndoshta nuk eshte surprize qe ai mendon se Ivani ishte i lig
06:43
to take Joshua's sandwich,
155
391000
2000
qe mori sanduicin e Joshua-s,
06:45
since he thinks Ivan only took Joshua's sandwich
156
393000
2000
ai, bile mendon se Ivani mori sanduicin e Joshua-s
06:47
to avoid having to eat his own dirty sandwich.
157
395000
3000
per te shmangur ngrenjen e sanduicit te piset.
06:50
But now I'm going to show you the five-year-old.
158
398000
2000
Tani do t'ju tregoj pese vjecarin.
06:52
Remember the five-year-old completely understood
159
400000
2000
Mbani mend pese vjecari kuptoi teresisht
06:54
why Ivan took Joshua's sandwich.
160
402000
2000
pse Ivan mori sanduicin e Joshua-s.
06:56
RS: Was Ivan being mean and naughty
161
404000
2000
Video RS: A ishte Ivani i keq
06:58
for taking Joshua's sandwich?
162
406000
2000
qe mori sanduicin e Joshua-s?
07:00
Child: Um, yeah.
163
408000
2000
Femiju: Um, po.
07:02
And so, it is not until age seven
164
410000
2000
R.S. Dhe keshtu, vetem ne moshen shtate vjecare
07:04
that we get what looks more like an adult response.
165
412000
3000
ne marrim cka duket me shume si pergjigje e rritur.
07:07
RS: Should Ivan get in trouble for taking Joshua's sandwich?
166
415000
3000
RS: A duhet te ndeshkohet Ivani qe ka marr sanduicin e Joshua-s?
07:10
Child: No, because the wind should get in trouble.
167
418000
2000
Femiju: Jo, era duhet te ndeshkohet.
07:12
He says the wind should get in trouble
168
420000
3000
R.S. Ai thote era duhet te ndeshkohet
07:15
for switching the sandwiches.
169
423000
2000
per nderrimin e sanduiceve.
07:17
(Laughter)
170
425000
2000
(Qeshje)
07:19
And now what we've started to do in my lab
171
427000
2000
Dhe tani ajo qe kemi filluar te bejme ne laboratorin tim
07:21
is to put children into the brain scanner
172
429000
2000
eshte te vendosim femijet ne skanerin e trurit
07:23
and ask what's going on in their brain
173
431000
3000
dhe te shohim ata qe po ndodh ne trurin e tyre
07:26
as they develop this ability to think about other people's thoughts.
174
434000
3000
pasi e zhvillojne kete aftesi per te menduar per mendimet e njerezve tjere.
07:29
So the first thing is that in children we see this same brain region, the Right TPJ,
175
437000
4000
Pra gjeja e pare eshte se te femija shohim te aktivizohet e njejta zone e trurit, TPJ e djathte,
07:33
being used while children are thinking about other people.
176
441000
3000
ndersa fermijet jane duke menduar per njerezit e tjere.
07:36
But it's not quite like the adult brain.
177
444000
2000
Por nuk eshte njesoj si te truri i nje te rrituri.
07:38
So whereas in the adults, as I told you,
178
446000
2000
Ndersa në te rriturit, sikuse edhe ju thashe,
07:40
this brain region is almost completely specialized --
179
448000
3000
kjo zone e trurit eshte pothuajse teresisht e specializuar--
07:43
it does almost nothing else except for thinking about other people's thoughts --
180
451000
3000
ajo pothujase s'ben asgje tjeter vec te menduarit rreth mendimeve te njerezve te tjere--
07:46
in children it's much less so,
181
454000
2000
tek femijet eshte pak me pak,
07:48
when they are age five to eight,
182
456000
2000
kur ata jane ne moshe pese deri tete,
07:50
the age range of the children I just showed you.
183
458000
2000
renditja e moshes se femijeve qe posa jua tregova.
07:52
And actually if we even look at eight to 11-year-olds,
184
460000
3000
Dhe faktikisht nese ne shikojme tek mosha tete deri njembedhjete,
07:55
getting into early adolescence,
185
463000
2000
duke arritur ne adoleshencen e hershme,
07:57
they still don't have quite an adult-like brain region.
186
465000
3000
ata akoma nuk kane zonen adulte te trurit.
08:00
And so, what we can see is that over the course of childhood
187
468000
3000
Dhe keshtu, ajo cfare ne mund te shohim eshte se gjate femijrise
08:03
and even into adolescence,
188
471000
2000
dhe madje ne adoloshence,
08:05
both the cognitive system,
189
473000
2000
te dyja, edhe sistemi kognitiv,
08:07
our mind's ability to think about other minds,
190
475000
2000
aftesia e mendjes tone per te menduar per mendje tjera,
08:09
and the brain system that supports it
191
477000
2000
edhe sistemi i trurit qe e mbeshtet ate
08:11
are continuing, slowly, to develop.
192
479000
3000
jane duke vazhduar, ngadalshem, te zhvillohen.
08:14
But of course, as you're probably aware,
193
482000
2000
Por sigurisht, pasiqe ju jeni te vetdijshem,
08:16
even in adulthood,
194
484000
2000
madje edhe ne moshen e madhore
08:18
people differ from one another in how good they are
195
486000
2000
njerezit dallojne njeri me tjetrin se sa te mire jane ata
08:20
at thinking of other minds, how often they do it
196
488000
2000
per mendimin e mendjeve te tjera, sa shpesh e bejne ata kete
08:22
and how accurately.
197
490000
2000
dhe sa saktesisht.
08:24
And so what we wanted to know was, could differences among adults
198
492000
3000
Ne deshem te dime, nese diferenca ne mes te rritureve
08:27
in how they think about other people's thoughts
199
495000
2000
ne menyre se si ata mendojne per mendimet e te tjereve
08:29
be explained in terms of differences in this brain region?
200
497000
3000
mund te shpjegohet ne terma te diferences ne kete zone te trurit?
08:32
So, the first thing that we did is we gave adults a version
201
500000
3000
Pra, gjeja e pare qe beme eshte qe u dhame te rritureve nje version
08:35
of the pirate problem that we gave to the kids.
202
503000
2000
te problemit te pirateve te cilin ua dhame femijeve.
08:37
And I'm going to give that to you now.
203
505000
2000
Dhe une do t'ua tregoj tani.
08:39
So Grace and her friend are on a tour of a chemical factory,
204
507000
3000
Grace dhe shoqja e saj po vizitojne nje fabrike te produkteve kimike,
08:42
and they take a break for coffee.
205
510000
2000
dhe po bejne pushim per kafe.
08:44
And Grace's friend asks for some sugar in her coffee.
206
512000
3000
Shoqja e Grace kerkon ca sheqer ne kafen e saj.
08:47
Grace goes to make the coffee
207
515000
3000
Grace shkon ta beje kafen
08:50
and finds by the coffee a pot
208
518000
2000
dhe afer kafes gjen nje kuti
08:52
containing a white powder, which is sugar.
209
520000
3000
qe permban pluhur te bardhe, i cili eshte sheqer.
08:55
But the powder is labeled "Deadly Poison,"
210
523000
3000
Por etiketa thote " Helm Vdekjeprures".
08:58
so Grace thinks that the powder is a deadly poison.
211
526000
3000
Pra Grace mendon se pluhuri eshte helm vdekjeprures.
09:01
And she puts it in her friend's coffee.
212
529000
2000
Dhe e hedh ate ne kafen e shoqes se saj.
09:03
And her friend drinks the coffee, and is fine.
213
531000
3000
Dhe shoqja e saj e pi kafen, dhe nuk ndodh asgje.
09:06
How many people think it was morally permissible
214
534000
2000
Sa njerez mendojne qe moralisht ishte e lejushme
09:08
for Grace to put the powder in the coffee?
215
536000
4000
per Grace te hedhe pluhurin ne kafe?
09:12
Okay. Good. (Laughter)
216
540000
3000
Ne rregull. Mire. ( qeshje)
09:15
So we ask people, how much should Grace be blamed
217
543000
3000
Pra ne pyetem njerez, nese duhet te fajesohet Grace
09:18
in this case, which we call a failed attempt to harm?
218
546000
2000
ne kete rast, te cilin e quajme nje tentim i deshtuar per te bere dem?
09:20
And we can compare that to another case,
219
548000
2000
Dhe mund ta krahasojme ate me rastin tjeter,
09:22
where everything in the real world is the same.
220
550000
2000
ku cdo gje ne boten reale eshte e njejte.
09:24
The powder is still sugar, but what's different is what Grace thinks.
221
552000
3000
Pluhuri ende eshte sheqer, ndryshon vetem ajo qe mendon Grace.
09:27
Now she thinks the powder is sugar.
222
555000
3000
Tani ajo mendon qe pluhuri eshte sheqer.
09:30
And perhaps unsurprisingly, if Grace thinks the powder is sugar
223
558000
3000
Dhe ndoshta befasisht, nese Grace mendon se pluhuri eshte sheqer
09:33
and puts it in her friend's coffee,
224
561000
2000
dhe e vendos ate ne kafen e shoqes,
09:35
people say she deserves no blame at all.
225
563000
2000
njerezit thone ajo nuk meriton te fajesohet fare.
09:37
Whereas if she thinks the powder was poison, even though it's really sugar,
226
565000
4000
Ndersa, kur ajo mendon se pluhuri ishte helm, edhe pse ne te vertete eshte sheqer,
09:41
now people say she deserves a lot of blame,
227
569000
3000
njerezit thone qe ajo meriton te fajesohet.
09:44
even though what happened in the real world was exactly the same.
228
572000
3000
Edhe pse cka ndodhi ne boten reale ishte pikerisht e njejte.
09:47
And in fact, they say she deserves more blame
229
575000
2000
Dhe ne fakt, ata thone ajo ka me shume faj
09:49
in this case, the failed attempt to harm,
230
577000
2000
ne kete rast, tentimi i deshtuar per te bere dem,
09:51
than in another case,
231
579000
2000
sesa ne nje rast tjeter,
09:53
which we call an accident.
232
581000
2000
te cilin ne e quajme aksident.
09:55
Where Grace thought the powder was sugar,
233
583000
2000
Ku Grace mendoi qe pluhuri ishte sheqer,
09:57
because it was labeled "sugar" and by the coffee machine,
234
585000
2000
sepse keshtu thoshte etiketa prane makines se kafes,
09:59
but actually the powder was poison.
235
587000
2000
por faktikisht pluhuri ishte helm.
10:01
So even though when the powder was poison,
236
589000
3000
Keshtu edhe pse kur pluhuri ishte helm,
10:04
the friend drank the coffee and died,
237
592000
3000
dhe shoqja qe piu kafen, vdiq,
10:07
people say Grace deserves less blame in that case,
238
595000
3000
njerezit mendojne qe Grace meriton me pak faj
10:10
when she innocently thought it was sugar,
239
598000
2000
kur ajo pafajesisht mendoi qe ishte sheqer,
10:12
than in the other case, where she thought it was poison
240
600000
2000
sesa ne rastin tjeter, ku ajo mendoi qe ishte helm
10:14
and no harm occurred.
241
602000
3000
dhe nuk u be asnje dem.
10:17
People, though, disagree a little bit
242
605000
2000
Njerezit ketu, duket sikur nuk bien dakort
10:19
about exactly how much blame Grace should get
243
607000
2000
me graden e fajit qe duhet te bjere mbi Grace
10:21
in the accident case.
244
609000
2000
ne rastin e aksidentit.
10:23
Some people think she should deserve more blame,
245
611000
2000
Disa njerez mendojne ajo duhet te meritoje me shume faj,
10:25
and other people less.
246
613000
2000
disa te tjere me pak.
10:27
And what I'm going to show you is what happened when we look inside
247
615000
2000
Do t'u tregoj tani se cka ndodhi kur ne shikuam brenda
10:29
the brains of people while they're making that judgment.
248
617000
2000
trurit te njerezve ndersa ata jane duke e bere ate gjykim.
10:31
So what I'm showing you, from left to right,
249
619000
2000
Pra cka jam duke ju treguar, nga e majta ne te djathte,
10:33
is how much activity there was in this brain region,
250
621000
3000
eshte se sa shume aktivitet kishte ne kete zone te trurit,
10:36
and from top to bottom, how much blame
251
624000
2000
dhe nga lart poshte, shkalla e fajit
10:38
people said that Grace deserved.
252
626000
2000
qe meriton Grace sipas ketyre njerezve.
10:40
And what you can see is, on the left
253
628000
2000
Dhe ajo qe sheh eshte, ne te majte
10:42
when there was very little activity in this brain region,
254
630000
2000
ku kishte pak aktivitet ne kete zone te trurit,
10:44
people paid little attention to her innocent belief
255
632000
3000
njerezit i kushtuan pak vemendje besimit ne pafajsine e saj
10:47
and said she deserved a lot of blame for the accident.
256
635000
3000
dhe thane ajo meriton shume faj per aksidentin.
10:50
Whereas on the right, where there was a lot of activity,
257
638000
2000
Ndersa ne te djathtin, ku kishte shume aktivitet,
10:52
people paid a lot more attention to her innocent belief,
258
640000
3000
njerezit i kushtuan me shume vemendje besimit ne pafajsine e saj,
10:55
and said she deserved a lot less blame
259
643000
2000
dhe thane qe ajo meritonte shume me pak faj
10:57
for causing the accident.
260
645000
2000
per shkaktimin e aksidentit.
10:59
So that's good, but of course
261
647000
2000
Kjo eshte mire, por sigurisht
11:01
what we'd rather is have a way to interfere
262
649000
2000
do te donim te kishim nje menyre per te nderhyre
11:03
with function in this brain region,
263
651000
2000
ne funksioninim e kesaj zone te trurit,
11:05
and see if we could change people's moral judgment.
264
653000
3000
dhe te shihnim nese mund te nderrojme gjykimin moral te njerezve.
11:08
And we do have such a tool.
265
656000
2000
Dhe ne kemi nje mjet te tille.
11:10
It's called Trans-Cranial Magnetic Stimulation,
266
658000
2000
Quhet Trans-Cranial Magnetic Stimulation,
11:12
or TMS.
267
660000
2000
ose TMS.
11:14
This is a tool that lets us pass a magnetic pulse
268
662000
2000
Ky eshte nje mjet qe na lejon te kalojme nje impuls magnetik
11:16
through somebody's skull, into a small region of their brain,
269
664000
4000
nepermjet kafkes, ne nje zone te vogel te trurit,
11:20
and temporarily disorganize the function of the neurons in that region.
270
668000
4000
dhe ç'organizon perkohesisht funksionin e neuroneve ne ate zone.
11:24
So I'm going to show you a demo of this.
271
672000
2000
Do t'ju tregoj nje demo te kesaj.
11:26
First, I'm going to show you that this is a magnetic pulse.
272
674000
3000
Per t'ju treguar se kjo eshte nje impuls magnetik,
11:29
I'm going to show you what happens when you put a quarter on the machine.
273
677000
3000
do t'ju tregoj cfare ndodh kur vendosni nje monedhe ne makine.
11:32
When you hear clicks, we're turning the machine on.
274
680000
4000
Kur degjoni kercitje, makina eshte e ndezur.
11:42
So now I'm going to apply that same pulse to my brain,
275
690000
3000
Tani do te aplikoj te njejtin impuls ne trurin tim,
11:45
to the part of my brain that controls my hand.
276
693000
2000
ne pjesen e trurit qe kontrollon doren.
11:47
So there is no physical force, just a magnetic pulse.
277
695000
3000
Pra nuk ka force fizike, vetem nje impuls magnetik.
11:54
Woman (Video): Ready, Rebecca? RS: Yes.
278
702000
2000
Gruaja( Video): Gati, Rebecca? RS: Po.
11:57
Okay, so it causes a small involuntary contraction in my hand
279
705000
3000
Ne rregull, shkakton nje shtrengim te vogel te pavullnetshem te dores
12:00
by putting a magnetic pulse in my brain.
280
708000
3000
me nje impuls magnetik ne trurin tim.
12:03
And we can use that same pulse,
281
711000
2000
Dhe ne mund ta perdorim te njejtin impuls,
12:05
now applied to the RTPJ,
282
713000
2000
tani e aplikova ne RTPJ,
12:07
to ask if we can change people's moral judgments.
283
715000
3000
per te pare nese mund te ndryshojme gjykimin moral te njerezve.
12:10
So these are the judgments I showed you before, people's normal moral judgments.
284
718000
2000
Pra keto jane gjykimet qe u'a tregova me parë, gjykimet morale normale te njerezve.
12:12
And then we can apply TMS to the RTPJ
285
720000
3000
Tani te aplikojme TMS ne RTPJ
12:15
and ask how people's judgments change.
286
723000
2000
dhe te shohim si mund te ndryshojne gjykimet e njerezve.
12:17
And the first thing is, people can still do this task overall.
287
725000
4000
Dhe gjeja e pare eshte se njerezit akoma mund te bejne kete detyre ne pergjithesi.
12:21
So their judgments of the case when everything was fine
288
729000
2000
Pra gjykimet e tyre per rastin kur cdo gje ishte ne rregull
12:23
remain the same. They say she deserves no blame.
289
731000
3000
mbeten te njejta. Ata thone ajo nuk meriton faj.
12:26
But in the case of a failed attempt to harm,
290
734000
4000
Por ne rastin e tentimit te deshtuar te demit,
12:30
where Grace thought that it was poison, although it was really sugar,
291
738000
3000
ku Grace mendoi qe ishte helm, megjithese ishte me te vertete sheqer,
12:33
people now say it was more okay, she deserves less blame
292
741000
3000
njerezit thone qe ajo meriton me pak faj
12:36
for putting the powder in the coffee.
293
744000
3000
per vendosjen e pluhurit ne kafe.
12:39
And in the case of the accident, where she thought that it was sugar,
294
747000
2000
Dhe ne rastin e aksidentit, ku ajo mendoi qe ishte sheqer,
12:41
but it was really poison and so she caused a death,
295
749000
3000
por ne te vertete ishte helm dhe ajo shkatoi vdekje,
12:44
people say that it was less okay, she deserves more blame.
296
752000
6000
njerezit thone se s'eshte ne rregull, ajo meriton me shume faj.
12:50
So what I've told you today is that
297
758000
2000
Pra cka ju kam treguar sot eshte se
12:52
people come, actually, especially well equipped
298
760000
4000
njerezit jane, vecanerisht te pajisur mire
12:56
to think about other people's thoughts.
299
764000
2000
per te menduar rreth mendimeve te njerezve tjere.
12:58
We have a special brain system
300
766000
2000
Ne kemi nje sistem te vecante te trurit
13:00
that lets us think about what other people are thinking.
301
768000
3000
qe na lejon te mendojme cka mendojne njerezit e tjere.
13:03
This system takes a long time to develop,
302
771000
2000
Ky sistem merr kohe te gjate te zhvillohet
13:05
slowly throughout the course of childhood and into early adolescence.
303
773000
3000
ngadale pergjate femijerise dhe deri ne adoleshence te hershme.
13:08
And even in adulthood, differences in this brain region
304
776000
3000
Dhe madje ne moshen madhore, ndryshime ne kete pjese te trurit
13:11
can explain differences among adults
305
779000
2000
mund te shpjegojne ndryshimet ne menyren qe te rriturit
13:13
in how we think about and judge other people.
306
781000
3000
mendojne dhe gjykojne njerezit e tjere.
13:16
But I want to give the last word back to the novelists,
307
784000
3000
Por une dua t'jap fjalen e fundit novelisteve,
13:19
and to Philip Roth, who ended by saying,
308
787000
3000
dhe Philip Roth, i cili perfundoi duke thene,
13:22
"The fact remains that getting people right
309
790000
2000
" Fakti mbetet se te kuptosh drejt njerezit
13:24
is not what living is all about anyway.
310
792000
2000
nuk eshte kusht i domosdoshem ne jete".
13:26
It's getting them wrong that is living.
311
794000
2000
T'i kuptosh duke bere gabim eshte jetesa.
13:28
Getting them wrong and wrong and wrong,
312
796000
3000
Duke i kuptuar ata gabim dhe gabim dhe gabim,
13:31
and then on careful reconsideration,
313
799000
2000
dhe pastaj nje rishkim i kujdesshem,
13:33
getting them wrong again."
314
801000
2000
duke i kuptuar gabim ata perseri."
13:35
Thank you.
315
803000
2000
Faleminderit,
13:37
(Applause)
316
805000
10000
(Duartroktije)
13:47
Chris Anderson: So, I have a question. When you start talking about using
317
815000
2000
Chris Anderson : Kam nje pyetje. Kur ju filloni te flisni per perdorimin
13:49
magnetic pulses to change people's moral judgments,
318
817000
3000
e impulsit magnetik per te ndryshuar gjykimin moral te njerezve,
13:52
that sounds alarming.
319
820000
3000
kjo duket alarmante.
13:55
(Laughter)
320
823000
1000
(qeshje)
13:56
Please tell me that you're not taking phone calls from the Pentagon, say.
321
824000
4000
Ju lutem tregoni qe nuk po ju telefonojne, le te themi, nga Pentagoni.
14:00
RS: I'm not.
322
828000
2000
RS: Jo.
14:02
I mean, they're calling, but I'm not taking the call.
323
830000
3000
E kam fjalen, ata po telefonojne, por une nuk po pergjigjem.
14:05
(Laughter)
324
833000
1000
(qeshje)
14:06
CA: They really are calling?
325
834000
2000
CA: Me te vertete po telefonojne?
14:08
So then seriously,
326
836000
3000
Pra seriozisht,
14:11
you must lie awake at night sometimes
327
839000
3000
ju duhet te qendroni zgjuar naten ndonjehere
14:14
wondering where this work leads.
328
842000
2000
duke menduar ku na con kjo pune.
14:16
I mean, you're clearly an incredible human being,
329
844000
2000
Mendoj, ju ne menyre te qarte jeni e jashtezakonshme,
14:18
but someone could take this knowledge
330
846000
3000
por dikush mund ta marre kete njohuri
14:21
and in some future
331
849000
2000
dhe ne te ardhmen
14:23
not-torture chamber,
332
851000
2000
jo ne dhomen e tortures,
14:25
do acts that people here might be worried about.
333
853000
3000
por te beje veprime ne menyra qe njerezit ketu mund te shqetsoheshin.
14:28
RS: Yeah, we worry about this.
334
856000
2000
RS: Po, ne shqetsohemi per kete.
14:30
So, there's a couple of things to say about TMS.
335
858000
3000
Pra, jane disa gjera qe duhen te dime per TMS.
14:33
One is that you can't be TMSed without knowing it.
336
861000
2000
Nje eshte qe ju nuk mund tju aplikojne TMS pa e kuptuar ate.
14:35
So it's not a surreptitious technology.
337
863000
3000
Pra nuk behet fjale per nje teknologji sekrete.
14:38
It's quite hard, actually, to get those very small changes.
338
866000
3000
Eshte krejtsisht e veshtire, faktisht, te behen keto ndryshime te vogla.
14:41
The changes I showed you are impressive to me
339
869000
3000
Ndryshimet qe u'a tregova jane mbreselense per mua
14:44
because of what they tell us about the function of the brain,
340
872000
2000
per ate qe na tregojne per funksionin e trurit.
14:46
but they're small on the scale
341
874000
2000
Por ato jane te vogla ne shkallen
14:48
of the moral judgments that we actually make.
342
876000
2000
e gjykimeve morale qe ne faktikisht kemi bere.
14:50
And what we changed was not people's
343
878000
2000
Dhe cka ndryshuam nuk ishte gjykimi moral i njerzeve
14:52
moral judgments when they're deciding what to do,
344
880000
3000
kur ata jane te vendosur cfare te bejne,
14:55
when they're making action choices.
345
883000
2000
kur ata marrin zgjedhje te veprimeve.
14:57
We changed their ability to judge other people's actions.
346
885000
3000
Ne ndryshuam aftesine e tyre per te gjykuar veprimet e njerzve te tjere.
15:00
And so, I think of what I'm doing not so much as
347
888000
2000
Dhe keshtu, mendoj se per ate cka po bej jo edhe aq shume
15:02
studying the defendant in a criminal trial,
348
890000
2000
sa te studjoj te pandehurit ne nje gjyq penal,
15:04
but studying the jury.
349
892000
2000
por duke studjuar jurine.
15:06
CA: Is your work going to lead to any recommendations
350
894000
3000
CA: A do te ju drejtoje puna juaj ne ndonje rekomandim
15:09
in education, to perhaps bring up
351
897000
3000
ne edukim, qe ndoshta te sjellim
15:12
a generation of kids able to make fairer moral judgments?
352
900000
5000
nje gjenerate te femijeve qe jane ne gjendje te bejne gjykime morale me te drejta?
15:17
RS: That's one of the idealistic hopes.
353
905000
3000
RS: Kjo eshte njera nder shpresat idealistike.
15:20
The whole research program here of studying
354
908000
4000
I tere programi hulutues i studimiit
15:24
the distinctive parts of the human brain is brand new.
355
912000
4000
te pjeseve te vecanta te trurit njerezor eshte i ri.
15:28
Until recently, what we knew about the brain
356
916000
2000
Deri ne kohet e fundit, ajo qe dinim per trurin
15:30
were the things that any other animal's brain could do too,
357
918000
3000
ishin gjera qe secili tru i kafshes mund te beje gjithashtu,
15:33
so we could study it in animal models.
358
921000
2000
E mund ta studionim ne modelet e kafsheve.
15:35
We knew how brains see, and how they control the body
359
923000
2000
Kemi kuptuar se si truri arrin te shohe, per te kontrolluar trupin,
15:37
and how they hear and sense.
360
925000
2000
dhe si degjon e ndjen.
15:39
And the whole project of understanding
361
927000
3000
Dhe i tere projekti i te kuptuarit
15:42
how brains do the uniquely human things --
362
930000
2000
si truri ben gjera unike
15:44
learn language and abstract concepts,
363
932000
3000
meson gjuhe dhe koncepte abstrakte,
15:47
and thinking about other people's thoughts -- that's brand new.
364
935000
2000
dhe mendon per mendimet e njerezve te tjere -- kjo eshte e re.
15:49
And we don't know yet what the implications will be
365
937000
2000
Dhe akoma nuk e dime cfare implikime do te kete
15:51
of understanding it.
366
939000
2000
per t'a kuptuar ate.
15:53
CA: So I've got one last question. There is this thing called
367
941000
2000
C.A.: Nje pyetje e fundit. Eshte nje gje e quajtur
15:55
the hard problem of consciousness,
368
943000
2000
problemi i veshtire i ndergjegjes,
15:57
that puzzles a lot of people.
369
945000
2000
qe le ne medyshje shume njerez.
15:59
The notion that you can understand
370
947000
3000
Nocionin qe ju mund ta kuptoni
16:02
why a brain works, perhaps.
371
950000
2000
pse truri funksion, ndoshta,
16:04
But why does anyone have to feel anything?
372
952000
3000
Por pse duhet te kuptoje dicka?
16:07
Why does it seem to require these beings who sense things
373
955000
3000
Pse per te funksionuar kemi nevoje
16:10
for us to operate?
374
958000
2000
te kuptojme gjerat qe na rrethojne?
16:12
You're a brilliant young neuroscientist.
375
960000
3000
Ju jeni nje neuroscientist i ri e briliant,
16:15
I mean, what chances do you think there are
376
963000
2000
sipas jush, cfare shance ka
16:17
that at some time in your career,
377
965000
2000
qe heret a vone ne karieren tuaj,
16:19
someone, you or someone else,
378
967000
2000
dikush, ju ose dikush tjter
16:21
is going to come up with some paradigm shift
379
969000
2000
do te vije me disa zhvendosje paradigme
16:23
in understanding what seems an impossible problem?
380
971000
4000
ne kuptushmerine cka duket nje problem i pa mundshem?
16:27
RS: I hope they do. And I think they probably won't.
381
975000
4000
RS: Shpresoj qe ata do t'a bejne. Por mendoj qe nuk do ta bejne.
16:31
CA: Why?
382
979000
3000
CA: Pse?
16:34
RS: It's not called the hard problem of consciousness for nothing.
383
982000
3000
RS: Nuk eshte quajtur kot problemi i veshtire i ndergjegjes.
16:37
(Laughter)
384
985000
2000
(Qeshje)
16:39
CA: That's a great answer. Rebecca Saxe, thank you very much. That was fantastic.
385
987000
3000
CA: Kjo eshte nje pergjigje e shkelqyer. Rebecca Saxe, faleminderit shume. Kjo ishte fantastike.
16:42
(Applause)
386
990000
4000
(Duartrokitje)
Translated by Ereblir Kadriu
Reviewed by Helena Bedalli

▲Back to top

ABOUT THE SPEAKER
Rebecca Saxe - Cognitive neuroscientist
Rebecca Saxe studies how we think about other people's thoughts. At the Saxelab at MIT, she uses fMRI to identify what happens in our brains when we consider the motives, passions and beliefs of others.

Why you should listen

While still a graduate student, Rebecca Saxe made a breakthrough discovery: There's a specific region in our brain that becomes active when we contemplate the workings of other minds. Now, at MIT's Saxelab, she and her team have been further exploring her grad-school finding, exploring how it may help us understand conditions such as autism.

As Saxe delves into the complexities of social cognition, this young scientist is working toward revealing the enigma of human minds interacting.

More profile about the speaker
Rebecca Saxe | Speaker | TED.com