ABOUT THE SPEAKER
Rebecca Saxe - Cognitive neuroscientist
Rebecca Saxe studies how we think about other people's thoughts. At the Saxelab at MIT, she uses fMRI to identify what happens in our brains when we consider the motives, passions and beliefs of others.

Why you should listen

While still a graduate student, Rebecca Saxe made a breakthrough discovery: There's a specific region in our brain that becomes active when we contemplate the workings of other minds. Now, at MIT's Saxelab, she and her team have been further exploring her grad-school finding, exploring how it may help us understand conditions such as autism.

As Saxe delves into the complexities of social cognition, this young scientist is working toward revealing the enigma of human minds interacting.

More profile about the speaker
Rebecca Saxe | Speaker | TED.com
TEDGlobal 2009

Rebecca Saxe: How we read each other's minds

Filmed:
3,311,612 views

Sensing the motives and feelings of others is a natural talent for humans. But how do we do it? Here, Rebecca Saxe shares fascinating lab work that uncovers how the brain thinks about other peoples' thoughts -- and judges their actions.
- Cognitive neuroscientist
Rebecca Saxe studies how we think about other people's thoughts. At the Saxelab at MIT, she uses fMRI to identify what happens in our brains when we consider the motives, passions and beliefs of others. Full bio

Double-click the English transcript below to play the video.

00:12
Today I'm going to talk to you about the problem of other minds.
0
0
3000
00:15
And the problem I'm going to talk about
1
3000
2000
00:17
is not the familiar one from philosophy,
2
5000
3000
00:20
which is, "How can we know
3
8000
2000
00:22
whether other people have minds?"
4
10000
2000
00:24
That is, maybe you have a mind,
5
12000
2000
00:26
and everyone else is just a really convincing robot.
6
14000
3000
00:29
So that's a problem in philosophy,
7
17000
2000
00:31
but for today's purposes I'm going to assume
8
19000
2000
00:33
that many people in this audience have a mind,
9
21000
2000
00:35
and that I don't have to worry about this.
10
23000
2000
00:37
There is a second problem that is maybe even more familiar to us
11
25000
3000
00:40
as parents and teachers and spouses
12
28000
3000
00:43
and novelists,
13
31000
2000
00:45
which is, "Why is it so hard
14
33000
2000
00:47
to know what somebody else wants or believes?"
15
35000
2000
00:49
Or perhaps, more relevantly,
16
37000
2000
00:51
"Why is it so hard to change what somebody else wants or believes?"
17
39000
3000
00:54
I think novelists put this best.
18
42000
2000
00:56
Like Philip Roth, who said,
19
44000
2000
00:58
"And yet, what are we to do about this terribly significant business
20
46000
3000
01:01
of other people?
21
49000
2000
01:03
So ill equipped are we all,
22
51000
2000
01:05
to envision one another's interior workings
23
53000
2000
01:07
and invisible aims."
24
55000
2000
01:09
So as a teacher and as a spouse,
25
57000
3000
01:12
this is, of course, a problem I confront every day.
26
60000
2000
01:14
But as a scientist, I'm interested in a different problem of other minds,
27
62000
3000
01:17
and that is the one I'm going to introduce to you today.
28
65000
3000
01:20
And that problem is, "How is it so easy
29
68000
2000
01:22
to know other minds?"
30
70000
2000
01:24
So to start with an illustration,
31
72000
2000
01:26
you need almost no information,
32
74000
2000
01:28
one snapshot of a stranger,
33
76000
2000
01:30
to guess what this woman is thinking,
34
78000
2000
01:32
or what this man is.
35
80000
3000
01:35
And put another way, the crux of the problem is
36
83000
2000
01:37
the machine that we use for thinking about other minds,
37
85000
3000
01:40
our brain, is made up of pieces, brain cells,
38
88000
3000
01:43
that we share with all other animals, with monkeys
39
91000
2000
01:45
and mice and even sea slugs.
40
93000
3000
01:48
And yet, you put them together in a particular network,
41
96000
3000
01:51
and what you get is the capacity to write Romeo and Juliet.
42
99000
3000
01:54
Or to say, as Alan Greenspan did,
43
102000
2000
01:56
"I know you think you understand what you thought I said,
44
104000
3000
01:59
but I'm not sure you realize that what you heard
45
107000
2000
02:01
is not what I meant."
46
109000
2000
02:03
(Laughter)
47
111000
3000
02:06
So, the job of my field of cognitive neuroscience
48
114000
2000
02:08
is to stand with these ideas,
49
116000
2000
02:10
one in each hand.
50
118000
2000
02:12
And to try to understand how you can put together
51
120000
3000
02:15
simple units, simple messages over space and time, in a network,
52
123000
4000
02:19
and get this amazing human capacity to think about minds.
53
127000
4000
02:23
So I'm going to tell you three things about this today.
54
131000
3000
02:26
Obviously the whole project here is huge.
55
134000
3000
02:29
And I'm going to tell you just our first few steps
56
137000
3000
02:32
about the discovery of a special brain region
57
140000
2000
02:34
for thinking about other people's thoughts.
58
142000
2000
02:36
Some observations on the slow development of this system
59
144000
2000
02:38
as we learn how to do this difficult job.
60
146000
4000
02:42
And then finally, to show that some of the differences
61
150000
2000
02:44
between people, in how we judge others,
62
152000
3000
02:47
can be explained by differences in this brain system.
63
155000
4000
02:51
So first, the first thing I want to tell you is that
64
159000
2000
02:53
there is a brain region in the human brain, in your brains,
65
161000
3000
02:56
whose job it is to think about other people's thoughts.
66
164000
3000
02:59
This is a picture of it.
67
167000
2000
03:01
It's called the Right Temporo-Parietal Junction.
68
169000
2000
03:03
It's above and behind your right ear.
69
171000
2000
03:05
And this is the brain region you used when you saw the pictures I showed you,
70
173000
2000
03:07
or when you read Romeo and Juliet
71
175000
2000
03:09
or when you tried to understand Alan Greenspan.
72
177000
3000
03:12
And you don't use it for solving any other kinds of logical problems.
73
180000
4000
03:16
So this brain region is called the Right TPJ.
74
184000
3000
03:19
And this picture shows the average activation
75
187000
2000
03:21
in a group of what we call typical human adults.
76
189000
2000
03:23
They're MIT undergraduates.
77
191000
2000
03:25
(Laughter)
78
193000
4000
03:29
The second thing I want to say about this brain system
79
197000
2000
03:31
is that although we human adults
80
199000
2000
03:33
are really good at understanding other minds,
81
201000
2000
03:35
we weren't always that way.
82
203000
2000
03:37
It takes children a long time to break into the system.
83
205000
3000
03:40
I'm going to show you a little bit of that long, extended process.
84
208000
4000
03:44
The first thing I'm going to show you is a change between age three and five,
85
212000
3000
03:47
as kids learn to understand
86
215000
2000
03:49
that somebody else can have beliefs that are different from their own.
87
217000
3000
03:52
So I'm going to show you a five-year-old
88
220000
2000
03:54
who is getting a standard kind of puzzle
89
222000
2000
03:56
that we call the false belief task.
90
224000
3000
03:59
Rebecca Saxe (Video): This is the first pirate. His name is Ivan.
91
227000
3000
04:02
And you know what pirates really like?
92
230000
2000
04:04
Child: What? RS: Pirates really like cheese sandwiches.
93
232000
3000
04:07
Child: Cheese? I love cheese!
94
235000
3000
04:10
RS: Yeah. So Ivan has this cheese sandwich,
95
238000
2000
04:12
and he says, "Yum yum yum yum yum!
96
240000
2000
04:14
I really love cheese sandwiches."
97
242000
2000
04:16
And Ivan puts his sandwich over here, on top of the pirate chest.
98
244000
4000
04:20
And Ivan says, "You know what? I need a drink with my lunch."
99
248000
4000
04:24
And so Ivan goes to get a drink.
100
252000
3000
04:27
And while Ivan is away
101
255000
2000
04:29
the wind comes,
102
257000
3000
04:32
and it blows the sandwich down onto the grass.
103
260000
2000
04:34
And now, here comes the other pirate.
104
262000
4000
04:38
This pirate is called Joshua.
105
266000
3000
04:41
And Joshua also really loves cheese sandwiches.
106
269000
2000
04:43
So Joshua has a cheese sandwich and he says,
107
271000
2000
04:45
"Yum yum yum yum yum! I love cheese sandwiches."
108
273000
4000
04:49
And he puts his cheese sandwich over here on top of the pirate chest.
109
277000
3000
04:52
Child: So, that one is his.
110
280000
2000
04:54
RS: That one is Joshua's. That's right.
111
282000
2000
04:56
Child: And then his went on the ground.
112
284000
2000
04:58
RS: That's exactly right.
113
286000
2000
05:00
Child: So he won't know which one is his.
114
288000
2000
05:02
RS: Oh. So now Joshua goes off to get a drink.
115
290000
3000
05:05
Ivan comes back and he says, "I want my cheese sandwich."
116
293000
4000
05:09
So which one do you think Ivan is going to take?
117
297000
3000
05:12
Child: I think he is going to take that one.
118
300000
2000
05:14
RS: Yeah, you think he's going to take that one? All right. Let's see.
119
302000
2000
05:16
Oh yeah, you were right. He took that one.
120
304000
3000
05:19
So that's a five-year-old who clearly understands
121
307000
2000
05:21
that other people can have false beliefs
122
309000
2000
05:23
and what the consequences are for their actions.
123
311000
2000
05:25
Now I'm going to show you a three-year-old
124
313000
3000
05:28
who got the same puzzle.
125
316000
2000
05:30
RS: And Ivan says, "I want my cheese sandwich."
126
318000
2000
05:32
Which sandwich is he going to take?
127
320000
3000
05:35
Do you think he's going to take that one? Let's see what happens.
128
323000
2000
05:37
Let's see what he does. Here comes Ivan.
129
325000
2000
05:39
And he says, "I want my cheese sandwich."
130
327000
3000
05:42
And he takes this one.
131
330000
2000
05:44
Uh-oh. Why did he take that one?
132
332000
3000
05:47
Child: His was on the grass.
133
335000
4000
05:51
So the three-year-old does two things differently.
134
339000
3000
05:54
First, he predicts Ivan will take the sandwich
135
342000
3000
05:57
that's really his.
136
345000
2000
05:59
And second, when he sees Ivan taking the sandwich where he left his,
137
347000
4000
06:03
where we would say he's taking that one because he thinks it's his,
138
351000
3000
06:06
the three-year-old comes up with another explanation:
139
354000
3000
06:09
He's not taking his own sandwich because he doesn't want it,
140
357000
2000
06:11
because now it's dirty, on the ground.
141
359000
2000
06:13
So that's why he's taking the other sandwich.
142
361000
2000
06:15
Now of course, development doesn't end at five.
143
363000
4000
06:19
And we can see the continuation of this process
144
367000
2000
06:21
of learning to think about other people's thoughts
145
369000
2000
06:23
by upping the ante
146
371000
2000
06:25
and asking children now, not for an action prediction,
147
373000
3000
06:28
but for a moral judgment.
148
376000
2000
06:30
So first I'm going to show you the three-year-old again.
149
378000
2000
06:32
RS.: So is Ivan being mean and naughty for taking Joshua's sandwich?
150
380000
3000
06:35
Child: Yeah.
151
383000
1000
06:36
RS: Should Ivan get in trouble for taking Joshua's sandwich?
152
384000
3000
06:39
Child: Yeah.
153
387000
2000
06:41
So it's maybe not surprising he thinks it was mean of Ivan
154
389000
2000
06:43
to take Joshua's sandwich,
155
391000
2000
06:45
since he thinks Ivan only took Joshua's sandwich
156
393000
2000
06:47
to avoid having to eat his own dirty sandwich.
157
395000
3000
06:50
But now I'm going to show you the five-year-old.
158
398000
2000
06:52
Remember the five-year-old completely understood
159
400000
2000
06:54
why Ivan took Joshua's sandwich.
160
402000
2000
06:56
RS: Was Ivan being mean and naughty
161
404000
2000
06:58
for taking Joshua's sandwich?
162
406000
2000
07:00
Child: Um, yeah.
163
408000
2000
07:02
And so, it is not until age seven
164
410000
2000
07:04
that we get what looks more like an adult response.
165
412000
3000
07:07
RS: Should Ivan get in trouble for taking Joshua's sandwich?
166
415000
3000
07:10
Child: No, because the wind should get in trouble.
167
418000
2000
07:12
He says the wind should get in trouble
168
420000
3000
07:15
for switching the sandwiches.
169
423000
2000
07:17
(Laughter)
170
425000
2000
07:19
And now what we've started to do in my lab
171
427000
2000
07:21
is to put children into the brain scanner
172
429000
2000
07:23
and ask what's going on in their brain
173
431000
3000
07:26
as they develop this ability to think about other people's thoughts.
174
434000
3000
07:29
So the first thing is that in children we see this same brain region, the Right TPJ,
175
437000
4000
07:33
being used while children are thinking about other people.
176
441000
3000
07:36
But it's not quite like the adult brain.
177
444000
2000
07:38
So whereas in the adults, as I told you,
178
446000
2000
07:40
this brain region is almost completely specialized --
179
448000
3000
07:43
it does almost nothing else except for thinking about other people's thoughts --
180
451000
3000
07:46
in children it's much less so,
181
454000
2000
07:48
when they are age five to eight,
182
456000
2000
07:50
the age range of the children I just showed you.
183
458000
2000
07:52
And actually if we even look at eight to 11-year-olds,
184
460000
3000
07:55
getting into early adolescence,
185
463000
2000
07:57
they still don't have quite an adult-like brain region.
186
465000
3000
08:00
And so, what we can see is that over the course of childhood
187
468000
3000
08:03
and even into adolescence,
188
471000
2000
08:05
both the cognitive system,
189
473000
2000
08:07
our mind's ability to think about other minds,
190
475000
2000
08:09
and the brain system that supports it
191
477000
2000
08:11
are continuing, slowly, to develop.
192
479000
3000
08:14
But of course, as you're probably aware,
193
482000
2000
08:16
even in adulthood,
194
484000
2000
08:18
people differ from one another in how good they are
195
486000
2000
08:20
at thinking of other minds, how often they do it
196
488000
2000
08:22
and how accurately.
197
490000
2000
08:24
And so what we wanted to know was, could differences among adults
198
492000
3000
08:27
in how they think about other people's thoughts
199
495000
2000
08:29
be explained in terms of differences in this brain region?
200
497000
3000
08:32
So, the first thing that we did is we gave adults a version
201
500000
3000
08:35
of the pirate problem that we gave to the kids.
202
503000
2000
08:37
And I'm going to give that to you now.
203
505000
2000
08:39
So Grace and her friend are on a tour of a chemical factory,
204
507000
3000
08:42
and they take a break for coffee.
205
510000
2000
08:44
And Grace's friend asks for some sugar in her coffee.
206
512000
3000
08:47
Grace goes to make the coffee
207
515000
3000
08:50
and finds by the coffee a pot
208
518000
2000
08:52
containing a white powder, which is sugar.
209
520000
3000
08:55
But the powder is labeled "Deadly Poison,"
210
523000
3000
08:58
so Grace thinks that the powder is a deadly poison.
211
526000
3000
09:01
And she puts it in her friend's coffee.
212
529000
2000
09:03
And her friend drinks the coffee, and is fine.
213
531000
3000
09:06
How many people think it was morally permissible
214
534000
2000
09:08
for Grace to put the powder in the coffee?
215
536000
4000
09:12
Okay. Good. (Laughter)
216
540000
3000
09:15
So we ask people, how much should Grace be blamed
217
543000
3000
09:18
in this case, which we call a failed attempt to harm?
218
546000
2000
09:20
And we can compare that to another case,
219
548000
2000
09:22
where everything in the real world is the same.
220
550000
2000
09:24
The powder is still sugar, but what's different is what Grace thinks.
221
552000
3000
09:27
Now she thinks the powder is sugar.
222
555000
3000
09:30
And perhaps unsurprisingly, if Grace thinks the powder is sugar
223
558000
3000
09:33
and puts it in her friend's coffee,
224
561000
2000
09:35
people say she deserves no blame at all.
225
563000
2000
09:37
Whereas if she thinks the powder was poison, even though it's really sugar,
226
565000
4000
09:41
now people say she deserves a lot of blame,
227
569000
3000
09:44
even though what happened in the real world was exactly the same.
228
572000
3000
09:47
And in fact, they say she deserves more blame
229
575000
2000
09:49
in this case, the failed attempt to harm,
230
577000
2000
09:51
than in another case,
231
579000
2000
09:53
which we call an accident.
232
581000
2000
09:55
Where Grace thought the powder was sugar,
233
583000
2000
09:57
because it was labeled "sugar" and by the coffee machine,
234
585000
2000
09:59
but actually the powder was poison.
235
587000
2000
10:01
So even though when the powder was poison,
236
589000
3000
10:04
the friend drank the coffee and died,
237
592000
3000
10:07
people say Grace deserves less blame in that case,
238
595000
3000
10:10
when she innocently thought it was sugar,
239
598000
2000
10:12
than in the other case, where she thought it was poison
240
600000
2000
10:14
and no harm occurred.
241
602000
3000
10:17
People, though, disagree a little bit
242
605000
2000
10:19
about exactly how much blame Grace should get
243
607000
2000
10:21
in the accident case.
244
609000
2000
10:23
Some people think she should deserve more blame,
245
611000
2000
10:25
and other people less.
246
613000
2000
10:27
And what I'm going to show you is what happened when we look inside
247
615000
2000
10:29
the brains of people while they're making that judgment.
248
617000
2000
10:31
So what I'm showing you, from left to right,
249
619000
2000
10:33
is how much activity there was in this brain region,
250
621000
3000
10:36
and from top to bottom, how much blame
251
624000
2000
10:38
people said that Grace deserved.
252
626000
2000
10:40
And what you can see is, on the left
253
628000
2000
10:42
when there was very little activity in this brain region,
254
630000
2000
10:44
people paid little attention to her innocent belief
255
632000
3000
10:47
and said she deserved a lot of blame for the accident.
256
635000
3000
10:50
Whereas on the right, where there was a lot of activity,
257
638000
2000
10:52
people paid a lot more attention to her innocent belief,
258
640000
3000
10:55
and said she deserved a lot less blame
259
643000
2000
10:57
for causing the accident.
260
645000
2000
10:59
So that's good, but of course
261
647000
2000
11:01
what we'd rather is have a way to interfere
262
649000
2000
11:03
with function in this brain region,
263
651000
2000
11:05
and see if we could change people's moral judgment.
264
653000
3000
11:08
And we do have such a tool.
265
656000
2000
11:10
It's called Trans-Cranial Magnetic Stimulation,
266
658000
2000
11:12
or TMS.
267
660000
2000
11:14
This is a tool that lets us pass a magnetic pulse
268
662000
2000
11:16
through somebody's skull, into a small region of their brain,
269
664000
4000
11:20
and temporarily disorganize the function of the neurons in that region.
270
668000
4000
11:24
So I'm going to show you a demo of this.
271
672000
2000
11:26
First, I'm going to show you that this is a magnetic pulse.
272
674000
3000
11:29
I'm going to show you what happens when you put a quarter on the machine.
273
677000
3000
11:32
When you hear clicks, we're turning the machine on.
274
680000
4000
11:42
So now I'm going to apply that same pulse to my brain,
275
690000
3000
11:45
to the part of my brain that controls my hand.
276
693000
2000
11:47
So there is no physical force, just a magnetic pulse.
277
695000
3000
11:54
Woman (Video): Ready, Rebecca? RS: Yes.
278
702000
2000
11:57
Okay, so it causes a small involuntary contraction in my hand
279
705000
3000
12:00
by putting a magnetic pulse in my brain.
280
708000
3000
12:03
And we can use that same pulse,
281
711000
2000
12:05
now applied to the RTPJ,
282
713000
2000
12:07
to ask if we can change people's moral judgments.
283
715000
3000
12:10
So these are the judgments I showed you before, people's normal moral judgments.
284
718000
2000
12:12
And then we can apply TMS to the RTPJ
285
720000
3000
12:15
and ask how people's judgments change.
286
723000
2000
12:17
And the first thing is, people can still do this task overall.
287
725000
4000
12:21
So their judgments of the case when everything was fine
288
729000
2000
12:23
remain the same. They say she deserves no blame.
289
731000
3000
12:26
But in the case of a failed attempt to harm,
290
734000
4000
12:30
where Grace thought that it was poison, although it was really sugar,
291
738000
3000
12:33
people now say it was more okay, she deserves less blame
292
741000
3000
12:36
for putting the powder in the coffee.
293
744000
3000
12:39
And in the case of the accident, where she thought that it was sugar,
294
747000
2000
12:41
but it was really poison and so she caused a death,
295
749000
3000
12:44
people say that it was less okay, she deserves more blame.
296
752000
6000
12:50
So what I've told you today is that
297
758000
2000
12:52
people come, actually, especially well equipped
298
760000
4000
12:56
to think about other people's thoughts.
299
764000
2000
12:58
We have a special brain system
300
766000
2000
13:00
that lets us think about what other people are thinking.
301
768000
3000
13:03
This system takes a long time to develop,
302
771000
2000
13:05
slowly throughout the course of childhood and into early adolescence.
303
773000
3000
13:08
And even in adulthood, differences in this brain region
304
776000
3000
13:11
can explain differences among adults
305
779000
2000
13:13
in how we think about and judge other people.
306
781000
3000
13:16
But I want to give the last word back to the novelists,
307
784000
3000
13:19
and to Philip Roth, who ended by saying,
308
787000
3000
13:22
"The fact remains that getting people right
309
790000
2000
13:24
is not what living is all about anyway.
310
792000
2000
13:26
It's getting them wrong that is living.
311
794000
2000
13:28
Getting them wrong and wrong and wrong,
312
796000
3000
13:31
and then on careful reconsideration,
313
799000
2000
13:33
getting them wrong again."
314
801000
2000
13:35
Thank you.
315
803000
2000
13:37
(Applause)
316
805000
10000
13:47
Chris Anderson: So, I have a question. When you start talking about using
317
815000
2000
13:49
magnetic pulses to change people's moral judgments,
318
817000
3000
13:52
that sounds alarming.
319
820000
3000
13:55
(Laughter)
320
823000
1000
13:56
Please tell me that you're not taking phone calls from the Pentagon, say.
321
824000
4000
14:00
RS: I'm not.
322
828000
2000
14:02
I mean, they're calling, but I'm not taking the call.
323
830000
3000
14:05
(Laughter)
324
833000
1000
14:06
CA: They really are calling?
325
834000
2000
14:08
So then seriously,
326
836000
3000
14:11
you must lie awake at night sometimes
327
839000
3000
14:14
wondering where this work leads.
328
842000
2000
14:16
I mean, you're clearly an incredible human being,
329
844000
2000
14:18
but someone could take this knowledge
330
846000
3000
14:21
and in some future
331
849000
2000
14:23
not-torture chamber,
332
851000
2000
14:25
do acts that people here might be worried about.
333
853000
3000
14:28
RS: Yeah, we worry about this.
334
856000
2000
14:30
So, there's a couple of things to say about TMS.
335
858000
3000
14:33
One is that you can't be TMSed without knowing it.
336
861000
2000
14:35
So it's not a surreptitious technology.
337
863000
3000
14:38
It's quite hard, actually, to get those very small changes.
338
866000
3000
14:41
The changes I showed you are impressive to me
339
869000
3000
14:44
because of what they tell us about the function of the brain,
340
872000
2000
14:46
but they're small on the scale
341
874000
2000
14:48
of the moral judgments that we actually make.
342
876000
2000
14:50
And what we changed was not people's
343
878000
2000
14:52
moral judgments when they're deciding what to do,
344
880000
3000
14:55
when they're making action choices.
345
883000
2000
14:57
We changed their ability to judge other people's actions.
346
885000
3000
15:00
And so, I think of what I'm doing not so much as
347
888000
2000
15:02
studying the defendant in a criminal trial,
348
890000
2000
15:04
but studying the jury.
349
892000
2000
15:06
CA: Is your work going to lead to any recommendations
350
894000
3000
15:09
in education, to perhaps bring up
351
897000
3000
15:12
a generation of kids able to make fairer moral judgments?
352
900000
5000
15:17
RS: That's one of the idealistic hopes.
353
905000
3000
15:20
The whole research program here of studying
354
908000
4000
15:24
the distinctive parts of the human brain is brand new.
355
912000
4000
15:28
Until recently, what we knew about the brain
356
916000
2000
15:30
were the things that any other animal's brain could do too,
357
918000
3000
15:33
so we could study it in animal models.
358
921000
2000
15:35
We knew how brains see, and how they control the body
359
923000
2000
15:37
and how they hear and sense.
360
925000
2000
15:39
And the whole project of understanding
361
927000
3000
15:42
how brains do the uniquely human things --
362
930000
2000
15:44
learn language and abstract concepts,
363
932000
3000
15:47
and thinking about other people's thoughts -- that's brand new.
364
935000
2000
15:49
And we don't know yet what the implications will be
365
937000
2000
15:51
of understanding it.
366
939000
2000
15:53
CA: So I've got one last question. There is this thing called
367
941000
2000
15:55
the hard problem of consciousness,
368
943000
2000
15:57
that puzzles a lot of people.
369
945000
2000
15:59
The notion that you can understand
370
947000
3000
16:02
why a brain works, perhaps.
371
950000
2000
16:04
But why does anyone have to feel anything?
372
952000
3000
16:07
Why does it seem to require these beings who sense things
373
955000
3000
16:10
for us to operate?
374
958000
2000
16:12
You're a brilliant young neuroscientist.
375
960000
3000
16:15
I mean, what chances do you think there are
376
963000
2000
16:17
that at some time in your career,
377
965000
2000
16:19
someone, you or someone else,
378
967000
2000
16:21
is going to come up with some paradigm shift
379
969000
2000
16:23
in understanding what seems an impossible problem?
380
971000
4000
16:27
RS: I hope they do. And I think they probably won't.
381
975000
4000
16:31
CA: Why?
382
979000
3000
16:34
RS: It's not called the hard problem of consciousness for nothing.
383
982000
3000
16:37
(Laughter)
384
985000
2000
16:39
CA: That's a great answer. Rebecca Saxe, thank you very much. That was fantastic.
385
987000
3000
16:42
(Applause)
386
990000
4000

▲Back to top

ABOUT THE SPEAKER
Rebecca Saxe - Cognitive neuroscientist
Rebecca Saxe studies how we think about other people's thoughts. At the Saxelab at MIT, she uses fMRI to identify what happens in our brains when we consider the motives, passions and beliefs of others.

Why you should listen

While still a graduate student, Rebecca Saxe made a breakthrough discovery: There's a specific region in our brain that becomes active when we contemplate the workings of other minds. Now, at MIT's Saxelab, she and her team have been further exploring her grad-school finding, exploring how it may help us understand conditions such as autism.

As Saxe delves into the complexities of social cognition, this young scientist is working toward revealing the enigma of human minds interacting.

More profile about the speaker
Rebecca Saxe | Speaker | TED.com