ABOUT THE SPEAKER
Laura Schulz - Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn.

Why you should listen

MIT Early Childhood Cognition Lab lead investigator Laura Schulz studies learning in early childhood. Her research bridges computational models of cognitive development and behavioral studies in order to understand the origins of inquiry and discovery.

Working in play labs, children’s museums, and a recently-launched citizen science website, Schultz is reshaping how we view young children’s perceptions of the world around them. Some of the surprising results of her research: before the age of four, children expect hidden causes when events happen probabilistically, use simple experiments to distinguish causal hypotheses, and trade off learning from instruction and exploration.

More profile about the speaker
Laura Schulz | Speaker | TED.com
TED2015

Laura Schulz: The surprisingly logical minds of babies

Filmed:
1,888,975 views

How do babies learn so much from so little so quickly? In a fun, experiment-filled talk, cognitive scientist Laura Schulz shows how our young ones make decisions with a surprisingly strong sense of logic, well before they can talk.
- Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn. Full bio

Double-click the English transcript below to play the video.

00:12
Mark Twain summed up
what I take to be
0
835
2155
00:14
one of the fundamental problems
of cognitive science
1
2990
3120
00:18
with a single witticism.
2
6110
1710
00:20
He said, "There's something
fascinating about science.
3
8410
3082
00:23
One gets such wholesale
returns of conjecture
4
11492
3228
00:26
out of such a trifling
investment in fact."
5
14720
3204
00:29
(Laughter)
6
17924
1585
00:32
Twain meant it as a joke,
of course, but he's right:
7
20199
2604
00:34
There's something
fascinating about science.
8
22803
2876
00:37
From a few bones, we infer
the existence of dinosuars.
9
25679
4261
00:42
From spectral lines,
the composition of nebulae.
10
30910
3871
00:47
From fruit flies,
11
35471
2938
00:50
the mechanisms of heredity,
12
38409
2943
00:53
and from reconstructed images
of blood flowing through the brain,
13
41352
4249
00:57
or in my case, from the behavior
of very young children,
14
45601
4708
01:02
we try to say something about
the fundamental mechanisms
15
50309
2829
01:05
of human cognition.
16
53138
1618
01:07
In particular, in my lab in the Department
of Brain and Cognitive Sciences at MIT,
17
55716
4759
01:12
I have spent the past decade
trying to understand the mystery
18
60475
3654
01:16
of how children learn so much
from so little so quickly.
19
64129
3977
01:20
Because, it turns out that
the fascinating thing about science
20
68666
2978
01:23
is also a fascinating
thing about children,
21
71644
3529
01:27
which, to put a gentler
spin on Mark Twain,
22
75173
2581
01:29
is precisely their ability
to draw rich, abstract inferences
23
77754
4650
01:34
rapidly and accurately
from sparse, noisy data.
24
82404
4661
01:40
I'm going to give you
just two examples today.
25
88355
2398
01:42
One is about a problem of generalization,
26
90753
2287
01:45
and the other is about a problem
of causal reasoning.
27
93040
2850
01:47
And although I'm going to talk
about work in my lab,
28
95890
2525
01:50
this work is inspired by
and indebted to a field.
29
98415
3460
01:53
I'm grateful to mentors, colleagues,
and collaborators around the world.
30
101875
4283
01:59
Let me start with the problem
of generalization.
31
107308
2974
02:02
Generalizing from small samples of data
is the bread and butter of science.
32
110652
4133
02:06
We poll a tiny fraction of the electorate
33
114785
2554
02:09
and we predict the outcome
of national elections.
34
117339
2321
02:12
We see how a handful of patients
responds to treatment in a clinical trial,
35
120240
3925
02:16
and we bring drugs to a national market.
36
124165
3065
02:19
But this only works if our sample
is randomly drawn from the population.
37
127230
4365
02:23
If our sample is cherry-picked
in some way --
38
131595
2735
02:26
say, we poll only urban voters,
39
134330
2072
02:28
or say, in our clinical trials
for treatments for heart disease,
40
136402
4388
02:32
we include only men --
41
140790
1881
02:34
the results may not generalize
to the broader population.
42
142671
3158
02:38
So scientists care whether evidence
is randomly sampled or not,
43
146479
3581
02:42
but what does that have to do with babies?
44
150060
2015
02:44
Well, babies have to generalize
from small samples of data all the time.
45
152585
4621
02:49
They see a few rubber ducks
and learn that they float,
46
157206
3158
02:52
or a few balls and learn that they bounce.
47
160364
3575
02:55
And they develop expectations
about ducks and balls
48
163939
2951
02:58
that they're going to extend
to rubber ducks and balls
49
166890
2716
03:01
for the rest of their lives.
50
169606
1879
03:03
And the kinds of generalizations
babies have to make about ducks and balls
51
171485
3739
03:07
they have to make about almost everything:
52
175224
2089
03:09
shoes and ships and sealing wax
and cabbages and kings.
53
177313
3917
03:14
So do babies care whether
the tiny bit of evidence they see
54
182200
2961
03:17
is plausibly representative
of a larger population?
55
185161
3692
03:21
Let's find out.
56
189763
1900
03:23
I'm going to show you two movies,
57
191663
1723
03:25
one from each of two conditions
of an experiment,
58
193386
2462
03:27
and because you're going to see
just two movies,
59
195848
2438
03:30
you're going to see just two babies,
60
198286
2136
03:32
and any two babies differ from each other
in innumerable ways.
61
200422
3947
03:36
But these babies, of course,
here stand in for groups of babies,
62
204369
3051
03:39
and the differences you're going to see
63
207420
1895
03:41
represent average group differences
in babies' behavior across conditions.
64
209315
5195
03:47
In each movie, you're going to see
a baby doing maybe
65
215160
2583
03:49
just exactly what you might
expect a baby to do,
66
217743
3460
03:53
and we can hardly make babies
more magical than they already are.
67
221203
4017
03:58
But to my mind the magical thing,
68
226090
2010
04:00
and what I want you to pay attention to,
69
228100
2089
04:02
is the contrast between
these two conditions,
70
230189
3111
04:05
because the only thing
that differs between these two movies
71
233300
3529
04:08
is the statistical evidence
the babies are going to observe.
72
236829
3466
04:13
We're going to show babies
a box of blue and yellow balls,
73
241425
3183
04:16
and my then-graduate student,
now colleague at Stanford, Hyowon Gweon,
74
244608
4620
04:21
is going to pull three blue balls
in a row out of this box,
75
249228
3077
04:24
and when she pulls those balls out,
she's going to squeeze them,
76
252305
3123
04:27
and the balls are going to squeak.
77
255428
2113
04:29
And if you're a baby,
that's like a TED Talk.
78
257541
2763
04:32
It doesn't get better than that.
79
260304
1904
04:34
(Laughter)
80
262208
2561
04:38
But the important point is it's really
easy to pull three blue balls in a row
81
266968
3659
04:42
out of a box of mostly blue balls.
82
270627
2305
04:44
You could do that with your eyes closed.
83
272932
2060
04:46
It's plausibly a random sample
from this population.
84
274992
2996
04:49
And if you can reach into a box at random
and pull out things that squeak,
85
277988
3732
04:53
then maybe everything in the box squeaks.
86
281720
2839
04:56
So maybe babies should expect
those yellow balls to squeak as well.
87
284559
3650
05:00
Now, those yellow balls
have funny sticks on the end,
88
288209
2519
05:02
so babies could do other things
with them if they wanted to.
89
290728
2857
05:05
They could pound them or whack them.
90
293585
1831
05:07
But let's see what the baby does.
91
295416
2586
05:12
(Video) Hyowon Gweon: See this?
(Ball squeaks)
92
300548
3343
05:16
Did you see that?
(Ball squeaks)
93
304531
3045
05:20
Cool.
94
308036
3066
05:24
See this one?
95
312706
1950
05:26
(Ball squeaks)
96
314656
1881
05:28
Wow.
97
316537
2653
05:33
Laura Schulz: Told you. (Laughs)
98
321854
2113
05:35
(Video) HG: See this one?
(Ball squeaks)
99
323967
4031
05:39
Hey Clara, this one's for you.
You can go ahead and play.
100
327998
4619
05:51
(Laughter)
101
339854
4365
05:56
LS: I don't even have to talk, right?
102
344219
2995
05:59
All right, it's nice that babies
will generalize properties
103
347214
2899
06:02
of blue balls to yellow balls,
104
350113
1528
06:03
and it's impressive that babies
can learn from imitating us,
105
351641
3096
06:06
but we've known those things about babies
for a very long time.
106
354737
3669
06:10
The really interesting question
107
358406
1811
06:12
is what happens when we show babies
exactly the same thing,
108
360217
2852
06:15
and we can ensure it's exactly the same
because we have a secret compartment
109
363069
3611
06:18
and we actually pull the balls from there,
110
366680
2110
06:20
but this time, all we change
is the apparent population
111
368790
3478
06:24
from which that evidence was drawn.
112
372268
2902
06:27
This time, we're going to show babies
three blue balls
113
375170
3553
06:30
pulled out of a box
of mostly yellow balls,
114
378723
3384
06:34
and guess what?
115
382107
1322
06:35
You [probably won't] randomly draw
three blue balls in a row
116
383429
2840
06:38
out of a box of mostly yellow balls.
117
386269
2484
06:40
That is not plausibly
randomly sampled evidence.
118
388753
3747
06:44
That evidence suggests that maybe Hyowon
was deliberately sampling the blue balls.
119
392500
5123
06:49
Maybe there's something special
about the blue balls.
120
397623
2583
06:52
Maybe only the blue balls squeak.
121
400846
2976
06:55
Let's see what the baby does.
122
403822
1895
06:57
(Video) HG: See this?
(Ball squeaks)
123
405717
2904
07:02
See this toy?
(Ball squeaks)
124
410851
2645
07:05
Oh, that was cool. See?
(Ball squeaks)
125
413496
5480
07:10
Now this one's for you to play.
You can go ahead and play.
126
418976
4394
07:18
(Fussing)
(Laughter)
127
426074
6347
07:26
LS: So you just saw
two 15-month-old babies
128
434901
2748
07:29
do entirely different things
129
437649
1942
07:31
based only on the probability
of the sample they observed.
130
439591
3599
07:35
Let me show you the experimental results.
131
443190
2321
07:37
On the vertical axis, you'll see
the percentage of babies
132
445511
2764
07:40
who squeezed the ball in each condition,
133
448275
2530
07:42
and as you'll see, babies are much
more likely to generalize the evidence
134
450805
3715
07:46
when it's plausibly representative
of the population
135
454520
3135
07:49
than when the evidence
is clearly cherry-picked.
136
457655
3738
07:53
And this leads to a fun prediction:
137
461393
2415
07:55
Suppose you pulled just one blue ball
out of the mostly yellow box.
138
463808
4868
08:00
You [probably won't] pull three blue balls
in a row at random out of a yellow box,
139
468896
3869
08:04
but you could randomly sample
just one blue ball.
140
472765
2455
08:07
That's not an improbable sample.
141
475220
1970
08:09
And if you could reach into
a box at random
142
477190
2224
08:11
and pull out something that squeaks,
maybe everything in the box squeaks.
143
479414
3987
08:15
So even though babies are going to see
much less evidence for squeaking,
144
483875
4445
08:20
and have many fewer actions to imitate
145
488320
2242
08:22
in this one ball condition than in
the condition you just saw,
146
490562
3343
08:25
we predicted that babies themselves
would squeeze more,
147
493905
3892
08:29
and that's exactly what we found.
148
497797
2894
08:32
So 15-month-old babies,
in this respect, like scientists,
149
500691
4411
08:37
care whether evidence
is randomly sampled or not,
150
505102
3088
08:40
and they use this to develop
expectations about the world:
151
508190
3507
08:43
what squeaks and what doesn't,
152
511697
2182
08:45
what to explore and what to ignore.
153
513879
3145
08:50
Let me show you another example now,
154
518384
2066
08:52
this time about a problem
of causal reasoning.
155
520450
2730
08:55
And it starts with a problem
of confounded evidence
156
523180
2439
08:57
that all of us have,
157
525619
1672
08:59
which is that we are part of the world.
158
527291
2020
09:01
And this might not seem like a problem
to you, but like most problems,
159
529311
3436
09:04
it's only a problem when things go wrong.
160
532747
2337
09:07
Take this baby, for instance.
161
535464
1811
09:09
Things are going wrong for him.
162
537275
1705
09:10
He would like to make
this toy go, and he can't.
163
538980
2271
09:13
I'll show you a few-second clip.
164
541251
2529
09:21
And there's two possibilities, broadly:
165
549340
1920
09:23
Maybe he's doing something wrong,
166
551260
2634
09:25
or maybe there's something
wrong with the toy.
167
553894
4216
09:30
So in this next experiment,
168
558110
2111
09:32
we're going to give babies
just a tiny bit of statistical data
169
560221
3297
09:35
supporting one hypothesis over the other,
170
563518
2582
09:38
and we're going to see if babies
can use that to make different decisions
171
566100
3455
09:41
about what to do.
172
569555
1834
09:43
Here's the setup.
173
571389
2022
09:46
Hyowon is going to try to make
the toy go and succeed.
174
574071
3030
09:49
I am then going to try twice
and fail both times,
175
577101
3320
09:52
and then Hyowon is going
to try again and succeed,
176
580421
3112
09:55
and this roughly sums up my relationship
to my graduate students
177
583533
3172
09:58
in technology across the board.
178
586705
2835
10:02
But the important point here is
it provides a little bit of evidence
179
590030
3292
10:05
that the problem isn't with the toy,
it's with the person.
180
593322
3668
10:08
Some people can make this toy go,
181
596990
2350
10:11
and some can't.
182
599340
959
10:12
Now, when the baby gets the toy,
he's going to have a choice.
183
600799
3413
10:16
His mom is right there,
184
604212
2188
10:18
so he can go ahead and hand off the toy
and change the person,
185
606400
3315
10:21
but there's also going to be
another toy at the end of that cloth,
186
609715
3158
10:24
and he can pull the cloth towards him
and change the toy.
187
612873
3552
10:28
So let's see what the baby does.
188
616425
2090
10:30
(Video) HG: Two, three. Go!
(Music)
189
618515
4183
10:34
LS: One, two, three, go!
190
622698
3131
10:37
Arthur, I'm going to try again.
One, two, three, go!
191
625829
7382
10:45
YG: Arthur, let me try again, okay?
192
633677
2600
10:48
One, two, three, go!
(Music)
193
636277
4550
10:53
Look at that. Remember these toys?
194
641583
1883
10:55
See these toys? Yeah, I'm going
to put this one over here,
195
643466
3264
10:58
and I'm going to give this one to you.
196
646730
2062
11:00
You can go ahead and play.
197
648792
2335
11:23
LS: Okay, Laura, but of course,
babies love their mommies.
198
671213
4737
11:27
Of course babies give toys
to their mommies
199
675950
2182
11:30
when they can't make them work.
200
678132
2030
11:32
So again, the really important question
is what happens when we change
201
680162
3593
11:35
the statistical data ever so slightly.
202
683755
3154
11:38
This time, babies are going to see the toy
work and fail in exactly the same order,
203
686909
4087
11:42
but we're changing
the distribution of evidence.
204
690996
2415
11:45
This time, Hyowon is going to succeed
once and fail once, and so am I.
205
693411
4411
11:49
And this suggests it doesn't matter
who tries this toy, the toy is broken.
206
697822
5637
11:55
It doesn't work all the time.
207
703459
1886
11:57
Again, the baby's going to have a choice.
208
705345
1965
11:59
Her mom is right next to her,
so she can change the person,
209
707310
3396
12:02
and there's going to be another toy
at the end of the cloth.
210
710706
3204
12:05
Let's watch what she does.
211
713910
1378
12:07
(Video) HG: Two, three, go!
(Music)
212
715288
4348
12:11
Let me try one more time.
One, two, three, go!
213
719636
4984
12:17
Hmm.
214
725460
1697
12:19
LS: Let me try, Clara.
215
727950
2692
12:22
One, two, three, go!
216
730642
3945
12:27
Hmm, let me try again.
217
735265
1935
12:29
One, two, three, go!
(Music)
218
737200
5670
12:35
HG: I'm going
to put this one over here,
219
743009
2233
12:37
and I'm going to give this one to you.
220
745242
2001
12:39
You can go ahead and play.
221
747243
3597
12:58
(Applause)
222
766376
4897
13:04
LS: Let me show you
the experimental results.
223
772993
2392
13:07
On the vertical axis,
you'll see the distribution
224
775385
2475
13:09
of children's choices in each condition,
225
777860
2577
13:12
and you'll see that the distribution
of the choices children make
226
780437
4551
13:16
depends on the evidence they observe.
227
784988
2787
13:19
So in the second year of life,
228
787775
1857
13:21
babies can use a tiny bit
of statistical data
229
789632
2577
13:24
to decide between two
fundamentally different strategies
230
792209
3367
13:27
for acting in the world:
231
795576
1881
13:29
asking for help and exploring.
232
797457
2743
13:33
I've just shown you
two laboratory experiments
233
801700
3434
13:37
out of literally hundreds in the field
that make similar points,
234
805134
3691
13:40
because the really critical point
235
808825
2392
13:43
is that children's ability
to make rich inferences from sparse data
236
811217
5108
13:48
underlies all the species-specific
cultural learning that we do.
237
816325
5341
13:53
Children learn about new tools
from just a few examples.
238
821666
4597
13:58
They learn new causal relationships
from just a few examples.
239
826263
4717
14:03
They even learn new words,
in this case in American Sign Language.
240
831928
4871
14:08
I want to close with just two points.
241
836799
2311
14:12
If you've been following my world,
the field of brain and cognitive sciences,
242
840050
3688
14:15
for the past few years,
243
843738
1927
14:17
three big ideas will have come
to your attention.
244
845665
2415
14:20
The first is that this is
the era of the brain.
245
848080
3436
14:23
And indeed, there have been
staggering discoveries in neuroscience:
246
851516
3669
14:27
localizing functionally specialized
regions of cortex,
247
855185
3436
14:30
turning mouse brains transparent,
248
858621
2601
14:33
activating neurons with light.
249
861222
3776
14:36
A second big idea
250
864998
1996
14:38
is that this is the era of big data
and machine learning,
251
866994
4104
14:43
and machine learning promises
to revolutionize our understanding
252
871098
3141
14:46
of everything from social networks
to epidemiology.
253
874239
4667
14:50
And maybe, as it tackles problems
of scene understanding
254
878906
2693
14:53
and natural language processing,
255
881599
1993
14:55
to tell us something
about human cognition.
256
883592
3324
14:59
And the final big idea you'll have heard
257
887756
1937
15:01
is that maybe it's a good idea we're going
to know so much about brains
258
889693
3387
15:05
and have so much access to big data,
259
893080
1917
15:06
because left to our own devices,
260
894997
2507
15:09
humans are fallible, we take shortcuts,
261
897504
3831
15:13
we err, we make mistakes,
262
901335
3437
15:16
we're biased, and in innumerable ways,
263
904772
3684
15:20
we get the world wrong.
264
908456
2969
15:24
I think these are all important stories,
265
912843
2949
15:27
and they have a lot to tell us
about what it means to be human,
266
915792
3785
15:31
but I want you to note that today
I told you a very different story.
267
919577
3529
15:35
It's a story about minds and not brains,
268
923966
3807
15:39
and in particular, it's a story
about the kinds of computations
269
927773
3006
15:42
that uniquely human minds can perform,
270
930779
2590
15:45
which involve rich, structured knowledge
and the ability to learn
271
933369
3944
15:49
from small amounts of data,
the evidence of just a few examples.
272
937313
5268
15:56
And fundamentally, it's a story
about how starting as very small children
273
944301
4299
16:00
and continuing out all the way
to the greatest accomplishments
274
948600
4180
16:04
of our culture,
275
952780
3843
16:08
we get the world right.
276
956623
1997
16:12
Folks, human minds do not only learn
from small amounts of data.
277
960433
5267
16:18
Human minds think
of altogether new ideas.
278
966285
2101
16:20
Human minds generate
research and discovery,
279
968746
3041
16:23
and human minds generate
art and literature and poetry and theater,
280
971787
5273
16:29
and human minds take care of other humans:
281
977070
3760
16:32
our old, our young, our sick.
282
980830
3427
16:36
We even heal them.
283
984517
2367
16:39
In the years to come, we're going
to see technological innovations
284
987564
3103
16:42
beyond anything I can even envision,
285
990667
3797
16:46
but we are very unlikely
286
994464
2150
16:48
to see anything even approximating
the computational power of a human child
287
996614
5709
16:54
in my lifetime or in yours.
288
1002323
4298
16:58
If we invest in these most powerful
learners and their development,
289
1006621
5047
17:03
in babies and children
290
1011668
2917
17:06
and mothers and fathers
291
1014585
1826
17:08
and caregivers and teachers
292
1016411
2699
17:11
the ways we invest in our other
most powerful and elegant forms
293
1019110
4170
17:15
of technology, engineering and design,
294
1023280
3218
17:18
we will not just be dreaming
of a better future,
295
1026498
2939
17:21
we will be planning for one.
296
1029437
2127
17:23
Thank you very much.
297
1031564
2345
17:25
(Applause)
298
1033909
3421
17:29
Chris Anderson: Laura, thank you.
I do actually have a question for you.
299
1037810
4426
17:34
First of all, the research is insane.
300
1042236
2359
17:36
I mean, who would design
an experiment like that? (Laughter)
301
1044595
3725
17:41
I've seen that a couple of times,
302
1049150
1790
17:42
and I still don't honestly believe
that that can truly be happening,
303
1050940
3222
17:46
but other people have done
similar experiments; it checks out.
304
1054162
3158
17:49
The babies really are that genius.
305
1057320
1633
17:50
LS: You know, they look really impressive
in our experiments,
306
1058953
3007
17:53
but think about what they
look like in real life, right?
307
1061960
2652
17:56
It starts out as a baby.
308
1064612
1150
17:57
Eighteen months later,
it's talking to you,
309
1065762
2007
17:59
and babies' first words aren't just
things like balls and ducks,
310
1067769
3041
18:02
they're things like "all gone,"
which refer to disappearance,
311
1070810
2881
18:05
or "uh-oh," which refer
to unintentional actions.
312
1073691
2283
18:07
It has to be that powerful.
313
1075974
1562
18:09
It has to be much more powerful
than anything I showed you.
314
1077536
2775
18:12
They're figuring out the entire world.
315
1080311
1974
18:14
A four-year-old can talk to you
about almost anything.
316
1082285
3144
18:17
(Applause)
317
1085429
1601
18:19
CA: And if I understand you right,
the other key point you're making is,
318
1087030
3414
18:22
we've been through these years
where there's all this talk
319
1090444
2754
18:25
of how quirky and buggy our minds are,
320
1093198
1932
18:27
that behavioral economics
and the whole theories behind that
321
1095130
2867
18:29
that we're not rational agents.
322
1097997
1603
18:31
You're really saying that the bigger
story is how extraordinary,
323
1099600
4216
18:35
and there really is genius there
that is underappreciated.
324
1103816
4944
18:40
LS: One of my favorite
quotes in psychology
325
1108760
2070
18:42
comes from the social
psychologist Solomon Asch,
326
1110830
2290
18:45
and he said the fundamental task
of psychology is to remove
327
1113120
2807
18:47
the veil of self-evidence from things.
328
1115927
2626
18:50
There are orders of magnitude
more decisions you make every day
329
1118553
4551
18:55
that get the world right.
330
1123104
1347
18:56
You know about objects
and their properties.
331
1124451
2132
18:58
You know them when they're occluded.
You know them in the dark.
332
1126583
3029
19:01
You can walk through rooms.
333
1129612
1308
19:02
You can figure out what other people
are thinking. You can talk to them.
334
1130920
3532
19:06
You can navigate space.
You know about numbers.
335
1134452
2230
19:08
You know causal relationships.
You know about moral reasoning.
336
1136682
3022
19:11
You do this effortlessly,
so we don't see it,
337
1139704
2356
19:14
but that is how we get the world right,
and it's a remarkable
338
1142060
2912
19:16
and very difficult-to-understand
accomplishment.
339
1144972
2318
19:19
CA: I suspect there are people
in the audience who have
340
1147290
2628
19:21
this view of accelerating
technological power
341
1149918
2238
19:24
who might dispute your statement
that never in our lifetimes
342
1152156
2958
19:27
will a computer do what
a three-year-old child can do,
343
1155114
2618
19:29
but what's clear is that in any scenario,
344
1157732
3248
19:32
our machines have so much to learn
from our toddlers.
345
1160980
3770
19:38
LS: I think so. You'll have some
machine learning folks up here.
346
1166230
3216
19:41
I mean, you should never bet
against babies or chimpanzees
347
1169446
4203
19:45
or technology as a matter of practice,
348
1173649
3645
19:49
but it's not just
a difference in quantity,
349
1177294
4528
19:53
it's a difference in kind.
350
1181822
1764
19:55
We have incredibly powerful computers,
351
1183586
2160
19:57
and they do do amazingly
sophisticated things,
352
1185746
2391
20:00
often with very big amounts of data.
353
1188137
3204
20:03
Human minds do, I think,
something quite different,
354
1191341
2607
20:05
and I think it's the structured,
hierarchical nature of human knowledge
355
1193948
3895
20:09
that remains a real challenge.
356
1197843
2032
20:11
CA: Laura Schulz, wonderful
food for thought. Thank you so much.
357
1199875
3061
20:14
LS: Thank you.
(Applause)
358
1202936
2922

▲Back to top

ABOUT THE SPEAKER
Laura Schulz - Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn.

Why you should listen

MIT Early Childhood Cognition Lab lead investigator Laura Schulz studies learning in early childhood. Her research bridges computational models of cognitive development and behavioral studies in order to understand the origins of inquiry and discovery.

Working in play labs, children’s museums, and a recently-launched citizen science website, Schultz is reshaping how we view young children’s perceptions of the world around them. Some of the surprising results of her research: before the age of four, children expect hidden causes when events happen probabilistically, use simple experiments to distinguish causal hypotheses, and trade off learning from instruction and exploration.

More profile about the speaker
Laura Schulz | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee