English-Video.net comment policy

The comment field is common to all languages

Let's write in your language and use "Google Translate" together

Please refer to informative community guidelines on TED.com

TED2015

Laura Schulz: The surprisingly logical minds of babies

Laura Schulz: As sorprendentemente lóxicas mentes dos bebés

Filmed
Views 1,632,838

Como aprenden os bebés tanto de tan pouco, e tan rápido? Nunha charla divertida e chea de experimentos, a científica cognitiva Laura Schulz amosa como os nosos cativos toman decisións cun sentido da lóxica sorprendentemente forte, xa moito antes de que dean falado.

- Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn. Full bio

Mark Twain resumiu
o que eu considero que é
00:12
Mark Twain summed up
what I take to be
un dos problemas fundamentais
da ciencia cognitiva
00:14
one of the fundamental problems
of cognitive science
cunha sinxela ocorrencia.
00:18
with a single witticism.
Dixo, "A ciencia é fascinante.
00:20
He said, "There's something
fascinating about science.
Conséguense cantidades
masivas de conxecturas
00:23
One gets such wholesale
returns of conjecture
00:26
out of such a trifling
investment in fact."
a partir dun investimento
tan insignificante en feitos.”
00:29
(Laughter)
(Risas)
Twain quería facer unha broma, claro,
pero ten razón:
00:32
Twain meant it as a joke,
of course, but he's right:
A ciencia é fascinante.
00:34
There's something
fascinating about science.
A partir duns cantos ósos, inferimos
a existencia dos dinosauros.
00:37
From a few bones, we infer
the existence of dinosuars.
Das liñas espectrais,
a composición das nebulosas.
00:42
From spectral lines,
the composition of nebulae.
A partir das moscas da froita,
00:47
From fruit flies,
os mecanismos da herdanza,
00:50
the mechanisms of heredity,
e de imaxes reconstruídas de sangue
fluíndo a través do cerebro,
00:53
and from reconstructed images
of blood flowing through the brain,
ou no meu caso, do comportamento
de nenos moi pequenos,
00:57
or in my case, from the behavior
of very young children,
intentamos dicir algo
sobre os mecanismos fundamentais
01:02
we try to say something about
the fundamental mechanisms
da cognición humana.
01:05
of human cognition.
En concreto, no meu laboratorio no Dpto.
de Cerebro e Ciencias Cognitivas, no MIT,
01:07
In particular, in my lab in the Department
of Brain and Cognitive Sciences at MIT,
pasei a última década
intentando entender o misterio
01:12
I have spent the past decade
trying to understand the mystery
de por que os nenos aprenden tanto,
a partir de tan pouco, e tan rápido.
01:16
of how children learn so much
from so little so quickly.
Porque resulta que o que a ciencia
ten de fascinante
01:20
Because, it turns out that
the fascinating thing about science
téñeno tamén de fascinante os nenos,
01:23
is also a fascinating
thing about children,
e é, dicíndoo de forma máis suave
ca Mark Twain,
01:27
which, to put a gentler
spin on Mark Twain,
precisamente a súa capacidade
de extraer inferencias ricas e abstractas
01:29
is precisely their ability
to draw rich, abstract inferences
de forma rápida e precisa a partir
de datos dispersos e confusos.
01:34
rapidly and accurately
from sparse, noisy data.
Vou dar só dous exemplos hoxe.
01:40
I'm going to give you
just two examples today.
Un deles aborda
un problema de xeneralización,
01:42
One is about a problem of generalization,
e o outro un de razoamento causal.
01:45
and the other is about a problem
of causal reasoning.
E aínda que vou falar
do que facemos no meu laboratorio,
01:47
And although I'm going to talk
about work in my lab,
01:50
this work is inspired by
and indebted to a field.
este traballo está inspirado por un campo
e en débeda con el.
Estoulles agradecida a mentores,
colegas e colaboradores de todo o mundo.
01:53
I'm grateful to mentors, colleagues,
and collaborators around the world.
Quero comezar
co problema de xeneralización.
01:59
Let me start with the problem
of generalization.
Xeneralizar a partir de pequenas mostras
de datos é o pan de cada día da ciencia.
02:02
Generalizing from small samples of data
is the bread and butter of science.
Entrevistamos unha fracción
mínima do electorado
02:06
We poll a tiny fraction of the electorate
e predicimos o resultado
das eleccións nacionais.
02:09
and we predict the outcome
of national elections.
Vemos como un puñado de pacientes
responde a tratamento nun ensaio clínico,
02:12
We see how a handful of patients
responds to treatment in a clinical trial,
e incorporamos fármacos
ao mercado nacional.
02:16
and we bring drugs to a national market.
Pero isto soamente funciona se a mostra
se extrae aleatoriamente da poboación.
02:19
But this only works if our sample
is randomly drawn from the population.
Se a nosa mostra ten algunha manipulación
02:23
If our sample is cherry-picked
in some way --
--por exemplo,
entrevistamos só votantes urbanos,
02:26
say, we poll only urban voters,
02:28
or say, in our clinical trials
for treatments for heart disease,
ou nos nosos ensaios clínicos
de tratamentos para doenzas cardíacas
incluímos só homes--
02:32
we include only men --
os resultados poden
non ser xeneralizables a toda a poboación.
02:34
the results may not generalize
to the broader population.
Por tanto aos científicos impórtalles
se a mostra se recolleu ou non ao chou,
02:38
So scientists care whether evidence
is randomly sampled or not,
pero que ten iso que ver cos bebés?
02:42
but what does that have to do with babies?
Os bebés teñen que xeneralizar seguido
a partir de pequenas mostras de datos.
02:44
Well, babies have to generalize
from small samples of data all the time.
Ven uns poucos parrulos de goma
e aprenden que flotan,
02:49
They see a few rubber ducks
and learn that they float,
ou algunhas pelotas e aprenden que botan.
02:52
or a few balls and learn that they bounce.
E desenvolven expectativas
sobre os parrulos e as pelotas
02:55
And they develop expectations
about ducks and balls
que aplicarán a uns e outras
02:58
that they're going to extend
to rubber ducks and balls
o resto das súas vidas.
03:01
for the rest of their lives.
E os tipos de xeneralizacións
que deben facer sobre parrulos e pelotas,
03:03
And the kinds of generalizations
babies have to make about ducks and balls
deben facelos para case todo:
03:07
they have to make about almost everything:
zapatos e barcos e lacre e verzas e reis.
03:09
shoes and ships and sealing wax
and cabbages and kings.
Entón aos bebés impórtalles
se o pequeno anaco de proba que ven
03:14
So do babies care whether
the tiny bit of evidence they see
representa de forma plausíbel
unha poboación maior?
03:17
is plausibly representative
of a larger population?
Descubrámolo.
03:21
Let's find out.
Vou amosar dous vídeos,
03:23
I'm going to show you two movies,
un por cada suposto dun experimento,
03:25
one from each of two conditions
of an experiment,
e como só se verán dous vídeos,
03:27
and because you're going to see
just two movies,
só se verán dous bebés,
03:30
you're going to see just two babies,
e un par calquera de bebés difire
de calquera outro de innumerábeis formas.
03:32
and any two babies differ from each other
in innumerable ways.
03:36
But these babies, of course,
here stand in for groups of babies,
Pero estes bebés, por suposto,
representan aquí a grupos de bebés,
03:39
and the differences you're going to see
e as diferenzas que se van ver
representan as diferenzas grupais medias
03:41
represent average group differences
in babies' behavior across conditions.
no comportamento dos bebés
en cada suposto.
En cada vídeo verase
un bebé facendo tal vez
03:47
In each movie, you're going to see
a baby doing maybe
xusto o que se agardaría que fixese,
03:49
just exactly what you might
expect a baby to do,
e dificilmente podemos volver
os bebés máis máxicos do que xa son.
03:53
and we can hardly make babies
more magical than they already are.
Pero para min o máxico,
03:58
But to my mind the magical thing,
e ao que quero que se lle preste atención,
04:00
and what I want you to pay attention to,
é o contraste entre estes dous supostos,
04:02
is the contrast between
these two conditions,
porque o único que difire
entre os dous vídeos
04:05
because the only thing
that differs between these two movies
son os datos estatísticos
que os bebés van observar.
04:08
is the statistical evidence
the babies are going to observe.
Imos ensinarlles unha caixa
de bólas azuis e amarelas,
04:13
We're going to show babies
a box of blue and yellow balls,
e a que era a miña estudante graduada,
hoxe compañeira en Stanford, Hyowon Gweon,
04:16
and my then-graduate student,
now colleague at Stanford, Hyowon Gweon,
vai sacar tres bólas azuis
seguidas desta caixa,
04:21
is going to pull three blue balls
in a row out of this box,
e despois de sacalas, vainas apertar,
04:24
and when she pulls those balls out,
she's going to squeeze them,
e as bólas van chiar.
04:27
and the balls are going to squeak.
E se es un bebé,
iso é como un charla TED.
04:29
And if you're a baby,
that's like a TED Talk.
Non pode haber nada mellor.
04:32
It doesn't get better than that.
(Risas)
04:34
(Laughter)
Pero o importante é que é moi sinxelo
sacar tres bólas azuis seguidas
04:38
But the important point is it's really
easy to pull three blue balls in a row
dunha caixa que ten
sobre todo bólas azuis.
04:42
out of a box of mostly blue balls.
Poderíase facer cos ollos pechados.
04:44
You could do that with your eyes closed.
Pódese admitir que é unha
mostra aleatoria desta poboación.
04:46
It's plausibly a random sample
from this population.
E se podes meter a man aleatoriamente
nunha caixa e sacar cousas que chían,
04:49
And if you can reach into a box at random
and pull out things that squeak,
ao mellor todo o que hai na caixa chía.
04:53
then maybe everything in the box squeaks.
Así que tal vez os bebés deberían esperar
que as bólas amarelas chíen tamén.
04:56
So maybe babies should expect
those yellow balls to squeak as well.
As bólas amarelas teñen
divertidos paus nun extremo,
05:00
Now, those yellow balls
have funny sticks on the end,
que permiten facer con elas
outras cousas se se quere.
05:02
so babies could do other things
with them if they wanted to.
Poderían axitalas ou bater con elas.
05:05
They could pound them or whack them.
Pero vexamos qué fai o bebé.
05:07
But let's see what the baby does.
(Vídeo) Ves isto? (A bóla chía)
05:12
(Video) Hyowon Gweon: See this?
(Ball squeaks)
Viches iso? (A bóla chía)
05:16
Did you see that?
(Ball squeaks)
Xenial.
05:20
Cool.
Ves estoutra?
05:24
See this one?
(A bóla chía)
05:26
(Ball squeaks)
Uaau.
05:28
Wow.
Díxenvolo. (Ri)
05:33
Laura Schulz: Told you. (Laughs)
Viches esta? (A bóla chía)
05:35
(Video) HG: See this one?
(Ball squeaks)
Clara, agora esta é para ti.
Veña, podes collela e xogar.
05:39
Hey Clara, this one's for you.
You can go ahead and play.
(Barullo) (Risas)
05:51
(Laughter)
LS: Non teño nin que dicir nada, verdade?
05:56
LS: I don't even have to talk, right?
Vale, está ben que os bebés
xeneralicen propiedades
05:59
All right, it's nice that babies
will generalize properties
das bólas azuis ás bolas amarelas.
06:02
of blue balls to yellow balls,
E é impresionante que poidan
aprender imitándonos.
06:03
and it's impressive that babies
can learn from imitating us,
Pero sabemos iso dos bebés
dende hai moito tempo.
06:06
but we've known those things about babies
for a very long time.
A pregunta realmente interesante é
06:10
The really interesting question
que ocorre cando lles amosamos
aos bebés exactamente a mesma cousa,
06:12
is what happens when we show babies
exactly the same thing,
06:15
and we can ensure it's exactly the same
because we have a secret compartment
podemos asegurar que é a mesma
porque temos un compartimento secreto
e en realidade sacamos as bólas del,
06:18
and we actually pull the balls from there,
pero esta vez o que cambiamos
foi a poboación aparente
06:20
but this time, all we change
is the apparent population
da que extraemos as mostras.
06:24
from which that evidence was drawn.
Esta vez amosarémoslles
aos bebés tres bólas azuis
06:27
This time, we're going to show babies
three blue balls
sacadas dunha caixa que ten sobre todo
bólas amarelas,
06:30
pulled out of a box
of mostly yellow balls,
e saben que?
06:34
and guess what?
Non se poden sacar aleatoriamente
tres bólas azuis seguidas
06:35
You [probably won't] randomly draw
three blue balls in a row
dunha caixa que ten sobre todo
bólas amarelas.
06:38
out of a box of mostly yellow balls.
Esa non é unha mostra aleatoria.
06:40
That is not plausibly
randomly sampled evidence.
Esa proba suxire que ao mellor Hyowon
estivo amosando deliberadamente as azuis.
06:44
That evidence suggests that maybe Hyowon
was deliberately sampling the blue balls.
Tal vez as bólas azuis teñen algo especial
06:49
Maybe there's something special
about the blue balls.
Tal vez soamente as bólas azuis chían.
06:52
Maybe only the blue balls squeak.
Vexamos o que fai o bebé.
06:55
Let's see what the baby does.
(Vídeo) Ves isto?
(A bóla chía)
06:57
(Video) HG: See this?
(Ball squeaks)
Ves este xoguete?
(A bóla chía)
07:02
See this toy?
(Ball squeaks)
Oh, que xenial. Ves?
(A bóla chía)
07:05
Oh, that was cool. See?
(Ball squeaks)
Agora esta é para que xogues ti.
Veña, podes xogar.
07:10
Now this one's for you to play.
You can go ahead and play.
(Barullo) (Risas)
07:18
(Fussing)
(Laughter)
LS: Acabades de ver dous
bebés de 15 meses
07:26
LS: So you just saw
two 15-month-old babies
facendo dúas cousas totalmente diferentes
07:29
do entirely different things
07:31
based only on the probability
of the sample they observed.
baseadas só na probabilidade
da mostra que observaron.
Quero ensinar os resultados experimentais.
07:35
Let me show you the experimental results.
No eixe vertical, pódese ver
a porcentaxe de bebés
07:37
On the vertical axis, you'll see
the percentage of babies
que apertaron a bóla en cada suposto,
07:40
who squeezed the ball in each condition,
e como se ve, os bebés tenden
moito máis a xeneralizar a mostra
07:42
and as you'll see, babies are much
more likely to generalize the evidence
cando é representativa da poboación
07:46
when it's plausibly representative
of the population
ca cando está claramente manipulada.
07:49
than when the evidence
is clearly cherry-picked.
E isto lévanos a unha predición curiosa:
07:53
And this leads to a fun prediction:
supoñamos que sacamos só unha bóla azul
07:55
Suppose you pulled just one blue ball
out of the mostly yellow box.
da caixa que ten sobre todo
bólas amarelas.
Non se poderían sacar aleatoriamente
3 bólas azuis seguidas dunha caixa amarela
08:00
You [probably won't] pull three blue balls
in a row at random out of a yellow box,
pero poderíase sacar soamente unha.
08:04
but you could randomly sample
just one blue ball.
Non é unha mostra improbable.
08:07
That's not an improbable sample.
E se se puidese meter a man
ao chou nunha caixa
08:09
And if you could reach into
a box at random
e sacar algo que chía,
tal vez todo o da caixa chíe.
08:11
and pull out something that squeaks,
maybe everything in the box squeaks.
Entón, aínda que os bebés van observar
moita menos probas para chíos,
08:15
So even though babies are going to see
much less evidence for squeaking,
e contan con moitas menos
accións que imitar
08:20
and have many fewer actions to imitate
neste suposto dunha única bóla
ca no que vimos antes,
08:22
in this one ball condition than in
the condition you just saw,
predicimos que os bebés por si sós
apertarían a bóla máis veces,
08:25
we predicted that babies themselves
would squeeze more,
e iso é exactamente o que atopamos.
08:29
and that's exactly what we found.
08:32
So 15-month-old babies,
in this respect, like scientists,
Así que aos bebés de 15 meses,
neste sentido, como científicos,
08:37
care whether evidence
is randomly sampled or not,
impórtalles se a proba é
unha mostra representativa ou non,
e usan isto para desenvolver
expectativas sobre o mundo:
08:40
and they use this to develop
expectations about the world:
08:43
what squeaks and what doesn't,
qué chía e qué non,
qué explorar e qué ignorar.
08:45
what to explore and what to ignore.
Agora quero amosar outro exemplo,
08:50
Let me show you another example now,
esta vez sobre un problema
de razoamento causal.
08:52
this time about a problem
of causal reasoning.
E comeza cun problema de proba confusa
08:55
And it starts with a problem
of confounded evidence
que todos temos:
08:57
that all of us have,
o feito de que formamos parte do mundo.
08:59
which is that we are part of the world.
Isto pode non parecer un problema,
pero como a maior parte deles,
09:01
And this might not seem like a problem
to you, but like most problems,
maniféstase só cando as cousas van mal.
09:04
it's only a problem when things go wrong.
Velaquí este bebé, por exemplo.
09:07
Take this baby, for instance.
As cousas están indo mal para el.
09:09
Things are going wrong for him.
Gustaríalle facer funcionar
o seu xoguete, e non pode.
09:10
He would like to make
this toy go, and he can't.
Amosarei un vídeo duns poucos segundos.
09:13
I'll show you a few-second clip.
En xeral, hai dúas posibilidades:
09:21
And there's two possibilities, broadly:
ou el está facendo algo mal,
09:23
Maybe he's doing something wrong,
ou algo non funciona no xoguete.
09:25
or maybe there's something
wrong with the toy.
Así que no seguinte experimento,
09:30
So in this next experiment,
darémoslles aos bebés só
unha mínima porción de datos estatísticos
09:32
we're going to give babies
just a tiny bit of statistical data
que apoian unha das hipóteses
sobre a outra,
09:35
supporting one hypothesis over the other,
e veremos se os bebés poden usar iso
para tomar decisións diferentes
09:38
and we're going to see if babies
can use that to make different decisions
sobre qué facer.
09:41
about what to do.
Velaquí o plan.
09:43
Here's the setup.
Hyowon vai intentar que o xoguete
funcione, e conségueo.
09:46
Hyowon is going to try to make
the toy go and succeed.
Entón eu vou intentalo dúas veces
e fracasar as dúas,
09:49
I am then going to try twice
and fail both times,
despois Hyowon vai intentalo
outra vez e conseguilo,
09:52
and then Hyowon is going
to try again and succeed,
o que resume en xeral a miña relación
cos meus estudantes de posgrao
09:55
and this roughly sums up my relationship
to my graduate students
09:58
in technology across the board.
no que ten que ver coa tecnoloxía.
Pero o importante aquí é
que proporciona algunha proba
10:02
But the important point here is
it provides a little bit of evidence
de que o problema non é o xoguete,
senón a persoa.
10:05
that the problem isn't with the toy,
it's with the person.
Algunhas poden facer
que o xoguete funcione,
10:08
Some people can make this toy go,
e outras non.
10:11
and some can't.
Agora, cando o bebé consegue o xoguete,
vai ter unha elección.
10:12
Now, when the baby gets the toy,
he's going to have a choice.
Súa nai está xusto alí,
10:16
His mom is right there,
polo que pode ir e darlle o xoguete
e cambiar a persoa,
10:18
so he can go ahead and hand off the toy
and change the person,
pero tamén vai haber outro xoguete
no bordo desa tea,
10:21
but there's also going to be
another toy at the end of that cloth,
así que pode tirar da tea cara a el
e cambiar o xoguete.
10:24
and he can pull the cloth towards him
and change the toy.
Vexamos logo qué fai o bebé.
10:28
So let's see what the baby does.
(Vídeo) HG: Dous, tres. Xa!
(Música)
10:30
(Video) HG: Two, three. Go!
(Music)
LS: Un, dous, tres. Xa!
10:34
LS: One, two, three, go!
10:37
Arthur, I'm going to try again.
One, two, three, go!
Arthur, vou intentalo outra vez.
Un, dous, tres. Xa!
HG: Arthur, déixame probar outra vez, si?
10:45
YG: Arthur, let me try again, okay?
10:48
One, two, three, go!
(Music)
Un, dous, tres. Xa! (Música)
10:53
Look at that. Remember these toys?
Mira. Acórdaste destes xoguetes?
10:55
See these toys? Yeah, I'm going
to put this one over here,
Ves estes xoguetes?
Si, vou poñer este por aquí,
e a ti vouche dar este.
10:58
and I'm going to give this one to you.
Veña, xa podes xogar.
11:00
You can go ahead and play.
LS: Vale, Laura, pero claro,
os bebés quérenlles ás súas mamás.
11:23
LS: Okay, Laura, but of course,
babies love their mommies.
Normal que lles dean os xoguetes a ela
11:27
Of course babies give toys
to their mommies
cando non conseguen que funcionen.
11:30
when they can't make them work.
De novo, a pregunta realmente importante
é que ocorre cando cambiamos
11:32
So again, the really important question
is what happens when we change
os datos estatísticos só levemente.
11:35
the statistical data ever so slightly.
Agora, os bebés van ver o xoguete
funcionar e fallar xusto na mesma orde,
11:38
This time, babies are going to see the toy
work and fail in exactly the same order,
pero imos cambiar a distribución da proba.
11:42
but we're changing
the distribution of evidence.
Agora, Hyowon vai conseguilo unha vez
e fracasar outra, e eu tamén.
11:45
This time, Hyowon is going to succeed
once and fail once, and so am I.
O que suxire que non importa
quen proba este xoguete, está roto.
11:49
And this suggests it doesn't matter
who tries this toy, the toy is broken.
Non funciona nunca.
11:55
It doesn't work all the time.
De novo, o bebé
vai ter que tomar unha decisión.
11:57
Again, the baby's going to have a choice.
A súa nai está xusto ao lado,
así que pode cambiar a persoa,
11:59
Her mom is right next to her,
so she can change the person,
e haberá outro xoguete ao final da tea.
12:02
and there's going to be another toy
at the end of the cloth.
Vexamos que fai.
12:05
Let's watch what she does.
HG: Dous, tres, xa!
(Música)
12:07
(Video) HG: Two, three, go!
(Music)
Déixame probar outra vez.
Un, dous, tres, xa!
12:11
Let me try one more time.
One, two, three, go!
Umm.
12:17
Hmm.
LS: Déixame probar a min, Clara.
12:19
LS: Let me try, Clara.
Un, dous, tres, xa!
12:22
One, two, three, go!
Umm, déixame probar outra vez.
12:27
Hmm, let me try again.
Un, dos, tres, xa!
(Música)
12:29
One, two, three, go!
(Music)
HG: Vou poñer este por aquí,
12:35
HG: I'm going
to put this one over here,
e vouche dar este a ti.
12:37
and I'm going to give this one to you.
Veña, xa podes xogar.
12:39
You can go ahead and play.
12:58
(Applause)
(Aplausos)
LS: Amosarei agora
os resultados experimentais.
13:04
LS: Let me show you
the experimental results.
No eixe vertical, vese a distribución
13:07
On the vertical axis,
you'll see the distribution
das eleccións dos nenos
baixo cada suposto,
13:09
of children's choices in each condition,
e vese que a distribución
das eleccións que fan
13:12
and you'll see that the distribution
of the choices children make
depende da proba que observan.
13:16
depends on the evidence they observe.
No segundo ano de idade,
13:19
So in the second year of life,
os bebés poden usar unha fracción
mínima de datos estatísticos
13:21
babies can use a tiny bit
of statistical data
13:24
to decide between two
fundamentally different strategies
para decidir entre dúas estratexias
fundamentalmente diferentes
para actuar no mundo:
13:27
for acting in the world:
pedir axuda e explorar.
13:29
asking for help and exploring.
Acabo de amosar
dous experimentos de laboratorio
13:33
I've just shown you
two laboratory experiments
dos literalmente centos neste campo
que chegan a conclusións similares,
13:37
out of literally hundreds in the field
that make similar points,
porque o auténtico punto clave
13:40
because the really critical point
é que a capacidade dos nenos
13:43
is that children's ability
to make rich inferences from sparse data
para facer ricas inferencias
partindo de datos dispersos
serve de base a toda a nosa aprendizaxe
cultural específica como especie.
13:48
underlies all the species-specific
cultural learning that we do.
Os nenos aprenden sobre novas ferramentas
a partir duns poucos exemplos.
13:53
Children learn about new tools
from just a few examples.
Aprenden novas relacións causais
a partir duns poucos exemplos.
13:58
They learn new causal relationships
from just a few examples.
Incluso aprenden palabras novas ,
neste caso en lingua de signos americana.
14:03
They even learn new words,
in this case in American Sign Language.
Quero concluír con só dúas cousas.
14:08
I want to close with just two points.
A quen seguise o meu campo
(o do cerebro e as ciencias cognitivas)
14:12
If you've been following my world,
the field of brain and cognitive sciences,
durante os últimos anos,
14:15
for the past few years,
chamaríanlle a atención
tres grandes ideas.
14:17
three big ideas will have come
to your attention.
A primeira é que esta é a era do cerebro.
14:20
The first is that this is
the era of the brain.
14:23
And indeed, there have been
staggering discoveries in neuroscience:
E por suposto, houbo descubrimentos
impresionantes en neurociencia:
localizar rexións do córtex
funcionalmente especializadas,
14:27
localizing functionally specialized
regions of cortex,
facer transparentes os cerebros de ratos,
14:30
turning mouse brains transparent,
activar neuronas con luz.
14:33
activating neurons with light.
14:36
A second big idea
Unha segunda grande idea
é que esta é a era dos datos masivos
e da aprendizaxe automática,
14:38
is that this is the era of big data
and machine learning,
e a aprendizaxe automática promete
revolucionar a nosa comprensión
14:43
and machine learning promises
to revolutionize our understanding
de todo, dende as redes sociais
ata a epidemioloxía.
14:46
of everything from social networks
to epidemiology.
E tal vez, á vez que afronta problemas
de comprensión do contexto
14:50
And maybe, as it tackles problems
of scene understanding
14:53
and natural language processing,
e de procesamento da linguaxe natural,
poida desvelarnos algo
sobre a cognición humana.
14:55
to tell us something
about human cognition.
E a gran idea final que escoitarían
14:59
And the final big idea you'll have heard
é que pode ser boa idea
saber tanto sobre os cerebros
15:01
is that maybe it's a good idea we're going
to know so much about brains
e ter tanto acceso a datos masivos,
15:05
and have so much access to big data,
porque pola nosa conta,
15:06
because left to our own devices,
os humanos somos falíbeis,
buscamos atallos,
15:09
humans are fallible, we take shortcuts,
erramos, temos fallos,
15:13
we err, we make mistakes,
non somos neutrais,
e de formas innumerables,
15:16
we're biased, and in innumerable ways,
15:20
we get the world wrong.
chegamos a ideas falsas sobre o mundo.
Eu creo que todas estas
son noticias importantes,
15:24
I think these are all important stories,
e que teñen moito que contarnos
sobre qué significa ser humano,
15:27
and they have a lot to tell us
about what it means to be human,
pero gustaríame destacar
que hoxe tratei unha noticia moi distinta.
15:31
but I want you to note that today
I told you a very different story.
Unha noticia sobre mentes,
non sobre cerebros,
15:35
It's a story about minds and not brains,
e en particular,
sobre o tipo de computación
15:39
and in particular, it's a story
about the kinds of computations
que só as mentes humanas poden realizar,
15:42
that uniquely human minds can perform,
que implican coñecementos ricos
e estruturados e capacidade de aprender
15:45
which involve rich, structured knowledge
and the ability to learn
a partir de pequenas cantidades de datos,
coa proba de só uns poucos exemplos.
15:49
from small amounts of data,
the evidence of just a few examples.
E fundamentalmente, é unha noticia
sobre como dende meniños
15:56
And fundamentally, it's a story
about how starting as very small children
e continuando todo o camiño
ata os máis grandes logros
16:00
and continuing out all the way
to the greatest accomplishments
16:04
of our culture,
da nosa cultura,
conseguimos entender ben o mundo.
16:08
we get the world right.
16:12
Folks, human minds do not only learn
from small amounts of data.
Amigos, as mentes humanas non aprenden só
a partir de pequenas cantidades de datos
As mentes humanas pensan
ideas totalmente novas.
16:18
Human minds think
of altogether new ideas.
As mentes humanas xeran
investigación e descubrimento,
16:20
Human minds generate
research and discovery,
e as mentes humanas xeran
arte e literatura e poesía e teatro,
16:23
and human minds generate
art and literature and poetry and theater,
e as mentes humanas
coidan doutros seres humanos:
16:29
and human minds take care of other humans:
os nosos maiores, a nosa mocidade,
os nosos enfermos.
16:32
our old, our young, our sick.
Incluso os curamos.
16:36
We even heal them.
Nos próximos anos,
imos ver innovacións tecnolóxicas
16:39
In the years to come, we're going
to see technological innovations
máis alá do que podo concibir,
16:42
beyond anything I can even envision,
pero hai moi poucas probabilidades
16:46
but we are very unlikely
de que vexamos algo
que se aproxime sequera
16:48
to see anything even approximating
the computational power of a human child
ao poder computacional dun neno humano,
no resto da miña vida ou da vosa.
16:54
in my lifetime or in yours.
Se investimos nestes potentísimos
aprendices e no seu desenvolvemento,
16:58
If we invest in these most powerful
learners and their development,
en bebés e cativos,
17:03
in babies and children
e nais e pais
17:06
and mothers and fathers
e coidadores e profesores
17:08
and caregivers and teachers
do xeito que investimos nas nosas
outras poderosísimas e elegantes formas
17:11
the ways we invest in our other
most powerful and elegant forms
de tecnoloxía, enxeñaría e deseño,
17:15
of technology, engineering and design,
non estaremos simplemente
soñando cun mellor futuro,
17:18
we will not just be dreaming
of a better future,
estaremos planificándoo.
17:21
we will be planning for one.
Moitísimas grazas.
17:23
Thank you very much.
(Aplausos)
17:25
(Applause)
17:29
Chris Anderson: Laura, thank you.
I do actually have a question for you.
Chris Anderson: Grazas, Laura.
Quería facerche unha pregunta.
Antes de nada,
esta investigación é de tolos.
17:34
First of all, the research is insane.
Quen deseñaría
un experimento coma ese? (Risas)
17:36
I mean, who would design
an experiment like that? (Laughter)
Vino unhas cantas veces,
17:41
I've seen that a couple of times,
e sigo sen acabar de crer
que poida estar ocorrendo de verdade,
17:42
and I still don't honestly believe
that that can truly be happening,
pero outras persoas fixeron
experimentos similares; está comprobado.
17:46
but other people have done
similar experiments; it checks out.
17:49
The babies really are that genius.
Os bebés son realmente xenios.
17:50
LS: You know, they look really impressive
in our experiments,
LS: Parecen realmente impresionantes
nos nosos experimentos,
17:53
but think about what they
look like in real life, right?
pero pensa no que fan na vida real, non?
Todo comeza cun bebé.
17:56
It starts out as a baby.
Dezaoito meses despois, estache falando,
17:57
Eighteen months later,
it's talking to you,
e as primeiras palabras dos bebés
non van de pelotas e parrulos,
17:59
and babies' first words aren't just
things like balls and ducks,
son cousas como “non ta”
que se refire á desaparición,
18:02
they're things like "all gone,"
which refer to disappearance,
ou “uh oh”, para referirse
a accións involuntarias.
18:05
or "uh-oh," which refer
to unintentional actions.
Ten que ser así de poderoso.
18:07
It has to be that powerful.
Ten que ser moito máis poderoso
que o que ensinei.
18:09
It has to be much more powerful
than anything I showed you.
Están descifrando o mundo enteiro.
18:12
They're figuring out the entire world.
Un neno de catro anos
pode falarche sobre case todo.
18:14
A four-year-old can talk to you
about almost anything.
(Aplausos)
18:17
(Applause)
18:19
CA: And if I understand you right,
the other key point you're making is,
CA: E se entendo ben,
o outro punto clave que destacas é
que durante estes anos
tivemos todo este debate
18:22
we've been through these years
where there's all this talk
sobre o peculiares e confusas
que son as nosas mentes,
18:25
of how quirky and buggy our minds are,
18:27
that behavioral economics
and the whole theories behind that
coa economía condutual
e teorías enteiras detrás
18:29
that we're not rational agents.
de que non somos axentes racionais.
18:31
You're really saying that the bigger
story is how extraordinary,
E ti estás a dicir que este fenómeno
é extraordinario,
e que en realidade hai xenialidade
que está subestimada.
18:35
and there really is genius there
that is underappreciated.
18:40
LS: One of my favorite
quotes in psychology
Unha das miñas citas favoritas
en psicoloxía
18:42
comes from the social
psychologist Solomon Asch,
é do psicólogo social Solomon Asch,
18:45
and he said the fundamental task
of psychology is to remove
que dixo que
“o cometido fundamental da psicoloxía
é eliminar
o veo de autoevidencia das cousas”.
18:47
the veil of self-evidence from things.
Hai millóns de decisións
que se toman a diario
18:50
There are orders of magnitude
more decisions you make every day
que interpretan ben o mundo.
18:55
that get the world right.
Coñecemos os obxectos
e as súas propiedades.
18:56
You know about objects
and their properties.
Recoñecémolos cando están ocultos.
Recoñecémolos na escuridade.
18:58
You know them when they're occluded.
You know them in the dark.
Camiñamos por cuartos.
19:01
You can walk through rooms.
Podemos percibir o que pensan outros.
Podemos falarlles.
19:02
You can figure out what other people
are thinking. You can talk to them.
Podemos navegar no espazo.
Coñecemos os números.
19:06
You can navigate space.
You know about numbers.
Entendemos as relacións causais.
Entendemos o razoamento moral.
19:08
You know causal relationships.
You know about moral reasoning.
E todo isto sen esforzo ningún,
por iso non nos decatamos,
19:11
You do this effortlessly,
so we don't see it,
pero así interpretamos ben o mundo,
19:14
but that is how we get the world right,
and it's a remarkable
e moi difícil de entender.
19:16
and very difficult-to-understand
accomplishment.
CA: Imaxino que hai persoas no público
que comparten
19:19
CA: I suspect there are people
in the audience who have
esa visión do crecente poder tecnolóxico
19:21
this view of accelerating
technological power
que poderían cuestionar a túa afirmación
de que nunca nas nosas vidas
19:24
who might dispute your statement
that never in our lifetimes
un ordenador fará
o que un neno de tres anos pode facer,
19:27
will a computer do what
a three-year-old child can do,
pero está claro que en calquera situación,
19:29
but what's clear is that in any scenario,
as nosas máquinas teñen
moito que aprender dos nosos cativos.
19:32
our machines have so much to learn
from our toddlers.
19:38
LS: I think so. You'll have some
machine learning folks up here.
LS: Eu tamén o creo. Aquí haberá
partidarios da aprendizaxe automática.
19:41
I mean, you should never bet
against babies or chimpanzees
Nunca deberías apostar
contra os bebés ou os chimpancés
ou da tecnoloxía, en principio.
19:45
or technology as a matter of practice,
19:49
but it's not just
a difference in quantity,
pero non se trata só
dunha diferenza de cantidade,
é unha diferenza cualitativa.
19:53
it's a difference in kind.
Temos ordenadores incriblemente potentes,
19:55
We have incredibly powerful computers,
que fan cousas incriblemente sofisticadas,
19:57
and they do do amazingly
sophisticated things,
por veces con enormes cantidades de datos.
20:00
often with very big amounts of data.
As mentes humanas fan, para min,
algo bastante diferente,
20:03
Human minds do, I think,
something quite different,
e creo que é a natureza estruturada
e xerarquizada do coñecemento humano
20:05
and I think it's the structured,
hierarchical nature of human knowledge
20:09
that remains a real challenge.
o que permanece como
un verdadeiro desafío.
20:11
CA: Laura Schulz, wonderful
food for thought. Thank you so much.
CA: Laura Schulz, un gran tema
para reflexionar. Moitas grazas.
20:14
LS: Thank you.
(Applause)
Grazas
(Aplausos)

▲Back to top

About the speaker:

Laura Schulz - Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn.

Why you should listen

MIT Early Childhood Cognition Lab lead investigator Laura Schulz studies learning in early childhood. Her research bridges computational models of cognitive development and behavioral studies in order to understand the origins of inquiry and discovery.

Working in play labs, children’s museums, and a recently-launched citizen science website, Schultz is reshaping how we view young children’s perceptions of the world around them. Some of the surprising results of her research: before the age of four, children expect hidden causes when events happen probabilistically, use simple experiments to distinguish causal hypotheses, and trade off learning from instruction and exploration.

More profile about the speaker
Laura Schulz | Speaker | TED.com