English-Video.net comment policy

The comment field is common to all languages

Let's write in your language and use "Google Translate" together

Please refer to informative community guidelines on TED.com

TED2015

Laura Schulz: The surprisingly logical minds of babies

Laura Schulz: Iznenađujuće logični umovi beba

Filmed
Views 1,632,838

Kako bebe tako brzo nauče toliko puno na temelju tako male količine podataka? U ovom zabavnom predavanju prepunom eksperimenata kognitivna znanstvenica Laura Schulz pokazuje kako naši najmlađi donose odluke pomoću iznenađujuće snažnog osjećaja za logiku, puno prije no što nauče govoriti.

- Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn. Full bio

Mark Twain summed up
what I take to be
Mark Twain sažeo je nešto što je za mene
00:12
one of the fundamental problems
of cognitive science
jedno od temeljnih
problema kognitivne znanosti
00:14
with a single witticism.
u jednoj duhovitoj rečenici.
00:18
He said, "There's something
fascinating about science.
Rekao je: "Ima nešto
fascinantno u znanosti.
00:20
One gets such wholesale
returns of conjecture
Dobivamo ogromnu dobit
u obliku pretpostavki
00:23
out of such a trifling
investment in fact."
skromnim ulaganjem u činjenice."
00:26
(Laughter)
(Smijeh)
00:29
Twain meant it as a joke,
of course, but he's right:
Twain je to zamislio
kao šalu, ali bio je u pravu:
00:32
There's something
fascinating about science.
Ima nešto fascinantno u znanosti.
00:34
From a few bones, we infer
the existence of dinosuars.
Na temelju nekoliko kostiju,
zaključujemo o postojanju dinosaura.
00:37
From spectral lines,
the composition of nebulae.
Iz spektralnih linija zaključujemo
o sastavu svemirskih nebula.
00:42
From fruit flies,
Od vinskih mušica
00:47
the mechanisms of heredity,
mehanizme nasljeđivanja,
00:50
and from reconstructed images
of blood flowing through the brain,
a iz rekonstruiranih slika
protoka krvi kroz mozak,
00:53
or in my case, from the behavior
of very young children,
ili, u mom slučaju,
iz ponašanja vrlo male djece,
00:57
we try to say something about
the fundamental mechanisms
pokušavamo reći nešto
o osnovnim mehanizmima
01:02
of human cognition.
ljudske spoznaje.
01:05
In particular, in my lab in the Department
of Brain and Cognitive Sciences at MIT,
U svojem laboratoriju na odsjeku
za mozak i kognitivnu znanost na MIT-u
01:07
I have spent the past decade
trying to understand the mystery
provela sam proteklo desetljeće
pokušavajući razumjeti misterij
01:12
of how children learn so much
from so little so quickly.
o tome kako djeca nauče toliko puno
i toliko brzo iz skromne baze podataka.
01:16
Because, it turns out that
the fascinating thing about science
Naime, ispada da je ono
što je facinantno kod znanosti,
01:20
is also a fascinating
thing about children,
fascinantno i kod male djece,
01:23
which, to put a gentler
spin on Mark Twain,
a to je, ako ublažimo izjavu Marka Twaina,
01:27
is precisely their ability
to draw rich, abstract inferences
upravo njihova sposobnost
da izvlače bogate, apstraktne zaključke
01:29
rapidly and accurately
from sparse, noisy data.
brzo i točno, iz malobrojnih
i nepreciznih podataka.
01:34
I'm going to give you
just two examples today.
Danas ću vam dati samo dva primjera.
01:40
One is about a problem of generalization,
Jedan se tiče problema generalizacije,
01:42
and the other is about a problem
of causal reasoning.
a drugi problema
uzročnog zaključivanja.
01:45
And although I'm going to talk
about work in my lab,
Iako ću govoriti o radu
mog laboratorija,
01:47
this work is inspired by
and indebted to a field.
taj je rad inspiriran i dužan čitavom
jednom polju istraživanja.
01:50
I'm grateful to mentors, colleagues,
and collaborators around the world.
Zahvalna sam svojim mentorima,
kolegama i suradnicima diljem svijeta.
01:53
Let me start with the problem
of generalization.
Počet ću od problema generalizacije.
01:59
Generalizing from small samples of data
is the bread and butter of science.
Generaliziranje na temelju malih uzoraka
osnovno je sredstvo znanstvenog rada.
02:02
We poll a tiny fraction of the electorate
Ispitamo maleni dio biračkog tijela
02:06
and we predict the outcome
of national elections.
i predviđamo ishod državnih izbora.
02:09
We see how a handful of patients
responds to treatment in a clinical trial,
Vidimo kako šačica pacijenata reagira
na tretman u kliničkom ispitivanju
02:12
and we bring drugs to a national market.
i te lijekove dovedemo
na nacionalno tržište.
02:16
But this only works if our sample
is randomly drawn from the population.
To je učinkovito jedino ako je naš uzorak
iz populacije izabran slučajem.
02:19
If our sample is cherry-picked
in some way --
Ako je naš uzorak na neki način probran --
02:23
say, we poll only urban voters,
npr. ispitivanjem samo glasača iz gradova
02:26
or say, in our clinical trials
for treatments for heart disease,
ili ako u klinička istraživanja
o liječenju srčanih bolesti
02:28
we include only men --
uključimo samo muškarce --
02:32
the results may not generalize
to the broader population.
rezultati se možda neće moći
primijeniti na širu populaciju.
02:34
So scientists care whether evidence
is randomly sampled or not,
Dakle, znanstvenicima je važno
temelje li se dokazi na slučajnom uzorku,
02:38
but what does that have to do with babies?
ali kakve to veze ima s bebama?
02:42
Well, babies have to generalize
from small samples of data all the time.
Bebe stalno moraju generalizirati
na temelju malih uzoraka podataka.
02:44
They see a few rubber ducks
and learn that they float,
Vide nekoliko gumenih patkica
te zaključe da one plutaju
02:49
or a few balls and learn that they bounce.
ili vide nekoliko lopti
i zaključe da one odskakuju.
02:52
And they develop expectations
about ducks and balls
Razvijaju očekivanja o patkicama i loptama
02:55
that they're going to extend
to rubber ducks and balls
koja će onda primjenjivati
na gumene patkice i lopte
02:58
for the rest of their lives.
do kraja svojih života.
03:01
And the kinds of generalizations
babies have to make about ducks and balls
Takve generalizacije kakve bebe moraju
činiti o patkicama i loptama,
03:03
they have to make about almost everything:
moraju činiti o gotovo svemu:
03:07
shoes and ships and sealing wax
and cabbages and kings.
o cipelama, brodovima,
pečatnom vosku, kupusima i kraljevima.
03:09
So do babies care whether
the tiny bit of evidence they see
Je li bebama važno je li
maleni vidljivi uzorak dokaza
03:14
is plausibly representative
of a larger population?
reprezentativan za veću populaciju?
03:17
Let's find out.
Saznajmo!
03:21
I'm going to show you two movies,
Pokazat ću vam dva filmića,
03:23
one from each of two conditions
of an experiment,
po jedan iz svakog
od uvjeta u eksperimentu,
03:25
and because you're going to see
just two movies,
a budući da ćete vidjeti samo dva filmića,
03:27
you're going to see just two babies,
vidjet ćete i samo dvije bebe,
03:30
and any two babies differ from each other
in innumerable ways.
a sve se bebe međusobno
razlikuju na nebrojene načine,
03:32
But these babies, of course,
here stand in for groups of babies,
ali ove bebe, dakako,
predstavljaju skupine beba,
03:36
and the differences you're going to see
a razlike koje ćete vidjeti
03:39
represent average group differences
in babies' behavior across conditions.
predstavljaju prosječne razlike među
grupama beba u različitim uvjetima.
03:41
In each movie, you're going to see
a baby doing maybe
U svakom filmiću
vidjet ćete bebu koja radi
03:47
just exactly what you might
expect a baby to do,
možda upravo ono što se
od bebe očekuje da radi,
03:49
and we can hardly make babies
more magical than they already are.
a njih teško možemo učiniti
čarobnijima nego što one to već jesu.
03:53
But to my mind the magical thing,
Ali meni je čarobno to,
03:58
and what I want you to pay attention to,
i na što želim da obratite pažnju,
04:00
is the contrast between
these two conditions,
kontrast između ovih dvaju
eksperimentalnih uvjeta
04:02
because the only thing
that differs between these two movies
jer se ova dva filmića razlikuju samo po
04:05
is the statistical evidence
the babies are going to observe.
statističkim dokazima
koje će bebe promatrati.
04:08
We're going to show babies
a box of blue and yellow balls,
Pokazat ćemo im
kutiju plavih i žutih loptica.
04:13
and my then-graduate student,
now colleague at Stanford, Hyowon Gweon,
Moja tadašnja doktorandica, a sada
kolegica sa Stanforda Hyowon Gweon,
04:16
is going to pull three blue balls
in a row out of this box,
zaredom će iz ove kutije
izvući tri plave loptice.
04:21
and when she pulls those balls out,
she's going to squeeze them,
Kad izvuče loptice, stisnut će ih,
04:24
and the balls are going to squeak.
a one će zasvirati.
04:27
And if you're a baby,
that's like a TED Talk.
Za bebu je to poput TED-predavanja.
04:29
It doesn't get better than that.
Nema boljega!
04:32
(Laughter)
(Smijeh)
04:34
But the important point is it's really
easy to pull three blue balls in a row
Ono što je važno jest da je lako
izvući tri plave loptice zaredom
04:38
out of a box of mostly blue balls.
iz kutije u kojoj su
uglavnom plave loptice.
04:42
You could do that with your eyes closed.
Možete to učiniti i zatvorenih očiju.
04:44
It's plausibly a random sample
from this population.
To može predstavljati
slučajni uzorak iz ove populacije.
04:46
And if you can reach into a box at random
and pull out things that squeak,
A ako možete slučajnim izborom
iz kutije izvući stvari koje sviraju,
04:49
then maybe everything in the box squeaks.
onda možda sve što je u kutiji svira,
04:53
So maybe babies should expect
those yellow balls to squeak as well.
pa bi bebe možda mogle očekivati
da će i žute loptice svirati.
04:56
Now, those yellow balls
have funny sticks on the end,
Žute loptice imaju
smiješne štapiće na krajevima,
05:00
so babies could do other things
with them if they wanted to.
tako da bi bebe ako žele mogle
s njima raditi i druge stvari.
05:02
They could pound them or whack them.
Mogle bi ih udarati ili lupati.
05:05
But let's see what the baby does.
Ali idemo vidjeti što beba radi.
05:07
(Video) Hyowon Gweon: See this?
(Ball squeaks)
Hyowon Gweon: Vidiš ovo?
(Loptica svira)
05:12
Did you see that?
(Ball squeaks)
Jesi li to vidjela?
(Loptica svira)
05:16
Cool.
Fora.
05:20
See this one?
Vidiš ovu?
05:24
(Ball squeaks)
(Loptica svira)
05:26
Wow.
Opa.
05:28
Laura Schulz: Told you. (Laughs)
Laura Schulz: Rekla sam vam. (Smijeh)
05:33
(Video) HG: See this one?
(Ball squeaks)
HG: Vidiš ovu?
(Loptica svira)
05:35
Hey Clara, this one's for you.
You can go ahead and play.
Hej, Clara, ova je za tebe.
Možeš se igrati njome.
05:39
(Laughter)
(Smijeh)
05:51
LS: I don't even have to talk, right?
LS: Ni ne moram govoriti, zar ne?
05:56
All right, it's nice that babies
will generalize properties
U redu, zgodno je da
bebe generaliziraju osobine
05:59
of blue balls to yellow balls,
plavih loptica na žute loptice
06:02
and it's impressive that babies
can learn from imitating us,
i dojmljivo je da mogu
učiti imitirajući nas,
06:03
but we've known those things about babies
for a very long time.
ali to o bebama već dugo znamo.
06:06
The really interesting question
Ono što je zaista zanimljivo proučiti
06:10
is what happens when we show babies
exactly the same thing,
jest to što se dogodi kad bebi
pokažemo potpuno istu stvar,
06:12
and we can ensure it's exactly the same
because we have a secret compartment
a sigurni smo da je potpuno ista
jer imamo tajni pretinac
06:15
and we actually pull the balls from there,
iz kojeg vučemo loptice,
06:18
but this time, all we change
is the apparent population
ali ovog puta promijenimo
jedino prividnu populaciju
06:20
from which that evidence was drawn.
iz koje se vuku dokazi.
06:24
This time, we're going to show babies
three blue balls
Ovog puta bebi ćemo
pokazati tri plave loptice
06:27
pulled out of a box
of mostly yellow balls,
izvučene iz kutije u kojoj
je većina žutih loptica,
06:30
and guess what?
i znate što?
06:34
You [probably won't] randomly draw
three blue balls in a row
Vjerojatno nećete slučajno izvući
tri plave loptice zaredom
06:35
out of a box of mostly yellow balls.
iz kutije u kojoj je većina žutih loptica.
06:38
That is not plausibly
randomly sampled evidence.
To nije uvjerljiv slučajan uzorak.
06:40
That evidence suggests that maybe Hyowon
was deliberately sampling the blue balls.
Ti dokazi upućuju na to da je Hyowon
možda namjerno uzorkovala plave loptice.
06:44
Maybe there's something special
about the blue balls.
Možda ima nešto posebno
u tim plavim lopticama.
06:49
Maybe only the blue balls squeak.
Možda samo plave loptice sviraju.
06:52
Let's see what the baby does.
Pogledajmo što beba radi.
06:55
(Video) HG: See this?
(Ball squeaks)
HG: Vidiš ovo?
(Loptica svira)
06:57
See this toy?
(Ball squeaks)
Vidiš ovu igračku?
(Loptica svira)
07:02
Oh, that was cool. See?
(Ball squeaks)
O, to je bilo fora! Vidiš?
(Loptica svira)
07:05
Now this one's for you to play.
You can go ahead and play.
A ova je tebi za igru.
Možeš se sada poigrati.
07:10
(Fussing)
(Laughter)
(Smijeh)
07:18
LS: So you just saw
two 15-month-old babies
LS: Dakle, vidjeli ste dvije
petnaestomjesečne bebe
07:26
do entirely different things
koje su se potpuno drugačije ponašale
07:29
based only on the probability
of the sample they observed.
samo na temelju vjerojatnosti
promatranog uzorka.
07:31
Let me show you the experimental results.
Pokazat ću vam rezultate eksperimenta.
07:35
On the vertical axis, you'll see
the percentage of babies
Na okomitoj osi vidite postotak beba
07:37
who squeezed the ball in each condition,
koje su stisnule lopticu
u svakom slučaju,
07:40
and as you'll see, babies are much
more likely to generalize the evidence
i kao možete vidjeti, bebe su
sklonije generalizirati dokaze
07:42
when it's plausibly representative
of the population
koji uvjerljivo predstavljaju populaciju
07:46
than when the evidence
is clearly cherry-picked.
nego one koji su očito probrani.
07:49
And this leads to a fun prediction:
To dovodi do zanimljive pretpostavke:
07:53
Suppose you pulled just one blue ball
out of the mostly yellow box.
zamislite da izvučemo samo jednu plavu
lopticu iz kutije u kojoj je većina žutih.
07:55
You [probably won't] pull three blue balls
in a row at random out of a yellow box,
Vjerojatno nećete iz takve kutije
slučajno zaredom izvući tri plave loptice,
08:00
but you could randomly sample
just one blue ball.
ali mogli biste slučajno
izvući samo jednu plavu.
08:04
That's not an improbable sample.
To nije nevjerojatan uzorak.
08:07
And if you could reach into
a box at random
Kad biste mogli slučajno iz kutije izvući
08:09
and pull out something that squeaks,
maybe everything in the box squeaks.
nešto što svira, možda sve u njoj svira.
08:11
So even though babies are going to see
much less evidence for squeaking,
Iako bebe vide puno manje
dokaza u korist sviranja,
08:15
and have many fewer actions to imitate
i imaju manje postupaka
koje mogu imitirati,
08:20
in this one ball condition than in
the condition you just saw,
u slučaju s jednom lopticom nego
u slučaju koji ste sada vidjeli,
08:22
we predicted that babies themselves
would squeeze more,
predvidjeli smo
da će same bebe češće stiskati,
08:25
and that's exactly what we found.
a upravo se to i dogodilo.
08:29
So 15-month-old babies,
in this respect, like scientists,
U ovom slučaju petnaestomjesečne
bebe, kao i znanstvenike,
08:32
care whether evidence
is randomly sampled or not,
zanima jesu li dokazi
slučajno izabrani ili nisu
08:37
and they use this to develop
expectations about the world:
i to koriste kako bi stvorile
očekivanja o svijetu:
08:40
what squeaks and what doesn't,
o tome što svira, a što ne,
08:43
what to explore and what to ignore.
što istraživati, a što ignorirati.
08:45
Let me show you another example now,
Sada ću vam pokazati još jedan primjer,
08:50
this time about a problem
of causal reasoning.
ovog puta o problemu
uzročnog zaključivanja,
08:52
And it starts with a problem
of confounded evidence
a počinje s problemom zbunjujućih dokaza
08:55
that all of us have,
s kojim se svi suočavamo,
08:57
which is that we are part of the world.
a to je da smo dio svijeta.
08:59
And this might not seem like a problem
to you, but like most problems,
Vama se ovo možda ne čini
kao problem, ali kao i većina problema,
09:01
it's only a problem when things go wrong.
problematično je samo
kad stvari pođu po zlu.
09:04
Take this baby, for instance.
Uzmimo npr. ovu bebu.
09:07
Things are going wrong for him.
Njemu stvari polaze po zlu.
09:09
He would like to make
this toy go, and he can't.
Htio bi pokrenuti
ovu igračku, ali ne može.
09:10
I'll show you a few-second clip.
Pokazat ću vam filmić od nekoliko sekundi.
09:13
And there's two possibilities, broadly:
Općenito gledajući,
imamo dvije opcije.
09:21
Maybe he's doing something wrong,
Možda on nešto krivo radi
09:23
or maybe there's something
wrong with the toy.
ili možda nešto nije u redu s igračkom.
09:25
So in this next experiment,
U sljedećem eksperimentu
09:30
we're going to give babies
just a tiny bit of statistical data
bebama ćemo dati samo
mrvicu statističkih podataka
09:32
supporting one hypothesis over the other,
u prilog jedne ili druge hipoteze
09:35
and we're going to see if babies
can use that to make different decisions
i vidjet ćemo mogu li
na temelju toga donositi razlčite odluke
09:38
about what to do.
o tome što trebaju učiniti.
09:41
Here's the setup.
Evo kako smo to postavili.
09:43
Hyowon is going to try to make
the toy go and succeed.
Hyowon će pokušati pokrenuti
igračku i u tome će i uspjeti.
09:46
I am then going to try twice
and fail both times,
Ja ću onda dvaput pokušati
i oba ću puta doživjeti neuspjeh,
09:49
and then Hyowon is going
to try again and succeed,
a onda će Hyowon
ponovno pokušati i uspjeti.
09:52
and this roughly sums up my relationship
to my graduate students
To ukratko opisuje i moj odnos
sa svim mojim doktorandima
09:55
in technology across the board.
iz tehnologije.
09:58
But the important point here is
it provides a little bit of evidence
Ali ono što je važno jest
da dajemo mrvicu dokaza
10:02
that the problem isn't with the toy,
it's with the person.
da problem nije u igrački, nego u osobi.
10:05
Some people can make this toy go,
Neki ljudi mogu pokrenuti ovu igračku,
10:08
and some can't.
a drugi ne.
10:11
Now, when the baby gets the toy,
he's going to have a choice.
Kad beba dobije igračku, moći će birati.
10:12
His mom is right there,
Tamo mu je mama,
10:16
so he can go ahead and hand off the toy
and change the person,
pa može proslijediti igračku
i promijeniti osobu,
10:18
but there's also going to be
another toy at the end of that cloth,
ali malo dalje bit će i druga igračka
10:21
and he can pull the cloth towards him
and change the toy.
za kojom može posegnuti
te promijeniti igračku.
10:24
So let's see what the baby does.
Pogledajmo što beba radi.
10:28
(Video) HG: Two, three. Go!
(Music)
HG: Dva, tri. Kreni!
(Glazba)
10:30
LS: One, two, three, go!
LS: Jedan, dva, tri, kreni!
10:34
Arthur, I'm going to try again.
One, two, three, go!
Arthure, probat ću ponovo.
Jedan, dva, tri, kreni!
10:37
YG: Arthur, let me try again, okay?
HG: Arthure, daj da ja
opet probam, može?
10:45
One, two, three, go!
(Music)
Jedan, dva, tri, kreni!
(Glazba)
10:48
Look at that. Remember these toys?
Vidi ovo. Sjećaš se ovih igračaka?
10:53
See these toys? Yeah, I'm going
to put this one over here,
Vidiš ove igračke?
Da, ovu ću staviti ovamo,
10:55
and I'm going to give this one to you.
a ovu ću dati tebi.
10:58
You can go ahead and play.
Sada se možeš igrati.
11:00
LS: Okay, Laura, but of course,
babies love their mommies.
LS: U redu, Laura,
ali bebe vole svoje mame.
11:23
Of course babies give toys
to their mommies
Naravno da bebe daju igračke svojim mamama
11:27
when they can't make them work.
kad ih ne mogu pokrenuti.
11:30
So again, the really important question
is what happens when we change
Ponovno, ono što je zbilja
važno jest što se događa
11:32
the statistical data ever so slightly.
kad neznatno izmijenimo
statističke podatke.
11:35
This time, babies are going to see the toy
work and fail in exactly the same order,
Ovog puta bebe će vidjeti istu igračku da
radi i ne radi potpuno istim redoslijedom,
11:38
but we're changing
the distribution of evidence.
ali promijenit ćemo raspored dokaza.
11:42
This time, Hyowon is going to succeed
once and fail once, and so am I.
Ovog će puta Hyowon jednom
uspjeti i jednom ne, kao i ja,
11:45
And this suggests it doesn't matter
who tries this toy, the toy is broken.
što ukazuje na to da bez obzira na to
tko pokuša, igračka je pokvarena.
11:49
It doesn't work all the time.
Ne radi stalno.
11:55
Again, the baby's going to have a choice.
Beba će ponovno moći birati.
11:57
Her mom is right next to her,
so she can change the person,
Mama je odmah pokraj nje,
pa može promijeniti osobu,
11:59
and there's going to be another toy
at the end of the cloth.
a na kraju krpe bit će još jedna igračka.
12:02
Let's watch what she does.
Pogledajmo što će učiniti.
12:05
(Video) HG: Two, three, go!
(Music)
HG: Dva, tri, kreni!
(Glazba)
12:07
Let me try one more time.
One, two, three, go!
Daj da još jednom probam.
Jedan, dva, tri, kreni!
12:11
Hmm.
Hmmm.
12:17
LS: Let me try, Clara.
LS: Daj da ja probam, Clara.
12:19
One, two, three, go!
Jedan, dva, tri, kreni!
12:22
Hmm, let me try again.
Hmm, daj da opet probam.
12:27
One, two, three, go!
(Music)
Jedan, dva, tri, kreni!
(glazba)
12:29
HG: I'm going
to put this one over here,
HG: Ovu ću staviti ovamo,
12:35
and I'm going to give this one to you.
a ovu ću dati tebi.
12:37
You can go ahead and play.
Možeš se igrati.
12:39
(Applause)
(Pljesak)
12:58
LS: Let me show you
the experimental results.
LS: Pokazat ću vam rezultate eksperimenta.
13:04
On the vertical axis,
you'll see the distribution
Na okomitoj osi vidite raspodjelu
13:07
of children's choices in each condition,
dječjih odgovora u svakom od slučajeva
13:09
and you'll see that the distribution
of the choices children make
i kao što vidite,
raspodjela dječjih odluka
13:12
depends on the evidence they observe.
ovisi o dokazima koje promatraju.
13:16
So in the second year of life,
U drugoj godini života
13:19
babies can use a tiny bit
of statistical data
bebe mogu koristiti
djelić statističkih podataka
13:21
to decide between two
fundamentally different strategies
kako bi odlučili između dvaju
znatno različitih strategija
13:24
for acting in the world:
pristupa svijetu:
13:27
asking for help and exploring.
traženje pomoći i istraživanje.
13:29
I've just shown you
two laboratory experiments
Upravo sam vam pokazala
dva laboratorijska eksperimenta
13:33
out of literally hundreds in the field
that make similar points,
među stotinama eksperimenata na ovom
području koji ukazuju na slične zaključke
13:37
because the really critical point
jer ono što je zaista ključno
13:40
is that children's ability
to make rich inferences from sparse data
jest da dječja sposobnost da donose
bogate zaključke iz malo podataka
13:43
underlies all the species-specific
cultural learning that we do.
temelj je svakog kulturološkog učenja
specifičnog za našu vrstu.
13:48
Children learn about new tools
from just a few examples.
Djeca uče o novim alatima
iz tek nekoliko primjera.
13:53
They learn new causal relationships
from just a few examples.
Uče nove kauzalne odnose
iz tek nekoliko primjera.
13:58
They even learn new words,
in this case in American Sign Language.
Uče čak i nove riječi, u ovom
slučaju američki znakovni jeziik.
14:03
I want to close with just two points.
Želim završiti s dva zaključka.
14:08
If you've been following my world,
the field of brain and cognitive sciences,
Ako ste pratili područje
mozga i kognitivnih znanosti
14:12
for the past few years,
zadnjih nekoliko godina,
14:15
three big ideas will have come
to your attention.
primijetili ste tri velike ideje.
14:17
The first is that this is
the era of the brain.
Prva - ovo je era mozga.
14:20
And indeed, there have been
staggering discoveries in neuroscience:
U neuroznanosti se uistinu
dogodio niz zapanjujućih otkrića:
14:23
localizing functionally specialized
regions of cortex,
lokalizacija funkcionalno
specijaliziranih regija korteksa,
14:27
turning mouse brains transparent,
transparentnost mišjih mozgova,
14:30
activating neurons with light.
aktiviranje neurona pomoću svjetlosti.
14:33
A second big idea
Druga velika ideja
14:36
is that this is the era of big data
and machine learning,
jest da je ovo era velike
količine podataka i strojnog učenja,
14:38
and machine learning promises
to revolutionize our understanding
a strojno učenje obećava
revoluciju našeg razumijevanja
14:43
of everything from social networks
to epidemiology.
svega od društvenih
mreža do epidemiologije.
14:46
And maybe, as it tackles problems
of scene understanding
Budući da se bavi problemima
razumijevanja scene
14:50
and natural language processing,
i obrade prirodnog jezika,
14:53
to tell us something
about human cognition.
možda nam može nešto
reći o ljudskoj spoznaji.
14:55
And the final big idea you'll have heard
Posljednja velika ideja
14:59
is that maybe it's a good idea we're going
to know so much about brains
jest da je možda dobra stvar
to što ćemo znati toliko o mozgovima
15:01
and have so much access to big data,
i imati pristup tolikoj bazi podataka
15:05
because left to our own devices,
jer kad su prepušteni sami sebi,
15:06
humans are fallible, we take shortcuts,
ljudi su grešni, koriste prečace,
15:09
we err, we make mistakes,
griješe, čine pogreške,
15:13
we're biased, and in innumerable ways,
pristrani su i na nebrojene načine
15:16
we get the world wrong.
pogrešno shvaćaju svijet.
15:20
I think these are all important stories,
Mislim da su sve ovo važne priče
15:24
and they have a lot to tell us
about what it means to be human,
i mogu nam puno toga reći
o tome što to znači biti čovjek,
15:27
but I want you to note that today
I told you a very different story.
ali želim da primijetite da sam vam
danas ispričala posve drugačiju priču,
15:31
It's a story about minds and not brains,
priču o umovima, a ne o mozgovima.
15:35
and in particular, it's a story
about the kinds of computations
Točnije, to je priča o vrstama računanja
15:39
that uniquely human minds can perform,
koje samo ljudski umovi mogu izvršiti,
15:42
which involve rich, structured knowledge
and the ability to learn
a koji uključuju bogato
strukturirano znanje i sposobnost učenja
15:45
from small amounts of data,
the evidence of just a few examples.
na temelju vrlo malo
podataka, dokaza ili primjera.
15:49
And fundamentally, it's a story
about how starting as very small children
To je u osnovi priča o tome kako se
započevši svoj put kao sasvim mala djeca
15:56
and continuing out all the way
to the greatest accomplishments
razvijamo do najvećih postignuća
16:00
of our culture,
naše kulture
16:04
we get the world right.
i svijet shvaćamo ispravno.
16:08
Folks, human minds do not only learn
from small amounts of data.
Ljudi, naši umovi ne uče
samo iz malih količina podataka.
16:12
Human minds think
of altogether new ideas.
Ljudski umovi stvaraju sasvim nove ideje,
16:18
Human minds generate
research and discovery,
pokreću istraživanja i otkrića,
16:20
and human minds generate
art and literature and poetry and theater,
stvaraju umjetnost,
književnost, povijest, kazalište
16:23
and human minds take care of other humans:
te brinu za druge ljude:
16:29
our old, our young, our sick.
naše stare, naše mlade, naše bolesne.
16:32
We even heal them.
Liječimo ih.
16:36
In the years to come, we're going
to see technological innovations
U predstojećim godinama svjedočit ćemo
tehnološkim inovacijama
16:39
beyond anything I can even envision,
iznad svih mojih očekivanja,
16:42
but we are very unlikely
ali teško da ćemo
16:46
to see anything even approximating
the computational power of a human child
svjedočiti ičemu što je
približno djetetovoj moći računanja
16:48
in my lifetime or in yours.
- za mog života ili za vaših života.
16:54
If we invest in these most powerful
learners and their development,
Uložimo li u ove najmoćnije
učenike i u njihov razvoj,
16:58
in babies and children
u bebe i djecu,
17:03
and mothers and fathers
majke i očeve,
17:06
and caregivers and teachers
skrbnike i nastavnike
17:08
the ways we invest in our other
most powerful and elegant forms
onoliko koliko ulažemo u druge
najmoćnije i najelegantnije oblike
17:11
of technology, engineering and design,
tehnologije, inženjeringa i dizajna,
17:15
we will not just be dreaming
of a better future,
nećemo samo sanjati o boljoj budućnosti
17:18
we will be planning for one.
već ćemo je i planirati.
17:21
Thank you very much.
Puno vam hvala.
17:23
(Applause)
(Pljesak)
17:25
Chris Anderson: Laura, thank you.
I do actually have a question for you.
Chris Anderson: Laura,
hvala ti, ali imam pitanje za tebe.
17:29
First of all, the research is insane.
Kao prvo, istraživanje je preludo.
17:34
I mean, who would design
an experiment like that? (Laughter)
Mislim, tko bi uopće osmislio
takav eksperiment? (Smijeh)
17:36
I've seen that a couple of times,
Nekoliko sam to puta vidio,
17:41
and I still don't honestly believe
that that can truly be happening,
i još uvijek ne vjerujem
da se to stvarno događa,
17:42
but other people have done
similar experiments; it checks out.
ali drugi su ljudi provodili
slične eksperimente - drži vodu.
17:46
The babies really are that genius.
Bebe uistinu jesu geniji.
17:49
LS: You know, they look really impressive
in our experiments,
LS: U našim eksperimentima
doista izgledaju impresivno,
17:50
but think about what they
look like in real life, right?
ali razmisli o tome kako
izgledaju u stvarnom životu.
17:53
It starts out as a baby.
Počinje kao beba,
17:56
Eighteen months later,
it's talking to you,
osamnaest mjeseci kasnije priča s vama,
17:57
and babies' first words aren't just
things like balls and ducks,
a njihove prve riječi nisu samo
stvari poput loptica i patkica
17:59
they're things like "all gone,"
which refer to disappearance,
već i stvari poput "nema više",
koji se odnosi na nestajanje.
18:02
or "uh-oh," which refer
to unintentional actions.
Ili "uh-oh" koji se odnosi
na nenamjerne radnje.
18:05
It has to be that powerful.
Mora biti toliko moćno.
18:07
It has to be much more powerful
than anything I showed you.
Mora biti moćnije od svega
što sam vam pokazala.
18:09
They're figuring out the entire world.
Pokušavaju shvatiti cijeli svijet.
18:12
A four-year-old can talk to you
about almost anything.
Četverogodišnjak s vama može
razgovarati gotovo o bilo čemu.
18:14
(Applause)
(Pljesak)
18:17
CA: And if I understand you right,
the other key point you're making is,
CA: A ako sam vas dobro razumio,
vaš drugi ključni argument jest
18:19
we've been through these years
where there's all this talk
da se godinama priča o tome
18:22
of how quirky and buggy our minds are,
koliko su naši umovi čudni i grešni,
18:25
that behavioral economics
and the whole theories behind that
da bihevioralna ekonomija
i teorije o tome govore
18:27
that we're not rational agents.
da nismo racionalni agenti.
18:29
You're really saying that the bigger
story is how extraordinary,
Vi zapravo govorite o većoj priči
o tome koliko smo nevjerojatni
18:31
and there really is genius there
that is underappreciated.
i da se tu negdje zaista
krije podcijenjeni genij.
18:35
LS: One of my favorite
quotes in psychology
LS: Jedan od najdražih
mi psiholoških citata
18:40
comes from the social
psychologist Solomon Asch,
dolazi od socijalnog psihologa
Solomona Ascha
18:42
and he said the fundamental task
of psychology is to remove
koji je rekao da je temeljna
zadaća psihologije ukloniti
18:45
the veil of self-evidence from things.
veo očiglednosti sa stvari.
18:47
There are orders of magnitude
more decisions you make every day
Postoji niz odluka
koje donosite svakog dana,
18:50
that get the world right.
a kojima shvaćate svijet.
18:55
You know about objects
and their properties.
Znate o objektima i njihovim svojstvima.
18:56
You know them when they're occluded.
You know them in the dark.
Prepoznajete ih kad su skriveni,
prepoznajete ih u mraku.
18:58
You can walk through rooms.
Prolazite sobama.
19:01
You can figure out what other people
are thinking. You can talk to them.
Možete shvatiti što drugi ljudi
misle, možete razgovarati s njima.
19:02
You can navigate space.
You know about numbers.
Možete se kretati prostorom,
znate za brojeve.
19:06
You know causal relationships.
You know about moral reasoning.
Znate o posljedičnim vezama,
znate o moralnosti.
19:08
You do this effortlessly,
so we don't see it,
To činite bez napora,
pa to ni ne primjećujemo,
19:11
but that is how we get the world right,
and it's a remarkable
ali tako ispravno shvaćamo
svijet i to je nevjerojatno
19:14
and very difficult-to-understand
accomplishment.
i teško razumljivo postignuće.
19:16
CA: I suspect there are people
in the audience who have
CA: Vjerujem da u publici ima ljudi
19:19
this view of accelerating
technological power
koji vjeruju u ubrzanje tehnološke moći
19:21
who might dispute your statement
that never in our lifetimes
i koji bi mogli osporavati vašu
izjavu da nikad za naših života
19:24
will a computer do what
a three-year-old child can do,
računalo neće moći raditi
ono što trogodišnjak može,
19:27
but what's clear is that in any scenario,
ali jasno je da u bilo kojem scenarju
19:29
our machines have so much to learn
from our toddlers.
naši strojevi imaju
puno za učiti od djece.
19:32
LS: I think so. You'll have some
machine learning folks up here.
LS: Mislim da da. Imat ćete ovdje i
strojeve koji poučavaju ljude,
19:38
I mean, you should never bet
against babies or chimpanzees
ali nikad se ne biste trebali kladiti
protiv djece ili čimpanzi
19:41
or technology as a matter of practice,
ili tehnologije općenito,
19:45
but it's not just
a difference in quantity,
ali ne radi se tu samo
o razlici u kvantiteti,
19:49
it's a difference in kind.
postoji i razlika u vrsti.
19:53
We have incredibly powerful computers,
Imamo nevjerojatno moćna računala
19:55
and they do do amazingly
sophisticated things,
koja rade nevjerojatne
i sofisticirane stvari,
19:57
often with very big amounts of data.
često s velikim količinama podataka.
20:00
Human minds do, I think,
something quite different,
Ljudski umovi čine nešto sasvim drugačije
20:03
and I think it's the structured,
hierarchical nature of human knowledge
i mislim da strukturirana
hijerajhijska narav ljudskog znanja
20:05
that remains a real challenge.
nastavlja predstavljati pravi izazov.
20:09
CA: Laura Schulz, wonderful
food for thought. Thank you so much.
CA: Laura Schulz, dali ste nam
na razmišljanje. Hvala vam.
20:11
LS: Thank you.
(Applause)
LS: Hvala vama.
(Pljesak)
20:14
Translated by Anja Kolobarić
Reviewed by Ivan Stamenkovic

▲Back to top

About the speaker:

Laura Schulz - Cognitive scientist
Developmental behavior studies spearheaded by Laura Schulz are changing our notions of how children learn.

Why you should listen

MIT Early Childhood Cognition Lab lead investigator Laura Schulz studies learning in early childhood. Her research bridges computational models of cognitive development and behavioral studies in order to understand the origins of inquiry and discovery.

Working in play labs, children’s museums, and a recently-launched citizen science website, Schultz is reshaping how we view young children’s perceptions of the world around them. Some of the surprising results of her research: before the age of four, children expect hidden causes when events happen probabilistically, use simple experiments to distinguish causal hypotheses, and trade off learning from instruction and exploration.

More profile about the speaker
Laura Schulz | Speaker | TED.com