ABOUT THE SPEAKER
Marvin Minsky - AI pioneer
Marvin Minsky is one of the great pioneers of artificial intelligence -- and using computing metaphors to understand the human mind. His contributions to mathematics, robotics and computational linguistics are legendary and far-reaching.

Why you should listen

Marvin Minsky is the superstar-elder of artificial intelligence, one of the most productive and important cognitive scientists of the century, and the leading proponent of the Society of Mind theory. Articulated in his 1985 book of the same name, Minsky's theory says intelligence is not born of any single mechanism, but from the interaction of many independent agents. The book's sequel,The Emotion Machine (2006), says similar activity also accounts for feelings, goals, emotions and conscious thoughts.

Minsky also pioneered advances in mathematics, computational linguistics, optics, robotics and telepresence. He built SNARC, the first neural network simulator, some of the first visual scanners, and the first LOGO "turtle." From his headquarters at MIT's Media Lab and the AI Lab (which he helped found), he continues to work on, as he says, "imparting to machines the human capacity for commonsense reasoning."

More profile about the speaker
Marvin Minsky | Speaker | TED.com
TED2003

Marvin Minsky: Health and the human mind

Marvin Minsky: A saúde e a mente humana

Filmed:
606,909 views

Escoiten con atención -- a pillabá, ecléctica e deliciosamente espontánea charla de Marvin Monsky sobre saúde, superpoboación e a mente humana está cargada de sutilezas: enxeño, sabedoría e unha chisca de astutos consellos (ou serán bromas con cara seria?).
- AI pioneer
Marvin Minsky is one of the great pioneers of artificial intelligence -- and using computing metaphors to understand the human mind. His contributions to mathematics, robotics and computational linguistics are legendary and far-reaching. Full bio

Double-click the English transcript below to play the video.

00:18
If you ask people about what part of psychology do they think is hard,
0
0
6000
Se lle preguntas á xente que parte
da psicoloxía pensan que é máis complicada
00:24
and you say, "Well, what about thinking and emotions?"
1
6000
3000
e lles dis, por exemplo,
“o pensamento ou as emocións?”
00:27
Most people will say, "Emotions are terribly hard.
2
9000
3000
a maior parte da xente diría,
“As emocións son moi difíciles,
00:30
They're incredibly complex. They can't -- I have no idea of how they work.
3
12000
6000
e incriblemente complexas. Non poden --
é que non teño nin idea de como funcionan.
00:36
But thinking is really very straightforward:
4
18000
2000
Pero o pensamento é moi sinxelo.
00:38
it's just sort of some kind of logical reasoning, or something.
5
20000
4000
É unha especie de razoamento lóxico,
algo así.
00:42
But that's not the hard part."
6
24000
3000
Esa non é a parte máis difícil.”
00:45
So here's a list of problems that come up.
7
27000
2000
Velaquí unha lista de
problemas que aparecen.
00:47
One nice problem is, what do we do about health?
8
29000
3000
Un bo problema:
que podemos facer coa saúde?
00:50
The other day, I was reading something, and the person said
9
32000
4000
O outro día estaba lendo unha cousa,
e o autor dicía que
00:54
probably the largest single cause of disease is handshaking in the West.
10
36000
6000
posiblemente a maior causa de enfermidades
en occidente fose dar a man.
01:00
And there was a little study about people who don't handshake,
11
42000
4000
Citaba un pequeno estudo que comparaba
01:04
and comparing them with ones who do handshake.
12
46000
3000
ás persoas que non daban a man
coas que si,
01:07
And I haven't the foggiest idea of where you find the ones that don't handshake,
13
49000
5000
e non teño nin a menor idea de onde
se atopan as persoas que non a dan,
01:12
because they must be hiding.
14
54000
3000
deben estar escondidas.
01:15
And the people who avoid that
15
57000
4000
Pois a xente que o evita
01:19
have 30 percent less infectious disease or something.
16
61000
4000
ten algo así como un 30%
menos de enfermidades infecciosas.
01:23
Or maybe it was 31 and a quarter percent.
17
65000
3000
Quizais era 31% e un cuarto.
01:26
So if you really want to solve the problem of epidemics and so forth,
18
68000
4000
Así que se queremos solucionar
o problema das epidemias e demais,
01:30
let's start with that. And since I got that idea,
19
72000
4000
comecemos por aí.
E dende que teño esa idea
01:34
I've had to shake hundreds of hands.
20
76000
4000
tiven que dar a man uns centos de veces,
01:38
And I think the only way to avoid it
21
80000
5000
e creo que a única maneira de evitar isto
01:43
is to have some horrible visible disease,
22
85000
2000
é ter algún tipo visible
de enfermidade horrible,
01:45
and then you don't have to explain.
23
87000
3000
así non te tes que explicar.
01:48
Education: how do we improve education?
24
90000
4000
A educación:
como podemos mellorar a educación?
01:52
Well, the single best way is to get them to understand
25
94000
4000
E ben, a mellor maneira
é facerlles entender
01:56
that what they're being told is a whole lot of nonsense.
26
98000
3000
que todo o que lles están contando
son un montón de sandeces.
01:59
And then, of course, you have to do something
27
101000
2000
E entón, claro, tes que facer algo
02:01
about how to moderate that, so that anybody can -- so they'll listen to you.
28
103000
5000
ao respecto para moderar isto,
de modo que te escoiten.
02:06
Pollution, energy shortage, environmental diversity, poverty.
29
108000
4000
A polución, o déficit enerxético,
a diversidade medioambiental, a pobreza.
02:10
How do we make stable societies? Longevity.
30
112000
4000
Como podemos crear sociedades estables?
A lonxevidade.
02:14
Okay, there're lots of problems to worry about.
31
116000
3000
Ben, hai un montón de problemas
polos que preocuparse.
02:17
Anyway, the question I think people should talk about --
32
119000
2000
A cuestión de que se debería falar,
02:19
and it's absolutely taboo -- is, how many people should there be?
33
121000
5000
e é totalmente tabú, é,
cantas persoas debería de haber no mundo?
02:24
And I think it should be about 100 million or maybe 500 million.
34
126000
7000
Eu penso que debería haber
aí uns 100 millóns ou quizais 500 millóns.
02:31
And then notice that a great many of these problems disappear.
35
133000
5000
Entón, fíxate en que moitos
destes problemas desaparecen.
02:36
If you had 100 million people
36
138000
2000
Se tiveses 100 millóns de persoas
02:38
properly spread out, then if there's some garbage,
37
140000
6000
ben repartidas, entón,
se hai un pouco lixo,
02:44
you throw it away, preferably where you can't see it, and it will rot.
38
146000
7000
tíralo, preferiblemente
onde non o poidas ver, e xa podrecerá.
02:51
Or you throw it into the ocean and some fish will benefit from it.
39
153000
5000
Ou tíralo no océano,
onde algúns peixes se beneficiarán del.
02:56
The problem is, how many people should there be?
40
158000
2000
Cantas persoas
debería de haber no mundo?
02:58
And it's a sort of choice we have to make.
41
160000
3000
Velaí unha decisión que temos que tomar.
03:01
Most people are about 60 inches high or more,
42
163000
3000
A maioría das persoas miden 1,55 metros,
ou máis,
03:04
and there's these cube laws. So if you make them this big,
43
166000
4000
e tede en conta as leis do cadrado-cubo.
Se as fas así de grandes,
03:08
by using nanotechnology, I suppose --
44
170000
3000
usando nanotecnoloxía, supoño --
03:11
(Laughter)
45
173000
1000
(Risas)
03:12
-- then you could have a thousand times as many.
46
174000
2000
Así poderías ter outros miles máis.
03:14
That would solve the problem, but I don't see anybody
47
176000
2000
Isto solucionaría o problema,
mais non vexo que ninguén investigue
03:16
doing any research on making people smaller.
48
178000
3000
para facer máis pequenas ás persoas
03:19
Now, it's nice to reduce the population, but a lot of people want to have children.
49
181000
5000
Agora, estaría ben reducir a poboación
pero moita xente quere ter fillos.
03:24
And there's one solution that's probably only a few years off.
50
186000
3000
Ademais, hai unha solución
que probablemente saia nuns anos.
03:27
You know you have 46 chromosomes. If you're lucky, you've got 23
51
189000
5000
Xa sabedes que os seres humanos temos
46 cromosomas. Se tes sorte,
03:32
from each parent. Sometimes you get an extra one or drop one out,
52
194000
6000
23 de cada proxenitor.
Ás veces tes un de máis ou un de menos,
03:38
but -- so you can skip the grandparent and great-grandparent stage
53
200000
4000
pero, podemos saltar
a etapa do avó e ao do bisavó,
03:42
and go right to the great-great-grandparent. And you have 46 people
54
204000
5000
e vaiamos directos ao tataravó.
Desta maneira tes a 46 persoas,
03:47
and you give them a scanner, or whatever you need,
55
209000
3000
e dáslle un escáner a todas elas,
ou o que precisen,
03:50
and they look at their chromosomes and each of them says
56
212000
4000
miran os seus cromosomas,
e cada un deles di
03:54
which one he likes best, or she -- no reason to have just two sexes
57
216000
5000
cal lles gusta máis.
Xa non hai razón para ter tan só
03:59
any more, even. So each child has 46 parents,
58
221000
5000
dous sexos. Deste modo,
cada neno tería a 46 pais,
04:04
and I suppose you could let each group of 46 parents have 15 children.
59
226000
6000
e supoño que poderíamos permitir que
cada grupo de 46 puidese ter 15 nenos.
04:10
Wouldn't that be enough? And then the children
60
232000
2000
Non sería isto suficiente? Estes nenos
04:12
would get plenty of support, and nurturing, and mentoring,
61
234000
4000
terían moito máis apoio,
mellores coidados, e unha mellor educación
04:16
and the world population would decline very rapidly
62
238000
2000
a poboación mundial diminuiría rapidamente
04:18
and everybody would be totally happy.
63
240000
3000
e todos seríamos totalmente felices.
04:21
Timesharing is a little further off in the future.
64
243000
3000
As multipropiedades aínda están
un pouco lonxe no futuro.
04:24
And there's this great novel that Arthur Clarke wrote twice,
65
246000
3000
E hai unha gran novela que Arthur Clarke
escribiu dúas veces
04:27
called "Against the Fall of Night" and "The City and the Stars."
66
249000
4000
titulada Contra a caída da noite e
A cidade e as estrelas.
04:31
They're both wonderful and largely the same,
67
253000
3000
As dúas son marabillosas
e en gran parte iguais,
04:34
except that computers happened in between.
68
256000
2000
mais apareceron os ordenadores
entre ambas.
04:36
And Arthur was looking at this old book, and he said, "Well, that was wrong.
69
258000
5000
E Arthur miraba á novela máis vella
e dixo que estaba mal.
04:41
The future must have some computers."
70
263000
2000
Que o futuro debería de ter ordenadores.
04:43
So in the second version of it, there are 100 billion
71
265000
5000
Así que na segunda versión
hai uns 100 mil millóns
04:48
or 1,000 billion people on Earth, but they're all stored on hard disks or floppies,
72
270000
8000
ou un billón de persoas na Terra,
mais todas almacenadas en discos duros,
04:56
or whatever they have in the future.
73
278000
2000
disquetes ou o que sexa que teñan.
04:58
And you let a few million of them out at a time.
74
280000
4000
E van liberando uns cantos millóns
de cada vez.
05:02
A person comes out, they live for a thousand years
75
284000
4000
Unha persoa sae, vive durante uns mil anos
05:06
doing whatever they do, and then, when it's time to go back
76
288000
6000
facendo o que sexa que faga, e entón,
cando chega a hora, retorna
05:12
for a billion years -- or a million, I forget, the numbers don't matter --
77
294000
4000
durante mil millóns de anos, ou un millón,
esquecinme, os números tanto dan,
05:16
but there really aren't very many people on Earth at a time.
78
298000
4000
o caso é que non hai tanta xente na Terra
ao mesmo tempo.
05:20
And you get to think about yourself and your memories,
79
302000
2000
Podes pensar en ti e nos teus recordos
05:22
and before you go back into suspension, you edit your memories
80
304000
5000
e antes de volver a estar en suspensión,
editas os teus recordos
05:27
and you change your personality and so forth.
81
309000
3000
e modificas a túa personalidade e demais.
05:30
The plot of the book is that there's not enough diversity,
82
312000
6000
A trama do libro é que non hai
suficiente diversidade,
05:36
so that the people who designed the city
83
318000
3000
así que as persoas que deseñaron
a cidade
05:39
make sure that every now and then an entirely new person is created.
84
321000
4000
se aseguran de que cada certo tempo
se cree unha nova persoa.
05:43
And in the novel, a particular one named Alvin is created. And he says,
85
325000
6000
E na novela un personaxe en particular
chamado Alvin é creado e di:
05:49
maybe this isn't the best way, and wrecks the whole system.
86
331000
4000
"Quizais este non é o mellor método",
e rompe con todo o sistema.
05:53
I don't think the solutions that I proposed
87
335000
2000
Non creo que as solucións que propuxen
05:55
are good enough or smart enough.
88
337000
3000
sexan o suficientemente boas
ou brillantes.
05:58
I think the big problem is that we're not smart enough
89
340000
4000
Creo que o gran problema é que non somos
o suficientemente intelixentes
06:02
to understand which of the problems we're facing are good enough.
90
344000
4000
para entender cales dos problemas aos
que nos enfrontamos son máis relevantes.
06:06
Therefore, we have to build super intelligent machines like HAL.
91
348000
4000
Así que temos que crear unhas máquinas
sumamente intelixentes como HAL.
06:10
As you remember, at some point in the book for "2001,"
92
352000
5000
Como sabedes, nalgún punto do libro 2001
06:15
HAL realizes that the universe is too big, and grand, and profound
93
357000
5000
HAL decátase de que o universo é
demasiado grande, e impoñente, e profundo
06:20
for those really stupid astronauts. If you contrast HAL's behavior
94
362000
4000
para uns astronautas tan estúpidos.
Se contrastas o comportamento de HAL
06:24
with the triviality of the people on the spaceship,
95
366000
4000
coa trivialidade da tripulación da nave,
06:28
you can see what's written between the lines.
96
370000
3000
veredes o que está escrito entre liñas.
06:31
Well, what are we going to do about that? We could get smarter.
97
373000
3000
E que imos facer ao respecto?
Poderíamos volvernos máis listos.
06:34
I think that we're pretty smart, as compared to chimpanzees,
98
376000
5000
Penso que somos bastante intelixentes,
comparados cos chimpancés,
06:39
but we're not smart enough to deal with the colossal problems that we face,
99
381000
6000
pero abondo para lidar con problemas
tan colosais como os que temos,
06:45
either in abstract mathematics
100
387000
2000
sexa en matemáticas abstractas,
06:47
or in figuring out economies, or balancing the world around.
101
389000
5000
en poder comprender as economías,
ou o equilibrio do mundo que nos rodea.
06:52
So one thing we can do is live longer.
102
394000
3000
Así que unha cousa
que podemos facer é vivir máis.
06:55
And nobody knows how hard that is,
103
397000
2000
E ninguén sabe o difícil que é iso,
06:57
but we'll probably find out in a few years.
104
399000
3000
pero probablemente
atopemos unha maneira nuns anos.
07:00
You see, there's two forks in the road. We know that people live
105
402000
3000
Veredes, o camiño aquí xébrase.
Sabemos que a xente vive
07:03
twice as long as chimpanzees almost,
106
405000
4000
case o dobre que os chimpancés,
07:07
and nobody lives more than 120 years,
107
409000
4000
e que ninguén vive máis de 120 anos,
07:11
for reasons that aren't very well understood.
108
413000
3000
por razóns que non entendemos ben.
07:14
But lots of people now live to 90 or 100,
109
416000
3000
Pero moitas persoas chegan a vivir
ata os 90 ou os 100,
07:17
unless they shake hands too much or something like that.
110
419000
4000
a non ser que dean moito a man,
ou algo así.
07:21
And so maybe if we lived 200 years, we could accumulate enough skills
111
423000
5000
Así que se chegásemos a vivir 200 anos,
acumularíamos as suficientes destrezas
07:26
and knowledge to solve some problems.
112
428000
5000
e coñecemento como para
resolver algúns problemas.
07:31
So that's one way of going about it.
113
433000
2000
Esta é unha das maneiras de facelo.
07:33
And as I said, we don't know how hard that is. It might be --
114
435000
3000
E como dixen, non sabemos
o difícil que é. Podería ser --
07:36
after all, most other mammals live half as long as the chimpanzee,
115
438000
6000
despois de todo, a maior parte dos outros
mamíferos viven a metade ca os chimpancés,
07:42
so we're sort of three and a half or four times, have four times
116
444000
3000
así que vivimos o triplo,
ou catro veces máis
07:45
the longevity of most mammals. And in the case of the primates,
117
447000
6000
que a maioría dos mamíferos.
No caso dos primates,
07:51
we have almost the same genes. We only differ from chimpanzees,
118
453000
4000
temos case os mesmos xenes.
O único que nos separa dos chimpancés
07:55
in the present state of knowledge, which is absolute hogwash,
119
457000
6000
no estado actual do coñecemento,
que é unha total ridiculez,
08:01
maybe by just a few hundred genes.
120
463000
2000
poden ser só uns centos de xenes.
08:03
What I think is that the gene counters don't know what they're doing yet.
121
465000
3000
Coido que os contadores de xenes aínda
non saben que están facendo,
08:06
And whatever you do, don't read anything about genetics
122
468000
3000
e fagan o que fagan,
non leades nada sobre xenética
08:09
that's published within your lifetime, or something.
123
471000
3000
que se publique mentres vivades.
08:12
(Laughter)
124
474000
3000
(Risas)
08:15
The stuff has a very short half-life, same with brain science.
125
477000
4000
É un tema cunha vida media moi curta,
o mesmo pasa coas ciencias cerebrais.
08:19
And so it might be that if we just fix four or five genes,
126
481000
6000
Así que tal vez se reparamos catro
ou cinco xenes,
08:25
we can live 200 years.
127
487000
2000
poidamos vivir 200 anos.
08:27
Or it might be that it's just 30 or 40,
128
489000
3000
O tal vez só 30 ou 40,
08:30
and I doubt that it's several hundred.
129
492000
2000
e dubido que varios centenares.
08:32
So this is something that people will be discussing
130
494000
4000
Isto é algo que a xente discutirá
08:36
and lots of ethicists -- you know, an ethicist is somebody
131
498000
3000
e moitos expertos en ética --
un eticista é unha persoa
08:39
who sees something wrong with whatever you have in mind.
132
501000
3000
que atopa algo malo
en calquera cousa que penses.
08:42
(Laughter)
133
504000
3000
(Risas)
08:45
And it's very hard to find an ethicist who considers any change
134
507000
4000
Resulta moi difícil atopar un experto en
ética que considere que calquera cambio
08:49
worth making, because he says, what about the consequences?
135
511000
4000
paga a pena, porque, di el:
e as súas consecuencias?
08:53
And, of course, we're not responsible for the consequences
136
515000
3000
E claro, non somos responsables
das consecuencias
08:56
of what we're doing now, are we? Like all this complaint about clones.
137
518000
6000
do que estamos facendo agora, verdade?
Como todas estas protestas sobre clons.
09:02
And yet two random people will mate and have this child,
138
524000
3000
E, sen embargo, dúas persoas
aparearanse e terán un fillo,
09:05
and both of them have some pretty rotten genes,
139
527000
4000
e aínda que os dous teñan uns xenes
bastante lamentables,
09:09
and the child is likely to come out to be average.
140
531000
4000
é probable que o neno saia normal.
09:13
Which, by chimpanzee standards, is very good indeed.
141
535000
6000
O que, para os estándares dos chimpancés,
está pero que moi ben.
09:19
If we do have longevity, then we'll have to face the population growth
142
541000
3000
Se nós conseguimos a lonxevidade,
entón teremos que afrontar
09:22
problem anyway. Because if people live 200 or 1,000 years,
143
544000
4000
o problema do crecemento poboacional.
Porque se a xente vive 200 ou 1 000 anos,
09:26
then we can't let them have a child more than about once every 200 or 1,000 years.
144
548000
6000
non podemos deixar que teñan un fillo
máis que unha vez cada 200 ou 1 000 anos.
09:32
And so there won't be any workforce.
145
554000
3000
Mais desta maneira
non haberá poboación activa.
09:35
And one of the things Laurie Garrett pointed out, and others have,
146
557000
4000
Unha das cousas que Laurie Garrett sinala,
e que outros xa teñen sinalado,
09:39
is that a society that doesn't have people
147
561000
5000
é que unha sociedade sen poboación activa
09:44
of working age is in real trouble. And things are going to get worse,
148
566000
3000
en idade de traballar é un problema grave.
E isto vai empeorar
09:47
because there's nobody to educate the children or to feed the old.
149
569000
6000
porque non haberá ninguén para educar
aos nenos ou alimentar aos anciáns.
09:53
And when I'm talking about a long lifetime, of course,
150
575000
2000
E cando falo dunha vida de longa duración
09:55
I don't want somebody who's 200 years old to be like our image
151
577000
6000
non quero que alguén de 200 anos
teña o aspecto que imaxinamos
10:01
of what a 200-year-old is -- which is dead, actually.
152
583000
4000
de alguén de 200 anos, é dicir, morto.
10:05
You know, there's about 400 different parts of the brain
153
587000
2000
Hai preto de 400 partes cerebrais
10:07
which seem to have different functions.
154
589000
2000
que parecen ter diferentes funcións.
10:09
Nobody knows how most of them work in detail,
155
591000
3000
Ninguén sabe como funcionan
en detalle a maioría delas,
10:12
but we do know that there're lots of different things in there.
156
594000
4000
mais si coñecemos que hai
moitas cousas diferentes,
e non sempre traballan xuntas.
Gústame a teoría de Freud
10:16
And they don't always work together. I like Freud's theory
157
598000
2000
10:18
that most of them are cancelling each other out.
158
600000
4000
sobre que a maior parte delas
anúlanse mutuamente.
10:22
And so if you think of yourself as a sort of city
159
604000
4000
Así que se pensas en ti mesmo
coma unha cidade
10:26
with a hundred resources, then, when you're afraid, for example,
160
608000
6000
con centos de recursos, entón,
cando tes medo, por exemplo,
10:32
you may discard your long-range goals, but you may think deeply
161
614000
4000
tal vez descartes obxectivos a longo prazo
pero pode que penses en serio
10:36
and focus on exactly how to achieve that particular goal.
162
618000
4000
e concentres todo en como acadar
ese obxectivo en particular.
10:40
You throw everything else away. You become a monomaniac --
163
622000
3000
Deixas todo o demais de lado.
Vólveste un monomaníaco --
10:43
all you care about is not stepping out on that platform.
164
625000
4000
o único polo que te preocupas
é non saír fóra desa plataforma.
10:47
And when you're hungry, food becomes more attractive, and so forth.
165
629000
4000
Cando tes fame, por exemplo,
a comida vólvese máis atractiva, etc.
10:51
So I see emotions as highly evolved subsets of your capability.
166
633000
6000
Así que vexo as emocións como subgrupos
moi avanzados das vosas capacidades.
10:57
Emotion is not something added to thought. An emotional state
167
639000
4000
As emocións non son algo
que se lle engade ao pensamento.
11:01
is what you get when you remove 100 or 200
168
643000
4000
Un estado emocional é o que tes
cando eliminas 100 ou 200
11:05
of your normally available resources.
169
647000
3000
dos teus recursos dispoñibles
habitualmente.
11:08
So thinking of emotions as the opposite of -- as something
170
650000
3000
Así que pensar nas emocións como o oposto
--como algo
11:11
less than thinking is immensely productive. And I hope,
171
653000
4000
menos importante que o pensamento
é moi produtivo, e espero
11:15
in the next few years, to show that this will lead to smart machines.
172
657000
4000
que nos próximos anos, isto nos leve
a crear máquinas máis intelixentes
11:19
And I guess I better skip all the rest of this, which are some details
173
661000
3000
Supoño que o mellor é que
me salte o resto, algúns detalles
11:22
on how we might make those smart machines and --
174
664000
5000
sobre como deberíamos facer estas máquinas
e --
11:27
(Laughter)
175
669000
5000
(Risas)
11:32
-- and the main idea is in fact that the core of a really smart machine
176
674000
5000
-- e a idea básica é que, de feito,
a cerna dunha máquina intelixente
11:37
is one that recognizes that a certain kind of problem is facing you.
177
679000
5000
é recoñecer a que tipo de problema
te estás enfrontando.
11:42
This is a problem of such and such a type,
178
684000
3000
"Este é un problema de tal tipo,
11:45
and therefore there's a certain way or ways of thinking
179
687000
5000
e hai unha certa maneira
ou maneiras de pensar
11:50
that are good for that problem.
180
692000
2000
que son boas para ese problema."
11:52
So I think the future, main problem of psychology is to classify
181
694000
4000
Por iso penso creo que o problema
máis grande da psicoloxía no futuro
11:56
types of predicaments, types of situations, types of obstacles
182
698000
4000
é clasificar os tipos de situacións
complicadas e os tipos de obstáculos,
12:00
and also to classify available and possible ways to think and pair them up.
183
702000
6000
e tamén clasificar as maneiras de pensar
dispoñibles e posibles para emparellalas.
12:06
So you see, it's almost like a Pavlovian --
184
708000
3000
Así que xa vedes, é case coma
un reflexo de Pavlov --
12:09
we lost the first hundred years of psychology
185
711000
2000
perdemos os primeiros século de psicoloxía
12:11
by really trivial theories, where you say,
186
713000
3000
en teorías triviais
que falan sobre como aprendemos
12:14
how do people learn how to react to a situation? What I'm saying is,
187
716000
6000
a reaccionar diante dunha situación.
O que digo é,
12:20
after we go through a lot of levels, including designing
188
722000
5000
despois de pasar por moitos niveis,
incluíndo o deseño
12:25
a huge, messy system with thousands of ports,
189
727000
3000
dun enorme e complexo sistema
de miles de pezas,
12:28
we'll end up again with the central problem of psychology.
190
730000
4000
acabaremos de novo
no problema central da psicoloxía.
12:32
Saying, not what are the situations,
191
734000
3000
Preguntándonos,
non cales son as situacións,
12:35
but what are the kinds of problems
192
737000
2000
senón cales son os tipos de problemas
12:37
and what are the kinds of strategies, how do you learn them,
193
739000
3000
e cales son os tipos de estratexias,
como aprendelos,
12:40
how do you connect them up, how does a really creative person
194
742000
3000
como conectalos,
como unha persoa realmente creativa
12:43
invent a new way of thinking out of the available resources and so forth.
195
745000
5000
inventa unha nova maneira de pensar
a partir dos recursos dispoñibles.
12:48
So, I think in the next 20 years,
196
750000
2000
Creo que nos próximos 20 anos,
12:50
if we can get rid of all of the traditional approaches to artificial intelligence,
197
752000
5000
se nos podemos librar dos achegamentos
tradicionais á intelixencia artificial,
12:55
like neural nets and genetic algorithms
198
757000
2000
como redes neuronais,
algoritmos xenéticos
12:57
and rule-based systems, and just turn our sights a little bit higher to say,
199
759000
6000
ou sistemas baseados en regras,
teremos as expectativas altas
para preguntarnos:,
13:03
can we make a system that can use all those things
200
765000
2000
"Podemos facer un sistema que use todo iso
13:05
for the right kind of problem? Some problems are good for neural nets;
201
767000
4000
para o tipo de problema correcto?"
Para algúns valen as redes neuronais;
13:09
we know that others, neural nets are hopeless on them.
202
771000
3000
sabemos que para outros,
as redes neuronais son improdutivas.
13:12
Genetic algorithms are great for certain things;
203
774000
3000
Os algoritmos xenéticos son estupendos
para certas cousas;
13:15
I suspect I know what they're bad at, and I won't tell you.
204
777000
4000
Sospeito que sei para que son malos,
pero non volo direi.
13:19
(Laughter)
205
781000
1000
(Risas)
13:20
Thank you.
206
782000
2000
Grazas.
13:22
(Applause)
207
784000
6000
(Aplausos)

▲Back to top

ABOUT THE SPEAKER
Marvin Minsky - AI pioneer
Marvin Minsky is one of the great pioneers of artificial intelligence -- and using computing metaphors to understand the human mind. His contributions to mathematics, robotics and computational linguistics are legendary and far-reaching.

Why you should listen

Marvin Minsky is the superstar-elder of artificial intelligence, one of the most productive and important cognitive scientists of the century, and the leading proponent of the Society of Mind theory. Articulated in his 1985 book of the same name, Minsky's theory says intelligence is not born of any single mechanism, but from the interaction of many independent agents. The book's sequel,The Emotion Machine (2006), says similar activity also accounts for feelings, goals, emotions and conscious thoughts.

Minsky also pioneered advances in mathematics, computational linguistics, optics, robotics and telepresence. He built SNARC, the first neural network simulator, some of the first visual scanners, and the first LOGO "turtle." From his headquarters at MIT's Media Lab and the AI Lab (which he helped found), he continues to work on, as he says, "imparting to machines the human capacity for commonsense reasoning."

More profile about the speaker
Marvin Minsky | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee