ABOUT THE SPEAKER
Dan Dennett - Philosopher, cognitive scientist
Dan Dennett thinks that human consciousness and free will are the result of physical processes.

Why you should listen

One of our most important living philosophers, Dan Dennett is best known for his provocative and controversial arguments that human consciousness and free will are the result of physical processes in the brain. He argues that the brain's computational circuitry fools us into thinking we know more than we do, and that what we call consciousness — isn't. His 2003 book "Freedom Evolves" explores how our brains evolved to give us -- and only us -- the kind of freedom that matters, while 2006's "Breaking the Spell" examines belief through the lens of biology.

This mind-shifting perspective on the mind itself has distinguished Dennett's career as a philosopher and cognitive scientist. And while the philosophy community has never quite known what to make of Dennett (he defies easy categorization, and refuses to affiliate himself with accepted schools of thought), his computational approach to understanding the brain has made him, as Edge's John Brockman writes, “the philosopher of choice of the AI community.”

“It's tempting to say that Dennett has never met a robot he didn't like, and that what he likes most about them is that they are philosophical experiments,” Harry Blume wrote in the Atlantic Monthly in 1998. “To the question of whether machines can attain high-order intelligence, Dennett makes this provocative answer: ‘The best reason for believing that robots might some day become conscious is that we human beings are conscious, and we are a sort of robot ourselves.'"

In recent years, Dennett has become outspoken in his atheism, and his 2006 book Breaking the Spell calls for religion to be studied through the scientific lens of evolutionary biology. Dennett regards religion as a natural -- rather than supernatural -- phenomenon, and urges schools to break the taboo against empirical examination of religion. He argues that religion's influence over human behavior is precisely what makes gaining a rational understanding of it so necessary: “If we don't understand religion, we're going to miss our chance to improve the world in the 21st century.”

Dennett's landmark books include The Mind's I, co-edited with Douglas Hofstaedter, Consciousness Explained, and Darwin's Dangerous Idea. Read an excerpt from his 2013 book, Intuition Pumps, in the Guardian >>

More profile about the speaker
Dan Dennett | Speaker | TED.com
TED2009

Dan Dennett: Cute, sexy, sweet, funny

Dan Dennett: Fofo, sensual, doce, engraçado

Filmed:
3,553,924 views

Porque bebês são fofos? Porque bolo é doce? O Filósofo Dan Dennett tem respostas que você não esperaria, a medida que ele revela a razão contra intuitiva da evolução sobre coisas fofas, doces e sensuais (além de uma nova teoria de Matthew Hurley sobre porque as piadas são engraçadas).
- Philosopher, cognitive scientist
Dan Dennett thinks that human consciousness and free will are the result of physical processes. Full bio

Double-click the English transcript below to play the video.

00:12
I’m going around the world giving talks about Darwin,
0
0
3000
Eu viajo ao redor do mundo dando palestras sobre Darwin,
00:15
and usually what I’m talking about
1
3000
2000
e geralmente eu falo sobre
00:17
is Darwin’s strange inversion of reasoning.
2
5000
3000
a estranha inversão de raciocínio de Darwin.
00:20
Now that title, that phrase, comes from a critic, an early critic,
3
8000
5000
Este título, esta frase, vem de um crítico, um crítico pioneiro,
00:25
and this is a passage that I just love, and would like to read for you.
4
13000
4000
e este é um trecho que eu adoro, e gostaria de ler para vocês.
00:29
"In the theory with which we have to deal, Absolute Ignorance is the artificer;
5
17000
5000
"Na teoria com a qual temos que lidar, Ignorância Absoluta é a artífice;
00:34
so that we may enunciate as the fundamental principle of the whole system,
6
22000
5000
de modo que podemos enunciar como o princípio fundamental de todo o sistema,
00:39
that, in order to make a perfect and beautiful machine,
7
27000
3000
que, a fim de fazer uma máquina perfeita e bela,
00:42
it is not requisite to know how to make it.
8
30000
3000
não é requisito saber como fazê-la.
00:45
This proposition will be found on careful examination to express,
9
33000
4000
Esta proposição será encontrada em um exame cuidadoso, para expressar,
00:49
in condensed form, the essential purport of the Theory,
10
37000
4000
em uma forma condensada, o propósito essencial da Teoria,
00:53
and to express in a few words all Mr. Darwin’s meaning;
11
41000
4000
e para expressar em poucas palavras todo o significado do Sr. Darwin;
00:57
who, by a strange inversion of reasoning,
12
45000
4000
quem, por uma estranha inversão de raciocínio,
01:01
seems to think Absolute Ignorance fully qualified
13
49000
3000
parece pensar que a Ignorância Absoluta é totalmente qualificada
01:04
to take the place of Absolute Wisdom in the achievements of creative skill."
14
52000
6000
para tomar o lugar da Sabedoria Absoluta nas realizações da habilidade criativa."
01:10
Exactly. Exactly. And it is a strange inversion.
15
58000
7000
Exatamente. Exatamente. E é uma estranha inversão.
01:17
A creationist pamphlet has this wonderful page in it:
16
65000
4000
Um panfleto criacionista tem uma formidável página:
01:21
"Test Two:
17
69000
2000
"Teste Dois:
01:23
Do you know of any building that didn’t have a builder? Yes/No.
18
71000
4000
Você conhece alguma construção que não tinha um construtor? Sim Não.
01:27
Do you know of any painting that didn’t have a painter? Yes/No.
19
75000
3000
Você conhece alguma pintura que não tinha um pintor? Sim Não.
01:30
Do you know of any car that didn’t have a maker? Yes/No.
20
78000
4000
Você conhece algum carro que não tinha um fabricante? Sim Não.
01:34
If you answered 'Yes' for any of the above, give details."
21
82000
5000
Se você respondeu "SIM" para alguma das acima, especifique."
01:39
A-ha! I mean, it really is a strange inversion of reasoning.
22
87000
6000
A-ha! Quero dizer, é realmente uma estranha inversão de raciocínio.
01:45
You would have thought it stands to reason
23
93000
4000
Você teria pensado que parece lógico
01:49
that design requires an intelligent designer.
24
97000
4000
que design requer um designer inteligente.
01:53
But Darwin shows that it’s just false.
25
101000
2000
Mas Darwin mostra que isso é simplesmente falso.
01:55
Today, though, I’m going to talk about Darwin’s other strange inversion,
26
103000
5000
Hoje, porém, eu vou falar sobre outra inversão de raciocínio de Darwin,
02:00
which is equally puzzling at first, but in some ways just as important.
27
108000
6000
a qual é igualmente intrigante de início, mas de certa forma tão importante quanto.
02:06
It stands to reason that we love chocolate cake because it is sweet.
28
114000
7000
Parece lógico que adoramos bolo de chocolate porque ele é doce.
02:13
Guys go for girls like this because they are sexy.
29
121000
6000
Rapazes tentam conseguir garotas como estas porque elas são sexy.
02:19
We adore babies because they’re so cute.
30
127000
4000
Adoramos bebês porque eles são tão fofos.
02:23
And, of course, we are amused by jokes because they are funny.
31
131000
9000
E, naturalmente, nos divertimos com piadas porque elas são engraçadas.
02:32
This is all backwards. It is. And Darwin shows us why.
32
140000
7000
Isto está invertido. Está sim. E Darwin nos mostra o porquê.
02:39
Let’s start with sweet. Our sweet tooth is basically an evolved sugar detector,
33
147000
8000
Vamos começar com doce. Nosso apetite por doce é basicamente um detector de açúcar evoluído,
02:47
because sugar is high energy, and it’s just been wired up to the preferer,
34
155000
4000
porque açúcar é energético, e foi apenas conectado até a preferência
02:51
to put it very crudely, and that’s why we like sugar.
35
159000
5000
para colocar de forma crua, e é por isso que gostamos de açúcar.
02:56
Honey is sweet because we like it, not "we like it because honey is sweet."
36
164000
7000
Mel é doce porque gostamos dele, não "nós gostamos dele porque mel é doce."
03:03
There’s nothing intrinsically sweet about honey.
37
171000
5000
Não há nada de intrinsecamente doce sobre o mel.
03:08
If you looked at glucose molecules till you were blind,
38
176000
4000
Se você olhasse para moléculas de glicose até cansar,
03:12
you wouldn’t see why they tasted sweet.
39
180000
3000
você não veria porque elas são doces.
03:15
You have to look in our brains to understand why they’re sweet.
40
183000
6000
Você deve olhar dentro do cérebro para entender porque elas são doces.
03:21
So if you think first there was sweetness,
41
189000
2000
Então se você acha que primeiro existia a doçura,
03:23
and then we evolved to like sweetness,
42
191000
2000
e depois nós evoluímos para gostar de doçura,
03:25
you’ve got it backwards; that’s just wrong. It’s the other way round.
43
193000
4000
você entendeu invertido; isso é simplesmente errado. É o contrário.
03:29
Sweetness was born with the wiring which evolved.
44
197000
4000
Doçura nasceu com a fiação que evoluiu.
03:33
And there’s nothing intrinsically sexy about these young ladies.
45
201000
4000
E não existe nada intrinsecamente sensual sobre estas jovens.
03:37
And it’s a good thing that there isn’t, because if there were,
46
205000
5000
E é bom que não existe, porque se existisse,
03:42
then Mother Nature would have a problem:
47
210000
4000
então a Mãe Natureza teria um problema:
03:46
How on earth do you get chimps to mate?
48
214000
5000
Como conseguir fazer chimpanzés acasalarem?
03:53
Now you might think, ah, there’s a solution: hallucinations.
49
221000
8000
Agora você talvez pense, ah, existe uma solução: alucinações.
04:01
That would be one way of doing it, but there’s a quicker way.
50
229000
4000
Isso seria uma maneira de fazer, mas há uma maneira mais rápida.
04:05
Just wire the chimps up to love that look,
51
233000
3000
Apenas faça com que os chimpanzés gostem daquele aspecto,
04:08
and apparently they do.
52
236000
3000
e aparentemente eles gostam.
04:11
That’s all there is to it.
53
239000
4000
Isso é tudo.
04:16
Over six million years, we and the chimps evolved our different ways.
54
244000
4000
Ao longo de seis milhões de anos, nós e os chimpanzés evoluímos nossos caminhos diferentes.
04:20
We became bald-bodied, oddly enough;
55
248000
3000
Nós ficamos sem pelos, curiosamente;
04:23
for one reason or another, they didn’t.
56
251000
4000
por uma ou outra razão, eles não ficaram.
04:27
If we hadn’t, then probably this would be the height of sexiness.
57
255000
12000
Se nós não tivéssemos, então provavelmente esse seria o auge do sexy.
04:39
Our sweet tooth is an evolved and instinctual preference for high-energy food.
58
267000
5000
Nosso apetite por doce é uma evoluída e instintiva preferência por alimentos energéticos.
04:44
It wasn’t designed for chocolate cake.
59
272000
3000
Ele não foi programado para bolo de chocolate.
04:47
Chocolate cake is a supernormal stimulus.
60
275000
3000
Bolo de chocolate é um estímulo supranormal.
04:50
The term is owed to Niko Tinbergen,
61
278000
2000
O termo pertence a Niko Tinbergen,
04:52
who did his famous experiments with gulls,
62
280000
2000
que fez seus famosos experimentos com gaivotas,
04:54
where he found that that orange spot on the gull’s beak --
63
282000
4000
onde ele verificou que aquela mancha laranja no bico da gaivota --
04:58
if he made a bigger, oranger spot
64
286000
2000
se ele fizesse uma maior, mancha laranja
05:00
the gull chicks would peck at it even harder.
65
288000
2000
os filhotes de gaivota bicariam nela ainda mais forte.
05:02
It was a hyperstimulus for them, and they loved it.
66
290000
3000
Isso foi um hiper estímulo para eles, e eles adoraram.
05:05
What we see with, say, chocolate cake
67
293000
4000
O que nós vemos com, digamos, bolo de chocolate
05:09
is it’s a supernormal stimulus to tweak our design wiring.
68
297000
5000
é um estímulo supranormal para ajustar nosso projeto de fiação.
05:14
And there are lots of supernormal stimuli; chocolate cake is one.
69
302000
3000
E existem muitos estímulos supranormais; bolo de chocolate é um.
05:17
There's lots of supernormal stimuli for sexiness.
70
305000
3000
Existem muitos estímulos supranormais para a sensualidade.
05:20
And there's even supernormal stimuli for cuteness. Here’s a pretty good example.
71
308000
6000
E existem até estímulo supranormal para fofura. Aqui está um bom exemplo.
05:26
It’s important that we love babies, and that we not be put off by, say, messy diapers.
72
314000
5000
É importante que adoremos bebês, e que não seja adiado por, digamos, fraudas sujas.
05:31
So babies have to attract our affection and our nurturing, and they do.
73
319000
6000
Então bebês tem que atrair nosso afeto e nossa proteção, e eles atraem.
05:37
And, by the way, a recent study shows that mothers
74
325000
4000
E, por sinal, um estudo recente mostra que mães
05:41
prefer the smell of the dirty diapers of their own baby.
75
329000
3000
preferem o cheiro das fraudas sujas do seu próprio bebê.
05:44
So nature works on many levels here.
76
332000
3000
Então a natureza trabalha em vários níveis aqui.
05:47
But now, if babies didn’t look the way they do -- if babies looked like this,
77
335000
5000
Mas agora, se bebês não parecessem do jeito que são, se bebês parecessem com isto,
05:52
that’s what we would find adorable, that’s what we would find --
78
340000
4000
Isso seria o que nós acharíamos adorável, isso seria o que nós encontraríamos --
05:56
we would think, oh my goodness, do I ever want to hug that.
79
344000
6000
nós pensaríamos, nossa, eu jamais iria querer abraçar aquilo.
06:02
This is the strange inversion.
80
350000
2000
Isto é a estranha inversão.
06:04
Well now, finally what about funny. My answer is, it’s the same story, the same story.
81
352000
7000
Bem agora, finalmente o que dizer sobre graça. Minha resposta é, é a mesma estória, a mesma estória.
06:11
This is the hard one, the one that isn’t obvious. That’s why I leave it to the end.
82
359000
4000
Esta é a mais difícil, a que não é óbvia. Por isso que eu deixei ela para o final.
06:15
And I won’t be able to say too much about it.
83
363000
2000
E eu não vou ser capaz de falar muito sobre ela.
06:17
But you have to think evolutionarily, you have to think, what hard job that has to be done --
84
365000
6000
Mas você deve pensar evolutivamente, você deve pensar, que trabalho duro tem que ser feito --
06:23
it’s dirty work, somebody’s got to do it --
85
371000
3000
É um trabalho sujo, alguém tem que fazer --
06:26
is so important to give us such a powerful, inbuilt reward for it when we succeed.
86
374000
8000
é tão importante nos dar uma recompensa tão poderosa e incorporada para isso quando formos bem sucedidos.
06:34
Now, I think we've found the answer -- I and a few of my colleagues.
87
382000
4000
Agora, eu acho que nós encontramos a resposta, eu e alguns dos meus colegas.
06:38
It’s a neural system that’s wired up to reward the brain
88
386000
4000
É um sistema neural que é conectado para recompensar o cérebro
06:42
for doing a grubby clerical job.
89
390000
5000
por fazer um trabalho administrativo sujo.
06:48
Our bumper sticker for this view is
90
396000
4000
Nossa mensagem para este ponto de vista é
06:52
that this is the joy of debugging.
91
400000
3000
que isto é o prazer da depuração.
06:55
Now I’m not going to have time to spell it all out,
92
403000
2000
Agora eu não vou ter tempo para esclarecer tudo,
06:57
but I’ll just say that only some kinds of debugging get the reward.
93
405000
5000
mas eu vou apenas dizer que só alguns tipos de depuração recebem recompensa.
07:02
And what we’re doing is we’re using humor as a sort of neuroscientific probe
94
410000
8000
E o que nós estamos fazendo é usar o humor como um tipo de sonda neurocientífica
07:10
by switching humor on and off, by turning the knob on a joke --
95
418000
4000
por ligar e desligar o humor, por classificar uma piada --
07:14
now it’s not funny ... oh, now it’s funnier ...
96
422000
2000
Agora não é engraçado ... oh, agora é mais engraçado ...
07:16
now we’ll turn a little bit more ... now it’s not funny --
97
424000
2000
agora nós vamos girar o botão um pouco mais ... agora não é engraçado --
07:18
in this way, we can actually learn something
98
426000
3000
desta forma, poderemos realmente aprender algo
07:21
about the architecture of the brain,
99
429000
2000
sobre a arquitetura do cérebro,
07:23
the functional architecture of the brain.
100
431000
2000
a arquitetura funcional do cérebro.
07:25
Matthew Hurley is the first author of this. We call it the Hurley Model.
101
433000
5000
Matthew Hurley é o primeiro autor disto. Nós o chamamos de o Modelo Hurley.
07:30
He’s a computer scientist, Reginald Adams a psychologist, and there I am,
102
438000
4000
Ele é um cientista da computação, Reginald Adams é psicólogo, e lá estou eu,
07:34
and we’re putting this together into a book.
103
442000
2000
e nós estamos juntando isso em um livro.
07:36
Thank you very much.
104
444000
3000
Muito obrigado.
Translated by Luiz Bento
Reviewed by Aurelio Tergolina Salton

▲Back to top

ABOUT THE SPEAKER
Dan Dennett - Philosopher, cognitive scientist
Dan Dennett thinks that human consciousness and free will are the result of physical processes.

Why you should listen

One of our most important living philosophers, Dan Dennett is best known for his provocative and controversial arguments that human consciousness and free will are the result of physical processes in the brain. He argues that the brain's computational circuitry fools us into thinking we know more than we do, and that what we call consciousness — isn't. His 2003 book "Freedom Evolves" explores how our brains evolved to give us -- and only us -- the kind of freedom that matters, while 2006's "Breaking the Spell" examines belief through the lens of biology.

This mind-shifting perspective on the mind itself has distinguished Dennett's career as a philosopher and cognitive scientist. And while the philosophy community has never quite known what to make of Dennett (he defies easy categorization, and refuses to affiliate himself with accepted schools of thought), his computational approach to understanding the brain has made him, as Edge's John Brockman writes, “the philosopher of choice of the AI community.”

“It's tempting to say that Dennett has never met a robot he didn't like, and that what he likes most about them is that they are philosophical experiments,” Harry Blume wrote in the Atlantic Monthly in 1998. “To the question of whether machines can attain high-order intelligence, Dennett makes this provocative answer: ‘The best reason for believing that robots might some day become conscious is that we human beings are conscious, and we are a sort of robot ourselves.'"

In recent years, Dennett has become outspoken in his atheism, and his 2006 book Breaking the Spell calls for religion to be studied through the scientific lens of evolutionary biology. Dennett regards religion as a natural -- rather than supernatural -- phenomenon, and urges schools to break the taboo against empirical examination of religion. He argues that religion's influence over human behavior is precisely what makes gaining a rational understanding of it so necessary: “If we don't understand religion, we're going to miss our chance to improve the world in the 21st century.”

Dennett's landmark books include The Mind's I, co-edited with Douglas Hofstaedter, Consciousness Explained, and Darwin's Dangerous Idea. Read an excerpt from his 2013 book, Intuition Pumps, in the Guardian >>

More profile about the speaker
Dan Dennett | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee