ABOUT THE SPEAKER
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com
TED2011

Dan Ariely: Beware conflicts of interest

Dan Ariely: Cuidado com conflitos de interesse

Filmed:
1,284,831 views

Nesta curta palestra, o psicólogo Dan Ariely relata duas histórias pessoais que exploram o conflito de interesses na ciência: como a busca de conhecimento e 'insight' pode ser afetada, conscientemente ou não, por metas pessoais míopes. Quando pensamos grandes questões, ele nos lembra, devemos ter cuidado com nossos cérebros completa e demasiadamente humanos.
- Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why. Full bio

Double-click the English transcript below to play the video.

00:16
So, I was in the hospital for a long time.
0
1000
3000
Estive no hospital por um longo tempo.
00:19
And a few years after I left, I went back,
1
4000
3000
E alguns anos depois que saí de lá, eu voltei,
00:22
and the chairman of the burn department was very excited to see me --
2
7000
3000
e o chefe do departamento de queimados estava muito entusiasmado por me ver --
00:25
said, "Dan, I have a fantastic new treatment for you."
3
10000
3000
disse: "Dan, tenho um novo tratamento fantástico para você."
00:28
I was very excited. I walked with him to his office.
4
13000
2000
Fiquei entusiasmado. Fui com ele até seu consultório.
00:30
And he explained to me that, when I shave,
5
15000
3000
E ele me explicou que, quando me barbeio,
00:33
I have little black dots on the left side of my face where the hair is,
6
18000
3000
tenho pequeninos pontos negros no lado esquerdo de minha face onde há pelos,
00:36
but on the right side of my face
7
21000
2000
mas no lado direito de minha face,
00:38
I was badly burned so I have no hair,
8
23000
2000
como fui severamente queimado, não tenho pelos
00:40
and this creates lack of symmetry.
9
25000
2000
e isso provoca falta de simetria.
00:42
And what's the brilliant idea he had?
10
27000
2000
E qual foi a brilhante ideia que ele teve?
00:44
He was going to tattoo little black dots
11
29000
2000
Ele ia tatuar pequeninos pontos negros
00:46
on the right side of my face
12
31000
3000
no lado direito de minha face
00:49
and make me look very symmetric.
13
34000
2000
e me fazer parecer muito simétrico.
00:51
It sounded interesting. He asked me to go and shave.
14
36000
3000
Parecia interessante. Ele pediu que eu me barbeasse.
00:54
Let me tell you, this was a strange way to shave,
15
39000
2000
Deixem-me dizer, foi uma estranha maneira de barbear-me,
00:56
because I thought about it
16
41000
2000
porque pensei no assunto
00:58
and I realized that the way I was shaving then
17
43000
2000
e percebi que a maneira como eu estava me barbeando então
01:00
would be the way I would shave for the rest of my life --
18
45000
2000
seria a maneira como eu me barbearia pelo resto de minha vida --
01:02
because I had to keep the width the same.
19
47000
2000
pois eu teria que manter a barba sempre a mesma.
01:04
When I got back to his office,
20
49000
2000
Quando voltei a seu consultório,
01:06
I wasn't really sure.
21
51000
2000
Eu não estava realmente convicto.
01:08
I said, "Can I see some evidence for this?"
22
53000
2000
Eu disse: "Posso ver alguma demonstração disso?"
01:10
So he showed me some pictures
23
55000
2000
Ele me mostrou algumas fotos
01:12
of little cheeks with little black dots --
24
57000
2000
de pequenas bochechas com pequeninos pontos negros --
01:14
not very informative.
25
59000
2000
não muito informativas.
01:16
I said, "What happens when I grow older and my hair becomes white?
26
61000
2000
Eu disse: "O que acontece quando eu envelhecer e meus pelos se tornarem brancos?
01:18
What would happen then?"
27
63000
2000
O que aconteceria então?"
01:20
"Oh, don't worry about it," he said.
28
65000
2000
"Oh, não se preocupe com isso.", ele disse.
01:22
"We have lasers; we can whiten it out."
29
67000
3000
"Temos laser; podemos clareá-los"
01:25
But I was still concerned,
30
70000
2000
Mas eu ainda estava preocupado,
01:27
so I said, "You know what, I'm not going to do it."
31
72000
3000
e disse: "Quer saber, não vou fazer isso."
01:30
And then came one of the biggest guilt trips of my life.
32
75000
4000
Então apareceu um dos maiores sentimentos de culpa de minha vida.
01:34
This is coming from a Jewish guy, all right, so that means a lot.
33
79000
3000
Vindo de um judeu, certo, isso explica tudo.
01:37
(Laughter)
34
82000
2000
(Risos)
01:39
And he said, "Dan, what's wrong with you?
35
84000
3000
E ele disse: "Dan, o que há de errado com você?
01:42
Do you enjoy looking non-symmetric?
36
87000
2000
Você gosta de parecer não-simétrico?
01:44
Do you have some kind of perverted pleasure from this?
37
89000
5000
Você tem algum tipo de prazer pervertido nisso?
01:49
Do women feel pity for you
38
94000
2000
As mulheres sentem pena de você
01:51
and have sex with you more frequently?"
39
96000
3000
e fazem sexo com você mais frequentemente?"
01:54
None of those happened.
40
99000
3000
Nada disso aconteceu.
01:58
And this was very surprising to me,
41
103000
2000
E isso foi muito surpreendente para mim,
02:00
because I've gone through many treatments --
42
105000
2000
porque passei por muitos tratamentos --
02:02
there were many treatments I decided not to do --
43
107000
2000
houve muitos tratamentos que decidi não fazer --
02:04
and I never got this guilt trip to this extent.
44
109000
2000
e nunca tive um sentimento de culpa tão grande.
02:06
But I decided not to have this treatment.
45
111000
2000
Mas decidi não fazer esse tratamento.
02:08
And I went to his deputy and asked him, "What was going on?
46
113000
2000
E fui ao vice-chefe do departamento e perguntei-lhe: "O que estava acontecendo?
02:10
Where was this guilt trip coming from?"
47
115000
2000
De onde vinha esse sentimento de culpa?"
02:12
And he explained that they have done this procedure on two patients already,
48
117000
4000
E ele explicou que eles já tinham executado esse procedimento em dois pacientes
02:16
and they need the third patient for a paper they were writing.
49
121000
3000
e eles precisavam de um terceiro paciente para um estudo que estavam escrevendo.
02:19
(Laughter)
50
124000
2000
(Risos)
02:21
Now you probably think that this guy's a schmuck.
51
126000
2000
Vocês provavelmente pensam que esse sujeito é um idiota.
02:23
Right, that's what he seems like.
52
128000
2000
Certo, é o que ele parece.
02:25
But let me give you a different perspective on the same story.
53
130000
3000
Mas deixem-me dar a vocês uma perspectiva diferente na mesma história.
02:28
A few years ago, I was running some of my own experiments in the lab.
54
133000
3000
Alguns anos atrás, eu estava fazendo alguns experimentos no laboratório.
02:31
And when we run experiments,
55
136000
2000
E quando fazemos experimentos,
02:33
we usually hope that one group will behave differently than another.
56
138000
3000
geralmente esperamos que um grupo se comporte diferentemente do outro.
02:36
So we had one group that I hoped their performance would be very high,
57
141000
3000
Então, tínhamos um grupo cujo desempenho eu esperava que fosse bem alto,
02:39
another group that I thought their performance would be very low,
58
144000
3000
e outro grupo cujo desempenho eu acreditava que fosse bem baixo.
02:42
and when I got the results, that's what we got --
59
147000
2000
E quando obtive os resultados, foi isso que constatamos --
02:44
I was very happy -- aside from one person.
60
149000
3000
Eu estava muito feliz -- exceto por uma pessoa.
02:47
There was one person in the group
61
152000
2000
Havia uma pessoa no grupo
02:49
that was supposed to have very high performance
62
154000
2000
dos que deveriam ter um desempenho muito alto
02:51
that was actually performing terribly.
63
156000
2000
que, na verdade, estava desempenhando horrivelmente.
02:53
And he pulled the whole mean down,
64
158000
2000
E ele puxou toda a média para baixo,
02:55
destroying my statistical significance of the test.
65
160000
3000
arruinando a significância estatística do teste.
02:59
So I looked carefully at this guy.
66
164000
2000
Então, observei cuidadosamente esse sujeito.
03:01
He was 20-some years older than anybody else in the sample.
67
166000
3000
Ele era uns 20 e poucos anos mais velho do que qualquer outro na amostra.
03:04
And I remembered that the old and drunken guy
68
169000
2000
E lembrei que o sujeito velho e bêbado
03:06
came one day to the lab
69
171000
2000
veio um dia ao laboratório
03:08
wanting to make some easy cash
70
173000
2000
querendo ganhar algum dinheiro fácil
03:10
and this was the guy.
71
175000
2000
e esse era o sujeito.
03:12
"Fantastic!" I thought. "Let's throw him out.
72
177000
2000
"Fantástico!", pensei. "Vamos tirá-lo daqui.
03:14
Who would ever include a drunken guy in a sample?"
73
179000
3000
Quem incluiria um sujeito bêbado numa amostra?"
03:17
But a couple of days later,
74
182000
2000
Mas alguns dias depois,
03:19
we thought about it with my students,
75
184000
2000
pensamos sobre isso junto com meus alunos,
03:21
and we said, "What would have happened if this drunken guy was not in that condition?
76
186000
3000
e questionamos: "O que teria acontecido se esse sujeito bêbado não estivesse nessa condição?
03:24
What would have happened if he was in the other group?
77
189000
2000
O que teria acontecido se ele estivesse no outro grupo?
03:26
Would we have thrown him out then?"
78
191000
2000
Nós o teríamos tirado?"
03:28
We probably wouldn't have looked at the data at all,
79
193000
2000
Provavelmente não teríamos olhado para os dados,
03:30
and if we did look at the data,
80
195000
2000
e se realmente tivéssemos olhado para os dados,
03:32
we'd probably have said, "Fantastic! What a smart guy who is performing this low,"
81
197000
3000
provavelmente teríamos dito: "Fantástico! Que sujeito esperto esse que tem desempenho tão baixo",
03:35
because he would have pulled the mean of the group lower,
82
200000
2000
porque ele teria puxado a média do grupo mais para baixo,
03:37
giving us even stronger statistical results than we could.
83
202000
3000
dando-nos resultados estatísticos mais consistentes.
03:41
So we decided not to throw the guy out and to rerun the experiment.
84
206000
3000
Então decidimos não retirar o sujeito e refazer o experimento.
03:44
But you know, these stories,
85
209000
3000
Sabem, essas histórias
03:47
and lots of other experiments that we've done on conflicts of interest,
86
212000
3000
e muitos outros experimentos que fizemos sobre conflitos de interesse
03:50
basically kind of bring two points
87
215000
2000
basicamente trazem dois pontos
03:52
to the foreground for me.
88
217000
2000
para o plano principal a meu ver.
03:54
The first one is that in life we encounter many people
89
219000
3000
O primeiro é que na vida encontramos muitas pessoas
03:57
who, in some way or another,
90
222000
3000
que, de uma maneira ou outra,
04:00
try to tattoo our faces.
91
225000
2000
tentam tatuar nossas faces.
04:02
They just have the incentives that get them to be blinded to reality
92
227000
3000
Eles têm os incentivos que os deixam cegos para a realidade
04:05
and give us advice that is inherently biased.
93
230000
3000
e nos dão o conselho que é intrinsicamente tendencioso.
04:08
And I'm sure that it's something that we all recognize,
94
233000
2000
E tenho certeza de que isso é algo que todos nós reconhecemos
04:10
and we see that it happens.
95
235000
2000
e vemos que acontece.
04:12
Maybe we don't recognize it every time,
96
237000
2000
Talvez não o reconhecemos todas as vezes,
04:14
but we understand that it happens.
97
239000
2000
mas compreendemos que acontece.
04:16
The most difficult thing, of course, is to recognize
98
241000
2000
A coisa mais difícil, é claro, é reconhecer
04:18
that sometimes we too
99
243000
2000
que algumas vezes nós também
04:20
are blinded by our own incentives.
100
245000
2000
estamos cegos por nossos próprios incentivos.
04:22
And that's a much, much more difficult lesson to take into account.
101
247000
3000
E esta é uma lição muito, muito mais difícil de considerar.
04:25
Because we don't see how conflicts of interest work on us.
102
250000
4000
Porque não vemos como os conflitos de interesse funcionam em nós.
04:29
When I was doing these experiments,
103
254000
2000
Quando eu estava fazendo esses experimentos,
04:31
in my mind, I was helping science.
104
256000
2000
em minha mente, eu estava ajudando a ciência.
04:33
I was eliminating the data
105
258000
2000
Eu estava eliminando os dados
04:35
to get the true pattern of the data to shine through.
106
260000
2000
para fazer o verdadeiro padrão de dados brilhar.
04:37
I wasn't doing something bad.
107
262000
2000
Eu não estava fazendo algo ruim.
04:39
In my mind, I was actually a knight
108
264000
2000
Em minha mente, eu era, na verdade, um cavaleiro
04:41
trying to help science move along.
109
266000
2000
tentando ajudar a ciência a avançar.
04:43
But this was not the case.
110
268000
2000
Mas este não era o caso.
04:45
I was actually interfering with the process with lots of good intentions.
111
270000
3000
Na verdade, eu estava interferindo no processo, cheio de boas intenções.
04:48
And I think the real challenge is to figure out
112
273000
2000
E penso que o verdadeiro desafio é imaginar
04:50
where are the cases in our lives
113
275000
2000
onde estão os casos em nossas vidas
04:52
where conflicts of interest work on us,
114
277000
2000
nos quais os conflitos de interesse funcionam em nós
04:54
and try not to trust our own intuition to overcome it,
115
279000
3000
e tentar não confiar em nossa própria intuição para superá-los,
04:57
but to try to do things
116
282000
2000
mas tentar fazer coisas
04:59
that prevent us from falling prey to these behaviors,
117
284000
2000
que nos previnam de nos tornar presas desses comportamentos,
05:01
because we can create lots of undesirable circumstances.
118
286000
3000
porque podemos criar muitas circunstâncias indesejáveis.
05:05
I do want to leave you with one positive thought.
119
290000
2000
Realmente quero deixá-los com um pensamento positivo.
05:07
I mean, this is all very depressing, right --
120
292000
2000
Quero dizer, tudo isso é muito depressivo, certo --
05:09
people have conflicts of interest, we don't see it, and so on.
121
294000
3000
pessoas têm conflitos de interesses, não vemos isso, e assim por diante.
05:12
The positive perspective, I think, of all of this
122
297000
2000
A perspectiva positiva de tudo isso, eu penso,
05:14
is that, if we do understand when we go wrong,
123
299000
3000
é que, se de fato entendemos quando estamos errados,
05:17
if we understand the deep mechanisms
124
302000
2000
se entendemos os profundos mecanismos
05:19
of why we fail and where we fail,
125
304000
2000
de por que falhamos e onde falhamos,
05:21
we can actually hope to fix things.
126
306000
2000
realmente podemos esperar que consertemos as coisas.
05:23
And that, I think, is the hope. Thank you very much.
127
308000
2000
E isto, penso, é a esperança. Muito obrigado.
05:25
(Applause)
128
310000
4000
(Aplausos)
Translated by Isabel Villan
Reviewed by Rafael Eufrasio

▲Back to top

ABOUT THE SPEAKER
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee