ABOUT THE SPEAKER
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com
EG 2008

Dan Ariely: Are we in control of our own decisions?

Dan Ariely: Controlamos as nosas propias decisións?

Filmed:
6,706,559 views

O economista condutual Dan Ariely, autor de *Prediciblemente irracional*, emprega ilusións visuais clásicas, e as súas propias achegas científicas contraintuitivas (e por veces sorprendentes), para amosar que non somos tan racionais como pensamos á hora de tomar decisións.
- Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why. Full bio

Double-click the English transcript below to play the video.

00:16
I'll tell you a little bit about irrational behavior.
0
0
3000
Falareilles un pouquiño sobre
o comportamento irracional.
00:19
Not yours, of course -- other people's.
1
3000
2000
Non do seu, claro; do doutra xente.
(Risas)
00:21
(Laughter)
2
5000
1000
00:22
So after being at MIT for a few years,
3
6000
4000
Despois de estar algúns anos no MIT,
00:26
I realized that writing academic papers is not that exciting.
4
10000
4000
decateime de que non é emocionante
escribir artigos académicos.
00:30
You know, I don't know how many of those you read,
5
14000
2000
Non sei cantos leron,
00:32
but it's not fun to read and often not fun to write --
6
16000
3000
pero non é divertido lelos e,
ás veces, escribilos tampouco;
00:35
even worse to write.
7
19000
2000
escribilos é incluso peor.
00:37
So I decided to try and write something more fun.
8
21000
3000
Así que decidín tentar escribir
algo máis divertido.
00:40
And I came up with an idea that I will write a cookbook.
9
24000
4000
E ocorréuseme a idea de escribir
un libro de cociña.
00:44
And the title for my cookbook was going to be
10
28000
2000
E o título do meu libro de cociña sería,
00:46
"Dining Without Crumbs: The Art of Eating Over the Sink."
11
30000
3000
Comer sen migallas: A arte de comer
sobre o vertedoiro da cociña
.
00:49
(Laughter)
12
33000
2000
(Risas)
00:51
And it was going to be a look at life through the kitchen.
13
35000
3000
E sería unha perspectiva da vida
a través da cociña.
00:54
And I was quite excited about this. I was going to talk
14
38000
2000
O que me tiña bastante emocionado.
00:56
a little bit about research, a little bit about the kitchen.
15
40000
3000
Ía falar un pouco
das miñas investigacións,
e un pouco de cociña.
00:59
You know, we do so much in the kitchen I thought this would be interesting.
16
43000
3000
Facemos tantas cousas na cociña
que pensei que sería interesante.
01:02
And I wrote a couple of chapters.
17
46000
2000
Escribín un par de capítulos, e leveinos
á editorial do MIT, onde me dixeron,
01:04
And I took it to MIT press and they said,
18
48000
2000
01:06
"Cute, but not for us. Go and find somebody else."
19
50000
4000
"Ben, pero non para nós.
Busque a alguén máis."
01:10
I tried other people and everybody said the same thing,
20
54000
2000
Intenteino con outras persoas
e todas dicíanme o mesmo:
01:12
"Cute. Not for us."
21
56000
3000
"Ben, pero non para nós."
01:15
Until somebody said,
22
59000
3000
Até que alguén me dixo:
01:18
"Look, if you're serious about this,
23
62000
2000
"Mira, se de verdade queres facer isto
primeiro tes que escribir da túa
investigación; tes que publicar algo,
01:20
you first have to write a book about your research. You have to publish something,
24
64000
3000
01:23
and then you'll get the opportunity to write something else.
25
67000
2000
e entón terás a oportunidade
de escribir outra cousa.
01:25
If you really want to do it you have to do it."
26
69000
2000
Se de verdade o queres, tes que facelo."
01:27
So I said, "You know, I really don't want to write about my research.
27
71000
3000
Eu dixen: "Non quero escribir
sobre a miña investigación.
01:30
I do this all day long. I want to write something else.
28
74000
2000
Iso fágoo todo o tempo.
01:32
Something a bit more free, less constrained."
29
76000
3000
Quero escribir algo un pouco máis libre,
menos restrinxido."
01:35
And this person was very forceful and said,
30
79000
3000
Esta persoa foi moi insistente e díxome:
01:38
"Look. That's the only way you'll ever do it."
31
82000
2000
"Mira, ese é o único xeito de conseguilo."
01:40
So I said, "Okay, if I have to do it -- "
32
84000
3000
Así que dixen:
"Vale, se teño que facelo...".
Tiña un sabático.
01:43
I had a sabbatical. I said, "I'll write about my research
33
87000
3000
Dixen: "Escribirei da miña investigación,
se non hai outro modo.
01:46
if there is no other way. And then I'll get to do my cookbook."
34
90000
2000
E logo farei o meu libro de cociña"
01:48
So I wrote a book on my research.
35
92000
3000
Así que escribín un libro
sobre a miña investigación
01:51
And it turned out to be quite fun in two ways.
36
95000
3000
e resultou ser bastante divertido,
en dous aspectos.
01:54
First of all, I enjoyed writing.
37
98000
3000
En primeiro lugar, gocei escribindo.
01:57
But the more interesting thing was that
38
101000
2000
Pero o máis interesante foi
que empecei a aprender da xente.
01:59
I started learning from people.
39
103000
2000
É un momento fantástico para escribir,
02:01
It's a fantastic time to write,
40
105000
2000
porque podes conseguir
moita retroalimentación da xente.
02:03
because there is so much feedback you can get from people.
41
107000
2000
02:05
People write me about their personal experience,
42
109000
3000
A xente escríbeme
sobre as súas experiencias persoais,
02:08
and about their examples, and what they disagree,
43
112000
2000
e sobre os seus exemplos,
e onde non están de acordo,
02:10
and nuances.
44
114000
2000
e sobre os seus matices.
02:12
And even being here -- I mean the last few days,
45
116000
2000
Mesmo estando aquí, é dicir,
nos últimos días,
02:14
I've known really heights of obsessive behavior
46
118000
3000
coñecín moitos comportamentos obsesivos
02:17
I never thought about.
47
121000
2000
que xamais imaxinei.
02:19
(Laughter)
48
123000
1000
(Risas)
02:20
Which I think is just fascinating.
49
124000
2000
Paréceme simplemente fascinante.
02:22
I will tell you a little bit about irrational behavior.
50
126000
3000
Falareilles un pouquiño
sobre a conduta irracional
02:25
And I want to start by giving you some examples of visual illusion
51
129000
3000
e quero comezar dándolles
algúns exemplos de ilusións visuais
02:28
as a metaphor for rationality.
52
132000
2000
coma unha metáfora da racionalidade.
02:30
So think about these two tables.
53
134000
2000
Así que pensen sobre estas dúas mesas.
02:32
And you must have seen this illusion.
54
136000
2000
Seguramente xa viron esta ilusión.
02:34
If I asked you what's longer, the vertical line on the table on the left,
55
138000
3000
Preguntaranse cal é máis longa:
a liña vertical da mesa da esquerda,
02:37
or the horizontal line on the table on the right?
56
141000
3000
ou a liña horizontal na mesa da dereita?
02:40
Which one seems longer?
57
144000
3000
Cal parece máis longa?
Pode alguén ver algo que non sexa
que a da esquerda é máis longa?
02:43
Can anybody see anything but the left one being longer?
58
147000
3000
Non, verdade? É imposible.
02:46
No, right? It's impossible.
59
150000
2000
Pero o mellor das ilusións visuais é
que podemos demostrar os erros facilmente.
02:48
But the nice thing about visual illusion is we can easily demonstrate mistakes.
60
152000
3000
02:51
So I can put some lines on; it doesn't help.
61
155000
3000
Entón, podo poñer algunhas liñas así;
pero non axuda.
02:54
I can animate the lines.
62
158000
2000
Podo mover as liñas.
E, namentres crean
que non encollín as liñas,
02:56
And to the extent you believe I didn't shrink the lines,
63
160000
2000
02:58
which I didn't, I've proven to you that your eyes were deceiving you.
64
162000
5000
que non o fixen, probei que os seus ollos
estábanos a enganar.
03:03
Now, the interesting thing about this
65
167000
2000
Agora, o máis interesante disto é que,
cando retiramos as liñas,
03:05
is when I take the lines away,
66
169000
2000
03:07
it's as if you haven't learned anything in the last minute.
67
171000
2000
é coma se non aprenderan nada
no último minuto.
03:09
(Laughter)
68
173000
3000
(Risas)
03:12
You can't look at this and say, "Okay now I see reality as it is."
69
176000
3000
Non poden mirar isto e dicir:
"Agora vexo a realidade tal e como é."
03:15
Right? It's impossible to overcome this
70
179000
2000
Certo? É imposible superar a sensación
de que esta é, de feito, a máis longa.
03:17
sense that this is indeed longer.
71
181000
3000
A nosa intuición realmente engánanos
03:20
Our intuition is really fooling us in a repeatable, predictable, consistent way.
72
184000
3000
dun xeito repetible,
predicible e consistente.
03:23
And there is almost nothing we can do about it,
73
187000
3000
E case que non hai nada
que facer ao respecto,
03:26
aside from taking a ruler and starting to measure it.
74
190000
3000
agás coller unha regra e comezar a medir.
03:29
Here is another one -- this is one of my favorite illusions.
75
193000
3000
Aquí temos outra.
É unha das miñas ilusións favoritas.
03:32
What do you see the color that top arrow is pointing to?
76
196000
3000
A que cor sinala a frecha superior?
03:35
Brown. Thank you.
77
199000
2000
Audiencia: Marrón.
Dan Ariely: Marrón. Grazas.
03:37
The bottom one? Yellow.
78
201000
2000
E a frecha inferior? Amarela.
03:39
Turns out they're identical.
79
203000
2000
Pois resulta que son idénticas.
03:41
Can anybody see them as identical?
80
205000
2000
Alguén pode ver que son idénticas?
03:43
Very very hard.
81
207000
2000
Moi, pero moi difícil.
Podo cubrir o resto do cubo.
03:45
I can cover the rest of the cube up.
82
209000
2000
03:47
And if I cover the rest of the cube you can see that they are identical.
83
211000
3000
Se cubro o resto do cubo,
poden ver que son idénticas.
03:50
And if you don't believe me you can get the slide later
84
214000
2000
Se non me cren,
poden conseguir logo a diapositiva,
03:52
and do some arts and crafts and see that they're identical.
85
216000
3000
facer manualidades
e ver que son idénticas.
03:55
But again it's the same story
86
219000
2000
Pero, de novo, é a mesma historia,
se retiramos o fondo,
03:57
that if we take the background away,
87
221000
2000
03:59
the illusion comes back. Right.
88
223000
2000
a ilusión regresa.
04:01
There is no way for us not to see this illusion.
89
225000
3000
Non hai modo de eliminar a ilusión.
04:04
I guess maybe if you're colorblind I don't think you can see that.
90
228000
3000
Supoño que, se es cego á cor,
quizais non a podas ver.
04:07
I want you to think about illusion as a metaphor.
91
231000
3000
Quero que pensen na ilusión
como nunha metáfora.
04:10
Vision is one of the best things we do.
92
234000
2000
A visión é unha das cousas
que mellor facemos.
04:12
We have a huge part of our brain dedicated to vision --
93
236000
2000
Gran parte do noso cerebro
dedícase á visión,
04:14
bigger than dedicated to anything else.
94
238000
2000
moito máis ca a calquera outra cousa.
04:16
We do more vision more hours of the day than we do anything else.
95
240000
4000
Empregamos a visión máis horas ao día
ca calquera outra cousa.
04:20
And we are evolutionarily designed to do vision.
96
244000
2000
Estamos deseñados evolutivamente para ver.
04:22
And if we have these predictable repeatable mistakes in vision,
97
246000
3000
E se cometemos estes erros repetitivos
e predicibles na visión,
04:25
which we're so good at,
98
249000
2000
algo no que somos tan bos,
04:27
what's the chance that we don't make even more mistakes
99
251000
2000
que probabilidade existe
de que non cometamos máis erros
04:29
in something we're not as good at --
100
253000
2000
en cousas nas que non somos tan bos,
por exemplo, en decisións financeiras.
04:31
for example, financial decision making:
101
255000
2000
04:33
(Laughter)
102
257000
2000
(Risas)
04:35
something we don't have an evolutionary reason to do,
103
259000
2000
Algo para o que non temos
ningunha necesidade evolutiva,
04:37
we don't have a specialized part of the brain,
104
261000
2000
non temos unha parte
especializada do cerebro,
04:39
and we don't do that many hours of the day.
105
263000
2000
e tampouco o facemos tantas horas ao día.
04:41
And the argument is in those cases
106
265000
3000
O argumento, neses casos,
04:44
it might be the issue that we actually make many more mistakes
107
268000
4000
é que poderíamos, de feito,
estar a cometer moitos máis erros.
04:48
and, worse, not have an easy way to see them.
108
272000
3000
E o peor é que non temos
un xeito fácil de velos,
porque é fácil demostrar
os erros nas ilusións visuais;
04:51
Because in visual illusions we can easily demonstrate the mistakes;
109
275000
3000
04:54
in cognitive illusion it's much, much harder
110
278000
2000
nas ilusións cognitivas
é moitísimo máis difícil
04:56
to demonstrate to people the mistakes.
111
280000
2000
amosarlles os erros ás persoas.
04:58
So I want to show you some cognitive illusions,
112
282000
3000
Así que quero mostrarlles
algunhas ilusións cognitivas,
05:01
or decision-making illusions, in the same way.
113
285000
3000
ou ilusións na toma de decisións,
da mesma maneira.
05:04
And this is one of my favorite plots in social sciences.
114
288000
3000
E esta é unha das miñas gráficas favoritas
en ciencias sociais.
05:07
It's from a paper by Johnson and Goldstein.
115
291000
4000
É dun artigo de Johnson e Goldstein.
05:11
And it basically shows
116
295000
2000
Basicamente, amosa
a porcentaxe de persoas que indicaron
05:13
the percentage of people who indicated
117
297000
2000
05:15
they would be interested in giving their organs to donation.
118
299000
4000
o seu posible interese
en doar os seus órganos.
Estes son diferentes países europeos.
05:19
And these are different countries in Europe. And you basically
119
303000
2000
Basicamente, vemos dous tipos de países:
05:21
see two types of countries:
120
305000
2000
os países da dereita,
que parecen dar moito,
05:23
countries on the right, that seem to be giving a lot;
121
307000
2000
05:25
and countries on the left that seem to giving very little,
122
309000
3000
e os países da esquerda,
que parecen dar moi pouco,
05:28
or much less.
123
312000
2000
ou moito menos.
05:30
The question is, why? Why do some countries give a lot
124
314000
2000
A pregunta é, por que?
Por que algúns países dan moito
e outros países dan pouco?
05:32
and some countries give a little?
125
316000
2000
05:34
When you ask people this question,
126
318000
2000
Cando lle preguntamos isto á xente,
05:36
they usually think that it has to be something about culture.
127
320000
2000
adoitan pensar que é algo cultural.
05:38
Right? How much do you care about people?
128
322000
2000
Canto che importa a xente?
05:40
Giving your organs to somebody else
129
324000
2000
Doar os teus órganos ten que ver,
probablemente, co teu interese pola
sociedade, coa túa vinculación con ela.
05:42
is probably about how much you care about society, how linked you are.
130
326000
3000
05:45
Or maybe it is about religion.
131
329000
2000
Ou quizais é cuestión de relixión.
05:47
But, if you look at this plot,
132
331000
2000
Pero se miran esta gráfica,
poden ver que países
que consideramos moi similares,
05:49
you can see that countries that we think about as very similar
133
333000
3000
05:52
actually exhibit very different behavior.
134
336000
3000
en realidade, exhiben
comportamentos moi diferentes.
05:55
For example, Sweden is all the way on the right,
135
339000
2000
Por exemplo, Suecia
está totalmente á dereita
05:57
and Denmark, that we think is culturally very similar,
136
341000
3000
e Dinamarca, á que consideramos
culturalmente similar,
06:00
is all the way on the left.
137
344000
2000
está totalmente á esquerda.
06:02
Germany is on the left. And Austria is on the right.
138
346000
4000
Alemaña está á esquerda,
e Austria á dereita.
06:06
The Netherlands is on the left. And Belgium is on the right.
139
350000
3000
Holanda está á esquerda,
e Bélxica á dereita.
06:09
And finally, depending on your particular version
140
353000
3000
E finalmente, dependendo
da súa versión particular
06:12
of European similarity,
141
356000
2000
da similitude europea,
06:14
you can think about the U.K and France as either similar culturally or not.
142
358000
5000
poden considerar ao Reino Unido e Francia
como similares culturalmente, ou non.
06:19
But it turns out that from organ donation they are very different.
143
363000
4000
Pero resulta que, en canto á doazón
de órganos, son moi diferentes.
06:23
By the way, the Netherlands is an interesting story.
144
367000
2000
Por certo, Holanda representa
un caso á parte.
06:25
You see the Netherlands is kind of the biggest of the small group.
145
369000
5000
Ven que Holanda é
o máis grande dos pequenos.
06:30
Turns out that they got to 28 percent
146
374000
3000
Resulta que chegaron ao 28 por cento
despois de enviar unha carta
a cada casa do país,
06:33
after mailing every household in the country a letter
147
377000
3000
06:36
begging people to join this organ donation program.
148
380000
3000
rogándolle á xente que se unira
ao programa de doazón de órganos.
06:39
You know the expression, "Begging only gets you so far"?
149
383000
3000
Xa coñecen a expresión:
"Rogar non te leva moi lonxe".
06:42
It's 28 percent in organ donation.
150
386000
3000
Só un 28 por cento
en doazón de órganos.
06:45
(Laughter)
151
389000
2000
(Risas)
06:47
But whatever the countries on the right are doing
152
391000
2000
Pero o que sexa que están a facer
os países da dereita,
06:49
they are doing a much better job than begging.
153
393000
2000
estalles saíndo mellor ca rogar.
06:51
So what are they doing?
154
395000
2000
Así que, que están a facer?
06:53
Turns out the secret has to do with a form at the DMV.
155
397000
3000
Resulta que o segredo está nun formulario
do Departamento de Tráfico (DXT).
06:56
And here is the story.
156
400000
2000
Velaquí a historia:
06:58
The countries on the left have a form at the DMV
157
402000
2000
Os países da esquerda teñen
un formulario de tráfico
07:00
that looks something like this.
158
404000
2000
parecido a este.
07:02
Check the box below if you want to participate
159
406000
2000
"Marque o recadro se quere participar
no programa de doazón de órganos".
07:04
in the organ donor program.
160
408000
2000
07:06
And what happens?
161
410000
2000
E que ocorre?
07:08
People don't check, and they don't join.
162
412000
3000
A xente non o marca, e non se afilian.
07:11
The countries on the right, the ones that give a lot,
163
415000
2000
Os países da dereita, os que dan moito,
07:13
have a slightly different form.
164
417000
2000
teñen un formulario
lixeiramente diferente.
07:15
It says check the box below if you don't want to participate.
165
419000
3000
Nel di: "Marque o recadro
se non quere participar...".
07:18
Interestingly enough, when people get this,
166
422000
2000
Curiosamente, cando a xente le isto,
07:20
they again don't check -- but now they join.
167
424000
3000
tampouco o marcan, pero agora afílianse.
07:23
(Laughter)
168
427000
3000
(Risas)
07:26
Now think about what this means.
169
430000
3000
Agora, pensemos no que isto significa.
07:29
We wake up in the morning and we feel we make decisions.
170
433000
4000
Xa saben, levantámonos pola mañá
e sentimos que tomamos decisións.
07:33
We wake up in the morning and we open the closet
171
437000
2000
Levantámonos pola mañá
e abrimos o armario;
07:35
and we feel that we decide what to wear.
172
439000
2000
e sentimos que decidimos
o que imos poñer.
07:37
And we open the refrigerator and we feel that we decide what to eat.
173
441000
3000
Abrimos a neveira e sentimos
que decidimos o que imos comer.
07:40
What this is actually saying is that
174
444000
2000
O que realmente quere dicir isto,
é que moitas desas decisións
non residen en nós.
07:42
much of these decisions are not residing within us.
175
446000
2000
07:44
They are residing in the person who is designing that form.
176
448000
3000
Residen na persoa que está
deseñando ese formulario.
07:47
When you walk into the DMV,
177
451000
3000
Cando un vai ata a DXT,
a persoa que deseñou o formulario
terá unha enorme influencia
07:50
the person who designed the form will have a huge influence
178
454000
2000
07:52
on what you'll end up doing.
179
456000
2000
no que acabaremos facendo.
07:54
Now it's also very hard to intuit these results. Think about it for yourself.
180
458000
4000
Agora, tamén é moi difícil
intuír estes resultados.
Pensen en vostedes mesmos.
07:58
How many of you believe
181
462000
2000
Cantos de vostedes cren
que se tiveran que renovar
o permiso de conducir mañá,
08:00
that if you went to renew your license tomorrow,
182
464000
2000
08:02
and you went to the DMV,
183
466000
2000
e foran á DXT,
08:04
and you would encounter one of these forms,
184
468000
2000
e atoparan un deses formularios,
08:06
that it would actually change your own behavior?
185
470000
3000
iso cambiaría realmente
o seu propio comportamento?
É moi difícil pensar
que iso podería influírnos.
08:09
Very, very hard to think that you will influence us.
186
473000
2000
08:11
We can say, "Oh, these funny Europeans, of course it would influence them."
187
475000
2000
Podemos dicir: "Oh, que graciosos
os europeos; claro que lles influiría".
08:13
But when it comes to us,
188
477000
3000
Pero cando se trata de nós,
08:16
we have such a feeling that we are at the driver's seat,
189
480000
2000
nós temos a sensación de estar ao mando,
08:18
we have such a feeling that we are in control,
190
482000
2000
de que temos o control
e que estamos a tomar nós a decisión,
08:20
and we are making the decision,
191
484000
2000
08:22
that it's very hard to even accept
192
486000
2000
que é moi duro aceptar a idea
08:24
the idea that we actually have
193
488000
2000
de que, en realidade,
temos a ilusión de tomar unha decisión,
08:26
an illusion of making a decision, rather than an actual decision.
194
490000
4000
no canto de tomar unha decisión real.
Ben, vostedes poderían dicir,
08:30
Now, you might say,
195
494000
2000
08:32
"These are decisions we don't care about."
196
496000
3000
"Estas decisións non nos importan".
De feito, por definición,
esas son decisións
08:35
In fact, by definition, these are decisions
197
499000
2000
08:37
about something that will happen to us after we die.
198
501000
2000
sobre algo que non ocorrerá
ata despois de morrer.
08:39
How could we care about something less
199
503000
3000
Como podería importarnos algo, menos
que unha cousa que ocorre
despois da morte?
08:42
than something that happens after we die?
200
506000
2000
08:44
So a standard economist, someone who believes in rationality,
201
508000
3000
Entón un economista típico,
alguén que cre na racionalidade,
08:47
would say, "You know what? The cost of lifting the pencil
202
511000
3000
diría: "Sabes que?
O custo de levantar o lapis
e marcar un "V", é maior
08:50
and marking a V is higher than the possible
203
514000
2000
que o posible beneficio da decisión,
08:52
benefit of the decision,
204
516000
2000
08:54
so that's why we get this effect."
205
518000
2000
por iso temos este efecto."
(Risas)
08:56
But, in fact, it's not because it's easy.
206
520000
3000
Pero, de feito, non é porque sexa fácil.
08:59
It's not because it's trivial. It's not because we don't care.
207
523000
3000
Non é porque sexa trivial.
Non é porque non nos importe.
09:02
It's the opposite. It's because we care.
208
526000
3000
É o contrario. É porque nos importa.
É difícil e é complexo.
09:05
It's difficult and it's complex.
209
529000
2000
E é tan complexo
que non sabemos que facer.
09:07
And it's so complex that we don't know what to do.
210
531000
2000
09:09
And because we have no idea what to do
211
533000
2000
E, como non temos nin idea de que facer,
09:11
we just pick whatever it was that was chosen for us.
212
535000
4000
simplemente eliximos aquilo
que elixiron para nós.
09:15
I'll give you one more example for this.
213
539000
2000
Voulles dar outro exemplo.
09:17
This is from a paper by Redelmeier and Schaefer.
214
541000
3000
Isto provén dun artigo
de Redelmeier e Shafir.
Eles din: "Ocorrería tamén este efecto
con suxeitos expertos?
09:20
And they said, "Well, this effect also happens to experts,
215
544000
3000
09:23
people who are well-paid, experts in their decisions,
216
547000
3000
Xente ben pagada,
expertos en tomar decisións,
09:26
do it a lot."
217
550000
2000
e que toman moitas?"
E colleron un grupo de médicos.
09:28
And they basically took a group of physicians.
218
552000
2000
09:30
And they presented to them a case study of a patient.
219
554000
2000
E presentáronlles
un estudo de caso dun paciente.
09:32
Here is a patient. He is a 67-year-old farmer.
220
556000
4000
Dixeron: "Velaquí un paciente.
É un granxeiro de 67 anos.
09:36
He's been suffering from a right hip pain for a while.
221
560000
2000
Leva un tempo sufrindo dor
na súa cadeira dereita."
09:38
And then they said to the physician,
222
562000
2000
Entón dixéronlles aos médicos:
09:40
"You decided a few weeks ago
223
564000
2000
"Hai unhas semanas, vdes. decidiron
que este paciente non estaba ben.
09:42
that nothing is working for this patient.
224
566000
2000
09:44
All these medications, nothing seems to be working.
225
568000
2000
Toda a medicación,
nada parece funcionar.
09:46
So you refer the patient to hip replacement therapy.
226
570000
3000
Así que vdes. recomendáronlle
unha substitución de cadeira.
09:49
Hip replacement. Okay?"
227
573000
2000
Substitución de cadeira. De acordo?"
09:51
So the patient is on a path to have his hip replaced.
228
575000
3000
De modo que o paciente está esperando
que a súa cadeira sexa substituída.
09:54
And then they said to half the physicians, they said,
229
578000
2000
Despois, dixéronlles á metade dos médicos,
09:56
"Yesterday you reviewed the patient's case
230
580000
2000
"Onte revisaches o caso deste paciente,
09:58
and you realized that you forgot to try one medication.
231
582000
3000
e decatácheste de que esqueceras
probar unha nova medicina.
10:01
You did not try ibuprofen.
232
585000
3000
Non probaches ibuprofeno.
10:04
What do you do? Do you pull the patient back and try ibuprofen?
233
588000
3000
Que fas? Anulas a operación
e probas co ibuprofeno?
10:07
Or do you let them go and have hip replacement?"
234
591000
3000
Ou deixas que siga adiante
a operación de cadeira?"
10:10
Well the good news is that most physicians in this case
235
594000
2000
Ben, a boa noticia é
que a maioría dos médicos, neste caso,
10:12
decided to pull the patient and try the ibuprofen.
236
596000
3000
deciden anular a operación
e probar co ibuprofeno.
10:15
Very good for the physicians.
237
599000
2000
Moi ben polos médicos.
10:17
The other group of the physicians, they said,
238
601000
2000
Ao outro grupo de médicos,
dixéronlles:
10:19
"Yesterday when you reviewed the case
239
603000
2000
"Onte, cando revisaches o caso,
descubriches dúas medicinas
10:21
you discovered there were two medications you didn't try out yet,
240
605000
2000
10:23
ibuprofen and piroxicam."
241
607000
3000
que aínda non probaras:
ibuprofeno e piroxicam."
Tes dous medicamentos
que aínda non probaches.
10:26
And they said, "You have two medications you didn't try out yet. What do you do?
242
610000
3000
Que fas? Continúas coa operación,
ou anúlala?
10:29
You let them go. Or you pull them back.
243
613000
2000
E, se decides anulala,
probas ibuprofeno ou piroxicam? Cal?"
10:31
And if you pull them back do you try ibuprofen or piroxicam? Which one?"
244
615000
3000
10:34
Now think of it. This decision
245
618000
2000
Agora, pensen nisto:
Esta decisión facilita deixar
que o paciente continúe coa operación,
10:36
makes it as easy to let the patient continue with hip replacement.
246
620000
3000
10:39
But pulling them back, all of the sudden becomes more complex.
247
623000
3000
porque anulala fai que,
de repente, todo se complique.
10:42
There is one more decision.
248
626000
2000
Hai unha decisión máis que tomar.
10:44
What happens now?
249
628000
2000
Que pasa agora?
A maioría dos médicos escollen
que o paciente continúe
10:46
Majority of the physicians now choose to let the patient go
250
630000
3000
10:49
to hip replacement.
251
633000
2000
coa substitución de cadeira.
Por certo, espero que isto os preocupe...
10:51
I hope this worries you, by the way --
252
635000
2000
10:53
(Laughter)
253
637000
1000
(Risas)
10:54
when you go to see your physician.
254
638000
2000
cando teñan que ir ao seu médico.
10:56
The thing is is that no physician would ever say,
255
640000
3000
A cuestión é que ningún médico diría,
10:59
"Piroxicam, ibuprofen, hip replacement.
256
643000
2000
"Piroxicam, ibuprofeno, substitución
de cadeira... Adiante coa substitución".
11:01
Let's go for hip replacement."
257
645000
2000
11:03
But the moment you set this as the default
258
647000
3000
Pero no momento en que consideras
esta opción como por defecto,
11:06
it has a huge power over whatever people end up doing.
259
650000
4000
ten un enorme poder
sobre o que a xente termina por facer.
11:10
I'll give you a couple of more examples on irrational decision-making.
260
654000
3000
Dareilles un par de exemplos máis
sobre a toma irracional de decisións.
11:13
Imagine I give you a choice.
261
657000
2000
Imaxinen que lles deixo escoller:
11:15
Do you want to go for a weekend to Rome?
262
659000
2000
Queres ir a unha fin de semana a Roma,
con todos os gastos pagos...
11:17
All expenses paid:
263
661000
2000
hotel, transporte, mantenza,
almorzo continental... todo.
11:19
hotel, transportation, food, breakfast,
264
663000
2000
11:21
a continental breakfast, everything.
265
665000
2000
11:23
Or a weekend in Paris?
266
667000
2000
Ou unha fin de semana en París?
Fin de semana en París, fin de semana
en Roma... son cousas diferentes.
11:25
Now, a weekend in Paris, a weekend in Rome, these are different things;
267
669000
3000
11:28
they have different food, different culture, different art.
268
672000
2000
Teñen diferente comida,
diferente cultura, arte diferente.
11:30
Now imagine I added a choice to the set
269
674000
2000
Imaxinen que engado unha opción
que ninguén quixera.
11:32
that nobody wanted.
270
676000
2000
11:34
Imagine I said, "A weekend in Rome,
271
678000
2000
Imaxinen que digo: "Fin de semana en Roma,
11:36
a weekend in Paris, or having your car stolen?"
272
680000
3000
fin de semana en París,
ou que che rouben o coche?
11:39
(Laughter)
273
683000
3000
(Risas)
11:42
It's a funny idea, because why would having your car stolen,
274
686000
3000
É unha idea graciosa, porque,
que valor ten o roubo do coche
11:45
in this set, influence anything?
275
689000
2000
neste grupo, inflúe en algo?
11:47
(Laughter)
276
691000
2000
(Risas)
11:49
But what if the option to have your car stolen
277
693000
3000
Pero, que pasaría se a opción
de roubar o coche non fora exactamente así.
11:52
was not exactly like this.
278
696000
2000
Que tal se fora unha viaxe a Roma,
con todos os gastos pagos,
11:54
What if it was a trip to Rome, all expenses paid,
279
698000
2000
11:56
transportation, breakfast,
280
700000
2000
transporte, almorzo,
11:58
but doesn't include coffee in the morning.
281
702000
3000
pero non inclúe café polas mañás?
12:01
If you want coffee you have to pay for it yourself. It's two euros 50.
282
705000
3000
Se queres café, telo que pagar
e costa 2,5 euros.
12:04
Now in some ways,
283
708000
3000
(Risas)
Dalgún xeito,
12:07
given that you can have Rome with coffee,
284
711000
2000
dado que podes ter Roma con café,
12:09
why would you possibly want Rome without coffee?
285
713000
3000
por que quererías Roma sen café?
É como que che rouben o coche.
É unha opción inferior.
12:12
It's like having your car stolen. It's an inferior option.
286
716000
3000
12:15
But guess what happened. The moment you add Rome without coffee,
287
719000
2000
Pero, saben que pasa?
No momento en que engadimos
Roma sen café,
12:17
Rome with coffee becomes more popular. And people choose it.
288
721000
5000
Roma con café vólvese máis popular,
e a xente escólleo.
12:22
The fact that you have Rome without coffee
289
726000
3000
O feito de ter Roma sen café,
12:25
makes Rome with coffee look superior,
290
729000
2000
fai que Roma con café pareza superior,
12:27
and not just to Rome without coffee -- even superior to Paris.
291
731000
3000
e non só a Roma sen café,
senón incluso superior a París.
12:30
(Laughter)
292
734000
4000
(Risas)
12:34
Here are two examples of this principle.
293
738000
2000
Aquí teño dous exemplos deste principio.
12:36
This was an ad from The Economist a few years ago
294
740000
3000
Este era un anuncio de The Economist
de hai algúns anos
12:39
that gave us three choices.
295
743000
2000
que nos daba tres opcións:
12:41
An online subscription for 59 dollars.
296
745000
3000
Unha subscrición en liña por 59 dólares,
12:44
A print subscription for 125.
297
748000
4000
unha subscrición da edición impresa
por 125 dólares,
ou podías ter as dúas por 125.
12:48
Or you could get both for 125.
298
752000
2000
12:50
(Laughter)
299
754000
2000
(Risas)
12:52
Now I looked at this and I called up The Economist.
300
756000
2000
Cando vin isto, chamei a The Economist,
12:54
And I tried to figure out what were they thinking.
301
758000
3000
trataba de comprender
que estaban pensando.
12:57
And they passed me from one person to another to another,
302
761000
3000
Pasáronme dunha persoa a outra, e outra,
13:00
until eventually I got to a person who was in charge of the website.
303
764000
4000
ata que, finalmente,
pasáronme coa persoa a cargo da web,
13:04
And I called them up. And they went to check what was going on.
304
768000
3000
e aviseinos, e foron comprobar
que estaba a ocorrer.
13:07
The next thing I know, the ad is gone. And no explanation.
305
771000
4000
O seguinte que souben foi que o anuncio
desaparecera, sen explicacións.
13:11
So I decided to do the experiment
306
775000
2000
Así que decidín facer un experimento
13:13
that I would have loved The Economist to do with me.
307
777000
3000
que me encantaría
que The Economist fixera comigo.
13:16
I took this and I gave it to 100 MIT students.
308
780000
2000
Collín o anuncio e déillelo
a 100 estudantes do MIT.
13:18
I said, "What would you choose?"
309
782000
2000
Dixen, "Cal escollerías?
13:20
These are the market share. Most people wanted the combo deal.
310
784000
4000
Estes son os resultados:
a maioría escolleu a opción combinada.
Afortunadamente, ninguén quixo
a opción dominante.
13:24
Thankfully nobody wanted the dominated option.
311
788000
2000
13:26
That means our students can read.
312
790000
2000
Isto significa que
os nosos estudantes saben ler.
13:28
(Laughter)
313
792000
1000
(Risas)
13:29
But now if you have an option that nobody wants,
314
793000
3000
Pero agora, se tes unha opción
que ninguén quere,
13:32
you can take it off. Right?
315
796000
2000
pódela retirar, non si?
13:34
So I printed another version of this,
316
798000
2000
Así que imprimín
outra versión disto,
13:36
where I eliminated the middle option.
317
800000
2000
na que eliminei a opción media.
Déillela a outros 100 estudantes.
E isto foi o que pasou:
13:38
I gave it to another 100 students. Here is what happens.
318
802000
3000
13:41
Now the most popular option became the least popular.
319
805000
3000
Agora a opción máis popular
volveuse a menos popular,
13:44
And the least popular became the most popular.
320
808000
3000
e a menos popular
volveuse a máis popular.
13:47
What was happening was the option that was useless,
321
811000
4000
O que estaba a pasar era
que a opción que era inútil,
13:51
in the middle, was useless in the sense that nobody wanted it.
322
815000
4000
a opción media, era inútil
porque ninguén a quería.
Pero non o era en canto lle axudaba
á xente a comprender
13:55
But it wasn't useless in the sense that it helped people figure out
323
819000
2000
13:57
what they wanted.
324
821000
2000
que era o que quería.
13:59
In fact, relative to the option in the middle,
325
823000
3000
De feito, con respecto á opción media,
14:02
which was get only the print for 125,
326
826000
4000
que era ter só a versión impresa por 125,
14:06
the print and web for 125 looked like a fantastic deal.
327
830000
4000
a versión impresa máis web por 125
parecía unha opción fantástica.
14:10
And as a consequence, people chose it.
328
834000
2000
E, en consecuencia, a xente escollíaa.
14:12
The general idea here, by the way,
329
836000
2000
A idea xeral aquí é
14:14
is that we actually don't know our preferences that well.
330
838000
2000
que en realidade non coñecemos
tan ben a nosas preferencias.
14:16
And because we don't know our preferences that well
331
840000
2000
E como non as coñecemos tan ben,
14:18
we're susceptible to all of these influences from the external forces:
332
842000
4000
somos susceptibles
ao influxo de forzas externas:
14:22
the defaults, the particular options that are presented to us, and so on.
333
846000
4000
os valores por defecto,
as opcións presentadas, e cousas así.
14:26
One more example of this.
334
850000
2000
Outro exemplo disto.
14:28
People believe that when we deal with physical attraction,
335
852000
3000
A xente cree que,
cando falamos de atracción física,
14:31
we see somebody, and we know immediately whether we like them or not,
336
855000
3000
vemos a alguén e, inmediatamente,
sabemos se nos gusta ou non,
14:34
attracted or not.
337
858000
2000
se nos atrae ou non.
14:36
Which is why we have these four-minute dates.
338
860000
2000
Por iso, se fan
esas citas de catro minutos.
14:38
So I decided to do this experiment with people.
339
862000
3000
Así que decidín facer
un experimento con persoas.
14:41
I'll show you graphic images of people -- not real people.
340
865000
2000
Amosareilles imaxes, non de persoas reais,
aínda que a proba foi con persoas.
14:43
The experiment was with people.
341
867000
2000
14:45
I showed some people a picture of Tom, and a picture of Jerry.
342
869000
3000
Eu mostraba unha foto de Tom
e unha foto de Jerry.
14:48
I said "Who do you want to date? Tom or Jerry?"
343
872000
3000
E preguntaba: "Con quen queres saír?
Tom ou Jerry?"
14:51
But for half the people I added an ugly version of Jerry.
344
875000
4000
Pero, para á metade da xente,
eu engadía unha versión fea de Jerry.
Collía o Photoshop e facía a Jerry
lixeiramente menos atractivo.
14:55
I took Photoshop and I made Jerry slightly less attractive.
345
879000
5000
15:00
(Laughter)
346
884000
1000
(Risas)
15:01
The other people, I added an ugly version of Tom.
347
885000
4000
Para as outras persoas, engadíamos
unha versión fea de Tom.
15:05
And the question was, will ugly Jerry and ugly Tom
348
889000
3000
E a pregunta era, o Jerry feo e o Tom feo
15:08
help their respective, more attractive brothers?
349
892000
4000
axudarán a facer máis atractivo
ao seu respectivo irmán?
A resposta foi un contundente si.
15:12
The answer was absolutely yes.
350
896000
2000
Cando o Jerry feo aparecía,
o outro Jerry era popular.
15:14
When ugly Jerry was around, Jerry was popular.
351
898000
2000
15:16
When ugly Tom was around, Tom was popular.
352
900000
2000
Cando o Tom feo aparecía,
o outro Tom era popular.
15:18
(Laughter)
353
902000
2000
(Risas)
15:20
This of course has two very clear implications
354
904000
2000
Isto ten, claro,
dúas implicacións evidentes
15:22
for life in general.
355
906000
4000
para a vida en xeral.
Cando saes de bares,
con quen queres ir?
15:26
If you ever go bar hopping, who do you want to take with you?
356
910000
3000
15:29
(Laughter)
357
913000
6000
(Risas)
Queres unha versión
lixeiramente máis fea ca ti.
15:35
You want a slightly uglier version of yourself.
358
919000
3000
15:38
(Laughter)
359
922000
2000
(Risas)
15:40
Similar. Similar ... but slightly uglier.
360
924000
2000
Parecida, pero lixeiramente máis fea.
15:42
(Laughter)
361
926000
2000
(Risas)
O segundo punto, por suposto, é
que se alguén te convida a ir de bares,
15:44
The second point, or course, is that
362
928000
2000
15:46
if somebody else invites you, you know how they think about you.
363
930000
3000
xa sabes o que pensan de ti.
15:49
(Laughter)
364
933000
3000
(Risas)
15:52
Now you're getting it.
365
936000
2000
Agora si que o entenderon.
15:54
What is the general point?
366
938000
2000
Cal é o obxectivo xeral?
15:56
The general point is that when we think about economics we have
367
940000
2000
O obxectivo xeral é que,
cando pensamos en economía, temos
esta bonita visión da natureza humana.
15:58
this beautiful view of human nature.
368
942000
3000
16:01
"What a piece of work is man! How noble in reason!"
369
945000
2000
"Que marabilla de home! Que razoable! "
16:03
We have this view of ourselves, of others.
370
947000
3000
Temos esta visión de nós mesmos,
dos outros.
16:06
The behavioral economics perspective
371
950000
2000
A perspectiva da economía condutual é
lixeiramente menos xenerosa coa xente.
16:08
is slightly less generous to people.
372
952000
3000
16:11
In fact in medical terms, that's our view.
373
955000
3000
De feito, en termos médicos,
esa é a nosa perspectiva.
16:14
(Laughter)
374
958000
6000
(Risas)
Pero hai un lado positivo.
16:20
But there is a silver lining.
375
964000
2000
16:22
The silver lining is, I think,
376
966000
2000
Eu penso que o lado positivo
é o tipo de razón pola cal a economía
condutual é interesante e emocionante.
16:24
kind of the reason that behavioral economics is interesting and exciting.
377
968000
4000
16:28
Are we Superman? Or are we Homer Simpson?
378
972000
2000
Somos Superman, ou somos Homer Simpson?
16:30
When it comes to building the physical world,
379
974000
4000
Cando se trata de construír
o mundo físico,
16:34
we kind of understand our limitations.
380
978000
2000
parece que entendemos
as nosas limitacións.
16:36
We build steps. And we build these things
381
980000
2000
Construímos chanzos.
E construímos esas cousas
que non todos poden usar, obviamente.
16:38
that not everybody can use obviously.
382
982000
3000
(Risas)
16:41
(Laughter)
383
985000
1000
16:42
We understand our limitations,
384
986000
2000
Entendemos as nosas limitacións,
16:44
and we build around it.
385
988000
2000
e construímos arredor delas.
Pero, por algunha razón,
cando se trata do mundo mental,
16:46
But for some reason when it comes to the mental world,
386
990000
2000
16:48
when we design things like healthcare and retirement and stockmarkets,
387
992000
4000
ao deseñar cousas como sistemas de saúde
e xubilación, e mercado bolsista,
16:52
we somehow forget the idea that we are limited.
388
996000
2000
dalgún xeito, esquecemos
as nosas limitacións.
16:54
I think that if we understood our cognitive limitations
389
998000
3000
Creo que, se entendéramos
as nosas limitacións cognitivas
16:57
in the same way that we understand our physical limitations,
390
1001000
2000
do mesmo xeito en que entendemos
as limitacións físicas,
16:59
even though they don't stare us in the face in the same way,
391
1003000
2000
aínda que non sexan tan evidentes
coma estas últimas,
17:01
we could design a better world.
392
1005000
3000
poderíamos deseñar un mundo mellor,
e esa, creo,
17:04
And that, I think, is the hope of this thing.
393
1008000
2000
é a esperanza disto.
17:06
Thank you very much.
394
1010000
2000
Moitísimas grazas.
17:08
(Applause)
395
1012000
8000
(Aplausos)

▲Back to top

ABOUT THE SPEAKER
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com