ABOUT THE SPEAKER
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com
TED2009

Dan Ariely: Our buggy moral code

Filmed:
3,509,395 views

Behavioral economist Dan Ariely studies the bugs in our moral code: the hidden reasons we think it's OK to cheat or steal (sometimes). Clever studies help make his point that we're predictably irrational -- and can be influenced in ways we can't grasp.
- Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why. Full bio

Double-click the English transcript below to play the video.

00:16
I want to talk to you today a little bit
0
1000
2000
00:18
about predictable irrationality.
1
3000
3000
00:21
And my interest in irrational behavior
2
6000
4000
00:25
started many years ago in the hospital.
3
10000
3000
00:28
I was burned very badly.
4
13000
4000
00:32
And if you spend a lot of time in hospital,
5
17000
3000
00:35
you'll see a lot of types of irrationalities.
6
20000
3000
00:38
And the one that particularly bothered me in the burn department
7
23000
5000
00:43
was the process by which the nurses took the bandage off me.
8
28000
4000
00:48
Now, you must have all taken a Band-Aid off at some point,
9
33000
2000
00:50
and you must have wondered what's the right approach.
10
35000
3000
00:53
Do you rip it off quickly -- short duration but high intensity --
11
38000
4000
00:57
or do you take your Band-Aid off slowly --
12
42000
2000
00:59
you take a long time, but each second is not as painful --
13
44000
4000
01:03
which one of those is the right approach?
14
48000
3000
01:06
The nurses in my department thought that the right approach
15
51000
4000
01:10
was the ripping one, so they would grab hold and they would rip,
16
55000
3000
01:13
and they would grab hold and they would rip.
17
58000
2000
01:15
And because I had 70 percent of my body burned, it would take about an hour.
18
60000
4000
01:19
And as you can imagine,
19
64000
3000
01:22
I hated that moment of ripping with incredible intensity.
20
67000
4000
01:26
And I would try to reason with them and say,
21
71000
2000
01:28
"Why don't we try something else?
22
73000
1000
01:29
Why don't we take it a little longer --
23
74000
2000
01:31
maybe two hours instead of an hour -- and have less of this intensity?"
24
76000
5000
01:36
And the nurses told me two things.
25
81000
2000
01:38
They told me that they had the right model of the patient --
26
83000
4000
01:42
that they knew what was the right thing to do to minimize my pain --
27
87000
3000
01:45
and they also told me that the word patient doesn't mean
28
90000
3000
01:48
to make suggestions or to interfere or ...
29
93000
2000
01:50
This is not just in Hebrew, by the way.
30
95000
3000
01:53
It's in every language I've had experience with so far.
31
98000
3000
01:56
And, you know, there's not much -- there wasn't much I could do,
32
101000
4000
02:00
and they kept on doing what they were doing.
33
105000
3000
02:03
And about three years later, when I left the hospital,
34
108000
2000
02:05
I started studying at the university.
35
110000
3000
02:08
And one of the most interesting lessons I learned
36
113000
3000
02:11
was that there is an experimental method
37
116000
2000
02:13
that if you have a question you can create a replica of this question
38
118000
4000
02:17
in some abstract way, and you can try to examine this question,
39
122000
4000
02:21
maybe learn something about the world.
40
126000
2000
02:23
So that's what I did.
41
128000
2000
02:25
I was still interested
42
130000
1000
02:26
in this question of how do you take bandages off burn patients.
43
131000
2000
02:28
So originally I didn't have much money,
44
133000
3000
02:31
so I went to a hardware store and I bought a carpenter's vice.
45
136000
4000
02:35
And I would bring people to the lab and I would put their finger in it,
46
140000
4000
02:39
and I would crunch it a little bit.
47
144000
2000
02:41
(Laughter)
48
146000
2000
02:43
And I would crunch it for long periods and short periods,
49
148000
3000
02:46
and pain that went up and pain that went down,
50
151000
2000
02:48
and with breaks and without breaks -- all kinds of versions of pain.
51
153000
4000
02:52
And when I finished hurting people a little bit, I would ask them,
52
157000
2000
02:54
so, how painful was this? Or, how painful was this?
53
159000
2000
02:56
Or, if you had to choose between the last two,
54
161000
2000
02:58
which one would you choose?
55
163000
2000
03:00
(Laughter)
56
165000
3000
03:03
I kept on doing this for a while.
57
168000
3000
03:06
(Laughter)
58
171000
2000
03:08
And then, like all good academic projects, I got more funding.
59
173000
4000
03:12
I moved to sounds, electrical shocks --
60
177000
2000
03:14
I even had a pain suit that I could get people to feel much more pain.
61
179000
5000
03:19
But at the end of this process,
62
184000
4000
03:23
what I learned was that the nurses were wrong.
63
188000
3000
03:26
Here were wonderful people with good intentions
64
191000
3000
03:29
and plenty of experience, and nevertheless
65
194000
2000
03:31
they were getting things wrong predictably all the time.
66
196000
4000
03:35
It turns out that because we don't encode duration
67
200000
3000
03:38
in the way that we encode intensity,
68
203000
2000
03:40
I would have had less pain if the duration would have been longer
69
205000
4000
03:44
and the intensity was lower.
70
209000
2000
03:46
It turns out it would have been better to start with my face,
71
211000
3000
03:49
which was much more painful, and move toward my legs,
72
214000
2000
03:51
giving me a trend of improvement over time --
73
216000
3000
03:54
that would have been also less painful.
74
219000
1000
03:55
And it also turns out that it would have been good
75
220000
2000
03:57
to give me breaks in the middle to kind of recuperate from the pain.
76
222000
2000
03:59
All of these would have been great things to do,
77
224000
2000
04:01
and my nurses had no idea.
78
226000
3000
04:04
And from that point on I started thinking,
79
229000
1000
04:05
are the nurses the only people in the world who get things wrong
80
230000
3000
04:08
in this particular decision, or is it a more general case?
81
233000
3000
04:11
And it turns out it's a more general case --
82
236000
2000
04:13
there's a lot of mistakes we do.
83
238000
3000
04:16
And I want to give you one example of one of these irrationalities,
84
241000
5000
04:21
and I want to talk to you about cheating.
85
246000
3000
04:24
And the reason I picked cheating is because it's interesting,
86
249000
2000
04:26
but also it tells us something, I think,
87
251000
2000
04:28
about the stock market situation we're in.
88
253000
3000
04:31
So, my interest in cheating started
89
256000
3000
04:34
when Enron came on the scene, exploded all of a sudden,
90
259000
2000
04:36
and I started thinking about what is happening here.
91
261000
3000
04:39
Is it the case that there was kind of
92
264000
1000
04:40
a few apples who are capable of doing these things,
93
265000
3000
04:43
or are we talking a more endemic situation,
94
268000
2000
04:45
that many people are actually capable of behaving this way?
95
270000
4000
04:49
So, like we usually do, I decided to do a simple experiment.
96
274000
4000
04:53
And here's how it went.
97
278000
1000
04:54
If you were in the experiment, I would pass you a sheet of paper
98
279000
3000
04:57
with 20 simple math problems that everybody could solve,
99
282000
4000
05:01
but I wouldn't give you enough time.
100
286000
2000
05:03
When the five minutes were over, I would say,
101
288000
2000
05:05
"Pass me the sheets of paper, and I'll pay you a dollar per question."
102
290000
3000
05:08
People did this. I would pay people four dollars for their task --
103
293000
4000
05:12
on average people would solve four problems.
104
297000
2000
05:14
Other people I would tempt to cheat.
105
299000
3000
05:17
I would pass their sheet of paper.
106
302000
1000
05:18
When the five minutes were over, I would say,
107
303000
2000
05:20
"Please shred the piece of paper.
108
305000
1000
05:21
Put the little pieces in your pocket or in your backpack,
109
306000
3000
05:24
and tell me how many questions you got correctly."
110
309000
3000
05:27
People now solved seven questions on average.
111
312000
3000
05:30
Now, it wasn't as if there was a few bad apples --
112
315000
5000
05:35
a few people cheated a lot.
113
320000
3000
05:38
Instead, what we saw is a lot of people who cheat a little bit.
114
323000
3000
05:41
Now, in economic theory,
115
326000
3000
05:44
cheating is a very simple cost-benefit analysis.
116
329000
3000
05:47
You say, what's the probability of being caught?
117
332000
2000
05:49
How much do I stand to gain from cheating?
118
334000
3000
05:52
And how much punishment would I get if I get caught?
119
337000
2000
05:54
And you weigh these options out --
120
339000
2000
05:56
you do the simple cost-benefit analysis,
121
341000
2000
05:58
and you decide whether it's worthwhile to commit the crime or not.
122
343000
3000
06:01
So, we try to test this.
123
346000
2000
06:03
For some people, we varied how much money they could get away with --
124
348000
4000
06:07
how much money they could steal.
125
352000
1000
06:08
We paid them 10 cents per correct question, 50 cents,
126
353000
3000
06:11
a dollar, five dollars, 10 dollars per correct question.
127
356000
3000
06:14
You would expect that as the amount of money on the table increases,
128
359000
4000
06:18
people would cheat more, but in fact it wasn't the case.
129
363000
3000
06:21
We got a lot of people cheating by stealing by a little bit.
130
366000
3000
06:24
What about the probability of being caught?
131
369000
3000
06:27
Some people shredded half the sheet of paper,
132
372000
2000
06:29
so there was some evidence left.
133
374000
1000
06:30
Some people shredded the whole sheet of paper.
134
375000
2000
06:32
Some people shredded everything, went out of the room,
135
377000
3000
06:35
and paid themselves from the bowl of money that had over 100 dollars.
136
380000
3000
06:38
You would expect that as the probability of being caught goes down,
137
383000
3000
06:41
people would cheat more, but again, this was not the case.
138
386000
3000
06:44
Again, a lot of people cheated by just by a little bit,
139
389000
3000
06:47
and they were insensitive to these economic incentives.
140
392000
3000
06:50
So we said, "If people are not sensitive
141
395000
1000
06:51
to the economic rational theory explanations, to these forces,
142
396000
5000
06:56
what could be going on?"
143
401000
3000
06:59
And we thought maybe what is happening is that there are two forces.
144
404000
3000
07:02
At one hand, we all want to look at ourselves in the mirror
145
407000
2000
07:04
and feel good about ourselves, so we don't want to cheat.
146
409000
3000
07:07
On the other hand, we can cheat a little bit,
147
412000
2000
07:09
and still feel good about ourselves.
148
414000
2000
07:11
So, maybe what is happening is that
149
416000
1000
07:12
there's a level of cheating we can't go over,
150
417000
2000
07:14
but we can still benefit from cheating at a low degree,
151
419000
4000
07:18
as long as it doesn't change our impressions about ourselves.
152
423000
3000
07:21
We call this like a personal fudge factor.
153
426000
3000
07:25
Now, how would you test a personal fudge factor?
154
430000
4000
07:29
Initially we said, what can we do to shrink the fudge factor?
155
434000
4000
07:33
So, we got people to the lab, and we said,
156
438000
2000
07:35
"We have two tasks for you today."
157
440000
2000
07:37
First, we asked half the people
158
442000
1000
07:38
to recall either 10 books they read in high school,
159
443000
2000
07:40
or to recall The Ten Commandments,
160
445000
3000
07:43
and then we tempted them with cheating.
161
448000
2000
07:45
Turns out the people who tried to recall The Ten Commandments --
162
450000
3000
07:48
and in our sample nobody could recall all of The Ten Commandments --
163
453000
2000
07:51
but those people who tried to recall The Ten Commandments,
164
456000
4000
07:55
given the opportunity to cheat, did not cheat at all.
165
460000
3000
07:58
It wasn't that the more religious people --
166
463000
2000
08:00
the people who remembered more of the Commandments -- cheated less,
167
465000
1000
08:01
and the less religious people --
168
466000
2000
08:03
the people who couldn't remember almost any Commandments --
169
468000
1000
08:04
cheated more.
170
469000
2000
08:06
The moment people thought about trying to recall The Ten Commandments,
171
471000
4000
08:10
they stopped cheating.
172
475000
1000
08:11
In fact, even when we gave self-declared atheists
173
476000
2000
08:13
the task of swearing on the Bible and we give them a chance to cheat,
174
478000
4000
08:17
they don't cheat at all.
175
482000
2000
08:21
Now, Ten Commandments is something that is hard
176
486000
2000
08:23
to bring into the education system, so we said,
177
488000
2000
08:25
"Why don't we get people to sign the honor code?"
178
490000
2000
08:27
So, we got people to sign,
179
492000
2000
08:29
"I understand that this short survey falls under the MIT Honor Code."
180
494000
4000
08:33
Then they shredded it. No cheating whatsoever.
181
498000
3000
08:36
And this is particularly interesting,
182
501000
1000
08:37
because MIT doesn't have an honor code.
183
502000
2000
08:39
(Laughter)
184
504000
5000
08:44
So, all this was about decreasing the fudge factor.
185
509000
4000
08:48
What about increasing the fudge factor?
186
513000
3000
08:51
The first experiment -- I walked around MIT
187
516000
2000
08:53
and I distributed six-packs of Cokes in the refrigerators --
188
518000
3000
08:56
these were common refrigerators for the undergrads.
189
521000
2000
08:58
And I came back to measure what we technically call
190
523000
3000
09:01
the half-lifetime of Coke -- how long does it last in the refrigerators?
191
526000
4000
09:05
As you can expect it doesn't last very long; people take it.
192
530000
3000
09:08
In contrast, I took a plate with six one-dollar bills,
193
533000
4000
09:12
and I left those plates in the same refrigerators.
194
537000
3000
09:15
No bill ever disappeared.
195
540000
1000
09:16
Now, this is not a good social science experiment,
196
541000
3000
09:19
so to do it better I did the same experiment
197
544000
3000
09:22
as I described to you before.
198
547000
2000
09:24
A third of the people we passed the sheet, they gave it back to us.
199
549000
3000
09:27
A third of the people we passed it to, they shredded it,
200
552000
3000
09:30
they came to us and said,
201
555000
1000
09:31
"Mr. Experimenter, I solved X problems. Give me X dollars."
202
556000
3000
09:34
A third of the people, when they finished shredding the piece of paper,
203
559000
3000
09:37
they came to us and said,
204
562000
2000
09:39
"Mr Experimenter, I solved X problems. Give me X tokens."
205
564000
6000
09:45
We did not pay them with dollars; we paid them with something else.
206
570000
3000
09:48
And then they took the something else, they walked 12 feet to the side,
207
573000
3000
09:51
and exchanged it for dollars.
208
576000
2000
09:53
Think about the following intuition.
209
578000
2000
09:55
How bad would you feel about taking a pencil from work home,
210
580000
3000
09:58
compared to how bad would you feel
211
583000
2000
10:00
about taking 10 cents from a petty cash box?
212
585000
2000
10:02
These things feel very differently.
213
587000
3000
10:05
Would being a step removed from cash for a few seconds
214
590000
3000
10:08
by being paid by token make a difference?
215
593000
3000
10:11
Our subjects doubled their cheating.
216
596000
2000
10:13
I'll tell you what I think
217
598000
2000
10:15
about this and the stock market in a minute.
218
600000
2000
10:18
But this did not solve the big problem I had with Enron yet,
219
603000
4000
10:22
because in Enron, there's also a social element.
220
607000
3000
10:25
People see each other behaving.
221
610000
1000
10:26
In fact, every day when we open the news
222
611000
2000
10:28
we see examples of people cheating.
223
613000
2000
10:30
What does this cause us?
224
615000
3000
10:33
So, we did another experiment.
225
618000
1000
10:34
We got a big group of students to be in the experiment,
226
619000
3000
10:37
and we prepaid them.
227
622000
1000
10:38
So everybody got an envelope with all the money for the experiment,
228
623000
3000
10:41
and we told them that at the end, we asked them
229
626000
2000
10:43
to pay us back the money they didn't make. OK?
230
628000
4000
10:47
The same thing happens.
231
632000
1000
10:48
When we give people the opportunity to cheat, they cheat.
232
633000
2000
10:50
They cheat just by a little bit, all the same.
233
635000
3000
10:53
But in this experiment we also hired an acting student.
234
638000
3000
10:56
This acting student stood up after 30 seconds, and said,
235
641000
4000
11:00
"I solved everything. What do I do now?"
236
645000
3000
11:03
And the experimenter said, "If you've finished everything, go home.
237
648000
4000
11:07
That's it. The task is finished."
238
652000
1000
11:08
So, now we had a student -- an acting student --
239
653000
4000
11:12
that was a part of the group.
240
657000
2000
11:14
Nobody knew it was an actor.
241
659000
2000
11:16
And they clearly cheated in a very, very serious way.
242
661000
4000
11:20
What would happen to the other people in the group?
243
665000
3000
11:23
Will they cheat more, or will they cheat less?
244
668000
3000
11:26
Here is what happens.
245
671000
2000
11:28
It turns out it depends on what kind of sweatshirt they're wearing.
246
673000
4000
11:32
Here is the thing.
247
677000
2000
11:34
We ran this at Carnegie Mellon and Pittsburgh.
248
679000
3000
11:37
And at Pittsburgh there are two big universities,
249
682000
2000
11:39
Carnegie Mellon and University of Pittsburgh.
250
684000
3000
11:42
All of the subjects sitting in the experiment
251
687000
2000
11:44
were Carnegie Mellon students.
252
689000
2000
11:46
When the actor who was getting up was a Carnegie Mellon student --
253
691000
4000
11:50
he was actually a Carnegie Mellon student --
254
695000
2000
11:52
but he was a part of their group, cheating went up.
255
697000
4000
11:56
But when he actually had a University of Pittsburgh sweatshirt,
256
701000
4000
12:00
cheating went down.
257
705000
2000
12:02
(Laughter)
258
707000
3000
12:05
Now, this is important, because remember,
259
710000
3000
12:08
when the moment the student stood up,
260
713000
2000
12:10
it made it clear to everybody that they could get away with cheating,
261
715000
3000
12:13
because the experimenter said,
262
718000
2000
12:15
"You've finished everything. Go home," and they went with the money.
263
720000
2000
12:17
So it wasn't so much about the probability of being caught again.
264
722000
3000
12:20
It was about the norms for cheating.
265
725000
3000
12:23
If somebody from our in-group cheats and we see them cheating,
266
728000
3000
12:26
we feel it's more appropriate, as a group, to behave this way.
267
731000
4000
12:30
But if it's somebody from another group, these terrible people --
268
735000
2000
12:32
I mean, not terrible in this --
269
737000
2000
12:34
but somebody we don't want to associate ourselves with,
270
739000
2000
12:36
from another university, another group,
271
741000
2000
12:38
all of a sudden people's awareness of honesty goes up --
272
743000
3000
12:41
a little bit like The Ten Commandments experiment --
273
746000
2000
12:43
and people cheat even less.
274
748000
4000
12:47
So, what have we learned from this about cheating?
275
752000
4000
12:51
We've learned that a lot of people can cheat.
276
756000
3000
12:54
They cheat just by a little bit.
277
759000
3000
12:57
When we remind people about their morality, they cheat less.
278
762000
4000
13:01
When we get bigger distance from cheating,
279
766000
3000
13:04
from the object of money, for example, people cheat more.
280
769000
4000
13:08
And when we see cheating around us,
281
773000
2000
13:10
particularly if it's a part of our in-group, cheating goes up.
282
775000
4000
13:14
Now, if we think about this in terms of the stock market,
283
779000
3000
13:17
think about what happens.
284
782000
1000
13:18
What happens in a situation when you create something
285
783000
3000
13:21
where you pay people a lot of money
286
786000
2000
13:23
to see reality in a slightly distorted way?
287
788000
3000
13:26
Would they not be able to see it this way?
288
791000
3000
13:29
Of course they would.
289
794000
1000
13:30
What happens when you do other things,
290
795000
1000
13:31
like you remove things from money?
291
796000
2000
13:33
You call them stock, or stock options, derivatives,
292
798000
3000
13:36
mortgage-backed securities.
293
801000
1000
13:37
Could it be that with those more distant things,
294
802000
3000
13:40
it's not a token for one second,
295
805000
2000
13:42
it's something that is many steps removed from money
296
807000
2000
13:44
for a much longer time -- could it be that people will cheat even more?
297
809000
4000
13:48
And what happens to the social environment
298
813000
2000
13:50
when people see other people behave around them?
299
815000
3000
13:53
I think all of those forces worked in a very bad way
300
818000
4000
13:57
in the stock market.
301
822000
2000
13:59
More generally, I want to tell you something
302
824000
3000
14:02
about behavioral economics.
303
827000
3000
14:05
We have many intuitions in our life,
304
830000
4000
14:09
and the point is that many of these intuitions are wrong.
305
834000
3000
14:12
The question is, are we going to test those intuitions?
306
837000
3000
14:15
We can think about how we're going to test this intuition
307
840000
2000
14:17
in our private life, in our business life,
308
842000
2000
14:19
and most particularly when it goes to policy,
309
844000
3000
14:22
when we think about things like No Child Left Behind,
310
847000
3000
14:25
when you create new stock markets, when you create other policies --
311
850000
3000
14:28
taxation, health care and so on.
312
853000
3000
14:31
And the difficulty of testing our intuition
313
856000
2000
14:33
was the big lesson I learned
314
858000
2000
14:35
when I went back to the nurses to talk to them.
315
860000
2000
14:37
So I went back to talk to them
316
862000
2000
14:39
and tell them what I found out about removing bandages.
317
864000
3000
14:42
And I learned two interesting things.
318
867000
2000
14:44
One was that my favorite nurse, Ettie,
319
869000
2000
14:46
told me that I did not take her pain into consideration.
320
871000
4000
14:50
She said, "Of course, you know, it was very painful for you.
321
875000
2000
14:52
But think about me as a nurse,
322
877000
2000
14:54
taking, removing the bandages of somebody I liked,
323
879000
2000
14:56
and had to do it repeatedly over a long period of time.
324
881000
3000
14:59
Creating so much torture was not something that was good for me, too."
325
884000
3000
15:02
And she said maybe part of the reason was it was difficult for her.
326
887000
5000
15:07
But it was actually more interesting than that, because she said,
327
892000
3000
15:10
"I did not think that your intuition was right.
328
895000
5000
15:15
I felt my intuition was correct."
329
900000
1000
15:16
So, if you think about all of your intuitions,
330
901000
2000
15:18
it's very hard to believe that your intuition is wrong.
331
903000
4000
15:22
And she said, "Given the fact that I thought my intuition was right ..." --
332
907000
3000
15:25
she thought her intuition was right --
333
910000
2000
15:27
it was very difficult for her to accept doing a difficult experiment
334
912000
5000
15:32
to try and check whether she was wrong.
335
917000
2000
15:34
But in fact, this is the situation we're all in all the time.
336
919000
4000
15:38
We have very strong intuitions about all kinds of things --
337
923000
3000
15:41
our own ability, how the economy works,
338
926000
3000
15:44
how we should pay school teachers.
339
929000
2000
15:46
But unless we start testing those intuitions,
340
931000
3000
15:49
we're not going to do better.
341
934000
2000
15:51
And just think about how better my life would have been
342
936000
2000
15:53
if these nurses would have been willing to check their intuition,
343
938000
2000
15:55
and how everything would have been better
344
940000
1000
15:56
if we just start doing more systematic experimentation of our intuitions.
345
941000
5000
16:01
Thank you very much.
346
946000
2000

▲Back to top

ABOUT THE SPEAKER
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com