ABOUT THE SPEAKER
Alexander Wagner - Economist
Alexander Wagner balances two passions: the thrill of seeking knowledge about fundamentals of human behavior for knowledge's sake, and the desire to apply insights in the real world and to improve the workings of markets and organizations.

Why you should listen

Alexander Wagner has discovered that to most people, what matters is not only how much money they receive but also whether they behaved honestly to receive that money. As Swiss Finance Institute professor at the University of Zurich's Department of Banking and Finance, Wagner has taught corporate finance to thousands of eager students and hundreds of motivated executives, and he has helped shape governance systems of companies large and small. His recent research deals with how investors perceive managerial words and deeds … and with the stock market implications of the Trump election.

More profile about the speaker
Alexander Wagner | Speaker | TED.com
TEDxZurich

Alexander Wagner: What really motivates people to be honest in business

Filmed:
1,661,864 views

Each year, one in seven large corporations commits fraud. Why? To find out, Alexander Wagner takes us inside the economics, ethics and psychology of doing the right thing. Join him for an introspective journey down the slippery slopes of deception as he helps us understand why people behave the way they do.
- Economist
Alexander Wagner balances two passions: the thrill of seeking knowledge about fundamentals of human behavior for knowledge's sake, and the desire to apply insights in the real world and to improve the workings of markets and organizations. Full bio

Double-click the English transcript below to play the video.

00:12
How many companies
have you interacted with today?
0
760
3440
00:17
Well, you got up in the morning,
1
5240
1656
00:18
took a shower,
2
6920
1215
00:20
washed your hair,
3
8160
1256
00:21
used a hair dryer,
4
9440
1536
00:23
ate breakfast --
5
11000
1216
00:24
ate cereals, fruit, yogurt, whatever --
6
12240
1858
00:26
had coffee --
7
14122
1214
00:27
tea.
8
15360
1376
00:28
You took public transport to come here,
9
16760
1976
00:30
or maybe used your private car.
10
18760
1840
00:33
You interacted with the company
that you work for or that you own.
11
21520
3560
00:38
You interacted with your clients,
12
26160
1960
00:40
your customers,
13
28760
1200
00:42
and so on and so forth.
14
30640
1256
00:43
I'm pretty sure there are
at least seven companies
15
31920
3616
00:47
you've interacted with today.
16
35560
1760
00:49
Let me tell you a stunning statistic.
17
37960
2000
00:52
One out of seven
large, public corporations
18
40840
4376
00:57
commit fraud every year.
19
45240
2240
01:00
This is a US academic study
that looks at US companies --
20
48400
3416
01:03
I have no reason to believe
that it's different in Europe.
21
51840
3200
01:07
This is a study that looks
at both detected and undetected fraud
22
55480
4216
01:11
using statistical methods.
23
59720
1736
01:13
This is not petty fraud.
24
61480
1720
01:16
These frauds cost
the shareholders of these companies,
25
64120
2856
01:19
and therefore society,
26
67000
1256
01:20
on the order of
380 billion dollars per year.
27
68280
3600
01:24
We can all think of some examples, right?
28
72960
2216
01:27
The car industry's secrets
aren't quite so secret anymore.
29
75200
3800
01:31
Fraud has become a feature,
30
79800
3296
01:35
not a bug,
31
83120
1216
01:36
of the financial services industry.
32
84360
1936
01:38
That's not me who's claiming that,
33
86320
2216
01:40
that's the president
of the American Finance Association
34
88560
3256
01:43
who stated that
in his presidential address.
35
91840
2936
01:46
That's a huge problem
if you think about, especially,
36
94800
2736
01:49
an economy like Switzerland,
37
97560
1696
01:51
which relies so much on the trust
put into its financial industry.
38
99280
4200
01:56
On the other hand,
39
104960
1216
01:58
there are six out of seven companies
who actually remain honest
40
106200
3536
02:01
despite all temptations
to start engaging in fraud.
41
109760
3840
02:06
There are whistle-blowers
like Michael Woodford,
42
114240
2296
02:08
who blew the whistle on Olympus.
43
116560
2336
02:10
These whistle-blowers risk their careers,
44
118920
2696
02:13
their friendships,
45
121640
1216
02:14
to bring out the truth
about their companies.
46
122880
2136
02:17
There are journalists
like Anna Politkovskaya
47
125040
2616
02:19
who risk even their lives
to report human rights violations.
48
127680
3856
02:23
She got killed --
49
131560
1216
02:24
every year,
50
132800
1216
02:26
around 100 journalists get killed
51
134040
1656
02:27
because of their conviction
to bring out the truth.
52
135720
2720
02:32
So in my talk today,
53
140040
1256
02:33
I want to share with you
some insights I've obtained and learned
54
141320
3496
02:36
in the last 10 years
of conducting research in this.
55
144840
3296
02:40
I'm a researcher,
a scientist working with economists,
56
148160
3496
02:43
financial economists,
57
151680
1336
02:45
ethicists, neuroscientists,
58
153040
2056
02:47
lawyers and others
59
155120
1336
02:48
trying to understand
what makes humans tick,
60
156480
2096
02:50
and how can we address this issue
of fraud in corporations
61
158600
4776
02:55
and therefore contribute
to the improvement of the world.
62
163400
3160
02:59
I want to start by sharing with you
two very distinct visions
63
167280
3536
03:02
of how people behave.
64
170840
1816
03:04
First, meet Adam Smith,
65
172680
1840
03:07
founding father of modern economics.
66
175200
1960
03:10
His basic idea was that if everybody
behaves in their own self-interests,
67
178280
4296
03:14
that's good for everybody in the end.
68
182600
2520
03:18
Self-interest isn't
a narrowly defined concept
69
186080
3056
03:21
just for your immediate utility.
70
189160
1936
03:23
It has a long-run implication.
71
191120
1936
03:25
Let's think about that.
72
193080
1480
03:27
Think about this dog here.
73
195080
2016
03:29
That might be us.
74
197120
1200
03:31
There's this temptation --
75
199440
1256
03:32
I apologize to all vegetarians, but --
76
200720
2376
03:35
(Laughter)
77
203120
1016
03:36
Dogs do like the bratwurst.
78
204160
1696
03:37
(Laughter)
79
205880
2376
03:40
Now, the straight-up,
self-interested move here
80
208280
3096
03:43
is to go for that.
81
211400
1576
03:45
So my friend Adam here might jump up,
82
213000
2936
03:47
get the sausage and thereby ruin
all this beautiful tableware.
83
215960
3360
03:52
But that's not what Adam Smith meant.
84
220000
1816
03:53
He didn't mean
disregard all consequences --
85
221840
2656
03:56
to the contrary.
86
224520
1216
03:57
He would have thought,
87
225760
1256
03:59
well, there may be negative consequences,
88
227040
2016
04:01
for example,
89
229080
1216
04:02
the owner might be angry with the dog
90
230320
3096
04:05
and the dog, anticipating that,
might not behave in this way.
91
233440
3600
04:09
That might be us,
92
237840
1256
04:11
weighing the benefits
and costs of our actions.
93
239120
3056
04:14
How does that play out?
94
242200
1240
04:15
Well, many of you, I'm sure,
95
243960
1976
04:17
have in your companies,
96
245960
1536
04:19
especially if it's a large company,
97
247520
1816
04:21
a code of conduct.
98
249360
1656
04:23
And then if you behave
according to that code of conduct,
99
251040
3416
04:26
that improves your chances
of getting a bonus payment.
100
254480
3176
04:29
And on the other hand,
if you disregard it,
101
257680
2135
04:31
then there are higher chances
of not getting your bonus
102
259839
2737
04:34
or its being diminished.
103
262600
1536
04:36
In other words,
104
264160
1256
04:37
this is a very economic motivation
105
265440
1816
04:39
of trying to get people to be more honest,
106
267280
2776
04:42
or more aligned with
the corporation's principles.
107
270080
3360
04:46
Similarly, reputation is a very
powerful economic force, right?
108
274240
5256
04:51
We try to build a reputation,
109
279520
1536
04:53
maybe for being honest,
110
281080
1416
04:54
because then people
trust us more in the future.
111
282520
2400
04:57
Right?
112
285960
1216
04:59
Adam Smith talked about the baker
113
287200
2096
05:01
who's not producing good bread
out of his benevolence
114
289320
3776
05:05
for those people who consume the bread,
115
293120
3016
05:08
but because he wants to sell
more future bread.
116
296160
3040
05:12
In my research, we find, for example,
117
300160
2216
05:14
at the University of Zurich,
118
302400
1376
05:15
that Swiss banks
who get caught up in media,
119
303800
4200
05:20
and in the context, for example,
120
308720
1776
05:22
of tax evasion, of tax fraud,
121
310520
1536
05:24
have bad media coverage.
122
312080
1736
05:25
They lose net new money in the future
123
313840
2736
05:28
and therefore make lower profits.
124
316600
1616
05:30
That's a very powerful reputational force.
125
318240
2360
05:34
Benefits and costs.
126
322200
1600
05:37
Here's another viewpoint of the world.
127
325120
2576
05:39
Meet Immanuel Kant,
128
327720
1536
05:41
18th-century German philosopher superstar.
129
329280
2760
05:44
He developed this notion
130
332920
1616
05:46
that independent of the consequences,
131
334560
3136
05:49
some actions are just right
132
337720
2976
05:52
and some are just wrong.
133
340720
1696
05:54
It's just wrong to lie, for example.
134
342440
3216
05:57
So, meet my friend Immanuel here.
135
345680
3136
06:00
He knows that the sausage is very tasty,
136
348840
2816
06:03
but he's going to turn away
because he's a good dog.
137
351680
2456
06:06
He knows it's wrong to jump up
138
354160
2696
06:08
and risk ruining
all this beautiful tableware.
139
356880
2800
06:12
If you believe that people
are motivated like that,
140
360520
2416
06:14
then all the stuff about incentives,
141
362960
2176
06:17
all the stuff about code of conduct
and bonus systems and so on,
142
365160
3776
06:20
doesn't make a whole lot of sense.
143
368960
2176
06:23
People are motivated
by different values perhaps.
144
371160
4176
06:27
So, what are people actually motivated by?
145
375360
3376
06:30
These two gentlemen here
have perfect hairdos,
146
378760
2176
06:32
but they give us
very different views of the world.
147
380960
4480
06:37
What do we do with this?
148
385840
1256
06:39
Well, I'm an economist
149
387120
1656
06:40
and we conduct so-called experiments
to address this issue.
150
388800
4176
06:45
We strip away facts
which are confusing in reality.
151
393000
3296
06:48
Reality is so rich,
there is so much going on,
152
396320
2736
06:51
it's almost impossible to know
what drives people's behavior really.
153
399080
3960
06:55
So let's do a little experiment together.
154
403520
2720
06:58
Imagine the following situation.
155
406680
2600
07:02
You're in a room alone,
156
410400
2416
07:04
not like here.
157
412840
1536
07:06
There's a five-franc coin
like the one I'm holding up right now
158
414400
3440
07:10
in front of you.
159
418560
1576
07:12
Here are your instructions:
160
420160
1576
07:13
toss the coin four times,
161
421760
2480
07:17
and then on a computer
terminal in front of you,
162
425800
2416
07:20
enter the number of times tails came up.
163
428240
3656
07:23
This is the situation.
164
431920
1280
07:25
Here's the rub.
165
433720
1216
07:26
For every time that you announce
that you had a tails throw,
166
434960
3376
07:30
you get paid five francs.
167
438360
1496
07:31
So if you say I had two tails throws,
168
439880
2536
07:34
you get paid 10 francs.
169
442440
2216
07:36
If you say you had zero,
you get paid zero francs.
170
444680
2936
07:39
If you say, "I had four tails throws,"
171
447640
2456
07:42
then you get paid 20 francs.
172
450120
2016
07:44
It's anonymous,
173
452160
1256
07:45
nobody's watching what you're doing,
174
453440
1896
07:47
and you get paid that money anonymously.
175
455360
2336
07:49
I've got two questions for you.
176
457720
1477
07:51
(Laughter)
177
459760
1616
07:53
You know what's coming now, right?
178
461400
1640
07:56
First, how would you behave
in that situation?
179
464000
3480
08:00
The second, look to your left
and look to your right --
180
468240
2936
08:03
(Laughter)
181
471200
1016
08:04
and think about how
the person sitting next to you
182
472240
2376
08:06
might behave in that situation.
183
474640
1656
08:08
We did this experiment for real.
184
476320
2136
08:10
We did it at the Manifesta art exhibition
185
478480
2696
08:13
that took place here in Zurich recently,
186
481200
2456
08:15
not with students in the lab
at the university
187
483680
2856
08:18
but with the real population,
188
486560
1776
08:20
like you guys.
189
488360
1200
08:22
First, a quick reminder of stats.
190
490080
2136
08:24
If I throw the coin four times
and it's a fair coin,
191
492240
3576
08:27
then the probability
that it comes up four times tails
192
495840
4096
08:31
is 6.25 percent.
193
499960
2400
08:35
And I hope you can intuitively see
194
503080
1656
08:36
that the probability that all four
of them are tails is much lower
195
504760
3376
08:40
than if two of them are tails, right?
196
508160
2120
08:42
Here are the specific numbers.
197
510760
1440
08:46
Here's what happened.
198
514039
1496
08:47
People did this experiment for real.
199
515559
2201
08:50
Around 30 to 35 percent of people said,
200
518799
3336
08:54
"Well, I had four tails throws."
201
522159
2401
08:57
That's extremely unlikely.
202
525640
1816
08:59
(Laughter)
203
527480
1936
09:01
But the really amazing thing here,
204
529440
3136
09:04
perhaps to an economist,
205
532600
1296
09:05
is there are around 65 percent of people
who did not say I had four tails throws,
206
533920
6536
09:12
even though in that situation,
207
540480
2176
09:14
nobody's watching you,
208
542680
2096
09:16
the only consequence that's in place
209
544800
1936
09:18
is you get more money
if you say four than less.
210
546760
3336
09:22
You leave 20 francs on the table
by announcing zero.
211
550120
3280
09:26
I don't know whether
the other people all were honest
212
554040
2576
09:28
or whether they also said a little bit
higher or lower than what they did
213
556640
3456
09:32
because it's anonymous.
214
560120
1216
09:33
We only observed the distribution.
215
561360
1656
09:35
But what I can tell you --
and here's another coin toss.
216
563040
2656
09:37
There you go, it's tails.
217
565720
1496
09:39
(Laughter)
218
567240
1496
09:40
Don't check, OK?
219
568760
1456
09:42
(Laughter)
220
570240
2816
09:45
What I can tell you
221
573080
1296
09:46
is that not everybody behaved
like Adam Smith would have predicted.
222
574400
4440
09:52
So what does that leave us with?
223
580840
1576
09:54
Well, it seems people are motivated
by certain intrinsic values
224
582440
4496
09:58
and in our research, we look at this.
225
586960
1800
10:01
We look at the idea that people have
so-called protected values.
226
589440
4480
10:06
A protected value isn't just any value.
227
594760
2816
10:09
A protected value is a value
where you're willing to pay a price
228
597600
5816
10:15
to uphold that value.
229
603440
1256
10:16
You're willing to pay a price
to withstand the temptation to give in.
230
604720
4440
10:22
And the consequence is you feel better
231
610200
2656
10:24
if you earn money in a way
that's consistent with your values.
232
612880
4296
10:29
Let me show you this again
in the metaphor of our beloved dog here.
233
617200
4280
10:34
If we succeed in getting the sausage
without violating our values,
234
622600
4056
10:38
then the sausage tastes better.
235
626680
1976
10:40
That's what our research shows.
236
628680
1480
10:42
If, on the other hand,
237
630720
1256
10:44
we do so --
238
632000
1256
10:45
if we get the sausage
239
633280
1416
10:46
and in doing so
we actually violate values,
240
634720
3456
10:50
we value the sausage less.
241
638200
2976
10:53
Quantitatively, that's quite powerful.
242
641200
2456
10:55
We can measure these protected values,
243
643680
2456
10:58
for example,
244
646160
1216
10:59
by a survey measure.
245
647400
1920
11:02
Simple, nine-item survey that's quite
predictive in these experiments.
246
650360
5976
11:08
If you think about the average
of the population
247
656360
2336
11:10
and then there's
a distribution around it --
248
658720
2096
11:12
people are different,
we all are different.
249
660840
2040
11:15
People who have a set of protected values
250
663480
2976
11:18
that's one standard deviation
above the average,
251
666480
4176
11:22
they discount money they receive
by lying by about 25 percent.
252
670680
5056
11:27
That means a dollar received when lying
253
675760
3616
11:31
is worth to them only 75 cents
254
679400
2136
11:33
without any incentives you put in place
for them to behave honestly.
255
681560
3696
11:37
It's their intrinsic motivation.
256
685280
1736
11:39
By the way, I'm not a moral authority.
257
687040
1856
11:40
I'm not saying I have
all these beautiful values, right?
258
688920
2920
11:44
But I'm interested in how people behave
259
692440
1936
11:46
and how we can leverage
that richness in human nature
260
694400
3376
11:49
to actually improve
the workings of our organizations.
261
697800
3440
11:54
So there are two
very, very different visions here.
262
702400
3176
11:57
On the one hand,
263
705600
1336
11:58
you can appeal to benefits and costs
264
706960
3016
12:02
and try to get people
to behave according to them.
265
710000
2656
12:04
On the other hand,
266
712680
1616
12:06
you can select people who have the values
267
714320
4016
12:10
and the desirable
characteristics, of course --
268
718360
2216
12:12
competencies that go
in line with your organization.
269
720600
3576
12:16
I do not yet know where
these protected values really come from.
270
724200
4216
12:20
Is it nurture or is it nature?
271
728440
3376
12:23
What I can tell you
272
731840
1376
12:25
is that the distribution
looks pretty similar for men and women.
273
733240
5096
12:30
It looks pretty similar
for those who had studied economics
274
738360
3776
12:34
or those who had studied psychology.
275
742160
2360
12:38
It looks even pretty similar
around different age categories
276
746000
3376
12:41
among adults.
277
749400
1216
12:42
But I don't know yet
how this develops over a lifetime.
278
750640
2656
12:45
That will be the subject
of future research.
279
753320
3440
12:49
The idea I want to leave you with
280
757640
1656
12:51
is it's all right to appeal to incentives.
281
759320
2776
12:54
I'm an economist;
282
762120
1216
12:55
I certainly believe in the fact
that incentives work.
283
763360
2920
12:59
But do think about selecting
the right people
284
767400
4016
13:03
rather than having people
and then putting incentives in place.
285
771440
3496
13:06
Selecting the right people
with the right values
286
774960
2256
13:09
may go a long way
to saving a lot of trouble
287
777240
3936
13:13
and a lot of money
288
781200
1376
13:14
in your organizations.
289
782600
1736
13:16
In other words,
290
784360
1256
13:17
it will pay off to put people first.
291
785640
3760
13:22
Thank you.
292
790040
1216
13:23
(Applause)
293
791280
3640

▲Back to top

ABOUT THE SPEAKER
Alexander Wagner - Economist
Alexander Wagner balances two passions: the thrill of seeking knowledge about fundamentals of human behavior for knowledge's sake, and the desire to apply insights in the real world and to improve the workings of markets and organizations.

Why you should listen

Alexander Wagner has discovered that to most people, what matters is not only how much money they receive but also whether they behaved honestly to receive that money. As Swiss Finance Institute professor at the University of Zurich's Department of Banking and Finance, Wagner has taught corporate finance to thousands of eager students and hundreds of motivated executives, and he has helped shape governance systems of companies large and small. His recent research deals with how investors perceive managerial words and deeds … and with the stock market implications of the Trump election.

More profile about the speaker
Alexander Wagner | Speaker | TED.com