ABOUT THE SPEAKER
Alex Edmans - Finance professor, editor
Alex Edmans uses rigorous academic research to influence real-life business practices -- in particular, how companies can pursue purpose as well as profit.

Why you should listen

Alex Edmans is professor of finance at London Business School and managing editor of the Review of Finance, the leading academic finance journal in Europe. He is an expert in corporate governance, executive compensation, corporate social responsibility and behavioral economics.

Edmans has a unique combination of deep academic rigor and practical business experience. He's particularly passionate about translating complex academic research into practical ideas that can then be applied to real-life problems. He has spoken at the World Economic Forum in Davos, at the World Bank Distinguished Speaker Series and in the UK House of Commons. Edmans is heavily involved in the ongoing reform of corporate governance, in particular to ensure that both the diagnosis of problems and suggested solutions are based on rigorous evidence rather than anecdote. He was appointed by the UK government to study the effect of share buybacks on executive pay and investment. Edmans also serves on the Steering Group of The Purposeful Company, which aims to embed purpose into the heart of business, and on Royal London Asset Management's Responsible Investment Advisory Committee.
 
Edmans has been interviewed by Bloomberg, BBC, CNBC, CNN, ESPN, Fox, ITV, NPR, Reuters, Sky News and Sky Sports, and has written for the Wall Street Journal, Financial Times and Harvard Business Review. He runs a blog, Access to Finance, that makes academic research accessible to a general audience, and was appointed Mercers' School Memorial Professor of Business by Gresham College, to give free lectures to the public. Edmans was previously a tenured professor at Wharton, where he won 14 teaching awards in six years. At LBS, he won the Excellence in Teaching award, LBS's highest teaching accolade.

More profile about the speaker
Alex Edmans | Speaker | TED.com
TEDxLondonBusinessSchool

Alex Edmans: What to trust in a "post-truth" world

Filmed:
1,695,337 views

Only if you are truly open to the possibility of being wrong can you ever learn, says researcher Alex Edmans. In an insightful talk, he explores how confirmation bias -- the tendency to only accept information that supports your personal beliefs -- can lead you astray on social media, in politics and beyond, and offers three practical tools for finding evidence you can actually trust. (Hint: appoint someone to be the devil's advocate in your life.)
- Finance professor, editor
Alex Edmans uses rigorous academic research to influence real-life business practices -- in particular, how companies can pursue purpose as well as profit. Full bio

Double-click the English transcript below to play the video.

00:13
Belle Gibson was a happy young Australian.
0
1675
2920
00:16
She lived in Perth,
and she loved skateboarding.
1
4619
3023
00:20
But in 2009, Belle learned that she had
brain cancer and four months to live.
2
8173
4449
00:25
Two months of chemo
and radiotherapy had no effect.
3
13034
3533
00:29
But Belle was determined.
4
17145
1500
00:30
She'd been a fighter her whole life.
5
18669
2130
00:32
From age six, she had to cook
for her brother, who had autism,
6
20823
3294
00:36
and her mother,
who had multiple sclerosis.
7
24141
2388
00:38
Her father was out of the picture.
8
26553
1754
00:40
So Belle fought, with exercise,
with meditation
9
28736
3286
00:44
and by ditching meat
for fruit and vegetables.
10
32046
2840
00:47
And she made a complete recovery.
11
35387
2200
00:50
Belle's story went viral.
12
38784
1579
00:52
It was tweeted, blogged about,
shared and reached millions of people.
13
40387
3393
00:56
It showed the benefits of shunning
traditional medicine
14
44246
3111
00:59
for diet and exercise.
15
47381
1467
01:01
In August 2013, Belle launched
a healthy eating app,
16
49381
4498
01:05
The Whole Pantry,
17
53903
1349
01:07
downloaded 200,000 times
in the first month.
18
55276
4023
01:13
But Belle's story was a lie.
19
61228
2799
01:17
Belle never had cancer.
20
65227
1534
01:19
People shared her story
without ever checking if it was true.
21
67601
4133
01:24
This is a classic example
of confirmation bias.
22
72815
3220
01:28
We accept a story uncritically
if it confirms what we'd like to be true.
23
76403
4676
01:33
And we reject any story
that contradicts it.
24
81484
2506
01:36
How often do we see this
25
84937
1825
01:38
in the stories
that we share and we ignore?
26
86786
3045
01:41
In politics, in business,
in health advice.
27
89855
4182
01:47
The Oxford Dictionary's
word of 2016 was "post-truth."
28
95180
4106
01:51
And the recognition that we now live
in a post-truth world
29
99768
3492
01:55
has led to a much needed emphasis
on checking the facts.
30
103284
3364
01:59
But the punch line of my talk
31
107339
1397
02:00
is that just checking
the facts is not enough.
32
108760
2991
02:04
Even if Belle's story were true,
33
112347
2927
02:07
it would be just as irrelevant.
34
115298
2067
02:10
Why?
35
118457
1150
02:11
Well, let's look at one of the most
fundamental techniques in statistics.
36
119957
3508
02:15
It's called Bayesian inference.
37
123489
2410
02:18
And the very simple version is this:
38
126251
2936
02:21
We care about "does the data
support the theory?"
39
129211
3268
02:25
Does the data increase our belief
that the theory is true?
40
133053
3456
02:29
But instead, we end up asking,
"Is the data consistent with the theory?"
41
137520
4383
02:34
But being consistent with the theory
42
142838
2515
02:37
does not mean that the data
supports the theory.
43
145377
2929
02:40
Why?
44
148799
1159
02:41
Because of a crucial
but forgotten third term --
45
149982
3825
02:45
the data could also be consistent
with rival theories.
46
153831
3558
02:49
But due to confirmation bias,
we never consider the rival theories,
47
157918
4667
02:54
because we're so protective
of our own pet theory.
48
162609
3151
02:58
Now, let's look at this for Belle's story.
49
166688
2413
03:01
Well, we care about:
Does Belle's story support the theory
50
169125
4214
03:05
that diet cures cancer?
51
173363
1603
03:06
But instead, we end up asking,
52
174990
1787
03:08
"Is Belle's story consistent
with diet curing cancer?"
53
176801
4045
03:13
And the answer is yes.
54
181790
1604
03:15
If diet did cure cancer,
we'd see stories like Belle's.
55
183839
4103
03:20
But even if diet did not cure cancer,
56
188839
2849
03:23
we'd still see stories like Belle's.
57
191712
2643
03:26
A single story in which
a patient apparently self-cured
58
194744
5190
03:31
just due to being misdiagnosed
in the first place.
59
199958
3174
03:35
Just like, even if smoking
was bad for your health,
60
203680
3326
03:39
you'd still see one smoker
who lived until 100.
61
207030
3304
03:42
(Laughter)
62
210664
1150
03:44
Just like, even if education
was good for your income,
63
212157
2562
03:46
you'd still see one multimillionaire
who didn't go to university.
64
214743
4281
03:51
(Laughter)
65
219048
4984
03:56
So the biggest problem with Belle's story
is not that it was false.
66
224056
3911
03:59
It's that it's only one story.
67
227991
2531
04:03
There might be thousands of other stories
where diet alone failed,
68
231094
4381
04:07
but we never hear about them.
69
235499
1934
04:10
We share the outlier cases
because they are new,
70
238141
3896
04:14
and therefore they are news.
71
242061
1867
04:16
We never share the ordinary cases.
72
244657
2476
04:19
They're too ordinary,
they're what normally happens.
73
247157
3213
04:23
And that's the true
99 percent that we ignore.
74
251125
3095
04:26
Just like in society, you can't just
listen to the one percent,
75
254244
2968
04:29
the outliers,
76
257236
1158
04:30
and ignore the 99 percent, the ordinary.
77
258418
2666
04:34
Because that's the second example
of confirmation bias.
78
262022
3254
04:37
We accept a fact as data.
79
265300
2769
04:41
The biggest problem is not
that we live in a post-truth world;
80
269038
3968
04:45
it's that we live in a post-data world.
81
273030
3769
04:49
We prefer a single story to tons of data.
82
277792
3744
04:54
Now, stories are powerful,
they're vivid, they bring it to life.
83
282752
3016
04:57
They tell you to start
every talk with a story.
84
285792
2222
05:00
I did.
85
288038
1150
05:01
But a single story
is meaningless and misleading
86
289696
4754
05:06
unless it's backed up by large-scale data.
87
294474
2849
05:11
But even if we had large-scale data,
88
299236
2357
05:13
that might still not be enough.
89
301617
2158
05:16
Because it could still be consistent
with rival theories.
90
304260
3138
05:20
Let me explain.
91
308136
1150
05:22
A classic study
by psychologist Peter Wason
92
310072
3262
05:25
gives you a set of three numbers
93
313358
1952
05:27
and asks you to think of the rule
that generated them.
94
315334
2905
05:30
So if you're given two, four, six,
95
318585
4476
05:35
what's the rule?
96
323085
1150
05:36
Well, most people would think,
it's successive even numbers.
97
324895
3219
05:40
How would you test it?
98
328767
1515
05:42
Well, you'd propose other sets
of successive even numbers:
99
330306
3262
05:45
4, 6, 8 or 12, 14, 16.
100
333592
3318
05:49
And Peter would say these sets also work.
101
337546
2800
05:53
But knowing that these sets also work,
102
341124
2564
05:55
knowing that perhaps hundreds of sets
of successive even numbers also work,
103
343712
4765
06:00
tells you nothing.
104
348501
1348
06:02
Because this is still consistent
with rival theories.
105
350572
3358
06:06
Perhaps the rule
is any three even numbers.
106
354889
3205
06:11
Or any three increasing numbers.
107
359000
2133
06:14
And that's the third example
of confirmation bias:
108
362365
2888
06:17
accepting data as evidence,
109
365277
3689
06:20
even if it's consistent
with rival theories.
110
368990
3000
06:24
Data is just a collection of facts.
111
372704
2952
06:28
Evidence is data that supports
one theory and rules out others.
112
376402
4923
06:34
So the best way to support your theory
113
382665
2483
06:37
is actually to try to disprove it,
to play devil's advocate.
114
385172
3930
06:41
So test something, like 4, 12, 26.
115
389466
4718
06:46
If you got a yes to that,
that would disprove your theory
116
394938
3683
06:50
of successive even numbers.
117
398645
1936
06:53
Yet this test is powerful,
118
401232
2016
06:55
because if you got a no, it would rule out
"any three even numbers"
119
403272
4845
07:00
and "any three increasing numbers."
120
408141
1712
07:01
It would rule out the rival theories,
but not rule out yours.
121
409877
3341
07:05
But most people are too afraid
of testing the 4, 12, 26,
122
413968
4794
07:10
because they don't want to get a yes
and prove their pet theory to be wrong.
123
418786
4163
07:16
Confirmation bias is not only
about failing to search for new data,
124
424727
5676
07:22
but it's also about misinterpreting
data once you receive it.
125
430427
3073
07:26
And this applies outside the lab
to important, real-world problems.
126
434339
3548
07:29
Indeed, Thomas Edison famously said,
127
437911
3309
07:33
"I have not failed,
128
441244
1888
07:35
I have found 10,000 ways that won't work."
129
443156
4188
07:40
Finding out that you're wrong
130
448281
2627
07:42
is the only way to find out what's right.
131
450932
2733
07:46
Say you're a university
admissions director
132
454654
2946
07:49
and your theory is that only
students with good grades
133
457624
2563
07:52
from rich families do well.
134
460211
1763
07:54
So you only let in such students.
135
462339
2190
07:56
And they do well.
136
464553
1150
07:58
But that's also consistent
with the rival theory.
137
466482
2772
08:01
Perhaps all students
with good grades do well,
138
469593
2747
08:04
rich or poor.
139
472364
1181
08:06
But you never test that theory
because you never let in poor students
140
474307
3730
08:10
because you don't want to be proven wrong.
141
478061
2800
08:14
So, what have we learned?
142
482577
1857
08:17
A story is not fact,
because it may not be true.
143
485315
3560
08:21
A fact is not data,
144
489498
2087
08:23
it may not be representative
if it's only one data point.
145
491609
4039
08:28
And data is not evidence --
146
496680
2349
08:31
it may not be supportive
if it's consistent with rival theories.
147
499053
3678
08:36
So, what do you do?
148
504146
2277
08:39
When you're at
the inflection points of life,
149
507464
2682
08:42
deciding on a strategy for your business,
150
510170
2566
08:44
a parenting technique for your child
151
512760
2611
08:47
or a regimen for your health,
152
515395
2428
08:49
how do you ensure
that you don't have a story
153
517847
3539
08:53
but you have evidence?
154
521410
1468
08:56
Let me give you three tips.
155
524268
1619
08:58
The first is to actively seek
other viewpoints.
156
526641
3984
09:02
Read and listen to people
you flagrantly disagree with.
157
530649
3594
09:06
Ninety percent of what they say
may be wrong, in your view.
158
534267
3488
09:10
But what if 10 percent is right?
159
538728
2133
09:13
As Aristotle said,
160
541851
1619
09:15
"The mark of an educated man
161
543494
2214
09:17
is the ability to entertain a thought
162
545732
3397
09:21
without necessarily accepting it."
163
549153
2333
09:24
Surround yourself with people
who challenge you,
164
552649
2254
09:26
and create a culture
that actively encourages dissent.
165
554917
3699
09:31
Some banks suffered from groupthink,
166
559347
2318
09:33
where staff were too afraid to challenge
management's lending decisions,
167
561689
4309
09:38
contributing to the financial crisis.
168
566022
2466
09:41
In a meeting, appoint someone
to be devil's advocate
169
569029
4199
09:45
against your pet idea.
170
573252
1642
09:47
And don't just hear another viewpoint --
171
575720
2571
09:50
listen to it, as well.
172
578315
2176
09:53
As psychologist Stephen Covey said,
173
581389
2404
09:55
"Listen with the intent to understand,
174
583817
3397
09:59
not the intent to reply."
175
587238
1666
10:01
A dissenting viewpoint
is something to learn from
176
589642
3492
10:05
not to argue against.
177
593158
1548
10:07
Which takes us to the other
forgotten terms in Bayesian inference.
178
595690
3866
10:12
Because data allows you to learn,
179
600198
2324
10:14
but learning is only relative
to a starting point.
180
602546
3515
10:18
If you started with complete certainty
that your pet theory must be true,
181
606085
5716
10:23
then your view won't change --
182
611825
1897
10:25
regardless of what data you see.
183
613746
2466
10:28
Only if you are truly open
to the possibility of being wrong
184
616641
4391
10:33
can you ever learn.
185
621056
1267
10:35
As Leo Tolstoy wrote,
186
623580
2095
10:37
"The most difficult subjects
187
625699
2182
10:39
can be explained to the most
slow-witted man
188
627905
3135
10:43
if he has not formed
any idea of them already.
189
631064
2753
10:46
But the simplest thing
190
634365
1873
10:48
cannot be made clear
to the most intelligent man
191
636262
3071
10:51
if he is firmly persuaded
that he knows already."
192
639357
3334
10:56
Tip number two is "listen to experts."
193
644500
3743
11:01
Now, that's perhaps the most
unpopular advice that I could give you.
194
649040
3492
11:04
(Laughter)
195
652556
1220
11:05
British politician Michael Gove
famously said that people in this country
196
653800
4738
11:10
have had enough of experts.
197
658562
2276
11:13
A recent poll showed that more people
would trust their hairdresser --
198
661696
3508
11:17
(Laughter)
199
665228
2285
11:19
or the man on the street
200
667537
1833
11:21
than they would leaders of businesses,
the health service and even charities.
201
669394
4305
11:26
So we respect a teeth-whitening formula
discovered by a mom,
202
674227
3977
11:30
or we listen to an actress's view
on vaccination.
203
678228
3198
11:33
We like people who tell it like it is,
who go with their gut,
204
681450
2865
11:36
and we call them authentic.
205
684339
1800
11:38
But gut feel can only get you so far.
206
686847
3214
11:42
Gut feel would tell you never to give
water to a baby with diarrhea,
207
690736
4436
11:47
because it would just
flow out the other end.
208
695196
2318
11:49
Expertise tells you otherwise.
209
697538
2578
11:53
You'd never trust your surgery
to the man on the street.
210
701149
3428
11:56
You'd want an expert
who spent years doing surgery
211
704887
3587
12:00
and knows the best techniques.
212
708498
2000
12:03
But that should apply
to every major decision.
213
711514
3133
12:07
Politics, business, health advice
214
715255
4556
12:11
require expertise, just like surgery.
215
719835
2896
12:16
So then, why are experts so mistrusted?
216
724474
3539
12:20
Well, one reason
is they're seen as out of touch.
217
728981
3239
12:24
A millionaire CEO couldn't possibly
speak for the man on the street.
218
732244
4090
12:29
But true expertise is found on evidence.
219
737455
3559
12:33
And evidence stands up
for the man on the street
220
741447
2905
12:36
and against the elites.
221
744376
1533
12:38
Because evidence forces you to prove it.
222
746456
2667
12:41
Evidence prevents the elites
from imposing their own view
223
749774
4421
12:46
without proof.
224
754219
1150
12:49
A second reason
why experts are not trusted
225
757006
2071
12:51
is that different experts
say different things.
226
759101
3087
12:54
For every expert who claimed that leaving
the EU would be bad for Britain,
227
762212
4476
12:58
another expert claimed it would be good.
228
766712
2429
13:01
Half of these so-called experts
will be wrong.
229
769165
3767
13:05
And I have to admit that most papers
written by experts are wrong.
230
773774
4243
13:10
Or at best, make claims that
the evidence doesn't actually support.
231
778520
3505
13:14
So we can't just take
an expert's word for it.
232
782990
3133
13:18
In November 2016, a study
on executive pay hit national headlines.
233
786776
6034
13:25
Even though none of the newspapers
who covered the study
234
793240
2890
13:28
had even seen the study.
235
796154
1600
13:30
It wasn't even out yet.
236
798685
1533
13:32
They just took the author's word for it,
237
800708
2204
13:35
just like with Belle.
238
803768
1400
13:38
Nor does it mean that we can
just handpick any study
239
806093
2436
13:40
that happens to support our viewpoint --
240
808553
2111
13:42
that would, again, be confirmation bias.
241
810688
2103
13:44
Nor does it mean
that if seven studies show A
242
812815
2555
13:47
and three show B,
243
815394
1668
13:49
that A must be true.
244
817086
1483
13:51
What matters is the quality,
245
819109
2659
13:53
and not the quantity of expertise.
246
821792
2817
13:57
So we should do two things.
247
825879
1800
14:00
First, we should critically examine
the credentials of the authors.
248
828434
4578
14:05
Just like you'd critically examine
the credentials of a potential surgeon.
249
833807
4143
14:10
Are they truly experts in the matter,
250
838347
3206
14:13
or do they have a vested interest?
251
841577
2267
14:16
Second, we should pay particular attention
252
844768
2523
14:19
to papers published
in the top academic journals.
253
847315
3889
14:24
Now, academics are often accused
of being detached from the real world.
254
852038
3861
14:28
But this detachment gives you
years to spend on a study.
255
856585
3730
14:32
To really nail down a result,
256
860339
1905
14:34
to rule out those rival theories,
257
862268
2015
14:36
and to distinguish correlation
from causation.
258
864307
3134
14:40
And academic journals involve peer review,
259
868172
3477
14:43
where a paper is rigorously scrutinized
260
871673
2294
14:45
(Laughter)
261
873991
1419
14:47
by the world's leading minds.
262
875434
1934
14:50
The better the journal,
the higher the standard.
263
878434
2556
14:53
The most elite journals
reject 95 percent of papers.
264
881014
5148
14:59
Now, academic evidence is not everything.
265
887434
3333
15:03
Real-world experience is critical, also.
266
891109
2667
15:06
And peer review is not perfect,
mistakes are made.
267
894465
3400
15:10
But it's better to go
with something checked
268
898530
2063
15:12
than something unchecked.
269
900617
1667
15:14
If we latch onto a study
because we like the findings,
270
902696
3199
15:17
without considering who it's by
or whether it's even been vetted,
271
905919
3888
15:21
there is a massive chance
that that study is misleading.
272
909831
3627
15:26
And those of us who claim to be experts
273
914894
2580
15:29
should recognize the limitations
of our analysis.
274
917498
3253
15:33
Very rarely is it possible to prove
or predict something with certainty,
275
921244
4563
15:38
yet it's so tempting to make
a sweeping, unqualified statement.
276
926292
4369
15:43
It's easier to turn into a headline
or to be tweeted in 140 characters.
277
931069
4344
15:48
But even evidence may not be proof.
278
936417
3142
15:52
It may not be universal,
it may not apply in every setting.
279
940481
4210
15:57
So don't say, "Red wine
causes longer life,"
280
945252
4920
16:02
when the evidence is only that red wine
is correlated with longer life.
281
950196
4682
16:07
And only then in people
who exercise as well.
282
955379
2770
16:11
Tip number three
is "pause before sharing anything."
283
959868
3966
16:16
The Hippocratic oath says,
"First, do no harm."
284
964907
3464
16:21
What we share is potentially contagious,
285
969046
3134
16:24
so be very careful about what we spread.
286
972204
3683
16:28
Our goal should not be
to get likes or retweets.
287
976632
2953
16:31
Otherwise, we only share the consensus;
we don't challenge anyone's thinking.
288
979609
3985
16:36
Otherwise, we only share what sounds good,
289
984085
2905
16:39
regardless of whether it's evidence.
290
987014
2400
16:42
Instead, we should ask the following:
291
990188
2466
16:45
If it's a story, is it true?
292
993572
2135
16:47
If it's true, is it backed up
by large-scale evidence?
293
995731
2865
16:50
If it is, who is it by,
what are their credentials?
294
998620
2595
16:53
Is it published,
how rigorous is the journal?
295
1001239
2756
16:56
And ask yourself
the million-dollar question:
296
1004733
2317
16:59
If the same study was written by the same
authors with the same credentials
297
1007980
4023
17:05
but found the opposite results,
298
1013130
1587
17:07
would you still be willing
to believe it and to share it?
299
1015608
3694
17:13
Treating any problem --
300
1021442
2246
17:15
a nation's economic problem
or an individual's health problem,
301
1023712
3792
17:19
is difficult.
302
1027528
1150
17:21
So we must ensure that we have
the very best evidence to guide us.
303
1029242
4383
17:26
Only if it's true can it be fact.
304
1034188
2681
17:29
Only if it's representative
can it be data.
305
1037601
2781
17:33
Only if it's supportive
can it be evidence.
306
1041128
3165
17:36
And only with evidence
can we move from a post-truth world
307
1044317
5167
17:41
to a pro-truth world.
308
1049508
1583
17:44
Thank you very much.
309
1052183
1334
17:45
(Applause)
310
1053541
1150

▲Back to top

ABOUT THE SPEAKER
Alex Edmans - Finance professor, editor
Alex Edmans uses rigorous academic research to influence real-life business practices -- in particular, how companies can pursue purpose as well as profit.

Why you should listen

Alex Edmans is professor of finance at London Business School and managing editor of the Review of Finance, the leading academic finance journal in Europe. He is an expert in corporate governance, executive compensation, corporate social responsibility and behavioral economics.

Edmans has a unique combination of deep academic rigor and practical business experience. He's particularly passionate about translating complex academic research into practical ideas that can then be applied to real-life problems. He has spoken at the World Economic Forum in Davos, at the World Bank Distinguished Speaker Series and in the UK House of Commons. Edmans is heavily involved in the ongoing reform of corporate governance, in particular to ensure that both the diagnosis of problems and suggested solutions are based on rigorous evidence rather than anecdote. He was appointed by the UK government to study the effect of share buybacks on executive pay and investment. Edmans also serves on the Steering Group of The Purposeful Company, which aims to embed purpose into the heart of business, and on Royal London Asset Management's Responsible Investment Advisory Committee.
 
Edmans has been interviewed by Bloomberg, BBC, CNBC, CNN, ESPN, Fox, ITV, NPR, Reuters, Sky News and Sky Sports, and has written for the Wall Street Journal, Financial Times and Harvard Business Review. He runs a blog, Access to Finance, that makes academic research accessible to a general audience, and was appointed Mercers' School Memorial Professor of Business by Gresham College, to give free lectures to the public. Edmans was previously a tenured professor at Wharton, where he won 14 teaching awards in six years. At LBS, he won the Excellence in Teaching award, LBS's highest teaching accolade.

More profile about the speaker
Alex Edmans | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee