ABOUT THE SPEAKER
Noreena Hertz - Economist
Noreena Hertz looks at global culture -- financial and otherwise -- using an approach that combines traditional economic analysis with foreign policy trends, psychology, behavioural economics, anthropology, history and sociology.

Why you should listen

For more than two decades, Noreena Hertz’s economic predictions have been accurate and ahead of the curve. In her recent book The Silent Takeover, Hertz predicted that unregulated markets and massive financial institutions would have serious global consequences while her 2005 book IOU: The Debt Threat predicted the 2008 financial crisis.

An influential economist on the international stage, Hertz also played an influential role in the development of (RED), an innovative commercial model to raise money for people with AIDS in Africa, having inspired Bono (co-founder of the project) with her writings.

Her work is considered to provide a much needed blueprint for rethinking economics and corporate strategy. She is the Duisenberg Professor of Globalization, Sustainability and Finance based at Duisenberg School of Finance, RSM, Erasmus University and University of Cambridge. She is also a Fellow of University College London.

More profile about the speaker
Noreena Hertz | Speaker | TED.com
TEDSalon London 2010

Noreena Hertz: How to use experts -- and when not to

Filmed:
951,198 views

We make important decisions every day -- and we often rely on experts to help us decide. But, says economist Noreena Hertz, relying too much on experts can be limiting and even dangerous. She calls for us to start democratizing expertise -- to listen not only to "surgeons and CEOs, but also to shop staff."
- Economist
Noreena Hertz looks at global culture -- financial and otherwise -- using an approach that combines traditional economic analysis with foreign policy trends, psychology, behavioural economics, anthropology, history and sociology. Full bio

Double-click the English transcript below to play the video.

00:15
It's Monday morning.
0
0
3000
00:18
In Washington,
1
3000
2000
00:20
the president of the United States
2
5000
2000
00:22
is sitting in the Oval Office,
3
7000
2000
00:24
assessing whether or not
4
9000
2000
00:26
to strike Al Qaeda
5
11000
2000
00:28
in Yemen.
6
13000
2000
00:30
At Number 10 Downing Street,
7
15000
2000
00:32
David Cameron is trying to work out
8
17000
3000
00:35
whether to cut more public sector jobs
9
20000
3000
00:38
in order to stave off a double-dip recession.
10
23000
3000
00:41
In Madrid, Maria Gonzalez
11
26000
3000
00:44
is standing at the door,
12
29000
2000
00:46
listening to her baby crying and crying,
13
31000
3000
00:49
trying to work out whether she should let it cry
14
34000
3000
00:52
until it falls asleep
15
37000
2000
00:54
or pick it up and hold it.
16
39000
3000
00:57
And I am sitting by my father's bedside in hospital,
17
42000
4000
01:01
trying to work out
18
46000
2000
01:03
whether I should let him drink
19
48000
2000
01:05
the one-and-a-half-liter bottle of water
20
50000
3000
01:08
that his doctors just came in and said,
21
53000
3000
01:11
"You must make him drink today," --
22
56000
2000
01:13
my father's been nil by mouth for a week --
23
58000
3000
01:16
or whether, by giving him this bottle,
24
61000
4000
01:20
I might actually kill him.
25
65000
3000
01:23
We face momentous decisions
26
68000
3000
01:26
with important consequences
27
71000
2000
01:28
throughout our lives,
28
73000
2000
01:30
and we have strategies for dealing with these decisions.
29
75000
3000
01:33
We talk things over with our friends,
30
78000
3000
01:36
we scour the Internet,
31
81000
3000
01:39
we search through books.
32
84000
3000
01:42
But still,
33
87000
2000
01:44
even in this age
34
89000
2000
01:46
of Google and TripAdvisor
35
91000
2000
01:48
and Amazon Recommends,
36
93000
3000
01:51
it's still experts
37
96000
2000
01:53
that we rely upon most --
38
98000
3000
01:56
especially when the stakes are high
39
101000
2000
01:58
and the decision really matters.
40
103000
3000
02:01
Because in a world of data deluge
41
106000
2000
02:03
and extreme complexity,
42
108000
3000
02:06
we believe that experts
43
111000
2000
02:08
are more able to process information than we can --
44
113000
3000
02:11
that they are able to come to better conclusions
45
116000
3000
02:14
than we could come to on our own.
46
119000
3000
02:17
And in an age
47
122000
2000
02:19
that is sometimes nowadays frightening
48
124000
3000
02:22
or confusing,
49
127000
2000
02:24
we feel reassured
50
129000
2000
02:26
by the almost parental-like authority
51
131000
3000
02:29
of experts
52
134000
2000
02:31
who tell us so clearly what it is
53
136000
3000
02:34
we can and cannot do.
54
139000
3000
02:38
But I believe
55
143000
2000
02:40
that this is a big problem,
56
145000
2000
02:42
a problem with potentially dangerous consequences
57
147000
4000
02:46
for us as a society,
58
151000
3000
02:49
as a culture
59
154000
2000
02:51
and as individuals.
60
156000
2000
02:53
It's not that experts
61
158000
2000
02:55
have not massively contributed to the world --
62
160000
2000
02:57
of course they have.
63
162000
2000
02:59
The problem lies with us:
64
164000
3000
03:02
we've become addicted to experts.
65
167000
3000
03:05
We've become addicted to their certainty,
66
170000
3000
03:08
their assuredness,
67
173000
2000
03:10
their definitiveness,
68
175000
2000
03:12
and in the process,
69
177000
2000
03:14
we have ceded our responsibility,
70
179000
2000
03:16
substituting our intellect
71
181000
2000
03:18
and our intelligence
72
183000
3000
03:21
for their supposed words of wisdom.
73
186000
3000
03:24
We've surrendered our power,
74
189000
3000
03:27
trading off our discomfort
75
192000
2000
03:29
with uncertainty
76
194000
2000
03:31
for the illusion of certainty
77
196000
2000
03:33
that they provide.
78
198000
3000
03:36
This is no exaggeration.
79
201000
3000
03:39
In a recent experiment,
80
204000
2000
03:41
a group of adults
81
206000
2000
03:43
had their brains scanned in an MRI machine
82
208000
3000
03:46
as they were listening to experts speak.
83
211000
3000
03:50
The results were quite extraordinary.
84
215000
3000
03:53
As they listened to the experts' voices,
85
218000
3000
03:56
the independent decision-making parts of their brains
86
221000
5000
04:01
switched off.
87
226000
2000
04:03
It literally flat-lined.
88
228000
3000
04:06
And they listened to whatever the experts said
89
231000
2000
04:08
and took their advice, however right or wrong.
90
233000
3000
04:12
But experts do get things wrong.
91
237000
3000
04:16
Did you know that studies show
92
241000
3000
04:19
that doctors misdiagnose
93
244000
3000
04:22
four times out of 10?
94
247000
3000
04:25
Did you know
95
250000
2000
04:27
that if you file your tax returns yourself,
96
252000
3000
04:30
you're statistically more likely
97
255000
2000
04:32
to be filing them correctly
98
257000
2000
04:34
than if you get a tax adviser
99
259000
2000
04:36
to do it for you?
100
261000
2000
04:38
And then there's, of course, the example
101
263000
2000
04:40
that we're all too aware of:
102
265000
2000
04:42
financial experts
103
267000
2000
04:44
getting it so wrong
104
269000
2000
04:46
that we're living through the worst recession
105
271000
2000
04:48
since the 1930s.
106
273000
3000
04:52
For the sake of our health,
107
277000
2000
04:54
our wealth
108
279000
2000
04:56
and our collective security,
109
281000
2000
04:58
it's imperative that we keep
110
283000
3000
05:01
the independent decision-making parts of our brains
111
286000
4000
05:05
switched on.
112
290000
2000
05:07
And I'm saying this as an economist
113
292000
2000
05:09
who, over the past few years,
114
294000
2000
05:11
has focused my research
115
296000
2000
05:13
on what it is we think
116
298000
2000
05:15
and who it is we trust and why,
117
300000
3000
05:18
but also --
118
303000
2000
05:20
and I'm aware of the irony here --
119
305000
2000
05:22
as an expert myself,
120
307000
3000
05:25
as a professor,
121
310000
2000
05:27
as somebody who advises prime ministers,
122
312000
3000
05:30
heads of big companies,
123
315000
2000
05:32
international organizations,
124
317000
2000
05:34
but an expert who believes
125
319000
2000
05:36
that the role of experts needs to change,
126
321000
3000
05:39
that we need to become more open-minded,
127
324000
3000
05:42
more democratic
128
327000
2000
05:44
and be more open
129
329000
2000
05:46
to people rebelling against
130
331000
2000
05:48
our points of view.
131
333000
3000
05:51
So in order to help you understand
132
336000
3000
05:54
where I'm coming from,
133
339000
2000
05:56
let me bring you into my world,
134
341000
3000
05:59
the world of experts.
135
344000
2000
06:01
Now there are, of course, exceptions,
136
346000
4000
06:05
wonderful, civilization-enhancing exceptions.
137
350000
6000
06:11
But what my research has shown me
138
356000
3000
06:14
is that experts tend on the whole
139
359000
3000
06:17
to form very rigid camps,
140
362000
3000
06:20
that within these camps,
141
365000
2000
06:22
a dominant perspective emerges
142
367000
3000
06:25
that often silences opposition,
143
370000
3000
06:28
that experts move with the prevailing winds,
144
373000
3000
06:31
often hero-worshipping
145
376000
3000
06:34
their own gurus.
146
379000
2000
06:36
Alan Greenspan's proclamations
147
381000
2000
06:38
that the years of economic growth
148
383000
3000
06:41
would go on and on,
149
386000
3000
06:44
not challenged by his peers,
150
389000
3000
06:47
until after the crisis, of course.
151
392000
3000
06:51
You see,
152
396000
3000
06:54
we also learn
153
399000
2000
06:56
that experts are located,
154
401000
2000
06:58
are governed,
155
403000
2000
07:00
by the social and cultural norms
156
405000
3000
07:03
of their times --
157
408000
2000
07:05
whether it be the doctors
158
410000
2000
07:07
in Victorian England, say,
159
412000
2000
07:09
who sent women to asylums
160
414000
3000
07:12
for expressing sexual desire,
161
417000
3000
07:15
or the psychiatrists in the United States
162
420000
3000
07:18
who, up until 1973,
163
423000
3000
07:21
were still categorizing homosexuality
164
426000
4000
07:25
as a mental illness.
165
430000
2000
07:27
And what all this means
166
432000
2000
07:29
is that paradigms
167
434000
2000
07:31
take far too long to shift,
168
436000
3000
07:34
that complexity and nuance are ignored
169
439000
4000
07:38
and also that money talks --
170
443000
3000
07:41
because we've all seen the evidence
171
446000
3000
07:44
of pharmaceutical companies
172
449000
2000
07:46
funding studies of drugs
173
451000
3000
07:49
that conveniently leave out
174
454000
2000
07:51
their worst side effects,
175
456000
3000
07:54
or studies funded by food companies
176
459000
3000
07:57
of their new products,
177
462000
3000
08:00
massively exaggerating the health benefits
178
465000
3000
08:03
of the products they're about to bring by market.
179
468000
3000
08:06
The study showed that food companies exaggerated
180
471000
2000
08:08
typically seven times more
181
473000
3000
08:11
than an independent study.
182
476000
3000
08:15
And we've also got to be aware
183
480000
2000
08:17
that experts, of course,
184
482000
2000
08:19
also make mistakes.
185
484000
2000
08:21
They make mistakes every single day --
186
486000
3000
08:24
mistakes born out of carelessness.
187
489000
3000
08:27
A recent study in the Archives of Surgery
188
492000
3000
08:30
reported surgeons
189
495000
2000
08:32
removing healthy ovaries,
190
497000
3000
08:35
operating on the wrong side of the brain,
191
500000
3000
08:38
carrying out procedures on the wrong hand,
192
503000
3000
08:41
elbow, eye, foot,
193
506000
3000
08:44
and also mistakes born out of thinking errors.
194
509000
3000
08:47
A common thinking error
195
512000
2000
08:49
of radiologists, for example --
196
514000
3000
08:52
when they look at CT scans --
197
517000
3000
08:55
is that they're overly influenced
198
520000
2000
08:57
by whatever it is
199
522000
2000
08:59
that the referring physician has said
200
524000
2000
09:01
that he suspects
201
526000
2000
09:03
the patient's problem to be.
202
528000
3000
09:06
So if a radiologist
203
531000
2000
09:08
is looking at the scan
204
533000
2000
09:10
of a patient with suspected pneumonia, say,
205
535000
3000
09:13
what happens is that,
206
538000
2000
09:15
if they see evidence
207
540000
2000
09:17
of pneumonia on the scan,
208
542000
3000
09:20
they literally stop looking at it --
209
545000
3000
09:23
thereby missing the tumor
210
548000
2000
09:25
sitting three inches below
211
550000
2000
09:27
on the patient's lungs.
212
552000
3000
09:31
I've shared with you so far
213
556000
3000
09:34
some insights into the world of experts.
214
559000
3000
09:37
These are, of course,
215
562000
2000
09:39
not the only insights I could share,
216
564000
2000
09:41
but I hope they give you a clear sense at least
217
566000
3000
09:44
of why we need to stop kowtowing to them,
218
569000
3000
09:47
why we need to rebel
219
572000
2000
09:49
and why we need to switch
220
574000
2000
09:51
our independent decision-making capabilities on.
221
576000
4000
09:55
But how can we do this?
222
580000
3000
09:58
Well for the sake of time,
223
583000
3000
10:01
I want to focus on just three strategies.
224
586000
3000
10:06
First, we've got to be ready and willing
225
591000
2000
10:08
to take experts on
226
593000
3000
10:11
and dispense with this notion of them
227
596000
3000
10:14
as modern-day apostles.
228
599000
2000
10:16
This doesn't mean having to get a Ph.D.
229
601000
3000
10:19
in every single subject,
230
604000
2000
10:21
you'll be relieved to hear.
231
606000
2000
10:23
But it does mean persisting
232
608000
3000
10:26
in the face of their inevitable annoyance
233
611000
3000
10:29
when, for example,
234
614000
2000
10:31
we want them to explain things to us
235
616000
2000
10:33
in language that we can actually understand.
236
618000
4000
10:38
Why was it that, when I had an operation,
237
623000
3000
10:41
my doctor said to me,
238
626000
2000
10:43
"Beware, Ms. Hertz,
239
628000
2000
10:45
of hyperpyrexia,"
240
630000
2000
10:47
when he could have just as easily said,
241
632000
2000
10:49
"Watch out for a high fever."
242
634000
3000
10:53
You see, being ready to take experts on
243
638000
4000
10:57
is about also being willing
244
642000
2000
10:59
to dig behind their graphs,
245
644000
3000
11:02
their equations, their forecasts,
246
647000
2000
11:04
their prophecies,
247
649000
2000
11:06
and being armed with the questions to do that --
248
651000
3000
11:09
questions like:
249
654000
2000
11:11
What are the assumptions that underpin this?
250
656000
3000
11:14
What is the evidence upon which this is based?
251
659000
3000
11:17
What has your investigation focused on?
252
662000
4000
11:21
And what has it ignored?
253
666000
3000
11:24
It recently came out
254
669000
2000
11:26
that experts trialing drugs
255
671000
3000
11:29
before they come to market
256
674000
2000
11:31
typically trial drugs
257
676000
3000
11:34
first, primarily on male animals
258
679000
4000
11:38
and then, primarily on men.
259
683000
3000
11:41
It seems that they've somehow overlooked the fact
260
686000
3000
11:44
that over half the world's population are women.
261
689000
3000
11:49
And women have drawn the short medical straw
262
694000
3000
11:52
because it now turns out that many of these drugs
263
697000
3000
11:55
don't work nearly as well on women
264
700000
3000
11:58
as they do on men --
265
703000
2000
12:00
and the drugs that do work well work so well
266
705000
3000
12:03
that they're actively harmful for women to take.
267
708000
3000
12:06
Being a rebel is about recognizing
268
711000
3000
12:09
that experts' assumptions
269
714000
3000
12:12
and their methodologies
270
717000
2000
12:14
can easily be flawed.
271
719000
3000
12:17
Second,
272
722000
2000
12:19
we need to create the space
273
724000
3000
12:22
for what I call "managed dissent."
274
727000
3000
12:25
If we are to shift paradigms,
275
730000
2000
12:27
if we are to make breakthroughs,
276
732000
2000
12:29
if we are to destroy myths,
277
734000
3000
12:32
we need to create an environment
278
737000
2000
12:34
in which expert ideas are battling it out,
279
739000
3000
12:37
in which we're bringing in
280
742000
2000
12:39
new, diverse, discordant, heretical views
281
744000
3000
12:42
into the discussion,
282
747000
2000
12:44
fearlessly,
283
749000
2000
12:46
in the knowledge that progress comes about,
284
751000
3000
12:49
not only from the creation of ideas,
285
754000
4000
12:53
but also from their destruction --
286
758000
3000
12:56
and also from the knowledge
287
761000
3000
12:59
that, by surrounding ourselves
288
764000
2000
13:01
by divergent, discordant,
289
766000
3000
13:04
heretical views.
290
769000
2000
13:06
All the research now shows us
291
771000
2000
13:08
that this actually makes us smarter.
292
773000
4000
13:13
Encouraging dissent is a rebellious notion
293
778000
3000
13:16
because it goes against our very instincts,
294
781000
3000
13:19
which are to surround ourselves
295
784000
3000
13:22
with opinions and advice
296
787000
2000
13:24
that we already believe
297
789000
3000
13:27
or want to be true.
298
792000
2000
13:29
And that's why I talk about the need
299
794000
2000
13:31
to actively manage dissent.
300
796000
3000
13:35
Google CEO Eric Schmidt
301
800000
2000
13:37
is a practical practitioner
302
802000
3000
13:40
of this philosophy.
303
805000
2000
13:42
In meetings, he looks out for the person in the room --
304
807000
3000
13:45
arms crossed, looking a bit bemused --
305
810000
3000
13:48
and draws them into the discussion,
306
813000
3000
13:51
trying to see if they indeed are
307
816000
3000
13:54
the person with a different opinion,
308
819000
3000
13:57
so that they have dissent within the room.
309
822000
3000
14:00
Managing dissent
310
825000
2000
14:02
is about recognizing the value
311
827000
3000
14:05
of disagreement, discord
312
830000
3000
14:08
and difference.
313
833000
2000
14:10
But we need to go even further.
314
835000
3000
14:13
We need to fundamentally redefine
315
838000
3000
14:16
who it is that experts are.
316
841000
3000
14:20
The conventional notion
317
845000
2000
14:22
is that experts are people
318
847000
3000
14:25
with advanced degrees,
319
850000
2000
14:27
fancy titles, diplomas,
320
852000
3000
14:30
best-selling books --
321
855000
2000
14:32
high-status individuals.
322
857000
2000
14:34
But just imagine
323
859000
2000
14:36
if we were to junk
324
861000
2000
14:38
this notion of expertise
325
863000
4000
14:42
as some sort of elite cadre
326
867000
4000
14:46
and instead embrace the notion
327
871000
3000
14:49
of democratized expertise --
328
874000
3000
14:52
whereby expertise was not just the preserve
329
877000
3000
14:55
of surgeons and CEO's,
330
880000
2000
14:57
but also shop-girls -- yeah.
331
882000
3000
15:01
Best Buy,
332
886000
2000
15:03
the consumer electronics company,
333
888000
2000
15:05
gets all its employees --
334
890000
3000
15:08
the cleaners, the shop assistants,
335
893000
2000
15:10
the people in the back office,
336
895000
3000
15:13
not just its forecasting team --
337
898000
2000
15:15
to place bets, yes bets,
338
900000
3000
15:18
on things like whether or not
339
903000
2000
15:20
a product is going to sell well before Christmas,
340
905000
3000
15:23
on whether customers' new ideas
341
908000
3000
15:26
are going to be or should be taken on by the company,
342
911000
4000
15:30
on whether a project
343
915000
2000
15:32
will come in on time.
344
917000
2000
15:34
By leveraging
345
919000
2000
15:36
and by embracing
346
921000
2000
15:38
the expertise within the company,
347
923000
2000
15:40
Best Buy was able to discover, for example,
348
925000
3000
15:43
that the store that it was going to open in China --
349
928000
4000
15:47
its big, grand store --
350
932000
2000
15:49
was not going to open on time.
351
934000
3000
15:52
Because when it asked its staff,
352
937000
2000
15:54
all its staff, to place their bets
353
939000
3000
15:57
on whether they thought the store would open on time or not,
354
942000
4000
16:01
a group from the finance department
355
946000
3000
16:04
placed all their chips
356
949000
2000
16:06
on that not happening.
357
951000
3000
16:09
It turned out that they were aware,
358
954000
2000
16:11
as no one else within the company was,
359
956000
3000
16:14
of a technological blip
360
959000
2000
16:16
that neither the forecasting experts,
361
961000
2000
16:18
nor the experts on the ground in China,
362
963000
3000
16:21
were even aware of.
363
966000
3000
16:25
The strategies
364
970000
2000
16:27
that I have discussed this evening --
365
972000
3000
16:30
embracing dissent,
366
975000
2000
16:32
taking experts on,
367
977000
2000
16:34
democratizing expertise,
368
979000
2000
16:36
rebellious strategies --
369
981000
3000
16:39
are strategies that I think
370
984000
2000
16:41
would serve us all well to embrace
371
986000
2000
16:43
as we try to deal with the challenges
372
988000
3000
16:46
of these very confusing, complex,
373
991000
3000
16:49
difficult times.
374
994000
2000
16:51
For if we keep
375
996000
2000
16:53
our independent decision-making part
376
998000
2000
16:55
of our brains switched on,
377
1000000
3000
16:58
if we challenge experts, if we're skeptical,
378
1003000
3000
17:01
if we devolve authority,
379
1006000
2000
17:03
if we are rebellious,
380
1008000
2000
17:05
but also
381
1010000
2000
17:07
if we become much more comfortable
382
1012000
2000
17:09
with nuance,
383
1014000
2000
17:11
uncertainty and doubt,
384
1016000
3000
17:14
and if we allow our experts
385
1019000
3000
17:17
to express themselves
386
1022000
2000
17:19
using those terms too,
387
1024000
2000
17:21
we will set ourselves up
388
1026000
2000
17:23
much better
389
1028000
2000
17:25
for the challenges of the 21st century.
390
1030000
4000
17:29
For now, more than ever,
391
1034000
3000
17:32
is not the time
392
1037000
2000
17:34
to be blindly following,
393
1039000
2000
17:36
blindly accepting,
394
1041000
2000
17:38
blindly trusting.
395
1043000
3000
17:41
Now is the time to face the world
396
1046000
3000
17:44
with eyes wide open --
397
1049000
3000
17:47
yes, using experts
398
1052000
2000
17:49
to help us figure things out, for sure --
399
1054000
3000
17:52
I don't want to completely do myself out of a job here --
400
1057000
4000
17:56
but being aware
401
1061000
2000
17:58
of their limitations
402
1063000
3000
18:01
and, of course, also our own.
403
1066000
4000
18:05
Thank you.
404
1070000
2000
18:07
(Applause)
405
1072000
5000

▲Back to top

ABOUT THE SPEAKER
Noreena Hertz - Economist
Noreena Hertz looks at global culture -- financial and otherwise -- using an approach that combines traditional economic analysis with foreign policy trends, psychology, behavioural economics, anthropology, history and sociology.

Why you should listen

For more than two decades, Noreena Hertz’s economic predictions have been accurate and ahead of the curve. In her recent book The Silent Takeover, Hertz predicted that unregulated markets and massive financial institutions would have serious global consequences while her 2005 book IOU: The Debt Threat predicted the 2008 financial crisis.

An influential economist on the international stage, Hertz also played an influential role in the development of (RED), an innovative commercial model to raise money for people with AIDS in Africa, having inspired Bono (co-founder of the project) with her writings.

Her work is considered to provide a much needed blueprint for rethinking economics and corporate strategy. She is the Duisenberg Professor of Globalization, Sustainability and Finance based at Duisenberg School of Finance, RSM, Erasmus University and University of Cambridge. She is also a Fellow of University College London.

More profile about the speaker
Noreena Hertz | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee