ABOUT THE SPEAKER
Ben Goldacre - Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks.

Why you should listen

"It was the MMR story that finally made me crack," begins the Bad Science manifesto, referring to the sensationalized -- and now-refuted -- link between vaccines and autism. With that sentence Ben Goldacre fired the starting shot of a crusade waged from the pages of The Guardian from 2003 to 2011, on an addicitve Twitter feed, and in bestselling books, including Bad Science and his latest, Bad Pharma, which puts the $600 billion global pharmaceutical industry under the microscope. What he reveals is a fascinating, terrifying mess.

Goldacre was trained in medicine at Oxford and London, and works as an academic in epidemiology. Helped along by this inexhaustible supply of material, he also travels the speaking circuit, promoting skepticism and nerdish curiosity with fire, wit, fast delivery and a lovable kind of exasperation. (He might even convince you that real science, sober reporting and reason are going to win in the end.)

As he writes, "If you're a journalist who misrepresents science for the sake of a headline, a politician more interested in spin than evidence, or an advertiser who loves pictures of molecules in little white coats, then beware: your days are numbered."

Read an excerpt of Bad Pharma >>

More profile about the speaker
Ben Goldacre | Speaker | TED.com
TEDGlobal 2011

Ben Goldacre: Battling bad science

Filmed:
2,713,579 views

Every day there are news reports of new health advice, but how can you know if they're right? Doctor and epidemiologist Ben Goldacre shows us, at high speed, the ways evidence can be distorted, from the blindingly obvious nutrition claims to the very subtle tricks of the pharmaceutical industry.
- Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks. Full bio

Double-click the English transcript below to play the video.

00:15
So I'm a doctor, but I kind of slipped sideways into research,
0
0
3000
00:18
and now I'm an epidemiologist.
1
3000
2000
00:20
And nobody really knows what epidemiology is.
2
5000
2000
00:22
Epidemiology is the science of how we know in the real world
3
7000
3000
00:25
if something is good for you or bad for you.
4
10000
2000
00:27
And it's best understood through example
5
12000
2000
00:29
as the science of those crazy, wacky newspaper headlines.
6
14000
5000
00:34
And these are just some of the examples.
7
19000
2000
00:36
These are from the Daily Mail. Every country in the world has a newspaper like this.
8
21000
3000
00:39
It has this bizarre, ongoing philosophical project
9
24000
3000
00:42
of dividing all the inanimate objects in the world
10
27000
2000
00:44
into the ones that either cause or prevent cancer.
11
29000
3000
00:47
So here are some of the things they said cause cancer recently:
12
32000
2000
00:49
divorce, Wi-Fi, toiletries and coffee.
13
34000
2000
00:51
Here are some of the things they say prevents cancer:
14
36000
2000
00:53
crusts, red pepper, licorice and coffee.
15
38000
2000
00:55
So already you can see there are contradictions.
16
40000
2000
00:57
Coffee both causes and prevents cancer.
17
42000
2000
00:59
And as you start to read on, you can see
18
44000
2000
01:01
that maybe there's some kind of political valence behind some of this.
19
46000
3000
01:04
So for women, housework prevents breast cancer,
20
49000
2000
01:06
but for men, shopping could make you impotent.
21
51000
3000
01:09
So we know that we need to start
22
54000
3000
01:12
unpicking the science behind this.
23
57000
3000
01:15
And what I hope to show
24
60000
2000
01:17
is that unpicking dodgy claims,
25
62000
2000
01:19
unpicking the evidence behind dodgy claims,
26
64000
2000
01:21
isn't a kind of nasty carping activity;
27
66000
3000
01:24
it's socially useful,
28
69000
2000
01:26
but it's also an extremely valuable
29
71000
2000
01:28
explanatory tool.
30
73000
2000
01:30
Because real science is all about
31
75000
2000
01:32
critically appraising the evidence for somebody else's position.
32
77000
2000
01:34
That's what happens in academic journals.
33
79000
2000
01:36
That's what happens at academic conferences.
34
81000
2000
01:38
The Q&A session after a post-op presents data
35
83000
2000
01:40
is often a blood bath.
36
85000
2000
01:42
And nobody minds that. We actively welcome it.
37
87000
2000
01:44
It's like a consenting intellectual S&M activity.
38
89000
3000
01:47
So what I'm going to show you
39
92000
2000
01:49
is all of the main things,
40
94000
2000
01:51
all of the main features of my discipline --
41
96000
2000
01:53
evidence-based medicine.
42
98000
2000
01:55
And I will talk you through all of these
43
100000
2000
01:57
and demonstrate how they work,
44
102000
2000
01:59
exclusively using examples of people getting stuff wrong.
45
104000
3000
02:02
So we'll start with the absolute weakest form of evidence known to man,
46
107000
3000
02:05
and that is authority.
47
110000
2000
02:07
In science, we don't care how many letters you have after your name.
48
112000
3000
02:10
In science, we want to know what your reasons are for believing something.
49
115000
3000
02:13
How do you know that something is good for us
50
118000
2000
02:15
or bad for us?
51
120000
2000
02:17
But we're also unimpressed by authority,
52
122000
2000
02:19
because it's so easy to contrive.
53
124000
2000
02:21
This is somebody called Dr. Gillian McKeith Ph.D,
54
126000
2000
02:23
or, to give her full medical title, Gillian McKeith.
55
128000
3000
02:26
(Laughter)
56
131000
3000
02:29
Again, every country has somebody like this.
57
134000
2000
02:31
She is our TV diet guru.
58
136000
2000
02:33
She has massive five series of prime-time television,
59
138000
3000
02:36
giving out very lavish and exotic health advice.
60
141000
3000
02:39
She, it turns out, has a non-accredited correspondence course Ph.D.
61
144000
3000
02:42
from somewhere in America.
62
147000
2000
02:44
She also boasts that she's a certified professional member
63
149000
2000
02:46
of the American Association of Nutritional Consultants,
64
151000
2000
02:48
which sounds very glamorous and exciting.
65
153000
2000
02:50
You get a certificate and everything.
66
155000
2000
02:52
This one belongs to my dead cat Hetti. She was a horrible cat.
67
157000
2000
02:54
You just go to the website, fill out the form,
68
159000
2000
02:56
give them $60, and it arrives in the post.
69
161000
2000
02:58
Now that's not the only reason that we think this person is an idiot.
70
163000
2000
03:00
She also goes and says things like,
71
165000
2000
03:02
you should eat lots of dark green leaves,
72
167000
2000
03:04
because they contain lots of chlorophyll, and that will really oxygenate your blood.
73
169000
2000
03:06
And anybody who's done school biology remembers
74
171000
2000
03:08
that chlorophyll and chloroplasts
75
173000
2000
03:10
only make oxygen in sunlight,
76
175000
2000
03:12
and it's quite dark in your bowels after you've eaten spinach.
77
177000
3000
03:15
Next, we need proper science, proper evidence.
78
180000
3000
03:18
So, "Red wine can help prevent breast cancer."
79
183000
2000
03:20
This is a headline from the Daily Telegraph in the U.K.
80
185000
2000
03:22
"A glass of red wine a day could help prevent breast cancer."
81
187000
3000
03:25
So you go and find this paper, and what you find
82
190000
2000
03:27
is it is a real piece of science.
83
192000
2000
03:29
It is a description of the changes in one enzyme
84
194000
3000
03:32
when you drip a chemical extracted from some red grape skin
85
197000
3000
03:35
onto some cancer cells
86
200000
2000
03:37
in a dish on a bench in a laboratory somewhere.
87
202000
3000
03:40
And that's a really useful thing to describe
88
205000
2000
03:42
in a scientific paper,
89
207000
2000
03:44
but on the question of your own personal risk
90
209000
2000
03:46
of getting breast cancer if you drink red wine,
91
211000
2000
03:48
it tells you absolutely bugger all.
92
213000
2000
03:50
Actually, it turns out that your risk of breast cancer
93
215000
2000
03:52
actually increases slightly
94
217000
2000
03:54
with every amount of alcohol that you drink.
95
219000
2000
03:56
So what we want is studies in real human people.
96
221000
4000
04:00
And here's another example.
97
225000
2000
04:02
This is from Britain's leading diet and nutritionist in the Daily Mirror,
98
227000
3000
04:05
which is our second biggest selling newspaper.
99
230000
2000
04:07
"An Australian study in 2001
100
232000
2000
04:09
found that olive oil in combination with fruits, vegetables and pulses
101
234000
2000
04:11
offers measurable protection against skin wrinklings."
102
236000
2000
04:13
And then they give you advice:
103
238000
2000
04:15
"If you eat olive oil and vegetables, you'll have fewer skin wrinkles."
104
240000
2000
04:17
And they very helpfully tell you how to go and find the paper.
105
242000
2000
04:19
So you go and find the paper, and what you find is an observational study.
106
244000
3000
04:22
Obviously nobody has been able
107
247000
2000
04:24
to go back to 1930,
108
249000
2000
04:26
get all the people born in one maternity unit,
109
251000
3000
04:29
and half of them eat lots of fruit and veg and olive oil,
110
254000
2000
04:31
and then half of them eat McDonald's,
111
256000
2000
04:33
and then we see how many wrinkles you've got later.
112
258000
2000
04:35
You have to take a snapshot of how people are now.
113
260000
2000
04:37
And what you find is, of course,
114
262000
2000
04:39
people who eat veg and olive oil have fewer skin wrinkles.
115
264000
3000
04:42
But that's because people who eat fruit and veg and olive oil,
116
267000
3000
04:45
they're freaks, they're not normal, they're like you;
117
270000
3000
04:48
they come to events like this.
118
273000
2000
04:50
They are posh, they're wealthy, they're less likely to have outdoor jobs,
119
275000
3000
04:53
they're less likely to do manual labor,
120
278000
2000
04:55
they have better social support, they're less likely to smoke --
121
280000
2000
04:57
so for a whole host of fascinating, interlocking
122
282000
2000
04:59
social, political and cultural reasons,
123
284000
2000
05:01
they are less likely to have skin wrinkles.
124
286000
2000
05:03
That doesn't mean that it's the vegetables or the olive oil.
125
288000
2000
05:05
(Laughter)
126
290000
2000
05:07
So ideally what you want to do is a trial.
127
292000
3000
05:10
And everybody thinks they're very familiar with the idea of a trial.
128
295000
2000
05:12
Trials are very old. The first trial was in the Bible -- Daniel 1:12.
129
297000
3000
05:15
It's very straightforward -- you take a bunch of people, you split them in half,
130
300000
2000
05:17
you treat one group one way, you treat the other group the other way,
131
302000
2000
05:19
and a little while later, you follow them up
132
304000
2000
05:21
and see what happened to each of them.
133
306000
2000
05:23
So I'm going to tell you about one trial,
134
308000
2000
05:25
which is probably the most well-reported trial
135
310000
2000
05:27
in the U.K. news media over the past decade.
136
312000
2000
05:29
And this is the trial of fish oil pills.
137
314000
2000
05:31
And the claim was fish oil pills improve school performance and behavior
138
316000
2000
05:33
in mainstream children.
139
318000
2000
05:35
And they said, "We've done a trial.
140
320000
2000
05:37
All the previous trials were positive, and we know this one's gonna be too."
141
322000
2000
05:39
That should always ring alarm bells.
142
324000
2000
05:41
Because if you already know the answer to your trial, you shouldn't be doing one.
143
326000
3000
05:44
Either you've rigged it by design,
144
329000
2000
05:46
or you've got enough data so there's no need to randomize people anymore.
145
331000
3000
05:49
So this is what they were going to do in their trial.
146
334000
3000
05:52
They were taking 3,000 children,
147
337000
2000
05:54
they were going to give them all these huge fish oil pills,
148
339000
2000
05:56
six of them a day,
149
341000
2000
05:58
and then a year later, they were going to measure their school exam performance
150
343000
3000
06:01
and compare their school exam performance
151
346000
2000
06:03
against what they predicted their exam performance would have been
152
348000
2000
06:05
if they hadn't had the pills.
153
350000
3000
06:08
Now can anybody spot a flaw in this design?
154
353000
3000
06:11
And no professors of clinical trial methodology
155
356000
3000
06:14
are allowed to answer this question.
156
359000
2000
06:16
So there's no control; there's no control group.
157
361000
2000
06:18
But that sounds really techie.
158
363000
2000
06:20
That's a technical term.
159
365000
2000
06:22
The kids got the pills, and then their performance improved.
160
367000
2000
06:24
What else could it possibly be if it wasn't the pills?
161
369000
3000
06:27
They got older. We all develop over time.
162
372000
3000
06:30
And of course, also there's the placebo effect.
163
375000
2000
06:32
The placebo effect is one of the most fascinating things in the whole of medicine.
164
377000
2000
06:34
It's not just about taking a pill, and your performance and your pain getting better.
165
379000
3000
06:37
It's about our beliefs and expectations.
166
382000
2000
06:39
It's about the cultural meaning of a treatment.
167
384000
2000
06:41
And this has been demonstrated in a whole raft of fascinating studies
168
386000
3000
06:44
comparing one kind of placebo against another.
169
389000
3000
06:47
So we know, for example, that two sugar pills a day
170
392000
2000
06:49
are a more effective treatment for getting rid of gastric ulcers
171
394000
2000
06:51
than one sugar pill.
172
396000
2000
06:53
Two sugar pills a day beats one sugar pill a day.
173
398000
2000
06:55
And that's an outrageous and ridiculous finding, but it's true.
174
400000
3000
06:58
We know from three different studies on three different types of pain
175
403000
2000
07:00
that a saltwater injection is a more effective treatment for pain
176
405000
3000
07:03
than taking a sugar pill, taking a dummy pill that has no medicine in it --
177
408000
4000
07:07
not because the injection or the pills do anything physically to the body,
178
412000
3000
07:10
but because an injection feels like a much more dramatic intervention.
179
415000
3000
07:13
So we know that our beliefs and expectations
180
418000
2000
07:15
can be manipulated,
181
420000
2000
07:17
which is why we do trials
182
422000
2000
07:19
where we control against a placebo --
183
424000
2000
07:21
where one half of the people get the real treatment
184
426000
2000
07:23
and the other half get placebo.
185
428000
2000
07:25
But that's not enough.
186
430000
3000
07:28
What I've just shown you are examples of the very simple and straightforward ways
187
433000
3000
07:31
that journalists and food supplement pill peddlers
188
436000
2000
07:33
and naturopaths
189
438000
2000
07:35
can distort evidence for their own purposes.
190
440000
3000
07:38
What I find really fascinating
191
443000
2000
07:40
is that the pharmaceutical industry
192
445000
2000
07:42
uses exactly the same kinds of tricks and devices,
193
447000
2000
07:44
but slightly more sophisticated versions of them,
194
449000
3000
07:47
in order to distort the evidence that they give to doctors and patients,
195
452000
3000
07:50
and which we use to make vitally important decisions.
196
455000
3000
07:53
So firstly, trials against placebo:
197
458000
2000
07:55
everybody thinks they know that a trial should be
198
460000
2000
07:57
a comparison of your new drug against placebo.
199
462000
2000
07:59
But actually in a lot of situations that's wrong.
200
464000
2000
08:01
Because often we already have a very good treatment that is currently available,
201
466000
3000
08:04
so we don't want to know that your alternative new treatment
202
469000
2000
08:06
is better than nothing.
203
471000
2000
08:08
We want to know that it's better than the best currently available treatment that we have.
204
473000
3000
08:11
And yet, repeatedly, you consistently see people doing trials
205
476000
3000
08:14
still against placebo.
206
479000
2000
08:16
And you can get license to bring your drug to market
207
481000
2000
08:18
with only data showing that it's better than nothing,
208
483000
2000
08:20
which is useless for a doctor like me trying to make a decision.
209
485000
3000
08:23
But that's not the only way you can rig your data.
210
488000
2000
08:25
You can also rig your data
211
490000
2000
08:27
by making the thing you compare your new drug against
212
492000
2000
08:29
really rubbish.
213
494000
2000
08:31
You can give the competing drug in too low a dose,
214
496000
2000
08:33
so that people aren't properly treated.
215
498000
2000
08:35
You can give the competing drug in too high a dose,
216
500000
2000
08:37
so that people get side effects.
217
502000
2000
08:39
And this is exactly what happened
218
504000
2000
08:41
which antipsychotic medication for schizophrenia.
219
506000
2000
08:43
20 years ago, a new generation of antipsychotic drugs were brought in
220
508000
3000
08:46
and the promise was that they would have fewer side effects.
221
511000
3000
08:49
So people set about doing trials of these new drugs
222
514000
2000
08:51
against the old drugs,
223
516000
2000
08:53
but they gave the old drugs in ridiculously high doses --
224
518000
2000
08:55
20 milligrams a day of haloperidol.
225
520000
2000
08:57
And it's a foregone conclusion,
226
522000
2000
08:59
if you give a drug at that high a dose,
227
524000
2000
09:01
that it will have more side effects and that your new drug will look better.
228
526000
3000
09:04
10 years ago, history repeated itself, interestingly,
229
529000
2000
09:06
when risperidone, which was the first of the new-generation antipscyhotic drugs,
230
531000
3000
09:09
came off copyright, so anybody could make copies.
231
534000
3000
09:12
Everybody wanted to show that their drug was better than risperidone,
232
537000
2000
09:14
so you see a bunch of trials comparing new antipsychotic drugs
233
539000
3000
09:17
against risperidone at eight milligrams a day.
234
542000
2000
09:19
Again, not an insane dose, not an illegal dose,
235
544000
2000
09:21
but very much at the high end of normal.
236
546000
2000
09:23
And so you're bound to make your new drug look better.
237
548000
3000
09:26
And so it's no surprise that overall,
238
551000
3000
09:29
industry-funded trials
239
554000
2000
09:31
are four times more likely to give a positive result
240
556000
2000
09:33
than independently sponsored trials.
241
558000
3000
09:36
But -- and it's a big but --
242
561000
3000
09:39
(Laughter)
243
564000
2000
09:41
it turns out,
244
566000
2000
09:43
when you look at the methods used by industry-funded trials,
245
568000
3000
09:46
that they're actually better
246
571000
2000
09:48
than independently sponsored trials.
247
573000
2000
09:50
And yet, they always manage to to get the result that they want.
248
575000
3000
09:53
So how does this work?
249
578000
2000
09:55
How can we explain this strange phenomenon?
250
580000
3000
09:58
Well it turns out that what happens
251
583000
2000
10:00
is the negative data goes missing in action;
252
585000
2000
10:02
it's withheld from doctors and patients.
253
587000
2000
10:04
And this is the most important aspect of the whole story.
254
589000
2000
10:06
It's at the top of the pyramid of evidence.
255
591000
2000
10:08
We need to have all of the data on a particular treatment
256
593000
3000
10:11
to know whether or not it really is effective.
257
596000
2000
10:13
And there are two different ways that you can spot
258
598000
2000
10:15
whether some data has gone missing in action.
259
600000
2000
10:17
You can use statistics, or you can use stories.
260
602000
3000
10:20
I personally prefer statistics, so that's what I'm going to do first.
261
605000
2000
10:22
This is something called funnel plot.
262
607000
2000
10:24
And a funnel plot is a very clever way of spotting
263
609000
2000
10:26
if small negative trials have disappeared, have gone missing in action.
264
611000
3000
10:29
So this is a graph of all of the trials
265
614000
2000
10:31
that have been done on a particular treatment.
266
616000
2000
10:33
And as you go up towards the top of the graph,
267
618000
2000
10:35
what you see is each dot is a trial.
268
620000
2000
10:37
And as you go up, those are the bigger trials, so they've got less error in them.
269
622000
3000
10:40
So they're less likely to be randomly false positives, randomly false negatives.
270
625000
3000
10:43
So they all cluster together.
271
628000
2000
10:45
The big trials are closer to the true answer.
272
630000
2000
10:47
Then as you go further down at the bottom,
273
632000
2000
10:49
what you can see is, over on this side, the spurious false negatives,
274
634000
3000
10:52
and over on this side, the spurious false positives.
275
637000
2000
10:54
If there is publication bias,
276
639000
2000
10:56
if small negative trials have gone missing in action,
277
641000
3000
10:59
you can see it on one of these graphs.
278
644000
2000
11:01
So you can see here that the small negative trials
279
646000
2000
11:03
that should be on the bottom left have disappeared.
280
648000
2000
11:05
This is a graph demonstrating the presence of publication bias
281
650000
3000
11:08
in studies of publication bias.
282
653000
2000
11:10
And I think that's the funniest epidemiology joke
283
655000
2000
11:12
that you will ever hear.
284
657000
2000
11:14
That's how you can prove it statistically,
285
659000
2000
11:16
but what about stories?
286
661000
2000
11:18
Well they're heinous, they really are.
287
663000
2000
11:20
This is a drug called reboxetine.
288
665000
2000
11:22
This is a drug that I myself have prescribed to patients.
289
667000
2000
11:24
And I'm a very nerdy doctor.
290
669000
2000
11:26
I hope I try to go out of my way to try and read and understand all the literature.
291
671000
3000
11:29
I read the trials on this. They were all positive. They were all well-conducted.
292
674000
3000
11:32
I found no flaw.
293
677000
2000
11:34
Unfortunately, it turned out,
294
679000
2000
11:36
that many of these trials were withheld.
295
681000
2000
11:38
In fact, 76 percent
296
683000
2000
11:40
of all of the trials that were done on this drug
297
685000
2000
11:42
were withheld from doctors and patients.
298
687000
2000
11:44
Now if you think about it,
299
689000
2000
11:46
if I tossed a coin a hundred times,
300
691000
2000
11:48
and I'm allowed to withhold from you
301
693000
2000
11:50
the answers half the times,
302
695000
2000
11:52
then I can convince you
303
697000
2000
11:54
that I have a coin with two heads.
304
699000
2000
11:56
If we remove half of the data,
305
701000
2000
11:58
we can never know what the true effect size of these medicines is.
306
703000
3000
12:01
And this is not an isolated story.
307
706000
2000
12:03
Around half of all of the trial data on antidepressants has been withheld,
308
708000
4000
12:07
but it goes way beyond that.
309
712000
2000
12:09
The Nordic Cochrane Group were trying to get a hold of the data on that
310
714000
2000
12:11
to bring it all together.
311
716000
2000
12:13
The Cochrane Groups are an international nonprofit collaboration
312
718000
3000
12:16
that produce systematic reviews of all of the data that has ever been shown.
313
721000
3000
12:19
And they need to have access to all of the trial data.
314
724000
3000
12:22
But the companies withheld that data from them,
315
727000
3000
12:25
and so did the European Medicines Agency
316
730000
2000
12:27
for three years.
317
732000
2000
12:29
This is a problem that is currently lacking a solution.
318
734000
3000
12:32
And to show how big it goes, this is a drug called Tamiflu,
319
737000
3000
12:35
which governments around the world
320
740000
2000
12:37
have spent billions and billions of dollars on.
321
742000
2000
12:39
And they spend that money on the promise
322
744000
2000
12:41
that this is a drug which will reduce the rate
323
746000
2000
12:43
of complications with flu.
324
748000
2000
12:45
We already have the data
325
750000
2000
12:47
showing that it reduces the duration of your flu by a few hours.
326
752000
2000
12:49
But I don't really care about that. Governments don't care about that.
327
754000
2000
12:51
I'm very sorry if you have the flu, I know it's horrible,
328
756000
3000
12:54
but we're not going to spend billions of dollars
329
759000
2000
12:56
trying to reduce the duration of your flu symptoms
330
761000
2000
12:58
by half a day.
331
763000
2000
13:00
We prescribe these drugs, we stockpile them for emergencies
332
765000
2000
13:02
on the understanding that they will reduce the number of complications,
333
767000
2000
13:04
which means pneumonia and which means death.
334
769000
3000
13:07
The infectious diseases Cochrane Group, which are based in Italy,
335
772000
3000
13:10
has been trying to get
336
775000
2000
13:12
the full data in a usable form out of the drug companies
337
777000
3000
13:15
so that they can make a full decision
338
780000
3000
13:18
about whether this drug is effective or not,
339
783000
2000
13:20
and they've not been able to get that information.
340
785000
3000
13:23
This is undoubtedly
341
788000
2000
13:25
the single biggest ethical problem
342
790000
3000
13:28
facing medicine today.
343
793000
2000
13:30
We cannot make decisions
344
795000
3000
13:33
in the absence of all of the information.
345
798000
4000
13:37
So it's a little bit difficult from there
346
802000
3000
13:40
to spin in some kind of positive conclusion.
347
805000
4000
13:44
But I would say this:
348
809000
4000
13:48
I think that sunlight
349
813000
3000
13:51
is the best disinfectant.
350
816000
2000
13:53
All of these things are happening in plain sight,
351
818000
3000
13:56
and they're all protected
352
821000
2000
13:58
by a force field of tediousness.
353
823000
3000
14:01
And I think, with all of the problems in science,
354
826000
2000
14:03
one of the best things that we can do
355
828000
2000
14:05
is to lift up the lid,
356
830000
2000
14:07
finger around in the mechanics and peer in.
357
832000
2000
14:09
Thank you very much.
358
834000
2000
14:11
(Applause)
359
836000
3000

▲Back to top

ABOUT THE SPEAKER
Ben Goldacre - Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks.

Why you should listen

"It was the MMR story that finally made me crack," begins the Bad Science manifesto, referring to the sensationalized -- and now-refuted -- link between vaccines and autism. With that sentence Ben Goldacre fired the starting shot of a crusade waged from the pages of The Guardian from 2003 to 2011, on an addicitve Twitter feed, and in bestselling books, including Bad Science and his latest, Bad Pharma, which puts the $600 billion global pharmaceutical industry under the microscope. What he reveals is a fascinating, terrifying mess.

Goldacre was trained in medicine at Oxford and London, and works as an academic in epidemiology. Helped along by this inexhaustible supply of material, he also travels the speaking circuit, promoting skepticism and nerdish curiosity with fire, wit, fast delivery and a lovable kind of exasperation. (He might even convince you that real science, sober reporting and reason are going to win in the end.)

As he writes, "If you're a journalist who misrepresents science for the sake of a headline, a politician more interested in spin than evidence, or an advertiser who loves pictures of molecules in little white coats, then beware: your days are numbered."

Read an excerpt of Bad Pharma >>

More profile about the speaker
Ben Goldacre | Speaker | TED.com