ABOUT THE SPEAKER
Ben Goldacre - Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks.

Why you should listen

"It was the MMR story that finally made me crack," begins the Bad Science manifesto, referring to the sensationalized -- and now-refuted -- link between vaccines and autism. With that sentence Ben Goldacre fired the starting shot of a crusade waged from the pages of The Guardian from 2003 to 2011, on an addicitve Twitter feed, and in bestselling books, including Bad Science and his latest, Bad Pharma, which puts the $600 billion global pharmaceutical industry under the microscope. What he reveals is a fascinating, terrifying mess.

Goldacre was trained in medicine at Oxford and London, and works as an academic in epidemiology. Helped along by this inexhaustible supply of material, he also travels the speaking circuit, promoting skepticism and nerdish curiosity with fire, wit, fast delivery and a lovable kind of exasperation. (He might even convince you that real science, sober reporting and reason are going to win in the end.)

As he writes, "If you're a journalist who misrepresents science for the sake of a headline, a politician more interested in spin than evidence, or an advertiser who loves pictures of molecules in little white coats, then beware: your days are numbered."

Read an excerpt of Bad Pharma >>

More profile about the speaker
Ben Goldacre | Speaker | TED.com
TEDMED 2012

Ben Goldacre: What doctors don't know about the drugs they prescribe

Filmed:
2,501,600 views

When a new drug gets tested, the results of the trials should be published for the rest of the medical world -- except much of the time, negative or inconclusive findings go unreported, leaving doctors and researchers in the dark. In this impassioned talk, Ben Goldacre explains why these unreported instances of negative data are especially misleading and dangerous.
- Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks. Full bio

Double-click the English transcript below to play the video.

00:16
Hi. So, this chap here,
0
671
2785
00:19
he thinks he can tell you the future.
1
3456
2262
00:21
His name is Nostradamus, although here the Sun have
2
5718
1979
00:23
made him look a little bit like Sean Connery. (Laughter)
3
7697
3670
00:27
And like most of you, I suspect, I don't really believe
4
11367
2903
00:30
that people can see into the future.
5
14270
1108
00:31
I don't believe in precognition, and every now and then,
6
15378
2704
00:33
you hear that somebody has been able to predict something that happened in the future,
7
18082
3182
00:37
and that's probably because it was a fluke, and we only
8
21264
3059
00:40
hear about the flukes and about the freaks.
9
24323
2615
00:42
We don't hear about all the times that people got stuff wrong.
10
26938
4079
00:46
Now we expect that to happen with silly stories
11
31017
2160
00:49
about precognition, but the problem is,
12
33177
3139
00:52
we have exactly the same problem in academia
13
36316
3344
00:55
and in medicine, and in this environment, it costs lives.
14
39660
4501
01:00
So firstly, thinking just about precognition, as it turns out,
15
44161
3365
01:03
just last year a researcher called Daryl Bem conducted
16
47526
2529
01:05
a piece of research where he found evidence
17
50055
1836
01:07
of precognitive powers in undergraduate students,
18
51891
3658
01:11
and this was published in a peer-reviewed academic journal
19
55549
2574
01:14
and most of the people who read this just said, "Okay, well,
20
58123
2281
01:16
fair enough, but I think that's a fluke, that's a freak, because I know
21
60404
2181
01:18
that if I did a study where I found no evidence
22
62585
2834
01:21
that undergraduate students had precognitive powers,
23
65419
2512
01:23
it probably wouldn't get published in a journal.
24
67931
3552
01:27
And in fact, we know that that's true, because
25
71483
2855
01:30
several different groups of research scientists tried
26
74338
2529
01:32
to replicate the findings of this precognition study,
27
76867
3520
01:36
and when they submitted it to the exact same journal,
28
80387
2635
01:38
the journal said, "No, we're not interested in publishing
29
83022
3152
01:42
replication. We're not interested in your negative data."
30
86174
4525
01:46
So this is already evidence of how, in the academic
31
90699
2754
01:49
literature, we will see a biased sample of the true picture
32
93453
4870
01:54
of all of the scientific studies that have been conducted.
33
98323
3467
01:57
But it doesn't just happen in the dry academic field of psychology.
34
101790
4429
02:02
It also happens in, for example, cancer research.
35
106219
4367
02:06
So in March, 2012, just one month ago, some researchers
36
110586
4077
02:10
reported in the journal Nature how they had tried
37
114663
2896
02:13
to replicate 53 different basic science studies looking at
38
117559
3846
02:17
potential treatment targets in cancer,
39
121405
3555
02:20
and out of those 53 studies, they were only able
40
124960
2638
02:23
to successfully replicate six.
41
127598
3076
02:26
Forty-seven out of those 53 were unreplicable.
42
130674
4333
02:30
And they say in their discussion that this is very likely
43
135007
3913
02:34
because freaks get published.
44
138920
2639
02:37
People will do lots and lots and lots of different studies,
45
141559
2096
02:39
and the occasions when it works they will publish,
46
143655
2120
02:41
and the ones where it doesn't work they won't.
47
145775
1679
02:43
And their first recommendation of how to fix this problem,
48
147454
3941
02:47
because it is a problem, because it sends us all down blind alleys,
49
151395
3289
02:50
their first recommendation of how to fix this problem
50
154684
1706
02:52
is to make it easier to publish negative results in science,
51
156390
3393
02:55
and to change the incentives so that scientists are
52
159783
2907
02:58
encouraged to post more of their negative results in public.
53
162690
4352
03:02
But it doesn't just happen in the very dry world
54
167042
3852
03:06
of preclinical basic science cancer research.
55
170894
3851
03:10
It also happens in the very real, flesh and blood
56
174745
3657
03:14
of academic medicine. So in 1980,
57
178402
3591
03:17
some researchers did a study on a drug called lorcainide,
58
181993
3008
03:20
and this was an anti-arrhythmic drug,
59
185001
2331
03:23
a drug that suppresses abnormal heart rhythms,
60
187332
2251
03:25
and the idea was, after people have had a heart attack,
61
189598
2228
03:27
they're quite likely to have abnormal heart rhythms,
62
191826
1537
03:29
so if we give them a drug that suppresses abnormal heart
63
193363
2377
03:31
rhythms, this will increase the chances of them surviving.
64
195740
3713
03:35
Early on its development, they did a very small trial,
65
199453
3008
03:38
just under a hundred patients.
66
202461
1644
03:40
Fifty patients got lorcainide, and of those patients, 10 died.
67
204105
3652
03:43
Another 50 patients got a dummy placebo sugar pill
68
207757
3043
03:46
with no active ingredient, and only one of them died.
69
210800
2958
03:49
So they rightly regarded this drug as a failure,
70
213758
2649
03:52
and its commercial development was stopped, and because
71
216407
2869
03:55
its commercial development was stopped, this trial was never published.
72
219276
4348
03:59
Unfortunately, over the course of the next five, 10 years,
73
223624
5397
04:04
other companies had the same idea about drugs that would
74
229021
3825
04:08
prevent arrhythmias in people who have had heart attacks.
75
232846
2592
04:11
These drugs were brought to market. They were prescribed
76
235438
1720
04:13
very widely because heart attacks are a very common thing,
77
237158
3412
04:16
and it took so long for us to find out that these drugs
78
240570
3843
04:20
also caused an increased rate of death
79
244413
2911
04:23
that before we detected that safety signal,
80
247324
2747
04:25
over 100,000 people died unnecessarily in America
81
250071
6051
04:32
from the prescription of anti-arrhythmic drugs.
82
256122
3451
04:35
Now actually, in 1993,
83
259573
3598
04:39
the researchers who did that 1980 study, that early study,
84
263171
3560
04:42
published a mea culpa, an apology to the scientific community,
85
266731
3841
04:46
in which they said, "When we carried out our study in 1980,
86
270572
3125
04:49
we thought that the increased death rate that occurred
87
273697
1936
04:51
in the lorcainide group was an effect of chance."
88
275633
3358
04:54
The development of lorcainide was abandoned for commercial reasons,
89
278991
2032
04:56
and this study was never published;
90
281023
1638
04:58
it's now a good example of publication bias.
91
282661
2386
05:00
That's the technical term for the phenomenon where
92
285047
1891
05:02
unflattering data gets lost, gets unpublished, is left
93
286938
4238
05:07
missing in action, and they say the results described here
94
291176
3371
05:10
"might have provided an early warning of trouble ahead."
95
294547
4808
05:15
Now these are stories from basic science.
96
299355
3213
05:18
These are stories from 20, 30 years ago.
97
302568
4787
05:23
The academic publishing environment is very different now.
98
307355
3147
05:26
There are academic journals like "Trials," the open access journal,
99
310502
3995
05:30
which will publish any trial conducted in humans
100
314497
2655
05:33
regardless of whether it has a positive or a negative result.
101
317152
3303
05:36
But this problem of negative results that go missing in action
102
320455
3969
05:40
is still very prevalent. In fact it's so prevalent
103
324424
3559
05:43
that it cuts to the core of evidence-based medicine.
104
327983
5851
05:49
So this is a drug called reboxetine, and this is a drug
105
333834
3015
05:52
that I myself have prescribed. It's an antidepressant.
106
336849
2535
05:55
And I'm a very nerdy doctor, so I read all of the studies
107
339384
2536
05:57
that I could on this drug. I read the one study that was published
108
341920
3052
06:00
that showed that reboxetine was better than placebo,
109
344972
2947
06:03
and I read the other three studies that were published
110
347919
1864
06:05
that showed that reboxetine was just as good as any other antidepressant,
111
349783
3571
06:09
and because this patient hadn't done well on those other antidepressants,
112
353354
2187
06:11
I thought, well, reboxetine is just as good. It's one to try.
113
355541
2466
06:13
But it turned out that I was misled. In fact,
114
358007
3392
06:17
seven trials were conducted comparing reboxetine
115
361399
2449
06:19
against a dummy placebo sugar pill. One of them
116
363848
2712
06:22
was positive and that was published, but six of them
117
366560
2312
06:24
were negative and they were left unpublished.
118
368872
4048
06:28
Three trials were published comparing reboxetine
119
372920
1739
06:30
against other antidepressants in which reboxetine
120
374659
2226
06:32
was just as good, and they were published,
121
376885
1793
06:34
but three times as many patients' worth of data was collected
122
378678
4389
06:38
which showed that reboxetine was worse than
123
383067
1871
06:40
those other treatments, and those trials were not published.
124
384938
4701
06:45
I felt misled.
125
389639
3759
06:49
Now you might say, well, that's an extremely unusual example,
126
393398
2130
06:51
and I wouldn't want to be guilty of the same kind of
127
395528
2008
06:53
cherry-picking and selective referencing
128
397536
2981
06:56
that I'm accusing other people of.
129
400517
1791
06:58
But it turns out that this phenomenon of publication bias
130
402308
1884
07:00
has actually been very, very well studied.
131
404192
2127
07:02
So here is one example of how you approach it.
132
406319
2218
07:04
The classic model is, you get a bunch of studies where
133
408537
2440
07:06
you know that they've been conducted and completed,
134
410977
2185
07:09
and then you go and see if they've been published anywhere
135
413162
2321
07:11
in the academic literature. So this took all of the trials
136
415483
2863
07:14
that had ever been conducted on antidepressants
137
418346
2154
07:16
that were approved over a 15-year period by the FDA.
138
420500
3642
07:20
They took all of the trials which were submitted to the FDA as part of the approval package.
139
424142
3756
07:23
So that's not all of the trials that were ever conducted on these drugs,
140
427898
3200
07:26
because we can never know if we have those,
141
431098
2098
07:29
but it is the ones that were conducted in order to get the marketing authorization.
142
433196
3494
07:32
And then they went to see if these trials had been published
143
436690
2349
07:34
in the peer-reviewed academic literature. And this is what they found.
144
439039
2572
07:37
It was pretty much a 50-50 split. Half of these trials
145
441611
3169
07:40
were positive, half of them were negative, in reality.
146
444780
3597
07:44
But when they went to look for these trials in the peer-reviewed academic literature,
147
448377
4741
07:49
what they found was a very different picture.
148
453118
2234
07:51
Only three of the negative trials were published,
149
455352
4372
07:55
but all but one of the positive trials were published.
150
459724
4642
08:00
Now if we just flick back and forth between those two,
151
464366
3761
08:04
you can see what a staggering difference there was
152
468127
2594
08:06
between reality and what doctors, patients,
153
470721
3450
08:10
commissioners of health services, and academics
154
474171
2622
08:12
were able to see in the peer-reviewed academic literature.
155
476793
3281
08:15
We were misled, and this is a systematic flaw
156
480074
4454
08:20
in the core of medicine.
157
484528
3330
08:23
In fact, there have been so many studies conducted on
158
487858
2663
08:26
publication bias now, over a hundred, that they've been
159
490521
3384
08:29
collected in a systematic review, published in 2010,
160
493905
3194
08:32
that took every single study on publication bias
161
497099
2766
08:35
that they could find.
162
499865
1299
08:37
Publication bias affects every field of medicine.
163
501164
2852
08:39
About half of all trials, on average, go missing in action,
164
504016
4313
08:44
and we know that positive findings are around twice as likely
165
508329
3058
08:47
to be published as negative findings.
166
511387
3054
08:50
This is a cancer at the core of evidence-based medicine.
167
514441
4061
08:54
If I flipped a coin 100 times but then
168
518502
3871
08:58
withheld the results from you from half of those tosses,
169
522373
3259
09:01
I could make it look as if I had a coin that always came up heads.
170
525632
3400
09:04
But that wouldn't mean that I had a two-headed coin.
171
529032
1806
09:06
That would mean that I was a chancer
172
530853
1712
09:08
and you were an idiot for letting me get away with it. (Laughter)
173
532565
3114
09:11
But this is exactly what we blindly tolerate
174
535679
3637
09:15
in the whole of evidence-based medicine.
175
539316
3789
09:19
And to me, this is research misconduct.
176
543105
4432
09:23
If I conducted one study and I withheld
177
547537
2743
09:26
half of the data points from that one study,
178
550280
3000
09:29
you would rightly accuse me, essentially, of research fraud.
179
553280
4707
09:33
And yet, for some reason, if somebody conducts
180
557987
2783
09:36
10 studies but only publishes the five that give the result that they want,
181
560770
4558
09:41
we don't consider that to be research misconduct.
182
565328
2788
09:44
And when that responsibility is diffused between
183
568116
2567
09:46
a whole network of researchers, academics,
184
570683
3161
09:49
industry sponsors, journal editors, for some reason
185
573844
3528
09:53
we find it more acceptable,
186
577372
1453
09:54
but the effect on patients is damning.
187
578825
3675
09:58
And this is happening right now, today.
188
582500
5018
10:03
This is a drug called Tamiflu. Tamiflu is a drug
189
587518
2711
10:06
which governments around the world have spent billions
190
590229
2596
10:08
and billions of dollars on stockpiling,
191
592825
2572
10:11
and we've stockpiled Tamiflu in panic,
192
595397
3148
10:14
in the belief that it will reduce the rate of complications of influenza.
193
598545
3949
10:18
Complications is a medical euphemism for pneumonia
194
602494
2684
10:21
and death. (Laughter)
195
605178
4814
10:25
Now when the Cochrane systematic reviewers
196
609992
3208
10:29
were trying to collect together all of the data from all
197
613200
2525
10:31
of the trials that had ever been conducted on whether Tamiflu actually did this or not,
198
615725
3648
10:35
they found that several of those trials were unpublished.
199
619373
2951
10:38
The results were unavailable to them.
200
622324
1842
10:40
And when they started obtaining the writeups of those trials through various different means,
201
624166
3964
10:44
through Freedom of Information Act requests, through
202
628130
1682
10:45
harassing various different organizations, what they found was inconsistent.
203
629812
4809
10:50
And when they tried to get a hold of the clinical study reports,
204
634621
2466
10:52
the 10,000-page long documents that have
205
637087
3046
10:56
the best possible rendition of the information,
206
640133
3600
10:59
they were told they weren't allowed to have them.
207
643733
2888
11:02
And if you want to read the full correspondence
208
646621
2683
11:05
and the excuses and the explanations given by the drug company,
209
649304
3290
11:08
you can see that written up in this week's edition
210
652594
2717
11:11
of PLOS Medicine.
211
655311
4367
11:15
And the most staggering thing of all of this, to me,
212
659678
3859
11:19
is that not only is this a problem, not only do we recognize
213
663537
3299
11:22
that this is a problem, but we've had to suffer fake fixes.
214
666836
4195
11:26
We've had people pretend that this is a problem that's been fixed.
215
671031
3058
11:29
First of all, we had trials registers, and everybody said,
216
674089
2188
11:32
oh, it's okay. We'll get everyone to register their trials, they'll post the protocol,
217
676277
3603
11:35
they'll say what they're going to do before they do it,
218
679880
2024
11:37
and then afterwards we'll be able to check and see if all the trials which
219
681904
2121
11:39
have been conducted and completed have been published.
220
684025
2468
11:42
But people didn't bother to use those registers.
221
686493
2196
11:44
And so then the International Committee of Medical Journal Editors came along,
222
688689
2619
11:47
and they said, oh, well, we will hold the line.
223
691308
1543
11:48
We won't publish any journals, we won't publish any trials,
224
692851
2633
11:51
unless they've been registered before they began.
225
695484
2682
11:54
But they didn't hold the line. In 2008, a study was conducted
226
698166
3531
11:57
which showed that half of all of trials published by journals
227
701697
3015
12:00
edited by members of the ICMJE
228
704712
2667
12:03
weren't properly registered, and a quarter of them weren't registered at all.
229
707379
4813
12:08
And then finally, the FDA Amendment Act was passed
230
712192
2801
12:10
a couple of years ago saying that everybody who conducts
231
714993
2349
12:13
a trial must post the results of that trial within one year.
232
717342
3443
12:16
And in the BMJ, in the first edition of January, 2012,
233
720785
4096
12:20
you can see a study which looks to see if people kept
234
724881
2704
12:23
to that ruling, and it turns out that only one in five
235
727585
3719
12:27
have done so.
236
731304
2864
12:30
This is a disaster.
237
734168
3283
12:33
We cannot know the true effects of the medicines
238
737451
3564
12:36
that we prescribe if we do not have access
239
741015
3216
12:40
to all of the information.
240
744231
3180
12:43
And this is not a difficult problem to fix.
241
747411
3959
12:47
We need to force people to publish all trials
242
751370
5128
12:52
conducted in humans, including the older trials,
243
756498
2971
12:55
because the FDA Amendment Act only asks that you publish the trials conducted after 2008,
244
759469
3945
12:59
and I don't know what world it is in which we're only
245
763414
2613
13:01
practicing medicine on the basis of trials that completed in the past two years.
246
766027
4456
13:06
We need to publish all trials in humans,
247
770483
2105
13:08
including the older trials, for all drugs in current use,
248
772588
3074
13:11
and you need to tell everyone you know
249
775662
2916
13:14
that this is a problem and that it has not been fixed.
250
778578
3442
13:17
Thank you very much. (Applause)
251
782020
2951
13:20
(Applause)
252
784971
3273
Translated by Joseph Geni
Reviewed by Morton Bast

▲Back to top

ABOUT THE SPEAKER
Ben Goldacre - Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks.

Why you should listen

"It was the MMR story that finally made me crack," begins the Bad Science manifesto, referring to the sensationalized -- and now-refuted -- link between vaccines and autism. With that sentence Ben Goldacre fired the starting shot of a crusade waged from the pages of The Guardian from 2003 to 2011, on an addicitve Twitter feed, and in bestselling books, including Bad Science and his latest, Bad Pharma, which puts the $600 billion global pharmaceutical industry under the microscope. What he reveals is a fascinating, terrifying mess.

Goldacre was trained in medicine at Oxford and London, and works as an academic in epidemiology. Helped along by this inexhaustible supply of material, he also travels the speaking circuit, promoting skepticism and nerdish curiosity with fire, wit, fast delivery and a lovable kind of exasperation. (He might even convince you that real science, sober reporting and reason are going to win in the end.)

As he writes, "If you're a journalist who misrepresents science for the sake of a headline, a politician more interested in spin than evidence, or an advertiser who loves pictures of molecules in little white coats, then beware: your days are numbered."

Read an excerpt of Bad Pharma >>

More profile about the speaker
Ben Goldacre | Speaker | TED.com