ABOUT THE SPEAKER
Ben Goldacre - Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks.

Why you should listen

"It was the MMR story that finally made me crack," begins the Bad Science manifesto, referring to the sensationalized -- and now-refuted -- link between vaccines and autism. With that sentence Ben Goldacre fired the starting shot of a crusade waged from the pages of The Guardian from 2003 to 2011, on an addicitve Twitter feed, and in bestselling books, including Bad Science and his latest, Bad Pharma, which puts the $600 billion global pharmaceutical industry under the microscope. What he reveals is a fascinating, terrifying mess.

Goldacre was trained in medicine at Oxford and London, and works as an academic in epidemiology. Helped along by this inexhaustible supply of material, he also travels the speaking circuit, promoting skepticism and nerdish curiosity with fire, wit, fast delivery and a lovable kind of exasperation. (He might even convince you that real science, sober reporting and reason are going to win in the end.)

As he writes, "If you're a journalist who misrepresents science for the sake of a headline, a politician more interested in spin than evidence, or an advertiser who loves pictures of molecules in little white coats, then beware: your days are numbered."

Read an excerpt of Bad Pharma >>

More profile about the speaker
Ben Goldacre | Speaker | TED.com
TEDGlobal 2011

Ben Goldacre: Battling bad science

Ben Goldacre: Combater a mala ciencia

Filmed:
2,713,579 views

Todos os días aparecen noticias con novas recomendacións sobre saúde, pero como podemos saber se son correctas? O doutor e epidemiólogo Ben Goldacre amósanos, a alta velocidade, as formas en que a evidencia se pode distorsionar, desde as afirmacións máis obvias sobre a nutrición ata os trucos máis sutís da industria farmacéutica.
- Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks. Full bio

Double-click the English transcript below to play the video.

00:15
So I'm a doctor, but I kind of slipped sideways into research,
0
0
3000
Eu son médico, pero
inclineime pola investigación
00:18
and now I'm an epidemiologist.
1
3000
2000
e agora son epidemiólogo.
E ninguén sabe de certo
que é a epidemioloxía.
00:20
And nobody really knows what epidemiology is.
2
5000
2000
00:22
Epidemiology is the science of how we know in the real world
3
7000
3000
A epidemioloxía é a ciencia
que estuda como saber no mundo real
00:25
if something is good for you or bad for you.
4
10000
2000
se algo é bo ou malo para nós.
00:27
And it's best understood through example
5
12000
2000
Enténdese mellor a través
dun exemplo:
00:29
as the science of those crazy, wacky newspaper headlines.
6
14000
5000
é a ciencia deses titulares
tolos, absurdos, dos xornais.
00:34
And these are just some of the examples.
7
19000
2000
Imos ver algúns exemplos.
00:36
These are from the Daily Mail. Every country in the world has a newspaper like this.
8
21000
3000
Este é do Daily Mail. Todos os países
teñen un xornal coma este.
00:39
It has this bizarre, ongoing philosophical project
9
24000
3000
Ten o estraño proxecto filosófico
00:42
of dividing all the inanimate objects in the world
10
27000
2000
de dividir os obxectos
inanimados do mundo
00:44
into the ones that either cause or prevent cancer.
11
29000
3000
nos que causan cancro
e nos que o preveñen.
00:47
So here are some of the things they said cause cancer recently:
12
32000
2000
Segundo eles, causan cancro:
o divorcio, a rede sen fíos,
os artigos de aseo e o café.
00:49
divorce, Wi-Fi, toiletries and coffee.
13
34000
2000
00:51
Here are some of the things they say prevents cancer:
14
36000
2000
E preveñen o cancro:
00:53
crusts, red pepper, licorice and coffee.
15
38000
2000
a codia, o pemento vermello,
a regalicia e o café.
00:55
So already you can see there are contradictions.
16
40000
2000
Como podedes ver,
hai contradicións.
00:57
Coffee both causes and prevents cancer.
17
42000
2000
O café causa cancro e á vez preveno.
00:59
And as you start to read on, you can see
18
44000
2000
E cando comezades a ler vedes
01:01
that maybe there's some kind of political valence behind some of this.
19
46000
3000
que quizais haxa algún tipo
de interese político por tras.
Para as mulleres, o traballo doméstico
causa cancro de mama
01:04
So for women, housework prevents breast cancer,
20
49000
2000
01:06
but for men, shopping could make you impotent.
21
51000
3000
pero para os homes,
comprar pode facelos impotentes.
01:09
So we know that we need to start
22
54000
3000
Así que hai que comezar
01:12
unpicking the science behind this.
23
57000
3000
descifrando a ciencia que hai aí detrás.
01:15
And what I hope to show
24
60000
2000
E espero demostrar
01:17
is that unpicking dodgy claims,
25
62000
2000
que examinar afirmacións tan arriscadas,
01:19
unpicking the evidence behind dodgy claims,
26
64000
2000
examinar a evidencia que subxace a elas
01:21
isn't a kind of nasty carping activity;
27
66000
3000
non é rosmar con mala intención;
01:24
it's socially useful,
28
69000
2000
socialmente é útil,
01:26
but it's also an extremely valuable
29
71000
2000
pero ademais é unha ferramenta explicativa
01:28
explanatory tool.
30
73000
2000
extremadamente valiosa.
01:30
Because real science is all about
31
75000
2000
Porque a ciencia verdadeira consiste
01:32
critically appraising the evidence for somebody else's position.
32
77000
2000
na avaliación crítica das probas
que avalan unha postura.
01:34
That's what happens in academic journals.
33
79000
2000
Así se fai nas publicacións académicas.
01:36
That's what happens at academic conferences.
34
81000
2000
E nas reunións académicas.
01:38
The Q&A session after a post-op presents data
35
83000
2000
As preguntas que seguen
unha presentación de datos
01:40
is often a blood bath.
36
85000
2000
adoitan ser un baño de sangue.
01:42
And nobody minds that. We actively welcome it.
37
87000
2000
E a ninguén lle importa. Gústanos.
01:44
It's like a consenting intellectual S&M activity.
38
89000
3000
É como unha actividade intelectual
sadomasoquista consensuada.
01:47
So what I'm going to show you
39
92000
2000
Así que vou amosarvos
01:49
is all of the main things,
40
94000
2000
as partes principais,
01:51
all of the main features of my discipline --
41
96000
2000
as características principais
da miña disciplina:
01:53
evidence-based medicine.
42
98000
2000
a medicina baseada en evidencias.
01:55
And I will talk you through all of these
43
100000
2000
Explicaréivolas
01:57
and demonstrate how they work,
44
102000
2000
e demostrareivos como funcionan
01:59
exclusively using examples of people getting stuff wrong.
45
104000
3000
usando exclusivamente exemplos
de xente que o fai mal.
02:02
So we'll start with the absolute weakest form of evidence known to man,
46
107000
3000
Así que comezaremos coa forma máis
débil de evidencia coñecida polo home:
02:05
and that is authority.
47
110000
2000
a autoridade.
02:07
In science, we don't care how many letters you have after your name.
48
112000
3000
Na ciencia, non nos importa
cantos títulos tes.
02:10
In science, we want to know what your reasons are for believing something.
49
115000
3000
O que queremos é saber
as túas razóns para crer en algo.
02:13
How do you know that something is good for us
50
118000
2000
Como sabes que algo é bo
02:15
or bad for us?
51
120000
2000
ou malo para nós?
02:17
But we're also unimpressed by authority,
52
122000
2000
Non nos impresiona
a autoridade,
02:19
because it's so easy to contrive.
53
124000
2000
porque é tan fácil inventala...
02:21
This is somebody called Dr. Gillian McKeith Ph.D,
54
126000
2000
Esta é a Dra. Gillian McKeith
02:23
or, to give her full medical title, Gillian McKeith.
55
128000
3000
ou, para darlle o seu título médico
completo, Gillian McKeith.
02:26
(Laughter)
56
131000
3000
(Risos)
02:29
Again, every country has somebody like this.
57
134000
2000
En fin, todos os países teñen alguén así.
02:31
She is our TV diet guru.
58
136000
2000
É a nosa gurú mediática das dietas.
02:33
She has massive five series of prime-time television,
59
138000
3000
Ten cinco programas de televisión
en horario de máxima audiencia
02:36
giving out very lavish and exotic health advice.
60
141000
3000
nos que dá consellos moi
exóticos sobre saúde.
02:39
She, it turns out, has a non-accredited correspondence course Ph.D.
61
144000
3000
Resulta que ten un doutoramento
non oficial por correspondencia,
02:42
from somewhere in America.
62
147000
2000
de algures nos Estados Unidos.
02:44
She also boasts that she's a certified professional member
63
149000
2000
Tamén se gaba de ser membro certificado
02:46
of the American Association of Nutritional Consultants,
64
151000
2000
da Asociación Americana de
Consultores Nutricionais,
02:48
which sounds very glamorous and exciting.
65
153000
2000
que soa moi glamuroso e
fascinante.
02:50
You get a certificate and everything.
66
155000
2000
Danche un certificado e todo.
Este é de Hetti, a miña defunta gata.
Unha gata horrible.
02:52
This one belongs to my dead cat Hetti. She was a horrible cat.
67
157000
2000
02:54
You just go to the website, fill out the form,
68
159000
2000
Vas á web, enches o impreso,
02:56
give them $60, and it arrives in the post.
69
161000
2000
dáslles 60 dólares e chégache por correo.
02:58
Now that's not the only reason that we think this person is an idiot.
70
163000
2000
Pero esta non é a única razón
para pensar que é idiota.
03:00
She also goes and says things like,
71
165000
2000
Tamén vai e di cousas como
03:02
you should eat lots of dark green leaves,
72
167000
2000
que deberiamos comer
moita verdura verde escura
03:04
because they contain lots of chlorophyll, and that will really oxygenate your blood.
73
169000
2000
porque ten moita clorofila
e iso osixena o sangue.
03:06
And anybody who's done school biology remembers
74
171000
2000
Calquera que estudara bioloxía
na escola lembra
03:08
that chlorophyll and chloroplasts
75
173000
2000
que a clorofila e os cloroplastos
03:10
only make oxygen in sunlight,
76
175000
2000
só producen osíxeno á luz do día
03:12
and it's quite dark in your bowels after you've eaten spinach.
77
177000
3000
e os intestinos están bastante escuros
para as espinacas.
03:15
Next, we need proper science, proper evidence.
78
180000
3000
Seguimos. Precisamos ciencia e
evidencia axeitadas.
03:18
So, "Red wine can help prevent breast cancer."
79
183000
2000
"O viño tinto axuda a previr
o cancro de mama."
03:20
This is a headline from the Daily Telegraph in the U.K.
80
185000
2000
Este é un titulardo Daily Telegraph
do Reino Unido.
03:22
"A glass of red wine a day could help prevent breast cancer."
81
187000
3000
"Un vaso de viño tinto ao día
axuda a previr o cancro de mama."
03:25
So you go and find this paper, and what you find
82
190000
2000
Pero collemos o xornal e descubrimos
03:27
is it is a real piece of science.
83
192000
2000
que é un verdadeiro artigo científico.
03:29
It is a description of the changes in one enzyme
84
194000
3000
É a descrición dos cambios nun enzima
03:32
when you drip a chemical extracted from some red grape skin
85
197000
3000
ao colocar unha gota dun produto químico
extraído da tona da uva tinta
03:35
onto some cancer cells
86
200000
2000
en células cancerosas
03:37
in a dish on a bench in a laboratory somewhere.
87
202000
3000
nunha mesa de laboratorio en algures.
03:40
And that's a really useful thing to describe
88
205000
2000
E esa é unha cousa moi útil para describir
03:42
in a scientific paper,
89
207000
2000
nunha publicación científica,
03:44
but on the question of your own personal risk
90
209000
2000
pero en canto ao risco persoal
03:46
of getting breast cancer if you drink red wine,
91
211000
2000
de ter cancro de mama se
bebemos viño tinto,
03:48
it tells you absolutely bugger all.
92
213000
2000
non di nada.
03:50
Actually, it turns out that your risk of breast cancer
93
215000
2000
En realidade, resulta que o risco
de cancro de mama
03:52
actually increases slightly
94
217000
2000
aumenta un pouco
03:54
with every amount of alcohol that you drink.
95
219000
2000
canto máis alcol bebemos.
03:56
So what we want is studies in real human people.
96
221000
4000
Queremos estudos feitos
en xente real.
04:00
And here's another example.
97
225000
2000
Aquí hai outro exemplo
04:02
This is from Britain's leading diet and nutritionist in the Daily Mirror,
98
227000
3000
do destacado nutricionista británico
do Daily Mirror,
04:05
which is our second biggest selling newspaper.
99
230000
2000
o noso segundo xornal máis vendido:
04:07
"An Australian study in 2001
100
232000
2000
"Un estudo australiano do 2001 atopou
04:09
found that olive oil in combination with fruits, vegetables and pulses
101
234000
2000
que o aceite de oliva combinado
con froitas, vexetais e legumes
04:11
offers measurable protection against skin wrinklings."
102
236000
2000
protexe contra as engurras da pel."
04:13
And then they give you advice:
103
238000
2000
E despois dános un consello:
04:15
"If you eat olive oil and vegetables, you'll have fewer skin wrinkles."
104
240000
2000
"Se tomamos aceite de oliva e vexetais
teremos menos engurras."
04:17
And they very helpfully tell you how to go and find the paper.
105
242000
2000
E dinos como atopar a publicación.
04:19
So you go and find the paper, and what you find is an observational study.
106
244000
3000
Así que buscas o artigo e o que atopas
é un estudo observacional.
04:22
Obviously nobody has been able
107
247000
2000
Obviamente, ninguén puido ir a 1930,
04:24
to go back to 1930,
108
249000
2000
04:26
get all the people born in one maternity unit,
109
251000
3000
coller os bebés dunha maternidade,
facer que a metade comese moita
froita, verduras e aceite de oliva
04:29
and half of them eat lots of fruit and veg and olive oil,
110
254000
2000
04:31
and then half of them eat McDonald's,
111
256000
2000
e a que outra metade comese McDonald's
04:33
and then we see how many wrinkles you've got later.
112
258000
2000
e despois analizar as súas engurras.
04:35
You have to take a snapshot of how people are now.
113
260000
2000
Hai que facer unha mostraxe
de como son as persoas agora.
04:37
And what you find is, of course,
114
262000
2000
E o que atopas, claro,
04:39
people who eat veg and olive oil have fewer skin wrinkles.
115
264000
3000
é que a xente que come verduras e
aceite de oliva ten menos engurras.
04:42
But that's because people who eat fruit and veg and olive oil,
116
267000
3000
Pero iso é porque a xente que come
froita e aceite de oliva
04:45
they're freaks, they're not normal, they're like you;
117
270000
3000
é rara, non é normal, é coma vós;
04:48
they come to events like this.
118
273000
2000
veñen a eventos coma este.
04:50
They are posh, they're wealthy, they're less likely to have outdoor jobs,
119
275000
3000
Son elegantes, son ricos,
traballan menos ao aire libre,
04:53
they're less likely to do manual labor,
120
278000
2000
fan menos traballos manuais,
04:55
they have better social support, they're less likely to smoke --
121
280000
2000
teñen máis apoio social, fuman menos...
04:57
so for a whole host of fascinating, interlocking
122
282000
2000
así que por unha chea
de fascinantes razóns,
04:59
social, political and cultural reasons,
123
284000
2000
sociais, políticas
e culturais entrelazadas,
05:01
they are less likely to have skin wrinkles.
124
286000
2000
é menos probable que
teñan engurras na pel.
05:03
That doesn't mean that it's the vegetables or the olive oil.
125
288000
2000
Iso non significa que sexa polos
vexetais e o aceite de oliva.
05:05
(Laughter)
126
290000
2000
05:07
So ideally what you want to do is a trial.
127
292000
3000
(Risos)
Entón, o ideal sería facer un ensaio.
05:10
And everybody thinks they're very familiar with the idea of a trial.
128
295000
2000
Todo o mundo cre que sabe que é un ensaio.
05:12
Trials are very old. The first trial was in the Bible -- Daniel 1:12.
129
297000
3000
Os ensaios son moi antigos.
O primeiro está na Biblia, Daniel 1:12.
05:15
It's very straightforward -- you take a bunch of people, you split them in half,
130
300000
2000
É moi fácil, cóllese un grupo
de xente, divídese en dous,
05:17
you treat one group one way, you treat the other group the other way,
131
302000
2000
trátase un grupo dun xeito,
e o outro, doutro,
05:19
and a little while later, you follow them up
132
304000
2000
e despois, fáiselles un seguimento
05:21
and see what happened to each of them.
133
306000
2000
para ver que ocorre con cada un.
05:23
So I'm going to tell you about one trial,
134
308000
2000
Así que vou falarvos dun ensaio
05:25
which is probably the most well-reported trial
135
310000
2000
que foi probablemente o máis popular
05:27
in the U.K. news media over the past decade.
136
312000
2000
nos medios británicos na pasada década.
05:29
And this is the trial of fish oil pills.
137
314000
2000
É o ensaio das pílulas de aceite de peixe.
05:31
And the claim was fish oil pills improve school performance and behavior
138
316000
2000
Dicíase que melloraban
o rendemento escolar
05:33
in mainstream children.
139
318000
2000
e o comportamento na maioría dos nenos.
05:35
And they said, "We've done a trial.
140
320000
2000
Dixeron: "Fixemos un ensaio.
05:37
All the previous trials were positive, and we know this one's gonna be too."
141
322000
2000
Os anteriores foron positivos
e sabemos que este tamén."
05:39
That should always ring alarm bells.
142
324000
2000
Iso sempre debería
facer soar unha alarma.
05:41
Because if you already know the answer to your trial, you shouldn't be doing one.
143
326000
3000
Porque se xa se sabe a resposta
do ensaio non se debería facer.
05:44
Either you've rigged it by design,
144
329000
2000
Ou está manipulado o deseño
05:46
or you've got enough data so there's no need to randomize people anymore.
145
331000
3000
ou xa hai datos dabondo, así que
non é preciso probalo en máis persoas.
05:49
So this is what they were going to do in their trial.
146
334000
3000
Isto é o que ían facer no seu ensaio.
05:52
They were taking 3,000 children,
147
337000
2000
Ían coller 3 000 nenos,
05:54
they were going to give them all these huge fish oil pills,
148
339000
2000
ían darlles unhas enormes
pílulas de aceite de peixe,
05:56
six of them a day,
149
341000
2000
seis ao día,
05:58
and then a year later, they were going to measure their school exam performance
150
343000
3000
e un ano máis tarde, ían medir
o rendemento escolar en exames
06:01
and compare their school exam performance
151
346000
2000
e comparar ese rendemento
06:03
against what they predicted their exam performance would have been
152
348000
2000
co que eles calculaban que terían
06:05
if they hadn't had the pills.
153
350000
3000
se non tomaran as pílulas.
06:08
Now can anybody spot a flaw in this design?
154
353000
3000
Pode ver alguén o punto fraco no deseño?
06:11
And no professors of clinical trial methodology
155
356000
3000
Se sodes profesores de metodoloxía
de ensaios clínicos
06:14
are allowed to answer this question.
156
359000
2000
non respondades esta cuestión.
06:16
So there's no control; there's no control group.
157
361000
2000
Non hai control;
non hai ningún grupo de control.
06:18
But that sounds really techie.
158
363000
2000
Pero isto soa moi técnico.
06:20
That's a technical term.
159
365000
2000
É un termo técnico.
Os rapaces tomaron as pílulas e
o seu rendemento mellorou.
06:22
The kids got the pills, and then their performance improved.
160
367000
2000
06:24
What else could it possibly be if it wasn't the pills?
161
369000
3000
Que outra cousa podería ser
senón as pílulas?
06:27
They got older. We all develop over time.
162
372000
3000
Medraron. Todos evolucionamos co tempo.
06:30
And of course, also there's the placebo effect.
163
375000
2000
Por suposto, tamén está o efecto placebo,
06:32
The placebo effect is one of the most fascinating things in the whole of medicine.
164
377000
2000
que é unha das cousas máis
fascinantes en medicina.
06:34
It's not just about taking a pill, and your performance and your pain getting better.
165
379000
3000
E non se trata só de tomar unha pílula e
que mellore o rendemento e a dor.
06:37
It's about our beliefs and expectations.
166
382000
2000
Trátase das nosas crenzas e expectativas.
06:39
It's about the cultural meaning of a treatment.
167
384000
2000
Do significado cultural dun tratamento.
06:41
And this has been demonstrated in a whole raft of fascinating studies
168
386000
3000
E isto demostrouse nunha chea
de estudos fascinantes
06:44
comparing one kind of placebo against another.
169
389000
3000
que comparaban un tipo de placebo
con outro.
06:47
So we know, for example, that two sugar pills a day
170
392000
2000
Sabemos, por exemplo,
que 2 pílulas de azucre ao día
06:49
are a more effective treatment for getting rid of gastric ulcers
171
394000
2000
son máis efectivas para
as úlceras gástricas
06:51
than one sugar pill.
172
396000
2000
que unha soa pílula de azucre.
06:53
Two sugar pills a day beats one sugar pill a day.
173
398000
2000
2 pílulas ao día son mellores que unha.
06:55
And that's an outrageous and ridiculous finding, but it's true.
174
400000
3000
Ese é un achado estraño
e ridículo, pero verdadeiro.
06:58
We know from three different studies on three different types of pain
175
403000
2000
Sabemos por 3 estudos diferentes
sobre 3 tipos de dor
07:00
that a saltwater injection is a more effective treatment for pain
176
405000
3000
que unha inxección de auga salgada é
máis efectivo para a dor
07:03
than taking a sugar pill, taking a dummy pill that has no medicine in it --
177
408000
4000
que tomar unha pílula de azucre,
unha pílula que non contén nada,
07:07
not because the injection or the pills do anything physically to the body,
178
412000
3000
non porque unha nin outra fagan
algo fisicamente no corpo,
07:10
but because an injection feels like a much more dramatic intervention.
179
415000
3000
senón porque a inxección parece unha
intervención máis drástica.
07:13
So we know that our beliefs and expectations
180
418000
2000
Sabemos que as nosas
crenzas e expectativas
07:15
can be manipulated,
181
420000
2000
poden ser manipuladas,
07:17
which is why we do trials
182
422000
2000
por iso facemos ensaios
07:19
where we control against a placebo --
183
424000
2000
onde comparamos un control
cun un placebo:
07:21
where one half of the people get the real treatment
184
426000
2000
a metade da xente recibe o
tratamento real
07:23
and the other half get placebo.
185
428000
2000
e a outra metade, o placebo.
07:25
But that's not enough.
186
430000
3000
Pero non abonda.
07:28
What I've just shown you are examples of the very simple and straightforward ways
187
433000
3000
O que vos amosei son exemplos das
formas sinxelas e directas
07:31
that journalists and food supplement pill peddlers
188
436000
2000
en que xornalistas e vendedores
de suplementos dietéticos
07:33
and naturopaths
189
438000
2000
e naturópatas
07:35
can distort evidence for their own purposes.
190
440000
3000
terxiversan as probas
a favor dos seus propios intereses.
O que é realmente fascinante
07:38
What I find really fascinating
191
443000
2000
07:40
is that the pharmaceutical industry
192
445000
2000
é que a industria farmacéutica
usa exactamente os mesmos
trucos e instrumentos
07:42
uses exactly the same kinds of tricks and devices,
193
447000
2000
07:44
but slightly more sophisticated versions of them,
194
449000
3000
pero con versións lixeiramente
máis sofisticadas
07:47
in order to distort the evidence that they give to doctors and patients,
195
452000
3000
para terxiversar as probas
que lles dan a médicos e pacientes
07:50
and which we use to make vitally important decisions.
196
455000
3000
e que usamos para tomar decisións vitais.
07:53
So firstly, trials against placebo:
197
458000
2000
En primeiro lugar, o ensaio con placebos:
07:55
everybody thinks they know that a trial should be
198
460000
2000
todo o mundo pensa que
un ensaio debería ser
07:57
a comparison of your new drug against placebo.
199
462000
2000
unha comparación entre
un novo fármaco e un placebo.
07:59
But actually in a lot of situations that's wrong.
200
464000
2000
Pero, realmente, moitas veces non é así
08:01
Because often we already have a very good treatment that is currently available,
201
466000
3000
porque a miúdo xa existe
un bo tratamento
08:04
so we don't want to know that your alternative new treatment
202
469000
2000
así que non queremos saber se
o novo tratamento alternativo
08:06
is better than nothing.
203
471000
2000
é mellor que nada.
08:08
We want to know that it's better than the best currently available treatment that we have.
204
473000
3000
Queremos saber se supera
o mellor dos tratamentos que xa temos.
08:11
And yet, repeatedly, you consistently see people doing trials
205
476000
3000
E aínda así, de xeito repetido vemos xente
08:14
still against placebo.
206
479000
2000
que fai ensaios contra o placebo.
E pódese obter licenza para sacar
un fármaco ao mercado
08:16
And you can get license to bring your drug to market
207
481000
2000
08:18
with only data showing that it's better than nothing,
208
483000
2000
só cuns datos que digan
que é mellor que nada,
08:20
which is useless for a doctor like me trying to make a decision.
209
485000
3000
algo inútil para un médico coma min
que intenta tomar unha decisión.
08:23
But that's not the only way you can rig your data.
210
488000
2000
Pero non é a única forma
de manipular os datos.
08:25
You can also rig your data
211
490000
2000
Tamén pode facerse
comparando o novo fármaco
08:27
by making the thing you compare your new drug against
212
492000
2000
08:29
really rubbish.
213
494000
2000
con algo inútil.
08:31
You can give the competing drug in too low a dose,
214
496000
2000
Pódese dar o fármaco competidor
en dose moi baixa
08:33
so that people aren't properly treated.
215
498000
2000
para que a xente non teña
o tratamento adecuado.
08:35
You can give the competing drug in too high a dose,
216
500000
2000
Ou nunha dose moi alta
08:37
so that people get side effects.
217
502000
2000
para que produza efectos secundarios.
08:39
And this is exactly what happened
218
504000
2000
Isto foi talmente o que ocorreu
08:41
which antipsychotic medication for schizophrenia.
219
506000
2000
cos antipsicóticos para a esquizofrenia.
08:43
20 years ago, a new generation of antipsychotic drugs were brought in
220
508000
3000
Hai 20 anos, apareceu unha
nova xeración de antipsicóticos
08:46
and the promise was that they would have fewer side effects.
221
511000
3000
coa promesa de que terían
menos efectos secundarios.
08:49
So people set about doing trials of these new drugs
222
514000
2000
Así que se comezaron a facer
ensaios con eles
08:51
against the old drugs,
223
516000
2000
comparándoos cos vellos
08:53
but they gave the old drugs in ridiculously high doses --
224
518000
2000
pero dábanse doses ridiculamente
altas dos vellos fármacos
08:55
20 milligrams a day of haloperidol.
225
520000
2000
-20 mg ao día de haloperidol.
08:57
And it's a foregone conclusion,
226
522000
2000
E é unha conclusión evidente
08:59
if you give a drug at that high a dose,
227
524000
2000
que se se dá un fármaco
nunha dose tan alta
09:01
that it will have more side effects and that your new drug will look better.
228
526000
3000
terá máis efectos secundarios e
o novo parecerá mellor.
09:04
10 years ago, history repeated itself, interestingly,
229
529000
2000
Hai 10 anos, repetiuse a historia
09:06
when risperidone, which was the first of the new-generation antipscyhotic drugs,
230
531000
3000
cando a risperidona, o primeiro
antipsicótico da nova xeración,
09:09
came off copyright, so anybody could make copies.
231
534000
3000
xa non tiña dereitos de autor,
e podía reproducirse libremente.
09:12
Everybody wanted to show that their drug was better than risperidone,
232
537000
2000
Todos querían probar que o seu
fármaco era mellor ca ela
09:14
so you see a bunch of trials comparing new antipsychotic drugs
233
539000
3000
así que houbo moitos ensaios
que comparaban os novos antipsicóticos
09:17
against risperidone at eight milligrams a day.
234
542000
2000
con 8 mg por día de risperidona.
09:19
Again, not an insane dose, not an illegal dose,
235
544000
2000
Outra vez, non é unha dose absurda,
non é ilegal,
09:21
but very much at the high end of normal.
236
546000
2000
pero está moi cerca de superar o normal.
09:23
And so you're bound to make your new drug look better.
237
548000
3000
Así, seguro que o novo fármaco
parecerá mellor.
09:26
And so it's no surprise that overall,
238
551000
3000
Polo tanto, non sorprende que, en xeral,
09:29
industry-funded trials
239
554000
2000
os ensaios financiados pola industria
teñan 4 veces máis probabilidades
de dar un resultado positivo
09:31
are four times more likely to give a positive result
240
556000
2000
09:33
than independently sponsored trials.
241
558000
3000
que os independentes.
09:36
But -- and it's a big but --
242
561000
3000
Pero -e este é un pero moi grande-
09:39
(Laughter)
243
564000
2000
(Risos)
09:41
it turns out,
244
566000
2000
resulta que
cando observas os métodos usados
en ensaios financiados pola industria
09:43
when you look at the methods used by industry-funded trials,
245
568000
3000
09:46
that they're actually better
246
571000
2000
ves que son realmente mellores
09:48
than independently sponsored trials.
247
573000
2000
que os independentes.
09:50
And yet, they always manage to to get the result that they want.
248
575000
3000
E, aínda así, sempre conseguen os
resultados que queren.
09:53
So how does this work?
249
578000
2000
Entón..., como pode ser?
(Risos)
09:55
How can we explain this strange phenomenon?
250
580000
3000
Como podemos explicar
este estraño fenómeno?
09:58
Well it turns out that what happens
251
583000
2000
Ben, pois o que ocorre
10:00
is the negative data goes missing in action;
252
585000
2000
é que os datos negativos
pérdense en combate;
10:02
it's withheld from doctors and patients.
253
587000
2000
non se revelan a médicos e pacientes.
10:04
And this is the most important aspect of the whole story.
254
589000
2000
Este é o aspecto
máis importante da historia.
10:06
It's at the top of the pyramid of evidence.
255
591000
2000
Está no cume da pirámide de probas.
10:08
We need to have all of the data on a particular treatment
256
593000
3000
Precisamos ter todos os datos
dun tratamento en concreto
10:11
to know whether or not it really is effective.
257
596000
2000
para saber se é efectivo ou non.
10:13
And there are two different ways that you can spot
258
598000
2000
E hai dous modos de ver
10:15
whether some data has gone missing in action.
259
600000
2000
se algúns datos se perderon en combate.
10:17
You can use statistics, or you can use stories.
260
602000
3000
Pódense usar estatísticas ou historias.
Eu prefiro as estatísticas,
así que empezarei por elas.
10:20
I personally prefer statistics, so that's what I'm going to do first.
261
605000
2000
10:22
This is something called funnel plot.
262
607000
2000
Isto é unha gráfica de funil.
10:24
And a funnel plot is a very clever way of spotting
263
609000
2000
É unha moi boa forma de identificar
10:26
if small negative trials have disappeared, have gone missing in action.
264
611000
3000
se pequenos ensaios negativos
desapareceron en combate.
10:29
So this is a graph of all of the trials
265
614000
2000
Esta é unha gráfica de todas as probas
10:31
that have been done on a particular treatment.
266
616000
2000
que se fixeron dun tratamento concreto.
10:33
And as you go up towards the top of the graph,
267
618000
2000
Ao observar a parte superior da gráfica
10:35
what you see is each dot is a trial.
268
620000
2000
vese que cada punto é un ensaio.
10:37
And as you go up, those are the bigger trials, so they've got less error in them.
269
622000
3000
E ao subir, estes son os ensaios
máis grandes, con menos erros.
10:40
So they're less likely to be randomly false positives, randomly false negatives.
270
625000
3000
É menos probable que dean
falsos positivos ou falsos negativos.
10:43
So they all cluster together.
271
628000
2000
Así que todos se agrupan.
Os grandes ensaios están
máis cerca da resposta real.
10:45
The big trials are closer to the true answer.
272
630000
2000
10:47
Then as you go further down at the bottom,
273
632000
2000
Cando imos cara ao fondo,
10:49
what you can see is, over on this side, the spurious false negatives,
274
634000
3000
o que vemos é, neste lado,
os falsos negativos espurios
10:52
and over on this side, the spurious false positives.
275
637000
2000
e, neste lado, os falsos
positivos espurios.
10:54
If there is publication bias,
276
639000
2000
Se hai un nesgo na publicación
10:56
if small negative trials have gone missing in action,
277
641000
3000
se os pequenos ensaios negativos
desapareceron,
10:59
you can see it on one of these graphs.
278
644000
2000
pódese ver nunha destas gráficas.
Aquí pódese ver que os
pequenos ensaios negativos
11:01
So you can see here that the small negative trials
279
646000
2000
11:03
that should be on the bottom left have disappeared.
280
648000
2000
que deberían estar
abaixo á esquerda desapareceron.
11:05
This is a graph demonstrating the presence of publication bias
281
650000
3000
Esta gráfica demostra a presenza
de nesgos na publicación
11:08
in studies of publication bias.
282
653000
2000
en estudos sobre nesgos nas publicacións.
11:10
And I think that's the funniest epidemiology joke
283
655000
2000
E penso que esta é a broma epidemiolóxica
11:12
that you will ever hear.
284
657000
2000
máis graciosa que escoitastes.
11:14
That's how you can prove it statistically,
285
659000
2000
Así é como se proba estatisticamente, pero
11:16
but what about stories?
286
661000
2000
que pasa coas historias?
11:18
Well they're heinous, they really are.
287
663000
2000
Ben, son odiosas, abofé que si.
11:20
This is a drug called reboxetine.
288
665000
2000
Hai un fármaco chamado reboxetina.
É un medicamento que eu mesmo
lles prescribín a pacientes.
11:22
This is a drug that I myself have prescribed to patients.
289
667000
2000
11:24
And I'm a very nerdy doctor.
290
669000
2000
E son un médico moi aplicado.
11:26
I hope I try to go out of my way to try and read and understand all the literature.
291
671000
3000
Fago o posible por intentar ler
e entender a bibliografía.
11:29
I read the trials on this. They were all positive. They were all well-conducted.
292
674000
3000
Lin os ensaios sobre este fármaco.
Todos positivos. Todos ben dirixidos.
11:32
I found no flaw.
293
677000
2000
Non atopei puntos febles.
11:34
Unfortunately, it turned out,
294
679000
2000
Por desgraza, resultou
11:36
that many of these trials were withheld.
295
681000
2000
que moitos deses ensaios,
11:38
In fact, 76 percent
296
683000
2000
en realidade, o 76 %
11:40
of all of the trials that were done on this drug
297
685000
2000
dos que se fixeron con este fármaco
11:42
were withheld from doctors and patients.
298
687000
2000
ocultáronselles a médicos e pacientes.
11:44
Now if you think about it,
299
689000
2000
Se o pensades,
11:46
if I tossed a coin a hundred times,
300
691000
2000
se eu tiro unha moeda ao ar cen veces,
11:48
and I'm allowed to withhold from you
301
693000
2000
e se me permite ocultar
11:50
the answers half the times,
302
695000
2000
o resultado a metade das veces,
11:52
then I can convince you
303
697000
2000
podería convencervos
de que teño unha moeda de dúas caras.
11:54
that I have a coin with two heads.
304
699000
2000
11:56
If we remove half of the data,
305
701000
2000
Se eliminamos a metade dos datos,
11:58
we can never know what the true effect size of these medicines is.
306
703000
3000
nunca poderemos saber
os verdadeiros efectos dese fármaco.
12:01
And this is not an isolated story.
307
706000
2000
E isto non é unha historia illada.
12:03
Around half of all of the trial data on antidepressants has been withheld,
308
708000
4000
Case a metade da información de
ensaios con antidepresivos está oculta
12:07
but it goes way beyond that.
309
712000
2000
pero isto vai máis alá.
12:09
The Nordic Cochrane Group were trying to get a hold of the data on that
310
714000
2000
O Nordic Cochrane Group intentou
acceder a esa información
12:11
to bring it all together.
311
716000
2000
para agrupala.
Os Cochrane Groups son unha alianza
internacional sen fin de lucro
12:13
The Cochrane Groups are an international nonprofit collaboration
312
718000
3000
12:16
that produce systematic reviews of all of the data that has ever been shown.
313
721000
3000
que fai revisións sistemáticas de
todos os datos que aparecen.
12:19
And they need to have access to all of the trial data.
314
724000
3000
E precisan ter acceso a
todos os datos dos ensaios.
12:22
But the companies withheld that data from them,
315
727000
3000
Pero as compañías ocúltanlles
esta información,
12:25
and so did the European Medicines Agency
316
730000
2000
tal como fixo tamén a Axencia
Europea de Medicamentos
12:27
for three years.
317
732000
2000
durante 3 anos.
12:29
This is a problem that is currently lacking a solution.
318
734000
3000
Este é un problema actualmente
sen solución.
12:32
And to show how big it goes, this is a drug called Tamiflu,
319
737000
3000
E para amosarvos o seu tamaño,
velaquí un fármaco chamado Tamiflu,
12:35
which governments around the world
320
740000
2000
en que os gobernos do mundo
12:37
have spent billions and billions of dollars on.
321
742000
2000
gastaron miles e miles
de millóns de dólares.
12:39
And they spend that money on the promise
322
744000
2000
E gastáronos coa promesa
12:41
that this is a drug which will reduce the rate
323
746000
2000
de que era un fármaco que reduciría a taxa
12:43
of complications with flu.
324
748000
2000
de complicacións da gripe.
12:45
We already have the data
325
750000
2000
Xa temos os datos
12:47
showing that it reduces the duration of your flu by a few hours.
326
752000
2000
que din que reduce a duración
da gripe nunhas horas.
12:49
But I don't really care about that. Governments don't care about that.
327
754000
2000
Pero a min éme igual.
E aos gobernos tamén.
12:51
I'm very sorry if you have the flu, I know it's horrible,
328
756000
3000
Sinto que teñades a gripe,
sei que é horrible,
12:54
but we're not going to spend billions of dollars
329
759000
2000
pero non imos gastar
miles de millóns de dólares
12:56
trying to reduce the duration of your flu symptoms
330
761000
2000
para intentar reducir a duración
dos síntomas da túa gripe
12:58
by half a day.
331
763000
2000
en medio día.
13:00
We prescribe these drugs, we stockpile them for emergencies
332
765000
2000
Prescribimos eses fármacos,
acumulámolos para emerxencias
13:02
on the understanding that they will reduce the number of complications,
333
767000
2000
pensando que reducirían
o número de complicacións,
13:04
which means pneumonia and which means death.
334
769000
3000
é dicir, pneumonía e morte.
13:07
The infectious diseases Cochrane Group, which are based in Italy,
335
772000
3000
O Cochrane Group de enfermidades
infecciosas, con sede en Italia,
13:10
has been trying to get
336
775000
2000
intentou obter
13:12
the full data in a usable form out of the drug companies
337
777000
3000
das compañías farmacéuticas
todos os datos nun formato usable
13:15
so that they can make a full decision
338
780000
3000
para poder decidir de forma concluínte
13:18
about whether this drug is effective or not,
339
783000
2000
se o fármaco é efectivo ou non
13:20
and they've not been able to get that information.
340
785000
3000
e non foron capaces de
conseguir esa información.
13:23
This is undoubtedly
341
788000
2000
Este é, sen dúbida,
13:25
the single biggest ethical problem
342
790000
3000
o problema ético máis grande
13:28
facing medicine today.
343
793000
2000
con que bate a medicina hoxe en día.
13:30
We cannot make decisions
344
795000
3000
Non podemos tomar decisións
13:33
in the absence of all of the information.
345
798000
4000
ao non termos toda a información.
13:37
So it's a little bit difficult from there
346
802000
3000
Así que é un pouco difícil
13:40
to spin in some kind of positive conclusion.
347
805000
4000
extraer algún tipo de conclusión positiva.
13:44
But I would say this:
348
809000
4000
Pero eu diría isto:
13:48
I think that sunlight
349
813000
3000
creo que a luz do sol
13:51
is the best disinfectant.
350
816000
2000
é o mellor desinfectante.
13:53
All of these things are happening in plain sight,
351
818000
3000
Todas estas cousas están
ocorrendo diante dos nosos ollos,
13:56
and they're all protected
352
821000
2000
e todas están protexidas
13:58
by a force field of tediousness.
353
823000
3000
por un campo de forza de tedio.
14:01
And I think, with all of the problems in science,
354
826000
2000
E penso que, con todos os
problemas da ciencia,
14:03
one of the best things that we can do
355
828000
2000
unha das mellores cousas
que podemos facer
14:05
is to lift up the lid,
356
830000
2000
é levantar a tapa,
14:07
finger around in the mechanics and peer in.
357
832000
2000
remexer nos mecanismos e osmar.
14:09
Thank you very much.
358
834000
2000
Moitas grazas.
14:11
(Applause)
359
836000
3000
(Aplausos)
Translated by Carme Paz
Reviewed by Xusto Rodriguez

▲Back to top

ABOUT THE SPEAKER
Ben Goldacre - Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks.

Why you should listen

"It was the MMR story that finally made me crack," begins the Bad Science manifesto, referring to the sensationalized -- and now-refuted -- link between vaccines and autism. With that sentence Ben Goldacre fired the starting shot of a crusade waged from the pages of The Guardian from 2003 to 2011, on an addicitve Twitter feed, and in bestselling books, including Bad Science and his latest, Bad Pharma, which puts the $600 billion global pharmaceutical industry under the microscope. What he reveals is a fascinating, terrifying mess.

Goldacre was trained in medicine at Oxford and London, and works as an academic in epidemiology. Helped along by this inexhaustible supply of material, he also travels the speaking circuit, promoting skepticism and nerdish curiosity with fire, wit, fast delivery and a lovable kind of exasperation. (He might even convince you that real science, sober reporting and reason are going to win in the end.)

As he writes, "If you're a journalist who misrepresents science for the sake of a headline, a politician more interested in spin than evidence, or an advertiser who loves pictures of molecules in little white coats, then beware: your days are numbered."

Read an excerpt of Bad Pharma >>

More profile about the speaker
Ben Goldacre | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee