ABOUT THE SPEAKER
Ben Goldacre - Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks.

Why you should listen

"It was the MMR story that finally made me crack," begins the Bad Science manifesto, referring to the sensationalized -- and now-refuted -- link between vaccines and autism. With that sentence Ben Goldacre fired the starting shot of a crusade waged from the pages of The Guardian from 2003 to 2011, on an addicitve Twitter feed, and in bestselling books, including Bad Science and his latest, Bad Pharma, which puts the $600 billion global pharmaceutical industry under the microscope. What he reveals is a fascinating, terrifying mess.

Goldacre was trained in medicine at Oxford and London, and works as an academic in epidemiology. Helped along by this inexhaustible supply of material, he also travels the speaking circuit, promoting skepticism and nerdish curiosity with fire, wit, fast delivery and a lovable kind of exasperation. (He might even convince you that real science, sober reporting and reason are going to win in the end.)

As he writes, "If you're a journalist who misrepresents science for the sake of a headline, a politician more interested in spin than evidence, or an advertiser who loves pictures of molecules in little white coats, then beware: your days are numbered."

Read an excerpt of Bad Pharma >>

More profile about the speaker
Ben Goldacre | Speaker | TED.com
TEDGlobal 2011

Ben Goldacre: Battling bad science

班.高達可 (Ben Goldacre):擊敗壞科學

Filmed:
2,713,579 views

每天都有新聞報告提出一些新的健康建議,但是我們要怎麼知道這些建議是否正確?身兼醫師與流行病學家,班.高達可滔滔不絕告訴我們證據是如何被扭曲的,從聲稱具有營養價值、讓人一目了然的噱頭手法,到醫藥工業細微的操弄手段。
- Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks. Full bio

Double-click the English transcript below to play the video.

00:15
So I'm a doctor醫生, but I kind of slipped下滑 sideways側身 into research研究,
0
0
3000
雖然我是醫生,但無心插柳做了點研究
00:18
and now I'm an epidemiologist流行病學家.
1
3000
2000
所以現在我也是個流行病學家
00:20
And nobody沒有人 really knows知道 what epidemiology流行病學 is.
2
5000
2000
沒有人真正知道流行病學是什麼
00:22
Epidemiology流行病學 is the science科學 of how we know in the real真實 world世界
3
7000
3000
流行病學是一種科學,讓我們知道在生活中
00:25
if something is good for you or bad for you.
4
10000
2000
什麼東西對你好,或對你不好
00:27
And it's best最好 understood了解 through通過 example
5
12000
2000
而有例可循最是簡單明瞭
00:29
as the science科學 of those crazy, wacky古怪 newspaper報紙 headlines新聞頭條.
6
14000
5000
好比誇張、奇怪的新聞標題就是一例
00:34
And these are just some of the examples例子.
7
19000
2000
這裡有些例子
00:36
These are from the Daily日常 Mail郵件. Every一切 country國家 in the world世界 has a newspaper報紙 like this.
8
21000
3000
它們來自《每日郵報》,全世界各國都有類似的報紙
00:39
It has this bizarre奇異的, ongoing不斷的 philosophical哲學上 project項目
9
24000
3000
這類報紙一直都有個古怪、看似有理的企畫
00:42
of dividing all the inanimate老成 objects對象 in the world世界
10
27000
2000
就是將世界上所有不會動的物件
00:44
into the ones那些 that either cause原因 or prevent避免 cancer癌症.
11
29000
3000
都分別歸類成致癌物或防癌物。
00:47
So here are some of the things they said cause原因 cancer癌症 recently最近:
12
32000
2000
這些是一部分最近據說會致癌的玩意兒
00:49
divorce離婚, Wi-Fi無線上網, toiletries化妝品 and coffee咖啡.
13
34000
2000
離婚、無線網路、衛生用品和咖啡
00:51
Here are some of the things they say prevents防止 cancer癌症:
14
36000
2000
而這些則是據說能防癌的東西
00:53
crusts結殼, red pepper胡椒, licorice甘草 and coffee咖啡.
15
38000
2000
麵包皮、紅椒、甘草和咖啡
00:55
So already已經 you can see there are contradictions矛盾.
16
40000
2000
我想你看到了矛盾的地方
00:57
Coffee咖啡 both causes原因 and prevents防止 cancer癌症.
17
42000
2000
咖啡同時致癌也防癌
00:59
And as you start開始 to read on, you can see
18
44000
2000
當你開始讀下去,你會發現
01:01
that maybe there's some kind of political政治 valence behind背後 some of this.
19
46000
3000
這些報導說不定別有用心。
01:04
So for women婦女, housework家務 prevents防止 breast乳房 cancer癌症,
20
49000
2000
拿女性來說,做家事能預防乳癌
01:06
but for men男人, shopping購物 could make you impotent無能.
21
51000
3000
但對男性而言,購物會使他們性無能
01:09
So we know that we need to start開始
22
54000
3000
所以,我們要開始
01:12
unpickingunpicking the science科學 behind背後 this.
23
57000
3000
揭露這一切背後的科學真相
01:15
And what I hope希望 to show顯示
24
60000
2000
我想要做的
01:17
is that unpickingunpicking dodgy躲閃的 claims索賠,
25
62000
2000
是揭穿那些騙人的說法
01:19
unpickingunpicking the evidence證據 behind背後 dodgy躲閃的 claims索賠,
26
64000
2000
以及那些假象背後的證據,
01:21
isn't a kind of nasty討厭 carping吹毛求疵 activity活動;
27
66000
3000
這並非卑鄙、催毛求疵的事
01:24
it's socially社交上 useful有用,
28
69000
2000
這對社會有用,
01:26
but it's also an extremely非常 valuable有價值
29
71000
2000
同時,也是個極具價值的
01:28
explanatory解釋性 tool工具.
30
73000
2000
解說工具。
01:30
Because real真實 science科學 is all about
31
75000
2000
因為真正的科學
01:32
critically危重 appraising評價 the evidence證據 for somebody else's別人的 position位置.
32
77000
2000
是批判地評斷他人提出的證據
01:34
That's what happens發生 in academic學術的 journals期刊.
33
79000
2000
這就是在學術期刊,
01:36
That's what happens發生 at academic學術的 conferences會議.
34
81000
2000
以及學術研討論上,上演的戲碼。
01:38
The Q&A session會議 after a post-op後運 presents禮物 data數據
35
83000
2000
有時在發言完後的問答時間
01:40
is often經常 a blood血液 bath.
36
85000
2000
就像場血戰。
01:42
And nobody沒有人 minds頭腦 that. We actively積極地 welcome歡迎 it.
37
87000
2000
但沒人在乎,我們還熱烈歡迎。
01:44
It's like a consenting同意 intellectual知識分子 S&M activity活動.
38
89000
3000
就像是雙方同意,來場學術上的激烈性愛一樣。
01:47
So what I'm going to show顯示 you
39
92000
2000
所以,接下來要讓各位看的
01:49
is all of the main主要 things,
40
94000
2000
是我的研究領域
01:51
all of the main主要 features特徵 of my discipline學科 --
41
96000
2000
也就是流行病學的主要特色-
01:53
evidence-based循證 medicine醫學.
42
98000
2000
循證醫學。
01:55
And I will talk you through通過 all of these
43
100000
2000
我會一步步的講解,
01:57
and demonstrate演示 how they work,
44
102000
2000
說明他們是如何運作的,
01:59
exclusively using運用 examples例子 of people getting得到 stuff東東 wrong錯誤.
45
104000
3000
但不會拿出錯的事情做為例子。
02:02
So we'll start開始 with the absolute絕對 weakest最弱 form形成 of evidence證據 known已知 to man,
46
107000
3000
一開始,我們先談最不堪一擊的證據來源
02:05
and that is authority權威.
47
110000
2000
就是權威。
02:07
In science科學, we don't care關心 how many許多 letters you have after your name名稱.
48
112000
3000
在科學界,我們不在乎你名字後面有多少頭銜,
02:10
In science科學, we want to know what your reasons原因 are for believing相信 something.
49
115000
3000
我們想知道,你發表某樣事物的背後有什麼理由。
02:13
How do you know that something is good for us
50
118000
2000
你怎麼知道那對我們是好,
02:15
or bad for us?
51
120000
2000
還是不好?
02:17
But we're also unimpressed不為所動 by authority權威,
52
122000
2000
但,因為權威容易偽造,
02:19
because it's so easy簡單 to contrive圖謀.
53
124000
2000
所以我們對權威並不有什麼感覺。
02:21
This is somebody called Dr博士. Gillian阿嬌 McKeith麥克凱斯 Ph博士.D,
54
126000
2000
這是麥基絲醫生博士
02:23
or, to give her full充分 medical title標題, Gillian阿嬌 McKeith麥克凱斯.
55
128000
3000
或者,給她一個完整的醫學頭銜,吉輪 ●麥基絲。
02:26
(Laughter笑聲)
56
131000
3000
(笑聲)
02:29
Again, every一切 country國家 has somebody like this.
57
134000
2000
每個國家都有像這個的一個人。
02:31
She is our TV電視 diet飲食 guru領袖.
58
136000
2000
她是電視上的健康飲食大師,
02:33
She has massive大規模的 five series系列 of prime-time黃金時間 television電視,
59
138000
3000
她有多達五個黃金時段的系列節目
02:36
giving out very lavish闊氣 and exotic異國情調 health健康 advice忠告.
60
141000
3000
分享又豐富,又充滿異國風情的保健建議
02:39
She, it turns out, has a non-accredited非認可 correspondence對應 course課程 Ph博士.D.
61
144000
3000
結果,她的學歷只是在美國某一大學的
02:42
from somewhere某處 in America美國.
62
147000
2000
非公認的函授課程博士。
02:44
She also boasts自誇 that she's a certified認證 professional專業的 member會員
63
149000
2000
她也自豪地說她是美國營養顧問協會
02:46
of the American美國 Association協會 of Nutritional營養 Consultants顧問,
64
151000
2000
的認證專業會員
02:48
which哪一個 sounds聲音 very glamorous富有魅力的 and exciting扣人心弦.
65
153000
2000
聽起來是多麼吸引人又讓人興奮
02:50
You get a certificate證書 and everything.
66
155000
2000
你有張結業證書,就什麼都有了
02:52
This one belongs屬於 to my dead cat HettiHetti. She was a horrible可怕 cat.
67
157000
2000
這是我的貓,海地,她已經過世了,一隻很不乖的貓
02:54
You just go to the website網站, fill out the form形成,
68
159000
2000
你只要上網,填表格,
02:56
give them $60, and it arrives到達 in the post崗位.
69
161000
2000
交六十塊給他們,然後證書就郵寄給你
02:58
Now that's not the only reason原因 that we think this person is an idiot白痴.
70
163000
2000
但,這不是我們覺得這傢伙是笨蛋的唯一理由
03:00
She also goes and says things like,
71
165000
2000
她也會說些話像是
03:02
you should eat lots of dark黑暗 green綠色 leaves樹葉,
72
167000
2000
你該吃很多的深綠色蔬菜,
03:04
because they contain包含 lots of chlorophyll葉綠素, and that will really oxygenate氧合 your blood血液.
73
169000
2000
因為深綠色蔬菜有大量葉綠素,可讓血液充滿氧氣
03:06
And anybody任何人 who's誰是 doneDONE school學校 biology生物學 remembers記得
74
171000
2000
但,任何在學校做過生物實驗的人都記得
03:08
that chlorophyll葉綠素 and chloroplasts葉綠體
75
173000
2000
葉綠素及葉綠粒
03:10
only make oxygen in sunlight陽光,
76
175000
2000
只透過光合作用來製造氧氣
03:12
and it's quite相當 dark黑暗 in your bowels腸子 after you've eaten吃過 spinach菠菜.
77
177000
3000
而當你把菠菜吃掉時,身體裡其實是黑暗的
03:15
Next下一個, we need proper正確 science科學, proper正確 evidence證據.
78
180000
3000
接著,我們需要正確的科學、有理的證據來證實
03:18
So, "Red wine紅酒 can help prevent避免 breast乳房 cancer癌症."
79
183000
2000
「紅酒可預防乳癌。」
03:20
This is a headline標題 from the Daily日常 Telegraph電報 in the U.K.
80
185000
2000
這是英國的《每日電訊》的頭條新聞
03:22
"A glass玻璃 of red wine紅酒 a day could help prevent避免 breast乳房 cancer癌症."
81
187000
3000
「一天一杯紅酒能幫助預防乳癌。」
03:25
So you go and find this paper, and what you find
82
190000
2000
接著你去找這則新聞,你會發現
03:27
is it is a real真實 piece of science科學.
83
192000
2000
這確實有科學佐證
03:29
It is a description描述 of the changes變化 in one enzyme
84
194000
3000
這是描述某實驗室的工作臺上的培養皿,
03:32
when you drip a chemical化學 extracted提取 from some red grape葡萄 skin皮膚
85
197000
3000
把紅葡萄皮上所萃取的化學物質,
03:35
onto some cancer癌症 cells細胞
86
200000
2000
滴到一些癌細胞上,
03:37
in a dish on a bench長凳 in a laboratory實驗室 somewhere某處.
87
202000
3000
酵素會產生變化
03:40
And that's a really useful有用 thing to describe描述
88
205000
2000
就科學報告而言,
03:42
in a scientific科學 paper,
89
207000
2000
這樣的實驗描述很有用,
03:44
but on the question of your own擁有 personal個人 risk風險
90
209000
2000
但,一問到喝紅酒
03:46
of getting得到 breast乳房 cancer癌症 if you drink red wine紅酒,
91
211000
2000
和得到乳癌的風險為何,
03:48
it tells告訴 you absolutely絕對 bugger開溜 all.
92
213000
2000
這樣的實驗描述什麼屁都沒告訴你。
03:50
Actually其實, it turns out that your risk風險 of breast乳房 cancer癌症
93
215000
2000
事實上,你得到乳癌的風險
03:52
actually其實 increases增加 slightly
94
217000
2000
會隨著你喝的含酒精飲料
03:54
with every一切 amount of alcohol that you drink.
95
219000
2000
一點一點的增加
03:56
So what we want is studies學習 in real真實 human人的 people.
96
221000
4000
因此,我們要看的,是拿真人做的研究
04:00
And here's這裡的 another另一個 example.
97
225000
2000
這是另一個例子
04:02
This is from Britain's英國的 leading領導 diet飲食 and nutritionist營養師 in the Daily日常 Mirror鏡子,
98
227000
3000
來自英國的《每日鏡報》,一位頗具影響力的飲食營養學家
04:05
which哪一個 is our second第二 biggest最大 selling銷售 newspaper報紙.
99
230000
2000
《每日鏡報》是英國銷售量第二的報紙
04:07
"An Australian澳大利亞 study研究 in 2001
100
232000
2000
「2001年,澳洲當地的實驗發現
04:09
found發現 that olive橄欖 oil in combination組合 with fruits水果, vegetables蔬菜 and pulses脈衝
101
234000
2000
若把橄欖油與水果、蔬菜,或豆類植物搭配食用,
04:11
offers報價 measurable可測量 protection保護 against反對 skin皮膚 wrinklingswrinklings."
102
236000
2000
將對皮膚防皺有相當顯著效果。」
04:13
And then they give you advice忠告:
103
238000
2000
然後,他們就給妳這樣的建議:
04:15
"If you eat olive橄欖 oil and vegetables蔬菜, you'll你會 have fewer skin皮膚 wrinkles皺紋."
104
240000
2000
「如果把橄欖油及蔬菜搭配著吃,你的皺紋會比較少。」
04:17
And they very helpfully有益 tell you how to go and find the paper.
105
242000
2000
他們也很熱心地告訴你文獻出處
04:19
So you go and find the paper, and what you find is an observational觀察 study研究.
106
244000
3000
而在你看完報告後,你就會知道這是份觀測研究
04:22
Obviously明顯 nobody沒有人 has been able能夠
107
247000
2000
很顯然地,沒人有辦法
04:24
to go back to 1930,
108
249000
2000
回到1930年
04:26
get all the people born天生 in one maternity母道 unit單元,
109
251000
3000
把同個產科所有出生的小孩找來
04:29
and half of them eat lots of fruit水果 and veg蔬菜 and olive橄欖 oil,
110
254000
2000
其中一半給他們吃水果、蔬菜和橄欖油
04:31
and then half of them eat McDonald's麥當勞,
111
256000
2000
另一半給他們吃麥當勞
04:33
and then we see how many許多 wrinkles皺紋 you've got later後來.
112
258000
2000
之後,看他們的皺紋誰多誰少
04:35
You have to take a snapshot快照 of how people are now.
113
260000
2000
你得拍照記錄人們的樣子
04:37
And what you find is, of course課程,
114
262000
2000
當然,你會發現到
04:39
people who eat veg蔬菜 and olive橄欖 oil have fewer skin皮膚 wrinkles皺紋.
115
264000
3000
吃蔬菜和橄欖油的人,確實皺紋較少
04:42
But that's because people who eat fruit水果 and veg蔬菜 and olive橄欖 oil,
116
267000
3000
但,這是因為那些吃水果、蔬菜,橄欖油的人是怪胎
04:45
they're freaks怪胎, they're not normal正常, they're like you;
117
270000
3000
他們不正常,就像你們一樣
04:48
they come to events事件 like this.
118
273000
2000
他們會出席TED這樣的活動
04:50
They are posh辣妹, they're wealthy富裕, they're less likely容易 to have outdoor戶外 jobs工作,
119
275000
3000
他們受歡迎、有錢、也不太可能做室外工作
04:53
they're less likely容易 to do manual手冊 labor勞動,
120
278000
2000
也不太從事勞力的工作
04:55
they have better social社會 support支持, they're less likely容易 to smoke抽煙 --
121
280000
2000
他們在社會上獲得較多的支持,他們不太抽菸-
04:57
so for a whole整個 host主辦 of fascinating迷人, interlocking聯鎖
122
282000
2000
因此,這麼多關於社會、政治,文化等等原因,
04:59
social社會, political政治 and cultural文化 reasons原因,
123
284000
2000
這些原因這麼棒,又如此環環相扣
05:01
they are less likely容易 to have skin皮膚 wrinkles皺紋.
124
286000
2000
而這些人,也就不太可能有皺紋
05:03
That doesn't mean that it's the vegetables蔬菜 or the olive橄欖 oil.
125
288000
2000
這並不意味著蔬菜還是橄欖油的功勞
05:05
(Laughter笑聲)
126
290000
2000
(笑聲)
05:07
So ideally理想 what you want to do is a trial審訊.
127
292000
3000
所以,做實驗往往是最理想的選擇
05:10
And everybody每個人 thinks they're very familiar with the idea理念 of a trial審訊.
128
295000
2000
大家都認為,他們對於實驗都得很熟
05:12
Trials試驗 are very old. The first trial審訊 was in the Bible聖經 -- Daniel丹尼爾 1:12.
129
297000
3000
實驗是個老話題。第一個是出現在聖經-但以理書,第一章十二節
05:15
It's very straightforward直截了當 -- you take a bunch of people, you split分裂 them in half,
130
300000
2000
這試驗很簡單,你找一群人,把他們分兩組
05:17
you treat對待 one group one way, you treat對待 the other group the other way,
131
302000
2000
這兩組人,你都用不同的方法去對待他們
05:19
and a little while later後來, you follow跟隨 them up
132
304000
2000
過了一下子之後
05:21
and see what happened發生 to each of them.
133
306000
2000
你就看會他們會發生什麼事
05:23
So I'm going to tell you about one trial審訊,
134
308000
2000
現在我要和你說的這個試驗
05:25
which哪一個 is probably大概 the most well-reported好報 trial審訊
135
310000
2000
在過去十年的英國新聞媒體報導中
05:27
in the U.K. news新聞 media媒體 over the past過去 decade.
136
312000
2000
可以算是名聲相當不錯的試驗
05:29
And this is the trial審訊 of fish oil pills.
137
314000
2000
這是個魚肝油的試驗
05:31
And the claim要求 was fish oil pills improve提高 school學校 performance性能 and behavior行為
138
316000
2000
魚肝油宣稱,可以改善大部分小孩的學業表現
05:33
in mainstream主流 children孩子.
139
318000
2000
及學校行為
05:35
And they said, "We've我們已經 doneDONE a trial審訊.
140
320000
2000
他們說:「我們做了個試驗
05:37
All the previous以前 trials試驗 were positive, and we know this one's那些 gonna be too."
141
322000
2000
而之前的試驗結果都是好的,我想我們的也會是好的。」
05:39
That should always ring alarm報警 bells鐘聲.
142
324000
2000
會這樣說,就是個警訊
05:41
Because if you already已經 know the answer回答 to your trial審訊, you shouldn't不能 be doing one.
143
326000
3000
當你已經知道實驗的結果,那還做它幹什麼
05:44
Either you've rigged非法操縱的 it by design設計,
144
329000
2000
無論這試驗只是隨意起草
05:46
or you've got enough足夠 data數據 so there's no need to randomize隨機 people anymore.
145
331000
3000
或是你已有充分的數據,而你已不需要隨機挑選更多的試驗者
05:49
So this is what they were going to do in their trial審訊.
146
334000
3000
實驗就是這樣進行的
05:52
They were taking服用 3,000 children孩子,
147
337000
2000
他們找了三千位孩童
05:54
they were going to give them all these huge巨大 fish oil pills,
148
339000
2000
並給孩童吃很多的魚肝油
05:56
six of them a day,
149
341000
2000
一天吃六顆
05:58
and then a year later後來, they were going to measure測量 their school學校 exam考試 performance性能
150
343000
3000
一年後,他們來測量孩童的學業表現
06:01
and compare比較 their school學校 exam考試 performance性能
151
346000
2000
並把得來的結果和
06:03
against反對 what they predicted預料到的 their exam考試 performance性能 would have been
152
348000
2000
這些孩童沒有吃魚肝油的預期結果
06:05
if they hadn't有沒有 had the pills.
153
350000
3000
來兩相比較
06:08
Now can anybody任何人 spot a flaw缺陷 in this design設計?
154
353000
3000
有人現在看出這實驗設計的缺失了嗎?
06:11
And no professors教授 of clinical臨床 trial審訊 methodology方法
155
356000
3000
所有臨床試驗方法的教授
06:14
are allowed允許 to answer回答 this question.
156
359000
2000
都不準回答這問題
06:16
So there's no control控制; there's no control控制 group.
157
361000
2000
這實驗沒有對照組
06:18
But that sounds聲音 really techie技術人員.
158
363000
2000
聽起來像電子迷才會做的事
06:20
That's a technical技術 term術語.
159
365000
2000
那是一個專業術語
06:22
The kids孩子 got the pills, and then their performance性能 improved改善.
160
367000
2000
孩童吃了魚肝油,他們的學業表現進步了
06:24
What else其他 could it possibly或者 be if it wasn't the pills?
161
369000
3000
除了魚肝油,還有什麼其它的原因嗎?
06:27
They got older舊的. We all develop發展 over time.
162
372000
3000
他們隨著時間長大了
06:30
And of course課程, also there's the placebo安慰劑 effect影響.
163
375000
2000
當然,還有安慰劑的影響
06:32
The placebo安慰劑 effect影響 is one of the most fascinating迷人 things in the whole整個 of medicine醫學.
164
377000
2000
安慰劑的效果是所有藥品中,最令人神往的一個
06:34
It's not just about taking服用 a pill, and your performance性能 and your pain疼痛 getting得到 better.
165
379000
3000
不是吃藥而已,然後你的表現變好,疼痛舒緩如此而已
06:37
It's about our beliefs信仰 and expectations期望.
166
382000
2000
是我們的信念,以及期望
06:39
It's about the cultural文化 meaning含義 of a treatment治療.
167
384000
2000
這就是治療的文化定譯
06:41
And this has been demonstrated證明 in a whole整個 raft of fascinating迷人 studies學習
168
386000
3000
許多很好的研究也和安慰劑的效果
06:44
comparing比較 one kind of placebo安慰劑 against反對 another另一個.
169
389000
3000
做了對照實驗
06:47
So we know, for example, that two sugar pills a day
170
392000
2000
舉例來說,一天吃兩顆糖的人
06:49
are a more effective有效 treatment治療 for getting得到 rid擺脫 of gastric胃的 ulcers潰瘍
171
394000
2000
較不容易得到胃潰瘍
06:51
than one sugar pill.
172
396000
2000
和吃一顆糖的人相比
06:53
Two sugar pills a day beats節拍 one sugar pill a day.
173
398000
2000
一天吃兩顆糖比吃一顆糖更有用
06:55
And that's an outrageous蠻橫的 and ridiculous荒謬 finding發現, but it's true真正.
174
400000
3000
儘管這個發現讓人覺得奇怪又好笑,但這是事實
06:58
We know from three different不同 studies學習 on three different不同 types類型 of pain疼痛
175
403000
2000
從三種不同疼痛的研究當中,我們瞭解
07:00
that a saltwater鹽水 injection注射 is a more effective有效 treatment治療 for pain疼痛
176
405000
3000
注射生理食鹽水來治療疼痛,是個較有效的方法,
07:03
than taking服用 a sugar pill, taking服用 a dummy pill that has no medicine醫學 in it --
177
408000
4000
相較於吃糖果,安慰劑...等等沒有任何藥品成分的東西
07:07
not because the injection注射 or the pills do anything physically物理 to the body身體,
178
412000
3000
而這並非是注射或吃藥對生體產生了什麼影響
07:10
but because an injection注射 feels感覺 like a much more dramatic戲劇性 intervention介入.
179
415000
3000
而是人們覺得注射是種更強力的手段
07:13
So we know that our beliefs信仰 and expectations期望
180
418000
2000
我們曉得自己的信念,期待
07:15
can be manipulated操縱,
181
420000
2000
是可被操控的
07:17
which哪一個 is why we do trials試驗
182
422000
2000
這也為什麼我們做實驗時
07:19
where we control控制 against反對 a placebo安慰劑 --
183
424000
2000
需要拿安慰劑來對照
07:21
where one half of the people get the real真實 treatment治療
184
426000
2000
一半的人接受真正的治療
07:23
and the other half get placebo安慰劑.
185
428000
2000
另一半的人則使用安慰劑
07:25
But that's not enough足夠.
186
430000
3000
但,這樣是不夠的
07:28
What I've just shown顯示 you are examples例子 of the very simple簡單 and straightforward直截了當 ways方法
187
433000
3000
目前為止,我給你們看的實驗例子的方法,都很簡單易懂
07:31
that journalists記者 and food餐飲 supplement補充 pill peddlers小販
188
436000
2000
簡單到記者、食物供應商、糖果小販
07:33
and naturopaths自然療法
189
438000
2000
以及自然療法醫生
07:35
can distort歪曲 evidence證據 for their own擁有 purposes目的.
190
440000
3000
都可以根據他們自己的目的,來曲解證據
07:38
What I find really fascinating迷人
191
443000
2000
令我相當驚訝的
07:40
is that the pharmaceutical製藥 industry行業
192
445000
2000
是那些醫藥工業
07:42
uses使用 exactly究竟 the same相同 kinds of tricks技巧 and devices設備,
193
447000
2000
他們使用相同的方法、設備
07:44
but slightly more sophisticated複雜的 versions版本 of them,
194
449000
3000
只不過把證據扭曲成較複雜的版本
07:47
in order訂購 to distort歪曲 the evidence證據 that they give to doctors醫生 and patients耐心,
195
452000
3000
而醫生及病人,都根據那些扭曲的證據
07:50
and which哪一個 we use to make vitally至關重要 important重要 decisions決定.
196
455000
3000
做出攸關性命的重要決定
07:53
So firstly首先, trials試驗 against反對 placebo安慰劑:
197
458000
2000
首先,大家都認為
07:55
everybody每個人 thinks they know that a trial審訊 should be
198
460000
2000
一個和安慰劑對照的實驗,應該是
07:57
a comparison對照 of your new drug藥物 against反對 placebo安慰劑.
199
462000
2000
拿新藥品和安慰劑來相比較
07:59
But actually其實 in a lot of situations情況 that's wrong錯誤.
200
464000
2000
但,在很多情況下,這樣的方法是錯的
08:01
Because often經常 we already已經 have a very good treatment治療 that is currently目前 available可得到,
201
466000
3000
因為,我們通常已經有了一個可行的良好療法
08:04
so we don't want to know that your alternative替代 new treatment治療
202
469000
2000
我們就不需要知道那可有可無的
08:06
is better than nothing.
203
471000
2000
替代療法
08:08
We want to know that it's better than the best最好 currently目前 available可得到 treatment治療 that we have.
204
473000
3000
我們想知道的是,替代療法是否比現行的最佳療法更好
08:11
And yet然而, repeatedly反复, you consistently始終如一 see people doing trials試驗
205
476000
3000
然而,你卻看到人們一再地
08:14
still against反對 placebo安慰劑.
206
479000
2000
拿安慰劑來對照做實驗
08:16
And you can get license執照 to bring帶來 your drug藥物 to market市場
207
481000
2000
只要你得到許可證,你的藥品就可以上市
08:18
with only data數據 showing展示 that it's better than nothing,
208
483000
2000
藥品背後的數據,證明這種藥物可有可無
08:20
which哪一個 is useless無用 for a doctor醫生 like me trying to make a decision決定.
209
485000
3000
對一個像我這樣需要做決定的醫生來說是沒有任何用處的
08:23
But that's not the only way you can rig操縱 your data數據.
210
488000
2000
而這不是唯一可以扭曲數據的方法
08:25
You can also rig操縱 your data數據
211
490000
2000
你還可以把和新藥品要比較的物品
08:27
by making製造 the thing you compare比較 your new drug藥物 against反對
212
492000
2000
弄得像垃圾一樣
08:29
really rubbish垃圾.
213
494000
2000
毫無價值
08:31
You can give the competing競爭 drug藥物 in too low a dose劑量,
214
496000
2000
在使用競爭者的藥品時,你也可以只給予很少的劑量,
08:33
so that people aren't properly正確 treated治療.
215
498000
2000
而人們自然沒有辦法治療好
08:35
You can give the competing競爭 drug藥物 in too high a dose劑量,
216
500000
2000
或是你把劑量提高
08:37
so that people get side effects效果.
217
502000
2000
這樣就會產生副作用
08:39
And this is exactly究竟 what happened發生
218
504000
2000
而這正是發生在治療人格分裂症的
08:41
which哪一個 antipsychotic抗精神病藥 medication藥物治療 for schizophrenia精神分裂症.
219
506000
2000
抗精神病療程的事情
08:43
20 years年份 ago, a new generation of antipsychotic抗精神病藥 drugs毒品 were brought in
220
508000
3000
二十年前,從國外引進了一批新的抗精神病藥物
08:46
and the promise諾言 was that they would have fewer side effects效果.
221
511000
3000
這些藥物都保證副作用比現有的藥物還少
08:49
So people set about doing trials試驗 of these new drugs毒品
222
514000
2000
所以人們就開始拿這些新的藥物和舊的藥物
08:51
against反對 the old drugs毒品,
223
516000
2000
做實驗比較
08:53
but they gave the old drugs毒品 in ridiculously可笑 high doses劑量 --
224
518000
2000
但,他們卻把舊的藥物劑量提高很多-
08:55
20 milligrams毫克 a day of haloperidol氟哌啶醇.
225
520000
2000
一天要吃20毫克的施寧錠(抗精神病藥物)
08:57
And it's a foregone已成 conclusion結論,
226
522000
2000
結論可想而知
08:59
if you give a drug藥物 at that high a dose劑量,
227
524000
2000
要是你把一種藥物的劑量開這麼高
09:01
that it will have more side effects效果 and that your new drug藥物 will look better.
228
526000
3000
它的副作用當然比新的藥物還要多
09:04
10 years年份 ago, history歷史 repeated重複 itself本身, interestingly有趣,
229
529000
2000
過去十年,歷史不斷重演。有趣的是
09:06
when risperidone利培酮, which哪一個 was the first of the new-generation新一代 antipscyhoticantipscyhotic drugs毒品,
230
531000
3000
當那一批抗精神病藥品,像是理思必妥
09:09
came來了 off copyright版權, so anybody任何人 could make copies副本.
231
534000
3000
產權到期時,人人都可以自己做這些藥物
09:12
Everybody每個人 wanted to show顯示 that their drug藥物 was better than risperidone利培酮,
232
537000
2000
大家想證明,他們自己做的藥比理思必妥還好
09:14
so you see a bunch of trials試驗 comparing比較 new antipsychotic抗精神病藥 drugs毒品
233
539000
3000
所以你會看到,有一堆的抗精神病藥物
09:17
against反對 risperidone利培酮 at eight milligrams毫克 a day.
234
542000
2000
拿來和一天八毫克的理思必妥做比較
09:19
Again, not an insane dose劑量, not an illegal非法 dose劑量,
235
544000
2000
八毫克不是多得很瘋狂,也沒有超出合法劑量
09:21
but very much at the high end結束 of normal正常.
236
546000
2000
但幾乎游走在法律邊緣了
09:23
And so you're bound to make your new drug藥物 look better.
237
548000
3000
所以,你自己做的藥物當然比那些藥來的好
09:26
And so it's no surprise that overall總體,
238
551000
3000
果不其然
09:29
industry-funded行業資助 trials試驗
239
554000
2000
藥商贊助的實驗
09:31
are four times more likely容易 to give a positive result結果
240
556000
2000
比那些獨立出資的實驗,得到陽性結果
09:33
than independently獨立地 sponsored贊助 trials試驗.
241
558000
3000
機率高出四倍
09:36
But -- and it's a big but --
242
561000
3000
但是,但但但但是
09:39
(Laughter笑聲)
243
564000
2000
(笑聲)
09:41
it turns out,
244
566000
2000
最後卻發現
09:43
when you look at the methods方法 used by industry-funded行業資助 trials試驗,
245
568000
3000
那些藥商贊助實驗所使用的方法
09:46
that they're actually其實 better
246
571000
2000
確實是比獨立出資的實驗
09:48
than independently獨立地 sponsored贊助 trials試驗.
247
573000
2000
要來的好
09:50
And yet然而, they always manage管理 to to get the result結果 that they want.
248
575000
3000
儘管如此,他們仍設法去得到他們想要的結果
09:53
So how does this work?
249
578000
2000
怎麼做?
09:55
How can we explain說明 this strange奇怪 phenomenon現象?
250
580000
3000
這樣如此奇怪的現象該如何解釋?
09:58
Well it turns out that what happens發生
251
583000
2000
原來,那些負面的數據
10:00
is the negative data數據 goes missing失踪 in action行動;
252
585000
2000
在實驗進行中被隱匿起來。
10:02
it's withheld扣留 from doctors醫生 and patients耐心.
253
587000
2000
醫生和病人都不知情
10:04
And this is the most important重要 aspect方面 of the whole整個 story故事.
254
589000
2000
而這正是整件事情的重點所在
10:06
It's at the top最佳 of the pyramid金字塔 of evidence證據.
255
591000
2000
證據都在這裡
10:08
We need to have all of the data數據 on a particular特定 treatment治療
256
593000
3000
任一特定的療法,我們需要所有的資訊
10:11
to know whether是否 or not it really is effective有效.
257
596000
2000
來知道該療法是否有效
10:13
And there are two different不同 ways方法 that you can spot
258
598000
2000
而有兩種不同的方法可以讓你知道
10:15
whether是否 some data數據 has gone走了 missing失踪 in action行動.
259
600000
2000
是否有數據在實驗中不見了
10:17
You can use statistics統計, or you can use stories故事.
260
602000
3000
妳可以用統計數字,或是實例
10:20
I personally親自 prefer比較喜歡 statistics統計, so that's what I'm going to do first.
261
605000
2000
我個人偏好統計數字,所以我先講這個
10:22
This is something called funnel漏斗 plot情節.
262
607000
2000
這個圖表叫做漏斗圖
10:24
And a funnel漏斗 plot情節 is a very clever聰明 way of spotting斑點
263
609000
2000
漏斗圖的設計非常聰明
10:26
if small negative trials試驗 have disappeared消失, have gone走了 missing失踪 in action行動.
264
611000
3000
要是有少量的負面結果不見的話,你會看得出來
10:29
So this is a graph圖形 of all of the trials試驗
265
614000
2000
這圖表上,就是針對某一種醫療法的
10:31
that have been doneDONE on a particular特定 treatment治療.
266
616000
2000
所有實驗
10:33
And as you go up towards the top最佳 of the graph圖形,
267
618000
2000
隨著你接近圖表的頂端
10:35
what you see is each dot is a trial審訊.
268
620000
2000
你看到的每一個點,都是一個實驗
10:37
And as you go up, those are the bigger trials試驗, so they've他們已經 got less error錯誤 in them.
269
622000
3000
越往上看,那些實驗規模越大,也就越少出錯
10:40
So they're less likely容易 to be randomly隨機 false positives陽性, randomly隨機 false negatives底片.
270
625000
3000
也就是越少隨便的假陽性效果、假的陰性效果
10:43
So they all cluster together一起.
271
628000
2000
他們全聚在一塊
10:45
The big trials試驗 are closer接近 to the true真正 answer回答.
272
630000
2000
規模較大的實驗,都較接近事實
10:47
Then as you go further進一步 down at the bottom底部,
273
632000
2000
而你越往底部看
10:49
what you can see is, over on this side, the spurious false negatives底片,
274
634000
3000
你就會看到捏造的假陰性結果
10:52
and over on this side, the spurious false positives陽性.
275
637000
2000
而另一邊,你也是看到捏造的假陽性結果
10:54
If there is publication出版物 bias偏壓,
276
639000
2000
要是有發表篇倚(有統計同的結果更容易被發表)
10:56
if small negative trials試驗 have gone走了 missing失踪 in action行動,
277
641000
3000
或是少量的負面結果在過程中消失
10:59
you can see it on one of these graphs.
278
644000
2000
你在這些圖表都會看到
11:01
So you can see here that the small negative trials試驗
279
646000
2000
所以,那些本該在左下角的負面結果
11:03
that should be on the bottom底部 left have disappeared消失.
280
648000
2000
全都消失了
11:05
This is a graph圖形 demonstrating示範 the presence存在 of publication出版物 bias偏壓
281
650000
3000
這圖表展示的,是關於發表篇倚的研究中
11:08
in studies學習 of publication出版物 bias偏壓.
282
653000
2000
所出現的發表篇倚
11:10
And I think that's the funniest最有趣 epidemiology流行病學 joke玩笑
283
655000
2000
我認為那是個流行病學的笑話,而且,我想這是你聽過的笑話中
11:12
that you will ever hear.
284
657000
2000
最好笑的一個
11:14
That's how you can prove證明 it statistically統計學,
285
659000
2000
剛剛說的,就是說明該如何透過統記證明
11:16
but what about stories故事?
286
661000
2000
那麼實例呢?
11:18
Well they're heinous滔天, they really are.
287
663000
2000
實例就比較可惡了,真的很可惡。
11:20
This is a drug藥物 called reboxetine瑞波西汀.
288
665000
2000
這藥叫做瑞波西汀(抗憂鬱劑)
11:22
This is a drug藥物 that I myself have prescribed規定 to patients耐心.
289
667000
2000
我過去曾開過這種藥給病人
11:24
And I'm a very nerdy書呆子 doctor醫生.
290
669000
2000
而我是個蠻書呆子的醫生
11:26
I hope希望 I try to go out of my way to try and read and understand理解 all the literature文學.
291
671000
3000
我盡力去嘗試閱讀,並瞭解所有的文獻
11:29
I read the trials試驗 on this. They were all positive. They were all well-conducted有效地開展.
292
674000
3000
我看了這些實驗,他們都是陽性的,也都進行得很好
11:32
I found發現 no flaw缺陷.
293
677000
2000
我沒發現什麼缺失
11:34
Unfortunately不幸, it turned轉身 out,
294
679000
2000
不幸地是
11:36
that many許多 of these trials試驗 were withheld扣留.
295
681000
2000
許多實驗並沒有公布出來
11:38
In fact事實, 76 percent百分
296
683000
2000
事實上,有76%關於這項藥品的實驗
11:40
of all of the trials試驗 that were doneDONE on this drug藥物
297
685000
2000
並沒有讓醫生
11:42
were withheld扣留 from doctors醫生 and patients耐心.
298
687000
2000
或是病人知道
11:44
Now if you think about it,
299
689000
2000
現在,請你想一想
11:46
if I tossed a coin硬幣 a hundred times,
300
691000
2000
要是我擲一枚銅板一百次
11:48
and I'm allowed允許 to withhold扣壓 from you
301
693000
2000
而其中有一半的答案
11:50
the answers答案 half the times,
302
695000
2000
我都不和你說
11:52
then I can convince說服 you
303
697000
2000
那麼,我就有辦法說服你
11:54
that I have a coin硬幣 with two heads.
304
699000
2000
這枚銅板兩面都是人頭
11:56
If we remove去掉 half of the data數據,
305
701000
2000
若我們刪除一半的數據
11:58
we can never know what the true真正 effect影響 size尺寸 of these medicines藥品 is.
306
703000
3000
那我們永遠都不曉得,這些藥品實際的效果為何
12:01
And this is not an isolated孤立 story故事.
307
706000
2000
而這類的實例都是息息相關的
12:03
Around half of all of the trial審訊 data數據 on antidepressants抗抑鬱藥 has been withheld扣留,
308
708000
4000
大約有一半抗憂鬱藥物的實驗至今仍未公佈
12:07
but it goes way beyond that.
309
712000
2000
甚至還超出一半
12:09
The Nordic北歐 Cochrane科克倫 Group were trying to get a hold保持 of the data數據 on that
310
714000
2000
而科克倫小組,正致力於獲得那些數據
12:11
to bring帶來 it all together一起.
311
716000
2000
來整合所有的事情
12:13
The Cochrane科克倫 Groups are an international國際 nonprofit非營利性 collaboration合作
312
718000
3000
科克倫小組是個國際性,非營利的合作機構
12:16
that produce生產 systematic系統的 reviews評論 of all of the data數據 that has ever been shown顯示.
313
721000
3000
他們有系統地提供過去的數據回顧
12:19
And they need to have access訪問 to all of the trial審訊 data數據.
314
724000
3000
而他們需要管道,來獲得所有的數據
12:22
But the companies公司 withheld扣留 that data數據 from them,
315
727000
3000
但這些公司不讓他們知道那些數據
12:25
and so did the European歐洲的 Medicines藥品 Agency機構
316
730000
2000
歐洲藥物管理局也是如此
12:27
for three years年份.
317
732000
2000
已經達三年了
12:29
This is a problem問題 that is currently目前 lacking不足 a solution.
318
734000
3000
這個問題現在仍少個解決方法
12:32
And to show顯示 how big it goes, this is a drug藥物 called Tamiflu達菲,
319
737000
3000
要讓你們知道問題的嚴重性。來看看克流感
12:35
which哪一個 governments政府 around the world世界
320
740000
2000
世界上許多政府
12:37
have spent花費 billions數十億 and billions數十億 of dollars美元 on.
321
742000
2000
花了數十億的金錢在克流感上
12:39
And they spend that money on the promise諾言
322
744000
2000
這些政府都承諾
12:41
that this is a drug藥物 which哪一個 will reduce減少 the rate
323
746000
2000
克流感可以降低
12:43
of complications並發症 with flu流感.
324
748000
2000
流感併發症的比率
12:45
We already已經 have the data數據
325
750000
2000
我們既有的數據顯示
12:47
showing展示 that it reduces減少 the duration持續時間 of your flu流感 by a few少數 hours小時.
326
752000
2000
克流感是降低流感的持續時間幾小時
12:49
But I don't really care關心 about that. Governments政府 don't care關心 about that.
327
754000
2000
但我不在乎這個,政府也不在乎
12:51
I'm very sorry if you have the flu流感, I know it's horrible可怕,
328
756000
3000
如果你得到流感,我很抱歉,我知道那很難受
12:54
but we're not going to spend billions數十億 of dollars美元
329
759000
2000
但,我們絕不會花了幾十億
12:56
trying to reduce減少 the duration持續時間 of your flu流感 symptoms症狀
330
761000
2000
只是要去縮短流感症狀的持續時間而已
12:58
by half a day.
331
763000
2000
還只減少個半天左右
13:00
We prescribe規定 these drugs毒品, we stockpile儲存 them for emergencies緊急情況
332
765000
2000
我們會開這些藥,也會貯存他們為了緊急狀況
13:02
on the understanding理解 that they will reduce減少 the number of complications並發症,
333
767000
2000
因為我們瞭解,他們可以降低併發症可能性
13:04
which哪一個 means手段 pneumonia肺炎 and which哪一個 means手段 death死亡.
334
769000
3000
併發症可能是肺炎,或是死亡
13:07
The infectious傳染病 diseases疾病 Cochrane科克倫 Group, which哪一個 are based基於 in Italy意大利,
335
772000
3000
以義大利為根基的科克倫小組,正努力從藥品公司
13:10
has been trying to get
336
775000
2000
想辦法獲得感染疾病的完整數據
13:12
the full充分 data數據 in a usable可用 form形成 out of the drug藥物 companies公司
337
777000
3000
而這些數據都是正確可用的
13:15
so that they can make a full充分 decision決定
338
780000
3000
如此一來,科克倫小組才能決定
13:18
about whether是否 this drug藥物 is effective有效 or not,
339
783000
2000
一個藥品有效與否
13:20
and they've他們已經 not been able能夠 to get that information信息.
340
785000
3000
然而,他們卻無法獲得那樣的資訊
13:23
This is undoubtedly無疑
341
788000
2000
無庸置疑
13:25
the single biggest最大 ethical合乎道德的 problem問題
342
790000
3000
照是現藥品面臨的
13:28
facing面對 medicine醫學 today今天.
343
793000
2000
一個嚴重的道德問題
13:30
We cannot不能 make decisions決定
344
795000
3000
若沒有這些資訊
13:33
in the absence缺席 of all of the information信息.
345
798000
4000
我們便沒有辦法下決定
13:37
So it's a little bit difficult from there
346
802000
3000
儘管,現在要下個有希望的結論
13:40
to spin in some kind of positive conclusion結論.
347
805000
4000
是有些困難
13:44
But I would say this:
348
809000
4000
但,我會說
13:48
I think that sunlight陽光
349
813000
3000
我認為,陽光
13:51
is the best最好 disinfectant消毒劑.
350
816000
2000
是最好的消毒劑
13:53
All of these things are happening事件 in plain sight視力,
351
818000
3000
我們眼前看到所發生的事物
13:56
and they're all protected保護
352
821000
2000
他們都被我們習以為常的陽光
13:58
by a force field領域 of tediousness沉悶.
353
823000
3000
給保護著
14:01
And I think, with all of the problems問題 in science科學,
354
826000
2000
儘管現今科學界存在著這些問題
14:03
one of the best最好 things that we can do
355
828000
2000
我認為,我們最好能做的一件事
14:05
is to lift電梯 up the lid,
356
830000
2000
就是掀開事物的表層
14:07
finger手指 around in the mechanics機械學 and peer窺視 in.
357
832000
2000
仔細探查,檢視一番
14:09
Thank you very much.
358
834000
2000
謝謝你們
14:11
(Applause掌聲)
359
836000
3000
(掌聲)
Translated by Ou Chih-Hong
Reviewed by Joan Liu

▲Back to top

ABOUT THE SPEAKER
Ben Goldacre - Debunker
Ben Goldacre unpicks dodgy scientific claims made by scaremongering journalists, dubious government reports, pharmaceutical corporations, PR companies and quacks.

Why you should listen

"It was the MMR story that finally made me crack," begins the Bad Science manifesto, referring to the sensationalized -- and now-refuted -- link between vaccines and autism. With that sentence Ben Goldacre fired the starting shot of a crusade waged from the pages of The Guardian from 2003 to 2011, on an addicitve Twitter feed, and in bestselling books, including Bad Science and his latest, Bad Pharma, which puts the $600 billion global pharmaceutical industry under the microscope. What he reveals is a fascinating, terrifying mess.

Goldacre was trained in medicine at Oxford and London, and works as an academic in epidemiology. Helped along by this inexhaustible supply of material, he also travels the speaking circuit, promoting skepticism and nerdish curiosity with fire, wit, fast delivery and a lovable kind of exasperation. (He might even convince you that real science, sober reporting and reason are going to win in the end.)

As he writes, "If you're a journalist who misrepresents science for the sake of a headline, a politician more interested in spin than evidence, or an advertiser who loves pictures of molecules in little white coats, then beware: your days are numbered."

Read an excerpt of Bad Pharma >>

More profile about the speaker
Ben Goldacre | Speaker | TED.com