ABOUT THE SPEAKER
Martin Rees - Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos.

Why you should listen

Lord Martin Rees has issued a clarion call for humanity. His 2004 book, ominously titled Our Final Hour, catalogues the threats facing the human race in a 21st century dominated by unprecedented and accelerating scientific change. He calls on scientists and nonscientists alike to take steps that will ensure our survival as a species.

One of the world's leading astronomers, Rees is an emeritus professor of cosmology and astrophysics at Cambridge, and UK Astronomer Royal. Author of more than 500 research papers on cosmological topics ranging from black holes to quantum physics to the Big Bang, Rees has received countless awards for his scientific contributions. But equally significant has been his devotion to explaining the complexities of science for a general audience, in books like Before the Beginning and Our Cosmic Habitat.

More profile about the speaker
Martin Rees | Speaker | TED.com
TED2014

Martin Rees: Can we prevent the end of the world?

Sir Martin Rees: 我哋可以阻止世界末日嗎?

Filmed:
1,283,785 views

一個後末日世界嘅地球,人類消失咗,聽起嚟似係科幻小說劇同電影。 但喺呢場短又眼前一亮嘅演講入面,Martin Rees 爵士叫我哋諗一諗我哋真實存在嘅危機 —— 可以毀滅人類嘅自然威脅同人為威脅。 作為關心人類嘅一個人,佢問:「最壞可能發生嘅事係乜?」。
- Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos. Full bio

Double-click the English transcript below to play the video.

十年前,我寫咗一本書
00:12
Ten years ago, I wrote a book which I entitled題為
0
485
2222
叫做 「 我哋最後世紀? 」問號
00:14
"Our Final最後 Century世紀?" Question個問題 mark馬克.
1
2707
3093
00:17
My publishers出版商 cutcut out the question個問題 mark馬克. (Laughter笑聲)
2
5800
3577
我嘅出版商將問號刪咗
(笑聲)
00:21
The American美國 publishers出版商 changed our titletitle
3
9377
1882
美國出版商將題目改咗做
00:23
to "Our Final最後 Hour小時."
4
11259
3909
「 我哋最後一個鐘 」
(笑聲)
00:27
Americans美國人 like instant即時 gratification滿足 and the reverse反向.
5
15168
3492
美國人鍾意即時行樂,將事情倒轉嚟睇
(笑聲)
00:30
(Laughter笑聲)
6
18660
1708
00:32
And my theme主題 was this:
7
20368
1750
我嘅主題係噉樣嘅:
00:34
Our Earth地球 has existed存在 for 45 million centuries世紀,
8
22118
4166
我哋嘅地球已經有四十五億年歷史
00:38
but this one is special特殊
9
26284
2013
但過去一百年嚟係特別嘅
00:40
it's the first where one species物種, ours我哋,
10
28297
3016
呢個係有史以嚟第一次
有一個物種,即係我哋人類
00:43
has the planet's行星嘅 future未來 in its hands.
11
31313
2802
將呢個星球嘅未來掌握喺自己手裏面
00:46
Over nearly all of Earth's地球嘅 history歷史,
12
34115
1990
睇返地球歷史
00:48
threats威脅 have come from nature自然
13
36105
1936
威脅絕大部分都係嚟自大自然
00:50
disease疾病, earthquakes地震, asteroids小行星 and so forth提出
14
38041
3496
疾病、地震、隕石撞擊等等
00:53
but from now on, the worst糟糕 dangers危險 come from us.
15
41537
5672
但係由宜家開始
最大嘅威脅嚟自於我哋自己
並且宜家,唔單止核威脅
00:59
And it's now not just the nuclear threat威脅;
16
47209
3271
01:02
in our interconnected互聯 world世界,
17
50480
1751
喺我哋全球化嘅世界裏面
01:04
network網絡 breakdowns故障 can cascade級聯 globally全球;
18
52231
3163
網路故障可以癱瘓全球
01:07
air空氣 travel旅行 can spread傳播 pandemics流行病
worldwide全球 within days;
19
55394
3956
航班令流行病
可以喺幾日之內喺全世界傳播
01:11
and social社會 media媒體 can spread傳播 panic恐慌 and rumor謠言
20
59350
3327
並且社會媒體可以散佈恐慌同謠言
01:14
literally從字面上 at the speed速度 of light.
21
62677
3217
傳播速度同光速有得輝
01:17
We fret煩惱 too much about minor hazards危害
22
65894
3225
我哋會為到細小嘅威脅而焦慮
01:21
improbable冇可能 air空氣 crashes崩潰, carcinogens致癌物質 in food食品,
23
69119
4031
冇乜可能發生嘅空難、食物嘅致癌物質
低水平輻射等等
01:25
low radiation輻射 doses劑量, and so forth提出
24
73150
2226
01:27
but we and our political政治 masters主人
25
75376
2825
但係我哋同我哋嘅政治領袖
都唔肯面對災難情景
01:30
are in denial拒絕 about catastrophic災難 scenarios場景.
26
78201
4203
01:34
The worst糟糕 have thankfully謝天謝地 not yet尚未 happened發生.
27
82404
3038
好彩最壞嘅仲未發生
01:37
Indeed講真, they probably可能 won't唔會.
28
85442
2196
確實,佢哋應該唔會發生
01:39
But if an event事件 is potentially devastating毀滅性,
29
87638
3185
但係如果有件潛在毀滅性嘅事發生
01:42
it's worth值得 paying支付 a substantial大量 premium溢價
30
90823
2868
噉就值得我哋持續開支
01:45
to safeguard保障 against it, even if it's unlikely冇可能,
31
93691
3836
防止嗰件事發生
即使件事唔太可能發生都好
就好似我哋會幫間屋買火險一樣
01:49
just as we take out fire insurance保險 on our house房子.
32
97527
4513
01:54
And as science科學 offers提供 greater power權力 and promise承諾,
33
102040
4997
道高一尺魔高一丈
科學俾我哋更加多嘅力量同願望嘅同時
01:59
the downside缺點 gets得到 scarier可怕 too.
34
107037
3866
科學可以係好危險
02:02
We get ever more vulnerable脆弱.
35
110903
2239
我哋於是更加經唔起考驗
02:05
Within a few幾個 decades幾十年,
36
113142
1838
幾十年後,幾百萬嘅人會有能力
02:06
millions数百万 will have the capability能力
37
114980
2230
濫用發展一日千里嘅生物科技
02:09
to misuse濫用 rapidly迅速 advancing推進 biotech生物,
38
117210
3121
02:12
just as they misuse濫用 cybertechcybertech today今日.
39
120331
3553
就好似宜家嘅人濫用網絡科技一樣
02:15
Freeman弗里曼 Dyson戴森, in a TED泰德 Talk,
40
123884
3199
Freeman Dyson 喺一場 TED 演講入面
02:19
foresaw預見 that children孩子 will design設計
and create創建 new新增功能 organisms生物
41
127083
3596
預見小朋友設計同創造新嘅生物
02:22
just as routinely經常 as his generation生成
played發揮 with chemistry化學 sets.
42
130679
4511
就好似佢嗰代人當化學實驗遊戲噉玩
02:27
Well, this may可能 be on the science科學 fiction小說 fringe邊緣,
43
135190
2528
當然,呢樣可能只會喺科幻小說出現
02:29
but were even part部分 of his scenario場景 to come about,
44
137718
3183
但亦有可能係佢預言嘅情景︰
02:32
our ecology生態 and even our species物種
45
140901
2737
我哋嘅生態、以至物種
一定唔會平安無事咁長期生存
02:35
would surely肯定 not survive生存 long unscathed毫發無損.
46
143638
3989
02:39
For instance實例, there are some eco-extremists生態極端分子
47
147627
3863
例如,有一啲生態學嘅極端人士
好似 Gaia 咁
02:43
who think that it would be better for the planet星球,
48
151490
2509
佢認為人類幾量越少
會對我哋嘅星球越好
02:45
for Gaia盖亚, if there were far fewer humans人類.
49
153999
3403
如果人類掌握咗 2050 年
就會全世界廣泛使用嘅合成生物技術
02:49
What happens發生 when such people have mastered掌握
50
157402
2717
02:52
synthetic合成 biology生物學 techniques技術
51
160119
2137
02:54
that will be widespread廣泛 by 2050?
52
162256
2852
會點樣?
02:57
And by then, other science科學 fiction小說 nightmares噩夢
53
165108
3042
到嗰個時候
其他嘅科幻小說裏面所提到嘅噩夢
03:00
may可能 transition過渡 to reality現實:
54
168150
1710
或者會成為現實
03:01
dumb robots機械人 going rogue流氓,
55
169860
2070
沉默嘅機器人變得瘋狂
03:03
or a network網絡 that develops發展 a mind介意 of its own自己
56
171930
2417
網路有自己嘅思想,威脅我哋全人類
03:06
threatens威脅 us all.
57
174347
2589
03:08
Well, can we guard警衛 against such risks風險 by regulation調節?
58
176936
3270
噉,我哋可以通過規管
去避免呢啲危機嗎?
03:12
We must必須 surely肯定 try, but these enterprises企業
59
180206
2407
我哋確實要去嘗試
但係呢啲企業都好有競爭性
喺全世界都有生意利益
03:14
are so competitive競爭, so globalized全球化,
60
182613
3529
03:18
and so driven驅動 by commercial商業 pressure壓力,
61
186142
1980
太受市場影響
03:20
that anything that can be done
will be done somewhere地方,
62
188122
3285
因為法律
令可以做嘅嘢
都要搬到去其他地方先做到
03:23
whatever無論 the regulations法規 say.
63
191407
2036
03:25
It's like the drug藥物 laws法律 — we try to regulate調節, but can't.
64
193443
3487
就好似禁毒法律咁
我哋去監管,但始終失敗
並且地球村將會有
一班蠢嘅人遍佈全世界
03:28
And the global全球 village will have its village idiots白痴,
65
196930
3044
03:31
and they'll佢地會 have a global全球 range範圍.
66
199974
3496
03:35
So as I said in my book,
67
203470
2291
所以,我喺書裏面話
03:37
we'll我哋就 have a bumpy顛簸 ride through透過 this century世紀.
68
205761
2889
我哋喺呢個世紀嘅路會比較難行
03:40
There may可能 be setbacks挫折 to our society社會
69
208650
3490
我哋社會可能會有一啲難題
其實係有一半機會面對重大問題
03:44
indeed講真, a 50 percent百分比 chance機會 of a severe嚴重 setback挫折.
70
212140
4115
03:48
But are there conceivable可想而知 events事件
71
216255
2914
但係會唔會有可以預見到
03:51
that could be even worse更糟,
72
219169
2161
但更加弊、滅絕所有物種嘅事發生?
03:53
events事件 that could snuff鼻煙 out all life?
73
221330
3430
03:56
When a new新增功能 particle粒子 accelerator加速器 came online在線,
74
224760
2926
當新嘅粒子加速器嘅消息喺上網出現
一啲人會緊張咁問
03:59
some people anxiously焦急地 asked問吓,
75
227686
1789
佢會唔會破壞地球
04:01
could it destroy摧毀 the Earth地球 or, even worse更糟,
76
229475
2250
或者話更弊嘅,將太空空間撕開?
04:03
rip apart分開 the fabric織物 of space空間?
77
231725
2659
04:06
Well luckily好彩呀, reassurance could be offered提供.
78
234384
3543
值得慶倖嘅係,我哋仍然係安全嘅
04:09
I and others pointed指出 out that nature自然
79
237927
2044
我同其他人指出過
大自然其實已經透過宇宙射線碰撞
做過無數次同樣嘅實驗
04:11
has done the same相同 experiments實驗
80
239971
1933
04:13
zillions無數 of times already,
81
241904
2186
04:16
via透過 cosmic宇宙 ray射線 collisions碰撞.
82
244090
1765
04:17
But scientists科學家 should surely肯定 be precautionary預防
83
245855
3054
但係科學家喺做研究嘅時候
必須要小心整咗啲
宇宙從來冇發生過嘅嘢
04:20
about experiments實驗 that generate生成 conditions條件
84
248909
2580
04:23
without precedent先例 in the natural自然 world世界.
85
251489
2483
04:25
Biologists生物學家 should avoid避免 release釋放
of potentially devastating毀滅性
86
253972
3423
生物學家應該避免泄漏
經過基因改造
具有潛在毀滅後果嘅病原體
04:29
genetically基因 modified改性 pathogens病原體.
87
257395
2715
04:32
And by the way, our special特殊 aversion厭惡
88
260110
3517
順便講下,我哋
對真正嘅災難之所以特別厭惡
04:35
to the risk風險 of truly真正 existential存在 disasters災害
89
263627
3461
04:39
depends要睇 on a philosophical哲學 and ethical倫理 question個問題,
90
267088
3275
係源於一個哲學倫理問題
04:42
and it's this:
91
270363
1670
個問題係:
想像有兩個場景
04:44
Consider諗緊 two scenarios場景.
92
272033
2308
04:46
Scenario場景 A wipes濕巾 out 90 percent百分比 of humanity人類.
93
274341
5236
場景一,消滅 90% 人類
04:51
Scenario場景 B wipes濕巾 out 100 percent百分比.
94
279577
3896
場景二,消滅所有人類
04:55
How much worse更糟 is B than A?
95
283473
2918
場景二比場景一壞幾多呢?
04:58
Some would say 10 percent百分比 worse更糟.
96
286391
3023
有人會話差十個百分點
05:01
The body身體 count計數 is 10 percent百分比 higher.
97
289414
3150
死亡統計係高出十個百分點
05:04
But I claim索賠 that B is incomparably無比 worse更糟.
98
292564
2906
但係我唔會話場景二比較差
05:07
As an astronomer天文學家, I can't believe
99
295470
2629
作為一個天文學家
我唔相信人類只係
成個地球歷史嘅最後一部分
05:10
that humans人類 are the end結束 of the story故事.
100
298099
2467
05:12
It is five billion years before the sun太陽 flares耀斑 up,
101
300566
3323
仲有五十億年
太陽至會加劇燃燒,步入衰退期
05:15
and the universe宇宙 may可能 go on forever永遠,
102
303889
2711
但宇宙仍然會繼續走落去
05:18
so post-human之後人類 evolution演化,
103
306600
2292
噉樣,後人類嘅進化過程
05:20
here on Earth地球 and far beyond超越,
104
308892
2190
喺地球同埋外太空
05:23
could be as prolonged長期 as the Darwinian達爾文 process過程
105
311082
2714
可以同達爾文進化論一樣咁長
甚至更加精彩
05:25
that's led to us, and even more wonderful美妙.
106
313796
3281
05:29
And indeed講真, future未來 evolution演化
will happen發生 much faster更快,
107
317077
2664
其實,未來進化嘅速度會更加快
05:31
on a technological技術 timescale時間表,
108
319741
2199
呢樣係基於技術上嘅時間嘅考慮
05:33
not a natural自然 selection選擇 timescale時間表.
109
321940
2299
而並非自物競天擇嘅時間考慮
05:36
So we surely肯定, in view视图 of those immense巨大 stakes風險,
110
324239
4195
所以面對咁大嘅利害關係
05:40
shouldn't唔该 accept接受 even a one in a billion risk風險
111
328434
3386
面對十億個危險
我哋都唔應該妥栛一個危險
05:43
that human人類 extinction滅絕 would foreclose排除
112
331820
2229
因為要記住人類滅絕嘅可能
05:46
this immense巨大 potential.
113
334049
2310
05:48
Some scenarios場景 that have been envisaged設想
114
336359
1772
一啲俾人提過嘅場景
都可能出自於科幻小說裏邊
05:50
may可能 indeed講真 be science科學 fiction小說,
115
338131
1819
05:51
but others may可能 be disquietingly其聯動性 real真正.
116
339950
3386
但係其他嘅場景
可能真係會成為現實
05:55
It's an important重要 maxim格言 that the unfamiliar陌生
117
343336
2874
有句重要嘅教誨係講︰
未知唔等於唔可能
05:58
is not the same相同 as the improbable冇可能,
118
346210
2697
06:00
and in fact事實, that's why we at Cambridge劍橋 University大學
119
348907
2398
實際上因為噉樣
我哋喺劍橋大學成立咗一個中心
06:03
are setting設置 up a center中心 to study研究 how to mitigate減輕
120
351305
3375
研究點樣降低以上存在嘅風險
06:06
these existential存在 risks風險.
121
354680
2032
06:08
It seems好似 it's worthwhile值得 just for a few幾個 people
122
356712
3063
幾個人去思考呢啲潛在災難睇落幾唔錯
06:11
to think about these potential disasters災害.
123
359775
2316
06:14
And we need all the help we can get from others,
124
362091
3013
但係我哋需要更加多嘅人幫手
06:17
because we are stewards管家 of a precious矜貴
125
365104
2479
因為我哋係茫茫宇宙中
一個藍色星球嘅打理人
06:19
pale蒼白 blue藍色 dotD in a vast巨大 cosmos宇宙,
126
367583
3483
06:23
a planet星球 with 50 million centuries世紀 ahead提前 of it.
127
371066
3378
呢個星球喺人類出現之前
已經有五十億年嘅歷史
06:26
And so let's not jeopardize危及 that future未來.
128
374444
2556
我哋唔要糟塌呢個星球嘅未來
06:29
And I'd like to finish完成 with a quote報價
129
377000
1795
我諗引用著名科學屋
Peter Medawar 嘅一句話收尾
06:30
from a great scientist科學家 called Peter彼得 MedawarMedawar.
130
378795
3501
06:34
I quote報價, "The bells鐘聲 that toll收費 for mankind人類
131
382296
3273
「 為人類而設嘅鈴鈴
06:37
are like the bells鐘聲 of Alpine高山 cattle.
132
385569
2644
就好似阿爾卑斯山嘅小牛嘅鈴鈴一樣,
06:40
They are attached附加 to our own自己 necks脖子,
133
388213
2286
呢啲鈴鈴就好似掛喺我哋頸上面
06:42
and it must必須 be our fault故障 if they do not make
134
390499
2675
如果佢哋冇發出和諧而有旋律嘅聲音,
06:45
a tuneful悅耳 and melodious悠揚 sound聲音."
135
393174
2131
噉就必定係我哋嘅錯。 」
06:47
Thank you very much.
136
395305
2267
多謝
(掌聲)
06:49
(Applause掌聲)
137
397572
2113
Translated by Miley Liu
Reviewed by Xingyi Ouyang 歐陽杏儀

▲Back to top

ABOUT THE SPEAKER
Martin Rees - Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos.

Why you should listen

Lord Martin Rees has issued a clarion call for humanity. His 2004 book, ominously titled Our Final Hour, catalogues the threats facing the human race in a 21st century dominated by unprecedented and accelerating scientific change. He calls on scientists and nonscientists alike to take steps that will ensure our survival as a species.

One of the world's leading astronomers, Rees is an emeritus professor of cosmology and astrophysics at Cambridge, and UK Astronomer Royal. Author of more than 500 research papers on cosmological topics ranging from black holes to quantum physics to the Big Bang, Rees has received countless awards for his scientific contributions. But equally significant has been his devotion to explaining the complexities of science for a general audience, in books like Before the Beginning and Our Cosmic Habitat.

More profile about the speaker
Martin Rees | Speaker | TED.com