ABOUT THE SPEAKER
Martin Rees - Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos.

Why you should listen

Lord Martin Rees has issued a clarion call for humanity. His 2004 book, ominously titled Our Final Hour, catalogues the threats facing the human race in a 21st century dominated by unprecedented and accelerating scientific change. He calls on scientists and nonscientists alike to take steps that will ensure our survival as a species.

One of the world's leading astronomers, Rees is an emeritus professor of cosmology and astrophysics at Cambridge, and UK Astronomer Royal. Author of more than 500 research papers on cosmological topics ranging from black holes to quantum physics to the Big Bang, Rees has received countless awards for his scientific contributions. But equally significant has been his devotion to explaining the complexities of science for a general audience, in books like Before the Beginning and Our Cosmic Habitat.

More profile about the speaker
Martin Rees | Speaker | TED.com
TED2014

Martin Rees: Can we prevent the end of the world?

馬丁•里斯爵士: 我們能阻止世界末日嗎?

Filmed:
1,283,785 views

一個末日後杳無人煙的地球,更像是科幻電視和電影中的場景。但在這個簡短而充滿奇思妙想的演講中,馬丁•里斯爵士引領我們探索了那些真正影響人類生存的危機——來自自然和人為,可以使人類徹底滅亡的威脅。作為與人類命運緊緊相扣的一員,他問道:「可能發生的最糟糕的事情是什麽?」
- Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos. Full bio

Double-click the English transcript below to play the video.

00:12
Ten years年份 ago, I wrote a book which哪一個 I entitled標題
0
485
2222
十年前,我寫了一本書。
00:14
"Our Final最後 Century世紀?" Question mark標記.
1
2707
3093
書名為《 我們的末世紀?》以問號結尾。
00:17
My publishers出版商 cut out the question mark標記. (Laughter笑聲)
2
5800
3577
我的出版商去掉了問號。(笑聲)
00:21
The American美國 publishers出版商 changed our title標題
3
9377
1882
美國出版商把我們的書名
00:23
to "Our Final最後 Hour小時."
4
11259
3909
改成了《我們的末日》。
00:27
Americans美國人 like instant瞬間 gratification享樂 and the reverse相反.
5
15168
3492
美國人喜歡即刻的滿足與逆反。
00:30
(Laughter笑聲)
6
18660
1708
(笑聲)
00:32
And my theme主題 was this:
7
20368
1750
我的主題是這樣的,
00:34
Our Earth地球 has existed存在 for 45 million百萬 centuries百年,
8
22118
4166
我們的地球已經存在了四千五百多萬個世紀。
00:38
but this one is special特別
9
26284
2013
但這個世紀是特殊的,
00:40
it's the first where one species種類, ours我們的,
10
28297
3016
第一次有一個物種,也就是我們,
00:43
has the planet's地球上的 future未來 in its hands.
11
31313
2802
掌握了這個星球的命運。
00:46
Over nearly幾乎 all of Earth's地球 history歷史,
12
34115
1990
地球過去的歷史中,
00:48
threats威脅 have come from nature性質
13
36105
1936
威脅主要來源於自然——
00:50
disease疾病, earthquakes地震, asteroids小行星 and so forth向前
14
38041
3496
疾病、地震、小行星等等——
00:53
but from now on, the worst最差 dangers危險 come from us.
15
41537
5672
但是從今往後,最大的威脅來源於我們自己。
00:59
And it's now not just the nuclear threat威脅;
16
47209
3271
現今不止是核威脅,
01:02
in our interconnected互聯 world世界,
17
50480
1751
在這個相互連接的世界裡,
01:04
network網絡 breakdowns故障 can cascade級聯 globally全球;
18
52231
3163
網路故障可以波及全球,
01:07
air空氣 travel旅行 can spread傳播 pandemics流行病
worldwide全世界 within days;
19
55394
3956
航空旅行可以在幾天內將流行病傳遍世界。
01:11
and social社會 media媒體 can spread傳播 panic恐慌 and rumor謠言
20
59350
3327
社會媒體簡直能以光速
01:14
literally按照字面 at the speed速度 of light.
21
62677
3217
散播恐慌和謠言。
01:17
We fret煩惱 too much about minor次要 hazards危害
22
65894
3225
我們太過苦惱於那些次要的危害,
01:21
improbable難以置信 air空氣 crashes崩潰, carcinogens致癌物 in food餐飲,
23
69119
4031
像是發生概率極小的空難、食品中的致癌物、
01:25
low radiation輻射 doses劑量, and so forth向前
24
73150
2226
低輻射等等。
01:27
but we and our political政治 masters主人
25
75376
2825
但我們和政治領袖們
01:30
are in denial否認 about catastrophic災難性的 scenarios場景.
26
78201
4203
卻否認那些災難性的情節。
01:34
The worst最差 have thankfully感激地 not yet然而 happened發生.
27
82404
3038
幸運的是最可怕的事情還沒有發生,
01:37
Indeed確實, they probably大概 won't慣於.
28
85442
2196
的確,他們可能不會發生。
01:39
But if an event事件 is potentially可能 devastating破壞性的,
29
87638
3185
但是如果有一件事具潛在的毀滅性,
01:42
it's worth價值 paying付款 a substantial大量的 premium額外費用
30
90823
2868
那就值得我們付出大量的精力與金錢。
01:45
to safeguard保障 against反對 it, even if it's unlikely不會,
31
93691
3836
把它掐死在搖籃裡,即使它不太可能發生。
01:49
just as we take out fire insurance保險 on our house.
32
97527
4513
這就像給我們的房子買火災險。
01:54
And as science科學 offers報價 greater更大 power功率 and promise諾言,
33
102040
4997
科學提供了更強大的力量和保證,
01:59
the downside缺點 gets得到 scarier可怕 too.
34
107037
3866
隨之而來的負面影響也變得更加可怕,
02:02
We get ever more vulnerable弱勢.
35
110903
2239
我們變得更加脆弱。
02:05
Within a few少數 decades幾十年,
36
113142
1838
數十年之內,
02:06
millions百萬 will have the capability能力
37
114980
2230
數百萬人將會有能力
02:09
to misuse濫用 rapidly急速 advancing前進 biotech生物技術,
38
117210
3121
濫用飛速發展的生物技術,
02:12
just as they misuse濫用 cybertech軟數碼 today今天.
39
120331
3553
就像他們今天濫用網路技術一樣。
02:15
Freeman弗里曼 Dyson戴森, in a TEDTED Talk,
40
123884
3199
費裡曼•戴森在TED演講中
02:19
foresaw預見 that children孩子 will design設計
and create創建 new organisms生物
41
127083
3596
預言孩子們會設計並創造新的有機體,
02:22
just as routinely常規 as his generation
played發揮 with chemistry化學 sets.
42
130679
4511
就像他們那一代人擺弄化學裝置一樣平常。
02:27
Well, this may可能 be on the science科學 fiction小說 fringe邊緣,
43
135190
2528
好吧,這大概已經到科幻小說的邊緣了,
02:29
but were even part部分 of his scenario腳本 to come about,
44
137718
3183
但是即使他情節中一小部份發生了,
02:32
our ecology生態 and even our species種類
45
140901
2737
我們的生態系統乃至整個人類種族
02:35
would surely一定 not survive生存 long unscathed毫髮無損.
46
143638
3989
必定不會安然無恙地存活太久。
02:39
For instance, there are some eco-extremists生態極端分子
47
147627
3863
比如說,有一些生態極端主義者
02:43
who think that it would be better for the planet行星,
48
151490
2509
認為如果能大大減少人口,
02:45
for Gaia蓋亞, if there were far fewer humans人類.
49
153999
3403
那會對這整個星球和大地母親更好。
02:49
What happens發生 when such這樣 people have mastered掌握
50
157402
2717
當這樣的人掌握了
02:52
synthetic合成的 biology生物學 techniques技術
51
160119
2137
那些將在2050年普及的合成生物技術,
02:54
that will be widespread廣泛 by 2050?
52
162256
2852
會發生什麽?
02:57
And by then, other science科學 fiction小說 nightmares噩夢
53
165108
3042
到那時,其他科幻小說中的噩夢
03:00
may可能 transition過渡 to reality現實:
54
168150
1710
也可能變為現實。
03:01
dumb robots機器人 going rogue流氓,
55
169860
2070
成了流氓的愚蠢機器人
03:03
or a network網絡 that develops發展 a mind心神 of its own擁有
56
171930
2417
或者一套發展出自我意識的網路系統
03:06
threatens威脅 us all.
57
174347
2589
威脅我們所有人。
03:08
Well, can we guard守衛 against反對 such這樣 risks風險 by regulation?
58
176936
3270
那麼,我們能不能通過條例來防範這樣的風險?
03:12
We must必須 surely一定 try, but these enterprises企業
59
180206
2407
無疑我們必將嘗試,
但那些企業是如此求勝心切,
03:14
are so competitive競爭的, so globalized全球化,
60
182613
3529
如此全球化,如此被商業壓力所驅使,
03:18
and so driven驅動 by commercial廣告 pressure壓力,
61
186142
1980
03:20
that anything that can be doneDONE
will be doneDONE somewhere某處,
62
188122
3285
以至於他們會不擇手段,
03:23
whatever隨你 the regulations法規 say.
63
191407
2036
不管法規條例說了些什麽。
03:25
It's like the drug藥物 laws法律 — we try to regulate調節, but can't.
64
193443
3487
這就像製毒法律——我們試圖管制,但做不到。
03:28
And the global全球 village will have its village idiots白痴,
65
196930
3044
地球村裡將會有些愚蠢的村民,
03:31
and they'll他們會 have a global全球 range範圍.
66
199974
3496
影響到整個地球。
03:35
So as I said in my book,
67
203470
2291
所以就像我在書中所說,
03:37
we'll have a bumpy顛簸 ride through通過 this century世紀.
68
205761
2889
我們會在顛簸中走完這個世紀。
03:40
There may可能 be setbacks挫折 to our society社會
69
208650
3490
我們的社會可能會遭遇挫折——
03:44
indeed確實, a 50 percent百分 chance機會 of a severe嚴重 setback挫折.
70
212140
4115
事實上,有 50% 的機率是極其嚴重的挫折。
03:48
But are there conceivable可以想像 events事件
71
216255
2914
但是,能否想像
03:51
that could be even worse更差,
72
219169
2161
更糟糕的事件,
03:53
events事件 that could snuff鼻煙 out all life?
73
221330
3430
那些可以毀滅所有生命的事件?
03:56
When a new particle粒子 accelerator加速器 came來了 online線上,
74
224760
2926
當一台新的粒子加速器開始運行時,
03:59
some people anxiously焦急地 asked,
75
227686
1789
有人焦急地問
04:01
could it destroy破壞 the Earth地球 or, even worse更差,
76
229475
2250
它會毀滅地球嗎?
04:03
rip安息 apart距離 the fabric of space空間?
77
231725
2659
或者更糟,撕破時空的結構?
04:06
Well luckily, reassurance放心 could be offered提供.
78
234384
3543
幸運的是對此我們可以放心,
04:09
I and others其他 pointed out that nature性質
79
237927
2044
我和其他一些人指出
04:11
has doneDONE the same相同 experiments實驗
80
239971
1933
大自然已經將同樣的實驗
04:13
zillions不計其數 of times already已經,
81
241904
2186
通過宇宙射線的撞擊
04:16
via通過 cosmic宇宙的 ray射線 collisions碰撞.
82
244090
1765
做了無數次。
04:17
But scientists科學家們 should surely一定 be precautionary預防
83
245855
3054
但是對於那些在自然界中
04:20
about experiments實驗 that generate生成 conditions條件
84
248909
2580
沒有先例的實驗,
04:23
without precedent先例 in the natural自然 world世界.
85
251489
2483
科學家們應該警鐘長鳴,
04:25
Biologists生物學家 should avoid避免 release發布
of potentially可能 devastating破壞性的
86
253972
3423
生物學家應該預防
具有潛在毀滅性的轉基因病原體。
04:29
genetically基因 modified改性 pathogens病原體.
87
257395
2715
04:32
And by the way, our special特別 aversion厭惡
88
260110
3517
順便說一句,我們對於
04:35
to the risk風險 of truly existential存在 disasters災害
89
263627
3461
毀滅性災難的風險尤其反感,
04:39
depends依靠 on a philosophical哲學上 and ethical合乎道德的 question,
90
267088
3275
這是基於一個哲學倫理問題。
04:42
and it's this:
91
270363
1670
這個問題是這樣的。
04:44
Consider考慮 two scenarios場景.
92
272033
2308
想像如下兩個場景:
04:46
Scenario腳本 A wipes濕巾 out 90 percent百分 of humanity人性.
93
274341
5236
情景 A:90%的人類會消亡;
04:51
Scenario腳本 B wipes濕巾 out 100 percent百分.
94
279577
3896
情景 B:100%的人類會消亡。
04:55
How much worse更差 is B than A?
95
283473
2918
情景 B 比情景 A 糟糕多少呢?
04:58
Some would say 10 percent百分 worse更差.
96
286391
3023
有人會說糟糕 10%,
05:01
The body身體 count計數 is 10 percent百分 higher更高.
97
289414
3150
因為死亡人數多 10%。
05:04
But I claim要求 that B is incomparably無比 worse更差.
98
292564
2906
但我堅持情景 B 是無比糟糕的。
05:07
As an astronomer天文學家, I can't believe
99
295470
2629
做為天文學家,我無法相信
05:10
that humans人類 are the end結束 of the story故事.
100
298099
2467
人類是整個故事的結尾。
05:12
It is five billion十億 years年份 before the sun太陽 flares喇叭褲 up,
101
300566
3323
在太陽開始燃燒的五十億年前,宇宙就誕生了,
05:15
and the universe宇宙 may可能 go on forever永遠,
102
303889
2711
而且可能會永遠持續下去。
05:18
so post-human後人類 evolution演化,
103
306600
2292
因此,在地球和及其遙遠的地方,
05:20
here on Earth地球 and far beyond,
104
308892
2190
後人類的進化會被延長,
05:23
could be as prolonged經久 as the Darwinian達爾文 process處理
105
311082
2714
就像產生了我們人類的式的達爾文式進化過程,
05:25
that's led to us, and even more wonderful精彩.
106
313796
3281
甚至更加絕妙。
05:29
And indeed確實, future未來 evolution演化
will happen發生 much faster更快,
107
317077
2664
事實上,未來的進化會發生得更快,
05:31
on a technological技術性 timescale時間表,
108
319741
2199
會在一個技術時間尺度上,
05:33
not a natural自然 selection選擇 timescale時間表.
109
321940
2299
而不是一個自然選擇的時間尺度上。
05:36
So we surely一定, in view視圖 of those immense巨大 stakes賭注,
110
324239
4195
所以,考慮到這些重大的利害關係,
05:40
shouldn't不能 accept接受 even a one in a billion十億 risk風險
111
328434
3386
我們不應該接受哪怕十億分之一的風險,
05:43
that human人的 extinction滅絕 would foreclose取消抵押品贖回權
112
331820
2229
因人類滅絕而中止了
05:46
this immense巨大 potential潛在.
113
334049
2310
這巨大潛力的風險。
05:48
Some scenarios場景 that have been envisaged設想
114
336359
1772
有些設想中的情景
05:50
may可能 indeed確實 be science科學 fiction小說,
115
338131
1819
的確可能只會在科幻小說裡出現,
05:51
but others其他 may可能 be disquietingly使人不安 real真實.
116
339950
3386
但其他的一些可能會是令人不安的現實。
05:55
It's an important重要 maxim格言 that the unfamiliar陌生
117
343336
2874
一句重要的格言這麼說:
05:58
is not the same相同 as the improbable難以置信,
118
346210
2697
不熟悉不等於不可能。
06:00
and in fact事實, that's why we at Cambridge劍橋 University大學
119
348907
2398
事實上,這就是為什麽我們正在劍橋大學
06:03
are setting設置 up a center中央 to study研究 how to mitigate減輕
120
351305
3375
創建一個中心來研究
06:06
these existential存在 risks風險.
121
354680
2032
如何緩解這些生存風險。
06:08
It seems似乎 it's worthwhile合算 just for a few少數 people
122
356712
3063
看來讓一小部分人
06:11
to think about these potential潛在 disasters災害.
123
359775
2316
思考這些潛在災難是值得的。
06:14
And we need all the help we can get from others其他,
124
362091
3013
我們需要可以從其他人那裡得到的所有幫助。
06:17
because we are stewards管家 of a precious珍貴
125
365104
2479
因為我們是來自茫茫宇宙中
06:19
pale蒼白 blue藍色 dot in a vast廣大 cosmos宇宙,
126
367583
3483
那顆珍貴暗藍色圓石上的守護者,
06:23
a planet行星 with 50 million百萬 centuries百年 ahead of it.
127
371066
3378
一顆已經走過五千多萬個世紀的星球,
06:26
And so let's not jeopardize危害 that future未來.
128
374444
2556
所以請我們不要危及它的未來。
06:29
And I'd like to finish with a quote引用
129
377000
1795
我想用一段偉大科學家彼得•梅達沃的話
06:30
from a great scientist科學家 called Peter彼得 Medawar梅達沃.
130
378795
3501
結束今天的演講,這段話是這樣的:
06:34
I quote引用, "The bells鐘聲 that toll收費 for mankind人類
131
382296
3273
「為人類敲響的鐘
06:37
are like the bells鐘聲 of Alpine高山 cattle黃牛.
132
385569
2644
就像阿爾卑斯山上牛的鈴鐺,
06:40
They are attached to our own擁有 necks脖子,
133
388213
2286
繫在我們自己的脖子上。
06:42
and it must必須 be our fault故障 if they do not make
134
390499
2675
如果它們沒有發出和諧悠揚的樂聲,
06:45
a tuneful和諧的 and melodious悠揚 sound聲音."
135
393174
2131
那一定是我們自己的錯。」
06:47
Thank you very much.
136
395305
2267
非常感謝。
06:49
(Applause掌聲)
137
397572
2113
(掌聲)
Translated by Qingqing Mao
Reviewed by Melody Zhou

▲Back to top

ABOUT THE SPEAKER
Martin Rees - Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos.

Why you should listen

Lord Martin Rees has issued a clarion call for humanity. His 2004 book, ominously titled Our Final Hour, catalogues the threats facing the human race in a 21st century dominated by unprecedented and accelerating scientific change. He calls on scientists and nonscientists alike to take steps that will ensure our survival as a species.

One of the world's leading astronomers, Rees is an emeritus professor of cosmology and astrophysics at Cambridge, and UK Astronomer Royal. Author of more than 500 research papers on cosmological topics ranging from black holes to quantum physics to the Big Bang, Rees has received countless awards for his scientific contributions. But equally significant has been his devotion to explaining the complexities of science for a general audience, in books like Before the Beginning and Our Cosmic Habitat.

More profile about the speaker
Martin Rees | Speaker | TED.com