ABOUT THE SPEAKER
Martin Rees - Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos.

Why you should listen

Lord Martin Rees has issued a clarion call for humanity. His 2004 book, ominously titled Our Final Hour, catalogues the threats facing the human race in a 21st century dominated by unprecedented and accelerating scientific change. He calls on scientists and nonscientists alike to take steps that will ensure our survival as a species.

One of the world's leading astronomers, Rees is an emeritus professor of cosmology and astrophysics at Cambridge, and UK Astronomer Royal. Author of more than 500 research papers on cosmological topics ranging from black holes to quantum physics to the Big Bang, Rees has received countless awards for his scientific contributions. But equally significant has been his devotion to explaining the complexities of science for a general audience, in books like Before the Beginning and Our Cosmic Habitat.

More profile about the speaker
Martin Rees | Speaker | TED.com
TED2014

Martin Rees: Can we prevent the end of the world?

Filmed:
1,283,785 views

A post-apocalyptic Earth, emptied of humans, seems like the stuff of science fiction TV and movies. But in this short, surprising talk, Lord Martin Rees asks us to think about our real existential risks — natural and human-made threats that could wipe out humanity. As a concerned member of the human race, he asks: What's the worst thing that could possibly happen?
- Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos. Full bio

Double-click the English transcript below to play the video.

00:12
Ten years ago, I wrote a book which I entitled
0
485
2222
00:14
"Our Final Century?" Question mark.
1
2707
3093
00:17
My publishers cut out the question mark. (Laughter)
2
5800
3577
00:21
The American publishers changed our title
3
9377
1882
00:23
to "Our Final Hour."
4
11259
3909
00:27
Americans like instant gratification and the reverse.
5
15168
3492
00:30
(Laughter)
6
18660
1708
00:32
And my theme was this:
7
20368
1750
00:34
Our Earth has existed for 45 million centuries,
8
22118
4166
00:38
but this one is special —
9
26284
2013
00:40
it's the first where one species, ours,
10
28297
3016
00:43
has the planet's future in its hands.
11
31313
2802
00:46
Over nearly all of Earth's history,
12
34115
1990
00:48
threats have come from nature —
13
36105
1936
00:50
disease, earthquakes, asteroids and so forth —
14
38041
3496
00:53
but from now on, the worst dangers come from us.
15
41537
5672
00:59
And it's now not just the nuclear threat;
16
47209
3271
01:02
in our interconnected world,
17
50480
1751
01:04
network breakdowns can cascade globally;
18
52231
3163
01:07
air travel can spread pandemics
worldwide within days;
19
55394
3956
01:11
and social media can spread panic and rumor
20
59350
3327
01:14
literally at the speed of light.
21
62677
3217
01:17
We fret too much about minor hazards —
22
65894
3225
01:21
improbable air crashes, carcinogens in food,
23
69119
4031
01:25
low radiation doses, and so forth —
24
73150
2226
01:27
but we and our political masters
25
75376
2825
01:30
are in denial about catastrophic scenarios.
26
78201
4203
01:34
The worst have thankfully not yet happened.
27
82404
3038
01:37
Indeed, they probably won't.
28
85442
2196
01:39
But if an event is potentially devastating,
29
87638
3185
01:42
it's worth paying a substantial premium
30
90823
2868
01:45
to safeguard against it, even if it's unlikely,
31
93691
3836
01:49
just as we take out fire insurance on our house.
32
97527
4513
01:54
And as science offers greater power and promise,
33
102040
4997
01:59
the downside gets scarier too.
34
107037
3866
02:02
We get ever more vulnerable.
35
110903
2239
02:05
Within a few decades,
36
113142
1838
02:06
millions will have the capability
37
114980
2230
02:09
to misuse rapidly advancing biotech,
38
117210
3121
02:12
just as they misuse cybertech today.
39
120331
3553
02:15
Freeman Dyson, in a TED Talk,
40
123884
3199
02:19
foresaw that children will design
and create new organisms
41
127083
3596
02:22
just as routinely as his generation
played with chemistry sets.
42
130679
4511
02:27
Well, this may be on the science fiction fringe,
43
135190
2528
02:29
but were even part of his scenario to come about,
44
137718
3183
02:32
our ecology and even our species
45
140901
2737
02:35
would surely not survive long unscathed.
46
143638
3989
02:39
For instance, there are some eco-extremists
47
147627
3863
02:43
who think that it would be better for the planet,
48
151490
2509
02:45
for Gaia, if there were far fewer humans.
49
153999
3403
02:49
What happens when such people have mastered
50
157402
2717
02:52
synthetic biology techniques
51
160119
2137
02:54
that will be widespread by 2050?
52
162256
2852
02:57
And by then, other science fiction nightmares
53
165108
3042
03:00
may transition to reality:
54
168150
1710
03:01
dumb robots going rogue,
55
169860
2070
03:03
or a network that develops a mind of its own
56
171930
2417
03:06
threatens us all.
57
174347
2589
03:08
Well, can we guard against such risks by regulation?
58
176936
3270
03:12
We must surely try, but these enterprises
59
180206
2407
03:14
are so competitive, so globalized,
60
182613
3529
03:18
and so driven by commercial pressure,
61
186142
1980
03:20
that anything that can be done
will be done somewhere,
62
188122
3285
03:23
whatever the regulations say.
63
191407
2036
03:25
It's like the drug laws — we try to regulate, but can't.
64
193443
3487
03:28
And the global village will have its village idiots,
65
196930
3044
03:31
and they'll have a global range.
66
199974
3496
03:35
So as I said in my book,
67
203470
2291
03:37
we'll have a bumpy ride through this century.
68
205761
2889
03:40
There may be setbacks to our society —
69
208650
3490
03:44
indeed, a 50 percent chance of a severe setback.
70
212140
4115
03:48
But are there conceivable events
71
216255
2914
03:51
that could be even worse,
72
219169
2161
03:53
events that could snuff out all life?
73
221330
3430
03:56
When a new particle accelerator came online,
74
224760
2926
03:59
some people anxiously asked,
75
227686
1789
04:01
could it destroy the Earth or, even worse,
76
229475
2250
04:03
rip apart the fabric of space?
77
231725
2659
04:06
Well luckily, reassurance could be offered.
78
234384
3543
04:09
I and others pointed out that nature
79
237927
2044
04:11
has done the same experiments
80
239971
1933
04:13
zillions of times already,
81
241904
2186
04:16
via cosmic ray collisions.
82
244090
1765
04:17
But scientists should surely be precautionary
83
245855
3054
04:20
about experiments that generate conditions
84
248909
2580
04:23
without precedent in the natural world.
85
251489
2483
04:25
Biologists should avoid release
of potentially devastating
86
253972
3423
04:29
genetically modified pathogens.
87
257395
2715
04:32
And by the way, our special aversion
88
260110
3517
04:35
to the risk of truly existential disasters
89
263627
3461
04:39
depends on a philosophical and ethical question,
90
267088
3275
04:42
and it's this:
91
270363
1670
04:44
Consider two scenarios.
92
272033
2308
04:46
Scenario A wipes out 90 percent of humanity.
93
274341
5236
04:51
Scenario B wipes out 100 percent.
94
279577
3896
04:55
How much worse is B than A?
95
283473
2918
04:58
Some would say 10 percent worse.
96
286391
3023
05:01
The body count is 10 percent higher.
97
289414
3150
05:04
But I claim that B is incomparably worse.
98
292564
2906
05:07
As an astronomer, I can't believe
99
295470
2629
05:10
that humans are the end of the story.
100
298099
2467
05:12
It is five billion years before the sun flares up,
101
300566
3323
05:15
and the universe may go on forever,
102
303889
2711
05:18
so post-human evolution,
103
306600
2292
05:20
here on Earth and far beyond,
104
308892
2190
05:23
could be as prolonged as the Darwinian process
105
311082
2714
05:25
that's led to us, and even more wonderful.
106
313796
3281
05:29
And indeed, future evolution
will happen much faster,
107
317077
2664
05:31
on a technological timescale,
108
319741
2199
05:33
not a natural selection timescale.
109
321940
2299
05:36
So we surely, in view of those immense stakes,
110
324239
4195
05:40
shouldn't accept even a one in a billion risk
111
328434
3386
05:43
that human extinction would foreclose
112
331820
2229
05:46
this immense potential.
113
334049
2310
05:48
Some scenarios that have been envisaged
114
336359
1772
05:50
may indeed be science fiction,
115
338131
1819
05:51
but others may be disquietingly real.
116
339950
3386
05:55
It's an important maxim that the unfamiliar
117
343336
2874
05:58
is not the same as the improbable,
118
346210
2697
06:00
and in fact, that's why we at Cambridge University
119
348907
2398
06:03
are setting up a center to study how to mitigate
120
351305
3375
06:06
these existential risks.
121
354680
2032
06:08
It seems it's worthwhile just for a few people
122
356712
3063
06:11
to think about these potential disasters.
123
359775
2316
06:14
And we need all the help we can get from others,
124
362091
3013
06:17
because we are stewards of a precious
125
365104
2479
06:19
pale blue dot in a vast cosmos,
126
367583
3483
06:23
a planet with 50 million centuries ahead of it.
127
371066
3378
06:26
And so let's not jeopardize that future.
128
374444
2556
06:29
And I'd like to finish with a quote
129
377000
1795
06:30
from a great scientist called Peter Medawar.
130
378795
3501
06:34
I quote, "The bells that toll for mankind
131
382296
3273
06:37
are like the bells of Alpine cattle.
132
385569
2644
06:40
They are attached to our own necks,
133
388213
2286
06:42
and it must be our fault if they do not make
134
390499
2675
06:45
a tuneful and melodious sound."
135
393174
2131
06:47
Thank you very much.
136
395305
2267
06:49
(Applause)
137
397572
2113

▲Back to top

ABOUT THE SPEAKER
Martin Rees - Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos.

Why you should listen

Lord Martin Rees has issued a clarion call for humanity. His 2004 book, ominously titled Our Final Hour, catalogues the threats facing the human race in a 21st century dominated by unprecedented and accelerating scientific change. He calls on scientists and nonscientists alike to take steps that will ensure our survival as a species.

One of the world's leading astronomers, Rees is an emeritus professor of cosmology and astrophysics at Cambridge, and UK Astronomer Royal. Author of more than 500 research papers on cosmological topics ranging from black holes to quantum physics to the Big Bang, Rees has received countless awards for his scientific contributions. But equally significant has been his devotion to explaining the complexities of science for a general audience, in books like Before the Beginning and Our Cosmic Habitat.

More profile about the speaker
Martin Rees | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee