06:57
TED2014

Martin Rees: Can we prevent the end of the world?

Filmed:

A post-apocalyptic Earth, emptied of humans, seems like the stuff of science fiction TV and movies. But in this short, surprising talk, Lord Martin Rees asks us to think about our real existential risks — natural and human-made threats that could wipe out humanity. As a concerned member of the human race, he asks: What’s the worst thing that could possibly happen?

- Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos. Full bio

Ten years ago, I wrote a book which I entitled
00:12
"Our Final Century?" Question mark.
00:14
My publishers cut out the question mark. (Laughter)
00:17
The American publishers changed our title
00:21
to "Our Final Hour."
00:23
Americans like instant gratification and the reverse.
00:27
(Laughter)
00:30
And my theme was this:
00:32
Our Earth has existed for 45 million centuries,
00:34
but this one is special —
00:38
it's the first where one species, ours,
00:40
has the planet's future in its hands.
00:43
Over nearly all of Earth's history,
00:46
threats have come from nature —
00:48
disease, earthquakes, asteroids and so forth —
00:50
but from now on, the worst dangers come from us.
00:53
And it's now not just the nuclear threat;
00:59
in our interconnected world,
01:02
network breakdowns can cascade globally;
01:04
air travel can spread pandemics
worldwide within days;
01:07
and social media can spread panic and rumor
01:11
literally at the speed of light.
01:14
We fret too much about minor hazards —
01:17
improbable air crashes, carcinogens in food,
01:21
low radiation doses, and so forth —
01:25
but we and our political masters
01:27
are in denial about catastrophic scenarios.
01:30
The worst have thankfully not yet happened.
01:34
Indeed, they probably won't.
01:37
But if an event is potentially devastating,
01:39
it's worth paying a substantial premium
01:42
to safeguard against it, even if it's unlikely,
01:45
just as we take out fire insurance on our house.
01:49
And as science offers greater power and promise,
01:54
the downside gets scarier too.
01:59
We get ever more vulnerable.
02:02
Within a few decades,
02:05
millions will have the capability
02:06
to misuse rapidly advancing biotech,
02:09
just as they misuse cybertech today.
02:12
Freeman Dyson, in a TED Talk,
02:15
foresaw that children will design
and create new organisms
02:19
just as routinely as his generation
played with chemistry sets.
02:22
Well, this may be on the science fiction fringe,
02:27
but were even part of his scenario to come about,
02:29
our ecology and even our species
02:32
would surely not survive long unscathed.
02:35
For instance, there are some eco-extremists
02:39
who think that it would be better for the planet,
02:43
for Gaia, if there were far fewer humans.
02:45
What happens when such people have mastered
02:49
synthetic biology techniques
02:52
that will be widespread by 2050?
02:54
And by then, other science fiction nightmares
02:57
may transition to reality:
03:00
dumb robots going rogue,
03:01
or a network that develops a mind of its own
03:03
threatens us all.
03:06
Well, can we guard against such risks by regulation?
03:08
We must surely try, but these enterprises
03:12
are so competitive, so globalized,
03:14
and so driven by commercial pressure,
03:18
that anything that can be done
will be done somewhere,
03:20
whatever the regulations say.
03:23
It's like the drug laws — we try to regulate, but can't.
03:25
And the global village will have its village idiots,
03:28
and they'll have a global range.
03:31
So as I said in my book,
03:35
we'll have a bumpy ride through this century.
03:37
There may be setbacks to our society —
03:40
indeed, a 50 percent chance of a severe setback.
03:44
But are there conceivable events
03:48
that could be even worse,
03:51
events that could snuff out all life?
03:53
When a new particle accelerator came online,
03:56
some people anxiously asked,
03:59
could it destroy the Earth or, even worse,
04:01
rip apart the fabric of space?
04:03
Well luckily, reassurance could be offered.
04:06
I and others pointed out that nature
04:09
has done the same experiments
04:11
zillions of times already,
04:13
via cosmic ray collisions.
04:16
But scientists should surely be precautionary
04:17
about experiments that generate conditions
04:20
without precedent in the natural world.
04:23
Biologists should avoid release
of potentially devastating
04:25
genetically modified pathogens.
04:29
And by the way, our special aversion
04:32
to the risk of truly existential disasters
04:35
depends on a philosophical and ethical question,
04:39
and it's this:
04:42
Consider two scenarios.
04:44
Scenario A wipes out 90 percent of humanity.
04:46
Scenario B wipes out 100 percent.
04:51
How much worse is B than A?
04:55
Some would say 10 percent worse.
04:58
The body count is 10 percent higher.
05:01
But I claim that B is incomparably worse.
05:04
As an astronomer, I can't believe
05:07
that humans are the end of the story.
05:10
It is five billion years before the sun flares up,
05:12
and the universe may go on forever,
05:15
so post-human evolution,
05:18
here on Earth and far beyond,
05:20
could be as prolonged as the Darwinian process
05:23
that's led to us, and even more wonderful.
05:25
And indeed, future evolution
will happen much faster,
05:29
on a technological timescale,
05:31
not a natural selection timescale.
05:33
So we surely, in view of those immense stakes,
05:36
shouldn't accept even a one in a billion risk
05:40
that human extinction would foreclose
05:43
this immense potential.
05:46
Some scenarios that have been envisaged
05:48
may indeed be science fiction,
05:50
but others may be disquietingly real.
05:51
It's an important maxim that the unfamiliar
05:55
is not the same as the improbable,
05:58
and in fact, that's why we at Cambridge University
06:00
are setting up a center to study how to mitigate
06:03
these existential risks.
06:06
It seems it's worthwhile just for a few people
06:08
to think about these potential disasters.
06:11
And we need all the help we can get from others,
06:14
because we are stewards of a precious
06:17
pale blue dot in a vast cosmos,
06:19
a planet with 50 million centuries ahead of it.
06:23
And so let's not jeopardize that future.
06:26
And I'd like to finish with a quote
06:29
from a great scientist called Peter Medawar.
06:30
I quote, "The bells that toll for mankind
06:34
are like the bells of Alpine cattle.
06:37
They are attached to our own necks,
06:40
and it must be our fault if they do not make
06:42
a tuneful and melodious sound."
06:45
Thank you very much.
06:47
(Applause)
06:49

▲Back to top

About the Speaker:

Martin Rees - Astrophysicist
Lord Martin Rees, one of the world's most eminent astronomers, is an emeritus professor of cosmology and astrophysics at the University of Cambridge and the UK's Astronomer Royal. He is one of our key thinkers on the future of humanity in the cosmos.

Why you should listen

Lord Martin Rees has issued a clarion call for humanity. His 2004 book, ominously titled Our Final Hour, catalogues the threats facing the human race in a 21st century dominated by unprecedented and accelerating scientific change. He calls on scientists and nonscientists alike to take steps that will ensure our survival as a species.

One of the world's leading astronomers, Rees is an emeritus professor of cosmology and astrophysics at Cambridge, and UK Astronomer Royal. Author of more than 500 research papers on cosmological topics ranging from black holes to quantum physics to the Big Bang, Rees has received countless awards for his scientific contributions. But equally significant has been his devotion to explaining the complexities of science for a general audience, in books like Before the Beginning and Our Cosmic Habitat.

More profile about the speaker
Martin Rees | Speaker | TED.com