sponsored links
TED Dialogues

Yuval Noah Harari: Nationalism vs. globalism: the new political divide

February 15, 2017

How do we make sense of today's political divisions? In a wide-ranging conversation full of insight, historian Yuval Harari places our current turmoil in a broader context, against the ongoing disruption of our technology, climate, media -- even our notion of what humanity is for. This is the first of a series of TED Dialogues, seeking a thoughtful response to escalating political divisiveness. Make time (just over an hour) for this fascinating discussion between Harari and TED curator Chris Anderson.

Yuval Noah Harari - Historian
In his book "Sapiens: A Brief History of Humankind," Yuval Noah Harari asks: "What made homo sapiens the most successful species on the planet?" Full bio

sponsored links
Double-click the English subtitles below to play the video.
Chris Anderson: Hello.
Welcome to this TED Dialogues.
00:12
It's the first of a series
that's going to be done
00:16
in response to the current
political upheaval.
00:20
I don't know about you;
00:24
I've become quite concerned about
the growing divisiveness in this country
00:25
and in the world.
00:28
No one's listening to each other. Right?
00:30
They aren't.
00:33
I mean, it feels like we need
a different kind of conversation,
00:34
one that's based on -- I don't know,
on reason, listening, on understanding,
00:38
on a broader context.
00:44
That's at least what we're going to try
in these TED Dialogues,
00:46
starting today.
00:49
And we couldn't have anyone with us
00:50
who I'd be more excited to kick this off.
00:53
This is a mind right here that thinks
pretty much like no one else
00:56
on the planet, I would hasten to say.
01:00
I'm serious.
01:02
(Yuval Noah Harari laughs)
01:03
I'm serious.
01:04
He synthesizes history
with underlying ideas
01:05
in a way that kind of takes
your breath away.
01:10
So, some of you will know
this book, "Sapiens."
01:12
Has anyone here read "Sapiens"?
01:16
(Applause)
01:18
I mean, I could not put it down.
01:19
The way that he tells the story of mankind
01:22
through big ideas that really make you
think differently --
01:26
it's kind of amazing.
01:30
And here's the follow-up,
01:31
which I think is being published
in the US next week.
01:33
YNH: Yeah, next week.
01:36
CA: "Homo Deus."
01:37
Now, this is the history
of the next hundred years.
01:38
I've had a chance to read it.
01:42
It's extremely dramatic,
01:44
and I daresay, for some people,
quite alarming.
01:46
It's a must-read.
01:51
And honestly, we couldn't have
someone better to help
01:52
make sense of what on Earth
is happening in the world right now.
01:58
So a warm welcome, please,
to Yuval Noah Harari.
02:02
(Applause)
02:06
It's great to be joined by our friends
on Facebook and around the Web.
02:14
Hello, Facebook.
02:18
And all of you, as I start
asking questions of Yuval,
02:20
come up with your own questions,
02:24
and not necessarily about
the political scandal du jour,
02:25
but about the broader understanding
of: Where are we heading?
02:28
You ready? OK, we're going to go.
02:34
So here we are, Yuval:
02:36
New York City, 2017,
there's a new president in power,
02:37
and shock waves rippling around the world.
02:41
What on Earth is happening?
02:44
YNH: I think the basic thing that happened
02:46
is that we have lost our story.
02:49
Humans think in stories,
02:51
and we try to make sense of the world
by telling stories.
02:54
And for the last few decades,
02:58
we had a very simple
and very attractive story
02:59
about what's happening in the world.
03:02
And the story said that,
oh, what's happening is
03:04
that the economy is being globalized,
03:07
politics is being liberalized,
03:10
and the combination of the two
will create paradise on Earth,
03:12
and we just need to keep on
globalizing the economy
03:16
and liberalizing the political system,
03:19
and everything will be wonderful.
03:21
And 2016 is the moment
03:23
when a very large segment,
even of the Western world,
03:25
stopped believing in this story.
03:29
For good or bad reasons --
it doesn't matter.
03:32
People stopped believing in the story,
03:34
and when you don't have a story,
you don't understand what's happening.
03:36
CA: Part of you believes that that story
was actually a very effective story.
03:41
It worked.
03:45
YNH: To some extent, yes.
03:46
According to some measurements,
03:47
we are now in the best time ever
03:49
for humankind.
03:52
Today, for the first time in history,
03:53
more people die from eating too much
than from eating too little,
03:56
which is an amazing achievement.
04:00
(Laughter)
04:02
Also for the first time in history,
04:05
more people die from old age
than from infectious diseases,
04:07
and violence is also down.
04:11
For the first time in history,
04:13
more people commit suicide
than are killed by crime and terrorism
04:15
and war put together.
04:20
Statistically, you are
your own worst enemy.
04:22
At least, of all the people in the world,
04:26
you are most likely
to be killed by yourself --
04:28
(Laughter)
04:32
which is, again,
very good news, compared --
04:33
(Laughter)
04:36
compared to the level of violence
that we saw in previous eras.
04:38
CA: But this process
of connecting the world
04:42
ended up with a large group of people
kind of feeling left out,
04:44
and they've reacted.
04:48
And so we have this bombshell
04:50
that's sort of ripping
through the whole system.
04:52
I mean, what do you make
of what's happened?
04:54
It feels like the old way
that people thought of politics,
04:57
the left-right divide,
has been blown up and replaced.
05:01
How should we think of this?
05:04
YNH: Yeah, the old 20th-century
political model of left versus right
05:05
is now largely irrelevant,
05:10
and the real divide today
is between global and national,
05:11
global or local.
05:16
And you see it again all over the world
05:18
that this is now the main struggle.
05:21
We probably need completely
new political models
05:23
and completely new ways
of thinking about politics.
05:26
In essence, what you can say
is that we now have global ecology,
05:32
we have a global economy
but we have national politics,
05:37
and this doesn't work together.
05:41
This makes the political
system ineffective,
05:43
because it has no control
over the forces that shape our life.
05:45
And you have basically two solutions
to this imbalance:
05:49
either de-globalize the economy
and turn it back into a national economy,
05:52
or globalize the political system.
05:57
CA: So some, I guess
many liberals out there
06:00
view Trump and his government
as kind of irredeemably bad,
06:05
just awful in every way.
06:11
Do you see any underlying narrative
or political philosophy in there
06:14
that is at least worth understanding?
06:21
How would you articulate that philosophy?
06:22
Is it just the philosophy of nationalism?
06:24
YNH: I think the underlying
feeling or idea
06:28
is that the political system --
something is broken there.
06:33
It doesn't empower
the ordinary person anymore.
06:38
It doesn't care so much
about the ordinary person anymore,
06:41
and I think this diagnosis
of the political disease is correct.
06:45
With regard to the answers,
I am far less certain.
06:50
I think what we are seeing
is the immediate human reaction:
06:53
if something doesn't work, let's go back.
06:57
And you see it all over the world,
06:59
that people, almost nobody
in the political system today,
07:01
has any future-oriented vision
of where humankind is going.
07:05
Almost everywhere,
you see retrograde vision:
07:09
"Let's make America great again,"
07:13
like it was great -- I don't know --
in the '50s, in the '80s, sometime,
07:15
let's go back there.
07:18
And you go to Russia
a hundred years after Lenin,
07:19
Putin's vision for the future
07:24
is basically, ah, let's go back
to the Tsarist empire.
07:26
And in Israel, where I come from,
07:29
the hottest political vision
of the present is:
07:32
"Let's build the temple again."
07:35
So let's go back 2,000 years backwards.
07:37
So people are thinking
sometime in the past we've lost it,
07:40
and sometimes in the past, it's like
you've lost your way in the city,
07:45
and you say OK, let's go back
to the point where I felt secure
07:48
and start again.
07:51
I don't think this can work,
07:53
but a lot of people,
this is their gut instinct.
07:54
CA: But why couldn't it work?
07:57
"America First" is a very
appealing slogan in many ways.
07:59
Patriotism is, in many ways,
a very noble thing.
08:03
It's played a role
in promoting cooperation
08:07
among large numbers of people.
08:09
Why couldn't you have a world
organized in countries,
08:11
all of which put themselves first?
08:15
YNH: For many centuries,
even thousands of years,
08:19
patriotism worked quite well.
08:22
Of course, it led to wars an so forth,
08:25
but we shouldn't focus
too much on the bad.
08:27
There are also many,
many positive things about patriotism,
08:30
and the ability to have
a large number of people
08:33
care about each other,
08:37
sympathize with one another,
08:39
and come together for collective action.
08:40
If you go back to the first nations,
08:43
so, thousands of years ago,
08:46
the people who lived along
the Yellow River in China --
08:48
it was many, many different tribes
08:51
and they all depended on the river
for survival and for prosperity,
08:54
but all of them also suffered
from periodical floods
08:58
and periodical droughts.
09:03
And no tribe could really do
anything about it,
09:04
because each of them controlled
just a tiny section of the river.
09:07
And then in a long
and complicated process,
09:11
the tribes coalesced together
to form the Chinese nation,
09:14
which controlled the entire Yellow River
09:18
and had the ability to bring
hundreds of thousands of people together
09:21
to build dams and canals
and regulate the river
09:26
and prevent the worst floods and droughts
09:31
and raise the level
of prosperity for everybody.
09:34
And this worked in many places
around the world.
09:37
But in the 21st century,
09:40
technology is changing all that
in a fundamental way.
09:43
We are now living -- all people
in the world --
09:46
are living alongside the same cyber river,
09:49
and no single nation can regulate
this river by itself.
09:53
We are all living together
on a single planet,
09:58
which is threatened by our own actions.
10:02
And if you don't have some kind
of global cooperation,
10:05
nationalism is just not on the right level
to tackle the problems,
10:09
whether it's climate change
or whether it's technological disruption.
10:14
CA: So it was a beautiful idea
10:19
in a world where most of the action,
most of the issues,
10:21
took place on national scale,
10:25
but your argument is that the issues
that matter most today
10:28
no longer take place on a national scale
but on a global scale.
10:30
YNH: Exactly. All the major problems
of the world today
10:34
are global in essence,
10:37
and they cannot be solved
10:40
unless through some kind
of global cooperation.
10:41
It's not just climate change,
10:45
which is, like, the most obvious
example people give.
10:47
I think more in terms
of technological disruption.
10:50
If you think about, for example,
artificial intelligence,
10:54
over the next 20, 30 years
10:57
pushing hundreds of millions of people
out of the job market --
11:00
this is a problem on a global level.
11:04
It will disrupt the economy
of all the countries.
11:06
And similarly, if you think
about, say, bioengineering
11:09
and people being afraid of conducting,
11:13
I don't know, genetic engineering
research in humans,
11:16
it won't help if just
a single country, let's say the US,
11:19
outlaws all genetic experiments in humans,
11:24
but China or North Korea
continues to do it.
11:27
So the US cannot solve it by itself,
11:31
and very quickly, the pressure on the US
to do the same will be immense
11:34
because we are talking about
high-risk, high-gain technologies.
11:39
If somebody else is doing it,
I can't allow myself to remain behind.
11:44
The only way to have regulations,
effective regulations,
11:48
on things like genetic engineering,
11:54
is to have global regulations.
11:56
If you just have national regulations,
nobody would like to stay behind.
11:58
CA: So this is really interesting.
12:03
It seems to me that this may be one key
12:05
to provoking at least
a constructive conversation
12:07
between the different sides here,
12:11
because I think everyone can agree
that the start point
12:12
of a lot of the anger
that's propelled us to where we are
12:16
is because of the legitimate
concerns about job loss.
12:18
Work is gone, a traditional
way of life has gone,
12:21
and it's no wonder
that people are furious about that.
12:25
And in general, they have blamed
globalism, global elites,
12:28
for doing this to them
without asking their permission,
12:33
and that seems like
a legitimate complaint.
12:35
But what I hear you saying
is that -- so a key question is:
12:38
What is the real cause of job loss,
both now and going forward?
12:41
To the extent that it's about globalism,
12:46
then the right response,
yes, is to shut down borders
12:49
and keep people out
and change trade agreements and so forth.
12:53
But you're saying, I think,
12:57
that actually the bigger cause of job loss
is not going to be that at all.
12:59
It's going to originate
in technological questions,
13:04
and we have no chance of solving that
13:07
unless we operate as a connected world.
13:09
YNH: Yeah, I think that,
13:11
I don't know about the present,
but looking to the future,
13:13
it's not the Mexicans or Chinese
who will take the jobs
13:16
from the people in Pennsylvania,
13:19
it's the robots and algorithms.
13:21
So unless you plan to build a big wall
on the border of California --
13:23
(Laughter)
13:27
the wall on the border with Mexico
is going to be very ineffective.
13:28
And I was struck when I watched
the debates before the election,
13:32
I was struck that certainly Trump
did not even attempt to frighten people
13:38
by saying the robots will take your jobs.
13:44
Now even if it's not true,
it doesn't matter.
13:47
It could have been an extremely
effective way of frightening people --
13:49
(Laughter)
13:52
and galvanizing people:
13:53
"The robots will take your jobs!"
13:55
And nobody used that line.
13:56
And it made me afraid,
13:58
because it meant
that no matter what happens
14:00
in universities and laboratories,
14:04
and there, there is already
an intense debate about it,
14:07
but in the mainstream political system
and among the general public,
14:09
people are just unaware
14:13
that there could be an immense
technological disruption --
14:16
not in 200 years,
but in 10, 20, 30 years --
14:20
and we have to do something about it now,
14:24
partly because most of what we teach
children today in school or in college
14:27
is going to be completely irrelevant
to the job market of 2040, 2050.
14:33
So it's not something we'll need
to think about in 2040.
14:39
We need to think today
what to teach the young people.
14:43
CA: Yeah, no, absolutely.
14:46
You've often written about
moments in history
14:50
where humankind has ...
entered a new era, unintentionally.
14:54
Decisions have been made,
technologies have been developed,
15:01
and suddenly the world has changed,
15:04
possibly in a way
that's worse for everyone.
15:06
So one of the examples
you give in "Sapiens"
15:09
is just the whole agricultural revolution,
15:11
which, for an actual person
tilling the fields,
15:13
they just picked up a 12-hour
backbreaking workday
15:17
instead of six hours in the jungle
and a much more interesting lifestyle.
15:20
(Laughter)
15:26
So are we at another possible
phase change here,
15:27
where we kind of sleepwalk into a future
that none of us actually wants?
15:30
YNH: Yes, very much so.
15:35
During the agricultural revolution,
15:38
what happened is that immense
technological and economic revolution
15:40
empowered the human collective,
15:44
but when you look at actual
individual lives,
15:47
the life of a tiny elite
became much better,
15:50
and the lives of the majority of people
became considerably worse.
15:54
And this can happen again
in the 21st century.
15:58
No doubt the new technologies
will empower the human collective.
16:01
But we may end up again
16:06
with a tiny elite reaping
all the benefits, taking all the fruits,
16:08
and the masses of the population
finding themselves worse
16:13
than they were before,
16:17
certainly much worse than this tiny elite.
16:18
CA: And those elites
might not even be human elites.
16:22
They might be cyborgs or --
16:25
YNH: Yeah, they could be
enhanced super humans.
16:26
They could be cyborgs.
16:29
They could be completely
nonorganic elites.
16:30
They could even be
non-conscious algorithms.
16:32
What we see now in the world
is authority shifting away
16:35
from humans to algorithms.
16:40
More and more decisions --
about personal lives,
16:42
about economic matters,
about political matters --
16:46
are actually being taken by algorithms.
16:48
If you ask the bank for a loan,
16:51
chances are your fate is decided
by an algorithm, not by a human being.
16:54
And the general impression
is that maybe Homo sapiens just lost it.
16:58
The world is so complicated,
there is so much data,
17:04
things are changing so fast,
17:09
that this thing that evolved
on the African savanna
17:12
tens of thousands of years ago --
17:15
to cope with a particular environment,
17:17
a particular volume
of information and data --
17:20
it just can't handle the realities
of the 21st century,
17:24
and the only thing
that may be able to handle it
17:28
is big-data algorithms.
17:31
So no wonder more and more authority
is shifting from us to the algorithms.
17:33
CA: So we're in New York City
for the first of a series of TED Dialogues
17:40
with Yuval Harari,
17:44
and there's a Facebook Live
audience out there.
17:46
We're excited to have you with us.
17:50
We'll start coming
to some of your questions
17:52
and questions of people in the room
17:54
in just a few minutes,
17:56
so have those coming.
17:57
Yuval, if you're going
to make the argument
17:59
that we need to get past nationalism
because of the coming technological ...
18:03
danger, in a way,
18:11
presented by so much of what's happening
18:12
we've got to have
a global conversation about this.
18:14
Trouble is, it's hard to get people
really believing that, I don't know,
18:17
AI really is an imminent
threat, and so forth.
18:20
The things that people,
some people at least,
18:22
care about much more immediately, perhaps,
18:25
is climate change,
18:27
perhaps other issues like refugees,
nuclear weapons, and so forth.
18:29
Would you argue that where
we are right now
18:34
that somehow those issues
need to be dialed up?
18:39
You've talked about climate change,
18:42
but Trump has said
he doesn't believe in that.
18:45
So in a way, your most powerful argument,
18:48
you can't actually use to make this case.
18:51
YNH: Yeah, I think with climate change,
18:54
at first sight, it's quite surprising
18:56
that there is a very close correlation
18:59
between nationalism and climate change.
19:02
I mean, almost always, the people
who deny climate change are nationalists.
19:05
And at first sight, you think: Why?
19:10
What's the connection?
19:12
Why don't you have socialists
denying climate change?
19:13
But then, when you think
about it, it's obvious --
19:16
because nationalism has no solution
to climate change.
19:18
If you want to be a nationalist
in the 21st century,
19:22
you have to deny the problem.
19:25
If you accept the reality of the problem,
then you must accept that, yes,
19:27
there is still room in the world
for patriotism,
19:32
there is still room in the world
for having special loyalties
19:35
and obligations towards your own people,
towards your own country.
19:39
I don't think anybody is really
thinking of abolishing that.
19:43
But in order to confront climate change,
19:47
we need additional loyalties
and commitments
19:50
to a level beyond the nation.
19:55
And that should not be impossible,
19:57
because people can have
several layers of loyalty.
19:59
You can be loyal to your family
20:03
and to your community
20:05
and to your nation,
20:07
so why can't you also be loyal
to humankind as a whole?
20:08
Of course, there are occasions
when it becomes difficult,
20:12
what to put first,
20:15
but, you know, life is difficult.
20:17
Handle it.
20:19
(Laughter)
20:20
CA: OK, so I would love to get
some questions from the audience here.
20:23
We've got a microphone here.
20:27
Speak into it, and Facebook,
get them coming, too.
20:29
Howard Morgan: One of the things that has
clearly made a huge difference
20:32
in this country and other countries
20:36
is the income distribution inequality,
20:38
the dramatic change
in income distribution in the US
20:40
from what it was 50 years ago,
20:44
and around the world.
20:46
Is there anything we can do
to affect that?
20:47
Because that gets at a lot
of the underlying causes.
20:50
YNH: So far I haven't heard a very
good idea about what to do about it,
20:56
again, partly because most ideas
remain on the national level,
21:01
and the problem is global.
21:05
I mean, one idea that we hear
quite a lot about now
21:06
is universal basic income.
21:09
But this is a problem.
21:11
I mean, I think it's a good start,
21:13
but it's a problematic idea because
it's not clear what "universal" is
21:14
and it's not clear what "basic" is.
21:18
Most people when they speak
about universal basic income,
21:20
they actually mean national basic income.
21:23
But the problem is global.
21:26
Let's say that you have AI and 3D printers
taking away millions of jobs
21:28
in Bangladesh,
21:33
from all the people who make
my shirts and my shoes.
21:35
So what's going to happen?
21:38
The US government will levy taxes
on Google and Apple in California,
21:39
and use that to pay basic income
to unemployed Bangladeshis?
21:46
If you believe that,
you can just as well believe
21:50
that Santa Claus will come
and solve the problem.
21:53
So unless we have really universal
and not national basic income,
21:57
the deep problems
are not going to go away.
22:02
And also it's not clear what basic is,
22:05
because what are basic human needs?
22:08
A thousand years ago,
just food and shelter was enough.
22:10
But today, people will say
education is a basic human need,
22:13
it should be part of the package.
22:17
But how much? Six years?
Twelve years? PhD?
22:19
Similarly, with health care,
22:22
let's say that in 20, 30, 40 years,
22:24
you'll have expensive treatments
that can extend human life
22:27
to 120, I don't know.
22:31
Will this be part of the basket
of basic income or not?
22:33
It's a very difficult problem,
22:38
because in a world where people
lose their ability to be employed,
22:39
the only thing they are going to get
is this basic income.
22:46
So what's part of it is a very,
very difficult ethical question.
22:49
CA: There's a bunch of questions
on how the world affords it as well,
22:54
who pays.
22:58
There's a question here
from Facebook from Lisa Larson:
22:59
"How does nationalism in the US now
23:02
compare to that between
World War I and World War II
23:04
in the last century?"
23:08
YNH: Well the good news, with regard
to the dangers of nationalism,
23:09
we are in a much better position
than a century ago.
23:14
A century ago, 1917,
23:18
Europeans were killing
each other by the millions.
23:20
In 2016, with Brexit,
as far as I remember,
23:23
a single person lost their life,
an MP who was murdered by some extremist.
23:28
Just a single person.
23:33
I mean, if Brexit was about
British independence,
23:35
this is the most peaceful
war of independence in human history.
23:37
And let's say that Scotland
will now choose to leave the UK
23:42
after Brexit.
23:48
So in the 18th century,
23:50
if Scotland wanted -- and the Scots
wanted several times --
23:52
to break out of the control of London,
23:55
the reaction of the government
in London was to send an army up north
23:59
to burn down Edinburgh
and massacre the highland tribes.
24:03
My guess is that if, in 2018,
the Scots vote for independence,
24:07
the London government
will not send an army up north
24:12
to burn down Edinburgh.
24:16
Very few people are now willing
to kill or be killed
24:17
for Scottish or for British independence.
24:22
So for all the talk
of the rise of nationalism
24:24
and going back to the 1930s,
24:30
to the 19th century, in the West at least,
24:32
the power of national sentiments
today is far, far smaller
24:36
than it was a century ago.
24:42
CA: Although some people now,
you hear publicly worrying
24:44
about whether that might be shifting,
24:48
that there could actually be
outbreaks of violence in the US
24:50
depending on how things turn out.
24:54
Should we be worried about that,
24:56
or do you really think
things have shifted?
24:58
YNH: No, we should be worried.
25:00
We should be aware of two things.
25:01
First of all, don't be hysterical.
25:03
We are not back
in the First World War yet.
25:05
But on the other hand,
don't be complacent.
25:08
We reached from 1917 to 2017,
25:11
not by some divine miracle,
25:16
but simply by human decisions,
25:19
and if we now start making
the wrong decisions,
25:21
we could be back
in an analogous situation to 1917
25:23
in a few years.
25:28
One of the things I know as a historian
25:29
is that you should never
underestimate human stupidity.
25:32
(Laughter)
25:36
It's one of the most powerful
forces in history,
25:38
human stupidity and human violence.
25:42
Humans do such crazy things
for no obvious reason,
25:44
but again, at the same time,
25:48
another very powerful force
in human history is human wisdom.
25:50
We have both.
25:53
CA: We have with us here
moral psychologist Jonathan Haidt,
25:55
who I think has a question.
25:57
Jonathan Haidt: Thanks, Yuval.
26:00
So you seem to be a fan
of global governance,
26:02
but when you look at the map of the world
from Transparency International,
26:04
which rates the level of corruption
of political institutions,
26:08
it's a vast sea of red with little bits
of yellow here and there
26:11
for those with good institutions.
26:14
So if we were to have
some kind of global governance,
26:16
what makes you think it would end up
being more like Denmark
26:18
rather than more like Russia or Honduras,
26:21
and aren't there alternatives,
26:23
such as we did with CFCs?
26:25
There are ways to solve global problems
with national governments.
26:27
What would world government
actually look like,
26:30
and why do you think it would work?
26:32
YNH: Well, I don't know
what it would look like.
26:34
Nobody still has a model for that.
26:38
The main reason we need it
26:41
is because many of these issues
are lose-lose situations.
26:44
When you have
a win-win situation like trade,
26:48
both sides can benefit
from a trade agreement,
26:51
then this is something you can work out.
26:54
Without some kind of global government,
26:56
national governments each
have an interest in doing it.
26:58
But when you have a lose-lose situation
like with climate change,
27:01
it's much more difficult
27:05
without some overarching
authority, real authority.
27:07
Now, how to get there
and what would it look like,
27:12
I don't know.
27:15
And certainly there is no obvious reason
27:16
to think that it would look like Denmark,
27:20
or that it would be a democracy.
27:22
Most likely it wouldn't.
27:24
We don't have workable democratic models
27:26
for a global government.
27:32
So maybe it would look more
like ancient China
27:34
than like modern Denmark.
27:38
But still, given the dangers
that we are facing,
27:39
I think the imperative of having
some kind of real ability
27:45
to force through difficult decisions
on the global level
27:50
is more important
than almost anything else.
27:54
CA: There's a question from Facebook here,
27:59
and then we'll get the mic to Andrew.
28:01
So, Kat Hebron on Facebook,
28:03
calling in from Vail:
28:05
"How would developed nations manage
the millions of climate migrants?"
28:07
YNH: I don't know.
28:12
CA: That's your answer, Kat. (Laughter)
28:14
YNH: And I don't think
that they know either.
28:16
They'll just deny the problem, maybe.
28:18
CA: But immigration, generally,
is another example of a problem
28:20
that's very hard to solve
on a nation-by-nation basis.
28:23
One nation can shut its doors,
28:26
but maybe that stores up
problems for the future.
28:27
YNH: Yes, I mean --
it's another very good case,
28:30
especially because it's so much easier
28:34
to migrate today
28:36
than it was in the Middle Ages
or in ancient times.
28:38
CA: Yuval, there's a belief
among many technologists, certainly,
28:42
that political concerns
are kind of overblown,
28:46
that actually, political leaders
don't have that much influence
28:48
in the world,
28:52
that the real determination of humanity
at this point is by science,
28:53
by invention, by companies,
28:57
by many things
other than political leaders,
28:59
and it's actually very hard
for leaders to do much,
29:03
so we're actually worrying
about nothing here.
29:06
YNH: Well, first, it should be emphasized
29:09
that it's true that political leaders'
ability to do good is very limited,
29:12
but their ability to do harm is unlimited.
29:17
There is a basic imbalance here.
29:20
You can still press the button
and blow everybody up.
29:22
You have that kind of ability.
29:26
But if you want, for example,
to reduce inequality,
29:27
that's very, very difficult.
29:31
But to start a war,
29:33
you can still do so very easily.
29:34
So there is a built-in imbalance
in the political system today
29:36
which is very frustrating,
29:40
where you cannot do a lot of good
but you can still do a lot of harm.
29:42
And this makes the political system
still a very big concern.
29:46
CA: So as you look at
what's happening today,
29:51
and putting your historian's hat on,
29:53
do you look back in history at moments
when things were going just fine
29:55
and an individual leader really took
the world or their country backwards?
29:59
YNH: There are quite a few examples,
30:05
but I should emphasize,
it's never an individual leader.
30:07
I mean, somebody put him there,
30:10
and somebody allowed him
to continue to be there.
30:12
So it's never really just the fault
of a single individual.
30:15
There are a lot of people
behind every such individual.
30:19
CA: Can we have the microphone
here, please, to Andrew?
30:24
Andrew Solomon: You've talked a lot
about the global versus the national,
30:30
but increasingly, it seems to me,
30:34
the world situation
is in the hands of identity groups.
30:36
We look at people within the United States
30:38
who have been recruited by ISIS.
30:41
We look at these other groups
which have formed
30:42
which go outside of national bounds
30:45
but still represent
significant authorities.
30:47
How are they to be integrated
into the system,
30:49
and how is a diverse set of identities
to be made coherent
30:51
under either national
or global leadership?
30:55
YNH: Well, the problem
of such diverse identities
30:59
is a problem from nationalism as well.
31:02
Nationalism believes
in a single, monolithic identity,
31:05
and exclusive or at least
more extreme versions of nationalism
31:09
believe in an exclusive loyalty
to a single identity.
31:13
And therefore, nationalism has had
a lot of problems
31:17
with people wanting to divide
their identities
31:20
between various groups.
31:23
So it's not just a problem, say,
for a global vision.
31:25
And I think, again, history shows
31:30
that you shouldn't necessarily
think in such exclusive terms.
31:34
If you think that there is just
a single identity for a person,
31:40
"I am just X, that's it, I can't be
several things, I can be just that,"
31:43
that's the start of the problem.
31:48
You have religions, you have nations
31:51
that sometimes demand exclusive loyalty,
31:53
but it's not the only option.
31:57
There are many religions and many nations
31:58
that enable you to have
diverse identities at the same time.
32:01
CA: But is one explanation
of what's happened in the last year
32:05
that a group of people have got
fed up with, if you like,
32:09
the liberal elites,
for want of a better term,
32:14
obsessing over many, many different
identities and them feeling,
32:17
"But what about my identity?
I am being completely ignored here.
32:22
And by the way, I thought
I was the majority"?
32:26
And that that's actually
sparked a lot of the anger.
32:29
YNH: Yeah. Identity is always problematic,
32:32
because identity is always based
on fictional stories
32:35
that sooner or later collide with reality.
32:40
Almost all identities,
32:43
I mean, beyond the level
of the basic community
32:45
of a few dozen people,
32:48
are based on a fictional story.
32:50
They are not the truth.
32:52
They are not the reality.
32:53
It's just a story that people invent
and tell one another
32:55
and start believing.
32:58
And therefore all identities
are extremely unstable.
32:59
They are not a biological reality.
33:05
Sometimes nationalists, for example,
33:07
think that the nation
is a biological entity.
33:09
It's made of the combination
of soil and blood,
33:12
creates the nation.
33:16
But this is just a fictional story.
33:18
CA: Soil and blood
kind of makes a gooey mess.
33:21
(Laughter)
33:23
YNH: It does, and also
it messes with your mind
33:25
when you think too much
that I am a combination of soil and blood.
33:28
If you look from a biological perspective,
33:33
obviously none of the nations
that exist today
33:36
existed 5,000 years ago.
33:39
Homo sapiens is a social animal,
that's for sure.
33:42
But for millions of years,
33:45
Homo sapiens and our hominid ancestors
lived in small communities
33:48
of a few dozen individuals.
33:53
Everybody knew everybody else.
33:55
Whereas modern nations
are imagined communities,
33:57
in the sense that I don't even know
all these people.
34:01
I come from a relatively
small nation, Israel,
34:04
and of eight million Israelis,
34:07
I never met most of them.
34:09
I will never meet most of them.
34:11
They basically exist here.
34:13
CA: But in terms of this identity,
34:16
this group who feel left out
and perhaps have work taken away,
34:18
I mean, in "Homo Deus,"
34:24
you actually speak of this group
in one sense expanding,
34:26
that so many people
may have their jobs taken away
34:29
by technology in some way
that we could end up with
34:33
a really large -- I think you call it
a "useless class" --
34:37
a class where traditionally,
34:41
as viewed by the economy,
these people have no use.
34:43
YNH: Yes.
34:45
CA: How likely a possibility is that?
34:47
Is that something
we should be terrified about?
34:50
And can we address it in any way?
34:52
YNH: We should think about it
very carefully.
34:55
I mean, nobody really knows
what the job market will look like
34:57
in 2040, 2050.
35:00
There is a chance
many new jobs will appear,
35:02
but it's not certain.
35:05
And even if new jobs do appear,
35:07
it won't necessarily be easy
35:09
for a 50-year old unemployed truck driver
35:11
made unemployed by self-driving vehicles,
35:14
it won't be easy
for an unemployed truck driver
35:17
to reinvent himself or herself
as a designer of virtual worlds.
35:21
Previously, if you look at the trajectory
of the industrial revolution,
35:25
when machines replaced humans
in one type of work,
35:30
the solution usually came
from low-skill work
35:34
in new lines of business.
35:38
So you didn't need any more
agricultural workers,
35:41
so people moved to working
in low-skill industrial jobs,
35:44
and when this was taken away
by more and more machines,
35:50
people moved to low-skill service jobs.
35:53
Now, when people say there will
be new jobs in the future,
35:56
that humans can do better than AI,
35:59
that humans can do better than robots,
36:02
they usually think about high-skill jobs,
36:04
like software engineers
designing virtual worlds.
36:06
Now, I don't see how
an unemployed cashier from Wal-Mart
36:10
reinvents herself or himself at 50
as a designer of virtual worlds,
36:16
and certainly I don't see
36:20
how the millions of unemployed
Bangladeshi textile workers
36:22
will be able to do that.
36:25
I mean, if they are going to do it,
36:27
we need to start teaching
the Bangladeshis today
36:29
how to be software designers,
36:32
and we are not doing it.
36:34
So what will they do in 20 years?
36:35
CA: So it feels like you're really
highlighting a question
36:38
that's really been bugging me
the last few months more and more.
36:42
It's almost a hard question
to ask in public,
36:46
but if any mind has some wisdom
to offer in it, maybe it's yours,
36:49
so I'm going to ask you:
36:52
What are humans for?
36:54
YNH: As far as we know, for nothing.
36:57
(Laughter)
36:59
I mean, there is no great cosmic drama,
some great cosmic plan,
37:00
that we have a role to play in.
37:06
And we just need to discover
what our role is
37:09
and then play it to the best
of our ability.
37:12
This has been the story of all religions
and ideologies and so forth,
37:15
but as a scientist, the best I can say
is this is not true.
37:20
There is no universal drama
with a role in it for Homo sapiens.
37:23
So --
37:29
CA: I'm going to push back on you
just for a minute,
37:30
just from your own book,
37:33
because in "Homo Deus,"
37:34
you give really one of the most coherent
and understandable accounts
37:35
about sentience, about consciousness,
37:40
and that unique sort of human skill.
37:43
You point out that it's different
from intelligence,
37:46
the intelligence
that we're building in machines,
37:48
and that there's actually a lot
of mystery around it.
37:51
How can you be sure there's no purpose
37:54
when we don't even understand
what this sentience thing is?
37:58
I mean, in your own thinking,
isn't there a chance
38:02
that what humans are for
is to be the universe's sentient things,
38:04
to be the centers of joy and love
and happiness and hope?
38:09
And maybe we can build machines
that actually help amplify that,
38:12
even if they're not going to become
sentient themselves?
38:15
Is that crazy?
38:18
I kind of found myself hoping that,
reading your book.
38:19
YNH: Well, I certainly think that the most
interesting question today in science
38:23
is the question
of consciousness and the mind.
38:26
We are getting better and better
in understanding the brain
38:29
and intelligence,
38:32
but we are not getting much better
38:34
in understanding the mind
and consciousness.
38:36
People often confuse intelligence
and consciousness,
38:39
especially in places like Silicon Valley,
38:42
which is understandable,
because in humans, they go together.
38:44
I mean, intelligence basically
is the ability to solve problems.
38:48
Consciousness is the ability
to feel things,
38:52
to feel joy and sadness
and boredom and pain and so forth.
38:54
In Homo sapiens and all other mammals
as well -- it's not unique to humans --
39:00
in all mammals and birds
and some other animals,
39:04
intelligence and consciousness
go together.
39:06
We often solve problems by feeling things.
39:09
So we tend to confuse them.
39:13
But they are different things.
39:14
What's happening today
in places like Silicon Valley
39:16
is that we are creating
artificial intelligence
39:19
but not artificial consciousness.
39:22
There has been an amazing development
in computer intelligence
39:24
over the last 50 years,
39:28
and exactly zero development
in computer consciousness,
39:29
and there is no indication that computers
are going to become conscious
39:33
anytime soon.
39:37
So first of all, if there is
some cosmic role for consciousness,
39:40
it's not unique to Homo sapiens.
39:45
Cows are conscious, pigs are conscious,
39:47
chimpanzees are conscious,
chickens are conscious,
39:50
so if we go that way, first of all,
we need to broaden our horizons
39:53
and remember very clearly we are not
the only sentient beings on Earth,
39:57
and when it comes to sentience --
40:01
when it comes to intelligence,
there is good reason to think
40:03
we are the most intelligent
of the whole bunch.
40:06
But when it comes to sentience,
40:10
to say that humans are more
sentient than whales,
40:12
or more sentient than baboons
or more sentient than cats,
40:16
I see no evidence for that.
40:20
So first step is, you go
in that direction, expand.
40:22
And then the second question
of what is it for,
40:26
I would reverse it
40:30
and I would say that I don't think
sentience is for anything.
40:31
I think we don't need
to find our role in the universe.
40:36
The really important thing
is to liberate ourselves from suffering.
40:40
What characterizes sentient beings
40:46
in contrast to robots, to stones,
40:49
to whatever,
40:52
is that sentient beings
suffer, can suffer,
40:53
and what they should focus on
40:57
is not finding their place
in some mysterious cosmic drama.
40:59
They should focus on understanding
what suffering is,
41:03
what causes it and how
to be liberated from it.
41:07
CA: I know this is a big issue for you,
and that was very eloquent.
41:11
We're going to have a blizzard
of questions from the audience here,
41:14
and maybe from Facebook as well,
41:18
and maybe some comments as well.
41:20
So let's go quick.
41:21
There's one right here.
41:23
Keep your hands held up
at the back if you want the mic,
41:26
and we'll get it back to you.
41:29
Question: In your work, you talk a lot
about the fictional stories
41:31
that we accept as truth,
41:34
and we live our lives by it.
41:35
As an individual, knowing that,
41:37
how does it impact the stories
that you choose to live your life,
41:39
and do you confuse them
with the truth, like all of us?
41:43
YNH: I try not to.
41:48
I mean, for me, maybe the most
important question,
41:49
both as a scientist and as a person,
41:52
is how to tell the difference
between fiction and reality,
41:54
because reality is there.
41:58
I'm not saying that everything is fiction.
42:01
It's just very difficult for human beings
to tell the difference
42:03
between fiction and reality,
42:06
and it has become more and more difficult
as history progressed,
42:07
because the fictions
that we have created --
42:12
nations and gods and money
and corporations --
42:15
they now control the world.
42:18
So just to even think,
42:20
"Oh, this is just all fictional entities
that we've created,"
42:21
is very difficult.
42:24
But reality is there.
42:25
For me the best ...
42:28
There are several tests
42:30
to tell the difference
between fiction and reality.
42:33
The simplest one, the best one
that I can say in short,
42:35
is the test of suffering.
42:39
If it can suffer, it's real.
42:40
If it can't suffer, it's not real.
42:43
A nation cannot suffer.
42:44
That's very, very clear.
42:46
Even if a nation loses a war,
42:47
we say, "Germany suffered a defeat
in the First World War,"
42:49
it's a metaphor.
42:53
Germany cannot suffer.
Germany has no mind.
42:55
Germany has no consciousness.
42:57
Germans can suffer, yes,
but Germany cannot.
42:59
Similarly, when a bank goes bust,
43:02
the bank cannot suffer.
43:05
When the dollar loses its value,
the dollar doesn't suffer.
43:07
People can suffer. Animals can suffer.
43:11
This is real.
43:13
So I would start, if you
really want to see reality,
43:14
I would go through the door of suffering.
43:19
If you can really understand
what suffering is,
43:21
this will give you also the key
43:24
to understand what reality is.
43:26
CA: There's a Facebook question
here that connects to this,
43:28
from someone around the world
in a language that I cannot read.
43:31
YNH: Oh, it's Hebrew.
CA: Hebrew. There you go.
43:34
(Laughter)
43:36
Can you read the name?
43:37
YNH: [??]
43:38
CA: Well, thank you for writing in.
43:40
The question is: "Is the post-truth era
really a brand-new era,
43:42
or just another climax or moment
in a never-ending trend?
43:47
YNH: Personally, I don't connect
with this idea of post-truth.
43:52
My basic reaction as a historian is:
43:55
If this is the era of post-truth,
when the hell was the era of truth?
43:58
CA: Right.
44:02
(Laughter)
44:03
YNH: Was it the 1980s, the 1950s,
the Middle Ages?
44:05
I mean, we have always lived
in an era, in a way, of post-truth.
44:09
CA: But I'd push back on that,
44:14
because I think what people
are talking about
44:17
is that there was a world
where you had fewer journalistic outlets,
44:19
where there were traditions,
that things were fact-checked.
44:26
It was incorporated into the charter
of those organizations
44:30
that the truth mattered.
44:34
So if you believe in a reality,
44:36
then what you write is information.
44:38
There was a belief that that information
should connect to reality in a real way,
44:40
and if you wrote a headline,
it was a serious, earnest attempt
44:44
to reflect something
that had actually happened.
44:47
And people didn't always get it right.
44:49
But I think the concern now is you've got
44:51
a technological system
that's incredibly powerful
44:53
that, for a while at least,
massively amplified anything
44:55
with no attention paid to whether
it connected to reality,
45:00
only to whether it connected
to clicks and attention,
45:02
and that that was arguably toxic.
45:06
That's a reasonable concern, isn't it?
45:07
YNH: Yeah, it is. I mean,
the technology changes,
45:10
and it's now easier to disseminate
both truth and fiction and falsehood.
45:12
It goes both ways.
45:17
It's also much easier, though, to spread
the truth than it was ever before.
45:19
But I don't think there
is anything essentially new
45:24
about this disseminating
fictions and errors.
45:28
There is nothing that -- I don't know --
Joseph Goebbels, didn't know
45:32
about all this idea of fake
news and post-truth.
45:36
He famously said that if you repeat
a lie often enough,
45:42
people will think it's the truth,
45:46
and the bigger the lie, the better,
45:48
because people won't even think
that something so big can be a lie.
45:50
I think that fake news
has been with us for thousands of years.
45:56
Just think of the Bible.
46:02
(Laughter)
46:04
CA: But there is a concern
46:05
that the fake news is associated
with tyrannical regimes,
46:06
and when you see an uprise in fake news
46:10
that is a canary in the coal mine
that there may be dark times coming.
46:13
YNH: Yeah. I mean, the intentional use
of fake news is a disturbing sign.
46:19
But I'm not saying that it's not bad,
I'm just saying that it's not new.
46:27
CA: There's a lot of interest
on Facebook on this question
46:32
about global governance
versus nationalism.
46:35
Question here from Phil Dennis:
46:41
"How do we get people, governments,
to relinquish power?
46:42
Is that -- is that --
actually, the text is so big
46:46
I can't read the full question.
46:50
But is that a necessity?
46:51
Is it going to take war to get there?
46:53
Sorry Phil -- I mangled your question,
but I blame the text right here.
46:55
YNH: One option
that some people talk about
46:59
is that only a catastrophe
can shake humankind
47:01
and open the path to a real system
of global governance,
47:06
and they say that we can't do it
before the catastrophe,
47:11
but we need to start
laying the foundations
47:15
so that when the disaster strikes,
47:18
we can react quickly.
47:21
But people will just not have
the motivation to do such a thing
47:23
before the disaster strikes.
47:27
Another thing that I would emphasize
47:29
is that anybody who is really
interested in global governance
47:31
should always make it very, very clear
47:36
that it doesn't replace or abolish
local identities and communities,
47:39
that it should come both as --
47:46
It should be part of a single package.
47:49
CA: I want to hear more on this,
47:52
because the very words "global governance"
47:56
are almost the epitome of evil
in the mindset of a lot of people
47:59
on the alt-right right now.
48:03
It just seems scary, remote, distant,
and it has let them down,
48:05
and so globalists,
global governance -- no, go away!
48:08
And many view the election
as the ultimate poke in the eye
48:12
to anyone who believes in that.
48:16
So how do we change the narrative
48:17
so that it doesn't seem
so scary and remote?
48:21
Build more on this idea
of it being compatible
48:24
with local identity, local communities.
48:26
YNH: Well, I think again we should start
48:29
really with the biological realities
48:32
of Homo sapiens.
48:35
And biology tells us two things
about Homo sapiens
48:37
which are very relevant to this issue:
48:41
first of all, that we are
completely dependent
48:43
on the ecological system around us,
48:46
and that today we are talking
about a global system.
48:49
You cannot escape that.
48:52
And at the same time, biology tells us
about Homo sapiens
48:54
that we are social animals,
48:57
but that we are social
on a very, very local level.
49:00
It's just a simple fact of humanity
49:04
that we cannot have intimate familiarity
49:08
with more than about 150 individuals.
49:13
The size of the natural group,
49:17
the natural community of Homo sapiens,
49:21
is not more than 150 individuals,
49:24
and everything beyond that is really
based on all kinds of imaginary stories
49:27
and large-scale institutions,
49:34
and I think that we can find a way,
49:36
again, based on a biological
understanding of our species,
49:40
to weave the two together
49:45
and to understand that today
in the 21st century,
49:47
we need both the global level
and the local community.
49:50
And I would go even further than that
49:56
and say that it starts
with the body itself.
49:58
The feelings that people today have
of alienation and loneliness
50:02
and not finding their place in the world,
50:06
I would think that the chief problem
is not global capitalism.
50:09
The chief problem is that over
the last hundred years,
50:16
people have been becoming disembodied,
50:19
have been distancing themselves
from their body.
50:22
As a hunter-gatherer or even as a peasant,
50:26
to survive, you need to be
constantly in touch
50:28
with your body and with your senses,
50:33
every moment.
50:35
If you go to the forest
to look for mushrooms
50:36
and you don't pay attention
to what you hear,
50:38
to what you smell, to what you taste,
50:41
you're dead.
50:43
So you must be very connected.
50:44
In the last hundred years,
people are losing their ability
50:46
to be in touch with their body
and their senses,
50:51
to hear, to smell, to feel.
50:53
More and more attention goes to screens,
50:56
to what is happening elsewhere,
50:59
some other time.
51:00
This, I think, is the deep reason
51:02
for the feelings of alienation
and loneliness and so forth,
51:04
and therefore part of the solution
51:08
is not to bring back
some mass nationalism,
51:11
but also reconnect with our own bodies,
51:15
and if you are back
in touch with your body,
51:19
you will feel much more at home
in the world also.
51:22
CA: Well, depending on how things go,
we may all be back in the forest soon.
51:25
We're going to have
one more question in the room
51:29
and one more on Facebook.
51:32
Ama Adi-Dako: Hello. I'm from Ghana,
West Africa, and my question is:
51:33
I'm wondering how do you present
and justify the idea of global governance
51:36
to countries that have been
historically disenfranchised
51:41
by the effects of globalization,
51:44
and also, if we're talking about
global governance,
51:46
it sounds to me like it will definitely
come from a very Westernized idea
51:49
of what the "global"
is supposed to look like.
51:53
So how do we present and justify
that idea of global
51:55
versus wholly nationalist
51:58
to people in countries like Ghana
and Nigeria and Togo
52:01
and other countries like that?
52:04
YNH: I would start by saying
that history is extremely unfair,
52:07
and that we should realize that.
52:14
Many of the countries that suffered most
52:18
from the last 200 years of globalization
52:21
and imperialism and industrialization
52:26
are exactly the countries
which are also most likely to suffer most
52:28
from the next wave.
52:33
And we should be very,
very clear about that.
52:36
If we don't have a global governance,
52:41
and if we suffer from climate change,
52:44
from technological disruptions,
52:47
the worst suffering will not be in the US.
52:49
The worst suffering will be in Ghana,
will be in Sudan, will be in Syria,
52:53
will be in Bangladesh,
will be in those places.
52:58
So I think those countries
have an even greater incentive
53:01
to do something about
the next wave of disruption,
53:07
whether it's ecological
or whether it's technological.
53:12
Again, if you think about
technological disruption,
53:14
so if AI and 3D printers and robots
will take the jobs
53:17
from billions of people,
53:22
I worry far less about the Swedes
53:24
than about the people in Ghana
or in Bangladesh.
53:27
And therefore,
because history is so unfair
53:31
and the results of a calamity
53:36
will not be shared equally
between everybody,
53:41
as usual, the rich
will be able to get away
53:43
from the worst consequences
of climate change
53:47
in a way that the poor
will not be able to.
53:51
CA: And here's a great question
from Cameron Taylor on Facebook:
53:55
"At the end of 'Sapiens,'"
53:58
you said we should be asking the question,
54:00
'What do we want to want?'
54:02
Well, what do you think
we should want to want?"
54:05
YNH: I think we should want
to want to know the truth,
54:08
to understand reality.
54:11
Mostly what we want is to change reality,
54:15
to fit it to our own desires,
to our own wishes,
54:20
and I think we should first
want to understand it.
54:23
If you look at the long-term
trajectory of history,
54:27
what you see is that
for thousands of years
54:31
we humans have been gaining
control of the world outside us
54:34
and trying to shape it
to fit our own desires.
54:37
And we've gained control
of the other animals,
54:41
of the rivers, of the forests,
54:44
and reshaped them completely,
54:45
causing an ecological destruction
54:49
without making ourselves satisfied.
54:52
So the next step
is we turn our gaze inwards,
54:55
and we say OK, getting control
of the world outside us
54:59
did not really make us satisfied.
55:04
Let's now try to gain control
of the world inside us.
55:06
This is the really big project
55:08
of science and technology
and industry in the 21st century --
55:11
to try and gain control
of the world inside us,
55:15
to learn how to engineer and produce
bodies and brains and minds.
55:19
These are likely to be the main
products of the 21st century economy.
55:23
When people think about the future,
very often they think in terms,
55:28
"Oh, I want to gain control
of my body and of my brain."
55:32
And I think that's very dangerous.
55:36
If we've learned anything
from our previous history,
55:39
it's that yes, we gain
the power to manipulate,
55:42
but because we didn't really
understand the complexity
55:46
of the ecological system,
55:49
we are now facing an ecological meltdown.
55:51
And if we now try to reengineer
the world inside us
55:54
without really understanding it,
56:00
especially without understanding
the complexity of our mental system,
56:02
we might cause a kind of internal
ecological disaster,
56:06
and we'll face a kind of mental
meltdown inside us.
56:11
CA: Putting all the pieces
together here --
56:16
the current politics,
the coming technology,
56:18
concerns like the one
you've just outlined --
56:21
I mean, it seems like you yourself
are in quite a bleak place
56:23
when you think about the future.
56:26
You're pretty worried about it.
56:28
Is that right?
56:29
And if there was one cause for hope,
how would you state that?
56:31
YNH: I focus on the most
dangerous possibilities
56:37
partly because this is like
my job or responsibility
56:41
as a historian or social critic.
56:44
I mean, the industry focuses mainly
on the positive sides,
56:46
so it's the job of historians
and philosophers and sociologists
56:51
to highlight the more dangerous potential
of all these new technologies.
56:54
I don't think any of that is inevitable.
56:59
Technology is never deterministic.
57:01
You can use the same technology
57:04
to create very different
kinds of societies.
57:06
If you look at the 20th century,
57:09
so, the technologies
of the Industrial Revolution,
57:11
the trains and electricity and all that
57:14
could be used to create
a communist dictatorship
57:17
or a fascist regime
or a liberal democracy.
57:20
The trains did not tell you
what to do with them.
57:23
Similarly, now, artificial intelligence
and bioengineering and all of that --
57:26
they don't predetermine a single outcome.
57:30
Humanity can rise up to the challenge,
57:34
and the best example we have
57:37
of humanity rising up
to the challenge of a new technology
57:39
is nuclear weapons.
57:43
In the late 1940s, '50s,
57:45
many people were convinced
57:48
that sooner or later the Cold War
will end in a nuclear catastrophe,
57:50
destroying human civilization.
57:54
And this did not happen.
57:56
In fact, nuclear weapons prompted
humans all over the world
57:57
to change the way that they manage
international politics
58:04
to reduce violence.
58:09
And many countries basically took out war
58:11
from their political toolkit.
58:14
They no longer tried to pursue
their interests with warfare.
58:16
Not all countries have done so,
but many countries have.
58:21
And this is maybe
the most important reason
58:24
why international violence
declined dramatically since 1945,
58:28
and today, as I said,
more people commit suicide
58:34
than are killed in war.
58:38
So this, I think, gives us a good example
58:40
that even the most frightening technology,
58:45
humans can rise up to the challenge
58:49
and actually some good can come out of it.
58:51
The problem is, we have very little
margin for error.
58:54
If we don't get it right,
58:59
we might not have
a second option to try again.
59:01
CA: That's a very powerful note,
59:06
on which I think we should draw
this to a conclusion.
59:07
Before I wrap up, I just want to say
one thing to people here
59:10
and to the global TED community
watching online, anyone watching online:
59:13
help us with these dialogues.
59:19
If you believe, like we do,
59:22
that we need to find
a different kind of conversation,
59:24
now more than ever, help us do it.
59:27
Reach out to other people,
59:30
try and have conversations
with people you disagree with,
59:33
understand them,
59:35
pull the pieces together,
59:37
and help us figure out how to take
these conversations forward
59:38
so we can make a real contribution
59:42
to what's happening
in the world right now.
59:44
I think everyone feels more alive,
59:47
more concerned, more engaged
59:50
with the politics of the moment.
59:53
The stakes do seem quite high,
59:55
so help us respond to it
in a wise, wise way.
59:58
Yuval Harari, thank you.
00:02
(Applause)
00:04

sponsored links

Yuval Noah Harari - Historian
In his book "Sapiens: A Brief History of Humankind," Yuval Noah Harari asks: "What made homo sapiens the most successful species on the planet?"

Why you should listen

A lecturer in history at The Hebrew University of Jerusalem, in his book Sapiens: A Brief History of Humankind, Yuval Noah Harari  asks what made homo sapiens the most successful species on the planet. His answer: We are the only animal that can believe in things that exist purely in our imagination, such as gods, states, money, human rights, corporations and other “fictions,” and we have developed a unique ability to use these stories to unify and organize groups and ensure cooperation. In his next book, he'll explore the growth of inequality in human society. He asks, are we on the cusp of the next great divergence?

Harari specializes in world history, medieval history and military history. His current research focuses on macro-historical questions: What is the relation between history and biology? What is the essential difference between Homo sapiens and other animals? Is there justice in history? Does history have a direction? Did people become happier as history unfolded? Harari also teaches a MOOC (Massive Open Online Course) titled A Brief History of Humankind

sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.