sponsored links
TED2008

Jonathan Haidt: The moral roots of liberals and conservatives

March 1, 2008

Psychologist Jonathan Haidt studies the five moral values that form the basis of our political choices, whether we're left, right or center. In this eye-opening talk, he pinpoints the moral values that liberals and conservatives tend to honor most.

Jonathan Haidt - Social psychologist
Jonathan Haidt studies how -- and why -- we evolved to be moral. By understanding more about our moral roots, his hope is that we can learn to be civil and open-minded. Full bio

sponsored links
Double-click the English subtitles below to play the video.
Suppose that two American friends are traveling together in Italy.
00:19
They go to see Michelangelo's "David,"
00:22
and when they finally come face to face with the statue,
00:24
they both freeze dead in their tracks.
00:26
The first guy -- we'll call him Adam --
00:28
is transfixed by the beauty of the perfect human form.
00:30
The second guy -- we'll call him Bill --
00:33
is transfixed by embarrassment, at staring at the thing there in the center.
00:35
So here's my question for you:
00:40
which one of these two guys was more likely to have voted for George Bush,
00:42
which for Al Gore?
00:46
I don't need a show of hands
00:48
because we all have the same political stereotypes.
00:49
We all know that it's Bill.
00:52
And in this case, the stereotype corresponds to reality.
00:54
It really is a fact that liberals are much higher than conservatives
00:58
on a major personality trait called openness to experience.
01:01
People who are high in openness to experience
01:04
just crave novelty, variety, diversity, new ideas, travel.
01:06
People low on it like things that are familiar, that are safe and dependable.
01:10
If you know about this trait,
01:15
you can understand a lot of puzzles about human behavior.
01:17
You can understand why artists are so different from accountants.
01:19
You can actually predict what kinds of books they like to read,
01:22
what kinds of places they like to travel to,
01:24
and what kinds of food they like to eat.
01:26
Once you understand this trait, you can understand
01:28
why anybody would eat at Applebee's, but not anybody that you know.
01:31
(Laughter)
01:35
This trait also tells us a lot about politics.
01:41
The main researcher of this trait, Robert McCrae says that,
01:43
"Open individuals have an affinity for liberal, progressive, left-wing political views" --
01:46
they like a society which is open and changing --
01:50
"whereas closed individuals prefer conservative, traditional, right-wing views."
01:52
This trait also tells us a lot about the kinds of groups people join.
01:57
So here's the description of a group I found on the Web.
02:01
What kinds of people would join a global community
02:03
welcoming people from every discipline and culture,
02:05
who seek a deeper understanding of the world,
02:07
and who hope to turn that understanding into a better future for us all?
02:09
This is from some guy named Ted.
02:12
(Laughter)
02:14
Well, let's see now, if openness predicts who becomes liberal,
02:16
and openness predicts who becomes a TEDster,
02:20
then might we predict that most TEDsters are liberal?
02:22
Let's find out.
02:25
I'm going to ask you to raise your hand, whether you are liberal, left of center --
02:26
on social issues, we're talking about, primarily --
02:30
or conservative, and I'll give a third option,
02:32
because I know there are a number of libertarians in the audience.
02:34
So, right now, please raise your hand --
02:36
down in the simulcast rooms, too,
02:38
let's let everybody see who's here --
02:39
please raise your hand if you would say that you are liberal or left of center.
02:41
Please raise your hand high right now. OK.
02:44
Please raise your hand if you'd say you're libertarian.
02:48
OK, about a -- two dozen.
02:51
And please raise your hand if you'd say you are right of center or conservative.
02:53
One, two, three, four, five -- about eight or 10.
02:56
OK. This is a bit of a problem.
03:02
Because if our goal is to understand the world,
03:05
to seek a deeper understanding of the world,
03:08
our general lack of moral diversity here is going to make it harder.
03:10
Because when people all share values, when people all share morals,
03:13
they become a team, and once you engage the psychology of teams,
03:17
it shuts down open-minded thinking.
03:20
When the liberal team loses, as it did in 2004,
03:25
and as it almost did in 2000, we comfort ourselves.
03:29
(Laughter)
03:33
We try to explain why half of America voted for the other team.
03:35
We think they must be blinded by religion, or by simple stupidity.
03:39
(Laughter)
03:44
(Applause)
03:47
So, if you think that half of America votes Republican
03:55
because they are blinded in this way,
04:01
then my message to you is that you're trapped in a moral matrix,
04:04
in a particular moral matrix.
04:07
And by the matrix, I mean literally the matrix, like the movie "The Matrix."
04:08
But I'm here today to give you a choice.
04:12
You can either take the blue pill and stick to your comforting delusions,
04:14
or you can take the red pill,
04:18
learn some moral psychology and step outside the moral matrix.
04:20
Now, because I know --
04:23
(Applause) --
04:25
OK, I assume that answers my question.
04:28
I was going to ask you which one you picked, but no need.
04:30
You're all high in openness to experience, and besides,
04:32
it looks like it might even taste good, and you're all epicures.
04:34
So anyway, let's go with the red pill.
04:37
Let's study some moral psychology and see where it takes us.
04:39
Let's start at the beginning.
04:41
What is morality and where does it come from?
04:43
The worst idea in all of psychology
04:45
is the idea that the mind is a blank slate at birth.
04:47
Developmental psychology has shown
04:50
that kids come into the world already knowing so much
04:52
about the physical and social worlds,
04:54
and programmed to make it really easy for them to learn certain things
04:56
and hard to learn others.
05:00
The best definition of innateness I've ever seen --
05:01
this just clarifies so many things for me --
05:03
is from the brain scientist Gary Marcus.
05:05
He says, "The initial organization of the brain does not depend that much on experience.
05:07
Nature provides a first draft, which experience then revises.
05:12
Built-in doesn't mean unmalleable;
05:15
it means organized in advance of experience."
05:17
OK, so what's on the first draft of the moral mind?
05:20
To find out, my colleague, Craig Joseph, and I
05:22
read through the literature on anthropology,
05:25
on culture variation in morality
05:27
and also on evolutionary psychology, looking for matches.
05:29
What are the sorts of things that people talk about across disciplines?
05:31
That you find across cultures and even across species?
05:34
We found five -- five best matches,
05:36
which we call the five foundations of morality.
05:38
The first one is harm/care.
05:40
We're all mammals here, we all have a lot of neural and hormonal programming
05:42
that makes us really bond with others, care for others,
05:46
feel compassion for others, especially the weak and vulnerable.
05:48
It gives us very strong feelings about those who cause harm.
05:51
This moral foundation underlies about 70 percent
05:54
of the moral statements I've heard here at TED.
05:57
The second foundation is fairness/reciprocity.
05:59
There's actually ambiguous evidence
06:02
as to whether you find reciprocity in other animals,
06:04
but the evidence for people could not be clearer.
06:06
This Norman Rockwell painting is called "The Golden Rule,"
06:08
and we heard about this from Karen Armstrong, of course,
06:10
as the foundation of so many religions.
06:12
That second foundation underlies the other 30 percent
06:15
of the moral statements I've heard here at TED.
06:17
The third foundation is in-group/loyalty.
06:19
You do find groups in the animal kingdom --
06:21
you do find cooperative groups --
06:23
but these groups are always either very small or they're all siblings.
06:25
It's only among humans that you find very large groups of people
06:28
who are able to cooperate, join together into groups,
06:31
but in this case, groups that are united to fight other groups.
06:34
This probably comes from our long history of tribal living, of tribal psychology.
06:38
And this tribal psychology is so deeply pleasurable
06:42
that even when we don't have tribes,
06:44
we go ahead and make them, because it's fun.
06:46
(Laughter)
06:49
Sports is to war as pornography is to sex.
06:52
We get to exercise some ancient, ancient drives.
06:55
The fourth foundation is authority/respect.
06:58
Here you see submissive gestures from two members of very closely related species.
07:01
But authority in humans is not so closely based on power and brutality,
07:04
as it is in other primates.
07:08
It's based on more voluntary deference,
07:10
and even elements of love, at times.
07:12
The fifth foundation is purity/sanctity.
07:14
This painting is called "The Allegory Of Chastity,"
07:16
but purity's not just about suppressing female sexuality.
07:19
It's about any kind of ideology, any kind of idea
07:22
that tells you that you can attain virtue
07:25
by controlling what you do with your body,
07:27
by controlling what you put into your body.
07:28
And while the political right may moralize sex much more,
07:30
the political left is really doing a lot of it with food.
07:34
Food is becoming extremely moralized nowadays,
07:36
and a lot of it is ideas about purity,
07:38
about what you're willing to touch, or put into your body.
07:40
I believe these are the five best candidates
07:43
for what's written on the first draft of the moral mind.
07:46
I think this is what we come with, at least
07:48
a preparedness to learn all of these things.
07:49
But as my son, Max, grows up in a liberal college town,
07:52
how is this first draft going to get revised?
07:56
And how will it end up being different
07:58
from a kid born 60 miles south of us in Lynchburg, Virginia?
08:00
To think about culture variation, let's try a different metaphor.
08:03
If there really are five systems at work in the mind --
08:05
five sources of intuitions and emotions --
08:08
then we can think of the moral mind
08:10
as being like one of those audio equalizers that has five channels,
08:12
where you can set it to a different setting on every channel.
08:14
And my colleagues, Brian Nosek and Jesse Graham, and I,
08:16
made a questionnaire, which we put up on the Web at www.YourMorals.org.
08:19
And so far, 30,000 people have taken this questionnaire, and you can too.
08:24
Here are the results.
08:29
Here are the results from about 23,000 American citizens.
08:30
On the left, I've plotted the scores for liberals;
08:33
on the right, those for conservatives; in the middle, the moderates.
08:35
The blue line shows you people's responses
08:37
on the average of all the harm questions.
08:39
So, as you see, people care about harm and care issues.
08:41
They give high endorsement of these sorts of statements
08:44
all across the board, but as you also see,
08:46
liberals care about it a little more than conservatives -- the line slopes down.
08:48
Same story for fairness.
08:51
But look at the other three lines.
08:53
For liberals, the scores are very low.
08:55
Liberals are basically saying, "No, this is not morality.
08:57
In-group, authority, purity -- this stuff has nothing to do with morality. I reject it."
08:59
But as people get more conservative, the values rise.
09:02
We can say that liberals have a kind of a two-channel,
09:04
or two-foundation morality.
09:07
Conservatives have more of a five-foundation,
09:08
or five-channel morality.
09:10
We find this in every country we look at.
09:12
Here's the data for 1,100 Canadians.
09:13
I'll just flip through a few other slides.
09:15
The U.K., Australia, New Zealand, Western Europe, Eastern Europe,
09:17
Latin America, the Middle East, East Asia and South Asia.
09:20
Notice also that on all of these graphs,
09:24
the slope is steeper on in-group, authority, purity.
09:26
Which shows that within any country,
09:29
the disagreement isn't over harm and fairness.
09:31
Everybody -- I mean, we debate over what's fair --
09:34
but everybody agrees that harm and fairness matter.
09:36
Moral arguments within cultures
09:39
are especially about issues of in-group, authority, purity.
09:41
This effect is so robust that we find it no matter how we ask the question.
09:44
In one recent study,
09:47
we asked people to suppose you're about to get a dog.
09:49
You picked a particular breed,
09:51
you learned some new information about the breed.
09:52
Suppose you learn that this particular breed is independent-minded,
09:54
and relates to its owner as a friend and an equal?
09:57
Well, if you are a liberal, you say, "Hey, that's great!"
09:59
Because liberals like to say, "Fetch, please."
10:01
(Laughter)
10:03
But if you're conservative, that's not so attractive.
10:08
If you're conservative, and you learn that a dog's extremely loyal
10:11
to its home and family, and doesn't warm up quickly to strangers,
10:14
for conservatives, well, loyalty is good -- dogs ought to be loyal.
10:16
But to a liberal, it sounds like this dog
10:19
is running for the Republican nomination.
10:21
(Laughter)
10:23
So, you might say, OK,
10:24
there are these differences between liberals and conservatives,
10:26
but what makes those three other foundations moral?
10:28
Aren't those just the foundations of xenophobia
10:30
and authoritarianism and Puritanism?
10:32
What makes them moral?
10:34
The answer, I think, is contained in this incredible triptych from Hieronymus Bosch,
10:35
"The Garden of Earthly Delights."
10:38
In the first panel, we see the moment of creation.
10:40
All is ordered, all is beautiful, all the people and animals
10:43
are doing what they're supposed to be doing, where they're supposed to be.
10:47
But then, given the way of the world, things change.
10:50
We get every person doing whatever he wants,
10:53
with every aperture of every other person and every other animal.
10:55
Some of you might recognize this as the '60s.
10:58
(Laughter)
11:00
But the '60s inevitably gives way to the '70s,
11:01
where the cuttings of the apertures hurt a little bit more.
11:05
Of course, Bosch called this hell.
11:09
So this triptych, these three panels
11:11
portray the timeless truth that order tends to decay.
11:14
The truth of social entropy.
11:19
But lest you think this is just some part of the Christian imagination
11:21
where Christians have this weird problem with pleasure,
11:24
here's the same story, the same progression,
11:26
told in a paper that was published in Nature a few years ago,
11:29
in which Ernst Fehr and Simon Gachter had people play a commons dilemma.
11:32
A game in which you give people money,
11:36
and then, on each round of the game,
11:38
they can put money into a common pot,
11:40
and then the experimenter doubles what's in there,
11:42
and then it's all divided among the players.
11:44
So it's a really nice analog for all sorts of environmental issues,
11:46
where we're asking people to make a sacrifice
11:49
and they themselves don't really benefit from their own sacrifice.
11:51
But you really want everybody else to sacrifice,
11:53
but everybody has a temptation to a free ride.
11:55
And what happens is that, at first, people start off reasonably cooperative --
11:57
and this is all played anonymously.
12:01
On the first round, people give about half of the money that they can.
12:03
But they quickly see, "You know what, other people aren't doing so much though.
12:06
I don't want to be a sucker. I'm not going to cooperate."
12:09
And so cooperation quickly decays from reasonably good, down to close to zero.
12:11
But then -- and here's the trick --
12:15
Fehr and Gachter said, on the seventh round, they told people,
12:17
"You know what? New rule.
12:19
If you want to give some of your own money
12:21
to punish people who aren't contributing, you can do that."
12:23
And as soon as people heard about the punishment issue going on,
12:27
cooperation shoots up.
12:30
It shoots up and it keeps going up.
12:32
There's a lot of research showing that to solve cooperative problems, it really helps.
12:34
It's not enough to just appeal to people's good motives.
12:37
It really helps to have some sort of punishment.
12:39
Even if it's just shame or embarrassment or gossip,
12:41
you need some sort of punishment to bring people,
12:43
when they're in large groups, to cooperate.
12:46
There's even some recent research suggesting that religion --
12:48
priming God, making people think about God --
12:51
often, in some situations, leads to more cooperative, more pro-social behavior.
12:53
Some people think that religion is an adaptation
12:59
evolved both by cultural and biological evolution
13:01
to make groups to cohere,
13:03
in part for the purpose of trusting each other,
13:05
and then being more effective at competing with other groups.
13:07
I think that's probably right,
13:09
although this is a controversial issue.
13:10
But I'm particularly interested in religion,
13:12
and the origin of religion, and in what it does to us and for us.
13:14
Because I think that the greatest wonder in the world is not the Grand Canyon.
13:17
The Grand Canyon is really simple.
13:21
It's just a lot of rock, and then a lot of water and wind, and a lot of time,
13:23
and you get the Grand Canyon.
13:26
It's not that complicated.
13:28
This is what's really complicated,
13:29
that there were people living in places like the Grand Canyon,
13:31
cooperating with each other, or on the savannahs of Africa,
13:33
or on the frozen shores of Alaska, and then some of these villages
13:35
grew into the mighty cities of Babylon, and Rome, and Tenochtitlan.
13:38
How did this happen?
13:42
This is an absolute miracle, much harder to explain than the Grand Canyon.
13:43
The answer, I think, is that they used every tool in the toolbox.
13:46
It took all of our moral psychology
13:49
to create these cooperative groups.
13:51
Yes, you do need to be concerned about harm,
13:53
you do need a psychology of justice.
13:55
But it really helps to organize a group if you can have sub-groups,
13:56
and if those sub-groups have some internal structure,
13:59
and if you have some ideology that tells people
14:02
to suppress their carnality, to pursue higher, nobler ends.
14:04
And now we get to the crux of the disagreement
14:08
between liberals and conservatives.
14:10
Because liberals reject three of these foundations.
14:12
They say "No, let's celebrate diversity, not common in-group membership."
14:14
They say, "Let's question authority."
14:17
And they say, "Keep your laws off my body."
14:19
Liberals have very noble motives for doing this.
14:21
Traditional authority, traditional morality can be quite repressive,
14:24
and restrictive to those at the bottom, to women, to people that don't fit in.
14:27
So liberals speak for the weak and oppressed.
14:30
They want change and justice, even at the risk of chaos.
14:32
This guy's shirt says, "Stop bitching, start a revolution."
14:34
If you're high in openness to experience, revolution is good,
14:37
it's change, it's fun.
14:39
Conservatives, on the other hand, speak for institutions and traditions.
14:41
They want order, even at some cost to those at the bottom.
14:44
The great conservative insight is that order is really hard to achieve.
14:48
It's really precious, and it's really easy to lose.
14:50
So as Edmund Burke said, "The restraints on men,
14:53
as well as their liberties, are to be reckoned among their rights."
14:55
This was after the chaos of the French Revolution.
14:58
So once you see this -- once you see
15:00
that liberals and conservatives both have something to contribute,
15:02
that they form a balance on change versus stability --
15:05
then I think the way is open to step outside the moral matrix.
15:08
This is the great insight that all the Asian religions have attained.
15:11
Think about yin and yang.
15:16
Yin and yang aren't enemies. Yin and yang don't hate each other.
15:18
Yin and yang are both necessary, like night and day,
15:20
for the functioning of the world.
15:22
You find the same thing in Hinduism.
15:24
There are many high gods in Hinduism.
15:26
Two of them are Vishnu, the preserver, and Shiva, the destroyer.
15:28
This image actually is both of those gods sharing the same body.
15:31
You have the markings of Vishnu on the left,
15:34
so we could think of Vishnu as the conservative god.
15:36
You have the markings of Shiva on the right,
15:39
Shiva's the liberal god. And they work together.
15:41
You find the same thing in Buddhism.
15:43
These two stanzas contain, I think, the deepest insights
15:45
that have ever been attained into moral psychology.
15:47
From the Zen master Seng-ts'an:
15:50
"If you want the truth to stand clear before you, never be for or against.
15:52
The struggle between for and against is the mind's worst disease."
15:56
Now unfortunately, it's a disease
16:00
that has been caught by many of the world's leaders.
16:02
But before you feel superior to George Bush,
16:04
before you throw a stone, ask yourself, do you accept this?
16:07
Do you accept stepping out of the battle of good and evil?
16:11
Can you be not for or against anything?
16:14
So, what's the point? What should you do?
16:18
Well, if you take the greatest insights
16:21
from ancient Asian philosophies and religions,
16:23
and you combine them with the latest research on moral psychology,
16:25
I think you come to these conclusions:
16:27
that our righteous minds were designed by evolution
16:29
to unite us into teams, to divide us against other teams
16:33
and then to blind us to the truth.
16:36
So what should you do? Am I telling you to not strive?
16:39
Am I telling you to embrace Seng-ts'an and stop,
16:43
stop with this struggle of for and against?
16:46
No, absolutely not. I'm not saying that.
16:49
This is an amazing group of people who are doing so much,
16:51
using so much of their talent, their brilliance, their energy, their money,
16:54
to make the world a better place, to fight --
16:58
to fight wrongs, to solve problems.
17:00
But as we learned from Samantha Power, in her story
17:04
about Sergio Vieira de Mello, you can't just go charging in,
17:08
saying, "You're wrong, and I'm right."
17:13
Because, as we just heard, everybody thinks they are right.
17:15
A lot of the problems we have to solve
17:19
are problems that require us to change other people.
17:21
And if you want to change other people, a much better way to do it
17:24
is to first understand who we are -- understand our moral psychology,
17:27
understand that we all think we're right -- and then step out,
17:31
even if it's just for a moment, step out -- check in with Seng-ts'an.
17:34
Step out of the moral matrix,
17:38
just try to see it as a struggle playing out,
17:40
in which everybody does think they're right,
17:42
and everybody, at least, has some reasons -- even if you disagree with them --
17:44
everybody has some reasons for what they're doing.
17:46
Step out.
17:48
And if you do that, that's the essential move to cultivate moral humility,
17:49
to get yourself out of this self-righteousness,
17:53
which is the normal human condition.
17:54
Think about the Dalai Lama.
17:56
Think about the enormous moral authority of the Dalai Lama --
17:58
and it comes from his moral humility.
18:01
So I think the point -- the point of my talk,
18:05
and I think the point of TED --
18:07
is that this is a group that is passionately engaged
18:10
in the pursuit of changing the world for the better.
18:13
People here are passionately engaged
18:15
in trying to make the world a better place.
18:18
But there is also a passionate commitment to the truth.
18:20
And so I think that the answer is to use that passionate commitment
18:23
to the truth to try to turn it into a better future for us all.
18:27
Thank you.
18:31
(Applause)
18:32

sponsored links

Jonathan Haidt - Social psychologist
Jonathan Haidt studies how -- and why -- we evolved to be moral. By understanding more about our moral roots, his hope is that we can learn to be civil and open-minded.

Why you should listen

Haidt is a social psychologist whose research on morality across cultures led up to his much-quoted 2008 TEDTalk on the psychological roots of the American culture war. He asks, "Can't we all disagree more constructively?" In September 2009, Jonathan Haidt spoke to the TED Blog about the moral psychology behind the healthcare debate in the United States. He's also active in the study of positive psychology and human flourishing.

At TED2012 he explored the intersection of his work on morality with his work on happiness to talk about “hive psychology” – the ability that humans have to lose themselves in groups pursuing larger projects, almost like bees in a hive. This hivish ability Is crucial, he argues, for understanding the origins of morality, politics, and religion. These are ideas that Haidt develops at greater length in his new book, The Righteous Mind: Why Good People are Divided by Politics and Religion. Learn more about his drive for a more productive and civil politics on his website CivilPolitics.org. And take an eye-opening quiz about your own morals at YourMorals.org

During the bruising 2012 political season, Haidt was invited to speak at TEDxMidAtlantic on the topic of civility. He developed the metaphor of The Asteroids Club to embody how we can reach. common groun. Learn how to start your own Asteroids Club at www.AsteroidsClub.org.

Watch Haidt talk about the Asteroids Club on MSNBC's The Cycle >>

sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.