sponsored links
TED2010

Michael Shermer: The pattern behind self-deception

February 10, 2010

Michael Shermer says the human tendency to believe strange things -- from alien abductions to dowsing rods -- boils down to two of the brain's most basic, hard-wired survival skills. He explains what they are, and how they get us into trouble.

Michael Shermer - Skeptic
Michael Shermer debunks myths, superstitions and urban legends -- and explains why we believe them. Along with publishing Skeptic Magazine, he's author of Why People Believe Weird Things and The Mind of the Market. Full bio

sponsored links
Double-click the English subtitles below to play the video.
So since I was here last in '06,
00:16
we discovered that global climate change
00:19
is turning out to be a pretty serious issue,
00:21
so we covered that fairly extensively
00:23
in Skeptic magazine.
00:25
We investigate all kinds
00:27
of scientific and quasi-scientific controversies,
00:29
but it turns out we don't have to worry about any of this
00:32
because the world's going to end in 2012.
00:34
Another update:
00:36
You will recall I introduced you guys
00:38
to the Quadro Tracker.
00:40
It's like a water dowsing device.
00:42
It's just a hollow piece of plastic with an antenna that swivels around.
00:44
And you walk around, and it points to things.
00:47
Like if you're looking for marijuana in students' lockers,
00:49
it'll point right to somebody.
00:52
Oh, sorry. (Laughter)
00:54
This particular one that was given to me
00:56
finds golf balls,
00:58
especially if you're at a golf course
01:00
and you check under enough bushes.
01:02
Well, under the category of "What's the harm of silly stuff like this?"
01:05
this device, the ADE 651,
01:08
was sold to the Iraqi government
01:11
for 40,000 dollars apiece.
01:14
It's just like this one, completely worthless,
01:16
in which it allegedly worked by "electrostatic
01:18
magnetic ion attraction,"
01:20
which translates to
01:24
"pseudoscientific baloney" -- would be the nice word --
01:26
in which you string together a bunch of words that sound good,
01:29
but it does absolutely nothing.
01:31
In this case, at trespass points,
01:33
allowing people to go through
01:36
because your little tracker device said they were okay,
01:38
actually cost lives.
01:41
So there is a danger to pseudoscience,
01:44
in believing in this sort of thing.
01:46
So what I want to talk about today is belief.
01:49
I want to believe,
01:52
and you do too.
01:54
And in fact, I think my thesis here is that
01:56
belief is the natural state of things.
01:58
It is the default option. We just believe.
02:00
We believe all sorts of things.
02:02
Belief is natural;
02:04
disbelief, skepticism, science, is not natural.
02:06
It's more difficult.
02:08
It's uncomfortable to not believe things.
02:10
So like Fox Mulder on "X-Files,"
02:12
who wants to believe in UFOs? Well, we all do,
02:15
and the reason for that is because
02:18
we have a belief engine in our brains.
02:20
Essentially, we are pattern-seeking primates.
02:23
We connect the dots: A is connected to B; B is connected to C.
02:26
And sometimes A really is connected to B,
02:29
and that's called association learning.
02:32
We find patterns, we make those connections,
02:34
whether it's Pavlov's dog here
02:37
associating the sound of the bell with the food,
02:39
and then he salivates to the sound of the bell,
02:42
or whether it's a Skinnerian rat,
02:44
in which he's having an association
02:46
between his behavior and a reward for it,
02:48
and therefore he repeats the behavior.
02:50
In fact, what Skinner discovered
02:52
is that, if you put a pigeon in a box like this,
02:54
and he has to press one of these two keys,
02:57
and he tries to figure out what the pattern is,
02:59
and you give him a little reward in the hopper box there --
03:01
if you just randomly assign rewards
03:03
such that there is no pattern,
03:06
they will figure out any kind of pattern.
03:08
And whatever they were doing just before they got the reward,
03:10
they repeat that particular pattern.
03:12
Sometimes it was even spinning around twice counterclockwise,
03:14
once clockwise and peck the key twice.
03:17
And that's called superstition,
03:20
and that, I'm afraid,
03:22
we will always have with us.
03:24
I call this process "patternicity" --
03:26
that is, the tendency to find meaningful patterns
03:28
in both meaningful and meaningless noise.
03:30
When we do this process, we make two types of errors.
03:33
A Type I error, or false positive,
03:36
is believing a pattern is real
03:38
when it's not.
03:40
Our second type of error is a false negative.
03:42
A Type II error is not believing
03:44
a pattern is real when it is.
03:46
So let's do a thought experiment.
03:49
You are a hominid three million years ago
03:51
walking on the plains of Africa.
03:53
Your name is Lucy, okay?
03:56
And you hear a rustle in the grass.
03:58
Is it a dangerous predator,
04:00
or is it just the wind?
04:02
Your next decision could be the most important one of your life.
04:04
Well, if you think that the rustle in the grass is a dangerous predator
04:07
and it turns out it's just the wind,
04:10
you've made an error in cognition,
04:12
made a Type I error, false positive.
04:14
But no harm. You just move away.
04:16
You're more cautious. You're more vigilant.
04:18
On the other hand, if you believe that the rustle in the grass is just the wind,
04:20
and it turns out it's a dangerous predator,
04:22
you're lunch.
04:25
You've just won a Darwin award.
04:27
You've been taken out of the gene pool.
04:29
Now the problem here is that
04:31
patternicities will occur whenever the cost
04:33
of making a Type I error
04:35
is less than the cost of making a Type II error.
04:37
This is the only equation in the talk by the way.
04:39
We have a pattern detection problem
04:41
that is assessing the difference between a Type I and a Type II error
04:43
is highly problematic,
04:46
especially in split-second, life-and-death situations.
04:48
So the default position
04:51
is just: Believe all patterns are real --
04:53
All rustles in the grass are dangerous predators
04:55
and not just the wind.
04:58
And so I think that we evolved ...
05:00
there was a natural selection for the propensity for our belief engines,
05:02
our pattern-seeking brain processes,
05:05
to always find meaningful patterns
05:07
and infuse them with these sort of
05:09
predatory or intentional agencies that I'll come back to.
05:11
So for example, what do you see here?
05:14
It's a horse head, that's right.
05:16
It looks like a horse. It must be a horse.
05:18
That's a pattern.
05:20
And is it really a horse?
05:22
Or is it more like a frog?
05:24
See, our pattern detection device,
05:27
which appears to be located in the anterior cingulate cortex --
05:29
it's our little detection device there --
05:32
can be easily fooled, and this is the problem.
05:35
For example, what do you see here?
05:37
Yes, of course, it's a cow.
05:39
Once I prime the brain -- it's called cognitive priming --
05:42
once I prime the brain to see it,
05:45
it pops back out again even without the pattern that I've imposed on it.
05:47
And what do you see here?
05:50
Some people see a Dalmatian dog.
05:52
Yes, there it is. And there's the prime.
05:54
So when I go back without the prime,
05:56
your brain already has the model
05:58
so you can see it again.
06:00
What do you see here?
06:02
Planet Saturn. Yes, that's good.
06:05
How about here?
06:07
Just shout out anything you see.
06:10
That's a good audience, Chris.
06:14
Because there's nothing in this. Well, allegedly there's nothing.
06:16
This is an experiment done by Jennifer Whitson
06:19
at U.T. Austin
06:22
on corporate environments
06:24
and whether feelings of uncertainty and out of control
06:26
makes people see illusory patterns.
06:29
That is, almost everybody sees the planet Saturn.
06:31
People that are put in a condition of feeling out of control
06:34
are more likely to see something in this,
06:37
which is allegedly patternless.
06:39
In other words, the propensity to find these patterns
06:42
goes up when there's a lack of control.
06:45
For example, baseball players are notoriously superstitious
06:48
when they're batting,
06:51
but not so much when they're fielding.
06:53
Because fielders are successful
06:55
90 to 95 percent of the time.
06:57
The best batters fail seven out of 10 times.
06:59
So their superstitions, their patternicities,
07:02
are all associated with feelings of lack of control
07:04
and so forth.
07:07
What do you see in this particular one here, in this field?
07:10
Anybody see an object there?
07:13
There actually is something here,
07:15
but it's degraded.
07:17
While you're thinking about that,
07:19
this was an experiment done by Susan Blackmore,
07:21
a psychologist in England,
07:23
who showed subjects this degraded image
07:25
and then ran a correlation between
07:27
their scores on an ESP test:
07:29
How much did they believe in the paranormal,
07:31
supernatural, angels and so forth.
07:33
And those who scored high on the ESP scale,
07:36
tended to not only see
07:39
more patterns in the degraded images
07:41
but incorrect patterns.
07:43
Here is what you show subjects.
07:45
The fish is degraded 20 percent, 50 percent
07:47
and then the one I showed you,
07:50
70 percent.
07:52
A similar experiment was done by another [Swiss] psychologist
07:54
named Peter Brugger,
07:56
who found significantly more meaningful patterns
07:58
were perceived on the right hemisphere,
08:01
via the left visual field, than the left hemisphere.
08:03
So if you present subjects the images such
08:06
that it's going to end up on the right hemisphere instead of the left,
08:08
then they're more likely to see patterns
08:11
than if you put it on the left hemisphere.
08:13
Our right hemisphere appears to be
08:15
where a lot of this patternicity occurs.
08:17
So what we're trying to do is bore into the brain
08:19
to see where all this happens.
08:21
Brugger and his colleague, Christine Mohr,
08:23
gave subjects L-DOPA.
08:26
L-DOPA's a drug, as you know, given for treating Parkinson's disease,
08:28
which is related to a decrease in dopamine.
08:31
L-DOPA increases dopamine.
08:34
An increase of dopamine caused
08:36
subjects to see more patterns
08:38
than those that did not receive the dopamine.
08:40
So dopamine appears to be the drug
08:42
associated with patternicity.
08:44
In fact, neuroleptic drugs
08:46
that are used to eliminate psychotic behavior,
08:48
things like paranoia, delusions
08:50
and hallucinations,
08:52
these are patternicities.
08:54
They're incorrect patterns. They're false positives. They're Type I errors.
08:56
And if you give them drugs
08:59
that are dopamine antagonists,
09:01
they go away.
09:03
That is, you decrease the amount of dopamine,
09:05
and their tendency to see
09:07
patterns like that decreases.
09:09
On the other hand, amphetamines like cocaine
09:11
are dopamine agonists.
09:14
They increase the amount of dopamine.
09:16
So you're more likely to feel in a euphoric state,
09:18
creativity, find more patterns.
09:21
In fact, I saw Robin Williams recently
09:23
talk about how he thought he was much funnier
09:25
when he was doing cocaine, when he had that issue, than now.
09:27
So perhaps more dopamine
09:30
is related to more creativity.
09:32
Dopamine, I think, changes
09:34
our signal-to-noise ratio.
09:36
That is, how accurate we are
09:38
in finding patterns.
09:40
If it's too low, you're more likely to make too many Type II errors.
09:42
You miss the real patterns. You don't want to be too skeptical.
09:45
If you're too skeptical, you'll miss the really interesting good ideas.
09:47
Just right, you're creative, and yet you don't fall for too much baloney.
09:51
Too high and maybe you see patterns everywhere.
09:54
Every time somebody looks at you, you think people are staring at you.
09:57
You think people are talking about you.
10:00
And if you go too far on that, that's just simply
10:02
labeled as madness.
10:04
It's a distinction perhaps we might make
10:06
between two Nobel laureates, Richard Feynman
10:08
and John Nash.
10:10
One sees maybe just the right number
10:12
of patterns to win a Nobel Prize.
10:14
The other one also, but maybe too many patterns.
10:16
And we then call that schizophrenia.
10:18
So the signal-to-noise ratio then presents us with a pattern-detection problem.
10:21
And of course you all know exactly
10:24
what this is, right?
10:26
And what pattern do you see here?
10:28
Again, I'm putting your anterior cingulate cortex to the test here,
10:30
causing you conflicting pattern detections.
10:33
You know, of course, this is Via Uno shoes.
10:36
These are sandals.
10:38
Pretty sexy feet, I must say.
10:41
Maybe a little Photoshopped.
10:44
And of course, the ambiguous figures
10:46
that seem to flip-flop back and forth.
10:48
It turns out what you're thinking about a lot
10:50
influences what you
10:52
tend to see.
10:54
And you see the lamp here, I know.
10:56
Because the lights on here.
10:58
Of course, thanks to the environmentalist movement
11:01
we're all sensitive to the plight of marine mammals.
11:03
So what you see in this particular ambiguous figure
11:06
is, of course, the dolphins, right?
11:09
You see a dolphin here,
11:11
and there's a dolphin,
11:13
and there's a dolphin.
11:15
That's a dolphin tail there, guys.
11:17
(Laughter)
11:20
If we can give you conflicting data, again,
11:25
your ACC is going to be going into hyperdrive.
11:28
If you look down here, it's fine. If you look up here, then you get conflicting data.
11:31
And then we have to flip the image
11:34
for you to see that it's a set up.
11:36
The impossible crate illusion.
11:40
It's easy to fool the brain in 2D.
11:42
So you say, "Aw, come on Shermer, anybody can do that
11:44
in a Psych 101 text with an illusion like that."
11:46
Well here's the late, great Jerry Andrus'
11:48
"impossible crate" illusion in 3D,
11:50
in which Jerry is standing inside
11:53
the impossible crate.
11:55
And he was kind enough to post this
11:57
and give us the reveal.
11:59
Of course, camera angle is everything. The photographer is over there,
12:01
and this board appears to overlap with this one, and this one with that one, and so on.
12:04
But even when I take it away,
12:07
the illusion is so powerful because of how are brains are wired
12:09
to find those certain kinds of patterns.
12:11
This is a fairly new one
12:14
that throws us off because of the conflicting patterns
12:16
of comparing this angle with that angle.
12:18
In fact, it's the exact same picture side by side.
12:21
So what you're doing is comparing that angle
12:24
instead of with this one, but with that one.
12:26
And so your brain is fooled.
12:28
Yet again, your pattern detection devices are fooled.
12:30
Faces are easy to see
12:32
because we have an additional evolved
12:34
facial recognition software
12:36
in our temporal lobes.
12:38
Here's some faces on the side of a rock.
12:41
I'm actually not even sure if this is -- this might be Photoshopped.
12:44
But anyway, the point is still made.
12:47
Now which one of these looks odd to you?
12:49
In a quick reaction, which one looks odd?
12:51
The one on the left. Okay. So I'll rotate it
12:53
so it'll be the one on the right.
12:55
And you are correct.
12:57
A fairly famous illusion -- it was first done with Margaret Thatcher.
12:59
Now, they trade up the politicians every time.
13:02
Well, why is this happening?
13:04
Well, we know exactly where it happens,
13:06
in the temporal lobe, right across, sort of above your ear there,
13:08
in a little structure called the fusiform gyrus.
13:11
And there's two types of cells that do this,
13:14
that record facial features either globally,
13:16
or specifically these large, rapid-firing cells,
13:19
first look at the general face.
13:21
So you recognize Obama immediately.
13:23
And then you notice something quite
13:25
a little bit odd about the eyes and the mouth.
13:27
Especially when they're upside down,
13:29
you're engaging that general facial recognition software there.
13:31
Now I said back in our little thought experiment,
13:34
you're a hominid walking on the plains of Africa.
13:37
Is it just the wind or a dangerous predator?
13:39
What's the difference between those?
13:42
Well, the wind is inanimate;
13:44
the dangerous predator is an intentional agent.
13:46
And I call this process agenticity.
13:48
That is the tendency to infuse patterns
13:50
with meaning, intention and agency,
13:52
often invisible beings from the top down.
13:54
This is an idea that we got
13:57
from a fellow TEDster here, Dan Dennett,
13:59
who talked about taking the intentional stance.
14:01
So it's a type of that expanded to explain, I think, a lot of different things:
14:03
souls, spirits, ghosts, gods, demons, angels,
14:06
aliens, intelligent designers,
14:09
government conspiracists
14:11
and all manner of invisible agents
14:13
with power and intention, are believed
14:15
to haunt our world and control our lives.
14:17
I think it's the basis of animism
14:19
and polytheism and monotheism.
14:21
It's the belief that aliens are somehow
14:24
more advanced than us, more moral than us,
14:26
and the narratives always are
14:28
that they're coming here to save us and rescue us from on high.
14:30
The intelligent designer's always portrayed
14:33
as this super intelligent, moral being
14:35
that comes down to design life.
14:38
Even the idea that government can rescue us --
14:40
that's no longer the wave of the future,
14:42
but that is, I think, a type of agenticity:
14:44
projecting somebody up there,
14:46
big and powerful, will come rescue us.
14:48
And this is also, I think, the basis of conspiracy theories.
14:50
There's somebody hiding behind there pulling the strings,
14:52
whether it's the Illuminati
14:55
or the Bilderbergers.
14:57
But this is a pattern detection problem, isn't it?
14:59
Some patterns are real and some are not.
15:01
Was JFK assassinated by a conspiracy or by a lone assassin?
15:03
Well, if you go there -- there's people there on any given day --
15:06
like when I went there, here -- showing me where the different shooters were.
15:09
My favorite one was he was in the manhole.
15:12
And he popped out at the last second, took that shot.
15:15
But of course, Lincoln was assassinated by a conspiracy.
15:18
So we can't just uniformly dismiss
15:20
all patterns like that.
15:22
Because, let's face it, some patterns are real.
15:24
Some conspiracies really are true.
15:26
Explains a lot, maybe.
15:30
And 9/11 has a conspiracy theory. It is a conspiracy.
15:32
We did a whole issue on it.
15:35
Nineteen members of Al Queda plotting to fly planes into buildings
15:37
constitutes a conspiracy.
15:39
But that's not what the "9/11 truthers" think.
15:41
They think it was an inside job by the Bush administration.
15:43
Well, that's a whole other lecture.
15:46
You know how we know that 9/11
15:48
was not orchestrated by the Bush administration?
15:50
Because it worked.
15:52
(Laughter)
15:54
(Applause)
15:57
So we are natural-born dualists.
16:00
Our agenticity process comes from
16:02
the fact that we can enjoy movies like these.
16:04
Because we can imagine, in essence,
16:06
continuing on.
16:08
We know that if you stimulate the temporal lobe,
16:10
you can produce a feeling of out-of-body experiences,
16:12
near-death experiences,
16:14
which you can do by just touching an electrode to the temporal lobe there.
16:16
Or you can do it through loss of consciousness,
16:19
by accelerating in a centrifuge.
16:21
You get a hypoxia, or a lower oxygen.
16:23
And the brain then senses
16:26
that there's an out-of-body experience.
16:28
You can use -- which I did, went out and did --
16:30
Michael Persinger's God Helmet,
16:32
that bombards your temporal lobes with electromagnetic waves.
16:34
And you get a sense of out-of-body experience.
16:36
So I'm going to end here with a short video clip
16:39
that sort of brings all this together.
16:41
It's just a minute and a half.
16:43
It ties together all this into the power of expectation and the power of belief.
16:45
Go ahead and roll it.
16:48
Narrator: This is the venue they chose for their fake auditions
16:50
for an advert for lip balm.
16:53
Woman: We're hoping we can use part of this
16:55
in a national commercial, right?
16:57
And this is test on some lip balms
16:59
that we have over here.
17:01
And these are our models who are going to help us,
17:03
Roger and Matt.
17:05
And we have our own lip balm,
17:07
and we have a leading brand.
17:09
Would you have any problem
17:11
kissing our models to test it?
17:13
Girl: No.
17:15
Woman: You wouldn't? (Girl: No.) Woman: You'd think that was fine.
17:17
Girl: That would be fine. (Woman: Okay.)
17:19
So this is a blind test.
17:21
I'm going to ask you to go ahead
17:24
and put a blindfold on.
17:26
Kay, now can you see anything? (Girl: No.)
17:29
Pull it so you can't even see down. (Girl: Okay.)
17:32
Woman: It's completely blind now, right?
17:34
Girl: Yes. (Woman: Okay.)
17:36
Now, what I'm going to be looking for in this test
17:38
is how it protects your lips,
17:41
the texture, right,
17:44
and maybe if you can discern any flavor or not.
17:46
Girl: Okay. (Woman: Have you ever done a kissing test before?)
17:49
Girl: No.
17:52
Woman: Take a step here.
17:54
Okay, now I'm going to ask you to pucker up.
17:56
Pucker up big and lean in just a little bit, okay?
17:58
(Music)
18:06
(Laughter)
18:10
(Laughter)
18:19
Woman: Okay.
18:30
And, Jennifer, how did that feel?
18:32
Jennifer: Good.
18:34
(Laughter)
18:36
Girl: Oh my God!
18:43
(Laughter)
18:45
Michael Shermer: Thank you very much. Thank you. Thanks.
18:50

sponsored links

Michael Shermer - Skeptic
Michael Shermer debunks myths, superstitions and urban legends -- and explains why we believe them. Along with publishing Skeptic Magazine, he's author of Why People Believe Weird Things and The Mind of the Market.

Why you should listen

As founder and publisher of Skeptic Magazine, Michael Shermer has exposed fallacies behind intelligent design, 9/11 conspiracies, the low-carb craze, alien sightings and other popular beliefs and paranoias. But it's not about debunking for debunking's sake. Shermer defends the notion that we can understand our world better only by matching good theory with good science.

Shermer's work offers cognitive context for our often misguided beliefs: In the absence of sound science, incomplete information can powerfully combine with the power of suggestion (helping us hear Satanic lyrics when "Stairway to Heaven" plays backwards, for example). In fact, a common thread that runs through beliefs of all sorts, he says, is our tendency to convince ourselves: We overvalue the shreds of evidence that support our preferred outcome, and ignore the facts we aren't looking for.

He writes a monthly column for Scientific American, and is an adjunct at Claremont Graduate University and Chapman University. His latest book is The Believing Brain: From Ghosts and Gods to Politics and Conspiracies—How We Construct Beliefs and Reinforce Them as Truths. He is also the author of The Mind of the Market, on evolutionary economics, Why Darwin Matters: Evolution and the Case Against Intelligent Design, and The Science of Good and Evil. And his next book is titled The Moral Arc of Science.

The original video is available on TED.com
sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.