TEDGlobal 2011
Pamela Meyer: How to spot a liar
Pamela Meyer: 識破謊言的假面
Filmed:
Readability: 3.8
28,415,176 views
在任何特定的日子,我們受騙的次數起碼有10到200次,這些謊言的線索,可說是不露痕跡且有違直覺。《破解謊言》的作者Pamela Meyer說明,那些受過訓練、能識破欺騙的人辨識詐欺所使用的技倆和『關鍵跡象(hotspots)』──而且她主張誠實是一個值得維持的價值。
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio
Double-click the English transcript below to play the video.
00:15
Okay, now I don't want to alarm anybody in this room,
0
0
5000
嗯,我不想驚動在這講廳內的任何人
00:20
but it's just come to my attention
1
5000
2000
但那引起我的注意
00:22
that the person to your right is a liar.
2
7000
2000
在你右邊的那人是騙子。
00:24
(Laughter)
3
9000
2000
(笑)
00:26
Also, the person to your left is a liar.
4
11000
3000
還有,你左邊的那個是個騙子。
00:29
Also the person sitting in your very seats is a liar.
5
14000
3000
坐在你位子上的那個人也是個騙子。
00:32
We're all liars.
6
17000
2000
我們全是騙子。
00:34
What I'm going to do today
7
19000
2000
今天我要做的是
00:36
is I'm going to show you what the research says about why we're all liars,
8
21000
3000
我要讓你們看看為何研究指出我們都是騙子的理由、
00:39
how you can become a liespotter
9
24000
2000
你如何能成為識破謊言者
00:41
and why you might want to go the extra mile
10
26000
3000
及為何你也許想更進一步
00:44
and go from liespotting to truth seeking,
11
29000
3000
由識破謊言到尋求真相,
00:47
and ultimately to trust building.
12
32000
2000
最終到信任的殿堂。
00:49
Now speaking of trust,
13
34000
3000
說到信任,
00:52
ever since I wrote this book, "Liespotting,"
14
37000
3000
打從我寫這本書《破解謊言》
00:55
no one wants to meet me in person anymore, no, no, no, no, no.
15
40000
3000
再也沒有人想和我會面,不、不、不、不、不。
00:58
They say, "It's okay, we'll email you."
16
43000
3000
他們說:「沒關係,我們會email 給你。」
01:01
(Laughter)
17
46000
2000
(笑)
01:03
I can't even get a coffee date at Starbucks.
18
48000
4000
我甚至得不到在星巴克喝咖啡的約會。
01:07
My husband's like, "Honey, deception?
19
52000
2000
我丈夫會說:「親愛的,『欺騙』?
01:09
Maybe you could have focused on cooking. How about French cooking?"
20
54000
3000
也許你可以專研烹飪。法國烹飪如何?」
01:12
So before I get started, what I'm going to do
21
57000
2000
那麼在開始前,我首先要
01:14
is I'm going to clarify my goal for you,
22
59000
3000
為你們釐清我的目標,
01:17
which is not to teach a game of Gotcha.
23
62000
2000
我不是要教『抓鬼(Gotcha)』遊戲。
01:19
Liespotters aren't those nitpicky kids,
24
64000
2000
識破謊言者不是那些雞蛋裡挑骨頭的孩子,
01:21
those kids in the back of the room that are shouting, "Gotcha! Gotcha!
25
66000
3000
在房子後面大喊:「抓到了! 抓到了!
01:24
Your eyebrow twitched. You flared your nostril.
26
69000
3000
你的眉毛抽搐、你撐大鼻孔。
01:27
I watch that TV show 'Lie To Me.' I know you're lying."
27
72000
3000
我有看電視節目〈對我撒謊〉(Lie to me)。我知道你在說謊。」
01:30
No, liespotters are armed
28
75000
2000
不,識破謊言者身懷
01:32
with scientific knowledge of how to spot deception.
29
77000
3000
辨識欺騙的科學知識。
01:35
They use it to get to the truth,
30
80000
2000
他們用之以取得真相,
01:37
and they do what mature leaders do everyday;
31
82000
2000
他們做謹慎的領導者每天在做的事;
01:39
they have difficult conversations with difficult people,
32
84000
3000
他們運用招術與棘手的人物交談,
01:42
sometimes during very difficult times.
33
87000
2000
有時更是在相當難熬的時候。
01:44
And they start up that path
34
89000
2000
他們一開始便循
01:46
by accepting a core proposition,
35
91000
2000
依據同意一個核心論點的路線,
01:48
and that proposition is the following:
36
93000
2000
該論點如下述:
01:50
Lying is a cooperative act.
37
95000
3000
『說謊是一種樂意共構的行動』。
01:53
Think about it, a lie has no power whatsoever by its mere utterance.
38
98000
4000
想一想,單僅是話語,一個謊言成不了氣候。
01:57
Its power emerges
39
102000
2000
其能量匯集而成,
01:59
when someone else agrees to believe the lie.
40
104000
2000
當其他人同意也相信謊言。
02:01
So I know it may sound like tough love,
41
106000
2000
我知道那也許聽來像『嚴苛的愛(tough love)』,
02:03
but look, if at some point you got lied to,
42
108000
4000
但聽好,在某些時候你被騙
02:07
it's because you agreed to get lied to.
43
112000
2000
那是因為你同意被騙。
02:09
Truth number one about lying: Lying's a cooperative act.
44
114000
3000
有關說謊的一號真相:說謊是一種共構的行為。
02:12
Now not all lies are harmful.
45
117000
2000
不是所有的謊言都有害。
02:14
Sometimes we're willing participants in deception
46
119000
3000
有時我們是騙局的自願參與者,
02:17
for the sake of social dignity,
47
122000
3000
為社交體面之故,
02:20
maybe to keep a secret that should be kept secret, secret.
48
125000
3000
可能對當保守的袐密守口如瓶。
02:23
We say, "Nice song."
49
128000
2000
我們說:「好歌。」
02:25
"Honey, you don't look fat in that, no."
50
130000
3000
「親愛的,你穿那件看起來不胖,不。」
02:28
Or we say, favorite of the digiratti,
51
133000
2000
或是網路高手的最愛,我們說:
02:30
"You know, I just fished that email out of my spam folder.
52
135000
3000
「你明白的,我才剛從垃圾郵件匣找出那封信。
02:33
So sorry."
53
138000
3000
所以,抱歉囉!」
02:36
But there are times when we are unwilling participants in deception.
54
141000
3000
但有時我們是非自願參與欺騙。
02:39
And that can have dramatic costs for us.
55
144000
3000
我們為其付出慘痛代價。
02:42
Last year saw 997 billion dollars
56
147000
3000
去年發生了九千九百七十億元
02:45
in corporate fraud alone in the United States.
57
150000
4000
企業集團詐欺,單單在美國。
02:49
That's an eyelash under a trillion dollars.
58
154000
2000
那是兆美元之一分,九牛一毛。
02:51
That's seven percent of revenues.
59
156000
2000
那是7%的稅收。
02:53
Deception can cost billions.
60
158000
2000
詐欺能耗上數來億。
02:55
Think Enron, Madoff, the mortgage crisis.
61
160000
3000
想想安隆案( Enron)、馬道夫騙局(Madoff)、次級房貸危機。
02:58
Or in the case of double agents and traitors,
62
163000
3000
或者是雙重代理和背信的案例,
03:01
like Robert Hanssen or Aldrich Ames,
63
166000
2000
像是Robert Hanssen或者Aldrich Ames(兩者皆為間諜),
03:03
lies can betray our country,
64
168000
2000
謊言能出賣我們的國家、
03:05
they can compromise our security, they can undermine democracy,
65
170000
3000
謊言會危及我們國防、謊言能削弱民主、
03:08
they can cause the deaths of those that defend us.
66
173000
3000
謊言會導致捍衛我們的一切瓦解。
03:11
Deception is actually serious business.
67
176000
3000
詐欺事實上是危險的生意。
03:14
This con man, Henry Oberlander,
68
179000
2000
騙徒Henry Oberlander,
03:16
he was such an effective con man
69
181000
2000
他是高桿的騙徒,
03:18
British authorities say
70
183000
2000
英國當局表示
03:20
he could have undermined the entire banking system of the Western world.
71
185000
3000
他本可破壞西方世界的整個銀行系統。
03:23
And you can't find this guy on Google; you can't find him anywhere.
72
188000
2000
你在Google搜尋不到這個人;你無從找起。
03:25
He was interviewed once, and he said the following.
73
190000
3000
他有次受訪,說了這樣的話,
03:28
He said, "Look, I've got one rule."
74
193000
2000
他說:「瞧,我有條法則。」
03:30
And this was Henry's rule, he said,
75
195000
3000
這是『亨利的法則』,他說:
03:33
"Look, everyone is willing to give you something.
76
198000
2000
「瞧,每個人願意給你某東西。
03:35
They're ready to give you something for whatever it is they're hungry for."
77
200000
3000
他們準備好給你某東西,換取他們渴求的,不論是什麼。」
03:38
And that's the crux of it.
78
203000
2000
而這就是其關鍵。
03:40
If you don't want to be deceived, you have to know,
79
205000
2000
若你不想被欺騙,你必須知道
03:42
what is it that you're hungry for?
80
207000
2000
你最渴望的是什麼?
03:44
And we all kind of hate to admit it.
81
209000
3000
我們都有點討厭承認這點。
03:47
We wish we were better husbands, better wives,
82
212000
3000
我們希望我們是更佳的丈夫、更佳的妻子;
03:50
smarter, more powerful,
83
215000
2000
更聰明、更有權力、
03:52
taller, richer --
84
217000
2000
更高、更富有──
03:54
the list goes on.
85
219000
2000
願望清單不勝枚舉。
03:56
Lying is an attempt to bridge that gap,
86
221000
2000
說謊是企圖在那道溝上架橋;
03:58
to connect our wishes and our fantasies
87
223000
2000
把我們的願望、幻想
04:00
about who we wish we were, how we wish we could be,
88
225000
3000
關於我們希望我們是誰、我們希望我們能如何
04:03
with what we're really like.
89
228000
3000
與我們真正面貌連結。
04:06
And boy are we willing to fill in those gaps in our lives with lies.
90
231000
3000
嘿!我們樂意以謊言填滿我們生命的溝壑。
04:09
On a given day, studies show that you may be lied to
91
234000
3000
在某段時間,研究顯示,你可能被誆
04:12
anywhere from 10 to 200 times.
92
237000
2000
在任何地點,次數起碼10次至200次。
04:14
Now granted, many of those are white lies.
93
239000
3000
當然,多數是無傷大雅的謊言。
04:17
But in another study,
94
242000
2000
但在另一個研究指出,
04:19
it showed that strangers lied three times
95
244000
2000
陌生人撒謊三次,
04:21
within the first 10 minutes of meeting each other.
96
246000
2000
在彼此見面的初始10分鐘內。
04:23
(Laughter)
97
248000
2000
(笑)
04:25
Now when we first hear this data, we recoil.
98
250000
3000
我們初次得知這個數據,大為震驚。
04:28
We can't believe how prevalent lying is.
99
253000
2000
不敢置信,說謊已蔚然成風。
04:30
We're essentially against lying.
100
255000
2000
本質上,我們是反對說謊。
04:32
But if you look more closely,
101
257000
2000
但若你仔細瞧,
04:34
the plot actually thickens.
102
259000
2000
情節著實撲朔迷離。
04:36
We lie more to strangers than we lie to coworkers.
103
261000
3000
我們對陌生人說的謊比對工作伙伴說的還多。
04:39
Extroverts lie more than introverts.
104
264000
4000
外向者說的謊言多於內向者。
04:43
Men lie eight times more about themselves
105
268000
3000
男人談論自己時撒的謊是
04:46
than they do other people.
106
271000
2000
談論他人時的八倍多。
04:48
Women lie more to protect other people.
107
273000
3000
女人說謊大多為了保護他人。
04:51
If you're an average married couple,
108
276000
3000
若你們是一對普通的已婚夫婦,
04:54
you're going to lie to your spouse
109
279000
2000
你會對你的配偶撒謊,
04:56
in one out of every 10 interactions.
110
281000
2000
每十次的交流就有一次。
04:58
Now you may think that's bad.
111
283000
2000
現在你也許認為那不好。
05:00
If you're unmarried, that number drops to three.
112
285000
2000
若你未婚,則數字掉至三次有一次。
05:02
Lying's complex.
113
287000
2000
說謊是複雜糾結的。
05:04
It's woven into the fabric of our daily and our business lives.
114
289000
3000
人們以之編造出我們每日生活、買賣交易的基本結構。
05:07
We're deeply ambivalent about the truth.
115
292000
2000
我們對於真相深感矛盾,
05:09
We parse it out on an as-needed basis,
116
294000
2000
我們將其解讀成一種『需要基礎』,
05:11
sometimes for very good reasons,
117
296000
2000
有時理由十分充分,
05:13
other times just because we don't understand the gaps in our lives.
118
298000
3000
其他時候只因我們不了解我們生活的溝渠。
05:16
That's truth number two about lying.
119
301000
2000
那是關於說謊的真相二。
05:18
We're against lying,
120
303000
2000
我們反對說謊,
05:20
but we're covertly for it
121
305000
2000
但我們又暗渡陳倉,
05:22
in ways that our society has sanctioned
122
307000
2000
以我們的社會讚許的方式
05:24
for centuries and centuries and centuries.
123
309000
2000
存在已有好幾個世紀。
05:26
It's as old as breathing.
124
311000
2000
如呼吸一般的久遠。
05:28
It's part of our culture, it's part of our history.
125
313000
2000
是我們文化的一部分;歷史的一部分。
05:30
Think Dante, Shakespeare,
126
315000
3000
想想但丁(Dante)、莎士比亞( Shakespeare)
05:33
the Bible, News of the World.
127
318000
3000
《聖經》(Bible)、《世界新聞報》(News of the World)。
05:36
(Laughter)
128
321000
2000
(笑)
05:38
Lying has evolutionary value to us as a species.
129
323000
2000
說謊對身為一物種的人類有演化的貢獻。
05:40
Researchers have long known
130
325000
2000
研究人員老早就已知
05:42
that the more intelligent the species,
131
327000
2000
物種愈是聰明,
05:44
the larger the neocortex,
132
329000
2000
新皮質愈是大,
05:46
the more likely it is to be deceptive.
133
331000
2000
更可能難以捉摸。
05:48
Now you might remember Koko.
134
333000
2000
你們也許記得Koko 。
05:50
Does anybody remember Koko the gorilla who was taught sign language?
135
335000
3000
有誰記得被教導手語的大猩猩Koko?
05:53
Koko was taught to communicate via sign language.
136
338000
3000
Koko被教導以手語溝通。
05:56
Here's Koko with her kitten.
137
341000
2000
這是Koko和牠的小貓。
05:58
It's her cute little, fluffy pet kitten.
138
343000
3000
牠的嬌小玲瓏、毛茸茸的寵物貓。
06:01
Koko once blamed her pet kitten
139
346000
2000
Koko有次責備小寵物貓
06:03
for ripping a sink out of the wall.
140
348000
2000
把水槽從牆上給拆下來。
06:05
(Laughter)
141
350000
2000
(笑)
06:07
We're hardwired to become leaders of the pack.
142
352000
2000
我們生來即為群雄/賊之首,
06:09
It's starts really, really early.
143
354000
2000
這是真的真的很早就開始的行為。
06:11
How early?
144
356000
2000
有多早?
06:13
Well babies will fake a cry,
145
358000
2000
嗯,嬰兒假哭,
06:15
pause, wait to see who's coming
146
360000
2000
停一下,等著瞧誰會來搭理,
06:17
and then go right back to crying.
147
362000
2000
然後回頭繼續哭。
06:19
One-year-olds learn concealment.
148
364000
2000
一歲學會隱瞞;
06:21
(Laughter)
149
366000
2000
(笑)
06:23
Two-year-olds bluff.
150
368000
2000
兩歲學會裝模作樣;
06:25
Five-year-olds lie outright.
151
370000
2000
五歲撒謊不臉紅,
06:27
They manipulate via flattery.
152
372000
2000
他們巧妙的運用花言巧語;
06:29
Nine-year-olds, masters of the cover up.
153
374000
3000
九歲,掩飾能手;
06:32
By the time you enter college,
154
377000
2000
到了進大學前,
06:34
you're going to lie to your mom in one out of every five interactions.
155
379000
3000
每五次和母親的交流中,有一次你會對她撒謊。
06:37
By the time we enter this work world and we're breadwinners,
156
382000
3000
到了進職場、掙錢養家,
06:40
we enter a world that is just cluttered
157
385000
2000
我們進入了一個世界,充斥著
06:42
with spam, fake digital friends,
158
387000
2000
垃圾郵件、虛假的數位朋友
06:44
partisan media,
159
389000
2000
政黨媒體、
06:46
ingenious identity thieves,
160
391000
2000
神通廣大的身份盜賊、
06:48
world-class Ponzi schemers,
161
393000
2000
世界級騙局陰謀者(Ponzi schemers) 、
06:50
a deception epidemic --
162
395000
2000
一種欺騙流行病──
06:52
in short, what one author calls
163
397000
2000
簡而言之,某位作家稱其為
06:54
a post-truth society.
164
399000
3000
後真相社會。
06:57
It's been very confusing
165
402000
2000
長久以來
06:59
for a long time now.
166
404000
3000
一直令人困或不解。
07:03
What do you do?
167
408000
2000
你能做什麼呢?
07:05
Well there are steps we can take
168
410000
2000
我們可以採取這些步驟
07:07
to navigate our way through the morass.
169
412000
2000
引領我們走出泥淖的明路。
07:09
Trained liespotters get to the truth 90 percent of the time.
170
414000
3000
受過訓的識破謊言者有百分之九十的時候,能獲得真相。
07:12
The rest of us, we're only 54 percent accurate.
171
417000
3000
我們則只有54%的準確度。
07:15
Why is it so easy to learn?
172
420000
2000
為何容易學呢?
07:17
There are good liars and there are bad liars. There are no real original liars.
173
422000
3000
有擅長說謊的人和蹩腳的編謊者;沒有誰是真正的謊言原創者。
07:20
We all make the same mistakes. We all use the same techniques.
174
425000
3000
我們都犯同樣的錯誤;我們都使用同樣的伎倆。
07:23
So what I'm going to do
175
428000
2000
那麼,我打算做什麼呢......
07:25
is I'm going to show you two patterns of deception.
176
430000
2000
我要讓你們看兩種欺騙的模式。
07:27
And then we're going to look at the hot spots and see if we can find them ourselves.
177
432000
3000
然後我們來檢視這些關鍵跡象(hot spots)並看看我們自己能否找出關鍵跡象。
07:30
We're going to start with speech.
178
435000
3000
我們先來看這段演說。
07:33
(Video) Bill Clinton: I want you to listen to me.
179
438000
2000
(影視)比爾‧克林頓:「我要你們聽我說。
07:35
I'm going to say this again.
180
440000
2000
我要再次重申
07:37
I did not have sexual relations
181
442000
3000
我沒有和那女人,
07:40
with that woman, Miss Lewinsky.
182
445000
4000
Lewinsky小姐發生性關係。
07:44
I never told anybody to lie,
183
449000
2000
我從未教唆任何人說謊,
07:46
not a single time, never.
184
451000
2000
一次也沒有;從未。
07:48
And these allegations are false.
185
453000
3000
這些是子虛烏有的指控。
07:51
And I need to go back to work for the American people.
186
456000
2000
我必須回到崗位為美國人民工作了。
07:53
Thank you.
187
458000
2000
謝謝。」
07:58
Pamela Meyer: Okay, what were the telltale signs?
188
463000
3000
Pamela Meyer:好,什麼是洩漏內情的訊號?
08:01
Well first we heard what's known as a non-contracted denial.
189
466000
4000
我們聽到所謂的非縮寫式的否認。
08:05
Studies show that people who are overdetermined in their denial
190
470000
3000
研究顯示,人們過度堅決否認
08:08
will resort to formal rather than informal language.
191
473000
3000
會採取正式的語言而不是非正式的語言。
08:11
We also heard distancing language: "that woman."
192
476000
3000
我們也聽到了疏離語言:『那女人』。
08:14
We know that liars will unconsciously distance themselves
193
479000
2000
我們知道說謊者會下意識地讓自己疏離
08:16
from their subject
194
481000
2000
與自己相關的對象,
08:18
using language as their tool.
195
483000
3000
以語言作為他們的工具。
08:21
Now if Bill Clinton had said, "Well, to tell you the truth ... "
196
486000
3000
若比爾‧柯林頓說了:「嗯,說真的......」
08:24
or Richard Nixon's favorite, "In all candor ... "
197
489000
2000
或李察‧尼克森的最愛:「坦白說......」
08:26
he would have been a dead giveaway
198
491000
2000
他早會洩漏真相
08:28
for any liespotter than knows
199
493000
2000
給任何辨識謊言者,其明白
08:30
that qualifying language, as it's called, qualifying language like that,
200
495000
3000
修飾語言,正如其名,像那類的修飾語言
08:33
further discredits the subject.
201
498000
2000
敗壞對象的聲譽。
08:35
Now if he had repeated the question in its entirety,
202
500000
3000
若言談中,他持續重覆問題
08:38
or if he had peppered his account with a little too much detail --
203
503000
4000
或者若他的描述格外交代細節──
08:42
and we're all really glad he didn't do that --
204
507000
2000
我們十分欣然他沒那樣做──
08:44
he would have further discredited himself.
205
509000
2000
他就會進一步的自毁聲譽/自取其辱。
08:46
Freud had it right.
206
511000
2000
佛洛伊德說的好。
08:48
Freud said, look, there's much more to it than speech:
207
513000
3000
佛洛伊德說:「瞧,有更多言詞外的蛛絲馬跡:
08:51
"No mortal can keep a secret.
208
516000
3000
『凡人守不住袐密。
08:54
If his lips are silent, he chatters with his fingertips."
209
519000
3000
若其雙唇緘默,他的手指喋喋不休。』」
08:57
And we all do it no matter how powerful you are.
210
522000
3000
無論你是何方神聖,你也會這麼做。
09:00
We all chatter with our fingertips.
211
525000
2000
我們都以手指在交談
09:02
I'm going to show you Dominique Strauss-Kahn with Obama
212
527000
3000
我要讓你看看Dominique Strauss-Kahn(前IMF總裁)和歐巴馬
09:05
who's chattering with his fingertips.
213
530000
3000
歐巴馬的指尖正嘮叨不停。
09:08
(Laughter)
214
533000
3000
(笑)
09:11
Now this brings us to our next pattern,
215
536000
3000
這正是我們接著要探討的模式,
09:14
which is body language.
216
539000
3000
所謂『身體語言』。
09:17
With body language, here's what you've got to do.
217
542000
3000
對於『身體語言』,你必須這麼做:
09:20
You've really got to just throw your assumptions out the door.
218
545000
3000
請先將你的預設想法放置一旁。
09:23
Let the science temper your knowledge a little bit.
219
548000
2000
讓科學稍稍更新你的知識。
09:25
Because we think liars fidget all the time.
220
550000
3000
因為我們認為說謊者往往焦躁不安
09:28
Well guess what, they're known to freeze their upper bodies when they're lying.
221
553000
3000
猜怎麼著! 據了解,當他們撒謊時,上半身是僵硬的。
09:31
We think liars won't look you in the eyes.
222
556000
3000
我們認為說謊者不會看著你的雙眼。
09:34
Well guess what, they look you in the eyes a little too much
223
559000
2000
你們知道嗎?!他們過度直視你的雙眼
09:36
just to compensate for that myth.
224
561000
2000
只為讓謊言更具說服力。
09:38
We think warmth and smiles
225
563000
2000
我們認為熱情和微笑
09:40
convey honesty, sincerity.
226
565000
2000
傳達誠實和真誠。
09:42
But a trained liespotter
227
567000
2000
但一個受過訓的識謊者
09:44
can spot a fake smile a mile away.
228
569000
2000
大老遠能認出虛情假意的微笑。
09:46
Can you all spot the fake smile here?
229
571000
3000
你們會辨識虛情假意的微笑嗎?
09:50
You can consciously contract
230
575000
2000
你能有意識的牽動
09:52
the muscles in your cheeks.
231
577000
3000
你臉部雙頰的肌肉。
09:55
But the real smile's in the eyes, the crow's feet of the eyes.
232
580000
3000
但真正的笑肌在眼中,眼部的魚尾紋,
09:58
They cannot be consciously contracted,
233
583000
2000
它們不會有意識地收縮,
10:00
especially if you overdid the Botox.
234
585000
2000
特別是你過分施打肉毒桿菌(Botox)。
10:02
Don't overdo the Botox; nobody will think you're honest.
235
587000
3000
別過分施打肉毒桿菌,沒有人會認為你是誠實的。
10:05
Now we're going to look at the hot spots.
236
590000
2000
現在我們來看看關鍵跡象(hot spots)。
10:07
Can you tell what's happening in a conversation?
237
592000
2000
你們能辨識出在對話中發生了什麼事嗎?
10:09
Can you start to find the hot spots
238
594000
3000
你能開始發現關鍵跡象、
10:12
to see the discrepancies
239
597000
2000
找出差異之處,
10:14
between someone's words and someone's actions?
240
599000
2000
從某人的言談或行動之中嗎?
10:16
Now I know it seems really obvious,
241
601000
2000
我知道這顯而易見,
10:18
but when you're having a conversation
242
603000
2000
但當你與某個
10:20
with someone you suspect of deception,
243
605000
3000
你懷疑其行騙的人交談,
10:23
attitude is by far the most overlooked but telling of indicators.
244
608000
3000
態度無疑是最常受忽視,但卻是顯著指標。
10:26
An honest person is going to be cooperative.
245
611000
2000
誠實的人會協同合作。
10:28
They're going to show they're on your side.
246
613000
2000
他們會表明他們與你同在、
10:30
They're going to be enthusiastic.
247
615000
2000
他們古道熱腸、
10:32
They're going to be willing and helpful to getting you to the truth.
248
617000
2000
他們願意幫助你發現真象、
10:34
They're going to be willing to brainstorm, name suspects,
249
619000
3000
他們願意提出妙計、指出嫌疑犯、
10:37
provide details.
250
622000
2000
提供細節。
10:39
They're going to say, "Hey,
251
624000
2000
他們會說:「嘿!
10:41
maybe it was those guys in payroll that forged those checks."
252
626000
3000
也許是那些在職人員偽造那些支票。」
10:44
They're going to be infuriated if they sense they're wrongly accused
253
629000
3000
若他們感受到不實指控,他們將勃然大怒
10:47
throughout the entire course of the interview, not just in flashes;
254
632000
2000
在整個面談的過程,不只是一怒作罷;
10:49
they'll be infuriated throughout the entire course of the interview.
255
634000
3000
在整個面談的過程,他們會勃然大怒。
10:52
And if you ask someone honest
256
637000
2000
若你問某個誠實的人
10:54
what should happen to whomever did forge those checks,
257
639000
3000
該拿那偽造支票的人怎辦,
10:57
an honest person is much more likely
258
642000
2000
一個誠實的人更為可能
10:59
to recommend strict rather than lenient punishment.
259
644000
4000
提議嚴厲而非寬大的懲治。
11:03
Now let's say you're having that exact same conversation
260
648000
2000
假如說你和某個欺騙者
11:05
with someone deceptive.
261
650000
2000
有同樣的對話。
11:07
That person may be withdrawn,
262
652000
2000
那人可能沈默寡言、
11:09
look down, lower their voice,
263
654000
2000
目光朝下、聲音壓低
11:11
pause, be kind of herky-jerky.
264
656000
2000
停頓、有點反覆無常。
11:13
Ask a deceptive person to tell their story,
265
658000
2000
請某個欺騙者陳述事情
11:15
they're going to pepper it with way too much detail
266
660000
3000
他們會加入過多繁瑣的細節
11:18
in all kinds of irrelevant places.
267
663000
3000
各式各樣且無關緊要。
11:21
And then they're going to tell their story in strict chronological order.
268
666000
3000
而且他們的故事情節有嚴謹的發生時間先後。
11:24
And what a trained interrogator does
269
669000
2000
一個受過訓練的訊問者的作法會是
11:26
is they come in and in very subtle ways
270
671000
2000
他們神不知鬼不覺的進入
11:28
over the course of several hours,
271
673000
2000
數個小時的面談過程,
11:30
they will ask that person to tell that story backwards,
272
675000
3000
他們會要求那人將故事倒著說
11:33
and then they'll watch them squirm,
273
678000
2000
接著他們盯著那些人,令其侷促不安,
11:35
and track which questions produce the highest volume of deceptive tells.
274
680000
3000
追問那些聽來欺騙意味濃厚的疑點。
11:38
Why do they do that? Well we all do the same thing.
275
683000
3000
為何他們如此做?我們都做相同的事。
11:41
We rehearse our words,
276
686000
2000
我們排練要說的話,
11:43
but we rarely rehearse our gestures.
277
688000
2000
但我們鮮少排演我們的姿態。
11:45
We say "yes," we shake our heads "no."
278
690000
2000
我們說『是』;我們搖頭說『不』。
11:47
We tell very convincing stories, we slightly shrug our shoulders.
279
692000
3000
我們說非常有說服力的故事;我們微微聳肩。
11:50
We commit terrible crimes,
280
695000
2000
我們犯下可怕的罪行;
11:52
and we smile at the delight in getting away with it.
281
697000
3000
我們卻微笑著,一派輕鬆地脫身。
11:55
Now that smile is known in the trade as "duping delight."
282
700000
3000
在這行,『微笑』之盛名為『欺騙之快感』
11:58
And we're going to see that in several videos moving forward,
283
703000
3000
我們會在接著的數支影片看到它。
12:01
but we're going to start -- for those of you who don't know him,
284
706000
2000
但我們會開始......,由於你們中有人不認識他,
12:03
this is presidential candidate John Edwards
285
708000
3000
這是總統候選人約翰‧愛得華(John Edwards),
12:06
who shocked America by fathering a child out of wedlock.
286
711000
3000
他未婚有一子震驚全美。
12:09
We're going to see him talk about getting a paternity test.
287
714000
3000
我們來看看他談親子鑑定。
12:12
See now if you can spot him
288
717000
2000
看是否你們能發現
12:14
saying, "yes" while shaking his head "no,"
289
719000
2000
他說『是』卻又搖頭『不』;
12:16
slightly shrugging his shoulders.
290
721000
2000
稍微聳動其肩膀。
12:18
(Video) John Edwards: I'd be happy to participate in one.
291
723000
2000
(影視)約翰‧愛得華:「我會樂意參與測試。
12:20
I know that it's not possible that this child could be mine,
292
725000
3000
我知道這孩子不可能會是我的,
12:23
because of the timing of events.
293
728000
2000
因為這事件發生的時機。
12:25
So I know it's not possible.
294
730000
2000
我知道不可能。
12:27
Happy to take a paternity test,
295
732000
2000
樂意接受親子鑑定。
12:29
and would love to see it happen.
296
734000
2000
樂見其成。」
12:31
Interviewer: Are you going to do that soon? Is there somebody --
297
736000
3000
訪問者:「你會快快接受測試吧?有某人......」
12:34
JE: Well, I'm only one side. I'm only one side of the test.
298
739000
3000
約翰‧愛得華:「嗯,我是唯一的一方。我是測試的唯一一方。
12:37
But I'm happy to participate in one.
299
742000
3000
但我高興參與測試。」
12:40
PM: Okay, those head shakes are much easier to spot
300
745000
2000
潘蜜拉:好,這些搖動是極其容易辨識
12:42
once you know to look for them.
301
747000
2000
一旦你明白要找出它們的話。
12:44
There're going to be times
302
749000
2000
有好多次,好多時候
12:46
when someone makes one expression
303
751000
2000
當某個人做一個表情時
12:48
while masking another that just kind of leaks through in a flash.
304
753000
3000
同時掩飾另一個表情,只是在瞬間有一絲藏不住。
12:52
Murderers are known to leak sadness.
305
757000
2000
殺人犯為人所知會流露出悲傷。
12:54
Your new joint venture partner might shake your hand,
306
759000
2000
你新合夥事業夥伴也許和你握手、
12:56
celebrate, go out to dinner with you
307
761000
2000
慶祝、在外共進晚餐,
12:58
and then leak an expression of anger.
308
763000
3000
然後流露出生氣的表情。
13:01
And we're not all going to become facial expression experts overnight here,
309
766000
3000
我們不會一夜成為面部表情專家,
13:04
but there's one I can teach you that's very dangerous, and it's easy to learn,
310
769000
3000
但我可以教你們一招險招且易學,
13:07
and that's the expression of contempt.
311
772000
3000
那就輕視的表情。
13:10
Now with anger, you've got two people on an even playing field.
312
775000
3000
有兩個人在一場勢均力敵的競賽,火藥味十足。
13:13
It's still somewhat of a healthy relationship.
313
778000
2000
這種競爭仍然是種健康的關係。
13:15
But when anger turns to contempt,
314
780000
2000
但當生氣轉為蔑視,
13:17
you've been dismissed.
315
782000
2000
一方就已被迫退場。
13:19
It's associated with moral superiority.
316
784000
2000
這與道德優越感有關。
13:21
And for that reason, it's very, very hard to recover from.
317
786000
3000
正因如此,很難再恢復。
13:24
Here's what it looks like.
318
789000
2000
它看起來像這樣。
13:26
It's marked by one lip corner
319
791000
2000
它的特點是一邊唇角
13:28
pulled up and in.
320
793000
2000
上揚內縮。
13:30
It's the only asymmetrical expression.
321
795000
3000
是唯一不對稱的表情。
13:33
And in the presence of contempt,
322
798000
2000
這存有輕蔑的意味,
13:35
whether or not deception follows --
323
800000
2000
是不是有欺騙行為在後──
13:37
and it doesn't always follow --
324
802000
2000
並不一定會如此──
13:39
look the other way, go the other direction,
325
804000
2000
看著其他的地方,走另一個方向,
13:41
reconsider the deal,
326
806000
2000
重新思考提議,
13:43
say, "No thank you. I'm not coming up for just one more nightcap. Thank you."
327
808000
4000
說:「不,謝了。我不勝酒力,睡意來囉。謝謝你!」
13:47
Science has surfaced
328
812000
2000
科學已揭開
13:49
many, many more indicators.
329
814000
2000
許許多多又更多的指標。
13:51
We know, for example,
330
816000
2000
我們知道,舉例來說,
13:53
we know liars will shift their blink rate,
331
818000
2000
我們知道說謊者會改變眨眼的速度,
13:55
point their feet towards an exit.
332
820000
2000
雙腳朝著出口,隨時作好逃生準備。
13:57
They will take barrier objects
333
822000
2000
他們會拿障礙物、
13:59
and put them between themselves and the person that is interviewing them.
334
824000
3000
把障礙擺放在他們自己和面談的人之間。
14:02
They'll alter their vocal tone,
335
827000
2000
他們會改變聲調,
14:04
often making their vocal tone much lower.
336
829000
3000
往往是壓低聲調。
14:07
Now here's the deal.
337
832000
2000
而這是對策。
14:09
These behaviors are just behaviors.
338
834000
3000
這些舉止態度只是舉止態度,
14:12
They're not proof of deception.
339
837000
2000
這些不是欺騙的證明。
14:14
They're red flags.
340
839000
2000
這些是警示紅旗。
14:16
We're human beings.
341
841000
2000
我們是人類。
14:18
We make deceptive flailing gestures all over the place all day long.
342
843000
3000
我們到處製造假象,成天裝腔作勢。
14:21
They don't mean anything in and of themselves.
343
846000
2000
它們本身不具任何意思或意味著什麼。
14:23
But when you see clusters of them, that's your signal.
344
848000
3000
但若把看這些行為串起來看,就成了個人發出的訊息。
14:26
Look, listen, probe, ask some hard questions,
345
851000
3000
注視、耹聽、刺探、問些犀厲的問題,
14:29
get out of that very comfortable mode of knowing,
346
854000
3000
拋開舒適自在的交談模式,
14:32
walk into curiosity mode, ask more questions,
347
857000
3000
改以好奇的交談模式,問更多的問題,
14:35
have a little dignity, treat the person you're talking to with rapport.
348
860000
3000
態度莊重,以親和力對待和你談話的人。
14:38
Don't try to be like those folks on "Law & Order" and those other TV shows
349
863000
3000
不要嘗試像影集《法網遊龍》(Law & Order)或其他電視劇裡的家屬那樣
14:41
that pummel their subjects into submission.
350
866000
2000
使談話的對象屈打成招。
14:43
Don't be too aggressive, it doesn't work.
351
868000
3000
不要太挑釁,這樣沒效。
14:46
Now we've talked a little bit
352
871000
2000
我們已談了一些
14:48
about how to talk to someone who's lying
353
873000
2000
關於如何與謊言假面交談
14:50
and how to spot a lie.
354
875000
2000
識破謊言。
14:52
And as I promised, we're now going to look at what the truth looks like.
355
877000
3000
如我開頭說的,我們要來看看真實的樣貌為何。
14:55
But I'm going to show you two videos,
356
880000
2000
我打算讓你們看兩支錄影
14:57
two mothers -- one is lying, one is telling the truth.
357
882000
3000
兩個母親──一個在說謊,一個說真話。
15:00
And these were surfaced
358
885000
2000
這些是由加洲研究者
15:02
by researcher David Matsumoto in California.
359
887000
2000
David Matsumoto 所公開的。
15:04
And I think they're an excellent example
360
889000
2000
我認為它們是很棒的例子
15:06
of what the truth looks like.
361
891000
2000
呈現事實的樣貌。
15:08
This mother, Diane Downs,
362
893000
2000
這位媽媽Diane Downs
15:10
shot her kids at close range,
363
895000
2000
近距離射擊她的小孩,
15:12
drove them to the hospital
364
897000
2000
開車送他們去醫院的同時
15:14
while they bled all over the car,
365
899000
2000
他們血流遍染車子,
15:16
claimed a scraggy-haired stranger did it.
366
901000
2000
聲稱一個頭髮散亂的陌生人幹的。
15:18
And you'll see when you see the video,
367
903000
2000
你們在片中會看到
15:20
she can't even pretend to be an agonizing mother.
368
905000
2000
她甚至無法假裝出悲痛萬分的為人母的樣子。
15:22
What you want to look for here
369
907000
2000
在這兒你要找出的是
15:24
is an incredible discrepancy
370
909000
2000
一個極端的矛盾之處──
15:26
between horrific events that she describes
371
911000
2000
她描述的恐怖駭人的事故
15:28
and her very, very cool demeanor.
372
913000
2000
和她極其冷酷的外表之不恰當。
15:30
And if you look closely, you'll see duping delight throughout this video.
373
915000
3000
若你仔細瞧,你會看到『欺騙之快感』慣穿整個錄影。
15:33
(Video) Diane Downs: At night when I close my eyes,
374
918000
2000
(影視)Diane Downs:「夜裡我闔眼
15:35
I can see Christie reaching her hand out to me while I'm driving,
375
920000
3000
我依晰可見克利斯汀向我伸出她的手,在我開車時
15:38
and the blood just kept coming out of her mouth.
376
923000
3000
而且血不斷的從她的嘴流出。
15:41
And that -- maybe it'll fade too with time --
377
926000
2000
而一切──也許將隨時間而淡化──
15:43
but I don't think so.
378
928000
2000
但我可不這麼認為。
15:45
That bothers me the most.
379
930000
3000
那令我十分困擾。」
15:55
PM: Now I'm going to show you a video
380
940000
2000
潘蜜拉:我要讓你們看另一支影片
15:57
of an actual grieving mother, Erin Runnion,
381
942000
2000
內容是一位傷心欲絶的母親Erin Runnion
15:59
confronting her daughter's murderer and torturer in court.
382
944000
4000
在法庭上面對她女兒的謀殺和折磨。
16:03
Here you're going to see no false emotion,
383
948000
2000
在這兒你會看到絲毫不虛假的情緒
16:05
just the authentic expression of a mother's agony.
384
950000
3000
正是一位母親痛苦的真實表白。
16:08
(Video) Erin Runnion: I wrote this statement on the third anniversary
385
953000
2000
(影視)Erin Runnion :「在那夜的第三週年,我寫下這篇陳述
16:10
of the night you took my baby,
386
955000
2000
那夜你奪走我的寶貝;
16:12
and you hurt her,
387
957000
2000
你傷害她;
16:14
and you crushed her,
388
959000
2000
你毀滅她;
16:16
you terrified her until her heart stopped.
389
961000
4000
你讓她害怕的要命直到她心臟停止。
16:20
And she fought, and I know she fought you.
390
965000
3000
而她反抗,我知道她努力與你拼搏。
16:23
But I know she looked at you
391
968000
2000
然而我知道她盯著你
16:25
with those amazing brown eyes,
392
970000
2000
以那雙驚愕的/漂亮的棕色眼睛盯著你,
16:27
and you still wanted to kill her.
393
972000
3000
而你仍下毒手。
16:30
And I don't understand it,
394
975000
2000
我不理解,
16:32
and I never will.
395
977000
3000
我將永不。」
16:35
PM: Okay, there's no doubting the veracity of those emotions.
396
980000
4000
潘蜜拉:這些情緒的真實無庸致疑。
16:39
Now the technology around what the truth looks like
397
984000
3000
以實相為中心的科技
16:42
is progressing on, the science of it.
398
987000
3000
日新月異──真相的科學。
16:45
We know for example
399
990000
2000
打個比方,我們知道
16:47
that we now have specialized eye trackers and infrared brain scans,
400
992000
3000
我們現在有了專業的眼球追蹤器和紅外線腦部掃瞄,
16:50
MRI's that can decode the signals that our bodies send out
401
995000
3000
核磁共振造影,來解碼我們身體釋放的訊號,
16:53
when we're trying to be deceptive.
402
998000
2000
當我們企圖欺騙時。
16:55
And these technologies are going to be marketed to all of us
403
1000000
3000
這些科技即將上市,
16:58
as panaceas for deceit,
404
1003000
2000
為解欺騙的萬靈丹,
17:00
and they will prove incredibly useful some day.
405
1005000
3000
終有一日它們會證明是出奇的有用。
17:03
But you've got to ask yourself in the meantime:
406
1008000
2000
但同時你必須自問:
17:05
Who do you want on your side of the meeting,
407
1010000
2000
你要誰在會面時,在你這一方
17:07
someone who's trained in getting to the truth
408
1012000
3000
是受訓以獲得實相的某個人
17:10
or some guy who's going to drag a 400-pound electroencephalogram
409
1015000
2000
或打算拖著400磅的電腦圖儀入門
17:12
through the door?
410
1017000
2000
的某人呢?
17:14
Liespotters rely on human tools.
411
1019000
4000
謊言辨識者依據『人性工具』作判斷。
17:18
They know, as someone once said,
412
1023000
2000
他們知道,如某人曾說過,
17:20
"Character's who you are in the dark."
413
1025000
2000
「人格是在黑暗中的你。」
17:22
And what's kind of interesting
414
1027000
2000
有點有趣的事是
17:24
is that today we have so little darkness.
415
1029000
2000
今日我們有微乎其微的黑暗。
17:26
Our world is lit up 24 hours a day.
416
1031000
3000
我們的世界一天24小時亮著,
17:29
It's transparent
417
1034000
2000
它是透明的,
17:31
with blogs and social networks
418
1036000
2000
因部落格和社群網絡
17:33
broadcasting the buzz of a whole new generation of people
419
1038000
2000
傳播全新世代的人們的議論
17:35
that have made a choice to live their lives in public.
420
1040000
3000
他們已選擇公開他們的生活來過日子,
17:38
It's a much more noisy world.
421
1043000
4000
那是一個更為紛擾的世界。
17:42
So one challenge we have
422
1047000
2000
我們有一項挑戰,
17:44
is to remember,
423
1049000
2000
是牢記
17:46
oversharing, that's not honesty.
424
1051000
3000
過度分享,不是誠實。
17:49
Our manic tweeting and texting
425
1054000
2000
狂熱的『特推』和發送短訊
17:51
can blind us to the fact
426
1056000
2000
會蒙蔽我們看清事實
17:53
that the subtleties of human decency -- character integrity --
427
1058000
3000
人類道德禮儀的精妙之處──人格正直──
17:56
that's still what matters, that's always what's going to matter.
428
1061000
3000
依舊是重要;永遠是受重視的。
17:59
So in this much noisier world,
429
1064000
2000
所以在這個更為嘈雜的世界
18:01
it might make sense for us
430
1066000
2000
對我們而言,
18:03
to be just a little bit more explicit
431
1068000
2000
多一點更明確表明
18:05
about our moral code.
432
1070000
3000
我們的道德規範是合理的。
18:08
When you combine the science of recognizing deception
433
1073000
2000
當你結合辨識詐欺的科學
18:10
with the art of looking, listening,
434
1075000
2000
和望聞的藝術,
18:12
you exempt yourself from collaborating in a lie.
435
1077000
3000
你免除了自己成為謊言共犯。
18:15
You start up that path
436
1080000
2000
你開始踏上了這條
18:17
of being just a little bit more explicit,
437
1082000
2000
只要更為直截了當、明確清楚的路徑,
18:19
because you signal to everyone around you,
438
1084000
2000
因為你對週圍的人發出訊號,
18:21
you say, "Hey, my world, our world,
439
1086000
3000
你們說:「嘿!我的世界、我們的世界
18:24
it's going to be an honest one.
440
1089000
2000
將會是一個真誠的世界。
18:26
My world is going to be one where truth is strengthened
441
1091000
2000
我的世界會是一個真相威力無敵,
18:28
and falsehood is recognized and marginalized."
442
1093000
3000
而謊言無所遁形,天地不容。」
18:31
And when you do that,
443
1096000
2000
當你這樣做,
18:33
the ground around you starts to shift just a little bit.
444
1098000
3000
你周遭開始會稍稍改變立場。
18:36
And that's the truth. Thank you.
445
1101000
3000
而這是事實,謝謝你們。
18:39
(Applause)
446
1104000
5000
(掌聲)
ABOUT THE SPEAKER
Pamela Meyer - Lie detectorPamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.
Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.
Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speakerWorking with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
Pamela Meyer | Speaker | TED.com