TEDGlobal 2011
Pamela Meyer: How to spot a liar
Pamela Meyer: Hvordan man spotter en løgner
Filmed:
Readability: 3.8
28,415,176 views
På en hvilken som helst dag, bliver vi løjet for mellem 10 og 200 gange og ledetrådende til at opspore de løgne kan være diskrette og ulogiske. Pamela Meyer, forfatteren af Liespotting, viser metoderne og "hotspots" der bruges af dem, der er trænet til at genkende bedrag -- og hun argumenterer for at ærlighed er en værdi der er værd at bevare.
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio
Double-click the English transcript below to play the video.
00:15
Okay, now I don't want to alarm anybody in this room,
0
0
5000
Okay, nu vil jeg ikke forskrække nogen i lokalet,
00:20
but it's just come to my attention
1
5000
2000
men jeg er lige blevet klar over,
00:22
that the person to your right is a liar.
2
7000
2000
at personen til højre for dig er en løgner.
00:24
(Laughter)
3
9000
2000
(Latter)
00:26
Also, the person to your left is a liar.
4
11000
3000
Derudover, er personen til venstre for dig en løgner.
00:29
Also the person sitting in your very seats is a liar.
5
14000
3000
Og personen der sidder i dit sæde er en løgner.
00:32
We're all liars.
6
17000
2000
Vi er alle løgnere.
00:34
What I'm going to do today
7
19000
2000
Det jeg vil gøre i dag,
00:36
is I'm going to show you what the research says about why we're all liars,
8
21000
3000
er at jeg vil vise jeg hvad forskningen siger om, hvorfor vi er løgnere,
00:39
how you can become a liespotter
9
24000
2000
hvordan man kan blive en der spotter løgnere
00:41
and why you might want to go the extra mile
10
26000
3000
og hvorfor man måske vil give det sidste
00:44
and go from liespotting to truth seeking,
11
29000
3000
og gå fra at spotte løgnere til at søge sandheden,
00:47
and ultimately to trust building.
12
32000
2000
og til i sidste ende at opbygge tillid.
00:49
Now speaking of trust,
13
34000
3000
Nu vi taler om tillid,
00:52
ever since I wrote this book, "Liespotting,"
14
37000
3000
lige siden jeg skrev denne bog, "Liespotting,"
00:55
no one wants to meet me in person anymore, no, no, no, no, no.
15
40000
3000
ingen vil møde mig personligt mere, nej, nej, nej, nej, nej.
00:58
They say, "It's okay, we'll email you."
16
43000
3000
De siger, "Det er okay, vi sender dig en email."
01:01
(Laughter)
17
46000
2000
(Latter)
01:03
I can't even get a coffee date at Starbucks.
18
48000
4000
Jeg kan ikke engang få en date til kaffe på Starbucks.
01:07
My husband's like, "Honey, deception?
19
52000
2000
Min mand siger, "Skat, bedrag?
01:09
Maybe you could have focused on cooking. How about French cooking?"
20
54000
3000
Måske du kunne have fokuseret på madlavning? Hvad med fransk madlavning?"
01:12
So before I get started, what I'm going to do
21
57000
2000
Så før jeg går i gang, det jeg vil gøre er,
01:14
is I'm going to clarify my goal for you,
22
59000
3000
at jeg vil afklare mit mål for jer,
01:17
which is not to teach a game of Gotcha.
23
62000
2000
som ikke er at lære et spil fik-dig.
01:19
Liespotters aren't those nitpicky kids,
24
64000
2000
Dem der spotter løgne er ikke de børn der kommer med smålig kritik,
01:21
those kids in the back of the room that are shouting, "Gotcha! Gotcha!
25
66000
3000
de børn der sidder bagerst i lokalet og råber, "Fik dig! Fik dig!
01:24
Your eyebrow twitched. You flared your nostril.
26
69000
3000
Dit øjenbryn spjættede. Du flagrede med næseboret.
01:27
I watch that TV show 'Lie To Me.' I know you're lying."
27
72000
3000
Jeg ser det TV show 'Lie To Me'. Jeg ved at du lyver."
01:30
No, liespotters are armed
28
75000
2000
Nej, dem der spotter løgne er bevæbnede
01:32
with scientific knowledge of how to spot deception.
29
77000
3000
med videnskabelig indsigt til hvordan man spotter bedrag.
01:35
They use it to get to the truth,
30
80000
2000
De bruger det til at finde frem til sandheden,
01:37
and they do what mature leaders do everyday;
31
82000
2000
og de gør det, som modne ledere gør hver dag;
01:39
they have difficult conversations with difficult people,
32
84000
3000
de har vanskelige samtaler med vanskelige mennesker,
01:42
sometimes during very difficult times.
33
87000
2000
nogle gange i vanskelige tider.
01:44
And they start up that path
34
89000
2000
Og de går hen ad den vej
01:46
by accepting a core proposition,
35
91000
2000
ved at acceptere en kerne holdning,
01:48
and that proposition is the following:
36
93000
2000
og den holdning er det følgende:
01:50
Lying is a cooperative act.
37
95000
3000
At lyve er en samarbejdende handling.
01:53
Think about it, a lie has no power whatsoever by its mere utterance.
38
98000
4000
Tænk over det, en løgn har ikke noget magt overhovedet, når den kun bliver ytret.
01:57
Its power emerges
39
102000
2000
Dens magt opstår,
01:59
when someone else agrees to believe the lie.
40
104000
2000
når en anden indvilliger i at tro på den.
02:01
So I know it may sound like tough love,
41
106000
2000
Så jeg ved det måske lyder som hård kærlighed,
02:03
but look, if at some point you got lied to,
42
108000
4000
men se, hvis man på et tidspunkt er blev løjet for,
02:07
it's because you agreed to get lied to.
43
112000
2000
er det fordi man indvilligede i at blive løjet for.
02:09
Truth number one about lying: Lying's a cooperative act.
44
114000
3000
Sandhed nummer et om at lyve: At lyve er en samarbejdende handling.
02:12
Now not all lies are harmful.
45
117000
2000
Nu er det ikke alle løgne der er skadelige.
02:14
Sometimes we're willing participants in deception
46
119000
3000
Nogen gange er vi villige til at deltage i bedrag
02:17
for the sake of social dignity,
47
122000
3000
for den sociale værdigheds skyld,
02:20
maybe to keep a secret that should be kept secret, secret.
48
125000
3000
måske at holde en hemmelighed der skal holdes hemmelig, hemmelig.
02:23
We say, "Nice song."
49
128000
2000
Vi siger, "God sang."
02:25
"Honey, you don't look fat in that, no."
50
130000
3000
"Skat, du ser ikke tyk ud i det, nej."
02:28
Or we say, favorite of the digiratti,
51
133000
2000
Eller vi siger, digirattiens favorit,
02:30
"You know, I just fished that email out of my spam folder.
52
135000
3000
"Du ved, jeg fiskede lige den email ud af mit spam filter.
02:33
So sorry."
53
138000
3000
Det må du undskylde."
02:36
But there are times when we are unwilling participants in deception.
54
141000
3000
Men der er tider, når vi er uvillige deltagere i bedrag.
02:39
And that can have dramatic costs for us.
55
144000
3000
Og det kan have dramatiske omkostninger for os.
02:42
Last year saw 997 billion dollars
56
147000
3000
Sidste år endte 997 milliarder dollars
02:45
in corporate fraud alone in the United States.
57
150000
4000
som selskabssvindel i USA.
02:49
That's an eyelash under a trillion dollars.
58
154000
2000
Det er meget lidt under en billion dollars.
02:51
That's seven percent of revenues.
59
156000
2000
Det er syv procent af indtægterne.
02:53
Deception can cost billions.
60
158000
2000
Bedrag kan koste milliarder.
02:55
Think Enron, Madoff, the mortgage crisis.
61
160000
3000
Tænk på Enron, Madoff, boligkrisen.
02:58
Or in the case of double agents and traitors,
62
163000
3000
Eller med hensyn til dobbeltagenter og forrædere,
03:01
like Robert Hanssen or Aldrich Ames,
63
166000
2000
som Robert Hanssen eller Aldrich Ames,
03:03
lies can betray our country,
64
168000
2000
løgne kan forråde vores land,
03:05
they can compromise our security, they can undermine democracy,
65
170000
3000
de kan kompromittere vores sikkerhed, de kan underminere demokratiet,
03:08
they can cause the deaths of those that defend us.
66
173000
3000
den kan forårsage døden over dem der forsvarer os.
03:11
Deception is actually serious business.
67
176000
3000
Bedrag er faktisk en alvorlig sag.
03:14
This con man, Henry Oberlander,
68
179000
2000
Denne svindler, Henry Oberlander,
03:16
he was such an effective con man
69
181000
2000
han var så effektiv en svindler,
03:18
British authorities say
70
183000
2000
at britiske myndigheder siger,
03:20
he could have undermined the entire banking system of the Western world.
71
185000
3000
at han kunne have undermineret hele banksystemet i den vestlige verden.
03:23
And you can't find this guy on Google; you can't find him anywhere.
72
188000
2000
Og man kan ikke finde denne fyr på Google; man kan ikke finde ham nogen steder.
03:25
He was interviewed once, and he said the following.
73
190000
3000
Han blev interviewet én gang, og han sagde det følgende.
03:28
He said, "Look, I've got one rule."
74
193000
2000
Han sagde, "Hør engang, jeg har en regel."
03:30
And this was Henry's rule, he said,
75
195000
3000
Og dette var Henrys regel, sagde han,
03:33
"Look, everyone is willing to give you something.
76
198000
2000
"Hør engang, alle er villige til at give en noget.
03:35
They're ready to give you something for whatever it is they're hungry for."
77
200000
3000
De er klar til at give en noget, for hvad end de nu er sultne efter."
03:38
And that's the crux of it.
78
203000
2000
Og det er kernen i det.
03:40
If you don't want to be deceived, you have to know,
79
205000
2000
Hvis man ikke vil bedrages, skal man vide,
03:42
what is it that you're hungry for?
80
207000
2000
hvad er det man er sulten efter?
03:44
And we all kind of hate to admit it.
81
209000
3000
Og vi hader allesammen at indrømme det.
03:47
We wish we were better husbands, better wives,
82
212000
3000
Vi ville ønske vi var bedre ægtemænd, koner,
03:50
smarter, more powerful,
83
215000
2000
klogere, mere magtfulde,
03:52
taller, richer --
84
217000
2000
højere, rigere --
03:54
the list goes on.
85
219000
2000
listen fortsætter.
03:56
Lying is an attempt to bridge that gap,
86
221000
2000
Det at lyve prøver på at slå bro henover det mellemrum,
03:58
to connect our wishes and our fantasies
87
223000
2000
for at forbinde vores ønsker og vores fantasier
04:00
about who we wish we were, how we wish we could be,
88
225000
3000
om den vi ønsker vi var, hvordan vi ønsker vi kunne være,
04:03
with what we're really like.
89
228000
3000
med det vi virkelig er.
04:06
And boy are we willing to fill in those gaps in our lives with lies.
90
231000
3000
Og manner, hvor er vi villige til at fylde det mellemrum i vores liv med løgn.
04:09
On a given day, studies show that you may be lied to
91
234000
3000
På en given dag, viser undersøgelser at man bliver løjet for
04:12
anywhere from 10 to 200 times.
92
237000
2000
mellem 10 til 200 gange.
04:14
Now granted, many of those are white lies.
93
239000
3000
Nu er mange af disse givetvis hvide løgne.
04:17
But in another study,
94
242000
2000
Men i en anden undersøgelse,
04:19
it showed that strangers lied three times
95
244000
2000
viste det at fremmede fortalte en løgn tre gange oftere
04:21
within the first 10 minutes of meeting each other.
96
246000
2000
inden for de 10 første minutter af at de mødte hinanden.
04:23
(Laughter)
97
248000
2000
(Latter)
04:25
Now when we first hear this data, we recoil.
98
250000
3000
Når vi først hører denne data, føler vi noget modvilje.
04:28
We can't believe how prevalent lying is.
99
253000
2000
Vi kan ikke tro på, hvor udbredt det er at lyve.
04:30
We're essentially against lying.
100
255000
2000
Vi er grundlæggende imod at lyve.
04:32
But if you look more closely,
101
257000
2000
Men hvis man ser nærmerede på det,
04:34
the plot actually thickens.
102
259000
2000
bliver situationen mere indviklet.
04:36
We lie more to strangers than we lie to coworkers.
103
261000
3000
Vi lyver mere overfor fremmede, end vi lyver overfor kollegaer.
04:39
Extroverts lie more than introverts.
104
264000
4000
Dem der er udadvendte lyver mere, end dem der er indadvendte.
04:43
Men lie eight times more about themselves
105
268000
3000
Mænd lyver otte gange mere om sig selv,
04:46
than they do other people.
106
271000
2000
end de gør om andre mennesker.
04:48
Women lie more to protect other people.
107
273000
3000
Kvinder lyver mere for at beskytte andre mennesker.
04:51
If you're an average married couple,
108
276000
3000
Hvis man er et gennemsnitligt gift par,
04:54
you're going to lie to your spouse
109
279000
2000
vil man lyve overfor sin ægtefælle
04:56
in one out of every 10 interactions.
110
281000
2000
i en ud af 10 interaktioner.
04:58
Now you may think that's bad.
111
283000
2000
Nu tror I måske at det er slemt.
05:00
If you're unmarried, that number drops to three.
112
285000
2000
Hvis man er ugift, falder det nummer til tre.
05:02
Lying's complex.
113
287000
2000
Løgne er komplicerede.
05:04
It's woven into the fabric of our daily and our business lives.
114
289000
3000
Det er vævet ind i det stof der er vores dagligdag og vores arbejdsliv.
05:07
We're deeply ambivalent about the truth.
115
292000
2000
Vi er utrolig ambivalente omkring sandheden.
05:09
We parse it out on an as-needed basis,
116
294000
2000
Vi deler den ud i takt med at der er brug for den,
05:11
sometimes for very good reasons,
117
296000
2000
nogle gange på grund af rigtig gode årsager,
05:13
other times just because we don't understand the gaps in our lives.
118
298000
3000
andre gange bare fordi vi ikke forstår hullerne i vores liv.
05:16
That's truth number two about lying.
119
301000
2000
Det er sandhed nummer to om det at lyve.
05:18
We're against lying,
120
303000
2000
Vi er imod det at lyve,
05:20
but we're covertly for it
121
305000
2000
men i det skjulte er vi for det,
05:22
in ways that our society has sanctioned
122
307000
2000
på måder som vores samfund har sanktioneret
05:24
for centuries and centuries and centuries.
123
309000
2000
i århundreder og århundreder og århundreder.
05:26
It's as old as breathing.
124
311000
2000
Det er så gammelt som at trække vejret.
05:28
It's part of our culture, it's part of our history.
125
313000
2000
Det er en del af vores kultur, det er en del af vores historie.
05:30
Think Dante, Shakespeare,
126
315000
3000
Tænk på Dante, Shakespeare,
05:33
the Bible, News of the World.
127
318000
3000
Biblen, News of the World.
05:36
(Laughter)
128
321000
2000
(Latter)
05:38
Lying has evolutionary value to us as a species.
129
323000
2000
Det at lyve har en evolutionær værdi for os som art.
05:40
Researchers have long known
130
325000
2000
Forskere har i lang tid vidst,
05:42
that the more intelligent the species,
131
327000
2000
at jo mere intelligent arten er,
05:44
the larger the neocortex,
132
329000
2000
jo større neokortex,
05:46
the more likely it is to be deceptive.
133
331000
2000
jo større er sandsynligheden for at de er bedrageriske.
05:48
Now you might remember Koko.
134
333000
2000
Nu husker I måske Koko.
05:50
Does anybody remember Koko the gorilla who was taught sign language?
135
335000
3000
Kan nogen huske gorilaen Koko, der blev lært tegnesprog?
05:53
Koko was taught to communicate via sign language.
136
338000
3000
Koko blev lært at kommunikere ved hjælp af tegnesprog.
05:56
Here's Koko with her kitten.
137
341000
2000
Her er Koko med hendes killing.
05:58
It's her cute little, fluffy pet kitten.
138
343000
3000
Det er hendes nuttede, bløde kæledyrskilling.
06:01
Koko once blamed her pet kitten
139
346000
2000
Koko gav engang hendes killing skylden,
06:03
for ripping a sink out of the wall.
140
348000
2000
for at rive køkkenvasken ud af væggen.
06:05
(Laughter)
141
350000
2000
(Latter)
06:07
We're hardwired to become leaders of the pack.
142
352000
2000
Vi er lavet til at blive flokkens leder.
06:09
It's starts really, really early.
143
354000
2000
Det begynder virkelig, virkelig tidligt.
06:11
How early?
144
356000
2000
Hvor tidligt?
06:13
Well babies will fake a cry,
145
358000
2000
Jamen babyer simulerer gråd,
06:15
pause, wait to see who's coming
146
360000
2000
pauser, venter og ser hvem der kommer
06:17
and then go right back to crying.
147
362000
2000
og begynder så at græde igen.
06:19
One-year-olds learn concealment.
148
364000
2000
Etårige lærer hemmligholdelse.
06:21
(Laughter)
149
366000
2000
(Latter)
06:23
Two-year-olds bluff.
150
368000
2000
Toårige lærer at bluffe.
06:25
Five-year-olds lie outright.
151
370000
2000
Femårige lyver decideret.
06:27
They manipulate via flattery.
152
372000
2000
De manipulerer via smiger.
06:29
Nine-year-olds, masters of the cover up.
153
374000
3000
Niårige, mestre i at dække over noget.
06:32
By the time you enter college,
154
377000
2000
Til den tid at man begynder på universitetet,
06:34
you're going to lie to your mom in one out of every five interactions.
155
379000
3000
man vil lyve overfor sin mor i en ud af fem interaktioner.
06:37
By the time we enter this work world and we're breadwinners,
156
382000
3000
Til den tid at vi begynder på vores arbejdsliv og vi er forsørgere,
06:40
we enter a world that is just cluttered
157
385000
2000
går vi ind i en verden der bare er rodet
06:42
with spam, fake digital friends,
158
387000
2000
med spam, falske digitale venner,
06:44
partisan media,
159
389000
2000
ensidige medier,
06:46
ingenious identity thieves,
160
391000
2000
geniale identitets tyve,
06:48
world-class Ponzi schemers,
161
393000
2000
verdensklasse pyramidespillere,
06:50
a deception epidemic --
162
395000
2000
en bedrageriepidemi --
06:52
in short, what one author calls
163
397000
2000
Kort sagt, det en forfatter kalder
06:54
a post-truth society.
164
399000
3000
et post-sandhed samfund.
06:57
It's been very confusing
165
402000
2000
Det har været meget forvirrende
06:59
for a long time now.
166
404000
3000
i meget lang tid nu.
07:03
What do you do?
167
408000
2000
Hvad gør man?
07:05
Well there are steps we can take
168
410000
2000
Jamen vi kan tage et par forholdsregler,
07:07
to navigate our way through the morass.
169
412000
2000
for at navigere os gennem moradset.
07:09
Trained liespotters get to the truth 90 percent of the time.
170
414000
3000
Dem der er trænede i at spotte løgne, finder frem til sandheden 90 procent af tiden.
07:12
The rest of us, we're only 54 percent accurate.
171
417000
3000
Resten af os, vi er kun 54 procent nøjagtige.
07:15
Why is it so easy to learn?
172
420000
2000
Hvorfor er det så let at lære?
07:17
There are good liars and there are bad liars. There are no real original liars.
173
422000
3000
Der er gode løgnere og dårlige løgnere. Der er ikke rigtig nogen originale løgnere.
07:20
We all make the same mistakes. We all use the same techniques.
174
425000
3000
Vi begår alle de samme fejl. Vi bruger alle de samme teknikker.
07:23
So what I'm going to do
175
428000
2000
Så det jeg vil gøre,
07:25
is I'm going to show you two patterns of deception.
176
430000
2000
er at jeg vil vise jer to forskellige bedragsmønstre.
07:27
And then we're going to look at the hot spots and see if we can find them ourselves.
177
432000
3000
Og så vil vi se på hotspots og se om vi kan finde os selv.
07:30
We're going to start with speech.
178
435000
3000
Vi vil begynde med tale.
07:33
(Video) Bill Clinton: I want you to listen to me.
179
438000
2000
(Video) Bill Clinton: Jeg vil have at I lytter til mig.
07:35
I'm going to say this again.
180
440000
2000
Jeg vil sige dette endnu en gang.
07:37
I did not have sexual relations
181
442000
3000
Jeg havde ikke et seksuelt forhold
07:40
with that woman, Miss Lewinsky.
182
445000
4000
til den kvinde, Miss Lewinsky.
07:44
I never told anybody to lie,
183
449000
2000
Jeg sagde ikke til nogen at de skulle lyve,
07:46
not a single time, never.
184
451000
2000
ikke en eneste gang, aldrig.
07:48
And these allegations are false.
185
453000
3000
Og disse anklager er falske.
07:51
And I need to go back to work for the American people.
186
456000
2000
Og jeg skal tilbage til mit arbejde for det amerikanske folk.
07:53
Thank you.
187
458000
2000
Tak.
07:58
Pamela Meyer: Okay, what were the telltale signs?
188
463000
3000
Pamela Meyer: Okay, hvad var de afslørende tegn?
08:01
Well first we heard what's known as a non-contracted denial.
189
466000
4000
Jamen først hørte vi det der er kendt som en ikke-indskrænket benægtelse.
08:05
Studies show that people who are overdetermined in their denial
190
470000
3000
Undersøgelser viser at mennesker der er ekstra bestemte i deres benægtelse,
08:08
will resort to formal rather than informal language.
191
473000
3000
vil ty til formel i stedet for uformel sprogbrug.
08:11
We also heard distancing language: "that woman."
192
476000
3000
Vi hørte også afstandstagende sprog: "den kvinde."
08:14
We know that liars will unconsciously distance themselves
193
479000
2000
Vi ved at løgnere ubevidst vil distancere sig selv
08:16
from their subject
194
481000
2000
fra deres genstand,
08:18
using language as their tool.
195
483000
3000
ved at bruge sproget som redskab.
08:21
Now if Bill Clinton had said, "Well, to tell you the truth ... "
196
486000
3000
Hvis nu Bill Clinton havde sagt, "Jamen, for at sige sandheden …"
08:24
or Richard Nixon's favorite, "In all candor ... "
197
489000
2000
eller Richard Nixons yndling, "Oprigtig talt …"
08:26
he would have been a dead giveaway
198
491000
2000
ville det have været en åbenlys afsløring
08:28
for any liespotter than knows
199
493000
2000
for en hvilken som helst person der spotter løgne og ved,
08:30
that qualifying language, as it's called, qualifying language like that,
200
495000
3000
at det kvalificerende sprogbrug, som det hedder, sådan et kvalificerende sprogbrug
08:33
further discredits the subject.
201
498000
2000
sår yderligere tvivl om personen.
08:35
Now if he had repeated the question in its entirety,
202
500000
3000
Hvis han nu havde gentaget hele spørgsmålet,
08:38
or if he had peppered his account with a little too much detail --
203
503000
4000
eller hvis han havde krydret sin beretning med lidt for mange detaljer --
08:42
and we're all really glad he didn't do that --
204
507000
2000
og vi er allesammen rigtig glade for at han ikke gjorde det --
08:44
he would have further discredited himself.
205
509000
2000
ville han have sået yderligere tvivl om sig selv.
08:46
Freud had it right.
206
511000
2000
Freud havde ret.
08:48
Freud said, look, there's much more to it than speech:
207
513000
3000
Freud sagde, hør engang, der er meget mere ved det end sproget:
08:51
"No mortal can keep a secret.
208
516000
3000
"Ingen dødelig kan holde på en hemmelighed.
08:54
If his lips are silent, he chatters with his fingertips."
209
519000
3000
Hvis hans læber er tavse, taler han med fingerspidserne."
08:57
And we all do it no matter how powerful you are.
210
522000
3000
Og vi gør det allesammen, uanset hvor magtfuld man er.
09:00
We all chatter with our fingertips.
211
525000
2000
Vi sladrer alle med vores fingerspidser.
09:02
I'm going to show you Dominique Strauss-Kahn with Obama
212
527000
3000
Jeg vil vise jer Dominique Strauss-Kahn sammen med Obama,
09:05
who's chattering with his fingertips.
213
530000
3000
der sladrer med sine fingerspidser.
09:08
(Laughter)
214
533000
3000
(Latter)
09:11
Now this brings us to our next pattern,
215
536000
3000
Nu bringer dette os videre til vores næste mønster,
09:14
which is body language.
216
539000
3000
som er kropssprog.
09:17
With body language, here's what you've got to do.
217
542000
3000
Med kropssprog, her er hvad man skal gøre.
09:20
You've really got to just throw your assumptions out the door.
218
545000
3000
Man skal virkelig bare kaste sine antagelser ud af vinduet.
09:23
Let the science temper your knowledge a little bit.
219
548000
2000
Lad videnskaben dæmpe ens viden en lille smule.
09:25
Because we think liars fidget all the time.
220
550000
3000
Fordi vi tror at løgnere er nervøse hele tiden.
09:28
Well guess what, they're known to freeze their upper bodies when they're lying.
221
553000
3000
Men gæt engang, de er kendte for at holde deres overkrop helt stille når de lyver.
09:31
We think liars won't look you in the eyes.
222
556000
3000
Vi tror at løgnere ikke vil kigge en i øjnene.
09:34
Well guess what, they look you in the eyes a little too much
223
559000
2000
Men gæt engang, de kigger en lidt for meget i øjnene,
09:36
just to compensate for that myth.
224
561000
2000
bare for at kompensere for den myte.
09:38
We think warmth and smiles
225
563000
2000
Vi tror at varme og smil
09:40
convey honesty, sincerity.
226
565000
2000
kommunikerer ærlighed, oprigtighed.
09:42
But a trained liespotter
227
567000
2000
Men en der er trænet i at spotte løgne,
09:44
can spot a fake smile a mile away.
228
569000
2000
kan spotte et falsk smil på en kilometers afstand.
09:46
Can you all spot the fake smile here?
229
571000
3000
Kan I alle spotte et falsk smil?
09:50
You can consciously contract
230
575000
2000
Man kan bevidst trække
09:52
the muscles in your cheeks.
231
577000
3000
musklerne i ens kinder sammen.
09:55
But the real smile's in the eyes, the crow's feet of the eyes.
232
580000
3000
Men det rigtige smil er i øjnene, øjnenes kragetæer.
09:58
They cannot be consciously contracted,
233
583000
2000
De kan ikke trækkes sammen bevidst,
10:00
especially if you overdid the Botox.
234
585000
2000
specielt hvis man overdrev med Botox'en.
10:02
Don't overdo the Botox; nobody will think you're honest.
235
587000
3000
Overdriv ikke med Botox'en; ingen vil tro at man er ærlig.
10:05
Now we're going to look at the hot spots.
236
590000
2000
Nu vil vi kigge på hotspots.
10:07
Can you tell what's happening in a conversation?
237
592000
2000
Kan I se hvad der sker i en samtale?
10:09
Can you start to find the hot spots
238
594000
3000
Kan man begynde at finde hotspots,
10:12
to see the discrepancies
239
597000
2000
til at se uoverenstemmelserne
10:14
between someone's words and someone's actions?
240
599000
2000
mellem nogens ord og nogens handlinger?
10:16
Now I know it seems really obvious,
241
601000
2000
Nu ved jeg at det virker virkelig åbenlyst,
10:18
but when you're having a conversation
242
603000
2000
men når man fører en samtale
10:20
with someone you suspect of deception,
243
605000
3000
med nogen man mistænker for bedrag,
10:23
attitude is by far the most overlooked but telling of indicators.
244
608000
3000
er attitude uden tvivl den mest oversete, men afslørende, indikator.
10:26
An honest person is going to be cooperative.
245
611000
2000
En ærlig person er samarbejdsvillig.
10:28
They're going to show they're on your side.
246
613000
2000
De viser at de er på ens side.
10:30
They're going to be enthusiastic.
247
615000
2000
De vil være entusiastiske.
10:32
They're going to be willing and helpful to getting you to the truth.
248
617000
2000
De vil være villige og behjælpelige med at få en frem til sandheden.
10:34
They're going to be willing to brainstorm, name suspects,
249
619000
3000
De er villige til at brainstorme, navngive mistænkte,
10:37
provide details.
250
622000
2000
forsyne detaljer.
10:39
They're going to say, "Hey,
251
624000
2000
De siger, "Hey,
10:41
maybe it was those guys in payroll that forged those checks."
252
626000
3000
måske var det de fyre i lønningsafdelingen, der forfalskede de checks."
10:44
They're going to be infuriated if they sense they're wrongly accused
253
629000
3000
De bliver rasende, hvis de føler de bliver falsk anklaget
10:47
throughout the entire course of the interview, not just in flashes;
254
632000
2000
gennem hele interviewet, ikke kun i glimt;
10:49
they'll be infuriated throughout the entire course of the interview.
255
634000
3000
de vil være rasende gennem hele interviewet.
10:52
And if you ask someone honest
256
637000
2000
Og hvis spørger nogen ærligt,
10:54
what should happen to whomever did forge those checks,
257
639000
3000
om hvad der burde ske med hvem end der forfalskede de checks,
10:57
an honest person is much more likely
258
642000
2000
er det meget mere sandsynligt, at en ærlig person
10:59
to recommend strict rather than lenient punishment.
259
644000
4000
anbefaler en hård straf over en mild straf.
11:03
Now let's say you're having that exact same conversation
260
648000
2000
Lad os nu sige at man har den præcis samme samtale
11:05
with someone deceptive.
261
650000
2000
med en person der er bedragerisk.
11:07
That person may be withdrawn,
262
652000
2000
Den person vil være tilbagetrukket,
11:09
look down, lower their voice,
263
654000
2000
kigger ned, sænker deres stemme,
11:11
pause, be kind of herky-jerky.
264
656000
2000
pauser, er uforudsigelige.
11:13
Ask a deceptive person to tell their story,
265
658000
2000
Spørg en bedragerisk person om at fortælle deres historie,
11:15
they're going to pepper it with way too much detail
266
660000
3000
de kommer til at krydre den med alt for mange detaljer
11:18
in all kinds of irrelevant places.
267
663000
3000
på alle mulige irrelevante steder.
11:21
And then they're going to tell their story in strict chronological order.
268
666000
3000
Og så vil de fortælle deres historie i en striks kronologisk rækkefølge.
11:24
And what a trained interrogator does
269
669000
2000
Og det en trænet forhører gør
11:26
is they come in and in very subtle ways
270
671000
2000
er at de kommer ind og på meget diskrete måder og
11:28
over the course of several hours,
271
673000
2000
i løbet af adskillige timer,
11:30
they will ask that person to tell that story backwards,
272
675000
3000
vil de spørge den person om at fortælle historien baglæns,
11:33
and then they'll watch them squirm,
273
678000
2000
og så vil de se dem vride sig,
11:35
and track which questions produce the highest volume of deceptive tells.
274
680000
3000
og se hvilke spørgsmål der producerer den største mængde bedrageriske tegn.
11:38
Why do they do that? Well we all do the same thing.
275
683000
3000
Hvorfor gør de det? Jamen vi gør alle den samme ting.
11:41
We rehearse our words,
276
686000
2000
Vi øver vores ord,
11:43
but we rarely rehearse our gestures.
277
688000
2000
men vi øver sjældent vores kropssprog.
11:45
We say "yes," we shake our heads "no."
278
690000
2000
Vi siger, "ja," vi ryster hovedet "nej."
11:47
We tell very convincing stories, we slightly shrug our shoulders.
279
692000
3000
Vi fortæller meget overbevisende historier, vi trækker let på skuldrene.
11:50
We commit terrible crimes,
280
695000
2000
Vi begår forfærdelige forbrydelser,
11:52
and we smile at the delight in getting away with it.
281
697000
3000
og vi smiler af henrykkelsen over at slippe afsted med det.
11:55
Now that smile is known in the trade as "duping delight."
282
700000
3000
Nu er det smil i faget kendt som "narre henrykkelse."
11:58
And we're going to see that in several videos moving forward,
283
703000
3000
Og vi kommer til at se det i adskillige flimklip, der bevæger sig fremad,
12:01
but we're going to start -- for those of you who don't know him,
284
706000
2000
men vi begynder -- for dem af jer der ikke kender ham,
12:03
this is presidential candidate John Edwards
285
708000
3000
er dette den præsidentkandidaten John Edwards,
12:06
who shocked America by fathering a child out of wedlock.
286
711000
3000
som chokerede USA ved at være far til et barn uden for ægteskabet.
12:09
We're going to see him talk about getting a paternity test.
287
714000
3000
Vi kommer til at se ham tale om at få en faderskabstest.
12:12
See now if you can spot him
288
717000
2000
Se om I nu kan spotte ham
12:14
saying, "yes" while shaking his head "no,"
289
719000
2000
i at sige, "ja" men han ryster hovedet "nej,"
12:16
slightly shrugging his shoulders.
290
721000
2000
og trækker let på skuldrene.
12:18
(Video) John Edwards: I'd be happy to participate in one.
291
723000
2000
(Video) John Edwards: Jeg vil glædeligt deltage i en.
12:20
I know that it's not possible that this child could be mine,
292
725000
3000
Jeg ved, at det ikke er muligt at dette barn kan være mit,
12:23
because of the timing of events.
293
728000
2000
på grund af tingenes timing.
12:25
So I know it's not possible.
294
730000
2000
Så jeg ved det ikke er muligt.
12:27
Happy to take a paternity test,
295
732000
2000
Jeg vil med glæde tage en faderskabstest,
12:29
and would love to see it happen.
296
734000
2000
og jeg vil elske at se det ske.
12:31
Interviewer: Are you going to do that soon? Is there somebody --
297
736000
3000
Interviewer: Vil du gøre det snart? Er der nogen --
12:34
JE: Well, I'm only one side. I'm only one side of the test.
298
739000
3000
JE: Jamen, jeg er kun den ene side. Jeg er kun den ene side af testen.
12:37
But I'm happy to participate in one.
299
742000
3000
Men jeg vil glædeligt deltage i en.
12:40
PM: Okay, those head shakes are much easier to spot
300
745000
2000
PM: Okay, de ryst på hovedet er meget lettere at spotte
12:42
once you know to look for them.
301
747000
2000
når man ved, at man skal kigge efter dem.
12:44
There're going to be times
302
749000
2000
Der kommer til at være tidspunkter,
12:46
when someone makes one expression
303
751000
2000
når nogen laver et udtryk
12:48
while masking another that just kind of leaks through in a flash.
304
753000
3000
imens de dækker over et andet der på en måde lynhurtigt skinner igennem.
12:52
Murderers are known to leak sadness.
305
757000
2000
Mordere er kendt for at lække vemod.
12:54
Your new joint venture partner might shake your hand,
306
759000
2000
Ens nye joint venture partner måske giver en hånden,
12:56
celebrate, go out to dinner with you
307
761000
2000
fejrer, går ud og spiser med en
12:58
and then leak an expression of anger.
308
763000
3000
og lækker så et udtryk af vrede.
13:01
And we're not all going to become facial expression experts overnight here,
309
766000
3000
Og vi bliver ikke alle sammen eksperter i at kende ansigtsudtryk fra den ene dag til den anden,
13:04
but there's one I can teach you that's very dangerous, and it's easy to learn,
310
769000
3000
men der er et jeg kan lære jer der er meget farligt, og det er let at lære,
13:07
and that's the expression of contempt.
311
772000
3000
og det er et udtryk af foragt.
13:10
Now with anger, you've got two people on an even playing field.
312
775000
3000
Med vrede har man to mennesker på lige vilkår.
13:13
It's still somewhat of a healthy relationship.
313
778000
2000
Det er stadig noget der ligner et sundt forhold.
13:15
But when anger turns to contempt,
314
780000
2000
Men når vrede bliver til foragt,
13:17
you've been dismissed.
315
782000
2000
er man blevet afvist.
13:19
It's associated with moral superiority.
316
784000
2000
Det er associeret med moralsk overlegenhed.
13:21
And for that reason, it's very, very hard to recover from.
317
786000
3000
Og af den årsag, er det meget, meget svært at komme sig over.
13:24
Here's what it looks like.
318
789000
2000
Her er et eksempel på hvordan det ser ud.
13:26
It's marked by one lip corner
319
791000
2000
Det er markeret ved at det ene hjørne af læben
13:28
pulled up and in.
320
793000
2000
er trukket op og ind.
13:30
It's the only asymmetrical expression.
321
795000
3000
Det er det eneste asymmetriske udtryk.
13:33
And in the presence of contempt,
322
798000
2000
Og når foragt er tilstede,
13:35
whether or not deception follows --
323
800000
2000
uanset om der kommer bedrag bagefter --
13:37
and it doesn't always follow --
324
802000
2000
og det kommer ikke altid bagefter --
13:39
look the other way, go the other direction,
325
804000
2000
se den anden vej, gå den anden vej,
13:41
reconsider the deal,
326
806000
2000
revurder tilbuddet,
13:43
say, "No thank you. I'm not coming up for just one more nightcap. Thank you."
327
808000
4000
sig, "Nej tak. Jeg kommer ikke op til en sidste godnatdrink. Tak."
13:47
Science has surfaced
328
812000
2000
Videnskab har påvist
13:49
many, many more indicators.
329
814000
2000
mange, mange flere indikatorer.
13:51
We know, for example,
330
816000
2000
Vi ved, for eksempel,
13:53
we know liars will shift their blink rate,
331
818000
2000
vi ved løgnere vil ændre hyppigheden hvormed de blinker med øjnene,
13:55
point their feet towards an exit.
332
820000
2000
pege deres fødder mod en udgang.
13:57
They will take barrier objects
333
822000
2000
De vil bruge barriere objekter
13:59
and put them between themselves and the person that is interviewing them.
334
824000
3000
og sætte dem mellem sig selv om personen der interviewer dem.
14:02
They'll alter their vocal tone,
335
827000
2000
De vil ændre deres stemmeleje,
14:04
often making their vocal tone much lower.
336
829000
3000
ofte gøre deres stemmeleje meget lavere.
14:07
Now here's the deal.
337
832000
2000
Men her er hagen.
14:09
These behaviors are just behaviors.
338
834000
3000
Disse adfærd er bare adfærd.
14:12
They're not proof of deception.
339
837000
2000
De er ikke bevis på bedrag.
14:14
They're red flags.
340
839000
2000
De er røde flag.
14:16
We're human beings.
341
841000
2000
Vi er mennesker.
14:18
We make deceptive flailing gestures all over the place all day long.
342
843000
3000
Vi laver alle bedrageriske, fægtende udtryk over det hele, hele dagen.
14:21
They don't mean anything in and of themselves.
343
846000
2000
De betyder ikke noget i sig selv.
14:23
But when you see clusters of them, that's your signal.
344
848000
3000
Men når man ser klynger af dem, er dét signalet.
14:26
Look, listen, probe, ask some hard questions,
345
851000
3000
Se, lyt, snag, stil nogle svære spørgsmål,
14:29
get out of that very comfortable mode of knowing,
346
854000
3000
kom ud af den meget komfortable tilstand hvor man er sikker,
14:32
walk into curiosity mode, ask more questions,
347
857000
3000
begynd på den nysgerrige tilstand, stil flere spørgsmål,
14:35
have a little dignity, treat the person you're talking to with rapport.
348
860000
3000
hav en smule værdighed, behandl den anden person med sympati.
14:38
Don't try to be like those folks on "Law & Order" and those other TV shows
349
863000
3000
Prøv ikke på at være ligesom folkene på "Law & Order" og de andre TV serier,
14:41
that pummel their subjects into submission.
350
866000
2000
der tæver deres mistænkte til tilståelse.
14:43
Don't be too aggressive, it doesn't work.
351
868000
3000
Vær ikke for aggresiv, det virker ikke.
14:46
Now we've talked a little bit
352
871000
2000
Nu har vi talt en lille smule
14:48
about how to talk to someone who's lying
353
873000
2000
om, hvordan man taler til en der lyver
14:50
and how to spot a lie.
354
875000
2000
og hvordan man spotter en løgn.
14:52
And as I promised, we're now going to look at what the truth looks like.
355
877000
3000
Og som jeg lovede, vil vi nu kigge på hvordan sandheden ser ud.
14:55
But I'm going to show you two videos,
356
880000
2000
Men jeg vil vise jer to filmklip,
14:57
two mothers -- one is lying, one is telling the truth.
357
882000
3000
to mødre -- den ene lyver, den anden siger sandheden.
15:00
And these were surfaced
358
885000
2000
Og disse blev vist
15:02
by researcher David Matsumoto in California.
359
887000
2000
af forsker David Matsumoto i California.
15:04
And I think they're an excellent example
360
889000
2000
Og jeg synes de er et fremragende,
15:06
of what the truth looks like.
361
891000
2000
på hvordan sandheden ser ud.
15:08
This mother, Diane Downs,
362
893000
2000
Denne moder, Diane Downs,
15:10
shot her kids at close range,
363
895000
2000
skød hendes børn på klods hold,
15:12
drove them to the hospital
364
897000
2000
kørte dem til hospitalet
15:14
while they bled all over the car,
365
899000
2000
imens de blødte i hele bilen,
15:16
claimed a scraggy-haired stranger did it.
366
901000
2000
påstod en fremmed med dårligt hår gjorde det.
15:18
And you'll see when you see the video,
367
903000
2000
Og når man ser filmklippet kan man se,
15:20
she can't even pretend to be an agonizing mother.
368
905000
2000
at hun ikke kan lade som om hun er en plaget moder.
15:22
What you want to look for here
369
907000
2000
Det man skal se efter her,
15:24
is an incredible discrepancy
370
909000
2000
er en utrolig uoverensstemmelse
15:26
between horrific events that she describes
371
911000
2000
mellem de forfærdelige begivenheder som hun beskriver
15:28
and her very, very cool demeanor.
372
913000
2000
og hendes meget, meget kølige opførsel.
15:30
And if you look closely, you'll see duping delight throughout this video.
373
915000
3000
Og hvis man ser nøje efter, kan man se deres fornøjelse igennem klippet.
15:33
(Video) Diane Downs: At night when I close my eyes,
374
918000
2000
(Video) Diane Downs: Når jeg lukker øjnene om natten,
15:35
I can see Christie reaching her hand out to me while I'm driving,
375
920000
3000
kan jeg se Christie række sin hånd ud mod mig mens jeg kører,
15:38
and the blood just kept coming out of her mouth.
376
923000
3000
og blodet blev bare ved med at komme ud af hendes mund.
15:41
And that -- maybe it'll fade too with time --
377
926000
2000
Og at -- måske vil det også fortage sig med tiden --
15:43
but I don't think so.
378
928000
2000
men det tror jeg ikke.
15:45
That bothers me the most.
379
930000
3000
Det er det der nager mig mest.
15:55
PM: Now I'm going to show you a video
380
940000
2000
PM: Nu vil jeg vise jer et filmklip
15:57
of an actual grieving mother, Erin Runnion,
381
942000
2000
af en moder der virkelig er sørgende, Erin Runnion,
15:59
confronting her daughter's murderer and torturer in court.
382
944000
4000
der konfronterer sin datters morder og torturbøddel i retten.
16:03
Here you're going to see no false emotion,
383
948000
2000
Her kommer I ikke til at se nogen falske følelser,
16:05
just the authentic expression of a mother's agony.
384
950000
3000
kun det autentiske udtryk af en moders pine.
16:08
(Video) Erin Runnion: I wrote this statement on the third anniversary
385
953000
2000
(Video) Erin Runnion: Jeg skrev denne udtalelse på den tredje årsdag
16:10
of the night you took my baby,
386
955000
2000
for natten hvor du tog mit barn,
16:12
and you hurt her,
387
957000
2000
og du gjorde hende fortræd,
16:14
and you crushed her,
388
959000
2000
og du knuste hende,
16:16
you terrified her until her heart stopped.
389
961000
4000
du gjorde hende skrækslagen indtil hendes hjerte stoppede.
16:20
And she fought, and I know she fought you.
390
965000
3000
Og hun kæmpede, og jeg ved hun bekæmpede dig.
16:23
But I know she looked at you
391
968000
2000
Men jeg ved, at hun kiggede på dig
16:25
with those amazing brown eyes,
392
970000
2000
med de fantastiske, brune øjne,
16:27
and you still wanted to kill her.
393
972000
3000
og du ville stadig slå hende ihjel.
16:30
And I don't understand it,
394
975000
2000
Og jeg forstår det ikke,
16:32
and I never will.
395
977000
3000
og det kommer jeg aldrig til.
16:35
PM: Okay, there's no doubting the veracity of those emotions.
396
980000
4000
PM: Okay, der er ingen tvivl om sandfærdigheden i de følelser.
16:39
Now the technology around what the truth looks like
397
984000
3000
Nu er teknologien omkring hvordan sandheden ser ud
16:42
is progressing on, the science of it.
398
987000
3000
fremadskridende, videnskaben bag det.
16:45
We know for example
399
990000
2000
Vi ved for eksempel,
16:47
that we now have specialized eye trackers and infrared brain scans,
400
992000
3000
at vi nu har specialiserede øjne sporingsenheder og infrarøde hjernescannere,
16:50
MRI's that can decode the signals that our bodies send out
401
995000
3000
MRI der kan afkode signalerne som vores kroppe sender ud,
16:53
when we're trying to be deceptive.
402
998000
2000
når vi prøver at være bedrageriske.
16:55
And these technologies are going to be marketed to all of us
403
1000000
3000
Og disse teknologier bliver markedsført til os alle
16:58
as panaceas for deceit,
404
1003000
2000
som et universalmiddel mod bedrag,
17:00
and they will prove incredibly useful some day.
405
1005000
3000
og en dag vil de vise sig at være utrolig nyttige.
17:03
But you've got to ask yourself in the meantime:
406
1008000
2000
Men i mellemtiden skal man spørge sig selv:
17:05
Who do you want on your side of the meeting,
407
1010000
2000
Hvem vil man have på sin side i et møde,
17:07
someone who's trained in getting to the truth
408
1012000
3000
en der er trænet i at finde frem til sandheden,
17:10
or some guy who's going to drag a 400-pound electroencephalogram
409
1015000
2000
eller en fyr der trækker et 200 kilo tungt elektroencefalogram
17:12
through the door?
410
1017000
2000
gennem døren?
17:14
Liespotters rely on human tools.
411
1019000
4000
Dem der spotter løgne stoler på menneskelige værktøjer.
17:18
They know, as someone once said,
412
1023000
2000
De ved, som nogen engang sagde,
17:20
"Character's who you are in the dark."
413
1025000
2000
"Karakter er den man er i mørket."
17:22
And what's kind of interesting
414
1027000
2000
Og det der er lidt interessant,
17:24
is that today we have so little darkness.
415
1029000
2000
er at vi i dag har så lidt mørke.
17:26
Our world is lit up 24 hours a day.
416
1031000
3000
Vores verden er oplyst 24 timer om dagen.
17:29
It's transparent
417
1034000
2000
Den er gennemskuelig
17:31
with blogs and social networks
418
1036000
2000
med blogs og sociale netværk,
17:33
broadcasting the buzz of a whole new generation of people
419
1038000
2000
der udsender en summen af en helt ny generation af mennesker,
17:35
that have made a choice to live their lives in public.
420
1040000
3000
der har truffet det valg at de lever deres liv i offentligheden.
17:38
It's a much more noisy world.
421
1043000
4000
Det er en meget mere støjende verden.
17:42
So one challenge we have
422
1047000
2000
Så en udfordring vi har,
17:44
is to remember,
423
1049000
2000
er at huske,
17:46
oversharing, that's not honesty.
424
1051000
3000
at dele alt, det er ikke ærlighed.
17:49
Our manic tweeting and texting
425
1054000
2000
Vores maniske Twitter og sms-vaner
17:51
can blind us to the fact
426
1056000
2000
kan gøre os blinde overfor det faktum,
17:53
that the subtleties of human decency -- character integrity --
427
1058000
3000
at de raffinerede detaljer af den menneskelige anstændighed -- karakterintegritet --
17:56
that's still what matters, that's always what's going to matter.
428
1061000
3000
er stadig det det handler om, det er det det altid vil handle om.
17:59
So in this much noisier world,
429
1064000
2000
Så in denne meget mere støjende verden,
18:01
it might make sense for us
430
1066000
2000
giver det mening for os
18:03
to be just a little bit more explicit
431
1068000
2000
at være bare en lille smule mere tydelig
18:05
about our moral code.
432
1070000
3000
om vores moral.
18:08
When you combine the science of recognizing deception
433
1073000
2000
Når man kombinerer videnskaben bag at genkende bedrag
18:10
with the art of looking, listening,
434
1075000
2000
med kunsten af at se, lytte,
18:12
you exempt yourself from collaborating in a lie.
435
1077000
3000
fritager man sig selv for at deltage i en løgn.
18:15
You start up that path
436
1080000
2000
Man begynder med at
18:17
of being just a little bit more explicit,
437
1082000
2000
være bare en lille smule mere tydelig,
18:19
because you signal to everyone around you,
438
1084000
2000
fordi man signalerer til alle omkring en,
18:21
you say, "Hey, my world, our world,
439
1086000
3000
man siger, "Hey, min verden, vores verden,
18:24
it's going to be an honest one.
440
1089000
2000
det kommer til at blive en ærlig en.
18:26
My world is going to be one where truth is strengthened
441
1091000
2000
Min verden kommer til at være en hvor sandheden er styrket
18:28
and falsehood is recognized and marginalized."
442
1093000
3000
og falskheden bliver genkendt og marginaliseret."
18:31
And when you do that,
443
1096000
2000
Og når man gør det,
18:33
the ground around you starts to shift just a little bit.
444
1098000
3000
begynder jorden omkring en at skifte bare en lille smule.
18:36
And that's the truth. Thank you.
445
1101000
3000
Og det er sandheden. Tak.
18:39
(Applause)
446
1104000
5000
(Bifald)
Translated by David J. Kreps Finnemann
ABOUT THE SPEAKER
Pamela Meyer - Lie detectorPamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.
Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.
Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speakerWorking with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
Pamela Meyer | Speaker | TED.com