TEDGlobal 2011
Pamela Meyer: How to spot a liar
Pamela Meyer: Com detectar un mentider
Filmed:
Readability: 3.8
28,415,176 views
Un dia qualsevol ens menteixen entre 10 i 200 vegades, i les claus per detectar aquestes mentides poden ser subtils i no intuïtives. Pamela Meyer, autora de Liespotting ("Detectar mentides"), ens mostra les maneres i els trucs que utilitzen aquells que han estat entrenats per reconèixer l'engany, i argumenta que l'honradesa és un valor que val la pena preservar.
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio
Double-click the English transcript below to play the video.
00:15
Okay, now I don't want to alarm anybody in this room,
0
0
5000
Bé, no vull alarmar ningú d'aquesta sala,
00:20
but it's just come to my attention
1
5000
2000
però m'acabo d'adonar
00:22
that the person to your right is a liar.
2
7000
2000
que la persona de la vostra dreta és un mentider.
00:24
(Laughter)
3
9000
2000
(Rialles)
00:26
Also, the person to your left is a liar.
4
11000
3000
I també la persona de la vostra esquerra és un mentider.
00:29
Also the person sitting in your very seats is a liar.
5
14000
3000
I la persona que s'asseu just al vostre seient també és un mentider.
00:32
We're all liars.
6
17000
2000
Tots som mentiders.
00:34
What I'm going to do today
7
19000
2000
El que vull fer avui
00:36
is I'm going to show you what the research says about why we're all liars,
8
21000
3000
és mostrar-vos què diuen les investigacions sobre què fa que tots siguem mentiders,
00:39
how you can become a liespotter
9
24000
2000
com es poden convertir en detectors de mentides
00:41
and why you might want to go the extra mile
10
26000
3000
i per què poden voler anar més enllà
00:44
and go from liespotting to truth seeking,
11
29000
3000
i anar des de detectar mentides fins a buscar la veritat,
00:47
and ultimately to trust building.
12
32000
2000
i finalment generar confiança.
00:49
Now speaking of trust,
13
34000
3000
Parlant de confiança,
00:52
ever since I wrote this book, "Liespotting,"
14
37000
3000
des que vaig escriure aquest llibre, "Liespotting" (Detector de mentides),
00:55
no one wants to meet me in person anymore, no, no, no, no, no.
15
40000
3000
ningú ja no em vol conèixer en persona, no, no, no, no, no.
00:58
They say, "It's okay, we'll email you."
16
43000
3000
Diuen: "D'acord, t'enviarem un e-mail".
01:01
(Laughter)
17
46000
2000
(Rialles)
01:03
I can't even get a coffee date at Starbucks.
18
48000
4000
Ni tan sols puc quedar per fer un cafè a l'Starbucks.
01:07
My husband's like, "Honey, deception?
19
52000
2000
El meu marit em diu: "Reina, enganys?
01:09
Maybe you could have focused on cooking. How about French cooking?"
20
54000
3000
Et podries haver centrat en la cuina. Què et sembla la cuina francesa?"
01:12
So before I get started, what I'm going to do
21
57000
2000
Per tant, abans de començar,
01:14
is I'm going to clarify my goal for you,
22
59000
3000
us vull aclarir quin és el meu objectiu,
01:17
which is not to teach a game of Gotcha.
23
62000
2000
que no és ensenyar-vos a jugar al joc de les mentides.
01:19
Liespotters aren't those nitpicky kids,
24
64000
2000
Els detectors de mentides no són aquests nens primmirats,
01:21
those kids in the back of the room that are shouting, "Gotcha! Gotcha!
25
66000
3000
que es queden al fons de la sala i criden: "Ja et tinc! Ja et tinc!
01:24
Your eyebrow twitched. You flared your nostril.
26
69000
3000
Has aixecat la cella. Has obert els forats del nas.
01:27
I watch that TV show 'Lie To Me.' I know you're lying."
27
72000
3000
Segueixo el programa de la TV 'Menteix-me'. Sé que estàs mentint."
01:30
No, liespotters are armed
28
75000
2000
No, els detectors de mentides van armats
01:32
with scientific knowledge of how to spot deception.
29
77000
3000
amb coneixement científic sobre com detectar enganys.
01:35
They use it to get to the truth,
30
80000
2000
El fan servir per arribar a la veritat,
01:37
and they do what mature leaders do everyday;
31
82000
2000
i fan allò que els líders madurs fan cada dia;
01:39
they have difficult conversations with difficult people,
32
84000
3000
tenen converses difícils amb persones difícils,
01:42
sometimes during very difficult times.
33
87000
2000
de vegades en temps molt difícils.
01:44
And they start up that path
34
89000
2000
I comencen aquest camí
01:46
by accepting a core proposition,
35
91000
2000
acceptant una proposició fonamental,
01:48
and that proposition is the following:
36
93000
2000
i aquesta proposició és la següent:
01:50
Lying is a cooperative act.
37
95000
3000
Mentir és un acte cooperatiu.
01:53
Think about it, a lie has no power whatsoever by its mere utterance.
38
98000
4000
Penseu-hi, una mentida no té gens de poder només dient-la.
01:57
Its power emerges
39
102000
2000
El seu poder sorgeix
01:59
when someone else agrees to believe the lie.
40
104000
2000
quan algú altre accepta creure's aquesta mentida.
02:01
So I know it may sound like tough love,
41
106000
2000
I sé que segurament sonarà com un amor difícil,
02:03
but look, if at some point you got lied to,
42
108000
4000
però mireu, si algun cop us han mentit,
02:07
it's because you agreed to get lied to.
43
112000
2000
és perquè vau acceptar que us mentissin.
02:09
Truth number one about lying: Lying's a cooperative act.
44
114000
3000
Veritat número u sobre mentir: mentir és un acte cooperatiu.
02:12
Now not all lies are harmful.
45
117000
2000
Però no totes les mentides són perjudicials.
02:14
Sometimes we're willing participants in deception
46
119000
3000
De vegades participem voluntàriament en enganys
02:17
for the sake of social dignity,
47
122000
3000
per la dignitat social,
02:20
maybe to keep a secret that should be kept secret, secret.
48
125000
3000
potser per guardar un secret que hauria de seguir sent secret, secret.
02:23
We say, "Nice song."
49
128000
2000
Diem: "Quina cançó més maca."
02:25
"Honey, you don't look fat in that, no."
50
130000
3000
"Reina, no et veus grassa, no."
02:28
Or we say, favorite of the digiratti,
51
133000
2000
O diem, el preferit dels informàtics,
02:30
"You know, I just fished that email out of my spam folder.
52
135000
3000
"Acabo de recuperar el correu electrònic de la carpeta de correu brossa.
02:33
So sorry."
53
138000
3000
Em sap greu."
02:36
But there are times when we are unwilling participants in deception.
54
141000
3000
Però de vegades participem en l'engany sense voler-ho
02:39
And that can have dramatic costs for us.
55
144000
3000
i això pot tenir costos dramàtics per a nosaltres.
02:42
Last year saw 997 billion dollars
56
147000
3000
L'any passat hi va haver 997 mil milions de dòlars
02:45
in corporate fraud alone in the United States.
57
150000
4000
en frau corporatiu, només als Estats Units.
02:49
That's an eyelash under a trillion dollars.
58
154000
2000
Això són gairebé un milió de milions de dòlars.
02:51
That's seven percent of revenues.
59
156000
2000
Representa el 7% dels ingressos recaptats.
02:53
Deception can cost billions.
60
158000
2000
L'engany pot costar milers de milions.
02:55
Think Enron, Madoff, the mortgage crisis.
61
160000
3000
Penseu en Enron, Madoff, la crisi hipotecària.
02:58
Or in the case of double agents and traitors,
62
163000
3000
O en el cas dels espies i traïdors,
03:01
like Robert Hanssen or Aldrich Ames,
63
166000
2000
com Robert Hanssen o Aldrich Ames,
03:03
lies can betray our country,
64
168000
2000
les mentides poden trair el nostre país,
03:05
they can compromise our security, they can undermine democracy,
65
170000
3000
comprometre la nostra seguretat, afeblir la democràcia,
03:08
they can cause the deaths of those that defend us.
66
173000
3000
poden causar la mort d'aquells qui ens defensen.
03:11
Deception is actually serious business.
67
176000
3000
En realitat l'engany és quelcom seriós.
03:14
This con man, Henry Oberlander,
68
179000
2000
Aquest estafador, Henry Oberlander,
03:16
he was such an effective con man
69
181000
2000
era tan bon estafador
03:18
British authorities say
70
183000
2000
que les autoritats britàniques diuen
03:20
he could have undermined the entire banking system of the Western world.
71
185000
3000
que podria haver perjudicat tot el sistema bancari del món occidental.
03:23
And you can't find this guy on Google; you can't find him anywhere.
72
188000
2000
I no pots trobar aquest paio a Google, no es troba enlloc.
03:25
He was interviewed once, and he said the following.
73
190000
3000
El van entrevistar una vegada i va dir el següent.
03:28
He said, "Look, I've got one rule."
74
193000
2000
Va dir: "Mira, tinc una norma."
03:30
And this was Henry's rule, he said,
75
195000
3000
I aquesta era la norma del Henry, va dir:
03:33
"Look, everyone is willing to give you something.
76
198000
2000
"Mira, tothom està disposat a donar-te alguna cosa.
03:35
They're ready to give you something for whatever it is they're hungry for."
77
200000
3000
Estan a punt per donar-te alguna cosa a canvi d'allò que més anhelen."
03:38
And that's the crux of it.
78
203000
2000
I aquest és el quid de la qüestió.
03:40
If you don't want to be deceived, you have to know,
79
205000
2000
Si no vols que t'enganyin, has de saber:
03:42
what is it that you're hungry for?
80
207000
2000
què és allò que més anheles?
03:44
And we all kind of hate to admit it.
81
209000
3000
I a ningú ens agrada gaire admetre-ho.
03:47
We wish we were better husbands, better wives,
82
212000
3000
Ens agradaria ser millors marits, millors mullers,
03:50
smarter, more powerful,
83
215000
2000
més intel·ligents, més poderosos,
03:52
taller, richer --
84
217000
2000
més alts, més rics...
03:54
the list goes on.
85
219000
2000
la llista continua.
03:56
Lying is an attempt to bridge that gap,
86
221000
2000
Mentir és un intent de reduir aquest bretxa,
03:58
to connect our wishes and our fantasies
87
223000
2000
de connectar els nostres desitjos i fantasies
04:00
about who we wish we were, how we wish we could be,
88
225000
3000
sobre qui ens agradaria ser i com ens agradaria ser,
04:03
with what we're really like.
89
228000
3000
amb com som realment.
04:06
And boy are we willing to fill in those gaps in our lives with lies.
90
231000
3000
I estem disposats a omplir aquests buits de les nostres vides amb mentides.
04:09
On a given day, studies show that you may be lied to
91
234000
3000
Hi ha estudis que mostren que cada dia et poden mentir
04:12
anywhere from 10 to 200 times.
92
237000
2000
entre 10 i 200 vegades.
04:14
Now granted, many of those are white lies.
93
239000
3000
Tot i que moltes d'aquestes mentides són mentides pietoses.
04:17
But in another study,
94
242000
2000
Però un altre estudi
04:19
it showed that strangers lied three times
95
244000
2000
va mostrar que els desconeguts es mentien 3 vegades
04:21
within the first 10 minutes of meeting each other.
96
246000
2000
en els 10 primers minuts després de conèixer-se.
04:23
(Laughter)
97
248000
2000
(Rialles)
04:25
Now when we first hear this data, we recoil.
98
250000
3000
Quan sentim aquestes dades per primer cop, defugim.
04:28
We can't believe how prevalent lying is.
99
253000
2000
No ens podem creure que mentir sigui tan corrent.
04:30
We're essentially against lying.
100
255000
2000
Estem essencialment en contra de les mentides.
04:32
But if you look more closely,
101
257000
2000
Però si pares més atenció,
04:34
the plot actually thickens.
102
259000
2000
la cosa es complica.
04:36
We lie more to strangers than we lie to coworkers.
103
261000
3000
Mentim més als desconeguts que als companys de feina.
04:39
Extroverts lie more than introverts.
104
264000
4000
Els extravertits menteixen més que els introvertits.
04:43
Men lie eight times more about themselves
105
268000
3000
Els homes menteixen 8 vegades més sobre ells mateixos
04:46
than they do other people.
106
271000
2000
que sobre altres persones.
04:48
Women lie more to protect other people.
107
273000
3000
Les dones menteixen més per protegir altres persones.
04:51
If you're an average married couple,
108
276000
3000
Si sou un matrimoni mitjà,
04:54
you're going to lie to your spouse
109
279000
2000
mentireu a la vostra parella
04:56
in one out of every 10 interactions.
110
281000
2000
en una de cada deu interaccions.
04:58
Now you may think that's bad.
111
283000
2000
Deveu pensar que això no és bo.
05:00
If you're unmarried, that number drops to three.
112
285000
2000
Si no esteu casats, el número baixa fins a tres.
05:02
Lying's complex.
113
287000
2000
Mentir és complex.
05:04
It's woven into the fabric of our daily and our business lives.
114
289000
3000
És una part integral de la nostra vida quotidiana i professional.
05:07
We're deeply ambivalent about the truth.
115
292000
2000
Som molt ambigus pel que fa a la veritat.
05:09
We parse it out on an as-needed basis,
116
294000
2000
La desglossem segons les nostres necessitats,
05:11
sometimes for very good reasons,
117
296000
2000
a vegades per una bona causa,
05:13
other times just because we don't understand the gaps in our lives.
118
298000
3000
a vegades simplement perquè no entenem els buits de les nostres vides.
05:16
That's truth number two about lying.
119
301000
2000
Aquesta és la veritat número dos sobre mentir.
05:18
We're against lying,
120
303000
2000
Estem en contra de mentir,
05:20
but we're covertly for it
121
305000
2000
però hi estem secretament a favor
05:22
in ways that our society has sanctioned
122
307000
2000
d'una manera que la nostra societat ha penalitzat
05:24
for centuries and centuries and centuries.
123
309000
2000
durant segles i segles i segles.
05:26
It's as old as breathing.
124
311000
2000
Mentir és tan antic com respirar.
05:28
It's part of our culture, it's part of our history.
125
313000
2000
És part de la nostra cultura, és part de la nostra història.
05:30
Think Dante, Shakespeare,
126
315000
3000
Penseu en Dante, Shakespeare,
05:33
the Bible, News of the World.
127
318000
3000
la Bíblia, News of the World.
05:36
(Laughter)
128
321000
2000
(Rialles)
05:38
Lying has evolutionary value to us as a species.
129
323000
2000
Mentir té un valor evolutiu per a nosaltres, com a espècie.
05:40
Researchers have long known
130
325000
2000
Els investigadors saben des de fa temps
05:42
that the more intelligent the species,
131
327000
2000
que com més intel·ligent és l'espècie,
05:44
the larger the neocortex,
132
329000
2000
més gran és el neocòrtex
05:46
the more likely it is to be deceptive.
133
331000
2000
i és més probable que enganyi.
05:48
Now you might remember Koko.
134
333000
2000
Potser recordeu la Koko.
05:50
Does anybody remember Koko the gorilla who was taught sign language?
135
335000
3000
Algú recorda la goril·la Koko, a qui van ensenyar llengua de signes?
05:53
Koko was taught to communicate via sign language.
136
338000
3000
A la Koko li van ensenyar a comunicar-se amb llengua de signes.
05:56
Here's Koko with her kitten.
137
341000
2000
Aquí teniu la Koko amb el seu gatet.
05:58
It's her cute little, fluffy pet kitten.
138
343000
3000
El seu gatet bufó, petitó i suau.
06:01
Koko once blamed her pet kitten
139
346000
2000
La Koko una vegada va culpar el seu gatet
06:03
for ripping a sink out of the wall.
140
348000
2000
d'haver arrencat una pica de la paret.
06:05
(Laughter)
141
350000
2000
(Rialles)
06:07
We're hardwired to become leaders of the pack.
142
352000
2000
Estem dissenyats per ser líders de la manada.
06:09
It's starts really, really early.
143
354000
2000
Comença molt, molt d'hora.
06:11
How early?
144
356000
2000
Però quan?
06:13
Well babies will fake a cry,
145
358000
2000
Bé, els nadons simulen el plor,
06:15
pause, wait to see who's coming
146
360000
2000
paren, esperen a veure si ve algú
06:17
and then go right back to crying.
147
362000
2000
i després tornen a plorar.
06:19
One-year-olds learn concealment.
148
364000
2000
Els nadons d'un any amaguen coses.
06:21
(Laughter)
149
366000
2000
(Rialles)
06:23
Two-year-olds bluff.
150
368000
2000
Els de dos anys dissimulen.
06:25
Five-year-olds lie outright.
151
370000
2000
Els nens de cinc anys menteixen descaradament.
06:27
They manipulate via flattery.
152
372000
2000
Manipulen fent la pilota.
06:29
Nine-year-olds, masters of the cover up.
153
374000
3000
Els nens de nou anys són experts en dissimular.
06:32
By the time you enter college,
154
377000
2000
Quan comences a la universitat,
06:34
you're going to lie to your mom in one out of every five interactions.
155
379000
3000
mentiràs a la teva mare en una de cada cinc interaccions.
06:37
By the time we enter this work world and we're breadwinners,
156
382000
3000
Quan entrem al món laboral i portem el pes de la família,
06:40
we enter a world that is just cluttered
157
385000
2000
entrem en un món que està impregnat
06:42
with spam, fake digital friends,
158
387000
2000
de correu brossa, ciberamics falsos,
06:44
partisan media,
159
389000
2000
mitjans partidistes,
06:46
ingenious identity thieves,
160
391000
2000
lladres d'identitat enginyosos,
06:48
world-class Ponzi schemers,
161
393000
2000
estafadors de talla mundial,
06:50
a deception epidemic --
162
395000
2000
una epidèmia d'enganys.
06:52
in short, what one author calls
163
397000
2000
En resum, allò que un autor anomena
06:54
a post-truth society.
164
399000
3000
la societat de la post-veritat.
06:57
It's been very confusing
165
402000
2000
Ha estat molt confús
06:59
for a long time now.
166
404000
3000
durant molt de temps.
07:03
What do you do?
167
408000
2000
Què fem?
07:05
Well there are steps we can take
168
410000
2000
Bé, hi ha alguns passos que podem seguir
07:07
to navigate our way through the morass.
169
412000
2000
per obrir-nos camí a través d'aquest fangar.
07:09
Trained liespotters get to the truth 90 percent of the time.
170
414000
3000
Els detectors de mentides entrenats arriben a la veritat en el 90% de les vegades.
07:12
The rest of us, we're only 54 percent accurate.
171
417000
3000
La resta, només tenim un 54% de precisió.
07:15
Why is it so easy to learn?
172
420000
2000
Per què és tan fàcil d'aprendre?
07:17
There are good liars and there are bad liars. There are no real original liars.
173
422000
3000
Hi ha bons mentiders i mals mentiders. No n'hi ha de realment innats.
07:20
We all make the same mistakes. We all use the same techniques.
174
425000
3000
Tots fem els mateixos errors. Tots utilitzem les mateixes tècniques.
07:23
So what I'm going to do
175
428000
2000
Per tant el que faré
07:25
is I'm going to show you two patterns of deception.
176
430000
2000
és mostrar-vos dos models d'engany
07:27
And then we're going to look at the hot spots and see if we can find them ourselves.
177
432000
3000
i després farem un cop d'ull als punts delators i veurem si els podem identificar.
07:30
We're going to start with speech.
178
435000
3000
Començarem amb el discurs.
07:33
(Video) Bill Clinton: I want you to listen to me.
179
438000
2000
(Vídeo) Bill Clinton: Vull que m'escolteu.
07:35
I'm going to say this again.
180
440000
2000
Ho diré una altra vegada.
07:37
I did not have sexual relations
181
442000
3000
No he tingut relacions sexuals
07:40
with that woman, Miss Lewinsky.
182
445000
4000
amb aquesta dona, la senyoreta Lewinsky.
07:44
I never told anybody to lie,
183
449000
2000
No he demanat a ningú que mentís,
07:46
not a single time, never.
184
451000
2000
ni una vegada, mai.
07:48
And these allegations are false.
185
453000
3000
I aquestes acusacions són falses.
07:51
And I need to go back to work for the American people.
186
456000
2000
I he de tornar a treballar per als americans.
07:53
Thank you.
187
458000
2000
Gràcies.
07:58
Pamela Meyer: Okay, what were the telltale signs?
188
463000
3000
Pamela Meyer: D'acord, quines eren les senyals reveladores?
08:01
Well first we heard what's known as a non-contracted denial.
189
466000
4000
Bé, primer hem sentit allò que es coneix com a negació estesa.
08:05
Studies show that people who are overdetermined in their denial
190
470000
3000
Hi ha estudis que demostren que les persones entestades a negar els seus actes
08:08
will resort to formal rather than informal language.
191
473000
3000
recorren al llenguatge formal, més que a l'informal.
08:11
We also heard distancing language: "that woman."
192
476000
3000
També hem sentit llenguatge distant: "aquesta dona"
08:14
We know that liars will unconsciously distance themselves
193
479000
2000
Sabem que els mentiders inconscientment es distancien
08:16
from their subject
194
481000
2000
del subjecte de qui parlen
08:18
using language as their tool.
195
483000
3000
fent servir el llenguatge.
08:21
Now if Bill Clinton had said, "Well, to tell you the truth ... "
196
486000
3000
Si Bill Clinton hagués dit: "Bé, sincerament..."
08:24
or Richard Nixon's favorite, "In all candor ... "
197
489000
2000
o la preferida de Richard Nixon: "Amb tota sinceritat..."
08:26
he would have been a dead giveaway
198
491000
2000
s'hauria delatat de seguida
08:28
for any liespotter than knows
199
493000
2000
per a qualsevol detector de mentides que sàpiga
08:30
that qualifying language, as it's called, qualifying language like that,
200
495000
3000
que el llenguatge qualificatiu, tal com se li diu, el llenguatge qualificatiu com aquest,
08:33
further discredits the subject.
201
498000
2000
desacredita la persona.
08:35
Now if he had repeated the question in its entirety,
202
500000
3000
Si hagués repetit la pregunta sencera,
08:38
or if he had peppered his account with a little too much detail --
203
503000
4000
o si hagués farcit l'explicació amb massa detalls
08:42
and we're all really glad he didn't do that --
204
507000
2000
(i sort que no ho va fer)
08:44
he would have further discredited himself.
205
509000
2000
encara s'hauria desacreditat més.
08:46
Freud had it right.
206
511000
2000
Freud tenia raó.
08:48
Freud said, look, there's much more to it than speech:
207
513000
3000
Freud va dir, no només existeix el discurs:
08:51
"No mortal can keep a secret.
208
516000
3000
"Cap mortal pot guardar un secret.
08:54
If his lips are silent, he chatters with his fingertips."
209
519000
3000
Si els llavis callen, parlarà amb les mans."
08:57
And we all do it no matter how powerful you are.
210
522000
3000
I tots ho fem, és igual el poder que tinguem.
09:00
We all chatter with our fingertips.
211
525000
2000
Tots parlem amb les mans.
09:02
I'm going to show you Dominique Strauss-Kahn with Obama
212
527000
3000
Us ensenyaré Dominique Strauss-Kahn amb Obama,
09:05
who's chattering with his fingertips.
213
530000
3000
que parla amb les mans.
09:08
(Laughter)
214
533000
3000
(Rialles)
09:11
Now this brings us to our next pattern,
215
536000
3000
Això ens porta al nostre següent patró,
09:14
which is body language.
216
539000
3000
que és el llenguatge corporal.
09:17
With body language, here's what you've got to do.
217
542000
3000
Amb el llenguatge corporal, això és el que heu de fer.
09:20
You've really got to just throw your assumptions out the door.
218
545000
3000
Heu de desfer-vos de les vostres suposicions.
09:23
Let the science temper your knowledge a little bit.
219
548000
2000
Deixeu que la ciència moderi una mica el vostre coneixement.
09:25
Because we think liars fidget all the time.
220
550000
3000
Creiem que els mentiders no paren de moure's.
09:28
Well guess what, they're known to freeze their upper bodies when they're lying.
221
553000
3000
Doncs mireu, se sap que immobilitzen la part superior del cos quan menteixen.
09:31
We think liars won't look you in the eyes.
222
556000
3000
Pensem que els mentiders no miren als ulls.
09:34
Well guess what, they look you in the eyes a little too much
223
559000
2000
Doncs mireu, us miren als ulls una mica massa
09:36
just to compensate for that myth.
224
561000
2000
simplement per compensar aquest mite.
09:38
We think warmth and smiles
225
563000
2000
Ens pensem que la cordialitat i els somriures
09:40
convey honesty, sincerity.
226
565000
2000
expressen honestedat i sinceritat.
09:42
But a trained liespotter
227
567000
2000
Però un detector de mentides entrenat
09:44
can spot a fake smile a mile away.
228
569000
2000
pot detectar un somriure fals d'una hora lluny.
09:46
Can you all spot the fake smile here?
229
571000
3000
Podeu identificar el somriure fals?
09:50
You can consciously contract
230
575000
2000
Es poden contreure conscientment
09:52
the muscles in your cheeks.
231
577000
3000
els músculs de les galtes.
09:55
But the real smile's in the eyes, the crow's feet of the eyes.
232
580000
3000
Però el somriure real és als ulls, a les potes de gall,
09:58
They cannot be consciously contracted,
233
583000
2000
que no es poden contreure expressament,
10:00
especially if you overdid the Botox.
234
585000
2000
sobretot si t'has passat amb el Botox.
10:02
Don't overdo the Botox; nobody will think you're honest.
235
587000
3000
No us passeu mai amb el Botox, o ningú no es creurà que dieu la veritat.
10:05
Now we're going to look at the hot spots.
236
590000
2000
Ara farem un cop d'ull als punts delators.
10:07
Can you tell what's happening in a conversation?
237
592000
2000
Sabeu dir-me què passa en una conversa?
10:09
Can you start to find the hot spots
238
594000
3000
Podeu trobar aquests punts
10:12
to see the discrepancies
239
597000
2000
que reflecteixen les diferències
10:14
between someone's words and someone's actions?
240
599000
2000
entre les paraules d'una persona i les seves accions?
10:16
Now I know it seems really obvious,
241
601000
2000
Sé que sembla evident,
10:18
but when you're having a conversation
242
603000
2000
però quan tens una conversa
10:20
with someone you suspect of deception,
243
605000
3000
amb algú que creus que t'enganya,
10:23
attitude is by far the most overlooked but telling of indicators.
244
608000
3000
l'actitud és l'indicador més valuós i menyspreat.
10:26
An honest person is going to be cooperative.
245
611000
2000
Les persones honestes seran cooperatives.
10:28
They're going to show they're on your side.
246
613000
2000
Us mostraran que estan de part vostra.
10:30
They're going to be enthusiastic.
247
615000
2000
Seran entusiastes.
10:32
They're going to be willing and helpful to getting you to the truth.
248
617000
2000
Estaran disposades a ajudar-vos a trobar la veritat.
10:34
They're going to be willing to brainstorm, name suspects,
249
619000
3000
Estaran disposades a donar idees, anomenar sospitosos,
10:37
provide details.
250
622000
2000
donar detalls.
10:39
They're going to say, "Hey,
251
624000
2000
Diran: "Ei,
10:41
maybe it was those guys in payroll that forged those checks."
252
626000
3000
potser van ser els encarregats de la nòmina que van falsificar aquests xecs."
10:44
They're going to be infuriated if they sense they're wrongly accused
253
629000
3000
Es posaran furiosos si creuen que se'ls acusa injustament
10:47
throughout the entire course of the interview, not just in flashes;
254
632000
2000
durant tota la conversa, no només en moments puntuals;
10:49
they'll be infuriated throughout the entire course of the interview.
255
634000
3000
es posaran furiosos durant tota la conversa.
10:52
And if you ask someone honest
256
637000
2000
I si pregunteu a una persona honesta
10:54
what should happen to whomever did forge those checks,
257
639000
3000
què s'hauria de fer amb els falsificadors dels xecs,
10:57
an honest person is much more likely
258
642000
2000
una persona honesta segurament
10:59
to recommend strict rather than lenient punishment.
259
644000
4000
recomanarà un càstig sever, més que no pas un de benèvol.
11:03
Now let's say you're having that exact same conversation
260
648000
2000
Posem per cas que teniu la mateixa conversa
11:05
with someone deceptive.
261
650000
2000
amb algú que no és honest.
11:07
That person may be withdrawn,
262
652000
2000
Pot ser que aquesta persona es mostri tímida,
11:09
look down, lower their voice,
263
654000
2000
que miri cap avall, que abaixi la veu,
11:11
pause, be kind of herky-jerky.
264
656000
2000
que faci pauses, que vagi a batzegades.
11:13
Ask a deceptive person to tell their story,
265
658000
2000
Pregunteu a una persona no honesta que us expliqui la història,
11:15
they're going to pepper it with way too much detail
266
660000
3000
hi ficarà massa detalls
11:18
in all kinds of irrelevant places.
267
663000
3000
irrellevants
11:21
And then they're going to tell their story in strict chronological order.
268
666000
3000
i us explicarà la història en ordre cronològic.
11:24
And what a trained interrogator does
269
669000
2000
Un interrogador entrenat
11:26
is they come in and in very subtle ways
270
671000
2000
hi va indagant de manera subtil
11:28
over the course of several hours,
271
673000
2000
durant hores i hores,
11:30
they will ask that person to tell that story backwards,
272
675000
3000
li demanarà a la persona que expliqui la història cap enrere
11:33
and then they'll watch them squirm,
273
678000
2000
i veurà que pateix
11:35
and track which questions produce the highest volume of deceptive tells.
274
680000
3000
i detectarà les preguntes que causen més respostes enganyoses.
11:38
Why do they do that? Well we all do the same thing.
275
683000
3000
Per què ho fan? Bé, tots fem el mateix.
11:41
We rehearse our words,
276
686000
2000
Assagem les paraules,
11:43
but we rarely rehearse our gestures.
277
688000
2000
però gairebé mai no assagem els gests.
11:45
We say "yes," we shake our heads "no."
278
690000
2000
Diem "sí" i amb el cap fem que "no".
11:47
We tell very convincing stories, we slightly shrug our shoulders.
279
692000
3000
Expliquem històries molt convincents, però ens arronsem una mica d'espatlles.
11:50
We commit terrible crimes,
280
695000
2000
Cometem delictes greus
11:52
and we smile at the delight in getting away with it.
281
697000
3000
i somriem davant el plaer de sortir-nos-en.
11:55
Now that smile is known in the trade as "duping delight."
282
700000
3000
A aquest somriure se'l coneix com a "plaer per l'engany".
11:58
And we're going to see that in several videos moving forward,
283
703000
3000
I ho veurem en diversos vídeos a continuació,
12:01
but we're going to start -- for those of you who don't know him,
284
706000
2000
però començarem, per aquells qui no el coneixeu,
12:03
this is presidential candidate John Edwards
285
708000
3000
aquest és el candidat a la presidència, John Edwards,
12:06
who shocked America by fathering a child out of wedlock.
286
711000
3000
que va sorprendre els EUA per tenir un suposat fill extramatrimonial.
12:09
We're going to see him talk about getting a paternity test.
287
714000
3000
El veurem parlar sobre fer-se la prova de paternitat.
12:12
See now if you can spot him
288
717000
2000
A veure si podeu veure quan
12:14
saying, "yes" while shaking his head "no,"
289
719000
2000
diu "sí" mentre amb el cap fa que "no",
12:16
slightly shrugging his shoulders.
290
721000
2000
mentre s'arronsa una mica d'espatlles.
12:18
(Video) John Edwards: I'd be happy to participate in one.
291
723000
2000
(Vídeo) John Edwards: "No em faria res fer-me'n una.
12:20
I know that it's not possible that this child could be mine,
292
725000
3000
Sé que és impossible que aquest nen sigui meu,
12:23
because of the timing of events.
293
728000
2000
per com han anat els fets.
12:25
So I know it's not possible.
294
730000
2000
Per tant sé que és impossible.
12:27
Happy to take a paternity test,
295
732000
2000
No em faria res fer-me una prova de paternitat,
12:29
and would love to see it happen.
296
734000
2000
m'agradaria fer-me-la.
12:31
Interviewer: Are you going to do that soon? Is there somebody --
297
736000
3000
Entrevistador: Te la faràs aviat? Hi ha algú...
12:34
JE: Well, I'm only one side. I'm only one side of the test.
298
739000
3000
JE: Bé, només sóc una part, jo. Només sóc una part de la prova.
12:37
But I'm happy to participate in one.
299
742000
3000
Però m'agradaria participar-hi.
12:40
PM: Okay, those head shakes are much easier to spot
300
745000
2000
PM: D'acord, aquests moviments de cap són molt fàcils de reconèixer
12:42
once you know to look for them.
301
747000
2000
quan saps buscar-les.
12:44
There're going to be times
302
749000
2000
De vegades
12:46
when someone makes one expression
303
751000
2000
algú fa una expressió
12:48
while masking another that just kind of leaks through in a flash.
304
753000
3000
mentre n'amaga una altra que de cop es filtra durant un instant.
12:52
Murderers are known to leak sadness.
305
757000
2000
Se sap que els assassins mostren tristesa.
12:54
Your new joint venture partner might shake your hand,
306
759000
2000
Pot ser que el teu nou soci et doni la mà,
12:56
celebrate, go out to dinner with you
307
761000
2000
ho celebri, vagi a sopar amb tu
12:58
and then leak an expression of anger.
308
763000
3000
i després deixi anar una expressió d'enuig.
13:01
And we're not all going to become facial expression experts overnight here,
309
766000
3000
No ens convertirem en experts en expressions facials de la nit al dia,
13:04
but there's one I can teach you that's very dangerous, and it's easy to learn,
310
769000
3000
però us en puc ensenyar una que és molt perillosa i molt fàcil d'aprendre:
13:07
and that's the expression of contempt.
311
772000
3000
l'expressió de menyspreu.
13:10
Now with anger, you've got two people on an even playing field.
312
775000
3000
Amb l'enuig, tens dues persones jugant en un camp igualat.
13:13
It's still somewhat of a healthy relationship.
313
778000
2000
D'alguna manera no deixa de ser una relació sana.
13:15
But when anger turns to contempt,
314
780000
2000
Però quan l'enuig es torna menyspreu,
13:17
you've been dismissed.
315
782000
2000
ja et pots retirar.
13:19
It's associated with moral superiority.
316
784000
2000
S'associa amb la superioritat moral.
13:21
And for that reason, it's very, very hard to recover from.
317
786000
3000
I és per això que és molt difícil recuperar-se'n.
13:24
Here's what it looks like.
318
789000
2000
És així.
13:26
It's marked by one lip corner
319
791000
2000
Es marca quan un cantó del llavi
13:28
pulled up and in.
320
793000
2000
puja una mica i va cap endins.
13:30
It's the only asymmetrical expression.
321
795000
3000
És la única expressió asimètrica.
13:33
And in the presence of contempt,
322
798000
2000
I quan hi hagi menyspreu,
13:35
whether or not deception follows --
323
800000
2000
seguit d'engany o no
13:37
and it doesn't always follow --
324
802000
2000
(no sempre ve seguit d'engany)
13:39
look the other way, go the other direction,
325
804000
2000
mireu cap a una altra banda, aneu en una altra direcció,
13:41
reconsider the deal,
326
806000
2000
reconsidereu el tracte,
13:43
say, "No thank you. I'm not coming up for just one more nightcap. Thank you."
327
808000
4000
digueu: "No, gràcies. No pujaré per només una copa més. Gràcies."
13:47
Science has surfaced
328
812000
2000
La ciència ha descobert
13:49
many, many more indicators.
329
814000
2000
molts altres indicadors.
13:51
We know, for example,
330
816000
2000
Sabem, per exemple,
13:53
we know liars will shift their blink rate,
331
818000
2000
que els mentiders canvien el ritme del parpelleig,
13:55
point their feet towards an exit.
332
820000
2000
apunten els peus a una sortida.
13:57
They will take barrier objects
333
822000
2000
Agafen objectes com a barrera
13:59
and put them between themselves and the person that is interviewing them.
334
824000
3000
i els posen entre ells i la persona que els interroga.
14:02
They'll alter their vocal tone,
335
827000
2000
Tots alteren el seu to de veu,
14:04
often making their vocal tone much lower.
336
829000
3000
moltes vegades fent-lo molt més baix.
14:07
Now here's the deal.
337
832000
2000
La cosa és així.
14:09
These behaviors are just behaviors.
338
834000
3000
Aquests comportaments són només comportaments.
14:12
They're not proof of deception.
339
837000
2000
No són cap prova d'engany.
14:14
They're red flags.
340
839000
2000
Són indicadors.
14:16
We're human beings.
341
841000
2000
Som éssers humans.
14:18
We make deceptive flailing gestures all over the place all day long.
342
843000
3000
Sempre estem fent gests exagerats.
14:21
They don't mean anything in and of themselves.
343
846000
2000
No volen dir res per si sols.
14:23
But when you see clusters of them, that's your signal.
344
848000
3000
Però quan en veus molts de cop, és una senyal.
14:26
Look, listen, probe, ask some hard questions,
345
851000
3000
Mireu, escolteu, investigueu, feu preguntes difícils,
14:29
get out of that very comfortable mode of knowing,
346
854000
3000
sortiu de l'actitud còmode del coneixement
14:32
walk into curiosity mode, ask more questions,
347
857000
3000
i endinseu-vos en l'actitud de la curiositat, feu més preguntes,
14:35
have a little dignity, treat the person you're talking to with rapport.
348
860000
3000
tingueu dignitat, tracteu bé la persona amb qui parleu.
14:38
Don't try to be like those folks on "Law & Order" and those other TV shows
349
863000
3000
No intenteu ser com els de "Llei i Ordre" i aquestes sèries de TV
14:41
that pummel their subjects into submission.
350
866000
2000
que molesten els sospitosos fins a sotmetre'ls.
14:43
Don't be too aggressive, it doesn't work.
351
868000
3000
No sigueu massa agressius, no funciona.
14:46
Now we've talked a little bit
352
871000
2000
Bé, hem parlat una mica
14:48
about how to talk to someone who's lying
353
873000
2000
de com parlar amb algú que menteix
14:50
and how to spot a lie.
354
875000
2000
i de com detectar mentides.
14:52
And as I promised, we're now going to look at what the truth looks like.
355
877000
3000
I com us he promès, ara veurem com és la veritat.
14:55
But I'm going to show you two videos,
356
880000
2000
Us ensenyaré dos vídeos,
14:57
two mothers -- one is lying, one is telling the truth.
357
882000
3000
dues mares: una menteix, l'altra diu la veritat.
15:00
And these were surfaced
358
885000
2000
Els va fer
15:02
by researcher David Matsumoto in California.
359
887000
2000
l'investigador David Matsumoto a Califòrnia.
15:04
And I think they're an excellent example
360
889000
2000
I crec que són un exemple excel·lent
15:06
of what the truth looks like.
361
891000
2000
de com és la veritat.
15:08
This mother, Diane Downs,
362
893000
2000
Aquesta mare, la Diane Downs,
15:10
shot her kids at close range,
363
895000
2000
va disparar els seus fills de prop,
15:12
drove them to the hospital
364
897000
2000
els va portar a l'hospital
15:14
while they bled all over the car,
365
899000
2000
mentre es dessagnaven al cotxe,
15:16
claimed a scraggy-haired stranger did it.
366
901000
2000
i va dir que ho havia fet un desconegut deixat.
15:18
And you'll see when you see the video,
367
903000
2000
I quan veureu el vídeo notareu
15:20
she can't even pretend to be an agonizing mother.
368
905000
2000
que ni tan sols sap actuar com una mare que pateix.
15:22
What you want to look for here
369
907000
2000
Heu de fixar-vos
15:24
is an incredible discrepancy
370
909000
2000
en la gran discrepància
15:26
between horrific events that she describes
371
911000
2000
entre les accions terribles que descriu
15:28
and her very, very cool demeanor.
372
913000
2000
i el seu comportament tan calmat.
15:30
And if you look closely, you'll see duping delight throughout this video.
373
915000
3000
Si us hi fixeu, veureu un exemple del "plaer per l'engany" al vídeo.
15:33
(Video) Diane Downs: At night when I close my eyes,
374
918000
2000
(Vídeo) Diane Downs: "A la nit, quan tanco els ulls,
15:35
I can see Christie reaching her hand out to me while I'm driving,
375
920000
3000
veig la Christie volent-me donar la mà mentre condueixo,
15:38
and the blood just kept coming out of her mouth.
376
923000
3000
i la sang que no li parava de sortir de la boca.
15:41
And that -- maybe it'll fade too with time --
377
926000
2000
Potser amb el temps ho oblidaré,
15:43
but I don't think so.
378
928000
2000
però no ho crec.
15:45
That bothers me the most.
379
930000
3000
I això és el que més em preocupa."
15:55
PM: Now I'm going to show you a video
380
940000
2000
PM: Ara us ensenyaré un vídeo
15:57
of an actual grieving mother, Erin Runnion,
381
942000
2000
d'una mare realment afligida, l'Erin Runnion,
15:59
confronting her daughter's murderer and torturer in court.
382
944000
4000
enfrontant-se amb l'assassí de la seva filla en un judici.
16:03
Here you're going to see no false emotion,
383
948000
2000
Aquí no hi veureu emocions falses,
16:05
just the authentic expression of a mother's agony.
384
950000
3000
simplement l'expressió autèntica de l'agonia d'una mare.
16:08
(Video) Erin Runnion: I wrote this statement on the third anniversary
385
953000
2000
(Vídeo) Erin Runnion: Vaig escriure aquesta declaració en el tercer aniversari
16:10
of the night you took my baby,
386
955000
2000
de la nit en què em vas prendre la meva filla,
16:12
and you hurt her,
387
957000
2000
i li vas fer mal,
16:14
and you crushed her,
388
959000
2000
i la vas torturar,
16:16
you terrified her until her heart stopped.
389
961000
4000
la vas aterrir fins que se li va aturar el cor.
16:20
And she fought, and I know she fought you.
390
965000
3000
I ella va lluitar, sé que va lluitar contra tu.
16:23
But I know she looked at you
391
968000
2000
Però sé que et va mirar
16:25
with those amazing brown eyes,
392
970000
2000
amb els seus increïbles ulls marrons,
16:27
and you still wanted to kill her.
393
972000
3000
i tu la vas voler matar igualment.
16:30
And I don't understand it,
394
975000
2000
I no ho entenc,
16:32
and I never will.
395
977000
3000
i mai no ho entendré.
16:35
PM: Okay, there's no doubting the veracity of those emotions.
396
980000
4000
PM: Bé, no hi ha cap dubte sobre la veracitat d'aquestes emocions.
16:39
Now the technology around what the truth looks like
397
984000
3000
Ara la tecnologia que ens permet saber com és la veritat
16:42
is progressing on, the science of it.
398
987000
3000
va progressant.
16:45
We know for example
399
990000
2000
Sabem, per exemple,
16:47
that we now have specialized eye trackers and infrared brain scans,
400
992000
3000
que ara tenim seguidors de mirada especialitzats i escàners cerebrals infrarrojos,
16:50
MRI's that can decode the signals that our bodies send out
401
995000
3000
ressonàncies magnètiques que poden descodificar les senyals que emet el cos
16:53
when we're trying to be deceptive.
402
998000
2000
quan intentem enganyar.
16:55
And these technologies are going to be marketed to all of us
403
1000000
3000
I ens vendran aquestes tecnologies
16:58
as panaceas for deceit,
404
1003000
2000
com a remei contra els enganys,
17:00
and they will prove incredibly useful some day.
405
1005000
3000
i un dia es demostrarà que són molt útils.
17:03
But you've got to ask yourself in the meantime:
406
1008000
2000
Mentrestant, però, us heu de preguntar:
17:05
Who do you want on your side of the meeting,
407
1010000
2000
Qui voleu al costat vostre en una reunió,
17:07
someone who's trained in getting to the truth
408
1012000
3000
algú entrenat per arribar a la veritat
17:10
or some guy who's going to drag a 400-pound electroencephalogram
409
1015000
2000
o algú que entri amb un electroencefalograma de 180 kg
17:12
through the door?
410
1017000
2000
per la porta?
17:14
Liespotters rely on human tools.
411
1019000
4000
Els detectors de mentides es basen en mecanismes humans.
17:18
They know, as someone once said,
412
1023000
2000
Saben, com algú bé va dir,
17:20
"Character's who you are in the dark."
413
1025000
2000
"El caràcter és qui ets en la foscor."
17:22
And what's kind of interesting
414
1027000
2000
I és interessant veure
17:24
is that today we have so little darkness.
415
1029000
2000
que avui dia no tenim gaire foscor.
17:26
Our world is lit up 24 hours a day.
416
1031000
3000
El nostre món està il·luminat les 24 hores del dia.
17:29
It's transparent
417
1034000
2000
És transparent
17:31
with blogs and social networks
418
1036000
2000
amb blocs i xarxes socials
17:33
broadcasting the buzz of a whole new generation of people
419
1038000
2000
que transmeten les paraules de tota una nova generació de persones
17:35
that have made a choice to live their lives in public.
420
1040000
3000
que han decidit viure les seves vides en públic.
17:38
It's a much more noisy world.
421
1043000
4000
És un món molt més sorollós.
17:42
So one challenge we have
422
1047000
2000
O sigui que tenim el repte
17:44
is to remember,
423
1049000
2000
de recordar que
17:46
oversharing, that's not honesty.
424
1051000
3000
compartir-ho tot no és honestedat.
17:49
Our manic tweeting and texting
425
1054000
2000
La nostra obsessió per les piulades o els missatges de text
17:51
can blind us to the fact
426
1056000
2000
pot impedir-nos veure
17:53
that the subtleties of human decency -- character integrity --
427
1058000
3000
que les subtileses de la decència humana, la personalitat, la integritat,
17:56
that's still what matters, that's always what's going to matter.
428
1061000
3000
continuen sent importants i sempre ho seran.
17:59
So in this much noisier world,
429
1064000
2000
En aquest món tan sorollós,
18:01
it might make sense for us
430
1066000
2000
té sentit
18:03
to be just a little bit more explicit
431
1068000
2000
ser una mica més explícits
18:05
about our moral code.
432
1070000
3000
sobre el nostre codi moral.
18:08
When you combine the science of recognizing deception
433
1073000
2000
Quan combines la ciència de reconèixer l'engany
18:10
with the art of looking, listening,
434
1075000
2000
amb l'art d'observar i escoltar,
18:12
you exempt yourself from collaborating in a lie.
435
1077000
3000
evites col·laborar en una mentida.
18:15
You start up that path
436
1080000
2000
Comences per aquest camí
18:17
of being just a little bit more explicit,
437
1082000
2000
de ser tan sols una mica més explícit,
18:19
because you signal to everyone around you,
438
1084000
2000
perquè dius a tothom qui t'envolta,
18:21
you say, "Hey, my world, our world,
439
1086000
3000
dius: "Ei, el meu món, el nostre món,
18:24
it's going to be an honest one.
440
1089000
2000
serà un món honest.
18:26
My world is going to be one where truth is strengthened
441
1091000
2000
El meu món serà un món on es reforci la veritat
18:28
and falsehood is recognized and marginalized."
442
1093000
3000
i on es reconegui i margini la falsedat."
18:31
And when you do that,
443
1096000
2000
I quan ho fas,
18:33
the ground around you starts to shift just a little bit.
444
1098000
3000
tot allò que t'envolta canvia una mica.
18:36
And that's the truth. Thank you.
445
1101000
3000
I aquesta és la veritat. Gràcies.
18:39
(Applause)
446
1104000
5000
(Aplaudiments)
ABOUT THE SPEAKER
Pamela Meyer - Lie detectorPamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.
Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.
Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speakerWorking with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
Pamela Meyer | Speaker | TED.com