ABOUT THE SPEAKER
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.

Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.

Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speaker
Pamela Meyer | Speaker | TED.com
TEDGlobal 2011

Pamela Meyer: How to spot a liar

Pamela Meyer: Si te identifikosh nje genjeshtar

Filmed:
28,415,176 views

Cdo dite, ne jemi te genjyer nga te tjeret mesatarisht nga 10 deri ne 100 here, dhe mundesite qe ta zbulosh se eshte e tille jane gati te pamundura, delikate dhe kunder-intuitive. Pamela Meyer, autore e bestsellerit "Liespotting", e cila merret me teknikat per te zbuluar mashtrimet, tregon menyrat dhe pikat delikate te njerezve qe jane trajnuar per te gjetur mashtrimet -- edhe ajo argumenton qe ndershmeria eshte nje virtyt qe ja vlen ta ruash.
- Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth. Full bio

Double-click the English transcript below to play the video.

00:15
Okay, now I don't want to alarm anybody in this room,
0
0
5000
Ok, sa per fillim nuk dua qe asnjeri nga ju ketu ne salle te filloje panikun
00:20
but it's just come to my attention
1
5000
2000
por ja qe me ka shkuar neper mendje ideja
00:22
that the person to your right is a liar.
2
7000
2000
qe personi ne te djathten tuaj eshte thjesht nje genjeshtar.
00:24
(Laughter)
3
9000
2000
(Qeshje)
00:26
Also, the person to your left is a liar.
4
11000
3000
Gjithashtu, edhe ai ne te majten tuaj eshte nje genjeshtar.
00:29
Also the person sitting in your very seats is a liar.
5
14000
3000
Edhe ai qe eshte ulur ne karrigen tuaj mban nje epitet te tille.
00:32
We're all liars.
6
17000
2000
Ne te gjithe jemi genjeshtare.
00:34
What I'm going to do today
7
19000
2000
Ajo cka une do te bej sot
00:36
is I'm going to show you what the research says about why we're all liars,
8
21000
3000
eshte se do t`ju tregoj se pse studimet e fundit na bejne te gjitheve genjeshtare
00:39
how you can become a liespotter
9
24000
2000
si mund te shnderrohesh ne nje kapes genjeshtrash
00:41
and why you might want to go the extra mile
10
26000
3000
dhe pse ndoshta do ta kerkosh edhe me shume te verteten
00:44
and go from liespotting to truth seeking,
11
29000
3000
duke kaluar nga nje kapes i genjeshtrave ne nje kerkues te vetem te te vertetes
00:47
and ultimately to trust building.
12
32000
2000
dhe ne fund te krijosh ate qe quhet besim.
00:49
Now speaking of trust,
13
34000
3000
Dhe per te folur rreth besimit,
00:52
ever since I wrote this book, "Liespotting,"
14
37000
3000
qekur e kam shkruar librin "Liespotting,"
00:55
no one wants to meet me in person anymore, no, no, no, no, no.
15
40000
3000
askush sdeshiron te me takoje me personalisht, jo jo jo jo,jo.
00:58
They say, "It's okay, we'll email you."
16
43000
3000
Ata thone "Pa problem, ta dergojme me e-mail".
01:01
(Laughter)
17
46000
2000
(Qeshje)
01:03
I can't even get a coffee date at Starbucks.
18
48000
4000
Madje nuk mund as te pij nje kafe ne Starbuck.
01:07
My husband's like, "Honey, deception?
19
52000
2000
Burri im me thote "He zemer, ndonje mashtrim?
01:09
Maybe you could have focused on cooking. How about French cooking?"
20
54000
3000
Ndoshta me mire do kish qene te fokusoheshe tek gatimi. Si mendon rreth gatimit te ushqimit francez?"
01:12
So before I get started, what I'm going to do
21
57000
2000
Keshtu qe para se te filloj, ajo cfare do te bej
01:14
is I'm going to clarify my goal for you,
22
59000
3000
eshte se do te bej te qarte synimin tim
01:17
which is not to teach a game of Gotcha.
23
62000
2000
qe nuk eshte rreth te mesuarit te ndonje loje.
01:19
Liespotters aren't those nitpicky kids,
24
64000
2000
Mashtruesit nuk jane vetem ata femijet e llastuar,
01:21
those kids in the back of the room that are shouting, "Gotcha! Gotcha!
25
66000
3000
qe ne fund te dhomes bertasin "Gotcha,Gotcha"
01:24
Your eyebrow twitched. You flared your nostril.
26
69000
3000
Vetulla jote menjere ngrihet. Je prekur aty ku nuk duhet.
01:27
I watch that TV show 'Lie To Me.' I know you're lying."
27
72000
3000
Une e shikoj ate TV show "Me genje mua" ose "E di qe po me genjen".
01:30
No, liespotters are armed
28
75000
2000
Jo, mashtruesit jane te armatosur
01:32
with scientific knowledge of how to spot deception.
29
77000
3000
me nje dije shkencerisht te madhe si ta qendisin me se miri mashtrimin
01:35
They use it to get to the truth,
30
80000
2000
Ata e perdorin mashtrimin per te arritur shpejt tek e verteta
01:37
and they do what mature leaders do everyday;
31
82000
2000
dhe bejne pikerisht ate qe bejne cdo dite lideret boterore;
01:39
they have difficult conversations with difficult people,
32
84000
3000
ata kane biseda te veshtira me njerez te veshtire
01:42
sometimes during very difficult times.
33
87000
2000
shpesh edhe ne kohera te veshtira.
01:44
And they start up that path
34
89000
2000
Dhe kshtu ata e fillojne kete udhetim
01:46
by accepting a core proposition,
35
91000
2000
duke e pranuar nje propozim te rendesishem
01:48
and that proposition is the following:
36
93000
2000
i cili eshte:
01:50
Lying is a cooperative act.
37
95000
3000
Genjeshtra eshte nje akt bashkepunimi.
01:53
Think about it, a lie has no power whatsoever by its mere utterance.
38
98000
4000
Mendoje pak, genjeshtra ska asnjehere fuqi, cfaredo qofte menyra e te shprehurit dhe e te folurit.
01:57
Its power emerges
39
102000
2000
"Fuqia" e saj zbarkon
01:59
when someone else agrees to believe the lie.
40
104000
2000
vetem kur dikush tjeter pranon per ta besuar genjeshtren.
02:01
So I know it may sound like tough love,
41
106000
2000
E di qe mund te tingelloj si nje dashuri e veshtire,
02:03
but look, if at some point you got lied to,
42
108000
4000
por, nese ndonjehere edhe ju keni qene pala e genjyer
02:07
it's because you agreed to get lied to.
43
112000
2000
ka ndodhur sepse keni pranuar te jeni e tille.
02:09
Truth number one about lying: Lying's a cooperative act.
44
114000
3000
Rregulla e pare per genjeshtren: Genjeshtra eshte veprim bashkepunimi.
02:12
Now not all lies are harmful.
45
117000
2000
Tani, jo te gjitha genjeshtrat jane te demshme.
02:14
Sometimes we're willing participants in deception
46
119000
3000
Ndonjehere, pa vetedije, ne jemi te gatshem te marrim pjese tek mashtrimi
02:17
for the sake of social dignity,
47
122000
3000
per interesin e dinjitetit social,
02:20
maybe to keep a secret that should be kept secret, secret.
48
125000
3000
vetem per arsyen qe ta ruajme sekretin, si sekret.
02:23
We say, "Nice song."
49
128000
2000
Ne themi, " Kenge e mire."
02:25
"Honey, you don't look fat in that, no."
50
130000
3000
"Zemer, nuk e dukesh e shendosh nen te, jo"
02:28
Or we say, favorite of the digiratti,
51
133000
2000
Ose themi, kjo eshte e preferuara,
02:30
"You know, I just fished that email out of my spam folder.
52
135000
3000
"E dini cfare, sapo e fshiva ate e-mail nga folderi spam.
02:33
So sorry."
53
138000
3000
Me vjen keq."
02:36
But there are times when we are unwilling participants in deception.
54
141000
3000
Por ka edhe raste, kur ne sduam te jemi pjese ne mashtrim.
02:39
And that can have dramatic costs for us.
55
144000
3000
Dhe kjo mund te na kushtoje ne menyre drastike.
02:42
Last year saw 997 billion dollars
56
147000
3000
Vitin e kaluar u regjistruan 991 bilione dollare deme
02:45
in corporate fraud alone in the United States.
57
150000
4000
vetem nga mashtrimet e korporatave ne Amerike.
02:49
That's an eyelash under a trillion dollars.
58
154000
2000
Kjo eshte vetem nje pjesez e vogel nen nje trilion dollar.
02:51
That's seven percent of revenues.
59
156000
2000
Kjo i bie te jete gati shtate perqind e totalit te te ardhurave.
02:53
Deception can cost billions.
60
158000
2000
Ky eshte shembulli kur mashtrimi mund te kushtoje biliona.
02:55
Think Enron, Madoff, the mortgage crisis.
61
160000
3000
Mendoni rastin Enron, Madoff, krizen e hipotekave.
02:58
Or in the case of double agents and traitors,
62
163000
3000
Ose ne rastin e agjenteve te dyfishte dhe tradhtare,
03:01
like Robert Hanssen or Aldrich Ames,
63
166000
2000
si rastet e Robert Hanssen ose Aldrich Ames,
03:03
lies can betray our country,
64
168000
2000
kur genjeshtra mund edhe ta tradhtoje shtetin tone
03:05
they can compromise our security, they can undermine democracy,
65
170000
3000
mund ta kompromentoje sigurine dhe mund ta minoje zhvillimin e demokracise,
03:08
they can cause the deaths of those that defend us.
66
173000
3000
ato mund te shkaktojne vdekjen e atyre qe na mbrojne
03:11
Deception is actually serious business.
67
176000
3000
Mashtrimi realisht eshte nje teme serioze
03:14
This con man, Henry Oberlander,
68
179000
2000
Ky njeri batakçi, Henry Oberlander,
03:16
he was such an effective con man
69
181000
2000
ishte nje mashtrues efektiv me diplome.
03:18
British authorities say
70
183000
2000
Autoritetet britaneze thone se
03:20
he could have undermined the entire banking system of the Western world.
71
185000
3000
ai mund te jete njeriu i cili ka demtuar gjith sistemin bankar te botes perendimore.
03:23
And you can't find this guy on Google; you can't find him anywhere.
72
188000
2000
Dhe ju nuk mund ta gjeni kete njeri ne Google, nuk mund ta gjeni ne asnje vend.
03:25
He was interviewed once, and he said the following.
73
190000
3000
Ate e kishin intervistuar njehere, dhe kishte thene:
03:28
He said, "Look, I've got one rule."
74
193000
2000
"Shikoni, une e kam nje rregull."
03:30
And this was Henry's rule, he said,
75
195000
3000
Dhe ky ishte rregulli i Henrit, ai tha,
03:33
"Look, everyone is willing to give you something.
76
198000
2000
"Shikoni, cdokush eshte i gatshem t'ju jape juve dicka.
03:35
They're ready to give you something for whatever it is they're hungry for."
77
200000
3000
Ata jane te gatshem per t'ju dhene juve dicka per cfaredolloj gjeje qe ju jeni te etur per ta patur."
03:38
And that's the crux of it.
78
203000
2000
Dhe ky eshte thelbi.
03:40
If you don't want to be deceived, you have to know,
79
205000
2000
Nese nuk doni te jeni te mashtruar, duhet te dini
03:42
what is it that you're hungry for?
80
207000
2000
se per cfare ju jeni te etur?
03:44
And we all kind of hate to admit it.
81
209000
3000
Ky eshte fakti qe na vjen inat ta pranojme.
03:47
We wish we were better husbands, better wives,
82
212000
3000
Ne do te donim te ishim bashkeshort me te mire, gra me te mira,
03:50
smarter, more powerful,
83
215000
2000
me te mencur, me me shume fuqi,
03:52
taller, richer --
84
217000
2000
me te gjate, me te pasur
03:54
the list goes on.
85
219000
2000
dhe lista vazhdon pafundesisht.
03:56
Lying is an attempt to bridge that gap,
86
221000
2000
Genjeshtra eshte nje perpjekje per ta mbushur kete zbrazetire
03:58
to connect our wishes and our fantasies
87
223000
2000
nje menyre per ti lidhur deshirat dhe fantazite tona
04:00
about who we wish we were, how we wish we could be,
88
225000
3000
per ate cfare do te deshironim te ishim, dhe si do te deshironim te ishim,
04:03
with what we're really like.
89
228000
3000
me ate qe ne jemi ne te vertete.
04:06
And boy are we willing to fill in those gaps in our lives with lies.
90
231000
3000
Dhe eshte absurde nese keto zbrazetira duam ti mbushim me shtirje e genjeshtra
04:09
On a given day, studies show that you may be lied to
91
234000
3000
Ketyre diteve, disa studime konkluduan se ju mund te genjeheni
04:12
anywhere from 10 to 200 times.
92
237000
2000
ne cdo vend e kohe, nga 10 deri ne 200 here.
04:14
Now granted, many of those are white lies.
93
239000
3000
E keni te sigurt, qe shumica prej ketyre jane genjeshtra te bardha.
04:17
But in another study,
94
242000
2000
Por ne nje studimin tjeter,
04:19
it showed that strangers lied three times
95
244000
2000
thuhet se te huajt genjejne mbi tre here
04:21
within the first 10 minutes of meeting each other.
96
246000
2000
ne 10 minuteshin e pare te takimit te tyre.
04:23
(Laughter)
97
248000
2000
(Qeshje)
04:25
Now when we first hear this data, we recoil.
98
250000
3000
Dhe kur i degjojme per here te pare keto te dhena, ne terhiqemi.
04:28
We can't believe how prevalent lying is.
99
253000
2000
Ne smund te besojme se sa mbizoteron kudo genjeshtra.
04:30
We're essentially against lying.
100
255000
2000
Ne esence te gjithe jemi kunder saj.
04:32
But if you look more closely,
101
257000
2000
Por nese shikon pak me thelle
04:34
the plot actually thickens.
102
259000
2000
komploti vetem sa shkon e trashet.
04:36
We lie more to strangers than we lie to coworkers.
103
261000
3000
Ne genjejme me shume te huajt sesa koleget tane.
04:39
Extroverts lie more than introverts.
104
264000
4000
Njerezit me te gjalle genjejne me shume se ata qe jane me pak te socializuar.
04:43
Men lie eight times more about themselves
105
268000
3000
Meshkujt genjejne tete here me shume per veten e tyre
04:46
than they do other people.
106
271000
2000
sesa per te tjeret.
04:48
Women lie more to protect other people.
107
273000
3000
Ndersa femrat genjejne me shume per te mbrojtur te tjeret.
04:51
If you're an average married couple,
108
276000
3000
Nese ju jeni nje cift i martuar me jete mesatare,
04:54
you're going to lie to your spouse
109
279000
2000
do ta genjeni partnerin
04:56
in one out of every 10 interactions.
110
281000
2000
ne nje nga 10 raste te ndryshme.
04:58
Now you may think that's bad.
111
283000
2000
Tani ju mund ta mendoni se kjo eshte dicka e gabuar.
05:00
If you're unmarried, that number drops to three.
112
285000
2000
E ndersa jeni i pamartuar, numri shkon ne tre.
05:02
Lying's complex.
113
287000
2000
Genjeshtra eshte e kompleksuar.
05:04
It's woven into the fabric of our daily and our business lives.
114
289000
3000
Ajo eshte e fabrikuar ne fabriken e perditshmerise sone, jetes private e personale.
05:07
We're deeply ambivalent about the truth.
115
292000
2000
Ne jemi thellesit te irrituar nga ky realitet.
05:09
We parse it out on an as-needed basis,
116
294000
2000
Genjeshtren madje e kemi futur si nje baze te nevojshme ditore,
05:11
sometimes for very good reasons,
117
296000
2000
ndonjehere per arsye te mira,
05:13
other times just because we don't understand the gaps in our lives.
118
298000
3000
e ndonjehere vetem sepse nuk i kuptojme hapesirat e jetes tone.
05:16
That's truth number two about lying.
119
301000
2000
Kjo eshte e verteta e dyte e hidhur mbi genjeshtren.
05:18
We're against lying,
120
303000
2000
Ne jemi kunder te genjyerit,
05:20
but we're covertly for it
121
305000
2000
por ne menyre te fshehte jemi pro saj
05:22
in ways that our society has sanctioned
122
307000
2000
ne menyrat te cilat shoqeria jone i ka denuar
05:24
for centuries and centuries and centuries.
123
309000
2000
per shekuj e shekuj e shekuj me rradhe.
05:26
It's as old as breathing.
124
311000
2000
Genjeshtra eshte aq e vjeter sa edhe frymemarrja e njerezimit.
05:28
It's part of our culture, it's part of our history.
125
313000
2000
Eshte pjese e kultures sone, e historise sone.
05:30
Think Dante, Shakespeare,
126
315000
3000
Mendoni Danten, Shekspirin,
05:33
the Bible, News of the World.
127
318000
3000
Biblen, Te rejat e Botes.
05:36
(Laughter)
128
321000
2000
(Qeshje)
05:38
Lying has evolutionary value to us as a species.
129
323000
2000
Genjeshtra ka nje vlere shume perparimtare per ne si qenie njerezore.
05:40
Researchers have long known
130
325000
2000
Kerkuesit kane njohuri te medha
05:42
that the more intelligent the species,
131
327000
2000
se sa me inteligjente te jene speciet,
05:44
the larger the neocortex,
132
329000
2000
aq me tru te zhvilluar kane,
05:46
the more likely it is to be deceptive.
133
331000
2000
aq me te prirur per te mashtruar e genjyer jane.
05:48
Now you might remember Koko.
134
333000
2000
Tani ju mund te kujtoni Kokon.
05:50
Does anybody remember Koko the gorilla who was taught sign language?
135
335000
3000
A ka ndonjeri qe kujton Kokon, Kokoja, gorilles qe iu mesua gjuha e shenjave?
05:53
Koko was taught to communicate via sign language.
136
338000
3000
Koko ishte trajnuar per te komunikuar me gjuhen e shenjave.
05:56
Here's Koko with her kitten.
137
341000
2000
Ketu eshte Koko me kotelen e saj.
05:58
It's her cute little, fluffy pet kitten.
138
343000
3000
Eshte kotelja e saj simpatike, plot gezof.
06:01
Koko once blamed her pet kitten
139
346000
2000
Koko njehere ka fajesuar kotelen e saj
06:03
for ripping a sink out of the wall.
140
348000
2000
per problemet me shkelqimin e lavamanit nga muri i saj.
06:05
(Laughter)
141
350000
2000
(Qeshje)
06:07
We're hardwired to become leaders of the pack.
142
352000
2000
Ne jemi te ster-trajnuar per te qene lideret e nje pakete.
06:09
It's starts really, really early.
143
354000
2000
Dhe kjo fillon shume shume heret ne kohe
06:11
How early?
144
356000
2000
Sa heret?
06:13
Well babies will fake a cry,
145
358000
2000
Foshnjet do te shtiren kur qajne,
06:15
pause, wait to see who's coming
146
360000
2000
do te pauzojne, te shohin se kush po shfaqet
06:17
and then go right back to crying.
147
362000
2000
dhe me pas do te kthehen e te qajne perseri.
06:19
One-year-olds learn concealment.
148
364000
2000
Nje vjecaret i mesojne keto fshehje mistrece.
06:21
(Laughter)
149
366000
2000
(Qeshje)
06:23
Two-year-olds bluff.
150
368000
2000
Dy vjecaret bejne blof.
06:25
Five-year-olds lie outright.
151
370000
2000
Pese vjecaret genjejne keshtu, plotesisht troç.
06:27
They manipulate via flattery.
152
372000
2000
Ata manipulojne me ane te shakave.
06:29
Nine-year-olds, masters of the cover up.
153
374000
3000
Nente vjecaret jane eksperte per ti fshehur pasojat e veprimeve.
06:32
By the time you enter college,
154
377000
2000
Ne kohen kur futesh ne shkolle te mesme
06:34
you're going to lie to your mom in one out of every five interactions.
155
379000
3000
do ta genjesh nenen nje nga nenete rastet e mundshme.
06:37
By the time we enter this work world and we're breadwinners,
156
382000
3000
Ndersa ne momentin qe ne i futemi punes se te qenurit mbajtes te familjes
06:40
we enter a world that is just cluttered
157
385000
2000
ne futemi ne botera te ndryshme te shperndara
06:42
with spam, fake digital friends,
158
387000
2000
me miq te shtirur dixhital,
06:44
partisan media,
159
389000
2000
media te animeve partiake,
06:46
ingenious identity thieves,
160
391000
2000
hajdute te zgjuar te identitetit,
06:48
world-class Ponzi schemers,
161
393000
2000
deshtime bankare si ajo Ponzi Schemers,
06:50
a deception epidemic --
162
395000
2000
nje mashtrim epidemik --
06:52
in short, what one author calls
163
397000
2000
shkurtimisht, siç nje autor e quan
06:54
a post-truth society.
164
399000
3000
nje shoqeri e post-realitetit.
06:57
It's been very confusing
165
402000
2000
Ka qene shume konfuze
06:59
for a long time now.
166
404000
3000
per nje kohe te gjate.
07:03
What do you do?
167
408000
2000
Cfare ben ti?
07:05
Well there are steps we can take
168
410000
2000
Epo, jane disa masa qe mund ti marrim
07:07
to navigate our way through the morass.
169
412000
2000
per te ndricuar rrugen tone.
07:09
Trained liespotters get to the truth 90 percent of the time.
170
414000
3000
Detektivet e genjeshtareve jane te trajnuar dhe arrijne tek e verteta ne 90 perqind te rasteve.
07:12
The rest of us, we're only 54 percent accurate.
171
417000
3000
Pjesa tjeter e jona, jemi veten 54 perqind te sakte.
07:15
Why is it so easy to learn?
172
420000
2000
Pse eshte kaq e lehte per te mesuar?
07:17
There are good liars and there are bad liars. There are no real original liars.
173
422000
3000
Ekzistojne genjeshtare te mire, dhe te keqinj. Por nuk ekzistojne genjeshtare origjinale.
07:20
We all make the same mistakes. We all use the same techniques.
174
425000
3000
Ne te gjithe bejme te njejtat gabime. Ne te gjithe perdorim te njejtat teknika.
07:23
So what I'm going to do
175
428000
2000
Pra, ajo cka une do te bej
07:25
is I'm going to show you two patterns of deception.
176
430000
2000
eshte se do tju tregoj dy modele te mashtrimit.
07:27
And then we're going to look at the hot spots and see if we can find them ourselves.
177
432000
3000
Dhe me pas do ti shikojme dhe analizojme ato, dhe ti gjejme vete tiparet.
07:30
We're going to start with speech.
178
435000
3000
Le te fillojme me fjalimin.
07:33
(Video) Bill Clinton: I want you to listen to me.
179
438000
2000
(Video) Bill Clinton: Dua qe ju te gjithe te me degjoni.
07:35
I'm going to say this again.
180
440000
2000
Dhe do ta them perseri
07:37
I did not have sexual relations
181
442000
3000
Une nuk kam pasur marredhenie seksuale
07:40
with that woman, Miss Lewinsky.
182
445000
4000
me ate gruan, Zonjen Lewinsky.
07:44
I never told anybody to lie,
183
449000
2000
Asnjehere si kam thene dikujt qe te genjej,
07:46
not a single time, never.
184
451000
2000
asnjehere te vetme, kurre.
07:48
And these allegations are false.
185
453000
3000
Dhe keto akuza jane te gjitha te rreme
07:51
And I need to go back to work for the American people.
186
456000
2000
Mua me duhet ti kthehem punes per popullin amerikan
07:53
Thank you.
187
458000
2000
Faleminderit.
07:58
Pamela Meyer: Okay, what were the telltale signs?
188
463000
3000
Pamela Meyer: Ne rregull, cilat ishin shenjat prej llafazani?
08:01
Well first we heard what's known as a non-contracted denial.
189
466000
4000
Epo se pari degjuam se cka do te thote mohim jo-protokolar.
08:05
Studies show that people who are overdetermined in their denial
190
470000
3000
Studimet tregojne se njerezit qe jane shume parashikues ne mohimet e tyre
08:08
will resort to formal rather than informal language.
191
473000
3000
do te perdorin me shume gjuhen zyrtare sesa informale.
08:11
We also heard distancing language: "that woman."
192
476000
3000
Gjithashtu degjuam pak edhe nga fraza distanciale " ajo grua".
08:14
We know that liars will unconsciously distance themselves
193
479000
2000
Ne e dime se genjeshtaret ne menyre jo te pergjegjshme distancojne veten
08:16
from their subject
194
481000
2000
nga subjekti i tyre
08:18
using language as their tool.
195
483000
3000
duke perdorur gjuhen si nje arme te forte.
08:21
Now if Bill Clinton had said, "Well, to tell you the truth ... "
196
486000
3000
Tashme, nese Bill Clinton kishte thene " Epo, me ju thene te drejten..."
08:24
or Richard Nixon's favorite, "In all candor ... "
197
489000
2000
ose te famshmit e Richard Nixon, "Me tere sinqeritetin...."
08:26
he would have been a dead giveaway
198
491000
2000
ai do te ishte nje anashkalues i vdekur
08:28
for any liespotter than knows
199
493000
2000
per çdo genjeshtar qe di
08:30
that qualifying language, as it's called, qualifying language like that,
200
495000
3000
se gjuha e kualifikuar, siç quhet, nese kualifikohet ne ate menyre
08:33
further discredits the subject.
201
498000
2000
e diskretiton me tutje temen.
08:35
Now if he had repeated the question in its entirety,
202
500000
3000
Tani ne qoftese ai e kishte perseritur ne teresi pyetjen,
08:38
or if he had peppered his account with a little too much detail --
203
503000
4000
ose ne qoftese ai e kishte mbushur llogarine e tij me nje detaj te vogel --
08:42
and we're all really glad he didn't do that --
204
507000
2000
dhe ne jemi te gjithe te kenaqur qe ai nuk e beri --
08:44
he would have further discredited himself.
205
509000
2000
ai do ta kishte fshehur veten edhe me tutje.
08:46
Freud had it right.
206
511000
2000
Frojdi kishte te drejtat e tij.
08:48
Freud said, look, there's much more to it than speech:
207
513000
3000
Frojdi ka thene, shiko, ketu ka me shume se nje fjalim:
08:51
"No mortal can keep a secret.
208
516000
3000
"Asnje njeri i vdekshem smund te mbaje nje sekret.
08:54
If his lips are silent, he chatters with his fingertips."
209
519000
3000
Nese buzet nuk i flasin, ai flet me ane te gishtave."
08:57
And we all do it no matter how powerful you are.
210
522000
3000
Dhe ne te gjithe e bejme kete, pa marre parasysh fuqine.
09:00
We all chatter with our fingertips.
211
525000
2000
Ne te gjithe merremi vesh me ane te shenjave te gishtave
09:02
I'm going to show you Dominique Strauss-Kahn with Obama
212
527000
3000
Une do tju tregoj Dominique Strauss Kahn me Obamen
09:05
who's chattering with his fingertips.
213
530000
3000
te cilet po merren vesh me ane te shenjave te gishtave.
09:08
(Laughter)
214
533000
3000
(Qeshje)
09:11
Now this brings us to our next pattern,
215
536000
3000
Tani kjo na con ne nje model tjeter,
09:14
which is body language.
216
539000
3000
qe eshte gjuha e trupit.
09:17
With body language, here's what you've got to do.
217
542000
3000
Me gjuhen e trupit, ja cfare duhet te bejme.
09:20
You've really got to just throw your assumptions out the door.
218
545000
3000
You vetem duhet te hidhni supozimet tuaj nga dera.
09:23
Let the science temper your knowledge a little bit.
219
548000
2000
Lere shkencen qe te kete durim pak me njohurite tuaj.
09:25
Because we think liars fidget all the time.
220
550000
3000
Sepse ne mendojme qe genjeshtaret luajne gjate gjithe kohes.
09:28
Well guess what, they're known to freeze their upper bodies when they're lying.
221
553000
3000
Epo degjojeni kete, ata jane te njohur qe te ngrijne trupat e tyre te siperm kur jane duke genjyer.
09:31
We think liars won't look you in the eyes.
222
556000
3000
Ne mendojme se genjeshtaret nuk na shikojne ne sy.
09:34
Well guess what, they look you in the eyes a little too much
223
559000
2000
Epo degjoni kete, ata ju shikojne ne sy me shume sec duhet
09:36
just to compensate for that myth.
224
561000
2000
vetem per ta kompensuar per ate mit.
09:38
We think warmth and smiles
225
563000
2000
Ne gjithmone mendojme se ngrohtesia dhe buzeqeshja
09:40
convey honesty, sincerity.
226
565000
2000
pecjellin sinqeritet dhe besnikeri
09:42
But a trained liespotter
227
567000
2000
Por nje zbulues i trajnuar genjeshtrash
09:44
can spot a fake smile a mile away.
228
569000
2000
mund ta dalloje nje buzeqeshje te shtirur me kilometra.
09:46
Can you all spot the fake smile here?
229
571000
3000
A mundeni ju te gjithe ta identifikoni buzeqeshjen e shtirur?
09:50
You can consciously contract
230
575000
2000
Ju me vetedije mund ti kontrolloni
09:52
the muscles in your cheeks.
231
577000
3000
muskujt ne faqen tuaj.
09:55
But the real smile's in the eyes, the crow's feet of the eyes.
232
580000
3000
Por buzeqeshja e vertete eshte tek syte, qendron tek syte.
09:58
They cannot be consciously contracted,
233
583000
2000
Ata nuk mund te kontrollohen me vetedije,
10:00
especially if you overdid the Botox.
234
585000
2000
sidomos nese ke mbiperdorur Botoks.
10:02
Don't overdo the Botox; nobody will think you're honest.
235
587000
3000
Mos e mbi-perdor Botoksin, askush sdo te mendoj se je i sinqerte.
10:05
Now we're going to look at the hot spots.
236
590000
2000
Tani do te shikojme disa spote.
10:07
Can you tell what's happening in a conversation?
237
592000
2000
A mund te tregoni cfare po ndodh ne bisede?
10:09
Can you start to find the hot spots
238
594000
3000
A mund te filloni ti shihni genjeshtrat
10:12
to see the discrepancies
239
597000
2000
mosperputhjet
10:14
between someone's words and someone's actions?
240
599000
2000
midis veprimeve dhe fjaleve te njerezve?
10:16
Now I know it seems really obvious,
241
601000
2000
Tani e di se eshte me te vertete e dukshme
10:18
but when you're having a conversation
242
603000
2000
por kur je buke bere bisede
10:20
with someone you suspect of deception,
243
605000
3000
me dike per te cilin dyshon se po te mashtron,
10:23
attitude is by far the most overlooked but telling of indicators.
244
608000
3000
sjellja, deri me sot eshte indikatori me i madh i se vertetes.
10:26
An honest person is going to be cooperative.
245
611000
2000
Nje person i ndershem do te jete bashkepunues.
10:28
They're going to show they're on your side.
246
613000
2000
Ata do tju tregojne qe jane ne anen tuaj.
10:30
They're going to be enthusiastic.
247
615000
2000
Ata do te jene entuziaste.
10:32
They're going to be willing and helpful to getting you to the truth.
248
617000
2000
Ata do te shprehin gatishmeri dhe ndihme per te mberritur tek e verteta.
10:34
They're going to be willing to brainstorm, name suspects,
249
619000
3000
Ata do te jene te gatshem per ide, emra te dyshimte,
10:37
provide details.
250
622000
2000
dhe te sigurojne detaje.
10:39
They're going to say, "Hey,
251
624000
2000
Ata do te thone,"Hej,
10:41
maybe it was those guys in payroll that forged those checks."
252
626000
3000
ndoshta ishin ata djemte e listave te pagave qe kane harruar cekun."
10:44
They're going to be infuriated if they sense they're wrongly accused
253
629000
3000
Ata do te zemerohen nese e ndjejne se po akuzohen padrejtesisht
10:47
throughout the entire course of the interview, not just in flashes;
254
632000
2000
pergjate tere intervistes, jo vetem ne fleshe te ndara;
10:49
they'll be infuriated throughout the entire course of the interview.
255
634000
3000
ata do te jene te irrituar gjate tere intervistes.
10:52
And if you ask someone honest
256
637000
2000
Nese pyet dike me sinqeritet
10:54
what should happen to whomever did forge those checks,
257
639000
3000
cfare duhet te bejme me ata djemte e cekut,
10:57
an honest person is much more likely
258
642000
2000
nje person i sinqerte ka shume mundesi
10:59
to recommend strict rather than lenient punishment.
259
644000
4000
te rekomandoje nje denim strikt, sesa nje te bute
11:03
Now let's say you're having that exact same conversation
260
648000
2000
Le te themi se ju e keni po ate bisede
11:05
with someone deceptive.
261
650000
2000
me nje mashtrues.
11:07
That person may be withdrawn,
262
652000
2000
Ky person mund te terhiqet,
11:09
look down, lower their voice,
263
654000
2000
te shikoje posht, ta ul zerin
11:11
pause, be kind of herky-jerky.
264
656000
2000
te pushoje, te sillet si herky-jerky.
11:13
Ask a deceptive person to tell their story,
265
658000
2000
Pyet nje mashtrues ta tregoje historine e tyre,
11:15
they're going to pepper it with way too much detail
266
660000
3000
ata do ta mbushin ate me plot detaje
11:18
in all kinds of irrelevant places.
267
663000
3000
ne disa vende te panevojshme.
11:21
And then they're going to tell their story in strict chronological order.
268
666000
3000
Dhe pastaj do ta ri-tregojne historine ne nje renditje kronologjike.
11:24
And what a trained interrogator does
269
669000
2000
Dhe ajo cfare ben nje hetues i trajnuar
11:26
is they come in and in very subtle ways
270
671000
2000
eshte se futen ne disa skuta dhe menyra te holla
11:28
over the course of several hours,
271
673000
2000
per disa ore te tera,
11:30
they will ask that person to tell that story backwards,
272
675000
3000
dhe do ta pyesin ate person per ta treguar historine edhe njehere
11:33
and then they'll watch them squirm,
273
678000
2000
dhe do te shikojne ato pershperitjet, belbezimet,
11:35
and track which questions produce the highest volume of deceptive tells.
274
680000
3000
dhe do ti bejne disa pyetje nga te cilat del perfundimi se ky eshte mashtrim.
11:38
Why do they do that? Well we all do the same thing.
275
683000
3000
Pse e bejne ata kete gje? Epo ne te gjithe e bejme te njejten gje.
11:41
We rehearse our words,
276
686000
2000
Ne i ushtrojme fjalet tona,
11:43
but we rarely rehearse our gestures.
277
688000
2000
por rralle i ushtrojme edhe gjestet.
11:45
We say "yes," we shake our heads "no."
278
690000
2000
Ne themi "Po", ne i shtrengojme duart, "Jo"
11:47
We tell very convincing stories, we slightly shrug our shoulders.
279
692000
3000
Ne tregojme histori shume te bindshme, dhe i levizim krahet tane.
11:50
We commit terrible crimes,
280
695000
2000
Ne pranojme fajin per krime te renda,
11:52
and we smile at the delight in getting away with it.
281
697000
3000
dhe buzeqeshim me kenaqesi kur largohemi prej tyre.
11:55
Now that smile is known in the trade as "duping delight."
282
700000
3000
Dhe kjo buzeqeshje quhet "kenaqesi hajvanesh".
11:58
And we're going to see that in several videos moving forward,
283
703000
3000
Dhe ne do te shohim te tilla ne disa sekuenca te videove,
12:01
but we're going to start -- for those of you who don't know him,
284
706000
2000
por ne do te fillojme - per ju te cilet nuk e njihni ate,
12:03
this is presidential candidate John Edwards
285
708000
3000
ky eshte kanditati per president, John Edwards
12:06
who shocked America by fathering a child out of wedlock.
286
711000
3000
i cili shokoi Ameriken, kur pranoi nje femije jashte martesor.
12:09
We're going to see him talk about getting a paternity test.
287
714000
3000
Ne do ta shohim ate duke folur per testin e atesise.
12:12
See now if you can spot him
288
717000
2000
Shikojeni tani nese mund ta vereni
12:14
saying, "yes" while shaking his head "no,"
289
719000
2000
ai thote "Po" , ndersa e shkund koken per "Jo"
12:16
slightly shrugging his shoulders.
290
721000
2000
duke lekundur pak edhe krahet.
12:18
(Video) John Edwards: I'd be happy to participate in one.
291
723000
2000
(Video) John Edwards: Do te isha i lumtur te merrja pjese.
12:20
I know that it's not possible that this child could be mine,
292
725000
3000
E di qe eshte e pamundur qe ky femije te jete i imi
12:23
because of the timing of events.
293
728000
2000
per shkak te kohes se ngjarjeve qe kane ndodhur
12:25
So I know it's not possible.
294
730000
2000
Keshtu qe, e di qe eshte e pamundur.
12:27
Happy to take a paternity test,
295
732000
2000
Jam i lumtur per te bere testin e atesise,
12:29
and would love to see it happen.
296
734000
2000
dhe ta shohe ate duke ndodhur.
12:31
Interviewer: Are you going to do that soon? Is there somebody --
297
736000
3000
Intervistuesi: A do ta beni ate se shpejti? E eshte dikush --
12:34
JE: Well, I'm only one side. I'm only one side of the test.
298
739000
3000
JE: Epo, Une jam vetem njeri, vetem njeri qe eshte pjese e ketij testi.
12:37
But I'm happy to participate in one.
299
742000
3000
Por jam i lumtur te marr pjese ne nje.
12:40
PM: Okay, those head shakes are much easier to spot
300
745000
2000
PM: Ne rregull, keto levizje te kokes jane me te lehta per ti kapur
12:42
once you know to look for them.
301
747000
2000
nese vetem njehere i shikon me kujdes.
12:44
There're going to be times
302
749000
2000
Do te jene disa
12:46
when someone makes one expression
303
751000
2000
kur dikush ben nje shprehje
12:48
while masking another that just kind of leaks through in a flash.
304
753000
3000
duke e fshehur tjetren qe rrjedh shpejt si nje blic
12:52
Murderers are known to leak sadness.
305
757000
2000
Vrasesit jane te njohur per shkaktimin e pikellimit.
12:54
Your new joint venture partner might shake your hand,
306
759000
2000
Partneri juaj i ri mund t`ju shtrengoje doren
12:56
celebrate, go out to dinner with you
307
761000
2000
te festoje, te shkoje ne dreke me ju
12:58
and then leak an expression of anger.
308
763000
3000
dhe pastaj te shfaqe nje ndjenje te zemerimit.
13:01
And we're not all going to become facial expression experts overnight here,
309
766000
3000
Asnjeri prej nesh nuk do te shnderrohet ne nje ekspert te dallimit te fytyrave, brenda nates
13:04
but there's one I can teach you that's very dangerous, and it's easy to learn,
310
769000
3000
por mund tju mesoj dicka qe eshte vertet e rrezikshme, dhe eshte e lehte per tu mesuar
13:07
and that's the expression of contempt.
311
772000
3000
dhe kjo eshte shprehja e perbuzjes.
13:10
Now with anger, you've got two people on an even playing field.
312
775000
3000
Me zemerimin, keni marre dy njerez ne te njejten fusheloje.
13:13
It's still somewhat of a healthy relationship.
313
778000
2000
Ende kjo eshte nje marredhenie e shendetshme.
13:15
But when anger turns to contempt,
314
780000
2000
Por kur zemerimi kthehet ne perbuzje,
13:17
you've been dismissed.
315
782000
2000
ju jeni te mboshtur.
13:19
It's associated with moral superiority.
316
784000
2000
Eshte e lidhur me superioritetin moral.
13:21
And for that reason, it's very, very hard to recover from.
317
786000
3000
Dhe per kete arsye, eshte shume, shume i veshtire sherimi.
13:24
Here's what it looks like.
318
789000
2000
Ja se si duket.
13:26
It's marked by one lip corner
319
791000
2000
Eshte e shenuar nga nje qoshe
13:28
pulled up and in.
320
793000
2000
e terhequr brenda dhe jashte.
13:30
It's the only asymmetrical expression.
321
795000
3000
Eshte e vetmja shprehje asimetrike.
13:33
And in the presence of contempt,
322
798000
2000
Dhe ne pranine e perbuzjes,
13:35
whether or not deception follows --
323
800000
2000
edhe nese mashtrimi bie poshte --
13:37
and it doesn't always follow --
324
802000
2000
dhe kjo nuk ndodh shpesh --
13:39
look the other way, go the other direction,
325
804000
2000
shikojeni anen tjeter, shkoni ne tjetrin drejtim
13:41
reconsider the deal,
326
806000
2000
rishqyrtojeni marreveshjen,
13:43
say, "No thank you. I'm not coming up for just one more nightcap. Thank you."
327
808000
4000
thoni "Jo faleminderit. Nuk do te vij vetem per nje endje te vogel. Faleminderit"
13:47
Science has surfaced
328
812000
2000
Shkenca ka nxjerr ne pah
13:49
many, many more indicators.
329
814000
2000
shume, shume indikatore.
13:51
We know, for example,
330
816000
2000
Ne e dime, pershembull
13:53
we know liars will shift their blink rate,
331
818000
2000
qe genjeshtaret shpesh e nderrojne normen e injorimit
13:55
point their feet towards an exit.
332
820000
2000
dhe kembet i cojne per tek dalja.
13:57
They will take barrier objects
333
822000
2000
Ata do te marrin objekte penguese
13:59
and put them between themselves and the person that is interviewing them.
334
824000
3000
duke i futur ato mes vetes dhe personit me te cilin po bejne interviste
14:02
They'll alter their vocal tone,
335
827000
2000
Ata do te ndryshojne edhe tonin e tyre
14:04
often making their vocal tone much lower.
336
829000
3000
shpesh duke e zvogeluar volumin.
14:07
Now here's the deal.
337
832000
2000
Kjo eshte marreveshja.
14:09
These behaviors are just behaviors.
338
834000
3000
Keto sjellje jane vetem sjellje.
14:12
They're not proof of deception.
339
837000
2000
Ato nuk jane prova te nje mashtrimi.
14:14
They're red flags.
340
839000
2000
Ato jane vetem flamur te kuq.
14:16
We're human beings.
341
841000
2000
Ne jemi qenie njerezore
14:18
We make deceptive flailing gestures all over the place all day long.
342
843000
3000
Ne bejme shpesh xheste mashtruese ne shume vende, tere diten.
14:21
They don't mean anything in and of themselves.
343
846000
2000
Dhe keto xheste nuk kane nje domethenie te madhe.
14:23
But when you see clusters of them, that's your signal.
344
848000
3000
Por kur i sheh se ata po grupohen, atehere kjo eshte shenja.
14:26
Look, listen, probe, ask some hard questions,
345
851000
3000
Shiko, degjo, heto, bej disa pyetje te veshtira,
14:29
get out of that very comfortable mode of knowing,
346
854000
3000
dil pak prej asaj pozites komode te te njohurit,
14:32
walk into curiosity mode, ask more questions,
347
857000
3000
por dil ne ate rrugen e kuriozitetit, bej ende pyetje,
14:35
have a little dignity, treat the person you're talking to with rapport.
348
860000
3000
ki me shume dinjitet, dhe trajtoje personin me te cilin bisedon me nje raport.
14:38
Don't try to be like those folks on "Law & Order" and those other TV shows
349
863000
3000
Mos u mundo te behesh si ata tek "Ligji & Rregulli" ose si ata te programeve tjera
14:41
that pummel their subjects into submission.
350
866000
2000
qe rrahin me grushta mendimet e tyre subjektive.
14:43
Don't be too aggressive, it doesn't work.
351
868000
3000
Mos u bej fort agresiv, sepse nuk funksionon.
14:46
Now we've talked a little bit
352
871000
2000
Dhe ja biseduam pak
14:48
about how to talk to someone who's lying
353
873000
2000
se si te komunikosh me dike qe genjen
14:50
and how to spot a lie.
354
875000
2000
dhe si te zbulosh nje genjeshter.
14:52
And as I promised, we're now going to look at what the truth looks like.
355
877000
3000
Dhe sic kam premtuar, tashme do te shohim se si duket e verteta.
14:55
But I'm going to show you two videos,
356
880000
2000
Por une do tju tregoj dy video,
14:57
two mothers -- one is lying, one is telling the truth.
357
882000
3000
dy nena - njera po genjen, njera thote te verteten.
15:00
And these were surfaced
358
885000
2000
Dhe te dyja kane dale ne siperfaqe
15:02
by researcher David Matsumoto in California.
359
887000
2000
nga studiusi David Matsumoto nga Kalifornia.
15:04
And I think they're an excellent example
360
889000
2000
Dhe mendoj se ato jane nje shembull brilant
15:06
of what the truth looks like.
361
891000
2000
se si ne realitet zbulohet e verteta.
15:08
This mother, Diane Downs,
362
893000
2000
Kjo nene, Diane Downs
15:10
shot her kids at close range,
363
895000
2000
i qelloi femijet e saj nga disance te afert,
15:12
drove them to the hospital
364
897000
2000
i dergoi ata per ne spital
15:14
while they bled all over the car,
365
899000
2000
derisa ata po gjakosnin gjate gjithe rruges,
15:16
claimed a scraggy-haired stranger did it.
366
901000
2000
duke thene se nje njeri i huaj ua kishte bere kete femijeve te saj.
15:18
And you'll see when you see the video,
367
903000
2000
Dhe ju do ta shihni edhe ne video
15:20
she can't even pretend to be an agonizing mother.
368
905000
2000
se ajo as nuk mund te shtiret si nje nene e pikelluar.
15:22
What you want to look for here
369
907000
2000
Cfare ju deshironi te shikoni ketu
15:24
is an incredible discrepancy
370
909000
2000
eshte nje mosperputhje e pabesueshme
15:26
between horrific events that she describes
371
911000
2000
midis ngjarjeve te tmerrshme qe ajo pershkruan
15:28
and her very, very cool demeanor.
372
913000
2000
dhe sjelljes se saj shume te ftohte.
15:30
And if you look closely, you'll see duping delight throughout this video.
373
915000
3000
Dhe nese shikoni me afer, do te shihni edhe nje kenaqesi dopio ne kete video
15:33
(Video) Diane Downs: At night when I close my eyes,
374
918000
2000
(Video) Diane Downs: Naten kur i mbylla syte,
15:35
I can see Christie reaching her hand out to me while I'm driving,
375
920000
3000
une pash Christin duke e zgjatur doren nderkohe qe une po i jepja makines,
15:38
and the blood just kept coming out of her mouth.
376
923000
3000
dhe gjaku po fillonte ti dilte nga goja.
15:41
And that -- maybe it'll fade too with time --
377
926000
2000
Dhe kjo -- ndoshta do te zhduket me kohen --
15:43
but I don't think so.
378
928000
2000
por une nuk e besoj.
15:45
That bothers me the most.
379
930000
3000
Kjo me pengon mua me se shumti.
15:55
PM: Now I'm going to show you a video
380
940000
2000
PM: Tani do tju tregoj nje video
15:57
of an actual grieving mother, Erin Runnion,
381
942000
2000
te nje nene aktualisht te brengosur, Erin Runnion,
15:59
confronting her daughter's murderer and torturer in court.
382
944000
4000
duke u perballur me vrasesit e vajzes se saj ne gjykate.
16:03
Here you're going to see no false emotion,
383
948000
2000
Ketu ju nuk do te shihni emocione te shtirura,
16:05
just the authentic expression of a mother's agony.
384
950000
3000
por vetem nje shprehje normale te nje nene ne agoni.
16:08
(Video) Erin Runnion: I wrote this statement on the third anniversary
385
953000
2000
(Video) Erin Runnion: E kam shkruar kete deklarate ne pervjetorin e trete
16:10
of the night you took my baby,
386
955000
2000
te asaj nate qe ti ma more femijen,
16:12
and you hurt her,
387
957000
2000
dhe e lendove ate,
16:14
and you crushed her,
388
959000
2000
e copetove ate,
16:16
you terrified her until her heart stopped.
389
961000
4000
e tmerrove deri ne momentin kur zemra e saj ndaloi.
16:20
And she fought, and I know she fought you.
390
965000
3000
Dhe ajo ka luftuar, e di qe ajo ka luftuar kunder teje.
16:23
But I know she looked at you
391
968000
2000
Por e di se ajo te ka shikuar
16:25
with those amazing brown eyes,
392
970000
2000
me syte e saj te kafte dhe te mrekullueshem,
16:27
and you still wanted to kill her.
393
972000
3000
dhe ti prape ke dashur ta vrasesh.
16:30
And I don't understand it,
394
975000
2000
Dhe une nuk e kuptoj
16:32
and I never will.
395
977000
3000
dhe skam per ta kuptuar kurre.
16:35
PM: Okay, there's no doubting the veracity of those emotions.
396
980000
4000
PM: Ne rregull, s`ka asnje pike dyshimi ne vertetesine e ketyre emocioneve.
16:39
Now the technology around what the truth looks like
397
984000
3000
Tani teknologjia rreth te vertetes sesi duket
16:42
is progressing on, the science of it.
398
987000
3000
eshte duke perparuar, sikurse edhe shkenca.
16:45
We know for example
399
990000
2000
Ne e dime, per shembull
16:47
that we now have specialized eye trackers and infrared brain scans,
400
992000
3000
se kemi edhe gjurmues te specialuar per sy apo skanera te rrezeve infrared,
16:50
MRI's that can decode the signals that our bodies send out
401
995000
3000
MRI` qe mund te dekodojne sinjalet qe nxjerr trupi yne
16:53
when we're trying to be deceptive.
402
998000
2000
kur ne deshirojme te mashtrojme.
16:55
And these technologies are going to be marketed to all of us
403
1000000
3000
Dhe keto teknologji do ti serviren secilit prej nesh
16:58
as panaceas for deceit,
404
1003000
2000
sikur zgjidhje per mashtrim,
17:00
and they will prove incredibly useful some day.
405
1005000
3000
dhe nje dite do te dalin me rezultate shume interesante.
17:03
But you've got to ask yourself in the meantime:
406
1008000
2000
Por gjithashtu ne te njejten kohe ju duhet te pyesni veten:
17:05
Who do you want on your side of the meeting,
407
1010000
2000
Ke do te deshironi ne anen tuaj gjate takimit
17:07
someone who's trained in getting to the truth
408
1012000
3000
dike qe eshte trajnuar ta gjeje te verteten
17:10
or some guy who's going to drag a 400-pound electroencephalogram
409
1015000
2000
apo ndonje person qe do te terheq nje elektroencefalograme 400 kileshe
17:12
through the door?
410
1017000
2000
pergjate deres ?
17:14
Liespotters rely on human tools.
411
1019000
4000
Zbuluesit e genjeshtrave mbeshteten ne mjetet e njeriut.
17:18
They know, as someone once said,
412
1023000
2000
Ata e dine, sic dikush ka thene,
17:20
"Character's who you are in the dark."
413
1025000
2000
"Karakteri qe ju jeni ne erresire".
17:22
And what's kind of interesting
414
1027000
2000
Dhe cfare eshte shume interesante
17:24
is that today we have so little darkness.
415
1029000
2000
eshte se diteve te sotme ne kemi shume pak erresire.
17:26
Our world is lit up 24 hours a day.
416
1031000
3000
Bota jone ndricohet 24 ore pa pushim.
17:29
It's transparent
417
1034000
2000
Eshte transparente
17:31
with blogs and social networks
418
1036000
2000
me blogje dhe rrjete sociale
17:33
broadcasting the buzz of a whole new generation of people
419
1038000
2000
duke transmetuar levizjet e gjenerates se re te njerezve
17:35
that have made a choice to live their lives in public.
420
1040000
3000
qe kane bere zgjidhje te jetojne jeten e tyre ne publik.
17:38
It's a much more noisy world.
421
1043000
4000
Eshte nje bote me e zhurmshme.
17:42
So one challenge we have
422
1047000
2000
Dhe, nje sfide qe ne e kemi
17:44
is to remember,
423
1049000
2000
eshte qe te kujtojme
17:46
oversharing, that's not honesty.
424
1051000
3000
shperndarjen, qe nuk eshte ndershmeri.
17:49
Our manic tweeting and texting
425
1054000
2000
Mania jone per te twittuar apo shkruar
17:51
can blind us to the fact
426
1056000
2000
mund te na verboje ne faktin
17:53
that the subtleties of human decency -- character integrity --
427
1058000
3000
se hollesite e sjelles njerezore, integritetit dhe karakterit
17:56
that's still what matters, that's always what's going to matter.
428
1061000
3000
kane shume rendesi, dhe eshte ajo qe do te kete gjithmone rendesi.
17:59
So in this much noisier world,
429
1064000
2000
Pra, ne nje bote te tille te zhurmshme,
18:01
it might make sense for us
430
1066000
2000
me siguri do te kete kuptim per ne
18:03
to be just a little bit more explicit
431
1068000
2000
qe te jemi pak me shume te qarte
18:05
about our moral code.
432
1070000
3000
rreth kodit tone moral.
18:08
When you combine the science of recognizing deception
433
1073000
2000
Kur e kombinon shkencen e te zbuluarit te mashtrimit
18:10
with the art of looking, listening,
434
1075000
2000
me artin e shikimit, degjimit,
18:12
you exempt yourself from collaborating in a lie.
435
1077000
3000
ti e largon veten nga bashkepunimi ne nje genjeshter.
18:15
You start up that path
436
1080000
2000
Ju e nisni kete rruge
18:17
of being just a little bit more explicit,
437
1082000
2000
duke qene pak me te qarte,
18:19
because you signal to everyone around you,
438
1084000
2000
sepse sinalizon te gjithe njerezit rreth teje,
18:21
you say, "Hey, my world, our world,
439
1086000
3000
dhe thua, "Hej, bota ime, bota jone,
18:24
it's going to be an honest one.
440
1089000
2000
duhet te jete e sinqerte.
18:26
My world is going to be one where truth is strengthened
441
1091000
2000
Bota ime do te jete ajo ku do te kete shum fuqi
18:28
and falsehood is recognized and marginalized."
442
1093000
3000
ku genjeshtra njihet dhe margjinalizohet."
18:31
And when you do that,
443
1096000
2000
Dhe kur ju do ta beni kete,
18:33
the ground around you starts to shift just a little bit.
444
1098000
3000
terreni perreth vetes suaj do te zhvendoset pak.
18:36
And that's the truth. Thank you.
445
1101000
3000
Dhe kjo eshte e verteta. Faleminderit
18:39
(Applause)
446
1104000
5000
)Duartrokitje)
Translated by Arber Selmani
Reviewed by Amantia Gjikondi

▲Back to top

ABOUT THE SPEAKER
Pamela Meyer - Lie detector
Pamela Meyer thinks we’re facing a pandemic of deception, but she’s arming people with tools that can help take back the truth.

Why you should listen
Social media expert Pamela Meyer can tell when you’re lying. If it’s not your words that give you away, it’s your posture, eyes, breathing rate, fidgets, and a host of other indicators. Worse, we are all lied to up to 200 times a day, she says, from the white lies that allow society to function smoothly to the devastating duplicities that bring down corporations and break up families.

Working with a team of researchers over several years, Meyer, who is CEO of social networking company Simpatico Networks, collected and reviewed most of the research on deception that has been published, from such fields as law-enforcement, military, psychology and espionage. She then became an expert herself, receiving advanced training in deception detection, including multiple courses of advanced training in interrogation, microexpression analysis, statement analysis, behavior and body language interpretation, and emotion recognition. Her research is synthetized in her bestselling book Liespotting.
More profile about the speaker
Pamela Meyer | Speaker | TED.com