ABOUT THE SPEAKER
Philip Zimbardo - Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism.

Why you should listen

Philip Zimbardo knows what evil looks like. After serving as an expert witness during the Abu Ghraib trials, he wrote The Lucifer Effect: Understanding How Good People Turn Evil. From Nazi comic books to the tactics of used-car salesmen, he explores a wealth of sources in trying to explain the psychology of evil.

A past president of the American Psychological Association and a professor emeritus at Stanford, Zimbardo retired in 2008 from lecturing, after 50 years of teaching his legendary introductory course in psychology. In addition to his work on evil and heroism, Zimbardo recently published The Time Paradox, exploring different cultural and personal perspectives on time.

Still well-known for his controversial Stanford Prison Experiment, Zimbardo in his new research looks at the psychology of heroism. He asks, "What pushes some people to become perpetrators of evil, while others act heroically on behalf of those in need?"

More profile about the speaker
Philip Zimbardo | Speaker | TED.com
TED2008

Philip Zimbardo: The psychology of evil

Philip Zimbardo: A psicoloxía do mal

Filmed:
7,078,283 views

Philip Zimbardo sabe o fácil que é que a xente boa se volva malvada. Nesta charla comparte información e fotos non publicadas dos xuízos de Abu Ghraib. Pero logo salta ao outro lado: o doado que é converterse nun heroe, e como podemos asumir ese desafío.
- Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism. Full bio

Double-click the English transcript below to play the video.

Filósofos, dramaturgos, teólogos
00:13
Philosophers, dramatists, theologians
0
1000
3000
00:16
have grappled with this question for centuries:
1
4000
2000
enfróntanse a esta cuestión
desde hai séculos:
00:18
what makes people go wrong?
2
6000
2000
que leva á xente a facer o mal?
00:20
Interestingly, I asked this question when I was a little kid.
3
8000
2000
Curiosamente, eu véñome preguntando isto
desde que era un neno.
00:23
When I was a kid growing up in the South Bronx, inner-city ghetto
4
11000
2000
Medrei no South Bronx,
un gueto de Nova York,
00:25
in New York, I was surrounded by evil,
5
13000
2000
e estaba rodeado polo mal,
como todos os nenos
que medran nun barrio marxinal.
00:28
as all kids are who grew up in an inner city.
6
16000
2000
00:30
And I had friends who were really good kids,
7
18000
2000
E tiña amigos
que eran bos rapaces,
00:32
who lived out the Dr. Jekyll Mr. Hyde scenario -- Robert Louis Stevenson.
8
20000
4000
que facían realidade o Dr. Jekyll
e Mr. Hyde de R.L. Stevenson.
00:36
That is, they took drugs, got in trouble, went to jail.
9
24000
4000
Consumían drogas, metíanse en liortas,
ían parar á cadea.
00:40
Some got killed, and some did it without drug assistance.
10
28000
4000
Algúns mataron,
e algúns mesmo o fixeron sen drogarse.
00:44
So when I read Robert Louis Stevenson, that wasn't fiction.
11
32000
3000
Así que cando lin a Stevenson,
para min non era ficción.
00:47
The only question is, what was in the juice?
12
35000
1000
A única pregunta é,
que había naquel caldo?
00:48
And more importantly, that line between good and evil --
13
36000
5000
E o máis importante:
esa liña entre o ben e o mal
00:53
which privileged people like to think is fixed and impermeable,
14
41000
3000
--que os privilexiados queren pensar
que é fixa e impermeable,
00:56
with them on the good side, and the others on the bad side --
15
44000
3000
con eles no lado bo, e os outros no malo--
00:59
I knew that line was movable, and it was permeable.
16
47000
3000
eu sei que é móbil, permeable.
01:02
Good people could be seduced across that line,
17
50000
2000
As boas persoas poden verse inducidas
a cruzar a liña,
e en condicións adecuadas e infrecuentes,
os nenos malos poden recuperarse
01:05
and under good and some rare circumstances, bad kids could recover
18
53000
3000
01:08
with help, with reform, with rehabilitation.
19
56000
3000
con axuda, con corrección,
con rehabilitación.
01:12
So I want to begin with this this wonderful illusion
20
60000
2000
Xa que logo, quero empezar
con esta preciosa ilusión
01:14
by [Dutch] artist M.C. Escher.
21
62000
2000
do artista holandés M.C. Escher.
Se a miran enfocando no branco,
verán un mundo cheo de anxos.
01:17
If you look at it and focus on the white,
22
65000
1000
01:18
what you see is a world full of angels.
23
66000
2000
01:20
But let's look more deeply, and as we do,
24
68000
3000
Pero miremos con máis atención,
e ao facelo,
01:23
what appears is the demons, the devils in the world.
25
71000
3000
aparecen os diaños, os demos do mundo.
Isto dinos varias cousas.
01:27
And that tells us several things.
26
75000
1000
01:28
One, the world is, was, will always be filled with good and evil,
27
76000
3000
Unha, que o mundo está, estivo, e estará
cheo de bondade e de maldade,
01:31
because good and evil is the yin and yang of the human condition.
28
79000
3000
porque o ben e o mal son o yin e o yang
da condición humana.
01:34
It tells me something else. If you remember,
29
82000
2000
Pero a min dime algo máis.
Como vostedes saben,
o anxo preferido de Deus era Lucifer.
01:36
God's favorite angel was Lucifer.
30
84000
3000
01:39
Apparently, Lucifer means "the light."
31
87000
2000
Seica Lucifer significa "Portador da luz".
Significa tamén, nalgún texto,
"luceiro da mañá".
01:42
It also means "the morning star," in some scripture.
32
90000
2000
01:44
And apparently, he disobeyed God,
33
92000
3000
E disque
desobedeceu a Deus,
01:47
and that's the ultimate disobedience to authority.
34
95000
3000
o que vén sendo a máxima
desobediencia á autoridade.
01:50
And when he did, Michael, the archangel, was sent
35
98000
3000
E cando o fixo, Deus mandou
ao arcanxo Miguel
01:54
to kick him out of heaven along with the other fallen angels.
36
102000
3000
para expulsalo do ceo
canda os demais anxos caídos.
01:57
And so Lucifer descends into hell, becomes Satan,
37
105000
3000
E Lucifer baixou aos infernos,
e converteuse en Satán,
02:01
becomes the devil, and the force of evil in the universe begins.
38
109000
2000
converteuse no demo,
e o mal apareceu no Universo.
02:03
Paradoxically, it was God who created hell as a place to store evil.
39
111000
5000
Paradoxalmente, Deus foi
quen creou o inferno,
un lugar onde encerrar o mal.
Non conseguíu mantelo dentro.
02:09
He didn't do a good job of keeping it there though.
40
117000
1000
02:10
So, this arc of the cosmic transformation
41
118000
3000
Así, este arco da transformación cósmica
do anxo preferido de Deus no demo,
02:13
of God's favorite angel into the Devil,
42
121000
2000
02:15
for me, sets the context for understanding human beings
43
123000
4000
para min define o marco
para comprender os seres humanos
02:19
who are transformed from good, ordinary people
44
127000
3000
que pasan de ser boas persoas, xente
normal, a ser axentes do mal.
02:22
into perpetrators of evil.
45
130000
2000
02:24
So the Lucifer effect, although it focuses on the negatives --
46
132000
4000
O efecto Lucifer,
aínda que se centra no negativo,
02:28
the negatives that people can become,
47
136000
3000
no negativa que pode volverse a xente,
02:31
not the negatives that people are --
48
139000
1000
non no negativa que é a xente,
02:32
leads me to a psychological definition. Evil is the exercise of power.
49
140000
6000
lévame a unha definición psicolóxica.
O mal é un exercicio de poder.
02:38
And that's the key: it's about power.
50
146000
2000
E esta é a chave: isto vai de poder.
02:40
To intentionally harm people psychologically,
51
148000
3000
Para danar psicoloxicamente
ás persoas á mantenta,
para mancalas, para destruír
as persoas ou as ideas,
02:44
to hurt people physically, to destroy people mortally, or ideas,
52
152000
3000
02:47
and to commit crimes against humanity.
53
155000
2000
e para cometer crimes contra a humanidade.
02:51
If you Google "evil," a word that should surely have withered by now,
54
159000
3000
Se meten 'mal' en Google, unha palabra
que xa debería estar esgotada,
02:54
you come up with 136 million hits in a third of a second.
55
162000
4000
sairanlles máis de 136 millóns
de entradas en 300 milisegundos.
02:58
A few years ago -- I am sure all of you were shocked, as I was,
56
166000
4000
Hai uns poucos anos, seguro que vostedes,
coma min, quedaron conmocionados
03:02
with the revelation of American soldiers
57
170000
3000
ao saberen que soldados estadounidenses
abusaran de prisioneiros nun lugar estraño
03:05
abusing prisoners in a strange place
58
173000
3000
03:08
in a controversial war, Abu Ghraib in Iraq.
59
176000
3000
nunha guerra controvertida,
Abu Ghraib, en Irak.
03:11
And these were men and women
60
179000
1000
Tratábase de homes e mulleres
03:12
who were putting prisoners through unbelievable humiliation.
61
180000
5000
que someteron aos prisioneiros
a incribles humillacións.
Eu quedei conmocionado,
pero non sorprendido,
03:18
I was shocked, but I wasn't surprised,
62
186000
1000
03:19
because I had seen those same visual parallels
63
187000
2000
porque xa vira imaxes semellantes
03:22
when I was the prison superintendent of the Stanford Prison Study.
64
190000
3000
cando fun supervisor de prisión
no Estudo da Prisión de Stanford.
03:25
Immediately the Bush administration military said ... what?
65
193000
2000
Que dixo inmediatamente
a Administración militar de Bush?
O que di calquera Administración
cando xorde un escándalo
03:28
What all administrations say when there's a scandal.
66
196000
2000
03:30
"Don't blame us. It's not the system. It's the few bad apples,
67
198000
3000
"Non nos culpen. Non é o sistema.
Son unhas poucas mazás podres,
uns poucos soldados ruíns".
03:34
the few rogue soldiers."
68
202000
1000
03:35
My hypothesis is, American soldiers are good, usually.
69
203000
3000
A miña hipótese é que os soldados
estadounidenses son, polo xeral, bos.
03:38
Maybe it was the barrel that was bad.
70
206000
2000
Se cadra o podre era o cesto.
03:40
But how am I going to -- how am I going to deal with that hypothesis?
71
208000
3000
Pero como contrastar esta hipótese?
03:43
I became an expert witness
72
211000
1000
Eu actuei como perito para un dos gardas,
03:44
for one of the guards, Sergeant Chip Frederick,
73
212000
2000
o sarxento Chip Frederick, e por esa vía,
03:46
and in that position, I had access to the dozen investigative reports.
74
214000
4000
tiven acceso a ducias
de informes de investigación.
03:50
I had access to him. I could study him,
75
218000
3000
Tiven acceso a el.
Puiden estudalo,
leveino á miña casa, coñecino,
03:54
have him come to my home, get to know him,
76
222000
1000
03:55
do psychological analysis to see, was he a good apple or bad apple.
77
223000
4000
fíxenlle avaliacións psicolóxicas
para ver se era unha mazá sa ou podre.
03:59
And thirdly, I had access to all of the 1,000 pictures
78
227000
4000
E, en terceiro lugar, tiven acceso
á totalidade das 1.000 fotografías
04:03
that these soldiers took.
79
231000
1000
que sacaran os soldados.
Eran fotos de natureza violenta ou sexual.
04:05
These pictures are of a violent or sexual nature.
80
233000
2000
04:07
All of them come from the cameras of American soldiers.
81
235000
3000
Todas sacadas coas cámaras
dos soldados estadounidenses.
Porque todo o mundo ten
unha cámara dixital ou un móbil,
04:11
Because everybody has a digital camera or cell phone camera,
82
239000
2000
04:13
they took pictures of everything. More than 1,000.
83
241000
2000
sacaron fotos de todo, máis de mil.
04:16
And what I've done is I organized them into various categories.
84
244000
2000
E eu organiceinas en varias categorías.
04:18
But these are by United States military police, army reservists.
85
246000
5000
Pero isto fixérono policías militares
estadounidenses, reservistas do exército.
04:24
They are not soldiers prepared for this mission at all.
86
252000
3000
Non eran para nada soldados
adestrados para esta misión.
04:27
And it all happened in a single place, Tier 1-A, on the night shift.
87
255000
5000
E todo isto pasou nun único lugar,
a Galería 1-A,
na quenda de noite. Por que?
04:32
Why? Tier 1-A was the center for military intelligence.
88
260000
3000
A Galería 1-A era o centro
da intelixencia militar.
04:35
It was the interrogation hold. The CIA was there.
89
263000
3000
Era a central de interrogatorios.
Alí estaba a CIA.
E interrogadores da Titan Corporation,
todos alí,
04:39
Interrogators from Titan Corporation, all there,
90
267000
2000
04:41
and they're getting no information about the insurgency.
91
269000
3000
E non conseguían información
sobre a insurrección.
Así que presionaban a estes soldados,
04:45
So they're going to put pressure on these soldiers,
92
273000
1000
04:46
military police, to cross the line,
93
274000
2000
policías militares,
para que cruzaran a liña,
04:49
give them permission to break the will of the enemy,
94
277000
3000
déronlles permiso para quebrar
a vontade dos inimigos,
04:52
to prepare them for interrogation, to soften them up,
95
280000
2000
para preparalos para os interrogatorios,
para abrandalos,
04:54
to take the gloves off. Those are the euphemisms,
96
282000
2000
para que quitasen os guantes.
Todo eufemismos,
04:56
and this is how it was interpreted.
97
284000
3000
e así é como se interpretaron.
Entremos nos calabozos.
05:00
Let's go down to that dungeon.
98
288000
1000
05:01
(Camera shutter)
99
289000
37000
(Son de máquina de escribir)
[Abusos na Prisión de Abu Ghraib, Irak,
2003. Fotografías dos policías militares]
[Estas imaxes inclúen espidos
e contidos violentos]
(disparo de cámara)
05:38
(Thuds)
100
326000
6000
(golpes xordos)
05:45
(Camera shutter)
101
333000
14000
(disparos de cámara)
05:59
(Thuds)
102
347000
10000
(disparo de cámara)
06:09
(Breathing)
103
357000
8000
(respiración)
06:17
(Bells)
104
365000
31000
(campás)
06:49
So, pretty horrific.
105
397000
3000
Arrepiante.
Esta unha das...
representacións gráficas do mal.
06:52
That's one of the visual illustrations of evil.
106
400000
3000
06:55
And it should not have escaped you that
107
403000
1000
E non se lles debe escapar que
06:56
the reason I paired the prisoner with his arms out
108
404000
4000
a razón de que eu emparellara
ao prisioneiro cos brazos abertos
07:00
with Leonardo da Vinci's ode to humanity
109
408000
3000
coa oda á humanidade de Leonardo da Vinci
07:03
is that that prisoner was mentally ill.
110
411000
2000
é que este prisioneiro era
un enfermo mental.
07:05
That prisoner covered himself with shit every day,
111
413000
2000
Cubríase de merda todos os días,
07:07
and they used to have to roll him in dirt so he wouldn't stink.
112
415000
3000
Tiñan que rebozalo en lama
para que non fedese.
07:10
But the guards ended up calling him "Shit Boy."
113
418000
2000
Pero os gardas deron en alcumalo
"Shit Boy" (Mozo Merda)
07:12
What was he doing in that prison
114
420000
2000
Que facía este home nunha cadea
en vez de estar nun psiquiátrico?
07:14
rather than in some mental institution?
115
422000
3000
07:17
In any event, here's former Secretary of Defense Rumsfeld.
116
425000
3000
En calquera caso, aquí está
o ex-Secretario de Defensa, Rumsfeld.
07:20
He comes down and says, "I want to know, who is responsible?
117
428000
2000
Baixa á area e di,
"Quero saber quen é o responsable"
07:22
Who are the bad apples?" Well, that's a bad question.
118
430000
3000
"Quen son as mazás podres?"
Esta é unha pregunta errónea.
07:25
You have to reframe it and ask, "What is responsible?"
119
433000
3000
Tería que reformulala, preguntar,
"Que é o responsable?"
07:28
Because "what" could be the who of people,
120
436000
2000
"Que" pode ser o "quen" das persoas,
07:30
but it could also be the what of the situation,
121
438000
2000
pero pode ser tamén o "que" da situación,
07:32
and obviously that's wrongheaded.
122
440000
2000
e, obviamente, está mal formulada.
07:34
So how do psychologists go about understanding
123
442000
2000
Como tratan de entender os psicólogos
estas transformacións do carácter humano,
07:36
such transformations of human character,
124
444000
2000
07:38
if you believe that they were good soldiers
125
446000
2000
se pensamos que eran bos soldados
07:40
before they went down to that dungeon?
126
448000
1000
antes de entrar naquel alxube?
07:42
There are three ways. The main way is -- it's called dispositional.
127
450000
2000
Hai tres formas.
A principal é a denominada disposicional.
07:44
We look at what's inside of the person, the bad apples.
128
452000
3000
Miramos dentro das persoas,
as mazás podres.
07:48
This is the foundation of all of social science,
129
456000
3000
Esta é a base de toda ciencia social,
07:51
the foundation of religion, the foundation of war.
130
459000
2000
a base da relixión, a base da guerra.
Os psicólogos sociais coma min
imos alén, dicimos: "Si,
07:55
Social psychologists like me come along and say, "Yeah,
131
463000
2000
07:57
people are the actors on the stage,
132
465000
2000
as persoas son actores no escenario,
07:59
but you'll have to be aware of what that situation is.
133
467000
2000
pero temos que ser conscientes
da situación.
08:01
Who are the cast of characters? What's the costume?
134
469000
3000
Cal é o elenco de personaxes?
Cal é o vestiario?
08:04
Is there a stage director?"
135
472000
1000
Hai un director de escena?"
08:05
And so we're interested in, what are the external factors
136
473000
2000
E deste modo interesámonos
polos factores externos
08:08
around the individual -- the bad barrel?
137
476000
2000
que rodean ao individuo: o cesto podre.
08:10
And social scientists stop there, and they miss the big point
138
478000
3000
Os científicos sociais paran aquí
e ignoran o punto crucial
08:13
that I discovered when I became an expert witness for Abu Ghraib.
139
481000
3000
que descubrín cando actuei
de perito para o caso Abu Ghraib.
08:16
The power is in the system.
140
484000
2000
O poder está no sistema.
08:18
The system creates the situation that corrupts the individuals,
141
486000
3000
O sistema crea a situación
que corrompe os individuos,
08:21
and the system is the legal, political, economic, cultural background.
142
489000
5000
e o sistema é o marco legal, político,
económico, cultural.
08:26
And this is where the power is of the bad-barrel makers.
143
494000
3000
Aí reside o poder
dos que producen cestos podres.
08:29
So if you want to change a person, you've got to change the situation.
144
497000
3000
Se queres cambiar unha persoa,
cambia a situación.
08:32
If you want to change the situation,
145
500000
1000
E para cambiala debes saber
que o poder reside no sistema.
08:33
you've got to know where the power is, in the system.
146
501000
2000
08:35
So the Lucifer effect involves understanding
147
503000
2000
Así que o efecto Lucifer supón comprender
08:37
human character transformations with these three factors.
148
505000
5000
as transformacións do carácter humano
con estes tres factores.
08:43
And it's a dynamic interplay.
149
511000
1000
É unha interacción dinámica.
08:44
What do the people bring into the situation?
150
512000
2000
Que lle achega a persoa a esta situación?
08:46
What does the situation bring out of them?
151
514000
2000
Que saca a situación da persoa?
08:48
And what is the system that creates and maintains that situation?
152
516000
4000
E cal é o sistema que crea
e mantén esa situación?
08:52
So my book, "The Lucifer Effect," recently published, is about,
153
520000
2000
O meu libro "O efecto Lucifer" trata de
08:54
how do you understand how good people turn evil?
154
522000
2000
explicar como as boas persoas viran malas.
08:57
And it has a lot of detail
155
525000
1000
E inclúe moitos detalles
sobre o tema do que lles falo hoxe.
08:58
about what I'm going to talk about today.
156
526000
2000
"O efecto Lucifer" do Dr. Z.,
aínda que se centra no mal,
09:01
So Dr. Z's "Lucifer Effect," although it focuses on evil,
157
529000
3000
09:04
really is a celebration of the human mind's
158
532000
2000
en realidade é unha celebración
da infinita capacidade da mente humana
09:06
infinite capacity to make any of us kind or cruel,
159
534000
4000
para facer a calquera de nós
bondadoso ou cruel,
09:10
caring or indifferent, creative or destructive,
160
538000
3000
solícito ou indiferente,
creativo ou destrutivo,
09:13
and it makes some of us villains.
161
541000
2000
para converter a algúns en viláns.
09:15
And the good news story that I'm going to hopefully come to
162
543000
2000
E as boas novas que, afortunadamente,
achegarei como remate
09:18
at the end is that it makes some of us heroes.
163
546000
2000
é que converte a algúns en heroes.
09:20
This is a wonderful cartoon in the New Yorker,
164
548000
3000
Esta estupenda viñeta do New Yorker
resume toda a miña charla:
09:23
which really summarizes my whole talk:
165
551000
2000
09:25
"I'm neither a good cop nor a bad cop, Jerome.
166
553000
2000
"Nin son un poli bo
nin un poli malo, Jerome.
09:27
Like yourself, I'm a complex amalgam
167
555000
2000
Coma ti, son unha mestura complexa
09:29
of positive and negative personality traits
168
557000
2000
de trazos de personalidade
positivos e negativos
que emerxen ou non, dependendo
das circunstancias."
09:32
that emerge or not, depending on the circumstances."
169
560000
3000
09:35
(Laughter)
170
563000
2000
(Risos)
09:37
There's a study some of you think you know about,
171
565000
3000
Hai un estudo do que algúns de vdes.
xa oíron falar,
09:40
but very few people have ever read the story. You watched the movie.
172
568000
4000
pero que pouca xente leu.
Vostedes viron a película.
09:44
This is Stanley Milgram, little Jewish kid from the Bronx,
173
572000
3000
Este é Stanley Milgram,
un mozo xudeu do Bronx,
09:47
and he asked the question, "Could the Holocaust happen here, now?"
174
575000
3000
que se preguntou "Podería producirse
o Holocausto aquí, agora?
A xente respondeu: "Non, aquelo foi
a Alemaña nazi, Hitler, xa sabes, 1939".
09:51
People say, "No, that's Nazi Germany,
175
579000
1000
09:52
that's Hitler, you know, that's 1939."
176
580000
2000
09:54
He said, "Yeah, but suppose Hitler asked you,
177
582000
2000
E dixo, "Si, pero supón
que Hitler che pregunta,
09:56
'Would you electrocute a stranger?' 'No way, not me, I'm a good person.' "
178
584000
3000
"Electrocutarías a un descoñecido?"
"Nunca, eu son unha boa persoa".
10:00
He said, "Why don't we put you in a situation
179
588000
1000
El dixo, "Por que non te poñemos
na situación
10:01
and give you a chance to see what you would do?"
180
589000
2000
e che damos a oportunidade
de ver o que farías?"
10:03
And so what he did was he tested 1,000 ordinary people.
181
591000
4000
E así o fixo, puxo a proba
a mil persoas normais.
10:07
500 New Haven, Connecticut, 500 Bridgeport.
182
595000
3000
Cincocentas en New Haven, Connecticut,
Cincocentas en Bridgeport.
10:10
And the ad said, "Psychologists want to understand memory.
183
598000
4000
O anuncio dicía, "Psicólogos interesados
no estudo da memoria.
10:14
We want to improve people's memory,
184
602000
1000
Queremos mellorar a memoria,
porque é a chave do éxito".
10:15
because memory is the key to success." OK?
185
603000
3000
Vale?
10:18
"We're going to give you five bucks -- four dollars for your time."
186
606000
5000
"Pagámosche cinco dólares,
catro dólares polo teu tempo.
Non queremos universitarios.
Buscamos homes entre 20 e 50 anos".
10:24
And it said, "We don't want college students.
187
612000
1000
10:25
We want men between 20 and 50."
188
613000
2000
10:27
In the later studies, they ran women.
189
615000
1000
Fixeron estudos posteriores con mulleres.
10:28
Ordinary people: barbers, clerks, white-collar people.
190
616000
4000
Xente corrente: barbeiros, curas,
administrativos.
10:32
So, you go down, and one of you is going to be a learner,
191
620000
3000
Así que empezamos,
un de vostedes será un alumno,
outro será profesor.
10:35
and one of you is going to be a teacher.
192
623000
1000
10:36
The learner's a genial, middle-aged guy.
193
624000
2000
O alumno é un tío simpático,
mediana idade.
10:38
He gets tied up to the shock apparatus in another room.
194
626000
3000
Está enchufado a un xerador de
descargas noutro cuarto.
10:41
The learner could be middle-aged, could be as young as 20.
195
629000
3000
Podía ser un home de mediana idade,
ou novo, duns 20 anos.
10:44
And one of you is told by the authority, the guy in the lab coat,
196
632000
4000
E a un de vostedes unha autoridade,
un tipo con bata branca dilles:
10:48
"Your job as teacher is to give this guy material to learn.
197
636000
3000
"A súa tarefa como profesor é
darlle material a aprender.
Se o fai ben, recompensa.
10:51
Gets it right, reward him.
198
639000
1000
10:52
Gets it wrong, you press a button on the shock box.
199
640000
2000
Se falla, prema un interruptor
no emisor de descargas.
10:54
The first button is 15 volts. He doesn't even feel it."
200
642000
3000
O primeiro interruptor é de 15 volts.
Apenas se sente".
10:58
That's the key. All evil starts with 15 volts.
201
646000
3000
Esta é a chave.
A maldade empeza con 15 volts.
11:01
And then the next step is another 15 volts.
202
649000
2000
E entón o seguinte paso
son outros 15 volts.
O problema é que ao final
da serie son 450 volts.
11:04
The problem is, at the end of the line, it's 450 volts.
203
652000
2000
11:06
And as you go along, the guy is screaming,
204
654000
3000
E conforme avanza, o tipo berra.
11:09
"I've got a heart condition! I'm out of here!"
205
657000
2000
"Padezo do corazón!
Quero marchar de aquí!"
11:11
You're a good person. You complain.
206
659000
2000
Vostede é boa persoa. Protesta.
"Señor, quen é o responsable
se lle pasa algo?"
11:13
"Sir, who's going to be responsible if something happens to him?"
207
661000
2000
11:15
The experimenter says, "Don't worry, I will be responsible.
208
663000
3000
O experimentador responde,
"Non se preocupe, o responsable son eu".
11:18
Continue, teacher."
209
666000
1000
Continúe, profesor".
11:19
And the question is, who would go all the way to 450 volts?
210
667000
4000
E a pregunta é: quen seguiría
até os 450 volts?
Teñan en conta que nos 375
11:24
You should notice here, when it gets up to 375,
211
672000
2000
11:26
it says, "Danger. Severe Shock."
212
674000
1000
indícase"Perigo. Descarga forte".
Chegado a ese punto temos a tripla X.
A pornografía do poder.
11:28
When it gets up to here, there's "XXX" -- the pornography of power.
213
676000
3000
11:31
(Laughter)
214
679000
1000
Milgram preguntoulles a 40 psiquiatras,
11:32
So Milgram asks 40 psychiatrists,
215
680000
1000
11:33
"What percent of American citizens would go to the end?"
216
681000
3000
"Que porcentaxe de estadounidenses
chegaría até o final?"
Responderon que só o 1 por cento.
11:37
They said only one percent. Because that's sadistic behavior,
217
685000
3000
Porque se trata dunha conduta sádica,
e os psiquiatras sabemos
11:40
and we know, psychiatry knows, only one percent of Americans are sadistic.
218
688000
3000
que só o 1 por cento
dos estadounidenses son sádicos.
11:44
OK. Here's the data. They could not be more wrong.
219
692000
4000
Vale.
Aquí están os datos.
Non podían equivocarse máis.
11:48
Two thirds go all the way to 450 volts. This was just one study.
220
696000
5000
Dous terzos chegaron aos 450 volts.
Era só un estudo.
11:53
Milgram did more than 16 studies. And look at this.
221
701000
3000
Milgram fixo dezaseis máis.
E miren isto.
11:56
In study 16, where you see somebody like you go all the way,
222
704000
4000
No estudo 16, onde ven a alguén
coma eles chegar até a fin,
12:00
90 percent go all the way. In study five, if you see people rebel, 90 percent rebel.
223
708000
6000
o 90 por cento chega até alí.
No estudo 5, onde ven
persoas que se rebelan,
o 90 por cento rebélase.
12:06
What about women? Study 13 -- no different than men.
224
714000
3000
Que pasa coas mulleres?
Estudo 13: non hai diferenzas cos homes.
Así que Milgram cuantificou a maldade
12:10
So Milgram is quantifying evil as the willingness of people
225
718000
3000
como a disposición das persoas
12:13
to blindly obey authority, to go all the way to 450 volts.
226
721000
3000
a obedecer cegamente á autoridade,
a chegar até os 450 volts.
12:16
And it's like a dial on human nature.
227
724000
2000
Isto é como o dial da natureza humana.
12:18
A dial in a sense that you can make almost everybody totally obedient,
228
726000
4000
Un dial en canto que podes facer
que obedeza practicamente todo o mundo,
12:23
down to the majority, down to none.
229
731000
2000
que o faga a maioría,
ou que non o faga ninguén.
12:25
So what are the external parallels? For all research is artificial.
230
733000
4000
Cal son os paralelismos externos?
Todo experimento é artificial.
Que validez ten no mundo real?
12:29
What's the validity in the real world?
231
737000
1000
12:30
912 American citizens committed suicide or were murdered
232
738000
4000
912 cidadáns estadounidenses
suicidáronse ou foron asasinados
12:34
by family and friends in Guyana jungle in 1978,
233
742000
3000
por familiares e amigos
na selva da Guyana en 1978,
12:37
because they were blindly obedient to this guy, their pastor --
234
745000
3000
porque obedecían cegamente
a este individuo, o seu pastor,
12:40
not their priest -- their pastor, Reverend Jim Jones.
235
748000
2000
non o seu párroco, o seu pastor,
o reverendo Jim Jones.
12:42
He persuaded them to commit mass suicide.
236
750000
3000
Persuadiunos para un suicidio colectivo.
12:46
And so, he's the modern Lucifer effect,
237
754000
1000
E así, el é o efecto Lucifer moderno,
12:47
a man of God who becomes the Angel of Death.
238
755000
3000
un home de Deus que se converte
no Anxo da Morte.
12:52
Milgram's study is all about individual authority to control people.
239
760000
4000
O estudo de Milgram ocúpase do uso
da autoridade individual
para controlar á xente.
12:56
Most of the time, we are in institutions,
240
764000
3000
Botamos a maior parte do tempo
en institucións,
polo que o Estudo da Prisión de Stanford
é sobre o poder das institucións
13:00
so the Stanford Prison Study is a study of the power of institutions
241
768000
3000
13:03
to influence individual behavior.
242
771000
2000
para influír na conduta individual.
13:05
Interestingly, Stanley Milgram and I were in the same high school class
243
773000
3000
Curiosamente, Stanley Milgram e mais eu
estudamos na mesma clase no instituto,
13:08
in James Monroe in the Bronx, 1954.
244
776000
3000
no instituto James Monroe, no Bronx, 1954.
Fixen o estudo
cos meus estudantes de posgrao,
13:13
So this study, which I did
245
781000
1000
13:14
with my graduate students, especially Craig Haney --
246
782000
2000
nomeadamente Craig Haney,
e tamén empezamos cun anuncio.
13:16
we also began work with an ad.
247
784000
1000
13:17
We didn't have money, so we had a cheap, little ad,
248
785000
2000
Era un anuncio breve,
13:19
but we wanted college students for a study of prison life.
249
787000
3000
queriamos universitarios
para un estudo da vida en prisión.
13:22
75 people volunteered, took personality tests.
250
790000
3000
Presentáronse 75 voluntarios,
fixeron tests de personalidade,
entrevistámolos.
13:25
We did interviews. Picked two dozen:
251
793000
2000
Escollimos dúas ducias:
os máis normais, os máis sans.
13:27
the most normal, the most healthy.
252
795000
1000
13:29
Randomly assigned them to be prisoner and guard.
253
797000
2000
Asignámoslles ao chou
o papel de preso e de garda.
13:31
So on day one, we knew we had good apples.
254
799000
2000
Así que o primeiro día tiñamos mazás sas.
13:33
I'm going to put them in a bad situation.
255
801000
2000
Eu ía poñelos nunha situación mala.
13:35
And secondly, we know there's no difference
256
803000
2000
En segundo lugar, sabiamos
que non había diferenzas
entre os mozos que ían ser gardas
e os que ían ser presos.
13:38
between the boys who are going to be guards
257
806000
1000
13:39
and the boys who are going to be prisoners.
258
807000
1000
13:41
The kids who were going to be prisoners,
259
809000
1000
Aos presos dixémoslles,
"Agardade na casa,
13:42
we said, "Wait at home in the dormitories. The study will begin Sunday."
260
810000
2000
"o estudo comeza o domingo"
Non lles dixemos
13:45
We didn't tell them
261
813000
1000
13:46
that the city police were going to come and do realistic arrests.
262
814000
36000
que a policía local ía proceder
a arrestalos como se fose real.
(Vídeo) (Música)
[Día 1]
14:22
(Video) Student: A police car pulls up in front, and a cop comes to the front door,
263
850000
6000
Estudante: Un coche da policía para diante
e un policía achégase á porta,
14:28
and knocks, and says he's looking for me.
264
856000
2000
e chama, e di que me busca.
14:30
So they, right there, you know, they took me out the door,
265
858000
2000
Así, como cho conto,
detivéronme na porta,
14:33
they put my hands against the car.
266
861000
3000
puxéronme as mans contra o coche.
14:36
It was a real cop car, it was a real policeman,
267
864000
2000
Era un coche da policía real,
un policía de verdade,
14:39
and there were real neighbors in the street,
268
867000
1000
e eran veciños de verdade
os que estaban na rúa, non sabían
14:40
who didn't know that this was an experiment.
269
868000
4000
que era un experimento.
14:44
And there was cameras all around and neighbors all around.
270
872000
3000
E había cámaras e veciños todo arredor.
14:47
They put me in the car, then they drove me around Palo Alto.
271
875000
3000
Metéronme no coche
e leváronme a Palo Alto.
14:52
They took me to the police station,
272
880000
3000
Leváronme a comisaría,
metéronme no soto da comisaría.
14:55
the basement of the police station. Then they put me in a cell.
273
883000
10000
Metéronme nun calabozo.
15:05
I was the first one to be picked up, so they put me in a cell,
274
893000
2000
Fun o primeiro detido,
así que me meteron nunha cela,
15:07
which was just like a room with a door with bars on it.
275
895000
4000
que era como un cuarto
cunha porta con reixa.
15:12
You could tell it wasn't a real jail.
276
900000
1000
Pode dicirse que non era un cárcere real.
15:13
They locked me in there, in this degrading little outfit.
277
901000
5000
Pecháronme alí,
con aquela roupa degradante,
Tomaban o experimento moi en serio.
15:19
They were taking this experiment too seriously.
278
907000
2000
15:21
Philip Zimbardo: Here are the prisoners who are going to be dehumanized.
279
909000
2000
Velaquí os presos, que van ser
deshumanizados, converteranse nun número.
15:23
They're going to become numbers.
280
911000
1000
15:24
Here are the guards with the symbols of power and anonymity.
281
912000
3000
E velaquí os gardas, cos seus símbolos
de poder e de anonimato.
15:27
Guards get prisoners
282
915000
1000
Os gardas obrigaban os presos
a limpar o váter coas mans,
15:28
to clean the toilet bowls out with their bare hands,
283
916000
2000
15:30
to do other humiliating tasks.
284
918000
2000
e a outras tarefas humillantes.
15:32
They strip them naked. They sexually taunt them.
285
920000
2000
Poñíanos en fila nús.
Vexábanos sexualmente.
15:34
They begin to do degrading activities,
286
922000
2000
Comezaron a obrigalos a actos degradantes,
como simular sodomía.
15:36
like having them simulate sodomy.
287
924000
2000
15:38
You saw simulating fellatio in soldiers in Abu Ghraib.
288
926000
3000
Vostedes viron felacións simuladas
así en Abu Ghraib
15:41
My guards did it in five days. The stress reaction was so extreme
289
929000
5000
Os meus gardas fixérono en cinco días.
A reacción de estrés foi tan extrema
15:46
that normal kids we picked because they were healthy
290
934000
2000
que os rapaces que seleccionáramos
porque eran sans
15:48
had breakdowns within 36 hours.
291
936000
2000
crebaron en 36 horas.
15:50
The study ended after six days, because it was out of control.
292
938000
4000
O estudo rematou aos seis días,
porque estaba fóra de control.
15:54
Five kids had emotional breakdowns.
293
942000
2000
Cinco rapaces tiveron crises emocionais.
15:58
Does it make a difference if warriors go to battle
294
946000
2000
Terá algunha influencia
que os guerreiros muden de aspecto
para entrar en batalla?
16:00
changing their appearance or not?
295
948000
2000
16:02
Does it make a difference if they're anonymous,
296
950000
1000
Como tratan ás súas vítimas
desde o anonimato?
16:03
in how they treat their victims?
297
951000
2000
16:05
We know in some cultures, they go to war,
298
953000
1000
Nalgunhas culturas vaise á guerra
sen cambiar o aspecto.
16:06
they don't change their appearance.
299
954000
1000
16:07
In other cultures, they paint themselves like "Lord of the Flies."
300
955000
2000
Noutras, píntanse como en
"O señor das moscas".
16:09
In some, they wear masks.
301
957000
2000
Nalgunhas levan máscaras.
16:11
In many, soldiers are anonymous in uniform.
302
959000
3000
Noutras, os soldados fanse
anónimos co uniforme.
16:14
So this anthropologist, John Watson, found
303
962000
2000
O antropólogo Jon Watson agrupou
23 culturas en dúas categorías.
16:17
23 cultures that had two bits of data.
304
965000
2000
16:19
Do they change their appearance? 15.
305
967000
2000
Cantas mudan de aspecto? 15.
16:21
Do they kill, torture, mutilate? 13.
306
969000
2000
Cantas matan, torturan, mutilan? 13.
16:23
If they don't change their appearance,
307
971000
2000
Se non mudan de aspecto, só nunha de oito
16:25
only one of eight kills, tortures or mutilates.
308
973000
2000
mata, tortura ou mutila.
16:27
The key is in the red zone.
309
975000
2000
A chave está na zona vermella.
Se cambian de aspecto,
16:29
If they change their appearance,
310
977000
1000
16:30
12 of 13 -- that's 90 percent -- kill, torture, mutilate.
311
978000
4000
12 de 13, o 90 por cento,
matan, torturan, mutilan
Ese é o poder do anonimato.
16:35
And that's the power of anonymity.
312
983000
1000
16:36
So what are the seven social processes
313
984000
2000
Entón, cales son os 7 procesos sociais
16:38
that grease the slippery slope of evil?
314
986000
2000
que engraxan a esvaradía costa do mal?
16:40
Mindlessly taking the first small step.
315
988000
2000
Dar o primeiro paso inconscientemente.
16:42
Dehumanization of others. De-individuation of Self.
316
990000
3000
Deshumanización dos outros.
Desindividuación dun mesmo.
16:45
Diffusion of personal responsibility. Blind obedience to authority.
317
993000
3000
Difusión da responsabilidade persoal.
Obediencia cega á autoridade.
Conformidade acrítica coas normas grupais.
16:49
Uncritical conformity to group norms.
318
997000
1000
16:50
Passive tolerance to evil through inaction or indifference.
319
998000
3000
Tolerancia pasiva perante o mal
por inacción ou indiferenza.
16:54
And it happens when you're in a new or unfamiliar situation.
320
1002000
2000
E isto sucede cando un se atopa
nunha situación nova.
16:56
Your habitual response patterns don't work.
321
1004000
2000
Os padróns de resposta
habituais non serven.
A personalidade e a moralidade
están desconectadas.
16:59
Your personality and morality are disengaged.
322
1007000
2000
17:01
"Nothing is easier than to denounce the evildoer;
323
1009000
3000
"Nada máis doado ca denunciar ao malvado;
17:04
nothing more difficult than understanding him," Dostoyevksy tells us.
324
1012000
3000
nada máis difícil ca comprendelo".
Dostoievski.
17:07
Understanding is not excusing. Psychology is not excuse-iology.
325
1015000
4000
Comprender non é escusar.
A psicoloxía non é "escusoloxía".
17:12
So social and psychological research reveals
326
1020000
1000
A investigación social
e psicolóxica revelan
17:13
how ordinary, good people can be transformed without the drugs.
327
1021000
4000
como a xente normal, boa,
pode transformarse sen axuda de drogas.
17:17
You don't need it. You just need the social-psychological processes.
328
1025000
3000
Non se necesitan. Só se necesitan
os procesos socio-psicolóxicos.
17:20
Real world parallels? Compare this with this.
329
1028000
4000
Paralelismos no mundo real?
Comparen isto con isto.
James Schlesinger --remato con isto--, di,
17:26
James Schlesinger -- and I'm going to have to end with this -- says,
330
1034000
2000
17:28
"Psychologists have attempted to understand how and why
331
1036000
2000
"Os psicólogos tratan de entender
como e por que
17:31
individuals and groups who usually act humanely
332
1039000
2000
persoas e grupos que normalmente
actúan humanamente
17:33
can sometimes act otherwise in certain circumstances."
333
1041000
3000
ás veces poden actuar doutro modo
baixo certas circunstancias"
Iso é o Efecto Lucifer.
17:37
That's the Lucifer effect.
334
1045000
1000
17:38
And he goes on to say, "The landmark Stanford study
335
1046000
2000
E segue dicindo,
"O famoso estudo de Stanford
17:40
provides a cautionary tale for all military operations."
336
1048000
4000
é unha advertencia
para calquera operación militar".
17:44
If you give people power without oversight,
337
1052000
2000
Se lle dás á xente poder
sen supervisión,
estás a dar licenza para abusar.
17:47
it's a prescription for abuse. They knew that, and let that happen.
338
1055000
3000
Eles sabíano, e deixaron que pasase.
17:50
So another report, an investigative report by General Fay,
339
1058000
5000
Outro informe, un informe
de investigación do xeneral Fay,
di que o sistema é culpable.
17:55
says the system is guilty. And in this report,
340
1063000
2000
Neste informe afirma que foi o ambiente
o que provocou Abu Ghraib,
17:57
he says it was the environment that created Abu Ghraib,
341
1065000
3000
18:00
by leadership failures that contributed
342
1068000
2000
polos erros de mando que contribuíron
a que se producisen os abusos,
18:02
to the occurrence of such abuse,
343
1070000
1000
18:03
and the fact that it remained undiscovered
344
1071000
2000
e porque os mandos superiores
18:05
by higher authorities for a long period of time.
345
1073000
2000
os ignoraron durante moito tempo.
18:07
Those abuses went on for three months. Who was watching the store?
346
1075000
4000
Os malos tratos prolongáronse tres meses
Quen estaba mirando para alí?
18:11
The answer is nobody, and, I think, nobody on purpose.
347
1079000
2000
A resposta é: ninguén.
Eu penso que adrede.
Déronlles permiso aos gardas
para facer aquilo,
18:14
He gave the guards permission to do those things,
348
1082000
1000
18:15
and they knew nobody was ever going to come down to that dungeon.
349
1083000
3000
e eles sabían que ninguén
ía baixar a aquel alxube.
18:18
So you need a paradigm shift in all of these areas.
350
1086000
3000
Cómpre un cambio
de paradigma nestas áreas.
18:21
The shift is away from the medical model
351
1089000
2000
O cambio do modelo médico
que se centra só no individuo,
18:23
that focuses only on the individual.
352
1091000
2000
18:25
The shift is toward a public health model
353
1093000
2000
ao modelo da saúde pública
18:28
that recognizes situational and systemic vectors of disease.
354
1096000
3000
que recoñece os vectores situacionais
e sistémicos das enfermidades.
18:31
Bullying is a disease. Prejudice is a disease. Violence is a disease.
355
1099000
4000
O acoso é un trastorno.
A discriminación, outro.
A violencia, outro.
18:35
And since the Inquisition, we've been dealing with problems
356
1103000
2000
Desde a Inquisición, levamos enfrontando
o problema a nivel individual.
18:37
at the individual level. And you know what? It doesn't work.
357
1105000
3000
Non funciona.
18:40
Aleksandr Solzhenitsyn says, "The line between good and evil
358
1108000
3000
Alexander Solzhenitsin afirma:
"A liña entre o ben e o mal
18:43
cuts through the heart of every human being."
359
1111000
2000
atravesa o corazón de cada ser humano".
18:45
That means that line is not out there.
360
1113000
2000
Isto significa que a liña non está fóra.
18:47
That's a decision that you have to make. That's a personal thing.
361
1115000
3000
É unha decisión que cada un ten que tomar,
unha cuestión persoal.
18:50
So I want to end very quickly on a positive note.
362
1118000
3000
Quero rematar rapidamente
cunha nota positiva.
18:53
Heroism as the antidote to evil,
363
1121000
2000
O heroísmo é o antídoto da maldade,
promovendo o imaxinario heroico,
18:56
by promoting the heroic imagination,
364
1124000
1000
18:57
especially in our kids, in our educational system.
365
1125000
3000
especialmente na infancia,
no sistema educativo.
19:00
We want kids to think, I'm the hero in waiting,
366
1128000
2000
Queremos que os nenos pensen
"son un heroe á espera,
19:02
waiting for the right situation to come along,
367
1130000
2000
á espera da situación adecuada,
19:05
and I will act heroically.
368
1133000
1000
na que actuarei como un heroe".
19:06
My whole life is now going to focus away from evil --
369
1134000
2000
A partir de agora vou deixar,
de ocuparme do mal,
19:08
that I've been in since I was a kid -- to understanding heroes.
370
1136000
3000
no que levo desde que era neno,
e vou ocuparme dos heroes.
19:11
Banality of heroism
371
1139000
1707
A banalidade do heroísmo.
19:13
is, it's ordinary people who do heroic deeds.
372
1140707
2293
É a xente normal a que fai heroicidades.
19:15
It's the counterpoint to Hannah Arendt's "Banality of Evil."
373
1143000
3000
É o contrapunto á "banalidade do mal"
de Hannah Arendt.
19:18
Our traditional societal heroes are wrong,
374
1146000
3000
Os nosos heroes tradicionais non valen,
porque son a excepción.
19:21
because they are the exceptions.
375
1149000
1000
19:22
They organize their whole life around this.
376
1150000
2000
Organizan a súa vida arredor do seu
heroísmo. Por iso os coñecemos.
19:24
That's why we know their names.
377
1152000
1000
19:25
And our kids' heroes are also wrong models for them,
378
1153000
2000
E os heroes infantís tamén
son malos modelos,
19:27
because they have supernatural talents.
379
1155000
2000
porque teñen poderes sobrenaturais.
Queremos que os nenos saiban
que os mais dos heroes son xente normal,
19:30
We want our kids to realize most heroes are everyday people,
380
1158000
2000
19:32
and the heroic act is unusual. This is Joe Darby.
381
1160000
4000
e que o inusual son os actos heroicos.
Este é Joe Darby.
19:36
He was the one that stopped those abuses you saw,
382
1164000
2000
Foi quen puxo fin aos abusos que viron,
19:38
because when he saw those images,
383
1166000
2000
porque cando viu aquelas fotos,
19:40
he turned them over to a senior investigating officer.
384
1168000
3000
denunciounas ante un superior.
19:43
He was a low-level private, and that stopped it. Was he a hero? No.
385
1171000
3000
El era un soldado raso, e parouno.
Era un heroe? Non.
19:47
They had to put him in hiding, because people wanted to kill him,
386
1175000
3000
Tiveron que poñerlle protección,
porque querían matalo,
19:50
and then his mother and his wife.
387
1178000
1000
tamén á súa nai e á súa muller.
19:51
For three years, they were in hiding.
388
1179000
2000
Durante tres anos, estiveron agochados.
19:53
This is the woman who stopped the Stanford Prison Study.
389
1181000
3000
Esta é a muller que lle puxo fin
ao Estudo da Prisión de Stanford.
19:56
When I said it got out of control, I was the prison superintendent.
390
1184000
3000
Cando dixen que estaba fóra de control
eu era o superior da cadea.
19:59
I didn't know it was out of control. I was totally indifferent.
391
1187000
3000
Non sabía que estaba fóra de control.
Tanto me tiña todo.
20:02
She came down, saw that madhouse and said,
392
1190000
2000
Ela viu aquela tolería e dixo:
20:04
"You know what, it's terrible what you're doing to those boys.
393
1192000
3000
"Sabes? o que lles estás a facer
a estes rapaces é terrible.
20:07
They're not prisoners, they're not guards,
394
1195000
1000
Non son presos nin gardas,
son rapaces, e ti es o responsable".
20:08
they're boys, and you are responsible."
395
1196000
2000
20:11
And I ended the study the next day.
396
1199000
2000
Cancelei o experimento ao día seguinte.
20:13
The good news is I married her the next year.
397
1201000
2000
A boa noticia é
que casei con ela un ano despois.
20:15
(Laughter)
398
1203000
3000
(Risos)
20:18
(Applause)
399
1206000
7000
(Aplausos)
20:25
I just came to my senses, obviously.
400
1213000
2000
Volvín ao meu ser, por suposto.
20:27
So situations have the power to do, through --
401
1215000
2000
As situacións teñen o poder
de facer [tres cousas]
Pero o caso é que a mesma situación
20:31
but the point is, this is the same situation
402
1219000
1000
20:32
that can inflame the hostile imagination in some of us,
403
1220000
4000
que pode prender no imaxinario
hostil dalgún de nós,
20:36
that makes us perpetrators of evil,
404
1224000
2000
que nos fai perpetradores do mal,
20:38
can inspire the heroic imagination in others. It's the same situation.
405
1226000
3000
pode inspirar o imaxinario
heroico noutros.
É a mesma situación,
e podemos estar dun ou doutro lado.
20:42
And you're on one side or the other.
406
1230000
1000
20:43
Most people are guilty of the evil of inaction,
407
1231000
2000
Moita xente peca de maldade
por pasividade,
20:45
because your mother said, "Don't get involved. Mind your own business."
408
1233000
3000
porque súa nai lle dixo: "Non te metas.
Métete nas túas cousas".
20:48
And you have to say, "Mama, humanity is my business."
409
1236000
3000
Hai que replicar, "Mamá,
a humanidade é cousa miña".
20:51
So the psychology of heroism is -- we're going to end in a moment --
410
1239000
2000
A psicoloxía do heroísmo consiste en ver
--estamos a rematar--
20:53
how do we encourage children in new hero courses,
411
1241000
4000
como meter aos nenos
en cursos para novos heroes,
20:57
that I'm working with Matt Langdon -- he has a hero workshop --
412
1245000
3000
niso estou a traballar con Matt Landon
- el ten un obradoiro para heroes-
21:00
to develop this heroic imagination, this self-labeling,
413
1248000
3000
para desenvolver este imaxinario heroico,
esta auto-etiqueta,
21:03
"I am a hero in waiting," and teach them skills.
414
1251000
3000
"son un heroe en espera",
e ensinarlles a selo.
21:06
To be a hero, you have to learn to be a deviant,
415
1254000
2000
Para ser un heroe,
hai que aprender a diverxer,
porque vai haber que ir
contra a conformidade do grupo.
21:09
because you're always going against the conformity of the group.
416
1257000
2000
21:11
Heroes are ordinary people whose social actions are extraordinary. Who act.
417
1259000
4000
Os heroes son xente normal que
fan cousas extraordinarias, que actúan.
21:15
The key to heroism is two things.
418
1263000
2000
A chave do heroísmo son dous puntos:
21:17
A: you've got to act when other people are passive.
419
1265000
2000
A: Actuar cando outros quedan parados.
B: Actuar sociocentricamente,
non egocentricamente.
21:20
B: you have to act socio-centrically, not egocentrically.
420
1268000
3000
21:23
And I want to end with the story that some of you know,
421
1271000
2000
E vou rematar cunha historia, a de Wesley
Autrey, heroe do metro de Nova York
21:25
about Wesley Autrey, New York subway hero.
422
1273000
2000
21:27
Fifty-year-old African-American construction worker.
423
1275000
2000
Un obreiro da construción afroamericano
de 55 anos.
21:29
He's standing on a subway in New York.
424
1277000
2000
Está na estación do metro.
Un tío branco cae ás vías.
21:31
A white guy falls on the tracks.
425
1279000
1000
21:32
The subway train is coming. There's 75 people there.
426
1280000
3000
O tren está chegando.
Hai 75 persoas na estación.
21:35
You know what? They freeze.
427
1283000
1000
E saben que? Quedan pegados.
21:36
He's got a reason not to get involved.
428
1284000
2000
El ten razóns para non meterse.
21:38
He's black, the guy's white, and he's got two little kids.
429
1286000
2000
É negro, o tío é branco,
e el está cos seus dous nenos.
Pero déixalle os fillos a un descoñecido,
21:41
Instead, he gives his kids to a stranger,
430
1289000
1000
21:42
jumps on the tracks, puts the guy between the tracks,
431
1290000
3000
salta ás vías, move ao tío
cara ao medio dos raís,
21:45
lies on him, the subway goes over him.
432
1293000
2000
bótaselle enriba,
o tren pásalles por riba.
21:47
Wesley and the guy -- 20 and a half inches height.
433
1295000
3000
Wesley e o tío,
52 centímetros de altura.
O espazo que deixa o tren é de 53,5 cm.
21:51
The train clearance is 21 inches.
434
1299000
2000
21:53
A half an inch would have taken his head off.
435
1301000
2000
Un centímetro e medio
e levaríalle a cabeza.
21:56
And he said, "I did what anyone could do,"
436
1304000
3000
E di: "Fixen o que faría calquera"
Saltar ás vías non é para tanto.
21:59
no big deal to jump on the tracks.
437
1307000
1000
22:00
And the moral imperative is "I did what everyone should do."
438
1308000
4000
E o imperativo moral é
"fixen o que faría calquera".
22:04
And so one day, you will be in a new situation.
439
1312000
2000
Así que calquera día,
atoparanse nunha situación nova.
22:07
Take path one, you're going to be a perpetrator of evil.
440
1315000
2000
Se collen o camiño un,
serán axentes do mal.
22:09
Evil, meaning you're going to be Arthur Andersen.
441
1317000
3000
O mal no sentido de que serán
Arthur Andersen.
22:12
You're going to cheat, or you're going to allow bullying.
442
1320000
2000
Mentirán, ou permitirán abusos.
22:14
Path two, you become guilty of the evil of passive inaction.
443
1322000
2000
Co camiño dous, serán culpables
do mal por pasividade.
22:17
Path three, you become a hero.
444
1325000
1000
Co camiño tres, serán heroes.
22:18
The point is, are we ready to take the path
445
1326000
3000
A cuestión é: estamos listos para coller
o camiño que leva a ser heroes normais,
22:21
to celebrating ordinary heroes,
446
1329000
2000
22:23
waiting for the right situation to come along
447
1331000
2000
a esperar pola situación adecuada
22:25
to put heroic imagination into action?
448
1333000
2000
para poñer en práctica
o imaxinario heroico?
22:27
Because it may only happen once in your life,
449
1335000
3000
Porque isto só pasa unha vez na vida.
e cando che pasa, sábelo.
22:31
and when you pass it by, you'll always know,
450
1339000
1000
22:32
I could have been a hero and I let it pass me by.
451
1340000
3000
Eu puiden ser un heroe
e deixei pasar a ocasión.
22:35
So the point is thinking it and then doing it.
452
1343000
2000
Así que a cousa é pensalo e logo facelo.
22:37
So I want to thank you. Thank you. Thank you.
453
1345000
3000
Quero darlles as grazas, grazas, grazas.
22:40
Let's oppose the power of evil systems at home and abroad,
454
1348000
3000
Opoñámonos ao poder dos sistemas do mal
na casa e fóra dela.
22:43
and let's focus on the positive.
455
1351000
2000
E concentrémonos no positivo.
22:45
Advocate for respect of personal dignity, for justice and peace,
456
1353000
3000
Avoguemos polo respecto á dignidade
das persoas, á xustiza e á paz,
22:48
which sadly our administration has not been doing.
457
1356000
2000
cousa que, por desgraza,
a Administración non fai.
22:50
Thanks so much.
458
1358000
1000
Moitas grazas.
22:51
(Applause)
459
1359000
13000
(Aplausos)
Translated by Socorro Rodríguez
Reviewed by Xusto Rodriguez

▲Back to top

ABOUT THE SPEAKER
Philip Zimbardo - Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism.

Why you should listen

Philip Zimbardo knows what evil looks like. After serving as an expert witness during the Abu Ghraib trials, he wrote The Lucifer Effect: Understanding How Good People Turn Evil. From Nazi comic books to the tactics of used-car salesmen, he explores a wealth of sources in trying to explain the psychology of evil.

A past president of the American Psychological Association and a professor emeritus at Stanford, Zimbardo retired in 2008 from lecturing, after 50 years of teaching his legendary introductory course in psychology. In addition to his work on evil and heroism, Zimbardo recently published The Time Paradox, exploring different cultural and personal perspectives on time.

Still well-known for his controversial Stanford Prison Experiment, Zimbardo in his new research looks at the psychology of heroism. He asks, "What pushes some people to become perpetrators of evil, while others act heroically on behalf of those in need?"

More profile about the speaker
Philip Zimbardo | Speaker | TED.com