ABOUT THE SPEAKER
Philip Zimbardo - Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism.

Why you should listen

Philip Zimbardo knows what evil looks like. After serving as an expert witness during the Abu Ghraib trials, he wrote The Lucifer Effect: Understanding How Good People Turn Evil. From Nazi comic books to the tactics of used-car salesmen, he explores a wealth of sources in trying to explain the psychology of evil.

A past president of the American Psychological Association and a professor emeritus at Stanford, Zimbardo retired in 2008 from lecturing, after 50 years of teaching his legendary introductory course in psychology. In addition to his work on evil and heroism, Zimbardo recently published The Time Paradox, exploring different cultural and personal perspectives on time.

Still well-known for his controversial Stanford Prison Experiment, Zimbardo in his new research looks at the psychology of heroism. He asks, "What pushes some people to become perpetrators of evil, while others act heroically on behalf of those in need?"

More profile about the speaker
Philip Zimbardo | Speaker | TED.com
TED2008

Philip Zimbardo: The psychology of evil

菲利普.金巴多﹕常人如何變成惡魔... 或英雄

Filmed:
7,078,283 views

菲利普.金巴多知道要讓一個好人變壞有多麼容易。在這篇演講中﹐他向我們分享他在阿布葛拉伊布監獄虐囚案中學到的事、和一些從未曝光的照片。然後他告訴我們另一面的事實﹕成為英雄有多麼容易﹐我們又該如何接受這項挑戰。
- Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism. Full bio

Double-click the English transcript below to play the video.

00:13
Philosophers哲學家, dramatists戲劇家, theologians神學家
0
1000
3000
許多世紀以來,哲學家,劇作家,神學家
00:16
have grappled扭打 with this question for centuries百年:
1
4000
2000
都在著力解決這個問題:
00:18
what makes品牌 people go wrong錯誤?
2
6000
2000
什麼使人們變壞?
00:20
Interestingly有趣的是, I asked this question when I was a little kid孩子.
3
8000
2000
有趣的是,當我還是小孩時,我問過同樣的問題。
00:23
When I was a kid孩子 growing生長 up in the South Bronx布朗克斯, inner-city內城 ghetto貧民區
4
11000
2000
我成長於紐約南布朗克斯市中的貧民窟,
00:25
in New York紐約, I was surrounded包圍 by evil邪惡,
5
13000
2000
周圍充滿了罪惡,
00:28
as all kids孩子 are who grew成長 up in an inner city.
6
16000
2000
如同所有在貧民窟長大的孩子一樣。
00:30
And I had friends朋友 who were really good kids孩子,
7
18000
2000
我有一些朋友,他們曾是好孩子,
00:32
who lived生活 out the Dr博士. Jekyll傑奇 Mr先生. Hyde海德 scenario腳本 -- Robert羅伯特 Louis路易 Stevenson史蒂文森.
8
20000
4000
但他們的人生卻如同羅伯特·路易斯·斯蒂文森筆下的變身怪醫,由善轉惡。
00:36
That is, they took drugs毒品, got in trouble麻煩, went to jail監獄.
9
24000
4000
他們染毒,惹了麻煩,然後進了監獄。
00:40
Some got killed殺害, and some did it without drug藥物 assistance幫助.
10
28000
4000
有些喪了命,即使並沒有沾染毒品。
00:44
So when I read Robert羅伯特 Louis路易 Stevenson史蒂文森, that wasn't fiction小說.
11
32000
3000
所以當我讀羅伯特·路易斯·斯蒂文森的作品時,我覺得那不是小說。
00:47
The only question is, what was in the juice果汁?
12
35000
1000
唯一的問題是:釀成由善轉惡的毒藥是什麼?
00:48
And more importantly重要的, that line between之間 good and evil邪惡 --
13
36000
5000
更重要的是,善惡之間的界限——
00:53
which哪一個 privileged特權 people like to think is fixed固定 and impermeable不透水,
14
41000
3000
特權階層喜歡認定這個界限是固定且不可逾越的,
00:56
with them on the good side, and the others其他 on the bad side --
15
44000
3000
認為他們是在善的一邊,其他人在惡的一邊——
00:59
I knew知道 that line was movable活動, and it was permeable透水.
16
47000
3000
而我以前就知道這個界限是可以移動的,而且是可逾越的。
01:02
Good people could be seduced誘惑 across橫過 that line,
17
50000
2000
好人可以受誘惑而越界,
01:05
and under good and some rare罕見 circumstances情況, bad kids孩子 could recover恢復
18
53000
3000
偶爾在某些比較好的情況下,壞孩子也可能
01:08
with help, with reform改革, with rehabilitation復原.
19
56000
3000
依靠外界的幫助、改造、治療,以重塑人生。
01:12
So I want to begin開始 with this this wonderful精彩 illusion錯覺
20
60000
2000
所以,我想以荷蘭藝術家M. C. Escher
01:14
by [Dutch荷蘭人] artist藝術家 M.C. Escher埃舍爾.
21
62000
2000
這幅奇妙的作品開始說起。
01:17
If you look at it and focus焦點 on the white白色,
22
65000
1000
如果你把視線集中在白色區域,
01:18
what you see is a world世界 full充分 of angels天使.
23
66000
2000
你會看到一個充滿了天使的世界。
01:20
But let's look more deeply, and as we do,
24
68000
3000
但是當我們再靠近一點看,
01:23
what appears出現 is the demons惡魔, the devils鬼子 in the world世界.
25
71000
3000
魔鬼就出現了,世間的魔鬼。
01:27
And that tells告訴 us several一些 things.
26
75000
1000
這告訴我們幾點。
01:28
One, the world世界 is, was, will always be filled填充 with good and evil邪惡,
27
76000
3000
一:這個世界,無論過去,現在,還是將來,都總是由善和惡組成,
01:31
because good and evil邪惡 is the yin and yang of the human人的 condition條件.
28
79000
3000
因為善惡就如人類的陰陽。
01:34
It tells告訴 me something else其他. If you remember記得,
29
82000
2000
它也告訴我另外一件事。如果你還記得,
01:36
God's favorite喜愛 angel天使 was Lucifer路西弗.
30
84000
3000
上帝最喜歡的天使是路西法。
01:39
Apparently顯然地, Lucifer路西弗 means手段 "the light."
31
87000
2000
顯然,路西法的意思是“光明”。
01:42
It also means手段 "the morning早上 star," in some scripture聖經.
32
90000
2000
在某些經文裡,它也有“黎明之星”的意思。
01:44
And apparently顯然地, he disobeyed違背 God,
33
92000
3000
顯然他後來背叛了上帝,
01:47
and that's the ultimate最終 disobedience不服從 to authority權威.
34
95000
3000
這是對權威的終極背叛。
01:50
And when he did, Michael邁克爾, the archangel天使, was sent發送
35
98000
3000
當他率眾背叛後,上帝派邁克天使長
01:54
to kick him out of heaven天堂 along沿 with the other fallen墮落 angels天使.
36
102000
3000
將他和其他墮落的天使一起趕出天堂。
01:57
And so Lucifer路西弗 descends下降 into hell地獄, becomes Satan撒但,
37
105000
3000
於是路西法降入地獄,成為撒旦,
02:01
becomes the devil魔鬼, and the force of evil邪惡 in the universe宇宙 begins開始.
38
109000
2000
成為惡魔,宇宙中的惡之能量誕生了。
02:03
Paradoxically矛盾的是, it was God who created創建 hell地獄 as a place地點 to store商店 evil邪惡.
39
111000
5000
矛盾的是,是上帝造出了惡的容身之處---地獄。
02:09
He didn't do a good job工作 of keeping保持 it there though雖然.
40
117000
1000
他卻沒能使惡一直呆在那裡。
02:10
So, this arc of the cosmic宇宙的 transformation轉型
41
118000
3000
所以,從上帝最受寵的天使變為惡魔,
02:13
of God's favorite喜愛 angel天使 into the Devil魔鬼,
42
121000
2000
這個巨大的轉變,
02:15
for me, sets the context上下文 for understanding理解 human人的 beings眾生
43
123000
4000
為我設立了一個大背景,
02:19
who are transformed改造 from good, ordinary普通 people
44
127000
3000
去理解那些從好人或者普通人
02:22
into perpetrators肇事者 of evil邪惡.
45
130000
2000
轉變成壞人的人。
02:24
So the Lucifer路西弗 effect影響, although雖然 it focuses重點 on the negatives底片 --
46
132000
4000
所以,路西法效應,儘管它集中在陰暗的方面——
02:28
the negatives底片 that people can become成為,
47
136000
3000
人們可能投向陰暗,
02:31
not the negatives底片 that people are --
48
139000
1000
但他們本身並非陰暗——
02:32
leads引線 me to a psychological心理 definition定義. Evil邪惡 is the exercise行使 of power功率.
49
140000
6000
引導我作出一個心理學定義:惡是行使權力
02:38
And that's the key: it's about power功率.
50
146000
2000
這才是關鍵:權力。
02:40
To intentionally故意地 harm危害 people psychologically心理,
51
148000
3000
來故意對他人進行心理傷害,
02:44
to hurt傷害 people physically物理, to destroy破壞 people mortally臨死, or ideas思路,
52
152000
3000
對他人進行身體傷害,殘害他人生命或思想,
02:47
and to commit承諾 crimes犯罪 against反對 humanity人性.
53
155000
2000
犯下反人道的罪行。
02:51
If you Google谷歌 "evil邪惡," a word that should surely一定 have withered乾枯 by now,
54
159000
3000
如果你用谷歌搜索evil (惡) 這個詞——時至今日,這本是個早應消亡的詞,
02:54
you come up with 136 million百萬 hits點擊 in a third第三 of a second第二.
55
162000
4000
你會在1/3秒內得到1.36億個搜索結果。
02:58
A few少數 years年份 ago -- I am sure all of you were shocked吃驚, as I was,
56
166000
4000
幾年前發生的一件事——我知道你們當時一定和我一樣震驚,
03:02
with the revelation啟示 of American美國 soldiers士兵
57
170000
3000
就是揭露美軍士兵
03:05
abusing濫用 prisoners囚犯 in a strange奇怪 place地點
58
173000
3000
在那場爭議性的對伊戰爭中
03:08
in a controversial爭論的 war戰爭, Abu阿布 Ghraib格萊布 in Iraq伊拉克.
59
176000
3000
中的虐囚行為:阿布葛拉伊布虐囚事件。
03:11
And these were men男人 and women婦女
60
179000
1000
這些士兵,有男性也有女性,
03:12
who were putting prisoners囚犯 through通過 unbelievable難以置信的 humiliation屈辱.
61
180000
5000
對囚犯們實施了讓人難以置信的羞辱。
03:18
I was shocked吃驚, but I wasn't surprised詫異,
62
186000
1000
我很震驚,但是並不感到意外,
03:19
because I had seen看到 those same相同 visual視覺 parallels相似之處
63
187000
2000
因為我以前看過類似的情況,
03:22
when I was the prison監獄 superintendent所長 of the Stanford斯坦福 Prison監獄 Study研究.
64
190000
3000
當時我是史丹佛監獄實驗的負責人。
03:25
Immediately立即 the Bush襯套 administration行政 military軍事 said ... what?
65
193000
2000
布希政府軍方對此事的第一反應是什麼?
03:28
What all administrations管理部門 say when there's a scandal醜聞.
66
196000
2000
是醜聞發生後任何官方都會說的套詞,
03:30
"Don't blame us. It's not the system系統. It's the few少數 bad apples蘋果,
67
198000
3000
"不要怪我們。這與整個系統無關。只是幾個壞蘋果而已,
03:34
the few少數 rogue流氓 soldiers士兵."
68
202000
1000
只是一小撮惡劣的士兵而已。 "
03:35
My hypothesis假設 is, American美國 soldiers士兵 are good, usually平時.
69
203000
3000
而我的假設是,美國士兵通常情況下是好的。
03:38
Maybe it was the barrel that was bad.
70
206000
2000
也許是裝蘋果的桶壞了。
03:40
But how am I going to -- how am I going to deal合同 with that hypothesis假設?
71
208000
3000
但我如何證明這個假設呢?
03:43
I became成為 an expert專家 witness見證
72
211000
1000
我成為了其中一個名叫
03:44
for one of the guards衛士, Sergeant軍士 Chip芯片 Frederick弗雷德里克,
73
212000
2000
奇普·弗萊德里克中士的專家證人,
03:46
and in that position位置, I had access訪問 to the dozen investigative研究 reports報告.
74
214000
4000
在這個位置上,我可以接觸到關於此事的十幾份調查報告。
03:50
I had access訪問 to him. I could study研究 him,
75
218000
3000
我同他接觸,我可以研究他,
03:54
have him come to my home, get to know him,
76
222000
1000
讓他來我家,了解他,
03:55
do psychological心理 analysis分析 to see, was he a good apple蘋果 or bad apple蘋果.
77
223000
4000
作些心理上的分析來判斷他是個好蘋果還是壞蘋果。
03:59
And thirdly第三, I had access訪問 to all of the 1,000 pictures圖片
78
227000
4000
第三點,我可以查看所有的
04:03
that these soldiers士兵 took.
79
231000
1000
1000多張士兵拍攝的照片。
04:05
These pictures圖片 are of a violent暴力 or sexual有性 nature性質.
80
233000
2000
這些照片都是暴力或色情的。
04:07
All of them come from the cameras相機 of American美國 soldiers士兵.
81
235000
3000
所有這些都是美軍士兵用相機拍攝的。
04:11
Because everybody每個人 has a digital數字 camera相機 or cell細胞 phone電話 camera相機,
82
239000
2000
因為每個人都有數位相機或手機相機,
04:13
they took pictures圖片 of everything. More than 1,000.
83
241000
2000
他們什麼都拍。拍了超過1000張照片。
04:16
And what I've doneDONE is I organized有組織的 them into various各個 categories類別.
84
244000
2000
我所做的是把它們分類。
04:18
But these are by United聯合的 States狀態 military軍事 police警察, army軍隊 reservists預備役.
85
246000
5000
但這些由陸軍預備役的美軍憲兵所拍攝的。
04:24
They are not soldiers士兵 prepared準備 for this mission任務 at all.
86
252000
3000
他們完全不是為執行此項任務而設立的部隊。
04:27
And it all happened發生 in a single place地點, Tier一級 1-A-一個, on the night shift轉移.
87
255000
5000
而此事僅發生在一個地點,1A層,在夜間值班時間。
04:32
Why? Tier一級 1-A-一個 was the center中央 for military軍事 intelligence情報.
88
260000
3000
為什麼? 1A層是軍方情報中心。
04:35
It was the interrogation問診 hold保持. The CIA中央情報局 was there.
89
263000
3000
是審訊關押處。中央情報局在那裡。
04:39
Interrogators讀寫器 from Titan泰坦 Corporation公司, all there,
90
267000
2000
巨人公司(美軍外包公司)的審訊人員,全部都在那裡,
04:41
and they're getting得到 no information信息 about the insurgency暴動.
91
269000
3000
而他們得不到任何關於暴動的信息。
04:45
So they're going to put pressure壓力 on these soldiers士兵,
92
273000
1000
於是他們向這些憲兵隊士兵施加壓力,
04:46
military軍事 police警察, to cross交叉 the line,
93
274000
2000
迫使他們越線,
04:49
give them permission允許 to break打破 the will of the enemy敵人,
94
277000
3000
允許他們採取措施來擊潰敵人的意志,
04:52
to prepare準備 them for interrogation問診, to soften軟化 them up,
95
280000
2000
挽起袖子,為審訊做準備,
04:54
to take the gloves手套 off. Those are the euphemisms委婉語,
96
282000
2000
使他們屈服。這些都是婉辭,
04:56
and this is how it was interpreted解讀.
97
284000
3000
而這就是他們如何闡釋的。
05:00
Let's go down to that dungeon地牢.
98
288000
1000
讓我們進入地牢吧。
05:01
(Camera相機 shutter快門)
99
289000
37000
(相機快門聲)(以下圖片含有裸露及暴力展示)
05:38
(Thuds重擊聲)
100
326000
6000
(重擊聲)
05:45
(Camera相機 shutter快門)
101
333000
14000
(相機快門聲)
05:59
(Thuds重擊聲)
102
347000
10000
(重擊聲)
06:09
(Breathing呼吸)
103
357000
8000
(喘息聲)
06:17
(Bells鐘聲)
104
365000
31000
(鐘聲)
06:49
So, pretty漂亮 horrific可怕的.
105
397000
3000
很恐怖。
06:52
That's one of the visual視覺 illustrations插圖 of evil邪惡.
106
400000
3000
這是惡的一種視覺展示。
06:55
And it should not have escaped逃脫 you that
107
403000
1000
你應該不會沒有注意到,
06:56
the reason原因 I paired配對 the prisoner囚犯 with his arms武器 out
108
404000
4000
我把那個伸開雙臂的囚犯
07:00
with Leonardo萊昂納多 daDA Vinci's·芬奇 ode to humanity人性
109
408000
3000
和達文西頌揚人類的作品放在一起的原因,
07:03
is that that prisoner囚犯 was mentally精神上 ill生病.
110
411000
2000
是那個犯人得了精神疾病。
07:05
That prisoner囚犯 covered覆蓋 himself他自己 with shit拉屎 every一切 day,
111
413000
2000
那個犯人每天用大便塗抹在身上,
07:07
and they used to have to roll him in dirt污垢 so he wouldn't不會 stink.
112
415000
3000
士兵們不得不使他在泥土裡打滾,以消除臭味。
07:10
But the guards衛士 ended結束 up calling調用 him "Shit拉屎 Boy男孩."
113
418000
2000
但士兵們最終還是叫他屎男。
07:12
What was he doing in that prison監獄
114
420000
2000
他在監獄裡做什麼!?
07:14
rather than in some mental心理 institution機構?
115
422000
3000
他本應在精神病院。
07:17
In any event事件, here's這裡的 former前任的 Secretary秘書 of Defense防禦 Rumsfeld拉姆斯菲爾德.
116
425000
3000
不管怎樣,前國防部長拉姆斯菲爾德
07:20
He comes down and says, "I want to know, who is responsible主管?
117
428000
2000
下來問,"我想知道誰該為此負責?
07:22
Who are the bad apples蘋果?" Well, that's a bad question.
118
430000
3000
到底誰才是那幾個壞蘋果? "嗯,這是個差勁的問題。
07:25
You have to reframe重構 it and ask, "What is responsible主管?"
119
433000
3000
你應該重新組織一下這個句子,"是什麼為此負責?"
07:28
Because "what" could be the who of people,
120
436000
2000
因為"什麼"既可以是指人,
07:30
but it could also be the what of the situation情況,
121
438000
2000
也可以是指情境,
07:32
and obviously明顯 that's wrongheaded判斷錯誤.
122
440000
2000
而顯然那樣問是堅持錯誤。
07:34
So how do psychologists心理學家 go about understanding理解
123
442000
2000
那麼心理學家是如何理解
07:36
such這樣 transformations轉換 of human人的 character字符,
124
444000
2000
這種人性的轉變呢?
07:38
if you believe that they were good soldiers士兵
125
446000
2000
如果你相信他們在進入地牢之前
07:40
before they went down to that dungeon地牢?
126
448000
1000
是好士兵的話。
07:42
There are three ways方法. The main主要 way is -- it's called dispositional氣質.
127
450000
2000
有三種方式。最主要的方式是所謂的特質論。
07:44
We look at what's inside of the person, the bad apples蘋果.
128
452000
3000
我們查看那些壞蘋果的內在特徵。
07:48
This is the foundation基礎 of all of social社會 science科學,
129
456000
3000
這是所有社會科學的基礎,
07:51
the foundation基礎 of religion宗教, the foundation基礎 of war戰爭.
130
459000
2000
宗教的基礎,戰爭的基礎。
07:55
Social社會 psychologists心理學家 like me come along沿 and say, "Yeah,
131
463000
2000
像我這樣的社會心理學家會出來說,"是啊,
07:57
people are the actors演員 on the stage階段,
132
465000
2000
人們是舞台上的演員,
07:59
but you'll你會 have to be aware知道的 of what that situation情況 is.
133
467000
2000
但你得清楚其所處的情境。
08:01
Who are the cast of characters人物? What's the costume服裝?
134
469000
3000
扮演角色的演員是哪些人?戲服什麼樣?
08:04
Is there a stage階段 director導向器?"
135
472000
1000
有舞台導演嗎?
08:05
And so we're interested有興趣 in, what are the external外部 factors因素
136
473000
2000
所以我們感興趣的是,個體周圍的外界因素
08:08
around the individual個人 -- the bad barrel?
137
476000
2000
是什麼,壞的蘋果桶?
08:10
And social社會 scientists科學家們 stop there, and they miss小姐 the big point
138
478000
3000
社會學家研究的僅限於此,卻遺漏了這個很重要的問題,
08:13
that I discovered發現 when I became成為 an expert專家 witness見證 for Abu阿布 Ghraib格萊布.
139
481000
3000
即我在成為阿布葛拉伊布虐囚事件的專家證人後所發現的:
08:16
The power功率 is in the system系統.
140
484000
2000
權力存在於系統中。
08:18
The system系統 creates創建 the situation情況 that corrupts腐敗 the individuals個人,
141
486000
3000
系統製造出腐化個體的情境,
08:21
and the system系統 is the legal法律, political政治, economic經濟, cultural文化 background背景.
142
489000
5000
這個系統,是指法制、政治、經濟和文化背景。
08:26
And this is where the power功率 is of the bad-barrel壞桶 makers製造商.
143
494000
3000
該系統即蘋果桶製造者權力之所在。
08:29
So if you want to change更改 a person, you've got to change更改 the situation情況.
144
497000
3000
如果你想改變一個人,你就得改變其所處的情境。
08:32
If you want to change更改 the situation情況,
145
500000
1000
如果你要改變情境,
08:33
you've got to know where the power功率 is, in the system系統.
146
501000
2000
你得知道其權力存在於系統的何處。
08:35
So the Lucifer路西弗 effect影響 involves涉及 understanding理解
147
503000
2000
所以路西法效應牽涉到理解
08:37
human人的 character字符 transformations轉換 with these three factors因素.
148
505000
5000
人性轉變是如何受這三項因素影響的。
08:43
And it's a dynamic動態 interplay相互作用.
149
511000
1000
它是一個相互作用的過程。
08:44
What do the people bring帶來 into the situation情況?
150
512000
2000
人們會怎樣影響情境?
08:46
What does the situation情況 bring帶來 out of them?
151
514000
2000
情境如何影響人們?
08:48
And what is the system系統 that creates創建 and maintains維持 that situation情況?
152
516000
4000
製造並維持該情境的系統是什麼?
08:52
So my book, "The Lucifer路西弗 Effect影響," recently最近 published發表, is about,
153
520000
2000
我最近出版的書《路西法效應》,
08:54
how do you understand理解 how good people turn evil邪惡?
154
522000
2000
就是關於我們如何理解好人是怎樣變成惡人的。
08:57
And it has a lot of detail詳情
155
525000
1000
書中有關於我今天演講內容
08:58
about what I'm going to talk about today今天.
156
526000
2000
的大量細節。
09:01
So Dr博士. Z's個Z "Lucifer路西弗 Effect影響," although雖然 it focuses重點 on evil邪惡,
157
529000
3000
所以,津博士的《路西法效應》,儘管著重於惡,
09:04
really is a celebration慶典 of the human人的 mind's心靈的
158
532000
2000
但其實是頌揚人類有無限的潛力,
09:06
infinite無窮 capacity容量 to make any of us kind or cruel殘忍,
159
534000
4000
使我們任何人向善或作惡,
09:10
caring愛心 or indifferent冷漠, creative創作的 or destructive有害,
160
538000
3000
關懷或冷漠,創造或毀滅,
09:13
and it makes品牌 some of us villains惡棍.
161
541000
2000
甚至可以使得我們其中一些人成為惡棍。
09:15
And the good news新聞 story故事 that I'm going to hopefully希望 come to
162
543000
2000
而我在最後將滿懷希望地給大家講一個好消息的故事,
09:18
at the end結束 is that it makes品牌 some of us heroes英雄.
163
546000
2000
即這潛力也可以使我們其中一些人成為英雄。
09:20
This is a wonderful精彩 cartoon動畫片 in the New Yorker紐約客,
164
548000
3000
這是登在《紐約客》上非常棒的一個漫畫,
09:23
which哪一個 really summarizes總結 my whole整個 talk:
165
551000
2000
它其實總結了我的全部演講:
09:25
"I'm neither也不 a good cop警察 nor也不 a bad cop警察, Jerome杰羅姆.
166
553000
2000
"傑若米,我既不是好警察也不是壞警察,
09:27
Like yourself你自己, I'm a complex複雜 amalgam汞合金
167
555000
2000
跟你一樣,我是一個正面和負面 人格特質
09:29
of positive and negative personality個性 traits性狀
168
557000
2000
的複雜混合體,
09:32
that emerge出現 or not, depending根據 on the circumstances情況."
169
560000
3000
至於體現哪一面,要靠具體情況而言。 "
09:35
(Laughter笑聲)
170
563000
2000
(笑聲)
09:37
There's a study研究 some of you think you know about,
171
565000
3000
有一項研究,你們其中一些人可能以為自己知道,
09:40
but very few少數 people have ever read the story故事. You watched看著 the movie電影.
172
568000
4000
但極少數人讀過這個故事。你看過電影。
09:44
This is Stanley斯坦利 Milgram米爾格拉姆, little Jewish猶太 kid孩子 from the Bronx布朗克斯,
173
572000
3000
這是斯坦利·米爾格拉姆,自小在布朗克斯長大的一個猶太人,
09:47
and he asked the question, "Could the Holocaust大屠殺 happen發生 here, now?"
174
575000
3000
他問,"大屠殺在此時此地發生嗎?"
09:51
People say, "No, that's Nazi納粹 Germany德國,
175
579000
1000
人們回答,"不,那是納粹德國,
09:52
that's Hitler希特勒, you know, that's 1939."
176
580000
2000
那是希特勒,你知道,那是1939年。 "
09:54
He said, "Yeah, but suppose假設 Hitler希特勒 asked you,
177
582000
2000
他說,"是啊,但如果希特勒問你,
09:56
'Would you electrocute觸電 a stranger陌生人?' 'No'沒有 way, not me, I'm a good person.' "
178
584000
3000
'你會用電刑處死一個陌生人嗎? ' ' 不可能,我肯定不會,我是個好人。 "
10:00
He said, "Why don't we put you in a situation情況
179
588000
1000
他說,"那麼我們不如把你放在一個情境裡,
10:01
and give you a chance機會 to see what you would do?"
180
589000
2000
給你一個機會,看看你會怎麼做? "
10:03
And so what he did was he tested測試 1,000 ordinary普通 people.
181
591000
4000
於是,他找了1000個普通人來做測試。
10:07
500 New Haven避風港, Connecticut康涅狄格, 500 Bridgeport布里奇波特.
182
595000
3000
500人來自康州紐黑文,500人來自布里奇波特。
10:10
And the ad廣告 said, "Psychologists心理學家 want to understand理解 memory記憶.
183
598000
4000
廣告是這樣說的,"心理學家想要研究人的記憶,
10:14
We want to improve提高 people's人們 memory記憶,
184
602000
1000
我們想改善人的記憶,
10:15
because memory記憶 is the key to success成功." OK?
185
603000
3000
因為記憶是成功的關鍵。 "
10:18
"We're going to give you five bucks雄鹿 -- four dollars美元 for your time."
186
606000
5000
"我們會給你5美元——4元用來支付時間。"
10:24
And it said, "We don't want college學院 students學生們.
187
612000
1000
上面寫著,"我們不要大學生,
10:25
We want men男人 between之間 20 and 50."
188
613000
2000
我們需要20到50歲之間的男性。 "
10:27
In the later後來 studies學習, they ran women婦女.
189
615000
1000
——他們在後來的實驗中也研究了女性——
10:28
Ordinary普通 people: barbers理髮師, clerks文員, white-collar白領 people.
190
616000
4000
他們都是普通人:理髮師,收銀員,白領等等。
10:32
So, you go down, and one of you is going to be a learner學習者,
191
620000
3000
於是你們下去,其中一個扮演學生,
10:35
and one of you is going to be a teacher老師.
192
623000
1000
另一個扮演教師。
10:36
The learner's學習者 a genial和藹的, middle-aged中年 guy.
193
624000
2000
學生是一個和藹的中年男子。
10:38
He gets得到 tied up to the shock休克 apparatus儀器 in another另一個 room房間.
194
626000
3000
在另外一間屋子裡,他被綁在一個電擊儀器上。
10:41
The learner學習者 could be middle-aged中年, could be as young年輕 as 20.
195
629000
3000
學生可能是中年人,也可能是二十多歲。
10:44
And one of you is told by the authority權威, the guy in the lab實驗室 coat塗層,
196
632000
4000
穿實驗室工作服的負責人,即權威角色,會告訴你們其中一個人說,
10:48
"Your job工作 as teacher老師 is to give this guy material材料 to learn學習.
197
636000
3000
"你作為教師的工作就是讓這個人學習材料。
10:51
Gets獲取 it right, reward獎勵 him.
198
639000
1000
記對了,就獎勵他。
10:52
Gets獲取 it wrong錯誤, you press a button按鍵 on the shock休克 box.
199
640000
2000
記錯了,你就按這個電擊盒上的按鈕。
10:54
The first button按鍵 is 15 volts. He doesn't even feel it."
200
642000
3000
第一個按鈕是15伏特。他基本感覺不到。 "
10:58
That's the key. All evil邪惡 starts啟動 with 15 volts.
201
646000
3000
這就是關鍵。所有的惡都是從15伏特開始的。
11:01
And then the next下一個 step is another另一個 15 volts.
202
649000
2000
下一個再加15伏特。
11:04
The problem問題 is, at the end結束 of the line, it's 450 volts.
203
652000
2000
問題是,最後一個按鈕,是450伏特。
11:06
And as you go along沿, the guy is screaming尖叫,
204
654000
3000
隨著你不斷加電壓,那個人就會慘叫,
11:09
"I've got a heart condition條件! I'm out of here!"
205
657000
2000
"我有心臟問題!我要出去!"
11:11
You're a good person. You complain抱怨.
206
659000
2000
你是一個好人。你去投訴。
11:13
"Sir先生, who's誰是 going to be responsible主管 if something happens發生 to him?"
207
661000
2000
"先生,如果他出事了,誰來負責?"
11:15
The experimenter實驗者 says, "Don't worry擔心, I will be responsible主管.
208
663000
3000
實驗人員說,"不要緊,我來負責。
11:18
Continue繼續, teacher老師."
209
666000
1000
請繼續,教師。 "
11:19
And the question is, who would go all the way to 450 volts?
210
667000
4000
問題是,誰會一直按到450伏特?
11:24
You should notice注意 here, when it gets得到 up to 375,
211
672000
2000
你們會注意到,到375伏特時,
11:26
it says, "Danger危險. Severe嚴重 Shock休克."
212
674000
1000
上面寫著,"危險:強烈電擊"
11:28
When it gets得到 up to here, there's "XXXXXX" -- the pornography色情 of power功率.
213
676000
3000
到這兒的時候,那兒標著"XXX"﹕限制級的權力。
11:31
(Laughter笑聲)
214
679000
1000
(笑聲)
11:32
So Milgram米爾格拉姆 asks 40 psychiatrists精神科醫生,
215
680000
1000
於是米爾格拉姆問了40個精神病醫生,
11:33
"What percent百分 of American美國 citizens公民 would go to the end結束?"
216
681000
3000
"百分之多少的美國人會按到最高電壓?"
11:37
They said only one percent百分. Because that's sadistic虐待狂 behavior行為,
217
685000
3000
他們回答只有百分之1。因為那屬於虐待狂行為,
11:40
and we know, psychiatry精神病學 knows知道, only one percent百分 of Americans美國人 are sadistic虐待狂.
218
688000
3000
而且我們知道,精神病學顯示,只有百分之1的美國人是虐待狂。
11:44
OK. Here's這裡的 the data數據. They could not be more wrong錯誤.
219
692000
4000
好。這裡是研究資料。他們大錯特錯。
11:48
Two thirds三分之二 go all the way to 450 volts. This was just one study研究.
220
696000
5000
三分之二的人會一直按到450伏特。這只是一個研究而已。
11:53
Milgram米爾格拉姆 did more than 16 studies學習. And look at this.
221
701000
3000
米爾格拉姆做了超過16項研究。我們看一下這個。
11:56
In study研究 16, where you see somebody like you go all the way,
222
704000
4000
在第16個研究中,你可以看到跟你們一樣的人們有百分之90
12:00
90 percent百分 go all the way. In study研究 five, if you see people rebel反叛, 90 percent百分 rebel反叛.
223
708000
6000
會一直按到450伏特。在第5個研究中,如果有人反抗,百分之90的人反抗。
12:06
What about women婦女? Study研究 13 -- no different不同 than men男人.
224
714000
3000
女性呢?第13個研究:與男性無差別。
12:10
So Milgram米爾格拉姆 is quantifying量化 evil邪惡 as the willingness願意 of people
225
718000
3000
米爾格拉姆在以人們盲目服從權威,
12:13
to blindly盲目地 obey遵守 authority權威, to go all the way to 450 volts.
226
721000
3000
一直按到450伏特的意願,來數量化惡。
12:16
And it's like a dial撥號 on human人的 nature性質.
227
724000
2000
這就好像是在調節人性。
12:18
A dial撥號 in a sense that you can make almost幾乎 everybody每個人 totally完全 obedient聽話,
228
726000
4000
調節的意思是,你幾乎可以從使絕大多數人完全服從,
12:23
down to the majority多數, down to none沒有.
229
731000
2000
到使沒有人服從。
12:25
So what are the external外部 parallels相似之處? For all research研究 is artificial人造.
230
733000
4000
那麼,外界世界有什麼類似情況嗎?畢竟所有的實驗都是人為的。
12:29
What's the validity合法性 in the real真實 world世界?
231
737000
1000
它在真實世界中的有效性如何?
12:30
912 American美國 citizens公民 committed提交 suicide自殺 or were murdered謀殺
232
738000
4000
1978年,在圭亞那叢林裡,有912名美國人
12:34
by family家庭 and friends朋友 in Guyana圭亞那 jungle叢林 in 1978,
233
742000
3000
自殺或遭其家人朋友殺害,
12:37
because they were blindly盲目地 obedient聽話 to this guy, their pastor牧師 --
234
745000
3000
因為他們盲目地服從這個傢伙,他們的傳道者。
12:40
not their priest牧師 -- their pastor牧師, Reverend牧師 Jim吉姆 Jones瓊斯.
235
748000
2000
不是他們的神父。他們的傳道者,吉姆·瓊斯主教。
12:42
He persuaded說服了 them to commit承諾 mass suicide自殺.
236
750000
3000
他說服他們進行集體自殺。
12:46
And so, he's the modern現代 Lucifer路西弗 effect影響,
237
754000
1000
所以他是一個當代的路西法效應。
12:47
a man of God who becomes the Angel天使 of Death死亡.
238
755000
3000
從上帝使者變成死亡天使。
12:52
Milgram's米爾格蘭姆 study研究 is all about individual個人 authority權威 to control控制 people.
239
760000
4000
米爾格拉姆的研究完全是關於控制大眾的個人權力。
12:56
Most of the time, we are in institutions機構,
240
764000
3000
大多數時間我們在機構裡,
13:00
so the Stanford斯坦福 Prison監獄 Study研究 is a study研究 of the power功率 of institutions機構
241
768000
3000
所以史丹佛監獄實驗,研究的是機構權力
13:03
to influence影響 individual個人 behavior行為.
242
771000
2000
如何影響個人行為。
13:05
Interestingly有趣的是, Stanley斯坦利 Milgram米爾格拉姆 and I were in the same相同 high school學校 class
243
773000
3000
有趣的是,斯坦利·米爾格拉姆和我上高中的時候在同一個班級,
13:08
in James詹姆士 Monroe夢露 in the Bronx布朗克斯, 1954.
244
776000
3000
那是1954年,在布朗克斯的詹姆斯·門羅高中。
13:13
So this study研究, which哪一個 I did
245
781000
1000
這個實驗室是我跟
13:14
with my graduate畢業 students學生們, especially特別 Craig克雷格 Haney哈尼 --
246
782000
2000
我的研究生做的,尤其是克雷格·漢尼,
13:16
we also began開始 work with an ad廣告.
247
784000
1000
我們也從打廣告開始。
13:17
We didn't have money, so we had a cheap低廉, little ad廣告,
248
785000
2000
我們沒什麼錢,於是我們打了一個簡單的小廣告,
13:19
but we wanted college學院 students學生們 for a study研究 of prison監獄 life.
249
787000
3000
我們想找大學生來研究一下監獄生活。
13:22
75 people volunteered自告奮勇, took personality個性 tests測試.
250
790000
3000
75個人誌願參加,做了人格測試。
13:25
We did interviews面試. Picked採摘的 two dozen:
251
793000
2000
我們做了面試。挑選了24名:
13:27
the most normal正常, the most healthy健康.
252
795000
1000
他們是最正常的,最健康的。
13:29
Randomly隨機 assigned分配 them to be prisoner囚犯 and guard守衛.
253
797000
2000
然後隨機把他們分成囚犯和警衛兩組。
13:31
So on day one, we knew知道 we had good apples蘋果.
254
799000
2000
所以在第一天,我們知道他們都是好蘋果。
13:33
I'm going to put them in a bad situation情況.
255
801000
2000
而我將把他們放在一個壞的情境裡。
13:35
And secondly其次, we know there's no difference區別
256
803000
2000
其次,我們知道
13:38
between之間 the boys男孩 who are going to be guards衛士
257
806000
1000
在將要扮演警衛和
13:39
and the boys男孩 who are going to be prisoners囚犯.
258
807000
1000
扮演囚犯的男生之間沒有任何區別。
13:41
The kids孩子 who were going to be prisoners囚犯,
259
809000
1000
我們對那些將要扮演囚犯的男生說,
13:42
we said, "Wait at home in the dormitories宿舍. The study研究 will begin開始 Sunday星期日."
260
810000
2000
"在住處等著,實驗在星期天開始。"
13:45
We didn't tell them
261
813000
1000
我們沒有告訴他們的是,
13:46
that the city police警察 were going to come and do realistic實際 arrests逮捕.
262
814000
36000
市警察局的警察會上門做真實的逮捕。
14:22
(Video視頻) Student學生: A police警察 car汽車 pulls up in front面前, and a cop警察 comes to the front面前 door,
263
850000
6000
錄像中的男人:一輛警車停在房子前面,一個警察來到前門
14:28
and knocks敲門, and says he's looking for me.
264
856000
2000
敲門,說是找我。
14:30
So they, right there, you know, they took me out the door,
265
858000
2000
於是他們,就在那兒,你懂的,把我抓出去,
14:33
they put my hands against反對 the car汽車.
266
861000
3000
把我的雙手放車上。
14:36
It was a real真實 cop警察 car汽車, it was a real真實 policeman警察,
267
864000
2000
那是輛真警車,是個真警察,
14:39
and there were real真實 neighbors鄰居 in the street,
268
867000
1000
街上的鄰居也是真的,
14:40
who didn't know that this was an experiment實驗.
269
868000
4000
他們不知道這是個實驗。
14:44
And there was cameras相機 all around and neighbors鄰居 all around.
270
872000
3000
周圍都是相機,圍滿了鄰居。
14:47
They put me in the car汽車, then they drove開車 me around Palo帕洛阿爾托 Alto奧拓.
271
875000
3000
他們讓我上警車,在帕羅奧圖市的大街上行駛。
14:52
They took me to the police警察 station,
272
880000
3000
他們把我抓到警察局,
14:55
the basement地下室 of the police警察 station. Then they put me in a cell細胞.
273
883000
10000
警察局的地下室。他們把我關到一間牢房裡。
15:05
I was the first one to be picked採摘的 up, so they put me in a cell細胞,
274
893000
2000
我是第一個被抓來的,所以他們把我關進一間單人牢房,
15:07
which哪一個 was just like a room房間 with a door with bars酒吧 on it.
275
895000
4000
基本上就是一間門上有欄杆的房間。
15:12
You could tell it wasn't a real真實 jail監獄.
276
900000
1000
你可以看出來出它不是間真的牢房。
15:13
They locked鎖定 me in there, in this degrading降解 little outfit配備.
277
901000
5000
他們把我鎖在那兒,穿著這件丟人的衣服。
15:19
They were taking服用 this experiment實驗 too seriously認真地.
278
907000
2000
他們對這個實驗太認真了。
15:21
Philip菲利普 Zimbardo津巴多: Here are the prisoners囚犯 who are going to be dehumanized非人性化.
279
909000
2000
這就是那些將要被剝奪人性的囚犯。
15:23
They're going to become成為 numbers數字.
280
911000
1000
他們的名字將被號碼代替。
15:24
Here are the guards衛士 with the symbols符號 of power功率 and anonymity匿名.
281
912000
3000
這是那些警衛,他們的裝扮標誌著權力和匿名性。
15:27
Guards逆天 get prisoners囚犯
282
915000
1000
警衛們讓囚犯們
15:28
to clean清潔 the toilet廁所 bowls out with their bare hands,
283
916000
2000
徒手清理馬桶,
15:30
to do other humiliating羞辱 tasks任務.
284
918000
2000
讓他們做其他一些羞辱性的任務。
15:32
They strip跳閘 them naked. They sexually taunt嘲諷 them.
285
920000
2000
他們脫光囚犯的衣服,性侮辱他們。
15:34
They begin開始 to do degrading降解 activities活動,
286
922000
2000
他們開始做侮辱行為,
15:36
like having them simulate模擬 sodomy雞姦.
287
924000
2000
譬如強迫囚犯們模擬雞姦。
15:38
You saw simulating模擬 fellatio口交 in soldiers士兵 in Abu阿布 Ghraib格萊布.
288
926000
3000
你們看到阿布格萊布的士兵強迫囚犯模擬口交。
15:41
My guards衛士 did it in five days. The stress強調 reaction反應 was so extreme極端
289
929000
5000
我的警衛在五天內就做了。囚犯們的應激反應是非常極端的,
15:46
that normal正常 kids孩子 we picked採摘的 because they were healthy健康
290
934000
2000
我們當初挑選他們是因為他們是健康的,
15:48
had breakdowns故障 within 36 hours小時.
291
936000
2000
而這些正常的男生在36小時內就有人崩潰了。
15:50
The study研究 ended結束 after six days, because it was out of control控制.
292
938000
4000
這個實驗在6天后結束因為它已經失控了。
15:54
Five kids孩子 had emotional情緒化 breakdowns故障.
293
942000
2000
五個男生情緒崩潰。
15:58
Does it make a difference區別 if warriors勇士 go to battle戰鬥
294
946000
2000
戰士們是否更換統一服裝
16:00
changing改變 their appearance出現 or not?
295
948000
2000
對於他們在戰場上的表現會有影響嗎?
16:02
Does it make a difference區別 if they're anonymous匿名,
296
950000
1000
他們匿名與否
16:03
in how they treat對待 their victims受害者?
297
951000
2000
對於他們對付受害者會有影響嗎?
16:05
We know in some cultures文化, they go to war戰爭,
298
953000
1000
我們知道在某些文化裡,人們上戰場時
16:06
they don't change更改 their appearance出現.
299
954000
1000
是不換服裝的。
16:07
In other cultures文化, they paint塗料 themselves他們自己 like "Lord of the Flies蒼蠅."
300
955000
2000
在另外一些文化裡,他們把自己塗成"蒼蠅王"的樣子。
16:09
In some, they wear穿 masks面具.
301
957000
2000
在某些文化裡他們戴著面具。
16:11
In many許多, soldiers士兵 are anonymous匿名 in uniform制服.
302
959000
3000
在許多文化中,戰士們穿著統一服裝達到匿名性。
16:14
So this anthropologist人類學家, John約翰 Watson沃森, found發現
303
962000
2000
人類學家約翰·華生
16:17
23 cultures文化 that had two bits of data數據.
304
965000
2000
在23個文化中發現兩組數據。
16:19
Do they change更改 their appearance出現? 15.
305
967000
2000
他們是否更換服裝? 15個是。
16:21
Do they kill, torture拷打, mutilate毀傷? 13.
306
969000
2000
他們是否殺戮,折磨,殘害? 13個是。
16:23
If they don't change更改 their appearance出現,
307
971000
2000
如果他們不換服裝,
16:25
only one of eight kills殺死, tortures折磨 or mutilates毀傷.
308
973000
2000
八個文化中只有一個殺戮,折磨或殘害。
16:27
The key is in the red zone.
309
975000
2000
關鍵在這個紅色區域。
16:29
If they change更改 their appearance出現,
310
977000
1000
如果他們更換服裝,
16:30
12 of 13 -- that's 90 percent百分 -- kill, torture拷打, mutilate毀傷.
311
978000
4000
13個文化中有12個,即百分之90,會殺戮,折磨,殘害。
16:35
And that's the power功率 of anonymity匿名.
312
983000
1000
這就是匿名性的威力。
16:36
So what are the seven social社會 processes流程
313
984000
2000
那麼是哪七個社會性過程
16:38
that grease潤滑脂 the slippery slope of evil邪惡?
314
986000
2000
會導致惡的逐漸產生呢?
16:40
Mindlessly盲目 taking服用 the first small step.
315
988000
2000
無意中邁出第一步。
16:42
Dehumanization非人化 of others其他. De-individuation去個性化 of Self.
316
990000
3000
對他人去人性化。對自己去個體化。
16:45
Diffusion擴散 of personal個人 responsibility責任. Blind obedience服從 to authority權威.
317
993000
3000
推卸個人責任。盲目服從權威。
16:49
Uncritical不加批判 conformity一致性 to group norms規範.
318
997000
1000
不加批判地依從群體規範。
16:50
Passive被動 tolerance公差 to evil邪惡 through通過 inaction無為 or indifference漠不關心.
319
998000
3000
袖手旁觀,漠不關心,對惡行消極容忍。
16:54
And it happens發生 when you're in a new or unfamiliar陌生 situation情況.
320
1002000
2000
而其容易在新的或不熟悉的環境中發生。
16:56
Your habitual慣常的 response響應 patterns模式 don't work.
321
1004000
2000
你的習慣性反應失效了。
16:59
Your personality個性 and morality道德 are disengaged脫開.
322
1007000
2000
你的人格和道德感被關閉了。
17:01
"Nothing is easier更輕鬆 than to denounce聲討 the evildoer妖孽;
323
1009000
3000
"沒有什麼比公開譴責作惡者更容易,
17:04
nothing more difficult than understanding理解 him," DostoyevksyDostoyevksy tells告訴 us.
324
1012000
3000
也沒什麼比理解他更難。"杜斯妥耶夫斯基告訴我們。
17:07
Understanding理解 is not excusing原諒. Psychology心理學 is not excuse-iology藉口,iology.
325
1015000
4000
理解不是找藉口。心理學不是藉口學。
17:12
So social社會 and psychological心理 research研究 reveals揭示
326
1020000
1000
社會學和心理學研究揭示了
17:13
how ordinary普通, good people can be transformed改造 without the drugs毒品.
327
1021000
4000
在無需藥物的情況下,普通的好人是如何被轉變的。
17:17
You don't need it. You just need the social-psychological社會心理 processes流程.
328
1025000
3000
你不需要藥物,你只需要社會心理學的過程。
17:20
Real真實 world世界 parallels相似之處? Compare比較 this with this.
329
1028000
4000
真實世界的情況?和這個比較一下。
17:26
James詹姆士 Schlesinger施萊辛格 -- and I'm going to have to end結束 with this -- says,
330
1034000
2000
我以詹姆斯·施萊辛格的話作為結束,
17:28
"Psychologists心理學家 have attempted嘗試 to understand理解 how and why
331
1036000
2000
"心理學家已嘗試理解,
17:31
individuals個人 and groups who usually平時 act法案 humanely入道
332
1039000
2000
一般情況下具備人性的個體和群體,為什麼以及如何
17:33
can sometimes有時 act法案 otherwise除此以外 in certain某些 circumstances情況."
333
1041000
3000
會在某些情境下,作出反常行為。 "
17:37
That's the Lucifer路西弗 effect影響.
334
1045000
1000
這就是路西法效應。
17:38
And he goes on to say, "The landmark里程碑 Stanford斯坦福 study研究
335
1046000
2000
他接著說,"具有標誌性的史丹佛實驗
17:40
provides提供 a cautionary警示 tale故事 for all military軍事 operations操作."
336
1048000
4000
給了所有軍事行為一個警告。 "
17:44
If you give people power功率 without oversight疏忽,
337
1052000
2000
如果你在沒有監督的情況下賦予人們權力,
17:47
it's a prescription處方 for abuse濫用. They knew知道 that, and let that happen發生.
338
1055000
3000
那就是在給濫用開通行證。他們明明了解後果,卻任其發生。
17:50
So another另一個 report報告, an investigative研究 report報告 by General一般 Fay仙女,
339
1058000
5000
另一個報告,是費將軍所做的調查,
17:55
says the system系統 is guilty有罪. And in this report報告,
340
1063000
2000
認為整個系統是有罪的,在該報告中,
17:57
he says it was the environment環境 that created創建 Abu阿布 Ghraib格萊布,
341
1065000
3000
他認為是環境造成了阿布格萊布事件,
18:00
by leadership領導 failures故障 that contributed貢獻
342
1068000
2000
領導力的失誤,
18:02
to the occurrence發生 of such這樣 abuse濫用,
343
1070000
1000
導致了虐待的發生,
18:03
and the fact事實 that it remained保持 undiscovered未被發現
344
1071000
2000
以及在很長一段時間內,
18:05
by higher更高 authorities當局 for a long period of time.
345
1073000
2000
當局高層一直被蒙在鼓裡。
18:07
Those abuses濫用 went on for three months個月. Who was watching觀看 the store商店?
346
1075000
4000
那些虐待行為持續了三個月。有誰在看管嗎?
18:11
The answer回答 is nobody沒有人, and, I think, nobody沒有人 on purpose目的.
347
1079000
2000
答案是沒有人,我認為,是沒有人主動去。
18:14
He gave the guards衛士 permission允許 to do those things,
348
1082000
1000
他允許警衛們作那些惡行,
18:15
and they knew知道 nobody沒有人 was ever going to come down to that dungeon地牢.
349
1083000
3000
他們知道沒有人會下地牢來查看。
18:18
So you need a paradigm範例 shift轉移 in all of these areas.
350
1086000
3000
所以我們在所有這些方面進行模式上的轉變。
18:21
The shift轉移 is away from the medical model模型
351
1089000
2000
原來的醫療模式,
18:23
that focuses重點 only on the individual個人.
352
1091000
2000
只集中於個體,
18:25
The shift轉移 is toward a public上市 health健康 model模型
353
1093000
2000
必須轉向一個公共健康模式,
18:28
that recognizes識別 situational情境 and systemic系統的 vectors矢量 of disease疾病.
354
1096000
3000
這個模式同時考慮情境和系統對疾病的作用。
18:31
Bullying欺凌 is a disease疾病. Prejudice偏見 is a disease疾病. Violence暴力 is a disease疾病.
355
1099000
4000
欺侮是病。偏見是病。暴力是病。
18:35
And since以來 the Inquisition宗教裁判所, we've我們已經 been dealing交易 with problems問題
356
1103000
2000
自從審訊以來,我們一直在個人層面
18:37
at the individual個人 level水平. And you know what? It doesn't work.
357
1105000
3000
解決問題。你猜怎麼著,沒用。
18:40
Aleksandr亞歷山大 Solzhenitsyn索爾仁尼琴 says, "The line between之間 good and evil邪惡
358
1108000
3000
亞歷山大·索忍尼辛認為每個人心中
18:43
cuts削減 through通過 the heart of every一切 human人的 being存在."
359
1111000
2000
都有善惡的分界線。
18:45
That means手段 that line is not out there.
360
1113000
2000
也就是說,這條線不是外在的。
18:47
That's a decision決定 that you have to make. That's a personal個人 thing.
361
1115000
3000
這是一個你必須作出的決定。是個人層面的。
18:50
So I want to end結束 very quickly很快 on a positive note注意.
362
1118000
3000
那麼,我想以一個正面的意見來做個簡短的結尾:
18:53
Heroism英雄主義 as the antidote解藥 to evil邪惡,
363
1121000
2000
英雄主義是惡的解藥。
18:56
by promoting促進 the heroic英勇 imagination想像力,
364
1124000
1000
通過推廣英雄主義想像,
18:57
especially特別 in our kids孩子, in our educational教育性 system系統.
365
1125000
3000
尤其是在我們的孩子之中,在教育系統裡。
19:00
We want kids孩子 to think, I'm the hero英雄 in waiting等候,
366
1128000
2000
我們要孩子們想,我是那個等待中的英雄,
19:02
waiting等候 for the right situation情況 to come along沿,
367
1130000
2000
等待合適的情境出現,
19:05
and I will act法案 heroically英勇.
368
1133000
1000
屆時我會行英雄之事。
19:06
My whole整個 life is now going to focus焦點 away from evil邪惡 --
369
1134000
2000
我一生自小與惡相伴,
19:08
that I've been in since以來 I was a kid孩子 -- to understanding理解 heroes英雄.
370
1136000
3000
如今我畢生努力之重點,將從研究惡轉向理解英雄主義。
19:11
Banality平庸 of heroism英雄主義
371
1139000
1707
現在所謂的英雄主義是,
19:13
is, it's ordinary普通 people who do heroic英勇 deeds行為.
372
1140707
2293
平凡之人行英雄之事。
19:15
It's the counterpoint對位 to Hannah漢娜 Arendt's阿倫特 "Banality平庸 of Evil邪惡."
373
1143000
3000
這是對漢娜·鄂蘭平庸之惡的反駁。
19:18
Our traditional傳統 societal社會的 heroes英雄 are wrong錯誤,
374
1146000
3000
我們傳統的社會英雄是錯誤的,
19:21
because they are the exceptions例外.
375
1149000
1000
因為他們是極少數例外。
19:22
They organize組織 their whole整個 life around this.
376
1150000
2000
他們為目標投入畢生之努力。
19:24
That's why we know their names.
377
1152000
1000
因此我們才知道他們的名字。
19:25
And our kids'孩子們 heroes英雄 are also wrong錯誤 models楷模 for them,
378
1153000
2000
孩子們的英雄也是他們的榜樣,
19:27
because they have supernatural超自然 talents人才.
379
1155000
2000
因為他們有超自然能力。
19:30
We want our kids孩子 to realize實現 most heroes英雄 are everyday每天 people,
380
1158000
2000
我們想要讓孩子們意識到,大多數英雄是平凡的人們,
19:32
and the heroic英勇 act法案 is unusual異常. This is Joe Darby.
381
1160000
4000
而英雄行為是不平凡的。這是喬·達比。
19:36
He was the one that stopped停止 those abuses濫用 you saw,
382
1164000
2000
就是他阻止了你們前面所見的那些虐行,
19:38
because when he saw those images圖片,
383
1166000
2000
因為當他看到那些圖片時,
19:40
he turned轉身 them over to a senior前輩 investigating調查 officer.
384
1168000
3000
他把它們交給了一位高級調查官。
19:43
He was a low-level低級別 private私人的, and that stopped停止 it. Was he a hero英雄? No.
385
1171000
3000
他是一個低級士兵但卻阻止了此事。他是英雄嗎?不是。
19:47
They had to put him in hiding, because people wanted to kill him,
386
1175000
3000
他們不得不把他藏起來,因為有人想殺他,
19:50
and then his mother母親 and his wife妻子.
387
1178000
1000
還有他的母親和妻子。
19:51
For three years年份, they were in hiding.
388
1179000
2000
他們隱藏了三年。
19:53
This is the woman女人 who stopped停止 the Stanford斯坦福 Prison監獄 Study研究.
389
1181000
3000
這個女人阻止了斯坦福監獄實驗。
19:56
When I said it got out of control控制, I was the prison監獄 superintendent所長.
390
1184000
3000
當我說實驗失控的時候,我當時是監獄實驗負責人。
19:59
I didn't know it was out of control控制. I was totally完全 indifferent冷漠.
391
1187000
3000
我不知道實驗已經失控了。我完全無動於衷。
20:02
She came來了 down, saw that madhouse瘋人院 and said,
392
1190000
2000
她下來看到這瘋人院一樣的監獄說,
20:04
"You know what, it's terrible可怕 what you're doing to those boys男孩.
393
1192000
3000
"你知道嗎?你對這些男孩所做的一切實在是太可怕了。
20:07
They're not prisoners囚犯, they're not guards衛士,
394
1195000
1000
他們不是囚犯,不是警衛,
20:08
they're boys男孩, and you are responsible主管."
395
1196000
2000
他們只是孩子,你要為他們負責。 "
20:11
And I ended結束 the study研究 the next下一個 day.
396
1199000
2000
我第二天就停止了這個實驗。
20:13
The good news新聞 is I married已婚 her the next下一個 year.
397
1201000
2000
好消息是,我第二年就娶了她。
20:15
(Laughter笑聲)
398
1203000
3000
(笑聲)
20:18
(Applause掌聲)
399
1206000
7000
(鼓掌)
20:25
I just came來了 to my senses感官, obviously明顯.
400
1213000
2000
顯然,我醒悟了。
20:27
So situations情況 have the power功率 to do, through通過 --
401
1215000
2000
所以情境是有力量的——
20:31
but the point is, this is the same相同 situation情況
402
1219000
1000
關鍵是,這個情境
20:32
that can inflame發炎 the hostile敵對 imagination想像力 in some of us,
403
1220000
4000
可以刺激一些人內心的敵意想像,
20:36
that makes品牌 us perpetrators肇事者 of evil邪惡,
404
1224000
2000
使我們成為惡之犯人,
20:38
can inspire啟發 the heroic英勇 imagination想像力 in others其他. It's the same相同 situation情況.
405
1226000
3000
也可以激發另外一些人內心的英雄想像。情境是同樣的情境。
20:42
And you're on one side or the other.
406
1230000
1000
而你二者必居其一。
20:43
Most people are guilty有罪 of the evil邪惡 of inaction無為,
407
1231000
2000
大多數人對袖手旁觀之惡感到內疚,
20:45
because your mother母親 said, "Don't get involved參與. Mind心神 your own擁有 business商業."
408
1233000
3000
因為你母親會說,"別管閒事,先管好你自己的事。"
20:48
And you have to say, "Mama媽媽, humanity人性 is my business商業."
409
1236000
3000
你一定得這麼回答,"媽媽,人性就是我的事。"
20:51
So the psychology心理學 of heroism英雄主義 is -- we're going to end結束 in a moment時刻 --
410
1239000
2000
英雄主義的心理學是——我們很快會結束——
20:53
how do we encourage鼓勵 children孩子 in new hero英雄 courses培訓班,
411
1241000
4000
我們如何在新的英雄課程裡鼓勵孩子們,
20:57
that I'm working加工 with Matt馬特 Langdon蘭登 -- he has a hero英雄 workshop作坊 --
412
1245000
3000
我正與馬特·郎登從事這項工作——他有一個英雄工作坊——
21:00
to develop發展 this heroic英勇 imagination想像力, this self-labeling自貼標,
413
1248000
3000
來培養這種英雄想像,這種自我標籤,
21:03
"I am a hero英雄 in waiting等候," and teach them skills技能.
414
1251000
3000
"我是一個等待中的英雄",並且教會他們技能。
21:06
To be a hero英雄, you have to learn學習 to be a deviant異常,
415
1254000
2000
想成為英雄的話,你一定要學會成為一個"異類",
21:09
because you're always going against反對 the conformity一致性 of the group.
416
1257000
2000
因為你得總是與群體規範相左。
21:11
Heroes英雄 are ordinary普通 people whose誰的 social社會 actions行動 are extraordinary非凡. Who act法案.
417
1259000
4000
英雄是那些在社會上行非凡之事的平凡人。那些有所為之人。
21:15
The key to heroism英雄主義 is two things.
418
1263000
2000
英雄主義之關鍵有二。
21:17
A: you've got to act法案 when other people are passive被動.
419
1265000
2000
一:在眾人消極冷漠之時有所作為。
21:20
B: you have to act法案 socio-centrically社會同心, not egocentricallyegocentrically.
420
1268000
3000
二:你的作為必須以社會為中心,而非以自我為中心。
21:23
And I want to end結束 with the story故事 that some of you know,
421
1271000
2000
我想以韋斯利·奧特里,紐約地鐵英雄的故事來結尾,
21:25
about Wesley韋斯利 Autrey奧特里, New York紐約 subway地鐵 hero英雄.
422
1273000
2000
你們其中一些人知道這個故事。
21:27
Fifty-year-old五十多歲 African-American非裔美國人 construction施工 worker工人.
423
1275000
2000
他是一個50歲的非裔美國人,是一個建築工人。
21:29
He's standing常設 on a subway地鐵 in New York紐約.
424
1277000
2000
他在紐約地鐵等車的時候,
21:31
A white白色 guy falls下降 on the tracks軌道.
425
1279000
1000
一個白人掉進地鐵軌道裡。
21:32
The subway地鐵 train培養 is coming未來. There's 75 people there.
426
1280000
3000
當時地鐵正開過來。當時有75個人在那兒。
21:35
You know what? They freeze凍結.
427
1283000
1000
你猜怎麼著,他們全都僵住了。
21:36
He's got a reason原因 not to get involved參與.
428
1284000
2000
他有理由袖手旁觀。
21:38
He's black黑色, the guy's傢伙 white白色, and he's got two little kids孩子.
429
1286000
2000
他是黑人,那個人是白人,他還有兩個小孩。
21:41
Instead代替, he gives his kids孩子 to a stranger陌生人,
430
1289000
1000
相反的是,他把兩個孩子交給一個陌生人看管,
21:42
jumps跳躍 on the tracks軌道, puts看跌期權 the guy between之間 the tracks軌道,
431
1290000
3000
跳進鐵軌裡,把那男子壓在鐵軌之間,
21:45
lies on him, the subway地鐵 goes over him.
432
1293000
2000
趴在他身上,地鐵就從他身上開了過去。
21:47
Wesley韋斯利 and the guy -- 20 and a half inches英寸 height高度.
433
1295000
3000
韋斯利和那個男子摞起來高20.5英寸。
21:51
The train培養 clearance淨空 is 21 inches英寸.
434
1299000
2000
地鐵列車下的空隙高21英寸。
21:53
A half an inch英寸 would have taken採取 his head off.
435
1301000
2000
再低半英寸就會把他的腦袋鏟去。
21:56
And he said, "I did what anyone任何人 could do,"
436
1304000
3000
而他卻說"我做了任何人都會做的事",
21:59
no big deal合同 to jump on the tracks軌道.
437
1307000
1000
跳下鐵軌沒什麼大不了的。
22:00
And the moral道德 imperative勢在必行 is "I did what everyone大家 should do."
438
1308000
4000
從道德責任的角度說應該是"我做了任何人應該做的事"。
22:04
And so one day, you will be in a new situation情況.
439
1312000
2000
那麼,將來有一天,你會遇到一個新的情境。
22:07
Take path路徑 one, you're going to be a perpetrator肇事者 of evil邪惡.
440
1315000
2000
第一條路,你會成為惡之犯人。
22:09
Evil邪惡, meaning含義 you're going to be Arthur亞瑟 Andersen安德森.
441
1317000
3000
惡,即你將成為亞瑟·安德森。
22:12
You're going to cheat作弊, or you're going to allow允許 bullying欺凌.
442
1320000
2000
你將會欺騙,或允許欺侮。
22:14
Path路徑 two, you become成為 guilty有罪 of the evil邪惡 of passive被動 inaction無為.
443
1322000
2000
第二條路:你將因漠不關心袖手旁觀而內疚。
22:17
Path路徑 three, you become成為 a hero英雄.
444
1325000
1000
第三條路:你成為一個英雄。
22:18
The point is, are we ready準備 to take the path路徑
445
1326000
3000
關鍵是,我們是否做好準備來選擇這條路
22:21
to celebrating慶祝 ordinary普通 heroes英雄,
446
1329000
2000
以頌揚平凡的英雄,
22:23
waiting等候 for the right situation情況 to come along沿
447
1331000
2000
等待合適的情境出現,
22:25
to put heroic英勇 imagination想像力 into action行動?
448
1333000
2000
將對於英雄的想像付諸於實施呢?
22:27
Because it may可能 only happen發生 once一旦 in your life,
449
1335000
3000
因為這可能是你平生僅有的機會,
22:31
and when you pass通過 it by, you'll你會 always know,
450
1339000
1000
而當你錯過的時候,你將永遠記得,
22:32
I could have been a hero英雄 and I let it pass通過 me by.
451
1340000
3000
我本可以成為一個英雄但我讓這機會溜走了。
22:35
So the point is thinking思維 it and then doing it.
452
1343000
2000
所以關鍵是先想再做。
22:37
So I want to thank you. Thank you. Thank you.
453
1345000
3000
所以我想謝謝你們。謝謝你們。謝謝。
22:40
Let's oppose反對 the power功率 of evil邪惡 systems系統 at home and abroad國外,
454
1348000
3000
讓我們反對國內外惡之系統的力量,
22:43
and let's focus焦點 on the positive.
455
1351000
2000
並集中於積極的一面。
22:45
Advocate主張 for respect尊重 of personal個人 dignity尊嚴, for justice正義 and peace和平,
456
1353000
3000
倡導對個人高尚行為之尊敬,倡導正義與和平,
22:48
which哪一個 sadly可悲的是 our administration行政 has not been doing.
457
1356000
2000
遺憾的是,我們的當局並沒有做這些。
22:50
Thanks謝謝 so much.
458
1358000
1000
非常感謝。
22:51
(Applause掌聲)
459
1359000
13000
(掌聲)
Translated by Coco Shen
Reviewed by Geoff Chen

▲Back to top

ABOUT THE SPEAKER
Philip Zimbardo - Psychologist
Philip Zimbardo was the leader of the notorious 1971 Stanford Prison Experiment -- and an expert witness at Abu Ghraib. His book The Lucifer Effect explores the nature of evil; now, in his new work, he studies the nature of heroism.

Why you should listen

Philip Zimbardo knows what evil looks like. After serving as an expert witness during the Abu Ghraib trials, he wrote The Lucifer Effect: Understanding How Good People Turn Evil. From Nazi comic books to the tactics of used-car salesmen, he explores a wealth of sources in trying to explain the psychology of evil.

A past president of the American Psychological Association and a professor emeritus at Stanford, Zimbardo retired in 2008 from lecturing, after 50 years of teaching his legendary introductory course in psychology. In addition to his work on evil and heroism, Zimbardo recently published The Time Paradox, exploring different cultural and personal perspectives on time.

Still well-known for his controversial Stanford Prison Experiment, Zimbardo in his new research looks at the psychology of heroism. He asks, "What pushes some people to become perpetrators of evil, while others act heroically on behalf of those in need?"

More profile about the speaker
Philip Zimbardo | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee