ABOUT THE SPEAKER
Anne Milgram - Criminal justice reformer
Anne Milgram is committed to using data and analytics to fight crime.

Why you should listen

Anne Milgram is focused on reforming systems through smart data, analytics and technology. She is currently a Professor of Practice and Distinguished Scholar in Residence at New York University School of Law, where she is building a Criminal Justice Innovation Lab, dedicated to using data and technology to transform the American criminal justice system. She also teaches seminars on criminal justice policy and human trafficking. Milgram began her career as a criminal prosecutor, serving in state, local and federal prosecution offices.  She then became the Attorney General of the State of New Jersey, where she served as the Chief Law Enforcement Officer for the State and oversaw the Camden Police Department.

Though her work, Milgram seeks to bring the best of the modern world -- data, technology and analytics -- to bear in an effort to transform outdated systems and practices. Milgram is centered on creating a paradigm shift in how we think about innovation and reform in the criminal justice system and beyond.

Milgram graduated summa cum laude from Rutgers University and holds a Master of Philosophy in social and political theory from the University of Cambridge. She received her law degree from New York University School of Law.

More profile about the speaker
Anne Milgram | Speaker | TED.com
TED@BCG San Francisco

Anne Milgram: Why smart statistics are the key to fighting crime

安妮.米爾格拉姆 (Anne Milgram): 為什麼統計是打擊犯罪的關鍵

Filmed:
1,034,735 views

安妮.米爾格拉姆在 2007 年成為紐澤西州總檢察長時,她隨即發現一些驚人的事情:她的團隊不只是不太清楚他們把誰關在牢裡,他們也無法了解自己做的決定是否能提升公共安全。因此她開始提出一連串鼓舞人心的請求,並將數據邏輯分析和統計分析運用在美國刑事司法體系之中。
- Criminal justice reformer
Anne Milgram is committed to using data and analytics to fight crime. Full bio

Double-click the English transcript below to play the video.

00:12
In 2007, I became成為 the attorney律師 general一般
0
843
2591
2007年時,我成為紐澤西州
00:15
of the state of New Jersey新澤西.
1
3434
1725
總檢察長。
00:17
Before that, I'd been a criminal刑事 prosecutor原告,
2
5159
2280
在那之前,我擔任刑事檢察官,
00:19
first in the Manhattan曼哈頓 district attorney's律師 office辦公室,
3
7439
2681
一開始在曼哈頓區檢察官辦公室,
00:22
and then at the United聯合的 States狀態 Department of Justice正義.
4
10120
2650
後來在美國司法部。
00:24
But when I became成為 the attorney律師 general一般,
5
12770
2201
但是當我成為總檢察長後,
00:26
two things happened發生 that changed
the way I see criminal刑事 justice正義.
6
14971
3895
發生了兩件事,
改變我對刑事司法的看法。
00:30
The first is that I asked what I thought
7
18866
2030
第一件事是我提出了一些我認為
00:32
were really basic基本 questions問題.
8
20896
2186
很基本的問題。
00:35
I wanted to understand理解 who we were arresting逮捕,
9
23082
2856
我想了解我們逮捕誰、
00:37
who we were charging充電,
10
25938
1664
控告誰、
00:39
and who we were putting in our nation's國家 jails監獄
11
27602
2128
以及把誰關進我國的拘留所
00:41
and prisons監獄.
12
29730
1416
和監獄。
00:43
I also wanted to understand理解
13
31146
1648
我也想了解
00:44
if we were making製造 decisions決定
14
32794
1329
我們做決定的方式
00:46
in a way that made製作 us safer更安全.
15
34123
2518
是否能讓我們更安全。
00:48
And I couldn't不能 get this information信息 out.
16
36641
3252
我找不出答案。
00:51
It turned轉身 out that most big criminal刑事 justice正義 agencies機構
17
39893
3357
結果是多數的大型刑事司法機關,
00:55
like my own擁有
18
43250
1302
就像我工作的地方,
00:56
didn't track跟踪 the things that matter.
19
44552
2382
沒有追蹤重要的事情。
00:58
So after about a month of being存在 incredibly令人難以置信 frustrated受挫,
20
46934
3318
因此歷經大約一個月的強烈挫折感後,
01:02
I walked down into a conference會議 room房間
21
50252
1971
我走進一間會議室,
01:04
that was filled填充 with detectives偵探
22
52223
1890
裡面擠滿警探
01:06
and stacks and stacks of case案件 files,
23
54113
2782
和堆積如山的案件資料,
01:08
and the detectives偵探 were sitting坐在 there
24
56895
1176
警探坐在那,
01:10
with yellow黃色 legal法律 pads taking服用 notes筆記.
25
58071
2234
手拿黃色標準便條紙在做筆記。
01:12
They were trying to get the information信息
26
60305
1586
他們試圖得到我在找的資訊,
01:13
I was looking for
27
61891
1218
01:15
by going through通過 case案件 by case案件
28
63109
2045
逐一檢視
01:17
for the past過去 five years年份.
29
65154
1898
過去五年的案件。
01:19
And as you can imagine想像,
30
67052
1653
如你所料,
01:20
when we finally最後 got the results結果, they weren't good.
31
68705
2643
最後得到的結果不太好。
01:23
It turned轉身 out that we were doing
32
71348
1655
結論是我們辦了
01:25
a lot of low-level低級別 drug藥物 cases
33
73003
2020
很多低階的街頭毒品案件,
01:27
on the streets街道 just around the corner
34
75023
1475
就在附近的街上,
01:28
from our office辦公室 in Trenton特倫頓.
35
76498
2268
離我們在翠登的辦公室不遠。
01:30
The second第二 thing that happened發生
36
78766
1467
第二件發生的事情是
01:32
is that I spent花費 the day in the Camden卡姆登,
New Jersey新澤西 police警察 department.
37
80233
3674
我在紐澤西肯頓警局待的那一天。
01:35
Now, at that time, Camden卡姆登, New Jersey新澤西,
38
83907
1887
那時候紐澤西州肯頓
01:37
was the most dangerous危險 city in America美國.
39
85794
2652
是全美最危險的城市,
01:40
I ran the Camden卡姆登 Police警察
Department because of that.
40
88446
3827
這是我去肯頓警局的原因。
01:44
I spent花費 the day in the police警察 department,
41
92273
2112
我在警局待一整天,
01:46
and I was taken採取 into a room房間
with senior前輩 police警察 officials官員,
42
94385
2726
被帶到一間房間,
和資深警官在一起,
01:49
all of whom were working加工 hard
43
97111
1675
他們都很努力
01:50
and trying very hard to reduce減少 crime犯罪 in Camden卡姆登.
44
98786
3257
試著降低肯頓的犯罪率。
01:54
And what I saw in that room房間,
45
102043
1826
當我們討論如何降低犯罪率時,
01:55
as we talked about how to reduce減少 crime犯罪,
46
103869
2245
我在那房裡看到
01:58
were a series系列 of officers長官 with a
lot of little yellow黃色 sticky notes筆記.
47
106114
3859
一大堆警官拿著
很多黃色的小型便利貼。
02:01
And they would take a yellow黃色 sticky
and they would write something on it
48
109973
2846
他們拿著一張黃色便利貼,
在上面寫點東西,
02:04
and they would put it up on a board.
49
112823
1799
把它貼在公布欄上。
02:06
And one of them said,
"We had a robbery搶劫 two weeks ago.
50
114622
2171
其中一位會說:「兩個星期前有搶案。
02:08
We have no suspects犯罪嫌疑人."
51
116793
1711
沒有嫌犯。」
02:10
And another另一個 said, "We had a shooting射擊 in this neighborhood鄰里 last week. We have no suspects犯罪嫌疑人."
52
118504
5027
另一位說:「上星期這附近
有槍擊案。沒有嫌犯。」
02:15
We weren't using運用 data-driven數據驅動 policing治安.
53
123531
2583
我們沒有運用任何數據處理治安。
02:18
We were essentially實質上 trying to fight鬥爭 crime犯罪
54
126114
2042
基本上,我們打算用黃色便利貼
02:20
with yellow黃色 Post-it發表它 notes筆記.
55
128156
2527
來打擊犯罪。
02:22
Now, both of these things made製作 me realize實現
56
130683
2135
這兩件事讓我了解
02:24
fundamentally從根本上 that we were failing失敗.
57
132818
3251
我們徹底失敗了。
02:28
We didn't even know who was
in our criminal刑事 justice正義 system系統,
58
136069
3123
我們甚至不知道誰在
我們的刑事司法體系裡,
02:31
we didn't have any data數據 about
the things that mattered要緊,
59
139192
3235
我們沒有重要資料的數據,
02:34
and we didn't share分享 data數據 or use analytics分析
60
142427
2568
也沒有共享數據、運用分析
02:36
or tools工具 to help us make better decisions決定
61
144995
2151
或工具來幫助我們做更好的決定,
02:39
and to reduce減少 crime犯罪.
62
147146
2003
並減少犯罪。
02:41
And for the first time, I started開始 to think
63
149149
2224
我第一次思考
02:43
about how we made製作 decisions決定.
64
151373
1910
我們是如何做決定。
02:45
When I was an assistant助理 D.A.,
65
153283
1397
我擔任地區助理檢察官
02:46
and when I was a federal聯邦 prosecutor原告,
66
154680
1870
和聯邦檢察官,
02:48
I looked看著 at the cases in front面前 of me,
67
156550
1746
研究眼前的案件時,
02:50
and I generally通常 made製作 decisions決定 based基於 on my instinct直覺
68
158296
2626
幾乎都是靠直覺
02:52
and my experience經驗.
69
160922
1692
和經驗做決定。
02:54
When I became成為 attorney律師 general一般,
70
162614
1659
當我成為總檢察長,
02:56
I could look at the system系統 as a whole整個,
71
164273
1639
能夠全面檢視體制,
02:57
and what surprised詫異 me is that I found發現
72
165912
1818
讓我驚訝的是發現了
02:59
that that was exactly究竟 how we were doing it
73
167730
1905
我們就是那樣做,
03:01
across橫過 the entire整個 system系統 --
74
169635
2303
整個體制都是如此──
03:03
in police警察 departments部門, in prosecutors's檢察官的 offices辦事處,
75
171938
2401
在警察局、檢察署、
03:06
in courts法院 and in jails監獄.
76
174339
2800
法院和監獄。
03:09
And what I learned學到了 very quickly很快
77
177139
2197
很快我就了解
03:11
is that we weren't doing a good job工作.
78
179336
3633
我們做得不好,
03:14
So I wanted to do things differently不同.
79
182969
2016
於是就想用不同的方式做事。
03:16
I wanted to introduce介紹 data數據 and analytics分析
80
184985
2197
我想把數據、邏輯分析
03:19
and rigorous嚴格 statistical統計 analysis分析
81
187182
2049
和精密統計分析
03:21
into our work.
82
189231
1400
運用到工作上。
03:22
In short, I wanted to moneyball點球成金 criminal刑事 justice正義.
83
190631
2970
簡而言之,我想用
魔球的方式處理刑事司法。
03:25
Now, moneyball點球成金, as many許多 of you know,
84
193601
2027
如在座許多人所知,
03:27
is what the Oakland奧克蘭 A's did,
85
195628
1569
魔球是奧克蘭運動家隊
所運用的策略,
03:29
where they used smart聰明 data數據 and statistics統計
86
197197
1973
他們用數據和統計
03:31
to figure數字 out how to pick players玩家
87
199170
1622
找出選擇球員的方法
03:32
that would help them win贏得 games遊戲,
88
200792
1521
去幫助球隊贏球,
03:34
and they went from a system系統 that
was based基於 on baseball棒球 scouts偵察兵
89
202313
2980
他們從前根據棒球球探意見,
03:37
who used to go out and watch players玩家
90
205293
1860
球探會出門去看球員,
03:39
and use their instinct直覺 and experience經驗,
91
207153
1637
然後以直覺和經驗,
03:40
the scouts'偵察兵“ instincts本能 and experience經驗,
92
208790
1743
球探的直覺和經驗,
03:42
to pick players玩家, from one to use
93
210533
1713
去挑選球員,從運用
03:44
smart聰明 data數據 and rigorous嚴格 statistical統計 analysis分析
94
212246
2822
數據和精密統計分析
03:47
to figure數字 out how to pick players玩家
that would help them win贏得 games遊戲.
95
215068
3371
找出要怎麼選出
能讓他們贏得比賽的球員。
03:50
It worked工作 for the Oakland奧克蘭 A's,
96
218439
1798
對奧克蘭運動家隊奏效了,
03:52
and it worked工作 in the state of New Jersey新澤西.
97
220237
2219
對紐澤西州也奏效了。
03:54
We took Camden卡姆登 off the top最佳 of the list名單
98
222456
2073
我們讓肯頓不再名列
03:56
as the most dangerous危險 city in America美國.
99
224529
2171
美國最危險城市之一。
03:58
We reduced減少 murders謀殺 there by 41 percent百分,
100
226700
3155
我們把當地兇殺案減少了 41%,
04:01
which哪一個 actually其實 means手段 37 lives生活 were saved保存.
101
229855
2982
意謂著救了 37 條人命。
04:04
And we reduced減少 all crime犯罪 in the city by 26 percent百分.
102
232837
3740
我們將城裡各種犯罪活動減少了 26% 。
04:08
We also changed the way
we did criminal刑事 prosecutions起訴.
103
236577
3239
我們也改變刑事訴訟的方式,
04:11
So we went from doing low-level低級別 drug藥物 crimes犯罪
104
239816
2005
從處理低階的毒品犯罪,
04:13
that were outside our building建造
105
241821
1642
那些發生在我們的大樓外,
04:15
to doing cases of statewide全州 importance重要性,
106
243463
2342
轉變為遍及全州的重要案件,
04:17
on things like reducing減少 violence暴力
with the most violent暴力 offenders罪犯,
107
245805
3158
像是減少高度危險暴力犯的再犯率、
04:20
prosecuting控方 street gangs幫派,
108
248963
1858
起訴街頭幫派、
04:22
gun and drug藥物 trafficking販賣, and political政治 corruption腐敗.
109
250821
3408
槍枝和毒品運送,以及政治貪汙。
04:26
And all of this matters事項 greatly非常,
110
254229
2502
這一切帶來的影響甚大,
04:28
because public上市 safety安全 to me
111
256731
1945
因為公共安全對我來說
04:30
is the most important重要 function功能 of government政府.
112
258676
2536
是政府最重要的功能。
04:33
If we're not safe安全, we can't be educated博學,
113
261212
2298
如果我們不安全,
我們就無法接受教育,
04:35
we can't be healthy健康,
114
263510
1348
就無法擁有健康,
04:36
we can't do any of the other things
we want to do in our lives生活.
115
264858
2945
我們就無法做所有生活中想做的事。
04:39
And we live生活 in a country國家 today今天
116
267803
1701
今天我們居住的國家
04:41
where we face面對 serious嚴重 criminal刑事 justice正義 problems問題.
117
269504
3134
正面對嚴重的刑事司法問題。
04:44
We have 12 million百萬 arrests逮捕 every一切 single year.
118
272638
3661
我們每年有 1,200 萬件逮捕案。
04:48
The vast廣大 majority多數 of those arrests逮捕
119
276299
2043
這些逮捕案最大部分的是
04:50
are for low-level低級別 crimes犯罪, like misdemeanors輕罪,
120
278342
3012
低階犯罪,像是輕罪,
04:53
70 to 80 percent百分.
121
281354
1734
佔 70% 到 80%。
04:55
Less than five percent百分 of all arrests逮捕
122
283088
1991
只有不到 5% 的逮捕
04:57
are for violent暴力 crime犯罪.
123
285079
1895
是暴力犯罪。
04:58
Yet然而 we spend 75 billion十億,
124
286974
2055
然而我們花費 750 億美元,
05:01
that's b for billion十億,
125
289029
1418
是「百億」元,
05:02
dollars美元 a year on state and local本地 corrections更正 costs成本.
126
290447
4127
在州和地方一年的懲治支出上。
05:06
Right now, today今天, we have 2.3 million百萬 people
127
294574
2841
現在,我們有 230 萬人
05:09
in our jails監獄 and prisons監獄.
128
297415
1900
在監獄和拘留所。
05:11
And we face面對 unbelievable難以置信的 public上市 safety安全 challenges挑戰
129
299315
2796
我們面對難以致信的公安問題
05:14
because we have a situation情況
130
302111
1939
因為面對的處境是
05:16
in which哪一個 two thirds三分之二 of the people in our jails監獄
131
304050
2898
拘留所內三分之二的嫌疑犯
05:18
are there waiting等候 for trial審訊.
132
306948
1754
正在等著審判,
05:20
They haven't沒有 yet然而 been convicted被定罪 of a crime犯罪.
133
308702
2135
他們還沒被判有罪,
05:22
They're just waiting等候 for their day in court法庭.
134
310837
2119
等著上法庭的那一天。
05:24
And 67 percent百分 of people come back.
135
312956
3548
67% 的嫌疑犯會回來。
05:28
Our recidivism累犯 rate is amongst其中包括
the highest最高 in the world世界.
136
316504
3028
我們是全球累犯率最高的國家之一。
05:31
Almost幾乎 seven in 10 people who are released發布
137
319532
2103
在監被釋放的人之中,
幾乎 10 個就有 7 個
05:33
from prison監獄 will be rearrested再次被捕
138
321635
1651
會再次被逮捕,
05:35
in a constant不變 cycle週期 of crime犯罪 and incarceration監禁.
139
323286
3955
呈現不斷犯罪和監禁的循環。
05:39
So when I started開始 my job工作 at the Arnold阿諾德 Foundation基礎,
140
327241
2582
因此當我開始在阿諾德基金會工作,
05:41
I came來了 back to looking at a lot of these questions問題,
141
329823
2736
回頭來看這一大堆問題,
05:44
and I came來了 back to thinking思維 about how
142
332559
1654
回頭來思考
05:46
we had used data數據 and analytics分析 to transform轉變
143
334213
2383
該如何使用數據和邏輯分析來轉變
05:48
the way we did criminal刑事 justice正義 in New Jersey新澤西.
144
336596
2584
我們在紐澤西刑事司法採取的方式。
05:51
And when I look at the criminal刑事 justice正義 system系統
145
339180
2144
當我檢視現今
05:53
in the United聯合的 States狀態 today今天,
146
341324
1656
美國的刑事司法體制時,
05:54
I feel the exact精確 same相同 way that I did
147
342980
1639
我發現和當年在紐澤西起頭時
05:56
about the state of New Jersey新澤西 when I started開始 there,
148
344619
2466
相同的情況,
05:59
which哪一個 is that we absolutely絕對 have to do better,
149
347085
3228
毫無疑問我們在那做得更好了,
06:02
and I know that we can do better.
150
350313
1923
我也知道我們可以做得更好。
06:04
So I decided決定 to focus焦點
151
352236
1705
因此我決定著眼在
06:05
on using運用 data數據 and analytics分析
152
353941
2217
使用數據和邏輯分析,
06:08
to help make the most critical危急 decision決定
153
356158
2361
協助我們在公共安全中
06:10
in public上市 safety安全,
154
358519
1606
做最重要的決定,
06:12
and that decision決定 is the determination決心
155
360125
2021
而那個決定即是
06:14
of whether是否, when someone有人 has been arrested被捕,
156
362146
2535
在某疑犯被逮捕時的判定,
06:16
whether是否 they pose提出 a risk風險 to public上市 safety安全
157
364681
1915
不管是他們危及公共安全
06:18
and should be detained被拘留,
158
366596
1526
該被拘留,
06:20
or whether是否 they don't pose提出 a risk風險 to public上市 safety安全
159
368122
2356
又或是他們沒有危及公共安全
06:22
and should be released發布.
160
370478
1637
而該被釋放。
06:24
Everything that happens發生 in criminal刑事 cases
161
372115
1919
每件在刑事案件中發生的事
06:26
comes out of this one decision決定.
162
374034
1772
都來自於這個決定。
06:27
It impacts影響 everything.
163
375806
1496
這個決定影響每一件事,
06:29
It impacts影響 sentencing宣判.
164
377302
1350
影響每一個判決,
06:30
It impacts影響 whether是否 someone有人 gets得到 drug藥物 treatment治療.
165
378652
1901
影響某疑犯是否接受藥物治療,
06:32
It impacts影響 crime犯罪 and violence暴力.
166
380553
2323
影響暴力和犯罪。
06:34
And when I talk to judges法官 around the United聯合的 States狀態,
167
382876
1937
當我和全美法官談話時,
06:36
which哪一個 I do all the time now,
168
384813
1928
── 我現在常這麼做 ──
06:38
they all say the same相同 thing,
169
386741
1837
他們都說一樣的話,
06:40
which哪一個 is that we put dangerous危險 people in jail監獄,
170
388578
3107
那就是我們把危險人物關進牢裡,
06:43
and we let non-dangerous非危險, nonviolent非暴力 people out.
171
391685
3525
讓不危險、非暴力的人出來。
06:47
They mean it and they believe it.
172
395210
2233
他們很認真,也深信不疑。
06:49
But when you start開始 to look at the data數據,
173
397443
1733
但當你開始檢視數據,
06:51
which哪一個, by the way, the judges法官 don't have,
174
399176
2464
附帶一提的是,
法官沒有看過數據,
06:53
when we start開始 to look at the data數據,
175
401640
1612
當我們開始檢視數據,
06:55
what we find time and time again,
176
403252
2418
就會一次又一次地發現
06:57
is that this isn't the case案件.
177
405670
1982
根本不是如此。
06:59
We find low-risk低風險 offenders罪犯,
178
407652
1681
我們見到低風險犯人
07:01
which哪一個 makes品牌 up 50 percent百分 of our
entire整個 criminal刑事 justice正義 population人口,
179
409333
3714
佔了所有刑事司法總人數的一半,
07:05
we find that they're in jail監獄.
180
413047
2399
我們發現他們在坐牢。
07:07
Take Leslie萊斯利 Chew, who was a Texas德州 man
181
415446
2486
看看一個名為萊斯理的德州人,
07:09
who stole偷了 four blankets毯子 on a cold winter冬季 night.
182
417932
2884
他在一個寒冷冬夜偷了四件毛毯。
07:12
He was arrested被捕, and he was kept不停 in jail監獄
183
420816
2595
他被逮捕,關在牢裡,
07:15
on 3,500 dollars美元 bail保釋,
184
423411
2053
保釋金為 3,500 美元,
07:17
an amount that he could not afford給予 to pay工資.
185
425464
2776
他繳不出保釋金,
07:20
And he stayed in jail監獄 for eight months個月
186
428240
2588
因此留在牢裡八個月,
07:22
until直到 his case案件 came來了 up for trial審訊,
187
430828
2065
直到案子開審,
07:24
at a cost成本 to taxpayers納稅人 of more than 9,000 dollars美元.
188
432893
3905
花費納稅人超過 9,000 美元。
07:28
And at the other end結束 of the spectrum光譜,
189
436798
1997
而在相反的那一端,
07:30
we're doing an equally一樣 terrible可怕 job工作.
190
438795
2282
我們做得一樣很糟。
07:33
The people who we find
191
441077
1572
我們見到的是
07:34
are the highest-risk高風險 offenders罪犯,
192
442649
2019
高風險的犯人,
07:36
the people who we think have the highest最高 likelihood可能性
193
444668
2497
我們認為這些人若被釋放,
07:39
of committing提交 a new crime犯罪 if they're released發布,
194
447165
1952
將會極有可能再次犯罪,
07:41
we see nationally國內 that 50 percent百分 of those people
195
449117
2950
這些人全國大概有一半
07:44
are being存在 released發布.
196
452067
1974
都被釋放了,
07:46
The reason原因 for this is the way we make decisions決定.
197
454041
3174
原因出自於我們做決定的方式。
07:49
Judges士師記 have the best最好 intentions意圖
198
457215
1709
當法官要做出關於風險的這些決定時,
07:50
when they make these decisions決定 about risk風險,
199
458924
1952
他們是出於好意,
07:52
but they're making製造 them subjectively主觀.
200
460876
2484
但是卻主觀地做出決定,
07:55
They're like the baseball棒球 scouts偵察兵 20 years年份 ago
201
463360
2146
就像 20 年前的棒球球探,
07:57
who were using運用 their instinct直覺 and their experience經驗
202
465506
2131
用直覺和經驗
07:59
to try to decide決定 what risk風險 someone有人 poses姿勢.
203
467637
2679
嘗試裁定某人會造成什麼危險。
08:02
They're being存在 subjective主觀,
204
470316
1530
他們主觀意識強,
08:03
and we know what happens發生
with subjective主觀 decision決定 making製造,
205
471846
3060
而我們知道主觀的決定
會帶來什麼結果,
08:06
which哪一個 is that we are often經常 wrong錯誤.
206
474906
2743
那就是我們經常會做錯。
08:09
What we need in this space空間
207
477649
1383
我們在這裡需要的是
08:11
are strong強大 data數據 and analytics分析.
208
479032
2552
強而有力的數據和邏輯分析。
08:13
What I decided決定 to look for
209
481584
1747
我決定找出
08:15
was a strong強大 data數據 and analytic解析 risk風險 assessment評定 tool工具,
210
483331
2836
強而有力的數據
和邏輯分析風險評估工具,
08:18
something that would let judges法官 actually其實 understand理解
211
486167
2764
透過科學和客觀的方式
08:20
with a scientific科學 and objective目的 way
212
488931
2259
讓法官確實了解
08:23
what the risk風險 was that was posed構成
213
491190
1647
他們面前的人
08:24
by someone有人 in front面前 of them.
214
492837
1610
可能造成什麼風險。
08:26
I looked看著 all over the country國家,
215
494447
1649
我檢視整個國家,
08:28
and I found發現 that between之間 five and 10 percent百分
216
496096
1942
發現美國的所有司法轄區之中
08:30
of all U.S. jurisdictions司法管轄區
217
498038
1329
約有 5% 到 10%
08:31
actually其實 use any type類型 of risk風險 assessment評定 tool工具,
218
499367
2978
確實使用某種型式的風險評估工具,
08:34
and when I looked看著 at these tools工具,
219
502345
1625
當我研究這些工具,
08:35
I quickly很快 realized實現 why.
220
503970
1860
很快就理解事出何因。
08:37
They were unbelievably令人難以置信 expensive昂貴 to administer管理,
221
505830
2690
這些工具管理起來貴得嚇人,
08:40
they were time-consuming耗時的,
222
508520
1528
耗費時間,
08:42
they were limited有限 to the local本地 jurisdiction管轄權
223
510048
2107
而且只能限制在
08:44
in which哪一個 they'd他們會 been created創建.
224
512155
1430
他們所在的司法轄區。
08:45
So basically基本上, they couldn't不能 be scaled縮放
225
513585
1793
基本上,他們無法擴大規模
08:47
or transferred轉入 to other places地方.
226
515378
2209
或是移轉到其它地方。
08:49
So I went out and built內置 a phenomenal非凡的 team球隊
227
517587
2237
因此我建立了一個出色的團隊,
由數據科學家、研究人員
08:51
of data數據 scientists科學家們 and researchers研究人員
228
519824
2044
08:53
and statisticians統計學家
229
521868
1626
和統計學家組成,
08:55
to build建立 a universal普遍 risk風險 assessment評定 tool工具,
230
523494
2845
來建立全面的風險評估工具,
08:58
so that every一切 single judge法官 in
the United聯合的 States狀態 of America美國
231
526339
2393
如此一來每一位美國法官
09:00
can have an objective目的, scientific科學 measure測量 of risk風險.
232
528732
4324
都可以擁有客觀、科學的風險測量。
09:05
In the tool工具 that we've我們已經 built內置,
233
533056
1658
在我們設計的工具中,
09:06
what we did was we collected 1.5 million百萬 cases
234
534714
2868
我們搜集了
09:09
from all around the United聯合的 States狀態,
235
537582
1698
全美 150 萬個案件,
09:11
from cities城市, from counties,
236
539280
1644
它們來自城市、郡市、
09:12
from every一切 single state in the country國家,
237
540924
1511
來自全國每一個州、
09:14
the federal聯邦 districts.
238
542435
1746
來自聯邦地區。
09:16
And with those 1.5 million百萬 cases,
239
544181
2189
而這 150 萬個案件
09:18
which哪一個 is the largest最大 data數據 set on pretrial預審
240
546370
1940
是美國現今審判前
09:20
in the United聯合的 States狀態 today今天,
241
548310
1805
最大的資料組,
09:22
we were able能夠 to basically基本上 find that there were
242
550115
1865
基本上我們能找出
09:23
900-plus-加 risk風險 factors因素 that we could look at
243
551980
3322
900 個以上的危險因子,
我們可以從其中檢視,
09:27
to try to figure數字 out what mattered要緊 most.
244
555302
2866
嘗試找出什麼是最重要的。
09:30
And we found發現 that there were nine specific具體 things
245
558168
2081
我們發現有特定的九件事
09:32
that mattered要緊 all across橫過 the country國家
246
560249
2235
在全國各地都很重要,
09:34
and that were the most highly高度 predictive預測 of risk風險.
247
562484
2977
而那些是最容易看得出來的風險。
09:37
And so we built內置 a universal普遍 risk風險 assessment評定 tool工具.
248
565461
3705
我們建置出一套全面的風險評估工具,
09:41
And it looks容貌 like this.
249
569166
1445
看起來就像這樣,
09:42
As you'll你會 see, we put some information信息 in,
250
570611
2612
就像你看到的,我們會放入一些資訊,
09:45
but most of it is incredibly令人難以置信 simple簡單,
251
573223
2013
但大部分都是很簡單的東西,
09:47
it's easy簡單 to use,
252
575236
1432
操作也簡單,
09:48
it focuses重點 on things like the
defendant's被告 prior convictions信念,
253
576668
2969
像是著眼在被告之前的犯罪記錄,
09:51
whether是否 they've他們已經 been sentenced判刑 to incarceration監禁,
254
579637
1979
不管是否被判監禁,
09:53
whether是否 they've他們已經 engaged訂婚 in violence暴力 before,
255
581616
2264
不管是否曾涉入暴力案件,
09:55
whether是否 they've他們已經 even failed失敗 to come back to court法庭.
256
583880
2393
或只是未曾出庭。
09:58
And with this tool工具, we can predict預測 three things.
257
586273
2500
有了這個工具,我們可以預測三件事。
10:00
First, whether是否 or not someone有人 will commit承諾
258
588773
1853
首先,如果某疑犯被釋放的話,
10:02
a new crime犯罪 if they're released發布.
259
590626
1565
他會否再犯罪。
10:04
Second第二, for the first time,
260
592191
1664
第二,這是第一次
10:05
and I think this is incredibly令人難以置信 important重要,
261
593855
1861
── 我想這十分重要 ──
10:07
we can predict預測 whether是否 someone有人 will commit承諾
262
595716
1923
我們可以預測某疑犯如果被釋放,
10:09
an act法案 of violence暴力 if they're released發布.
263
597639
1834
會不會從事暴力犯罪。
10:11
And that's the single most important重要 thing
264
599473
1887
那是法官跟你說話時,
10:13
that judges法官 say when you talk to them.
265
601360
1807
對他而言最重要的事。
10:15
And third第三, we can predict預測 whether是否 someone有人
266
603167
1828
第三,我們可以預測
10:16
will come back to court法庭.
267
604995
1990
某疑犯會否回到法庭上。
10:18
And every一切 single judge法官 in the
United聯合的 States狀態 of America美國 can use it,
268
606985
3033
每一位美國法官都能使用這個工具,
10:22
because it's been created創建 on a universal普遍 data數據 set.
269
610018
3812
因為它是以全面性的資料組建立。
10:25
What judges法官 see if they run the risk風險 assessment評定 tool工具
270
613830
2609
當法官們操作這個風險評估工具時,
10:28
is this -- it's a dashboard儀表板.
271
616439
2120
看到的就是這個介面。
10:30
At the top最佳, you see the New Criminal刑事 Activity活動 Score得分了,
272
618559
2848
在最上面的是新犯罪活動評分,
10:33
six of course課程 being存在 the highest最高,
273
621407
1929
六分當然是最高分,
10:35
and then in the middle中間 you
see, "Elevated提高的 risk風險 of violence暴力."
274
623336
2403
接著在中間可以看到
「增加的暴力風險」。
10:37
What that says is that this person
275
625739
1746
意謂著這個疑犯
10:39
is someone有人 who has an elevated提高的 risk風險 of violence暴力
276
627485
2060
有較高機率的暴力風險,
10:41
that the judge法官 should look twice兩次 at.
277
629545
1885
法官應該再多看一眼。
10:43
And then, towards the bottom底部,
278
631430
1336
接著,往底部看,
10:44
you see the Failure失敗 to Appear出現 Score得分了,
279
632766
1968
你會看到「未能出庭指數」,
10:46
which哪一個 again is the likelihood可能性
280
634734
1392
這再次意謂著
10:48
that someone有人 will come back to court法庭.
281
636126
3013
某疑犯會回到法院的可能性。
10:51
Now I want to say something really important重要.
282
639139
2213
接下來我要說的十分重要。
10:53
It's not that I think we should be eliminating消除
283
641352
2727
我認為我們並不是應該排除
10:56
the judge's法官 instinct直覺 and experience經驗
284
644079
2244
法官在這個過程中的
10:58
from this process處理.
285
646323
1604
直覺和經驗。
10:59
I don't.
286
647927
1058
不是這個意思。
11:00
I actually其實 believe the problem問題 that we see
287
648985
2007
我確實相信我們看到的問題
11:02
and the reason原因 that we have
these incredible難以置信 system系統 errors錯誤,
288
650992
2854
以及造成這些體制裡
重大錯誤的原因,
11:05
where we're incarcerating監禁
low-level低級別, nonviolent非暴力 people
289
653846
3087
我們監禁低階、非暴力的人,
11:08
and we're releasing釋放 high-risk高風險, dangerous危險 people,
290
656933
3172
卻把高風險的危險人物放出來,
11:12
is that we don't have an objective目的 measure測量 of risk風險.
291
660105
2723
是因為我們沒有客觀的風險評估。
11:14
But what I believe should happen發生
292
662828
1300
但我相信我們應該
11:16
is that we should take that
data-driven數據驅動 risk風險 assessment評定
293
664128
2800
拿這份依照數據產生的風險評估,
11:18
and combine結合 that with the
judge's法官 instinct直覺 and experience經驗
294
666928
3041
結合法官的直覺和經驗,
11:21
to lead us to better decision決定 making製造.
295
669969
2958
讓我們做出更好的決定。
11:24
The tool工具 went statewide全州 in Kentucky肯塔基 on July七月 1,
296
672927
3303
這項工具七月一日開始
在肯塔基州全州使用,
11:28
and we're about to go up in a
number of other U.S. jurisdictions司法管轄區.
297
676230
3351
我們還要擴展到全美許多轄區。
11:31
Our goal目標, quite相當 simply只是, is that every一切 single judge法官
298
679581
2591
我們的目標很簡單,
就是讓每一個美國法官
11:34
in the United聯合的 States狀態 will use a data-driven數據驅動 risk風險 tool工具
299
682172
2192
在五年內都可運用這套
11:36
within the next下一個 five years年份.
300
684364
2091
以數據為導向的風險工具。
11:38
We're now working加工 on risk風險 tools工具
301
686455
1352
我們現在設計
11:39
for prosecutors檢察官 and for police警察 officers長官 as well,
302
687807
3284
檢察官和警官也能使用這個風險工具,
11:43
to try to take a system系統 that runs運行 today今天
303
691091
2700
試著讓這套系統在現今美國運作,
11:45
in America美國 the same相同 way it did 50 years年份 ago,
304
693791
2796
就像 50 年前的方式一樣,
11:48
based基於 on instinct直覺 and experience經驗,
305
696587
2097
根據直覺和經驗,
11:50
and make it into one that runs運行
306
698684
1855
讓它改變為根據
11:52
on data數據 and analytics分析.
307
700539
2469
數據和邏輯分析。
11:55
Now, the great news新聞 about all this,
308
703008
1921
現在這一切最棒的是
11:56
and we have a ton of work left to do,
309
704929
1617
我們有一大堆工作等著我們去做,
11:58
and we have a lot of culture文化 to change更改,
310
706546
1857
有很多文化要改變,
12:00
but the great news新聞 about all of it
311
708403
1746
但這一切最棒的是
12:02
is that we know it works作品.
312
710149
1868
我們知道那有用。
12:04
It's why Google谷歌 is Google谷歌,
313
712017
2153
這是 Google 之所以
是 Google 的原因,
12:06
and it's why all these baseball棒球 teams球隊 use moneyball點球成金
314
714170
2462
這就是為什麼所有這些棒球隊運用
12:08
to win贏得 games遊戲.
315
716632
1781
魔球策略來贏球。
12:10
The great news新聞 for us as well
316
718413
1737
同樣對我們來說很棒的是
12:12
is that it's the way that we can transform轉變
317
720150
1896
這是我們能夠改變
12:14
the American美國 criminal刑事 justice正義 system系統.
318
722046
2321
美國刑事司法體系的方式,
12:16
It's how we can make our streets街道 safer更安全,
319
724367
2357
這是我們可以讓街道更安全的方式,
12:18
we can reduce減少 our prison監獄 costs成本,
320
726724
2299
我們可以減少監獄支出,
12:21
and we can make our system系統 much fairer更公平
321
729023
2067
我們可以讓體制更公平
12:23
and more just.
322
731090
1725
且更正義。
12:24
Some people call it data數據 science科學.
323
732815
2162
有些人稱它為數據科學。
12:26
I call it moneyballingmoneyballing criminal刑事 justice正義.
324
734977
2301
我稱它為魔球的刑事司法。
12:29
Thank you.
325
737278
1804
謝謝大家。
12:31
(Applause掌聲)
326
739082
4093
(掌聲)
Translated by Marssi Draw
Reviewed by William Choi

▲Back to top

ABOUT THE SPEAKER
Anne Milgram - Criminal justice reformer
Anne Milgram is committed to using data and analytics to fight crime.

Why you should listen

Anne Milgram is focused on reforming systems through smart data, analytics and technology. She is currently a Professor of Practice and Distinguished Scholar in Residence at New York University School of Law, where she is building a Criminal Justice Innovation Lab, dedicated to using data and technology to transform the American criminal justice system. She also teaches seminars on criminal justice policy and human trafficking. Milgram began her career as a criminal prosecutor, serving in state, local and federal prosecution offices.  She then became the Attorney General of the State of New Jersey, where she served as the Chief Law Enforcement Officer for the State and oversaw the Camden Police Department.

Though her work, Milgram seeks to bring the best of the modern world -- data, technology and analytics -- to bear in an effort to transform outdated systems and practices. Milgram is centered on creating a paradigm shift in how we think about innovation and reform in the criminal justice system and beyond.

Milgram graduated summa cum laude from Rutgers University and holds a Master of Philosophy in social and political theory from the University of Cambridge. She received her law degree from New York University School of Law.

More profile about the speaker
Anne Milgram | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee