ABOUT THE SPEAKER
Anne Milgram - Criminal justice reformer
Anne Milgram is committed to using data and analytics to fight crime.

Why you should listen

Anne Milgram is focused on reforming systems through smart data, analytics and technology. She is currently a Professor of Practice and Distinguished Scholar in Residence at New York University School of Law, where she is building a Criminal Justice Innovation Lab, dedicated to using data and technology to transform the American criminal justice system. She also teaches seminars on criminal justice policy and human trafficking. Milgram began her career as a criminal prosecutor, serving in state, local and federal prosecution offices.  She then became the Attorney General of the State of New Jersey, where she served as the Chief Law Enforcement Officer for the State and oversaw the Camden Police Department.

Though her work, Milgram seeks to bring the best of the modern world -- data, technology and analytics -- to bear in an effort to transform outdated systems and practices. Milgram is centered on creating a paradigm shift in how we think about innovation and reform in the criminal justice system and beyond.

Milgram graduated summa cum laude from Rutgers University and holds a Master of Philosophy in social and political theory from the University of Cambridge. She received her law degree from New York University School of Law.

More profile about the speaker
Anne Milgram | Speaker | TED.com
TED@BCG San Francisco

Anne Milgram: Why smart statistics are the key to fighting crime

Filmed:
1,034,735 views

When she became the attorney general of New Jersey in 2007, Anne Milgram quickly discovered a few startling facts: not only did her team not really know who they were putting in jail, but they had no way of understanding if their decisions were actually making the public safer. And so began her ongoing, inspirational quest to bring data analytics and statistical analysis to the US criminal justice system.
- Criminal justice reformer
Anne Milgram is committed to using data and analytics to fight crime. Full bio

Double-click the English transcript below to play the video.

00:12
In 2007, I became the attorney general
0
843
2591
00:15
of the state of New Jersey.
1
3434
1725
00:17
Before that, I'd been a criminal prosecutor,
2
5159
2280
00:19
first in the Manhattan district attorney's office,
3
7439
2681
00:22
and then at the United States Department of Justice.
4
10120
2650
00:24
But when I became the attorney general,
5
12770
2201
00:26
two things happened that changed
the way I see criminal justice.
6
14971
3895
00:30
The first is that I asked what I thought
7
18866
2030
00:32
were really basic questions.
8
20896
2186
00:35
I wanted to understand who we were arresting,
9
23082
2856
00:37
who we were charging,
10
25938
1664
00:39
and who we were putting in our nation's jails
11
27602
2128
00:41
and prisons.
12
29730
1416
00:43
I also wanted to understand
13
31146
1648
00:44
if we were making decisions
14
32794
1329
00:46
in a way that made us safer.
15
34123
2518
00:48
And I couldn't get this information out.
16
36641
3252
00:51
It turned out that most big criminal justice agencies
17
39893
3357
00:55
like my own
18
43250
1302
00:56
didn't track the things that matter.
19
44552
2382
00:58
So after about a month of being incredibly frustrated,
20
46934
3318
01:02
I walked down into a conference room
21
50252
1971
01:04
that was filled with detectives
22
52223
1890
01:06
and stacks and stacks of case files,
23
54113
2782
01:08
and the detectives were sitting there
24
56895
1176
01:10
with yellow legal pads taking notes.
25
58071
2234
01:12
They were trying to get the information
26
60305
1586
01:13
I was looking for
27
61891
1218
01:15
by going through case by case
28
63109
2045
01:17
for the past five years.
29
65154
1898
01:19
And as you can imagine,
30
67052
1653
01:20
when we finally got the results, they weren't good.
31
68705
2643
01:23
It turned out that we were doing
32
71348
1655
01:25
a lot of low-level drug cases
33
73003
2020
01:27
on the streets just around the corner
34
75023
1475
01:28
from our office in Trenton.
35
76498
2268
01:30
The second thing that happened
36
78766
1467
01:32
is that I spent the day in the Camden,
New Jersey police department.
37
80233
3674
01:35
Now, at that time, Camden, New Jersey,
38
83907
1887
01:37
was the most dangerous city in America.
39
85794
2652
01:40
I ran the Camden Police
Department because of that.
40
88446
3827
01:44
I spent the day in the police department,
41
92273
2112
01:46
and I was taken into a room
with senior police officials,
42
94385
2726
01:49
all of whom were working hard
43
97111
1675
01:50
and trying very hard to reduce crime in Camden.
44
98786
3257
01:54
And what I saw in that room,
45
102043
1826
01:55
as we talked about how to reduce crime,
46
103869
2245
01:58
were a series of officers with a
lot of little yellow sticky notes.
47
106114
3859
02:01
And they would take a yellow sticky
and they would write something on it
48
109973
2846
02:04
and they would put it up on a board.
49
112823
1799
02:06
And one of them said,
"We had a robbery two weeks ago.
50
114622
2171
02:08
We have no suspects."
51
116793
1711
02:10
And another said, "We had a shooting in this neighborhood last week. We have no suspects."
52
118504
5027
02:15
We weren't using data-driven policing.
53
123531
2583
02:18
We were essentially trying to fight crime
54
126114
2042
02:20
with yellow Post-it notes.
55
128156
2527
02:22
Now, both of these things made me realize
56
130683
2135
02:24
fundamentally that we were failing.
57
132818
3251
02:28
We didn't even know who was
in our criminal justice system,
58
136069
3123
02:31
we didn't have any data about
the things that mattered,
59
139192
3235
02:34
and we didn't share data or use analytics
60
142427
2568
02:36
or tools to help us make better decisions
61
144995
2151
02:39
and to reduce crime.
62
147146
2003
02:41
And for the first time, I started to think
63
149149
2224
02:43
about how we made decisions.
64
151373
1910
02:45
When I was an assistant D.A.,
65
153283
1397
02:46
and when I was a federal prosecutor,
66
154680
1870
02:48
I looked at the cases in front of me,
67
156550
1746
02:50
and I generally made decisions based on my instinct
68
158296
2626
02:52
and my experience.
69
160922
1692
02:54
When I became attorney general,
70
162614
1659
02:56
I could look at the system as a whole,
71
164273
1639
02:57
and what surprised me is that I found
72
165912
1818
02:59
that that was exactly how we were doing it
73
167730
1905
03:01
across the entire system --
74
169635
2303
03:03
in police departments, in prosecutors's offices,
75
171938
2401
03:06
in courts and in jails.
76
174339
2800
03:09
And what I learned very quickly
77
177139
2197
03:11
is that we weren't doing a good job.
78
179336
3633
03:14
So I wanted to do things differently.
79
182969
2016
03:16
I wanted to introduce data and analytics
80
184985
2197
03:19
and rigorous statistical analysis
81
187182
2049
03:21
into our work.
82
189231
1400
03:22
In short, I wanted to moneyball criminal justice.
83
190631
2970
03:25
Now, moneyball, as many of you know,
84
193601
2027
03:27
is what the Oakland A's did,
85
195628
1569
03:29
where they used smart data and statistics
86
197197
1973
03:31
to figure out how to pick players
87
199170
1622
03:32
that would help them win games,
88
200792
1521
03:34
and they went from a system that
was based on baseball scouts
89
202313
2980
03:37
who used to go out and watch players
90
205293
1860
03:39
and use their instinct and experience,
91
207153
1637
03:40
the scouts' instincts and experience,
92
208790
1743
03:42
to pick players, from one to use
93
210533
1713
03:44
smart data and rigorous statistical analysis
94
212246
2822
03:47
to figure out how to pick players
that would help them win games.
95
215068
3371
03:50
It worked for the Oakland A's,
96
218439
1798
03:52
and it worked in the state of New Jersey.
97
220237
2219
03:54
We took Camden off the top of the list
98
222456
2073
03:56
as the most dangerous city in America.
99
224529
2171
03:58
We reduced murders there by 41 percent,
100
226700
3155
04:01
which actually means 37 lives were saved.
101
229855
2982
04:04
And we reduced all crime in the city by 26 percent.
102
232837
3740
04:08
We also changed the way
we did criminal prosecutions.
103
236577
3239
04:11
So we went from doing low-level drug crimes
104
239816
2005
04:13
that were outside our building
105
241821
1642
04:15
to doing cases of statewide importance,
106
243463
2342
04:17
on things like reducing violence
with the most violent offenders,
107
245805
3158
04:20
prosecuting street gangs,
108
248963
1858
04:22
gun and drug trafficking, and political corruption.
109
250821
3408
04:26
And all of this matters greatly,
110
254229
2502
04:28
because public safety to me
111
256731
1945
04:30
is the most important function of government.
112
258676
2536
04:33
If we're not safe, we can't be educated,
113
261212
2298
04:35
we can't be healthy,
114
263510
1348
04:36
we can't do any of the other things
we want to do in our lives.
115
264858
2945
04:39
And we live in a country today
116
267803
1701
04:41
where we face serious criminal justice problems.
117
269504
3134
04:44
We have 12 million arrests every single year.
118
272638
3661
04:48
The vast majority of those arrests
119
276299
2043
04:50
are for low-level crimes, like misdemeanors,
120
278342
3012
04:53
70 to 80 percent.
121
281354
1734
04:55
Less than five percent of all arrests
122
283088
1991
04:57
are for violent crime.
123
285079
1895
04:58
Yet we spend 75 billion,
124
286974
2055
05:01
that's b for billion,
125
289029
1418
05:02
dollars a year on state and local corrections costs.
126
290447
4127
05:06
Right now, today, we have 2.3 million people
127
294574
2841
05:09
in our jails and prisons.
128
297415
1900
05:11
And we face unbelievable public safety challenges
129
299315
2796
05:14
because we have a situation
130
302111
1939
05:16
in which two thirds of the people in our jails
131
304050
2898
05:18
are there waiting for trial.
132
306948
1754
05:20
They haven't yet been convicted of a crime.
133
308702
2135
05:22
They're just waiting for their day in court.
134
310837
2119
05:24
And 67 percent of people come back.
135
312956
3548
05:28
Our recidivism rate is amongst
the highest in the world.
136
316504
3028
05:31
Almost seven in 10 people who are released
137
319532
2103
05:33
from prison will be rearrested
138
321635
1651
05:35
in a constant cycle of crime and incarceration.
139
323286
3955
05:39
So when I started my job at the Arnold Foundation,
140
327241
2582
05:41
I came back to looking at a lot of these questions,
141
329823
2736
05:44
and I came back to thinking about how
142
332559
1654
05:46
we had used data and analytics to transform
143
334213
2383
05:48
the way we did criminal justice in New Jersey.
144
336596
2584
05:51
And when I look at the criminal justice system
145
339180
2144
05:53
in the United States today,
146
341324
1656
05:54
I feel the exact same way that I did
147
342980
1639
05:56
about the state of New Jersey when I started there,
148
344619
2466
05:59
which is that we absolutely have to do better,
149
347085
3228
06:02
and I know that we can do better.
150
350313
1923
06:04
So I decided to focus
151
352236
1705
06:05
on using data and analytics
152
353941
2217
06:08
to help make the most critical decision
153
356158
2361
06:10
in public safety,
154
358519
1606
06:12
and that decision is the determination
155
360125
2021
06:14
of whether, when someone has been arrested,
156
362146
2535
06:16
whether they pose a risk to public safety
157
364681
1915
06:18
and should be detained,
158
366596
1526
06:20
or whether they don't pose a risk to public safety
159
368122
2356
06:22
and should be released.
160
370478
1637
06:24
Everything that happens in criminal cases
161
372115
1919
06:26
comes out of this one decision.
162
374034
1772
06:27
It impacts everything.
163
375806
1496
06:29
It impacts sentencing.
164
377302
1350
06:30
It impacts whether someone gets drug treatment.
165
378652
1901
06:32
It impacts crime and violence.
166
380553
2323
06:34
And when I talk to judges around the United States,
167
382876
1937
06:36
which I do all the time now,
168
384813
1928
06:38
they all say the same thing,
169
386741
1837
06:40
which is that we put dangerous people in jail,
170
388578
3107
06:43
and we let non-dangerous, nonviolent people out.
171
391685
3525
06:47
They mean it and they believe it.
172
395210
2233
06:49
But when you start to look at the data,
173
397443
1733
06:51
which, by the way, the judges don't have,
174
399176
2464
06:53
when we start to look at the data,
175
401640
1612
06:55
what we find time and time again,
176
403252
2418
06:57
is that this isn't the case.
177
405670
1982
06:59
We find low-risk offenders,
178
407652
1681
07:01
which makes up 50 percent of our
entire criminal justice population,
179
409333
3714
07:05
we find that they're in jail.
180
413047
2399
07:07
Take Leslie Chew, who was a Texas man
181
415446
2486
07:09
who stole four blankets on a cold winter night.
182
417932
2884
07:12
He was arrested, and he was kept in jail
183
420816
2595
07:15
on 3,500 dollars bail,
184
423411
2053
07:17
an amount that he could not afford to pay.
185
425464
2776
07:20
And he stayed in jail for eight months
186
428240
2588
07:22
until his case came up for trial,
187
430828
2065
07:24
at a cost to taxpayers of more than 9,000 dollars.
188
432893
3905
07:28
And at the other end of the spectrum,
189
436798
1997
07:30
we're doing an equally terrible job.
190
438795
2282
07:33
The people who we find
191
441077
1572
07:34
are the highest-risk offenders,
192
442649
2019
07:36
the people who we think have the highest likelihood
193
444668
2497
07:39
of committing a new crime if they're released,
194
447165
1952
07:41
we see nationally that 50 percent of those people
195
449117
2950
07:44
are being released.
196
452067
1974
07:46
The reason for this is the way we make decisions.
197
454041
3174
07:49
Judges have the best intentions
198
457215
1709
07:50
when they make these decisions about risk,
199
458924
1952
07:52
but they're making them subjectively.
200
460876
2484
07:55
They're like the baseball scouts 20 years ago
201
463360
2146
07:57
who were using their instinct and their experience
202
465506
2131
07:59
to try to decide what risk someone poses.
203
467637
2679
08:02
They're being subjective,
204
470316
1530
08:03
and we know what happens
with subjective decision making,
205
471846
3060
08:06
which is that we are often wrong.
206
474906
2743
08:09
What we need in this space
207
477649
1383
08:11
are strong data and analytics.
208
479032
2552
08:13
What I decided to look for
209
481584
1747
08:15
was a strong data and analytic risk assessment tool,
210
483331
2836
08:18
something that would let judges actually understand
211
486167
2764
08:20
with a scientific and objective way
212
488931
2259
08:23
what the risk was that was posed
213
491190
1647
08:24
by someone in front of them.
214
492837
1610
08:26
I looked all over the country,
215
494447
1649
08:28
and I found that between five and 10 percent
216
496096
1942
08:30
of all U.S. jurisdictions
217
498038
1329
08:31
actually use any type of risk assessment tool,
218
499367
2978
08:34
and when I looked at these tools,
219
502345
1625
08:35
I quickly realized why.
220
503970
1860
08:37
They were unbelievably expensive to administer,
221
505830
2690
08:40
they were time-consuming,
222
508520
1528
08:42
they were limited to the local jurisdiction
223
510048
2107
08:44
in which they'd been created.
224
512155
1430
08:45
So basically, they couldn't be scaled
225
513585
1793
08:47
or transferred to other places.
226
515378
2209
08:49
So I went out and built a phenomenal team
227
517587
2237
08:51
of data scientists and researchers
228
519824
2044
08:53
and statisticians
229
521868
1626
08:55
to build a universal risk assessment tool,
230
523494
2845
08:58
so that every single judge in
the United States of America
231
526339
2393
09:00
can have an objective, scientific measure of risk.
232
528732
4324
09:05
In the tool that we've built,
233
533056
1658
09:06
what we did was we collected 1.5 million cases
234
534714
2868
09:09
from all around the United States,
235
537582
1698
09:11
from cities, from counties,
236
539280
1644
09:12
from every single state in the country,
237
540924
1511
09:14
the federal districts.
238
542435
1746
09:16
And with those 1.5 million cases,
239
544181
2189
09:18
which is the largest data set on pretrial
240
546370
1940
09:20
in the United States today,
241
548310
1805
09:22
we were able to basically find that there were
242
550115
1865
09:23
900-plus risk factors that we could look at
243
551980
3322
09:27
to try to figure out what mattered most.
244
555302
2866
09:30
And we found that there were nine specific things
245
558168
2081
09:32
that mattered all across the country
246
560249
2235
09:34
and that were the most highly predictive of risk.
247
562484
2977
09:37
And so we built a universal risk assessment tool.
248
565461
3705
09:41
And it looks like this.
249
569166
1445
09:42
As you'll see, we put some information in,
250
570611
2612
09:45
but most of it is incredibly simple,
251
573223
2013
09:47
it's easy to use,
252
575236
1432
09:48
it focuses on things like the
defendant's prior convictions,
253
576668
2969
09:51
whether they've been sentenced to incarceration,
254
579637
1979
09:53
whether they've engaged in violence before,
255
581616
2264
09:55
whether they've even failed to come back to court.
256
583880
2393
09:58
And with this tool, we can predict three things.
257
586273
2500
10:00
First, whether or not someone will commit
258
588773
1853
10:02
a new crime if they're released.
259
590626
1565
10:04
Second, for the first time,
260
592191
1664
10:05
and I think this is incredibly important,
261
593855
1861
10:07
we can predict whether someone will commit
262
595716
1923
10:09
an act of violence if they're released.
263
597639
1834
10:11
And that's the single most important thing
264
599473
1887
10:13
that judges say when you talk to them.
265
601360
1807
10:15
And third, we can predict whether someone
266
603167
1828
10:16
will come back to court.
267
604995
1990
10:18
And every single judge in the
United States of America can use it,
268
606985
3033
10:22
because it's been created on a universal data set.
269
610018
3812
10:25
What judges see if they run the risk assessment tool
270
613830
2609
10:28
is this -- it's a dashboard.
271
616439
2120
10:30
At the top, you see the New Criminal Activity Score,
272
618559
2848
10:33
six of course being the highest,
273
621407
1929
10:35
and then in the middle you
see, "Elevated risk of violence."
274
623336
2403
10:37
What that says is that this person
275
625739
1746
10:39
is someone who has an elevated risk of violence
276
627485
2060
10:41
that the judge should look twice at.
277
629545
1885
10:43
And then, towards the bottom,
278
631430
1336
10:44
you see the Failure to Appear Score,
279
632766
1968
10:46
which again is the likelihood
280
634734
1392
10:48
that someone will come back to court.
281
636126
3013
10:51
Now I want to say something really important.
282
639139
2213
10:53
It's not that I think we should be eliminating
283
641352
2727
10:56
the judge's instinct and experience
284
644079
2244
10:58
from this process.
285
646323
1604
10:59
I don't.
286
647927
1058
11:00
I actually believe the problem that we see
287
648985
2007
11:02
and the reason that we have
these incredible system errors,
288
650992
2854
11:05
where we're incarcerating
low-level, nonviolent people
289
653846
3087
11:08
and we're releasing high-risk, dangerous people,
290
656933
3172
11:12
is that we don't have an objective measure of risk.
291
660105
2723
11:14
But what I believe should happen
292
662828
1300
11:16
is that we should take that
data-driven risk assessment
293
664128
2800
11:18
and combine that with the
judge's instinct and experience
294
666928
3041
11:21
to lead us to better decision making.
295
669969
2958
11:24
The tool went statewide in Kentucky on July 1,
296
672927
3303
11:28
and we're about to go up in a
number of other U.S. jurisdictions.
297
676230
3351
11:31
Our goal, quite simply, is that every single judge
298
679581
2591
11:34
in the United States will use a data-driven risk tool
299
682172
2192
11:36
within the next five years.
300
684364
2091
11:38
We're now working on risk tools
301
686455
1352
11:39
for prosecutors and for police officers as well,
302
687807
3284
11:43
to try to take a system that runs today
303
691091
2700
11:45
in America the same way it did 50 years ago,
304
693791
2796
11:48
based on instinct and experience,
305
696587
2097
11:50
and make it into one that runs
306
698684
1855
11:52
on data and analytics.
307
700539
2469
11:55
Now, the great news about all this,
308
703008
1921
11:56
and we have a ton of work left to do,
309
704929
1617
11:58
and we have a lot of culture to change,
310
706546
1857
12:00
but the great news about all of it
311
708403
1746
12:02
is that we know it works.
312
710149
1868
12:04
It's why Google is Google,
313
712017
2153
12:06
and it's why all these baseball teams use moneyball
314
714170
2462
12:08
to win games.
315
716632
1781
12:10
The great news for us as well
316
718413
1737
12:12
is that it's the way that we can transform
317
720150
1896
12:14
the American criminal justice system.
318
722046
2321
12:16
It's how we can make our streets safer,
319
724367
2357
12:18
we can reduce our prison costs,
320
726724
2299
12:21
and we can make our system much fairer
321
729023
2067
12:23
and more just.
322
731090
1725
12:24
Some people call it data science.
323
732815
2162
12:26
I call it moneyballing criminal justice.
324
734977
2301
12:29
Thank you.
325
737278
1804
12:31
(Applause)
326
739082
4093

▲Back to top

ABOUT THE SPEAKER
Anne Milgram - Criminal justice reformer
Anne Milgram is committed to using data and analytics to fight crime.

Why you should listen

Anne Milgram is focused on reforming systems through smart data, analytics and technology. She is currently a Professor of Practice and Distinguished Scholar in Residence at New York University School of Law, where she is building a Criminal Justice Innovation Lab, dedicated to using data and technology to transform the American criminal justice system. She also teaches seminars on criminal justice policy and human trafficking. Milgram began her career as a criminal prosecutor, serving in state, local and federal prosecution offices.  She then became the Attorney General of the State of New Jersey, where she served as the Chief Law Enforcement Officer for the State and oversaw the Camden Police Department.

Though her work, Milgram seeks to bring the best of the modern world -- data, technology and analytics -- to bear in an effort to transform outdated systems and practices. Milgram is centered on creating a paradigm shift in how we think about innovation and reform in the criminal justice system and beyond.

Milgram graduated summa cum laude from Rutgers University and holds a Master of Philosophy in social and political theory from the University of Cambridge. She received her law degree from New York University School of Law.

More profile about the speaker
Anne Milgram | Speaker | TED.com