Cathy O'Neil: The era of blind faith in big data must end
Data skeptic Cathy O’Neil uncovers the dark secrets of big data, showing how our "objective" algorithms could in fact reinforce human bias. Full bio
Double-click the English transcript below to play the video.
the winners from the losers.
that we don't understand
and often hoping for.
by looking, figuring out.
what is associated with success.
in written code.
to make a meal for my family.
of ramen noodles as food.
if my kids eat vegetables.
from if my youngest son were in charge.
he gets to eat lots of Nutella.
most people think of algorithms.
and true and scientific.
blind faith in big data.
She's a high school principal in Brooklyn.
her teachers were being scored
what the formula is, show it to me.
to get the formula,
told me it was math
a Freedom of Information Act request,
and all their scores
as an act of teacher-shaming.
the source code, through the same means,
had access to that formula.
got involved, Gary Rubenstein.
from that New York Post data
for individual assessment.
with 205 other teachers,
recommendations from her principal
of you guys are thinking,
the AI experts here.
an algorithm that inconsistent."
with good intentions.
that's designed badly
silently wreaking havoc.
about sexual harassment.
to succeed at Fox News.
but we've seen recently
to turn over another leaf?
their hiring process
21 years of applications to Fox News.
stayed there for four years
to learn what led to success,
historically led to success
to a current pool of applicants.
who were successful in the past.
blindly apply algorithms.
if we had a perfect world,
don't have embarrassing lawsuits,
it means they could be codifying sexism
all neighborhoods
only to the minority neighborhoods
we found the data scientists
where the next crime would occur?
criminal would be?
about how great and how accurate
but we do have severe segregations
and justice system data.
the individual criminality,
recently looked into
during sentencing by judges.
was scored a 10 out of 10.
3 out of 10, low risk.
for drug possession.
the higher score you are,
a longer sentence.
technologists hide ugly truths
important and destructive,
and it's not a mistake.
building private algorithms
for teachers and the public police,
the authority of the inscrutable.
since all this stuff is private
will solve this problem.
to be made in unfairness.
in ways that we wish we weren't,
have consistently demonstrated this
of applications to jobs out,
have white-sounding names
the results -- always.
into the algorithms
about ramen noodles --
picking up on past practices
to emerge unscathed?
we can check them for fairness.
the truth every time.
We can make them better.
algorithm I talked about,
we'd have to come to terms with the fact
smoke pot at the same rate
to be arrested --
depending on the area.
in other crime categories,
the definition of success,
algorithm? We talked about it.
and is promoted once?
that is supported by their culture.
the blind orchestra audition
are behind a sheet.
have decided what's important
distracted by that.
auditions started,
went up by a factor of five.
for teachers would fail immediately.
the errors of every algorithm.
and for whom does this model fail?
had considered that
only things that our friends had posted.
one for the data scientists out there.
not be the arbiters of truth.
of ethical discussions that happen
for our algorithmic overlords.
in big data must end.
ABOUT THE SPEAKER
Cathy O'Neil - Mathematician, data scientistData skeptic Cathy O’Neil uncovers the dark secrets of big data, showing how our "objective" algorithms could in fact reinforce human bias.
Why you should listen
In 2008, as a hedge-fund quant, mathematician Cathy O’Neil saw firsthand how really really bad math could lead to financial disaster. Disillusioned, O’Neil became a data scientist and eventually joined Occupy Wall Street’s Alternative Banking Group.
With her popular blog mathbabe.org, O’Neil emerged as an investigative journalist. Her acclaimed book Weapons of Math Destruction details how opaque, black-box algorithms rely on biased historical data to do everything from sentence defendants to hire workers. In 2017, O’Neil founded consulting firm ORCAA to audit algorithms for racial, gender and economic inequality.
Cathy O'Neil | Speaker | TED.com