Cathy O'Neil: The era of blind faith in big data must end
Кэти О'Нил: Эра слепой веры в «большие данные» должна закончиться
Data skeptic Cathy O’Neil uncovers the dark secrets of big data, showing how our "objective" algorithms could in fact reinforce human bias. Full bio
Double-click the English transcript below to play the video.
the winners from the losers.
победителей от проигравших.
шанса на собеседование
that we don't understand
которые мы зачастую не понимаем,
вам нужны две вещи:
and often hoping for.
и на что надеетесь.
by looking, figuring out.
наблюдая за результатом.
what is associated with success.
что связано с успехом.
in written code.
в виде формул и кодов.
to make a meal for my family.
когда готовлю еду для своей семьи.
of ramen noodles as food.
if my kids eat vegetables.
from if my youngest son were in charge.
he gets to eat lots of Nutella.
получит много Нутеллы.
имеет значение.
most people think of algorithms.
людей воспринимают алгоритмы.
and true and scientific.
истинны и научны.
blind faith in big data.
в «большие данные».
She's a high school principal in Brooklyn.
Она директор средней школы в Бруклине.
her teachers were being scored
что её учителей оценивали
what the formula is, show it to me.
что это за формула и покажи мне,
to get the formula,
получить формулу,
told me it was math
что это математика,
a Freedom of Information Act request,
Закона о свободе информации,
and all their scores
всех учителей и их баллами
as an act of teacher-shaming.
the source code, through the same means,
и исходный код,
had access to that formula.
не имеет доступа к этой формуле.
got involved, Gary Rubenstein.
умный — Гари Рубинштейн.
from that New York Post data
в базе данных Нью-Йорка
for individual assessment.
для индивидуального оценивания.
with 205 other teachers,
recommendations from her principal
от директора её школы
of you guys are thinking,
the AI experts here.
an algorithm that inconsistent."
такой непоследовательный алгоритм».
with good intentions.
глубоко разрушительный эффект.
that's designed badly
с ошибками в проекте
silently wreaking havoc.
бесшумно давая волю хаосу.
на сексуальные домогательства.
about sexual harassment.
to succeed at Fox News.
преуспеть в Fox News.
but we've seen recently
to turn over another leaf?
чтобы начать всё сначала?
their hiring process
21 years of applications to Fox News.
опыта приёма на работу в Fox News.
stayed there for four years
проработал там четыре года
можно было бы натренировать.
to learn what led to success,
которые способны достичь успеха,
historically led to success
были успешными в прошлом.
to a current pool of applicants.
ко всем претендентам.
who were successful in the past.
кто достиг успеха в прошлом.
blindly apply algorithms.
слепо применяете алгоритмы,
if we had a perfect world,
у нас был идеальный мир,
don't have embarrassing lawsuits,
обошлись без судебных процессов,
it means they could be codifying sexism
данные могут кодифицировать сексизм
all neighborhoods
городах, всех районах.
only to the minority neighborhoods
в окрестности меньшинств
we found the data scientists
where the next crime would occur?
места следующего преступления?
criminal would be?
следующего преступника?
about how great and how accurate
насколько гениальны и точны
but we do have severe segregations
но у нас есть серьёзное разделение
and justice system data.
в политической и судебной системах.
the individual criminality,
преступления отдельных лиц,
recently looked into
недавно рассмотрел
during sentencing by judges.
при вынесения приговора судьями.
was scored a 10 out of 10.
получил 10 из 10.
3 out of 10, low risk.
3 из 10 — низкий риск.
for drug possession.
за хранение наркотиков.
the higher score you are,
что чем выше оценка,
a longer sentence.
дадут более длительный срок.
technologists hide ugly truths
important and destructive,
and it's not a mistake.
building private algorithms
for teachers and the public police,
the authority of the inscrutable.
обеспеченным секретностью.
since all this stuff is private
will solve this problem.
to be made in unfairness.
с точки зрения экономики.
in ways that we wish we weren't,
have consistently demonstrated this
of applications to jobs out,
квалифицированных работников,
have white-sounding names
the results -- always.
into the algorithms
about ramen noodles --
picking up on past practices
to emerge unscathed?
окажутся непредвзятыми?
we can check them for fairness.
что мы можем это сделать.
the truth every time.
We can make them better.
Мы можем их улучшить.
algorithm I talked about,
рецидива, о котором я говорила ранее,
we'd have to come to terms with the fact
означает принятие факта о том,
smoke pot at the same rate
марихуану одинаково,
to be arrested --
depending on the area.
в зависимости от района.
in other crime categories,
the definition of success,
algorithm? We talked about it.
and is promoted once?
и одно продвижение?
that is supported by their culture.
культура компании.
the blind orchestra audition
are behind a sheet.
have decided what's important
distracted by that.
auditions started,
прослушивания»,
went up by a factor of five.
выросло в пять раз.
for teachers would fail immediately.
для учителей провалилась бы сразу.
the errors of every algorithm.
ошибки всех алгоритмов.
and for whom does this model fail?
к кому эта модель не подходит?
had considered that
подумали творцы Facebook,
only things that our friends had posted.
публикации наших друзей.
one for the data scientists out there.
одно для ИТ специалистов.
not be the arbiters of truth.
of ethical discussions that happen
for our algorithmic overlords.
in big data must end.
ABOUT THE SPEAKER
Cathy O'Neil - Mathematician, data scientistData skeptic Cathy O’Neil uncovers the dark secrets of big data, showing how our "objective" algorithms could in fact reinforce human bias.
Why you should listen
In 2008, as a hedge-fund quant, mathematician Cathy O’Neil saw firsthand how really really bad math could lead to financial disaster. Disillusioned, O’Neil became a data scientist and eventually joined Occupy Wall Street’s Alternative Banking Group.
With her popular blog mathbabe.org, O’Neil emerged as an investigative journalist. Her acclaimed book Weapons of Math Destruction details how opaque, black-box algorithms rely on biased historical data to do everything from sentence defendants to hire workers. In 2017, O’Neil founded consulting firm ORCAA to audit algorithms for racial, gender and economic inequality.
Cathy O'Neil | Speaker | TED.com