Zeynep Tufekci: Machine intelligence makes human morals more important
Techno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread. Full bio
Double-click the English transcript below to play the video.
as a computer programmer
came down to where I was,
And why are we whispering?"
at the computer in the room.
an affair with the receptionist.
if you're lying."
the laugh's on me.
emotional states and even lying
are very interested.
crazy about math and science.
I'd learned about nuclear weapons,
with the ethics of science.
as soon as possible.
let me pick a technical field
with any troublesome questions of ethics.
All the laughs are on me.
are building platforms
people see every day.
that could decide who to run over.
to make all sort of decisions,
that have no single right answers,
should you be shown?"
likely to reoffend?"
should be recommended to people?"
computers for a while,
for such subjective decisions
for flying airplanes, building bridges,
Did the bridge sway and fall?
fairly clear benchmarks,
our software is getting more powerful,
transparent and more complex.
have made great strides.
from a method called "machine learning."
than traditional programming,
detailed, exact, painstaking instructions.
and you feed it lots of data,
in our digital lives.
by churning through this data.
under a single-answer logic.
it's more probabilistic:
what you're looking for."
this method is really powerful.
what the system learned.
instructions to a computer;
a puppy-machine-creature
intelligence system gets things wrong.
when it gets things right,
when it's a subjective problem.
using machine-learning systems.
on previous employees' data
high performers in the company.
human resources managers and executives,
more objective, less biased,
and minorities a better shot
as a programmer,
come down to where I was
or really late in the afternoon,
let's go to lunch!"
had not confessed to their higher-ups
for a serious job was a teen girl
I just looked wrong
it is more complicated, and here's why:
can infer all sorts of things about you
disclosed those things.
with high levels of accuracy.
you haven't even disclosed.
such computational systems
of clinical or postpartum depression
the likelihood of depression
for early intervention. Great!
managers conference,
in a very large company,
what if, unbeknownst to you,
with high future likelihood of depression?
just maybe in the future, more likely.
more likely to be pregnant
but aren't pregnant now?
because that's your workplace culture?"
at gender breakdowns.
not traditional coding,
labeled "higher risk of depression,"
what your system is selecting on,
where to begin to look.
but you don't understand it.
isn't doing something shady?"
just stepped on 10 puppy tails.
another word about this."
isn't my problem, go away, death stare.
may even be less biased
shutting out of the job market
we want to build,
to machines we don't totally understand?
on data generated by our actions,
reflecting our biases,
could be picking up on our biases
neutral computation."
to be shown job ads for high-paying jobs.
suggesting criminal history,
and black-box algorithms
but sometimes we don't know,
was sentenced to six years in prison
in parole and sentencing decisions.
How is this score calculated?
be challenged in open court.
nonprofit, audited that very algorithm
was dismal, barely better than chance,
black defendants as future criminals
picking up her godsister
with a friend of hers.
and a scooter on a porch
a woman came out and said,
but they were arrested.
but she was also just 18.
for shoplifting in Home Depot --
a similar petty crime.
armed robbery convictions.
as high risk, and not him.
that she had not reoffended.
for her with her record.
prison term for a later crime.
this kind of unchecked power.
but they don't solve all our problems.
news feed algorithm --
and decides what to show you
for engagement on the site:
teenager by a white police officer,
unfiltered Twitter feed,
keeps wanting to make you
were talking about it.
wasn't showing it to me.
this was a widespread problem.
wasn't algorithm-friendly.
to even fewer people,
donate to charity, fine.
but difficult conversation
can also be wrong
IBM's machine-intelligence system
with human contestants on Jeopardy?
Watson was asked this question:
for a World War II hero,
for a World War II battle."
answered "Toronto" --
a second-grader wouldn't make.
error patterns of humans,
and be prepared for.
one is qualified for,
if it was because of stack overflow
fueled by a feedback loop
of value in 36 minutes.
what "error" means
autonomous weapons.
but that's exactly my point.
these difficult questions.
our responsibilities to machines.
a "Get out of ethics free" card.
calls this math-washing.
scrutiny and investigation.
algorithmic accountability,
that bringing math and computation
invades the algorithms.
to our moral responsibility to judgment,
and outsource our responsibilities
ABOUT THE SPEAKER
Zeynep Tufekci - Techno-sociologistTechno-sociologist Zeynep Tufekci asks big questions about our societies and our lives, as both algorithms and digital connectivity spread.
Why you should listen
We've entered an era of digital connectivity and machine intelligence. Complex algorithms are increasingly used to make consequential decisions about us. Many of these decisions are subjective and have no right answer: who should be hired, fired or promoted; what news should be shown to whom; which of your friends do you see updates from; which convict should be paroled. With increasing use of machine learning in these systems, we often don't even understand how exactly they are making these decisions. Zeynep Tufekci studies what this historic transition means for culture, markets, politics and personal life.
Tufekci is a contributing opinion writer at the New York Times, an associate professor at the School of Information and Library Science at University of North Carolina, Chapel Hill, and a faculty associate at Harvard's Berkman Klein Center for Internet and Society.
Her book, Twitter and Tear Gas: The Power and Fragility of Networked Protest, was published in 2017 by Yale University Press. Her next book, from Penguin Random House, will be about algorithms that watch, judge and nudge us.
Zeynep Tufekci | Speaker | TED.com