08:45
TEDxBeaconStreet

Joy Buolamwini: How I'm fighting bias in algorithms

Filmed:

MIT grad student Joy Buolamwini was working with facial recognition software when she noticed a problem: the software didn't recognize her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives.

- Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion. Full bio

Hello, I'm Joy, a poet of code,
00:13
on a mission to stop
an unseen force that's rising,
00:16
a force that I called "the coded gaze,"
00:21
my term for algorithmic bias.
00:24
Algorithmic bias, like human bias,
results in unfairness.
00:27
However, algorithms, like viruses,
can spread bias on a massive scale
00:31
at a rapid pace.
00:37
Algorithmic bias can also lead
to exclusionary experiences
00:39
and discriminatory practices.
00:44
Let me show you what I mean.
00:46
(Video) Joy Buolamwini: Hi, camera.
I've got a face.
00:48
Can you see my face?
00:52
No-glasses face?
00:54
You can see her face.
00:55
What about my face?
00:58
I've got a mask. Can you see my mask?
01:03
Joy Buolamwini: So how did this happen?
01:08
Why am I sitting in front of a computer
01:10
in a white mask,
01:14
trying to be detected by a cheap webcam?
01:15
Well, when I'm not fighting the coded gaze
01:19
as a poet of code,
01:21
I'm a graduate student
at the MIT Media Lab,
01:23
and there I have the opportunity to work
on all sorts of whimsical projects,
01:26
including the Aspire Mirror,
01:31
a project I did so I could project
digital masks onto my reflection.
01:33
So in the morning, if I wanted
to feel powerful,
01:38
I could put on a lion.
01:40
If I wanted to be uplifted,
I might have a quote.
01:42
So I used generic
facial recognition software
01:45
to build the system,
01:48
but found it was really hard to test it
unless I wore a white mask.
01:50
Unfortunately, I've run
into this issue before.
01:56
When I was an undergraduate
at Georgia Tech studying computer science,
02:00
I used to work on social robots,
02:04
and one of my tasks was to get a robot
to play peek-a-boo,
02:07
a simple turn-taking game
02:10
where partners cover their face
and then uncover it saying, "Peek-a-boo!"
02:12
The problem is, peek-a-boo
doesn't really work if I can't see you,
02:16
and my robot couldn't see me.
02:21
But I borrowed my roommate's face
to get the project done,
02:23
submitted the assignment,
02:27
and figured, you know what,
somebody else will solve this problem.
02:29
Not too long after,
02:33
I was in Hong Kong
for an entrepreneurship competition.
02:35
The organizers decided
to take participants
02:40
on a tour of local start-ups.
02:43
One of the start-ups had a social robot,
02:45
and they decided to do a demo.
02:48
The demo worked on everybody
until it got to me,
02:50
and you can probably guess it.
02:53
It couldn't detect my face.
02:55
I asked the developers what was going on,
02:58
and it turned out we had used the same
generic facial recognition software.
03:00
Halfway around the world,
03:06
I learned that algorithmic bias
can travel as quickly
03:07
as it takes to download
some files off of the internet.
03:11
So what's going on?
Why isn't my face being detected?
03:15
Well, we have to look
at how we give machines sight.
03:18
Computer vision uses
machine learning techniques
03:22
to do facial recognition.
03:25
So how this works is, you create
a training set with examples of faces.
03:27
This is a face. This is a face.
This is not a face.
03:31
And over time, you can teach a computer
how to recognize other faces.
03:34
However, if the training sets
aren't really that diverse,
03:38
any face that deviates too much
from the established norm
03:42
will be harder to detect,
03:46
which is what was happening to me.
03:47
But don't worry -- there's some good news.
03:49
Training sets don't just
materialize out of nowhere.
03:52
We actually can create them.
03:55
So there's an opportunity to create
full-spectrum training sets
03:56
that reflect a richer
portrait of humanity.
04:01
Now you've seen in my examples
04:04
how social robots
04:07
was how I found out about exclusion
with algorithmic bias.
04:09
But algorithmic bias can also lead
to discriminatory practices.
04:13
Across the US,
04:19
police departments are starting to use
facial recognition software
04:20
in their crime-fighting arsenal.
04:25
Georgetown Law published a report
04:27
showing that one in two adults
in the US -- that's 117 million people --
04:29
have their faces
in facial recognition networks.
04:36
Police departments can currently look
at these networks unregulated,
04:40
using algorithms that have not
been audited for accuracy.
04:44
Yet we know facial recognition
is not fail proof,
04:48
and labeling faces consistently
remains a challenge.
04:52
You might have seen this on Facebook.
04:56
My friends and I laugh all the time
when we see other people
04:58
mislabeled in our photos.
05:01
But misidentifying a suspected criminal
is no laughing matter,
05:04
nor is breaching civil liberties.
05:09
Machine learning is being used
for facial recognition,
05:12
but it's also extending beyond the realm
of computer vision.
05:15
In her book, "Weapons
of Math Destruction,"
05:21
data scientist Cathy O'Neil
talks about the rising new WMDs --
05:25
widespread, mysterious
and destructive algorithms
05:32
that are increasingly being used
to make decisions
05:36
that impact more aspects of our lives.
05:39
So who gets hired or fired?
05:42
Do you get that loan?
Do you get insurance?
05:44
Are you admitted into the college
you wanted to get into?
05:46
Do you and I pay the same price
for the same product
05:50
purchased on the same platform?
05:53
Law enforcement is also starting
to use machine learning
05:56
for predictive policing.
05:59
Some judges use machine-generated
risk scores to determine
06:02
how long an individual
is going to spend in prison.
06:05
So we really have to think
about these decisions.
06:10
Are they fair?
06:12
And we've seen that algorithmic bias
06:13
doesn't necessarily always
lead to fair outcomes.
06:16
So what can we do about it?
06:20
Well, we can start thinking about
how we create more inclusive code
06:22
and employ inclusive coding practices.
06:25
It really starts with people.
06:28
So who codes matters.
06:31
Are we creating full-spectrum teams
with diverse individuals
06:33
who can check each other's blind spots?
06:37
On the technical side,
how we code matters.
06:40
Are we factoring in fairness
as we're developing systems?
06:43
And finally, why we code matters.
06:47
We've used tools of computational creation
to unlock immense wealth.
06:50
We now have the opportunity
to unlock even greater equality
06:55
if we make social change a priority
07:00
and not an afterthought.
07:03
And so these are the three tenets
that will make up the "incoding" movement.
07:06
Who codes matters,
07:10
how we code matters
07:12
and why we code matters.
07:13
So to go towards incoding,
we can start thinking about
07:15
building platforms that can identify bias
07:18
by collecting people's experiences
like the ones I shared,
07:22
but also auditing existing software.
07:25
We can also start to create
more inclusive training sets.
07:28
Imagine a "Selfies for Inclusion" campaign
07:32
where you and I can help
developers test and create
07:34
more inclusive training sets.
07:38
And we can also start thinking
more conscientiously
07:41
about the social impact
of the technology that we're developing.
07:44
To get the incoding movement started,
07:49
I've launched the Algorithmic
Justice League,
07:51
where anyone who cares about fairness
can help fight the coded gaze.
07:54
On codedgaze.com, you can report bias,
08:00
request audits, become a tester
08:04
and join the ongoing conversation,
08:06
#codedgaze.
08:09
So I invite you to join me
08:12
in creating a world where technology
works for all of us,
08:15
not just some of us,
08:18
a world where we value inclusion
and center social change.
08:20
Thank you.
08:25
(Applause)
08:26
But I have one question:
08:32
Will you join me in the fight?
08:35
(Laughter)
08:37
(Applause)
08:39

▲Back to top

About the Speaker:

Joy Buolamwini - Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion.

Why you should listen

Joy Buolamwini is a poet of code on a mission to show compassion through computation. As a graduate researcher at the MIT Media Lab, she leads the Algorithmic Justice League to fight coded bias. Her research explores the intersection of social impact technology and inclusion. In support of this work, Buolamwini was awarded a $50,000 grant as the Grand Prize winner of a national contest inspired by the critically acclaimed film Hidden Figures, based on the book by Margot Lee Shetterly.

Driven by an entrepreneurial spirit, Buolamwini's global interest in creating technology for social impact spans multiple industries and countries. As the inaugural Chief Technology Officer for Techturized Inc., a hair care technology company, and Swift Tech Solutions, a global health tech consultancy, she led software development for underserved communities in the United States, Ethiopia, Mali, Nigeria and Niger. In Zambia, she explored empowering citizens with skills to create their own technology through the Zamrize Project. In the United Kingdom, Buolamwini piloted a Service Year Initiative to launch Code4Rights which supports youth in creating meaningful technology for their communities in partnership with local organizations.

Through Filmmakers Collaborative, Buolamwini produces media that highlight diverse creators of technology. Her short documentary, The Coded Gaze: Unmasking Algorithmic Bias, debuted at the Museum of Fine Arts Boston and her pilot of the Code4Rights: Journey To Code training series debuted at the Vatican. She has presented keynote speeches and public talks at various forums including #CSforAll at the White House, Harvard University, Saïd Business School, Rutgers University, NCWIT, Grace Hopper Celebration and SXSWedu.

Buolamwini is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer.  She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology. Buolamwini serves as a Harvard resident tutor at Adams House where she mentors students seeking scholarships or pursuing entrepreneurship.

More profile about the speaker
Joy Buolamwini | Speaker | TED.com