ABOUT THE SPEAKER
Joy Buolamwini - Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion.

Why you should listen

Joy Buolamwini is a poet of code on a mission to show compassion through computation. As a graduate researcher at the MIT Media Lab, she leads the Algorithmic Justice League to fight coded bias. Her research explores the intersection of social impact technology and inclusion. In support of this work, Buolamwini was awarded a $50,000 grant as the Grand Prize winner of a national contest inspired by the critically acclaimed film Hidden Figures, based on the book by Margot Lee Shetterly.

Driven by an entrepreneurial spirit, Buolamwini's global interest in creating technology for social impact spans multiple industries and countries. As the inaugural Chief Technology Officer for Techturized Inc., a hair care technology company, and Swift Tech Solutions, a global health tech consultancy, she led software development for underserved communities in the United States, Ethiopia, Mali, Nigeria and Niger. In Zambia, she explored empowering citizens with skills to create their own technology through the Zamrize Project. In the United Kingdom, Buolamwini piloted a Service Year Initiative to launch Code4Rights which supports youth in creating meaningful technology for their communities in partnership with local organizations.

Through Filmmakers Collaborative, Buolamwini produces media that highlight diverse creators of technology. Her short documentary, The Coded Gaze: Unmasking Algorithmic Bias, debuted at the Museum of Fine Arts Boston and her pilot of the Code4Rights: Journey To Code training series debuted at the Vatican. She has presented keynote speeches and public talks at various forums including #CSforAll at the White House, Harvard University, Saïd Business School, Rutgers University, NCWIT, Grace Hopper Celebration and SXSWedu.

Buolamwini is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer.  She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology. Buolamwini serves as a Harvard resident tutor at Adams House where she mentors students seeking scholarships or pursuing entrepreneurship.

More profile about the speaker
Joy Buolamwini | Speaker | TED.com
TEDxBeaconStreet

Joy Buolamwini: How I'm fighting bias in algorithms

Filmed:
1,223,943 views

MIT grad student Joy Buolamwini was working with facial analysis software when she noticed a problem: the software didn't detect her face -- because the people who coded the algorithm hadn't taught it to identify a broad range of skin tones and facial structures. Now she's on a mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." It's an eye-opening talk about the need for accountability in coding ... as algorithms take over more and more aspects of our lives.
- Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion. Full bio

Double-click the English transcript below to play the video.

00:13
Hello, I'm Joy, a poet of code,
0
1041
3134
00:16
on a mission to stop
an unseen force that's rising,
1
4199
4993
00:21
a force that I called "the coded gaze,"
2
9216
2856
00:24
my term for algorithmic bias.
3
12096
3309
00:27
Algorithmic bias, like human bias,
results in unfairness.
4
15429
4300
00:31
However, algorithms, like viruses,
can spread bias on a massive scale
5
19753
6022
00:37
at a rapid pace.
6
25799
1582
00:39
Algorithmic bias can also lead
to exclusionary experiences
7
27943
4387
00:44
and discriminatory practices.
8
32354
2128
00:46
Let me show you what I mean.
9
34506
2061
00:48
(Video) Joy Buolamwini: Hi, camera.
I've got a face.
10
36980
2436
00:52
Can you see my face?
11
40162
1864
00:54
No-glasses face?
12
42051
1625
00:55
You can see her face.
13
43701
2214
00:58
What about my face?
14
46237
2245
01:03
I've got a mask. Can you see my mask?
15
51890
3750
01:08
Joy Buolamwini: So how did this happen?
16
56474
2365
01:10
Why am I sitting in front of a computer
17
58863
3141
01:14
in a white mask,
18
62028
1424
01:15
trying to be detected by a cheap webcam?
19
63476
3650
01:19
Well, when I'm not fighting the coded gaze
20
67150
2291
01:21
as a poet of code,
21
69465
1520
01:23
I'm a graduate student
at the MIT Media Lab,
22
71009
3272
01:26
and there I have the opportunity to work
on all sorts of whimsical projects,
23
74305
4917
01:31
including the Aspire Mirror,
24
79246
2027
01:33
a project I did so I could project
digital masks onto my reflection.
25
81297
5134
01:38
So in the morning, if I wanted
to feel powerful,
26
86455
2350
01:40
I could put on a lion.
27
88829
1434
01:42
If I wanted to be uplifted,
I might have a quote.
28
90287
3496
01:45
So I used generic
facial recognition software
29
93807
2989
01:48
to build the system,
30
96820
1351
01:50
but found it was really hard to test it
unless I wore a white mask.
31
98195
5103
01:56
Unfortunately, I've run
into this issue before.
32
104282
4346
02:00
When I was an undergraduate
at Georgia Tech studying computer science,
33
108652
4303
02:04
I used to work on social robots,
34
112979
2055
02:07
and one of my tasks was to get a robot
to play peek-a-boo,
35
115058
3777
02:10
a simple turn-taking game
36
118859
1683
02:12
where partners cover their face
and then uncover it saying, "Peek-a-boo!"
37
120566
4321
02:16
The problem is, peek-a-boo
doesn't really work if I can't see you,
38
124911
4429
02:21
and my robot couldn't see me.
39
129364
2499
02:23
But I borrowed my roommate's face
to get the project done,
40
131887
3950
02:27
submitted the assignment,
41
135861
1380
02:29
and figured, you know what,
somebody else will solve this problem.
42
137265
3753
02:33
Not too long after,
43
141669
2003
02:35
I was in Hong Kong
for an entrepreneurship competition.
44
143696
4159
02:40
The organizers decided
to take participants
45
148339
2694
02:43
on a tour of local start-ups.
46
151057
2372
02:45
One of the start-ups had a social robot,
47
153453
2715
02:48
and they decided to do a demo.
48
156192
1912
02:50
The demo worked on everybody
until it got to me,
49
158128
2980
02:53
and you can probably guess it.
50
161132
1923
02:55
It couldn't detect my face.
51
163079
2965
02:58
I asked the developers what was going on,
52
166068
2511
03:00
and it turned out we had used the same
generic facial recognition software.
53
168603
5533
03:06
Halfway around the world,
54
174160
1650
03:07
I learned that algorithmic bias
can travel as quickly
55
175834
3852
03:11
as it takes to download
some files off of the internet.
56
179710
3170
03:15
So what's going on?
Why isn't my face being detected?
57
183745
3076
03:18
Well, we have to look
at how we give machines sight.
58
186845
3356
03:22
Computer vision uses
machine learning techniques
59
190225
3409
03:25
to do facial recognition.
60
193658
1880
03:27
So how this works is, you create
a training set with examples of faces.
61
195562
3897
03:31
This is a face. This is a face.
This is not a face.
62
199483
2818
03:34
And over time, you can teach a computer
how to recognize other faces.
63
202325
4519
03:38
However, if the training sets
aren't really that diverse,
64
206868
3989
03:42
any face that deviates too much
from the established norm
65
210881
3349
03:46
will be harder to detect,
66
214254
1649
03:47
which is what was happening to me.
67
215927
1963
03:49
But don't worry -- there's some good news.
68
217914
2382
03:52
Training sets don't just
materialize out of nowhere.
69
220320
2771
03:55
We actually can create them.
70
223115
1788
03:56
So there's an opportunity to create
full-spectrum training sets
71
224927
4176
04:01
that reflect a richer
portrait of humanity.
72
229127
3824
04:04
Now you've seen in my examples
73
232975
2221
04:07
how social robots
74
235220
1768
04:09
was how I found out about exclusion
with algorithmic bias.
75
237012
4611
04:13
But algorithmic bias can also lead
to discriminatory practices.
76
241647
4815
04:19
Across the US,
77
247437
1453
04:20
police departments are starting to use
facial recognition software
78
248914
4198
04:25
in their crime-fighting arsenal.
79
253136
2459
04:27
Georgetown Law published a report
80
255619
2013
04:29
showing that one in two adults
in the US -- that's 117 million people --
81
257656
6763
04:36
have their faces
in facial recognition networks.
82
264443
3534
04:40
Police departments can currently look
at these networks unregulated,
83
268001
4552
04:44
using algorithms that have not
been audited for accuracy.
84
272577
4286
04:48
Yet we know facial recognition
is not fail proof,
85
276887
3864
04:52
and labeling faces consistently
remains a challenge.
86
280775
4179
04:56
You might have seen this on Facebook.
87
284978
1762
04:58
My friends and I laugh all the time
when we see other people
88
286764
2988
05:01
mislabeled in our photos.
89
289776
2458
05:04
But misidentifying a suspected criminal
is no laughing matter,
90
292258
5591
05:09
nor is breaching civil liberties.
91
297873
2827
05:12
Machine learning is being used
for facial recognition,
92
300724
3205
05:15
but it's also extending beyond the realm
of computer vision.
93
303953
4505
05:21
In her book, "Weapons
of Math Destruction,"
94
309266
4016
05:25
data scientist Cathy O'Neil
talks about the rising new WMDs --
95
313306
6681
05:32
widespread, mysterious
and destructive algorithms
96
320011
4353
05:36
that are increasingly being used
to make decisions
97
324388
2964
05:39
that impact more aspects of our lives.
98
327376
3177
05:42
So who gets hired or fired?
99
330577
1870
05:44
Do you get that loan?
Do you get insurance?
100
332471
2112
05:46
Are you admitted into the college
you wanted to get into?
101
334607
3503
05:50
Do you and I pay the same price
for the same product
102
338134
3509
05:53
purchased on the same platform?
103
341667
2442
05:56
Law enforcement is also starting
to use machine learning
104
344133
3759
05:59
for predictive policing.
105
347916
2289
06:02
Some judges use machine-generated
risk scores to determine
106
350229
3494
06:05
how long an individual
is going to spend in prison.
107
353747
4402
06:10
So we really have to think
about these decisions.
108
358173
2454
06:12
Are they fair?
109
360651
1182
06:13
And we've seen that algorithmic bias
110
361857
2890
06:16
doesn't necessarily always
lead to fair outcomes.
111
364771
3374
06:20
So what can we do about it?
112
368169
1964
06:22
Well, we can start thinking about
how we create more inclusive code
113
370157
3680
06:25
and employ inclusive coding practices.
114
373861
2990
06:28
It really starts with people.
115
376875
2309
06:31
So who codes matters.
116
379708
1961
06:33
Are we creating full-spectrum teams
with diverse individuals
117
381693
4119
06:37
who can check each other's blind spots?
118
385836
2411
06:40
On the technical side,
how we code matters.
119
388271
3545
06:43
Are we factoring in fairness
as we're developing systems?
120
391840
3651
06:47
And finally, why we code matters.
121
395515
2913
06:50
We've used tools of computational creation
to unlock immense wealth.
122
398785
5083
06:55
We now have the opportunity
to unlock even greater equality
123
403892
4447
07:00
if we make social change a priority
124
408363
2930
07:03
and not an afterthought.
125
411317
2170
07:06
And so these are the three tenets
that will make up the "incoding" movement.
126
414008
4522
07:10
Who codes matters,
127
418554
1652
07:12
how we code matters
128
420230
1543
07:13
and why we code matters.
129
421797
2023
07:15
So to go towards incoding,
we can start thinking about
130
423844
3099
07:18
building platforms that can identify bias
131
426967
3164
07:22
by collecting people's experiences
like the ones I shared,
132
430155
3078
07:25
but also auditing existing software.
133
433257
3070
07:28
We can also start to create
more inclusive training sets.
134
436351
3765
07:32
Imagine a "Selfies for Inclusion" campaign
135
440140
2803
07:34
where you and I can help
developers test and create
136
442967
3655
07:38
more inclusive training sets.
137
446646
2093
07:41
And we can also start thinking
more conscientiously
138
449302
2828
07:44
about the social impact
of the technology that we're developing.
139
452154
5391
07:49
To get the incoding movement started,
140
457569
2393
07:51
I've launched the Algorithmic
Justice League,
141
459986
2847
07:54
where anyone who cares about fairness
can help fight the coded gaze.
142
462857
5872
08:00
On codedgaze.com, you can report bias,
143
468753
3296
08:04
request audits, become a tester
144
472073
2445
08:06
and join the ongoing conversation,
145
474542
2771
08:09
#codedgaze.
146
477337
2287
08:12
So I invite you to join me
147
480742
2487
08:15
in creating a world where technology
works for all of us,
148
483253
3719
08:18
not just some of us,
149
486996
1897
08:20
a world where we value inclusion
and center social change.
150
488917
4588
08:25
Thank you.
151
493529
1175
08:26
(Applause)
152
494728
4271
08:32
But I have one question:
153
500873
2854
08:35
Will you join me in the fight?
154
503751
2059
08:37
(Laughter)
155
505834
1285
08:39
(Applause)
156
507143
3687

▲Back to top

ABOUT THE SPEAKER
Joy Buolamwini - Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion.

Why you should listen

Joy Buolamwini is a poet of code on a mission to show compassion through computation. As a graduate researcher at the MIT Media Lab, she leads the Algorithmic Justice League to fight coded bias. Her research explores the intersection of social impact technology and inclusion. In support of this work, Buolamwini was awarded a $50,000 grant as the Grand Prize winner of a national contest inspired by the critically acclaimed film Hidden Figures, based on the book by Margot Lee Shetterly.

Driven by an entrepreneurial spirit, Buolamwini's global interest in creating technology for social impact spans multiple industries and countries. As the inaugural Chief Technology Officer for Techturized Inc., a hair care technology company, and Swift Tech Solutions, a global health tech consultancy, she led software development for underserved communities in the United States, Ethiopia, Mali, Nigeria and Niger. In Zambia, she explored empowering citizens with skills to create their own technology through the Zamrize Project. In the United Kingdom, Buolamwini piloted a Service Year Initiative to launch Code4Rights which supports youth in creating meaningful technology for their communities in partnership with local organizations.

Through Filmmakers Collaborative, Buolamwini produces media that highlight diverse creators of technology. Her short documentary, The Coded Gaze: Unmasking Algorithmic Bias, debuted at the Museum of Fine Arts Boston and her pilot of the Code4Rights: Journey To Code training series debuted at the Vatican. She has presented keynote speeches and public talks at various forums including #CSforAll at the White House, Harvard University, Saïd Business School, Rutgers University, NCWIT, Grace Hopper Celebration and SXSWedu.

Buolamwini is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer.  She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology. Buolamwini serves as a Harvard resident tutor at Adams House where she mentors students seeking scholarships or pursuing entrepreneurship.

More profile about the speaker
Joy Buolamwini | Speaker | TED.com