ABOUT THE SPEAKER
Joy Buolamwini - Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion.

Why you should listen

Joy Buolamwini is a poet of code on a mission to show compassion through computation. As a graduate researcher at the MIT Media Lab, she leads the Algorithmic Justice League to fight coded bias. Her research explores the intersection of social impact technology and inclusion. In support of this work, Buolamwini was awarded a $50,000 grant as the Grand Prize winner of a national contest inspired by the critically acclaimed film Hidden Figures, based on the book by Margot Lee Shetterly.

Driven by an entrepreneurial spirit, Buolamwini's global interest in creating technology for social impact spans multiple industries and countries. As the inaugural Chief Technology Officer for Techturized Inc., a hair care technology company, and Swift Tech Solutions, a global health tech consultancy, she led software development for underserved communities in the United States, Ethiopia, Mali, Nigeria and Niger. In Zambia, she explored empowering citizens with skills to create their own technology through the Zamrize Project. In the United Kingdom, Buolamwini piloted a Service Year Initiative to launch Code4Rights which supports youth in creating meaningful technology for their communities in partnership with local organizations.

Through Filmmakers Collaborative, Buolamwini produces media that highlight diverse creators of technology. Her short documentary, The Coded Gaze: Unmasking Algorithmic Bias, debuted at the Museum of Fine Arts Boston and her pilot of the Code4Rights: Journey To Code training series debuted at the Vatican. She has presented keynote speeches and public talks at various forums including #CSforAll at the White House, Harvard University, Saïd Business School, Rutgers University, NCWIT, Grace Hopper Celebration and SXSWedu.

Buolamwini is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer.  She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology. Buolamwini serves as a Harvard resident tutor at Adams House where she mentors students seeking scholarships or pursuing entrepreneurship.

More profile about the speaker
Joy Buolamwini | Speaker | TED.com
TEDxBeaconStreet

Joy Buolamwini: How I'm fighting bias in algorithms

乔伊·博拉维尼: 我如何与算法偏见对抗

Filmed:
1,223,943 views

MIT的硕士生Joy Buolamwini在研发人脸识别软件的时候注意到一个问题:这个软件没能识别她的脸-因为程序员没有使软件去在一个很广的肤色和面部结构中识别人脸。现在,她在为机器学习的偏见作战,她把这个现象称作为“代码的凝视 “。算法正在控制我们生活越来越多的方面,这是一个开阔眼界的关于编程责任的演讲。
- Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion. Full bio

Double-click the English transcript below to play the video.

00:13
Hello你好, I'm Joy喜悦, a poet诗人 of code,
0
1041
3134
大家好 我是乔伊
一位写代码的诗人
00:16
on a mission任务 to stop
an unseen看不见 force that's rising升起,
1
4199
4993
我正努力阻止一股
逐渐凸显的无形力量
一种我称为 代码的凝视 的力量
00:21
a force that I called "the coded编码 gaze凝视,"
2
9216
2856
这是我用来定义算法偏见的术语
00:24
my term术语 for algorithmic算法 bias偏压.
3
12096
3309
00:27
Algorithmic算法 bias偏压, like human人的 bias偏压,
results结果 in unfairness不平.
4
15429
4300
正如人类之间的偏见
算法偏见也会导致不公平
然而算法就像病毒一样
会以飞快的速度大范围地
00:31
However然而, algorithms算法, like viruses病毒,
can spread传播 bias偏压 on a massive大规模的 scale规模
5
19753
6022
00:37
at a rapid快速 pace步伐.
6
25799
1582
扩散偏见
00:39
Algorithmic算法 bias偏压 can also lead
to exclusionary排他性 experiences经验
7
27943
4387
算法也将会导致排他的经历和
歧视性的做法
00:44
and discriminatory歧视 practices做法.
8
32354
2128
给大家举个例子
00:46
Let me show显示 you what I mean.
9
34506
2061
00:48
(Video视频) Joy喜悦 BuolamwiniBuolamwini: Hi你好, camera相机.
I've got a face面对.
10
36980
2436
(录像)乔伊·博拉维尼:
嘿 摄像头 我来了
00:52
Can you see my face面对?
11
40162
1864
你可以看到我的脸吗
00:54
No-glasses无眼镜 face面对?
12
42051
1625
没有戴眼镜的脸呢
你可以看到她的脸
00:55
You can see her face面对.
13
43701
2214
00:58
What about my face面对?
14
46237
2245
那么我的脸呢
01:03
I've got a mask面具. Can you see my mask面具?
15
51890
3750
我戴上了一个面罩
你可以看到我的面罩吗
01:08
Joy喜悦 BuolamwiniBuolamwini: So how did this happen发生?
16
56474
2365
乔伊·博拉维尼:
这是怎么回事呢
为什么我坐在一台电脑前
01:10
Why am I sitting坐在 in front面前 of a computer电脑
17
58863
3141
戴着一个白色的面罩
01:14
in a white白色 mask面具,
18
62028
1424
01:15
trying to be detected检测 by a cheap低廉 webcam摄像头?
19
63476
3650
尝试着被一个廉价的
网络摄像头检测到
01:19
Well, when I'm not fighting战斗 the coded编码 gaze凝视
20
67150
2291
当我的身份不是写代码的诗人
01:21
as a poet诗人 of code,
21
69465
1520
与 代码的凝视 较劲的时候
01:23
I'm a graduate毕业 student学生
at the MITMIT Media媒体 Lab实验室,
22
71009
3272
我是MIT媒体实验室的
一位硕士生
在那里我有机会参与
各种不同的项目
01:26
and there I have the opportunity机会 to work
on all sorts排序 of whimsical怪诞的 projects项目,
23
74305
4917
01:31
including包含 the Aspire渴望 Mirror镜子,
24
79246
2027
包括激励镜子
01:33
a project项目 I did so I could project项目
digital数字 masks面具 onto my reflection反射.
25
81297
5134
一个可以将数字面罩
投射在我的映像上的项目
在早上的时候
如果我想充满力量
01:38
So in the morning早上, if I wanted
to feel powerful强大,
26
86455
2350
我可以放上一个狮子的图像
01:40
I could put on a lion狮子.
27
88829
1434
如果我想要感到积极向上
我也许就会放上一句格言
01:42
If I wanted to be uplifted抬升,
I might威力 have a quote引用.
28
90287
3496
我使用通用的人脸识别软件
01:45
So I used generic通用
facial面部 recognition承认 software软件
29
93807
2989
来搭建系统
01:48
to build建立 the system系统,
30
96820
1351
但是我发现除非我戴上白色的面罩
否则测试很难成功
01:50
but found发现 it was really hard to test测试 it
unless除非 I wore穿着 a white白色 mask面具.
31
98195
5103
01:56
Unfortunately不幸, I've run
into this issue问题 before.
32
104282
4346
遗憾的是 我以前
也曾遇到过这种问题
当我在佐治亚理工学院
读计算机科学专业本科的时候
02:00
When I was an undergraduate大学本科
at Georgia格鲁吉亚 Tech技术 studying研究 computer电脑 science科学,
33
108652
4303
我曾经在一个
社交机器人上进行实验
02:04
I used to work on social社会 robots机器人,
34
112979
2055
02:07
and one of my tasks任务 was to get a robot机器人
to play peek-a-boo偷看一嘘,
35
115058
3777
我的任务之一是
让机器人玩躲猫猫
一个简单的轮换游戏
02:10
a simple简单 turn-taking转回吐 game游戏
36
118859
1683
02:12
where partners伙伴 cover their face面对
and then uncover揭露 it saying, "Peek-a-boo偷看-嘘!"
37
120566
4321
在游戏中玩伴盖住他们的脸
然后掀开说“躲猫猫!“
问题是躲猫猫在我不能
看见你的时候不起作用
02:16
The problem问题 is, peek-a-boo偷看一嘘
doesn't really work if I can't see you,
38
124911
4429
而我的机器人看不见我
02:21
and my robot机器人 couldn't不能 see me.
39
129364
2499
我只好借了我室友的脸
去完成这个项目
02:23
But I borrowed my roommate's室友的 face面对
to get the project项目 doneDONE,
40
131887
3950
递交了作业
02:27
submitted提交 the assignment分配,
41
135861
1380
寻思着总会有人
来解决这个问题的把
02:29
and figured想通, you know what,
somebody else其他 will solve解决 this problem问题.
42
137265
3753
02:33
Not too long after,
43
141669
2003
不久之后
我在香港参加一次创业比赛
02:35
I was in Hong香港 Kong
for an entrepreneurship创业 competition竞争.
44
143696
4159
02:40
The organizers组织者 decided决定
to take participants参与者
45
148339
2694
组织者决定将各位参与者
02:43
on a tour游览 of local本地 start-ups创业.
46
151057
2372
带到当地的初创企业参观
其中一个创业公司
有一个社交机器人
02:45
One of the start-ups创业 had a social社会 robot机器人,
47
153453
2715
他们决定进行一个项目演示
02:48
and they decided决定 to do a demo演示.
48
156192
1912
这个项目演示对除我之外的
每个人都有效果
02:50
The demo演示 worked工作 on everybody每个人
until直到 it got to me,
49
158128
2980
你恐怕可以猜到
02:53
and you can probably大概 guess猜测 it.
50
161132
1923
它不能检测到我的脸
02:55
It couldn't不能 detect检测 my face面对.
51
163079
2965
我问开发师到底发生了什么
02:58
I asked the developers开发商 what was going on,
52
166068
2511
结果是我们使用了同一款
通用面部识别软件
03:00
and it turned转身 out we had used the same相同
generic通用 facial面部 recognition承认 software软件.
53
168603
5533
在地球的另一边
03:06
Halfway around the world世界,
54
174160
1650
我意识到算法偏见
传播得如此之快
03:07
I learned学到了 that algorithmic算法 bias偏压
can travel旅行 as quickly很快
55
175834
3852
只需要从互联网上
下载一些文件
03:11
as it takes to download下载
some files off of the internet互联网.
56
179710
3170
03:15
So what's going on?
Why isn't my face面对 being存在 detected检测?
57
183745
3076
那么到底发生了什么
为什么我的脸没有被检测到
我们需要了解我们
如何教会机器识别
03:18
Well, we have to look
at how we give machines sight视力.
58
186845
3356
03:22
Computer电脑 vision视力 uses使用
machine learning学习 techniques技术
59
190225
3409
计算机视觉使用机器学习技术
来进行面部识别
03:25
to do facial面部 recognition承认.
60
193658
1880
03:27
So how this works作品 is, you create创建
a training训练 set with examples例子 of faces面孔.
61
195562
3897
所以你要用一系列脸的样本
创建一个训练体系
03:31
This is a face面对. This is a face面对.
This is not a face面对.
62
199483
2818
这是一张脸 这是一张脸
而这不是一张脸
慢慢地你可以教电脑
如何识别其它的脸
03:34
And over time, you can teach a computer电脑
how to recognize认识 other faces面孔.
63
202325
4519
然而如果这个训练集
不是那么的多样化
03:38
However然而, if the training训练 sets
aren't really that diverse多种,
64
206868
3989
03:42
any face面对 that deviates偏离 too much
from the established既定 norm规范
65
210881
3349
那些与已建立的标准
偏差较多的脸
将会难以被检测到
03:46
will be harder更难 to detect检测,
66
214254
1649
03:47
which哪一个 is what was happening事件 to me.
67
215927
1963
而这正是我遭遇的问题
不过别担心
我们还有好消息
03:49
But don't worry担心 -- there's some good news新闻.
68
217914
2382
训练集并不是凭空产生的
03:52
Training训练 sets don't just
materialize物质化 out of nowhere无处.
69
220320
2771
实际上我们可以创造它们
03:55
We actually其实 can create创建 them.
70
223115
1788
现在就有机会去创造
全波段光谱的训练集
03:56
So there's an opportunity机会 to create创建
full-spectrum全谱 training训练 sets
71
224927
4176
可以反映更加饱满的人类面貌
04:01
that reflect反映 a richer更丰富
portrait肖像 of humanity人性.
72
229127
3824
现在你看到了在我的例子中
04:04
Now you've seen看到 in my examples例子
73
232975
2221
社交机器人
04:07
how social社会 robots机器人
74
235220
1768
04:09
was how I found发现 out about exclusion排除
with algorithmic算法 bias偏压.
75
237012
4611
使我发现了算法偏见的排他性
04:13
But algorithmic算法 bias偏压 can also lead
to discriminatory歧视 practices做法.
76
241647
4815
不过算法偏见还会导致
各种歧视性的做法
04:19
Across横过 the US,
77
247437
1453
美国境内的警察局
04:20
police警察 departments部门 are starting开始 to use
facial面部 recognition承认 software软件
78
248914
4198
在打击犯罪的过程中
开始使用面部识别软件
04:25
in their crime-fighting打击犯罪 arsenal兵工厂.
79
253136
2459
04:27
Georgetown乔治敦 Law published发表 a report报告
80
255619
2013
乔治敦大学法学院
发表了一个报告
04:29
showing展示 that one in two adults成年人
in the US -- that's 117 million百万 people --
81
257656
6763
表明在全美两个成年人中就有一个
也就是近1.2亿的人口
04:36
have their faces面孔
in facial面部 recognition承认 networks网络.
82
264443
3534
他们的面部信息
被储存在了面部识别网络中
警察局如今可以访问
这些未被规范的
04:40
Police警察 departments部门 can currently目前 look
at these networks网络 unregulated不受管制,
83
268001
4552
使用着未审核准确性的
算法的面部识别网络
04:44
using运用 algorithms算法 that have not
been audited审计 for accuracy准确性.
84
272577
4286
04:48
Yet然而 we know facial面部 recognition承认
is not fail失败 proof证明,
85
276887
3864
然而我们知道面部识别
并非万无一失
04:52
and labeling标签 faces面孔 consistently始终如一
remains遗迹 a challenge挑战.
86
280775
4179
而持续地给面部标签
还是很有挑战性的
你也许在Facebook上见过这个
04:56
You might威力 have seen看到 this on FacebookFacebook的.
87
284978
1762
当我和我的朋友看到其他人
在我们的照片上被错误标注时
04:58
My friends朋友 and I laugh all the time
when we see other people
88
286764
2988
05:01
mislabeled贴错标签 in our photos相片.
89
289776
2458
都会捧腹大笑
但是误认一个犯罪嫌疑人
可不是闹着玩儿的
05:04
But misidentifying误认 a suspected嫌疑 criminal刑事
is no laughing matter,
90
292258
5591
对公民自由的侵犯也不容忽视
05:09
nor也不 is breaching违约 civil国内 liberties自由.
91
297873
2827
机器学习正被用于面部识别
05:12
Machine learning学习 is being存在 used
for facial面部 recognition承认,
92
300724
3205
但也延伸到了计算机视觉领域之外
05:15
but it's also extending扩展 beyond the realm领域
of computer电脑 vision视力.
93
303953
4505
05:21
In her book, "Weapons武器
of Math数学 Destruction毁坏,"
94
309266
4016
在数据科学家凯西·欧奈尔在她
《数学杀伤性武器》一书中
叙述了逐渐严重的
新型大规模杀伤性武器
05:25
data数据 scientist科学家 Cathy凯茜 O'Neil奥尼尔
talks会谈 about the rising升起 new WMDs毁灭性 --
95
313306
6681
即 广泛应用而又神秘的
具有破坏性的算法
05:32
widespread广泛, mysterious神秘
and destructive有害 algorithms算法
96
320011
4353
正在被越来越多地
运用于决策制定上
05:36
that are increasingly日益 being存在 used
to make decisions决定
97
324388
2964
05:39
that impact碰撞 more aspects方面 of our lives生活.
98
327376
3177
而这些决策影响着
我们生活的方方面面
05:42
So who gets得到 hired雇用 or fired解雇?
99
330577
1870
谁被录用
又有谁被解雇
05:44
Do you get that loan贷款?
Do you get insurance保险?
100
332471
2112
你得到了贷款吗
你买到了保险吗
你被心目中的理想大学录取了吗
05:46
Are you admitted承认 into the college学院
you wanted to get into?
101
334607
3503
在同一平台上的同一件产品
05:50
Do you and I pay工资 the same相同 price价钱
for the same相同 product产品
102
338134
3509
你和我是否支付同样的价格
05:53
purchased购买 on the same相同 platform平台?
103
341667
2442
为了实现警情预测
执法机构也开始
05:56
Law enforcement强制 is also starting开始
to use machine learning学习
104
344133
3759
使用起机器学习
05:59
for predictive预测 policing治安.
105
347916
2289
一些法官使用机器生成的
危险评分来决定
06:02
Some judges法官 use machine-generated机生成的
risk风险 scores分数 to determine确定
106
350229
3494
06:05
how long an individual个人
is going to spend in prison监狱.
107
353747
4402
囚犯要在监狱里呆多久
我们真的应该
仔细思考这些决定
06:10
So we really have to think
about these decisions决定.
108
358173
2454
它们公平吗
06:12
Are they fair公平?
109
360651
1182
我们已经清楚了 算法偏见
06:13
And we've我们已经 seen看到 that algorithmic算法 bias偏压
110
361857
2890
06:16
doesn't necessarily一定 always
lead to fair公平 outcomes结果.
111
364771
3374
不一定总能带来公平的结果
那我们应该怎么做呢
06:20
So what can we do about it?
112
368169
1964
06:22
Well, we can start开始 thinking思维 about
how we create创建 more inclusive包括的 code
113
370157
3680
我们可以开始思考如何
创造更具有包容性的代码
06:25
and employ采用 inclusive包括的 coding编码 practices做法.
114
373861
2990
并且运用有包容性的编程实践
06:28
It really starts启动 with people.
115
376875
2309
这真的要从人开始
06:31
So who codes代码 matters事项.
116
379708
1961
由谁来编程很重要
06:33
Are we creating创建 full-spectrum全谱 teams球队
with diverse多种 individuals个人
117
381693
4119
我们组建的全光谱团队中
是否包括各种各样的个体
06:37
who can check each other's其他 blind spots斑点?
118
385836
2411
他们可以弥补彼此的盲区吗
06:40
On the technical技术 side,
how we code matters事项.
119
388271
3545
在技术层面上
我们如何编程很重要
06:43
Are we factoring保理 in fairness公平
as we're developing发展 systems系统?
120
391840
3651
我们在研发系统的同时
有没有也考虑到公平的因素
06:47
And finally最后, why we code matters事项.
121
395515
2913
最后一点 我们为什么编程也很重要
06:50
We've我们已经 used tools工具 of computational计算 creation创建
to unlock开锁 immense巨大 wealth财富.
122
398785
5083
我们用计算机创建的工具
创造了巨大的财富
现在我们有机会去
创造进一步的平等
06:55
We now have the opportunity机会
to unlock开锁 even greater更大 equality平等
123
403892
4447
我们应该优先考虑社会变革
07:00
if we make social社会 change更改 a priority优先
124
408363
2930
07:03
and not an afterthought事后.
125
411317
2170
而不是想着事后优化
07:06
And so these are the three tenets原则
that will make up the "incodingincoding" movement运动.
126
414008
4522
所以这三个宗旨
将构成“译码”运动
由谁来编程很重要
07:10
Who codes代码 matters事项,
127
418554
1652
我们如何编程很重要
07:12
how we code matters事项
128
420230
1543
07:13
and why we code matters事项.
129
421797
2023
以及我们为什么编程很重要
所以就译码来说
我们可以开始考虑
07:15
So to go towards incodingincoding,
we can start开始 thinking思维 about
130
423844
3099
建立一个我们可以辨识偏见的平台
07:18
building建造 platforms平台 that can identify鉴定 bias偏压
131
426967
3164
07:22
by collecting搜集 people's人们 experiences经验
like the ones那些 I shared共享,
132
430155
3078
通过收集人们与我类似的经历
不过也要审查现有的软件
07:25
but also auditing审计 existing现有 software软件.
133
433257
3070
我们也可以创造一些
更有包容性的训练集
07:28
We can also start开始 to create创建
more inclusive包括的 training训练 sets.
134
436351
3765
想象一个为了包容性的自拍运动
07:32
Imagine想像 a "Selfies自拍 for Inclusion列入" campaign运动
135
440140
2803
07:34
where you and I can help
developers开发商 test测试 and create创建
136
442967
3655
在那里 你和我可以帮助
程序员测试以及创造
更具包容性的训练集
07:38
more inclusive包括的 training训练 sets.
137
446646
2093
07:41
And we can also start开始 thinking思维
more conscientiously切实
138
449302
2828
我们还可以开始更认真地思考
07:44
about the social社会 impact碰撞
of the technology技术 that we're developing发展.
139
452154
5391
关于正在发展的科技
造成的社会影响
为了开启译码运动
07:49
To get the incodingincoding movement运动 started开始,
140
457569
2393
07:51
I've launched推出 the Algorithmic算法
Justice正义 League联盟,
141
459986
2847
我发起了算法正义联盟
在那里任何关心公平的人
可以出力来对抗 代码的凝视
07:54
where anyone任何人 who cares管它 about fairness公平
can help fight斗争 the coded编码 gaze凝视.
142
462857
5872
在codedgaze.com网站
你可以举报偏见
08:00
On codedgazecodedgaze.comCOM, you can report报告 bias偏压,
143
468753
3296
请求审核 成为测试者
08:04
request请求 audits审计, become成为 a tester测试仪
144
472073
2445
以及加入正在进行的谈话
08:06
and join加入 the ongoing不断的 conversation会话,
145
474542
2771
标签就是 代码的凝视
08:09
#codedgazecodedgaze.
146
477337
2287
08:12
So I invite邀请 you to join加入 me
147
480742
2487
我在此邀请各位加入我
去创造一个让科技为我们
所有人服务的世界
08:15
in creating创建 a world世界 where technology技术
works作品 for all of us,
148
483253
3719
而不是只服务于部分人
08:18
not just some of us,
149
486996
1897
08:20
a world世界 where we value inclusion包容
and center中央 social社会 change更改.
150
488917
4588
一个我们珍惜包容和
聚焦社会变革的世界
谢谢
08:25
Thank you.
151
493529
1175
(掌声)
08:26
(Applause掌声)
152
494728
4271
08:32
But I have one question:
153
500873
2854
不过我还有一个问题
08:35
Will you join加入 me in the fight斗争?
154
503751
2059
你会与我并肩战斗吗
08:37
(Laughter笑声)
155
505834
1285
(笑声)
08:39
(Applause掌声)
156
507143
3687
(掌声)
Translated by Cong Zhu
Reviewed by Aviva Nassimi

▲Back to top

ABOUT THE SPEAKER
Joy Buolamwini - Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion.

Why you should listen

Joy Buolamwini is a poet of code on a mission to show compassion through computation. As a graduate researcher at the MIT Media Lab, she leads the Algorithmic Justice League to fight coded bias. Her research explores the intersection of social impact technology and inclusion. In support of this work, Buolamwini was awarded a $50,000 grant as the Grand Prize winner of a national contest inspired by the critically acclaimed film Hidden Figures, based on the book by Margot Lee Shetterly.

Driven by an entrepreneurial spirit, Buolamwini's global interest in creating technology for social impact spans multiple industries and countries. As the inaugural Chief Technology Officer for Techturized Inc., a hair care technology company, and Swift Tech Solutions, a global health tech consultancy, she led software development for underserved communities in the United States, Ethiopia, Mali, Nigeria and Niger. In Zambia, she explored empowering citizens with skills to create their own technology through the Zamrize Project. In the United Kingdom, Buolamwini piloted a Service Year Initiative to launch Code4Rights which supports youth in creating meaningful technology for their communities in partnership with local organizations.

Through Filmmakers Collaborative, Buolamwini produces media that highlight diverse creators of technology. Her short documentary, The Coded Gaze: Unmasking Algorithmic Bias, debuted at the Museum of Fine Arts Boston and her pilot of the Code4Rights: Journey To Code training series debuted at the Vatican. She has presented keynote speeches and public talks at various forums including #CSforAll at the White House, Harvard University, Saïd Business School, Rutgers University, NCWIT, Grace Hopper Celebration and SXSWedu.

Buolamwini is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer.  She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology. Buolamwini serves as a Harvard resident tutor at Adams House where she mentors students seeking scholarships or pursuing entrepreneurship.

More profile about the speaker
Joy Buolamwini | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee