ABOUT THE SPEAKER
Joy Buolamwini - Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion.

Why you should listen

Joy Buolamwini is a poet of code on a mission to show compassion through computation. As a graduate researcher at the MIT Media Lab, she leads the Algorithmic Justice League to fight coded bias. Her research explores the intersection of social impact technology and inclusion. In support of this work, Buolamwini was awarded a $50,000 grant as the Grand Prize winner of a national contest inspired by the critically acclaimed film Hidden Figures, based on the book by Margot Lee Shetterly.

Driven by an entrepreneurial spirit, Buolamwini's global interest in creating technology for social impact spans multiple industries and countries. As the inaugural Chief Technology Officer for Techturized Inc., a hair care technology company, and Swift Tech Solutions, a global health tech consultancy, she led software development for underserved communities in the United States, Ethiopia, Mali, Nigeria and Niger. In Zambia, she explored empowering citizens with skills to create their own technology through the Zamrize Project. In the United Kingdom, Buolamwini piloted a Service Year Initiative to launch Code4Rights which supports youth in creating meaningful technology for their communities in partnership with local organizations.

Through Filmmakers Collaborative, Buolamwini produces media that highlight diverse creators of technology. Her short documentary, The Coded Gaze: Unmasking Algorithmic Bias, debuted at the Museum of Fine Arts Boston and her pilot of the Code4Rights: Journey To Code training series debuted at the Vatican. She has presented keynote speeches and public talks at various forums including #CSforAll at the White House, Harvard University, Saïd Business School, Rutgers University, NCWIT, Grace Hopper Celebration and SXSWedu.

Buolamwini is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer.  She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology. Buolamwini serves as a Harvard resident tutor at Adams House where she mentors students seeking scholarships or pursuing entrepreneurship.

More profile about the speaker
Joy Buolamwini | Speaker | TED.com
TEDxBeaconStreet

Joy Buolamwini: How I'm fighting bias in algorithms

玖伊珀薇妮: 我如何對抗偏差的演算邏輯

Filmed:
1,223,943 views

麻省理工學院(MIT)研究生玖伊珀薇妮在使用臉部辨識軟體時發現了一個問題:這個軟體無法辨識她的臉,因為當寫程式的人把偏差的人工智慧邏輯植入這個軟體,導致它對於某些膚色與臉部結構沒有反應。現在她擔負重任,務必要導正這個機器學習的偏差。她稱這個誤差為「數碼凝視」。這場別開生面的演說主要是關於程式的可信度。因為演算邏輯正在鯨吞蠶食我們的生活。
- Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion. Full bio

Double-click the English transcript below to play the video.

00:13
Hello你好, I'm Joy喜悅, a poet詩人 of code,
0
1041
3134
你好 我叫玖伊
是個寫媒體程式的詩人
00:16
on a mission任務 to stop
an unseen看不見 force that's rising升起,
1
4199
4993
我的使命是
終止一個隱形力量的崛起
00:21
a force that I called "the coded編碼 gaze凝視,"
2
9216
2856
我稱這種力量為「數碼凝視」
00:24
my term術語 for algorithmic算法 bias偏壓.
3
12096
3309
是我替偏差演算法取的名稱
偏差的演算法跟人的偏見一樣
00:27
Algorithmic演算法 bias偏壓, like human人的 bias偏壓,
results結果 in unfairness不平.
4
15429
4300
會導致不公平的結果
00:31
However然而, algorithms算法, like viruses病毒,
can spread傳播 bias偏壓 on a massive大規模的 scale規模
5
19753
6022
然而演算法更像病毒
它傳播的偏見
大量而迅速
00:37
at a rapid快速 pace步伐.
6
25799
1582
演算法偏差讓人
體驗到什麼叫做被排擠
00:39
Algorithmic演算法 bias偏壓 can also lead
to exclusionary排他性 experiences經驗
7
27943
4387
也會導致差別對待
00:44
and discriminatory歧視 practices做法.
8
32354
2128
00:46
Let me show顯示 you what I mean.
9
34506
2061
讓我告訴你我的意思
嗨 相機 我有一張臉
00:48
(Video視頻) Joy喜悅 Buolamwini布奧蘭維尼: Hi你好, camera相機.
I've got a face面對.
10
36980
2436
00:52
Can you see my face面對?
11
40162
1864
你能看見我的臉嗎?
00:54
No-glasses無眼鏡 face面對?
12
42051
1625
不戴眼鏡呢?
00:55
You can see her face面對.
13
43701
2214
你看得見她啊
那麼我的臉呢?
00:58
What about my face面對?
14
46237
2245
01:03
I've got a mask面具. Can you see my mask面具?
15
51890
3750
戴上面具 你看得見戴上面具嗎?
01:08
Joy喜悅 Buolamwini布奧蘭維尼: So how did this happen發生?
16
56474
2365
到底是怎麽回事?
01:10
Why am I sitting坐在 in front面前 of a computer電腦
17
58863
3141
我為什麽要坐在電腦前
01:14
in a white白色 mask面具,
18
62028
1424
戴著白色面具
01:15
trying to be detected檢測 by a cheap低廉 webcam攝像頭?
19
63476
3650
好讓這台廉價的攝影機能看得見我
01:19
Well, when I'm not fighting戰鬥 the coded編碼 gaze凝視
20
67150
2291
如果我沒有忙著對抗數碼凝視
01:21
as a poet詩人 of code,
21
69465
1520
當個媒體程式詩人
我就是麻省理工媒體實驗室的研究生
01:23
I'm a graduate畢業 student學生
at the MITMIT Media媒體 Lab實驗室,
22
71009
3272
01:26
and there I have the opportunity機會 to work
on all sorts排序 of whimsical怪誕的 projects項目,
23
74305
4917
我在那裡從事一些稀奇古怪的計劃
包括照妖鏡
01:31
including包含 the Aspire嚮往 Mirror鏡子,
24
79246
2027
01:33
a project項目 I did so I could project項目
digital數字 masks面具 onto my reflection反射.
25
81297
5134
照妖鏡計劃
讓我能把數位面具投射在自己臉上
01:38
So in the morning早上, if I wanted
to feel powerful強大,
26
86455
2350
早上起來如果我需要強大的力量
01:40
I could put on a lion獅子.
27
88829
1434
我就投上一個獅子面具
01:42
If I wanted to be uplifted抬升,
I might威力 have a quote引用.
28
90287
3496
如果我缺乏鬥志
我就放一段名人名言
01:45
So I used generic通用
facial面部 recognition承認 software軟件
29
93807
2989
因為我使用一般的臉部辨識軟體
來測試這個系統
01:48
to build建立 the system系統,
30
96820
1351
結果竟然發現
01:50
but found發現 it was really hard to test測試 it
unless除非 I wore穿著 a white白色 mask面具.
31
98195
5103
電腦無法偵測到我
除非我戴上白色面具
很不幸我之前就碰過這種問題
01:56
Unfortunately不幸, I've run
into this issue問題 before.
32
104282
4346
02:00
When I was an undergraduate大學本科
at Georgia格魯吉亞 Tech技術 studying研究 computer電腦 science科學,
33
108652
4303
先前我在喬治亞理工學院
攻讀電腦科學學士學位時
02:04
I used to work on social社會 robots機器人,
34
112979
2055
我研究社交機器人
02:07
and one of my tasks任務 was to get a robot機器人
to play peek-a-boo偷看一噓,
35
115058
3777
其中的一個實驗
就是和機器人玩躲貓貓
02:10
a simple簡單 turn-taking轉回吐 game遊戲
36
118859
1683
這個簡單的互動遊戲
02:12
where partners夥伴 cover their face面對
and then uncover揭露 it saying, "Peek-a-boopeek-a boo!"
37
120566
4321
讓對手先遮住臉再放開
同時要說 peek-a-boo
02:16
The problem問題 is, peek-a-boo偷看一噓
doesn't really work if I can't see you,
38
124911
4429
問題是如果看不到對方
遊戲就玩不下去了
02:21
and my robot機器人 couldn't不能 see me.
39
129364
2499
我的機器人就是看不到我
02:23
But I borrowed my roommate's室友的 face面對
to get the project項目 doneDONE,
40
131887
3950
最後我只好借我室友的臉來完成
02:27
submitted提交 the assignment分配,
41
135861
1380
做完實驗時我想
02:29
and figured想通, you know what,
somebody else其他 will solve解決 this problem問題.
42
137265
3753
總有一天會有別人解決這個問題
02:33
Not too long after,
43
141669
2003
不久之後
02:35
I was in Hong香港 Kong
for an entrepreneurship創業 competition競爭.
44
143696
4159
我去香港參加一個
業界舉辦的競技比賽
02:40
The organizers組織者 decided決定
to take participants參與者
45
148339
2694
主辦單位先帶每位參賽者
02:43
on a tour遊覽 of local本地 start-ups創業.
46
151057
2372
去參觀當地的新創市場
02:45
One of the start-ups創業 had a social社會 robot機器人,
47
153453
2715
其中一項就是社交機器人
02:48
and they decided決定 to do a demo演示.
48
156192
1912
當他們用社交機器人展示成果時
02:50
The demo演示 worked工作 on everybody每個人
until直到 it got to me,
49
158128
2980
社交機器人對每個參賽者都有反應
直到遇到了我
02:53
and you can probably大概 guess猜測 it.
50
161132
1923
接下來的情形你應該能想像
02:55
It couldn't不能 detect檢測 my face面對.
51
163079
2965
社交機器人怎樣都偵測不到我的臉
02:58
I asked the developers開發商 what was going on,
52
166068
2511
我問軟體開發人員是怎麼一回事
03:00
and it turned轉身 out we had used the same相同
generic通用 facial面部 recognition承認 software軟件.
53
168603
5533
才驚覺當年通用的
人臉辨識軟體
03:06
Halfway around the world世界,
54
174160
1650
竟然飄洋過海到了香港
03:07
I learned學到了 that algorithmic算法 bias偏壓
can travel旅行 as quickly很快
55
175834
3852
偏差的演算邏輯快速散播
03:11
as it takes to download下載
some files off of the internet互聯網.
56
179710
3170
只要從網路下載幾個檔案就搞定了
03:15
So what's going on?
Why isn't my face面對 being存在 detected檢測?
57
183745
3076
為什麼機器人就是看不見我的臉?
03:18
Well, we have to look
at how we give machines sight視力.
58
186845
3356
得先知道我們如何賦予機器視力
03:22
Computer電腦 vision視力 uses使用
machine learning學習 techniques技術
59
190225
3409
電腦使用機器學習的技術
來辨識人臉
03:25
to do facial面部 recognition承認.
60
193658
1880
03:27
So how this works作品 is, you create創建
a training訓練 set with examples例子 of faces面孔.
61
195562
3897
你必須用許多實作測試來訓練他們
這是人臉這是人臉這是人臉
03:31
This is a face面對. This is a face面對.
This is not a face面對.
62
199483
2818
這不是人臉
一而再再而三你就能教機器人
03:34
And over time, you can teach a computer電腦
how to recognize認識 other faces面孔.
63
202325
4519
辨識其他的人臉
03:38
However然而, if the training訓練 sets
aren't really that diverse多種,
64
206868
3989
但是如果實作測試不夠多樣化
當出現的人臉
03:42
any face面對 that deviates偏離 too much
from the established既定 norm規範
65
210881
3349
與既定規範相去太遠時
03:46
will be harder更難 to detect檢測,
66
214254
1649
電腦就很難判斷了
03:47
which哪一個 is what was happening事件 to me.
67
215927
1963
我的親身經驗就是這樣
03:49
But don't worry擔心 -- there's some good news新聞.
68
217914
2382
但別慌張 有好消息
03:52
Training訓練 sets don't just
materialize物質化 out of nowhere無處.
69
220320
2771
實作測試並不是無中生有
03:55
We actually其實 can create創建 them.
70
223115
1788
事實上我們能夠建的
03:56
So there's an opportunity機會 to create創建
full-spectrum全譜 training訓練 sets
71
224927
4176
我們可以有一套更周詳的測試樣本
04:01
that reflect反映 a richer更豐富
portrait肖像 of humanity人性.
72
229127
3824
涵蓋人種的多樣性
04:04
Now you've seen看到 in my examples例子
73
232975
2221
我的實驗說明了
04:07
how social社會 robots機器人
74
235220
1768
社交機器人
產生排他現象
04:09
was how I found發現 out about exclusion排除
with algorithmic算法 bias偏壓.
75
237012
4611
因為偏差的演算邏輯
04:13
But algorithmic算法 bias偏壓 can also lead
to discriminatory歧視 practices做法.
76
241647
4815
偏差的演算邏輯
也可能讓偏見成為一種習慣
04:19
Across橫過 the US,
77
247437
1453
美國各地的警方
04:20
police警察 departments部門 are starting開始 to use
facial面部 recognition承認 software軟件
78
248914
4198
正開始使用這套人臉辨識軟體
04:25
in their crime-fighting打擊犯罪 arsenal兵工廠.
79
253136
2459
來建立警方的打擊犯罪系統
04:27
Georgetown喬治敦 Law published發表 a report報告
80
255619
2013
喬治城大學法律中心的報告指出
每兩個美國成年人就有一個人
04:29
showing展示 that one in two adults成年人
in the US -- that's 117 million百萬 people --
81
257656
6763
也就是一億一千七百萬筆臉部資料
04:36
have their faces面孔
in facial面部 recognition承認 networks網絡.
82
264443
3534
在美國警方這套系統裡
04:40
Police警察 departments部門 can currently目前 look
at these networks網絡 unregulated不受管制,
83
268001
4552
警方這套系統既缺乏規範
04:44
using運用 algorithms算法 that have not
been audited審計 for accuracy準確性.
84
272577
4286
也缺乏正確合法的演算邏輯
04:48
Yet然而 we know facial面部 recognition承認
is not fail失敗 proof證明,
85
276887
3864
你要知道人臉辨識並非萬無一失
04:52
and labeling標籤 faces面孔 consistently始終如一
remains遺跡 a challenge挑戰.
86
280775
4179
要一貫正確地標註人臉
往往不是那麼容易
04:56
You might威力 have seen看到 this on FacebookFacebook的.
87
284978
1762
或許你在臉書上看過
朋友和我常覺得很好笑
04:58
My friends朋友 and I laugh all the time
when we see other people
88
286764
2988
看見有人標註朋友卻標錯了
05:01
mislabeled貼錯標籤 in our photos相片.
89
289776
2458
05:04
But misidentifying誤認 a suspected嫌疑 criminal刑事
is no laughing matter,
90
292258
5591
如果標錯的是犯人的臉呢
那就讓人笑不出來了
05:09
nor也不 is breaching違約 civil國內 liberties自由.
91
297873
2827
侵害公民自由也同樣讓人笑不出來
05:12
Machine learning學習 is being存在 used
for facial面部 recognition承認,
92
300724
3205
不僅辨識人臉倚賴機器學習的技術
05:15
but it's also extending擴展 beyond the realm領域
of computer電腦 vision視力.
93
303953
4505
許多領域其實都要用到機器學習
05:21
In her book, "Weapons武器
of Math數學 Destruction毀壞,"
94
309266
4016
《大數據的傲慢與偏見》
這本書的作者
05:25
data數據 scientist科學家 Cathy凱茜 O'Neil奧尼爾
talks會談 about the rising升起 new WMDs大規模毀滅性武器 --
95
313306
6681
數據科學家凱西 歐尼爾
談到新 WMD 勢力的崛起
05:32
widespread廣泛, mysterious神秘
and destructive有害 algorithms算法
96
320011
4353
WMD 是廣泛 神秘和具破壞性的算法
05:36
that are increasingly日益 being存在 used
to make decisions決定
97
324388
2964
演算法漸漸取代我們做決定
05:39
that impact碰撞 more aspects方面 of our lives生活.
98
327376
3177
影響我們生活的更多層面
05:42
So who gets得到 hired僱用 or fired解僱?
99
330577
1870
例如誰升了官?誰丟了飯碗?
05:44
Do you get that loan貸款?
Do you get insurance保險?
100
332471
2112
你借到錢了嗎?你買保險了嗎?
05:46
Are you admitted承認 into the college學院
you wanted to get into?
101
334607
3503
你進入心目中理想的大學了嗎?
05:50
Do you and I pay工資 the same相同 price價錢
for the same相同 product產品
102
338134
3509
我們花同樣多的錢在同樣的平台上
05:53
purchased購買 on the same相同 platform平台?
103
341667
2442
買到同樣的產品嗎?
05:56
Law enforcement強制 is also starting開始
to use machine learning學習
104
344133
3759
警方也開始使用機器學習
05:59
for predictive預測 policing治安.
105
347916
2289
來防範犯罪
06:02
Some judges法官 use machine-generated機生成的
risk風險 scores分數 to determine確定
106
350229
3494
法官根據電腦顯示的危險因子數據
06:05
how long an individual個人
is going to spend in prison監獄.
107
353747
4402
來決定一個人要在監獄待幾年
06:10
So we really have to think
about these decisions決定.
108
358173
2454
我們得仔細想想這些判定
06:12
Are they fair公平?
109
360651
1182
它們真的公平嗎?
06:13
And we've我們已經 seen看到 that algorithmic算法 bias偏壓
110
361857
2890
我們親眼看見偏差的演算邏輯
06:16
doesn't necessarily一定 always
lead to fair公平 outcomes結果.
111
364771
3374
未必做出正確的判斷
06:20
So what can we do about it?
112
368169
1964
我們該怎麽辦呢?
06:22
Well, we can start開始 thinking思維 about
how we create創建 more inclusive包括的 code
113
370157
3680
我們要先確定程式碼是否具多樣性
06:25
and employ採用 inclusive包括的 coding編碼 practices做法.
114
373861
2990
以及寫程式的過程是否周詳
06:28
It really starts啟動 with people.
115
376875
2309
事實上全都始於人
06:31
So who codes代碼 matters事項.
116
379708
1961
程式是誰寫的有關係
06:33
Are we creating創建 full-spectrum全譜 teams球隊
with diverse多種 individuals個人
117
381693
4119
寫程式的團隊是否由
多元的個體組成呢?
06:37
who can check each other's其他 blind spots斑點?
118
385836
2411
這樣才能互補並找出彼此的盲點
06:40
On the technical技術 side,
how we code matters事項.
119
388271
3545
從技術面而言
我們如何寫程式很重要
06:43
Are we factoring保理 in fairness公平
as we're developing發展 systems系統?
120
391840
3651
我們是否對公平這項要素
在系統開發階段就考量到呢?
06:47
And finally最後, why we code matters事項.
121
395515
2913
最後 我們為什麼寫程式也重要
06:50
We've我們已經 used tools工具 of computational計算 creation創建
to unlock開鎖 immense巨大 wealth財富.
122
398785
5083
我們使用計算創造工具
開啟了巨額財富之門
06:55
We now have the opportunity機會
to unlock開鎖 even greater更大 equality平等
123
403892
4447
我們現在有機會實現更大的平等
07:00
if we make social社會 change更改 a priority優先
124
408363
2930
如果我們將社會變革作為優先事項
而不是事後的想法
07:03
and not an afterthought事後.
125
411317
2170
07:06
And so these are the three tenets原則
that will make up the "incodingincoding" movement運動.
126
414008
4522
這裡有改革程式的三元素
07:10
Who codes代碼 matters事項,
127
418554
1652
程式是誰寫的重要
07:12
how we code matters事項
128
420230
1543
如何寫程式重要
07:13
and why we code matters事項.
129
421797
2023
以及為何寫程式重要
07:15
So to go towards incodingincoding,
we can start開始 thinking思維 about
130
423844
3099
要成功改革程式
我們可以先從建立能夠
找出偏差的分析平台開始
07:18
building建造 platforms平台 that can identify鑑定 bias偏壓
131
426967
3164
07:22
by collecting蒐集 people's人們 experiences經驗
like the ones那些 I shared共享,
132
430155
3078
作法是收集人們的親身經歷
像是我剛才分享的經歷
07:25
but also auditing審計 existing現有 software軟件.
133
433257
3070
也檢視現存的軟體
07:28
We can also start開始 to create創建
more inclusive包括的 training訓練 sets.
134
436351
3765
我們可以著手建立
更具包容性的測試樣本
07:32
Imagine想像 a "Selfies自拍 for Inclusion包含" campaign運動
135
440140
2803
想像「包容的自拍」活動
07:34
where you and I can help
developers開發商 test測試 and create創建
136
442967
3655
我們可以幫助開發人員測試和創建
07:38
more inclusive包括的 training訓練 sets.
137
446646
2093
更具包容性的測試樣本
07:41
And we can also start開始 thinking思維
more conscientiously切實
138
449302
2828
我們也要更自省
07:44
about the social社會 impact碰撞
of the technology技術 that we're developing發展.
139
452154
5391
我們發展的科技帶給社會的衝擊
07:49
To get the incodingincoding movement運動 started開始,
140
457569
2393
為了著手程式改革
07:51
I've launched推出 the Algorithmic演算法
Justice正義 League聯盟,
141
459986
2847
我發起了「演算邏輯正義聯盟」
07:54
where anyone任何人 who cares管它 about fairness公平
can help fight鬥爭 the coded編碼 gaze凝視.
142
462857
5872
只要你贊同公平
就可以加入打擊數碼凝視的行列
08:00
On codedgazecodedgaze.comCOM, you can report報告 bias偏壓,
143
468753
3296
只要上 codedgaze.com 網路
可以舉報你發現的偏差演算邏輯
08:04
request請求 audits審計, become成為 a tester測試儀
144
472073
2445
可以申請測試
可以成為受測者
08:06
and join加入 the ongoing不斷的 conversation會話,
145
474542
2771
也可以加入論壇
08:09
#codedgazecodedgaze.
146
477337
2287
只要搜尋 #codedgaze
08:12
So I invite邀請 you to join加入 me
147
480742
2487
我在此邀請大家加入我的行列
08:15
in creating創建 a world世界 where technology技術
works作品 for all of us,
148
483253
3719
創造一個技術適用於
我們所有人的世界
08:18
not just some of us,
149
486996
1897
而不是只適用於某些人
08:20
a world世界 where we value inclusion包容
and center中央 social社會 change更改.
150
488917
4588
一個重視包容性
和以社會變革為中心的世界
08:25
Thank you.
151
493529
1175
謝謝
08:26
(Applause掌聲)
152
494728
4271
(掌聲)
08:32
But I have one question:
153
500873
2854
我還有一個問題
08:35
Will you join加入 me in the fight鬥爭?
154
503751
2059
你要和我並肩作戰嗎?
08:37
(Laughter笑聲)
155
505834
1285
(笑聲)
08:39
(Applause掌聲)
156
507143
3687
(掌聲)
Translated by Suzie Tang
Reviewed by Helen Chang

▲Back to top

ABOUT THE SPEAKER
Joy Buolamwini - Poet of code
Joy Buolamwini's research explores the intersection of social impact technology and inclusion.

Why you should listen

Joy Buolamwini is a poet of code on a mission to show compassion through computation. As a graduate researcher at the MIT Media Lab, she leads the Algorithmic Justice League to fight coded bias. Her research explores the intersection of social impact technology and inclusion. In support of this work, Buolamwini was awarded a $50,000 grant as the Grand Prize winner of a national contest inspired by the critically acclaimed film Hidden Figures, based on the book by Margot Lee Shetterly.

Driven by an entrepreneurial spirit, Buolamwini's global interest in creating technology for social impact spans multiple industries and countries. As the inaugural Chief Technology Officer for Techturized Inc., a hair care technology company, and Swift Tech Solutions, a global health tech consultancy, she led software development for underserved communities in the United States, Ethiopia, Mali, Nigeria and Niger. In Zambia, she explored empowering citizens with skills to create their own technology through the Zamrize Project. In the United Kingdom, Buolamwini piloted a Service Year Initiative to launch Code4Rights which supports youth in creating meaningful technology for their communities in partnership with local organizations.

Through Filmmakers Collaborative, Buolamwini produces media that highlight diverse creators of technology. Her short documentary, The Coded Gaze: Unmasking Algorithmic Bias, debuted at the Museum of Fine Arts Boston and her pilot of the Code4Rights: Journey To Code training series debuted at the Vatican. She has presented keynote speeches and public talks at various forums including #CSforAll at the White House, Harvard University, Saïd Business School, Rutgers University, NCWIT, Grace Hopper Celebration and SXSWedu.

Buolamwini is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer.  She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology. Buolamwini serves as a Harvard resident tutor at Adams House where she mentors students seeking scholarships or pursuing entrepreneurship.

More profile about the speaker
Joy Buolamwini | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee