ABOUT THE SPEAKER
Kriti Sharma - AI technologist
Kriti Sharma creates AI technology to help address some of the toughest social challenges of our time -- from domestic violence to sexual health and inequality.

Why you should listen

Kriti Sharma is the Founder of AI for Good, an organization focused on building scalable technology solutions for social good. In 2018, she also launched rAInbow, a digital companion for women facing domestic violence in South Africa. This service reached nearly 200,000 conversations within the first 100 days, breaking down the stigma of gender-based violence. In 2019, she collaborated with the Population Foundation of India to launch Dr. Sneha, an AI-powered digital character to engage with young people about sexual health, an issue that is still considered a taboo in India. 

Sharma was recently named in the Forbes "30 Under 30" list for advancements in AI. She was appointed a United Nations Young Leader in 2018 and is an advisor to both the United Nations Technology Innovation Labs and to the UK Government’s Centre for Data Ethics and Innovation. 

More profile about the speaker
Kriti Sharma | Speaker | TED.com
TEDxWarwick

Kriti Sharma: How to keep human bias out of AI

Filmed:
2,050,106 views

AI algorithms make important decisions about you all the time -- like how much you should pay for car insurance or whether or not you get that job interview. But what happens when these machines are built with human bias coded into their systems? Technologist Kriti Sharma explores how the lack of diversity in tech is creeping into our AI, offering three ways we can start making more ethical algorithms.
- AI technologist
Kriti Sharma creates AI technology to help address some of the toughest social challenges of our time -- from domestic violence to sexual health and inequality. Full bio

Double-click the English transcript below to play the video.

00:12
How many decisions
have been made about you today,
0
875
3768
00:16
or this week or this year,
1
4667
2601
00:19
by artificial intelligence?
2
7292
1958
00:22
I build AI for a living
3
10958
1685
00:24
so, full disclosure, I'm kind of a nerd.
4
12667
3017
00:27
And because I'm kind of a nerd,
5
15708
2393
00:30
wherever some new news story comes out
6
18125
2351
00:32
about artificial intelligence
stealing all our jobs,
7
20500
3434
00:35
or robots getting citizenship
of an actual country,
8
23958
4185
00:40
I'm the person my friends
and followers message
9
28167
3142
00:43
freaking out about the future.
10
31333
1542
00:45
We see this everywhere.
11
33833
2101
00:47
This media panic that
our robot overlords are taking over.
12
35958
4893
00:52
We could blame Hollywood for that.
13
40875
1917
00:56
But in reality, that's not the problem
we should be focusing on.
14
44125
4125
01:01
There is a more pressing danger,
a bigger risk with AI,
15
49250
3643
01:04
that we need to fix first.
16
52917
1583
01:07
So we are back to this question:
17
55417
2309
01:09
How many decisions
have been made about you today by AI?
18
57750
4708
01:15
And how many of these
19
63792
1976
01:17
were based on your gender,
your race or your background?
20
65792
4500
01:24
Algorithms are being used all the time
21
72500
2768
01:27
to make decisions about who we are
and what we want.
22
75292
3833
01:32
Some of the women in this room
will know what I'm talking about
23
80208
3643
01:35
if you've been made to sit through
those pregnancy test adverts on YouTube
24
83875
3768
01:39
like 1,000 times.
25
87667
2059
01:41
Or you've scrolled past adverts
of fertility clinics
26
89750
2851
01:44
on your Facebook feed.
27
92625
2042
01:47
Or in my case, Indian marriage bureaus.
28
95625
2393
01:50
(Laughter)
29
98042
1267
01:51
But AI isn't just being used
to make decisions
30
99333
2976
01:54
about what products we want to buy
31
102333
2601
01:56
or which show we want to binge watch next.
32
104958
2500
02:01
I wonder how you'd feel about someone
who thought things like this:
33
109042
5184
02:06
"A black or Latino person
34
114250
1934
02:08
is less likely than a white person
to pay off their loan on time."
35
116208
4125
02:13
"A person called John
makes a better programmer
36
121542
2809
02:16
than a person called Mary."
37
124375
1667
02:19
"A black man is more likely to be
a repeat offender than a white man."
38
127250
5083
02:26
You're probably thinking,
39
134958
1268
02:28
"Wow, that sounds like a pretty sexist,
racist person," right?
40
136250
3750
02:33
These are some real decisions
that AI has made very recently,
41
141000
4851
02:37
based on the biases
it has learned from us,
42
145875
2934
02:40
from the humans.
43
148833
1250
02:43
AI is being used to help decide
whether or not you get that job interview;
44
151750
4809
02:48
how much you pay for your car insurance;
45
156583
2393
02:51
how good your credit score is;
46
159000
1893
02:52
and even what rating you get
in your annual performance review.
47
160917
3125
02:57
But these decisions
are all being filtered through
48
165083
3143
03:00
its assumptions about our identity,
our race, our gender, our age.
49
168250
5875
03:08
How is that happening?
50
176250
2268
03:10
Now, imagine an AI is helping
a hiring manager
51
178542
3517
03:14
find the next tech leader in the company.
52
182083
2851
03:16
So far, the manager
has been hiring mostly men.
53
184958
3101
03:20
So the AI learns men are more likely
to be programmers than women.
54
188083
4750
03:25
And it's a very short leap from there to:
55
193542
2892
03:28
men make better programmers than women.
56
196458
2042
03:31
We have reinforced
our own bias into the AI.
57
199417
3726
03:35
And now, it's screening out
female candidates.
58
203167
3625
03:40
Hang on, if a human
hiring manager did that,
59
208917
3017
03:43
we'd be outraged, we wouldn't allow it.
60
211958
2351
03:46
This kind of gender
discrimination is not OK.
61
214333
3476
03:49
And yet somehow,
AI has become above the law,
62
217833
4518
03:54
because a machine made the decision.
63
222375
2083
03:57
That's not it.
64
225833
1518
03:59
We are also reinforcing our bias
in how we interact with AI.
65
227375
4875
04:04
How often do you use a voice assistant
like Siri, Alexa or even Cortana?
66
232917
5976
04:10
They all have two things in common:
67
238917
2559
04:13
one, they can never get my name right,
68
241500
3101
04:16
and second, they are all female.
69
244625
2667
04:20
They are designed to be
our obedient servants,
70
248417
2767
04:23
turning your lights on and off,
ordering your shopping.
71
251208
3250
04:27
You get male AIs too,
but they tend to be more high-powered,
72
255125
3309
04:30
like IBM Watson,
making business decisions,
73
258458
3059
04:33
Salesforce Einstein
or ROSS, the robot lawyer.
74
261541
3792
04:38
So poor robots, even they suffer
from sexism in the workplace.
75
266208
4060
04:42
(Laughter)
76
270292
1125
04:44
Think about how these two things combine
77
272542
2851
04:47
and affect a kid growing up
in today's world around AI.
78
275417
5309
04:52
So they're doing some research
for a school project
79
280750
2934
04:55
and they Google images of CEO.
80
283708
3018
04:58
The algorithm shows them
results of mostly men.
81
286750
2893
05:01
And now, they Google personal assistant.
82
289667
2559
05:04
As you can guess,
it shows them mostly females.
83
292250
3434
05:07
And then they want to put on some music,
and maybe order some food,
84
295708
3601
05:11
and now, they are barking orders
at an obedient female voice assistant.
85
299333
6584
05:19
Some of our brightest minds
are creating this technology today.
86
307542
5309
05:24
Technology that they could have created
in any way they wanted.
87
312875
4184
05:29
And yet, they have chosen to create it
in the style of 1950s "Mad Man" secretary.
88
317083
5685
05:34
Yay!
89
322792
1500
05:36
But OK, don't worry,
90
324958
1310
05:38
this is not going to end
with me telling you
91
326292
2059
05:40
that we are all heading towards
sexist, racist machines running the world.
92
328375
3477
05:44
The good news about AI
is that it is entirely within our control.
93
332792
5791
05:51
We get to teach the right values,
the right ethics to AI.
94
339333
4000
05:56
So there are three things we can do.
95
344167
2184
05:58
One, we can be aware of our own biases
96
346375
3351
06:01
and the bias in machines around us.
97
349750
2726
06:04
Two, we can make sure that diverse teams
are building this technology.
98
352500
4518
06:09
And three, we have to give it
diverse experiences to learn from.
99
357042
4916
06:14
I can talk about the first two
from personal experience.
100
362875
3309
06:18
When you work in technology
101
366208
1435
06:19
and you don't look like
a Mark Zuckerberg or Elon Musk,
102
367667
3392
06:23
your life is a little bit difficult,
your ability gets questioned.
103
371083
3750
06:27
Here's just one example.
104
375875
1393
06:29
Like most developers,
I often join online tech forums
105
377292
3726
06:33
and share my knowledge to help others.
106
381042
3226
06:36
And I've found,
107
384292
1309
06:37
when I log on as myself,
with my own photo, my own name,
108
385625
3976
06:41
I tend to get questions
or comments like this:
109
389625
4601
06:46
"What makes you think
you're qualified to talk about AI?"
110
394250
3000
06:50
"What makes you think
you know about machine learning?"
111
398458
3476
06:53
So, as you do, I made a new profile,
112
401958
3435
06:57
and this time, instead of my own picture,
I chose a cat with a jet pack on it.
113
405417
4851
07:02
And I chose a name
that did not reveal my gender.
114
410292
2458
07:05
You can probably guess
where this is going, right?
115
413917
2726
07:08
So, this time, I didn't get any of those
patronizing comments about my ability
116
416667
6392
07:15
and I was able to actually
get some work done.
117
423083
3334
07:19
And it sucks, guys.
118
427500
1851
07:21
I've been building robots since I was 15,
119
429375
2476
07:23
I have a few degrees in computer science,
120
431875
2268
07:26
and yet, I had to hide my gender
121
434167
2434
07:28
in order for my work
to be taken seriously.
122
436625
2250
07:31
So, what's going on here?
123
439875
1893
07:33
Are men just better
at technology than women?
124
441792
3208
07:37
Another study found
125
445917
1559
07:39
that when women coders on one platform
hid their gender, like myself,
126
447500
4934
07:44
their code was accepted
four percent more than men.
127
452458
3250
07:48
So this is not about the talent.
128
456542
2916
07:51
This is about an elitism in AI
129
459958
2893
07:54
that says a programmer
needs to look like a certain person.
130
462875
2792
07:59
What we really need to do
to make AI better
131
467375
3101
08:02
is bring people
from all kinds of backgrounds.
132
470500
3042
08:06
We need people who can
write and tell stories
133
474542
2559
08:09
to help us create personalities of AI.
134
477125
2167
08:12
We need people who can solve problems.
135
480208
2042
08:15
We need people
who face different challenges
136
483125
3768
08:18
and we need people who can tell us
what are the real issues that need fixing
137
486917
5351
08:24
and help us find ways
that technology can actually fix it.
138
492292
3041
08:29
Because, when people
from diverse backgrounds come together,
139
497833
3726
08:33
when we build things in the right way,
140
501583
2143
08:35
the possibilities are limitless.
141
503750
2042
08:38
And that's what I want to end
by talking to you about.
142
506750
3309
08:42
Less racist robots, less machines
that are going to take our jobs --
143
510083
4225
08:46
and more about what technology
can actually achieve.
144
514332
3125
08:50
So, yes, some of the energy
in the world of AI,
145
518292
3434
08:53
in the world of technology
146
521750
1393
08:55
is going to be about
what ads you see on your stream.
147
523167
4267
08:59
But a lot of it is going towards
making the world so much better.
148
527458
5209
09:05
Think about a pregnant woman
in the Democratic Republic of Congo,
149
533500
3768
09:09
who has to walk 17 hours
to her nearest rural prenatal clinic
150
537292
4184
09:13
to get a checkup.
151
541500
1851
09:15
What if she could get diagnosis
on her phone, instead?
152
543375
2917
09:19
Or think about what AI could do
153
547750
1809
09:21
for those one in three women
in South Africa
154
549583
2726
09:24
who face domestic violence.
155
552333
2125
09:27
If it wasn't safe to talk out loud,
156
555083
2726
09:29
they could get an AI service
to raise alarm,
157
557833
2476
09:32
get financial and legal advice.
158
560333
2459
09:35
These are all real examples of projects
that people, including myself,
159
563958
5018
09:41
are working on right now, using AI.
160
569000
2500
09:45
So, I'm sure in the next couple of days
there will be yet another news story
161
573542
3601
09:49
about the existential risk,
162
577167
2684
09:51
robots taking over
and coming for your jobs.
163
579875
2434
09:54
(Laughter)
164
582333
1018
09:55
And when something like that happens,
165
583375
2309
09:57
I know I'll get the same messages
worrying about the future.
166
585708
3601
10:01
But I feel incredibly positive
about this technology.
167
589333
3667
10:07
This is our chance to remake the world
into a much more equal place.
168
595458
5959
10:14
But to do that, we need to build it
the right way from the get go.
169
602458
4000
10:19
We need people of different genders,
races, sexualities and backgrounds.
170
607667
5083
10:26
We need women to be the makers
171
614458
2476
10:28
and not just the machines
who do the makers' bidding.
172
616958
3000
10:33
We need to think very carefully
what we teach machines,
173
621875
3768
10:37
what data we give them,
174
625667
1642
10:39
so they don't just repeat
our own past mistakes.
175
627333
3125
10:44
So I hope I leave you
thinking about two things.
176
632125
3542
10:48
First, I hope you leave
thinking about bias today.
177
636542
4559
10:53
And that the next time
you scroll past an advert
178
641125
3184
10:56
that assumes you are interested
in fertility clinics
179
644333
2810
10:59
or online betting websites,
180
647167
2851
11:02
that you think and remember
181
650042
2017
11:04
that the same technology is assuming
that a black man will reoffend.
182
652083
4625
11:09
Or that a woman is more likely
to be a personal assistant than a CEO.
183
657833
4167
11:14
And I hope that reminds you
that we need to do something about it.
184
662958
3709
11:20
And second,
185
668917
1851
11:22
I hope you think about the fact
186
670792
1892
11:24
that you don't need to look a certain way
187
672708
1976
11:26
or have a certain background
in engineering or technology
188
674708
3851
11:30
to create AI,
189
678583
1268
11:31
which is going to be
a phenomenal force for our future.
190
679875
2875
11:36
You don't need to look
like a Mark Zuckerberg,
191
684166
2143
11:38
you can look like me.
192
686333
1250
11:41
And it is up to all of us in this room
193
689250
2893
11:44
to convince the governments
and the corporations
194
692167
2726
11:46
to build AI technology for everyone,
195
694917
2892
11:49
including the edge cases.
196
697833
2393
11:52
And for us all to get education
197
700250
2059
11:54
about this phenomenal
technology in the future.
198
702333
2375
11:58
Because if we do that,
199
706167
2017
12:00
then we've only just scratched the surface
of what we can achieve with AI.
200
708208
4893
12:05
Thank you.
201
713125
1268
12:06
(Applause)
202
714417
2708
Translated by Ivana Korom
Reviewed by Joanna Pietrulewicz

▲Back to top

ABOUT THE SPEAKER
Kriti Sharma - AI technologist
Kriti Sharma creates AI technology to help address some of the toughest social challenges of our time -- from domestic violence to sexual health and inequality.

Why you should listen

Kriti Sharma is the Founder of AI for Good, an organization focused on building scalable technology solutions for social good. In 2018, she also launched rAInbow, a digital companion for women facing domestic violence in South Africa. This service reached nearly 200,000 conversations within the first 100 days, breaking down the stigma of gender-based violence. In 2019, she collaborated with the Population Foundation of India to launch Dr. Sneha, an AI-powered digital character to engage with young people about sexual health, an issue that is still considered a taboo in India. 

Sharma was recently named in the Forbes "30 Under 30" list for advancements in AI. She was appointed a United Nations Young Leader in 2018 and is an advisor to both the United Nations Technology Innovation Labs and to the UK Government’s Centre for Data Ethics and Innovation. 

More profile about the speaker
Kriti Sharma | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee