Kriti Sharma: How to keep human bias out of AI
克里蒂 • 夏尔马: 如何让人工智能远离人类的偏见
Kriti Sharma creates AI technology to help address some of the toughest social challenges of our time -- from domestic violence to sexual health and inequality. Full bio
Double-click the English transcript below to play the video.
have been made about you today,
要抢走我们的工作
stealing all our jobs,
一个国家的公民身份时,
of an actual country,
担忧的朋友和关注者
and followers message
正在接管人类的统治。
our robot overlords are taking over.
we should be focusing on.
我们应该关注的问题。
a bigger risk with AI,
危机, 一个更大的风险,
是由人工智能做出的?
have been made about you today by AI?
your race or your background?
and what we want.
will know what I'm talking about
知道我在说什么,
被要求看完YouTube上
those pregnancy test adverts on YouTube
of fertility clinics
印度婚姻局。
to make decisions
who thought things like this:
to pay off their loan on time."
makes a better programmer
a repeat offender than a white man."
性别歧视和种族歧视的人。” 对吧?
racist person," right?
that AI has made very recently,
近期做出的真实决定,
it has learned from us,
whether or not you get that job interview;
你是否能够得到面试机会;
应该得到怎样的评分。
in your annual performance review.
are all being filtered through
通过它对我们的身份、
假设过滤出来的。
our race, our gender, our age.
正在帮助一个人事主管
a hiring manager
has been hiring mostly men.
更有可能成为程序员,
to be programmers than women.
our own bias into the AI.
female candidates.
hiring manager did that,
不允许这样的事情发生。
discrimination is not OK.
人工智能已经凌驾于法律之上,
AI has become above the law,
人工智能互动的偏见。
in how we interact with AI.
like Siri, Alexa or even Cortana?
这样的语音助手有多频繁?
our obedient servants,
ordering your shopping.
but they tend to be more high-powered,
但他们倾向于拥有更高的权力,
making business decisions,
或者ROSS, 是机器人律师。
or ROSS, the robot lawyer.
from sexism in the workplace.
逃脱工作中的性别歧视。
in today's world around AI.
世界中长大的孩子。
一个项目做一些研究,
for a school project
results of mostly men.
it shows them mostly females.
也许想点些吃的,
and maybe order some food,
顺从的女声助手发号施令。
at an obedient female voice assistant.
are creating this technology today.
创建了今天的这个技术。
想要的方式创造技术。
in any way they wanted.
《广告狂人》的秘书风格。
in the style of 1950s "Mad Man" secretary.
with me telling you
种族主义的机器前进而结束。
sexist, racist machines running the world.
is that it is entirely within our control.
一切都在我们的控制中。
the right ethics to AI.
正确的价值观,道德观。
这个技术的是背景多样的团队。
are building this technology.
从丰富的经验中学习。
diverse experiences to learn from.
from personal experience.
a Mark Zuckerberg or Elon Musk,
或埃隆·马斯克那样位高权重,
你的能力会收到质疑。
your ability gets questioned.
我经常参加在线科技论坛,
I often join online tech forums
自己的名字登陆时,
with my own photo, my own name,
or comments like this:
有资格谈论人工智能?”
you're qualified to talk about AI?"
you know about machine learning?"
而是选择了一只带着喷气背包的猫。
I chose a cat with a jet pack on it.
that did not reveal my gender.
where this is going, right?
任何居高临下的评论,
patronizing comments about my ability
get some work done.
to be taken seriously.
at technology than women?
隐藏性别时,像我这样,
hid their gender, like myself,
比例比男性高4%。
four percent more than men.
具备某个特征的人。
needs to look like a certain person.
to make AI better
我们需要切实的
from all kinds of backgrounds.
write and tell stories
who face different challenges
真正需要解决的问题,
what are the real issues that need fixing
解决问题的方法。
that technology can actually fix it.
from diverse backgrounds come together,
by talking to you about.
减少夺走我们工作的机器——
that are going to take our jobs --
can actually achieve.
in the world of AI,
what ads you see on your stream.
making the world so much better.
in the Democratic Republic of Congo,
到最近的农村产前诊所
to her nearest rural prenatal clinic
就能得到诊断会怎样呢?
on her phone, instead?
in South Africa
人工智能服务来报警,
to raise alarm,
that people, including myself,
正在使用人工智能的人
there will be yet another news story
会有另一个新闻故事,
and coming for your jobs.
同样对未来表示担忧的信息。
worrying about the future.
about this technology.
into a much more equal place.
变得更平等的机会。
the right way from the get go.
在一开始就以正确的方式构建它。
races, sexualities and backgrounds.
性取向和背景的人。
who do the makers' bidding.
what we teach machines,
我们教给机器的东西,
只是重复我们过去的错误。
our own past mistakes.
thinking about two things.
thinking about bias today.
当今社会中的的偏见。
you scroll past an advert
in fertility clinics
黑人会重复犯罪。
that a black man will reoffend.
to be a personal assistant than a CEO.
个人助理而非CEO。
that we need to do something about it.
我们需要对此有所行动。
in engineering or technology
一股非凡力量。
a phenomenal force for our future.
like a Mark Zuckerberg,
and the corporations
technology in the future.
of what we can achieve with AI.
ABOUT THE SPEAKER
Kriti Sharma - AI technologistKriti Sharma creates AI technology to help address some of the toughest social challenges of our time -- from domestic violence to sexual health and inequality.
Why you should listen
Kriti Sharma is the Founder of AI for Good, an organization focused on building scalable technology solutions for social good. In 2018, she also launched rAInbow, a digital companion for women facing domestic violence in South Africa. This service reached nearly 200,000 conversations within the first 100 days, breaking down the stigma of gender-based violence. In 2019, she collaborated with the Population Foundation of India to launch Dr. Sneha, an AI-powered digital character to engage with young people about sexual health, an issue that is still considered a taboo in India.
Sharma was recently named in the Forbes "30 Under 30" list for advancements in AI. She was appointed a United Nations Young Leader in 2018 and is an advisor to both the United Nations Technology Innovation Labs and to the UK Government’s Centre for Data Ethics and Innovation.
Kriti Sharma | Speaker | TED.com