Kriti Sharma: How to keep human bias out of AI
克里蒂沙馬: 如何防止人工智慧學到人類偏見
Kriti Sharma creates AI technology to help address some of the toughest social challenges of our time -- from domestic violence to sexual health and inequality. Full bio
Double-click the English transcript below to play the video.
have been made about you today,
stealing all our jobs,
偷走我們所有的工作,
of an actual country,
實際國家的公民權,
and followers message
就會發訊息給我,
our robot overlords are taking over.
會接管世界的媒體恐慌。
we should be focusing on.
把焦點放在那個問題上。
a bigger risk with AI,
人工智慧有個更大的風險,
have been made about you today by AI?
多少關於你的決策?
your race or your background?
或你的背景所做出來的?
and what we want.
我們想要什麼的相關決策。
will know what I'm talking about
知道我在說什麼,
those pregnancy test adverts on YouTube
看 YouTube 時,
且發生過約一千次的話。
of fertility clinics
看臉書動態時報時
印度婚姻介紹所的廣告。
to make decisions
who thought things like this:
你們有何感覺:
to pay off their loan on time."
沒有白人高。」
makes a better programmer
和名叫瑪莉的人相比,
a repeat offender than a white man."
有可能會再次犯罪。」
racist person," right?
和種族主義的人會說的話」對吧?
that AI has made very recently,
近期所做出的一些決策,
it has learned from us,
whether or not you get that job interview;
你是否能參加工作面試;
in your annual performance review.
得到的評級是多少。
are all being filtered through
our race, our gender, our age.
種族、性別、年齡等的假設。
a hiring manager
一位有人才需求的經理
has been hiring mostly men.
大部分都是男性。
to be programmers than women.
比女性更有可能成為程式設計師。
這個現象直接下結論:
our own bias into the AI.
灌輸給人工智慧。
female candidates.
女性候選人給篩掉。
hiring manager did that,
我們不會容忍這種事。
discrimination is not OK.
AI has become above the law,
某種方式超越了法律,
in how we interact with AI.
也加強了我們自己的偏見。
like Siri, Alexa or even Cortana?
Siri、Alexa,或甚至 Cortana?
our obedient servants,
ordering your shopping.
but they tend to be more high-powered,
但通常它們的功能比較強,
making business decisions,
做的是商業決策,
or ROSS, the robot lawyer.
或是機器律師 ROSS 。
from sexism in the workplace.
遇到工作場所的性別主義。
in today's world around AI.
一同生活的孩子成長。
for a school project
專案計畫做了些研究,
執行長的形象。
results of mostly men.
大部分都是男性。
it shows them mostly females.
搜尋結果大部分是女性。
and maybe order some food,
也許再點一些食物來吃,
at an obedient female voice assistant.
are creating this technology today.
創造出現今的這種技術。
in any way they wanted.
任何方式來創造這種技術。
in the style of 1950s "Mad Man" secretary.
《廣告狂人》中的秘書風格。
with me telling you
sexist, racist machines running the world.
種族主義的機器所統治的世界。
is that it is entirely within our control.
它完全在我們的掌控當中。
the right ethics to AI.
正確的價值觀、正確的倫理。
我們自己有偏見存在,
are building this technology.
是由多樣化的團隊來建造。
diverse experiences to learn from.
讓這項技術從中學習。
from personal experience.
a Mark Zuckerberg or Elon Musk,
馬克祖克柏或伊隆馬斯克,
your ability gets questioned.
你的能力會被質疑。
I often join online tech forums
我通常會加入線上技術討論區,
with my own photo, my own name,
照片,用我自己的名字,
or comments like this:
you're qualified to talk about AI?"
你有資格談論人工智慧?」
you know about machine learning?"
你了解機器學習?」
我會做個新的個人檔案,
I chose a cat with a jet pack on it.
噴氣飛行器的貓。
that did not reveal my gender.
where this is going, right?
patronizing comments about my ability
get some work done.
to be taken seriously.
at technology than women?
比女人厲害嗎?
hid their gender, like myself,
像我這樣隱瞞自己的性別時,
four percent more than men.
比率比男性高 4%。
needs to look like a certain person.
看起來像是某種人。
to make AI better
我們需要做的事情
from all kinds of backgrounds.
write and tell stories
人工智慧的人格。
who face different challenges
what are the real issues that need fixing
真正需要修正的問題是什麼,
that technology can actually fix it.
用科技來修正它的人。
from diverse backgrounds come together,
背景的人集結在一起,
by talking to you about.
來結束今天的演說。
that are going to take our jobs --
少談機器會搶走我們的工作——
can actually achieve.
in the world of AI,
世界中的某些能量,
what ads you see on your stream.
到你的串流中。
making the world so much better.
會是要讓世界變得更好。
in the Democratic Republic of Congo,
共和國的懷孕女子,
to her nearest rural prenatal clinic
才能到達最近的鄉村婦產科診所,
on her phone, instead?
來取得診斷呢?
in South Africa
to raise alarm,
that people, including myself,
有人在利用人工智慧
there will be yet another news story
還會有另一則新聞報導,
and coming for your jobs.
worrying about the future.
關於擔心未來的訊息。
about this technology.
是非常正面的。
into a much more equal place.
我們可以把世界重建,
the right way from the get go.
打從一開始就要用對方式。
races, sexualities and backgrounds.
種族、性向,和背景的人。
who do the makers' bidding.
命令做事的機器。
what we teach machines,
我們要教導機器什麼,
our own past mistakes.
thinking about two things.
thinking about bias today.
能想想現今的偏見。
you scroll past an advert
in fertility clinics
你想了解不孕症診所
that a black man will reoffend.
黑人會再犯罪。
to be a personal assistant than a CEO.
成為個人助理而非執行長。
that we need to do something about it.
我們得要採取行動。
in engineering or technology
a phenomenal force for our future.
一股驚人力量。
like a Mark Zuckerberg,
and the corporations
technology in the future.
of what we can achieve with AI.
做出更多了不起的事。
ABOUT THE SPEAKER
Kriti Sharma - AI technologistKriti Sharma creates AI technology to help address some of the toughest social challenges of our time -- from domestic violence to sexual health and inequality.
Why you should listen
Kriti Sharma is the Founder of AI for Good, an organization focused on building scalable technology solutions for social good. In 2018, she also launched rAInbow, a digital companion for women facing domestic violence in South Africa. This service reached nearly 200,000 conversations within the first 100 days, breaking down the stigma of gender-based violence. In 2019, she collaborated with the Population Foundation of India to launch Dr. Sneha, an AI-powered digital character to engage with young people about sexual health, an issue that is still considered a taboo in India.
Sharma was recently named in the Forbes "30 Under 30" list for advancements in AI. She was appointed a United Nations Young Leader in 2018 and is an advisor to both the United Nations Technology Innovation Labs and to the UK Government’s Centre for Data Ethics and Innovation.
Kriti Sharma | Speaker | TED.com