Joy Buolamwini: How I'm fighting bias in algorithms
Joy Buolamwini: Cómo lucho contra el sesgo en los algoritmos
Joy Buolamwini's research explores the intersection of social impact technology and inclusion. Full bio
Double-click the English transcript below to play the video.
an unseen force that's rising,
una fuerza invisible que crece,
results in unfairness.
se traduce en injusticia.
can spread bias on a massive scale
pueden propagar sesgos a gran escala
to exclusionary experiences
generar experiencias de exclusión
I've got a face.
Tengo una cara.
barata me detecte?
contra la mirada codificada
at the MIT Media Lab,
en el Laboratorio de Medios del MIT,
on all sorts of whimsical projects,
en todo tipo de proyectos caprichosos,
digital masks onto my reflection.
máscaras digitales en mi propio reflejo.
to feel powerful,
si quería sentirme poderosa,
I might have a quote.
podía usar una cita.
facial recognition software
de reconocimiento facial
unless I wore a white mask.
sin colocarme una máscara blanca.
into this issue before.
este problema antes.
at Georgia Tech studying computer science,
en Georgia Tech,
to play peek-a-boo,
que un robot jugara a esconderse,
luego las descubren diciendo: "Aquí está".
and then uncover it saying, "Peek-a-boo!"
doesn't really work if I can't see you,
no funciona, si no te pueden ver
to get the project done,
para terminar el proyecto,
somebody else will solve this problem.
resolvería este problema.
for an entrepreneurship competition.
en una competencia de emprendedores.
to take participants
llevar a los participantes
por empresas locales emergentes.
until it got to me,
hasta que llegó mi turno,
generic facial recognition software.
software genérico de reconocimiento.
can travel as quickly
puede viajar tan rápido
some files off of the internet.
descargar archivos de Internet.
Why isn't my face being detected?
¿Por qué no se detecta mi rostro?
at how we give machines sight.
cómo hacemos que las máquinas vean.
machine learning techniques
usa técnicas de aprendizaje de máquina
a training set with examples of faces.
de prueba con ejemplos de rostros.
This is not a face.
Esto no lo es.
how to recognize other faces.
a una computadora a reconocer rostros.
aren't really that diverse,
no son realmente diversas,
from the established norm
de la norma establecida
tengo buenas noticias.
materialize out of nowhere.
no se materializan de la nada.
full-spectrum training sets
series de prueba con espectros completos
portrait of humanity.
un retrato de la humanidad.
with algorithmic bias.
por el sesgo algorítmico.
to discriminatory practices.
puede generar prácticas discriminatorias.
facial recognition software
software de reconocimiento facial
para la lucha contra el crimen.
in the US -- that's 117 million people --
en EE.UU., 117 millones de personas,
in facial recognition networks.
de reconocimiento facial.
at these networks unregulated,
tienen acceso a esas redes no reguladas,
been audited for accuracy.
no ha sido testeada.
is not fail proof,
no es a prueba de fallas
remains a challenge.
aún es un desafío.
when we see other people
cuando vemos a otros
is no laughing matter,
no es un tema para reírse,
for facial recognition,
para el reconocimiento facial,
of computer vision.
al campo de la visión por computadora.
of Math Destruction,"
destrucción matemática",
talks about the rising new WMDs --
habla sobre los nuevos WMDs,
and destructive algorithms
misteriosos y destructivos
to make decisions
para tomar decisiones
de nuestras vidas.
Do you get insurance?
¿Y la cobertura de seguros?
you wanted to get into?
a la que deseas entrar?
for the same product
por el mismo producto
to use machine learning
a usar el aprendizaje de máquina
generados por máquinas para determinar
risk scores to determine
is going to spend in prison.
permanecerá en prisión.
about these decisions.
sobre estas decisiones.
lead to fair outcomes.
a resultados justos.
how we create more inclusive code
cómo creamos un código más inclusivo
de codificación inclusivas.
with diverse individuals
amplio espectro con diversidad de personas
los puntos ciegos de los demás?
how we code matters.
importa cómo codificamos.
as we're developing systems?
al desarrollar los sistemas?
codificamos.
to unlock immense wealth.
to unlock even greater equality
una igualdad aún más grande
that will make up the "incoding" movement.
constituirán el movimiento "codificador".
we can start thinking about
podemos empezar a pensar
que puedan identificar sesgos
like the ones I shared,
como las que compartí,
el software existente.
more inclusive training sets.
grupos de formación más inclusivos.
"Selfies por la inclusión"
developers test and create
a los desarrolladores a crear
more conscientiously
más concienzudamente
of the technology that we're developing.
que estamos desarrollando.
de codificación,
Justice League,
can help fight the coded gaze.
se preocupa por la equidad
la mirada codificada.
pueden informar sesgos,
convertirse en un betatesters
works for all of us,
trabaje para todos nosotros,
and center social change.
y así centrar el cambio social.
ABOUT THE SPEAKER
Joy Buolamwini - Poet of codeJoy Buolamwini's research explores the intersection of social impact technology and inclusion.
Why you should listen
Joy Buolamwini is a poet of code on a mission to show compassion through computation. As a graduate researcher at the MIT Media Lab, she leads the Algorithmic Justice League to fight coded bias. Her research explores the intersection of social impact technology and inclusion. In support of this work, Buolamwini was awarded a $50,000 grant as the Grand Prize winner of a national contest inspired by the critically acclaimed film Hidden Figures, based on the book by Margot Lee Shetterly.
Driven by an entrepreneurial spirit, Buolamwini's global interest in creating technology for social impact spans multiple industries and countries. As the inaugural Chief Technology Officer for Techturized Inc., a hair care technology company, and Swift Tech Solutions, a global health tech consultancy, she led software development for underserved communities in the United States, Ethiopia, Mali, Nigeria and Niger. In Zambia, she explored empowering citizens with skills to create their own technology through the Zamrize Project. In the United Kingdom, Buolamwini piloted a Service Year Initiative to launch Code4Rights which supports youth in creating meaningful technology for their communities in partnership with local organizations.
Through Filmmakers Collaborative, Buolamwini produces media that highlight diverse creators of technology. Her short documentary, The Coded Gaze: Unmasking Algorithmic Bias, debuted at the Museum of Fine Arts Boston and her pilot of the Code4Rights: Journey To Code training series debuted at the Vatican. She has presented keynote speeches and public talks at various forums including #CSforAll at the White House, Harvard University, Saïd Business School, Rutgers University, NCWIT, Grace Hopper Celebration and SXSWedu.
Buolamwini is a Rhodes Scholar, Fulbright Fellow, Google Anita Borg Scholar, Astronaut Scholar, A Stamps President's Scholar and Carter Center technical consultant recognized as a distinguished volunteer. She holds a master's degree in Learning and Technology from Oxford University and a bachelor's degree in Computer Science from the Georgia Institute of Technology. Buolamwini serves as a Harvard resident tutor at Adams House where she mentors students seeking scholarships or pursuing entrepreneurship.
Joy Buolamwini | Speaker | TED.com