sponsored links
TEDxOslo

Andreas Ekström: The moral bias behind your search results

January 29, 2015

Search engines have become our most trusted sources of information and arbiters of truth. But can we ever get an unbiased search result? Swedish author and journalist Andreas Ekström argues that such a thing is a philosophical impossibility. In this thoughtful talk, he calls on us to strengthen the bonds between technology and the humanities, and he reminds us that behind every algorithm is a set of personal beliefs that no code can ever completely eradicate.

Andreas Ekström - Author and journalist
Andreas Ekström describes the power structures of the digital revolution. Full bio

sponsored links
Double-click the English subtitles below to play the video.
So whenever I visit a school
and talk to students,
00:12
I always ask them the same thing:
00:15
Why do you Google?
00:18
Why is Google the search engine
of choice for you?
00:20
Strangely enough, I always get
the same three answers.
00:24
One, "Because it works,"
00:27
which is a great answer;
that's why I Google, too.
00:29
Two, somebody will say,
00:32
"I really don't know of any alternatives."
00:34
It's not an equally great answer
and my reply to that is usually,
00:37
"Try to Google the word 'search engine,'
00:40
you may find a couple
of interesting alternatives."
00:42
And last but not least, thirdly,
00:45
inevitably, one student will raise
her or his hand and say,
00:47
"With Google, I'm certain to always get
the best, unbiased search result."
00:50
Certain to always get the best,
unbiased search result.
00:56
Now, as a man of the humanities,
01:04
albeit a digital humanities man,
01:07
that just makes my skin curl,
01:09
even if I, too, realize that that trust,
that idea of the unbiased search result
01:11
is a cornerstone in our collective love
for and appreciation of Google.
01:16
I will show you why that, philosophically,
is almost an impossibility.
01:20
But let me first elaborate,
just a little bit, on a basic principle
01:24
behind each search query
that we sometimes seem to forget.
01:28
So whenever you set out
to Google something,
01:31
start by asking yourself this:
"Am I looking for an isolated fact?"
01:33
What is the capital of France?
01:38
What are the building blocks
of a water molecule?
01:41
Great -- Google away.
01:43
There's not a group of scientists
who are this close to proving
01:46
that it's actually London and H30.
01:49
You don't see a big conspiracy
among those things.
01:51
We agree, on a global scale,
01:53
what the answers are
to these isolated facts.
01:55
But if you complicate your question
just a little bit and ask something like,
01:58
"Why is there
an Israeli-Palestine conflict?"
02:03
You're not exactly looking
for a singular fact anymore,
02:06
you're looking for knowledge,
02:09
which is something way more
complicated and delicate.
02:11
And to get to knowledge,
02:14
you have to bring 10 or 20
or 100 facts to the table
02:15
and acknowledge them and say,
"Yes, these are all true."
02:19
But because of who I am,
02:22
young or old, black or white,
gay or straight,
02:23
I will value them differently.
02:26
And I will say, "Yes, this is true,
02:27
but this is more important
to me than that."
02:29
And this is where it becomes interesting,
02:31
because this is where we become human.
02:33
This is when we start
to argue, to form society.
02:35
And to really get somewhere,
we need to filter all our facts here,
02:38
through friends and neighbors
and parents and children
02:42
and coworkers and newspapers
and magazines,
02:44
to finally be grounded in real knowledge,
02:46
which is something that a search engine
is a poor help to achieve.
02:49
So, I promised you an example
just to show you why it's so hard
02:55
to get to the point of true, clean,
objective knowledge --
03:01
as food for thought.
03:04
I will conduct a couple of simple
queries, search queries.
03:06
We'll start with "Michelle Obama,"
03:10
the First Lady of the United States.
03:14
And we'll click for pictures.
03:16
It works really well, as you can see.
03:18
It's a perfect search
result, more or less.
03:21
It's just her in the picture,
not even the President.
03:24
How does this work?
03:27
Quite simple.
03:29
Google uses a lot of smartness
to achieve this, but quite simply,
03:31
they look at two things
more than anything.
03:34
First, what does it say in the caption
under the picture on each website?
03:36
Does it say "Michelle Obama"
under the picture?
03:41
Pretty good indication
it's actually her on there.
03:43
Second, Google looks at the picture file,
03:46
the name of the file as such
uploaded to the website.
03:48
Again, is it called "MichelleObama.jpeg"?
03:51
Pretty good indication it's not
Clint Eastwood in the picture.
03:54
So, you've got those two and you get
a search result like this -- almost.
03:57
Now, in 2009, Michelle Obama
was the victim of a racist campaign,
04:01
where people set out to insult her
through her search results.
04:08
There was a picture distributed
widely over the Internet
04:13
where her face was distorted
to look like a monkey.
04:15
And that picture was published all over.
04:18
And people published it
very, very purposefully,
04:21
to get it up there in the search results.
04:25
They made sure to write
"Michelle Obama" in the caption
04:27
and they made sure to upload the picture
as "MichelleObama.jpeg," or the like.
04:30
You get why -- to manipulate
the search result.
04:34
And it worked, too.
04:37
So when you picture-Googled
for "Michelle Obama" in 2009,
04:38
that distorted monkey picture
showed up among the first results.
04:41
Now, the results are self-cleansing,
04:44
and that's sort of the beauty of it,
04:48
because Google measures relevance
every hour, every day.
04:50
However, Google didn't settle
for that this time,
04:53
they just thought, "That's racist
and it's a bad search result
04:56
and we're going to go back
and clean that up manually.
04:59
We are going to write
some code and fix it,"
05:02
which they did.
05:05
And I don't think anyone in this room
thinks that was a bad idea.
05:07
Me neither.
05:11
But then, a couple of years go by,
05:14
and the world's most-Googled Anders,
05:17
Anders Behring Breivik,
05:20
did what he did.
05:22
This is July 22 in 2011,
05:24
and a terrible day in Norwegian history.
05:26
This man, a terrorist, blew up
a couple of government buildings
05:29
walking distance from where we are
right now in Oslo, Norway
05:33
and then he traveled
to the island of Utøya
05:36
and shot and killed a group of kids.
05:38
Almost 80 people died that day.
05:40
And a lot of people would describe
this act of terror as two steps,
05:44
that he did two things: he blew up
the buildings and he shot those kids.
05:48
It's not true.
05:52
It was three steps.
05:54
He blew up those buildings,
he shot those kids,
05:56
and he sat down and waited
for the world to Google him.
05:58
And he prepared
all three steps equally well.
06:03
And if there was somebody
who immediately understood this,
06:06
it was a Swedish web developer,
06:09
a search engine optimization expert
in Stockholm, named Nikke Lindqvist.
06:10
He's also a very political guy
06:14
and he was right out there
in social media, on his blog and Facebook.
06:15
And he told everybody,
06:19
"If there's something that
this guy wants right now,
06:20
it's to control the image of himself.
06:22
Let's see if we can distort that.
06:26
Let's see if we, in the civilized world,
can protest against what he did
06:29
through insulting him
in his search results."
06:33
And how?
06:36
He told all of his readers the following,
06:38
"Go out there on the Internet,
06:40
find pictures of dog poop on sidewalks --
06:42
find pictures of dog poop on sidewalks --
06:46
publish them in your feeds,
on your websites, on your blogs.
06:48
Make sure to write the terrorist's
name in the caption,
06:52
make sure to name
the picture file "Breivik.jpeg."
06:55
Let's teach Google that that's
the face of the terrorist."
06:59
And it worked.
07:05
Two years after that campaign
against Michelle Obama,
07:07
this manipulation campaign
against Anders Behring Breivik worked.
07:10
If you picture-Googled for him weeks after
the July 22 events from Sweden,
07:13
you'd see that picture of dog poop
high up in the search results,
07:18
as a little protest.
07:22
Strangely enough, Google
didn't intervene this time.
07:25
They did not step in and manually
clean those search results up.
07:30
So the million-dollar question,
07:35
is there anything different
between these two happenings here?
07:37
Is there anything different between
what happened to Michelle Obama
07:40
and what happened
to Anders Behring Breivik?
07:44
Of course not.
07:46
It's the exact same thing,
07:48
yet Google intervened in one case
and not in the other.
07:50
Why?
07:53
Because Michelle Obama
is an honorable person, that's why,
07:55
and Anders Behring Breivik
is a despicable person.
07:58
See what happens there?
08:01
An evaluation of a person takes place
08:03
and there's only one
power-player in the world
08:06
with the authority to say who's who.
08:10
"We like you, we dislike you.
08:13
We believe in you,
we don't believe in you.
08:15
You're right, you're wrong.
You're true, you're false.
08:17
You're Obama, and you're Breivik."
08:20
That's power if I ever saw it.
08:22
So I'm asking you to remember
that behind every algorithm
08:27
is always a person,
08:30
a person with a set of personal beliefs
08:32
that no code can ever
completely eradicate.
08:35
And my message goes
out not only to Google,
08:37
but to all believers in the faith
of code around the world.
08:40
You need to identify
your own personal bias.
08:42
You need to understand that you are human
08:45
and take responsibility accordingly.
08:47
And I say this because I believe
we've reached a point in time
08:51
when it's absolutely imperative
08:54
that we tie those bonds
together again, tighter:
08:56
the humanities and the technology.
08:59
Tighter than ever.
09:02
And, if nothing else, to remind us
that that wonderfully seductive idea
09:04
of the unbiased, clean search result
09:07
is, and is likely to remain, a myth.
09:10
Thank you for your time.
09:13
(Applause)
09:14

sponsored links

Andreas Ekström - Author and journalist
Andreas Ekström describes the power structures of the digital revolution.

Why you should listen

Andreas Ekström is staff writer at Sydsvenskan, a daily morning paper in Malmö, Sweden.

His passion is educating for digital equality, and he has a vision for a world in which we share the wealth -- not only financially, but also in terms of knowledge and power. Andreas is the author of six books, a columnist and a commentator, and he often lectures and leads seminars on the digital revolution.

sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.