English-Video.net comment policy

The comment field is common to all languages

Let's write in your language and use "Google Translate" together

Please refer to informative community guidelines on TED.com

TED2011

Eli Pariser: Beware online "filter bubbles"

Filmed
Views 4,090,714

As web companies strive to tailor their services (including news and search results) to our personal tastes, there's a dangerous unintended consequence: We get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our worldview. Eli Pariser argues powerfully that this will ultimately prove to be bad for us and bad for democracy.

- Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview. Full bio

Mark Zuckerberg,
00:15
a journalist was asking him a question about the news feed.
00:17
And the journalist was asking him,
00:20
"Why is this so important?"
00:22
And Zuckerberg said,
00:24
"A squirrel dying in your front yard
00:26
may be more relevant to your interests right now
00:28
than people dying in Africa."
00:31
And I want to talk about
00:34
what a Web based on that idea of relevance might look like.
00:36
So when I was growing up
00:40
in a really rural area in Maine,
00:42
the Internet meant something very different to me.
00:44
It meant a connection to the world.
00:47
It meant something that would connect us all together.
00:49
And I was sure that it was going to be great for democracy
00:52
and for our society.
00:55
But there's this shift
00:58
in how information is flowing online,
01:00
and it's invisible.
01:02
And if we don't pay attention to it,
01:05
it could be a real problem.
01:07
So I first noticed this in a place I spend a lot of time --
01:10
my Facebook page.
01:13
I'm progressive, politically -- big surprise --
01:15
but I've always gone out of my way to meet conservatives.
01:18
I like hearing what they're thinking about;
01:20
I like seeing what they link to;
01:22
I like learning a thing or two.
01:24
And so I was surprised when I noticed one day
01:26
that the conservatives had disappeared from my Facebook feed.
01:29
And what it turned out was going on
01:33
was that Facebook was looking at which links I clicked on,
01:35
and it was noticing that, actually,
01:39
I was clicking more on my liberal friends' links
01:41
than on my conservative friends' links.
01:43
And without consulting me about it,
01:46
it had edited them out.
01:48
They disappeared.
01:50
So Facebook isn't the only place
01:54
that's doing this kind of invisible, algorithmic
01:56
editing of the Web.
01:58
Google's doing it too.
02:01
If I search for something, and you search for something,
02:03
even right now at the very same time,
02:06
we may get very different search results.
02:08
Even if you're logged out, one engineer told me,
02:11
there are 57 signals
02:14
that Google looks at --
02:16
everything from what kind of computer you're on
02:19
to what kind of browser you're using
02:22
to where you're located --
02:24
that it uses to personally tailor your query results.
02:26
Think about it for a second:
02:29
there is no standard Google anymore.
02:31
And you know, the funny thing about this is that it's hard to see.
02:35
You can't see how different your search results are
02:38
from anyone else's.
02:40
But a couple of weeks ago,
02:42
I asked a bunch of friends to Google "Egypt"
02:44
and to send me screen shots of what they got.
02:47
So here's my friend Scott's screen shot.
02:50
And here's my friend Daniel's screen shot.
02:54
When you put them side-by-side,
02:57
you don't even have to read the links
02:59
to see how different these two pages are.
03:01
But when you do read the links,
03:03
it's really quite remarkable.
03:05
Daniel didn't get anything about the protests in Egypt at all
03:09
in his first page of Google results.
03:12
Scott's results were full of them.
03:14
And this was the big story of the day at that time.
03:16
That's how different these results are becoming.
03:18
So it's not just Google and Facebook either.
03:21
This is something that's sweeping the Web.
03:24
There are a whole host of companies that are doing this kind of personalization.
03:26
Yahoo News, the biggest news site on the Internet,
03:29
is now personalized -- different people get different things.
03:32
Huffington Post, the Washington Post, the New York Times --
03:36
all flirting with personalization in various ways.
03:39
And this moves us very quickly
03:42
toward a world in which
03:45
the Internet is showing us what it thinks we want to see,
03:47
but not necessarily what we need to see.
03:51
As Eric Schmidt said,
03:54
"It will be very hard for people to watch or consume something
03:57
that has not in some sense
04:00
been tailored for them."
04:02
So I do think this is a problem.
04:05
And I think, if you take all of these filters together,
04:07
you take all these algorithms,
04:10
you get what I call a filter bubble.
04:12
And your filter bubble is your own personal,
04:16
unique universe of information
04:19
that you live in online.
04:21
And what's in your filter bubble
04:23
depends on who you are, and it depends on what you do.
04:26
But the thing is that you don't decide what gets in.
04:29
And more importantly,
04:33
you don't actually see what gets edited out.
04:35
So one of the problems with the filter bubble
04:38
was discovered by some researchers at Netflix.
04:40
And they were looking at the Netflix queues, and they noticed something kind of funny
04:43
that a lot of us probably have noticed,
04:46
which is there are some movies
04:48
that just sort of zip right up and out to our houses.
04:50
They enter the queue, they just zip right out.
04:53
So "Iron Man" zips right out,
04:56
and "Waiting for Superman"
04:58
can wait for a really long time.
05:00
What they discovered
05:02
was that in our Netflix queues
05:04
there's this epic struggle going on
05:06
between our future aspirational selves
05:09
and our more impulsive present selves.
05:12
You know we all want to be someone
05:15
who has watched "Rashomon,"
05:17
but right now
05:19
we want to watch "Ace Ventura" for the fourth time.
05:21
(Laughter)
05:24
So the best editing gives us a bit of both.
05:27
It gives us a little bit of Justin Bieber
05:29
and a little bit of Afghanistan.
05:31
It gives us some information vegetables;
05:33
it gives us some information dessert.
05:35
And the challenge with these kinds of algorithmic filters,
05:38
these personalized filters,
05:40
is that, because they're mainly looking
05:42
at what you click on first,
05:44
it can throw off that balance.
05:48
And instead of a balanced information diet,
05:52
you can end up surrounded
05:55
by information junk food.
05:57
What this suggests
05:59
is actually that we may have the story about the Internet wrong.
06:01
In a broadcast society --
06:04
this is how the founding mythology goes --
06:06
in a broadcast society,
06:08
there were these gatekeepers, the editors,
06:10
and they controlled the flows of information.
06:12
And along came the Internet and it swept them out of the way,
06:15
and it allowed all of us to connect together,
06:18
and it was awesome.
06:20
But that's not actually what's happening right now.
06:22
What we're seeing is more of a passing of the torch
06:26
from human gatekeepers
06:29
to algorithmic ones.
06:31
And the thing is that the algorithms
06:34
don't yet have the kind of embedded ethics
06:37
that the editors did.
06:40
So if algorithms are going to curate the world for us,
06:43
if they're going to decide what we get to see and what we don't get to see,
06:46
then we need to make sure
06:49
that they're not just keyed to relevance.
06:51
We need to make sure that they also show us things
06:54
that are uncomfortable or challenging or important --
06:56
this is what TED does --
06:59
other points of view.
07:01
And the thing is, we've actually been here before
07:03
as a society.
07:05
In 1915, it's not like newspapers were sweating a lot
07:08
about their civic responsibilities.
07:11
Then people noticed
07:14
that they were doing something really important.
07:16
That, in fact, you couldn't have
07:19
a functioning democracy
07:21
if citizens didn't get a good flow of information,
07:23
that the newspapers were critical because they were acting as the filter,
07:28
and then journalistic ethics developed.
07:31
It wasn't perfect,
07:33
but it got us through the last century.
07:35
And so now,
07:38
we're kind of back in 1915 on the Web.
07:40
And we need the new gatekeepers
07:44
to encode that kind of responsibility
07:47
into the code that they're writing.
07:49
I know that there are a lot of people here from Facebook and from Google --
07:51
Larry and Sergey --
07:54
people who have helped build the Web as it is,
07:56
and I'm grateful for that.
07:58
But we really need you to make sure
08:00
that these algorithms have encoded in them
08:03
a sense of the public life, a sense of civic responsibility.
08:06
We need you to make sure that they're transparent enough
08:09
that we can see what the rules are
08:12
that determine what gets through our filters.
08:14
And we need you to give us some control
08:17
so that we can decide
08:19
what gets through and what doesn't.
08:21
Because I think
08:24
we really need the Internet to be that thing
08:26
that we all dreamed of it being.
08:28
We need it to connect us all together.
08:30
We need it to introduce us to new ideas
08:33
and new people and different perspectives.
08:36
And it's not going to do that
08:40
if it leaves us all isolated in a Web of one.
08:42
Thank you.
08:45
(Applause)
08:47

▲Back to top

About the speaker:

Eli Pariser - Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview.

Why you should listen

Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.

In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.

His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.

More profile about the speaker
Eli Pariser | Speaker | TED.com