ABOUT THE SPEAKER
Eli Pariser - Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview.

Why you should listen

Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.

In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.

His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.

More profile about the speaker
Eli Pariser | Speaker | TED.com
TED2011

Eli Pariser: Beware online "filter bubbles"

伊萊.帕理澤: 當心網路上的「過濾氣泡」

Filmed:
5,309,238 views

當網路公司正努力地將服務(包括新聞及搜索結果)根據我們的品味度身訂造,現在開始有不意想的後果:我們被困在一個「過瀘氣泡」,看不見一些或許讓我們能對世界擴闊視野的資訊。伊萊.帕理澤有力地爭論這種過濾最終會對我們及民主主義有負面影響。
- Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview. Full bio

Double-click the English transcript below to play the video.

00:15
Mark標記 Zuckerberg扎克伯格,
0
0
2000
有位新聞工作者問馬克·祖克柏
00:17
a journalist記者 was asking him a question about the news新聞 feed飼料.
1
2000
3000
一個關於動態消息的問题。
00:20
And the journalist記者 was asking him,
2
5000
2000
那新聞工作者問他:
00:22
"Why is this so important重要?"
3
7000
2000
「動態消息究竟為什麼重要?」
00:24
And Zuckerberg扎克伯格 said,
4
9000
2000
馬克·祖克柏說:
00:26
"A squirrel松鼠 dying垂死 in your front面前 yard
5
11000
2000
「一隻松鼠正死在你的前院,
00:28
may可能 be more relevant相應 to your interests利益 right now
6
13000
3000
對你來說可能會比
00:31
than people dying垂死 in Africa非洲."
7
16000
3000
非洲人正在死去更有關切性。」
00:34
And I want to talk about
8
19000
2000
現在我想談論
00:36
what a Web捲筒紙 based基於 on that idea理念 of relevance關聯 might威力 look like.
9
21000
3000
當網路基於「相關性」會是什麼樣子。
00:40
So when I was growing生長 up
10
25000
2000
我在缅因州
00:42
in a really rural鄉村 area in Maine緬因州,
11
27000
2000
極之郊區的環境長大,
00:44
the Internet互聯網 meant意味著 something very different不同 to me.
12
29000
3000
網路的意義對我極之不同。
00:47
It meant意味著 a connection連接 to the world世界.
13
32000
2000
它意味著與世界的連接。
00:49
It meant意味著 something that would connect us all together一起.
14
34000
3000
它意味著與所有人的連接。
00:52
And I was sure that it was going to be great for democracy民主
15
37000
3000
當時我非常肯定它會有助民主
00:55
and for our society社會.
16
40000
3000
及會有助我們的社會。
00:58
But there's this shift轉移
17
43000
2000
但現在網路上
01:00
in how information信息 is flowing流動 online線上,
18
45000
2000
资料流動的形色漸漸地、
01:02
and it's invisible無形.
19
47000
3000
無形地在轉移。
01:05
And if we don't pay工資 attention注意 to it,
20
50000
2000
假若我們不留心注意,
01:07
it could be a real真實 problem問題.
21
52000
3000
它可能會變成一個問题。
01:10
So I first noticed注意到 this in a place地點 I spend a lot of time --
22
55000
3000
我在我經常流覽的地方
首先注意到這個問題,
01:13
my FacebookFacebook的 page.
23
58000
2000
這個地方當然是我的臉書頁面。
01:15
I'm progressive進步, politically政治上 -- big surprise --
24
60000
3000
可想而知,我對政治的態度是進步主義,
01:18
but I've always gone走了 out of my way to meet遇到 conservatives保守派.
25
63000
2000
但我亦會叛經離道地結識保守主義者。
01:20
I like hearing聽力 what they're thinking思維 about;
26
65000
2000
我喜歡知道他們在想什麼;
01:22
I like seeing眼看 what they link鏈接 to;
27
67000
2000
我喜歡知道他們與什麼有聯繫;
01:24
I like learning學習 a thing or two.
28
69000
2000
我喜歡能從中學到一些東西。
01:26
And so I was surprised詫異 when I noticed注意到 one day
29
71000
3000
因此有一天我很駕訝當我察覺到
01:29
that the conservatives保守派 had disappeared消失 from my FacebookFacebook的 feed飼料.
30
74000
3000
有關保守派主義的消息
從我臉書的動態消息消失。
01:33
And what it turned轉身 out was going on
31
78000
2000
理由是因為
01:35
was that FacebookFacebook的 was looking at which哪一個 links鏈接 I clicked點擊 on,
32
80000
4000
臉書能看見我按過哪些連結,
01:39
and it was noticing注意到 that, actually其實,
33
84000
2000
它注意到
我其實按自由黨朋友的連結
01:41
I was clicking點擊 more on my liberal自由主義的 friends'朋友' links鏈接
34
86000
2000
01:43
than on my conservative保守 friends'朋友' links鏈接.
35
88000
3000
多過保守派朋友的連結。
01:46
And without consulting諮詢 me about it,
36
91000
2000
在未與我商量過的情況下,
01:48
it had edited編輯 them out.
37
93000
2000
它便編走那些連結。
01:50
They disappeared消失.
38
95000
3000
那些消息全消失了。
01:54
So FacebookFacebook的 isn't the only place地點
39
99000
2000
但不是淨只是臉書
01:56
that's doing this kind of invisible無形, algorithmic算法
40
101000
2000
會做這種無形的、算法式的
01:58
editing編輯 of the Web捲筒紙.
41
103000
3000
來編輯網路。
02:01
Google's谷歌的 doing it too.
42
106000
2000
Google(谷歌)也會這樣。
02:03
If I search搜索 for something, and you search搜索 for something,
43
108000
3000
若我在搜尋一樣東西,
你也在搜索一樣東西,
02:06
even right now at the very same相同 time,
44
111000
2000
即使是在現在、同一個時間,
02:08
we may可能 get very different不同 search搜索 results結果.
45
113000
3000
我們搜索的結果都會不同。
02:11
Even if you're logged記錄 out, one engineer工程師 told me,
46
116000
3000
一個工程師曾告訴過我,
即使你登出帳戶,
02:14
there are 57 signals信號
47
119000
2000
仍然有 57 個訊號
02:16
that Google谷歌 looks容貌 at --
48
121000
3000
在被谷歌觀察著:
02:19
everything from what kind of computer電腦 you're on
49
124000
3000
從你所用的電腦類型
02:22
to what kind of browser瀏覽器 you're using運用
50
127000
2000
到你所用的瀏覽器
02:24
to where you're located位於 --
51
129000
2000
甚至是你的所在位置,
02:26
that it uses使用 to personally親自 tailor裁縫 your query詢問 results結果.
52
131000
3000
它會以這些來量身訂造你的搜尋結果。
02:29
Think about it for a second第二:
53
134000
2000
試想一下:
02:31
there is no standard標準 Google谷歌 anymore.
54
136000
4000
現在已經沒有標準的谷歌。
02:35
And you know, the funny滑稽 thing about this is that it's hard to see.
55
140000
3000
而且,好笑的是,這很難察覺的。
02:38
You can't see how different不同 your search搜索 results結果 are
56
143000
2000
你根本無法看到你的搜尋結果
02:40
from anyone任何人 else's別人的.
57
145000
2000
會跟其他人的有所不同。
02:42
But a couple一對 of weeks ago,
58
147000
2000
所以在兩個星期前,
02:44
I asked a bunch of friends朋友 to Google谷歌 "Egypt埃及"
59
149000
3000
我請一些朋友用谷歌搜尋「埃及」,
02:47
and to send發送 me screen屏幕 shots鏡頭 of what they got.
60
152000
3000
並且寄給我他們搜尋結果的螢幕截圖。
02:50
So here's這裡的 my friend朋友 Scott's斯科特的 screen屏幕 shot射擊.
61
155000
3000
這張是我朋友史考特的截圖,
02:54
And here's這裡的 my friend朋友 Daniel's丹尼爾 screen屏幕 shot射擊.
62
159000
3000
而這張是我朋友丹尼爾的截圖。
02:57
When you put them side-by-side並排側,
63
162000
2000
當你將它們並排比較,
02:59
you don't even have to read the links鏈接
64
164000
2000
你根本不用細看那些連結
03:01
to see how different不同 these two pages網頁 are.
65
166000
2000
就可以看得出這兩頁是不一樣。
03:03
But when you do read the links鏈接,
66
168000
2000
但當你細看這些連結,
03:05
it's really quite相當 remarkable卓越.
67
170000
3000
這確實是蠻驚人的。
03:09
Daniel丹尼爾 didn't get anything about the protests抗議 in Egypt埃及 at all
68
174000
3000
在丹尼爾的谷歌搜尋結果第一頁裡,
03:12
in his first page of Google谷歌 results結果.
69
177000
2000
完全沒有關於埃及抗議的連結。
03:14
Scott's斯科特的 results結果 were full充分 of them.
70
179000
2000
在史考特的搜尋結果就有很多。
03:16
And this was the big story故事 of the day at that time.
71
181000
2000
但在那陣子卻是當日的大新聞。
03:18
That's how different不同 these results結果 are becoming變得.
72
183000
3000
這就是搜尋結果越來越不同的例子。
03:21
So it's not just Google谷歌 and FacebookFacebook的 either.
73
186000
3000
不只是谷歌及臉書。
03:24
This is something that's sweeping籠統的 the Web捲筒紙.
74
189000
2000
這趨勢在網路正漸散播。
03:26
There are a whole整個 host主辦 of companies公司 that are doing this kind of personalization個性化.
75
191000
3000
現有很多機構都實施個人化。
03:29
Yahoo雅虎 News新聞, the biggest最大 news新聞 site現場 on the Internet互聯網,
76
194000
3000
雅虎新聞,網路上最大型的新聞網站,
03:32
is now personalized個性化 -- different不同 people get different不同 things.
77
197000
3000
現在已經個人化,
即是不同人會看到不同的東西。
03:36
Huffington赫芬頓 Post崗位, the Washington華盛頓 Post崗位, the New York紐約 Times --
78
201000
3000
哈芬登郵報、華盛頓郵報、紐約時報等等
03:39
all flirting調情 with personalization個性化 in various各個 ways方法.
79
204000
3000
都正在用不同方式盤弄個人化。
03:42
And this moves移動 us very quickly很快
80
207000
3000
這種趨勢正在快速地推我們
03:45
toward a world世界 in which哪一個
81
210000
2000
前往一個新世界,
03:47
the Internet互聯網 is showing展示 us what it thinks we want to see,
82
212000
4000
一個網路應為我們想看的世界,
03:51
but not necessarily一定 what we need to see.
83
216000
3000
但未必是一個我們需要看到的世界。
03:54
As Eric埃里克 Schmidt施密特 said,
84
219000
3000
正如艾立克·史密特所說:
「現在很難要人們觀看或消化一些
03:57
"It will be very hard for people to watch or consume消耗 something
85
222000
3000
04:00
that has not in some sense
86
225000
2000
一點兒也沒有替他們
04:02
been tailored量身定制 for them."
87
227000
3000
量身訂造的東西。」
04:05
So I do think this is a problem問題.
88
230000
2000
我認為這是一個問題,
04:07
And I think, if you take all of these filters過濾器 together一起,
89
232000
3000
而且我想,如果將
全部的過濾器放在一起,
04:10
you take all these algorithms算法,
90
235000
2000
用盡所有算法,
04:12
you get what I call a filter過濾 bubble泡沫.
91
237000
3000
得到的是一個我稱為
「過濾氣泡」的東西。
04:16
And your filter過濾 bubble泡沫 is your own擁有 personal個人,
92
241000
3000
而你的過濾氣泡便是你個人
04:19
unique獨特 universe宇宙 of information信息
93
244000
2000
在網上存在的
04:21
that you live生活 in online線上.
94
246000
2000
獨特資料宇宙。
04:23
And what's in your filter過濾 bubble泡沫
95
248000
3000
你個人過濾氣泡的內容
04:26
depends依靠 on who you are, and it depends依靠 on what you do.
96
251000
3000
基於你是誰和你的行為。
04:29
But the thing is that you don't decide決定 what gets得到 in.
97
254000
4000
但問題是,氣泡的內容不是你可選擇。
04:33
And more importantly重要的,
98
258000
2000
更重要的是,
04:35
you don't actually其實 see what gets得到 edited編輯 out.
99
260000
3000
你完全看不到什麼被刪除。
04:38
So one of the problems問題 with the filter過濾 bubble泡沫
100
263000
2000
過濾氣泡的其中一個問題
04:40
was discovered發現 by some researchers研究人員 at NetflixNetflix公司.
101
265000
3000
被 Netflix 的研究員發現。
04:43
And they were looking at the NetflixNetflix公司 queues隊列, and they noticed注意到 something kind of funny滑稽
102
268000
3000
當在察看 Netflix 的電影列表時,
他們發覺一樣有趣的現象,
04:46
that a lot of us probably大概 have noticed注意到,
103
271000
2000
可能我們很多人亦有察覺到,
04:48
which哪一個 is there are some movies電影
104
273000
2000
也就是,有些電影
04:50
that just sort分類 of zip壓縮 right up and out to our houses房屋.
105
275000
3000
馬上就會被訂去看。
04:53
They enter輸入 the queue隊列, they just zip壓縮 right out.
106
278000
3000
才剛入上架,就馬上被訂去看了。
04:56
So "Iron Man" zips拉鍊 right out,
107
281000
2000
例如《鋼鐵人》很快就被看完,
04:58
and "Waiting等候 for Superman超人"
108
283000
2000
但《等待超人》
05:00
can wait for a really long time.
109
285000
2000
便真要等很久。
05:02
What they discovered發現
110
287000
2000
他們發現
05:04
was that in our NetflixNetflix公司 queues隊列
111
289000
2000
在我們 Netflix 的列表裡,
05:06
there's this epic史詩 struggle鬥爭 going on
112
291000
3000
正在發生一個很巨型的鬥爭:
05:09
between之間 our future未來 aspirational抱負 selves自我
113
294000
3000
我們未來的自我志向
05:12
and our more impulsive浮躁 present當下 selves自我.
114
297000
3000
和現在較衝動的自我之間在拔河。
05:15
You know we all want to be someone有人
115
300000
2000
我們全都想成為那個
05:17
who has watched看著 "Rashomon羅生門,"
116
302000
2000
曾經看過《羅生門》的人,
05:19
but right now
117
304000
2000
但現在
05:21
we want to watch "Ace高手 Ventura文圖拉" for the fourth第四 time.
118
306000
3000
我們想看第四遍的《王牌威龍》。
05:24
(Laughter笑聲)
119
309000
3000
(笑聲)
05:27
So the best最好 editing編輯 gives us a bit of both.
120
312000
2000
所以其實最好的編輯
是每樣都給我們一些。
05:29
It gives us a little bit of Justin賈斯汀 Bieber比伯
121
314000
2000
它會給我們一點小賈斯汀,
05:31
and a little bit of Afghanistan阿富汗.
122
316000
2000
亦會給我們一些阿富汗。
05:33
It gives us some information信息 vegetables蔬菜;
123
318000
2000
它會給我們一些像蔬菜一樣的重要資訊,
05:35
it gives us some information信息 dessert甜點.
124
320000
3000
亦會給我們一些像甜點一樣的資訊。
05:38
And the challenge挑戰 with these kinds of algorithmic算法 filters過濾器,
125
323000
2000
所以對這類算法式過濾和
05:40
these personalized個性化 filters過濾器,
126
325000
2000
這些個人過濾的挑戰,
05:42
is that, because they're mainly主要 looking
127
327000
2000
是因為,它們主要是看
05:44
at what you click點擊 on first,
128
329000
4000
你首先按的是什麼連結,
05:48
it can throw off that balance平衡.
129
333000
4000
這個方法會有阻平衡。
05:52
And instead代替 of a balanced均衡 information信息 diet飲食,
130
337000
3000
你不但沒得到均衡的資訊菜單,
05:55
you can end結束 up surrounded包圍
131
340000
2000
你可能會得到
05:57
by information信息 junk破爛 food餐飲.
132
342000
2000
很多垃圾資訊。
05:59
What this suggests提示
133
344000
2000
這個想法是在說
06:01
is actually其實 that we may可能 have the story故事 about the Internet互聯網 wrong錯誤.
134
346000
3000
可能我們對網路的印象是不正確。
06:04
In a broadcast廣播 society社會 --
135
349000
2000
在這個廣播社會,
06:06
this is how the founding創建 mythology神話 goes --
136
351000
2000
根據流傳的說法,
06:08
in a broadcast廣播 society社會,
137
353000
2000
在這個廣播社會,
06:10
there were these gatekeepers守門, the editors編者,
138
355000
2000
有一些看門人,叫編輯者,
06:12
and they controlled受控 the flows流動 of information信息.
139
357000
3000
他們控制資料的流通。
06:15
And along沿 came來了 the Internet互聯網 and it swept風靡 them out of the way,
140
360000
3000
隨後登場便是網際網路,
它掃走這些看門人,
06:18
and it allowed允許 all of us to connect together一起,
141
363000
2000
讓我們全部人可無阻地聯繫在一起,
06:20
and it was awesome真棒.
142
365000
2000
這真是棒啊。
06:22
But that's not actually其實 what's happening事件 right now.
143
367000
3000
但實在不是這樣。
06:26
What we're seeing眼看 is more of a passing通過 of the torch火炬
144
371000
3000
我們看到的是像傳火炬,
06:29
from human人的 gatekeepers守門
145
374000
2000
由人類看門人
06:31
to algorithmic算法 ones那些.
146
376000
3000
到算法看門人。
06:34
And the thing is that the algorithms算法
147
379000
3000
但現在這種算法程式
06:37
don't yet然而 have the kind of embedded嵌入式 ethics倫理
148
382000
3000
還未有編輯人
06:40
that the editors編者 did.
149
385000
3000
所擁有的嵌入概念。
06:43
So if algorithms算法 are going to curate策劃 the world世界 for us,
150
388000
3000
所以若我們讓算法
用它的方式來看世界,
06:46
if they're going to decide決定 what we get to see and what we don't get to see,
151
391000
3000
若讓它來決定我們
可看什麼、不可看什麼,
06:49
then we need to make sure
152
394000
2000
那我們便要確定
06:51
that they're not just keyed鍵控 to relevance關聯.
153
396000
3000
它的決定不只是基於關切性。
06:54
We need to make sure that they also show顯示 us things
154
399000
2000
我們要確定它亦會給我們看一些
06:56
that are uncomfortable不舒服 or challenging具有挑戰性的 or important重要 --
155
401000
3000
我們看了未必舒服,
但有重要性及有挑戰性的東西,
06:59
this is what TEDTED does --
156
404000
2000
正如 TED 大會那樣
07:01
other points of view視圖.
157
406000
2000
會展示其他觀點。
07:03
And the thing is, we've我們已經 actually其實 been here before
158
408000
2000
其實像現在這種過濾
07:05
as a society社會.
159
410000
2000
在以前的社會也發生過。
07:08
In 1915, it's not like newspapers報紙 were sweating出汗 a lot
160
413000
3000
在 1915 年,
那時的報章對它們的民事責任
07:11
about their civic公民 responsibilities責任.
161
416000
3000
不太在意。
07:14
Then people noticed注意到
162
419000
2000
之後人們發覺到
07:16
that they were doing something really important重要.
163
421000
3000
報章實在很重要。
因為事實上,
07:19
That, in fact事實, you couldn't不能 have
164
424000
2000
一個正常運作的民主社會是不存在的,
07:21
a functioning功能 democracy民主
165
426000
2000
07:23
if citizens公民 didn't get a good flow of information信息,
166
428000
4000
除非人民能獲得有效的資訊流通。
07:28
that the newspapers報紙 were critical危急 because they were acting演戲 as the filter過濾,
167
433000
3000
所以報章對事有評論,
因為它們扮演過濾網,
07:31
and then journalistic新聞 ethics倫理 developed發達.
168
436000
2000
也因此才有新聞道德的構成。
07:33
It wasn't perfect完善,
169
438000
2000
雖然不是完美,
07:35
but it got us through通過 the last century世紀.
170
440000
3000
但這種方式帶我們走過上一個世紀。
07:38
And so now,
171
443000
2000
現在,
07:40
we're kind of back in 1915 on the Web捲筒紙.
172
445000
3000
在網上我們又像回到 1915 年。
07:44
And we need the new gatekeepers守門
173
449000
3000
我們需要新的看門人
07:47
to encode編碼 that kind of responsibility責任
174
452000
2000
將道德責任
07:49
into the code that they're writing寫作.
175
454000
2000
輸入它們算法的程式裡。
07:51
I know that there are a lot of people here from FacebookFacebook的 and from Google谷歌 --
176
456000
3000
我知道在座很多人替臉書及谷歌工作,
07:54
Larry拉里 and Sergey謝爾蓋 --
177
459000
2000
像賴利和塞吉,
07:56
people who have helped幫助 build建立 the Web捲筒紙 as it is,
178
461000
2000
很多人參與建立現今的網際網路,
07:58
and I'm grateful感激 for that.
179
463000
2000
我對此很感謝。
08:00
But we really need you to make sure
180
465000
3000
但我們真的需要你們確保
08:03
that these algorithms算法 have encoded編碼 in them
181
468000
3000
這些算法的程式裡
08:06
a sense of the public上市 life, a sense of civic公民 responsibility責任.
182
471000
3000
要有公眾生活和民事責任感。
08:09
We need you to make sure that they're transparent透明 enough足夠
183
474000
3000
我們需要你們確保
它有一定的透明度,
08:12
that we can see what the rules規則 are
184
477000
2000
讓我們能知道是用什麼準則
08:14
that determine確定 what gets得到 through通過 our filters過濾器.
185
479000
3000
來決定什麼可通過過濾網。
08:17
And we need you to give us some control控制
186
482000
2000
而且我們需要你們
給予一些控制力,
08:19
so that we can decide決定
187
484000
2000
讓我們可以選擇
08:21
what gets得到 through通過 and what doesn't.
188
486000
3000
什麼能通過和不通過。
08:24
Because I think
189
489000
2000
因為我認為
08:26
we really need the Internet互聯網 to be that thing
190
491000
2000
我們真的需要網路
08:28
that we all dreamed夢見 of it being存在.
191
493000
2000
成為一個我們夢寐以求的平臺。
08:30
We need it to connect us all together一起.
192
495000
3000
我們需要它連結所有人。
08:33
We need it to introduce介紹 us to new ideas思路
193
498000
3000
我們需要它給我們介紹新的想法、
08:36
and new people and different不同 perspectives觀點.
194
501000
3000
新的人和不同觀點。
08:40
And it's not going to do that
195
505000
2000
而它是不可能辦到這些,
08:42
if it leaves樹葉 us all isolated孤立 in a Web捲筒紙 of one.
196
507000
3000
如果它將我們孤立在
唯一自我旳網路裡就絕不可能。
08:45
Thank you.
197
510000
2000
謝謝。
08:47
(Applause掌聲)
198
512000
11000
(鼓掌)
Translated by Ana Choi
Reviewed by Felix Leung

▲Back to top

ABOUT THE SPEAKER
Eli Pariser - Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview.

Why you should listen

Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.

In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.

His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.

More profile about the speaker
Eli Pariser | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee