ABOUT THE SPEAKER
Eli Pariser - Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview.

Why you should listen

Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.

In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.

His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.

More profile about the speaker
Eli Pariser | Speaker | TED.com
TED2011

Eli Pariser: Beware online "filter bubbles"

Eli Pariser: Čuvajte se online "filter mehurića"

Filmed:
5,309,238 views

Kako se web kompanije bore da skroje svoje usluge (uključujući vesti i rezultate pretrage) u skladu sa našim ličnim ukusima, postoji opasna nenamerna posledica: Bivamo zarobljeni unutar "filter mehurića" i nismo izloženi informacijama koje bi izazvale ili proširile naš pogled na svet. Eli Pariser žučno raspravlja da će se ovo konačno pokazati kao loše po nas i loše po demokratiju.
- Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview. Full bio

Double-click the English transcript below to play the video.

00:15
MarkMark ZuckerbergZuckerberg,
0
0
2000
Mark Cukerberg,
00:17
a journalistновинар was askingпитајући him a questionпитање about the newsвести feedнапајање.
1
2000
3000
novinar mu je postavio pitanje u vezi news feed-a (na facebooku).
00:20
And the journalistновинар was askingпитајући him,
2
5000
2000
I novinar ga je pitao,
00:22
"Why is this so importantважно?"
3
7000
2000
"zašto je ovo tako bitno?"
00:24
And ZuckerbergZuckerberg said,
4
9000
2000
A Cukerberg je odgovorio,
00:26
"A squirrelveverica dyingумирање in your frontфронт yarddvorište
5
11000
2000
"Veverica koja umire na vašem travnjaku
00:28
mayможе be more relevantрелевантно to your interestsинтереса right now
6
13000
3000
može biti relevantnija za vas u ovom trenutku
00:31
than people dyingумирање in AfricaAfrika."
7
16000
3000
nego ljudi koji umiru u Africi."
00:34
And I want to talk about
8
19000
2000
Želeo bih da govorim o tome
00:36
what a WebWeb basedзаснован on that ideaидеја of relevanceрелевантност mightМожда look like.
9
21000
3000
kako bi izgledao internet zasnovan na toj ideji o relevantnosti.
00:40
So when I was growingрастуће up
10
25000
2000
Tokom mog odrastanja
00:42
in a really ruralсеоски areaподручје in MaineMaine,
11
27000
2000
u stvarno ruralnoj sredini u Mejnu,
00:44
the InternetInternet meantмислио something very differentразличит to me.
12
29000
3000
Internet je meni značio nešto sasvim drugačije.
00:47
It meantмислио a connectionвеза to the worldсвет.
13
32000
2000
Značio je vezu sa svetom.
00:49
It meantмислио something that would connectцоннецт us all togetherзаједно.
14
34000
3000
Predstavljao je nešto što bi nas sve povezalo.
00:52
And I was sure that it was going to be great for democracyдемократија
15
37000
3000
I bio sam siguran da će to biti sjajno za demokratiju
00:55
and for our societyдруштво.
16
40000
3000
i za društvo.
00:58
But there's this shiftсмена
17
43000
2000
Ali tu postoji taj preokret
01:00
in how informationинформације is flowingтече onlineонлине,
18
45000
2000
u načinu na koji informacija putuje online,
01:02
and it's invisibleневидљив.
19
47000
3000
koji je nevidljiv.
01:05
And if we don't payплатите attentionпажњу to it,
20
50000
2000
I ako ne obratimo pažnju na njega,
01:07
it could be a realправи problemпроблем.
21
52000
3000
može postati realan problem.
01:10
So I first noticedПриметио this in a placeместо I spendпотрошити a lot of time --
22
55000
3000
Prvi put sam ovo primetio na mestu na kom provodim dosta vremena --
01:13
my FacebookFacebook pageстрана.
23
58000
2000
moja Facebook stranica.
01:15
I'm progressiveпрогресивно, politicallyполитички -- bigвелики surpriseизненађење --
24
60000
3000
Moja politička shvatanja su progresivna -- kakvo iznenađenje --
01:18
but I've always goneотишла out of my way to meetсусрет conservativeskonzervativci.
25
63000
2000
ali sam se uvek trudio da upoznam konzervativce.
01:20
I like hearingслух what they're thinkingразмишљање about;
26
65000
2000
Volim da čujem o čemu razmišljaju;
01:22
I like seeingвиди what they linkлинк to;
27
67000
2000
Volim da vidim koje linkove postavljaju;
01:24
I like learningучење a thing or two.
28
69000
2000
Volim da naučim par stvari.
01:26
And so I was surprisedизненађени when I noticedПриметио one day
29
71000
3000
Tako da sam bio iznenađen kad sam jednog dana primetio
01:29
that the conservativeskonzervativci had disappearedнестао from my FacebookFacebook feedнапајање.
30
74000
3000
da su konzervativci nestali iz mog Facebook feeda.
01:33
And what it turnedокренуо се out was going on
31
78000
2000
A ono što se ispostavilo je
01:35
was that FacebookFacebook was looking at whichкоја linksлинкови I clickedna klik on,
32
80000
4000
da je Facebook pratio koje linkove sam kliknuo,
01:39
and it was noticingпримећујем that, actuallyзаправо,
33
84000
2000
i primećivao je da sam, u stvari,
01:41
I was clickingкликом more on my liberalLiberalni friends'prijatelja linksлинкови
34
86000
2000
više kliktao na linkove mojih liberalnih prijatelja
01:43
than on my conservativeконзервативан friends'prijatelja linksлинкови.
35
88000
3000
nego na linkove mojih konzervativnih prijatelja.
01:46
And withoutбез consultingКонсалтинг me about it,
36
91000
2000
I bez konsultacije sa mnom,
01:48
it had editedедитед them out.
37
93000
2000
izbacio ih je.
01:50
They disappearedнестао.
38
95000
3000
Oni su nestali.
01:54
So FacebookFacebook isn't the only placeместо
39
99000
2000
Ali Facebook nije jedino mesto
01:56
that's doing this kindкинд of invisibleневидљив, algorithmicAlgoritamski
40
101000
2000
koje radi ovo nevidljivo, algoritamsko
01:58
editingедитовање of the WebWeb.
41
103000
3000
menjanje Weba.
02:01
Google'sGoogle- doing it too.
42
106000
2000
I Google to radi.
02:03
If I searchПретрага for something, and you searchПретрага for something,
43
108000
3000
Kad ja tražim neki pojam, i vi tražite neki pojam,
02:06
even right now at the very sameисти time,
44
111000
2000
čak i sada, u isto vreme,
02:08
we mayможе get very differentразличит searchПретрага resultsрезултате.
45
113000
3000
možemo dobiti različite rezultate pretrage.
02:11
Even if you're loggedevidentirani out, one engineerинжењер told me,
46
116000
3000
Čak i ako niste ulogovani, jedan inženjer mi je rekao,
02:14
there are 57 signalsсигналима
47
119000
2000
postoji 57 signala
02:16
that GoogleGoogle looksизглед at --
48
121000
3000
koje Google prati --
02:19
everything from what kindкинд of computerрачунар you're on
49
124000
3000
sve od toga na kakvom ste kompjuteru
02:22
to what kindкинд of browserpregledač you're usingКористећи
50
127000
2000
do toga koji pretraživač koristite
02:24
to where you're locatedналази се --
51
129000
2000
i toga gde se nalazite --
02:26
that it usesкористи to personallyлично tailorkrojač your queryupit resultsрезултате.
52
131000
3000
koje koristi da bi vam lično skrojio rezultate pretrage.
02:29
Think about it for a secondдруго:
53
134000
2000
Razmislite o tome na momenat:
02:31
there is no standardстандард GoogleGoogle anymoreвише.
54
136000
4000
više ne postoji standardni Google.
02:35
And you know, the funnyсмешно thing about this is that it's hardтешко to see.
55
140000
3000
Znate, zanimljivo u vezi toga je da je to vrlo teško primetiti.
02:38
You can't see how differentразличит your searchПретрага resultsрезултате are
56
143000
2000
Vi ne možete videti koliko se vaši rezultati pretrage
02:40
from anyoneбило ко else'sдруго.
57
145000
2000
razlikuju od bilo čijih drugih.
02:42
But a coupleпар of weeksнедељама agoпре,
58
147000
2000
Ali pre par nedelja
02:44
I askedпитао a bunchгомилу of friendsпријатељи to GoogleGoogle "EgyptEgipat"
59
149000
3000
pitao sam grupu prijatelja da guglaju "Egipat"
02:47
and to sendпошаљи me screenекран shotsснимке of what they got.
60
152000
3000
i da mi pošalju snimak ekrana šta su dobili.
02:50
So here'sево my friendпријатељ Scott'sSkot je screenекран shotпуцањ.
61
155000
3000
Evo snimka ekrana mog prijatelja Skota.
02:54
And here'sево my friendпријатељ Daniel'sDaniel je screenекран shotпуцањ.
62
159000
3000
A evo snimka ekrana mog prijatelja Danijela.
02:57
When you put them side-by-sideраме уз раме,
63
162000
2000
Kad ih stavite jedan pored drugog,
02:59
you don't even have to readчитати the linksлинкови
64
164000
2000
ne morate čak ni da čitate linkove
03:01
to see how differentразличит these two pagesстранице are.
65
166000
2000
da biste videli koliko se razlikuju.
03:03
But when you do readчитати the linksлинкови,
66
168000
2000
Ali kad pročitate linkove,
03:05
it's really quiteприлично remarkableизузетно.
67
170000
3000
stvarno je vrlo upadljivo.
03:09
DanielDanijel didn't get anything about the protestsпротести in EgyptEgipat at all
68
174000
3000
Danijel nije dobio ništa u vezi protesta u Egiptu
03:12
in his first pageстрана of GoogleGoogle resultsрезултате.
69
177000
2000
na svojoj prvoj stranici Google rezultata.
03:14
Scott'sSkot je resultsрезултате were fullпуна of them.
70
179000
2000
Skotovi rezultati su bili puni protesta.
03:16
And this was the bigвелики storyприча of the day at that time.
71
181000
2000
A to je bila najvažnija vest dana u tom trenutku.
03:18
That's how differentразличит these resultsрезултате are becomingпостаје.
72
183000
3000
Toliko ti rezultati postaju različiti.
03:21
So it's not just GoogleGoogle and FacebookFacebook eitherили.
73
186000
3000
To nisu čak ni samo Google i Facebook.
03:24
This is something that's sweepingпометање the WebWeb.
74
189000
2000
To je nešto što se širi mrežom.
03:26
There are a wholeцела hostдомаћин of companiesкомпаније that are doing this kindкинд of personalizationPersonalizacija.
75
191000
3000
Postoji mnoštvo kompanija koje rade ovakvu vrstu personalizacije.
03:29
YahooYahoo NewsNovosti, the biggestнајвеће newsвести siteсите on the InternetInternet,
76
194000
3000
Yahoo News, najveći news sajt na Internetu,
03:32
is now personalizedpersonalizovane -- differentразличит people get differentразличит things.
77
197000
3000
je sada personalizovan -- različiti ljudi dobijaju različite stvari.
03:36
HuffingtonHafington PostPost, the WashingtonWashington PostPost, the NewNovi YorkYork TimesPuta --
78
201000
3000
Huffington Post, Vašington Post, New York Times --
03:39
all flirtingflert with personalizationPersonalizacija in variousразни waysначини.
79
204000
3000
svi flertuju sa personalizacijom na različite načine.
03:42
And this movesпотезе us very quicklyбрзо
80
207000
3000
I to nas vrlo brzo vodi
03:45
towardпрема a worldсвет in whichкоја
81
210000
2000
ka svetu u kome
03:47
the InternetInternet is showingпоказивање us what it thinksмисли we want to see,
82
212000
4000
nam Internet prikazuje stvari koje misli da želimo da vidimo,
03:51
but not necessarilyнужно what we need to see.
83
216000
3000
ali ne nužno i stvari koje bi trebalo da vidimo.
03:54
As EricEric SchmidtSchmidt said,
84
219000
3000
Kao što je Erik Šmit rekao,
03:57
"It will be very hardтешко for people to watch or consumeконзумирати something
85
222000
3000
"Biće vrlo teško ljudima da gledaju ili konzumiraju nešto
04:00
that has not in some senseсмисао
86
225000
2000
što nije na neki način
04:02
been tailoredprilagođen for them."
87
227000
3000
skrojeno baš za njih."
04:05
So I do think this is a problemпроблем.
88
230000
2000
Tako da stvarno smatram da je to problem.
04:07
And I think, if you take all of these filtersфилтери togetherзаједно,
89
232000
3000
I mislim da, ako skupite sve te filtere,
04:10
you take all these algorithmsалгоритми,
90
235000
2000
uzmete sve te algoritme,
04:12
you get what I call a filterфилтер bubbleмехур.
91
237000
3000
dobijete ono što zovem filter mehurić.
04:16
And your filterфилтер bubbleмехур is your ownвластити personalлични,
92
241000
3000
Vaš filter mehurić je vaš lični
04:19
uniqueјединствен universeуниверзум of informationинформације
93
244000
2000
jedinstveni univerzum informacija
04:21
that you liveживи in onlineонлине.
94
246000
2000
u kome živite online.
04:23
And what's in your filterфилтер bubbleмехур
95
248000
3000
A šta se nalazi u vašem filter mehiruću
04:26
dependsзависи on who you are, and it dependsзависи on what you do.
96
251000
3000
zavisi od toga ko ste, i zavisi od toga čime se bavite.
04:29
But the thing is that you don't decideодлучити what getsдобива in.
97
254000
4000
Ali stvar je u tome da vi ne odlučujete šta ulazi unutra.
04:33
And more importantlyважно,
98
258000
2000
I još važnije,
04:35
you don't actuallyзаправо see what getsдобива editedедитед out.
99
260000
3000
ne možete da vidite šta vam je uklonjeno.
04:38
So one of the problemsпроблеми with the filterфилтер bubbleмехур
100
263000
2000
Tako da je jedan od problema sa filter mehurićem
04:40
was discoveredоткривени by some researchersистраживачи at NetflixNetflix.
101
265000
3000
pronađen od strane nekih istraživača iz Netflixa.
04:43
And they were looking at the NetflixNetflix queuesRedovi, and they noticedПриметио something kindкинд of funnyсмешно
102
268000
3000
Posmatrali su Netflix redoslede, i primetili nešto zanimljivo
04:46
that a lot of us probablyвероватно have noticedПриметио,
103
271000
2000
što su mnogi od nas verovatno primetili,
04:48
whichкоја is there are some moviesфилмове
104
273000
2000
a to je da postoje neki filmovi
04:50
that just sortврста of zipзип right up and out to our housesкуће.
105
275000
3000
koji na neki način iskoče pravo u naše domove.
04:53
They enterунесите the queueред, they just zipзип right out.
106
278000
3000
Uđu u redosled, i odmah iskoče.
04:56
So "IronGvozdeni Man" zipsNe radi right out,
107
281000
2000
Tako "Iron Man" odmah iskoči,
04:58
and "WaitingČeka for SupermanSupermen"
108
283000
2000
a "Waiting for Superman"
05:00
can wait for a really long time.
109
285000
2000
može da čeka prilično dugo vremena.
05:02
What they discoveredоткривени
110
287000
2000
Ono što su otkrili
05:04
was that in our NetflixNetflix queuesRedovi
111
289000
2000
je da se u Netflix redosledu
05:06
there's this epicepska struggleборба going on
112
291000
3000
dešava epska bitka
05:09
betweenизмеђу our futureбудућност aspirationalpotakne selvessamih sebe
113
294000
3000
između našeg budućeg sebe kojem težimo
05:12
and our more impulsiveimpulsivan presentпоклон selvessamih sebe.
114
297000
3000
i našeg sadašnjeg, impulsivnijeg sebe.
05:15
You know we all want to be someoneнеко
115
300000
2000
Znate, svi bismo želeli da budemo neko
05:17
who has watchedгледао "RashomonRasomone,"
116
302000
2000
ko je gledao "Rašomona",
05:19
but right now
117
304000
2000
ali trenutno
05:21
we want to watch "AceAs VenturaVentura" for the fourthчетврто time.
118
306000
3000
želimo da gledamo "Ejs Venturu" po četvrti put.
05:24
(LaughterSmeh)
119
309000
3000
(Smeh)
05:27
So the bestнајбоље editingедитовање givesдаје us a bitмало of bothи једно и друго.
120
312000
2000
Tako da nam najbolje uređivanje daje pomalo od oboje.
05:29
It givesдаје us a little bitмало of JustinJustin BieberBiber
121
314000
2000
Daje nam pomalo Džastin Bibera
05:31
and a little bitмало of AfghanistanAvganistan.
122
316000
2000
i pomalo Avganistana.
05:33
It givesдаје us some informationинформације vegetablesповрће;
123
318000
2000
Daje nam neke biljke informacija,
05:35
it givesдаје us some informationинформације dessertdesert.
124
320000
3000
i daje nam neke dezerte informacija.
05:38
And the challengeизазов with these kindsврсте of algorithmicAlgoritamski filtersфилтери,
125
323000
2000
Nedostatak ove vrste algoritamskih filtera,
05:40
these personalizedpersonalizovane filtersфилтери,
126
325000
2000
ovih personalizovanih filtera,
05:42
is that, because they're mainlyуглавном looking
127
327000
2000
je u tome što, pošto pretežno prate
05:44
at what you clickкликните on first,
128
329000
4000
na šta prvo klikćete,
05:48
it can throwбацање off that balanceбаланс.
129
333000
4000
mogu da poremete tu ravnotežu.
05:52
And insteadуместо тога of a balancedуравнотежено informationинформације dietдијета,
130
337000
3000
Umesto balansirane informacijske dijete,
05:55
you can endкрај up surroundedокружен
131
340000
2000
možete završiti okruženi
05:57
by informationинформације junkђубре foodхрана.
132
342000
2000
informacijskom lošom hranom.
05:59
What this suggestsсугерише
133
344000
2000
Ovo u stvari ukazuje na to da
06:01
is actuallyзаправо that we mayможе have the storyприча about the InternetInternet wrongпогрешно.
134
346000
3000
smo možda promašili celu priču sa Internetom.
06:04
In a broadcastemitovanje societyдруштво --
135
349000
2000
U društvu širokopojasnog emitovanja --
06:06
this is how the foundingоснивање mythologymitologija goesиде --
136
351000
2000
ovako glasi osnivačka mitologija --
06:08
in a broadcastemitovanje societyдруштво,
137
353000
2000
u društvu širokopojasnog emitovanja,
06:10
there were these gatekeepersborac, the editorsurednici,
138
355000
2000
postojali su ti vratari, urednici,
06:12
and they controlledконтролисано the flowsтокови of informationинформације.
139
357000
3000
koji su kontrolisali protok informacija.
06:15
And alongзаједно cameДошао the InternetInternet and it sweptсвепт them out of the way,
140
360000
3000
I onda se pojavio Internet koji ih je oduvao sa puta,
06:18
and it allowedдозвољен all of us to connectцоннецт togetherзаједно,
141
363000
2000
i omogućio svima nama da se međusobno povežemo,
06:20
and it was awesomeсупер.
142
365000
2000
što je bilo fenomenalno.
06:22
But that's not actuallyзаправо what's happeningдогађај right now.
143
367000
3000
Ali to u stvari nije ono što se upravo dešava.
06:26
What we're seeingвиди is more of a passingпролаз of the torchбакља
144
371000
3000
Ono što sada posmatramo je više prelazak štafete
06:29
from humanљудско gatekeepersborac
145
374000
2000
od ljudskih vratara
06:31
to algorithmicAlgoritamski onesоне.
146
376000
3000
ka algoritamskim.
06:34
And the thing is that the algorithmsалгоритми
147
379000
3000
A stvar je u tome da algoritmi
06:37
don't yetјош увек have the kindкинд of embeddedуграђени ethicsetika
148
382000
3000
još uvek nemaju ugrađenu etiku
06:40
that the editorsurednici did.
149
385000
3000
koju su imali urednici.
06:43
So if algorithmsалгоритми are going to curateЋupnik the worldсвет for us,
150
388000
3000
Ako će algoritmi da nam budu tutori o svetu,
06:46
if they're going to decideодлучити what we get to see and what we don't get to see,
151
391000
3000
ako će da odlučuju šta možemo a šta ne možemo da vidimo,
06:49
then we need to make sure
152
394000
2000
onda moramo da se pobrinemo
06:51
that they're not just keyedizgrebao to relevanceрелевантност.
153
396000
3000
da nisu naštimovani samo na relevantnost.
06:54
We need to make sure that they alsoтакође showсхов us things
154
399000
2000
Moramo da obezbedimo da nam prikazuju i stvari
06:56
that are uncomfortableнеудобан or challengingизазован or importantважно --
155
401000
3000
koje su neprijatne ili teške ili bitne --
06:59
this is what TEDTED does --
156
404000
2000
to je ono što TED radi --
07:01
other pointsбодова of viewпоглед.
157
406000
2000
druge tačke gledišta.
07:03
And the thing is, we'veми смо actuallyзаправо been here before
158
408000
2000
Stvar je u tome da smo već bili na ovom mestu
07:05
as a societyдруштво.
159
410000
2000
kao društvo.
07:08
In 1915, it's not like newspapersновине were sweatingznojenje a lot
160
413000
3000
1915. godine, novine se nisu mnogo brinule
07:11
about theirњихова civicграђански responsibilitiesодговорности.
161
416000
3000
o svojoj odgovornosti prema građanima.
07:14
Then people noticedПриметио
162
419000
2000
Onda su ljudi primetili
07:16
that they were doing something really importantважно.
163
421000
3000
da one rade nešto vrlo značajno.
07:19
That, in factчињеница, you couldn'tније могао have
164
424000
2000
Da, u stvari, ne možete imati
07:21
a functioningфункционира democracyдемократија
165
426000
2000
funkcionalnu demokratiju
07:23
if citizensГрађани didn't get a good flowток of informationинформације,
166
428000
4000
ako građani nemaju dobar priliv informacija.
07:28
that the newspapersновине were criticalкритичан because they were actingглума as the filterфилтер,
167
433000
3000
Da su novine kritične, jer su funkcionisale kao filter,
07:31
and then journalisticnovinarski ethicsetika developedразвијен.
168
436000
2000
i onda se razvila novinarska etika.
07:33
It wasn'tније perfectсавршен,
169
438000
2000
Nije bila savršena,
07:35
but it got us throughкроз the last centuryвек.
170
440000
3000
ali nam je poslužila kroz prošli vek.
07:38
And so now,
171
443000
2000
I tako smo danas,
07:40
we're kindкинд of back in 1915 on the WebWeb.
172
445000
3000
u neku ruku ponovo u 1915. na mreži.
07:44
And we need the newново gatekeepersborac
173
449000
3000
I potrebni su nam novi vratari
07:47
to encodeKodiraj that kindкинд of responsibilityодговорност
174
452000
2000
da utkaju tu vrstu odgovornosti
07:49
into the codeкод that they're writingписање.
175
454000
2000
u kod koji ispisuju.
07:51
I know that there are a lot of people here from FacebookFacebook and from GoogleGoogle --
176
456000
3000
Znam da ima dosta ljudi ovde iz Facebooka ili Googla --
07:54
LarryLeri and SergeySergej --
177
459000
2000
Lari i Sergej --
07:56
people who have helpedпомогао buildизградити the WebWeb as it is,
178
461000
2000
ljudi koji su pomogli u izgradnji Weba kakav je danas,
07:58
and I'm gratefulзахвални for that.
179
463000
2000
i ja sam im zahvalan zbog toga.
08:00
But we really need you to make sure
180
465000
3000
Ali stvarno želimo da obezbedite
08:03
that these algorithmsалгоритми have encodedkodirana in them
181
468000
3000
da ti algoritmi imaju ugrađen u sebi
08:06
a senseсмисао of the publicјавно life, a senseсмисао of civicграђански responsibilityодговорност.
182
471000
3000
osećaj javnog života, osećaj građanske odgovornosti.
08:09
We need you to make sure that they're transparentтранспарентно enoughдовољно
183
474000
3000
Želimo da obezbedite da su dovoljno transparentni
08:12
that we can see what the rulesправила are
184
477000
2000
da možemo da vidimo koja su pravila
08:14
that determineодредити what getsдобива throughкроз our filtersфилтери.
185
479000
3000
koja određuju šta će proći kroz naše filtere.
08:17
And we need you to give us some controlконтрола
186
482000
2000
I želimo da nam date neku kontrolu,
08:19
so that we can decideодлучити
187
484000
2000
da možemo da odlučimo
08:21
what getsдобива throughкроз and what doesn't.
188
486000
3000
šta će proći a šta neće.
08:24
Because I think
189
489000
2000
Jer smatram da
08:26
we really need the InternetInternet to be that thing
190
491000
2000
nam je stvarno potrebno da Internet bude to
08:28
that we all dreamedсањала of it beingбиће.
191
493000
2000
što smo svi sanjali da će biti.
08:30
We need it to connectцоннецт us all togetherзаједно.
192
495000
3000
Potreban nam je da nas sve poveže.
08:33
We need it to introduceувести us to newново ideasидеје
193
498000
3000
Potreban nam je da nas upozna sa novim idejama
08:36
and newново people and differentразличит perspectivesperspektiva.
194
501000
3000
i novim ljudima i različitim perspektivama.
08:40
And it's not going to do that
195
505000
2000
A to neće uraditi
08:42
if it leavesоставља us all isolatedизолован in a WebWeb of one.
196
507000
3000
ako nas sve ostavi izolovane u pojedinačnim mrežama.
08:45
Thank you.
197
510000
2000
Hvala Vam.
08:47
(ApplauseAplauz)
198
512000
11000
(Aplauz)
Translated by Sanja Drakulovic
Reviewed by Sandra Gojic

▲Back to top

ABOUT THE SPEAKER
Eli Pariser - Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview.

Why you should listen

Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.

In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.

His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.

More profile about the speaker
Eli Pariser | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee