ABOUT THE SPEAKER
Eli Pariser - Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview.

Why you should listen

Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.

In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.

His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.

More profile about the speaker
Eli Pariser | Speaker | TED.com
TED2011

Eli Pariser: Beware online "filter bubbles"

Eli Pariser: Čuvajte se online "filter mjehurića"

Filmed:
5,309,238 views

Kako se web kompanije bore da skroje svoje usluge ( uključujući vijesti i rezultate pretrage ) u skladu s našim osobnim ukusima, postoji opasna nenamjerna posljedica: Ostanemo zarobljeni u "filter mjehuriću" i ne budemo izloženi informacijama koje bi izazvale ili proširile naš pogled na svijet. Eli Pariser žučno raspravlja da će se ovo konačno pokazati kao loše za nas i loše za demokraciju.
- Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview. Full bio

Double-click the English transcript below to play the video.

00:15
MarkMark ZuckerbergZuckerberg,
0
0
2000
Mark Zuckerberg,
00:17
a journalistnovinar was askingtraži him a questionpitanje about the newsvijesti feedstočna hrana.
1
2000
3000
novinar mu je postavio pitanje u vezi news feed-a.
00:20
And the journalistnovinar was askingtraži him,
2
5000
2000
I novinar ga je pitao,
00:22
"Why is this so importantvažno?"
3
7000
2000
"Zašto je ovo toliko važno?"
00:24
And ZuckerbergZuckerberg said,
4
9000
2000
A Zuckerber odgovara,
00:26
"A squirrelvjeverica dyingumiranje in your frontispred yarddvorište
5
11000
2000
"Vjeverica koja umire na vašem travnjaku
00:28
maysvibanj be more relevantrelevantan to your interestsinteresi right now
6
13000
3000
može biti relevantnija za vas u ovom trenutku
00:31
than people dyingumiranje in AfricaAfrika."
7
16000
3000
nego ljudi koji umiru u Africi."
00:34
And I want to talk about
8
19000
2000
Želio bih govoriti o tome
00:36
what a WebWeb basedzasnovan on that ideaideja of relevancerelevantnost mightmoć look like.
9
21000
3000
kako bi izgledao internet utemeljen na toj ideji o relevantnosti.
00:40
So when I was growingrastući up
10
25000
2000
Tokom mog odrastanja
00:42
in a really ruralruralna areapodručje in MaineMaine,
11
27000
2000
u stvarno ruralnoj sredini u Maine-u,
00:44
the InternetInternet meantznačilo something very differentdrugačiji to me.
12
29000
3000
internet mi je značio nešto sasvim drukčije.
00:47
It meantznačilo a connectionveza to the worldsvijet.
13
32000
2000
Značio je vezu sa svijetom.
00:49
It meantznačilo something that would connectSpojiti us all togetherzajedno.
14
34000
3000
Značio je nešto što bi nas sve povezalo.
00:52
And I was sure that it was going to be great for democracydemokratija
15
37000
3000
I bio sam siguran da će to biti odlično za demokraciju
00:55
and for our societydruštvo.
16
40000
3000
i za naše društvo.
00:58
But there's this shiftsmjena
17
43000
2000
Ali tu postoji taj preokret
01:00
in how informationinformacija is flowingtekući onlinena liniji,
18
45000
2000
u načinu na koji informacija putuje online,
01:02
and it's invisiblenevidljiv.
19
47000
3000
koji je nevidljiv.
01:05
And if we don't payplatiti attentionpažnja to it,
20
50000
2000
I ako ne obratimo pažnju na njega,
01:07
it could be a realstvaran problemproblem.
21
52000
3000
može postati ozbiljan problem.
01:10
So I first noticedprimijetio this in a placemjesto I spendprovesti a lot of time --
22
55000
3000
Prvi put sam ovo primjetio na mjestu na kojem provodim dosta vremena --
01:13
my FacebookFacebook pagestranica.
23
58000
2000
moja Facebook stranica.
01:15
I'm progressiveprogresivan, politicallypolitičko -- bigvelika surpriseiznenađenje --
24
60000
3000
Moja politička razumjevanja su progresivna -- kakvo iznenađenje --
01:18
but I've always goneotišao out of my way to meetsastati conservativeskonzervativci.
25
63000
2000
ali sam se uvijek trudio upoznati konzervativce.
01:20
I like hearingsluh what they're thinkingmišljenje about;
26
65000
2000
Volim čuti o čemu razmišljaju;
01:22
I like seeingvidim what they linkveza to;
27
67000
2000
Volim vidjeti koje linkove postavljaju;
01:24
I like learningučenje a thing or two.
28
69000
2000
Volim naučiti nekoliko stvari.
01:26
And so I was surprisediznenađen when I noticedprimijetio one day
29
71000
3000
Tako sam se iznenadio kada sam jednog dana primjetio
01:29
that the conservativeskonzervativci had disappearednestao from my FacebookFacebook feedstočna hrana.
30
74000
3000
da su konzervativci nestali iz mog Facebook feeda.
01:33
And what it turnedokrenut out was going on
31
78000
2000
A ono što se ispostavilo je
01:35
was that FacebookFacebook was looking at whichkoji linkslinkovi I clickedklik on,
32
80000
4000
da je Facebook pratio koje linkove sam kliknuo,
01:39
and it was noticingprimjećujući that, actuallyzapravo,
33
84000
2000
i primjećivao je da sam, u stvari,
01:41
I was clickingklikom more on my liberalliberalni friends'prijatelji' linkslinkovi
34
86000
2000
više klikao na linkove mojih liberalnih prijatelja
01:43
than on my conservativekonzervativan friends'prijatelji' linkslinkovi.
35
88000
3000
nego na linkove mojih konzervativnih prijatelja.
01:46
And withoutbez consultingsavjetodavni me about it,
36
91000
2000
I bez konzultacije samnom,
01:48
it had editeduredio them out.
37
93000
2000
izbacio ih je.
01:50
They disappearednestao.
38
95000
3000
Oni su nestali.
01:54
So FacebookFacebook isn't the only placemjesto
39
99000
2000
Ali Facebook nije jedino mjesto
01:56
that's doing this kindljubazan of invisiblenevidljiv, algorithmicalgoritamski
40
101000
2000
koje radi ovo nevidljivo, algoritamsko
01:58
editingmontaža of the WebWeb.
41
103000
3000
mjenjanje weba.
02:01
Google'sGoogle doing it too.
42
106000
2000
I Google to radi.
02:03
If I searchtraži for something, and you searchtraži for something,
43
108000
3000
Kad ja tražim neki pojam, i vi tražite neki pojam,
02:06
even right now at the very sameisti time,
44
111000
2000
čak i sada, u isto vrijeme,
02:08
we maysvibanj get very differentdrugačiji searchtraži resultsrezultati.
45
113000
3000
možemo dobiti različite rezultate pretrage.
02:11
Even if you're loggedKreiranje datoteke poruka out, one engineerinženjer told me,
46
116000
3000
Čak i ako niste ulogirani, jedan inžinjer mi je rekao,
02:14
there are 57 signalssignali
47
119000
2000
postoji 57 signala
02:16
that GoogleGoogle looksizgled at --
48
121000
3000
koje Google prati --
02:19
everything from what kindljubazan of computerračunalo you're on
49
124000
3000
sve od toga na kakvom ste računalu
02:22
to what kindljubazan of browserpreglednik you're usingkoristeći
50
127000
2000
do toga koji pretraživač koristite
02:24
to where you're locatednalazi --
51
129000
2000
i toga gdje se nalazite --
02:26
that it usesnamjene to personallylično tailorkrojač your queryupit resultsrezultati.
52
131000
3000
koje služi da bi vam osobno skrojio rezultate pretrage.
02:29
Think about it for a seconddrugi:
53
134000
2000
Razmislite o tome na trenutak:
02:31
there is no standardstandard GoogleGoogle anymoreviše.
54
136000
4000
više ne postoji standardni Google.
02:35
And you know, the funnysmiješno thing about this is that it's hardteško to see.
55
140000
3000
Znate, zanimljivo u vezi toga je što je to vrlo teško primjetiti.
02:38
You can't see how differentdrugačiji your searchtraži resultsrezultati are
56
143000
2000
Vi ne možete vidjeti koliko se vaši rezultati pretrage
02:40
from anyonebilo tko else'sdrugo je.
57
145000
2000
razlikuju od bilo kojih drugih.
02:42
But a couplepar of weeksTjedni agoprije,
58
147000
2000
Ali prije nekoliko tjedana
02:44
I askedpitao a bunchmnogo of friendsprijatelji to GoogleGoogle "EgyptEgipat"
59
149000
3000
pitao sam grupu prijatelja da guglaju "Egipat"
02:47
and to sendposlati me screenzaslon shotssnimke of what they got.
60
152000
3000
i da mi pošalju snimak ekrana koji su dobili.
02:50
So here'sevo my friendprijatelj Scott'sScott je screenzaslon shotšut.
61
155000
3000
Evo snimka ekrana mog prijatelja Scott-a.
02:54
And here'sevo my friendprijatelj Daniel'sDanielov screenzaslon shotšut.
62
159000
3000
A evo snimka ekrana mog prijatelja Daniel-a.
02:57
When you put them side-by-sideusporedo,
63
162000
2000
Kad ih stavite jedan pored drugog,
02:59
you don't even have to readčitati the linkslinkovi
64
164000
2000
ne morate čak ni čitati linkove
03:01
to see how differentdrugačiji these two pagesstranica are.
65
166000
2000
da biste vidjeli koliko se razlikuju.
03:03
But when you do readčitati the linkslinkovi,
66
168000
2000
Ali kad pročitate linkove,
03:05
it's really quitedosta remarkableizvanredan.
67
170000
3000
stvarno je vrijedno pažnje.
03:09
DanielDaniel didn't get anything about the protestsprosvjedi in EgyptEgipat at all
68
174000
3000
Daniel nije dobio ništa u vezi prosvjeda u Egiptu
03:12
in his first pagestranica of GoogleGoogle resultsrezultati.
69
177000
2000
na svojoj prvoj stranici Google rezultata.
03:14
Scott'sScott je resultsrezultati were fullpuni of them.
70
179000
2000
Scottovi rezultati su bili puni prosvjeda.
03:16
And this was the bigvelika storypriča of the day at that time.
71
181000
2000
A to je bila najvažnija vijest dana u tom trenutku.
03:18
That's how differentdrugačiji these resultsrezultati are becomingpostaje.
72
183000
3000
Toliko ti rezultati postaju različiti.
03:21
So it's not just GoogleGoogle and FacebookFacebook eitherili.
73
186000
3000
To nisu čak ni samo Google i Facebook.
03:24
This is something that's sweepingbrišući the WebWeb.
74
189000
2000
To je nešto što se širi mrežom.
03:26
There are a wholečitav hostdomaćin of companiestvrtke that are doing this kindljubazan of personalizationPersonalizacija.
75
191000
3000
Postoji mnoštvo kompanija koje rade ovakvu vrstu presonalizacije.
03:29
YahooEhej NewsVijesti, the biggestnajveći newsvijesti sitemjesto on the InternetInternet,
76
194000
3000
Yahoo News, najveći news site na Internetu,
03:32
is now personalizedpersonalizirane -- differentdrugačiji people get differentdrugačiji things.
77
197000
3000
je sada personaliziran -- različiti ljudi dobiju različite stvari.
03:36
HuffingtonHuffington PostPost, the WashingtonWashington PostPost, the NewNovi YorkYork TimesPuta --
78
201000
3000
Huffington Post, Washington Post, New York Times --
03:39
all flirtingflert with personalizationPersonalizacija in variousraznovrstan waysnačine.
79
204000
3000
svi flertuju s personalizacijom na različite načine.
03:42
And this movespotezi us very quicklybrzo
80
207000
3000
I to nas vrlo brzo vodi
03:45
towardprema a worldsvijet in whichkoji
81
210000
2000
ka svijetu u kojem
03:47
the InternetInternet is showingpokazivanje us what it thinksmisli we want to see,
82
212000
4000
nam Internet prikazuje stvari koje misli da želimo vidjeti,
03:51
but not necessarilyobavezno what we need to see.
83
216000
3000
ali ne nužno i stvari koje bi trebali vidjeti.
03:54
As EricEric SchmidtSchmidt said,
84
219000
3000
Kao što je Eric Schmidt rekao,
03:57
"It will be very hardteško for people to watch or consumepojesti something
85
222000
3000
"Bit će vrlo teško ljudima gledati ili konzumirati nešto
04:00
that has not in some senseosjećaj
86
225000
2000
što nije na neki način
04:02
been tailoredpo mjeri for them."
87
227000
3000
skrojeno baš za njih."
04:05
So I do think this is a problemproblem.
88
230000
2000
Tako da stvarno smatram da je to problem.
04:07
And I think, if you take all of these filtersfilteri togetherzajedno,
89
232000
3000
I mislim da, ako skupite sve te filtere,
04:10
you take all these algorithmsalgoritmi,
90
235000
2000
uzmete sve te algoritme,
04:12
you get what I call a filterfilter bubblemjehurić.
91
237000
3000
dobijete ono što zovem filter mjehurić.
04:16
And your filterfilter bubblemjehurić is your ownvlastiti personalosobni,
92
241000
3000
Vaš filter mjehurić je vaš osobni
04:19
uniquejedinstvena universesvemir of informationinformacija
93
244000
2000
jedinstveni univerzum informacija
04:21
that you liveživjeti in onlinena liniji.
94
246000
2000
u kojem živite online.
04:23
And what's in your filterfilter bubblemjehurić
95
248000
3000
A što se nalazi u vašem filter mjehuriću
04:26
dependsovisi on who you are, and it dependsovisi on what you do.
96
251000
3000
zavisi od toga tko ste, i zavisi od toga čime se bavite.
04:29
But the thing is that you don't decideodlučiti what getsdobiva in.
97
254000
4000
Ali stvar je u tome da vi ne odlučujete što ulazi unutra.
04:33
And more importantlyvažnije,
98
258000
2000
I još važnije,
04:35
you don't actuallyzapravo see what getsdobiva editeduredio out.
99
260000
3000
ne možete vidjeti što vam je uklonjeno.
04:38
So one of the problemsproblemi with the filterfilter bubblemjehurić
100
263000
2000
Tako da je jedan od problema s filter mjehurićem
04:40
was discoveredotkriven by some researchersistraživači at NetflixNetflix.
101
265000
3000
otkriven od strane nekih istraživača iz Netflixa.
04:43
And they were looking at the NetflixNetflix queuesRedovi čekanja, and they noticedprimijetio something kindljubazan of funnysmiješno
102
268000
3000
Promatrali su Netflix redoslijede, i primjetili nešto zanimljivo
04:46
that a lot of us probablyvjerojatno have noticedprimijetio,
103
271000
2000
što su mnogi od nas vjerovatno primjetili,
04:48
whichkoji is there are some moviesfilmovi
104
273000
2000
a to je da postoje neki filmovi
04:50
that just sortvrsta of zipzip right up and out to our houseskuća.
105
275000
3000
koji na neki način iskoče odmah u naše domove.
04:53
They enterUnesi the queuered, they just zipzip right out.
106
278000
3000
Uđu u redoslijed, i odmah iskoče.
04:56
So "IronŽeljezo Man" zipsSlušajte right out,
107
281000
2000
Tako "Iron Man" odmah iskoči,
04:58
and "WaitingČeka for SupermanSupermen"
108
283000
2000
a "Waiting for Superman"
05:00
can wait for a really long time.
109
285000
2000
može čekati prilično dugo vremena.
05:02
What they discoveredotkriven
110
287000
2000
Ono što su otkrili
05:04
was that in our NetflixNetflix queuesRedovi čekanja
111
289000
2000
je da se u Netflix redoslijedu
05:06
there's this epicEP struggleborba going on
112
291000
3000
događa epska bitka
05:09
betweenizmeđu our futurebudućnost aspirationalaspiracija selvessebe
113
294000
3000
između našeg budućeg sebe kojem težimo
05:12
and our more impulsiveimpulzivne presentpredstaviti selvessebe.
114
297000
3000
i našeg sadašnjeg, impulzivnijeg sebe.
05:15
You know we all want to be someonenetko
115
300000
2000
Znate, svi bi mi željeli biti netko
05:17
who has watchedgledao "RashomonRašomon,"
116
302000
2000
tko je gledao "Rashomon,"
05:19
but right now
117
304000
2000
ali trenutno
05:21
we want to watch "AceAs VenturaVentura" for the fourthČetvrta time.
118
306000
3000
želimo gledati "Ace Venturu" po četvrti put.
05:24
(LaughterSmijeh)
119
309000
3000
(Smijeh)
05:27
So the bestnajbolje editingmontaža givesdaje us a bitbit of bothoba.
120
312000
2000
Tako da nam najbolje uređivanje daje ponešto od obojeg.
05:29
It givesdaje us a little bitbit of JustinJustin BieberBieber
121
314000
2000
Daje nam pomalo Justin Biebera
05:31
and a little bitbit of AfghanistanAfganistan.
122
316000
2000
i pomalo Afganistana.
05:33
It givesdaje us some informationinformacija vegetablespovrće;
123
318000
2000
Daje nam nekakve biljke informacija,
05:35
it givesdaje us some informationinformacija dessertdesert.
124
320000
3000
i daje nam neke informacijske poslastice.
05:38
And the challengeizazov with these kindsvrste of algorithmicalgoritamski filtersfilteri,
125
323000
2000
Nedostatak ovakve vrste algoritamskih filtera,
05:40
these personalizedpersonalizirane filtersfilteri,
126
325000
2000
ovih personaliziranih filtera,
05:42
is that, because they're mainlyuglavnom looking
127
327000
2000
je u tome što, pošto pretežno prate
05:44
at what you clickklik on first,
128
329000
4000
na što prvo kliknete,
05:48
it can throwbacanje off that balanceravnoteža.
129
333000
4000
mogu promijeniti tu ravnotežu.
05:52
And insteadumjesto of a balanceduravnotežen informationinformacija dietdijeta,
130
337000
3000
Umjesto balansirane informacijske dijete,
05:55
you can endkraj up surroundedokružen
131
340000
2000
možete završiti okruženi
05:57
by informationinformacija junkstarudija foodhrana.
132
342000
2000
informacijskim smećem od hrane.
05:59
What this suggestssugerira
133
344000
2000
Ovo ustvari ukazuje na to da
06:01
is actuallyzapravo that we maysvibanj have the storypriča about the InternetInternet wrongpogrešno.
134
346000
3000
smo možda promašili cijelu priču s Internetom.
06:04
In a broadcastemitiranje societydruštvo --
135
349000
2000
U širokopojasnom društvu --
06:06
this is how the foundingosnivanje mythologymitologija goeside --
136
351000
2000
ovako glasi osnivačka mitologija --
06:08
in a broadcastemitiranje societydruštvo,
137
353000
2000
u širokopojasnom društvu,
06:10
there were these gatekeepersvratari, the editorsUrednici,
138
355000
2000
postojali su ti vratari, urednici,
06:12
and they controlleddirigovan the flowsteče of informationinformacija.
139
357000
3000
koji su kontrolirali protok informacija.
06:15
And alonguz camedošao the InternetInternet and it sweptswept them out of the way,
140
360000
3000
I onda se pojavio Internet koji ih je pomeo s puta,
06:18
and it alloweddopušteno all of us to connectSpojiti togetherzajedno,
141
363000
2000
i omogućio svima nama da se međusobno povežemo,
06:20
and it was awesomesuper.
142
365000
2000
što je bilo fenomenalno.
06:22
But that's not actuallyzapravo what's happeningdogađa right now.
143
367000
3000
Ali to zapravo nije ono što se upravo događa.
06:26
What we're seeingvidim is more of a passingpretjecanje of the torchbaklja
144
371000
3000
Ono što sada vidimo je više predavanje štafete
06:29
from humanljudski gatekeepersvratari
145
374000
2000
od ljudskih vratara
06:31
to algorithmicalgoritamski onesone.
146
376000
3000
ka algoritamskim.
06:34
And the thing is that the algorithmsalgoritmi
147
379000
3000
A stvar je u tome da algoritmi
06:37
don't yetjoš have the kindljubazan of embeddedugrađen ethicsetika
148
382000
3000
još uvijek nemaju ugrađenu etiku
06:40
that the editorsUrednici did.
149
385000
3000
koju su imali urednici.
06:43
So if algorithmsalgoritmi are going to curatekapelan the worldsvijet for us,
150
388000
3000
Ako će nam algoritmi biti tutori o svijetu,
06:46
if they're going to decideodlučiti what we get to see and what we don't get to see,
151
391000
3000
ako će odlučivati što možemo a što ne možemo vidjeti,
06:49
then we need to make sure
152
394000
2000
onda se moramo pobriniti
06:51
that they're not just keyeds ključem to relevancerelevantnost.
153
396000
3000
da nisu naštimani samo na relevantnost.
06:54
We need to make sure that they alsotakođer showpokazati us things
154
399000
2000
Moramo osigurati da nam prikazuju i stvari
06:56
that are uncomfortableneudoban or challengingizazovno or importantvažno --
155
401000
3000
koje su neugodne ili teške ili bitne --
06:59
this is what TEDTED does --
156
404000
2000
to je ono što TED radi --
07:01
other pointsbodova of viewpogled.
157
406000
2000
druge točke gledišta.
07:03
And the thing is, we'veimamo actuallyzapravo been here before
158
408000
2000
Stvar je u tome da smo već bili na ovom mjestu
07:05
as a societydruštvo.
159
410000
2000
kao društvo.
07:08
In 1915, it's not like newspapersnovine were sweatingznojenje a lot
160
413000
3000
1915. godine, novine se nisu puno brinule
07:11
about theirnjihov civicgrađanski responsibilitiesodgovornosti.
161
416000
3000
o svojoj odgovornosti prema građanima.
07:14
Then people noticedprimijetio
162
419000
2000
Onda su ljudi primjetili
07:16
that they were doing something really importantvažno.
163
421000
3000
da one rade nešto vrlo značajno.
07:19
That, in factčinjenica, you couldn'tne mogu have
164
424000
2000
Da, u stvari, ne možete imati
07:21
a functioningfunkcioniranje democracydemokratija
165
426000
2000
funkcionalnu demokraciju
07:23
if citizensgrađani didn't get a good flowteći of informationinformacija,
166
428000
4000
ako građani nemaju dobar priljev informacija.
07:28
that the newspapersnovine were criticalkritično because they were actinggluma as the filterfilter,
167
433000
3000
Da su novine kritične, jer su funkcionirale kao filter,
07:31
and then journalisticnovinarski ethicsetika developedrazvijen.
168
436000
2000
i onda se razvila novinarska etika.
07:33
It wasn'tnije perfectsavršen,
169
438000
2000
Nije bila savršena,
07:35
but it got us throughkroz the last centurystoljeće.
170
440000
3000
ali nam je poslužila kroz prošli vijek.
07:38
And so now,
171
443000
2000
I tako smo danas,
07:40
we're kindljubazan of back in 1915 on the WebWeb.
172
445000
3000
u neku ruku ponovno u 1915. na Webu.
07:44
And we need the newnovi gatekeepersvratari
173
449000
3000
I potrebni su nam novi vratari
07:47
to encodekodiranje that kindljubazan of responsibilityodgovornost
174
452000
2000
da utkaju tu vrstu odgovornosti
07:49
into the codekodirati that they're writingpisanje.
175
454000
2000
u kod koji ispisuju.
07:51
I know that there are a lot of people here from FacebookFacebook and from GoogleGoogle --
176
456000
3000
Znam da ima dosta ljudi ovdje iz Facebooka ili Googlea --
07:54
LarryLarry and SergeySergej --
177
459000
2000
Larry i Sergey --
07:56
people who have helpedpomogao buildizgraditi the WebWeb as it is,
178
461000
2000
ljudi koji su pomogli u izgradnji Weba kakav je danas,
07:58
and I'm gratefulzahvalan for that.
179
463000
2000
i ja sam im zahvalan zbog toga.
08:00
But we really need you to make sure
180
465000
3000
Ali stvarno želimo da osigurate
08:03
that these algorithmsalgoritmi have encodedkodirani in them
181
468000
3000
da ti algoritmi imaju u sebi ugrađen
08:06
a senseosjećaj of the publicjavnost life, a senseosjećaj of civicgrađanski responsibilityodgovornost.
182
471000
3000
osjećaj javnog života, osjećaj građanske odgovornosti.
08:09
We need you to make sure that they're transparenttransparentan enoughdovoljno
183
474000
3000
Želimo da osigurate da budu dovoljno transparentni
08:12
that we can see what the rulespravila are
184
477000
2000
kako bi vidjeli koja su pravila
08:14
that determineodrediti what getsdobiva throughkroz our filtersfilteri.
185
479000
3000
koja određuju što će proći kroz naše filtere.
08:17
And we need you to give us some controlkontrolirati
186
482000
2000
I želimo da nam date neku kontrolu,
08:19
so that we can decideodlučiti
187
484000
2000
da možemo odlučiti
08:21
what getsdobiva throughkroz and what doesn't.
188
486000
3000
što će proći a što ne.
08:24
Because I think
189
489000
2000
Jer smatram da
08:26
we really need the InternetInternet to be that thing
190
491000
2000
nam je stvarno potrebno da Internet bude to
08:28
that we all dreamedsanjao of it beingbiće.
191
493000
2000
što smo svi sanjali da će biti.
08:30
We need it to connectSpojiti us all togetherzajedno.
192
495000
3000
Potreban nam je da nas sve poveže.
08:33
We need it to introducepredstaviti us to newnovi ideasideje
193
498000
3000
Potreban nam je da nas upozna s novim idejama
08:36
and newnovi people and differentdrugačiji perspectivesperspektive.
194
501000
3000
i novim ljudima i različitim perspektivama.
08:40
And it's not going to do that
195
505000
2000
A to neće napraviti
08:42
if it leaveslišće us all isolatedizolirani in a WebWeb of one.
196
507000
3000
ako nas sve ostavi izolirane u pojedinim mrežama.
08:45
Thank you.
197
510000
2000
Hvala Vam.
08:47
(ApplausePljesak)
198
512000
11000
(Pljesak)
Translated by Matija Stepic
Reviewed by Tilen Pigac - EFZG

▲Back to top

ABOUT THE SPEAKER
Eli Pariser - Organizer and author
Pioneering online organizer Eli Pariser is the author of "The Filter Bubble," about how personalized search might be narrowing our worldview.

Why you should listen

Shortly after the September 11, 2001, attacks, Eli Pariser created a website calling for a multilateral approach to fighting terrorism. In the following weeks, over half a million people from 192 countries signed on, and Pariser rather unexpectedly became an online organizer. The website merged with MoveOn.org in November 2001, and Pariser -- then 20 years old -- joined the group to direct its foreign policy campaigns. He led what the New York Times Magazine called the "mainstream arm of the peace movement" -- tripling MoveOn's member base and demonstrating how large numbers of small donations could be mobilized through online engagement.

In 2004, Pariser became executive director of MoveOn. Under his leadership, MoveOn.org Political Action has grown to 5 million members and raised over $120 million from millions of small donors to support advocacy campaigns and political candidates. Pariser focused MoveOn on online-to-offline organizing, developing phone-banking tools and precinct programs in 2004 and 2006 that laid the groundwork for Barack Obama's extraordinary web-powered campaign. In 2008, Pariser transitioned the Executive Director role at MoveOn to Justin Ruben and became President of MoveOn’s board; he's now a senior fellow at the Roosevelt Institute.

His book The Filter Bubble is set for release May 12, 2011. In it, he asks how modern search tools -- the filter by which many of see the wider world -- are getting better and better and screening the wider world from us, by returning only the search results it "thinks" we want to see.

More profile about the speaker
Eli Pariser | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee