TEDxCERN
Hans Block and Moritz Riesewieck: The price of a "clean" internet
Filmed:
Readability: 5.3
1,337,069 views
Millions of images and videos are uploaded to the internet each day, yet we rarely see shocking and disturbing content in our social media feeds. Who's keeping the internet "clean" for us? In this eye-opening talk, documentarians Hans Block and Moritz Riesewieck take us inside the shadowy world of online content moderators -- the people contracted by major platforms like Facebook, Twitter and Google to rid the internet of toxic material. Learn more about the psychological impact of this kind of work -- and how "digital cleaning" influences what all of us see and think.
Hans Block - Filmmaker, theater director, musician
Under the label Laokoon, Hans Block develops films, theatre productions, essays, lecture performances and radio plays that deal with the question of how our idea of humans and society change or can be transformed in the digital era. Full bioMoritz Riesewieck - Author, scriptwriter, theater and film director
Under the label Laokoon, Moritz Riesewieck develops films, theatre productions, essays, lecture performances and radio plays that deal with the question of how our idea of humans and society change or can be transformed in the digital era. Full bio
Under the label Laokoon, Hans Block develops films, theatre productions, essays, lecture performances and radio plays that deal with the question of how our idea of humans and society change or can be transformed in the digital era. Full bioMoritz Riesewieck - Author, scriptwriter, theater and film director
Under the label Laokoon, Moritz Riesewieck develops films, theatre productions, essays, lecture performances and radio plays that deal with the question of how our idea of humans and society change or can be transformed in the digital era. Full bio
Double-click the English transcript below to play the video.
00:12
[This talk contains mature content]
0
360
3221
00:16
Moritz Riesewieck: On March 23, 2013,
1
4913
4452
00:21
users worldwide
discovered in their news feed
discovered in their news feed
2
9389
4057
00:25
a video of a young girl
being raped by an older man.
being raped by an older man.
3
13470
5063
00:31
Before this video
was removed from Facebook,
was removed from Facebook,
4
19478
3856
00:35
it was already shared 16,000 times,
5
23358
4616
00:39
and it was even liked 4,000 times.
6
27998
3642
00:45
This video went viral
and infected the net.
and infected the net.
7
33268
3663
00:49
Hans Block: And that was the moment
we asked ourselves
we asked ourselves
8
37873
3095
00:52
how could something like this
get on Facebook?
get on Facebook?
9
40992
2778
00:55
And at the same time,
why don't we see such content more often?
why don't we see such content more often?
10
43794
4402
01:00
After all, there's a lot
of revolting material online,
of revolting material online,
11
48220
3683
01:03
but why do we so rarely see such crap
on Facebook, Twitter or Google?
on Facebook, Twitter or Google?
12
51927
4455
01:08
MR: While image-recognition software
13
56958
2241
01:11
can identify the outlines
of sexual organs,
of sexual organs,
14
59223
4283
01:15
blood or naked skin in images and videos,
15
63530
4928
01:20
it has immense difficulties
to distinguish pornographic content
to distinguish pornographic content
16
68482
5540
01:26
from holiday pictures, Adonis statues
17
74046
4348
01:30
or breast-cancer screening campaigns.
18
78418
2698
01:33
It can't distinguish
Romeo and Juliet dying onstage
Romeo and Juliet dying onstage
19
81140
4255
01:37
from a real knife attack.
20
85419
2555
01:39
It can't distinguish satire
from propaganda
from propaganda
21
87998
5222
01:45
or irony from hatred,
and so on and so forth.
and so on and so forth.
22
93244
3968
01:50
Therefore, humans are needed to decide
23
98077
4078
01:54
which of the suspicious content
should be deleted,
should be deleted,
24
102179
4001
01:58
and which should remain.
25
106204
1600
02:00
Humans whom we know almost nothing about,
26
108909
2811
02:03
because they work in secret.
27
111744
1867
02:06
They sign nondisclosure agreements,
28
114053
1905
02:07
which prohibit them
from talking and sharing
from talking and sharing
29
115982
2953
02:10
what they see on their screens
and what this work does to them.
and what this work does to them.
30
118959
3585
02:14
They are forced to use code words
in order to hide who they work for.
in order to hide who they work for.
31
122887
4270
02:19
They are monitored
by private security firms
by private security firms
32
127585
2905
02:22
in order to ensure
that they don't talk to journalists.
that they don't talk to journalists.
33
130514
3510
02:26
And they are threatened by fines
in case they speak.
in case they speak.
34
134048
3539
02:30
All of this sounds
like a weird crime story,
like a weird crime story,
35
138421
3762
02:34
but it's true.
36
142207
1330
02:35
These people exist,
37
143561
1492
02:37
and they are called content moderators.
38
145633
4181
02:42
HB: We are the directors of the feature
documentary film "The Cleaners,"
documentary film "The Cleaners,"
39
150942
3445
02:46
and we would like to take you
40
154411
1960
02:48
to a world that many of you
may not know yet.
may not know yet.
41
156395
2587
02:51
Here's a short clip of our film.
42
159006
2133
02:58
(Music)
43
166639
3620
03:04
(Video) Moderator: I need to be anonymous,
because we have a contract signed.
because we have a contract signed.
44
172400
3686
03:09
We are not allowed to declare
whom we are working with.
whom we are working with.
45
177784
3621
03:14
The reason why I speak to you
46
182807
1763
03:16
is because the world should know
that we are here.
that we are here.
47
184594
4380
03:22
There is somebody
who is checking the social media.
who is checking the social media.
48
190544
2803
03:26
We are doing our best
to make this platform
to make this platform
49
194317
3604
03:29
safe for all of them.
50
197945
1790
03:42
Delete.
51
210438
1150
03:44
Ignore.
52
212278
1294
03:45
Delete.
53
213596
1150
03:47
Ignore.
54
215279
1297
03:48
Delete.
55
216600
1151
03:50
Ignore.
56
218680
1151
03:51
Ignore.
57
219855
1150
03:53
Delete.
58
221625
1150
03:58
HB: The so-called content moderators
59
226030
1977
04:00
don't get their paychecks from Facebook,
Twitter or Google themselves,
Twitter or Google themselves,
60
228031
4000
04:04
but from outsourcing firms
around the world
around the world
61
232055
2317
04:06
in order to keep the wages low.
62
234396
2067
04:08
Tens of thousands of young people
63
236833
1967
04:10
looking at everything
we are not supposed to see.
we are not supposed to see.
64
238824
3213
04:14
And we are talking about
decapitations, mutilations,
decapitations, mutilations,
65
242061
3548
04:17
executions, necrophilia,
torture, child abuse.
torture, child abuse.
66
245633
3696
04:21
Thousands of images in one shift --
67
249743
2274
04:24
ignore, delete, day and night.
68
252041
2915
04:27
And much of this work is done in Manila,
69
255393
3398
04:30
where the analog toxic waste
from the Western world
from the Western world
70
258815
3302
04:34
was transported for years
by container ships,
by container ships,
71
262141
2608
04:36
now the digital waste is dumped there
via fiber-optic cable.
via fiber-optic cable.
72
264773
4003
04:40
And just as the so-called scavengers
73
268800
3047
04:43
rummage through gigantic tips
on the edge of the city,
on the edge of the city,
74
271871
3476
04:47
the content moderators click their way
through an endless toxic ocean
through an endless toxic ocean
75
275371
4833
04:52
of images and videos and all manner
of intellectual garbage,
of intellectual garbage,
76
280228
4087
04:56
so that we don't have to look at it.
77
284339
2302
04:58
MR: But unlike the wounds
of the scavengers,
of the scavengers,
78
286665
3540
05:02
those of the content moderators
remain invisible.
remain invisible.
79
290229
3451
05:06
Full of shocking and disturbing content,
80
294117
3080
05:09
these pictures and videos
burrow into their memories
burrow into their memories
81
297221
3263
05:12
where, at any time,
they can have unpredictable effects:
they can have unpredictable effects:
82
300508
3445
05:15
eating disorders, loss of libido,
83
303977
3357
05:19
anxiety disorders, alcoholism,
84
307358
3259
05:22
depression, which can even
lead to suicide.
lead to suicide.
85
310641
2912
05:26
The pictures and videos infect them,
86
314100
2445
05:28
and often never let them go again.
87
316569
2389
05:30
If they are unlucky, they develop
post-traumatic stress disorders,
post-traumatic stress disorders,
88
318982
4841
05:35
like soldiers after war missions.
89
323847
2200
05:39
In our film, we tell the story
of a young man
of a young man
90
327445
3643
05:43
who had to monitor livestreams
of self-mutilations and suicide attempts,
of self-mutilations and suicide attempts,
91
331112
5198
05:48
again and again,
92
336334
1675
05:50
and who eventually
committed suicide himself.
committed suicide himself.
93
338033
3066
05:53
It's not an isolated case,
as we've been told.
as we've been told.
94
341787
2740
05:57
This is the price all of us pay
95
345184
3980
06:01
for our so-called clean
and safe and "healthy"
and safe and "healthy"
96
349188
5327
06:06
environments on social media.
97
354539
2284
06:10
Never before in the history of mankind
98
358482
2595
06:13
has it been easier to reach
millions of people around the globe
millions of people around the globe
99
361101
3332
06:16
in a few seconds.
100
364457
1667
06:18
What is posted on social media
spreads so quickly,
spreads so quickly,
101
366148
3945
06:22
becomes viral and excites the minds
of people all around the globe.
of people all around the globe.
102
370117
3936
06:26
Before it is deleted,
103
374450
2064
06:28
it is often already too late.
104
376538
1933
06:30
Millions of people
have already been infected
have already been infected
105
378966
2230
06:33
with hatred and anger,
106
381220
1857
06:35
and they either become active online,
107
383101
2730
06:37
by spreading or amplifying hatred,
108
385855
3143
06:41
or they take to the streets
and take up arms.
and take up arms.
109
389022
3793
06:45
HB: Therefore, an army
of content moderators
of content moderators
110
393236
2540
06:47
sit in front of a screen
to avoid new collateral damage.
to avoid new collateral damage.
111
395800
3895
06:52
And they are deciding,
as soon as possible,
as soon as possible,
112
400434
2119
06:54
whether the content
stays on the platform -- ignore;
stays on the platform -- ignore;
113
402577
4095
06:58
or disappears -- delete.
114
406696
2340
07:01
But not every decision is as clear
115
409823
2627
07:04
as the decision about a child-abuse video.
116
412474
2897
07:07
What about controversial content,
ambivalent content,
ambivalent content,
117
415395
2777
07:10
uploaded by civil rights activists
or citizen journalists?
or citizen journalists?
118
418196
3153
07:14
The content moderators
often decide on such cases
often decide on such cases
119
422048
3222
07:17
at the same speed as the [clear] cases.
120
425294
2733
07:21
MR: We will show you a video now,
121
429515
2659
07:24
and we would like to ask you to decide:
122
432198
3309
07:27
Would you delete it,
123
435531
1690
07:29
or would you not delete it?
124
437245
1801
07:31
(Video) (Air strike sounds)
125
439070
1667
07:33
(Explosion)
126
441100
2550
07:40
(People speaking in Arabic)
127
448076
5953
07:46
MR: Yeah, we did some blurring for you.
128
454053
2230
07:49
A child would potentially
be dangerously disturbed
be dangerously disturbed
129
457196
3755
07:52
and extremely frightened by such content.
130
460975
2809
07:55
So, you rather delete it?
131
463808
2297
07:59
But what if this video could help
investigate the war crimes in Syria?
investigate the war crimes in Syria?
132
467610
4639
08:04
What if nobody would have heard
about this air strike,
about this air strike,
133
472717
3167
08:07
because Facebook, YouTube, Twitter
would have decided to take it down?
would have decided to take it down?
134
475908
3738
08:12
Airwars, a nongovernmental
organization based in London,
organization based in London,
135
480895
4325
08:17
tries to find those videos
as quickly as possible
as quickly as possible
136
485244
2897
08:20
whenever they are uploaded
to social media,
to social media,
137
488165
2560
08:22
in order to archive them.
138
490749
1600
08:24
Because they know, sooner or later,
139
492693
2833
08:27
Facebook, YouTube, Twitter
would take such content down.
would take such content down.
140
495550
3310
08:31
People armed with their mobile phones
141
499345
2208
08:33
can make visible what journalists
often do not have access to.
often do not have access to.
142
501577
4199
08:37
Civil rights groups often
do not have any better option
do not have any better option
143
505800
3063
08:40
to quickly make their recordings
accessible to a large audience
accessible to a large audience
144
508887
3801
08:44
than by uploading them to social media.
145
512712
2600
08:47
Wasn't this the empowering potential
the World Wide Web should have?
the World Wide Web should have?
146
515950
4286
08:52
Weren't these the dreams
147
520966
1960
08:54
people in its early stages had
about the World Wide Web?
about the World Wide Web?
148
522950
4111
08:59
Can't pictures and videos like these
149
527608
2795
09:02
persuade people who have become
insensitive to facts
insensitive to facts
150
530427
5134
09:07
to rethink?
151
535585
1150
09:09
HB: But instead, everything
that might be disturbing is deleted.
that might be disturbing is deleted.
152
537917
3602
09:13
And there's a general shift in society.
153
541543
2058
09:15
Media, for example, more and more often
use trigger warnings
use trigger warnings
154
543625
3897
09:19
at the top of articles
155
547546
1793
09:21
which some people may perceive
as offensive or troubling.
as offensive or troubling.
156
549363
3309
09:24
Or more and more students
at universities in the United States
at universities in the United States
157
552696
3914
09:28
demand the banishment of antique classics
158
556634
2817
09:31
which depict sexual violence or assault
from the curriculum.
from the curriculum.
159
559475
3115
09:34
But how far should we go with that?
160
562991
2333
09:37
Physical integrity is guaranteed
as a human right
as a human right
161
565875
3380
09:41
in constitutions worldwide.
162
569279
1800
09:43
In the Charter of Fundamental Rights
of the European Union,
of the European Union,
163
571422
3754
09:47
this right expressly applies
to mental integrity.
to mental integrity.
164
575200
3354
09:51
But even if the potentially
traumatic effect
traumatic effect
165
579347
2658
09:54
of images and videos is hard to predict,
166
582029
2826
09:56
do we want to become so cautious
167
584879
1957
09:58
that we risk losing
social awareness of injustice?
social awareness of injustice?
168
586860
3727
10:03
So what to do?
169
591203
1150
10:04
Mark Zuckerberg recently stated
that in the future,
that in the future,
170
592942
2992
10:07
the users, we, or almost everybody,
171
595958
3802
10:11
will decide individually
172
599784
2261
10:14
what they would like to see
on the platform,
on the platform,
173
602069
2048
10:16
by personal filter settings.
174
604141
2031
10:18
So everyone could easily claim
to remain undisturbed
to remain undisturbed
175
606196
3072
10:21
by images of war
or other violent conflicts, like ...
or other violent conflicts, like ...
176
609292
3739
10:25
MR: I'm the type of guy
who doesn't mind seeing breasts
who doesn't mind seeing breasts
177
613849
4446
10:30
and I'm very interested in global warming,
178
618319
3766
10:34
but I don't like war so much.
179
622109
2564
10:37
HB: Yeah, I'm more the opposite,
180
625109
1722
10:38
I have zero interest in naked breasts
or naked bodies at all.
or naked bodies at all.
181
626855
4053
10:43
But why not guns? I like guns, yes.
182
631209
2911
10:46
MR: Come on, if don't share
a similar social consciousness,
a similar social consciousness,
183
634901
3745
10:50
how shall we discuss social problems?
184
638670
2669
10:53
How shall we call people to action?
185
641363
2397
10:55
Even more isolated bubbles would emerge.
186
643784
3285
10:59
One of the central questions is:
"How, in the future,
"How, in the future,
187
647665
3231
11:02
freedom of expression will be weighed
against the people's need for protection."
against the people's need for protection."
188
650920
4903
11:08
It's a matter of principle.
189
656441
1467
11:10
Do we want to design
an either open or closed society
an either open or closed society
190
658602
4248
11:14
for the digital space?
191
662874
1639
11:17
At the heart of the matter
is "freedom versus security."
is "freedom versus security."
192
665054
5912
11:24
Facebook has always wanted to be
a "healthy" platform.
a "healthy" platform.
193
672388
4484
11:28
Above all, users should feel
safe and secure.
safe and secure.
194
676896
3698
11:32
It's the same choice of words
195
680618
2120
11:34
the content moderators
in the Philippines used
in the Philippines used
196
682762
2958
11:37
in a lot of our interviews.
197
685744
1800
11:40
(Video) The world
that we are living in right now,
that we are living in right now,
198
688188
2381
11:42
I believe, is not really healthy.
199
690593
2166
11:44
(Music)
200
692783
1548
11:46
In this world, there is really
an evil who exists.
an evil who exists.
201
694355
3158
11:49
(Music)
202
697537
3238
11:52
We need to watch for it.
203
700799
2063
11:54
(Music)
204
702886
1882
11:56
We need to control it -- good or bad.
205
704792
3250
12:00
(Music)
206
708646
7000
12:10
[Look up, Young man! --God]
207
718193
4103
12:14
MR: For the young content moderators
in the strictly Catholic Philippines,
in the strictly Catholic Philippines,
208
722952
4278
12:19
this is linked to a Christian mission.
209
727254
2793
12:22
To counter the sins of the world
210
730833
2966
12:25
which spread across the web.
211
733823
2174
12:28
"Cleanliness is next to godliness,"
212
736641
3412
12:32
is a saying everybody
in the Philippines knows.
in the Philippines knows.
213
740077
3434
12:36
HB: And others motivate themselves
214
744035
1659
12:37
by comparing themselves
with their president, Rodrigo Duterte.
with their president, Rodrigo Duterte.
215
745718
3731
12:41
He has been ruling
the Philippines since 2016,
the Philippines since 2016,
216
749837
3491
12:45
and he won the election
with the promise: "I will clean up."
with the promise: "I will clean up."
217
753352
3993
12:49
And what that means is eliminating
all kinds of problems
all kinds of problems
218
757892
3317
12:53
by literally killing people on the streets
219
761233
2455
12:55
who are supposed to be criminals,
whatever that means.
whatever that means.
220
763712
2865
12:58
And since he was elected,
221
766601
1270
12:59
an estimated 20,000 people
have been killed.
have been killed.
222
767895
3436
13:03
And one moderator in our film says,
223
771655
2501
13:06
"What Duterte does on the streets,
224
774180
2055
13:08
I do for the internet."
225
776259
1714
13:10
And here they are,
our self-proclaimed superheroes,
our self-proclaimed superheroes,
226
778934
3564
13:14
who enforce law and order
in our digital world.
in our digital world.
227
782522
2976
13:17
They clean up,
they polish everything clean,
they polish everything clean,
228
785522
2381
13:19
they free us from everything evil.
229
787927
2333
13:22
Tasks formerly reserved
to state authorities
to state authorities
230
790284
3729
13:26
have been taken over
by college graduates in their early 20s,
by college graduates in their early 20s,
231
794037
3675
13:29
equipped with
three- to five-day training --
three- to five-day training --
232
797736
2893
13:32
this is the qualification --
233
800653
1936
13:34
who work on nothing less
than the world's rescue.
than the world's rescue.
234
802613
3067
13:38
MR: National sovereignties
have been outsourced to private companies,
have been outsourced to private companies,
235
806756
4219
13:42
and they pass on their
responsibilities to third parties.
responsibilities to third parties.
236
810999
4008
13:47
It's an outsourcing of the outsourcing
of the outsourcing,
of the outsourcing,
237
815031
3063
13:50
which takes place.
238
818118
1150
13:51
With social networks,
239
819618
1396
13:53
we are dealing with a completely
new infrastructure,
new infrastructure,
240
821038
3015
13:56
with its own mechanisms,
241
824077
1516
13:57
its own logic of action
242
825617
1579
13:59
and therefore, also, its own new dangers,
243
827220
5245
14:04
which had not yet existed
in the predigitalized public sphere.
in the predigitalized public sphere.
244
832489
4025
14:08
HB: When Mark Zuckerberg
was at the US Congress
was at the US Congress
245
836538
2209
14:10
or at the European Parliament,
246
838771
1770
14:12
he was confronted
with all kinds of critics.
with all kinds of critics.
247
840565
2635
14:15
And his reaction was always the same:
248
843224
2580
14:18
"We will fix that,
249
846501
1468
14:19
and I will follow up on that
with my team."
with my team."
250
847993
2551
14:23
But such a debate shouldn't be held
in back rooms of Facebook,
in back rooms of Facebook,
251
851167
3778
14:26
Twitter or Google --
252
854969
1285
14:28
such a debate should be openly discussed
in new, cosmopolitan parliaments,
in new, cosmopolitan parliaments,
253
856278
4816
14:33
in new institutions
that reflect the diversity of people
that reflect the diversity of people
254
861118
4860
14:38
contributing to a utopian project
of a global network.
of a global network.
255
866002
4542
14:42
And while it may seem impossible
to consider the values
to consider the values
256
870568
3377
14:45
of users worldwide,
257
873969
2242
14:48
it's worth believing
258
876235
1682
14:49
that there's more that connects us
than separates us.
than separates us.
259
877941
3286
14:53
MR: Yeah, at a time
when populism is gaining strength,
when populism is gaining strength,
260
881624
3707
14:57
it becomes popular
to justify the symptoms,
to justify the symptoms,
261
885355
3198
15:00
to eradicate them,
262
888577
1278
15:01
to make them invisible.
263
889879
1888
15:04
This ideology is spreading worldwide,
264
892919
3349
15:08
analog as well as digital,
265
896292
2785
15:11
and it's our duty to stop it
266
899903
3492
15:15
before it's too late.
267
903419
1626
15:17
The question of freedom and democracy
268
905665
3984
15:21
must not only have these two options.
269
909673
2967
15:25
HB: Delete.
270
913053
1166
15:26
MR: Or ignore.
271
914243
2147
15:29
HB: Thank you very much.
272
917300
1597
15:30
(Applause)
273
918921
5269
ABOUT THE SPEAKERS
Hans Block - Filmmaker, theater director, musicianUnder the label Laokoon, Hans Block develops films, theatre productions, essays, lecture performances and radio plays that deal with the question of how our idea of humans and society change or can be transformed in the digital era.
Why you should listen
Hans Block is a German theater director, filmmaker and musician. He studied music (drums) at the University of Arts in Berlin and theater directing at Ernst Busch Academy of Dramatic Arts in Berlin. His research, together with Moritz Riesewieck, about freedom of expression in the age of social media brought them international attention. Their debut film, The Cleaners, celebrated its world premiere at the Sundance Film Festival in 2018 and has since been screened at more than 70 international film festivals, in cinemas and on TV worldwide. It was nominated for an Emmy and the German Television Award and has received numerous international awards, including the "Prix Europa" for the Best European TV documentary film 2018 and the Grimme Audience Award 2019.
More profile about the speakerHans Block | Speaker | TED.com
Moritz Riesewieck - Author, scriptwriter, theater and film director
Under the label Laokoon, Moritz Riesewieck develops films, theatre productions, essays, lecture performances and radio plays that deal with the question of how our idea of humans and society change or can be transformed in the digital era.
Why you should listen
Moritz Riesewieck is a German essay author, scriptwriter, theater and film director. He studied directing at Ernst Busch Academy of Dramatic Arts in Berlin and some semesters of economics as a fellow of the German Academic Scholarship Foundation. His research, together with Hans Block, about freedom of expression in the age of social media brought them international attention. Their debut film The Cleaners celebrated its world premiere at the Sundance Film Festival in 2018 and has since been screened at more than 70 international film festivals, in cinemas and on TV worldwide. It was nominated for an Emmy and the German Television Award and has received numerous international awards, including the "Prix Europa" for the Best European TV documentary film 2018 and the Grimme Audience Award 2019. His essay "Digital Dirt Work" was published by German publishing house dtv in September 2017.
More profile about the speakerMoritz Riesewieck | Speaker | TED.com