ABOUT THE SPEAKER
Andrew Marantz - Writer
Andrew Marantz writes narrative journalism about politics, the internet and the way we understand our world.

Why you should listen

Since 2016, Andrew Marantz has been at work on a book about the perils of virality, the myth of linear progress and the American far right. To report the book, he spent several years embedded with some of the conspiracists, white supremacists and nihilist trolls who have become experts at using social media to advance their corrosive agendas. He also watched as some of social media's earliest and most influential founders started to reckon with the forces they'd unleashed. The book, forthcoming in October from Viking Press, is called Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation

Marantz became a staff writer at the New Yorker in 2017. Prior to that, he worked on the magazine's editorial staff, splitting his time between writing stories (about such topics as hip-hop purism and the Truman Show delusion) and editing stories (about Las Vegas night clubs, Liberian warlords and many other things). Ultimately, Marantz's main interest lies not in any particular subject matter, but in how people form beliefs -- and under what circumstances those beliefs can change for the better.

Marantz is also a contributor to Radiolab and The New Yorker Radio Hour, and has written for Harper's, Mother Jones, the New York Times and many other outlets. He holds an undergraduate degree in religion from Brown University and a master's degree in literary nonfiction from New York University. He lives in Brooklyn with his wife, who is a criminal-justice reformer; his two-year-old son, who is an excellent dancer; and an endless supply of peanut butter.

More profile about the speaker
Andrew Marantz | Speaker | TED.com
TED2019

Andrew Marantz: Inside the bizarre world of internet trolls and propagandists

Filmed:
1,676,355 views

Journalist Andrew Marantz spent three years embedded in the world of internet trolls and social media propagandists, seeking out the people who are propelling fringe talking points into the heart of conversation online and trying to understand how they're making their ideas spread. Go down the rabbit hole of online propaganda and misinformation -- and learn we can start to make the internet less toxic.
- Writer
Andrew Marantz writes narrative journalism about politics, the internet and the way we understand our world. Full bio

Double-click the English transcript below to play the video.

00:12
I spent the past three years
0
612
1611
00:14
talking to some of the worst
people on the internet.
1
2247
3183
00:18
Now, if you've been online recently,
2
6441
2196
00:20
you may have noticed that there's
a lot of toxic garbage out there:
3
8661
3804
00:24
racist memes, misogynist propaganda,
viral misinformation.
4
12489
4953
00:29
So I wanted to know
who was making this stuff.
5
17466
2199
00:31
I wanted to understand
how they were spreading it.
6
19689
2338
00:34
Ultimately, I wanted to know
7
22051
1387
00:35
what kind of impact
it might be having on our society.
8
23462
2569
00:38
So in 2016, I started tracing
some of these memes back to their source,
9
26055
4011
00:42
back to the people who were making them
or who were making them go viral.
10
30090
3502
00:45
I'd approach those people and say,
11
33616
1634
00:47
"Hey, I'm a journalist.
Can I come watch you do what you do?"
12
35274
2906
00:50
Now, often the response would be,
13
38204
1596
00:51
"Why in hell would I want to talk to
14
39824
1747
00:53
some low-t soy-boy
Brooklyn globalist Jew cuck
15
41595
2199
00:55
who's in cahoots with the Democrat Party?"
16
43818
2087
00:57
(Laughter)
17
45929
1252
00:59
To which my response would be,
"Look, man, that's only 57 percent true."
18
47205
3835
01:03
(Laughter)
19
51064
1172
01:04
But often I got the opposite response.
20
52633
1840
01:06
"Yeah, sure, come on by."
21
54497
1288
01:08
So that's how I ended up
in the living room
22
56625
2081
01:10
of a social media propagandist
in Southern California.
23
58730
3125
01:14
He was a married white guy
in his late 30s.
24
62386
2381
01:16
He had a table in front of him
with a mug of coffee,
25
64791
3182
01:19
a laptop for tweeting,
26
67997
1570
01:21
a phone for texting
27
69591
1848
01:23
and an iPad for livestreaming
to Periscope and YouTube.
28
71463
3236
01:27
That was it.
29
75258
1161
01:28
And yet, with those tools,
30
76998
1270
01:30
he was able to propel his fringe,
noxious talking points
31
78292
3715
01:34
into the heart of
the American conversation.
32
82031
2411
01:37
For example, one of the days I was there,
33
85144
2023
01:39
a bomb had just exploded in New York,
34
87191
3152
01:42
and the guy accused of planting the bomb
had a Muslim-sounding name.
35
90367
3475
01:45
Now, to the propagandist in California,
this seemed like an opportunity,
36
93866
4142
01:50
because one of the things he wanted
37
98032
1716
01:51
was for the US to cut off
almost all immigration,
38
99772
2562
01:54
especially from Muslim-majority countries.
39
102358
2559
01:57
So he started livestreaming,
40
105424
2055
01:59
getting his followers
worked up into a frenzy
41
107503
2236
02:01
about how the open borders agenda
was going to kill us all
42
109763
2775
02:04
and asking them to tweet about this,
43
112562
1768
02:06
and use specific hashtags,
44
114354
1469
02:07
trying to get those hashtags trending.
45
115847
1858
02:09
And tweet they did --
46
117729
1219
02:10
hundreds and hundreds of tweets,
47
118972
1864
02:12
a lot of them featuring
images like this one.
48
120860
2379
02:15
So that's George Soros.
49
123263
1777
02:17
He's a Hungarian billionaire
and philanthropist,
50
125064
2645
02:19
and in the minds
of some conspiracists online,
51
127733
2595
02:22
George Soros is like
a globalist bogeyman,
52
130352
2568
02:24
one of a few elites who is secretly
manipulating all of global affairs.
53
132944
4110
02:29
Now, just to pause here:
if this idea sounds familiar to you,
54
137078
3592
02:32
that there are a few elites
who control the world
55
140694
2392
02:35
and a lot of them happen to be rich Jews,
56
143110
2581
02:37
that's because it is one of the most
anti-Semitic tropes in existence.
57
145715
3704
02:42
I should also mention that the guy
in New York who planted that bomb,
58
150296
3398
02:45
he was an American citizen.
59
153718
1491
02:47
So whatever else was going on there,
60
155785
2204
02:50
immigration was not the main issue.
61
158013
2173
02:53
And the propagandist in California,
he understood all this.
62
161332
2832
02:56
He was a well-read guy.
He was actually a lawyer.
63
164188
2521
02:58
He knew the underlying facts,
64
166733
1687
03:00
but he also knew that facts
do not drive conversation online.
65
168444
3191
03:03
What drives conversation online
66
171659
1707
03:05
is emotion.
67
173390
1229
03:07
See, the original premise of social media
68
175451
2010
03:09
was that it was going
to bring us all together,
69
177485
2249
03:11
make the world more open
and tolerant and fair ...
70
179758
2379
03:14
And it did some of that.
71
182161
1254
03:16
But the social media algorithms
have never been built
72
184503
2723
03:19
to distinguish between
what's true or false,
73
187250
2338
03:21
what's good or bad for society,
what's prosocial and what's antisocial.
74
189612
3910
03:25
That's just not what those algorithms do.
75
193965
2237
03:28
A lot of what they do
is measure engagement:
76
196226
2555
03:30
clicks, comments, shares,
retweets, that kind of thing.
77
198805
3021
03:33
And if you want your content
to get engagement,
78
201850
2701
03:36
it has to spark emotion,
79
204575
1658
03:38
specifically, what behavioral scientists
call "high-arousal emotion."
80
206257
3500
03:42
Now, "high arousal" doesn't only
mean sexual arousal,
81
210343
2490
03:44
although it's the internet,
obviously that works.
82
212857
2735
03:47
It means anything, positive or negative,
that gets people's hearts pumping.
83
215616
4009
03:51
So I would sit with these propagandists,
84
219649
1969
03:53
not just the guy in California,
but dozens of them,
85
221642
2409
03:56
and I would watch as they did this
again and again successfully,
86
224075
3748
03:59
not because they were Russian hackers,
not because they were tech prodigies,
87
227847
3608
04:03
not because they had
unique political insights --
88
231479
2289
04:05
just because they understood
how social media worked,
89
233772
2473
04:08
and they were willing
to exploit it to their advantage.
90
236269
2582
04:10
Now, at first I was able to tell myself
this was a fringe phenomenon,
91
238865
3243
04:14
something that was
relegated to the internet.
92
242132
2229
04:16
But there's really no separation anymore
between the internet and everything else.
93
244385
4352
04:20
This is an ad that ran
on multiple TV stations
94
248761
2184
04:22
during the 2018 congressional elections,
95
250969
2607
04:25
alleging with very little evidence
that one of the candidates
96
253600
2880
04:28
was in the pocket of
international manipulator George Soros,
97
256504
2886
04:31
who is awkwardly photoshopped here
next to stacks of cash.
98
259414
3252
04:35
This is a tweet from
the President of the United States,
99
263296
2654
04:37
alleging, again with no evidence,
100
265974
1633
04:39
that American politics is being
manipulated by George Soros.
101
267631
3431
04:43
This stuff that once seemed so shocking
and marginal and, frankly, just ignorable,
102
271086
3974
04:47
it's now so normalized
that we hardly even notice it.
103
275084
2945
04:50
So I spent about
three years in this world.
104
278053
2066
04:52
I talked to a lot of people.
105
280143
1670
04:53
Some of them seemed to have
no core beliefs at all.
106
281837
2466
04:56
They just seemed to be betting,
perfectly rationally,
107
284327
2583
04:58
that if they wanted
to make some money online
108
286934
2163
05:01
or get some attention online,
109
289121
1412
05:02
they should just be
as outrageous as possible.
110
290557
2213
05:04
But I talked to other people
who were true ideologues.
111
292794
2589
05:08
And to be clear, their ideology
was not traditional conservatism.
112
296173
3881
05:12
These were people who wanted
to revoke female suffrage.
113
300078
3378
05:15
These were people who wanted
to go back to racial segregation.
114
303480
2967
05:18
Some of them wanted to do away
with democracy altogether.
115
306471
2809
05:21
Now, obviously these people
were not born believing these things.
116
309304
3112
05:24
They didn't pick them up
in elementary school.
117
312440
2392
05:26
A lot of them, before they went
down some internet rabbit hole,
118
314856
2992
05:29
they had been libertarian
or they had been socialist
119
317872
2468
05:32
or they had been something else entirely.
120
320364
2187
05:34
So what was going on?
121
322575
1291
05:36
Well, I can't generalize about every case,
122
324918
2062
05:39
but a lot of the people I spoke to,
123
327004
1687
05:40
they seem to have a combination
of a high IQ and a low EQ.
124
328715
3880
05:44
They seem to take comfort
in anonymous, online spaces
125
332619
3511
05:48
rather than connecting in the real world.
126
336154
1987
05:50
So often they would retreat
to these message boards
127
338810
2453
05:53
or these subreddits,
128
341287
1172
05:54
where their worst impulses
would be magnified.
129
342483
2454
05:56
They might start out saying
something just as a sick joke,
130
344961
3059
06:00
and then they would get so much
positive reinforcement for that joke,
131
348044
3344
06:03
so many meaningless
"internet points," as they called it,
132
351412
2821
06:06
that they might start
believing their own joke.
133
354257
2532
06:10
I talked a lot with one young woman
who grew up in New Jersey,
134
358014
3524
06:13
and then after high school,
she moved to a new place
135
361562
2476
06:16
and suddenly she just felt
alienated and cut off
136
364062
2254
06:18
and started retreating into her phone.
137
366340
1864
06:20
She found some of these
spaces on the internet
138
368850
2172
06:23
where people would post
the most shocking, heinous things.
139
371046
2866
06:25
And she found this stuff
really off-putting
140
373936
2269
06:28
but also kind of engrossing,
141
376229
1775
06:30
kind of like she couldn't
look away from it.
142
378754
2084
06:33
She started interacting with people
in these online spaces,
143
381304
2832
06:36
and they made her feel smart,
they made her feel validated.
144
384160
2799
06:38
She started feeling a sense of community,
145
386983
1977
06:40
started wondering if maybe
some of these shocking memes
146
388984
2658
06:43
might actually contain a kernel of truth.
147
391666
2461
06:46
A few months later, she was in a car
with some of her new internet friends
148
394151
3515
06:49
headed to Charlottesville, Virginia,
149
397690
1826
06:51
to march with torches
in the name of the white race.
150
399540
2688
06:55
She'd gone, in a few months,
from Obama supporter
151
403033
2291
06:57
to fully radicalized white supremacist.
152
405338
2487
07:01
Now, in her particular case,
153
409030
2281
07:03
she actually was able to find her way
out of the cult of white supremacy.
154
411335
4132
07:08
But a lot of the people
I spoke to were not.
155
416418
2060
07:10
And just to be clear:
156
418502
1722
07:12
I was never so convinced
that I had to find common ground
157
420248
2966
07:15
with every single person I spoke to
158
423238
2063
07:17
that I was willing to say,
159
425325
1254
07:18
"You know what, man,
you're a fascist propagandist, I'm not,
160
426603
3141
07:21
whatever, let's just hug it out,
all our differences will melt away."
161
429768
3416
07:25
No, absolutely not.
162
433208
1714
07:28
But I did become convinced that we cannot
just look away from this stuff.
163
436056
3521
07:31
We have to try to understand it,
because only by understanding it
164
439601
3216
07:34
can we even start to inoculate
ourselves against it.
165
442841
3198
07:39
In my three years in this world,
I got a few nasty phone calls,
166
447361
3349
07:42
even some threats,
167
450734
1520
07:44
but it wasn't a fraction of what
female journalists get on this beat.
168
452278
3430
07:48
And yeah, I am Jewish,
169
456791
1391
07:50
although, weirdly, a lot of the Nazis
couldn't tell I was Jewish,
170
458206
3649
07:53
which I frankly just found
kind of disappointing.
171
461879
2861
07:56
(Laughter)
172
464764
1800
07:58
Seriously, like, your whole job
is being a professional anti-Semite.
173
466588
3966
08:03
Nothing about me
is tipping you off at all?
174
471228
2422
08:05
Nothing?
175
473674
1076
08:06
(Laughter)
176
474774
2303
08:09
This is not a secret.
177
477983
1195
08:11
My name is Andrew Marantz,
I write for "The New Yorker,"
178
479202
2683
08:13
my personality type
is like if a Seinfeld episode
179
481909
2521
08:16
was taped at the Park Slope Food Coop.
180
484454
1837
08:18
Nothing?
181
486315
1162
08:19
(Laughter)
182
487501
2513
08:24
Anyway, look -- ultimately,
it would be nice
183
492804
2231
08:27
if there were, like, a simple formula:
184
495059
2348
08:29
smartphone plus alienated kid
equals 12 percent chance of Nazi.
185
497431
3988
08:33
It's obviously not that simple.
186
501918
1604
08:36
And in my writing,
187
504190
1163
08:37
I'm much more comfortable
being descriptive, not prescriptive.
188
505377
3215
08:41
But this is TED,
189
509049
2641
08:43
so let's get practical.
190
511714
1987
08:45
I want to share a few suggestions
191
513725
1626
08:47
of things that citizens
of the internet like you and I
192
515375
3126
08:50
might be able to do to make things
a little bit less toxic.
193
518525
3137
08:54
So the first one is to be a smart skeptic.
194
522799
2300
08:57
So, I think there are
two kinds of skepticism.
195
525964
2197
09:00
And I don't want to drown you in technical
epistemological information here,
196
528185
4228
09:04
but I call them smart and dumb skepticism.
197
532437
2628
09:08
So, smart skepticism:
198
536176
2507
09:10
thinking for yourself,
199
538707
1204
09:11
questioning every claim,
200
539935
1192
09:13
demanding evidence --
201
541151
1442
09:14
great, that's real skepticism.
202
542617
1717
09:17
Dumb skepticism:
it sounds like skepticism,
203
545397
2630
09:20
but it's actually closer
to knee-jerk contrarianism.
204
548051
2991
09:23
Everyone says the earth is round,
205
551817
1573
09:25
you say it's flat.
206
553414
1356
09:26
Everyone says racism is bad,
207
554794
1567
09:28
you say, "I dunno,
I'm skeptical about that."
208
556385
2585
09:31
I cannot tell you how many young white men
I have spoken to in the last few years
209
559682
4119
09:35
who have said,
210
563825
1156
09:37
"You know, the media, my teachers,
they're all trying to indoctrinate me
211
565005
3400
09:40
into believing in male privilege
and white privilege,
212
568429
2523
09:42
but I don't know about that,
man, I don't think so."
213
570976
2452
09:45
Guys -- contrarian
white teens of the world --
214
573452
3137
09:48
look:
215
576613
2118
09:50
if you are being a round earth skeptic
and a male privilege skeptic
216
578755
3713
09:54
and a racism is bad skeptic,
217
582492
2310
09:56
you're not being a skeptic,
you're being a jerk.
218
584826
2325
09:59
(Applause)
219
587175
3593
10:04
It's great to be independent-minded,
we all should be independent-minded,
220
592394
3460
10:07
but just be smart about it.
221
595878
1541
10:09
So this next one is about free speech.
222
597906
1863
10:11
You will hear smart, accomplished people
who will say, "Well, I'm pro-free speech,"
223
599793
3964
10:15
and they say it in this way
that it's like they're settling a debate,
224
603781
3273
10:19
when actually, that is the very beginning
of any meaningful conversation.
225
607078
3815
10:23
All the interesting stuff
happens after that point.
226
611554
2473
10:26
OK, you're pro-free speech.
What does that mean?
227
614051
2304
10:28
Does it mean that David Duke
and Richard Spencer
228
616379
2287
10:30
need to have active Twitter accounts?
229
618690
1845
10:32
Does it mean that anyone
can harass anyone else online
230
620888
2643
10:35
for any reason?
231
623555
1452
10:37
You know, I looked through
the entire list of TED speakers this year.
232
625031
3288
10:40
I didn't find a single
round earth skeptic.
233
628343
2036
10:42
Is that a violation of free speech norms?
234
630403
2142
10:45
Look, we're all pro-free speech,
it's wonderful to be pro-free speech,
235
633086
3542
10:48
but if that's all you know
how to say again and again,
236
636652
2621
10:51
you're standing in the way
of a more productive conversation.
237
639297
2991
10:56
Making decency cool again, so ...
238
644105
3135
10:59
Great!
239
647264
1198
11:00
(Applause)
240
648486
1541
11:02
Yeah. I don't even need to explain it.
241
650051
2106
11:04
So in my research, I would go
to Reddit or YouTube or Facebook,
242
652181
3846
11:08
and I would search for "sharia law"
243
656051
2300
11:10
or I would search for "the Holocaust,"
244
658375
2116
11:12
and you might be able to guess
what the algorithms showed me, right?
245
660515
3374
11:16
"Is sharia law sweeping
across the United States?"
246
664386
2930
11:19
"Did the Holocaust really happen?"
247
667340
2094
11:22
Dumb skepticism.
248
670291
1258
11:24
So we've ended up in this
bizarre dynamic online,
249
672835
2381
11:27
where some people see bigoted propaganda
250
675240
2092
11:29
as being edgy or being dangerous and cool,
251
677356
2666
11:32
and people see basic truth
and human decency as pearl-clutching
252
680046
3327
11:35
or virtue-signaling or just boring.
253
683397
2544
11:38
And the social media algorithms,
whether intentionally or not,
254
686329
3289
11:41
they have incentivized this,
255
689642
2023
11:43
because bigoted propaganda
is great for engagement.
256
691689
2928
11:46
Everyone clicks on it,
everyone comments on it,
257
694641
2266
11:48
whether they love it or they hate it.
258
696931
1810
11:51
So the number one thing
that has to happen here
259
699463
2288
11:53
is social networks need
to fix their platforms.
260
701775
2938
11:57
(Applause)
261
705069
4149
12:01
So if you're listening to my voice
and you work at a social media company
262
709600
3430
12:05
or you invest in one
or, I don't know, own one,
263
713054
2464
12:08
this tip is for you.
264
716598
1461
12:10
If you have been optimizing
for maximum emotional engagement
265
718083
3921
12:14
and maximum emotional engagement turns out
to be actively harming the world,
266
722028
4113
12:18
it's time to optimize for something else.
267
726165
2536
12:20
(Applause)
268
728725
3704
12:26
But in addition to putting pressure
on them to do that
269
734939
3513
12:30
and waiting for them
and hoping that they'll do that,
270
738476
2634
12:33
there's some stuff that
the rest of us can do, too.
271
741134
2502
12:35
So, we can create some better pathways
or suggest some better pathways
272
743660
4497
12:40
for angsty teens to go down.
273
748181
1991
12:42
If you see something that you think
is really creative and thoughtful
274
750196
3279
12:45
and you want to share that thing,
you can share that thing,
275
753499
2799
12:48
even if it's not flooding you
with high arousal emotion.
276
756322
2656
12:51
Now that is a very small step, I realize,
277
759002
2020
12:53
but in the aggregate,
this stuff does matter,
278
761046
2184
12:55
because these algorithms,
as powerful as they are,
279
763254
2406
12:57
they are taking their
behavioral cues from us.
280
765684
2205
13:01
So let me leave you with this.
281
769556
1513
13:03
You know, a few years ago
it was really fashionable
282
771701
2489
13:06
to say that the internet
was a revolutionary tool
283
774214
2313
13:08
that was going to bring us all together.
284
776551
2012
13:11
It's now more fashionable to say
285
779074
1589
13:12
that the internet is a huge,
irredeemable dumpster fire.
286
780687
3094
13:16
Neither caricature is really true.
287
784787
1832
13:18
We know the internet
is just too vast and complex
288
786643
2373
13:21
to be all good or all bad.
289
789040
1499
13:22
And the danger with
these ways of thinking,
290
790563
2087
13:24
whether it's the utopian view
that the internet will inevitably save us
291
792674
3463
13:28
or the dystopian view that it
will inevitably destroy us,
292
796161
3448
13:31
either way, we're letting
ourselves off the hook.
293
799633
2444
13:35
There is nothing inevitable
about our future.
294
803564
2606
13:38
The internet is made of people.
295
806878
2001
13:40
People make decisions
at social media companies.
296
808903
2984
13:43
People make hashtags trend or not trend.
297
811911
2651
13:46
People make societies progress or regress.
298
814586
3129
13:51
When we internalize that fact,
299
819120
1535
13:52
we can stop waiting
for some inevitable future to arrive
300
820679
3048
13:55
and actually get to work now.
301
823751
1664
13:58
You know, we've all been taught
that the arc of the moral universe is long
302
826842
3519
14:02
but that it bends toward justice.
303
830385
1702
14:06
Maybe.
304
834489
1230
14:08
Maybe it will.
305
836814
1200
14:10
But that has always been an aspiration.
306
838776
2477
14:13
It is not a guarantee.
307
841277
1571
14:15
The arc doesn't bend itself.
308
843856
2165
14:18
It's not bent inevitably
by some mysterious force.
309
846045
3675
14:21
The real truth,
310
849744
1672
14:23
which is scarier and also more liberating,
311
851440
2391
14:26
is that we bend it.
312
854847
1215
14:28
Thank you.
313
856943
1209
14:30
(Applause)
314
858176
2617

▲Back to top

ABOUT THE SPEAKER
Andrew Marantz - Writer
Andrew Marantz writes narrative journalism about politics, the internet and the way we understand our world.

Why you should listen

Since 2016, Andrew Marantz has been at work on a book about the perils of virality, the myth of linear progress and the American far right. To report the book, he spent several years embedded with some of the conspiracists, white supremacists and nihilist trolls who have become experts at using social media to advance their corrosive agendas. He also watched as some of social media's earliest and most influential founders started to reckon with the forces they'd unleashed. The book, forthcoming in October from Viking Press, is called Antisocial: Online Extremists, Techno-Utopians, and the Hijacking of the American Conversation

Marantz became a staff writer at the New Yorker in 2017. Prior to that, he worked on the magazine's editorial staff, splitting his time between writing stories (about such topics as hip-hop purism and the Truman Show delusion) and editing stories (about Las Vegas night clubs, Liberian warlords and many other things). Ultimately, Marantz's main interest lies not in any particular subject matter, but in how people form beliefs -- and under what circumstances those beliefs can change for the better.

Marantz is also a contributor to Radiolab and The New Yorker Radio Hour, and has written for Harper's, Mother Jones, the New York Times and many other outlets. He holds an undergraduate degree in religion from Brown University and a master's degree in literary nonfiction from New York University. He lives in Brooklyn with his wife, who is a criminal-justice reformer; his two-year-old son, who is an excellent dancer; and an endless supply of peanut butter.

More profile about the speaker
Andrew Marantz | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee