ABOUT THE SPEAKER
Sebastián Bortnik - Information security specialist
Sebastián Bortnik's work is focused on preventing cyber attacks.

Why you should listen

Sebastián Bortnik is an information security specialist with more than ten years of work experience in the field. He has been working on research and education on cyber attacks since the beginning of his career. He's currently head of research for Onapsis, a company dedicated to protecting ERP and business-critical applications in big companies. Previously, he led a research and technology team for an antivirus company in Latin America focused on malware analysis and cyber attacks research.

Bortnik is one of the founding members of a non-profit organization in Argentina, Argentina Cibersegura, which is focused on raising awareness within the community on how to prevent cyber attacks. As its leader, he has helped create a network with more than 200 volunteers around the country that teach how to use technologies in a safer way to thousands of kids every year.

Bortnik has written several articles and delivered many talks at different conferences around the world. He is passionate about education and believes that everybody needs to understand the impact technology has on our lives and safety.

(Photo: Monstruo Estudio)

More profile about the speaker
Sebastián Bortnik | Speaker | TED.com
TEDxRiodelaPlata

Sebastián Bortnik: The conversation we're not having about digital child abuse

Filmed:
518,275 views

We need to talk to kids about the risks they face online, says information security expert Sebastián Bortnik. In this talk, Bortnik discusses the issue of "grooming" -- the sexual predation of children by adults on the internet -- and outlines the conversations we need to start having about technology to keep our kids safe. (In Spanish with English subtitles)
- Information security specialist
Sebastián Bortnik's work is focused on preventing cyber attacks. Full bio

Double-click the English transcript below to play the video.

00:12
[This talk contains graphic content.
Viewer discretion is advised.]
0
119
3975
00:16
This is Nina Rodríguez's Facebook profile.
1
4278
2944
00:21
This person had three different profiles
2
9072
2189
00:23
and 890 kids between 8 and 13 years old
among her friends list.
3
11286
4446
00:30
These are excerpts of a chat
with one of those kids.
4
18827
3703
00:37
This is an exact copy of the chat.
5
25385
2116
00:40
It's part of the case file.
6
28650
1682
00:44
This kid started sending private photos
7
32886
3456
00:48
until his family realized
what was going on.
8
36366
2273
00:51
The police report and subsequent
investigation lead them to a house.
9
39871
4430
00:57
This was the girl's bedroom.
10
45409
2213
01:02
Nina Rodríguez was actually
a 24-year-old man
11
50451
5007
01:07
that used to do this with lots of kids.
12
55482
2740
01:13
Micaela Ortega was 12 years old
13
61471
2220
01:16
when she went to meet
her new Facebook friend,
14
64697
2775
01:19
also 12.
15
67496
1380
01:21
"Rochi de River," was her name.
16
69667
2417
01:26
She actually met Jonathan Luna,
who was 26 years old.
17
74142
3510
01:30
When they finally caught him,
18
78611
1650
01:32
he confessed that he killed the girl
because she refused to have sex with him.
19
80285
4314
01:38
He had four Facebook profiles
20
86696
2986
01:41
and 1,700 women on his contact list;
21
89706
3370
01:47
90 percent of them
were under 13 years old.
22
95003
2493
01:52
These are two different
cases of "grooming":
23
100529
2269
01:56
an adult contacts a kid
through the internet,
24
104195
3300
02:00
and through manipulation or lying,
leads that kid into sexual territory --
25
108263
4344
02:04
from talking about sex
26
112631
1871
02:06
to sharing private photos,
27
114526
2184
02:08
recording the kid using a webcam
28
116734
2217
02:10
or arranging an in-person meeting.
29
118975
1778
02:14
This is grooming.
30
122527
1340
02:16
This is happening, and it's on the rise.
31
124589
2570
02:21
The question is: What are we going to do?
32
129727
2456
02:24
Because, in the meantime, kids are alone.
33
132207
2830
02:28
They finish dinner, go to their rooms,
34
136724
2016
02:30
close the door,
35
138764
1290
02:32
get on their computer, their cell phones,
36
140964
2423
02:35
and get into a bar,
37
143411
2210
02:38
into a club.
38
146704
1160
02:42
Think for one second
about what I've just said:
39
150284
2690
02:46
they're in a place full of strangers
40
154752
2848
02:49
in an uninhibited environment.
41
157624
2049
02:53
The internet broke physical boundaries.
42
161853
2090
02:56
When we're alone in our bedroom
and we go online,
43
164744
4031
03:00
we're not really alone.
44
168799
1577
03:05
There are at least two reasons
why we're not taking care of this,
45
173460
4680
03:10
or at least not in the right way.
46
178164
1702
03:13
First, we're sure that everything
that happens online is "virtual."
47
181779
4407
03:18
In fact, we call it "the virtual world."
48
186210
3023
03:23
If you look it up in the dictionary,
49
191019
1810
03:25
something virtual is something
that seems to exist
50
193816
2475
03:28
but is not real.
51
196315
1279
03:31
And we use that word
to talk about the internet:
52
199028
4172
03:35
something not real.
53
203224
1400
03:38
And that's the problem with grooming.
54
206867
2383
03:41
It is real.
55
209274
1400
03:43
Degenerate, perverted adults
use the internet to abuse boys and girls
56
211714
5814
03:49
and take advantage of, among other things,
57
217552
2085
03:51
the fact that the kids and their parents
think that what happens online
58
219661
3410
03:55
doesn't actually happen.
59
223095
1562
03:59
Several years ago,
some colleagues and I founded an NGO
60
227631
3709
04:03
called "Argentina Cibersegura,"
61
231364
1944
04:05
dedicated to raising awareness
about online safety.
62
233332
3211
04:10
In 2013, we attended meetings
at the House of Legislature
63
238678
4201
04:14
to discuss a law about grooming.
64
242903
2400
04:19
I remember that a lot of people thought
65
247688
2275
04:21
that grooming was strictly a precursor
66
249987
2172
04:24
to arranging an in-person meeting
with a kid to have sex with them.
67
252183
3705
04:30
But they didn't think about what happened
to the kids who were exposed
68
258247
3604
04:33
by talking about sex
with an adult without knowing it,
69
261875
3058
04:37
or who shared intimate photos thinking
only another kid would see them,
70
265904
3746
04:41
or even worse,
71
269674
1150
04:43
who had exposed themselves
using their web cam.
72
271887
2293
04:46
Nobody considered that rape.
73
274875
2049
04:51
I'm sure lots of you find it odd to think
one person can abuse another
74
279227
4317
04:55
without physical contact.
75
283568
1345
04:57
We're programmed to think that way.
76
285742
2180
05:01
I know, because I used to think that way.
77
289183
2067
05:03
I was just an IT security guy
78
291274
2300
05:07
until this happened to me.
79
295348
1389
05:11
At the end of 2011,
80
299423
1640
05:14
in a little town in Buenos Aires Province,
81
302179
2539
05:16
I heard about a case for the first time.
82
304742
2512
05:20
After giving a talk,
83
308780
1430
05:23
I met the parents of an 11-year-old girl
who had been a victim of grooming.
84
311288
4396
05:30
A man had manipulated her
into masturbating in front of her web cam,
85
318114
4023
05:34
and recorded it.
86
322161
1440
05:36
And the video was on several websites.
87
324207
2959
05:40
That day, her parents asked us, in tears,
88
328981
2681
05:43
to tell them the magic formula
89
331686
1542
05:45
for how to delete those videos
from the internet.
90
333252
2565
05:50
It broke my heart and changed me forever
91
338036
3657
05:53
to be their last disappointment,
telling them it was too late:
92
341717
3330
05:57
once content is online,
93
345671
2546
06:00
we've already lost control.
94
348241
1508
06:04
Since that day, I think about that girl
95
352714
2087
06:09
waking up in the morning,
having breakfast with her family,
96
357110
3542
06:12
who had seen the video,
97
360676
1417
06:15
and then walking to school, meeting
people that had seen her naked,
98
363211
5429
06:20
arriving to school, playing with
her friends, who had also seen her.
99
368664
4290
06:26
That was her life.
100
374830
1367
06:29
Exposed.
101
377502
1410
06:33
Of course, nobody raped her body.
102
381761
2220
06:37
But hadn't her sexuality been abused?
103
385355
2252
06:42
We clearly use different standards
to measure physical and digital things.
104
390723
3823
06:49
And we get angry at social networks
105
397363
1936
06:51
because being angry with ourselves
is more painful and more true.
106
399323
3990
06:56
And this brings us
to the second reason why
107
404775
2038
06:58
we aren't paying proper
attention to this issue.
108
406837
2533
07:02
We're convinced that kids
don't need our help,
109
410180
4489
07:06
that they "know everything"
about technology.
110
414693
2258
07:12
When I was a kid,
111
420092
1290
07:14
at one point, my parents started
letting me walk to school alone.
112
422590
3156
07:19
After years of taking me by the hand
and walking me to school,
113
427329
3880
07:24
one day they sat me down,
114
432476
1586
07:26
gave me the house keys
115
434086
1843
07:27
and said, "Be very careful with these;
don't give them to anyone,
116
435953
3987
07:31
take the route we showed you,
be at home at the time we said,
117
439964
4155
07:36
cross at the corner,
and look both ways before you cross,
118
444143
2739
07:38
and no matter what,
don't talk to strangers."
119
446906
3788
07:44
I knew everything about walking,
120
452967
2030
07:47
and yet, there was a responsible adult
there taking care of me.
121
455888
3733
07:52
Knowing how to do something is one thing,
122
460886
1976
07:54
knowing how to take care
of yourself is another.
123
462886
2332
07:58
Imagine this situation:
124
466649
1888
08:00
I'm 10 or 11 years old,
I wake up in the morning,
125
468561
2564
08:03
my parents toss me the keys and say,
126
471149
2116
08:05
"Seba, now you can walk to school alone."
127
473289
2133
08:08
And when I come back late,
128
476487
2200
08:11
they say, "No, you need to be home
at the time we said."
129
479535
3417
08:16
And two weeks later,
130
484826
1864
08:18
when it comes up,
they say, "You know what?
131
486714
3109
08:21
You have to cross at the corner,
and look both ways before crossing."
132
489847
3346
08:26
And two years later, they say,
133
494815
1673
08:29
"And also, don't talk to strangers."
134
497481
3260
08:34
It sounds absurd, right?
135
502866
1738
08:38
We have the same absurd behavior
in relation to technology.
136
506312
2890
08:42
We give kids total access
137
510129
2143
08:44
and we see if one day, sooner or later,
138
512296
2840
08:47
they learn how to take care of themselves.
139
515160
2167
08:50
Knowing how to do something is one thing,
140
518942
2004
08:52
knowing how to take care
of yourself is another.
141
520970
2437
08:56
Along those same lines,
when we talk to parents,
142
524703
2577
08:59
they often say they don't care
about technology and social networks.
143
527304
5495
09:06
I always rejoin that by asking
if they care about their kids.
144
534202
3070
09:09
As adults, being interested
or not in technology
145
537976
2464
09:12
is the same as being interested
or not in our kids.
146
540464
2822
09:15
The internet is part of their lives.
147
543310
1935
09:18
Technology forces us to rethink
the relationship between adults and kids.
148
546700
5066
09:24
Education was always based
on two main concepts:
149
552984
3200
09:28
experience and knowledge.
150
556208
2740
09:32
How do we teach our kids to be safe online
when we don't have either?
151
560687
5738
09:40
Nowadays, we adults
have to guide our children
152
568055
2739
09:42
through what is often for us
unfamiliar territory --
153
570818
2458
09:45
territory much more inviting for them.
154
573300
2019
09:49
It's impossible to find an answer
155
577501
3068
09:52
without doing new things --
things that make us uncomfortable,
156
580593
3048
09:55
things we're not used to.
157
583665
1486
09:59
A lot of you may think it's easy for me,
158
587581
2691
10:02
because I'm relatively young.
159
590296
1747
10:04
And it used to be that way.
160
592681
1480
10:07
Used to.
161
595342
1160
10:10
Until last year,
162
598249
1866
10:12
when I felt the weight
of my age on my shoulders
163
600139
4122
10:17
the first time I opened Snapchat.
164
605422
4634
10:22
(Laughter)
165
610390
2659
10:26
(Applause)
166
614510
2665
10:31
I didn't understand a thing!
167
619585
2840
10:35
I found it unnecessary,
168
623321
2328
10:37
useless, hard to understand;
169
625673
3142
10:40
it looked like a camera!
170
628839
1888
10:42
It didn't have menu options!
171
630751
1676
10:46
It was the first time I felt the gap
172
634760
2688
10:49
that sometimes exists
between kids and adults.
173
637472
2592
10:53
But it was also an opportunity
to do the right thing,
174
641742
3260
10:57
to leave my comfort zone, to force myself.
175
645026
2350
11:00
I never thought I'd ever use Snapchat,
176
648948
3311
11:04
but then I asked my teenage cousin
to show me how to use it.
177
652283
4602
11:10
I also asked why she used it.
178
658349
1830
11:12
What was fun about it?
179
660788
1432
11:15
We had a really nice talk.
180
663738
1560
11:17
She showed me her Snapchat,
she told me things,
181
665929
2453
11:20
we got closer, we laughed.
182
668406
2411
11:24
Today, I use it.
183
672949
1380
11:26
(Laughter)
184
674675
1500
11:28
I don't know if I do it right,
185
676788
1625
11:30
but the most important thing
is that I know it and I understand it.
186
678437
4371
11:36
The key was to overcome the initial shock
187
684406
4137
11:40
and do something new.
188
688567
1850
11:43
Something new.
189
691940
1480
11:45
Today, we have the chance
to create new conversations.
190
693444
2920
11:49
What's the last app you downloaded?
191
697312
2316
11:51
Which social network do you use
to contact your friends?
192
699652
2860
11:55
What kind of information do you share?
193
703074
2190
11:58
Have you ever been
approached by strangers?
194
706559
2077
12:02
Could we have these conversations
between kids and adults?
195
710862
3136
12:07
We have to force ourselves
to do it. All of us.
196
715536
2683
12:10
Today, lots of kids are listening to us.
197
718243
4570
12:16
Sometimes when we go
to schools to give our talks,
198
724707
2733
12:19
or through social networks,
199
727464
1650
12:21
kids ask or tell us things
200
729138
2330
12:23
they haven't told
their parents or their teachers.
201
731492
4297
12:27
They tell us -- they don't even know us.
202
735813
2080
12:32
Those kids need to know
203
740966
1940
12:36
what the risks of being online are,
204
744135
2320
12:40
how to take care of themselves,
205
748174
1506
12:41
but also that, fundamentally,
as with almost everything else,
206
749704
3567
12:45
kids can learn this from any adult.
207
753295
3005
12:51
Online safety needs to be
a conversation topic
208
759558
3886
12:55
in every house and every
classroom in the country.
209
763468
2601
12:59
We did a survey this year that showed
that 15 percent of schools said
210
767863
4265
13:04
they knew of cases of grooming
in their school.
211
772152
2291
13:07
And this number is growing.
212
775386
1411
13:11
Technology changed
every aspect of our life,
213
779830
3461
13:15
including the risks we face
214
783315
2741
13:18
and how we take care of ourselves.
215
786080
2402
13:20
Grooming shows us this
in the most painful way:
216
788506
3370
13:25
by involving our kids.
217
793088
1449
13:28
Are we going to do something
to avoid this?
218
796787
2034
13:32
The solution starts
with something as easy as:
219
800202
3439
13:36
talking about it.
220
804419
1265
13:37
Thank you.
221
805708
1158
13:38
(Applause)
222
806890
5739
Translated by Romina Pol
Reviewed by Sebastian Betti

▲Back to top

ABOUT THE SPEAKER
Sebastián Bortnik - Information security specialist
Sebastián Bortnik's work is focused on preventing cyber attacks.

Why you should listen

Sebastián Bortnik is an information security specialist with more than ten years of work experience in the field. He has been working on research and education on cyber attacks since the beginning of his career. He's currently head of research for Onapsis, a company dedicated to protecting ERP and business-critical applications in big companies. Previously, he led a research and technology team for an antivirus company in Latin America focused on malware analysis and cyber attacks research.

Bortnik is one of the founding members of a non-profit organization in Argentina, Argentina Cibersegura, which is focused on raising awareness within the community on how to prevent cyber attacks. As its leader, he has helped create a network with more than 200 volunteers around the country that teach how to use technologies in a safer way to thousands of kids every year.

Bortnik has written several articles and delivered many talks at different conferences around the world. He is passionate about education and believes that everybody needs to understand the impact technology has on our lives and safety.

(Photo: Monstruo Estudio)

More profile about the speaker
Sebastián Bortnik | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee