ABOUT THE SPEAKER
Amy Adele Hasinoff - Communications researcher
Amy Adele Hasinoff studies gender, sexuality, privacy and consent in new media.

Why you should listen

Amy Adele Hasinoff investigates how we think about communication technologies as both the cause of and solution to social problems. She wrote a book, Sexting Panic, about the well-intentioned but problematic responses to sexting in mass media, law and education. The National Communication Association described it as "[T]he rare book that advances scholarly conversations while also promising to enrich family conversations around the dinner table."

Hasinoff is an Assistant Professor in the communication department at the University of Colorado Denver. She publishes regularly in scholarly journals and books and wrote an op-ed about sexting for the New York Times.

More profile about the speaker
Amy Adele Hasinoff | Speaker | TED.com
TEDxMileHigh

Amy Adele Hasinoff: How to practice safe sexting

Filmed:
1,570,366 views

Sexting, like anything that's fun, runs its risks -- but a serious violation of privacy shouldn't be one of them. Amy Adele Hasinoff looks at problematic responses to sexting in mass media, law and education, offering practical solutions for how individuals and tech companies can protect sensitive (and, ahem, potentially scandalous) digital files.
- Communications researcher
Amy Adele Hasinoff studies gender, sexuality, privacy and consent in new media. Full bio

Double-click the English transcript below to play the video.

00:12
People have been using media
to talk about sex for a long time.
0
720
4280
00:17
Love letters, phone sex, racy Polaroids.
1
5600
3440
00:21
There's even a story of a girl who eloped
with a man that she met over the telegraph
2
9480
6056
00:27
in 1886.
3
15560
1640
00:30
Today we have sexting,
and I am a sexting expert.
4
18560
5096
00:35
Not an expert sexter.
5
23680
1880
00:38
Though, I do know what this means --
I think you do too.
6
26800
4176
00:43
[it's a penis]
7
31000
1375
00:44
(Laughter)
8
32400
2360
00:48
I have been studying sexting since
the media attention to it began in 2008.
9
36360
6336
00:54
I wrote a book on the moral
panic about sexting.
10
42720
2976
00:57
And here's what I found:
11
45720
1616
00:59
most people are worrying
about the wrong thing.
12
47360
3216
01:02
They're trying to just prevent
sexting from happening entirely.
13
50600
4176
01:06
But let me ask you this:
14
54800
1536
01:08
As long as it's completely consensual,
what's the problem with sexting?
15
56360
4856
01:13
People are into all sorts of things
that you may not be into,
16
61240
4016
01:17
like blue cheese or cilantro.
17
65280
2296
01:19
(Laughter)
18
67600
1640
01:22
Sexting is certainly risky,
like anything that's fun,
19
70600
4136
01:26
but as long as you're not sending an image
to someone who doesn't want to receive it,
20
74760
6696
01:33
there's no harm.
21
81480
1776
01:35
What I do think is a serious problem
22
83280
2616
01:37
is when people share
private images of others
23
85920
3256
01:41
without their permission.
24
89200
1520
01:43
And instead of worrying about sexting,
25
91360
2336
01:45
what I think we need to do
is think a lot more about digital privacy.
26
93720
4560
01:50
The key is consent.
27
98880
2200
01:53
Right now most people
are thinking about sexting
28
101680
3296
01:57
without really thinking
about consent at all.
29
105000
2960
02:00
Did you know that we currently
criminalize teen sexting?
30
108400
3680
02:05
It can be a crime because
it counts as child pornography,
31
113400
3456
02:08
if there's an image of someone under 18,
32
116880
2856
02:11
and it doesn't even matter
33
119760
1336
02:13
if they took that image of themselves
and shared it willingly.
34
121120
4200
02:17
So we end up with this
bizarre legal situation
35
125800
2976
02:20
where two 17-year-olds
can legally have sex in most US states
36
128800
4536
02:25
but they can't photograph it.
37
133360
1880
02:28
Some states have also tried
passing sexting misdemeanor laws
38
136560
4256
02:32
but these laws repeat the same problem
39
140840
3016
02:35
because they still
make consensual sexting illegal.
40
143880
3760
02:40
It doesn't make sense
41
148520
1256
02:41
to try to ban all sexting
to try to address privacy violations.
42
149800
4736
02:46
This is kind of like saying,
43
154560
1496
02:48
let's solve the problem of date rape
by just making dating completely illegal.
44
156080
5440
02:55
Most teens don't get arrested for sexting,
but can you guess who does?
45
163120
5376
03:00
It's often teens who are disliked
by their partner's parents.
46
168520
4976
03:05
And this can be because of class bias,
racism or homophobia.
47
173520
4800
03:10
Most prosecutors are,
of course, smart enough
48
178960
2776
03:13
not to use child pornography charges
against teenagers, but some do.
49
181760
5480
03:19
According to researchers
at the University of New Hampshire
50
187720
3656
03:23
seven percent of all child pornography
possession arrests are teens,
51
191400
5656
03:29
sexting consensually with other teens.
52
197080
3000
03:33
Child pornography is a serious crime,
53
201480
2536
03:36
but it's just not
the same thing as teen sexting.
54
204040
3680
03:41
Parents and educators
are also responding to sexting
55
209040
3416
03:44
without really thinking
too much about consent.
56
212480
3136
03:47
Their message to teens is often:
just don't do it.
57
215640
4120
03:52
And I totally get it --
there are serious legal risks
58
220200
3496
03:55
and of course, that potential
for privacy violations.
59
223720
3656
03:59
And when you were a teen,
60
227400
1256
04:00
I'm sure you did exactly
as you were told, right?
61
228680
3400
04:05
You're probably thinking,
my kid would never sext.
62
233440
3456
04:08
And that's true, your little angel
may not be sexting
63
236920
3456
04:12
because only 33 percent
64
240400
3136
04:15
of 16- and 17-year-olds are sexting.
65
243560
2360
04:19
But, sorry, by the time they're older,
odds are they will be sexting.
66
247200
4616
04:23
Every study I've seen puts the rate
above 50 percent for 18- to 24-year-olds.
67
251840
6200
04:30
And most of the time, nothing goes wrong.
68
258720
3056
04:33
People ask me all the time things like,
isn't sexting just so dangerous, though?
69
261800
5376
04:39
It's like you wouldn't
leave your wallet on a park bench
70
267200
3576
04:42
and you expect it's going to get stolen
if you do that, right?
71
270800
3440
04:46
Here's how I think about it:
72
274880
1456
04:48
sexting is like leaving your wallet
at your boyfriend's house.
73
276360
3936
04:52
If you come back the next day
74
280320
1776
04:54
and all the money is just gone,
75
282120
2280
04:57
you really need to dump that guy.
76
285040
2120
04:59
(Laughter)
77
287690
2170
05:03
So instead of criminalizing sexting
78
291360
2320
05:05
to try to prevent
these privacy violations,
79
293720
2616
05:08
instead we need to make consent central
80
296360
3296
05:11
to how we think about the circulation
of our private information.
81
299680
4080
05:16
Every new media technology
raises privacy concerns.
82
304480
4256
05:20
In fact, in the US the very first
major debates about privacy
83
308760
4616
05:25
were in response to technologies
that were relatively new at the time.
84
313400
4496
05:29
In the late 1800s,
people were worried about cameras,
85
317920
3896
05:33
which were just suddenly
more portable than ever before,
86
321840
3456
05:37
and newspaper gossip columns.
87
325320
2496
05:39
They were worried that the camera
would capture information about them,
88
327840
3816
05:43
take it out of context
and widely disseminate it.
89
331680
3200
05:47
Does this sound familiar?
90
335240
1616
05:48
It's exactly what we're worrying about
now with social media and drone cameras,
91
336880
4856
05:53
and, of course, sexting.
92
341760
1640
05:55
And these fears about technology,
93
343920
2216
05:58
they make sense
94
346160
1216
05:59
because technologies
can amplify and bring out
95
347400
3416
06:02
our worst qualities and behaviors.
96
350840
2720
06:06
But there are solutions.
97
354160
2376
06:08
And we've been here before
with a dangerous new technology.
98
356560
3560
06:12
In 1908, Ford introduced the Model T car.
99
360720
3776
06:16
Traffic fatality rates were rising.
100
364520
2576
06:19
It was a serious problem --
it looks so safe, right?
101
367120
2800
06:24
Our first response
was to try to change drivers' behavior,
102
372080
3976
06:28
so we developed speed limits
and enforced them through fines.
103
376080
3720
06:32
But over the following decades,
104
380240
1856
06:34
we started to realize the technology
of the car itself is not just neutral.
105
382120
5496
06:39
We could design the car to make it safer.
106
387640
3216
06:42
So in the 1920s, we got
shatter-resistant windshields.
107
390880
3456
06:46
In the 1950s, seat belts.
108
394360
2496
06:48
And in the 1990s, airbags.
109
396880
3080
06:52
All three of these areas:
110
400440
2376
06:54
laws, individuals and industry
came together over time
111
402840
4776
06:59
to help solve the problem
that a new technology causes.
112
407640
3776
07:03
And we can do the same thing
with digital privacy.
113
411440
3240
07:07
Of course, it comes back to consent.
114
415160
2760
07:10
Here's the idea.
115
418360
1216
07:11
Before anyone can distribute
your private information,
116
419600
3816
07:15
they should have to get your permission.
117
423440
2240
07:18
This idea of affirmative consent
comes from anti-rape activists
118
426240
4816
07:23
who tell us that we need consent
for every sexual act.
119
431080
3776
07:26
And we have really high standards
for consent in a lot of other areas.
120
434880
4576
07:31
Think about having surgery.
121
439480
1856
07:33
Your doctor has to make sure
122
441360
1616
07:35
that you are meaningfully and knowingly
consenting to that medical procedure.
123
443000
4040
07:39
This is not the type of consent
like with an iTunes Terms of Service
124
447520
3696
07:43
where you just scroll to the bottom
and you're like, agree, agree, whatever.
125
451240
3656
07:46
(Laughter)
126
454920
1720
07:49
If we think more about consent,
we can have better privacy laws.
127
457160
5256
07:54
Right now, we just don't have
that many protections.
128
462440
3416
07:57
If your ex-husband or your ex-wife
is a terrible person,
129
465880
3576
08:01
they can take your nude photos
and upload them to a porn site.
130
469480
4216
08:05
It can be really hard
to get those images taken down.
131
473720
3216
08:08
And in a lot of states,
132
476960
1216
08:10
you're actually better off
if you took the images of yourself
133
478200
3816
08:14
because then you can
file a copyright claim.
134
482040
2800
08:17
(Laughter)
135
485320
2056
08:19
Right now, if someone
violates your privacy,
136
487400
2976
08:22
whether that's an individual
or a company or the NSA,
137
490400
4200
08:27
you can try filing a lawsuit,
138
495280
2736
08:30
though you may not be successful
139
498040
2136
08:32
because many courts assume
that digital privacy is just impossible.
140
500200
4776
08:37
So they're not willing
to punish anyone for violating it.
141
505000
3440
08:41
I still hear people
asking me all the time,
142
509200
2896
08:44
isn't a digital image somehow blurring
the line between public and private
143
512120
5296
08:49
because it's digital, right?
144
517440
1640
08:51
No! No!
145
519600
1336
08:52
Everything digital
is not just automatically public.
146
520960
3336
08:56
That doesn't make any sense.
147
524320
1896
08:58
As NYU legal scholar
Helen Nissenbaum tells us,
148
526240
3496
09:01
we have laws and policies and norms
149
529760
2616
09:04
that protect all kinds
of information that's private,
150
532400
3136
09:07
and it doesn't make a difference
if it's digital or not.
151
535560
3416
09:11
All of your health records are digitized
152
539000
2656
09:13
but your doctor can't
just share them with anyone.
153
541680
3136
09:16
All of your financial information
is held in digital databases,
154
544840
4456
09:21
but your credit card company can't
just post your purchase history online.
155
549320
4240
09:27
Better laws could help address
privacy violations after they happen,
156
555080
5456
09:32
but one of the easiest things
we can all do is make personal changes
157
560560
4376
09:36
to help protect each other's privacy.
158
564960
2680
09:40
We're always told that privacy
159
568360
1896
09:42
is our own, sole,
individual responsibility.
160
570280
3056
09:45
We're told, constantly monitor
and update your privacy settings.
161
573360
4256
09:49
We're told, never share anything
you wouldn't want the entire world to see.
162
577640
4800
09:55
This doesn't make sense.
163
583400
1216
09:56
Digital media are social environments
164
584640
2976
09:59
and we share things
with people we trust all day, every day.
165
587640
4280
10:04
As Princeton researcher
Janet Vertesi argues,
166
592760
2976
10:07
our data and our privacy,
they're not just personal,
167
595760
4016
10:11
they're actually interpersonal.
168
599800
2576
10:14
And so one thing you can do
that's really easy
169
602400
3256
10:17
is just start asking for permission before
you share anyone else's information.
170
605680
5096
10:22
If you want to post a photo
of someone online, ask for permission.
171
610800
4536
10:27
If you want to forward an email thread,
172
615360
2456
10:29
ask for permission.
173
617840
1376
10:31
And if you want to share
someone's nude selfie,
174
619240
2776
10:34
obviously, ask for permission.
175
622040
2280
10:37
These individual changes can really
help us protect each other's privacy,
176
625560
4456
10:42
but we need technology companies
on board as well.
177
630040
3800
10:46
These companies have very little
incentive to help protect our privacy
178
634360
4496
10:50
because their business models
depend on us sharing everything
179
638880
3296
10:54
with as many people as possible.
180
642200
2240
10:57
Right now, if I send you an image,
181
645080
1936
10:59
you can forward that
to anyone that you want.
182
647040
3096
11:02
But what if I got to decide
if that image was forwardable or not?
183
650160
4256
11:06
This would tell you, you don't
have my permission to send this image out.
184
654440
4056
11:10
We do this kind of thing all the time
to protect copyright.
185
658520
4136
11:14
If you buy an e-book, you can't just
send it out to as many people as you want.
186
662680
4776
11:19
So why not try this with mobile phones?
187
667480
2560
11:22
What you can do is we can demand
that tech companies add these protections
188
670960
4776
11:27
to our devices and our platforms
as the default.
189
675760
3736
11:31
After all, you can choose
the color of your car,
190
679520
3416
11:34
but the airbags are always standard.
191
682960
2840
11:40
If we don't think more
about digital privacy and consent,
192
688080
3816
11:43
there can be serious consequences.
193
691920
2720
11:47
There was a teenager from Ohio --
194
695360
2256
11:49
let's call her Jennifer,
for the sake of her privacy.
195
697640
2840
11:53
She shared nude photos of herself
with her high school boyfriend,
196
701120
3576
11:56
thinking she could trust him.
197
704720
1520
11:59
Unfortunately, he betrayed her
198
707720
1936
12:01
and sent her photos
around the entire school.
199
709680
2976
12:04
Jennifer was embarrassed and humiliated,
200
712680
3520
12:08
but instead of being compassionate,
her classmates harassed her.
201
716800
4136
12:12
They called her a slut and a whore
202
720960
1856
12:14
and they made her life miserable.
203
722840
1960
12:17
Jennifer started missing school
and her grades dropped.
204
725360
3680
12:21
Ultimately, Jennifer decided
to end her own life.
205
729520
3800
12:26
Jennifer did nothing wrong.
206
734720
2696
12:29
All she did was share a nude photo
207
737440
2256
12:31
with someone she thought
that she could trust.
208
739720
2816
12:34
And yet our laws tell her
209
742560
2616
12:37
that she committed a horrible crime
equivalent to child pornography.
210
745200
4160
12:41
Our gender norms tell her
211
749920
1496
12:43
that by producing
this nude image of herself,
212
751440
3216
12:46
she somehow did the most
horrible, shameful thing.
213
754680
3200
12:50
And when we assume that privacy
is impossible in digital media,
214
758400
4216
12:54
we completely write off and excuse
her boyfriend's bad, bad behavior.
215
762640
5520
13:01
People are still saying all the time
to victims of privacy violations,
216
769200
5736
13:06
"What were you thinking?
217
774960
1256
13:08
You should have never sent that image."
218
776240
2480
13:11
If you're trying to figure out
what to say instead, try this.
219
779640
4000
13:16
Imagine you've run into your friend
who broke their leg skiing.
220
784160
3520
13:20
They took a risk to do something fun,
and it didn't end well.
221
788240
4576
13:24
But you're probably
not going to be the jerk who says,
222
792840
2536
13:27
"Well, I guess you shouldn't
have gone skiing then."
223
795400
2440
13:32
If we think more about consent,
224
800080
2136
13:34
we can see that victims
of privacy violations
225
802240
3256
13:37
deserve our compassion,
226
805520
1736
13:39
not criminalization, shaming,
harassment or punishment.
227
807280
4600
13:44
We can support victims,
and we can prevent some privacy violations
228
812440
4496
13:48
by making these legal,
individual and technological changes.
229
816960
4320
13:53
Because the problem is not sexting,
the issue is digital privacy.
230
821840
5816
13:59
And one solution is consent.
231
827680
2360
14:02
So the next time a victim
of a privacy violation comes up to you,
232
830680
4576
14:07
instead of blaming them,
let's do this instead:
233
835280
2736
14:10
let's shift our ideas
about digital privacy,
234
838040
3416
14:13
and let's respond with compassion.
235
841480
2640
14:16
Thank you.
236
844680
1216
14:17
(Applause)
237
845920
6136

▲Back to top

ABOUT THE SPEAKER
Amy Adele Hasinoff - Communications researcher
Amy Adele Hasinoff studies gender, sexuality, privacy and consent in new media.

Why you should listen

Amy Adele Hasinoff investigates how we think about communication technologies as both the cause of and solution to social problems. She wrote a book, Sexting Panic, about the well-intentioned but problematic responses to sexting in mass media, law and education. The National Communication Association described it as "[T]he rare book that advances scholarly conversations while also promising to enrich family conversations around the dinner table."

Hasinoff is an Assistant Professor in the communication department at the University of Colorado Denver. She publishes regularly in scholarly journals and books and wrote an op-ed about sexting for the New York Times.

More profile about the speaker
Amy Adele Hasinoff | Speaker | TED.com