ABOUT THE SPEAKER
Alessandro Acquisti - Privacy economist
What motivates you to share your personal information online? Alessandro Acquisti studies the behavioral economics of privacy (and information security) in social networks.

Why you should listen

Online, we humans are paradoxical: We cherish privacy, but freely disclose our personal information in certain contexts. Privacy economics offers a powerful lens to understand this paradox, and the field has been spearheaded by Alessandro Acquisti and his colleagues' analyses of how we decide what to share online and what we get in return.

His team's surprising studies on facial recognition software showed that it can connect an anonymous human face to an online name -- and then to a Facebook account -- in about 3 seconds. Other work shows how easy it can be to find a US citizen's Social Security number using basic pattern matching on public data. Work like this earned him an invitation to testify before a US Senate committee on the impact technology has on civil liberties.

Read about his work in the New York Times »

More profile about the speaker
Alessandro Acquisti | Speaker | TED.com
TEDGlobal 2013

Alessandro Acquisti: What will a future without secrets look like?

Filmed:
1,423,103 views

The line between public and private has blurred in the past decade, both online and in real life, and Alessandro Acquisti is here to explain what this means and why it matters. In this thought-provoking, slightly chilling talk, he shares details of recent and ongoing research -- including a project that shows how easy it is to match a photograph of a stranger with their sensitive personal information.
- Privacy economist
What motivates you to share your personal information online? Alessandro Acquisti studies the behavioral economics of privacy (and information security) in social networks. Full bio

Double-click the English transcript below to play the video.

00:12
I would like to tell you a story
0
641
2354
00:14
connecting the notorious privacy incident
1
2995
3176
00:18
involving Adam and Eve,
2
6171
2769
00:20
and the remarkable shift in the boundaries
3
8940
3446
00:24
between public and private which has occurred
4
12386
2686
00:27
in the past 10 years.
5
15072
1770
00:28
You know the incident.
6
16842
1298
00:30
Adam and Eve one day in the Garden of Eden
7
18140
3330
00:33
realize they are naked.
8
21470
1843
00:35
They freak out.
9
23313
1500
00:36
And the rest is history.
10
24813
2757
00:39
Nowadays, Adam and Eve
11
27570
2188
00:41
would probably act differently.
12
29758
2361
00:44
[@Adam Last nite was a blast! loved dat apple LOL]
13
32119
2268
00:46
[@Eve yep.. babe, know what happened to my pants tho?]
14
34387
1873
00:48
We do reveal so much more information
15
36260
2636
00:50
about ourselves online than ever before,
16
38896
3334
00:54
and so much information about us
17
42230
1704
00:55
is being collected by organizations.
18
43934
2224
00:58
Now there is much to gain and benefit
19
46158
3282
01:01
from this massive analysis of personal information,
20
49440
2446
01:03
or big data,
21
51886
1946
01:05
but there are also complex tradeoffs that come
22
53832
2638
01:08
from giving away our privacy.
23
56470
3098
01:11
And my story is about these tradeoffs.
24
59568
4023
01:15
We start with an observation which, in my mind,
25
63591
2584
01:18
has become clearer and clearer in the past few years,
26
66175
3327
01:21
that any personal information
27
69502
2097
01:23
can become sensitive information.
28
71599
2285
01:25
Back in the year 2000, about 100 billion photos
29
73884
4125
01:30
were shot worldwide,
30
78009
1912
01:31
but only a minuscule proportion of them
31
79921
3065
01:34
were actually uploaded online.
32
82986
1883
01:36
In 2010, only on Facebook, in a single month,
33
84869
3361
01:40
2.5 billion photos were uploaded,
34
88230
3270
01:43
most of them identified.
35
91500
1882
01:45
In the same span of time,
36
93382
1880
01:47
computers' ability to recognize people in photos
37
95262
4870
01:52
improved by three orders of magnitude.
38
100132
3608
01:55
What happens when you combine
39
103740
1882
01:57
these technologies together:
40
105622
1501
01:59
increasing availability of facial data;
41
107123
2658
02:01
improving facial recognizing ability by computers;
42
109781
3648
02:05
but also cloud computing,
43
113429
2182
02:07
which gives anyone in this theater
44
115611
1888
02:09
the kind of computational power
45
117499
1560
02:11
which a few years ago was only the domain
46
119059
1886
02:12
of three-letter agencies;
47
120945
1782
02:14
and ubiquitous computing,
48
122727
1378
02:16
which allows my phone, which is not a supercomputer,
49
124105
2892
02:18
to connect to the Internet
50
126997
1671
02:20
and do there hundreds of thousands
51
128668
2334
02:23
of face metrics in a few seconds?
52
131002
2639
02:25
Well, we conjecture that the result
53
133641
2628
02:28
of this combination of technologies
54
136269
2064
02:30
will be a radical change in our very notions
55
138333
2888
02:33
of privacy and anonymity.
56
141221
2257
02:35
To test that, we did an experiment
57
143478
1993
02:37
on Carnegie Mellon University campus.
58
145471
2121
02:39
We asked students who were walking by
59
147592
2099
02:41
to participate in a study,
60
149691
1779
02:43
and we took a shot with a webcam,
61
151470
2562
02:46
and we asked them to fill out a survey on a laptop.
62
154032
2782
02:48
While they were filling out the survey,
63
156814
1979
02:50
we uploaded their shot to a cloud-computing cluster,
64
158793
2797
02:53
and we started using a facial recognizer
65
161590
1727
02:55
to match that shot to a database
66
163317
2405
02:57
of some hundreds of thousands of images
67
165722
2393
03:00
which we had downloaded from Facebook profiles.
68
168115
3596
03:03
By the time the subject reached the last page
69
171711
3259
03:06
on the survey, the page had been dynamically updated
70
174970
3347
03:10
with the 10 best matching photos
71
178317
2313
03:12
which the recognizer had found,
72
180630
2285
03:14
and we asked the subjects to indicate
73
182915
1738
03:16
whether he or she found themselves in the photo.
74
184653
4120
03:20
Do you see the subject?
75
188773
3699
03:24
Well, the computer did, and in fact did so
76
192472
2845
03:27
for one out of three subjects.
77
195317
2149
03:29
So essentially, we can start from an anonymous face,
78
197466
3184
03:32
offline or online, and we can use facial recognition
79
200650
3484
03:36
to give a name to that anonymous face
80
204134
2360
03:38
thanks to social media data.
81
206494
2108
03:40
But a few years back, we did something else.
82
208602
1872
03:42
We started from social media data,
83
210474
1823
03:44
we combined it statistically with data
84
212297
3051
03:47
from U.S. government social security,
85
215348
2102
03:49
and we ended up predicting social security numbers,
86
217450
3324
03:52
which in the United States
87
220774
1512
03:54
are extremely sensitive information.
88
222286
2040
03:56
Do you see where I'm going with this?
89
224326
2093
03:58
So if you combine the two studies together,
90
226419
2922
04:01
then the question becomes,
91
229341
1512
04:02
can you start from a face and,
92
230853
2720
04:05
using facial recognition, find a name
93
233573
2311
04:07
and publicly available information
94
235884
2669
04:10
about that name and that person,
95
238553
1932
04:12
and from that publicly available information
96
240485
2248
04:14
infer non-publicly available information,
97
242733
2042
04:16
much more sensitive ones
98
244775
1606
04:18
which you link back to the face?
99
246381
1492
04:19
And the answer is, yes, we can, and we did.
100
247873
1916
04:21
Of course, the accuracy keeps getting worse.
101
249789
2568
04:24
[27% of subjects' first 5 SSN digits identified (with 4 attempts)]
102
252357
944
04:25
But in fact, we even decided to develop an iPhone app
103
253301
3827
04:29
which uses the phone's internal camera
104
257128
2715
04:31
to take a shot of a subject
105
259843
1600
04:33
and then upload it to a cloud
106
261443
1487
04:34
and then do what I just described to you in real time:
107
262930
2662
04:37
looking for a match, finding public information,
108
265592
2088
04:39
trying to infer sensitive information,
109
267680
1730
04:41
and then sending back to the phone
110
269410
2591
04:44
so that it is overlaid on the face of the subject,
111
272001
3609
04:47
an example of augmented reality,
112
275610
1901
04:49
probably a creepy example of augmented reality.
113
277511
2451
04:51
In fact, we didn't develop the app to make it available,
114
279962
3339
04:55
just as a proof of concept.
115
283301
1922
04:57
In fact, take these technologies
116
285223
2313
04:59
and push them to their logical extreme.
117
287536
1837
05:01
Imagine a future in which strangers around you
118
289373
2719
05:04
will look at you through their Google Glasses
119
292092
2311
05:06
or, one day, their contact lenses,
120
294403
2307
05:08
and use seven or eight data points about you
121
296710
4020
05:12
to infer anything else
122
300730
2582
05:15
which may be known about you.
123
303312
2603
05:17
What will this future without secrets look like?
124
305915
4794
05:22
And should we care?
125
310709
1964
05:24
We may like to believe
126
312673
1891
05:26
that the future with so much wealth of data
127
314564
3040
05:29
would be a future with no more biases,
128
317604
2514
05:32
but in fact, having so much information
129
320118
3583
05:35
doesn't mean that we will make decisions
130
323701
2191
05:37
which are more objective.
131
325892
1706
05:39
In another experiment, we presented to our subjects
132
327598
2560
05:42
information about a potential job candidate.
133
330158
2246
05:44
We included in this information some references
134
332404
3178
05:47
to some funny, absolutely legal,
135
335582
2646
05:50
but perhaps slightly embarrassing information
136
338228
2465
05:52
that the subject had posted online.
137
340693
2020
05:54
Now interestingly, among our subjects,
138
342713
2366
05:57
some had posted comparable information,
139
345079
3083
06:00
and some had not.
140
348162
2362
06:02
Which group do you think
141
350524
1949
06:04
was more likely to judge harshly our subject?
142
352473
4552
06:09
Paradoxically, it was the group
143
357025
1957
06:10
who had posted similar information,
144
358982
1733
06:12
an example of moral dissonance.
145
360715
2942
06:15
Now you may be thinking,
146
363657
1750
06:17
this does not apply to me,
147
365407
1702
06:19
because I have nothing to hide.
148
367109
2162
06:21
But in fact, privacy is not about
149
369271
2482
06:23
having something negative to hide.
150
371753
3676
06:27
Imagine that you are the H.R. director
151
375429
2354
06:29
of a certain organization, and you receive résumés,
152
377783
2947
06:32
and you decide to find more information about the candidates.
153
380730
2473
06:35
Therefore, you Google their names
154
383203
2460
06:37
and in a certain universe,
155
385663
2240
06:39
you find this information.
156
387903
2008
06:41
Or in a parallel universe, you find this information.
157
389911
4437
06:46
Do you think that you would be equally likely
158
394348
2717
06:49
to call either candidate for an interview?
159
397065
2803
06:51
If you think so, then you are not
160
399868
2282
06:54
like the U.S. employers who are, in fact,
161
402150
2582
06:56
part of our experiment, meaning we did exactly that.
162
404732
3307
07:00
We created Facebook profiles, manipulating traits,
163
408039
3182
07:03
then we started sending out résumés to companies in the U.S.,
164
411221
2851
07:06
and we detected, we monitored,
165
414072
1908
07:07
whether they were searching for our candidates,
166
415980
2393
07:10
and whether they were acting on the information
167
418373
1832
07:12
they found on social media. And they were.
168
420205
1938
07:14
Discrimination was happening through social media
169
422143
2101
07:16
for equally skilled candidates.
170
424244
3073
07:19
Now marketers like us to believe
171
427317
4575
07:23
that all information about us will always
172
431892
2269
07:26
be used in a manner which is in our favor.
173
434161
3273
07:29
But think again. Why should that be always the case?
174
437434
3715
07:33
In a movie which came out a few years ago,
175
441149
2664
07:35
"Minority Report," a famous scene
176
443813
2553
07:38
had Tom Cruise walk in a mall
177
446366
2576
07:40
and holographic personalized advertising
178
448942
3776
07:44
would appear around him.
179
452718
1835
07:46
Now, that movie is set in 2054,
180
454553
3227
07:49
about 40 years from now,
181
457780
1642
07:51
and as exciting as that technology looks,
182
459422
2908
07:54
it already vastly underestimates
183
462330
2646
07:56
the amount of information that organizations
184
464976
2140
07:59
can gather about you, and how they can use it
185
467116
2483
08:01
to influence you in a way that you will not even detect.
186
469599
3398
08:04
So as an example, this is another experiment
187
472997
2103
08:07
actually we are running, not yet completed.
188
475100
2273
08:09
Imagine that an organization has access
189
477373
2319
08:11
to your list of Facebook friends,
190
479692
2056
08:13
and through some kind of algorithm
191
481748
1772
08:15
they can detect the two friends that you like the most.
192
483520
3734
08:19
And then they create, in real time,
193
487254
2280
08:21
a facial composite of these two friends.
194
489534
2842
08:24
Now studies prior to ours have shown that people
195
492376
3069
08:27
don't recognize any longer even themselves
196
495445
2885
08:30
in facial composites, but they react
197
498330
2462
08:32
to those composites in a positive manner.
198
500792
2117
08:34
So next time you are looking for a certain product,
199
502909
3415
08:38
and there is an ad suggesting you to buy it,
200
506324
2559
08:40
it will not be just a standard spokesperson.
201
508883
2907
08:43
It will be one of your friends,
202
511790
2313
08:46
and you will not even know that this is happening.
203
514103
3303
08:49
Now the problem is that
204
517406
2413
08:51
the current policy mechanisms we have
205
519819
2519
08:54
to protect ourselves from the abuses of personal information
206
522338
3438
08:57
are like bringing a knife to a gunfight.
207
525776
2984
09:00
One of these mechanisms is transparency,
208
528760
2913
09:03
telling people what you are going to do with their data.
209
531673
3200
09:06
And in principle, that's a very good thing.
210
534873
2106
09:08
It's necessary, but it is not sufficient.
211
536979
3667
09:12
Transparency can be misdirected.
212
540646
3698
09:16
You can tell people what you are going to do,
213
544344
2104
09:18
and then you still nudge them to disclose
214
546448
2232
09:20
arbitrary amounts of personal information.
215
548680
2623
09:23
So in yet another experiment, this one with students,
216
551303
2886
09:26
we asked them to provide information
217
554189
3058
09:29
about their campus behavior,
218
557247
1813
09:31
including pretty sensitive questions, such as this one.
219
559060
2940
09:34
[Have you ever cheated in an exam?]
220
562000
621
09:34
Now to one group of subjects, we told them,
221
562621
2300
09:36
"Only other students will see your answers."
222
564921
2841
09:39
To another group of subjects, we told them,
223
567762
1579
09:41
"Students and faculty will see your answers."
224
569341
3561
09:44
Transparency. Notification. And sure enough, this worked,
225
572902
2591
09:47
in the sense that the first group of subjects
226
575493
1407
09:48
were much more likely to disclose than the second.
227
576900
2568
09:51
It makes sense, right?
228
579468
1520
09:52
But then we added the misdirection.
229
580988
1490
09:54
We repeated the experiment with the same two groups,
230
582478
2760
09:57
this time adding a delay
231
585238
2427
09:59
between the time we told subjects
232
587665
2935
10:02
how we would use their data
233
590600
2080
10:04
and the time we actually started answering the questions.
234
592680
4388
10:09
How long a delay do you think we had to add
235
597068
2561
10:11
in order to nullify the inhibitory effect
236
599629
4613
10:16
of knowing that faculty would see your answers?
237
604242
3411
10:19
Ten minutes?
238
607653
1780
10:21
Five minutes?
239
609433
1791
10:23
One minute?
240
611224
1776
10:25
How about 15 seconds?
241
613000
2049
10:27
Fifteen seconds were sufficient to have the two groups
242
615049
2668
10:29
disclose the same amount of information,
243
617717
1568
10:31
as if the second group now no longer cares
244
619285
2746
10:34
for faculty reading their answers.
245
622031
2656
10:36
Now I have to admit that this talk so far
246
624687
3336
10:40
may sound exceedingly gloomy,
247
628023
2480
10:42
but that is not my point.
248
630503
1721
10:44
In fact, I want to share with you the fact that
249
632224
2699
10:46
there are alternatives.
250
634923
1772
10:48
The way we are doing things now is not the only way
251
636695
2499
10:51
they can done, and certainly not the best way
252
639194
3037
10:54
they can be done.
253
642231
2027
10:56
When someone tells you, "People don't care about privacy,"
254
644258
4171
11:00
consider whether the game has been designed
255
648429
2642
11:03
and rigged so that they cannot care about privacy,
256
651071
2724
11:05
and coming to the realization that these manipulations occur
257
653795
3262
11:09
is already halfway through the process
258
657057
1607
11:10
of being able to protect yourself.
259
658664
2258
11:12
When someone tells you that privacy is incompatible
260
660922
3710
11:16
with the benefits of big data,
261
664632
1849
11:18
consider that in the last 20 years,
262
666481
2473
11:20
researchers have created technologies
263
668954
1917
11:22
to allow virtually any electronic transactions
264
670871
3318
11:26
to take place in a more privacy-preserving manner.
265
674189
3749
11:29
We can browse the Internet anonymously.
266
677938
2555
11:32
We can send emails that can only be read
267
680493
2678
11:35
by the intended recipient, not even the NSA.
268
683171
3709
11:38
We can have even privacy-preserving data mining.
269
686880
2997
11:41
In other words, we can have the benefits of big data
270
689877
3894
11:45
while protecting privacy.
271
693771
2132
11:47
Of course, these technologies imply a shifting
272
695903
3791
11:51
of cost and revenues
273
699694
1546
11:53
between data holders and data subjects,
274
701240
2107
11:55
which is why, perhaps, you don't hear more about them.
275
703347
3453
11:58
Which brings me back to the Garden of Eden.
276
706800
3706
12:02
There is a second privacy interpretation
277
710506
2780
12:05
of the story of the Garden of Eden
278
713286
1809
12:07
which doesn't have to do with the issue
279
715095
2096
12:09
of Adam and Eve feeling naked
280
717191
2225
12:11
and feeling ashamed.
281
719416
2381
12:13
You can find echoes of this interpretation
282
721797
2781
12:16
in John Milton's "Paradise Lost."
283
724578
2782
12:19
In the garden, Adam and Eve are materially content.
284
727360
4197
12:23
They're happy. They are satisfied.
285
731557
2104
12:25
However, they also lack knowledge
286
733661
2293
12:27
and self-awareness.
287
735954
1640
12:29
The moment they eat the aptly named
288
737594
3319
12:32
fruit of knowledge,
289
740913
1293
12:34
that's when they discover themselves.
290
742206
2605
12:36
They become aware. They achieve autonomy.
291
744811
4031
12:40
The price to pay, however, is leaving the garden.
292
748842
3126
12:43
So privacy, in a way, is both the means
293
751968
3881
12:47
and the price to pay for freedom.
294
755849
2962
12:50
Again, marketers tell us
295
758811
2770
12:53
that big data and social media
296
761581
3019
12:56
are not just a paradise of profit for them,
297
764600
2979
12:59
but a Garden of Eden for the rest of us.
298
767579
2457
13:02
We get free content.
299
770036
1238
13:03
We get to play Angry Birds. We get targeted apps.
300
771274
3123
13:06
But in fact, in a few years, organizations
301
774397
2897
13:09
will know so much about us,
302
777294
1609
13:10
they will be able to infer our desires
303
778903
2710
13:13
before we even form them, and perhaps
304
781613
2204
13:15
buy products on our behalf
305
783817
2447
13:18
before we even know we need them.
306
786264
2274
13:20
Now there was one English author
307
788538
3237
13:23
who anticipated this kind of future
308
791775
3045
13:26
where we would trade away
309
794820
1405
13:28
our autonomy and freedom for comfort.
310
796225
3548
13:31
Even more so than George Orwell,
311
799773
2161
13:33
the author is, of course, Aldous Huxley.
312
801934
2761
13:36
In "Brave New World," he imagines a society
313
804695
2854
13:39
where technologies that we created
314
807549
2171
13:41
originally for freedom
315
809720
1859
13:43
end up coercing us.
316
811579
2567
13:46
However, in the book, he also offers us a way out
317
814146
4791
13:50
of that society, similar to the path
318
818937
3438
13:54
that Adam and Eve had to follow to leave the garden.
319
822375
3955
13:58
In the words of the Savage,
320
826330
2147
14:00
regaining autonomy and freedom is possible,
321
828477
3069
14:03
although the price to pay is steep.
322
831546
2679
14:06
So I do believe that one of the defining fights
323
834225
5715
14:11
of our times will be the fight
324
839940
2563
14:14
for the control over personal information,
325
842503
2387
14:16
the fight over whether big data will become a force
326
844890
3507
14:20
for freedom,
327
848397
1289
14:21
rather than a force which will hiddenly manipulate us.
328
849686
4746
14:26
Right now, many of us
329
854432
2593
14:29
do not even know that the fight is going on,
330
857025
2753
14:31
but it is, whether you like it or not.
331
859778
2672
14:34
And at the risk of playing the serpent,
332
862450
2804
14:37
I will tell you that the tools for the fight
333
865254
2897
14:40
are here, the awareness of what is going on,
334
868151
3009
14:43
and in your hands,
335
871160
1355
14:44
just a few clicks away.
336
872515
3740
14:48
Thank you.
337
876255
1482
14:49
(Applause)
338
877737
4477

▲Back to top

ABOUT THE SPEAKER
Alessandro Acquisti - Privacy economist
What motivates you to share your personal information online? Alessandro Acquisti studies the behavioral economics of privacy (and information security) in social networks.

Why you should listen

Online, we humans are paradoxical: We cherish privacy, but freely disclose our personal information in certain contexts. Privacy economics offers a powerful lens to understand this paradox, and the field has been spearheaded by Alessandro Acquisti and his colleagues' analyses of how we decide what to share online and what we get in return.

His team's surprising studies on facial recognition software showed that it can connect an anonymous human face to an online name -- and then to a Facebook account -- in about 3 seconds. Other work shows how easy it can be to find a US citizen's Social Security number using basic pattern matching on public data. Work like this earned him an invitation to testify before a US Senate committee on the impact technology has on civil liberties.

Read about his work in the New York Times »

More profile about the speaker
Alessandro Acquisti | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee