15:01
TEDGlobal 2013

Alessandro Acquisti: What will a future without secrets look like?

Filmed:

The line between public and private has blurred in the past decade, both online and in real life, and Alessandro Acquisti is here to explain what this means and why it matters. In this thought-provoking, slightly chilling talk, he shares details of recent and ongoing research -- including a project that shows how easy it is to match a photograph of a stranger with their sensitive personal information.

- Privacy economist
What motivates you to share your personal information online? Alessandro Acquisti studies the behavioral economics of privacy (and information security) in social networks. Full bio

I would like to tell you a story
00:12
connecting the notorious privacy incident
00:14
involving Adam and Eve,
00:18
and the remarkable shift in the boundaries
00:20
between public and private which has occurred
00:24
in the past 10 years.
00:27
You know the incident.
00:28
Adam and Eve one day in the Garden of Eden
00:30
realize they are naked.
00:33
They freak out.
00:35
And the rest is history.
00:36
Nowadays, Adam and Eve
00:39
would probably act differently.
00:41
[@Adam Last nite was a blast! loved dat apple LOL]
00:44
[@Eve yep.. babe, know what happened to my pants tho?]
00:46
We do reveal so much more information
00:48
about ourselves online than ever before,
00:50
and so much information about us
00:54
is being collected by organizations.
00:55
Now there is much to gain and benefit
00:58
from this massive analysis of personal information,
01:01
or big data,
01:03
but there are also complex tradeoffs that come
01:05
from giving away our privacy.
01:08
And my story is about these tradeoffs.
01:11
We start with an observation which, in my mind,
01:15
has become clearer and clearer in the past few years,
01:18
that any personal information
01:21
can become sensitive information.
01:23
Back in the year 2000, about 100 billion photos
01:25
were shot worldwide,
01:30
but only a minuscule proportion of them
01:31
were actually uploaded online.
01:34
In 2010, only on Facebook, in a single month,
01:36
2.5 billion photos were uploaded,
01:40
most of them identified.
01:43
In the same span of time,
01:45
computers' ability to recognize people in photos
01:47
improved by three orders of magnitude.
01:52
What happens when you combine
01:55
these technologies together:
01:57
increasing availability of facial data;
01:59
improving facial recognizing ability by computers;
02:01
but also cloud computing,
02:05
which gives anyone in this theater
02:07
the kind of computational power
02:09
which a few years ago was only the domain
02:11
of three-letter agencies;
02:12
and ubiquitous computing,
02:14
which allows my phone, which is not a supercomputer,
02:16
to connect to the Internet
02:18
and do there hundreds of thousands
02:20
of face metrics in a few seconds?
02:23
Well, we conjecture that the result
02:25
of this combination of technologies
02:28
will be a radical change in our very notions
02:30
of privacy and anonymity.
02:33
To test that, we did an experiment
02:35
on Carnegie Mellon University campus.
02:37
We asked students who were walking by
02:39
to participate in a study,
02:41
and we took a shot with a webcam,
02:43
and we asked them to fill out a survey on a laptop.
02:46
While they were filling out the survey,
02:48
we uploaded their shot to a cloud-computing cluster,
02:50
and we started using a facial recognizer
02:53
to match that shot to a database
02:55
of some hundreds of thousands of images
02:57
which we had downloaded from Facebook profiles.
03:00
By the time the subject reached the last page
03:03
on the survey, the page had been dynamically updated
03:06
with the 10 best matching photos
03:10
which the recognizer had found,
03:12
and we asked the subjects to indicate
03:14
whether he or she found themselves in the photo.
03:16
Do you see the subject?
03:20
Well, the computer did, and in fact did so
03:24
for one out of three subjects.
03:27
So essentially, we can start from an anonymous face,
03:29
offline or online, and we can use facial recognition
03:32
to give a name to that anonymous face
03:36
thanks to social media data.
03:38
But a few years back, we did something else.
03:40
We started from social media data,
03:42
we combined it statistically with data
03:44
from U.S. government social security,
03:47
and we ended up predicting social security numbers,
03:49
which in the United States
03:52
are extremely sensitive information.
03:54
Do you see where I'm going with this?
03:56
So if you combine the two studies together,
03:58
then the question becomes,
04:01
can you start from a face and,
04:02
using facial recognition, find a name
04:05
and publicly available information
04:07
about that name and that person,
04:10
and from that publicly available information
04:12
infer non-publicly available information,
04:14
much more sensitive ones
04:16
which you link back to the face?
04:18
And the answer is, yes, we can, and we did.
04:19
Of course, the accuracy keeps getting worse.
04:21
[27% of subjects' first 5 SSN digits identified (with 4 attempts)]
04:24
But in fact, we even decided to develop an iPhone app
04:25
which uses the phone's internal camera
04:29
to take a shot of a subject
04:31
and then upload it to a cloud
04:33
and then do what I just described to you in real time:
04:34
looking for a match, finding public information,
04:37
trying to infer sensitive information,
04:39
and then sending back to the phone
04:41
so that it is overlaid on the face of the subject,
04:44
an example of augmented reality,
04:47
probably a creepy example of augmented reality.
04:49
In fact, we didn't develop the app to make it available,
04:51
just as a proof of concept.
04:55
In fact, take these technologies
04:57
and push them to their logical extreme.
04:59
Imagine a future in which strangers around you
05:01
will look at you through their Google Glasses
05:04
or, one day, their contact lenses,
05:06
and use seven or eight data points about you
05:08
to infer anything else
05:12
which may be known about you.
05:15
What will this future without secrets look like?
05:17
And should we care?
05:22
We may like to believe
05:24
that the future with so much wealth of data
05:26
would be a future with no more biases,
05:29
but in fact, having so much information
05:32
doesn't mean that we will make decisions
05:35
which are more objective.
05:37
In another experiment, we presented to our subjects
05:39
information about a potential job candidate.
05:42
We included in this information some references
05:44
to some funny, absolutely legal,
05:47
but perhaps slightly embarrassing information
05:50
that the subject had posted online.
05:52
Now interestingly, among our subjects,
05:54
some had posted comparable information,
05:57
and some had not.
06:00
Which group do you think
06:02
was more likely to judge harshly our subject?
06:04
Paradoxically, it was the group
06:09
who had posted similar information,
06:10
an example of moral dissonance.
06:12
Now you may be thinking,
06:15
this does not apply to me,
06:17
because I have nothing to hide.
06:19
But in fact, privacy is not about
06:21
having something negative to hide.
06:23
Imagine that you are the H.R. director
06:27
of a certain organization, and you receive résumés,
06:29
and you decide to find more information about the candidates.
06:32
Therefore, you Google their names
06:35
and in a certain universe,
06:37
you find this information.
06:39
Or in a parallel universe, you find this information.
06:41
Do you think that you would be equally likely
06:46
to call either candidate for an interview?
06:49
If you think so, then you are not
06:51
like the U.S. employers who are, in fact,
06:54
part of our experiment, meaning we did exactly that.
06:56
We created Facebook profiles, manipulating traits,
07:00
then we started sending out résumés to companies in the U.S.,
07:03
and we detected, we monitored,
07:06
whether they were searching for our candidates,
07:07
and whether they were acting on the information
07:10
they found on social media. And they were.
07:12
Discrimination was happening through social media
07:14
for equally skilled candidates.
07:16
Now marketers like us to believe
07:19
that all information about us will always
07:23
be used in a manner which is in our favor.
07:26
But think again. Why should that be always the case?
07:29
In a movie which came out a few years ago,
07:33
"Minority Report," a famous scene
07:35
had Tom Cruise walk in a mall
07:38
and holographic personalized advertising
07:40
would appear around him.
07:44
Now, that movie is set in 2054,
07:46
about 40 years from now,
07:49
and as exciting as that technology looks,
07:51
it already vastly underestimates
07:54
the amount of information that organizations
07:56
can gather about you, and how they can use it
07:59
to influence you in a way that you will not even detect.
08:01
So as an example, this is another experiment
08:04
actually we are running, not yet completed.
08:07
Imagine that an organization has access
08:09
to your list of Facebook friends,
08:11
and through some kind of algorithm
08:13
they can detect the two friends that you like the most.
08:15
And then they create, in real time,
08:19
a facial composite of these two friends.
08:21
Now studies prior to ours have shown that people
08:24
don't recognize any longer even themselves
08:27
in facial composites, but they react
08:30
to those composites in a positive manner.
08:32
So next time you are looking for a certain product,
08:34
and there is an ad suggesting you to buy it,
08:38
it will not be just a standard spokesperson.
08:40
It will be one of your friends,
08:43
and you will not even know that this is happening.
08:46
Now the problem is that
08:49
the current policy mechanisms we have
08:51
to protect ourselves from the abuses of personal information
08:54
are like bringing a knife to a gunfight.
08:57
One of these mechanisms is transparency,
09:00
telling people what you are going to do with their data.
09:03
And in principle, that's a very good thing.
09:06
It's necessary, but it is not sufficient.
09:08
Transparency can be misdirected.
09:12
You can tell people what you are going to do,
09:16
and then you still nudge them to disclose
09:18
arbitrary amounts of personal information.
09:20
So in yet another experiment, this one with students,
09:23
we asked them to provide information
09:26
about their campus behavior,
09:29
including pretty sensitive questions, such as this one.
09:31
[Have you ever cheated in an exam?]
09:34
Now to one group of subjects, we told them,
09:34
"Only other students will see your answers."
09:36
To another group of subjects, we told them,
09:39
"Students and faculty will see your answers."
09:41
Transparency. Notification. And sure enough, this worked,
09:44
in the sense that the first group of subjects
09:47
were much more likely to disclose than the second.
09:48
It makes sense, right?
09:51
But then we added the misdirection.
09:52
We repeated the experiment with the same two groups,
09:54
this time adding a delay
09:57
between the time we told subjects
09:59
how we would use their data
10:02
and the time we actually started answering the questions.
10:04
How long a delay do you think we had to add
10:09
in order to nullify the inhibitory effect
10:11
of knowing that faculty would see your answers?
10:16
Ten minutes?
10:19
Five minutes?
10:21
One minute?
10:23
How about 15 seconds?
10:25
Fifteen seconds were sufficient to have the two groups
10:27
disclose the same amount of information,
10:29
as if the second group now no longer cares
10:31
for faculty reading their answers.
10:34
Now I have to admit that this talk so far
10:36
may sound exceedingly gloomy,
10:40
but that is not my point.
10:42
In fact, I want to share with you the fact that
10:44
there are alternatives.
10:46
The way we are doing things now is not the only way
10:48
they can done, and certainly not the best way
10:51
they can be done.
10:54
When someone tells you, "People don't care about privacy,"
10:56
consider whether the game has been designed
11:00
and rigged so that they cannot care about privacy,
11:03
and coming to the realization that these manipulations occur
11:05
is already halfway through the process
11:09
of being able to protect yourself.
11:10
When someone tells you that privacy is incompatible
11:12
with the benefits of big data,
11:16
consider that in the last 20 years,
11:18
researchers have created technologies
11:20
to allow virtually any electronic transactions
11:22
to take place in a more privacy-preserving manner.
11:26
We can browse the Internet anonymously.
11:29
We can send emails that can only be read
11:32
by the intended recipient, not even the NSA.
11:35
We can have even privacy-preserving data mining.
11:38
In other words, we can have the benefits of big data
11:41
while protecting privacy.
11:45
Of course, these technologies imply a shifting
11:47
of cost and revenues
11:51
between data holders and data subjects,
11:53
which is why, perhaps, you don't hear more about them.
11:55
Which brings me back to the Garden of Eden.
11:58
There is a second privacy interpretation
12:02
of the story of the Garden of Eden
12:05
which doesn't have to do with the issue
12:07
of Adam and Eve feeling naked
12:09
and feeling ashamed.
12:11
You can find echoes of this interpretation
12:13
in John Milton's "Paradise Lost."
12:16
In the garden, Adam and Eve are materially content.
12:19
They're happy. They are satisfied.
12:23
However, they also lack knowledge
12:25
and self-awareness.
12:27
The moment they eat the aptly named
12:29
fruit of knowledge,
12:32
that's when they discover themselves.
12:34
They become aware. They achieve autonomy.
12:36
The price to pay, however, is leaving the garden.
12:40
So privacy, in a way, is both the means
12:43
and the price to pay for freedom.
12:47
Again, marketers tell us
12:50
that big data and social media
12:53
are not just a paradise of profit for them,
12:56
but a Garden of Eden for the rest of us.
12:59
We get free content.
13:02
We get to play Angry Birds. We get targeted apps.
13:03
But in fact, in a few years, organizations
13:06
will know so much about us,
13:09
they will be able to infer our desires
13:10
before we even form them, and perhaps
13:13
buy products on our behalf
13:15
before we even know we need them.
13:18
Now there was one English author
13:20
who anticipated this kind of future
13:23
where we would trade away
13:26
our autonomy and freedom for comfort.
13:28
Even more so than George Orwell,
13:31
the author is, of course, Aldous Huxley.
13:33
In "Brave New World," he imagines a society
13:36
where technologies that we created
13:39
originally for freedom
13:41
end up coercing us.
13:43
However, in the book, he also offers us a way out
13:46
of that society, similar to the path
13:50
that Adam and Eve had to follow to leave the garden.
13:54
In the words of the Savage,
13:58
regaining autonomy and freedom is possible,
14:00
although the price to pay is steep.
14:03
So I do believe that one of the defining fights
14:06
of our times will be the fight
14:11
for the control over personal information,
14:14
the fight over whether big data will become a force
14:16
for freedom,
14:20
rather than a force which will hiddenly manipulate us.
14:21
Right now, many of us
14:26
do not even know that the fight is going on,
14:29
but it is, whether you like it or not.
14:31
And at the risk of playing the serpent,
14:34
I will tell you that the tools for the fight
14:37
are here, the awareness of what is going on,
14:40
and in your hands,
14:43
just a few clicks away.
14:44
Thank you.
14:48
(Applause)
14:49

▲Back to top

About the Speaker:

Alessandro Acquisti - Privacy economist
What motivates you to share your personal information online? Alessandro Acquisti studies the behavioral economics of privacy (and information security) in social networks.

Why you should listen

Online, we humans are paradoxical: We cherish privacy, but freely disclose our personal information in certain contexts. Privacy economics offers a powerful lens to understand this paradox, and the field has been spearheaded by Alessandro Acquisti and his colleagues' analyses of how we decide what to share online and what we get in return.

His team's surprising studies on facial recognition software showed that it can connect an anonymous human face to an online name -- and then to a Facebook account -- in about 3 seconds. Other work shows how easy it can be to find a US citizen's Social Security number using basic pattern matching on public data. Work like this earned him an invitation to testify before a US Senate committee on the impact technology has on civil liberties.

Read about his work in the New York Times »

More profile about the speaker
Alessandro Acquisti | Speaker | TED.com