sponsored links
TEDGlobal 2013

Daniel Suarez: The kill decision shouldn't belong to a robot

June 11, 2013

As a novelist, Daniel Suarez spins dystopian tales of the future. But on the TEDGlobal stage, he talks us through a real-life scenario we all need to know more about: the rise of autonomous robotic weapons of war. Advanced drones, automated weapons and AI-powered intelligence-gathering tools, he suggests, could take the decision to make war out of the hands of humans.

Daniel Suarez - Sci-fi author
Daniel Suarez concocts thrilling reads from terrifying (and not-so-farfetched) near-future scenarios. Full bio

sponsored links
Double-click the English subtitles below to play the video.
I write fiction sci-fi thrillers,
00:12
so if I say "killer robots,"
00:15
you'd probably think something like this.
00:17
But I'm actually not here to talk about fiction.
00:20
I'm here to talk about very real killer robots,
00:22
autonomous combat drones.
00:25
Now, I'm not referring to Predator and Reaper drones,
00:28
which have a human making targeting decisions.
00:32
I'm talking about fully autonomous robotic weapons
00:35
that make lethal decisions about human beings
00:38
all on their own.
00:41
There's actually a technical term for this: lethal autonomy.
00:43
Now, lethally autonomous killer robots
00:47
would take many forms -- flying, driving,
00:50
or just lying in wait.
00:53
And actually, they're very quickly becoming a reality.
00:56
These are two automatic sniper stations
00:59
currently deployed in the DMZ between North and South Korea.
01:02
Both of these machines are capable of automatically
01:06
identifying a human target and firing on it,
01:08
the one on the left at a distance of over a kilometer.
01:12
Now, in both cases, there's still a human in the loop
01:16
to make that lethal firing decision,
01:19
but it's not a technological requirement. It's a choice.
01:22
And it's that choice that I want to focus on,
01:27
because as we migrate lethal decision-making
01:30
from humans to software,
01:33
we risk not only taking the humanity out of war,
01:36
but also changing our social landscape entirely,
01:40
far from the battlefield.
01:43
That's because the way humans resolve conflict
01:45
shapes our social landscape.
01:50
And this has always been the case, throughout history.
01:52
For example, these were state-of-the-art weapons systems
01:54
in 1400 A.D.
01:57
Now they were both very expensive to build and maintain,
01:59
but with these you could dominate the populace,
02:02
and the distribution of political power in feudal society reflected that.
02:05
Power was focused at the very top.
02:09
And what changed? Technological innovation.
02:12
Gunpowder, cannon.
02:15
And pretty soon, armor and castles were obsolete,
02:17
and it mattered less who you brought to the battlefield
02:21
versus how many people you brought to the battlefield.
02:24
And as armies grew in size, the nation-state arose
02:27
as a political and logistical requirement of defense.
02:31
And as leaders had to rely on more of their populace,
02:35
they began to share power.
02:37
Representative government began to form.
02:39
So again, the tools we use to resolve conflict
02:42
shape our social landscape.
02:45
Autonomous robotic weapons are such a tool,
02:48
except that, by requiring very few people to go to war,
02:52
they risk re-centralizing power into very few hands,
02:57
possibly reversing a five-century trend toward democracy.
03:02
Now, I think, knowing this,
03:09
we can take decisive steps to preserve our democratic institutions,
03:10
to do what humans do best, which is adapt.
03:15
But time is a factor.
03:19
Seventy nations are developing remotely-piloted
03:21
combat drones of their own,
03:24
and as you'll see, remotely-piloted combat drones
03:26
are the precursors to autonomous robotic weapons.
03:28
That's because once you've deployed remotely-piloted drones,
03:33
there are three powerful factors pushing decision-making
03:36
away from humans and on to the weapon platform itself.
03:39
The first of these is the deluge of video that drones produce.
03:44
For example, in 2004, the U.S. drone fleet produced
03:49
a grand total of 71 hours of video surveillance for analysis.
03:53
By 2011, this had gone up to 300,000 hours,
03:58
outstripping human ability to review it all,
04:03
but even that number is about to go up drastically.
04:06
The Pentagon's Gorgon Stare and Argus programs
04:09
will put up to 65 independently operated camera eyes
04:12
on each drone platform,
04:15
and this would vastly outstrip human ability to review it.
04:17
And that means visual intelligence software will need
04:20
to scan it for items of interest.
04:23
And that means very soon
04:27
drones will tell humans what to look at,
04:28
not the other way around.
04:31
But there's a second powerful incentive pushing
04:33
decision-making away from humans and onto machines,
04:36
and that's electromagnetic jamming,
04:39
severing the connection between the drone
04:42
and its operator.
04:44
Now we saw an example of this in 2011
04:47
when an American RQ-170 Sentinel drone
04:50
got a bit confused over Iran due to a GPS spoofing attack,
04:53
but any remotely-piloted drone is susceptible to this type of attack,
04:57
and that means drones
05:02
will have to shoulder more decision-making.
05:04
They'll know their mission objective,
05:08
and they'll react to new circumstances without human guidance.
05:11
They'll ignore external radio signals
05:16
and send very few of their own.
05:18
Which brings us to, really, the third
05:20
and most powerful incentive pushing decision-making
05:22
away from humans and onto weapons:
05:26
plausible deniability.
05:30
Now we live in a global economy.
05:33
High-tech manufacturing is occurring on most continents.
05:36
Cyber espionage is spiriting away advanced designs
05:40
to parts unknown,
05:43
and in that environment, it is very likely
05:45
that a successful drone design will be knocked off in contract factories,
05:47
proliferate in the gray market.
05:52
And in that situation, sifting through the wreckage
05:54
of a suicide drone attack, it will be very difficult to say
05:56
who sent that weapon.
05:59
This raises the very real possibility
06:04
of anonymous war.
06:07
This could tilt the geopolitical balance on its head,
06:09
make it very difficult for a nation to turn its firepower
06:12
against an attacker, and that could shift the balance
06:16
in the 21st century away from defense and toward offense.
06:18
It could make military action a viable option
06:22
not just for small nations,
06:25
but criminal organizations, private enterprise,
06:28
even powerful individuals.
06:30
It could create a landscape of rival warlords
06:33
undermining rule of law and civil society.
06:36
Now if responsibility and transparency
06:40
are two of the cornerstones of representative government,
06:43
autonomous robotic weapons could undermine both.
06:46
Now you might be thinking that
06:50
citizens of high-tech nations
06:52
would have the advantage in any robotic war,
06:54
that citizens of those nations would be less vulnerable,
06:56
particularly against developing nations.
07:00
But I think the truth is the exact opposite.
07:04
I think citizens of high-tech societies
07:08
are more vulnerable to robotic weapons,
07:10
and the reason can be summed up in one word: data.
07:14
Data powers high-tech societies.
07:18
Cell phone geolocation, telecom metadata,
07:22
social media, email, text, financial transaction data,
07:25
transportation data, it's a wealth of real-time data
07:28
on the movements and social interactions of people.
07:32
In short, we are more visible to machines
07:35
than any people in history,
07:39
and this perfectly suits the targeting needs of autonomous weapons.
07:41
What you're looking at here
07:47
is a link analysis map of a social group.
07:49
Lines indicate social connectedness between individuals.
07:52
And these types of maps can be automatically generated
07:56
based on the data trail modern people leave behind.
07:59
Now it's typically used to market goods and services
08:03
to targeted demographics, but it's a dual-use technology,
08:06
because targeting is used in another context.
08:10
Notice that certain individuals are highlighted.
08:13
These are the hubs of social networks.
08:16
These are organizers, opinion-makers, leaders,
08:19
and these people also can be automatically identified
08:23
from their communication patterns.
08:26
Now, if you're a marketer, you might then target them
08:28
with product samples, try to spread your brand
08:30
through their social group.
08:33
But if you're a repressive government
08:36
searching for political enemies, you might instead remove them,
08:37
eliminate them, disrupt their social group,
08:42
and those who remain behind lose social cohesion
08:45
and organization.
08:48
Now in a world of cheap, proliferating robotic weapons,
08:51
borders would offer very little protection
08:54
to critics of distant governments
08:57
or trans-national criminal organizations.
08:59
Popular movements agitating for change
09:02
could be detected early and their leaders eliminated
09:06
before their ideas achieve critical mass.
09:09
And ideas achieving critical mass
09:12
is what political activism in popular government is all about.
09:15
Anonymous lethal weapons could make lethal action
09:19
an easy choice for all sorts of competing interests.
09:23
And this would put a chill on free speech
09:27
and popular political action, the very heart of democracy.
09:30
And this is why we need an international treaty
09:36
on robotic weapons, and in particular a global ban
09:39
on the development and deployment of killer robots.
09:42
Now we already have international treaties
09:46
on nuclear and biological weapons, and, while imperfect,
09:49
these have largely worked.
09:53
But robotic weapons might be every bit as dangerous,
09:55
because they will almost certainly be used,
09:59
and they would also be corrosive to our democratic institutions.
10:02
Now in November 2012 the U.S. Department of Defense
10:07
issued a directive requiring
10:11
a human being be present in all lethal decisions.
10:13
This temporarily effectively banned autonomous weapons in the U.S. military,
10:18
but that directive needs to be made permanent.
10:22
And it could set the stage for global action.
10:26
Because we need an international legal framework
10:30
for robotic weapons.
10:34
And we need it now, before there's a devastating attack
10:36
or a terrorist incident that causes nations of the world
10:39
to rush to adopt these weapons
10:43
before thinking through the consequences.
10:44
Autonomous robotic weapons concentrate too much power
10:48
in too few hands, and they would imperil democracy itself.
10:51
Now, don't get me wrong, I think there are tons
10:57
of great uses for unarmed civilian drones:
11:00
environmental monitoring, search and rescue, logistics.
11:03
If we have an international treaty on robotic weapons,
11:07
how do we gain the benefits of autonomous drones
11:10
and vehicles while still protecting ourselves
11:13
against illegal robotic weapons?
11:16
I think the secret will be transparency.
11:20
No robot should have an expectation of privacy
11:25
in a public place.
11:28
(Applause)
11:31
Each robot and drone should have
11:36
a cryptographically signed I.D. burned in at the factory
11:38
that can be used to track its movement through public spaces.
11:41
We have license plates on cars, tail numbers on aircraft.
11:44
This is no different.
11:47
And every citizen should be able to download an app
11:49
that shows the population of drones and autonomous vehicles
11:51
moving through public spaces around them,
11:54
both right now and historically.
11:57
And civic leaders should deploy sensors and civic drones
11:59
to detect rogue drones,
12:03
and instead of sending killer drones of their own up to shoot them down,
12:05
they should notify humans to their presence.
12:08
And in certain very high-security areas,
12:11
perhaps civic drones would snare them
12:14
and drag them off to a bomb disposal facility.
12:16
But notice, this is more an immune system
12:19
than a weapons system.
12:22
It would allow us to avail ourselves of the use
12:23
of autonomous vehicles and drones
12:26
while still preserving our open, civil society.
12:28
We must ban the deployment and development
12:32
of killer robots.
12:35
Let's not succumb to the temptation to automate war.
12:37
Autocratic governments and criminal organizations
12:42
undoubtedly will, but let's not join them.
12:44
Autonomous robotic weapons
12:47
would concentrate too much power
12:49
in too few unseen hands,
12:51
and that would be corrosive to representative government.
12:54
Let's make sure, for democracies at least,
12:57
killer robots remain fiction.
13:00
Thank you.
13:03
(Applause)
13:04
Thank you. (Applause)
13:08
Translator:Joseph Geni
Reviewer:Morton Bast

sponsored links

Daniel Suarez - Sci-fi author
Daniel Suarez concocts thrilling reads from terrifying (and not-so-farfetched) near-future scenarios.

Why you should listen

While working as a software developer, Daniel Suarez self-published Daemon, a cyber-thriller depicting a future where society is radically reshaped by disruptive technologies. It struck a chord -- and so did the sequel, Freedom (TM) -- rocketing Suarez into the pantheon of sci-fi prophets.

In his 2012 novel Kill Decision, Suarez digs into the consequences of technology that’s here to stay: autonomous bots and drones programmed to be lethal. Suarez argues that as we cede more control to software, we gamble with the very essence of democracy itself. How can we establish sane guidelines for technology that could easily outstrip our control?

sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.