20:44
TED2006

Bill Joy: What I'm worried about, what I'm excited about

Filmed:

Technologist and futurist Bill Joy talks about several big worries for humanity -- and several big hopes in the fields of health, education and future tech.

- Technologist and futurist
The co-founder of Sun Microsystems, Bill Joy has, in recent years, turned his attention to the biggest questions facing humanity: Where are we going? What could go wrong? What's the next great thing? Full bio

What technology can we really apply to reducing global poverty?
00:18
And what I found was quite surprising.
00:24
We started looking at things like death rates in the 20th century,
00:28
and how they'd been improved, and very simple things turned out.
00:31
You'd think maybe antibiotics made more difference than clean water,
00:34
but it's actually the opposite.
00:37
And so very simple things -- off-the-shelf technologies
00:40
that we could easily find on the then-early Web --
00:43
would clearly make a huge difference to that problem.
00:48
But I also, in looking at more powerful technologies
00:53
and nanotechnology and genetic engineering and other new emerging
00:57
kind of digital technologies, became very concerned
01:02
about the potential for abuse.
01:06
If you think about it, in history, a long, long time ago
01:10
we dealt with the problem of an individual abusing another individual.
01:15
We came up with something -- the Ten Commandments: Thou shalt not kill.
01:18
That's a, kind of a one-on-one thing.
01:21
We organized into cities. We had many people.
01:23
And to keep the many from tyrannizing the one,
01:27
we came up with concepts like individual liberty.
01:31
And then, to have to deal with large groups,
01:35
say, at the nation-state level,
01:36
and we had to have mutual non-aggression,
01:39
or through a series of conflicts, we eventually came to
01:41
a rough international bargain to largely keep the peace.
01:45
But now we have a new situation, really what people call
01:51
an asymmetric situation, where technology is so powerful
01:56
that it extends beyond a nation-state.
01:59
It's not the nation-states that have potential access
02:03
to mass destruction, but individuals.
02:06
And this is a consequence of the fact that these new technologies tend to be digital.
02:11
We saw genome sequences.
02:16
You can download the gene sequences
02:20
of pathogens off the Internet if you want to,
02:21
and clearly someone recently -- I saw in a science magazine --
02:25
they said, well, the 1918 flu is too dangerous to FedEx around.
02:30
If people want to use it in their labs for working on research,
02:35
just reconstruct it yourself,
02:38
because, you know, it might break in FedEx.
02:41
So that this is possible to do this is not deniable.
02:45
So individuals in small groups super-empowered by access to these
02:50
kinds of self-replicating technologies, whether it be biological
02:55
or other, are clearly a danger in our world.
03:00
And the danger is that they can cause roughly what's a pandemic.
03:03
And we really don't have experience with pandemics,
03:07
and we're also not very good as a society at acting
03:10
to things we don't have direct and sort of gut-level experience with.
03:13
So it's not in our nature to pre-act.
03:17
And in this case, piling on more technology doesn't solve the problem,
03:21
because it only super-empowers people more.
03:26
So the solution has to be, as people like Russell and Einstein
03:29
and others imagine in a conversation that existed
03:33
in a much stronger form, I think, early in the 20th century,
03:35
that the solution had to be not just the head but the heart.
03:39
You know, public policy and moral progress.
03:42
The bargain that gives us civilization is a bargain to not use power.
03:47
We get our individual rights by society protecting us from others
03:53
not doing everything they can do but largely doing only what is legal.
03:56
And so to limit the danger of these new things, we have to limit,
04:01
ultimately, the ability of individuals
04:06
to have access, essentially, to pandemic power.
04:08
We also have to have sensible defense, because no limitation
04:11
is going to prevent a crazy person from doing something.
04:15
And you know, and the troubling thing is that
04:18
it's much easier to do something bad than to defend
04:20
against all possible bad things,
04:22
so the offensive uses really have an asymmetric advantage.
04:24
So these are the kind of thoughts I was thinking in 1999 and 2000,
04:28
and my friends told me I was getting really depressed,
04:32
and they were really worried about me.
04:34
And then I signed a book contract to write more gloomy thoughts about this
04:36
and moved into a hotel room in New York
04:39
with one room full of books on the Plague,
04:41
and you know, nuclear bombs exploding in New York
04:45
where I would be within the circle, and so on.
04:48
And then I was there on September 11th,
04:51
and I stood in the streets with everyone.
04:55
And it was quite an experience to be there.
04:56
I got up the next morning and walked out of the city,
04:58
and all the sanitation trucks were parked on Houston Street
05:01
and ready to go down and start taking the rubble away.
05:04
And I walked down the middle, up to the train station,
05:06
and everything below 14th Street was closed.
05:08
It was quite a compelling experience, but not really, I suppose,
05:11
a surprise to someone who'd had his room full of the books.
05:15
It was always a surprise that it happened then and there,
05:18
but it wasn't a surprise that it happened at all.
05:22
And everyone then started writing about this.
05:26
Thousands of people started writing about this.
05:28
And I eventually abandoned the book, and then Chris called me
05:29
to talk at the conference. I really don't talk about this anymore
05:31
because, you know, there's enough frustrating and depressing things going on.
05:34
But I agreed to come and say a few things about this.
05:39
And I would say that we can't give up the rule of law
05:42
to fight an asymmetric threat, which is what we seem to be doing
05:45
because of the present, the people that are in power,
05:49
because that's to give up the thing that makes civilization.
05:54
And we can't fight the threat in the kind of stupid way we're doing,
05:59
because a million-dollar act
06:02
causes a billion dollars of damage, causes a trillion dollar response
06:04
which is largely ineffective and arguably, probably almost certainly,
06:07
has made the problem worse.
06:10
So we can't fight the thing with a million-to-one cost,
06:12
one-to-a-million cost-benefit ratio.
06:17
So after giving up on the book -- and I had the great honor
06:24
to be able to join Kleiner Perkins about a year ago,
06:29
and to work through venture capital on the innovative side,
06:33
and to try to find some innovations that could address what I saw as
06:40
some of these big problems.
06:44
Things where, you know, a factor of 10 difference
06:46
can make a factor of 1,000 difference in the outcome.
06:49
I've been amazed in the last year at the incredible quality
06:53
and excitement of the innovations that have come across my desk.
06:56
It's overwhelming at times. I'm very thankful for Google and Wikipedia
07:01
so I can understand at least a little of what people are talking about
07:04
who come through the doors.
07:08
But I wanted to share with you three areas
07:10
that I'm particularly excited about and that relate to the problems
07:13
that I was talking about in the Wired article.
07:16
The first is this whole area of education,
07:21
and it really relates to what Nicholas was talking about with a $100 computer.
07:23
And that is to say that there's a lot of legs left in Moore's Law.
07:27
The most advanced transistors today are at 65 nanometers,
07:31
and we've seen, and I've had the pleasure to invest
07:35
in, companies that give me great confidence that we'll extend Moore's Law
07:38
all the way down to roughly the 10 nanometer scale.
07:44
Another factor of, say, six in dimensional reduction,
07:47
which should give us about another factor of 100 in raw improvement
07:53
in what the chips can do. And so, to put that in practical terms,
07:58
if something costs about 1,000 dollars today,
08:03
say, the best personal computer you can buy, that might be its cost,
08:07
I think we can have that in 2020 for 10 dollars. Okay?
08:12
Now, just imagine what that $100 computer will be in 2020
08:18
as a tool for education.
08:23
I think the challenge for us is --
08:25
I'm very certain that that will happen, the challenge is,
08:27
will we develop the kind of educational tools and things with the net
08:29
to let us take advantage of that device?
08:34
I'd argue today that we have incredibly powerful computers,
08:37
but we don't have very good software for them.
08:41
And it's only in retrospect, after the better software comes along,
08:43
and you take it and you run it on a ten-year-old machine, you say,
08:46
God, the machine was that fast?
08:48
I remember when they took the Apple Mac interface
08:50
and they put it back on the Apple II.
08:52
The Apple II was perfectly capable of running that kind of interface,
08:55
we just didn't know how to do it at the time.
08:58
So given that we know and should believe --
09:01
because Moore's Law's been, like, a constant,
09:03
I mean, it's just been very predictable progress
09:06
over the last 40 years or whatever.
09:09
We can know what the computers are going to be like in 2020.
09:12
It's great that we have initiatives to say,
09:16
let's go create the education and educate people in the world,
09:18
because that's a great force for peace.
09:21
And we can give everyone in the world a $100 computer
09:23
or a $10 computer in the next 15 years.
09:26
The second area that I'm focusing on is the environmental problem,
09:31
because that's clearly going to put a lot of pressure on this world.
09:36
We'll hear a lot more about that from Al Gore very shortly.
09:40
The thing that we see as the kind of Moore's Law trend
09:44
that's driving improvement in our ability to address
09:47
the environmental problem is new materials.
09:50
We have a challenge, because the urban population is growing
09:54
in this century from two billion to six billion
09:58
in a very short amount of time. People are moving to the cities.
10:01
They all need clean water, they need energy, they need transportation,
10:03
and we want them to develop in a green way.
10:06
We're reasonably efficient in the industrial sectors.
10:10
We've made improvements in energy and resource efficiency,
10:12
but the consumer sector, especially in America, is very inefficient.
10:15
But these new materials bring such incredible innovations
10:19
that there's a strong basis for hope that these things
10:23
will be so profitable that they can be brought to the market.
10:27
And I want to give you a specific example of a new material
10:29
that was discovered 15 years ago.
10:32
If we take carbon nanotubes, you know, Iijima discovered them in 1991,
10:35
they just have incredible properties.
10:40
And these are the kinds of things we're going to discover
10:42
as we start to engineer at the nano scale.
10:43
Their strength: they're almost the strongest material,
10:46
tensile strength material known.
10:49
They're very, very stiff. They stretch very, very little.
10:52
In two dimensions, if you make, like, a fabric out of them,
10:57
they're 30 times stronger than Kevlar.
11:00
And if you make a three-dimensional structure, like a buckyball,
11:03
they have all sorts of incredible properties.
11:06
If you shoot a particle at them and knock a hole in them,
11:08
they repair themselves; they go zip and they repair the hole
11:11
in femtoseconds, which is not -- is really quick.
11:14
(Laughter)
11:17
If you shine a light on them, they produce electricity.
11:20
In fact, if you flash them with a camera they catch on fire.
11:24
If you put electricity on them, they emit light.
11:27
If you run current through them, you can run 1,000 times more current
11:31
through one of these than through a piece of metal.
11:34
You can make both p- and n-type semiconductors,
11:38
which means you can make transistors out of them.
11:41
They conduct heat along their length but not across --
11:43
well, there is no width, but not in the other direction
11:46
if you stack them up; that's a property of carbon fiber also.
11:48
If you put particles in them, and they go shooting out the tip --
11:54
they're like miniature linear accelerators or electron guns.
11:57
The inside of the nanotubes is so small --
12:00
the smallest ones are 0.7 nanometers --
12:03
that it's basically a quantum world.
12:05
It's a strange place inside a nanotube.
12:07
And so we begin to see, and we've seen business plans already,
12:10
where the kind of things Lisa Randall's talking about are in there.
12:13
I had one business plan where I was trying to learn more about
12:16
Witten's cosmic dimension strings to try to understand
12:18
what the phenomenon was going on in this proposed nanomaterial.
12:21
So inside of a nanotube, we're really at the limit here.
12:24
So what we see is with these and other new materials
12:30
that we can do things with different properties -- lighter, stronger --
12:34
and apply these new materials to the environmental problems.
12:38
New materials that can make water,
12:44
new materials that can make fuel cells work better,
12:45
new materials that catalyze chemical reactions,
12:47
that cut pollution and so on.
12:51
Ethanol -- new ways of making ethanol.
12:54
New ways of making electric transportation.
12:57
The whole green dream -- because it can be profitable.
13:00
And we've dedicated -- we've just raised a new fund,
13:04
we dedicated 100 million dollars to these kinds of investments.
13:06
We believe that Genentech, the Compaq, the Lotus, the Sun,
13:09
the Netscape, the Amazon, the Google in these fields
13:13
are yet to be found, because this materials revolution
13:17
will drive these things forward.
13:20
The third area that we're working on,
13:24
and we just announced last week -- we were all in New York.
13:26
We raised 200 million dollars in a specialty fund
13:30
to work on a pandemic in biodefense.
13:36
And to give you an idea of the last fund that Kleiner raised
13:40
was a $400 million fund, so this for us is a very substantial fund.
13:43
And what we did, over the last few months -- well, a few months ago,
13:48
Ray Kurzweil and I wrote an op-ed in the New York Times
13:52
about how publishing the 1918 genome was very dangerous.
13:55
And John Doerr and Brook and others got concerned, [unclear],
13:58
and we started looking around at what the world was doing
14:02
about being prepared for a pandemic. And we saw a lot of gaps.
14:06
And so we asked ourselves, you know, can we find innovative things
14:11
that will go fill these gaps? And Brooks told me in a break here,
14:15
he said he's found so much stuff he can't sleep,
14:19
because there's so many great technologies out there,
14:21
we're essentially buried. And we need them, you know.
14:24
We have one antiviral that people are talking about stockpiling
14:27
that still works, roughly. That's Tamiflu.
14:30
But Tamiflu -- the virus is resistant. It is resistant to Tamiflu.
14:33
We've discovered with AIDS we need cocktails to work well
14:38
so that the viral resistance -- we need several anti-virals.
14:42
We need better surveillance.
14:45
We need networks that can find out what's going on.
14:47
We need rapid diagnostics so that we can tell if somebody has
14:50
a strain of flu which we have only identified very recently.
14:54
We've got to be able to make the rapid diagnostics quickly.
14:58
We need new anti-virals and cocktails. We need new kinds of vaccines.
15:00
Vaccines that are broad spectrum.
15:03
Vaccines that we can manufacture quickly.
15:05
Cocktails, more polyvalent vaccines.
15:09
You normally get a trivalent vaccine against three possible strains.
15:11
We need -- we don't know where this thing is going.
15:14
We believe that if we could fill these 10 gaps,
15:17
we have a chance to help really reduce the risk of a pandemic.
15:20
And the difference between a normal flu season and a pandemic
15:26
is about a factor of 1,000 in deaths
15:30
and certainly enormous economic impact.
15:33
So we're very excited because we think we can fund 10,
15:36
or speed up 10 projects and see them come to market
15:39
in the next couple years that will address this.
15:43
So if we can address, use technology, help address education,
15:46
help address the environment, help address the pandemic,
15:49
does that solve the larger problem that I was talking about
15:52
in the Wired article? And I'm afraid the answer is really no,
15:56
because you can't solve a problem with the management of technology
16:01
with more technology.
16:05
If we let an unlimited amount of power loose, then we will --
16:08
a very small number of people will be able to abuse it.
16:13
We can't fight at a million-to-one disadvantage.
16:15
So what we need to do is, we need better policy.
16:19
And for example, some things we could do
16:22
that would be policy solutions which are not really in the political air right now
16:25
but perhaps with the change of administration would be -- use markets.
16:29
Markets are a very strong force.
16:33
For example, rather than trying to regulate away problems,
16:35
which probably won't work, if we could price
16:38
into the cost of doing business, the cost of catastrophe,
16:40
so that people who are doing things that had a higher cost of catastrophe
16:45
would have to take insurance against that risk.
16:48
So if you wanted to put a drug on the market you could put it on.
16:51
But it wouldn't have to be approved by regulators;
16:53
you'd have to convince an actuary that it would be safe.
16:55
And if you apply the notion of insurance more broadly,
16:59
you can use a more powerful force, a market force,
17:02
to provide feedback.
17:05
How could you keep the law?
17:07
I think the law would be a really good thing to keep.
17:08
Well, you have to hold people accountable.
17:10
The law requires accountability.
17:12
Today scientists, technologists, businessmen, engineers
17:14
don't have any personal responsibility
17:17
for the consequences of their actions.
17:19
So if you tie that -- you have to tie that back with the law.
17:21
And finally, I think we have to do something that's not really --
17:25
it's almost unacceptable to say this -- which,
17:29
we have to begin to design the future.
17:30
We can't pick the future, but we can steer the future.
17:33
Our investment in trying to prevent pandemic flu
17:37
is affecting the distribution of possible outcomes.
17:39
We may not be able to stop it, but the likelihood
17:43
that it will get past us is lower if we focus on that problem.
17:45
So we can design the future if we choose what kind of things
17:49
we want to have happen and not have happen,
17:53
and steer us to a lower-risk place.
17:56
Vice President Gore will talk about how we could steer the climate trajectory
17:59
into a lower probability of catastrophic risk.
18:05
But above all, what we have to do is we have to help the good guys,
18:08
the people on the defensive side,
18:11
have an advantage over the people who want to abuse things.
18:13
And what we have to do to do that
18:17
is we have to limit access to certain information.
18:19
And growing up as we have, and holding very high
18:22
the value of free speech, this is a hard thing for us to accept --
18:25
for all of us to accept.
18:29
It's especially hard for the scientists to accept who still remember,
18:30
you know, Galileo essentially locked up,
18:35
and who are still fighting this battle against the church.
18:37
But that's the price of having a civilization.
18:41
The price of retaining the rule of law
18:46
is to limit the access to the great and kind of unbridled power.
18:48
Thank you.
18:53
(Applause)
18:54

▲Back to top

About the Speaker:

Bill Joy - Technologist and futurist
The co-founder of Sun Microsystems, Bill Joy has, in recent years, turned his attention to the biggest questions facing humanity: Where are we going? What could go wrong? What's the next great thing?

Why you should listen

In 2003, Bill Joy left Sun Microsystems, the computer company he cofounded, with no definite plans. He'd spent the late 1970s and early 1980s working on Berkeley UNIX (he wrote the vi editor), and the next decades building beautiful high-performance workstations at Sun. Always, he'd been a kind of polite engineer-gadfly -- refusing to settle for subpar code or muddled thinking.

In 2000, with a landmark cover story in Wired called "Why the Future Doesn't Need Us," Joy began to share his larger concerns with the world. A careful observer of the nanotech industry that was growing up around his own industry, Joy saw a way forward that, frankly, frightened him. He saw a very plausible future in which our own creations supplanted us -- if not out and out killed us (e.g., the gray goo problem). His proposed solution: Proceed with caution.

Joy's now a partner at KPMG, where he reviews business plans in education, environmental improvement and pandemic defense.

More profile about the speaker
Bill Joy | Speaker | TED.com