ABOUT THE SPEAKER
Bill Joy - Technologist and futurist
The co-founder of Sun Microsystems, Bill Joy has, in recent years, turned his attention to the biggest questions facing humanity: Where are we going? What could go wrong? What's the next great thing?

Why you should listen

In 2003, Bill Joy left Sun Microsystems, the computer company he cofounded, with no definite plans. He'd spent the late 1970s and early 1980s working on Berkeley UNIX (he wrote the vi editor), and the next decades building beautiful high-performance workstations at Sun. Always, he'd been a kind of polite engineer-gadfly -- refusing to settle for subpar code or muddled thinking.

In 2000, with a landmark cover story in Wired called "Why the Future Doesn't Need Us," Joy began to share his larger concerns with the world. A careful observer of the nanotech industry that was growing up around his own industry, Joy saw a way forward that, frankly, frightened him. He saw a very plausible future in which our own creations supplanted us -- if not out and out killed us (e.g., the gray goo problem). His proposed solution: Proceed with caution.

Joy's now a partner at KPMG, where he reviews business plans in education, environmental improvement and pandemic defense.

More profile about the speaker
Bill Joy | Speaker | TED.com
TED2006

Bill Joy: What I'm worried about, what I'm excited about

Filmed:
553,237 views

Technologist and futurist Bill Joy talks about several big worries for humanity -- and several big hopes in the fields of health, education and future tech.
- Technologist and futurist
The co-founder of Sun Microsystems, Bill Joy has, in recent years, turned his attention to the biggest questions facing humanity: Where are we going? What could go wrong? What's the next great thing? Full bio

Double-click the English transcript below to play the video.

00:18
What technology can we really apply to reducing global poverty?
0
0
6000
00:24
And what I found was quite surprising.
1
6000
4000
00:28
We started looking at things like death rates in the 20th century,
2
10000
3000
00:31
and how they'd been improved, and very simple things turned out.
3
13000
3000
00:34
You'd think maybe antibiotics made more difference than clean water,
4
16000
3000
00:37
but it's actually the opposite.
5
19000
3000
00:40
And so very simple things -- off-the-shelf technologies
6
22000
3000
00:43
that we could easily find on the then-early Web --
7
25000
5000
00:48
would clearly make a huge difference to that problem.
8
30000
5000
00:53
But I also, in looking at more powerful technologies
9
35000
4000
00:57
and nanotechnology and genetic engineering and other new emerging
10
39000
5000
01:02
kind of digital technologies, became very concerned
11
44000
4000
01:06
about the potential for abuse.
12
48000
4000
01:10
If you think about it, in history, a long, long time ago
13
52000
5000
01:15
we dealt with the problem of an individual abusing another individual.
14
57000
3000
01:18
We came up with something -- the Ten Commandments: Thou shalt not kill.
15
60000
3000
01:21
That's a, kind of a one-on-one thing.
16
63000
2000
01:23
We organized into cities. We had many people.
17
65000
4000
01:27
And to keep the many from tyrannizing the one,
18
69000
4000
01:31
we came up with concepts like individual liberty.
19
73000
4000
01:35
And then, to have to deal with large groups,
20
77000
1000
01:36
say, at the nation-state level,
21
78000
3000
01:39
and we had to have mutual non-aggression,
22
81000
2000
01:41
or through a series of conflicts, we eventually came to
23
83000
4000
01:45
a rough international bargain to largely keep the peace.
24
87000
6000
01:51
But now we have a new situation, really what people call
25
93000
5000
01:56
an asymmetric situation, where technology is so powerful
26
98000
3000
01:59
that it extends beyond a nation-state.
27
101000
4000
02:03
It's not the nation-states that have potential access
28
105000
3000
02:06
to mass destruction, but individuals.
29
108000
5000
02:11
And this is a consequence of the fact that these new technologies tend to be digital.
30
113000
5000
02:16
We saw genome sequences.
31
118000
4000
02:20
You can download the gene sequences
32
122000
1000
02:21
of pathogens off the Internet if you want to,
33
123000
4000
02:25
and clearly someone recently -- I saw in a science magazine --
34
127000
5000
02:30
they said, well, the 1918 flu is too dangerous to FedEx around.
35
132000
5000
02:35
If people want to use it in their labs for working on research,
36
137000
3000
02:38
just reconstruct it yourself,
37
140000
3000
02:41
because, you know, it might break in FedEx.
38
143000
4000
02:45
So that this is possible to do this is not deniable.
39
147000
5000
02:50
So individuals in small groups super-empowered by access to these
40
152000
5000
02:55
kinds of self-replicating technologies, whether it be biological
41
157000
5000
03:00
or other, are clearly a danger in our world.
42
162000
3000
03:03
And the danger is that they can cause roughly what's a pandemic.
43
165000
4000
03:07
And we really don't have experience with pandemics,
44
169000
3000
03:10
and we're also not very good as a society at acting
45
172000
3000
03:13
to things we don't have direct and sort of gut-level experience with.
46
175000
4000
03:17
So it's not in our nature to pre-act.
47
179000
4000
03:21
And in this case, piling on more technology doesn't solve the problem,
48
183000
5000
03:26
because it only super-empowers people more.
49
188000
3000
03:29
So the solution has to be, as people like Russell and Einstein
50
191000
4000
03:33
and others imagine in a conversation that existed
51
195000
2000
03:35
in a much stronger form, I think, early in the 20th century,
52
197000
4000
03:39
that the solution had to be not just the head but the heart.
53
201000
3000
03:42
You know, public policy and moral progress.
54
204000
5000
03:47
The bargain that gives us civilization is a bargain to not use power.
55
209000
6000
03:53
We get our individual rights by society protecting us from others
56
215000
3000
03:56
not doing everything they can do but largely doing only what is legal.
57
218000
5000
04:01
And so to limit the danger of these new things, we have to limit,
58
223000
5000
04:06
ultimately, the ability of individuals
59
228000
2000
04:08
to have access, essentially, to pandemic power.
60
230000
3000
04:11
We also have to have sensible defense, because no limitation
61
233000
4000
04:15
is going to prevent a crazy person from doing something.
62
237000
3000
04:18
And you know, and the troubling thing is that
63
240000
2000
04:20
it's much easier to do something bad than to defend
64
242000
2000
04:22
against all possible bad things,
65
244000
2000
04:24
so the offensive uses really have an asymmetric advantage.
66
246000
4000
04:28
So these are the kind of thoughts I was thinking in 1999 and 2000,
67
250000
4000
04:32
and my friends told me I was getting really depressed,
68
254000
2000
04:34
and they were really worried about me.
69
256000
2000
04:36
And then I signed a book contract to write more gloomy thoughts about this
70
258000
3000
04:39
and moved into a hotel room in New York
71
261000
2000
04:41
with one room full of books on the Plague,
72
263000
4000
04:45
and you know, nuclear bombs exploding in New York
73
267000
3000
04:48
where I would be within the circle, and so on.
74
270000
3000
04:51
And then I was there on September 11th,
75
273000
4000
04:55
and I stood in the streets with everyone.
76
277000
1000
04:56
And it was quite an experience to be there.
77
278000
2000
04:58
I got up the next morning and walked out of the city,
78
280000
3000
05:01
and all the sanitation trucks were parked on Houston Street
79
283000
3000
05:04
and ready to go down and start taking the rubble away.
80
286000
2000
05:06
And I walked down the middle, up to the train station,
81
288000
2000
05:08
and everything below 14th Street was closed.
82
290000
3000
05:11
It was quite a compelling experience, but not really, I suppose,
83
293000
4000
05:15
a surprise to someone who'd had his room full of the books.
84
297000
3000
05:18
It was always a surprise that it happened then and there,
85
300000
4000
05:22
but it wasn't a surprise that it happened at all.
86
304000
4000
05:26
And everyone then started writing about this.
87
308000
2000
05:28
Thousands of people started writing about this.
88
310000
1000
05:29
And I eventually abandoned the book, and then Chris called me
89
311000
2000
05:31
to talk at the conference. I really don't talk about this anymore
90
313000
3000
05:34
because, you know, there's enough frustrating and depressing things going on.
91
316000
5000
05:39
But I agreed to come and say a few things about this.
92
321000
3000
05:42
And I would say that we can't give up the rule of law
93
324000
3000
05:45
to fight an asymmetric threat, which is what we seem to be doing
94
327000
4000
05:49
because of the present, the people that are in power,
95
331000
5000
05:54
because that's to give up the thing that makes civilization.
96
336000
5000
05:59
And we can't fight the threat in the kind of stupid way we're doing,
97
341000
3000
06:02
because a million-dollar act
98
344000
2000
06:04
causes a billion dollars of damage, causes a trillion dollar response
99
346000
3000
06:07
which is largely ineffective and arguably, probably almost certainly,
100
349000
3000
06:10
has made the problem worse.
101
352000
2000
06:12
So we can't fight the thing with a million-to-one cost,
102
354000
5000
06:17
one-to-a-million cost-benefit ratio.
103
359000
6000
06:24
So after giving up on the book -- and I had the great honor
104
366000
5000
06:29
to be able to join Kleiner Perkins about a year ago,
105
371000
4000
06:33
and to work through venture capital on the innovative side,
106
375000
7000
06:40
and to try to find some innovations that could address what I saw as
107
382000
4000
06:44
some of these big problems.
108
386000
2000
06:46
Things where, you know, a factor of 10 difference
109
388000
3000
06:49
can make a factor of 1,000 difference in the outcome.
110
391000
4000
06:53
I've been amazed in the last year at the incredible quality
111
395000
3000
06:56
and excitement of the innovations that have come across my desk.
112
398000
5000
07:01
It's overwhelming at times. I'm very thankful for Google and Wikipedia
113
403000
3000
07:04
so I can understand at least a little of what people are talking about
114
406000
4000
07:08
who come through the doors.
115
410000
2000
07:10
But I wanted to share with you three areas
116
412000
3000
07:13
that I'm particularly excited about and that relate to the problems
117
415000
3000
07:16
that I was talking about in the Wired article.
118
418000
5000
07:21
The first is this whole area of education,
119
423000
2000
07:23
and it really relates to what Nicholas was talking about with a $100 computer.
120
425000
4000
07:27
And that is to say that there's a lot of legs left in Moore's Law.
121
429000
4000
07:31
The most advanced transistors today are at 65 nanometers,
122
433000
4000
07:35
and we've seen, and I've had the pleasure to invest
123
437000
3000
07:38
in, companies that give me great confidence that we'll extend Moore's Law
124
440000
6000
07:44
all the way down to roughly the 10 nanometer scale.
125
446000
3000
07:47
Another factor of, say, six in dimensional reduction,
126
449000
6000
07:53
which should give us about another factor of 100 in raw improvement
127
455000
5000
07:58
in what the chips can do. And so, to put that in practical terms,
128
460000
5000
08:03
if something costs about 1,000 dollars today,
129
465000
4000
08:07
say, the best personal computer you can buy, that might be its cost,
130
469000
5000
08:12
I think we can have that in 2020 for 10 dollars. Okay?
131
474000
6000
08:18
Now, just imagine what that $100 computer will be in 2020
132
480000
5000
08:23
as a tool for education.
133
485000
2000
08:25
I think the challenge for us is --
134
487000
2000
08:27
I'm very certain that that will happen, the challenge is,
135
489000
2000
08:29
will we develop the kind of educational tools and things with the net
136
491000
5000
08:34
to let us take advantage of that device?
137
496000
3000
08:37
I'd argue today that we have incredibly powerful computers,
138
499000
4000
08:41
but we don't have very good software for them.
139
503000
2000
08:43
And it's only in retrospect, after the better software comes along,
140
505000
3000
08:46
and you take it and you run it on a ten-year-old machine, you say,
141
508000
2000
08:48
God, the machine was that fast?
142
510000
2000
08:50
I remember when they took the Apple Mac interface
143
512000
2000
08:52
and they put it back on the Apple II.
144
514000
3000
08:55
The Apple II was perfectly capable of running that kind of interface,
145
517000
3000
08:58
we just didn't know how to do it at the time.
146
520000
3000
09:01
So given that we know and should believe --
147
523000
2000
09:03
because Moore's Law's been, like, a constant,
148
525000
3000
09:06
I mean, it's just been very predictable progress
149
528000
3000
09:09
over the last 40 years or whatever.
150
531000
3000
09:12
We can know what the computers are going to be like in 2020.
151
534000
4000
09:16
It's great that we have initiatives to say,
152
538000
2000
09:18
let's go create the education and educate people in the world,
153
540000
3000
09:21
because that's a great force for peace.
154
543000
2000
09:23
And we can give everyone in the world a $100 computer
155
545000
3000
09:26
or a $10 computer in the next 15 years.
156
548000
5000
09:31
The second area that I'm focusing on is the environmental problem,
157
553000
5000
09:36
because that's clearly going to put a lot of pressure on this world.
158
558000
4000
09:40
We'll hear a lot more about that from Al Gore very shortly.
159
562000
4000
09:44
The thing that we see as the kind of Moore's Law trend
160
566000
3000
09:47
that's driving improvement in our ability to address
161
569000
3000
09:50
the environmental problem is new materials.
162
572000
4000
09:54
We have a challenge, because the urban population is growing
163
576000
4000
09:58
in this century from two billion to six billion
164
580000
3000
10:01
in a very short amount of time. People are moving to the cities.
165
583000
2000
10:03
They all need clean water, they need energy, they need transportation,
166
585000
3000
10:06
and we want them to develop in a green way.
167
588000
4000
10:10
We're reasonably efficient in the industrial sectors.
168
592000
2000
10:12
We've made improvements in energy and resource efficiency,
169
594000
3000
10:15
but the consumer sector, especially in America, is very inefficient.
170
597000
4000
10:19
But these new materials bring such incredible innovations
171
601000
4000
10:23
that there's a strong basis for hope that these things
172
605000
4000
10:27
will be so profitable that they can be brought to the market.
173
609000
2000
10:29
And I want to give you a specific example of a new material
174
611000
3000
10:32
that was discovered 15 years ago.
175
614000
3000
10:35
If we take carbon nanotubes, you know, Iijima discovered them in 1991,
176
617000
5000
10:40
they just have incredible properties.
177
622000
2000
10:42
And these are the kinds of things we're going to discover
178
624000
1000
10:43
as we start to engineer at the nano scale.
179
625000
3000
10:46
Their strength: they're almost the strongest material,
180
628000
3000
10:49
tensile strength material known.
181
631000
2000
10:52
They're very, very stiff. They stretch very, very little.
182
634000
5000
10:57
In two dimensions, if you make, like, a fabric out of them,
183
639000
3000
11:00
they're 30 times stronger than Kevlar.
184
642000
3000
11:03
And if you make a three-dimensional structure, like a buckyball,
185
645000
3000
11:06
they have all sorts of incredible properties.
186
648000
2000
11:08
If you shoot a particle at them and knock a hole in them,
187
650000
3000
11:11
they repair themselves; they go zip and they repair the hole
188
653000
3000
11:14
in femtoseconds, which is not -- is really quick.
189
656000
3000
11:17
(Laughter)
190
659000
3000
11:20
If you shine a light on them, they produce electricity.
191
662000
4000
11:24
In fact, if you flash them with a camera they catch on fire.
192
666000
3000
11:27
If you put electricity on them, they emit light.
193
669000
4000
11:31
If you run current through them, you can run 1,000 times more current
194
673000
3000
11:34
through one of these than through a piece of metal.
195
676000
4000
11:38
You can make both p- and n-type semiconductors,
196
680000
3000
11:41
which means you can make transistors out of them.
197
683000
2000
11:43
They conduct heat along their length but not across --
198
685000
3000
11:46
well, there is no width, but not in the other direction
199
688000
2000
11:48
if you stack them up; that's a property of carbon fiber also.
200
690000
6000
11:54
If you put particles in them, and they go shooting out the tip --
201
696000
3000
11:57
they're like miniature linear accelerators or electron guns.
202
699000
3000
12:00
The inside of the nanotubes is so small --
203
702000
3000
12:03
the smallest ones are 0.7 nanometers --
204
705000
2000
12:05
that it's basically a quantum world.
205
707000
2000
12:07
It's a strange place inside a nanotube.
206
709000
3000
12:10
And so we begin to see, and we've seen business plans already,
207
712000
3000
12:13
where the kind of things Lisa Randall's talking about are in there.
208
715000
3000
12:16
I had one business plan where I was trying to learn more about
209
718000
2000
12:18
Witten's cosmic dimension strings to try to understand
210
720000
3000
12:21
what the phenomenon was going on in this proposed nanomaterial.
211
723000
3000
12:24
So inside of a nanotube, we're really at the limit here.
212
726000
6000
12:30
So what we see is with these and other new materials
213
732000
4000
12:34
that we can do things with different properties -- lighter, stronger --
214
736000
4000
12:38
and apply these new materials to the environmental problems.
215
740000
6000
12:44
New materials that can make water,
216
746000
1000
12:45
new materials that can make fuel cells work better,
217
747000
2000
12:47
new materials that catalyze chemical reactions,
218
749000
4000
12:51
that cut pollution and so on.
219
753000
3000
12:54
Ethanol -- new ways of making ethanol.
220
756000
3000
12:57
New ways of making electric transportation.
221
759000
3000
13:00
The whole green dream -- because it can be profitable.
222
762000
4000
13:04
And we've dedicated -- we've just raised a new fund,
223
766000
2000
13:06
we dedicated 100 million dollars to these kinds of investments.
224
768000
3000
13:09
We believe that Genentech, the Compaq, the Lotus, the Sun,
225
771000
4000
13:13
the Netscape, the Amazon, the Google in these fields
226
775000
4000
13:17
are yet to be found, because this materials revolution
227
779000
3000
13:20
will drive these things forward.
228
782000
3000
13:24
The third area that we're working on,
229
786000
2000
13:26
and we just announced last week -- we were all in New York.
230
788000
4000
13:30
We raised 200 million dollars in a specialty fund
231
792000
6000
13:36
to work on a pandemic in biodefense.
232
798000
4000
13:40
And to give you an idea of the last fund that Kleiner raised
233
802000
3000
13:43
was a $400 million fund, so this for us is a very substantial fund.
234
805000
5000
13:48
And what we did, over the last few months -- well, a few months ago,
235
810000
4000
13:52
Ray Kurzweil and I wrote an op-ed in the New York Times
236
814000
3000
13:55
about how publishing the 1918 genome was very dangerous.
237
817000
3000
13:58
And John Doerr and Brook and others got concerned, [unclear],
238
820000
4000
14:02
and we started looking around at what the world was doing
239
824000
4000
14:06
about being prepared for a pandemic. And we saw a lot of gaps.
240
828000
5000
14:11
And so we asked ourselves, you know, can we find innovative things
241
833000
4000
14:15
that will go fill these gaps? And Brooks told me in a break here,
242
837000
4000
14:19
he said he's found so much stuff he can't sleep,
243
841000
2000
14:21
because there's so many great technologies out there,
244
843000
3000
14:24
we're essentially buried. And we need them, you know.
245
846000
3000
14:27
We have one antiviral that people are talking about stockpiling
246
849000
3000
14:30
that still works, roughly. That's Tamiflu.
247
852000
3000
14:33
But Tamiflu -- the virus is resistant. It is resistant to Tamiflu.
248
855000
5000
14:38
We've discovered with AIDS we need cocktails to work well
249
860000
4000
14:42
so that the viral resistance -- we need several anti-virals.
250
864000
3000
14:45
We need better surveillance.
251
867000
2000
14:47
We need networks that can find out what's going on.
252
869000
3000
14:50
We need rapid diagnostics so that we can tell if somebody has
253
872000
4000
14:54
a strain of flu which we have only identified very recently.
254
876000
4000
14:58
We've got to be able to make the rapid diagnostics quickly.
255
880000
2000
15:00
We need new anti-virals and cocktails. We need new kinds of vaccines.
256
882000
3000
15:03
Vaccines that are broad spectrum.
257
885000
2000
15:05
Vaccines that we can manufacture quickly.
258
887000
4000
15:09
Cocktails, more polyvalent vaccines.
259
891000
2000
15:11
You normally get a trivalent vaccine against three possible strains.
260
893000
3000
15:14
We need -- we don't know where this thing is going.
261
896000
3000
15:17
We believe that if we could fill these 10 gaps,
262
899000
3000
15:20
we have a chance to help really reduce the risk of a pandemic.
263
902000
6000
15:26
And the difference between a normal flu season and a pandemic
264
908000
4000
15:30
is about a factor of 1,000 in deaths
265
912000
3000
15:33
and certainly enormous economic impact.
266
915000
3000
15:36
So we're very excited because we think we can fund 10,
267
918000
3000
15:39
or speed up 10 projects and see them come to market
268
921000
4000
15:43
in the next couple years that will address this.
269
925000
3000
15:46
So if we can address, use technology, help address education,
270
928000
3000
15:49
help address the environment, help address the pandemic,
271
931000
3000
15:52
does that solve the larger problem that I was talking about
272
934000
4000
15:56
in the Wired article? And I'm afraid the answer is really no,
273
938000
5000
16:01
because you can't solve a problem with the management of technology
274
943000
4000
16:05
with more technology.
275
947000
3000
16:08
If we let an unlimited amount of power loose, then we will --
276
950000
5000
16:13
a very small number of people will be able to abuse it.
277
955000
2000
16:15
We can't fight at a million-to-one disadvantage.
278
957000
4000
16:19
So what we need to do is, we need better policy.
279
961000
3000
16:22
And for example, some things we could do
280
964000
3000
16:25
that would be policy solutions which are not really in the political air right now
281
967000
4000
16:29
but perhaps with the change of administration would be -- use markets.
282
971000
4000
16:33
Markets are a very strong force.
283
975000
2000
16:35
For example, rather than trying to regulate away problems,
284
977000
3000
16:38
which probably won't work, if we could price
285
980000
2000
16:40
into the cost of doing business, the cost of catastrophe,
286
982000
5000
16:45
so that people who are doing things that had a higher cost of catastrophe
287
987000
3000
16:48
would have to take insurance against that risk.
288
990000
3000
16:51
So if you wanted to put a drug on the market you could put it on.
289
993000
2000
16:53
But it wouldn't have to be approved by regulators;
290
995000
2000
16:55
you'd have to convince an actuary that it would be safe.
291
997000
4000
16:59
And if you apply the notion of insurance more broadly,
292
1001000
3000
17:02
you can use a more powerful force, a market force,
293
1004000
3000
17:05
to provide feedback.
294
1007000
2000
17:07
How could you keep the law?
295
1009000
1000
17:08
I think the law would be a really good thing to keep.
296
1010000
2000
17:10
Well, you have to hold people accountable.
297
1012000
2000
17:12
The law requires accountability.
298
1014000
2000
17:14
Today scientists, technologists, businessmen, engineers
299
1016000
3000
17:17
don't have any personal responsibility
300
1019000
2000
17:19
for the consequences of their actions.
301
1021000
2000
17:21
So if you tie that -- you have to tie that back with the law.
302
1023000
4000
17:25
And finally, I think we have to do something that's not really --
303
1027000
4000
17:29
it's almost unacceptable to say this -- which,
304
1031000
1000
17:30
we have to begin to design the future.
305
1032000
3000
17:33
We can't pick the future, but we can steer the future.
306
1035000
4000
17:37
Our investment in trying to prevent pandemic flu
307
1039000
2000
17:39
is affecting the distribution of possible outcomes.
308
1041000
4000
17:43
We may not be able to stop it, but the likelihood
309
1045000
2000
17:45
that it will get past us is lower if we focus on that problem.
310
1047000
4000
17:49
So we can design the future if we choose what kind of things
311
1051000
4000
17:53
we want to have happen and not have happen,
312
1055000
3000
17:56
and steer us to a lower-risk place.
313
1058000
3000
17:59
Vice President Gore will talk about how we could steer the climate trajectory
314
1061000
6000
18:05
into a lower probability of catastrophic risk.
315
1067000
3000
18:08
But above all, what we have to do is we have to help the good guys,
316
1070000
3000
18:11
the people on the defensive side,
317
1073000
2000
18:13
have an advantage over the people who want to abuse things.
318
1075000
4000
18:17
And what we have to do to do that
319
1079000
2000
18:19
is we have to limit access to certain information.
320
1081000
3000
18:22
And growing up as we have, and holding very high
321
1084000
3000
18:25
the value of free speech, this is a hard thing for us to accept --
322
1087000
4000
18:29
for all of us to accept.
323
1091000
1000
18:30
It's especially hard for the scientists to accept who still remember,
324
1092000
5000
18:35
you know, Galileo essentially locked up,
325
1097000
2000
18:37
and who are still fighting this battle against the church.
326
1099000
4000
18:41
But that's the price of having a civilization.
327
1103000
5000
18:46
The price of retaining the rule of law
328
1108000
2000
18:48
is to limit the access to the great and kind of unbridled power.
329
1110000
5000
18:53
Thank you.
330
1115000
1000
18:54
(Applause)
331
1116000
2000

▲Back to top

ABOUT THE SPEAKER
Bill Joy - Technologist and futurist
The co-founder of Sun Microsystems, Bill Joy has, in recent years, turned his attention to the biggest questions facing humanity: Where are we going? What could go wrong? What's the next great thing?

Why you should listen

In 2003, Bill Joy left Sun Microsystems, the computer company he cofounded, with no definite plans. He'd spent the late 1970s and early 1980s working on Berkeley UNIX (he wrote the vi editor), and the next decades building beautiful high-performance workstations at Sun. Always, he'd been a kind of polite engineer-gadfly -- refusing to settle for subpar code or muddled thinking.

In 2000, with a landmark cover story in Wired called "Why the Future Doesn't Need Us," Joy began to share his larger concerns with the world. A careful observer of the nanotech industry that was growing up around his own industry, Joy saw a way forward that, frankly, frightened him. He saw a very plausible future in which our own creations supplanted us -- if not out and out killed us (e.g., the gray goo problem). His proposed solution: Proceed with caution.

Joy's now a partner at KPMG, where he reviews business plans in education, environmental improvement and pandemic defense.

More profile about the speaker
Bill Joy | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee