ABOUT THE SPEAKER
Susan Blackmore - Memeticist
Susan Blackmore studies memes -- those self-replicating "life forms" that spread themselves via human consciousness. We're now headed, she believes, toward a new form of meme, spread by the technology we've created.

Why you should listen

Susan Blackmore is dedicated to understanding the scientific nature of consciousness. Her latest work centers on the existence of memes -- little bits of knowledge, lore, habit that seem to spread themselves using human brains as mere carriers. She's exploring the existence of a new class of meme, spread by human technology. It's temporarily named the "teme."

She has written about memes, consciousness, and near-death experiences; has appeared on the British Big Brother to discuss the psychology of the participants; and writes for the Guardian UK.

More profile about the speaker
Susan Blackmore | Speaker | TED.com
TED2008

Susan Blackmore: Memes and "temes"

Filmed:
882,044 views

Susan Blackmore studies memes: ideas that replicate themselves from brain to brain like a virus. She makes a bold new argument: Humanity has spawned a new kind of meme, the teme, which spreads itself via technology -- and invents ways to keep itself alive
- Memeticist
Susan Blackmore studies memes -- those self-replicating "life forms" that spread themselves via human consciousness. We're now headed, she believes, toward a new form of meme, spread by the technology we've created. Full bio

Double-click the English transcript below to play the video.

00:18
Cultural evolution is a dangerous child
0
0
3000
00:21
for any species to let loose on its planet.
1
3000
3000
00:24
By the time you realize what's happening, the child is a toddler,
2
6000
4000
00:28
up and causing havoc, and it's too late to put it back.
3
10000
6000
00:34
We humans are Earth's Pandoran species.
4
16000
3000
00:37
We're the ones who let the second replicator out of its box,
5
19000
5000
00:42
and we can't push it back in.
6
24000
2000
00:44
We're seeing the consequences all around us.
7
26000
3000
00:48
Now that, I suggest, is the view that
8
30000
4000
00:52
comes out of taking memetics seriously.
9
34000
2000
00:54
And it gives us a new way of thinking about
10
36000
2000
00:56
not only what's going on on our planet,
11
38000
2000
00:58
but what might be going on elsewhere in the cosmos.
12
40000
3000
01:01
So first of all, I'd like to say something about memetics
13
43000
3000
01:04
and the theory of memes,
14
46000
2000
01:06
and secondly, how this might answer questions about who's out there,
15
48000
5000
01:11
if indeed anyone is.
16
53000
3000
01:14
So, memetics:
17
56000
2000
01:16
memetics is founded on the principle of Universal Darwinism.
18
58000
4000
01:20
Darwin had this amazing idea.
19
62000
3000
01:23
Indeed, some people say
20
65000
2000
01:25
it's the best idea anybody ever had.
21
67000
3000
01:28
Isn't that a wonderful thought, that there could be such a thing
22
70000
4000
01:32
as a best idea anybody ever had?
23
74000
2000
01:34
Do you think there could?
24
76000
1000
01:35
Audience: No.
25
77000
1000
01:36
(Laughter)
26
78000
1000
01:37
Susan Blackmore: Someone says no, very loudly, from over there.
27
79000
2000
01:39
Well, I say yes, and if there is, I give the prize to Darwin.
28
81000
4000
01:43
Why?
29
85000
2000
01:45
Because the idea was so simple,
30
87000
3000
01:48
and yet it explains all design in the universe.
31
90000
6000
01:54
I would say not just biological design,
32
96000
2000
01:56
but all of the design that we think of as human design.
33
98000
2000
01:58
It's all just the same thing happening.
34
100000
2000
02:00
What did Darwin say?
35
102000
2000
02:02
I know you know the idea, natural selection,
36
104000
2000
02:04
but let me just paraphrase "The Origin of Species," 1859,
37
106000
5000
02:09
in a few sentences.
38
111000
2000
02:11
What Darwin said was something like this:
39
113000
3000
02:14
if you have creatures that vary, and that can't be doubted --
40
116000
4000
02:18
I've been to the Galapagos, and I've measured the size of the beaks
41
120000
3000
02:21
and the size of the turtle shells and so on, and so on.
42
123000
2000
02:23
And 100 pages later.
43
125000
2000
02:25
(Laughter)
44
127000
2000
02:27
And if there is a struggle for life,
45
129000
4000
02:31
such that nearly all of these creatures die --
46
133000
3000
02:34
and this can't be doubted, I've read Malthus
47
136000
3000
02:37
and I've calculated how long it would take for elephants
48
139000
2000
02:39
to cover the whole world if they bred unrestricted, and so on and so on.
49
141000
3000
02:42
And another 100 pages later.
50
144000
4000
02:46
And if the very few that survive pass onto their offspring
51
148000
5000
02:51
whatever it was that helped them survive,
52
153000
3000
02:54
then those offspring must be better adapted
53
156000
2000
02:56
to the circumstances in which all this happened
54
158000
2000
02:58
than their parents were.
55
160000
3000
03:01
You see the idea?
56
163000
2000
03:03
If, if, if, then.
57
165000
2000
03:05
He had no concept of the idea of an algorithm,
58
167000
2000
03:07
but that's what he described in that book,
59
169000
3000
03:10
and this is what we now know as the evolutionary algorithm.
60
172000
3000
03:13
The principle is you just need those three things --
61
175000
4000
03:17
variation, selection and heredity.
62
179000
3000
03:20
And as Dan Dennett puts it, if you have those,
63
182000
4000
03:24
then you must get evolution.
64
186000
2000
03:26
Or design out of chaos, without the aid of mind.
65
188000
5000
03:31
There's one word I love on that slide.
66
193000
2000
03:33
What do you think my favorite word is?
67
195000
2000
03:35
Audience: Chaos.
68
197000
1000
03:36
SB: Chaos? No. What? Mind? No.
69
198000
3000
03:39
Audience: Without.
70
201000
1000
03:40
SB: No, not without.
71
202000
1000
03:41
(Laughter)
72
203000
1000
03:42
You try them all in order: Mmm...?
73
204000
2000
03:44
Audience: Must.
74
206000
1000
03:45
SB: Must, at must. Must, must.
75
207000
4000
03:49
This is what makes it so amazing.
76
211000
2000
03:51
You don't need a designer,
77
213000
3000
03:54
or a plan, or foresight, or anything else.
78
216000
3000
03:57
If there's something that is copied with variation
79
219000
3000
04:00
and it's selected, then you must get design appearing out of nowhere.
80
222000
4000
04:04
You can't stop it.
81
226000
2000
04:06
Must is my favorite word there.
82
228000
4000
04:11
Now, what's this to do with memes?
83
233000
2000
04:13
Well, the principle here applies to anything
84
235000
5000
04:18
that is copied with variation and selection.
85
240000
1000
04:19
We're so used to thinking in terms of biology,
86
241000
3000
04:22
we think about genes this way.
87
244000
2000
04:24
Darwin didn't, of course; he didn't know about genes.
88
246000
3000
04:27
He talked mostly about animals and plants,
89
249000
2000
04:29
but also about languages evolving and becoming extinct.
90
251000
3000
04:32
But the principle of Universal Darwinism
91
254000
2000
04:34
is that any information that is varied and selected
92
256000
4000
04:38
will produce design.
93
260000
2000
04:40
And this is what Richard Dawkins was on about
94
262000
2000
04:42
in his 1976 bestseller, "The Selfish Gene."
95
264000
3000
04:45
The information that is copied, he called the replicator.
96
267000
4000
04:49
It selfishly copies.
97
271000
2000
04:51
Not meaning it kind of sits around inside cells going, "I want to get copied."
98
273000
4000
04:55
But that it will get copied if it can,
99
277000
2000
04:57
regardless of the consequences.
100
279000
2000
05:00
It doesn't care about the consequences because it can't,
101
282000
3000
05:03
because it's just information being copied.
102
285000
2000
05:06
And he wanted to get away
103
288000
1000
05:07
from everybody thinking all the time about genes,
104
289000
3000
05:10
and so he said, "Is there another replicator out there on the planet?"
105
292000
3000
05:13
Ah, yes, there is.
106
295000
2000
05:15
Look around you -- here will do, in this room.
107
297000
3000
05:18
All around us, still clumsily drifting about
108
300000
3000
05:21
in its primeval soup of culture, is another replicator.
109
303000
3000
05:24
Information that we copy from person to person, by imitation,
110
306000
5000
05:29
by language, by talking, by telling stories,
111
311000
2000
05:31
by wearing clothes, by doing things.
112
313000
3000
05:34
This is information copied with variation and selection.
113
316000
5000
05:39
This is design process going on.
114
321000
3000
05:42
He wanted a name for the new replicator.
115
324000
3000
05:45
So, he took the Greek word "mimeme," which means that which is imitated.
116
327000
4000
05:49
Remember that, that's the core definition:
117
331000
2000
05:52
that which is imitated.
118
334000
1000
05:53
And abbreviated it to meme, just because it sounds good
119
335000
3000
05:56
and made a good meme, an effective spreading meme.
120
338000
3000
05:59
So that's how the idea came about.
121
341000
3000
06:03
It's important to stick with that definition.
122
345000
3000
06:06
The whole science of memetics is much maligned,
123
348000
4000
06:10
much misunderstood, much feared.
124
352000
3000
06:13
But a lot of these problems can be avoided
125
355000
3000
06:16
by remembering the definition.
126
358000
2000
06:18
A meme is not equivalent to an idea.
127
360000
2000
06:20
It's not an idea. It's not equivalent to anything else, really.
128
362000
2000
06:22
Stick with the definition.
129
364000
2000
06:24
It's that which is imitated,
130
366000
2000
06:26
or information which is copied from person to person.
131
368000
3000
06:30
So, let's see some memes.
132
372000
1000
06:31
Well, you sir, you've got those glasses hung around your neck
133
373000
3000
06:34
in that particularly fetching way.
134
376000
2000
06:36
I wonder whether you invented that idea for yourself,
135
378000
2000
06:38
or copied it from someone else?
136
380000
2000
06:40
If you copied it from someone else, it's a meme.
137
382000
3000
06:43
And what about, oh, I can't see any interesting memes here.
138
385000
3000
06:46
All right everyone, who's got some interesting memes for me?
139
388000
3000
06:49
Oh, well, your earrings,
140
391000
2000
06:51
I don't suppose you invented the idea of earrings.
141
393000
2000
06:53
You probably went out and bought them.
142
395000
2000
06:55
There are plenty more in the shops.
143
397000
2000
06:57
That's something that's passed on from person to person.
144
399000
2000
06:59
All the stories that we're telling -- well, of course,
145
401000
3000
07:02
TED is a great meme-fest, masses of memes.
146
404000
4000
07:06
The way to think about memes, though,
147
408000
2000
07:08
is to think, why do they spread?
148
410000
2000
07:10
They're selfish information, they will get copied, if they can.
149
412000
4000
07:14
But some of them will be copied because they're good,
150
416000
3000
07:17
or true, or useful, or beautiful.
151
419000
2000
07:19
Some of them will be copied even though they're not.
152
421000
2000
07:21
Some, it's quite hard to tell why.
153
423000
2000
07:24
There's one particular curious meme which I rather enjoy.
154
426000
3000
07:27
And I'm glad to say, as I expected, I found it when I came here,
155
429000
3000
07:30
and I'm sure all of you found it, too.
156
432000
2000
07:32
You go to your nice, posh, international hotel somewhere,
157
434000
3000
07:36
and you come in and you put down your clothes
158
438000
2000
07:38
and you go to the bathroom, and what do you see?
159
440000
3000
07:41
Audience: Bathroom soap.
160
443000
1000
07:42
SB: Pardon?
161
444000
1000
07:43
Audience: Soap.
162
445000
1000
07:44
SB: Soap, yeah. What else do you see?
163
446000
2000
07:46
Audience: (Inaudible)
164
448000
1000
07:47
SB: Mmm mmm.
165
449000
1000
07:48
Audience: Sink, toilet!
166
450000
1000
07:49
SB: Sink, toilet, yes, these are all memes, they're all memes,
167
451000
2000
07:51
but they're sort of useful ones, and then there's this one.
168
453000
3000
07:54
(Laughter)
169
456000
3000
07:58
What is this one doing?
170
460000
2000
08:00
(Laughter)
171
462000
1000
08:01
This has spread all over the world.
172
463000
2000
08:03
It's not surprising that you all found it
173
465000
2000
08:05
when you arrived in your bathrooms here.
174
467000
2000
08:07
But I took this photograph in a toilet at the back of a tent
175
469000
5000
08:12
in the eco-camp in the jungle in Assam.
176
474000
2000
08:14
(Laughter)
177
476000
1000
08:16
Who folded that thing up there, and why?
178
478000
3000
08:19
(Laughter)
179
481000
1000
08:20
Some people get carried away.
180
482000
2000
08:22
(Laughter)
181
484000
3000
08:26
Other people are just lazy and make mistakes.
182
488000
3000
08:29
Some hotels exploit the opportunity to put even more memes
183
491000
3000
08:32
with a little sticker.
184
494000
2000
08:34
(Laughter)
185
496000
1000
08:35
What is this all about?
186
497000
2000
08:37
I suppose it's there to tell you that somebody's
187
499000
2000
08:39
cleaned the place, and it's all lovely.
188
501000
2000
08:41
And you know, actually, all it tells you is that another person
189
503000
3000
08:44
has potentially spread germs from place to place.
190
506000
3000
08:47
(Laughter)
191
509000
1000
08:48
So, think of it this way.
192
510000
2000
08:50
Imagine a world full of brains
193
512000
2000
08:52
and far more memes than can possibly find homes.
194
514000
3000
08:55
The memes are all trying to get copied --
195
517000
3000
08:58
trying, in inverted commas -- i.e.,
196
520000
3000
09:01
that's the shorthand for, if they can get copied, they will.
197
523000
3000
09:04
They're using you and me as their propagating, copying machinery,
198
526000
6000
09:10
and we are the meme machines.
199
532000
3000
09:13
Now, why is this important?
200
535000
2000
09:15
Why is this useful, or what does it tell us?
201
537000
2000
09:17
It gives us a completely new view of human origins
202
539000
4000
09:21
and what it means to be human,
203
543000
1000
09:22
all conventional theories of cultural evolution,
204
544000
4000
09:26
of the origin of humans,
205
548000
2000
09:28
and what makes us so different from other species.
206
550000
4000
09:32
All other theories explaining the big brain, and language, and tool use
207
554000
2000
09:34
and all these things that make us unique,
208
556000
2000
09:36
are based upon genes.
209
558000
3000
09:39
Language must have been useful for the genes.
210
561000
3000
09:42
Tool use must have enhanced our survival, mating and so on.
211
564000
3000
09:45
It always comes back, as Richard Dawkins complained
212
567000
3000
09:48
all that long time ago, it always comes back to genes.
213
570000
3000
09:51
The point of memetics is to say, "Oh no, it doesn't."
214
573000
4000
09:55
There are two replicators now on this planet.
215
577000
3000
09:58
From the moment that our ancestors,
216
580000
3000
10:01
perhaps two and a half million years ago or so,
217
583000
2000
10:03
began imitating, there was a new copying process.
218
585000
4000
10:07
Copying with variation and selection.
219
589000
2000
10:09
A new replicator was let loose, and it could never be --
220
591000
5000
10:14
right from the start -- it could never be
221
596000
1000
10:15
that human beings who let loose this new creature,
222
597000
5000
10:20
could just copy the useful, beautiful, true things,
223
602000
3000
10:23
and not copy the other things.
224
605000
2000
10:25
While their brains were having an advantage from being able to copy --
225
607000
3000
10:28
lighting fires, keeping fires going, new techniques of hunting,
226
610000
5000
10:33
these kinds of things --
227
615000
2000
10:35
inevitably they were also copying putting feathers in their hair,
228
617000
3000
10:38
or wearing strange clothes, or painting their faces,
229
620000
2000
10:40
or whatever.
230
622000
1000
10:41
So, you get an arms race between the genes
231
623000
4000
10:45
which are trying to get the humans to have small economical brains
232
627000
4000
10:49
and not waste their time copying all this stuff,
233
631000
2000
10:51
and the memes themselves, like the sounds that people made and copied --
234
633000
4000
10:56
in other words, what turned out to be language --
235
638000
2000
10:58
competing to get the brains to get bigger and bigger.
236
640000
3000
11:01
So, the big brain, on this theory, is driven by the memes.
237
643000
4000
11:05
This is why, in "The Meme Machine," I called it memetic drive.
238
647000
4000
11:09
As the memes evolve, as they inevitably must,
239
651000
3000
11:12
they drive a bigger brain that is better at copying the memes
240
654000
4000
11:16
that are doing the driving.
241
658000
2000
11:18
This is why we've ended up with such peculiar brains,
242
660000
4000
11:22
that we like religion, and music, and art.
243
664000
3000
11:25
Language is a parasite that we've adapted to,
244
667000
3000
11:28
not something that was there originally for our genes,
245
670000
2000
11:30
on this view.
246
672000
2000
11:32
And like most parasites, it can begin dangerous,
247
674000
3000
11:35
but then it coevolves and adapts,
248
677000
3000
11:38
and we end up with a symbiotic relationship
249
680000
2000
11:40
with this new parasite.
250
682000
1000
11:41
And so, from our perspective,
251
683000
2000
11:43
we don't realize that that's how it began.
252
685000
3000
11:46
So, this is a view of what humans are.
253
688000
3000
11:49
All other species on this planet are gene machines only,
254
691000
3000
11:52
they don't imitate at all well, hardly at all.
255
694000
3000
11:55
We alone are gene machines and meme machines as well.
256
697000
5000
12:00
The memes took a gene machine and turned it into a meme machine.
257
702000
4000
12:04
But that's not all.
258
706000
2000
12:06
We have a new kind of memes now.
259
708000
3000
12:09
I've been wondering for a long time,
260
711000
1000
12:10
since I've been thinking about memes a lot,
261
712000
2000
12:12
is there a difference between the memes that we copy --
262
714000
2000
12:14
the words we speak to each other,
263
716000
2000
12:16
the gestures we copy, the human things --
264
718000
2000
12:18
and all these technological things around us?
265
720000
2000
12:20
I have always, until now, called them all memes,
266
722000
4000
12:24
but I do honestly think now
267
726000
3000
12:27
we need a new word for technological memes.
268
729000
3000
12:30
Let's call them techno-memes or temes.
269
732000
3000
12:33
Because the processes are getting different.
270
735000
3000
12:37
We began, perhaps 5,000 years ago, with writing.
271
739000
3000
12:40
We put the storage of memes out there on a clay tablet,
272
742000
7000
12:48
but in order to get true temes and true teme machines,
273
750000
2000
12:50
you need to get the variation, the selection and the copying,
274
752000
3000
12:53
all done outside of humans.
275
755000
2000
12:55
And we're getting there.
276
757000
2000
12:57
We're at this extraordinary point where we're nearly there,
277
759000
2000
12:59
that there are machines like that.
278
761000
2000
13:01
And indeed, in the short time I've already been at TED,
279
763000
2000
13:03
I see we're even closer than I thought we were before.
280
765000
2000
13:05
So actually, now the temes are forcing our brains
281
767000
6000
13:11
to become more like teme machines.
282
773000
2000
13:13
Our children are growing up very quickly learning to read,
283
775000
3000
13:16
learning to use machinery.
284
778000
2000
13:18
We're going to have all kinds of implants,
285
780000
1000
13:19
drugs that force us to stay awake all the time.
286
781000
3000
13:22
We'll think we're choosing these things,
287
784000
2000
13:24
but the temes are making us do it.
288
786000
3000
13:28
So, we're at this cusp now
289
790000
1000
13:29
of having a third replicator on our planet.
290
791000
4000
13:34
Now, what about what else is going on out there in the universe?
291
796000
5000
13:39
Is there anyone else out there?
292
801000
2000
13:41
People have been asking this question for a long time.
293
803000
3000
13:44
We've been asking it here at TED already.
294
806000
2000
13:46
In 1961, Frank Drake made his famous equation,
295
808000
4000
13:50
but I think he concentrated on the wrong things.
296
812000
2000
13:52
It's been very productive, that equation.
297
814000
2000
13:54
He wanted to estimate N,
298
816000
2000
13:56
the number of communicative civilizations out there in our galaxy,
299
818000
4000
14:00
and he included in there the rate of star formation,
300
822000
4000
14:04
the rate of planets, but crucially, intelligence.
301
826000
4000
14:08
I think that's the wrong way to think about it.
302
830000
4000
14:12
Intelligence appears all over the place, in all kinds of guises.
303
834000
3000
14:15
Human intelligence is only one kind of a thing.
304
837000
2000
14:17
But what's really important is the replicators you have
305
839000
3000
14:20
and the levels of replicators, one feeding on the one before.
306
842000
4000
14:24
So, I would suggest that we don't think intelligence,
307
846000
5000
14:29
we think replicators.
308
851000
2000
14:31
And on that basis, I've suggested a different kind of equation.
309
853000
3000
14:34
A very simple equation.
310
856000
2000
14:36
N, the same thing,
311
858000
2000
14:38
the number of communicative civilizations out there
312
860000
3000
14:41
[that] we might expect in our galaxy.
313
863000
2000
14:43
Just start with the number of planets there are in our galaxy.
314
865000
4000
14:47
The fraction of those which get a first replicator.
315
869000
4000
14:51
The fraction of those that get the second replicator.
316
873000
4000
14:55
The fraction of those that get the third replicator.
317
877000
2000
14:58
Because it's only the third replicator that's going to reach out --
318
880000
3000
15:01
sending information, sending probes, getting out there,
319
883000
3000
15:04
and communicating with anywhere else.
320
886000
2000
15:06
OK, so if we take that equation,
321
888000
3000
15:09
why haven't we heard from anybody out there?
322
891000
5000
15:14
Because every step is dangerous.
323
896000
4000
15:18
Getting a new replicator is dangerous.
324
900000
3000
15:21
You can pull through, we have pulled through,
325
903000
2000
15:23
but it's dangerous.
326
905000
2000
15:25
Take the first step, as soon as life appeared on this earth.
327
907000
3000
15:28
We may take the Gaian view.
328
910000
2000
15:30
I loved Peter Ward's talk yesterday -- it's not Gaian all the time.
329
912000
3000
15:33
Actually, life forms produce things that kill themselves.
330
915000
3000
15:36
Well, we did pull through on this planet.
331
918000
3000
15:39
But then, a long time later, billions of years later,
332
921000
2000
15:41
we got the second replicator, the memes.
333
923000
3000
15:44
That was dangerous, all right.
334
926000
2000
15:46
Think of the big brain.
335
928000
2000
15:48
How many mothers do we have here?
336
930000
3000
15:51
You know all about big brains.
337
933000
2000
15:53
They are dangerous to give birth to,
338
935000
2000
15:55
are agonizing to give birth to.
339
937000
2000
15:57
(Laughter)
340
939000
1000
15:59
My cat gave birth to four kittens, purring all the time.
341
941000
2000
16:01
Ah, mm -- slightly different.
342
943000
2000
16:03
(Laughter)
343
945000
2000
16:05
But not only is it painful, it kills lots of babies,
344
947000
3000
16:08
it kills lots of mothers,
345
950000
2000
16:10
and it's very expensive to produce.
346
952000
2000
16:12
The genes are forced into producing all this myelin,
347
954000
2000
16:14
all the fat to myelinate the brain.
348
956000
2000
16:16
Do you know, sitting here,
349
958000
2000
16:18
your brain is using about 20 percent of your body's energy output
350
960000
4000
16:22
for two percent of your body weight?
351
964000
2000
16:24
It's a really expensive organ to run.
352
966000
2000
16:26
Why? Because it's producing the memes.
353
968000
2000
16:28
Now, it could have killed us off. It could have killed us off,
354
970000
4000
16:32
and maybe it nearly did, but you see, we don't know.
355
974000
2000
16:34
But maybe it nearly did.
356
976000
2000
16:36
Has it been tried before?
357
978000
1000
16:37
What about all those other species?
358
979000
2000
16:39
Louise Leakey talked yesterday
359
981000
2000
16:41
about how we're the only one in this branch left.
360
983000
3000
16:44
What happened to the others?
361
986000
2000
16:46
Could it be that this experiment in imitation,
362
988000
2000
16:48
this experiment in a second replicator,
363
990000
2000
16:50
is dangerous enough to kill people off?
364
992000
4000
16:54
Well, we did pull through, and we adapted.
365
996000
2000
16:56
But now, we're hitting, as I've just described,
366
998000
3000
16:59
we're hitting the third replicator point.
367
1001000
2000
17:01
And this is even more dangerous --
368
1003000
3000
17:04
well, it's dangerous again.
369
1006000
2000
17:06
Why? Because the temes are selfish replicators
370
1008000
4000
17:10
and they don't care about us, or our planet, or anything else.
371
1012000
3000
17:13
They're just information, why would they?
372
1015000
3000
17:17
They are using us to suck up the planet's resources
373
1019000
2000
17:19
to produce more computers,
374
1021000
2000
17:21
and more of all these amazing things we're hearing about here at TED.
375
1023000
3000
17:24
Don't think, "Oh, we created the Internet for our own benefit."
376
1026000
4000
17:28
That's how it seems to us.
377
1030000
2000
17:30
Think, temes spreading because they must.
378
1032000
4000
17:34
We are the old machines.
379
1036000
2000
17:36
Now, are we going to pull through?
380
1038000
2000
17:38
What's going to happen?
381
1040000
2000
17:40
What does it mean to pull through?
382
1042000
2000
17:42
Well, there are kind of two ways of pulling through.
383
1044000
2000
17:45
One that is obviously happening all around us now,
384
1047000
2000
17:47
is that the temes turn us into teme machines,
385
1049000
4000
17:51
with these implants, with the drugs,
386
1053000
2000
17:53
with us merging with the technology.
387
1055000
3000
17:56
And why would they do that?
388
1058000
2000
17:58
Because we are self-replicating.
389
1060000
2000
18:00
We have babies.
390
1062000
2000
18:02
We make new ones, and so it's convenient to piggyback on us,
391
1064000
3000
18:05
because we're not yet at the stage on this planet
392
1067000
4000
18:09
where the other option is viable.
393
1071000
2000
18:11
Although it's closer, I heard this morning,
394
1073000
2000
18:13
it's closer than I thought it was.
395
1075000
2000
18:15
Where the teme machines themselves will replicate themselves.
396
1077000
3000
18:18
That way, it wouldn't matter if the planet's climate
397
1080000
4000
18:22
was utterly destabilized,
398
1084000
2000
18:24
and it was no longer possible for humans to live here.
399
1086000
2000
18:26
Because those teme machines, they wouldn't need --
400
1088000
2000
18:28
they're not squishy, wet, oxygen-breathing,
401
1090000
2000
18:30
warmth-requiring creatures.
402
1092000
3000
18:33
They could carry on without us.
403
1095000
2000
18:35
So, those are the two possibilities.
404
1097000
3000
18:38
The second, I don't think we're that close.
405
1100000
4000
18:42
It's coming, but we're not there yet.
406
1104000
2000
18:44
The first, it's coming too.
407
1106000
2000
18:46
But the damage that is already being done
408
1108000
3000
18:49
to the planet is showing us how dangerous the third point is,
409
1111000
5000
18:54
that third danger point, getting a third replicator.
410
1116000
3000
18:58
And will we get through this third danger point,
411
1120000
2000
19:00
like we got through the second and like we got through the first?
412
1122000
3000
19:04
Maybe we will, maybe we won't.
413
1126000
2000
19:06
I have no idea.
414
1128000
3000
19:13
(Applause)
415
1135000
10000
19:24
Chris Anderson: That was an incredible talk.
416
1146000
2000
19:26
SB: Thank you. I scared myself.
417
1148000
2000
19:28
CA: (Laughter)
418
1150000
1000

▲Back to top

ABOUT THE SPEAKER
Susan Blackmore - Memeticist
Susan Blackmore studies memes -- those self-replicating "life forms" that spread themselves via human consciousness. We're now headed, she believes, toward a new form of meme, spread by the technology we've created.

Why you should listen

Susan Blackmore is dedicated to understanding the scientific nature of consciousness. Her latest work centers on the existence of memes -- little bits of knowledge, lore, habit that seem to spread themselves using human brains as mere carriers. She's exploring the existence of a new class of meme, spread by human technology. It's temporarily named the "teme."

She has written about memes, consciousness, and near-death experiences; has appeared on the British Big Brother to discuss the psychology of the participants; and writes for the Guardian UK.

More profile about the speaker
Susan Blackmore | Speaker | TED.com