ABOUT THE SPEAKER
Clay Shirky - Social Media Theorist
Clay Shirky argues that the history of the modern world could be rendered as the history of ways of arguing, where changes in media change what sort of arguments are possible -- with deep social and political implications.

Why you should listen

Clay Shirky's work focuses on the rising usefulness of networks -- using decentralized technologies such as peer-to-peer sharing, wireless, software for social creation, and open-source development. New technologies are enabling new kinds of cooperative structures to flourish as a way of getting things done in business, science, the arts and elsewhere, as an alternative to centralized and institutional structures, which he sees as self-limiting. In his writings and speeches he has argued that "a group is its own worst enemy."

Shirky is an adjunct professor in New York Universityʼs graduate Interactive Telecommunications Program, where he teaches a course named “Social Weather.” Heʼs the author of several books. This spring at the TED headquarters in New York, he gave an impassioned talk against SOPA/PIPA that saw 1 million views in 48 hours.

More profile about the speaker
Clay Shirky | Speaker | TED.com
TEDGlobal 2005

Clay Shirky: Institutions vs. collaboration

Filmed:
1,321,687 views

In this prescient 2005 talk, Clay Shirky shows how closed groups and companies will give way to looser networks where small contributors have big roles and fluid cooperation replaces rigid planning.
- Social Media Theorist
Clay Shirky argues that the history of the modern world could be rendered as the history of ways of arguing, where changes in media change what sort of arguments are possible -- with deep social and political implications. Full bio

Double-click the English transcript below to play the video.

00:12
How do groups get anything done? Right?
0
0
3000
00:15
How do you organize a group of individuals
1
3000
2000
00:17
so that the output of the group
2
5000
2000
00:19
is something coherent and of lasting value,
3
7000
2000
00:21
instead of just being chaos?
4
9000
2000
00:23
And the economic framing of that problem
5
11000
3000
00:26
is called coordination costs.
6
14000
2000
00:28
And a coordination cost is essentially all of the financial
7
16000
4000
00:32
or institutional difficulties in arranging group output.
8
20000
4000
00:36
And we've had a classic answer for coordination costs,
9
24000
3000
00:39
which is, if you want to coordinate the work of a group of people,
10
27000
3000
00:42
you start an institution, right? You raise some resources.
11
30000
2000
00:44
You found something. It can be private or public.
12
32000
3000
00:47
It can be for profit or not profit. It can be large or small.
13
35000
3000
00:50
But you get these resources together.
14
38000
2000
00:52
You found an institution, and you use the institution
15
40000
3000
00:55
to coordinate the activities of the group.
16
43000
2000
00:57
More recently, because the cost of letting groups
17
45000
4000
01:01
communicate with each other has fallen through the floor --
18
49000
3000
01:04
and communication costs are one of the big
19
52000
2000
01:06
inputs to coordination -- there has been a second answer,
20
54000
4000
01:10
which is to put the cooperation into the infrastructure,
21
58000
4000
01:14
to design systems that coordinate the output
22
62000
3000
01:17
of the group as a by-product of the operating of the system,
23
65000
3000
01:20
without regard to institutional models.
24
68000
3000
01:23
So, that's what I want to talk about today.
25
71000
2000
01:25
I'm going to illustrate it with some fairly concrete examples,
26
73000
2000
01:27
but always pointing to the broader themes.
27
75000
4000
01:31
So, I'm going to start by trying to answer a question
28
79000
2000
01:33
that I know each of you will have asked yourself at some point or other,
29
81000
2000
01:35
and which the Internet is purpose-built to answer,
30
83000
2000
01:37
which is, where can I get a picture of a roller-skating mermaid?
31
85000
4000
01:41
So, in New York City, on the first Saturday of every summer,
32
89000
4000
01:45
Coney Island, our local, charmingly run-down amusement park,
33
93000
3000
01:48
hosts the Mermaid Parade. It's an amateur parade;
34
96000
3000
01:51
people come from all over the city; people get all dressed up.
35
99000
3000
01:54
Some people get less dressed up.
36
102000
2000
01:56
Young and old, dancing in the streets.
37
104000
3000
01:59
Colorful characters, and a good time is had by all.
38
107000
3000
02:02
And what I want to call your attention to is not the Mermaid Parade itself,
39
110000
2000
02:04
charming though it is, but rather to these photos.
40
112000
3000
02:07
I didn't take them. How did I get them?
41
115000
3000
02:10
And the answer is: I got them from Flickr.
42
118000
2000
02:12
Flickr is a photo-sharing service
43
120000
3000
02:15
that allows people to take photos, upload them,
44
123000
2000
02:17
share them over the Web and so forth.
45
125000
1000
02:18
Recently, Flickr has added an additional function called tagging.
46
126000
4000
02:22
Tagging was pioneered by Delicious and Joshua Schachter.
47
130000
3000
02:25
Delicious is a social bookmarking service.
48
133000
2000
02:27
Tagging is a cooperative infrastructure answer to classification.
49
135000
5000
02:32
Right? If I had given this talk last year,
50
140000
3000
02:35
I couldn't do what I just did,
51
143000
2000
02:37
because I couldn't have found those photos.
52
145000
2000
02:39
But instead of saying,
53
147000
2000
02:41
we need to hire a professional class of librarians
54
149000
2000
02:43
to organize these photos once they're uploaded,
55
151000
2000
02:45
Flickr simply turned over to the users
56
153000
3000
02:48
the ability to characterize the photos.
57
156000
2000
02:50
So, I was able to go in and draw down photos that had been tagged
58
158000
3000
02:53
"Mermaid Parade." There were 3,100 photos taken by 118 photographers,
59
161000
6000
02:59
all aggregated and then put under this nice, neat name,
60
167000
3000
03:02
shown in reverse chronological order.
61
170000
2000
03:04
And I was then able to go and retrieve them
62
172000
2000
03:06
to give you that little slideshow.
63
174000
2000
03:08
Now, what hard problem is being solved here?
64
176000
3000
03:11
And it's -- in the most schematic possible view,
65
179000
2000
03:13
it's a coordination problem, right?
66
181000
2000
03:15
There are a large number of people on the Internet,
67
183000
2000
03:17
a very small fraction of them have photos of the Mermaid Parade.
68
185000
4000
03:21
How do we get those people together to contribute that work?
69
189000
4000
03:25
The classic answer is to form an institution, right?
70
193000
3000
03:28
To draw those people into some prearranged structure
71
196000
4000
03:32
that has explicit goals.
72
200000
2000
03:34
And I want to call your attention to
73
202000
2000
03:36
some of the side effects of going the institutional route.
74
204000
5000
03:41
First of all, when you form an institution,
75
209000
2000
03:43
you take on a management problem, right?
76
211000
2000
03:45
No good just hiring employees,
77
213000
2000
03:47
you also have to hire other employees to manage those employees
78
215000
3000
03:50
and to enforce the goals of the institution and so forth.
79
218000
3000
03:53
Secondly, you have to bring structure into place.
80
221000
3000
03:56
Right? You have to have economic structure.
81
224000
2000
03:58
You have to have legal structure.
82
226000
2000
04:00
You have to have physical structure.
83
228000
1000
04:01
And that creates additional costs.
84
229000
3000
04:04
Third, forming an institution is inherently exclusionary.
85
232000
4000
04:08
You notice we haven't got everybody who has a photo.
86
236000
4000
04:12
You can't hire everyone in a company, right?
87
240000
3000
04:15
You can't recruit everyone into a governmental organization.
88
243000
3000
04:18
You have to exclude some people.
89
246000
2000
04:20
And fourth, as a result of that exclusion,
90
248000
2000
04:22
you end up with a professional class. Look at the change here.
91
250000
4000
04:26
We've gone from people with photos to photographers.
92
254000
3000
04:29
Right? We've created a professional class of photographers
93
257000
4000
04:33
whose goal is to go out and photograph the Mermaid Parade,
94
261000
2000
04:35
or whatever else they're sent out to photograph.
95
263000
5000
04:40
When you build cooperation into the infrastructure,
96
268000
3000
04:43
which is the Flickr answer,
97
271000
3000
04:46
you can leave the people where they are
98
274000
2000
04:48
and you take the problem to the individuals, rather than
99
276000
4000
04:52
moving the individuals to the problem.
100
280000
1000
04:53
You arrange the coordination in the group, and by doing that
101
281000
7000
05:00
you get the same outcome, without the institutional difficulties.
102
288000
4000
05:04
You lose the institutional imperative.
103
292000
2000
05:06
You lose the right to shape people's work when it's volunteer effort,
104
294000
3000
05:09
but you also shed the institutional cost,
105
297000
3000
05:12
which gives you greater flexibility.
106
300000
2000
05:14
What Flickr does is it replaces planning with coordination.
107
302000
5000
05:19
And this is a general aspect of these cooperative systems.
108
307000
3000
05:22
Right. You'll have experienced this in your life
109
310000
3000
05:25
whenever you bought your first mobile phone,
110
313000
2000
05:27
and you stopped making plans.
111
315000
2000
05:29
You just said, "I'll call you when I get there."
112
317000
2000
05:31
"Call me when you get off work." Right?
113
319000
2000
05:33
That is a point-to-point replacement of coordination with planning.
114
321000
5000
05:38
Right. We're now able to do that kind of thing with groups.
115
326000
4000
05:42
To say instead of, we must make an advance plan,
116
330000
3000
05:45
we must have a five-year projection
117
333000
1000
05:46
of where the Wikipedia is going to be, or whatever,
118
334000
3000
05:49
you can just say, let's coordinate the group effort,
119
337000
3000
05:52
and let's deal with it as we go,
120
340000
2000
05:54
because we're now well-enough coordinated
121
342000
1000
05:55
that we don't have to take on the problems of deciding in advance what to do.
122
343000
5000
06:00
So here's another example. This one's somewhat more somber.
123
348000
3000
06:03
These are photos on Flickr tagged "Iraq."
124
351000
2000
06:09
And everything that was hard about the coordination cost
125
357000
3000
06:12
with the Mermaid Parade is even harder here.
126
360000
3000
06:15
There are more pictures. There are more photographers.
127
363000
3000
06:18
It's taken over a wider geographic area.
128
366000
4000
06:22
The photos are spread out over a longer period of time.
129
370000
2000
06:24
And worst of all, that figure at the bottom,
130
372000
4000
06:28
approximately ten photos per photographer, is a lie.
131
376000
4000
06:32
It's mathematically true,
132
380000
2000
06:34
but it doesn't really talk about anything important --
133
382000
2000
06:36
because in these systems, the average isn't really what matters.
134
384000
5000
06:41
What matters is this.
135
389000
2000
06:43
This is a graph of photographs tagged Iraq
136
391000
5000
06:48
as taken by the 529 photographers who contributed the 5,445 photos.
137
396000
6000
06:54
And it's ranked in order of number of photos taken per photographer.
138
402000
5000
06:59
You can see here, over at the end,
139
407000
2000
07:01
our most prolific photographer has taken around 350 photos,
140
409000
4000
07:05
and you can see there's a few people who have taken hundreds of photos.
141
413000
4000
07:09
Then there's dozens of people who've taken dozens of photos.
142
417000
3000
07:12
And by the time we get around here,
143
420000
2000
07:14
we get ten or fewer photos, and then there's this long, flat tail.
144
422000
4000
07:18
And by the time you get to the middle,
145
426000
2000
07:20
you've got hundreds of people
146
428000
2000
07:22
who have contributed only one photo each.
147
430000
3000
07:25
This is called a power-law distribution.
148
433000
2000
07:27
It appears often in unconstrained social systems
149
435000
5000
07:32
where people are allowed to contribute as much or as little as they like --
150
440000
4000
07:36
this is often what you get. Right?
151
444000
2000
07:38
The math behind the power-law distribution is that whatever's in the nth position
152
446000
4000
07:42
is doing about one-nth of whatever's being measured,
153
450000
3000
07:45
relative to the person in the first position.
154
453000
2000
07:47
So, we'd expect the tenth most prolific photographer
155
455000
2000
07:49
to have contributed about a tenth of the photos,
156
457000
3000
07:52
and the hundredth most prolific photographer
157
460000
2000
07:54
to have contributed only about a hundred as many photos
158
462000
3000
07:57
as the most prolific photographer did.
159
465000
2000
07:59
So, the head of the curve can be sharper or flatter.
160
467000
4000
08:03
But that basic math accounts both for the steep slope
161
471000
2000
08:05
and for the long, flat tail.
162
473000
2000
08:07
And curiously, in these systems, as they grow larger,
163
475000
3000
08:10
the systems don't converge; they diverge more.
164
478000
4000
08:14
In bigger systems, the head gets bigger
165
482000
1000
08:15
and the tail gets longer, so the imbalance increases.
166
483000
6000
08:21
You can see the curve is obviously heavily left-weighted. Here's how heavily:
167
489000
4000
08:25
if you take the top 10 percent of photographers contributing to this system,
168
493000
4000
08:29
they account for three quarters of the photos taken --
169
497000
4000
08:33
just the top 10 percent most prolific photographers.
170
501000
3000
08:36
If you go down to five percent,
171
504000
2000
08:38
you're still accounting for 60 percent of the photos.
172
506000
3000
08:41
If you go down to one percent, exclude 99 percent of the group effort,
173
509000
6000
08:47
you're still accounting for almost a quarter of the photos.
174
515000
3000
08:50
And because of this left weighting,
175
518000
2000
08:52
the average is actually here, way to the left.
176
520000
5000
08:57
And that sounds strange to our ears,
177
525000
2000
08:59
but what ends up happening is that 80 percent of the contributors
178
527000
3000
09:02
have contributed a below-average amount.
179
530000
3000
09:05
That sounds strange because we expect average and middle
180
533000
2000
09:07
to be about the same, but they're not at all.
181
535000
3000
09:10
This is the math underlying the 80/20 rule. Right?
182
538000
4000
09:14
Whenever you hear anybody talking about the 80/20 rule,
183
542000
2000
09:16
this is what's going on. Right?
184
544000
2000
09:18
20 percent of the merchandise accounts for 80 percent of the revenue,
185
546000
4000
09:22
20 percent of the users use 80 percent of the resources --
186
550000
2000
09:24
this is the shape people are talking about when that happens.
187
552000
5000
09:29
Institutions only have two tools: carrots and sticks.
188
557000
3000
09:32
And the 80 percent zone is a no-carrot and no-stick zone.
189
560000
4000
09:36
The costs of running the institution mean that you cannot
190
564000
9000
09:45
take on the work of those people easily in an institutional frame.
191
573000
3000
09:48
The institutional model always pushes leftwards,
192
576000
4000
09:52
treating these people as employees.
193
580000
2000
09:54
The institutional response is,
194
582000
1000
09:55
I can get 75 percent of the value for 10 percent of the hires -- great,
195
583000
5000
10:00
that's what I'll do.
196
588000
2000
10:02
The cooperative infrastructure model says,
197
590000
2000
10:04
why do you want to give up a quarter of the value?
198
592000
3000
10:07
If your system is designed
199
595000
2000
10:09
so that you have to give up a quarter of the value,
200
597000
3000
10:12
re-engineer the system.
201
600000
3000
10:15
Don't take on the cost that prevents you
202
603000
2000
10:17
from getting to the contributions of these people.
203
605000
2000
10:19
Build the system so that anybody can contribute at any amount.
204
607000
5000
10:24
So the coordination response asks not,
205
612000
6000
10:30
how are these people as employees, but rather,
206
618000
3000
10:33
what is their contribution like? Right?
207
621000
2000
10:35
We have over here Psycho Milt, a Flickr user,
208
623000
3000
10:38
who has contributed one, and only one, photo titled "Iraq."
209
626000
5000
10:43
And here's the photo. Right. Labeled, "Bad Day at Work."
210
631000
4000
10:47
Right? So the question is,
211
635000
3000
10:50
do you want that photo? Yes or no.
212
638000
3000
10:53
The question is not, is Psycho Milt a good employee?
213
641000
4000
10:57
And the tension here is between institution as enabler
214
645000
5000
11:02
and institution as obstacle.
215
650000
2000
11:04
When you're dealing with the left-hand edge
216
652000
2000
11:06
of one of these distributions,
217
654000
2000
11:08
when you're dealing with the people who spend a lot of time
218
656000
2000
11:10
producing a lot of the material you want,
219
658000
2000
11:12
that's an institution-as-enabler world.
220
660000
2000
11:14
You can hire those people as employees, you can coordinate their work
221
662000
3000
11:17
and you can get some output.
222
665000
2000
11:19
But when you're down here, where the Psycho Milts of the world
223
667000
2000
11:21
are adding one photo at a time,
224
669000
3000
11:24
that's institution as obstacle.
225
672000
3000
11:27
Institutions hate being told they're obstacles.
226
675000
4000
11:31
One of the first things that happens
227
679000
2000
11:33
when you institutionalize a problem
228
681000
2000
11:35
is that the first goal of the institution
229
683000
4000
11:39
immediately shifts from whatever the nominal goal was
230
687000
2000
11:41
to self-preservation.
231
689000
2000
11:43
And the actual goal of the institution goes to two through n.
232
691000
4000
11:47
Right? So, when institutions are told they are obstacles,
233
695000
3000
11:50
and that there are other ways of coordinating the value,
234
698000
2000
11:52
they go through something a little bit like the Kubler-Ross stages --
235
700000
5000
11:57
(Laughter)
236
705000
1000
11:58
-- of reaction, being told you have a fatal illness:
237
706000
2000
12:00
denial, anger, bargaining, acceptance.
238
708000
4000
12:04
Most of the cooperative systems we've seen
239
712000
2000
12:06
haven't been around long enough
240
714000
1000
12:07
to have gotten to the acceptance phase.
241
715000
3000
12:10
Many, many institutions are still in denial,
242
718000
2000
12:12
but we're seeing recently a lot of both anger and bargaining.
243
720000
5000
12:17
There's a wonderful, small example going on right now.
244
725000
2000
12:19
In France, a bus company is suing people for forming a carpool,
245
727000
5000
12:24
right, because the fact that they have coordinated
246
732000
3000
12:27
themselves to create cooperative value is depriving them of revenue.
247
735000
6000
12:33
You can follow this in the Guardian.
248
741000
1000
12:34
It's actually quite entertaining.
249
742000
4000
12:38
The bigger question is,
250
746000
2000
12:40
what do you do about the value down here?
251
748000
3000
12:43
Right? How do you capture that?
252
751000
3000
12:46
And institutions, as I've said, are prevented from capturing that.
253
754000
4000
12:50
Steve Ballmer, now CEO of Microsoft,
254
758000
2000
12:52
was criticizing Linux a couple of years ago, and he said,
255
760000
2000
12:54
"Oh, this business of thousands of programmers
256
762000
2000
12:56
contributing to Linux, this is a myth.
257
764000
2000
12:58
We've looked at who's contributed to Linux,
258
766000
3000
13:01
and most of the patches have been produced by programmers
259
769000
3000
13:04
who've only done one thing." Right?
260
772000
4000
13:08
You can hear this distribution under that complaint.
261
776000
4000
13:12
And you can see why, from Ballmer's point of view,
262
780000
2000
13:14
that's a bad idea, right?
263
782000
1000
13:15
We hired this programmer, he came in, he drank our Cokes
264
783000
3000
13:18
and played Foosball for three years and he had one idea.
265
786000
2000
13:20
(Laughter)
266
788000
1000
13:21
Right? Bad hire. Right?
267
789000
3000
13:24
(Laughter)
268
792000
2000
13:26
The Psycho Milt question is, was it a good idea?
269
794000
5000
13:31
What if it was a security patch?
270
799000
2000
13:33
What if it was a security patch for a buffer overflow exploit,
271
801000
4000
13:37
of which Windows has not some, [but] several?
272
805000
2000
13:39
Do you want that patch, right?
273
807000
4000
13:43
The fact that a single programmer can,
274
811000
2000
13:45
without having to move into a professional relation
275
813000
3000
13:48
to an institution, improve Linux once
276
816000
3000
13:51
and never be seen from again, should terrify Ballmer.
277
819000
4000
13:55
Because this kind of value is unreachable in classic
278
823000
4000
13:59
institutional frameworks, but is part of cooperative
279
827000
2000
14:01
systems of open-source software, of file sharing,
280
829000
3000
14:04
of the Wikipedia. I've used a lot of examples from Flickr,
281
832000
3000
14:07
but there are actually stories about this from all over.
282
835000
3000
14:10
Meetup, a service founded so that users could find people
283
838000
3000
14:13
in their local area who share their interests and affinities
284
841000
2000
14:15
and actually have a real-world meeting offline in a cafe
285
843000
4000
14:19
or a pub or what have you.
286
847000
2000
14:21
When Scott Heiferman founded Meetup,
287
849000
2000
14:23
he thought it would be used for, you know,
288
851000
2000
14:25
train spotters and cat fanciers -- classic affinity groups.
289
853000
2000
14:27
The inventors don't know what the invention is.
290
855000
3000
14:30
Number one group on Meetup right now,
291
858000
2000
14:32
most chapters in most cities with most members, most active?
292
860000
3000
14:35
Stay-at-home moms. Right?
293
863000
2000
14:37
In the suburbanized, dual-income United States,
294
865000
3000
14:40
stay-at-home moms are actually missing
295
868000
3000
14:43
the social infrastructure that comes from extended family
296
871000
3000
14:46
and local, small-scale neighborhoods.
297
874000
3000
14:49
So they're reinventing it, using these tools.
298
877000
3000
14:52
Meetup is the platform,
299
880000
1000
14:53
but the value here is in social infrastructure.
300
881000
3000
14:56
If you want to know what technology is going to change the world,
301
884000
3000
14:59
don't pay attention to 13-year-old boys --
302
887000
2000
15:01
pay attention to young mothers,
303
889000
2000
15:03
because they have got not an ounce of support for technology
304
891000
3000
15:06
that doesn't materially make their lives better.
305
894000
3000
15:09
This is so much more important than Xbox,
306
897000
2000
15:11
but it's a lot less glitzy.
307
899000
2000
15:13
I think this is a revolution.
308
901000
2000
15:15
I think that this is a really profound change
309
903000
3000
15:18
in the way human affairs are arranged.
310
906000
1000
15:19
And I use that word advisedly.
311
907000
2000
15:21
It's a revolution in that it's a change in equilibrium.
312
909000
3000
15:24
It's a whole new way of doing things, which includes new downsides.
313
912000
6000
15:30
In the United States right now, a woman named Judith Miller
314
918000
3000
15:33
is in jail for not having given to a Federal Grand Jury her sources --
315
921000
5000
15:38
she's a reporter for the New York Times --
316
926000
1000
15:39
her sources, in a very abstract and hard-to-follow case.
317
927000
3000
15:42
And journalists are in the street rallying to improve the shield laws.
318
930000
3000
15:45
The shield laws are our laws -- pretty much a patchwork of state laws --
319
933000
4000
15:49
that prevent a journalist from having to betray a source.
320
937000
3000
15:52
This is happening, however, against the background
321
940000
3000
15:55
of the rise of Web logging.
322
943000
2000
15:57
Web logging is a classic example of mass amateurization.
323
945000
4000
16:01
It has de-professionalized publishing.
324
949000
2000
16:03
Want to publish globally anything you think today?
325
951000
3000
16:06
It is a one-button operation that you can do for free.
326
954000
4000
16:10
That has sent the professional class of publishing down
327
958000
4000
16:14
into the ranks of mass amateurization.
328
962000
3000
16:17
And so the shield law, as much as we want it --
329
965000
4000
16:21
we want a professional class of truth-tellers --
330
969000
2000
16:23
it is becoming increasingly incoherent, because
331
971000
3000
16:26
the institution is becoming incoherent.
332
974000
2000
16:28
There are people in the States right now
333
976000
2000
16:30
tying themselves into knots, trying to figure out
334
978000
3000
16:33
whether or not bloggers are journalists.
335
981000
2000
16:35
And the answer to that question is,
336
983000
2000
16:37
it doesn't matter, because that's not the right question.
337
985000
3000
16:40
Journalism was an answer to an even more important question,
338
988000
4000
16:44
which is, how will society be informed?
339
992000
2000
16:46
How will they share ideas and opinions?
340
994000
3000
16:49
And if there is an answer to that that happens outside
341
997000
3000
16:52
the professional framework of journalism,
342
1000000
2000
16:54
it makes no sense to take a professional metaphor
343
1002000
4000
16:58
and apply it to this distributed class.
344
1006000
4000
17:02
So as much as we want the shield laws,
345
1010000
2000
17:04
the background -- the institution to which they were attached --
346
1012000
4000
17:08
is becoming incoherent.
347
1016000
2000
17:10
Here's another example.
348
1018000
2000
17:12
Pro-ana, the pro-ana groups.
349
1020000
2000
17:14
These are groups of teenage girls
350
1022000
2000
17:16
who have taken on Web logs, bulletin boards,
351
1024000
3000
17:19
other kinds of cooperative infrastructure,
352
1027000
2000
17:21
and have used it to set up support groups for
353
1029000
2000
17:23
remaining anorexic by choice.
354
1031000
2000
17:25
They post pictures of thin models, which they call "thinspiration."
355
1033000
3000
17:28
They have little slogans, like "Salvation through Starvation."
356
1036000
3000
17:31
They even have Lance Armstrong-style bracelets,
357
1039000
2000
17:33
these red bracelets, which signify, in the small group,
358
1041000
3000
17:36
I am trying to maintain my eating disorder.
359
1044000
3000
17:39
They trade tips, like, if you feel like eating something,
360
1047000
2000
17:41
clean a toilet or the litter box. The feeling will pass.
361
1049000
5000
17:46
We're used to support groups being beneficial.
362
1054000
3000
17:49
We have an attitude that support groups are inherently beneficial.
363
1057000
3000
17:52
But it turns out that the logic of the support group is value neutral.
364
1060000
4000
17:56
A support group is simply a small group that wants to maintain
365
1064000
4000
18:00
a way of living in the context of a larger group.
366
1068000
3000
18:03
Now, when the larger group is a bunch of drunks,
367
1071000
2000
18:05
and the small group wants to stay sober, then we think,
368
1073000
2000
18:07
that's a great support group.
369
1075000
2000
18:09
But when the small group is teenage girls
370
1077000
2000
18:11
who want to stay anorexic by choice, then we're horrified.
371
1079000
4000
18:15
What's happened is that the normative goals
372
1083000
3000
18:18
of the support groups that we're used to,
373
1086000
2000
18:20
came from the institutions that were framing them,
374
1088000
3000
18:23
and not from the infrastructure.
375
1091000
1000
18:24
Once the infrastructure becomes generically available,
376
1092000
4000
18:28
the logic of the support group has been revealed to be
377
1096000
2000
18:30
accessible to anyone, including people pursuing these kinds of goals.
378
1098000
5000
18:35
So, there are significant downsides to these changes
379
1103000
2000
18:37
as well as upsides. And of course, in the current environment,
380
1105000
3000
18:40
one need allude only lightly to the work of non-state actors
381
1108000
5000
18:45
trying to influence global affairs, and taking advantage of these.
382
1113000
3000
18:48
This is a social map of the hijackers and their associates
383
1116000
3000
18:51
who perpetrated the 9/11 attack.
384
1119000
4000
18:55
It was produced by analyzing their communications patterns
385
1123000
4000
18:59
using a lot of these tools. And doubtless the intelligence communities of the world
386
1127000
3000
19:02
are doing the same work today for the attacks of last week.
387
1130000
4000
19:06
Now, this is the part of the talk where I tell you
388
1134000
2000
19:08
what's going to come as a result of all of this,
389
1136000
2000
19:10
but I'm running out of time, which is good,
390
1138000
3000
19:13
because I don't know.
391
1141000
2000
19:15
(Laughter)
392
1143000
2000
19:17
Right. As with the printing press, if it's really a revolution,
393
1145000
4000
19:21
it doesn't take us from Point A to Point B.
394
1149000
2000
19:23
It takes us from Point A to chaos.
395
1151000
3000
19:26
The printing press precipitated 200 years of chaos,
396
1154000
5000
19:31
moving from a world where the Catholic Church
397
1159000
2000
19:33
was the sort of organizing political force to the Treaty of Westphalia,
398
1161000
4000
19:37
when we finally knew what the new unit was: the nation state.
399
1165000
3000
19:40
Now, I'm not predicting 200 years of chaos as a result of this. 50.
400
1168000
5000
19:45
50 years in which loosely coordinated groups
401
1173000
4000
19:49
are going to be given increasingly high leverage,
402
1177000
3000
19:52
and the more those groups forego traditional institutional imperatives --
403
1180000
4000
19:56
like deciding in advance what's going to happen,
404
1184000
3000
19:59
or the profit motive -- the more leverage they'll get.
405
1187000
3000
20:02
And institutions are going to come under
406
1190000
2000
20:04
an increasing degree of pressure,
407
1192000
2000
20:06
and the more rigidly managed, and the more they rely
408
1194000
2000
20:08
on information monopolies, the greater the pressure is going to be.
409
1196000
4000
20:12
And that's going to happen one arena at a time,
410
1200000
2000
20:14
one institution at a time. The forces are general,
411
1202000
3000
20:17
but the results are going to be specific.
412
1205000
2000
20:19
And so the point here is not,
413
1207000
2000
20:21
"This is wonderful," or "We're going to see a transition
414
1209000
3000
20:24
from only institutions to only cooperative framework."
415
1212000
3000
20:27
It's going to be much more complicated than that.
416
1215000
2000
20:29
But the point is that it's going to be a massive readjustment.
417
1217000
3000
20:32
And since we can see it in advance and know it's coming,
418
1220000
2000
20:34
my argument is essentially: we might as well get good at it.
419
1222000
3000
20:37
Thank you very much.
420
1225000
2000
20:39
(Applause)
421
1227000
2000

▲Back to top

ABOUT THE SPEAKER
Clay Shirky - Social Media Theorist
Clay Shirky argues that the history of the modern world could be rendered as the history of ways of arguing, where changes in media change what sort of arguments are possible -- with deep social and political implications.

Why you should listen

Clay Shirky's work focuses on the rising usefulness of networks -- using decentralized technologies such as peer-to-peer sharing, wireless, software for social creation, and open-source development. New technologies are enabling new kinds of cooperative structures to flourish as a way of getting things done in business, science, the arts and elsewhere, as an alternative to centralized and institutional structures, which he sees as self-limiting. In his writings and speeches he has argued that "a group is its own worst enemy."

Shirky is an adjunct professor in New York Universityʼs graduate Interactive Telecommunications Program, where he teaches a course named “Social Weather.” Heʼs the author of several books. This spring at the TED headquarters in New York, he gave an impassioned talk against SOPA/PIPA that saw 1 million views in 48 hours.

More profile about the speaker
Clay Shirky | Speaker | TED.com