ABOUT THE SPEAKER
Bruce Bueno de Mesquita - Political scientist
A consultant to the CIA and the Department of Defense, Bruce Bueno de Mesquita has built an intricate computer model that can predict the outcomes of international conflicts with bewildering accuracy.

Why you should listen

Every motive has a number, says Bruce Bueno de Mesquita. A specialist in foreign policy, international relations and state building, he is also a leading -- if controversial -- scholar of rational choice theory, which says math underlies the nation-scale consequences of individuals acting for personal benefit. He created forecasting technology that has, time and again, exceeded the accuracy of old-school analysis, even with thorny quarrels charged by obscure contenders, and often against odds. (One example: He called the second Intifada two years in advance.)

Bueno de Mesquita's company, Mesquita & Roundell, sells his system's predictions and analysis to influential government and private institutions that need heads-ups on policy. He teaches at NYU and is a senior fellow at the Hoover Institution.

More profile about the speaker
Bruce Bueno de Mesquita | Speaker | TED.com
TED2009

Bruce Bueno de Mesquita: A prediction for the future of Iran

Filmed:
1,045,301 views

Bruce Bueno de Mesquita uses mathematical analysis to predict (very often correctly) such messy human events as war, political power shifts, Intifada ... After a crisp explanation of how he does it, he offers three predictions on the future of Iran.
- Political scientist
A consultant to the CIA and the Department of Defense, Bruce Bueno de Mesquita has built an intricate computer model that can predict the outcomes of international conflicts with bewildering accuracy. Full bio

Double-click the English transcript below to play the video.

00:18
What I'm going to try to do is explain to you
0
0
3000
00:21
quickly how to predict,
1
3000
2000
00:23
and illustrate it with some predictions
2
5000
2000
00:25
about what Iran is going to do in the next couple of years.
3
7000
5000
00:30
In order to predict effectively,
4
12000
3000
00:33
we need to use science.
5
15000
3000
00:36
And the reason that we need to use science
6
18000
3000
00:39
is because then we can reproduce what we're doing;
7
21000
2000
00:41
it's not just wisdom or guesswork.
8
23000
3000
00:44
And if we can predict,
9
26000
3000
00:47
then we can engineer the future.
10
29000
2000
00:49
So if you are concerned to influence energy policy,
11
31000
4000
00:53
or you are concerned to influence national security policy,
12
35000
5000
00:58
or health policy, or education,
13
40000
3000
01:01
science -- and a particular branch of science -- is a way to do it,
14
43000
4000
01:05
not the way we've been doing it,
15
47000
2000
01:07
which is seat-of-the-pants wisdom.
16
49000
2000
01:09
Now before I get into how to do it
17
51000
2000
01:11
let me give you a little truth in advertising,
18
53000
3000
01:14
because I'm not engaged in the business of magic.
19
56000
3000
01:17
There are lots of thing that the approach I take can predict,
20
59000
4000
01:21
and there are some that it can't.
21
63000
2000
01:23
It can predict complex negotiations
22
65000
3000
01:26
or situations involving coercion --
23
68000
3000
01:29
that is in essence everything that has to do with politics,
24
71000
4000
01:33
much of what has to do with business,
25
75000
2000
01:35
but sorry, if you're looking to speculate in the stock market,
26
77000
6000
01:41
I don't predict stock markets -- OK,
27
83000
2000
01:43
it's not going up any time really soon.
28
85000
3000
01:46
But I'm not engaged in doing that.
29
88000
3000
01:49
I'm not engaged in predicting random number generators.
30
91000
3000
01:52
I actually get phone calls from people
31
94000
2000
01:54
who want to know what lottery numbers are going to win.
32
96000
3000
01:57
I don't have a clue.
33
99000
3000
02:00
I engage in the use of game theory, game theory is a branch of mathematics
34
102000
4000
02:04
and that means, sorry, that even in the study of politics,
35
106000
4000
02:08
math has come into the picture.
36
110000
3000
02:11
We can no longer pretend that we just speculate about politics,
37
113000
4000
02:15
we need to look at this in a rigorous way.
38
117000
3000
02:18
Now, what is game theory about?
39
120000
3000
02:21
It assumes that people are looking out for what's good for them.
40
123000
5000
02:26
That doesn't seem terribly shocking --
41
128000
2000
02:28
although it's controversial for a lot of people --
42
130000
2000
02:30
that we are self-interested.
43
132000
4000
02:34
In order to look out for what's best for them
44
136000
2000
02:36
or what they think is best for them,
45
138000
2000
02:38
people have values -- they identify what they want, and what they don't want.
46
140000
4000
02:42
And they have beliefs about what other people want,
47
144000
3000
02:45
and what other people don't want, how much power other people have,
48
147000
3000
02:48
how much those people could get in the way of whatever it is that you want.
49
150000
4000
02:52
And they face limitations, constraints,
50
154000
4000
02:56
they may be weak, they may be located in the wrong part of the world,
51
158000
3000
02:59
they may be Einstein, stuck away farming
52
161000
3000
03:02
someplace in a rural village in India not being noticed,
53
164000
4000
03:06
as was the case for Ramanujan for a long time,
54
168000
3000
03:09
a great mathematician but nobody noticed.
55
171000
3000
03:12
Now who is rational?
56
174000
2000
03:14
A lot of people are worried about what is rationality about?
57
176000
3000
03:17
You know, what if people are rational?
58
179000
2000
03:19
Mother Theresa, she was rational.
59
181000
3000
03:22
Terrorists, they're rational.
60
184000
4000
03:26
Pretty much everybody is rational.
61
188000
3000
03:29
I think there are only two exceptions that I'm aware of --
62
191000
3000
03:32
two-year-olds, they are not rational,
63
194000
2000
03:34
they have very fickle preferences,
64
196000
3000
03:37
they switch what they think all the time,
65
199000
2000
03:39
and schizophrenics are probably not rational,
66
201000
3000
03:42
but pretty much everybody else is rational.
67
204000
2000
03:44
That is, they are just trying to do
68
206000
2000
03:46
what they think is in their own best interest.
69
208000
5000
03:51
Now in order to work out what people are going to do
70
213000
2000
03:53
to pursue their interests,
71
215000
2000
03:55
we have to think about who has influence in the world.
72
217000
2000
03:57
If you're trying to influence corporations to change their behavior,
73
219000
5000
04:02
with regard to producing pollutants,
74
224000
3000
04:05
one approach, the common approach,
75
227000
2000
04:07
is to exhort them to be better,
76
229000
2000
04:09
to explain to them what damage they're doing to the planet.
77
231000
3000
04:12
And many of you may have noticed that doesn't have
78
234000
2000
04:14
as big an effect, as perhaps you would like it to have.
79
236000
4000
04:18
But if you show them that it's in their interest,
80
240000
3000
04:21
then they're responsive.
81
243000
2000
04:23
So, we have to work out who influences problems.
82
245000
3000
04:26
If we're looking at Iran, the president of the United States
83
248000
2000
04:28
we would like to think, may have some influence --
84
250000
3000
04:31
certainly the president in Iran has some influence --
85
253000
4000
04:35
but we make a mistake if we just pay attention
86
257000
3000
04:38
to the person at the top of the power ladder
87
260000
3000
04:41
because that person doesn't know much about Iran,
88
263000
3000
04:44
or about energy policy,
89
266000
2000
04:46
or about health care,
90
268000
2000
04:48
or about any particular policy.
91
270000
2000
04:50
That person surrounds himself or herself with advisers.
92
272000
5000
04:55
If we're talking about national security problems,
93
277000
2000
04:57
maybe it's the Secretary of State,
94
279000
2000
04:59
maybe it's the Secretary of Defense,
95
281000
2000
05:01
the Director of National Intelligence,
96
283000
2000
05:03
maybe the ambassador to the United Nations, or somebody else
97
285000
2000
05:05
who they think is going to know more about the particular problem.
98
287000
4000
05:09
But let's face it, the Secretary of State doesn't know much about Iran.
99
291000
3000
05:12
The secretary of defense doesn't know much about Iran.
100
294000
3000
05:15
Each of those people in turn
101
297000
3000
05:18
has advisers who advise them,
102
300000
2000
05:20
so they can advise the president.
103
302000
3000
05:23
There are lots of people shaping decisions
104
305000
3000
05:26
and so if we want to predict correctly
105
308000
2000
05:28
we have to pay attention to everybody
106
310000
3000
05:31
who is trying to shape the outcome,
107
313000
2000
05:33
not just the people at the pinnacle
108
315000
3000
05:36
of the decision-making pyramid.
109
318000
4000
05:40
Unfortunately, a lot of times we don't do that.
110
322000
2000
05:42
There's a good reason that we don't do that,
111
324000
2000
05:44
and there's a good reason that using game theory and computers,
112
326000
3000
05:47
we can overcome the limitation
113
329000
3000
05:50
of just looking at a few people.
114
332000
2000
05:52
Imagine a problem with just five decision-makers.
115
334000
4000
05:56
Imagine for example
116
338000
2000
05:58
that Sally over here,
117
340000
2000
06:00
wants to know what Harry, and Jane,
118
342000
3000
06:03
and George and Frank are thinking,
119
345000
3000
06:06
and sends messages to those people.
120
348000
2000
06:08
Sally's giving her opinion to them,
121
350000
2000
06:10
and they're giving their opinion to Sally.
122
352000
3000
06:13
But Sally also wants to know
123
355000
2000
06:15
what Harry is saying to these three,
124
357000
3000
06:18
and what they're saying to Harry.
125
360000
2000
06:20
And Harry wants to know
126
362000
2000
06:22
what each of those people are saying to each other, and so on,
127
364000
3000
06:25
and Sally would like to know what Harry thinks those people are saying.
128
367000
3000
06:28
That's a complicated problem; that's a lot to know.
129
370000
3000
06:31
With five decision-makers
130
373000
3000
06:34
there are a lot of linkages --
131
376000
2000
06:36
120, as a matter of fact,
132
378000
2000
06:38
if you remember your factorials.
133
380000
2000
06:40
Five factorial is 120.
134
382000
2000
06:42
Now you may be surprised to know
135
384000
2000
06:44
that smart people can keep 120 things straight
136
386000
3000
06:47
in their head.
137
389000
2000
06:49
Suppose we double the number of influencers
138
391000
2000
06:51
from five to 10.
139
393000
2000
06:53
Does that mean we've doubled the number of pieces of information
140
395000
4000
06:57
we need to know, from 120 to 240?
141
399000
2000
06:59
No. How about 10 times?
142
401000
2000
07:01
To 1,200? No.
143
403000
3000
07:04
We've increased it to 3.6 million.
144
406000
3000
07:07
Nobody can keep that straight in their head.
145
409000
2000
07:09
But computers,
146
411000
3000
07:12
they can. They don't need coffee breaks,
147
414000
3000
07:15
they don't need vacations,
148
417000
3000
07:18
they don't need to go to sleep at night,
149
420000
2000
07:20
they don't ask for raises either.
150
422000
3000
07:23
They can keep this information straight
151
425000
2000
07:25
and that means that we can process the information.
152
427000
3000
07:28
So I'm going to talk to you about how to process it,
153
430000
2000
07:30
and I'm going to give you some examples out of Iran,
154
432000
3000
07:33
and you're going to be wondering,
155
435000
2000
07:35
"Why should we listen to this guy?
156
437000
2000
07:37
Why should we believe what he's saying?"
157
439000
3000
07:40
So I'm going to show you a factoid.
158
442000
4000
07:44
This is an assessment by the Central Intelligence Agency
159
446000
3000
07:47
of the percentage of time
160
449000
2000
07:49
that the model I'm talking about
161
451000
2000
07:51
is right in predicting things whose outcome is not yet known,
162
453000
3000
07:54
when the experts who provided the data inputs
163
456000
4000
07:58
got it wrong.
164
460000
2000
08:00
That's not my claim, that's a CIA claim -- you can read it,
165
462000
3000
08:03
it was declassified a while ago. You can read it in a volume edited by
166
465000
3000
08:06
H. Bradford Westerfield, Yale University Press.
167
468000
3000
08:09
So, what do we need to know
168
471000
2000
08:11
in order to predict?
169
473000
2000
08:13
You may be surprised to find out we don't need to know very much.
170
475000
3000
08:16
We do need to know who has a stake
171
478000
3000
08:19
in trying to shape the outcome of a decision.
172
481000
5000
08:24
We need to know what they say they want,
173
486000
3000
08:27
not what they want in their heart of hearts,
174
489000
3000
08:30
not what they think they can get,
175
492000
2000
08:32
but what they say they want, because that is a strategically chosen position,
176
494000
3000
08:35
and we can work backwards from that
177
497000
2000
08:37
to draw inferences about important features of their decision-making.
178
499000
4000
08:41
We need to know how focused they are
179
503000
2000
08:43
on the problem at hand.
180
505000
2000
08:45
That is, how willing are they to drop what they're doing when the issue comes up,
181
507000
3000
08:48
and attend to it instead of something else that's on their plate --
182
510000
4000
08:52
how big a deal is it to them?
183
514000
2000
08:54
And how much clout could they bring to bear
184
516000
3000
08:57
if they chose to engage on the issue?
185
519000
5000
09:02
If we know those things
186
524000
2000
09:04
we can predict their behavior by assuming that everybody
187
526000
3000
09:07
cares about two things on any decision.
188
529000
5000
09:12
They care about the outcome. They'd like an outcome as close to
189
534000
2000
09:14
what they are interested in as possible.
190
536000
3000
09:17
They're careerists, they also care about getting credit --
191
539000
3000
09:20
there's ego involvement,
192
542000
2000
09:22
they want to be seen as important in shaping the outcome,
193
544000
4000
09:26
or as important, if it's their druthers, to block an outcome.
194
548000
5000
09:31
And so we have to figure out how they balance those two things.
195
553000
3000
09:34
Different people trade off
196
556000
2000
09:36
between standing by their outcome,
197
558000
3000
09:39
faithfully holding to it, going down in a blaze of glory,
198
561000
3000
09:42
or giving it up, putting their finger in the wind,
199
564000
3000
09:45
and doing whatever they think is going to be a winning position.
200
567000
3000
09:48
Most people fall in between, and if we can work out where they fall
201
570000
3000
09:51
we can work out how to negotiate with them
202
573000
2000
09:53
to change their behavior.
203
575000
2000
09:55
So with just that little bit of input
204
577000
3000
09:58
we can work out what the choices are that people have,
205
580000
3000
10:01
what the chances are that they're willing to take,
206
583000
3000
10:04
what they're after, what they value, what they want,
207
586000
3000
10:07
and what they believe about other people.
208
589000
3000
10:10
You might notice what we don't need to know:
209
592000
4000
10:14
there's no history in here.
210
596000
2000
10:16
How they got to where they are
211
598000
2000
10:18
may be important in shaping the input information,
212
600000
2000
10:20
but once we know where they are
213
602000
2000
10:22
we're worried about where they're going to be headed in the future.
214
604000
3000
10:25
How they got there turns out not to be terribly critical in predicting.
215
607000
4000
10:29
I remind you of that 90 percent accuracy rate.
216
611000
4000
10:33
So where are we going to get this information?
217
615000
2000
10:35
We can get this information
218
617000
3000
10:38
from the Internet, from The Economist,
219
620000
3000
10:41
The Financial Times, The New York Times,
220
623000
3000
10:44
U.S. News and World Report, lots of sources like that,
221
626000
3000
10:47
or we can get it from asking experts
222
629000
2000
10:49
who spend their lives studying places and problems,
223
631000
3000
10:52
because those experts know this information.
224
634000
3000
10:55
If they don't know, who are the people trying to influence the decision,
225
637000
3000
10:58
how much clout do they have,
226
640000
2000
11:00
how much they care about this issue, and what do they say they want,
227
642000
3000
11:03
are they experts? That's what it means to be an expert,
228
645000
3000
11:06
that's the basic stuff an expert needs to know.
229
648000
4000
11:10
Alright, lets turn to Iran.
230
652000
2000
11:12
Let me make three important predictions --
231
654000
3000
11:15
you can check this out, time will tell.
232
657000
3000
11:18
What is Iran going to do about its nuclear weapons program?
233
660000
8000
11:26
How secure is the theocratic regime in Iran?
234
668000
3000
11:29
What's its future?
235
671000
2000
11:31
And everybody's best friend,
236
673000
3000
11:34
Ahmadinejad. How are things going for him?
237
676000
3000
11:37
How are things going to be working out for him in the next year or two?
238
679000
6000
11:43
You take a look at this, this is not based on statistics.
239
685000
3000
11:46
I want to be very clear here. I'm not projecting some past data into the future.
240
688000
5000
11:51
I've taken inputs on positions and so forth,
241
693000
3000
11:54
run it through a computer model
242
696000
2000
11:56
that had simulated the dynamics of interaction,
243
698000
3000
11:59
and these are the simulated dynamics,
244
701000
2000
12:01
the predictions about the path of policy.
245
703000
3000
12:04
So you can see here on the vertical axis,
246
706000
3000
12:07
I haven't shown it all the way down to zero,
247
709000
2000
12:09
there are lots of other options, but here I'm just showing you the prediction,
248
711000
3000
12:12
so I've narrowed the scale.
249
714000
2000
12:14
Up at the top of the axis, "Build the Bomb."
250
716000
3000
12:17
At 130, we start somewhere above 130,
251
719000
4000
12:21
between building a bomb, and making enough weapons-grade fuel
252
723000
3000
12:24
so that you could build a bomb.
253
726000
2000
12:26
That's where, according to my analyses,
254
728000
3000
12:29
the Iranians were at the beginning of this year.
255
731000
3000
12:32
And then the model makes predictions down the road.
256
734000
3000
12:35
At 115 they would only produce enough weapons grade fuel
257
737000
4000
12:39
to show that they know how, but they wouldn't build a weapon:
258
741000
2000
12:41
they would build a research quantity.
259
743000
2000
12:43
It would achieve some national pride,
260
745000
2000
12:45
but not go ahead and build a weapon.
261
747000
3000
12:48
And down at 100 they would build civilian nuclear energy,
262
750000
2000
12:50
which is what they say is their objective.
263
752000
4000
12:54
The yellow line shows us the most likely path.
264
756000
3000
12:57
The yellow line includes an analysis
265
759000
1000
12:58
of 87 decision makers in Iran,
266
760000
3000
13:01
and a vast number of outside influencers
267
763000
3000
13:04
trying to pressure Iran into changing its behavior,
268
766000
3000
13:07
various players in the United States, and Egypt,
269
769000
3000
13:10
and Saudi Arabia, and Russia, European Union,
270
772000
2000
13:12
Japan, so on and so forth.
271
774000
2000
13:14
The white line reproduces the analysis
272
776000
4000
13:18
if the international environment
273
780000
2000
13:20
just left Iran to make its own internal decisions,
274
782000
3000
13:23
under its own domestic political pressures.
275
785000
2000
13:25
That's not going to be happening,
276
787000
2000
13:27
but you can see that the line comes down faster
277
789000
4000
13:31
if they're not put under international pressure,
278
793000
3000
13:34
if they're allowed to pursue their own devices.
279
796000
2000
13:36
But in any event, by the end of this year,
280
798000
3000
13:39
beginning of next year, we get to a stable equilibrium outcome.
281
801000
3000
13:42
And that equilibrium is not what the United States would like,
282
804000
4000
13:46
but it's probably an equilibrium that the United States can live with,
283
808000
3000
13:49
and that a lot of others can live with.
284
811000
2000
13:51
And that is that Iran will achieve that nationalist pride
285
813000
4000
13:55
by making enough weapons-grade fuel, through research,
286
817000
4000
13:59
so that they could show that they know how to make weapons-grade fuel,
287
821000
4000
14:03
but not enough to actually build a bomb.
288
825000
5000
14:08
How is this happening?
289
830000
2000
14:10
Over here you can see this is the distribution
290
832000
4000
14:14
of power in favor of civilian nuclear energy today,
291
836000
5000
14:19
this is what that power block is predicted to be like
292
841000
3000
14:22
by the late parts of 2010, early parts of 2011.
293
844000
6000
14:28
Just about nobody supports research on weapons-grade fuel today,
294
850000
4000
14:32
but by 2011 that gets to be a big block,
295
854000
3000
14:35
and you put these two together, that's the controlling influence in Iran.
296
857000
4000
14:39
Out here today, there are a bunch of people --
297
861000
3000
14:42
Ahmadinejad for example --
298
864000
2000
14:44
who would like not only to build a bomb,
299
866000
2000
14:46
but test a bomb.
300
868000
2000
14:48
That power disappears completely;
301
870000
2000
14:50
nobody supports that by 2011.
302
872000
3000
14:53
These guys are all shrinking,
303
875000
2000
14:55
the power is all drifting out here,
304
877000
3000
14:58
so the outcome is going to be the weapons-grade fuel.
305
880000
3000
15:01
Who are the winners and who are the losers in Iran?
306
883000
3000
15:04
Take a look at these guys, they're growing in power,
307
886000
3000
15:07
and by the way, this was done a while ago
308
889000
3000
15:10
before the current economic crisis,
309
892000
2000
15:12
and that's probably going to get steeper.
310
894000
2000
15:14
These folks are the moneyed interests in Iran,
311
896000
2000
15:16
the bankers, the oil people, the bazaaries.
312
898000
4000
15:20
They are growing in political clout,
313
902000
3000
15:23
as the mullahs are isolating themselves --
314
905000
3000
15:26
with the exception of one group of mullahs,
315
908000
2000
15:28
who are not well known to Americans.
316
910000
2000
15:30
That's this line here, growing in power,
317
912000
2000
15:32
these are what the Iranians call the quietists.
318
914000
4000
15:36
These are the Ayatollahs, mostly based in Qom,
319
918000
3000
15:39
who have great clout in the religious community,
320
921000
4000
15:43
have been quiet on politics and are going to be getting louder,
321
925000
3000
15:46
because they see Iran going in an unhealthy direction,
322
928000
2000
15:48
a direction contrary
323
930000
2000
15:50
to what Khomeini had in mind.
324
932000
4000
15:54
Here is Mr. Ahmadinejad.
325
936000
2000
15:56
Two things to notice: he's getting weaker,
326
938000
3000
15:59
and while he gets a lot of attention in the United States,
327
941000
2000
16:01
he is not a major player in Iran.
328
943000
2000
16:03
He is on the way down.
329
945000
2000
16:05
OK, so I'd like you to take a little away from this.
330
947000
4000
16:09
Everything is not predictable: the stock market
331
951000
2000
16:11
is, at least for me, not predictable,
332
953000
3000
16:14
but most complicated negotiations are predictable.
333
956000
5000
16:19
Again, whether we're talking health policy, education,
334
961000
4000
16:23
environment, energy,
335
965000
3000
16:26
litigation, mergers,
336
968000
2000
16:28
all of these are complicated problems
337
970000
2000
16:30
that are predictable,
338
972000
2000
16:32
that this sort of technology can be applied to.
339
974000
4000
16:36
And the reason that being able to predict those things is important,
340
978000
5000
16:41
is not just because you might run a hedge fund and make money off of it,
341
983000
3000
16:44
but because if you can predict what people will do,
342
986000
3000
16:47
you can engineer what they will do.
343
989000
3000
16:50
And if you engineer what they do you can change the world,
344
992000
2000
16:52
you can get a better result.
345
994000
2000
16:54
I would like to leave you with one thought, which is
346
996000
3000
16:57
for me, the dominant theme of this gathering,
347
999000
5000
17:02
and is the dominant theme of this way of thinking about the world.
348
1004000
3000
17:05
When people say to you,
349
1007000
3000
17:08
"That's impossible,"
350
1010000
2000
17:10
you say back to them,
351
1012000
2000
17:12
"When you say 'That's impossible,'
352
1014000
2000
17:14
you're confused with,
353
1016000
2000
17:16
'I don't know how to do it.'"
354
1018000
3000
17:19
Thank you.
355
1021000
2000
17:21
(Applause)
356
1023000
4000
17:25
Chris Anderson: One question for you.
357
1027000
2000
17:27
That was fascinating.
358
1029000
3000
17:30
I love that you put it out there.
359
1032000
3000
17:33
I got very nervous halfway through the talk though,
360
1035000
2000
17:35
just panicking whether you'd included in your model, the possibility that
361
1037000
3000
17:38
putting this prediction out there might change the result.
362
1040000
4000
17:42
We've got 800 people in Tehran who watch TEDTalks.
363
1044000
3000
17:45
Bruce Bueno de Mesquita: I've thought about that,
364
1047000
2000
17:47
and since I've done a lot of work for the intelligence community,
365
1049000
4000
17:51
they've also pondered that.
366
1053000
2000
17:53
It would be a good thing if
367
1055000
3000
17:56
people paid more attention, took seriously,
368
1058000
3000
17:59
and engaged in the same sorts of calculations,
369
1061000
2000
18:01
because it would change things. But it would change things in two beneficial ways.
370
1063000
4000
18:05
It would hasten how quickly people arrive at an agreement,
371
1067000
6000
18:11
and so it would save everybody a lot of grief and time.
372
1073000
3000
18:14
And, it would arrive at an agreement that everybody was happy with,
373
1076000
4000
18:18
without having to manipulate them so much --
374
1080000
3000
18:21
which is basically what I do, I manipulate them.
375
1083000
3000
18:24
So it would be a good thing.
376
1086000
2000
18:26
CA: So you're kind of trying to say, "People of Iran, this is your destiny, lets go there."
377
1088000
4000
18:30
BBM: Well, people of Iran, this is what many of you are going to evolve to want,
378
1092000
6000
18:36
and we could get there a lot sooner,
379
1098000
2000
18:38
and you would suffer a lot less trouble from economic sanctions,
380
1100000
3000
18:41
and we would suffer a lot less fear of the use of military force on our end,
381
1103000
6000
18:47
and the world would be a better place.
382
1109000
2000
18:49
CA: Here's hoping they hear it that way. Thank you very much Bruce.
383
1111000
3000
18:52
BBM: Thank you.
384
1114000
2000
18:54
(Applause)
385
1116000
5000

▲Back to top

ABOUT THE SPEAKER
Bruce Bueno de Mesquita - Political scientist
A consultant to the CIA and the Department of Defense, Bruce Bueno de Mesquita has built an intricate computer model that can predict the outcomes of international conflicts with bewildering accuracy.

Why you should listen

Every motive has a number, says Bruce Bueno de Mesquita. A specialist in foreign policy, international relations and state building, he is also a leading -- if controversial -- scholar of rational choice theory, which says math underlies the nation-scale consequences of individuals acting for personal benefit. He created forecasting technology that has, time and again, exceeded the accuracy of old-school analysis, even with thorny quarrels charged by obscure contenders, and often against odds. (One example: He called the second Intifada two years in advance.)

Bueno de Mesquita's company, Mesquita & Roundell, sells his system's predictions and analysis to influential government and private institutions that need heads-ups on policy. He teaches at NYU and is a senior fellow at the Hoover Institution.

More profile about the speaker
Bruce Bueno de Mesquita | Speaker | TED.com