ABOUT THE SPEAKER
Ray Dalio - Hedge fund chair
Ray Dalio is the founder, chair and co-chief investment officer of Bridgewater Associates, a global leader in institutional portfolio management and the largest hedge fund in the world.

Why you should listen

Dalio started Bridgewater out of his two-bedroom apartment in New York City in 1975 and has grown it into the fifth most important private company in the U.S. (according to Fortune magazine). Because of the firm’s many industry-changing innovations over its 40-year history, he has been called the “Steve Jobs of investing” by aiCIO magazine and named one of TIME magazine’s "100 Most Influential People."

Dalio attributes Bridgewater’s success to its unique culture. He describes it as “a believability-weighted idea meritocracy” in which the people strive for “meaningful work and meaningful relationships through radical truth and radical transparency.” He has explained this approach in his book Principles, which has been downloaded more than three million times and has produced considerable curiosity and controversy.

More profile about the speaker
Ray Dalio | Speaker | TED.com
TED2017

Ray Dalio: How to build a company where the best ideas win

Filmed:
3,449,614 views

What if you knew what your coworkers really thought about you and what they were really like? Ray Dalio makes the business case for using radical transparency and algorithmic decision-making to create an idea meritocracy where people can speak up and say what they really think -- even calling out the boss is fair game. Learn more about how these strategies helped Dalio create one of the world's most successful hedge funds and how you might harness the power of data-driven group decision-making.
- Hedge fund chair
Ray Dalio is the founder, chair and co-chief investment officer of Bridgewater Associates, a global leader in institutional portfolio management and the largest hedge fund in the world. Full bio

Double-click the English transcript below to play the video.

00:12
Whether you like it or not,
0
560
1336
00:13
radical transparency and algorithmic
decision-making is coming at you fast,
1
1920
5376
00:19
and it's going to change your life.
2
7320
1976
00:21
That's because it's now easy
to take algorithms
3
9320
2816
00:24
and embed them into computers
4
12160
1896
00:26
and gather all that data
that you're leaving on yourself
5
14080
2936
00:29
all over the place,
6
17040
1376
00:30
and know what you're like,
7
18440
1696
00:32
and then direct the computers
to interact with you
8
20160
2936
00:35
in ways that are better
than most people can.
9
23120
2120
00:38
Well, that might sound scary.
10
26160
1616
00:39
I've been doing this for a long time
and I have found it to be wonderful.
11
27800
3640
00:44
My objective has been
to have meaningful work
12
32159
2657
00:46
and meaningful relationships
with the people I work with,
13
34840
2856
00:49
and I've learned that I couldn't have that
14
37720
2056
00:51
unless I had that radical transparency
and that algorithmic decision-making.
15
39800
4280
00:56
I want to show you why that is,
16
44680
2016
00:58
I want to show you how it works.
17
46720
1696
01:00
And I warn you that some of the things
that I'm going to show you
18
48440
3096
01:03
probably are a little bit shocking.
19
51560
1667
01:05
Since I was a kid,
I've had a terrible rote memory.
20
53760
3480
01:10
And I didn't like following instructions,
21
58120
2176
01:12
I was no good at following instructions.
22
60320
2416
01:14
But I loved to figure out
how things worked for myself.
23
62760
3000
01:18
When I was 12,
24
66680
1376
01:20
I hated school but I fell in love
with trading the markets.
25
68080
3280
01:23
I caddied at the time,
26
71920
1656
01:25
earned about five dollars a bag.
27
73600
1576
01:27
And I took my caddying money,
and I put it in the stock market.
28
75200
3200
01:31
And that was just because
the stock market was hot at the time.
29
79240
3376
01:34
And the first company I bought
30
82640
1456
01:36
was a company by the name
of Northeast Airlines.
31
84120
2600
01:39
Northeast Airlines was
the only company I heard of
32
87360
2736
01:42
that was selling for less
than five dollars a share.
33
90120
2696
01:44
(Laughter)
34
92840
1976
01:46
And I figured I could buy more shares,
35
94840
1856
01:48
and if it went up, I'd make more money.
36
96720
2096
01:50
So, it was a dumb strategy, right?
37
98840
2840
01:54
But I tripled my money,
38
102360
1456
01:55
and I tripled my money
because I got lucky.
39
103840
2120
01:58
The company was about to go bankrupt,
40
106520
1816
02:00
but some other company acquired it,
41
108360
2096
02:02
and I tripled my money.
42
110480
1456
02:03
And I was hooked.
43
111960
1200
02:05
And I thought, "This game is easy."
44
113720
2280
02:09
With time,
45
117200
1216
02:10
I learned this game is anything but easy.
46
118440
1960
02:12
In order to be an effective investor,
47
120880
2136
02:15
one has to bet against the consensus
48
123040
2896
02:17
and be right.
49
125960
1256
02:19
And it's not easy to bet
against the consensus and be right.
50
127240
2856
02:22
One has to bet against
the consensus and be right
51
130120
2336
02:24
because the consensus
is built into the price.
52
132480
2640
02:28
And in order to be an entrepreneur,
53
136120
2456
02:30
a successful entrepreneur,
54
138600
1616
02:32
one has to bet against
the consensus and be right.
55
140240
3480
02:37
I had to be an entrepreneur
and an investor --
56
145400
2936
02:40
and what goes along with that
is making a lot of painful mistakes.
57
148360
4200
02:45
So I made a lot of painful mistakes,
58
153440
2816
02:48
and with time,
59
156280
1256
02:49
my attitude about those mistakes
began to change.
60
157560
2960
02:53
I began to think of them as puzzles.
61
161160
2096
02:55
That if I could solve the puzzles,
62
163280
1936
02:57
they would give me gems.
63
165240
1440
02:59
And the puzzles were:
64
167160
1656
03:00
What would I do differently in the future
so I wouldn't make that painful mistake?
65
168840
3880
03:05
And the gems were principles
66
173280
2576
03:07
that I would then write down
so I would remember them
67
175880
3136
03:11
that would help me in the future.
68
179040
1572
03:13
And because I wrote them down so clearly,
69
181000
2696
03:15
I could then --
70
183720
1336
03:17
eventually discovered --
71
185080
1576
03:18
I could then embed them into algorithms.
72
186680
3760
03:23
And those algorithms
would be embedded in computers,
73
191400
3456
03:26
and the computers would
make decisions along with me;
74
194880
3336
03:30
and so in parallel,
we would make these decisions.
75
198240
3136
03:33
And I could see how those decisions
then compared with my own decisions,
76
201400
3976
03:37
and I could see that
those decisions were a lot better.
77
205400
3096
03:40
And that was because the computer
could make decisions much faster,
78
208520
4736
03:45
it could process a lot more information
79
213280
2256
03:47
and it can process decisions much more --
80
215560
3400
03:51
less emotionally.
81
219880
1200
03:54
So it radically improved
my decision-making.
82
222760
3920
04:00
Eight years after I started Bridgewater,
83
228440
4896
04:05
I had my greatest failure,
84
233360
1536
04:06
my greatest mistake.
85
234920
1200
04:09
It was late 1970s,
86
237680
2136
04:11
I was 34 years old,
87
239840
1976
04:13
and I had calculated that American banks
88
241840
3656
04:17
had lent much more money
to emerging countries
89
245520
2856
04:20
than those countries
were going to be able to pay back
90
248400
2816
04:23
and that we would have
the greatest debt crisis
91
251240
2696
04:25
since the Great Depression.
92
253960
1360
04:28
And with it, an economic crisis
93
256200
2216
04:30
and a big bear market in stocks.
94
258440
2040
04:33
It was a controversial view at the time.
95
261680
2000
04:36
People thought it was
kind of a crazy point of view.
96
264160
2440
04:39
But in August 1982,
97
267480
2216
04:41
Mexico defaulted on its debt,
98
269720
1960
04:44
and a number of other countries followed.
99
272520
2256
04:46
And we had the greatest debt crisis
since the Great Depression.
100
274800
3400
04:51
And because I had anticipated that,
101
279080
2776
04:53
I was asked to testify to Congress
and appear on "Wall Street Week,"
102
281880
4336
04:58
which was the show of the time.
103
286240
1976
05:00
Just to give you a flavor of that,
I've got a clip here,
104
288240
2936
05:03
and you'll see me in there.
105
291200
1920
05:06
(Video) Mr. Chairman, Mr. Mitchell,
106
294480
1696
05:08
it's a great pleasure and a great honor
to be able to appear before you
107
296200
3376
05:11
in examination with what
is going wrong with our economy.
108
299600
3480
05:15
The economy is now flat --
109
303640
1936
05:17
teetering on the brink of failure.
110
305600
2136
05:19
Martin Zweig: You were recently
quoted in an article.
111
307760
2496
05:22
You said, "I can say this
with absolute certainty
112
310280
2336
05:24
because I know how markets work."
113
312640
1616
05:26
Ray Dalio: I can say
with absolute certainty
114
314280
2096
05:28
that if you look at the liquidity base
115
316400
1856
05:30
in the corporations
and the world as a whole,
116
318280
3376
05:33
that there's such reduced
level of liquidity
117
321680
2096
05:35
that you can't return
to an era of stagflation."
118
323800
3216
05:39
I look at that now, I think,
"What an arrogant jerk!"
119
327040
3096
05:42
(Laughter)
120
330160
2000
05:45
I was so arrogant, and I was so wrong.
121
333760
2456
05:48
I mean, while the debt crisis happened,
122
336240
2576
05:50
the stock market and the economy
went up rather than going down,
123
338840
3976
05:54
and I lost so much money
for myself and for my clients
124
342840
5016
05:59
that I had to shut down
my operation pretty much,
125
347880
3416
06:03
I had to let almost everybody go.
126
351320
1880
06:05
And these were like extended family,
127
353640
1736
06:07
I was heartbroken.
128
355400
1616
06:09
And I had lost so much money
129
357040
1816
06:10
that I had to borrow
4,000 dollars from my dad
130
358880
3336
06:14
to help to pay my family bills.
131
362240
1920
06:16
It was one of the most painful
experiences of my life ...
132
364840
3160
06:21
but it turned out to be
one of the greatest experiences of my life
133
369240
3776
06:25
because it changed my attitude
about decision-making.
134
373040
2680
06:28
Rather than thinking, "I'm right,"
135
376360
3056
06:31
I started to ask myself,
136
379440
1576
06:33
"How do I know I'm right?"
137
381040
1800
06:36
I gained a humility that I needed
138
384480
1936
06:38
in order to balance my audacity.
139
386440
2560
06:41
I wanted to find the smartest
people who would disagree with me
140
389880
4216
06:46
to try to understand their perspective
141
394120
1896
06:48
or to have them
stress test my perspective.
142
396040
2600
06:51
I wanted to make an idea meritocracy.
143
399400
2776
06:54
In other words,
144
402200
1216
06:55
not an autocracy in which
I would lead and others would follow
145
403440
3816
06:59
and not a democracy in which everybody's
points of view were equally valued,
146
407280
3616
07:02
but I wanted to have an idea meritocracy
in which the best ideas would win out.
147
410920
5096
07:08
And in order to do that,
148
416040
1256
07:09
I realized that we would need
radical truthfulness
149
417320
3576
07:12
and radical transparency.
150
420920
1616
07:14
What I mean by radical truthfulness
and radical transparency
151
422560
3856
07:18
is people needed to say
what they really believed
152
426440
2656
07:21
and to see everything.
153
429120
2000
07:23
And we literally
tape almost all conversations
154
431480
3936
07:27
and let everybody see everything,
155
435440
1616
07:29
because if we didn't do that,
156
437080
1416
07:30
we couldn't really have
an idea meritocracy.
157
438520
3080
07:34
In order to have an idea meritocracy,
158
442760
3696
07:38
we have let people speak
and say what they want.
159
446480
2376
07:40
Just to give you an example,
160
448880
1376
07:42
this is an email from Jim Haskel --
161
450280
2696
07:45
somebody who works for me --
162
453000
1376
07:46
and this was available
to everybody in the company.
163
454400
3376
07:49
"Ray, you deserve a 'D-'
164
457800
2536
07:52
for your performance
today in the meeting ...
165
460360
2256
07:54
you did not prepare at all well
166
462640
1696
07:56
because there is no way
you could have been that disorganized."
167
464360
3560
08:01
Isn't that great?
168
469520
1216
08:02
(Laughter)
169
470760
1216
08:04
That's great.
170
472000
1216
08:05
It's great because, first of all,
I needed feedback like that.
171
473240
2936
08:08
I need feedback like that.
172
476200
1616
08:09
And it's great because if I don't let Jim,
and people like Jim,
173
477840
3456
08:13
to express their points of view,
174
481320
1576
08:14
our relationship wouldn't be the same.
175
482920
2056
08:17
And if I didn't make that public
for everybody to see,
176
485000
3056
08:20
we wouldn't have an idea meritocracy.
177
488080
1960
08:23
So for that last 25 years
that's how we've been operating.
178
491760
3280
08:27
We've been operating
with this radical transparency
179
495640
3056
08:30
and then collecting these principles,
180
498720
2296
08:33
largely from making mistakes,
181
501040
2056
08:35
and then embedding
those principles into algorithms.
182
503120
4416
08:39
And then those algorithms provide --
183
507560
2696
08:42
we're following the algorithms
184
510280
2016
08:44
in parallel with our thinking.
185
512320
1440
08:47
That has been how we've run
the investment business,
186
515280
3176
08:50
and it's how we also deal
with the people management.
187
518480
2736
08:53
In order to give you a glimmer
into what this looks like,
188
521240
3736
08:57
I'd like to take you into a meeting
189
525000
2336
08:59
and introduce you to a tool of ours
called the "Dot Collector"
190
527360
3136
09:02
that helps us do this.
191
530520
1280
09:07
A week after the US election,
192
535640
2176
09:09
our research team held a meeting
193
537840
2096
09:11
to discuss what a Trump presidency
would mean for the US economy.
194
539960
3320
09:16
Naturally, people had
different opinions on the matter
195
544000
2856
09:18
and how we were
approaching the discussion.
196
546880
2040
09:21
The "Dot Collector" collects these views.
197
549840
2776
09:24
It has a list of a few dozen attributes,
198
552640
2296
09:26
so whenever somebody thinks something
about another person's thinking,
199
554960
4016
09:31
it's easy for them
to convey their assessment;
200
559000
2936
09:33
they simply note the attribute
and provide a rating from one to 10.
201
561960
4520
09:39
For example, as the meeting began,
202
567520
2256
09:41
a researcher named Jen rated me a three --
203
569800
3120
09:45
in other words, badly --
204
573640
2016
09:47
(Laughter)
205
575680
1376
09:49
for not showing a good balance
of open-mindedness and assertiveness.
206
577080
4160
09:54
As the meeting transpired,
207
582080
1456
09:55
Jen's assessments of people
added up like this.
208
583560
3240
09:59
Others in the room
have different opinions.
209
587920
2176
10:02
That's normal.
210
590120
1216
10:03
Different people are always
going to have different opinions.
211
591360
2920
10:06
And who knows who's right?
212
594800
1400
10:09
Let's look at just what people thought
about how I was doing.
213
597240
3440
10:13
Some people thought I did well,
214
601600
2216
10:15
others, poorly.
215
603840
1200
10:18
With each of these views,
216
606080
1336
10:19
we can explore the thinking
behind the numbers.
217
607440
2320
10:22
Here's what Jen and Larry said.
218
610520
2160
10:25
Note that everyone
gets to express their thinking,
219
613760
2616
10:28
including their critical thinking,
220
616400
1656
10:30
regardless of their position
in the company.
221
618080
2120
10:33
Jen, who's 24 years old
and right out of college,
222
621120
3096
10:36
can tell me, the CEO,
that I'm approaching things terribly.
223
624240
2840
10:40
This tool helps people
both express their opinions
224
628480
3776
10:44
and then separate themselves
from their opinions
225
632280
3096
10:47
to see things from a higher level.
226
635400
2040
10:50
When Jen and others shift their attentions
from inputting their own opinions
227
638640
4896
10:55
to looking down on the whole screen,
228
643560
2576
10:58
their perspective changes.
229
646160
1720
11:00
They see their own opinions
as just one of many
230
648680
3136
11:03
and naturally start asking themselves,
231
651840
2536
11:06
"How do I know my opinion is right?"
232
654400
2000
11:09
That shift in perspective is like going
from seeing in one dimension
233
657480
4056
11:13
to seeing in multiple dimensions.
234
661560
2256
11:15
And it shifts the conversation
from arguing over our opinions
235
663840
4096
11:19
to figuring out objective criteria
for determining which opinions are best.
236
667960
4400
11:24
Behind the "Dot Collector"
is a computer that is watching.
237
672920
3600
11:29
It watches what all
these people are thinking
238
677120
2176
11:31
and it correlates that
with how they think.
239
679320
2576
11:33
And it communicates advice
back to each of them based on that.
240
681920
3520
11:38
Then it draws the data
from all the meetings
241
686520
3416
11:41
to create a pointilist painting
of what people are like
242
689960
3216
11:45
and how they think.
243
693200
1240
11:47
And it does that guided by algorithms.
244
695160
2720
11:50
Knowing what people are like helps
to match them better with their jobs.
245
698800
3760
11:55
For example,
246
703120
1216
11:56
a creative thinker who is unreliable
247
704360
1736
11:58
might be matched up with someone
who's reliable but not creative.
248
706120
3080
12:02
Knowing what people are like
also allows us to decide
249
710280
3336
12:05
what responsibilities to give them
250
713640
2256
12:07
and to weigh our decisions
based on people's merits.
251
715920
3480
12:12
We call it their believability.
252
720040
1600
12:14
Here's an example of a vote that we took
253
722560
1976
12:16
where the majority
of people felt one way ...
254
724560
2840
12:20
but when we weighed the views
based on people's merits,
255
728920
2936
12:23
the answer was completely different.
256
731880
1840
12:26
This process allows us to make decisions
not based on democracy,
257
734920
4576
12:31
not based on autocracy,
258
739520
2136
12:33
but based on algorithms that take
people's believability into consideration.
259
741680
5240
12:41
Yup, we really do this.
260
749520
1696
12:43
(Laughter)
261
751240
3296
12:46
We do it because it eliminates
262
754560
2856
12:49
what I believe to be
one of the greatest tragedies of mankind,
263
757440
4456
12:53
and that is people arrogantly,
264
761920
2160
12:56
naïvely holding opinions
in their minds that are wrong,
265
764760
4456
13:01
and acting on them,
266
769240
1256
13:02
and not putting them out there
to stress test them.
267
770520
2760
13:06
And that's a tragedy.
268
774000
1336
13:07
And we do it because it elevates ourselves
above our own opinions
269
775360
5416
13:12
so that we start to see things
through everybody's eyes,
270
780800
2896
13:15
and we see things collectively.
271
783720
1920
13:18
Collective decision-making is so much
better than individual decision-making
272
786360
4336
13:22
if it's done well.
273
790720
1200
13:24
It's been the secret sauce
behind our success.
274
792360
2616
13:27
It's why we've made
more money for our clients
275
795000
2176
13:29
than any other hedge fund in existence
276
797200
1936
13:31
and made money
23 out of the last 26 years.
277
799160
2720
13:35
So what's the problem
with being radically truthful
278
803880
4536
13:40
and radically transparent with each other?
279
808440
2240
13:45
People say it's emotionally difficult.
280
813400
2080
13:48
Critics say it's a formula
for a brutal work environment.
281
816240
4240
13:53
Neuroscientists tell me it has to do
with how are brains are prewired.
282
821400
4856
13:58
There's a part of our brain
that would like to know our mistakes
283
826280
3216
14:01
and like to look at our weaknesses
so we could do better.
284
829520
3960
14:06
I'm told that that's
the prefrontal cortex.
285
834120
2440
14:09
And then there's a part of our brain
which views all of this as attacks.
286
837040
4856
14:13
I'm told that that's the amygdala.
287
841920
1960
14:16
In other words,
there are two you's inside you:
288
844440
3056
14:19
there's an emotional you
289
847520
1416
14:20
and there's an intellectual you,
290
848960
1776
14:22
and often they're at odds,
291
850760
1776
14:24
and often they work against you.
292
852560
1920
14:27
It's been our experience
that we can win this battle.
293
855160
3736
14:30
We win it as a group.
294
858920
1320
14:33
It takes about 18 months typically
295
861000
2336
14:35
to find that most people
prefer operating this way,
296
863360
3056
14:38
with this radical transparency
297
866440
2016
14:40
than to be operating
in a more opaque environment.
298
868480
3336
14:43
There's not politics,
there's not the brutality of --
299
871840
4296
14:48
you know, all of that hidden,
behind-the-scenes --
300
876160
2376
14:50
there's an idea meritocracy
where people can speak up.
301
878560
2936
14:53
And that's been great.
302
881520
1256
14:54
It's given us more effective work,
303
882800
1656
14:56
and it's given us
more effective relationships.
304
884480
2400
14:59
But it's not for everybody.
305
887400
1320
15:01
We found something like
25 or 30 percent of the population
306
889680
2936
15:04
it's just not for.
307
892640
1736
15:06
And by the way,
308
894400
1216
15:07
when I say radical transparency,
309
895640
1816
15:09
I'm not saying transparency
about everything.
310
897480
2336
15:11
I mean, you don't have to tell somebody
that their bald spot is growing
311
899840
3816
15:15
or their baby's ugly.
312
903680
1616
15:17
So, I'm just talking about --
313
905320
2096
15:19
(Laughter)
314
907440
1216
15:20
talking about the important things.
315
908680
2176
15:22
So --
316
910880
1216
15:24
(Laughter)
317
912120
3200
15:28
So when you leave this room,
318
916600
1416
15:30
I'd like you to observe yourself
in conversations with others.
319
918040
4440
15:35
Imagine if you knew
what they were really thinking,
320
923360
3680
15:39
and imagine if you knew
what they were really like ...
321
927760
2600
15:43
and imagine if they knew
what you were really thinking
322
931840
3976
15:47
and what were really like.
323
935840
1840
15:50
It would certainly clear things up a lot
324
938160
2576
15:52
and make your operations
together more effective.
325
940760
2856
15:55
I think it will improve
your relationships.
326
943640
2240
15:58
Now imagine that you can have algorithms
327
946600
3296
16:01
that will help you gather
all of that information
328
949920
3816
16:05
and even help you make decisions
in an idea-meritocratic way.
329
953760
4560
16:12
This sort of radical transparency
is coming at you
330
960640
4336
16:17
and it is going to affect your life.
331
965000
1960
16:19
And in my opinion,
332
967600
2056
16:21
it's going to be wonderful.
333
969680
1336
16:23
So I hope it is as wonderful for you
334
971040
2336
16:25
as it is for me.
335
973400
1200
16:27
Thank you very much.
336
975160
1256
16:28
(Applause)
337
976440
4360

▲Back to top

ABOUT THE SPEAKER
Ray Dalio - Hedge fund chair
Ray Dalio is the founder, chair and co-chief investment officer of Bridgewater Associates, a global leader in institutional portfolio management and the largest hedge fund in the world.

Why you should listen

Dalio started Bridgewater out of his two-bedroom apartment in New York City in 1975 and has grown it into the fifth most important private company in the U.S. (according to Fortune magazine). Because of the firm’s many industry-changing innovations over its 40-year history, he has been called the “Steve Jobs of investing” by aiCIO magazine and named one of TIME magazine’s "100 Most Influential People."

Dalio attributes Bridgewater’s success to its unique culture. He describes it as “a believability-weighted idea meritocracy” in which the people strive for “meaningful work and meaningful relationships through radical truth and radical transparency.” He has explained this approach in his book Principles, which has been downloaded more than three million times and has produced considerable curiosity and controversy.

More profile about the speaker
Ray Dalio | Speaker | TED.com