ABOUT THE SPEAKER
Atul Gawande - Surgeon, writer, public health innovator
Surgeon and public health professor by day, writer by night, Atul Gawande explores how doctors can dramatically improve their practice using approaches as simple as a checklist – or coaching.

Why you should listen

Atul Gawande is author of several best-selling books, including Complications: A Surgeon's Notes on an Imperfect ScienceBetter: A Surgeon's Notes on Performance, Being Mortal: Medicine and What Matters in the End and The Checklist Manifesto.

He is also a surgeon at Brigham and Women’s Hospital in Boston, a staff writer for The New Yorker and a professor at Harvard Medical School and the Harvard School of Public Health. He has won the Lewis Thomas Prize for Writing about Science, a MacArthur Fellowship and two National Magazine Awards. In his work in public health, he is Executive Director of Ariadne Labs, a joint center for health systems innovation and chair of Lifebox, a nonprofit organization making surgery safer globally.

In June 2018, Gawande was chosen to lead the new healthcare company set up by Amazon, JPMorgan and Berkshire Hathaway.

Photo: Aubrey Calo

More profile about the speaker
Atul Gawande | Speaker | TED.com
TED2012

Atul Gawande: How do we heal medicine?

Filmed:
2,001,183 views

Our medical systems are broken. Doctors are capable of extraordinary (and expensive) treatments, but they are losing their core focus: actually treating people. Doctor and writer Atul Gawande suggests we take a step back and look at new ways to do medicine -- with fewer cowboys and more pit crews.
- Surgeon, writer, public health innovator
Surgeon and public health professor by day, writer by night, Atul Gawande explores how doctors can dramatically improve their practice using approaches as simple as a checklist – or coaching. Full bio

Double-click the English transcript below to play the video.

00:15
I got my start
0
0
3000
00:18
in writing and research
1
3000
2000
00:20
as a surgical trainee,
2
5000
3000
00:23
as someone who was a long ways away
3
8000
2000
00:25
from becoming any kind of an expert at anything.
4
10000
3000
00:28
So the natural question you ask then at that point
5
13000
3000
00:31
is, how do I get good at what I'm trying to do?
6
16000
2000
00:33
And it became a question of,
7
18000
2000
00:35
how do we all get good
8
20000
2000
00:37
at what we're trying to do?
9
22000
3000
00:40
It's hard enough to learn to get the skills,
10
25000
4000
00:44
try to learn all the material you have to absorb
11
29000
3000
00:47
at any task you're taking on.
12
32000
2000
00:49
I had to think about how I sew and how I cut,
13
34000
3000
00:52
but then also how I pick the right person
14
37000
2000
00:54
to come to an operating room.
15
39000
2000
00:56
And then in the midst of all this
16
41000
2000
00:58
came this new context
17
43000
2000
01:00
for thinking about what it meant to be good.
18
45000
2000
01:02
In the last few years
19
47000
2000
01:04
we realized we were in the deepest crisis
20
49000
3000
01:07
of medicine's existence
21
52000
2000
01:09
due to something you don't normally think about
22
54000
2000
01:11
when you're a doctor
23
56000
2000
01:13
concerned with how you do good for people,
24
58000
3000
01:16
which is the cost
25
61000
2000
01:18
of health care.
26
63000
2000
01:20
There's not a country in the world
27
65000
3000
01:23
that now is not asking
28
68000
2000
01:25
whether we can afford what doctors do.
29
70000
3000
01:28
The political fight that we've developed
30
73000
3000
01:31
has become one around
31
76000
2000
01:33
whether it's the government that's the problem
32
78000
3000
01:36
or is it insurance companies that are the problem.
33
81000
3000
01:41
And the answer is yes and no;
34
86000
4000
01:45
it's deeper than all of that.
35
90000
2000
01:47
The cause of our troubles
36
92000
2000
01:49
is actually the complexity that science has given us.
37
94000
3000
01:52
And in order to understand this,
38
97000
2000
01:54
I'm going to take you back a couple of generations.
39
99000
4000
01:58
I want to take you back
40
103000
2000
02:00
to a time when Lewis Thomas was writing in his book, "The Youngest Science."
41
105000
3000
02:03
Lewis Thomas was a physician-writer,
42
108000
2000
02:05
one of my favorite writers.
43
110000
2000
02:07
And he wrote this book to explain, among other things,
44
112000
3000
02:10
what it was like to be a medical intern
45
115000
3000
02:13
at the Boston City Hospital
46
118000
2000
02:15
in the pre-penicillin year
47
120000
2000
02:17
of 1937.
48
122000
3000
02:20
It was a time when medicine was cheap
49
125000
4000
02:24
and very ineffective.
50
129000
4000
02:28
If you were in a hospital, he said,
51
133000
3000
02:31
it was going to do you good
52
136000
3000
02:34
only because it offered you
53
139000
2000
02:36
some warmth, some food, shelter,
54
141000
4000
02:40
and maybe the caring attention
55
145000
2000
02:42
of a nurse.
56
147000
2000
02:44
Doctors and medicine
57
149000
4000
02:48
made no difference at all.
58
153000
2000
02:50
That didn't seem to prevent the doctors
59
155000
2000
02:52
from being frantically busy in their days,
60
157000
2000
02:54
as he explained.
61
159000
2000
02:56
What they were trying to do
62
161000
2000
02:58
was figure out whether you might have one of the diagnoses
63
163000
3000
03:01
for which they could do something.
64
166000
3000
03:04
And there were a few.
65
169000
2000
03:06
You might have a lobar pneumonia, for example,
66
171000
3000
03:09
and they could give you an antiserum,
67
174000
2000
03:11
an injection of rabid antibodies
68
176000
4000
03:15
to the bacterium streptococcus,
69
180000
3000
03:18
if the intern sub-typed it correctly.
70
183000
4000
03:22
If you had an acute congestive heart failure,
71
187000
3000
03:25
they could bleed a pint of blood from you
72
190000
3000
03:28
by opening up an arm vein,
73
193000
3000
03:31
giving you a crude leaf preparation of digitalis
74
196000
3000
03:34
and then giving you oxygen by tent.
75
199000
5000
03:39
If you had early signs of paralysis
76
204000
2000
03:41
and you were really good at asking personal questions,
77
206000
3000
03:44
you might figure out
78
209000
2000
03:46
that this paralysis someone has is from syphilis,
79
211000
3000
03:49
in which case you could give this nice concoction
80
214000
3000
03:52
of mercury and arsenic --
81
217000
4000
03:56
as long as you didn't overdose them and kill them.
82
221000
3000
04:01
Beyond these sorts of things,
83
226000
2000
04:03
a medical doctor didn't have a lot that they could do.
84
228000
5000
04:08
This was when the core structure of medicine
85
233000
2000
04:10
was created --
86
235000
2000
04:12
what it meant to be good at what we did
87
237000
3000
04:15
and how we wanted to build medicine to be.
88
240000
2000
04:17
It was at a time
89
242000
2000
04:19
when what was known you could know,
90
244000
2000
04:21
you could hold it all in your head, and you could do it all.
91
246000
3000
04:24
If you had a prescription pad,
92
249000
2000
04:26
if you had a nurse,
93
251000
2000
04:28
if you had a hospital
94
253000
2000
04:30
that would give you a place to convalesce, maybe some basic tools,
95
255000
3000
04:33
you really could do it all.
96
258000
2000
04:35
You set the fracture, you drew the blood,
97
260000
3000
04:38
you spun the blood,
98
263000
2000
04:40
looked at it under the microscope,
99
265000
2000
04:42
you plated the culture, you injected the antiserum.
100
267000
3000
04:45
This was a life as a craftsman.
101
270000
5000
04:50
As a result, we built it around
102
275000
3000
04:53
a culture and set of values
103
278000
2000
04:55
that said what you were good at
104
280000
3000
04:58
was being daring,
105
283000
2000
05:00
at being courageous,
106
285000
2000
05:02
at being independent and self-sufficient.
107
287000
4000
05:06
Autonomy was our highest value.
108
291000
5000
05:12
Go a couple generations forward
109
297000
2000
05:14
to where we are, though,
110
299000
2000
05:16
and it looks like a completely different world.
111
301000
2000
05:18
We have now found treatments
112
303000
3000
05:21
for nearly all of the tens of thousands of conditions
113
306000
4000
05:25
that a human being can have.
114
310000
2000
05:27
We can't cure it all.
115
312000
2000
05:29
We can't guarantee that everybody will live a long and healthy life.
116
314000
3000
05:32
But we can make it possible
117
317000
2000
05:34
for most.
118
319000
3000
05:37
But what does it take?
119
322000
2000
05:39
Well, we've now discovered
120
324000
2000
05:41
4,000 medical and surgical procedures.
121
326000
4000
05:45
We've discovered 6,000 drugs
122
330000
3000
05:48
that I'm now licensed to prescribe.
123
333000
3000
05:51
And we're trying to deploy this capability,
124
336000
2000
05:53
town by town,
125
338000
2000
05:55
to every person alive --
126
340000
4000
05:59
in our own country,
127
344000
2000
06:01
let alone around the world.
128
346000
2000
06:03
And we've reached the point where we've realized,
129
348000
3000
06:06
as doctors,
130
351000
2000
06:08
we can't know it all.
131
353000
2000
06:10
We can't do it all
132
355000
3000
06:13
by ourselves.
133
358000
2000
06:15
There was a study where they looked
134
360000
2000
06:17
at how many clinicians it took to take care of you
135
362000
2000
06:19
if you came into a hospital,
136
364000
2000
06:21
as it changed over time.
137
366000
2000
06:23
And in the year 1970,
138
368000
2000
06:25
it took just over two full-time equivalents of clinicians.
139
370000
3000
06:28
That is to say,
140
373000
2000
06:30
it took basically the nursing time
141
375000
3000
06:33
and then just a little bit of time for a doctor
142
378000
2000
06:35
who more or less checked in on you
143
380000
2000
06:37
once a day.
144
382000
2000
06:39
By the end of the 20th century,
145
384000
3000
06:42
it had become more than 15 clinicians
146
387000
3000
06:45
for the same typical hospital patient --
147
390000
3000
06:48
specialists, physical therapists,
148
393000
3000
06:51
the nurses.
149
396000
3000
06:54
We're all specialists now,
150
399000
2000
06:56
even the primary care physicians.
151
401000
2000
06:58
Everyone just has
152
403000
2000
07:00
a piece of the care.
153
405000
3000
07:03
But holding onto that structure we built
154
408000
2000
07:05
around the daring, independence,
155
410000
2000
07:07
self-sufficiency
156
412000
2000
07:09
of each of those people
157
414000
3000
07:12
has become a disaster.
158
417000
2000
07:14
We have trained, hired and rewarded people
159
419000
4000
07:18
to be cowboys.
160
423000
3000
07:21
But it's pit crews that we need,
161
426000
3000
07:24
pit crews for patients.
162
429000
2000
07:26
There's evidence all around us:
163
431000
2000
07:28
40 percent of our coronary artery disease patients
164
433000
3000
07:31
in our communities
165
436000
2000
07:33
receive incomplete or inappropriate care.
166
438000
4000
07:37
60 percent
167
442000
2000
07:39
of our asthma, stroke patients
168
444000
3000
07:42
receive incomplete or inappropriate care.
169
447000
4000
07:46
Two million people come into hospitals
170
451000
3000
07:49
and pick up an infection
171
454000
2000
07:51
they didn't have
172
456000
2000
07:53
because someone failed to follow
173
458000
3000
07:56
the basic practices of hygiene.
174
461000
3000
07:59
Our experience
175
464000
2000
08:01
as people who get sick,
176
466000
2000
08:03
need help from other people,
177
468000
2000
08:05
is that we have amazing clinicians
178
470000
3000
08:08
that we can turn to --
179
473000
2000
08:10
hardworking, incredibly well-trained and very smart --
180
475000
3000
08:13
that we have access to incredible technologies
181
478000
3000
08:16
that give us great hope,
182
481000
2000
08:18
but little sense
183
483000
2000
08:20
that it consistently all comes together for you
184
485000
4000
08:24
from start to finish
185
489000
3000
08:27
in a successful way.
186
492000
3000
08:30
There's another sign
187
495000
2000
08:32
that we need pit crews,
188
497000
2000
08:34
and that's the unmanageable cost
189
499000
3000
08:37
of our care.
190
502000
3000
08:40
Now we in medicine, I think,
191
505000
2000
08:42
are baffled by this question of cost.
192
507000
2000
08:44
We want to say, "This is just the way it is.
193
509000
4000
08:48
This is just what medicine requires."
194
513000
2000
08:50
When you go from a world
195
515000
2000
08:52
where you treated arthritis with aspirin,
196
517000
3000
08:55
that mostly didn't do the job,
197
520000
3000
08:58
to one where, if it gets bad enough,
198
523000
2000
09:00
we can do a hip replacement, a knee replacement
199
525000
2000
09:02
that gives you years, maybe decades,
200
527000
3000
09:05
without disability,
201
530000
2000
09:07
a dramatic change,
202
532000
2000
09:09
well is it any surprise
203
534000
2000
09:11
that that $40,000 hip replacement
204
536000
3000
09:14
replacing the 10-cent aspirin
205
539000
2000
09:16
is more expensive?
206
541000
2000
09:18
It's just the way it is.
207
543000
3000
09:21
But I think we're ignoring certain facts
208
546000
2000
09:23
that tell us something about what we can do.
209
548000
3000
09:28
As we've looked at the data
210
553000
2000
09:30
about the results that have come
211
555000
3000
09:33
as the complexity has increased,
212
558000
2000
09:35
we found
213
560000
2000
09:37
that the most expensive care
214
562000
2000
09:39
is not necessarily the best care.
215
564000
3000
09:42
And vice versa,
216
567000
2000
09:44
the best care
217
569000
2000
09:46
often turns out to be the least expensive --
218
571000
3000
09:49
has fewer complications,
219
574000
3000
09:52
the people get more efficient at what they do.
220
577000
3000
09:55
And what that means
221
580000
2000
09:57
is there's hope.
222
582000
3000
10:00
Because [if] to have the best results,
223
585000
3000
10:03
you really needed the most expensive care
224
588000
3000
10:06
in the country, or in the world,
225
591000
2000
10:08
well then we really would be talking about rationing
226
593000
3000
10:11
who we're going to cut off from Medicare.
227
596000
4000
10:15
That would be really our only choice.
228
600000
4000
10:19
But when we look at the positive deviants --
229
604000
2000
10:21
the ones who are getting the best results
230
606000
3000
10:24
at the lowest costs --
231
609000
2000
10:26
we find the ones that look the most like systems
232
611000
3000
10:29
are the most successful.
233
614000
2000
10:31
That is to say, they found ways
234
616000
3000
10:34
to get all of the different pieces,
235
619000
2000
10:36
all of the different components,
236
621000
2000
10:38
to come together into a whole.
237
623000
3000
10:41
Having great components is not enough,
238
626000
3000
10:44
and yet we've been obsessed in medicine with components.
239
629000
4000
10:48
We want the best drugs, the best technologies,
240
633000
3000
10:51
the best specialists,
241
636000
3000
10:54
but we don't think too much
242
639000
2000
10:56
about how it all comes together.
243
641000
3000
10:59
It's a terrible design strategy actually.
244
644000
4000
11:03
There's a famous thought experiment
245
648000
3000
11:06
that touches exactly on this
246
651000
2000
11:08
that said, what if you built a car
247
653000
2000
11:10
from the very best car parts?
248
655000
3000
11:13
Well it would lead you to put in Porsche brakes,
249
658000
3000
11:16
a Ferrari engine,
250
661000
2000
11:18
a Volvo body, a BMW chassis.
251
663000
3000
11:21
And you put it all together and what do you get?
252
666000
3000
11:24
A very expensive pile of junk that does not go anywhere.
253
669000
4000
11:28
And that is what medicine can feel like sometimes.
254
673000
5000
11:33
It's not a system.
255
678000
3000
11:36
Now a system, however,
256
681000
2000
11:38
when things start to come together,
257
683000
3000
11:41
you realize it has certain skills
258
686000
3000
11:44
for acting and looking that way.
259
689000
3000
11:47
Skill number one
260
692000
2000
11:49
is the ability to recognize success
261
694000
2000
11:51
and the ability to recognize failure.
262
696000
3000
11:54
When you are a specialist,
263
699000
2000
11:56
you can't see the end result very well.
264
701000
3000
11:59
You have to become really interested in data,
265
704000
3000
12:02
unsexy as that sounds.
266
707000
2000
12:04
One of my colleagues is a surgeon in Cedar Rapids, Iowa,
267
709000
3000
12:07
and he got interested in the question of,
268
712000
4000
12:11
well how many CT scans did they do
269
716000
2000
12:13
for their community in Cedar Rapids?
270
718000
2000
12:15
He got interested in this
271
720000
2000
12:17
because there had been government reports,
272
722000
2000
12:19
newspaper reports, journal articles
273
724000
2000
12:21
saying that there had been too many CT scans done.
274
726000
3000
12:24
He didn't see it in his own patients.
275
729000
4000
12:28
And so he asked the question, "How many did we do?"
276
733000
2000
12:30
and he wanted to get the data.
277
735000
2000
12:32
It took him three months.
278
737000
2000
12:34
No one had asked this question in his community before.
279
739000
3000
12:37
And what he found was that,
280
742000
2000
12:39
for the 300,000 people in their community,
281
744000
2000
12:41
in the previous year
282
746000
2000
12:43
they had done 52,000 CT scans.
283
748000
5000
12:48
They had found a problem.
284
753000
3000
12:51
Which brings us to skill number two a system has.
285
756000
5000
12:56
Skill one, find where your failures are.
286
761000
3000
12:59
Skill two is devise solutions.
287
764000
4000
13:03
I got interested in this
288
768000
2000
13:05
when the World Health Organization came to my team
289
770000
2000
13:07
asking if we could help with a project
290
772000
2000
13:09
to reduce deaths in surgery.
291
774000
2000
13:11
The volume of surgery had spread
292
776000
2000
13:13
around the world,
293
778000
2000
13:15
but the safety of surgery
294
780000
2000
13:17
had not.
295
782000
2000
13:19
Now our usual tactics for tackling problems like these
296
784000
3000
13:22
are to do more training,
297
787000
2000
13:24
give people more specialization
298
789000
3000
13:27
or bring in more technology.
299
792000
3000
13:30
Well in surgery, you couldn't have people who are more specialized
300
795000
3000
13:33
and you couldn't have people who are better trained.
301
798000
3000
13:36
And yet we see unconscionable levels
302
801000
3000
13:39
of death, disability
303
804000
4000
13:43
that could be avoided.
304
808000
2000
13:45
And so we looked at what other high-risk industries do.
305
810000
2000
13:47
We looked at skyscraper construction,
306
812000
2000
13:49
we looked at the aviation world,
307
814000
3000
13:52
and we found
308
817000
2000
13:54
that they have technology, they have training,
309
819000
2000
13:56
and then they have one other thing:
310
821000
3000
13:59
They have checklists.
311
824000
3000
14:02
I did not expect
312
827000
2000
14:04
to be spending a significant part
313
829000
2000
14:06
of my time as a Harvard surgeon
314
831000
2000
14:08
worrying about checklists.
315
833000
3000
14:11
And yet, what we found
316
836000
2000
14:13
were that these were tools
317
838000
3000
14:16
to help make experts better.
318
841000
3000
14:19
We got the lead safety engineer for Boeing to help us.
319
844000
4000
14:23
Could we design a checklist for surgery?
320
848000
3000
14:26
Not for the lowest people on the totem pole,
321
851000
2000
14:28
but for the folks
322
853000
2000
14:30
who were all the way around the chain,
323
855000
2000
14:32
the entire team including the surgeons.
324
857000
2000
14:34
And what they taught us
325
859000
2000
14:36
was that designing a checklist
326
861000
2000
14:38
to help people handle complexity
327
863000
2000
14:40
actually involves more difficulty than I had understood.
328
865000
3000
14:43
You have to think about things
329
868000
2000
14:45
like pause points.
330
870000
2000
14:47
You need to identify the moments in a process
331
872000
3000
14:50
when you can actually catch a problem before it's a danger
332
875000
2000
14:52
and do something about it.
333
877000
2000
14:54
You have to identify
334
879000
2000
14:56
that this is a before-takeoff checklist.
335
881000
3000
14:59
And then you need to focus on the killer items.
336
884000
3000
15:02
An aviation checklist,
337
887000
2000
15:04
like this one for a single-engine plane,
338
889000
2000
15:06
isn't a recipe for how to fly a plane,
339
891000
2000
15:08
it's a reminder of the key things
340
893000
2000
15:10
that get forgotten or missed
341
895000
3000
15:13
if they're not checked.
342
898000
2000
15:15
So we did this.
343
900000
2000
15:17
We created a 19-item two-minute checklist
344
902000
3000
15:20
for surgical teams.
345
905000
2000
15:22
We had the pause points
346
907000
2000
15:24
immediately before anesthesia is given,
347
909000
3000
15:27
immediately before the knife hits the skin,
348
912000
3000
15:30
immediately before the patient leaves the room.
349
915000
3000
15:33
And we had a mix of dumb stuff on there --
350
918000
3000
15:36
making sure an antibiotic is given in the right time frame
351
921000
3000
15:39
because that cuts the infection rate by half --
352
924000
2000
15:41
and then interesting stuff,
353
926000
2000
15:43
because you can't make a recipe for something as complicated as surgery.
354
928000
3000
15:46
Instead, you can make a recipe
355
931000
2000
15:48
for how to have a team that's prepared for the unexpected.
356
933000
3000
15:51
And we had items like making sure everyone in the room
357
936000
3000
15:54
had introduced themselves by name at the start of the day,
358
939000
3000
15:57
because you get half a dozen people or more
359
942000
2000
15:59
who are sometimes coming together as a team
360
944000
3000
16:02
for the very first time that day that you're coming in.
361
947000
3000
16:05
We implemented this checklist
362
950000
2000
16:07
in eight hospitals around the world,
363
952000
3000
16:10
deliberately in places from rural Tanzania
364
955000
2000
16:12
to the University of Washington in Seattle.
365
957000
3000
16:15
We found that after they adopted it
366
960000
3000
16:18
the complication rates fell
367
963000
2000
16:20
35 percent.
368
965000
2000
16:22
It fell in every hospital it went into.
369
967000
3000
16:25
The death rates fell
370
970000
2000
16:27
47 percent.
371
972000
3000
16:30
This was bigger than a drug.
372
975000
2000
16:32
(Applause)
373
977000
6000
16:38
And that brings us
374
983000
2000
16:40
to skill number three,
375
985000
3000
16:43
the ability to implement this,
376
988000
2000
16:45
to get colleagues across the entire chain
377
990000
3000
16:48
to actually do these things.
378
993000
3000
16:51
And it's been slow to spread.
379
996000
2000
16:53
This is not yet our norm in surgery --
380
998000
4000
16:57
let alone making checklists
381
1002000
2000
16:59
to go onto childbirth and other areas.
382
1004000
3000
17:02
There's a deep resistance
383
1007000
2000
17:04
because using these tools
384
1009000
2000
17:06
forces us to confront
385
1011000
2000
17:08
that we're not a system,
386
1013000
2000
17:10
forces us to behave with a different set of values.
387
1015000
3000
17:13
Just using a checklist
388
1018000
2000
17:15
requires you to embrace different values from the ones we've had,
389
1020000
3000
17:18
like humility,
390
1023000
4000
17:22
discipline,
391
1027000
3000
17:25
teamwork.
392
1030000
2000
17:27
This is the opposite of what we were built on:
393
1032000
3000
17:30
independence, self-sufficiency,
394
1035000
2000
17:32
autonomy.
395
1037000
3000
17:35
I met an actual cowboy, by the way.
396
1040000
3000
17:38
I asked him, what was it like
397
1043000
3000
17:41
to actually herd a thousand cattle
398
1046000
2000
17:43
across hundreds of miles?
399
1048000
2000
17:45
How did you do that?
400
1050000
2000
17:47
And he said, "We have the cowboys stationed at distinct places all around."
401
1052000
3000
17:50
They communicate electronically constantly,
402
1055000
3000
17:53
and they have protocols and checklists
403
1058000
2000
17:55
for how they handle everything --
404
1060000
2000
17:57
(Laughter)
405
1062000
2000
17:59
-- from bad weather
406
1064000
2000
18:01
to emergencies or inoculations for the cattle.
407
1066000
3000
18:04
Even the cowboys are pit crews now.
408
1069000
4000
18:08
And it seemed like time
409
1073000
2000
18:10
that we become that way ourselves.
410
1075000
2000
18:12
Making systems work
411
1077000
2000
18:14
is the great task of my generation
412
1079000
3000
18:17
of physicians and scientists.
413
1082000
2000
18:19
But I would go further and say
414
1084000
2000
18:21
that making systems work,
415
1086000
2000
18:23
whether in health care, education,
416
1088000
2000
18:25
climate change,
417
1090000
2000
18:27
making a pathway out of poverty,
418
1092000
2000
18:29
is the great task of our generation as a whole.
419
1094000
4000
18:33
In every field, knowledge has exploded,
420
1098000
3000
18:36
but it has brought complexity,
421
1101000
2000
18:38
it has brought specialization.
422
1103000
3000
18:41
And we've come to a place where we have no choice
423
1106000
2000
18:43
but to recognize,
424
1108000
2000
18:45
as individualistic as we want to be,
425
1110000
3000
18:48
complexity requires
426
1113000
3000
18:51
group success.
427
1116000
2000
18:53
We all need to be pit crews now.
428
1118000
4000
18:57
Thank you.
429
1122000
2000
18:59
(Applause)
430
1124000
14000

▲Back to top

ABOUT THE SPEAKER
Atul Gawande - Surgeon, writer, public health innovator
Surgeon and public health professor by day, writer by night, Atul Gawande explores how doctors can dramatically improve their practice using approaches as simple as a checklist – or coaching.

Why you should listen

Atul Gawande is author of several best-selling books, including Complications: A Surgeon's Notes on an Imperfect ScienceBetter: A Surgeon's Notes on Performance, Being Mortal: Medicine and What Matters in the End and The Checklist Manifesto.

He is also a surgeon at Brigham and Women’s Hospital in Boston, a staff writer for The New Yorker and a professor at Harvard Medical School and the Harvard School of Public Health. He has won the Lewis Thomas Prize for Writing about Science, a MacArthur Fellowship and two National Magazine Awards. In his work in public health, he is Executive Director of Ariadne Labs, a joint center for health systems innovation and chair of Lifebox, a nonprofit organization making surgery safer globally.

In June 2018, Gawande was chosen to lead the new healthcare company set up by Amazon, JPMorgan and Berkshire Hathaway.

Photo: Aubrey Calo

More profile about the speaker
Atul Gawande | Speaker | TED.com