ABOUT THE SPEAKER
Kathryn Schulz - Wrongologist
Kathryn Schulz is a staff writer for the New Yorker and is the author of "Being Wrong: Adventures in the Margin of Error."

Why you should listen

Kathryn Schulz is a journalist, author, and public speaker with a credible (if not necessarily enviable) claim to being the world's leading wrongologist.  She is the author of Being Wrong: Adventures in the Margin of Error. She was previously the book critic for New York Magazine; her writing has also appeared in the New York Times Magazine, Rolling Stone, TIME Magazine, the Boston Globe, the "Freakonomics" blog of The New York Times, The Nation, Foreign Policy, and the New York Times Book Review, among other publications. She is the former editor of the online environmental magazine Grist, and a former reporter and editor for The Santiago Times, of Santiago, Chile, where she covered environmental, labor, and human rights issues. She was a 2004 recipient of the Pew Fellowship in International Journalism (now the International Reporting Project), and has reported from throughout Central and South America, Japan, and, most recently, the Middle East. A graduate of Brown University and a former Ohioan, Oregonian and Brooklynite, she currently lives in New York's Hudson Valley.

More profile about the speaker
Kathryn Schulz | Speaker | TED.com
TED2011

Kathryn Schulz: On being wrong

Filmed:
4,826,828 views

Most of us will do anything to avoid being wrong. But what if we're wrong about that? "Wrongologist" Kathryn Schulz makes a compelling case for not just admitting but embracing our fallibility.
- Wrongologist
Kathryn Schulz is a staff writer for the New Yorker and is the author of "Being Wrong: Adventures in the Margin of Error." Full bio

Double-click the English transcript below to play the video.

00:15
So it's 1995,
0
0
3000
00:18
I'm in college,
1
3000
2000
00:20
and a friend and I go on a road trip
2
5000
3000
00:23
from Providence, Rhode Island
3
8000
2000
00:25
to Portland, Oregon.
4
10000
2000
00:27
And you know, we're young and unemployed,
5
12000
3000
00:30
so we do the whole thing on back roads
6
15000
2000
00:32
through state parks
7
17000
2000
00:34
and national forests --
8
19000
3000
00:37
basically the longest route we can possibly take.
9
22000
3000
00:41
And somewhere in the middle of South Dakota,
10
26000
3000
00:44
I turn to my friend
11
29000
3000
00:47
and I ask her a question
12
32000
2000
00:49
that's been bothering me
13
34000
2000
00:51
for 2,000 miles.
14
36000
3000
00:55
"What's up with the Chinese character I keep seeing by the side of the road?"
15
40000
4000
01:02
My friend looks at me totally blankly.
16
47000
4000
01:06
There's actually a gentleman in the front row
17
51000
2000
01:08
who's doing a perfect imitation of her look.
18
53000
3000
01:11
(Laughter)
19
56000
3000
01:14
And I'm like, "You know,
20
59000
2000
01:16
all the signs we keep seeing
21
61000
2000
01:18
with the Chinese character on them."
22
63000
3000
01:22
She just stares at me for a few moments,
23
67000
3000
01:25
and then she cracks up,
24
70000
3000
01:28
because she figures out what I'm talking about.
25
73000
2000
01:30
And what I'm talking about is this.
26
75000
3000
01:33
(Laughter)
27
78000
6000
01:39
Right, the famous Chinese character for picnic area.
28
84000
4000
01:43
(Laughter)
29
88000
2000
01:45
I've spent the last five years of my life
30
90000
4000
01:49
thinking about situations
31
94000
2000
01:51
exactly like this --
32
96000
3000
01:54
why we sometimes misunderstand
33
99000
2000
01:56
the signs around us,
34
101000
2000
01:58
and how we behave when that happens,
35
103000
3000
02:01
and what all of this can tell us about human nature.
36
106000
4000
02:05
In other words, as you heard Chris say,
37
110000
2000
02:07
I've spent the last five years
38
112000
2000
02:09
thinking about being wrong.
39
114000
3000
02:12
This might strike you as a strange career move,
40
117000
3000
02:15
but it actually has one great advantage:
41
120000
3000
02:18
no job competition.
42
123000
2000
02:20
(Laughter)
43
125000
2000
02:22
In fact, most of us do everything we can
44
127000
3000
02:25
to avoid thinking about being wrong,
45
130000
3000
02:28
or at least to avoid thinking about the possibility
46
133000
2000
02:30
that we ourselves are wrong.
47
135000
2000
02:32
We get it in the abstract.
48
137000
2000
02:34
We all know everybody in this room makes mistakes.
49
139000
2000
02:37
The human species, in general, is fallible -- okay fine.
50
142000
3000
02:41
But when it comes down to me, right now,
51
146000
3000
02:44
to all the beliefs I hold,
52
149000
2000
02:46
here in the present tense,
53
151000
3000
02:49
suddenly all of this abstract appreciation of fallibility
54
154000
4000
02:53
goes out the window --
55
158000
3000
02:56
and I can't actually think of anything I'm wrong about.
56
161000
3000
03:00
And the thing is, the present tense is where we live.
57
165000
3000
03:03
We go to meetings in the present tense;
58
168000
3000
03:06
we go on family vacations in the present tense;
59
171000
2000
03:08
we go to the polls and vote in the present tense.
60
173000
4000
03:12
So effectively, we all kind of wind up traveling through life,
61
177000
3000
03:15
trapped in this little bubble
62
180000
2000
03:17
of feeling very right about everything.
63
182000
3000
03:21
I think this is a problem.
64
186000
2000
03:23
I think it's a problem for each of us as individuals,
65
188000
3000
03:26
in our personal and professional lives,
66
191000
3000
03:29
and I think it's a problem for all of us collectively as a culture.
67
194000
3000
03:32
So what I want to do today
68
197000
2000
03:34
is, first of all, talk about why we get stuck
69
199000
3000
03:37
inside this feeling of being right.
70
202000
2000
03:39
And second, why it's such a problem.
71
204000
3000
03:42
And finally, I want to convince you
72
207000
2000
03:44
that it is possible
73
209000
2000
03:46
to step outside of that feeling
74
211000
2000
03:48
and that if you can do so,
75
213000
2000
03:50
it is the single greatest
76
215000
2000
03:52
moral, intellectual and creative leap you can make.
77
217000
3000
03:57
So why do we get stuck
78
222000
2000
03:59
in this feeling of being right?
79
224000
2000
04:01
One reason, actually, has to do with a feeling of being wrong.
80
226000
3000
04:04
So let me ask you guys something --
81
229000
2000
04:06
or actually, let me ask you guys something, because you're right here:
82
231000
4000
04:10
How does it feel -- emotionally --
83
235000
3000
04:13
how does it feel to be wrong?
84
238000
3000
04:16
Dreadful. Thumbs down.
85
241000
3000
04:19
Embarrassing. Okay, wonderful, great.
86
244000
2000
04:21
Dreadful, thumbs down, embarrassing --
87
246000
2000
04:23
thank you, these are great answers,
88
248000
3000
04:26
but they're answers to a different question.
89
251000
3000
04:29
You guys are answering the question:
90
254000
2000
04:31
How does it feel to realize you're wrong?
91
256000
3000
04:34
(Laughter)
92
259000
4000
04:38
Realizing you're wrong can feel like all of that and a lot of other things, right?
93
263000
3000
04:41
I mean it can be devastating, it can be revelatory,
94
266000
3000
04:44
it can actually be quite funny,
95
269000
2000
04:46
like my stupid Chinese character mistake.
96
271000
3000
04:49
But just being wrong
97
274000
3000
04:52
doesn't feel like anything.
98
277000
2000
04:54
I'll give you an analogy.
99
279000
3000
04:57
Do you remember that Loony Tunes cartoon
100
282000
2000
04:59
where there's this pathetic coyote
101
284000
2000
05:01
who's always chasing and never catching a roadrunner?
102
286000
2000
05:03
In pretty much every episode of this cartoon,
103
288000
3000
05:06
there's a moment where the coyote is chasing the roadrunner
104
291000
2000
05:08
and the roadrunner runs off a cliff,
105
293000
2000
05:10
which is fine -- he's a bird, he can fly.
106
295000
3000
05:13
But the thing is, the coyote runs off the cliff right after him.
107
298000
4000
05:17
And what's funny --
108
302000
2000
05:19
at least if you're six years old --
109
304000
2000
05:21
is that the coyote's totally fine too.
110
306000
2000
05:23
He just keeps running --
111
308000
2000
05:25
right up until the moment that he looks down
112
310000
2000
05:27
and realizes that he's in mid-air.
113
312000
3000
05:30
That's when he falls.
114
315000
3000
05:34
When we're wrong about something --
115
319000
2000
05:36
not when we realize it, but before that --
116
321000
3000
05:39
we're like that coyote
117
324000
3000
05:42
after he's gone off the cliff and before he looks down.
118
327000
3000
05:46
You know, we're already wrong,
119
331000
3000
05:49
we're already in trouble,
120
334000
2000
05:51
but we feel like we're on solid ground.
121
336000
3000
05:55
So I should actually correct something I said a moment ago.
122
340000
3000
05:58
It does feel like something to be wrong;
123
343000
3000
06:01
it feels like being right.
124
346000
3000
06:04
(Laughter)
125
349000
3000
06:07
So this is one reason, a structural reason,
126
352000
3000
06:10
why we get stuck inside this feeling of rightness.
127
355000
2000
06:12
I call this error blindness.
128
357000
2000
06:14
Most of the time,
129
359000
2000
06:16
we don't have any kind of internal cue
130
361000
3000
06:19
to let us know that we're wrong about something,
131
364000
2000
06:21
until it's too late.
132
366000
3000
06:24
But there's a second reason that we get stuck inside this feeling as well --
133
369000
3000
06:27
and this one is cultural.
134
372000
2000
06:30
Think back for a moment to elementary school.
135
375000
3000
06:33
You're sitting there in class,
136
378000
2000
06:35
and your teacher is handing back quiz papers,
137
380000
3000
06:38
and one of them looks like this.
138
383000
2000
06:40
This is not mine, by the way.
139
385000
2000
06:42
(Laughter)
140
387000
2000
06:44
So there you are in grade school,
141
389000
3000
06:47
and you know exactly what to think
142
392000
2000
06:49
about the kid who got this paper.
143
394000
3000
06:52
It's the dumb kid, the troublemaker,
144
397000
3000
06:55
the one who never does his homework.
145
400000
3000
06:58
So by the time you are nine years old,
146
403000
3000
07:01
you've already learned, first of all,
147
406000
2000
07:03
that people who get stuff wrong
148
408000
2000
07:05
are lazy, irresponsible dimwits --
149
410000
3000
07:08
and second of all,
150
413000
2000
07:10
that the way to succeed in life
151
415000
2000
07:12
is to never make any mistakes.
152
417000
3000
07:16
We learn these really bad lessons really well.
153
421000
4000
07:21
And a lot of us --
154
426000
2000
07:23
and I suspect, especially a lot of us in this room --
155
428000
4000
07:27
deal with them by just becoming
156
432000
2000
07:29
perfect little A students,
157
434000
2000
07:31
perfectionists, over-achievers.
158
436000
3000
07:34
Right,
159
439000
2000
07:36
Mr. CFO, astrophysicist, ultra-marathoner?
160
441000
4000
07:40
(Laughter)
161
445000
7000
07:47
You're all CFO, astrophysicists, ultra-marathoners, it turns out.
162
452000
4000
07:51
Okay, so fine.
163
456000
2000
07:53
Except that then we freak out
164
458000
3000
07:56
at the possibility that we've gotten something wrong.
165
461000
2000
07:58
Because according to this,
166
463000
3000
08:01
getting something wrong
167
466000
2000
08:03
means there's something wrong with us.
168
468000
3000
08:06
So we just insist that we're right,
169
471000
2000
08:08
because it makes us feel smart and responsible
170
473000
2000
08:10
and virtuous and safe.
171
475000
3000
08:14
So let me tell you a story.
172
479000
2000
08:16
A couple of years ago,
173
481000
2000
08:18
a woman comes into Beth Israel Deaconess Medical Center for a surgery.
174
483000
3000
08:21
Beth Israel's in Boston.
175
486000
2000
08:23
It's the teaching hospital for Harvard --
176
488000
2000
08:25
one of the best hospitals in the country.
177
490000
2000
08:27
So this woman comes in and she's taken into the operating room.
178
492000
3000
08:30
She's anesthetized, the surgeon does his thing --
179
495000
2000
08:32
stitches her back up, sends her out to the recovery room.
180
497000
3000
08:35
Everything seems to have gone fine.
181
500000
3000
08:38
And she wakes up, and she looks down at herself,
182
503000
3000
08:41
and she says, "Why is the wrong side of my body in bandages?"
183
506000
4000
08:45
Well the wrong side of her body is in bandages
184
510000
3000
08:48
because the surgeon has performed a major operation
185
513000
2000
08:50
on her left leg instead of her right one.
186
515000
3000
08:54
When the vice president for health care quality at Beth Israel
187
519000
3000
08:57
spoke about this incident,
188
522000
3000
09:00
he said something very interesting.
189
525000
3000
09:03
He said, "For whatever reason,
190
528000
3000
09:06
the surgeon simply felt
191
531000
2000
09:08
that he was on the correct side of the patient."
192
533000
2000
09:10
(Laughter)
193
535000
3000
09:15
The point of this story
194
540000
2000
09:17
is that trusting too much in the feeling
195
542000
3000
09:20
of being on the correct side of anything
196
545000
3000
09:23
can be very dangerous.
197
548000
3000
09:26
This internal sense of rightness
198
551000
3000
09:29
that we all experience so often
199
554000
2000
09:31
is not a reliable guide
200
556000
2000
09:33
to what is actually going on in the external world.
201
558000
3000
09:36
And when we act like it is,
202
561000
2000
09:38
and we stop entertaining the possibility that we could be wrong,
203
563000
4000
09:42
well that's when we end up doing things
204
567000
2000
09:44
like dumping 200 million gallons of oil into the Gulf of Mexico,
205
569000
4000
09:48
or torpedoing the global economy.
206
573000
3000
09:52
So this is a huge practical problem.
207
577000
3000
09:55
But it's also a huge social problem.
208
580000
3000
09:58
Think for a moment about what it means to feel right.
209
583000
4000
10:02
It means that you think that your beliefs
210
587000
2000
10:04
just perfectly reflect reality.
211
589000
3000
10:07
And when you feel that way,
212
592000
2000
10:09
you've got a problem to solve,
213
594000
2000
10:11
which is, how are you going to explain
214
596000
2000
10:13
all of those people who disagree with you?
215
598000
3000
10:16
It turns out, most of us explain those people the same way,
216
601000
3000
10:19
by resorting to a series of unfortunate assumptions.
217
604000
3000
10:23
The first thing we usually do when someone disagrees with us
218
608000
3000
10:26
is we just assume they're ignorant.
219
611000
3000
10:29
They don't have access to the same information that we do,
220
614000
2000
10:31
and when we generously share that information with them,
221
616000
3000
10:34
they're going to see the light and come on over to our team.
222
619000
3000
10:37
When that doesn't work,
223
622000
3000
10:40
when it turns out those people have all the same facts that we do
224
625000
2000
10:42
and they still disagree with us,
225
627000
2000
10:44
then we move on to a second assumption,
226
629000
2000
10:46
which is that they're idiots.
227
631000
2000
10:48
(Laughter)
228
633000
2000
10:50
They have all the right pieces of the puzzle,
229
635000
2000
10:52
and they are too moronic to put them together correctly.
230
637000
3000
10:55
And when that doesn't work,
231
640000
2000
10:57
when it turns out that people who disagree with us
232
642000
3000
11:00
have all the same facts we do
233
645000
2000
11:02
and are actually pretty smart,
234
647000
3000
11:05
then we move on to a third assumption:
235
650000
3000
11:08
they know the truth,
236
653000
3000
11:11
and they are deliberately distorting it
237
656000
2000
11:13
for their own malevolent purposes.
238
658000
3000
11:17
So this is a catastrophe.
239
662000
2000
11:19
This attachment to our own rightness
240
664000
3000
11:22
keeps us from preventing mistakes
241
667000
2000
11:24
when we absolutely need to
242
669000
2000
11:26
and causes us to treat each other terribly.
243
671000
3000
11:30
But to me, what's most baffling
244
675000
2000
11:32
and most tragic about this
245
677000
3000
11:35
is that it misses the whole point of being human.
246
680000
4000
11:39
It's like we want to imagine
247
684000
2000
11:41
that our minds are just these perfectly translucent windows
248
686000
3000
11:44
and we just gaze out of them
249
689000
2000
11:46
and describe the world as it unfolds.
250
691000
3000
11:49
And we want everybody else to gaze out of the same window
251
694000
2000
11:51
and see the exact same thing.
252
696000
2000
11:53
That is not true,
253
698000
2000
11:55
and if it were, life would be incredibly boring.
254
700000
3000
11:58
The miracle of your mind
255
703000
3000
12:01
isn't that you can see the world as it is.
256
706000
3000
12:05
It's that you can see the world as it isn't.
257
710000
3000
12:09
We can remember the past,
258
714000
2000
12:11
and we can think about the future,
259
716000
3000
12:14
and we can imagine what it's like
260
719000
2000
12:16
to be some other person in some other place.
261
721000
3000
12:19
And we all do this a little differently,
262
724000
2000
12:21
which is why we can all look up at the same night sky
263
726000
2000
12:23
and see this
264
728000
2000
12:25
and also this
265
730000
2000
12:27
and also this.
266
732000
3000
12:30
And yeah, it is also why we get things wrong.
267
735000
3000
12:34
1,200 years before Descartes said his famous thing
268
739000
2000
12:36
about "I think therefore I am,"
269
741000
2000
12:38
this guy, St. Augustine, sat down
270
743000
2000
12:40
and wrote "Fallor ergo sum" --
271
745000
3000
12:43
"I err therefore I am."
272
748000
4000
12:47
Augustine understood
273
752000
2000
12:49
that our capacity to screw up,
274
754000
2000
12:51
it's not some kind of embarrassing defect
275
756000
2000
12:53
in the human system,
276
758000
2000
12:55
something we can eradicate or overcome.
277
760000
3000
12:58
It's totally fundamental to who we are.
278
763000
3000
13:01
Because, unlike God,
279
766000
2000
13:03
we don't really know what's going on out there.
280
768000
3000
13:06
And unlike all of the other animals,
281
771000
3000
13:09
we are obsessed with trying to figure it out.
282
774000
4000
13:13
To me, this obsession
283
778000
2000
13:15
is the source and root
284
780000
2000
13:17
of all of our productivity and creativity.
285
782000
3000
13:20
Last year, for various reasons,
286
785000
3000
13:23
I found myself listening to a lot of episodes
287
788000
2000
13:25
of the Public Radio show This American Life.
288
790000
2000
13:27
And so I'm listening and I'm listening,
289
792000
3000
13:30
and at some point, I start feeling
290
795000
3000
13:33
like all the stories are about being wrong.
291
798000
3000
13:37
And my first thought was,
292
802000
2000
13:39
"I've lost it.
293
804000
2000
13:41
I've become the crazy wrongness lady.
294
806000
2000
13:43
I just imagined it everywhere,"
295
808000
2000
13:45
which has happened.
296
810000
2000
13:47
But a couple of months later,
297
812000
2000
13:49
I actually had a chance to interview Ira Glass, who's the host of the show.
298
814000
2000
13:51
And I mentioned this to him,
299
816000
2000
13:53
and he was like, "No actually, that's true.
300
818000
3000
13:56
In fact," he says,
301
821000
2000
13:58
"as a staff, we joke
302
823000
2000
14:00
that every single episode of our show
303
825000
2000
14:02
has the same crypto-theme.
304
827000
3000
14:05
And the crypto-theme is:
305
830000
2000
14:07
'I thought this one thing was going to happen
306
832000
3000
14:10
and something else happened instead.'
307
835000
3000
14:13
And the thing is," says Ira Glass, "we need this.
308
838000
3000
14:16
We need these moments
309
841000
2000
14:18
of surprise and reversal and wrongness
310
843000
2000
14:20
to make these stories work."
311
845000
2000
14:22
And for the rest of us, audience members,
312
847000
2000
14:24
as listeners, as readers,
313
849000
3000
14:27
we eat this stuff up.
314
852000
2000
14:29
We love things like plot twists
315
854000
3000
14:32
and red herrings and surprise endings.
316
857000
3000
14:35
When it comes to our stories,
317
860000
3000
14:38
we love being wrong.
318
863000
3000
14:41
But, you know, our stories are like this
319
866000
2000
14:43
because our lives are like this.
320
868000
3000
14:46
We think this one thing is going to happen
321
871000
3000
14:49
and something else happens instead.
322
874000
3000
14:52
George Bush thought he was going to invade Iraq,
323
877000
2000
14:54
find a bunch of weapons of mass destruction,
324
879000
2000
14:56
liberate the people and bring democracy to the Middle East.
325
881000
3000
15:00
And something else happened instead.
326
885000
2000
15:03
And Hosni Mubarak
327
888000
2000
15:05
thought he was going to be the dictator of Egypt for the rest of his life,
328
890000
2000
15:07
until he got too old or too sick
329
892000
2000
15:09
and could pass the reigns of power onto his son.
330
894000
3000
15:12
And something else happened instead.
331
897000
3000
15:16
And maybe you thought
332
901000
2000
15:18
you were going to grow up and marry your high school sweetheart
333
903000
2000
15:20
and move back to your hometown and raise a bunch of kids together.
334
905000
3000
15:24
And something else happened instead.
335
909000
3000
15:27
And I have to tell you
336
912000
2000
15:29
that I thought I was writing an incredibly nerdy book
337
914000
2000
15:31
about a subject everybody hates
338
916000
2000
15:33
for an audience that would never materialize.
339
918000
3000
15:36
And something else happened instead.
340
921000
2000
15:38
(Laughter)
341
923000
2000
15:40
I mean, this is life.
342
925000
2000
15:42
For good and for ill,
343
927000
2000
15:44
we generate these incredible stories
344
929000
3000
15:47
about the world around us,
345
932000
2000
15:49
and then the world turns around and astonishes us.
346
934000
3000
15:55
No offense, but this entire conference
347
940000
3000
15:58
is an unbelievable monument
348
943000
2000
16:00
to our capacity to get stuff wrong.
349
945000
2000
16:02
We just spent an entire week
350
947000
2000
16:04
talking about innovations and advancements
351
949000
2000
16:06
and improvements,
352
951000
2000
16:08
but you know why we need all of those innovations
353
953000
3000
16:11
and advancements and improvements?
354
956000
2000
16:13
Because half the stuff
355
958000
2000
16:15
that's the most mind-boggling and world-altering --
356
960000
3000
16:18
TED 1998 --
357
963000
2000
16:20
eh.
358
965000
2000
16:22
(Laughter)
359
967000
4000
16:26
Didn't really work out that way, did it?
360
971000
2000
16:28
(Laughter)
361
973000
2000
16:30
Where's my jet pack, Chris?
362
975000
3000
16:33
(Laughter)
363
978000
4000
16:37
(Applause)
364
982000
5000
16:42
So here we are again.
365
987000
3000
16:45
And that's how it goes.
366
990000
2000
16:47
We come up with another idea.
367
992000
2000
16:49
We tell another story.
368
994000
3000
16:52
We hold another conference.
369
997000
3000
16:55
The theme of this one,
370
1000000
2000
16:57
as you guys have now heard seven million times,
371
1002000
2000
16:59
is the rediscovery of wonder.
372
1004000
2000
17:01
And to me,
373
1006000
2000
17:03
if you really want to rediscover wonder,
374
1008000
3000
17:06
you need to step outside
375
1011000
2000
17:08
of that tiny, terrified space of rightness
376
1013000
6000
17:14
and look around at each other
377
1019000
3000
17:17
and look out at the vastness
378
1022000
3000
17:20
and complexity and mystery
379
1025000
3000
17:23
of the universe
380
1028000
3000
17:26
and be able to say,
381
1031000
3000
17:29
"Wow, I don't know.
382
1034000
4000
17:33
Maybe I'm wrong."
383
1038000
2000
17:35
Thank you.
384
1040000
2000
17:37
(Applause)
385
1042000
3000
17:40
Thank you guys.
386
1045000
2000
17:42
(Applause)
387
1047000
3000

▲Back to top

ABOUT THE SPEAKER
Kathryn Schulz - Wrongologist
Kathryn Schulz is a staff writer for the New Yorker and is the author of "Being Wrong: Adventures in the Margin of Error."

Why you should listen

Kathryn Schulz is a journalist, author, and public speaker with a credible (if not necessarily enviable) claim to being the world's leading wrongologist.  She is the author of Being Wrong: Adventures in the Margin of Error. She was previously the book critic for New York Magazine; her writing has also appeared in the New York Times Magazine, Rolling Stone, TIME Magazine, the Boston Globe, the "Freakonomics" blog of The New York Times, The Nation, Foreign Policy, and the New York Times Book Review, among other publications. She is the former editor of the online environmental magazine Grist, and a former reporter and editor for The Santiago Times, of Santiago, Chile, where she covered environmental, labor, and human rights issues. She was a 2004 recipient of the Pew Fellowship in International Journalism (now the International Reporting Project), and has reported from throughout Central and South America, Japan, and, most recently, the Middle East. A graduate of Brown University and a former Ohioan, Oregonian and Brooklynite, she currently lives in New York's Hudson Valley.

More profile about the speaker
Kathryn Schulz | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee