ABOUT THE SPEAKER
Susan Etlinger - Data analyst
Susan Etlinger promotes the smart, well-considered and ethical use of data.

Why you should listen

Susan Etlinger is an industry analyst with Altimeter Group, where she focuses on data and analytics. She conducts independent research and has authored two intriguing reports: “The Social Media ROI Cookbook” and “A Framework for Social Analytics.” She also advises global clients on how to work measurement into their organizational structure and how to extract insights from the social web which can lead to tangible actions. In addition, she works with technology innovators to help them refine their roadmaps and strategies. 

Etlinger is on the board of The Big Boulder Initiative, an industry organization dedicated to promoting the successful and ethical use of social data. She is regularly interviewed and asked to speak on data strategy and best practices, and has been quoted in media outlets like The Wall Street Journal, The New York Times, and the BBC.

More profile about the speaker
Susan Etlinger | Speaker | TED.com
TED@IBM

Susan Etlinger: What do we do with all this big data?

Filmed:
1,344,301 views

Does a set of data make you feel more comfortable? More successful? Then your interpretation of it is likely wrong. In a surprisingly moving talk, Susan Etlinger explains why, as we receive more and more data, we need to deepen our critical thinking skills. Because it's hard to move beyond counting things to really understanding them.
- Data analyst
Susan Etlinger promotes the smart, well-considered and ethical use of data. Full bio

Double-click the English transcript below to play the video.

00:13
Technology has brought us so much:
0
1354
3135
00:16
the moon landing, the Internet,
1
4489
2019
00:18
the ability to sequence the human genome.
2
6508
2625
00:21
But it also taps into a lot of our deepest fears,
3
9133
3724
00:24
and about 30 years ago,
4
12857
1856
00:26
the culture critic Neil Postman wrote a book
5
14713
2553
00:29
called "Amusing Ourselves to Death,"
6
17266
2115
00:31
which lays this out really brilliantly.
7
19381
2759
00:34
And here's what he said,
8
22140
1650
00:35
comparing the dystopian visions
9
23790
2263
00:38
of George Orwell and Aldous Huxley.
10
26053
3573
00:41
He said, Orwell feared we would become
11
29626
3126
00:44
a captive culture.
12
32752
2248
00:47
Huxley feared we would become a trivial culture.
13
35000
3752
00:50
Orwell feared the truth would be
14
38752
2145
00:52
concealed from us,
15
40897
1923
00:54
and Huxley feared we would be drowned
16
42820
2190
00:57
in a sea of irrelevance.
17
45010
2693
00:59
In a nutshell, it's a choice between
18
47703
2170
01:01
Big Brother watching you
19
49873
2600
01:04
and you watching Big Brother.
20
52473
2496
01:06
(Laughter)
21
54969
1931
01:08
But it doesn't have to be this way.
22
56900
1734
01:10
We are not passive consumers
of data and technology.
23
58634
3336
01:13
We shape the role it plays in our lives
24
61970
2403
01:16
and the way we make meaning from it,
25
64373
2130
01:18
but to do that,
26
66503
1603
01:20
we have to pay as much attention to how we think
27
68106
3513
01:23
as how we code.
28
71619
2030
01:25
We have to ask questions, and hard questions,
29
73649
3098
01:28
to move past counting things
30
76747
1869
01:30
to understanding them.
31
78616
2602
01:33
We're constantly bombarded with stories
32
81218
2446
01:35
about how much data there is in the world,
33
83664
2476
01:38
but when it comes to big data
34
86140
1580
01:39
and the challenges of interpreting it,
35
87720
2596
01:42
size isn't everything.
36
90316
2088
01:44
There's also the speed at which it moves,
37
92404
2903
01:47
and the many varieties of data types,
38
95307
1696
01:49
and here are just a few examples:
39
97003
2498
01:51
images,
40
99501
2198
01:53
text,
41
101699
4007
01:57
video,
42
105706
2095
01:59
audio.
43
107801
1830
02:01
And what unites this disparate types of data
44
109631
3042
02:04
is that they're created by people
45
112673
2221
02:06
and they require context.
46
114894
2775
02:09
Now, there's a group of data scientists
47
117669
2445
02:12
out of the University of Illinois-Chicago,
48
120114
2305
02:14
and they're called the Health Media Collaboratory,
49
122419
2554
02:16
and they've been working with
the Centers for Disease Control
50
124973
2587
02:19
to better understand
51
127560
1505
02:21
how people talk about quitting smoking,
52
129065
2848
02:23
how they talk about electronic cigarettes,
53
131913
2680
02:26
and what they can do collectively
54
134593
1985
02:28
to help them quit.
55
136578
1984
02:30
The interesting thing is, if you want to understand
56
138562
2013
02:32
how people talk about smoking,
57
140575
2216
02:34
first you have to understand
58
142791
1901
02:36
what they mean when they say "smoking."
59
144692
2565
02:39
And on Twitter, there are four main categories:
60
147257
3926
02:43
number one, smoking cigarettes;
61
151183
2997
02:46
number two, smoking marijuana;
62
154180
2807
02:48
number three, smoking ribs;
63
156987
2643
02:51
and number four, smoking hot women.
64
159630
3553
02:55
(Laughter)
65
163183
2993
02:58
So then you have to think about, well,
66
166176
2426
03:00
how do people talk about electronic cigarettes?
67
168602
2140
03:02
And there are so many different ways
68
170742
2025
03:04
that people do this, and you can see from the slide
69
172767
2599
03:07
it's a complex kind of a query.
70
175366
2610
03:09
And what it reminds us is that
71
177976
3224
03:13
language is created by people,
72
181200
2411
03:15
and people are messy and we're complex
73
183611
2340
03:17
and we use metaphors and slang and jargon
74
185951
2767
03:20
and we do this 24/7 in many, many languages,
75
188718
3279
03:23
and then as soon as we figure it out, we change it up.
76
191997
3224
03:27
So did these ads that the CDC put on,
77
195221
5118
03:32
these television ads that featured a woman
78
200339
2430
03:34
with a hole in her throat and that were very graphic
79
202769
2021
03:36
and very disturbing,
80
204790
1904
03:38
did they actually have an impact
81
206694
1885
03:40
on whether people quit?
82
208579
2671
03:43
And the Health Media Collaboratory
respected the limits of their data,
83
211250
3307
03:46
but they were able to conclude
84
214557
2005
03:48
that those advertisements —
and you may have seen them —
85
216562
3312
03:51
that they had the effect of jolting people
86
219874
2591
03:54
into a thought process
87
222465
1822
03:56
that may have an impact on future behavior.
88
224287
3667
03:59
And what I admire and
appreciate about this project,
89
227954
3891
04:03
aside from the fact, including the fact
90
231845
1489
04:05
that it's based on real human need,
91
233334
4057
04:09
is that it's a fantastic example of courage
92
237391
2846
04:12
in the face of a sea of irrelevance.
93
240237
4443
04:16
And so it's not just big data that causes
94
244680
3305
04:19
challenges of interpretation, because let's face it,
95
247985
2601
04:22
we human beings have a very rich history
96
250586
2594
04:25
of taking any amount of data, no matter how small,
97
253180
2693
04:27
and screwing it up.
98
255873
1617
04:29
So many years ago, you may remember
99
257490
3737
04:33
that former President Ronald Reagan
100
261227
2273
04:35
was very criticized for making a statement
101
263500
1991
04:37
that facts are stupid things.
102
265491
3010
04:40
And it was a slip of the tongue, let's be fair.
103
268501
2794
04:43
He actually meant to quote John Adams' defense
104
271295
2430
04:45
of British soldiers in the Boston Massacre trials
105
273725
2751
04:48
that facts are stubborn things.
106
276476
3150
04:51
But I actually think there's
107
279626
2624
04:54
a bit of accidental wisdom in what he said,
108
282250
3418
04:57
because facts are stubborn things,
109
285668
2776
05:00
but sometimes they're stupid, too.
110
288444
2923
05:03
I want to tell you a personal story
111
291367
1888
05:05
about why this matters a lot to me.
112
293255
3548
05:08
I need to take a breath.
113
296803
2437
05:11
My son Isaac, when he was two,
114
299240
2754
05:13
was diagnosed with autism,
115
301994
2417
05:16
and he was this happy, hilarious,
116
304411
2161
05:18
loving, affectionate little guy,
117
306572
2035
05:20
but the metrics on his developmental evaluations,
118
308607
2902
05:23
which looked at things like
the number of words —
119
311509
2070
05:25
at that point, none —
120
313579
3657
05:29
communicative gestures and minimal eye contact,
121
317236
3940
05:33
put his developmental level
122
321176
2003
05:35
at that of a nine-month-old baby.
123
323179
3961
05:39
And the diagnosis was factually correct,
124
327140
2960
05:42
but it didn't tell the whole story.
125
330100
3209
05:45
And about a year and a half later,
126
333309
1401
05:46
when he was almost four,
127
334710
2102
05:48
I found him in front of the computer one day
128
336812
2363
05:51
running a Google image search on women,
129
339175
5453
05:56
spelled "w-i-m-e-n."
130
344628
3616
06:00
And I did what any obsessed parent would do,
131
348244
2740
06:02
which is immediately started
hitting the "back" button
132
350984
1901
06:04
to see what else he'd been searching for.
133
352885
3363
06:08
And they were, in order: men,
134
356248
2171
06:10
school, bus and computer.
135
358419
7267
06:17
And I was stunned,
136
365686
2070
06:19
because we didn't know that he could spell,
137
367756
2002
06:21
much less read, and so I asked him,
138
369758
1766
06:23
"Isaac, how did you do this?"
139
371524
2193
06:25
And he looked at me very seriously and said,
140
373717
2678
06:28
"Typed in the box."
141
376395
3352
06:31
He was teaching himself to communicate,
142
379747
3734
06:35
but we were looking in the wrong place,
143
383481
3004
06:38
and this is what happens when assessments
144
386485
2295
06:40
and analytics overvalue one metric —
145
388780
2396
06:43
in this case, verbal communication —
146
391176
2609
06:45
and undervalue others, such
as creative problem-solving.
147
393785
5703
06:51
Communication was hard for Isaac,
148
399488
2307
06:53
and so he found a workaround
149
401795
1912
06:55
to find out what he needed to know.
150
403707
2857
06:58
And when you think about it, it makes a lot of sense,
151
406564
1890
07:00
because forming a question
152
408454
2081
07:02
is a really complex process,
153
410535
2565
07:05
but he could get himself a lot of the way there
154
413100
2522
07:07
by putting a word in a search box.
155
415622
4092
07:11
And so this little moment
156
419714
2936
07:14
had a really profound impact on me
157
422650
2836
07:17
and our family
158
425486
1309
07:18
because it helped us change our frame of reference
159
426795
3141
07:21
for what was going on with him,
160
429936
2208
07:24
and worry a little bit less and appreciate
161
432144
2976
07:27
his resourcefulness more.
162
435120
2182
07:29
Facts are stupid things.
163
437302
2861
07:32
And they're vulnerable to misuse,
164
440163
2397
07:34
willful or otherwise.
165
442560
1653
07:36
I have a friend, Emily Willingham, who's a scientist,
166
444213
3026
07:39
and she wrote a piece for Forbes not long ago
167
447239
2801
07:42
entitled "The 10 Weirdest Things
168
450040
1980
07:44
Ever Linked to Autism."
169
452020
1810
07:45
It's quite a list.
170
453830
3005
07:48
The Internet, blamed for everything, right?
171
456835
3532
07:52
And of course mothers, because.
172
460367
3757
07:56
And actually, wait, there's more,
173
464124
1587
07:57
there's a whole bunch in
the "mother" category here.
174
465711
3430
08:01
And you can see it's a pretty
rich and interesting list.
175
469141
4815
08:05
I'm a big fan of
176
473956
2193
08:08
being pregnant near freeways, personally.
177
476149
3704
08:11
The final one is interesting,
178
479853
1539
08:13
because the term "refrigerator mother"
179
481392
3003
08:16
was actually the original hypothesis
180
484395
2605
08:19
for the cause of autism,
181
487000
1431
08:20
and that meant somebody
who was cold and unloving.
182
488431
2735
08:23
And at this point, you might be thinking,
183
491166
1562
08:24
"Okay, Susan, we get it,
184
492728
1657
08:26
you can take data, you can
make it mean anything."
185
494385
1782
08:28
And this is true, it's absolutely true,
186
496167
4703
08:32
but the challenge is that
187
500870
5610
08:38
we have this opportunity
188
506480
2448
08:40
to try to make meaning out of it ourselves,
189
508928
2284
08:43
because frankly, data doesn't
create meaning. We do.
190
511212
5352
08:48
So as businesspeople, as consumers,
191
516564
3256
08:51
as patients, as citizens,
192
519820
2539
08:54
we have a responsibility, I think,
193
522359
2396
08:56
to spend more time
194
524755
2194
08:58
focusing on our critical thinking skills.
195
526949
2870
09:01
Why?
196
529819
1078
09:02
Because at this point in our history, as we've heard
197
530897
3178
09:06
many times over,
198
534075
1706
09:07
we can process exabytes of data
199
535781
1981
09:09
at lightning speed,
200
537762
2153
09:11
and we have the potential to make bad decisions
201
539915
3515
09:15
far more quickly, efficiently,
202
543430
1834
09:17
and with far greater impact than we did in the past.
203
545264
5028
09:22
Great, right?
204
550292
1388
09:23
And so what we need to do instead
205
551680
3030
09:26
is spend a little bit more time
206
554710
2330
09:29
on things like the humanities
207
557040
2746
09:31
and sociology, and the social sciences,
208
559786
3464
09:35
rhetoric, philosophy, ethics,
209
563250
2308
09:37
because they give us context that is so important
210
565558
2856
09:40
for big data, and because
211
568414
2576
09:42
they help us become better critical thinkers.
212
570990
2418
09:45
Because after all, if I can spot
213
573408
4207
09:49
a problem in an argument, it doesn't much matter
214
577615
2486
09:52
whether it's expressed in words or in numbers.
215
580101
2759
09:54
And this means
216
582860
2719
09:57
teaching ourselves to find
those confirmation biases
217
585579
4421
10:02
and false correlations
218
590000
1822
10:03
and being able to spot a naked emotional appeal
219
591822
2138
10:05
from 30 yards,
220
593960
1662
10:07
because something that happens after something
221
595622
2522
10:10
doesn't mean it happened
because of it, necessarily,
222
598144
3082
10:13
and if you'll let me geek out on you for a second,
223
601226
2119
10:15
the Romans called this
"post hoc ergo propter hoc,"
224
603345
4297
10:19
after which therefore because of which.
225
607642
3296
10:22
And it means questioning
disciplines like demographics.
226
610938
3757
10:26
Why? Because they're based on assumptions
227
614695
2520
10:29
about who we all are based on our gender
228
617215
2306
10:31
and our age and where we live
229
619521
1462
10:32
as opposed to data on what
we actually think and do.
230
620983
3478
10:36
And since we have this data,
231
624461
1663
10:38
we need to treat it with appropriate privacy controls
232
626124
3139
10:41
and consumer opt-in,
233
629263
3576
10:44
and beyond that, we need to be clear
234
632839
2993
10:47
about our hypotheses,
235
635832
2103
10:49
the methodologies that we use,
236
637935
2596
10:52
and our confidence in the result.
237
640531
2804
10:55
As my high school algebra teacher used to say,
238
643335
2474
10:57
show your math,
239
645809
1531
10:59
because if I don't know what steps you took,
240
647340
3441
11:02
I don't know what steps you didn't take,
241
650781
1991
11:04
and if I don't know what questions you asked,
242
652772
2438
11:07
I don't know what questions you didn't ask.
243
655210
3197
11:10
And it means asking ourselves, really,
244
658407
1523
11:11
the hardest question of all:
245
659930
1479
11:13
Did the data really show us this,
246
661409
3500
11:16
or does the result make us feel
247
664909
2311
11:19
more successful and more comfortable?
248
667220
3878
11:23
So the Health Media Collaboratory,
249
671098
2584
11:25
at the end of their project, they were able
250
673682
1699
11:27
to find that 87 percent of tweets
251
675381
3408
11:30
about those very graphic and disturbing
252
678789
2144
11:32
anti-smoking ads expressed fear,
253
680933
4038
11:36
but did they conclude
254
684971
1856
11:38
that they actually made people stop smoking?
255
686827
3161
11:41
No. It's science, not magic.
256
689988
2542
11:44
So if we are to unlock
257
692530
3190
11:47
the power of data,
258
695720
2862
11:50
we don't have to go blindly into
259
698582
3448
11:54
Orwell's vision of a totalitarian future,
260
702030
3436
11:57
or Huxley's vision of a trivial one,
261
705466
3117
12:00
or some horrible cocktail of both.
262
708583
3020
12:03
What we have to do
263
711603
2379
12:05
is treat critical thinking with respect
264
713982
2718
12:08
and be inspired by examples
265
716700
2029
12:10
like the Health Media Collaboratory,
266
718729
2610
12:13
and as they say in the superhero movies,
267
721339
2328
12:15
let's use our powers for good.
268
723667
1822
12:17
Thank you.
269
725489
2351
12:19
(Applause)
270
727840
2334

▲Back to top

ABOUT THE SPEAKER
Susan Etlinger - Data analyst
Susan Etlinger promotes the smart, well-considered and ethical use of data.

Why you should listen

Susan Etlinger is an industry analyst with Altimeter Group, where she focuses on data and analytics. She conducts independent research and has authored two intriguing reports: “The Social Media ROI Cookbook” and “A Framework for Social Analytics.” She also advises global clients on how to work measurement into their organizational structure and how to extract insights from the social web which can lead to tangible actions. In addition, she works with technology innovators to help them refine their roadmaps and strategies. 

Etlinger is on the board of The Big Boulder Initiative, an industry organization dedicated to promoting the successful and ethical use of social data. She is regularly interviewed and asked to speak on data strategy and best practices, and has been quoted in media outlets like The Wall Street Journal, The New York Times, and the BBC.

More profile about the speaker
Susan Etlinger | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee