ABOUT THE SPEAKER
Jer Thorp - Data artist
Jer Thorp’s work focuses on adding meaning and narrative to huge amounts of data as a way to help people take control of the information that surrounds them.

Why you should listen

Currently the data artist in residence at the New York Times, Jer’s software-based art has been featured all over the world. His former career as a data artist explains why his art often brings big data sets to life and is deeply influenced by science. Originally from Vancouver, he lives in New York City, where, along with his work at the New York Times, he teaches in NYU’s ITP program.

More profile about the speaker
Jer Thorp | Speaker | TED.com
TEDxVancouver

Jer Thorp: Make data more human

Filmed:
300,699 views

Jer Thorp creates beautiful data visualizations to put abstract data into a human context. At TEDxVancouver, he shares his moving projects, from graphing an entire year's news cycle, to mapping the way people share articles across the internet.
- Data artist
Jer Thorp’s work focuses on adding meaning and narrative to huge amounts of data as a way to help people take control of the information that surrounds them. Full bio

Double-click the English transcript below to play the video.

00:10
I want to talk to you about two
of the most exciting possible things.
0
1674
6250
00:16
You've probably guessed what they are --
1
7948
1949
00:18
data and history.
2
9921
2319
00:21
Right?
3
13211
1171
00:24
So, I'm not a historian.
4
15871
1982
00:26
I'm not going to give you
a definition of history.
5
17877
2728
00:29
But let's think instead
of history within a framework.
6
20629
3113
00:32
So, when we're making history,
7
23766
1611
00:33
or when we're creating
historical documents,
8
25401
2892
00:36
we're taking things
that have happened in the past,
9
28317
2428
00:39
and we're stitching them
together into a story.
10
30769
2552
00:41
So let me start with a little bit
of my own story.
11
33345
2530
00:44
Like anybody my age
who works creatively with computers,
12
35899
3678
00:48
I was a popular, socially
well-adjusted young man --
13
39601
4456
00:52
(Laughter)
14
44081
1122
00:53
And sporty!
15
45227
2541
00:56
Sporty young man.
16
47792
1733
00:58
And like a lot of people my age
in the type of business that I'm in,
17
50075
5353
01:03
I was influenced tremendously by Apple.
18
55452
2645
01:07
But notice my choice of logo here, right?
19
58635
3722
01:10
The Apple on the left,
not the Apple on the right.
20
62381
3585
01:15
I'm influenced as much
by the Apple on the right
21
66621
2293
01:17
as the next person,
22
68938
2083
01:19
but the Apple on the left --
I mean, look at that logo!
23
71045
2633
01:22
It's a rainbow.
It's not even in the right order!
24
73702
2397
01:24
(Laughter)
25
76123
1134
01:25
That's how crazy Apple was.
26
77281
2273
01:28
(Laughter)
27
79578
1037
01:29
But I don't want to talk too much
about the company.
28
80639
2945
01:32
I'll start talking about
a machine, though.
29
83608
2177
01:34
How amazing it is to think about this.
I go back and I think about this.
30
85809
4127
01:38
Wednesday -- one Wednesday,
when I was about 12 years old,
31
89960
3314
01:41
I didn't have a computer.
32
93298
2111
01:44
On Thursday, I had a computer.
33
96034
2779
01:48
Can you imagine that change?
34
99965
1999
01:50
It's so drastic.
35
102417
1681
01:52
I can't even think about anything
that could change our lives that way.
36
104122
3451
01:56
But I'm actually not even going
to talk about the computer.
37
107597
2767
01:58
I'm going to talk about a program
that came loaded on that computer.
38
110388
3230
02:02
And it was build by,
not the guy on the left,
39
113642
2272
02:04
but the guy on the right.
40
115938
1435
02:05
Does anybody know
who the guy on the right is?
41
117397
2144
02:09
Nobody ever knows the answer
to this question.
42
121161
2410
02:12
This is Bill Atkinson.
43
123595
1686
02:13
And Bill Atkinson was responsible
for tons of things
44
125305
3107
02:16
that you see on your computer every day.
45
128436
2482
02:19
But I want to talk about one program
that Bill Atkinson wrote,
46
130942
3107
02:22
called HyperCard.
47
134073
1500
02:25
Someone's cheering over there.
48
137025
2160
02:27
(Laughter)
49
139209
1205
02:28
HyperCard was a program
that shipped with the Mac,
50
140438
2602
02:31
and it was designed
for users of the computer
51
143064
2569
02:34
to make programs on their computers.
52
145657
3197
02:38
Crazy idea today.
53
149505
1597
02:39
And these programs were not the apps
that we think about today,
54
151126
2973
02:42
with their large budgets
and their big distribution.
55
154123
2449
02:45
These were small things,
56
156596
1189
02:46
people making applications to keep track
of their local basketball team scores
57
157809
3882
02:50
or to organize their research
58
161715
2825
02:53
or to teach people about classical music
59
164564
3016
02:56
or to calculate weird astronomical dates.
60
167604
4095
03:00
And then, of course,
there were some art projects.
61
171723
2381
03:02
This is my favorite one.
62
174128
1220
03:03
It's called "If Monks Had Macs,"
63
175372
2089
03:05
and it's a nonlinear
kind of exploratory environment.
64
177485
4534
03:10
I thank the stars for HyperCard
all of the time.
65
182043
5573
03:16
And I thank the stars
for putting me in this era
66
187640
2447
03:18
where I got to use HyperCard.
67
190111
2300
03:20
HyperCard was the last program to ship
on a public computer
68
192435
4640
03:25
that was designed for the users
of the computer to make programs with it.
69
197099
5129
03:30
If you talked to the people
who invented the computer
70
202252
2705
03:33
and you told them there would be
a day, a magical day,
71
204981
2749
03:36
when everybody had a computer
but none of them knew how to program,
72
207754
5062
03:41
they would think you were crazy.
73
212840
1811
03:43
So let's skip forward a few years.
74
215486
1664
03:45
I'm starting my career as an artist,
75
217174
2588
03:48
and I'm building things
with my computer, small-scale things,
76
219786
3962
03:52
investigating things like
the growth systems of plants.
77
223772
3603
03:55
Or, in this example, I'm building
a simulated economy
78
227399
2999
03:58
in which pixels are trading color
with one another,
79
230422
3961
04:02
trying to investigate how
these types of systems work,
80
234407
2575
04:05
and just kind of having fun.
81
237006
1402
04:06
And then this project led me
to start working with data.
82
238432
2628
04:09
So I'm building graphics like this,
83
241084
2989
04:12
which compare "communism" --
84
244097
2594
04:15
the frequency of usage of the word
"communism" in the New York Times --
85
246715
3395
04:18
to "terrorism," at the top.
86
250134
1937
04:20
You see "terrorism" kind of appears
as "communism" is going away.
87
252095
4625
04:25
And with these graphics, I was really
interested in the aesthetic of the graphs.
88
256744
3816
04:29
This is Iran and Iraq.
89
260584
1150
04:30
It reads like a clock. It's called
a "timepiece graph."
90
261758
3910
04:34
This is another timepiece graph,
overlaying "despair" over "hope."
91
265692
5711
04:39
And there's only three times -- actually,
it's "crisis" over "hope" --
92
271427
3310
04:43
there's only three times
when "crisis" eclipses "hope."
93
274761
2609
04:45
We're in the middle
of one of them right now.
94
277394
2155
04:48
But don't think about that too much.
95
279573
1772
04:49
(Laughter)
96
281369
1888
04:51
And finally, the culmination of this work
with the New York Times data
97
283281
3780
04:55
a few years ago
98
287085
1202
04:56
was the attempt to combine
an entire year's news cycle
99
288311
3176
05:00
into a single graphic.
100
291511
1313
05:01
So these graphics actually show us
a full year of news, all the people,
101
292848
4227
05:05
and how they're connected
into a single graphic.
102
297099
2630
05:08
And from there, I started to be
interested again in more active systems.
103
299753
3938
05:12
Here's a project called "Just Landed,"
104
303715
2264
05:14
where I'm looking at people
tweeting on Twitter.
105
306003
3151
05:17
"Hey! I just landed
in Hawaii!" -- you know,
106
309178
2060
05:19
how people just casually try to sneak
that into their Twitter conversation.
107
311262
3702
05:23
"I'm not showing off. Really.
But I did just land in Hawaii."
108
314988
3117
05:26
And then I'm plotting
those people's trips,
109
318129
2743
05:29
in the hopes that maybe
we can use social network
110
320896
3212
05:32
and the data that it leaves behind
111
324132
1681
05:34
to provide a model of how people move,
112
325837
2199
05:36
which would be valuable
to epidemiologists, among other people.
113
328060
2975
05:39
And, more fun -- this
is a similar project,
114
331059
2579
05:42
looking at people
saying "Good morning" to each other
115
333662
2491
05:44
all around the world.
116
336177
1183
05:45
Which taught me, by the way,
117
337384
1434
05:47
that it is true that people in Vancouver
on the West Coast wake up much later
118
338842
4350
05:51
and say "Good morning" much later
119
343216
1583
05:53
than the people on the East Coast,
120
344823
1861
05:55
who are more adventurous.
121
346708
1799
05:57
Here's a more useful -- maybe -- project,
122
348531
1974
05:59
where I took all the information
from the Kepler Project
123
350529
3351
06:02
and tried to put it into some visual form
that made sense to me.
124
353904
3043
06:05
And I should say that everything
I've shown you up to now --
125
356971
2884
06:08
these are all things
that I just did for fun.
126
359879
2152
06:10
It may seem weird,
but this comes back from HyperCard.
127
362055
2735
06:13
I'm building tools for myself.
128
364814
1830
06:15
I may share them with a few other people,
129
366668
1983
06:17
but they're for fun, they're for me.
130
368675
2107
06:21
So, all these tools I show you
kind of occupy this weird space
131
373341
3970
06:25
somewhere between science, art and design.
132
377335
2544
06:28
That's where my practice lies.
133
379903
1805
06:30
And still today,
from my experience with HyperCard,
134
381732
3156
06:33
what I'm doing is building visual tools
to help me understand systems.
135
384912
4230
06:38
So today, I work at the New York Times.
136
390083
2221
06:40
I'm the data artist in residence
at the New York Times.
137
392328
2873
06:43
And I've had an opportunity at the Times
138
395225
1933
06:45
to work on a variety
of really interesting projects,
139
397182
2464
06:48
two of which I'm going
to share with you today.
140
399670
2222
06:50
The first one, I've been working on
in conjunction with Mark Hansen.
141
401916
3202
06:53
Mark Hansen is a professor of statistics
at UCLA. He's also a media artist.
142
405142
5142
06:58
And Mark came to the Times
with a very interesting question
143
410308
2786
07:01
to what may seem like an obvious problem:
144
413118
2660
07:04
When people share content on the internet,
145
415802
3151
07:07
how does that content get
from person A to person B?
146
418977
3615
07:11
Or maybe, person A to person B
to person C to person D?
147
423358
4724
07:16
We know that people share content
in the internet,
148
428106
2354
07:18
but what we don't know
is what happens in that gap
149
430484
2358
07:21
between one person to the other.
150
432866
1791
07:23
So we decided to build
the tool to explore that,
151
434681
2356
07:25
and this tool is called Cascade.
152
437061
1823
07:27
If we look at these systems
153
439471
2595
07:30
that start with one event
that leads to other events,
154
442090
4430
07:35
we call that structure a cascade.
155
446544
2238
07:37
And these cascades
actually happen over time.
156
448806
2409
07:39
So we can model these things over time.
157
451239
2020
07:41
Now, the New York Times has
a lot of people who share our content,
158
453283
4031
07:45
so the cascades do not look like that one,
they look more like this.
159
457338
3491
07:49
Here's a typical cascade.
160
460853
1540
07:50
At the bottom left, the very first event.
161
462417
2714
07:54
And then as people are sharing
the content from one person to another,
162
466237
4272
07:59
we go up in the Y axis,
degrees of separation,
163
470533
3794
08:02
and over on the X axis, for time.
164
474351
2768
08:05
So we're able to look at that conversation
in a couple of different views:
165
477143
3501
08:09
this one, which shows us
the threads of conversation,
166
480668
2615
08:11
and this one, which combines
that stacked view
167
483307
3194
08:15
with a view that lets us see the threads.
168
486525
2932
08:18
Now, the Times publishes
about 7,000 pieces of content
169
489924
3345
08:21
every month.
170
493293
1210
08:23
So it was important for us,
when we were building this tool,
171
494527
2842
08:25
to make it an exploratory one,
172
497393
1633
08:27
so that people could dig through
this vast terrain of data.
173
499050
4207
08:31
I think of it as a vehicle
that we're giving people
174
503281
2436
08:34
to traverse this really big
terrain of data.
175
505741
3473
08:37
So here's what it really looks like,
176
509238
1718
08:39
and here's the cascade
playing in real time.
177
510980
2740
08:42
I have to say, this was
a tremendous moment.
178
513744
2079
08:44
We had been working with canned
data, fake data, for so long,
179
515847
4017
08:48
that when we saw this
for the first moment,
180
519888
2805
08:51
it was like an archaeologist who had
just dusted off these dinosaur bones.
181
522717
4878
08:56
We discovered this thing,
and we were seeing it for the first time,
182
527619
3878
09:00
these sharing structures
that underlie the internet.
183
531521
3712
09:04
And maybe the dinosaur
analogy is a good one,
184
536475
2105
09:07
because we're actually making
some probabilistic guesses
185
538604
3047
09:10
about how these things link.
186
541675
1359
09:11
We're looking at some of these
pieces and making some guesses,
187
543058
2926
09:14
but we try to make sure that those
are as statistically rigorous as possible.
188
546008
3937
09:19
Now tweets, in this case,
they become parts of stories.
189
550720
4662
09:23
They become parts of narratives.
190
555406
1925
09:25
So we are building histories here,
191
557355
2420
09:28
but they're very short-term histories.
192
559799
2175
09:30
And sometimes these very large cascades
are the most interesting ones,
193
561998
3838
09:34
but sometimes the small ones
are also interesting.
194
565860
3135
09:37
This is one of my favorites.
We call this the "Rabbi Cascade."
195
569019
3525
09:41
It's a conversation amongst rabbis
about this article in the New York Times,
196
572568
5089
09:46
about the fact that religious workers
don't get a lot of time off.
197
577681
3772
09:49
I guess Saturdays and Sundays are bad days
for them to take off.
198
581477
4035
09:54
So, in this cascade, there's a group
of rabbis having a conversation
199
585536
3692
09:57
about a New York Times story.
200
589252
1402
09:59
One of them has the best
Twitter name ever --
201
590678
2124
10:01
he's called "The Velveteen Rabbi."
202
592826
1855
10:03
(Laughter)
203
594705
2323
10:05
But we would have never found this
if it weren't for this exploratory tool.
204
597052
4507
10:10
This would just be sitting somewhere,
205
601583
1802
10:11
and we would have never
been able to see that.
206
603409
2186
10:14
But this exercise of taking
single pieces of information
207
605619
4141
10:18
and building narrative structures,
building histories out of them,
208
609784
4221
10:22
I find tremendously interesting.
209
614029
1925
10:24
You know, I moved to New York
about two years ago.
210
616319
2344
10:27
And in New York, everybody has a story
211
618687
2720
10:29
that surrounds this
tremendously impactful event
212
621431
2960
10:32
that happened on September 11 of 2001.
213
624415
2299
10:35
And my own story with September 11
has really become a more intricate one,
214
627373
6367
10:42
because I spent a great deal of time
215
633764
2064
10:44
working on a piece
of the 9/11 Memorial in Manhattan.
216
635852
4149
10:49
The central idea about the 9/11 Memorial
217
640530
2564
10:51
is that the names in the memorial
are not laid out in alphabetical order
218
643118
4459
10:56
or chronological order,
219
647601
1685
10:57
but instead, they're laid out in a way
220
649310
1824
10:59
in which the relationships
between the people who were killed
221
651158
3424
11:03
are embodied in the memorial.
222
654606
1960
11:05
Brothers are placed next to brothers,
223
657153
2538
11:08
coworkers are placed together.
224
659715
2185
11:10
So this memorial actually considers
all of these myriad connections
225
661924
4665
11:15
that were part of these people's lives.
226
666613
2421
11:18
I worked with a company
called Local Projects
227
670310
3433
11:22
to work on an algorithm
and a software tool
228
673767
2674
11:24
to help the architects build
the layout for the memorial:
229
676465
3004
11:28
almost 3,000 names
230
680331
1722
11:30
and almost 1,500 of these
adjacency requests,
231
682077
3627
11:34
these requests for connection --
232
685728
1610
11:35
so a very dense story,
a very dense narrative,
233
687362
3386
11:39
that becomes an embodied part
of this memorial.
234
690772
2816
11:42
Working with Jake Barton,
we produce the software tool,
235
694195
3331
11:46
which allows the architects to,
first of all, generate a layout
236
697550
4119
11:50
that satisfied all of those
adjacency requests,
237
701693
3129
11:53
but then second, make little adjustments
where they needed to
238
704846
3033
11:56
to tell the stories
that they wanted to tell.
239
707903
2348
11:59
So this memorial, I think,
has an incredibly timely concept
240
711219
4135
12:03
in our era of social networks,
241
715378
2990
12:06
because these networks -- these real-life
networks that make up people's lives --
242
718392
3975
12:10
are actually embodied
inside of the memorial.
243
722391
2432
12:13
And one of the most tremendously
moving experiences
244
725286
3471
12:17
is to go to the memorial
245
728781
1661
12:18
and see how these people
are placed next to each other,
246
730466
4200
12:23
so that this memorial
is representing their own lives.
247
734690
2862
12:27
How does this affect our lives?
248
738859
1687
12:29
Well, I don't know if you remember,
249
741133
1676
12:31
but in the spring,
there was a controversy,
250
742833
2713
12:34
because it was discovered
that on the iPhone
251
745570
2198
12:36
and, actually, on your computer,
252
747792
1606
12:37
we were storing a tremendous amount
of the location data.
253
749422
3315
12:41
So Apple responded, saying,
this was not location data about you,
254
753173
3861
12:45
it was location data
about wireless networks
255
757058
2805
12:48
that were in the area where you are.
256
759887
2287
12:50
So it's not about you,
257
762198
1428
12:52
but it's about where you are.
258
763650
1584
12:53
(Laughter)
259
765258
1648
12:55
This is very valuable data.
260
766930
2808
12:58
It's like gold to researchers,
this human-mobility data.
261
769762
4625
13:02
So we thought, "Man!
How many people have iPhones?"
262
774411
3664
13:06
How many of you have iPhones?
263
778099
1448
13:09
So in this room, we have this tremendous
database of location data
264
780608
5478
13:14
that researchers
would really, really like.
265
786110
3775
13:18
So we built this system called Open Paths,
266
789909
2031
13:20
which lets people upload their iPhone data
267
791964
2656
13:23
and broker relationships
with researchers to share that data,
268
794644
3796
13:26
to donate that data to people
that can actually put it to use.
269
798464
3387
13:30
Open Paths was a great
success as a prototype.
270
802256
2350
13:33
We received thousands of data sets,
271
804630
3433
13:36
and we built this interface
272
808087
1349
13:37
which allows people to actually
see their lives unfolding
273
809460
3318
13:41
from these traces
that are left behind on your devices.
274
812802
3156
13:45
Now, what we didn't expect
was how moving this experience would be.
275
816593
5267
13:50
When I uploaded my data,
I thought, "Big deal.
276
821884
2227
13:52
I know where I live. I know where I work.
What am I going to see here?"
277
824135
3416
13:56
Well, it turns out, what I saw
was that moment I got off the plane
278
827575
3501
13:59
to start my new life in New York;
279
831100
1623
14:02
the restaurant where I had Thai food
that first night,
280
833588
2606
14:04
thinking about this new experience
of being in New York;
281
836218
2953
14:07
the day that I met my girlfriend.
282
839195
1623
14:11
This is LaGuardia airport.
283
842587
2275
14:13
(Laughter)
284
844886
1487
14:14
This is this Thai restaurant
on Amsterdam Avenue.
285
846397
3641
14:19
This is the moment I met my girlfriend.
286
850559
2050
14:22
See how that changes the first time
I told you about those stories
287
854146
3451
14:26
and the second time I told
you about those stories?
288
857621
2468
14:28
Because what we do
in the tool, inadvertently,
289
860113
3207
14:31
is we put these pieces of data
into a human context.
290
863344
3115
14:35
And by placing data into a human context,
291
866935
2498
14:37
it gains meaning.
292
869457
1474
14:39
And I think this is tremendously,
tremendously important,
293
870955
3328
14:42
because these are our histories
that are being stored on these devices.
294
874307
4918
14:49
And by thinking about them that way,
295
880809
1994
14:52
putting them in a human context --
296
883543
1902
14:53
first of all, what we do with our own data
is get a better understanding
297
885469
3662
14:57
of the type of information
that we're sharing.
298
889155
2479
15:00
But if we can do this with other data,
if we can put data into a human context,
299
891658
4053
15:04
I think we can change a lot of things,
300
895735
2918
15:07
because it builds, automatically, empathy
for the people involved in these systems.
301
898677
6385
15:14
And that, in turn, results
in a fundamental respect,
302
905602
2953
15:17
which, I believe, is missing
in a large part of technology,
303
908579
3163
15:20
when we start to deal
with issues like privacy,
304
912329
2938
15:25
by understanding that these numbers
are not just numbers,
305
916765
2717
15:28
but instead they're attached, tethered to,
pieces of the real world.
306
919506
3619
15:31
They carry weight.
307
923149
1506
15:33
By understanding that,
the dialog becomes a lot different.
308
924679
3332
15:38
How many of you have ever clicked a button
309
929595
2331
15:40
that enables a third party to access
your location data on your phone?
310
931950
4987
15:46
Lots of you.
311
937595
1555
15:47
So the third party is the developer,
312
939174
2245
15:49
the second party is Apple.
313
941443
1801
15:52
The only party that never gets access
to this information is the first party!
314
943954
4823
15:58
And I think that's because we think
about these pieces of data
315
950198
3135
16:01
in this stranded, abstract way.
316
953357
2055
16:03
We don't put them into a context
317
955436
1897
16:05
which, I think, makes them
a lot more important.
318
957357
2309
16:08
So what I'm asking you
to do is really simple:
319
959690
2166
16:10
start to think about data
in a human context.
320
961880
2323
16:13
It doesn't really take anything.
321
964918
1657
16:15
When you read stock prices,
think about them in a human context.
322
966599
3359
16:18
When you think about mortgage reports,
think about them in a human context.
323
969982
3542
16:22
There's no doubt that big data
is big business.
324
973548
3930
16:26
There's an industry being developed here.
325
977502
3018
16:30
Think about how well we've done
326
981520
1501
16:31
in previous industries
that we've developed involving resources.
327
983045
3369
16:34
Not very well at all.
328
986438
1300
16:36
I think part of that problem is, we've had
a lack of participation in these dialogues
329
987762
4522
16:40
from multiple pieces of human society.
330
992308
4428
16:45
So the other thing that I'm asking for
331
996760
1992
16:48
is an inclusion in this dialogue
from artists, from poets, from writers --
332
999669
4378
16:52
from people who can bring a human element
into this discussion.
333
1004071
4013
16:57
Because I believe that this world of data
334
1008725
2356
16:59
is going to be transformative for us.
335
1011105
3025
17:03
And unlike our attempts
with the resource industry
336
1014687
3169
17:06
and our attempts
with the financial industry,
337
1017880
2153
17:08
by bringing the human
element into this story,
338
1020057
2931
17:11
I think we can take it
to tremendous places.
339
1023012
2178
17:14
Thank you.
340
1026203
1155
17:15
(Applause)
341
1027382
4052
Translated by Camille Martínez
Reviewed by Brian Greene

▲Back to top

ABOUT THE SPEAKER
Jer Thorp - Data artist
Jer Thorp’s work focuses on adding meaning and narrative to huge amounts of data as a way to help people take control of the information that surrounds them.

Why you should listen

Currently the data artist in residence at the New York Times, Jer’s software-based art has been featured all over the world. His former career as a data artist explains why his art often brings big data sets to life and is deeply influenced by science. Originally from Vancouver, he lives in New York City, where, along with his work at the New York Times, he teaches in NYU’s ITP program.

More profile about the speaker
Jer Thorp | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee