ABOUT THE SPEAKER
Tricia Wang - Technology ethnographer
With astronaut eyes and ethnographer curiosity, Tricia Wang helps corporations grow by discovering the unknown about their customers.

Why you should listen

For Tricia Wang, human behavior generates some of the most perplexing questions of our times. She has taught global organizations how to identify new customers and markets hidden behind their data, amplified IDEO's design thinking practice as an expert-in-residence, researched the social evolution of the Chinese internet, and written about the "elastic self," an emergent form of interaction in a virtual world. Wang is the co-founder of Sudden Compass, a consulting firm that helps companies unlock new growth opportunities by putting customer obsession into practice.

Wang's work has been featured in The Atlantic, Al Jazeera, and The Guardian. Fast Company spotlighted her work in China: "What Twitter Can Learn From Weibo: Field Notes From Global Tech Ethnographer Tricia Wang." In her latest op-ed on Slate, she discusses how attempts to stop terrorists on social media can harm our privacy and anonymity. Her Medium post, "Why Big Data Needs Thick Data," is a frequently cited industry piece on the importance of an integrated data approach. One of her favorite essays documents her day in the life of working as a street vendor in China.

Known for her lively presentations that are grounded in her research and observations about human behavior and data, Wang has spoken at organizations such as Proctor & Gamble, Nike, Wrigley, 21st Century Fox and Tumblr. Her most recent talk at Enterprise UX delved into why corporate innovation usually doesn’t work and what to do about it. She delivered the opening keynote at The Conference to a crowd of marketers and creatives, delving into the wild history of linear perspective and its influence on how we think and form organizations.

Wang holds affiliate positions at Data & Society, Harvard University's Berkman Klein Center for Internet Studies and New York University's Interactive Telecommunication Program. She oversees Ethnography Matters, a site that publishes articles about applied ethnography and technology. She co-started a Slack community for anyone who uses ethnographic methods in industry.

Wang began her career as a documentary filmmaker at NASA, an HIV/AIDS activist, and an educator specializing in culturally responsive pedagogy. She is also proud to have co-founded the first national hip-hop education initiative, which turned into the Hip Hop Education Center at New York University, and to have built after-school technology and arts programs for low-income youth at New York City public schools and the Queens Museum of Arts. Her life philosophy is that you have to go to the edge to discover what’s really happening. She's the proud companion of her internet famous dog, #ellethedog.

More profile about the speaker
Tricia Wang | Speaker | TED.com
TEDxCambridge

Tricia Wang: The human insights missing from big data

王聖捷: 因大數據而失準的視野

Filmed:
1,688,539 views

為什麼即使擁有空前巨大的數據量,還有這麼多公司選錯了?藉由諾基亞、網飛、和古希臘先知的故事,王聖捷簡潔易懂地解釋了大數據是什麼,還揭示了大數據隱藏的陷阱。她建議我們應重視「厚數據」──也就是與現實中的人們溝通而獲得的寶貴、不可計量的洞察觀點──以期做出正確的商業選擇,並戰勝未知。
- Technology ethnographer
With astronaut eyes and ethnographer curiosity, Tricia Wang helps corporations grow by discovering the unknown about their customers. Full bio

Double-click the English transcript below to play the video.

00:12
In ancient Greece希臘,
0
885
1545
古希臘時期,
00:15
when anyone任何人 from slaves奴隸 to soldiers士兵,
poets詩人 and politicians政治家,
1
3436
3943
不論是奴隸或士兵,詩人或政治家,
00:19
needed需要 to make a big decision決定
on life's人生 most important重要 questions問題,
2
7403
4004
當他們人生遇到重大問題時,
需要做出重要的決定,
00:23
like, "Should I get married已婚?"
3
11431
1391
像是「我該結婚嗎?」
00:24
or "Should we embark從事 on this voyage航程?"
4
12846
1857
或是「我該開始這次的航行嗎?」
00:26
or "Should our army軍隊
advance提前 into this territory領土?"
5
14727
2928
或是「我的士兵該進攻這個領地嗎?」
00:29
they all consulted諮詢 the oracle神諭.
6
17679
2579
他們都會請示先知。
00:33
So this is how it worked工作:
7
21020
1440
運行模式是這樣的:
00:34
you would bring帶來 her a question
and you would get on your knees膝蓋,
8
22484
3112
你把問題告訴她,接著屈膝跪下,
00:37
and then she would go into this trance發呆.
9
25620
1871
然後她就會進入出神狀態。
00:39
It would take a couple一對 of days,
10
27515
1549
這會花上幾天的時間,
00:41
and then eventually終於
she would come out of it,
11
29088
2163
最終她會回神,
00:43
giving you her predictions預測 as your answer回答.
12
31275
2536
答復你她的預知。
00:46
From the oracle神諭 bones骨頭 of ancient China中國
13
34910
2566
從古中國的甲骨文,
到古希臘,再到馬雅曆,
00:49
to ancient Greece希臘 to Mayan瑪雅 calendars日曆,
14
37500
2345
00:51
people have craved渴望 for prophecy預言
15
39869
2296
人們都渴求著預言,
00:54
in order訂購 to find out
what's going to happen發生 next下一個.
16
42189
3137
為了知道接下來會發生什麼事。
00:58
And that's because we all want
to make the right decision決定.
17
46516
3239
而這是因為我們都想做正確的決定,
01:01
We don't want to miss小姐 something.
18
49779
1545
我們不希望漏掉了什麼。
01:03
The future未來 is scary害怕,
19
51892
1743
未來令人害怕。
01:05
so it's much nicer更好
knowing會心 that we can make a decision決定
20
53659
2717
所以能在某種程度上
保障決定的結果,是很棒的事。
01:08
with some assurance保證 of the outcome結果.
21
56400
1982
01:11
Well, we have a new oracle神諭,
22
59079
1611
我們有了新的先知,
01:12
and it's name名稱 is big data數據,
23
60714
2145
名字叫大數據。
01:14
or we call it "Watson沃森"
or "deep learning學習" or "neural神經 net."
24
62883
3939
也可以稱它為「華生」、
「深度學習」或「人工神經網路」。
01:19
And these are the kinds of questions問題
we ask of our oracle神諭 now,
25
67340
4012
如今我們會問先知這樣的問題:
01:23
like, "What's the most efficient高效 way
to ship these phones手機
26
71376
3922
「要將這批手機從中國
運到瑞典,怎樣最有效率?」
01:27
from China中國 to Sweden瑞典?"
27
75322
1823
01:29
Or, "What are the odds可能性
28
77169
1800
或是「我的小孩出生就有
遺傳疾病的機率是多少?」
01:30
of my child兒童 being存在 born天生
with a genetic遺傳 disorder紊亂?"
29
78993
3363
01:34
Or, "What are the sales銷售 volume
we can predict預測 for this product產品?"
30
82952
3244
或是「預期這產品的銷售量多少?」
01:40
I have a dog. Her name名稱 is Elle艾麗,
and she hates the rain.
31
88108
4047
我養了隻狗,名叫埃萊,最討厭下雨。
01:44
And I have tried試著 everything
to untrainuntrain her.
32
92179
3306
我用盡方法來訓練她,
讓她適應下雨。
01:47
But because I have failed失敗 at this,
33
95509
2771
但因為我失敗了,
01:50
I also have to consult請教
an oracle神諭, called Dark黑暗 Sky天空,
34
98304
3286
我還是得諮詢一位叫
Dark Sky(天氣預報公司)的先知,
01:53
every一切 time before we go on a walk步行,
35
101614
1635
每次散步之前都會諮詢,
01:55
for very accurate準確 weather天氣 predictions預測
in the next下一個 10 minutes分鐘.
36
103273
3577
以獲得接下來十分鐘的準確天氣預報。
02:01
She's so sweet.
37
109535
1303
她真的很貼心。
02:03
So because of all of this,
our oracle神諭 is a $122 billion十億 industry行業.
38
111827
5707
基於這些理由,我們的「先知」
是個 1220 億美元的產業。
02:10
Now, despite儘管 the size尺寸 of this industry行業,
39
118006
3376
先不論這個產業的規模,
02:13
the returns回報 are surprisingly出奇 low.
40
121406
2456
令人驚訝的是它極低的報酬率。
02:16
Investing投資 in big data數據 is easy簡單,
41
124342
2494
投資大數據很簡單,
02:18
but using運用 it is hard.
42
126860
1933
運用大數據卻很難。
02:21
Over 73 percent百分 of big data數據 projects項目
aren't even profitable有利可圖,
43
129981
4040
73% 以上的大數據計畫根本不賺錢,
02:26
and I have executives高管
coming未來 up to me saying,
44
134045
2431
有些業務主管跑來跟我說,
02:28
"We're experiencing經歷 the same相同 thing.
45
136500
1789
「我們都面臨了同樣的問題。
02:30
We invested投資 in some big data數據 system系統,
46
138313
1753
我們投資了幾個大數據系統,
02:32
and our employees僱員 aren't making製造
better decisions決定.
47
140090
2968
但我們的員工卻還是不能
做出更優的決定。
02:35
And they're certainly當然 not coming未來 up
with more breakthrough突破 ideas思路."
48
143082
3162
他們當然也沒有想出
更多突破性的點子。」
02:38
So this is all really interesting有趣 to me,
49
146914
3184
這些對我來說都很有趣,
02:42
because I'm a technology技術 ethnographer人種.
50
150122
2010
因為我是個科技人類學家。
02:44
I study研究 and I advise勸告 companies公司
51
152630
2564
我研究並給予公司建議,
02:47
on the patterns模式
of how people use technology技術,
52
155218
2483
告訴他們人們使用科技的形態,
02:49
and one of my interest利益 areas is data數據.
53
157725
2678
我有興趣的領域之一就是數據。
02:52
So why is having more data數據
not helping幫助 us make better decisions決定,
54
160427
5193
為什麼獲得更多數據
卻沒有幫我們做更好的決定,
02:57
especially特別 for companies公司
who have all these resources資源
55
165644
2783
特別是那些有資源,
可以投資大數據系統的公司?
03:00
to invest投資 in these big data數據 systems系統?
56
168451
1736
03:02
Why isn't it getting得到 any easier更輕鬆 for them?
57
170211
2398
為什麼他們沒有更好地做決定?
03:05
So, I've witnessed目擊 the struggle鬥爭 firsthand第一手.
58
173990
2634
我第一時間就目睹了這項困境。
03:09
In 2009, I started開始
a research研究 position位置 with Nokia諾基亞.
59
177374
3484
2009 年,我開始了
在諾基亞的研究工作。
03:13
And at the time,
60
181232
1158
當時,諾基亞是世界上
最大的手機公司之一,
03:14
Nokia諾基亞 was one of the largest最大
cell細胞 phone電話 companies公司 in the world世界,
61
182414
3158
03:17
dominating主導 emerging新興 markets市場
like China中國, Mexico墨西哥 and India印度 --
62
185596
3202
在中國、墨西哥、印度等
新興市場中佔有主要地位──
03:20
all places地方 where I had doneDONE
a lot of research研究
63
188822
2502
我在這些地方都做了很多研究,
03:23
on how low-income低收入 people use technology技術.
64
191348
2676
研究低收入的人怎麼使用科技產品。
03:26
And I spent花費 a lot of extra額外 time in China中國
65
194048
2330
我在中國花了特別多時間
03:28
getting得到 to know the informal非正式的 economy經濟.
66
196402
2592
來了解地下經濟。
03:31
So I did things like working加工
as a street vendor供應商
67
199018
2401
所以我當過街頭攤販,
03:33
selling銷售 dumplings水餃 to construction施工 workers工人.
68
201443
2574
賣水餃給建築工人。
03:36
Or I did fieldwork實習,
69
204041
1358
我也做過實地調查,
03:37
spending開支 nights and days
in internet互聯網 cafCAFés,
70
205423
2958
在網咖中日日夜夜地待著,
03:40
hanging out with Chinese中文 youth青年,
so I could understand理解
71
208405
2546
和中國年輕人來往,這樣我才知道
03:42
how they were using運用
games遊戲 and mobile移動 phones手機
72
210975
2284
他們怎麼玩遊戲、使用手機,
03:45
and using運用 it between之間 moving移動
from the rural鄉村 areas to the cities城市.
73
213283
3370
以及他們從農村地區
移居到城市時的使用情形。
03:50
Through通過 all of this qualitative定性 evidence證據
that I was gathering蒐集,
74
218335
3927
透過我收集的定性資料,
03:54
I was starting開始 to see so clearly明確地
75
222286
2824
我開始清楚看見
03:57
that a big change更改 was about to happen發生
among其中 low-income低收入 Chinese中文 people.
76
225134
4472
即將發生在低收入中國人身上的巨變。
04:03
Even though雖然 they were surrounded包圍
by advertisements廣告 for luxury豪華 products製品
77
231020
4367
雖然他們身邊圍繞著奢侈品的廣告,
04:07
like fancy幻想 toilets洗手間 --
who wouldn't不會 want one? --
78
235411
3495
像是花俏的馬桶──誰不想要呢──
04:10
and apartments公寓 and cars汽車,
79
238930
2890
還有公寓和車,
04:13
through通過 my conversations對話 with them,
80
241844
1820
從和他們的對話中,
04:15
I found發現 out that the ads廣告
the actually其實 enticed誘惑 them the most
81
243688
3841
我發現最吸引他們的廣告,
04:19
were the ones那些 for iPhonesiPhone手機,
82
247553
1996
是 iPhone 的廣告,
04:21
promising有希望 them this entry條目
into this high-tech高科技 life.
83
249573
3052
那些廣告向他們保證了
進入高科技生活的途徑。
04:25
And even when I was living活的 with them
in urban城市的 slums貧民窟 like this one,
84
253469
3163
即使我和他們一起
住在這樣的城市貧民窟,
04:28
I saw people investing投資
over half of their monthly每月一次 income收入
85
256656
2996
我也看到人們將半個月以上的收入
04:31
into buying購買 a phone電話,
86
259676
1623
拿去買手機,
04:33
and increasingly日益, they were "shanzhai山寨,"
87
261323
2302
而且越來越多都是「山寨品」,
04:35
which哪一個 are affordable實惠 knock-offs翻版
of iPhonesiPhone手機 and other brands品牌.
88
263649
3388
也就是他們買得起的
iPhone 或其他品牌的仿冒品。
04:40
They're very usable可用.
89
268303
1625
這些仿冒品很堪使用。
04:42
Does the job工作.
90
270890
1322
原廠有的功能都能用。
04:44
And after years年份 of living活的
with migrants移民 and working加工 with them
91
272750
5789
我和移民一起住、一起工作了數年,
04:50
and just really doing everything
that they were doing,
92
278563
3434
真的是他們做什麼,我就做什麼,
04:54
I started開始 piecing接頭
all these data數據 points together一起 --
93
282021
3597
我開始將所有數據拼湊在一起──
04:57
from the things that seem似乎 random隨機,
like me selling銷售 dumplings水餃,
94
285642
3123
不論是看似不相關的事,
像是我賣水餃的事,
05:00
to the things that were more obvious明顯,
95
288789
1804
或是較明顯相關的事,
像是追蹤他們花多少錢付手機費。
05:02
like tracking追踪 how much they were spending開支
on their cell細胞 phone電話 bills票據.
96
290617
3232
所以我才有辦法描繪出
這麼多整體畫面
05:05
And I was able能夠 to create創建
this much more holistic整體 picture圖片
97
293873
2639
05:08
of what was happening事件.
98
296536
1156
來說明當時正發生什麼事。
05:09
And that's when I started開始 to realize實現
99
297716
1722
這時我才開始理解到
05:11
that even the poorest最窮 in China中國
would want a smartphone手機,
100
299462
3509
連中國最窮的人也想要智慧型手機,
05:14
and that they would do almost幾乎 anything
to get their hands on one.
101
302995
4985
且他們幾乎會不擇手段拿到手。
05:21
You have to keep in mind心神,
102
309073
2404
你們要記得,
05:23
iPhonesiPhone手機 had just come out, it was 2009,
103
311501
3084
當時是 2009 年,iPhone 才剛出現,
05:26
so this was, like, eight years年份 ago,
104
314609
1885
這是八年前的事,
05:28
and Androids機器人 had just started開始
looking like iPhonesiPhone手機.
105
316518
2437
安卓手機才剛開始像 iPhone。
05:30
And a lot of very smart聰明
and realistic實際 people said,
106
318979
2507
很多聰明又現實的人說,
05:33
"Those smartphones智能手機 -- that's just a fad時尚.
107
321510
2207
「智慧型手機只是一時的流行。
05:36
Who wants to carry攜帶 around
these heavy things
108
324243
2996
誰會想帶著這麼重的東西到處走,
05:39
where batteries電池 drain排水 quickly很快
and they break打破 every一切 time you drop下降 them?"
109
327263
3487
又很快就沒電,
還會一掉地就壞?」
05:44
But I had a lot of data數據,
110
332793
1201
但我有很多數據,
05:46
and I was very confident信心
about my insights見解,
111
334018
2260
我對自己的洞察觀點非常有自信,
05:48
so I was very excited興奮
to share分享 them with Nokia諾基亞.
112
336302
2829
我興奮地把數據告訴諾基亞。
05:53
But Nokia諾基亞 was not convinced相信,
113
341332
2517
但我沒能說服諾基亞,
05:55
because it wasn't big data數據.
114
343873
2335
因為那不是大數據。
05:59
They said, "We have
millions百萬 of data數據 points,
115
347022
2404
他們說:「我們有幾百萬則數據,
06:01
and we don't see any indicators指標
of anyone任何人 wanting希望 to buy購買 a smartphone手機,
116
349450
4247
而我們沒見到任何數據
指出有人想買智慧型手機,
06:05
and your data數據 set of 100,
as diverse多種 as it is, is too weak
117
353721
4388
你的 100 組數據太缺乏多樣性,
06:10
for us to even take seriously認真地."
118
358133
1714
我們完全無法重視這項數據。」
06:12
And I said, "Nokia諾基亞, you're right.
119
360908
1605
我說:「諾基亞,你說的沒錯。
06:14
Of course課程 you wouldn't不會 see this,
120
362537
1560
你當然不會看到有人要買,
06:16
because you're sending發出 out surveys調查
assuming假設 that people don't know
121
364121
3371
因為你所發送問卷的假設前提
是人們不知道智慧型手機是什麼,
06:19
what a smartphone手機 is,
122
367516
1159
所以你的數據當然不會反映
06:20
so of course課程 you're not going
to get any data數據 back
123
368699
2366
06:23
about people wanting希望 to buy購買
a smartphone手機 in two years年份.
124
371089
2572
兩年內想買智慧型手機的人的想法。
06:25
Your surveys調查, your methods方法
have been designed設計
125
373685
2118
你問卷、研究方法的設計理念
06:27
to optimize優化 an existing現有 business商業 model模型,
126
375827
2022
都是想讓現有的業務型態更好,
06:29
and I'm looking
at these emergent應急 human人的 dynamics動力學
127
377873
2608
而我關注的是這些正浮現的人類動態,
06:32
that haven't沒有 happened發生 yet然而.
128
380505
1354
那些是過去沒有發生的,
06:33
We're looking outside of market市場 dynamics動力學
129
381883
2438
我們看的是市場動態之外,
06:36
so that we can get ahead of it."
130
384345
1631
這樣我們才能先走一步。」
06:39
Well, you know what happened發生 to Nokia諾基亞?
131
387373
2244
你們知道諾基亞怎麼樣了嗎?
06:41
Their business商業 fell下跌 off a cliff懸崖.
132
389641
2365
他們的產業跌落谷底。
06:44
This -- this is the cost成本
of missing失踪 something.
133
392791
3727
這就是錯失的代價。
06:49
It was unfathomable叵測.
134
397163
1999
那代價是深不可測的。
06:52
But Nokia's諾基亞的 not alone單獨.
135
400003
1651
但不是只有諾基亞這樣。
06:54
I see organizations組織
throwing投擲 out data數據 all the time
136
402258
2581
我看到各機構一天到晚丟棄數據,
06:56
because it didn't come from a quant定量 model模型
137
404863
2561
因為數據並非來自數量大的模型,
06:59
or it doesn't fit適合 in one.
138
407448
1768
或對不上數量大的模型數據。
07:02
But it's not big data's數據的 fault故障.
139
410219
2048
但這不是大數據的錯。
07:04
It's the way we use big data數據;
it's our responsibility責任.
140
412942
3907
是我們用錯方法,
是我們的責任。
07:09
Big data's數據的 reputation聲譽 for success成功
141
417730
1911
但一般認為大數據的成功之處
07:11
comes from quantifying量化
very specific具體 environments環境,
142
419665
3759
在於量化的對象非常的特定,
07:15
like electricity電力 power功率 grids網格
or delivery交貨 logistics後勤 or genetic遺傳 code,
143
423448
4913
像是電網、物流運送或遺傳密碼,
07:20
when we're quantifying量化 in systems系統
that are more or less contained.
144
428385
4318
也就是些基本上可操縱的系統。
07:24
But not all systems系統
are as neatly整潔 contained.
145
432727
2969
但並非所有的系統
都能被操縱得好好的。
07:27
When you're quantifying量化
and systems系統 are more dynamic動態,
146
435720
3258
若你在量化的系統是動態的,
07:31
especially特別 systems系統
that involve涉及 human人的 beings眾生,
147
439002
3799
特別是那些有人參與其中的系統,
07:34
forces軍隊 are complex複雜 and unpredictable不可預料的,
148
442825
2426
會產生影響的事物複雜又難以預測,
07:37
and these are things
that we don't know how to model模型 so well.
149
445275
3486
我們不太知道怎樣建立這些模型。
07:41
Once一旦 you predict預測 something
about human人的 behavior行為,
150
449204
2813
即使你一時預測了人的行動,
07:44
new factors因素 emerge出現,
151
452041
1855
又會出現新的要素,
07:45
because conditions條件
are constantly經常 changing改變.
152
453920
2365
因為情況持續在改變。
07:48
That's why it's a never-ending沒完沒了 cycle週期.
153
456309
1803
正因如此,這是個永無止境的迴圈。
07:50
You think you know something,
154
458136
1464
你以為你瞭解了一件事,
07:51
and then something unknown未知
enters進入 the picture圖片.
155
459624
2242
另一件未知的事物便進入了你的視野。
07:53
And that's why just relying依托
on big data數據 alone單獨
156
461890
3322
所以純粹依靠大數據
07:57
increases增加 the chance機會
that we'll miss小姐 something,
157
465236
2849
便增加了我們錯失的機率,
08:00
while giving us this illusion錯覺
that we already已經 know everything.
158
468109
3777
但同時讓我們以為我們無所不知。
08:04
And what makes品牌 it really hard
to see this paradox悖論
159
472406
3856
為什麼我們很難發現這個矛盾,
08:08
and even wrap our brains大腦 around it
160
476286
2659
甚至也很難去理解它,
08:10
is that we have this thing
that I call the quantification量化 bias偏壓,
161
478969
3691
是因為我們有我所謂的「量化成見」,
08:14
which哪一個 is the unconscious無意識 belief信仰
of valuing價值評估 the measurable可測量
162
482684
3922
也就是無意識地認為可量化的
比不可量化的更有價值。
08:18
over the immeasurable不可計量的.
163
486630
1594
08:21
And we often經常 experience經驗 this at our work.
164
489222
3284
我們工作時常有這樣的經驗。
08:24
Maybe we work alongside並肩
colleagues同事 who are like this,
165
492530
2650
或許我們和這樣想的同事一起工作,
08:27
or even our whole整個 entire整個
company公司 may可能 be like this,
166
495204
2428
或者整個公司都這樣想,
08:29
where people become成為
so fixated迷戀 on that number,
167
497656
2546
人們過於迷戀數字,
08:32
that they can't see anything
outside of it,
168
500226
2067
以至於看不見除此之外的任何東西,
08:34
even when you present當下 them evidence證據
right in front面前 of their face面對.
169
502317
3948
即使你將證據貼到他們臉上,給他們看。
08:39
And this is a very appealing吸引人的 message信息,
170
507123
3371
這是個十分吸引人的訊息,
08:42
because there's nothing
wrong錯誤 with quantifying量化;
171
510518
2343
因為量化並沒有錯;
08:44
it's actually其實 very satisfying滿意的.
172
512885
1430
量化事實上很讓人滿意。
08:46
I get a great sense of comfort安慰
from looking at an Excel高強 spreadsheet電子表格,
173
514339
4362
我看著 Excel 電子表格就覺得安心,
08:50
even very simple簡單 ones那些.
174
518725
1401
即使是很簡單的也一樣。
08:52
(Laughter笑聲)
175
520150
1014
(笑聲)
08:53
It's just kind of like,
176
521188
1152
那種感覺就是,
08:54
"Yes! The formula worked工作. It's all OK.
Everything is under control控制."
177
522364
3504
「好的!方程式沒問題。
一切都很好。都在掌控之中。」
08:58
But the problem問題 is
178
526792
2390
問題是,
09:01
that quantifying量化 is addictive上癮.
179
529206
2661
量化會使人上癮。
09:03
And when we forget忘記 that
180
531891
1382
我們一旦忘記這件事,
09:05
and when we don't have something
to kind of keep that in check,
181
533297
3038
若我們沒能做到時時確認是否上癮,
09:08
it's very easy簡單 to just throw out data數據
182
536359
2118
我們很容易直接扔掉這樣的資料:
09:10
because it can't be expressed表達
as a numerical數字的 value.
183
538501
2718
僅僅因為它無法用數值量化。
09:13
It's very easy簡單 just to slip
into silver-bullet銀子彈 thinking思維,
184
541243
2921
很容易認為會有完美解決一切的絶招,
09:16
as if some simple簡單 solution existed存在.
185
544188
2579
就好像有某種簡單的解決方法一樣。
09:19
Because this is a great moment時刻 of danger危險
for any organization組織,
186
547600
4062
因為這對任何一間機構來說,
都是危機的重要時刻,
09:23
because oftentimes通常情況下,
the future未來 we need to predict預測 --
187
551686
2634
時常,我們要預測的未來,
09:26
it isn't in that haystack草垛,
188
554344
2166
並不是在這安穩的草堆裡,
09:28
but it's that tornado龍捲風
that's bearing軸承 down on us
189
556534
2538
而是在它之外,
是即將襲擊我們的暴風中心。
09:31
outside of the barn穀倉.
190
559096
1488
09:34
There is no greater更大 risk風險
191
562960
2326
沒有什麼比對未知
一無所知來得有風險,
09:37
than being存在 blind to the unknown未知.
192
565310
1666
09:39
It can cause原因 you to make
the wrong錯誤 decisions決定.
193
567000
2149
那會使你做出錯誤的決定。
09:41
It can cause原因 you to miss小姐 something big.
194
569173
1974
那可能使你錯失重要的事物。
09:43
But we don't have to go down this path路徑.
195
571734
3101
但我們不用這樣做。
09:47
It turns out that the oracle神諭
of ancient Greece希臘
196
575453
3195
到頭來,是古希臘的先知
握有顯示道路的神秘鑰匙。
09:50
holds持有 the secret秘密 key
that shows節目 us the path路徑 forward前鋒.
197
578672
3966
09:55
Now, recent最近 geological地質 research研究 has shown顯示
198
583654
2595
近年的地質研究顯示,
09:58
that the Temple寺廟 of Apollo阿波羅,
where the most famous著名 oracle神諭 satSAT,
199
586273
3564
最有名的先知所在的阿波羅神廟,
10:01
was actually其實 built內置
over two earthquake地震 faults故障.
200
589861
3084
事實上座落在兩個地震斷層上。
10:04
And these faults故障 would release發布
these petrochemical石化 fumes油煙
201
592969
2886
這些斷層會從地殼下釋出石油煙氣,
10:07
from underneath the Earth's地球 crust脆皮,
202
595879
1685
而那位先知就直接坐在那些斷層上方,
10:09
and the oracle神諭 literally按照字面 satSAT
right above以上 these faults故障,
203
597588
3866
10:13
inhaling吸入 enormous巨大 amounts
of ethylene乙烯 gas加油站, these fissures裂縫.
204
601478
3588
從縫隙中吸入數不盡的乙烯氣體。
10:17
(Laughter笑聲)
205
605090
1008
(笑聲)
10:18
It's true真正.
206
606122
1173
那是真的。
10:19
(Laughter笑聲)
207
607319
1017
(笑聲)
10:20
It's all true真正, and that's what made製作 her
babble潺潺 and hallucinate產生幻覺
208
608360
3509
那都是真的,那就是為什麼
她講話含糊不清還看到幻覺,
10:23
and go into this trance-like恍惚 state.
209
611893
1724
並進入類似出神的狀態。
10:25
She was high as a kite風箏!
210
613641
1770
她感覺自己都飛上天了!
10:27
(Laughter笑聲)
211
615435
4461
(笑聲)
10:31
So how did anyone任何人 --
212
619920
2779
所以大家要怎麼──
10:34
How did anyone任何人 get
any useful有用 advice忠告 out of her
213
622723
3030
大家要怎麼在這個狀態下
得到有用的建議?
10:37
in this state?
214
625777
1190
10:39
Well, you see those people
surrounding周圍 the oracle神諭?
215
627497
2381
看到那些圍繞先知的人們了嗎?
10:41
You see those people holding保持 her up,
216
629902
1879
你可以看到那些人支撐著她,
10:43
because she's, like, a little woozy喝醉的?
217
631805
1717
因為她好像有點頭昏眼花?
10:45
And you see that guy
on your left-hand左手 side
218
633546
2308
有沒有發現她左邊的男子
10:47
holding保持 the orange橙子 notebook筆記本?
219
635878
1598
正拿著橘色小冊子?
10:50
Well, those were the temple寺廟 guides導遊,
220
638105
1730
那些是神廟的引導人員,
10:51
and they worked工作 hand in hand
with the oracle神諭.
221
639859
3016
他們與先知密切合作。
10:56
When inquisitors監獄 would come
and get on their knees膝蓋,
222
644084
2516
當有人來下跪詢問時,
10:58
that's when the temple寺廟 guides導遊
would get to work,
223
646624
2340
神廟的引導人員就開始工作了,
11:00
because after they asked her questions問題,
224
648988
1864
在來者向先知詢問一些問題後,
11:02
they would observe their emotional情緒化 state,
225
650876
2001
他們會觀察來者的精神狀態,
11:04
and then they would ask them
follow-up跟進 questions問題,
226
652901
2324
然後他們會問來者一些後續問題,
11:07
like, "Why do you want to know
this prophecy預言? Who are you?
227
655249
2834
像是:「為什麼你想知道
這個預言?你是誰?
11:10
What are you going to do
with this information信息?"
228
658107
2264
你會怎麼運用這個資訊?」
11:12
And then the temple寺廟 guides導遊 would take
this more ethnographic人種學,
229
660395
3182
接著神廟的引導人員會
用人類學的角度來看,
11:15
this more qualitative定性 information信息,
230
663601
2156
用質性資訊的角度來看,
11:17
and interpret the oracle's甲骨文公司 babblings嘮叨.
231
665781
2075
然後翻譯先知含糊不清的話。
11:21
So the oracle神諭 didn't stand alone單獨,
232
669428
2292
所以先知並非自己承攬一切任務,
11:23
and neither也不 should our big data數據 systems系統.
233
671744
2148
我們的大數據系統同樣也不該如此。
11:26
Now to be clear明確,
234
674630
1161
我要澄清一下,
11:27
I'm not saying that big data數據 systems系統
are huffing吹氣 ethylene乙烯 gas加油站,
235
675815
3459
我並非在說大數據系統
在呼吸着乙烯氣體,
11:31
or that they're even giving
invalid無效 predictions預測.
236
679298
2353
甚至給予沒用的預測。
11:33
The total opposite對面.
237
681675
1161
完全相反。
11:34
But what I am saying
238
682860
2068
我想說的是,
11:36
is that in the same相同 way
that the oracle神諭 needed需要 her temple寺廟 guides導遊,
239
684952
3832
就像先知需要神廟的引導人員那樣,
11:40
our big data數據 systems系統 need them, too.
240
688808
2288
大數據系統同樣也需要。
11:43
They need people like ethnographers人種學家
and user用戶 researchers研究人員
241
691120
4109
大數據需要人類學家以及用戶研究人員
11:47
who can gather收集 what I call thick data數據.
242
695253
2506
來收集我所謂的「厚數據」──
11:50
This is precious珍貴 data數據 from humans人類,
243
698502
2991
來自於人們的寶貴數據,
11:53
like stories故事, emotions情緒 and interactions互動
that cannot不能 be quantified量化.
244
701517
4102
像是故事、情緒和互動,
這些無法計量的事物。
11:57
It's the kind of data數據
that I collected for Nokia諾基亞
245
705643
2322
就像我收集給諾基亞的那種數據,
數據樣本規模非常小,
11:59
that comes in in the form形成
of a very small sample樣品 size尺寸,
246
707989
2669
12:02
but delivers提供 incredible難以置信 depth深度 of meaning含義.
247
710682
2955
但傳達的涵義卻極其的深。
12:05
And what makes品牌 it so thick and meaty肉香
248
713661
3680
它如此厚重、內容豐富的原因是
12:10
is the experience經驗 of understanding理解
the human人的 narrative敘述.
249
718445
4029
那些從人們的話語中
明白更多信息的經驗。
12:14
And that's what helps幫助 to see
what's missing失踪 in our models楷模.
250
722498
3639
這才能幫助我們看到
模型裡缺少了什麼東西。
12:18
Thick data數據 grounds理由 our business商業 questions問題
in human人的 questions問題,
251
726851
4045
厚數據以人類問題為根基
來說明經濟問題,
12:22
and that's why integrating整合
big and thick data數據
252
730920
3562
這就是為什麼結合大數據和厚數據
12:26
forms形式 a more complete完成 picture圖片.
253
734506
1689
能讓我們得到的訊息更加完整。
12:28
Big data數據 is able能夠 to offer提供
insights見解 at scale規模
254
736772
2881
大數據能在一定程度上洞悉問題,
12:31
and leverage槓桿作用 the best最好
of machine intelligence情報,
255
739677
2647
並最大程度發揮機器智能,
12:34
whereas thick data數據 can help us
rescue拯救 the context上下文 loss失利
256
742348
3572
而厚數據能幫我們找到
那缺失的背景資訊,
能讓大數據便於使用,
12:37
that comes from making製造 big data數據 usable可用,
257
745944
2098
12:40
and leverage槓桿作用 the best最好
of human人的 intelligence情報.
258
748066
2181
並最大程度發揮人類智能。
12:42
And when you actually其實 integrate整合 the two,
that's when things get really fun開玩笑,
259
750271
3552
若你真的把這兩個結合在一起
事情就會變得非常有趣,
12:45
because then you're no longer
just working加工 with data數據
260
753847
2436
如此一來,運用的就不只是
你早就收集的數據。
12:48
you've already已經 collected.
261
756307
1196
你還可以運用尚未收集的數據。
12:49
You get to also work with data數據
that hasn't有沒有 been collected.
262
757527
2737
你就可以知道「為什麼」:
12:52
You get to ask questions問題 about why:
263
760288
1719
12:54
Why is this happening事件?
264
762031
1317
為什麼會變成這樣?
12:55
Now, when NetflixNetflix公司 did this,
265
763778
1379
所以說,網飛這樣做
12:57
they unlocked解鎖 a whole整個 new way
to transform轉變 their business商業.
266
765181
3035
就開啟了轉換商業模式的全新方式。
13:01
NetflixNetflix公司 is known已知 for their really great
recommendation建議 algorithm算法,
267
769406
3956
網飛以擁有優秀的推薦演算法而聞名,
13:05
and they had this $1 million百萬 prize
for anyone任何人 who could improve提高 it.
268
773386
4797
且發給任何能改善系統的人
一百萬美元獎金。
13:10
And there were winners獲獎者.
269
778207
1314
有人贏了獎金。
13:12
But NetflixNetflix公司 discovered發現
the improvements改進 were only incremental增加的.
270
780255
4323
但網飛發現效能提升還是不夠明顯。
13:17
So to really find out what was going on,
271
785404
1964
為了知道發生了什麼事,
13:19
they hired僱用 an ethnographer人種,
Grant格蘭特 McCracken麥克拉肯,
272
787392
3741
他們僱用了人類學家,
格蘭特.麥克拉肯,
13:23
to gather收集 thick data數據 insights見解.
273
791157
1546
來收集厚數據以準確洞察理解。
13:24
And what he discovered發現 was something
that they hadn't有沒有 seen看到 initially原來
274
792727
3924
他發現了網飛最初未能
從量化數據中看出來的,
13:28
in the quantitative data數據.
275
796675
1355
13:31
He discovered發現 that people loved喜愛
to binge-watch狂歡手錶.
276
799072
2728
他發現人們喜歡刷劇。
(註:短時間內狂看電視劇)
13:33
In fact事實, people didn't even
feel guilty有罪 about it.
277
801824
2353
事實上,人們甚至不覺得有什麼不對。
13:36
They enjoyed享受 it.
278
804201
1255
他們非常享受這個過程。
13:37
(Laughter笑聲)
279
805480
1026
(笑聲)
13:38
So NetflixNetflix公司 was like,
"Oh. This is a new insight眼光."
280
806530
2356
網飛覺得:「噢,這是個新洞見。」
13:40
So they went to their data數據 science科學 team球隊,
281
808910
1938
於是叫他們的數據科學組
13:42
and they were able能夠 to scale規模
this big data數據 insight眼光
282
810872
2318
把這洞察放大到
量化數據的規模來衡量。
13:45
in with their quantitative data數據.
283
813214
2587
13:47
And once一旦 they verified驗證 it
and validated驗證 it,
284
815825
3170
一旦他們再次確認了它的準確性,
13:51
NetflixNetflix公司 decided決定 to do something
very simple簡單 but impactful影響力.
285
819019
4761
網飛便決定做一件簡單
卻影響很大的事情。
13:56
They said, instead代替 of offering
the same相同 show顯示 from different不同 genres流派
286
824834
6492
他們說:
「與其提供不同類型但相似的影集,
14:03
or more of the different不同 shows節目
from similar類似 users用戶,
287
831350
3888
或是給類似的觀眾
欣賞更多不同的影集,
14:07
we'll just offer提供 more of the same相同 show顯示.
288
835262
2554
只要同一影集提供更多集就好了。
14:09
We'll make it easier更輕鬆
for you to binge-watch狂歡手錶.
289
837840
2105
我們讓你更容易刷劇。」
14:11
And they didn't stop there.
290
839969
1486
而他們並沒有止步於此。
14:13
They did all these things
291
841479
1474
他們用一樣的方式,
14:14
to redesign重新設計 their entire整個
viewer觀眾 experience經驗,
292
842977
2959
重新設計了整個觀眾體驗,
14:17
to really encourage鼓勵 binge-watching長時間觀看.
293
845960
1758
來真正地鼓勵大家刷劇。
14:20
It's why people and friends朋友 disappear消失
for whole整個 weekends週末 at a time,
294
848230
3241
這就是為什麼朋友會消失整個星期,
14:23
catching up on shows節目
like "Master of None沒有."
295
851495
2343
追上「無為大師」等戲劇的進度。
14:25
By integrating整合 big data數據 and thick data數據,
they not only improved改善 their business商業,
296
853862
4173
結合大數據與厚數據,
不只讓產業進步,
14:30
but they transformed改造 how we consume消耗 media媒體.
297
858059
2812
也轉變了我們使用媒體的型態。
14:32
And now their stocks個股 are projected預計
to double in the next下一個 few少數 years年份.
298
860895
4552
預期他們的股票
會在接下來幾年內翻倍。
14:38
But this isn't just about
watching觀看 more videos視頻
299
866280
3830
這不只是關於看了更多影片,
14:42
or selling銷售 more smartphones智能手機.
300
870134
1620
或賣了更多智慧型手機,等等。
14:44
For some, integrating整合 thick data數據
insights見解 into the algorithm算法
301
872143
4050
對於一些公司來說,
結合厚數據洞察和演算法,
14:48
could mean life or death死亡,
302
876217
2263
可能讓他們起死回生,
14:50
especially特別 for the marginalized邊緣化.
303
878504
2146
特別是那些已被邊緣化的公司。
14:53
All around the country國家,
police警察 departments部門 are using運用 big data數據
304
881738
3434
全國的警察局都用大數據來防止犯罪,
14:57
for predictive預測 policing治安,
305
885196
1963
來設定保證金金額,
14:59
to set bond amounts
and sentencing宣判 recommendations建議
306
887183
3084
並用加劇偏見的方式來建議判刑。
15:02
in ways方法 that reinforce加強 existing現有 biases偏見.
307
890291
3147
15:06
NSA'sNSA的 Skynet天網 machine learning學習 algorithm算法
308
894296
2423
美國國家安全局的天網學習演算法
15:08
has possibly或者 aided輔助 in the deaths死亡
of thousands數千 of civilians老百姓 in Pakistan巴基斯坦
309
896743
5444
可能致使幾千名巴基斯坦平民死亡,
15:14
from misreading誤讀 cellular細胞的 device設備 metadata元數據.
310
902211
2721
肇因於錯誤判讀了行動電話的數據。
15:19
As all of our lives生活 become成為 more automated自動化,
311
907131
3403
當我們的生活變得更加自動化,
15:22
from automobiles汽車 to health健康 insurance保險
or to employment僱用,
312
910558
3080
從汽車、健康保險或者就業,
15:25
it is likely容易 that all of us
313
913662
2350
很可能我們所有人
15:28
will be impacted影響
by the quantification量化 bias偏壓.
314
916036
2989
都會受量化偏見的影響。
15:32
Now, the good news新聞
is that we've我們已經 come a long way
315
920972
2621
好消息是
我們從吸入乙烯氣體到做出預測
15:35
from huffing吹氣 ethylene乙烯 gas加油站
to make predictions預測.
316
923617
2450
已有長足的進步。
15:38
We have better tools工具,
so let's just use them better.
317
926091
3070
我們有了更好的工具,
那麽讓我們更好地利用它。
15:41
Let's integrate整合 the big data數據
with the thick data數據.
318
929185
2323
讓我們將大數據與厚數據結合。
15:43
Let's bring帶來 our temple寺廟 guides導遊
with the oracles神諭,
319
931532
2261
讓我們使神廟的引導人員
與先知一起合作,
15:45
and whether是否 this work happens發生
in companies公司 or nonprofits非營利組織
320
933817
3376
不論做這項工作的是
公司、非營利組織、
政府,甚至軟體,
15:49
or government政府 or even in the software軟件,
321
937217
2469
全部都有其意義,
15:51
all of it matters事項,
322
939710
1792
15:53
because that means手段
we're collectively committed提交
323
941526
3023
因為這代表我們全體一起努力
15:56
to making製造 better data數據,
324
944573
2191
來得到更好的數據,
15:58
better algorithms算法, better outputs輸出
325
946788
1836
更好的演算法、更好的產品,
16:00
and better decisions決定.
326
948648
1643
以及更好的決定。
16:02
This is how we'll avoid避免
missing失踪 that something.
327
950315
3558
這就是避免錯失的方法。
16:07
(Applause掌聲)
328
955222
3948
(掌聲)
Translated by 風 韶
Reviewed by Wilde Luo

▲Back to top

ABOUT THE SPEAKER
Tricia Wang - Technology ethnographer
With astronaut eyes and ethnographer curiosity, Tricia Wang helps corporations grow by discovering the unknown about their customers.

Why you should listen

For Tricia Wang, human behavior generates some of the most perplexing questions of our times. She has taught global organizations how to identify new customers and markets hidden behind their data, amplified IDEO's design thinking practice as an expert-in-residence, researched the social evolution of the Chinese internet, and written about the "elastic self," an emergent form of interaction in a virtual world. Wang is the co-founder of Sudden Compass, a consulting firm that helps companies unlock new growth opportunities by putting customer obsession into practice.

Wang's work has been featured in The Atlantic, Al Jazeera, and The Guardian. Fast Company spotlighted her work in China: "What Twitter Can Learn From Weibo: Field Notes From Global Tech Ethnographer Tricia Wang." In her latest op-ed on Slate, she discusses how attempts to stop terrorists on social media can harm our privacy and anonymity. Her Medium post, "Why Big Data Needs Thick Data," is a frequently cited industry piece on the importance of an integrated data approach. One of her favorite essays documents her day in the life of working as a street vendor in China.

Known for her lively presentations that are grounded in her research and observations about human behavior and data, Wang has spoken at organizations such as Proctor & Gamble, Nike, Wrigley, 21st Century Fox and Tumblr. Her most recent talk at Enterprise UX delved into why corporate innovation usually doesn’t work and what to do about it. She delivered the opening keynote at The Conference to a crowd of marketers and creatives, delving into the wild history of linear perspective and its influence on how we think and form organizations.

Wang holds affiliate positions at Data & Society, Harvard University's Berkman Klein Center for Internet Studies and New York University's Interactive Telecommunication Program. She oversees Ethnography Matters, a site that publishes articles about applied ethnography and technology. She co-started a Slack community for anyone who uses ethnographic methods in industry.

Wang began her career as a documentary filmmaker at NASA, an HIV/AIDS activist, and an educator specializing in culturally responsive pedagogy. She is also proud to have co-founded the first national hip-hop education initiative, which turned into the Hip Hop Education Center at New York University, and to have built after-school technology and arts programs for low-income youth at New York City public schools and the Queens Museum of Arts. Her life philosophy is that you have to go to the edge to discover what’s really happening. She's the proud companion of her internet famous dog, #ellethedog.

More profile about the speaker
Tricia Wang | Speaker | TED.com