ABOUT THE SPEAKER
Sebastian Wernicke - Data scientist
After making a splash in the field of bioinformatics, Sebastian Wernicke moved on to the corporate sphere, where he motivates and manages multidimensional projects.

Why you should listen

Dr. Sebastian Wernicke is the Chief Data Scientist of ONE LOGIC, a data science boutique that supports organizations across industries to make sense of their vast data collections to improve operations and gain strategic advantages. Wernicke originally studied bioinformatics and previously led the strategy and growth of Seven Bridges Genomics, a Cambridge-based startup that builds platforms for genetic analysis.

Before his career in statistics began, Wernicke worked stints as both a paramedic and successful short animated filmmaker. He's also the author of the TEDPad app, an irreverent tool for creating an infinite number of "amazing and really bad" and mostly completely meaningless talks. He's the author of the statistically authoritative and yet completely ridiculous "How to Give the Perfect TEDTalk."

More profile about the speaker
Sebastian Wernicke | Speaker | TED.com
TEDxCambridge

Sebastian Wernicke: How to use data to make a hit TV show

塞巴斯蒂安.韋尼克: 如何運用數據做出一個很讚的電視節目

Filmed:
1,628,704 views

收集更多的數據會導向更好的決策嗎?有競爭力、數據頂尖的公司,如 Amazon、Google 和 Netflix 已經知道數據分析本身並不總能產生最佳的效果。在這次講座中,數據科學家塞巴斯蒂安.韋尼克剖析了當我們純粹用數據做決策時會發生甚麼錯誤,並建議我們一個更聰明的方式來使用它。
- Data scientist
After making a splash in the field of bioinformatics, Sebastian Wernicke moved on to the corporate sphere, where he motivates and manages multidimensional projects. Full bio

Double-click the English transcript below to play the video.

Roy Price這個人,
各位可能都未曾聽過,
00:12
Roy羅伊 Price價錢 is a man that most of you
have probably大概 never heard聽說 about,
0
820
4276
00:17
even though雖然 he may可能 have been responsible主管
1
5120
2496
即使他曾負責過
你生命中平凡無奇的22分鐘,
00:19
for 22 somewhat有些 mediocre平庸
minutes分鐘 of your life on April四月 19, 2013.
2
7640
6896
在2013年4月19日這一天。
00:26
He may可能 have also been responsible主管
for 22 very entertaining娛樂 minutes分鐘,
3
14560
3176
他也許也曾負責帶給
各位非常歡樂的22分鐘,
00:29
but not very many許多 of you.
4
17760
2256
但你們其中也許很多人並沒有。
00:32
And all of that goes back to a decision決定
5
20040
1896
而這一切全部要回到
00:33
that Roy羅伊 had to make
about three years年份 ago.
6
21960
2000
Roy在三年前的一個決定。
00:35
So you see, Roy羅伊 Price價錢
is a senior前輩 executive行政人員 with Amazon亞馬遜 Studios工作室.
7
23984
4832
所以,你明白,Roy Price是
Amazon廣播公司的一位資深決策者。
00:40
That's the TV電視 production生產
company公司 of Amazon亞馬遜.
8
28840
3016
這是Amazon旗下的一家
電視節目製作公司。
00:43
He's 47 years年份 old, slim, spiky高低不平 hair頭髮,
9
31880
3256
他47歲,身材不錯,尖頭髮,
00:47
describes介紹 himself他自己 on Twitter推特
as "movies電影, TV電視, technology技術, tacos玉米餅."
10
35160
4816
在Twitter上形容自己是
“電影、電視、科技、墨西哥捲餅 。”
00:52
And Roy羅伊 Price價錢 has a very responsible主管 job工作,
because it's his responsibility責任
11
40000
5176
Roy Price有一個
責任非常重大的工作,
因為他要負責幫Amazon挑選
即將製作的原創內容節目。
00:57
to pick the shows節目, the original原版的 content內容
that Amazon亞馬遜 is going to make.
12
45200
4056
01:01
And of course課程 that's
a highly高度 competitive競爭的 space空間.
13
49280
2336
當然,這是高度競爭的領域。
01:03
I mean, there are so many許多
TV電視 shows節目 already已經 out there,
14
51640
2736
我的意思是,
外面已經有那麼多的電視節目,
01:06
that Roy羅伊 can't just choose選擇 any show顯示.
15
54400
2176
Roy不能隨便亂挑一個節目。
01:08
He has to find shows節目
that are really, really great.
16
56600
4096
他必須找出真正、
真正很讚的節目。
01:12
So in other words, he has to find shows節目
17
60720
2816
換句話說,
他必須從這條曲線上的右邊挑選節目。
01:15
that are on the very right end結束
of this curve曲線 here.
18
63560
2376
01:17
So this curve曲線 here
is the rating評分 distribution分配
19
65960
2656
這條曲線是 IMDB網路電影資料庫裡
01:20
of about 2,500 TV電視 shows節目
on the website網站 IMDBIMDB,
20
68640
4376
2500個電視節目的
客戶評分分布圖,
01:25
and the rating評分 goes from one to 10,
21
73040
2896
評分從 1到10,
01:27
and the height高度 here shows節目 you
how many許多 shows節目 get that rating評分.
22
75960
2976
最高的地方代表
有多少節目達到這個評分。
01:30
So if your show顯示 gets得到 a rating評分
of nine points or higher更高, that's a winner優勝者.
23
78960
4696
所以如果你的節目達到 9分或更高,
你就是贏家。
01:35
Then you have a top最佳 two percent百分 show顯示.
24
83680
1816
你就是那百分之二的頂尖節目。
01:37
That's shows節目 like "Breaking打破 Bad,"
"Game遊戲 of Thrones權力," "The Wire,"
25
85520
3896
例如像是" 絕命毒師 、
權力遊戲、火線重案組 "
01:41
so all of these shows節目 that are addictive上癮,
26
89440
2296
全部都是會讓你上癮的節目,
01:43
whereafter此後 you've watched看著 a season季節,
your brain is basically基本上 like,
27
91760
3056
看完一季之後,你的大腦基本上像是 ...
01:46
"Where can I get more of these episodes發作?"
28
94840
2176
" 我要去哪裡找到更多這部片的影集? "
01:49
That kind of show顯示.
29
97040
1200
等等這類的節目。
01:50
On the left side, just for clarity明晰,
here on that end結束,
30
98920
2496
左邊末端,很明顯地,
你們有個叫" 小小姐與后冠 "的節目
01:53
you have a show顯示 called
"Toddlers幼兒 and Tiaras皇冠" --
31
101440
3176
01:56
(Laughter笑聲)
32
104640
2656
(笑聲)
一個足夠讓你明白
01:59
-- which哪一個 should tell you enough足夠
33
107320
1536
02:00
about what's going on
on that end結束 of the curve曲線.
34
108880
2191
為什麼它會在曲線末端的節目。
02:03
Now, Roy羅伊 Price價錢 is not worried擔心 about
getting得到 on the left end結束 of the curve曲線,
35
111095
4161
現在,Roy Price不擔心
在曲線左邊末端的節目。
02:07
because I think you would have to have
some serious嚴重 brainpower腦力
36
115280
2936
因為我認為你們都會想
有一些嚴肅的判斷力
02:10
to undercut咬邊 "Toddlers幼兒 and Tiaras皇冠."
37
118240
1696
來降低" 小小姐與后冠 "的評分 。
02:11
So what he's worried擔心 about
is this middle中間 bulge here,
38
119960
3936
所以,他擔心的是中間多數的這些節目,
多到爆的這些一般性電視節目,
02:15
the bulge of average平均 TV電視,
39
123920
1816
02:17
you know, those shows節目
that aren't really good or really bad,
40
125760
2856
你知道,這些節目
既不是很好也不是很壞,
02:20
they don't really get you excited興奮.
41
128639
1656
它們不會真正地讓你興奮。
02:22
So he needs需求 to make sure
that he's really on the right end結束 of this.
42
130320
4856
所以他要確保他真的
是在右邊的末端這裡,
02:27
So the pressure壓力 is on,
43
135200
1576
所以,壓力就來了,
02:28
and of course課程 it's also the first time
44
136800
2176
所以當然,這也是第一次 Amazon
02:31
that Amazon亞馬遜 is even
doing something like this,
45
139000
2176
也想要做類似這樣的事情,
02:33
so Roy羅伊 Price價錢 does not want
to take any chances機會.
46
141200
3336
Roy Price不想冒風險,
他想要建造成功,
02:36
He wants to engineer工程師 success成功.
47
144560
2456
他要一個保證的成功,
02:39
He needs需求 a guaranteed保證 success成功,
48
147040
1776
02:40
and so what he does is,
he holds持有 a competition競爭.
49
148840
2576
所以他就舉辦一個比賽。
02:43
So he takes a bunch of ideas思路 for TV電視 shows節目,
50
151440
3136
他為電視節目帶來了很多想法,
02:46
and from those ideas思路,
through通過 an evaluation評測,
51
154600
2296
並且透過一個評估,形塑這些想法,
02:48
they select選擇 eight candidates候選人 for TV電視 shows節目,
52
156920
4096
他們為電視節目挑選了八個候選名單,
02:53
and then he just makes品牌 the first episode插曲
of each one of these shows節目
53
161040
3216
然後他製作每一個節目的第一集,
02:56
and puts看跌期權 them online線上 for free自由
for everyone大家 to watch.
54
164280
3136
然後把他們放到網路上,
讓每個人免費觀看。
02:59
And so when Amazon亞馬遜
is giving out free自由 stuff東東,
55
167440
2256
所以當Amazon要給你免費的東西時,
03:01
you're going to take it, right?
56
169720
1536
你就會拿,對吧?
03:03
So millions百萬 of viewers觀眾
are watching觀看 those episodes發作.
57
171280
5136
所以上百萬人在看這些影集,
而這些人不明白的是,
當他們在觀看節目的時候,
03:08
What they don't realize實現 is that,
while they're watching觀看 their shows節目,
58
176440
3216
03:11
actually其實, they are being存在 watched看著.
59
179680
2296
實際上他們也正被觀查中。
03:14
They are being存在 watched看著
by Roy羅伊 Price價錢 and his team球隊,
60
182000
2336
他們被Roy Price及他的團隊觀查,
03:16
who record記錄 everything.
61
184360
1376
他們紀錄了每一件事。
03:17
They record記錄 when somebody presses印刷機 play,
when somebody presses印刷機 pause暫停,
62
185760
3376
他們紀錄了,那些人按了撥放,
那些人按了暫停,
03:21
what parts部分 they skip跳躍,
what parts部分 they watch again.
63
189160
2536
那些部分他們跳過,
那些部分他們又重看一遍。
03:23
So they collect蒐集 millions百萬 of data數據 points,
64
191720
2256
所以他們收集了上百萬的數據資料,
03:26
because they want
to have those data數據 points
65
194000
2096
因為他們想要用這些數據資料來決定
03:28
to then decide決定
which哪一個 show顯示 they should make.
66
196120
2696
要做甚麼樣的節目。
03:30
And sure enough足夠,
so they collect蒐集 all the data數據,
67
198840
2176
確定好後,他們收集所有的數據,
03:33
they do all the data數據 crunching搗弄,
and an answer回答 emerges出現,
68
201040
2576
他們做完所有數據處理後,
得到一個答案,
03:35
and the answer回答 is,
69
203640
1216
而答案就是,
03:36
"Amazon亞馬遜 should do a sitcom情景喜劇
about four Republican共和黨人 US Senators參議員."
70
204880
5536
" Amazon需要製作一個有關
美國共和黨參議員的喜劇 "。
03:42
They did that show顯示.
71
210440
1216
他們做了,
03:43
So does anyone任何人 know the name名稱 of the show顯示?
72
211680
2160
有人知道這個節目嗎?
(觀眾:" 艾爾發屋 ")
03:46
(Audience聽眾: "AlphaΑ House.")
73
214720
1296
03:48
Yes, "AlphaΑ House,"
74
216040
1456
是的," 艾爾發屋 "
03:49
but it seems似乎 like not too many許多 of you here
remember記得 that show顯示, actually其實,
75
217520
4096
但實際上,你們大部人
應該不記得有這部片子,
03:53
because it didn't turn out that great.
76
221640
1856
因為這部片並不那麼賣座。
03:55
It's actually其實 just an average平均 show顯示,
77
223520
1856
它實際上僅是一般的節目,
03:57
actually其實 -- literally按照字面, in fact事實, because
the average平均 of this curve曲線 here is at 7.4,
78
225400
4576
實際上,一般的節目差不多
坐落在曲線上的 7.4分,
04:02
and "AlphaΑ House" lands土地 at 7.5,
79
230000
2416
而" 艾爾發房屋 "落在7.5分,
04:04
so a slightly above以上 average平均 show顯示,
80
232440
2016
所以比一般的節目高一點點,
04:06
but certainly當然 not what Roy羅伊 Price價錢
and his team球隊 were aiming瞄準 for.
81
234480
2920
但絕對不是Roy Price與
他的團隊所要達到的目標。
這時,然而,同一時間,
04:10
Meanwhile與此同時, however然而,
at about the same相同 time,
82
238320
2856
另一家公司的另一個決策者,
04:13
at another另一個 company公司,
83
241200
1576
04:14
another另一個 executive行政人員 did manage管理
to land土地 a top最佳 show顯示 using運用 data數據 analysis分析,
84
242800
4216
用同樣的數據分析做了一個頂尖的節目,
04:19
and his name名稱 is Ted攤曬,
85
247040
1576
他的名字是 Ted,
04:20
Ted攤曬 SarandosSarandos, who is
the Chief首席 Content內容 Officer of NetflixNetflix公司,
86
248640
3416
Ted Sarandos是Netflix的
首席節目內容決策者,
04:24
and just like Roy羅伊,
he's on a constant不變 mission任務
87
252080
2136
就跟 Roy一樣,他也要不停的找
04:26
to find that great TV電視 show顯示,
88
254240
1496
最棒的節目,
04:27
and he uses使用 data數據 as well to do that,
89
255760
2016
而他也使用數據來這樣做,
04:29
except he does it
a little bit differently不同.
90
257800
2015
但他的做法,有點不太一樣。
04:31
So instead代替 of holding保持 a competition競爭,
what he did -- and his team球隊 of course課程 --
91
259839
3737
不是舉辦比賽,當然,他和他的團隊
04:35
was they looked看著 at all the data數據
they already已經 had about NetflixNetflix公司 viewers觀眾,
92
263600
3536
也有觀察Netflix已經有的觀眾數據,
04:39
you know, the ratings評級
they give their shows節目,
93
267160
2096
觀眾對節目的評分、觀看紀錄、
04:41
the viewing觀看 histories歷史,
what shows節目 people like, and so on.
94
269280
2696
那些節目是人們喜歡的等等,
04:44
And then they use that data數據 to discover發現
95
272000
1896
他們也使用數據去發掘
04:45
all of these little bits and pieces
about the audience聽眾:
96
273920
2616
觀眾所有的小細節:
他們喜歡甚麼類型的節目、
04:48
what kinds of shows節目 they like,
97
276560
1456
04:50
what kind of producers生產商,
what kind of actors演員.
98
278040
2096
甚麼類型的製作人、甚麼類型的演員,
04:52
And once一旦 they had
all of these pieces together一起,
99
280160
2576
一旦他們收集全部的細節後,
04:54
they took a leap飛躍 of faith信仰,
100
282760
1656
他們很有信心地
04:56
and they decided決定 to license執照
101
284440
2096
決定要製作一部,
04:58
not a sitcom情景喜劇 about four Senators參議員
102
286560
2456
不是四個參議員的喜劇,
05:01
but a drama戲劇 series系列 about a single Senator參議員.
103
289040
2880
而是一系列有關一位
單身參議員的戲劇。
各位知道那個節目嗎?
05:04
You guys know the show顯示?
104
292760
1656
05:06
(Laughter笑聲)
105
294440
1296
(笑聲)
05:07
Yes, "House of Cards," and NetflixNetflix公司
of course課程, nailed it with that show顯示,
106
295760
3736
是的," 纸牌屋 ",Netflix ,當然,
至少頭二季,用這節目盯住那個分數。
05:11
at least最小 for the first two seasons季節.
107
299520
2136
05:13
(Laughter笑聲) (Applause掌聲)
108
301680
3976
(笑聲)(掌聲)
05:17
"House of Cards" gets得到
a 9.1 rating評分 on this curve曲線,
109
305680
3176
" 纸牌屋 "在這曲線上拿到 9.1分,
05:20
so it's exactly究竟
where they wanted it to be.
110
308880
3176
這當然是他們想要的。
05:24
Now, the question of course課程 is,
what happened發生 here?
111
312080
2416
現在,當然問題就是
這到底是怎麼一回事?
05:26
So you have two very competitive競爭的,
data-savvy數據精明 companies公司.
112
314520
2656
你有兩個非常有競爭力、
精通數據資料的公司。
05:29
They connect all of these
millions百萬 of data數據 points,
113
317200
2856
他們連結了所有的數據資料,
05:32
and then it works作品
beautifully精美 for one of them,
114
320080
2376
然後,其中一個做的很漂亮,
05:34
and it doesn't work for the other one.
115
322480
1856
而另一個卻沒有,
05:36
So why?
116
324360
1216
為什麼?
05:37
Because logic邏輯 kind of tells告訴 you
that this should be working加工 all the time.
117
325600
3456
因為邏輯上告訴你,
這應該每次都有效啊,
05:41
I mean, if you're collecting蒐集
millions百萬 of data數據 points
118
329080
2456
我的意思是,
如果你收集了所有的數據資料
05:43
on a decision決定 you're going to make,
119
331560
1736
來決定一個決策,
05:45
then you should be able能夠
to make a pretty漂亮 good decision決定.
120
333320
2616
那你應該可以得到一個
相當不錯的決策。
05:47
You have 200 years年份
of statistics統計 to rely依靠 on.
121
335960
2216
你有 200年的統計數據做後盾,
05:50
You're amplifying放大 it
with very powerful強大 computers電腦.
122
338200
3016
你用很強大的電腦去增強它,
05:53
The least最小 you could expect期望
is good TV電視, right?
123
341240
3280
至少你可以期待到一個
好的電視節目,對吧?
05:57
And if data數據 analysis分析
does not work that way,
124
345880
2720
但如果數據分析
並沒有想像中的有效,
那,這真的有點恐怖,
06:01
then it actually其實 gets得到 a little scary害怕,
125
349520
2056
06:03
because we live生活 in a time
where we're turning車削 to data數據 more and more
126
351600
3816
因為我們正轉向一個
數據越來越多的時代,
06:07
to make very serious嚴重 decisions決定
that go far beyond TV電視.
127
355440
4480
來做出遠比電視節目
還要嚴肅的決策。
06:12
Does anyone任何人 here know the company公司
Multi-Health多生 Systems系統?
128
360760
3240
你們當中有人知道" MHS "這家公司嗎?
沒人?好,這樣很好,
06:17
No one. OK, that's good actually其實.
129
365080
1656
06:18
OK, so Multi-Health多生 Systems系統
is a software軟件 company公司,
130
366760
3216
好的,MHS是一家軟體公司,
06:22
and I hope希望 that nobody沒有人 here in this room房間
131
370000
2816
而我希望在座的各位,
沒有人與這個軟體有牽連,
06:24
ever comes into contact聯繫
with that software軟件,
132
372840
3176
06:28
because if you do,
it means手段 you're in prison監獄.
133
376040
2096
因為如果你有,代表你在監獄中
06:30
(Laughter笑聲)
134
378160
1176
(笑聲)
06:31
If someone有人 here in the US is in prison監獄,
and they apply應用 for parole言語,
135
379360
3536
在美國這裡如果有人被判入監,
然後要申請假釋,
06:34
then it's very likely容易 that
data數據 analysis分析 software軟件 from that company公司
136
382920
4296
很有可能那家公司的數據分析軟體
06:39
will be used in determining決定
whether是否 to grant發放 that parole言語.
137
387240
3616
會被用來判定是否能獲得假釋。
06:42
So it's the same相同 principle原理
as Amazon亞馬遜 and NetflixNetflix公司,
138
390880
2576
所以,它也是採用
Amazon 和 Netflix 公司相同的原則,
06:45
but now instead代替 of deciding決定 whether是否
a TV電視 show顯示 is going to be good or bad,
139
393480
4616
但不同的是,
他們是用來決定電視節目將來的好壞,
你是用來決定一個人將來的好壞,
06:50
you're deciding決定 whether是否 a person
is going to be good or bad.
140
398120
2896
06:53
And mediocre平庸 TV電視, 22 minutes分鐘,
that can be pretty漂亮 bad,
141
401040
5496
表現普通22分鐘的電視節目,很糟糕,
06:58
but more years年份 in prison監獄,
I guess猜測, even worse更差.
142
406560
2640
但,我猜,要做更多年的牢,更糟糕。
但不幸的是,實際上已經有證據顯示,
該數據分析除了擁有龐大的數據外,
07:02
And unfortunately不幸, there is actually其實
some evidence證據 that this data數據 analysis分析,
143
410360
4136
07:06
despite儘管 having lots of data數據,
does not always produce生產 optimum最佳 results結果.
144
414520
4216
它並不總是跑出適當的結果。
07:10
And that's not because a company公司
like Multi-Health多生 Systems系統
145
418760
2722
但並不只有像是MHS這樣的軟體公司
07:13
doesn't know what to do with data數據.
146
421506
1627
不明白數據怎麼了,
07:15
Even the most data-savvy數據精明
companies公司 get it wrong錯誤.
147
423158
2298
甚至最頂尖的數據公司也會出錯,
07:17
Yes, even Google谷歌 gets得到 it wrong錯誤 sometimes有時.
148
425480
2400
是的,甚至Google有時也會出錯。
07:20
In 2009, Google谷歌 announced公佈
that they were able能夠, with data數據 analysis分析,
149
428680
4496
2009年,Google宣布他們可以用數據分析,
來預測流行性感冒,討人厭的流感,
07:25
to predict預測 outbreaks爆發 of influenza流感,
the nasty討厭 kind of flu流感,
150
433200
4136
經由他們的Google搜尋引擎來做數據分析。
07:29
by doing data數據 analysis分析
on their Google谷歌 searches搜索.
151
437360
3776
07:33
And it worked工作 beautifully精美,
and it made製作 a big splash in the news新聞,
152
441160
3856
而且它準確無比,當時造成一股新聞的轟動,
07:37
including包含 the pinnacle巔峰
of scientific科學 success成功:
153
445040
2136
包含一個科學界成功的高峰:
07:39
a publication出版物 in the journal日誌 "Nature性質."
154
447200
2456
在 "自然期刊"上發表文章。
07:41
It worked工作 beautifully精美
for year after year after year,
155
449680
3616
之後的每一年,它都預測地很漂亮,
07:45
until直到 one year it failed失敗.
156
453320
1656
直到有一年它失敗了。
07:47
And nobody沒有人 could even tell exactly究竟 why.
157
455000
2256
沒有人能正確地說明到底甚麼原因。
07:49
It just didn't work that year,
158
457280
1696
那一年它就是不準了,
07:51
and of course課程 that again made製作 big news新聞,
159
459000
1936
當然,又造成了一次大新聞,
07:52
including包含 now a retraction回縮
160
460960
1616
包含現在
07:54
of a publication出版物
from the journal日誌 "Nature性質."
161
462600
2840
被" 自然期刊 "撤銷發表的文章
所以,即使是最頂尖的數據分析公司,
Amazon和Google,
07:58
So even the most data-savvy數據精明 companies公司,
Amazon亞馬遜 and Google谷歌,
162
466480
3336
08:01
they sometimes有時 get it wrong錯誤.
163
469840
2136
他們有時也會出錯。
08:04
And despite儘管 all those failures故障,
164
472000
2936
但儘管有這些失敗,
08:06
data數據 is moving移動 rapidly急速
into real-life現實生活 decision-making做決定 --
165
474960
3856
數據正快速地進入我們
實際生活上的決策、
08:10
into the workplace職場,
166
478840
1816
進入工作職場、
08:12
law enforcement強制,
167
480680
1816
法律執行、
08:14
medicine醫學.
168
482520
1200
醫藥界。
所以,我們應該確保數據是有幫助的。
08:16
So we should better make sure
that data數據 is helping幫助.
169
484400
3336
08:19
Now, personally親自 I've seen看到
a lot of this struggle鬥爭 with data數據 myself,
170
487760
3136
我個人已經經歷過很多
自己在數據上的掙扎,
08:22
because I work in computational計算 genetics遺傳學,
171
490920
1976
因為我在計算遺傳學界工作,
08:24
which哪一個 is also a field領域
where lots of very smart聰明 people
172
492920
2496
這個領域有很多非常聰明的人
08:27
are using運用 unimaginable不可思議 amounts of data數據
to make pretty漂亮 serious嚴重 decisions決定
173
495440
3656
使用多到難以想像的數據
來制定相當嚴肅的決策,
08:31
like deciding決定 on a cancer癌症 therapy治療
or developing發展 a drug藥物.
174
499120
3560
像是癌症治療決策或藥物開發。
經過這幾年,我已經注意到一種模式
08:35
And over the years年份,
I've noticed注意到 a sort分類 of pattern模式
175
503520
2376
08:37
or kind of rule規則, if you will,
about the difference區別
176
505920
2456
或者規則,如果你要這麼說也行,
08:40
between之間 successful成功
decision-making做決定 with data數據
177
508400
2696
就是有關於用數據做出
08:43
and unsuccessful不成功 decision-making做決定,
178
511120
1616
成功決策和不成功決策,
08:44
and I find this a pattern模式 worth價值 sharing分享,
and it goes something like this.
179
512760
3880
我發現這個模式值得分享,
它是這樣的......
當你要解決一個複雜問題時,
08:50
So whenever每當 you're
solving a complex複雜 problem問題,
180
518520
2135
08:52
you're doing essentially實質上 two things.
181
520679
1737
本質上你會做兩件事,
08:54
The first one is, you take that problem問題
apart距離 into its bits and pieces
182
522440
3296
第一件事是,你會把問題拆分得很仔細,
08:57
so that you can deeply analyze分析
those bits and pieces,
183
525760
2496
所以你可以深度地分析這些細節,
09:00
and then of course課程
you do the second第二 part部分.
184
528280
2016
當然你的第二件事就是,
09:02
You put all of these bits and pieces
back together一起 again
185
530320
2656
你會再把這些細節拿回來整合一起,
09:05
to come to your conclusion結論.
186
533000
1336
來得出你要的結論。
09:06
And sometimes有時 you
have to do it over again,
187
534360
2336
有時候你必須一做再做,
09:08
but it's always those two things:
188
536720
1656
就這兩件事:
09:10
taking服用 apart距離 and putting
back together一起 again.
189
538400
2320
拆分、再合併一起。
但,關鍵是
09:14
And now the crucial關鍵 thing is
190
542280
1616
09:15
that data數據 and data數據 analysis分析
191
543920
2896
數據與數據分析
09:18
is only good for the first part部分.
192
546840
2496
只適用於第一步驟,
09:21
Data數據 and data數據 analysis分析,
no matter how powerful強大,
193
549360
2216
無論數據與數據分析多麼地強大,
09:23
can only help you taking服用 a problem問題 apart距離
and understanding理解 its pieces.
194
551600
4456
它只能幫助你拆分問題及了解細節,
09:28
It's not suited合適的 to put those pieces
back together一起 again
195
556080
3496
它不適用於把細節
拿回來放在一起再整合,
09:31
and then to come to a conclusion結論.
196
559600
1896
來得出一個結論。
09:33
There's another另一個 tool工具 that can do that,
and we all have it,
197
561520
2736
有一個工具可以這麼做,
而我們都擁有它,
09:36
and that tool工具 is the brain.
198
564280
1296
那工具就是大腦。
09:37
If there's one thing a brain is good at,
199
565600
1936
如果要說大腦有一項能力很強,
09:39
it's taking服用 bits and pieces
back together一起 again,
200
567560
2256
那就是,它很會把事情
拆分細節後再整合一起,
09:41
even when you have incomplete殘缺 information信息,
201
569840
2016
即使當你有的只是不完整的資訊,
09:43
and coming未來 to a good conclusion結論,
202
571880
1576
也能得到一個好的決策,
09:45
especially特別 if it's the brain of an expert專家.
203
573480
2936
特別是專家的大腦。
09:48
And that's why I believe
that NetflixNetflix公司 was so successful成功,
204
576440
2656
而這也是為什麼我相信
Netflix會這麼成功的原因,
09:51
because they used data數據 and brains大腦
where they belong屬於 in the process處理.
205
579120
3576
因為他們在過程中使用數據與大腦。
09:54
They use data數據 to first understand理解
lots of pieces about their audience聽眾
206
582720
3536
他們利用數據,
首先了解很多觀眾的細節,
09:58
that they otherwise除此以外 wouldn't不會 have
been able能夠 to understand理解 at that depth深度,
207
586280
3416
否則沒有這些數據,
他們沒有能力可以了解這麼深,
10:01
but then the decision決定
to take all these bits and pieces
208
589720
2616
但做出拆分、整合
及製作" 紙牌屋 "的
10:04
and put them back together一起 again
and make a show顯示 like "House of Cards,"
209
592360
3336
這兩個決策,是數據中無法幫你決定的。
10:07
that was nowhere無處 in the data數據.
210
595720
1416
10:09
Ted攤曬 SarandosSarandos and his team球隊
made製作 that decision決定 to license執照 that show顯示,
211
597160
3976
Ted Sarandos和他的團隊做出
許可該節目的這個決策,
10:13
which哪一個 also meant意味著, by the way,
that they were taking服用
212
601160
2381
總之,意思就是,
他們在做出決策當下,
也正在承擔很大的個人風險。
10:15
a pretty漂亮 big personal個人 risk風險
with that decision決定.
213
603565
2851
10:18
And Amazon亞馬遜, on the other hand,
they did it the wrong錯誤 way around.
214
606440
3016
而另一方面,Amazon他們把它搞砸了。
10:21
They used data數據 all the way
to drive駕駛 their decision-making做決定,
215
609480
2736
他們全程依賴數據來制定決策,
10:24
first when they held保持
their competition競爭 of TV電視 ideas思路,
216
612240
2416
首先,他們舉辦節目想法的競賽,
10:26
then when they selected "AlphaΑ House"
to make as a show顯示.
217
614680
3696
然後當他們選擇" 艾爾發屋 "來作為節目,
10:30
Which哪一個 of course課程 was
a very safe安全 decision決定 for them,
218
618400
2496
當然啦,對他們而言,
這是一個非常安全的決策,
10:32
because they could always
point at the data數據, saying,
219
620920
2456
因為他們總是可以指著數據說,
10:35
"This is what the data數據 tells告訴 us."
220
623400
1696
"這是數據告訴我們的"
10:37
But it didn't lead to the exceptional優秀
results結果 that they were hoping希望 for.
221
625120
4240
但這並沒有帶領他們到
他們所希望的傑出結果。
所以,數據當然是做決策時的
一個強大的工具,
10:42
So data數據 is of course課程 a massively大規模
useful有用 tool工具 to make better decisions決定,
222
630120
4976
10:47
but I believe that things go wrong錯誤
223
635120
2376
但我相信,當數據開始主導這些決策時,
10:49
when data數據 is starting開始
to drive駕駛 those decisions決定.
224
637520
2576
事情也會開始出錯。
10:52
No matter how powerful強大,
data數據 is just a tool工具,
225
640120
3776
不管它有多麼的強大,
數據僅是一個工具,
10:55
and to keep that in mind心神,
I find this device設備 here quite相當 useful有用.
226
643920
3336
並把這個記在腦裡,
我發現這個裝置相當有用。
你們很多人將會 ...
10:59
Many許多 of you will ...
227
647280
1216
11:00
(Laughter笑聲)
228
648520
1216
(笑聲)
11:01
Before there was data數據,
229
649760
1216
在有數據之前,
11:03
this was the decision-making做決定
device設備 to use.
230
651000
2856
這就是用來做決策的工具
11:05
(Laughter笑聲)
231
653880
1256
(笑聲)
11:07
Many許多 of you will know this.
232
655160
1336
你們很多人應該知道這個玩意。
11:08
This toy玩具 here is called the Magic魔法 8 Ball,
233
656520
1953
這個玩具在這裡稱做"魔術 8號球",
11:10
and it's really amazing驚人,
234
658497
1199
它真的很奇妙,
11:11
because if you have a decision決定 to make,
a yes or no question,
235
659720
2896
因為如果你要做一個
"是或不是"的決策時,
11:14
all you have to do is you shake the ball,
and then you get an answer回答 --
236
662640
3736
你只要搖一搖這顆球,
然後你就可以得到答案了--
11:18
"Most Likely容易" -- right here
in this window窗口 in real真實 time.
237
666400
2816
"很有可能是"--
就在這視窗裡及時顯現給你看,
11:21
I'll have it out later後來 for tech高科技 demos演示.
238
669240
2096
我會帶它去做技術示範。
11:23
(Laughter笑聲)
239
671360
1216
(笑聲)
11:24
Now, the thing is, of course課程 --
so I've made製作 some decisions決定 in my life
240
672600
3576
事情是,當然啦 --
我已經在我人生中做出一些決定,
11:28
where, in hindsight事後,
I should have just listened聽了 to the ball.
241
676200
2896
但早知道,我就應該聽這顆球的話。
11:31
But, you know, of course課程,
if you have the data數據 available可得到,
242
679120
3336
但,當然,如果你有有效的數據,
11:34
you want to replace更換 this with something
much more sophisticated複雜的,
243
682480
3056
你想要用超複雜的方式來取代這顆球,
11:37
like data數據 analysis分析
to come to a better decision決定.
244
685560
3616
例如,用數據分析來得到更好的決策。
11:41
But that does not change更改 the basic基本 setup建立.
245
689200
2616
但這無法改變基本的設定,
11:43
So the ball may可能 get smarter聰明
and smarter聰明 and smarter聰明,
246
691840
3176
所以這球會越來越聰明,
11:47
but I believe it's still on us
to make the decisions決定
247
695040
2816
但我相信,如果我們想達成某些
曲線右邊末端的非凡成就,
11:49
if we want to achieve實現
something extraordinary非凡,
248
697880
3016
最後我們自己還是得做出決定,
11:52
on the right end結束 of the curve曲線.
249
700920
1936
11:54
And I find that a very encouraging鼓舞人心的
message信息, in fact事實,
250
702880
4496
事實上,我發現
一個非常激勵人心的訊息,
11:59
that even in the face面對
of huge巨大 amounts of data數據,
251
707400
3976
即使面對龐大的數據,
你仍會有很大的收穫,
12:03
it still pays支付 off to make decisions決定,
252
711400
4096
在你做出決策、
變成一位該領域的專家
12:07
to be an expert專家 in what you're doing
253
715520
2656
並承擔風險時。
12:10
and take risks風險.
254
718200
2096
因為,最後,不是數據,
12:12
Because in the end結束, it's not data數據,
255
720320
2776
12:15
it's risks風險 that will land土地 you
on the right end結束 of the curve曲線.
256
723120
3960
是風險會帶你來到曲線的右邊末端。
謝謝各位。
12:19
Thank you.
257
727840
1216
12:21
(Applause掌聲)
258
729080
3680
(掌聲)
Translated by Yi-Fan Yu
Reviewed by Ernie Hsieh

▲Back to top

ABOUT THE SPEAKER
Sebastian Wernicke - Data scientist
After making a splash in the field of bioinformatics, Sebastian Wernicke moved on to the corporate sphere, where he motivates and manages multidimensional projects.

Why you should listen

Dr. Sebastian Wernicke is the Chief Data Scientist of ONE LOGIC, a data science boutique that supports organizations across industries to make sense of their vast data collections to improve operations and gain strategic advantages. Wernicke originally studied bioinformatics and previously led the strategy and growth of Seven Bridges Genomics, a Cambridge-based startup that builds platforms for genetic analysis.

Before his career in statistics began, Wernicke worked stints as both a paramedic and successful short animated filmmaker. He's also the author of the TEDPad app, an irreverent tool for creating an infinite number of "amazing and really bad" and mostly completely meaningless talks. He's the author of the statistically authoritative and yet completely ridiculous "How to Give the Perfect TEDTalk."

More profile about the speaker
Sebastian Wernicke | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee