ABOUT THE SPEAKER
Jennifer Golbeck - Computer scientist
As the director of the Human-Computer Interaction Lab at the University of Maryland, Jennifer Golbeck studies how people use social media -- and thinks about ways to improve their interactions.

Why you should listen

Jennifer Golbeck is an associate professor in the College of Information Studies at the University of Maryland, where she also moonlights in the department of computer science. Her work invariably focuses on how to enhance and improve the way that people interact with their own information online. "I approach this from a computer science perspective and my general research hits social networks, trust, web science, artificial intelligence, and human-computer interaction," she writes.

Author of the 2013 book, Analyzing the Social Web, Golbeck likes nothing more than to immerse herself in the inner workings of the Internet tools so many millions of people use daily, to understand the implications of our choices and actions. Recently, she has also been working to bring human-computer interaction ideas to the world of security and privacy systems.

More profile about the speaker
Jennifer Golbeck | Speaker | TED.com
TEDxMidAtlantic 2013

Jennifer Golbeck: Your social media "likes" expose more than you think

珍妮佛.戈爾貝克: 炸馬鈴薯圈:社群點「讚」透露比你想像中更多的訊息

Filmed:
2,366,837 views

你喜歡炸馬鈴薯圈嗎?曾經到粉絲頁上按讚嗎?這篇演講揭露了關於臉書(以及其他網站)可以從你隨機的讚以及分享中獲得什麼資訊。電腦科學家珍妮佛.戈爾貝克解釋了其中的原因,這些科技應用並不那麼好,還有為什麼她認為我們應該把對訊息的控制權交回給正當的主人。
- Computer scientist
As the director of the Human-Computer Interaction Lab at the University of Maryland, Jennifer Golbeck studies how people use social media -- and thinks about ways to improve their interactions. Full bio

Double-click the English transcript below to play the video.

00:12
If you remember記得 that first decade of the web捲筒紙,
0
738
1997
如果你還記得網路出現的頭十年,
00:14
it was really a static靜態的 place地點.
1
2735
2255
當時是一個很靜態的環境。
00:16
You could go online線上, you could look at pages網頁,
2
4990
2245
你可以上網、瀏覽網頁,
00:19
and they were put up either by organizations組織
3
7235
2513
這些網站或許是由一些機構製作,
00:21
who had teams球隊 to do it
4
9748
1521
這些機構有自己的團隊,
00:23
or by individuals個人 who were really tech-savvy技術嫻熟
5
11269
2229
或是當時很懂科技的人製作的。
00:25
for the time.
6
13498
1737
00:27
And with the rise上升 of social社會 media媒體
7
15235
1575
隨著社交媒體、
00:28
and social社會 networks網絡 in the early 2000s,
8
16810
2399
社交網路在 21 世紀初期的興起,
00:31
the web捲筒紙 was completely全然 changed
9
19209
2149
網路世界完全改變了。
00:33
to a place地點 where now the vast廣大 majority多數 of content內容
10
21358
3608
現在的網路有很多內容
我們互動的內容是由網路用戶放上網的,
00:36
we interact相互作用 with is put up by average平均 users用戶,
11
24966
3312
00:40
either in YouTubeYouTube的 videos視頻 or blog博客 posts帖子
12
28278
2697
不管是 YouTube 上的影片或者部落格,
00:42
or product產品 reviews評論 or social社會 media媒體 postings帖子.
13
30975
3315
抑或是商品評價或者社交媒體的文章。
00:46
And it's also become成為 a much more interactive互動 place地點,
14
34290
2347
除此之外,網路也多了很多互動。
00:48
where people are interacting互動 with others其他,
15
36637
2637
人們在網絡上互動,
00:51
they're commenting評論, they're sharing分享,
16
39274
1696
他們評論、分享,
00:52
they're not just reading.
17
40970
1614
而不僅是看看而已。
00:54
So FacebookFacebook的 is not the only place地點 you can do this,
18
42584
1866
臉書不是唯一一個
能做這些事的網站,
00:56
but it's the biggest最大,
19
44450
1098
但它是最大的。
00:57
and it serves供應 to illustrate說明 the numbers數字.
20
45548
1784
我們可以通過臉書
來判斷使用人數。
00:59
FacebookFacebook的 has 1.2 billion十億 users用戶 per month.
21
47332
3477
臉書每個月的用戶高達 12 億。
01:02
So half the Earth's地球 Internet互聯網 population人口
22
50809
1930
也就是說全球一半的網民
01:04
is using運用 FacebookFacebook的.
23
52739
1653
都在使用臉書。
01:06
They are a site現場, along沿 with others其他,
24
54392
1932
這個網站,還有其他的網站,
01:08
that has allowed允許 people to create創建 an online線上 persona人物
25
56324
3219
讓網民能創建網路上的個人形象
01:11
with very little technical技術 skill技能,
26
59543
1782
而且無需太多的技術即可操作。
01:13
and people responded回應 by putting huge巨大 amounts
27
61325
2476
用戶反應熱烈,上傳大量的
01:15
of personal個人 data數據 online線上.
28
63801
1983
個人訊息到網路上。
01:17
So the result結果 is that we have behavioral行為的,
29
65784
2543
這樣一來我們就有了有關行為、
01:20
preference偏愛, demographic人口 data數據
30
68327
1986
偏好、地理數據,
01:22
for hundreds數以百計 of millions百萬 of people,
31
70313
2101
提供給成千上萬的人,
01:24
which哪一個 is unprecedented史無前例 in history歷史.
32
72414
2026
這是史無前例的。
01:26
And as a computer電腦 scientist科學家,
what this means手段 is that
33
74440
2560
作為電腦科學家,這就意味著
01:29
I've been able能夠 to build建立 models楷模
34
77000
1664
我可以建立很多模型
01:30
that can predict預測 all sorts排序 of hidden attributes屬性
35
78664
2322
用來推測各種隱藏特性,
01:32
for all of you that you don't even know
36
80986
2284
而你們自己可能都不知道
你們分享的訊息透露了這些特性。
01:35
you're sharing分享 information信息 about.
37
83270
2202
01:37
As scientists科學家們, we use that to help
38
85472
2382
科學家利用這些數據來改善
01:39
the way people interact相互作用 online線上,
39
87854
2114
網民在網路上的互動,
01:41
but there's less altruistic利他 applications應用,
40
89968
2499
但網路也有一些
沒那麼利他主義的應用,
01:44
and there's a problem問題 in that users用戶 don't really
41
92467
2381
我們面臨一個問題,
那就是網路用戶並不真正
01:46
understand理解 these techniques技術 and how they work,
42
94848
2470
了解這些網路技術、它們的運作原理,
01:49
and even if they did, they don't
have a lot of control控制 over it.
43
97318
3128
而且即使他們懂,
也沒什麼辦法控制其影響。
所以我今天想和你們分享的,
01:52
So what I want to talk to you about today今天
44
100446
1490
01:53
is some of these things that we're able能夠 to do,
45
101936
2702
是我們力所能及、可控制的一些事情,
01:56
and then give us some ideas思路
of how we might威力 go forward前鋒
46
104638
2763
給大家一些想法,看看我們如何發展才能
01:59
to move移動 some control控制 back into the hands of users用戶.
47
107401
2769
把部分控制權交回到網路用戶的手上。
02:02
So this is Target目標, the company公司.
48
110170
1586
這個是 Target 公司。
02:03
I didn't just put that logo商標
49
111756
1324
我不是沒事把 Target 的標誌放在
02:05
on this poor較差的, pregnant woman's女人的 belly肚皮.
50
113080
2170
這個可憐孕婦的肚子上。
02:07
You may可能 have seen看到 this anecdote軼事 that was printed印刷的
51
115250
1840
你可能讀過一個小故事,刊登在
02:09
in Forbes福布斯 magazine雜誌 where Target目標
52
117090
2061
富比士雜誌。故事提到 Target
02:11
sent發送 a flyer傳單 to this 15-year-old-歲 girl女孩
53
119151
2361
發了張傳單給一位 15 歲的女孩。
02:13
with advertisements廣告 and coupons優惠券
54
121512
1710
上面的廣告和折價卷
02:15
for baby寶寶 bottles瓶子 and diapers尿布 and cribs嬰兒床
55
123222
2554
都是嬰兒奶瓶、尿布、嬰兒床的。
02:17
two weeks before she told her parents父母
56
125776
1684
這還是在她告訴她父親
02:19
that she was pregnant.
57
127460
1864
自己懷孕了之前兩週的事。
02:21
Yeah, the dad was really upset煩亂.
58
129324
2704
是的,她的父親很難過。
02:24
He said, "How did Target目標 figure數字 out
59
132028
1716
那為什麼 Target 知道
02:25
that this high school學校 girl女孩 was pregnant
60
133744
1824
在這高中女生告訴父母她懷孕以前,
就已經先知道了呢?
02:27
before she told her parents父母?"
61
135568
1960
02:29
It turns out that they have the purchase採購 history歷史
62
137528
2621
原來,Target 有購物記錄,
02:32
for hundreds數以百計 of thousands數千 of customers顧客
63
140149
2301
記錄成千上萬網路顧客的購物歷史,
02:34
and they compute計算 what they
call a pregnancy懷孕 score得分了,
64
142450
2730
而且他們還有一個叫做
“懷孕分數”的計算系統,
02:37
which哪一個 is not just whether是否 or
not a woman's女人的 pregnant,
65
145180
2332
這個系統不只計算一位女性是否懷孕,
02:39
but what her due應有 date日期 is.
66
147512
1730
還有她們的預產期。
02:41
And they compute計算 that
67
149242
1304
另外,他們不僅探討一些很明顯的資訊,
02:42
not by looking at the obvious明顯 things,
68
150546
1768
02:44
like, she's buying購買 a crib嬰兒床 or baby寶寶 clothes衣服,
69
152314
2512
比如說購買了一張嬰兒床、嬰兒服,
02:46
but things like, she bought more vitamins維生素
70
154826
2943
還會計算她買了比平時多的維他命,
02:49
than she normally一般 had,
71
157769
1717
02:51
or she bought a handbag手提包
72
159486
1464
或者是她買了一個
大小足夠放下尿布的包包。
02:52
that's big enough足夠 to hold保持 diapers尿布.
73
160950
1711
02:54
And by themselves他們自己, those purchases購買 don't seem似乎
74
162661
1910
對購買者來說,
他們並不覺得這些購物訊息
02:56
like they might威力 reveal揭示 a lot,
75
164571
2469
透露很多隱私,
02:59
but it's a pattern模式 of behavior行為 that,
76
167040
1978
但其實這是一種行為模式,
03:01
when you take it in the context上下文
of thousands數千 of other people,
77
169018
3117
當你把和成千上萬
網友的資料放在一起看,
03:04
starts啟動 to actually其實 reveal揭示 some insights見解.
78
172135
2757
其實就能推測出很多東西。
03:06
So that's the kind of thing that we do
79
174892
1793
所以這些就是我們所做的事情,
03:08
when we're predicting預測 stuff東東
about you on social社會 media媒體.
80
176685
2567
我們在社群網站上
推測與你們相關的東西。
03:11
We're looking for little
patterns模式 of behavior行為 that,
81
179252
2796
我們要找的行為模式是,
03:14
when you detect檢測 them among其中 millions百萬 of people,
82
182048
2682
當你們從上百萬人身上發現這種模式,
03:16
lets讓我們 us find out all kinds of things.
83
184730
2706
我們就能找到所有相關的事情。
03:19
So in my lab實驗室 and with colleagues同事,
84
187436
1747
所以我和實驗室的同事們,
03:21
we've我們已經 developed發達 mechanisms機制 where we can
85
189183
1777
開發了多種機制,幫助我們
03:22
quite相當 accurately準確 predict預測 things
86
190960
1560
較精確地推斷很多事情,
03:24
like your political政治 preference偏愛,
87
192520
1725
像是你的政治傾向、
03:26
your personality個性 score得分了, gender性別, sexual有性 orientation方向,
88
194245
3752
性格測試分數、性別、性取向、
03:29
religion宗教, age年齡, intelligence情報,
89
197997
2873
宗教信仰、年齡、智力,
03:32
along沿 with things like
90
200870
1394
同時還有像是
03:34
how much you trust相信 the people you know
91
202264
1937
你對認識的人有多信任、
03:36
and how strong強大 those relationships關係 are.
92
204201
1804
你們的關係有多緊密等。
03:38
We can do all of this really well.
93
206005
1785
所有這些我們都可以做得很好。
03:39
And again, it doesn't come from what you might威力
94
207790
2197
而且,這些都不是來自於
你會認為是明顯的訊息。
03:41
think of as obvious明顯 information信息.
95
209987
2102
03:44
So my favorite喜愛 example is from this study研究
96
212089
2281
我最喜歡舉的一個例子
03:46
that was published發表 this year
97
214370
1240
是一個今年發表的研究
03:47
in the Proceedings論文集 of the National國民 Academies學院.
98
215610
1795
刊在《美國國家科學院院刊》上。
03:49
If you Google谷歌 this, you'll你會 find it.
99
217405
1285
Google 一下就能查到。
03:50
It's four pages網頁, easy簡單 to read.
100
218690
1872
研究只有四頁紙,很容易讀。
03:52
And they looked看著 at just people's人們 FacebookFacebook的 likes喜歡,
101
220562
3003
他們僅是研究了用戶在臉書的點讚,
03:55
so just the things you like on FacebookFacebook的,
102
223565
1920
只是你在臉書上點讚的內容,
03:57
and used that to predict預測 all these attributes屬性,
103
225485
2138
用這些點讚的內容
來推斷所有這些特性,
03:59
along沿 with some other ones那些.
104
227623
1645
以及其他的資訊。
04:01
And in their paper they listed上市 the five likes喜歡
105
229268
2961
在調查中,他們列出了五類的讚,
04:04
that were most indicative指示 of high intelligence情報.
106
232229
2787
這些讚最能表明高智商的用戶。
04:07
And among其中 those was liking喜歡 a page
107
235016
2324
這其中還包括
到炸馬鈴薯圈頁面點讚。(笑聲)
04:09
for curly捲曲 fries薯條. (Laughter笑聲)
108
237340
1905
04:11
Curly捲曲 fries薯條 are delicious美味的,
109
239245
2093
炸馬鈴薯圈是好吃,
04:13
but liking喜歡 them does not necessarily一定 mean
110
241338
2530
但是到這頁面按讚不表示
04:15
that you're smarter聰明 than the average平均 person.
111
243868
2080
你就比一般人聰明。
04:17
So how is it that one of the strongest最強 indicators指標
112
245948
3207
到底為什麼,
最能體現你智商指數的指標之一
04:21
of your intelligence情報
113
249155
1570
04:22
is liking喜歡 this page
114
250725
1447
是到一個頁面按讚,
04:24
when the content內容 is totally完全 irrelevant不相干
115
252172
2252
即使頁面的內容完全無關於
04:26
to the attribute屬性 that's being存在 predicted預料到的?
116
254424
2527
要推斷的特性?
04:28
And it turns out that we have to look at
117
256951
1584
結論是,我們需要參考
04:30
a whole整個 bunch of underlying底層 theories理論
118
258535
1618
很多背後的理論
04:32
to see why we're able能夠 to do this.
119
260153
2569
來了解為什麼我們能夠做到這點。
04:34
One of them is a sociological社會學的
theory理論 called homophily趨同性,
120
262722
2913
其中一個就是社會學理論,叫同質相吸,
04:37
which哪一個 basically基本上 says people are
friends朋友 with people like them.
121
265635
3092
指的是人們通常
和與自己相像的人交朋友。
04:40
So if you're smart聰明, you tend趨向 to
be friends朋友 with smart聰明 people,
122
268727
2014
所以如果你聰明,
你會和聰明的人交朋友,
04:42
and if you're young年輕, you tend趨向
to be friends朋友 with young年輕 people,
123
270741
2630
如果你年輕,
你會和年輕人交朋友,
04:45
and this is well established既定
124
273371
1627
這個理論是經過驗證的,
04:46
for hundreds數以百計 of years年份.
125
274998
1745
多年來大家都肯定。
04:48
We also know a lot
126
276743
1232
我們還知道很多
04:49
about how information信息 spreads利差 through通過 networks網絡.
127
277975
2550
關於訊息在網路上如何傳播。
04:52
It turns out things like viral病毒 videos視頻
128
280525
1754
我們發現病毒影片、
04:54
or FacebookFacebook的 likes喜歡 or other information信息
129
282279
2406
臉書按讚或是其他訊息
04:56
spreads利差 in exactly究竟 the same相同 way
130
284685
1888
傳播的方式完全和
04:58
that diseases疾病 spread傳播 through通過 social社會 networks網絡.
131
286573
2454
病毒透過社群網站傳播的方式一樣。
05:01
So this is something we've我們已經 studied研究 for a long time.
132
289027
1791
這是我們研究了很長時間的東西,
05:02
We have good models楷模 of it.
133
290818
1576
我們有很好的模型。
05:04
And so you can put those things together一起
134
292394
2157
所以如果你們把這些模型都放在一起,
05:06
and start開始 seeing眼看 why things like this happen發生.
135
294551
3088
就能了解為何這樣的事情會發生了。
05:09
So if I were to give you a hypothesis假設,
136
297639
1814
如果要給各位一個假設,
05:11
it would be that a smart聰明 guy started開始 this page,
137
299453
3227
那就是一個聰明的人
建立了一個粉絲頁,
05:14
or maybe one of the first people who liked喜歡 it
138
302680
1939
或者剛開始幾個去按讚的人
05:16
would have scored進球 high on that test測試.
139
304619
1736
在智力測試上得了高分,
05:18
And they liked喜歡 it, and their friends朋友 saw it,
140
306355
2288
他們給這個頁面點了讚,
當他們的朋友看見了,
05:20
and by homophily趨同性, we know that
he probably大概 had smart聰明 friends朋友,
141
308643
3122
根據同質相吸的原理,我們知道
這些人的朋友可能也很聰明,
05:23
and so it spread傳播 to them,
and some of them liked喜歡 it,
142
311765
3056
當訊息傳給他們,
有些人也會給這個頁面點讚,
05:26
and they had smart聰明 friends朋友,
143
314821
1189
而他們又有聰明的朋友,
05:28
and so it spread傳播 to them,
144
316010
807
訊息接著傳出去,
05:28
and so it propagated傳播 through通過 the network網絡
145
316817
1973
這樣一來,就在網路上傳開了,
05:30
to a host主辦 of smart聰明 people,
146
318790
2569
傳給一群聰明的人,
05:33
so that by the end結束, the action行動
147
321359
2056
如此,到最後
05:35
of liking喜歡 the curly捲曲 fries薯條 page
148
323415
2544
給炸馬鈴薯圈頁面點讚的行為
05:37
is indicative指示 of high intelligence情報,
149
325959
1615
就成了高智商的指標,
05:39
not because of the content內容,
150
327574
1803
並不是因為頁面的內容,
05:41
but because the actual實際 action行動 of liking喜歡
151
329377
2522
而是因為點讚的這一行為
05:43
reflects反映 back the common共同 attributes屬性
152
331899
1900
反映了做這件事情的人的
05:45
of other people who have doneDONE it.
153
333799
2468
共同特性。
05:48
So this is pretty漂亮 complicated複雜 stuff東東, right?
154
336267
2897
所以這還是挺複雜的,是吧?
05:51
It's a hard thing to sit down and explain說明
155
339164
2199
要坐下來跟普通用戶解釋是困難的,
05:53
to an average平均 user用戶, and even if you do,
156
341363
2848
而且即使我們分析了,
05:56
what can the average平均 user用戶 do about it?
157
344211
2188
對普通用戶們又有什麼用呢?
05:58
How do you know that
you've liked喜歡 something
158
346399
2048
你們怎麼知道到某個粉絲頁按讚
06:00
that indicates指示 a trait特徵 for you
159
348447
1492
能夠反映出你的特性,
06:01
that's totally完全 irrelevant不相干 to the
content內容 of what you've liked喜歡?
160
349939
3545
而這特性又和你按讚的內容
完全無關呢?
06:05
There's a lot of power功率 that users用戶 don't have
161
353484
2546
很多的權力用戶都沒有,
06:08
to control控制 how this data數據 is used.
162
356030
2230
他們沒法控制這些數據的使用。
06:10
And I see that as a real真實
problem問題 going forward前鋒.
163
358260
3112
我認為這是我們繼續發展
所面臨的真正困難。
06:13
So I think there's a couple一對 paths路徑
164
361372
1977
所以我想到了幾條途徑
06:15
that we want to look at
165
363349
1001
我們可以參考,
06:16
if we want to give users用戶 some control控制
166
364350
1910
看能不能給用戶一些
06:18
over how this data數據 is used,
167
366260
1740
控制這些數據的方法。
06:20
because it's not always going to be used
168
368000
1940
因為這些數據並不總是
06:21
for their benefit效益.
169
369940
1381
能替用戶帶來益處。
06:23
An example I often經常 give is that,
170
371321
1422
我常舉例說,
06:24
if I ever get bored無聊 being存在 a professor教授,
171
372743
1646
如果我厭倦當教授,
06:26
I'm going to go start開始 a company公司
172
374389
1653
我要開個公司
06:28
that predicts預測 all of these attributes屬性
173
376042
1454
去推斷所有這些用戶特性,
06:29
and things like how well you work in teams球隊
174
377496
1602
像是你的團隊合作、
06:31
and if you're a drug藥物 user用戶, if you're an alcoholic酒精.
175
379098
2671
嗑不嗑藥、是不是酒鬼。
06:33
We know how to predict預測 all that.
176
381769
1440
我們知道如何去推斷這些訊息。
06:35
And I'm going to sell reports報告
177
383209
1761
接著我就要把這些報告
06:36
to H.R. companies公司 and big businesses企業
178
384970
2100
賣給人力資源公司或者大企業
06:39
that want to hire聘請 you.
179
387070
2273
就是那些將要雇你的人。
06:41
We totally完全 can do that now.
180
389343
1177
我們現在完全可以做到這些。
06:42
I could start開始 that business商業 tomorrow明天,
181
390520
1788
我明天就可以開始做,
06:44
and you would have
absolutely絕對 no control控制
182
392308
2052
而且你完全沒法控制
06:46
over me using運用 your data數據 like that.
183
394360
2138
我這樣使用數據的行為。
06:48
That seems似乎 to me to be a problem問題.
184
396498
2292
這在我看來是一個問題。
06:50
So one of the paths路徑 we can go down
185
398790
1910
所以我們能選擇的
其中一條途徑就是
06:52
is the policy政策 and law path路徑.
186
400700
2032
政策和法律的制定。
06:54
And in some respects尊重, I think
that that would be most effective有效,
187
402732
3046
在某種程度上,
我認為這將是最有效的方法,
06:57
but the problem問題 is we'd星期三
actually其實 have to do it.
188
405778
2756
但問題是我們必須得實際執行。
07:00
Observing觀察 our political政治 process處理 in action行動
189
408534
2780
透過觀察我們的政治進程,
07:03
makes品牌 me think it's highly高度 unlikely不會
190
411314
2379
讓我意識到我們很難
07:05
that we're going to get a bunch of representatives代表
191
413693
1597
集合一群代表,
07:07
to sit down, learn學習 about this,
192
415290
1986
讓他們坐下來了解這件事,
07:09
and then enact制定 sweeping籠統的 changes變化
193
417276
2106
然後開始進行大規模改變,
07:11
to intellectual知識分子 property屬性 law in the U.S.
194
419382
2157
修改美國的知識產權法律
07:13
so users用戶 control控制 their data數據.
195
421539
2461
以讓用戶有權控制他們的數據。
07:16
We could go the policy政策 route路線,
196
424000
1304
我們可以走政策道路,
07:17
where social社會 media媒體 companies公司 say,
197
425304
1479
讓社群公司表態,
07:18
you know what? You own擁有 your data數據.
198
426783
1402
「好,你們擁有自己的數據。
07:20
You have total control控制 over how it's used.
199
428185
2489
你們能完全地控制對它們的使用。」
07:22
The problem問題 is that the revenue收入 models楷模
200
430674
1848
問題在於
多數社交媒體的收益模式
07:24
for most social社會 media媒體 companies公司
201
432522
1724
07:26
rely依靠 on sharing分享 or exploiting利用
users'用戶' data數據 in some way.
202
434246
4031
某種程度上仰賴
分享或利用用戶的數據。
07:30
It's sometimes有時 said of FacebookFacebook的 that the users用戶
203
438277
1833
有人說臉書的用戶
07:32
aren't the customer顧客, they're the product產品.
204
440110
2528
不是顧客,而是產品。
07:34
And so how do you get a company公司
205
442638
2714
所以你怎麼可能讓一間公司
07:37
to cede放棄 control控制 of their main主要 asset財富
206
445352
2558
放棄對他們主要收入的控制
07:39
back to the users用戶?
207
447910
1249
把控制權還給用戶呢?
07:41
It's possible可能, but I don't think it's something
208
449159
1701
這是有可能的,但我不認為
07:42
that we're going to see change更改 quickly很快.
209
450860
2320
我們能很快看到這一改變。
07:45
So I think the other path路徑
210
453180
1500
所以我認為另外一條途徑
07:46
that we can go down that's
going to be more effective有效
211
454680
2288
一條更有效的途徑,
07:48
is one of more science科學.
212
456968
1508
是更科學的途徑。
07:50
It's doing science科學 that allowed允許 us to develop發展
213
458476
2510
正是透過科學,我們才能開發
07:52
all these mechanisms機制 for computing計算
214
460986
1750
所有的這些機制首先用於計算個人數據
07:54
this personal個人 data數據 in the first place地點.
215
462736
2052
07:56
And it's actually其實 very similar類似 research研究
216
464788
2106
事實上,有個很類似的研究,
07:58
that we'd星期三 have to do
217
466894
1438
08:00
if we want to develop發展 mechanisms機制
218
468332
2386
如果我們要發明一些機制
08:02
that can say to a user用戶,
219
470718
1421
是可以對用戶說
08:04
"Here's這裡的 the risk風險 of that action行動 you just took."
220
472139
2229
「這是你剛才所做的行為
要面臨的風險。」
08:06
By liking喜歡 that FacebookFacebook的 page,
221
474368
2080
藉由臉書按讚,
08:08
or by sharing分享 this piece of personal個人 information信息,
222
476448
2535
或者是分享私人資訊,
08:10
you've now improved改善 my ability能力
223
478983
1502
你現在給了我更多能力
08:12
to predict預測 whether是否 or not you're using運用 drugs毒品
224
480485
2086
去推斷你是否嗑藥
08:14
or whether是否 or not you get
along沿 well in the workplace職場.
225
482571
2862
或者你是否和同事相處融洽。
08:17
And that, I think, can affect影響 whether是否 or not
226
485433
1848
我認為這些會影響
08:19
people want to share分享 something,
227
487281
1510
人們是否願意分享事情、
08:20
keep it private私人的, or just keep it offline離線 altogether.
228
488791
3239
還是設為隱私,或者是完全不放上網絡。
08:24
We can also look at things like
229
492030
1563
我們還可以研究一些像是
08:25
allowing允許 people to encrypt加密 data數據 that they upload上載,
230
493593
2728
讓用戶可以加密他們上傳的數據,
08:28
so it's kind of invisible無形 and worthless無用
231
496321
1855
所以對像是臉書的網站,
這是隱形而且無用的,
08:30
to sites網站 like FacebookFacebook的
232
498176
1431
08:31
or third第三 party派對 services服務 that access訪問 it,
233
499607
2629
或者是第三方服務網站也是如此。
08:34
but that select選擇 users用戶 who the person who posted發布 it
234
502236
3247
但是用戶可選擇上傳的東西
08:37
want to see it have access訪問 to see it.
235
505483
2670
要讓誰有權可以看到。
08:40
This is all super exciting扣人心弦 research研究
236
508153
2166
如果我們從知識的角度去看,
08:42
from an intellectual知識分子 perspective透視,
237
510319
1620
這些都是非常令人興奮的研究,
08:43
and so scientists科學家們 are going to be willing願意 to do it.
238
511939
1859
所以說科學家會願意做相關的研究。
08:45
So that gives us an advantage優點 over the law side.
239
513798
3610
這比起法律的途徑,
給了我們更多的好處。
08:49
One of the problems問題 that people bring帶來 up
240
517408
1725
當我談到這個的時候,
08:51
when I talk about this is, they say,
241
519133
1595
人們常會提出一個疑問,
08:52
you know, if people start開始
keeping保持 all this data數據 private私人的,
242
520728
2646
你知道,如果人們開始把這些數據都保密了,
08:55
all those methods方法 that you've been developing發展
243
523374
2113
你們一直在開發的這些
08:57
to predict預測 their traits性狀 are going to fail失敗.
244
525487
2653
用來推斷他們特性的方法都將失效,
09:00
And I say, absolutely絕對, and for me, that's success成功,
245
528140
3520
我回答說,完全正確,
但對我來說,那就是成功。
09:03
because as a scientist科學家,
246
531660
1786
因為身為一名科學家,
09:05
my goal目標 is not to infer推斷 information信息 about users用戶,
247
533446
3688
我的目標不是要推斷用戶的資訊,
09:09
it's to improve提高 the way people interact相互作用 online線上.
248
537134
2767
而是要改進人們在網路互動的方式。
09:11
And sometimes有時 that involves涉及
inferring推斷 things about them,
249
539901
3218
有時候這包括推斷關於他們的事情,
09:15
but if users用戶 don't want me to use that data數據,
250
543119
3022
但如果用戶不想要我使用這些數據,
09:18
I think they should have the right to do that.
251
546141
2038
我認為他們有權利這麼做。
09:20
I want users用戶 to be informed通知 and consenting同意
252
548179
2651
我希望用戶們可以知道且同意
09:22
users用戶 of the tools工具 that we develop發展.
253
550830
2112
我們一直開發這些工具。
09:24
And so I think encouraging鼓舞人心的 this kind of science科學
254
552942
2952
所以,我認為推廣這類科學、
09:27
and supporting支持 researchers研究人員
255
555894
1346
支持研究者,
09:29
who want to cede放棄 some of that control控制 back to users用戶
256
557240
3023
支持那些希望把控制權
交回到用戶手中,
09:32
and away from the social社會 media媒體 companies公司
257
560263
2311
從社群媒體公司
拿回這些權利的研究者,
09:34
means手段 that going forward前鋒, as these tools工具 evolve發展
258
562574
2671
意味著隨著這些工具進化和發展,
我們是向前發展的。
09:37
and advance提前,
259
565245
1476
09:38
means手段 that we're going to have an educated博學
260
566721
1414
我們將有一組教育程度更高、
更有力的用戶數據,
09:40
and empowered授權 user用戶 base基礎,
261
568135
1694
09:41
and I think all of us can agree同意
262
569829
1100
我相信大家都會認同
09:42
that that's a pretty漂亮 ideal理想 way to go forward前鋒.
263
570929
2564
朝此理想的發展方式前進。
09:45
Thank you.
264
573493
2184
謝謝。
09:47
(Applause掌聲)
265
575677
3080
(掌聲)
Translated by Adrienne Lin
Reviewed by Ying Ru Wu

▲Back to top

ABOUT THE SPEAKER
Jennifer Golbeck - Computer scientist
As the director of the Human-Computer Interaction Lab at the University of Maryland, Jennifer Golbeck studies how people use social media -- and thinks about ways to improve their interactions.

Why you should listen

Jennifer Golbeck is an associate professor in the College of Information Studies at the University of Maryland, where she also moonlights in the department of computer science. Her work invariably focuses on how to enhance and improve the way that people interact with their own information online. "I approach this from a computer science perspective and my general research hits social networks, trust, web science, artificial intelligence, and human-computer interaction," she writes.

Author of the 2013 book, Analyzing the Social Web, Golbeck likes nothing more than to immerse herself in the inner workings of the Internet tools so many millions of people use daily, to understand the implications of our choices and actions. Recently, she has also been working to bring human-computer interaction ideas to the world of security and privacy systems.

More profile about the speaker
Jennifer Golbeck | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee