ABOUT THE SPEAKER
Lorrie Faith Cranor - Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online.

Why you should listen

Lorrie Faith Cranor is an Associate Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University, where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. She is also a co-founder of Wombat Security Technologies, Inc. She has authored over 100 research papers on online privacy, usable security, phishing, spam, electronic voting, anonymous publishing, and other topics.

Cranor plays a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P. She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. In 2003 she was named one of the top 100 innovators 35 or younger by Technology Review.

More profile about the speaker
Lorrie Faith Cranor | Speaker | TED.com
TEDxCMU

Lorrie Faith Cranor: What’s wrong with your pa$$w0rd?

羅蕊·奎諾: 你的密碼出了什麼問題?

Filmed:
1,566,161 views

羅蕊·奎諾研究上千個密碼來了解人們以及安全網站在設立密碼時常犯的錯誤。你也許會問:她取得這些真實密碼的同時是否累及使用者的安全?這份秘密資料十分值得了解,尤其適合密碼為123456的人。
- Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online. Full bio

Double-click the English transcript below to play the video.

00:12
I am a computer電腦 science科學 and engineering工程
professor教授 here at Carnegie卡內基 Mellon梅隆,
0
535
3445
我是卡內基美隆大學(Carnegie Mellon)
電腦科學及工程的教授
00:15
and my research研究 focuses重點 on
usable可用 privacy隱私 and security安全,
1
3980
4248
我研究的專長是有用的隱私權和安全性
00:20
and so my friends朋友 like to give me examples例子
2
8228
2768
所以我的朋友們喜歡提供我一些例子
00:22
of their frustrations挫折 with computing計算 systems系統,
3
10996
2202
使用電腦系統時遇到的挫折
00:25
especially特別 frustrations挫折 related有關 to
4
13198
3354
尤其是
00:28
unusable不可用 privacy隱私 and security安全.
5
16552
4112
碰到不能用的隱私權和安全性
00:32
So passwords密碼 are something that I hear a lot about.
6
20664
2711
密碼就是常見的例子
00:35
A lot of people are frustrated受挫 with passwords密碼,
7
23375
2880
很多人對密碼設定感到挫折
00:38
and it's bad enough足夠
8
26255
1694
尤其是
00:39
when you have to have one really good password密碼
9
27949
2644
當你需要一組很好用的密碼
00:42
that you can remember記得
10
30593
1822
你自己記得住
00:44
but nobody沒有人 else其他 is going to be able能夠 to guess猜測.
11
32415
2894
但沒人猜的到
00:47
But what do you do when you have accounts賬戶
12
35309
1637
但當你好幾個帳戶
00:48
on a hundred different不同 systems系統
13
36946
1808
都在不同的系統環境上
00:50
and you're supposed應該 to have a unique獨特 password密碼
14
38754
2276
而每個帳戶都要有個獨特的密碼
00:53
for each of these systems系統?
15
41030
3037
你要怎麼辦呢?
00:56
It's tough強硬.
16
44067
2184
那不容易
00:58
At Carnegie卡內基 Mellon梅隆, they used to make it
17
46251
1759
卡內基美隆大學
01:00
actually其實 pretty漂亮 easy簡單 for us
18
48010
1299
過去設定密碼很簡單
01:01
to remember記得 our passwords密碼.
19
49309
1737
我們很容易記住
01:03
The password密碼 requirement需求 up through通過 2009
20
51046
2403
到2009年以前
01:05
was just that you had to have a password密碼
21
53449
2379
密碼設定只要求
01:07
with at least最小 one character字符.
22
55828
2211
至少一個字元
01:10
Pretty漂亮 easy簡單. But then they changed things,
23
58039
2888
非常簡單 但後來他們改變了規則
01:12
and at the end結束 of 2009, they announced公佈
24
60927
2670
到了2009年底
01:15
that we were going to have a new policy政策,
25
63597
2376
他們公布了一項新規定
01:17
and this new policy政策 required需要
26
65973
1863
新規定要求
01:19
passwords密碼 that were at least最小 eight characters人物 long,
27
67836
2681
密碼至少要有八個字元
01:22
with an uppercase大寫 letter, lowercase小寫 letter,
28
70517
1775
包括一個大寫字母 一個小寫字母
01:24
a digit數字, a symbol符號,
29
72292
1288
一個數字 一個符號
01:25
you couldn't不能 use the same相同
character字符 more than three times,
30
73580
2638
相同字元不能超過三次
01:28
and it wasn't allowed允許 to be in a dictionary字典.
31
76218
2434
不可是字典查得到的
01:30
Now, when they implemented實施 this new policy政策,
32
78652
2182
現在他們實施新規定
01:32
a lot of people, my colleagues同事 and friends朋友,
33
80834
2310
我的同事 朋友 還有很多人
01:35
came來了 up to me and they said, "Wow,
34
83144
1854
告訴我 說: 哇
01:36
now that's really unusable不可用.
35
84998
1512
現在真的變得好難用
01:38
Why are they doing this to us,
36
86510
1193
他們為何這麼做
01:39
and why didn't you stop them?"
37
87703
1711
妳怎麼不阻止他們?
01:41
And I said, "Well, you know what?
38
89414
1356
我說: 你知道嗎?
01:42
They didn't ask me."
39
90770
1508
他們沒問我
01:44
But I got curious好奇, and I decided決定 to go talk
40
92278
3465
可我也很好奇 所以決定要跟那些
01:47
to the people in charge收費 of our computer電腦 systems系統
41
95743
1937
管理我們電腦系統的人談談
01:49
and find out what led them to introduce介紹
42
97680
2831
了解到底什麼原因
01:52
this new policy政策,
43
100511
1848
讓他們導入這新政策
01:54
and they said that the university大學
44
102359
1584
他們說卡內基美隆大學
01:55
had joined加盟 a consortium財團 of universities高校,
45
103943
2366
加入了大學聯盟
01:58
and one of the requirements要求 of membership
46
106309
2634
成為會員的條件包括
02:00
was that we had to have stronger passwords密碼
47
108943
2248
提高密碼的安全性
02:03
that complied編譯過 with some new requirements要求,
48
111191
2272
好符合新的規定
02:05
and these requirements要求 were that our passwords密碼
49
113463
2104
新規定要求密碼
02:07
had to have a lot of entropy.
50
115567
1604
必須要有很多熵值
02:09
Now entropy is a complicated複雜 term術語,
51
117171
2278
現在這熵值是個很複雜的名詞
02:11
but basically基本上 it measures措施 the strength強度 of passwords密碼.
52
119449
2798
基本上它可衡量密碼的強度
02:14
But the thing is, there isn't actually其實
53
122247
1979
可事實上
02:16
a standard標準 measure測量 of entropy.
54
124226
1949
熵值沒有衡量的標準
02:18
Now, the National國民 Institute研究所
of Standards標準 and Technology技術
55
126175
2399
現在 國家標準與技術研究所
02:20
has a set of guidelines方針
56
128574
1553
有整套指引
02:22
which哪一個 have some rules規則 of thumb拇指
57
130127
2568
指引裡有一些經驗法則
02:24
for measuring測量 entropy,
58
132695
1440
用於測量熵值
02:26
but they don't have anything too specific具體,
59
134135
2895
但他們沒有什麼太具體的東西
02:29
and the reason原因 they only have rules規則 of thumb拇指
60
137030
2337
他們會只有經驗法則
02:31
is it turns out they don't actually其實 have any good data數據
61
139367
3136
因為他們其實沒有很精確的
02:34
on passwords密碼.
62
142503
1520
密碼數據
02:36
In fact事實, their report報告 states狀態,
63
144023
2312
實際上他們的報告指出:
02:38
"Unfortunately不幸, we do not have much data數據
64
146335
2328
不幸的是我們數據不多
02:40
on the passwords密碼 users用戶
choose選擇 under particular特定 rules規則.
65
148663
2842
特別是 密碼使用者在特定規則下選擇密碼
02:43
NISTNIST would like to obtain獲得 more data數據
66
151505
2333
NIST想獲得更多
02:45
on the passwords密碼 users用戶 actually其實 choose選擇,
67
153838
2462
使用者選擇密碼的資料
02:48
but system系統 administrators管理員
are understandably可以理解的 reluctant不情願
68
156300
2463
但系統管理員當然不願意
02:50
to reveal揭示 password密碼 data數據 to others其他."
69
158763
2940
透露密碼資料給其他人
02:53
So this is a problem問題, but our research研究 group
70
161703
3097
所以這是個問題 但我們的研究小組
02:56
looked看著 at it as an opportunity機會.
71
164800
2140
視它作為一個機會
02:58
We said, "Well, there's a need
for good password密碼 data數據.
72
166940
3100
我們說 好的密碼資料是有必要的
03:02
Maybe we can collect蒐集 some good password密碼 data數據
73
170040
2148
也許我們可以收集一些好的密碼資料
03:04
and actually其實 advance提前 the state of the art藝術 here.
74
172188
2704
有助於提昇技術現況
03:06
So the first thing we did is,
75
174892
1672
所以 我們做的第一件事是
03:08
we got a bag of candy糖果 bars酒吧
76
176564
1556
我們拿了一包糖果
03:10
and we walked around campus校園
77
178120
1086
在校園裡
03:11
and talked to students學生們, faculty學院 and staff員工,
78
179206
2798
跟學生 教師和工作人員解釋
03:14
and asked them for information信息
79
182004
1530
並跟他們要求取得
03:15
about their passwords密碼.
80
183534
1552
關於他們設定密碼的資訊
03:17
Now we didn't say, "Give us your password密碼."
81
185086
3004
我們並不是說 請給我們您的密碼
03:20
No, we just asked them about their password密碼.
82
188090
2661
我們只是詢問
03:22
How long is it? Does it have a digit數字?
83
190751
1478
他們的密碼多長?有一個數字嗎?
03:24
Does it have a symbol符號?
84
192229
1068
有一個符號嗎?
03:25
And were you annoyed懊惱 at having to create創建
85
193297
2045
上星期要重設的新密碼
03:27
a new one last week?
86
195342
2744
會不會讓你覺得很煩?
03:30
So we got results結果 from 470 students學生們,
87
198086
3206
因此 我們得到了來自470位
03:33
faculty學院 and staff員工,
88
201292
971
學生 教師和工作人員的回饋
03:34
and indeed確實 we confirmed確認 that the new policy政策
89
202263
2514
事實上我們也證實了新的政策
03:36
was very annoying惱人的,
90
204777
1453
實在令人討厭
03:38
but we also found發現 that people said
91
206230
1792
但我們也發現 有人說
03:40
they felt more secure安全 with these new passwords密碼.
92
208022
3130
他們認為這些新密碼更安全
03:43
We found發現 that most people knew知道
93
211152
2306
我們發現大多數人都知道
03:45
they were not supposed應該 to
write their password密碼 down,
94
213458
2152
不應該把密碼寫下來
03:47
and only 13 percent百分 of them did,
95
215610
2391
而其中只有13%的人有把密碼寫下來
03:50
but disturbingly令人不安, 80 percent百分 of people
96
218001
2416
但令人不安的是
03:52
said they were reusing重用 their password密碼.
97
220417
2124
80%的人表示他們重複使用他們的密碼
03:54
Now, this is actually其實 more dangerous危險
98
222541
1796
這其實比寫下密碼
03:56
than writing寫作 your password密碼 down,
99
224337
2022
來得更危險
03:58
because it makes品牌 you much
more susceptible易感 to attackers攻擊者.
100
226359
3561
因為你更容易受到駭客攻擊
04:01
So if you have to, write your passwords密碼 down,
101
229920
3118
所以 必要時寫下密碼
04:05
but don't reuse重用 them.
102
233038
1799
但不要重複使用
04:06
We also found發現 some interesting有趣 things
103
234837
1751
我們還發現了一些有趣的事情
04:08
about the symbols符號 people use in passwords密碼.
104
236588
2961
關於密碼中的符號
04:11
So CMUCMU allows允許 32 possible可能 symbols符號,
105
239549
2799
CMU允許使用的符號有32個
04:14
but as you can see, there's only a small number
106
242348
2433
但正如你所見 只有少數符號
04:16
that most people are using運用,
107
244781
1802
常常被使用
04:18
so we're not actually其實 getting得到 very much strength強度
108
246583
2941
所以實際上 我們密碼也沒有變得更強
04:21
from the symbols符號 in our passwords密碼.
109
249524
2466
因為密碼裡有了符號
04:23
So this was a really interesting有趣 study研究,
110
251990
2711
這是一個非常有趣的研究
04:26
and now we had data數據 from 470 people,
111
254701
2464
現在我們有470人的資料
04:29
but in the scheme方案 of things,
112
257165
1305
但對這項計劃而言
04:30
that's really not very much password密碼 data數據,
113
258470
2580
這些數據不算多
04:33
and so we looked看著 around to see
114
261050
1445
所以我們想周圍
04:34
where could we find additional額外 password密碼 data數據?
115
262495
2560
那裡可以找到更多密碼資料?
04:37
So it turns out there are a lot of people
116
265055
2176
後來我們發現 原來有很多人
04:39
going around stealing偷竊行為 passwords密碼,
117
267231
2202
到處去竊取密碼
04:41
and they often經常 go and post崗位 these passwords密碼
118
269433
2477
然後在網路上
04:43
on the Internet互聯網.
119
271910
1337
發佈這些密碼
04:45
So we were able能夠 to get access訪問
120
273247
1673
我們取得
04:46
to some of these stolen被盜 password密碼 sets.
121
274920
3970
其中一些被盜的密碼
04:50
This is still not really ideal理想 for research研究, though雖然,
122
278890
2328
雖然這對研究來說不是很理想
04:53
because it's not entirely完全 clear明確
123
281218
2037
因為我們不清楚
04:55
where all of these passwords密碼 came來了 from,
124
283255
2184
這些密碼從哪裡來
04:57
or exactly究竟 what policies政策 were in effect影響
125
285439
2242
還是他們設密碼時
04:59
when people created創建 these passwords密碼.
126
287681
2108
採取了什麼樣的策略
05:01
So we wanted to find some better source資源 of data數據.
127
289789
3552
因此 我們希望找到
較好的資料來源
05:05
So we decided決定 that one thing we could do
128
293341
1634
所以我們發現可以做一件事
05:06
is we could do a study研究 and have people
129
294975
2129
我們打算展開研究計畫
05:09
actually其實 create創建 passwords密碼 for our study研究.
130
297104
3240
讓人們實際來設定密碼
05:12
So we used a service服務 called
Amazon亞馬遜 Mechanical機械 Turk土耳其人,
131
300344
2821
我們使用了"亞馬遜機器游牧民族"提供的服務
05:15
and this is a service服務 where you can post崗位
132
303165
2334
這是一種群眾外包服務平台
05:17
a small job工作 online線上 that takes a minute分鐘,
133
305499
2304
在那裡 你可以張貼微型線上工作資訊
05:19
a few少數 minutes分鐘, an hour小時,
134
307803
1500
只需要幾分種 一小時就能完成
05:21
and pay工資 people, a penny一分錢, ten cents, a few少數 dollars美元,
135
309303
2584
然後付些錢
05:23
to do a task任務 for you,
136
311887
1346
給幫你完成工作的人
05:25
and then you pay工資 them through通過 Amazon亞馬遜.comCOM.
137
313233
2122
然後透過Amazon.com付費給他們
05:27
So we paid支付 people about 50 cents
138
315355
2294
我們一共付了約50分美元
05:29
to create創建 a password密碼 following以下 our rules規則
139
317649
2596
讓人根據我們的規則 設立密碼
05:32
and answering回答 a survey調查,
140
320245
1410
並回答問卷
05:33
and then we paid支付 them again to come back
141
321655
2525
然後 再付給他們一筆錢
請他們兩天後回來
05:36
two days later後來 and log日誌 in
142
324180
2071
用他們的密碼登錄
05:38
using運用 their password密碼 and answering回答 another另一個 survey調查.
143
326251
2574
再回答另一份問卷
05:40
So we did this, and we collected 5,000 passwords密碼,
144
328825
4464
就這樣我們收集了5,000個密碼
05:45
and we gave people a bunch of different不同 policies政策
145
333289
2695
要求他們依不同規則
05:47
to create創建 passwords密碼 with.
146
335984
1508
設立密碼
05:49
So some people had a pretty漂亮 easy簡單 policy政策,
147
337492
1910
有些人的規則很簡單
05:51
we call it Basic基本8,
148
339402
1539
我們叫它Basic8
05:52
and here the only rule規則 was that your password密碼
149
340941
2146
它只要求
05:55
had to have at least最小 eight characters人物.
150
343087
3416
密碼至少有八個字元
05:58
Then some people had a much harder更難 policy政策,
151
346503
2251
有些人的規則就比較困難
06:00
and this was very similar類似 to the CMUCMU policy政策,
152
348754
2537
跟卡內基美隆大學的作法很像
06:03
that it had to have eight characters人物
153
351291
1934
它必須有8個字元
06:05
including包含 uppercase大寫, lowercase小寫, digit數字, symbol符號,
154
353225
2376
包括大寫 小寫 數字 符號
06:07
and pass通過 a dictionary字典 check.
155
355601
2389
並通過字典檢查
06:09
And one of the other policies政策 we tried試著,
156
357990
1335
我們試過其中的規則之一
06:11
and there were a whole整個 bunch more,
157
359325
1270
並還有更多
06:12
but one of the ones那些 we tried試著 was called Basic基本16,
158
360595
2240
但其中一種稱為Basic16
06:14
and the only requirement需求 here
159
362835
2632
唯一的要求是
06:17
was that your password密碼 had
to have at least最小 16 characters人物.
160
365467
3153
密碼至少有16個字元
06:20
All right, so now we had 5,000 passwords密碼,
161
368620
2458
所以現在我們有5,000個密碼
06:23
and so we had much more detailed詳細 information信息.
162
371078
3563
還有詳細的資訊
06:26
Again we see that there's only a small number
163
374641
2559
我們又看到
06:29
of symbols符號 that people are actually其實 using運用
164
377200
1915
只有少數的符號
06:31
in their passwords密碼.
165
379115
1886
被使用在密碼中
06:33
We also wanted to get an idea理念 of how strong強大
166
381001
2599
我們也希望知道
06:35
the passwords密碼 were that people were creating創建,
167
383600
2771
人們設立的密碼的強度
06:38
but as you may可能 recall召回, there isn't a good measure測量
168
386371
2620
但你可能還記得 還沒有一個很好的
06:40
of password密碼 strength強度.
169
388991
1754
衡量密碼強度的方法
06:42
So what we decided決定 to do was to see
170
390745
2312
所以我們決定要看看
06:45
how long it would take to crack裂紋 these passwords密碼
171
393057
2370
使用駭客用的破解工具
06:47
using運用 the best最好 cracking開裂 tools工具
172
395427
1414
或是使用
06:48
that the bad guys are using運用,
173
396841
1808
我們在文獻可以找的到的資訊
06:50
or that we could find information信息 about
174
398649
2016
需要多長時間
06:52
in the research研究 literature文學.
175
400665
1537
來破解這些密碼
06:54
So to give you an idea理念 of how bad guys
176
402202
2758
因此 讓你們來看看駭客
06:56
go about cracking開裂 passwords密碼,
177
404960
2170
如何破解密碼
06:59
they will steal a password密碼 file文件
178
407130
1951
他們竊取密碼檔案
07:01
that will have all of the passwords密碼
179
409081
2153
這檔案中的密碼
07:03
in kind of a scrambled form形成, called a hash哈希,
180
411234
2889
呈現混亂形式 稱為散列
07:06
and so what they'll他們會 do is they'll他們會 make a guess猜測
181
414123
2562
他們會用猜測的方式
07:08
as to what a password密碼 is,
182
416685
1712
來猜測密碼
07:10
run it through通過 a hashing散列 function功能,
183
418397
1897
再用散列函數程式跑過
07:12
and see whether是否 it matches火柴
184
420294
1765
看是否能在他們偷來的密碼中
07:14
the passwords密碼 they have on
their stolen被盜 password密碼 list名單.
185
422059
3950
找到相同密碼
07:18
So a dumb attacker攻擊者 will try every一切 password密碼 in order訂購.
186
426009
3105
所以 一個愚蠢駭客會將密碼一個一個順序試
07:21
They'll他們會 start開始 with AAAAAAAAAA and move移動 on to AAAABAAAAB,
187
429114
3568
從AAAAA開始 到AAAAB
07:24
and this is going to take a really long time
188
432682
2418
這要花很長的時間
07:27
before they get any passwords密碼
189
435100
1526
才能得到有可能
07:28
that people are really likely容易 to actually其實 have.
190
436626
2697
人們真正設置的密碼
07:31
A smart聰明 attacker攻擊者, on the other hand,
191
439323
2183
另一方面 一個聰明的駭客
07:33
does something much more clever聰明.
192
441506
1386
作法就聰明些
07:34
They look at the passwords密碼
193
442892
1826
他們猜測密碼時
07:36
that are known已知 to be popular流行
194
444718
1800
先試
07:38
from these stolen被盜 password密碼 sets,
195
446518
1727
從這些偷來的密碼組中
07:40
and they guess猜測 those first.
196
448245
1189
常用的密碼
07:41
So they're going to start開始 by guessing揣測 "password密碼,"
197
449434
2134
因此 他們要開始猜測“密碼”時
07:43
and then they'll他們會 guess猜測 "I love you," and "monkey,"
198
451568
2751
他們會先猜 “我愛你” 和 “猴子”
07:46
and "12345678,"
199
454319
2583
和“12345678”
07:48
because these are the passwords密碼
200
456902
1312
因為這些密碼
07:50
that are most likely容易 for people to have.
201
458214
1905
是最多人用
07:52
In fact事實, some of you probably大概 have these passwords密碼.
202
460119
3261
事實上 你們可能也有這些密碼
07:57
So what we found發現
203
465191
1298
所以 我們發現
07:58
by running賽跑 all of these 5,000 passwords密碼 we collected
204
466489
3406
用我們收集的5,000個密碼
08:01
through通過 these tests測試 to see how strong強大 they were,
205
469895
4106
用這些測試 看看他們的強度是如何
08:06
we found發現 that the long passwords密碼
206
474001
2752
我們發現
08:08
were actually其實 pretty漂亮 strong強大,
207
476753
1280
長密碼通常強度很高
08:10
and the complex複雜 passwords密碼 were pretty漂亮 strong強大 too.
208
478033
3262
而複雜的密碼也相當強
08:13
However然而, when we looked看著 at the survey調查 data數據,
209
481295
2442
然而 當我們看到這調查數據
08:15
we saw that people were really frustrated受挫
210
483737
3024
我們看到人們對非常複雜的密碼
08:18
by the very complex複雜 passwords密碼,
211
486761
2339
很有挫折感
08:21
and the long passwords密碼 were a lot more usable可用,
212
489100
2630
長的密碼會實用很多
08:23
and in some cases, they were actually其實
213
491730
1325
在某些情況下
08:25
even stronger than the complex複雜 passwords密碼.
214
493055
2908
他們比複雜密碼更強
08:27
So this suggests提示 that,
215
495963
1169
這告訴我們
08:29
instead代替 of telling告訴 people that they need
216
497132
1703
比較好是告訴人們使用長密碼
08:30
to put all these symbols符號 and numbers數字
217
498835
1522
而不是告訴人們
08:32
and crazy things into their passwords密碼,
218
500357
2842
他們要把這些符號和數字
08:35
we might威力 be better off just telling告訴 people
219
503199
2022
和瘋狂的東西
08:37
to have long passwords密碼.
220
505221
2652
放進自己的密碼中
08:39
Now here's這裡的 the problem問題, though雖然:
221
507873
1792
現在 有個問題:
08:41
Some people had long passwords密碼
222
509665
2255
有些人有長密碼
08:43
that actually其實 weren't very strong強大.
223
511920
1555
但其實並不是很強
08:45
You can make long passwords密碼
224
513475
1997
你可以有長的密碼
08:47
that are still the sort分類 of thing
225
515472
1556
仍是那些
08:49
that an attacker攻擊者 could easily容易 guess猜測.
226
517028
1742
駭客可以很容易地猜到的密碼
08:50
So we need to do more than
just say long passwords密碼.
227
518770
3365
所以我們需要的不只是長密碼
08:54
There has to be some additional額外 requirements要求,
228
522135
1936
還有一些其他的要求
08:56
and some of our ongoing不斷的 research研究 is looking at
229
524071
2969
我們正在進行的一些研究
08:59
what additional額外 requirements要求 we should add
230
527040
2439
是在尋找應該加入什麼樣的要求
09:01
to make for stronger passwords密碼
231
529479
2104
讓密碼強度更高
09:03
that also are going to be easy簡單 for people
232
531583
2312
並且很容易
09:05
to remember記得 and type類型.
233
533895
2698
讓人記住和輸入
09:08
Another另一個 approach途徑 to getting得到 people to have
234
536593
2126
讓人們設置強度高密碼的另一種方法
09:10
stronger passwords密碼 is to use a password密碼 meter儀表.
235
538719
2257
是使用一個密碼強度測量表
09:12
Here are some examples例子.
236
540976
1385
下面是一些例子
09:14
You may可能 have seen看到 these on the Internet互聯網
237
542361
1401
當你在網路上設立密碼時
09:15
when you were creating創建 passwords密碼.
238
543762
3057
你可能已經看過這些
09:18
We decided決定 to do a study研究 to find out
239
546819
2248
我們決定做研究找出
09:21
whether是否 these password密碼 meters actually其實 work.
240
549067
2887
這些密碼強度測量表是否真的有用
09:23
Do they actually其實 help people
241
551954
1421
它們是否真的
09:25
have stronger passwords密碼,
242
553375
1453
有助於設立強的密碼
09:26
and if so, which哪一個 ones那些 are better?
243
554828
2086
若有的話 哪個更好?
09:28
So we tested測試 password密碼 meters that were
244
556914
2507
所以我們在密碼旁
09:31
different不同 sizes大小, shapes形狀, colors顏色,
245
559421
2098
加入不同的大小 形狀 顏色的
09:33
different不同 words next下一個 to them,
246
561519
1416
密碼強度測量表
09:34
and we even tested測試 one that was a dancing跳舞 bunny兔子.
247
562935
3275
我們甚至用一個跳舞的兔子
09:38
As you type類型 a better password密碼,
248
566210
1582
當你輸入一個較好的密碼
09:39
the bunny兔子 dances舞蹈 faster更快 and faster更快.
249
567792
2539
兔子會跳舞跳得越來越快
09:42
So this was pretty漂亮 fun開玩笑.
250
570331
2529
這很有趣
09:44
What we found發現
251
572860
1567
我們發現
09:46
was that password密碼 meters do work.
252
574427
3572
密碼強度測量表真的有用
09:49
(Laughter笑聲)
253
577999
1801
(笑聲)
09:51
Most of the password密碼 meters were actually其實 effective有效,
254
579800
3333
大部分的密碼測量表都真的有效
09:55
and the dancing跳舞 bunny兔子 was very effective有效 too,
255
583133
2521
跳舞的兔子也很有效
09:57
but the password密碼 meters that were the most effective有效
256
585654
2881
最有效的密碼測量表
10:00
were the ones那些 that made製作 you work harder更難
257
588535
2355
再給你大拇指讚之前
10:02
before they gave you that thumbs大拇指 up and said
258
590890
1980
告訴你做的好之前
10:04
you were doing a good job工作,
259
592870
1377
你已做了很多工做
10:06
and in fact事實 we found發現 that most
260
594247
1512
而事實上 我們發現
10:07
of the password密碼 meters on the Internet互聯網 today今天
261
595759
2281
目前網路上大部分的
10:10
are too soft柔軟的.
262
598040
952
測量表都太弱
10:10
They tell you you're doing a good job工作 too early,
263
598992
2203
他們都太早告訴你 你做的好
10:13
and if they would just wait a little bit
264
601195
1929
在給你的正面回饋之前
10:15
before giving you that positive feedback反饋,
265
603124
2049
他們只要再等久一點點
10:17
you probably大概 would have better passwords密碼.
266
605173
3160
你可能就會有更好的密碼
10:20
Now another另一個 approach途徑 to better passwords密碼, perhaps也許,
267
608333
3847
另一種更好地設密碼的方法
10:24
is to use pass通過 phrases短語 instead代替 of passwords密碼.
268
612180
2890
就是使用密語 而不是密碼
10:27
So this was an xkcdXKCD cartoon動畫片
from a couple一對 of years年份 ago,
269
615070
3418
所以這是一個幾年前的XKCD卡通
10:30
and the cartoonist漫畫家 suggests提示
270
618488
1674
漫畫家建議我
10:32
that we should all use pass通過 phrases短語,
271
620162
2196
們都應該使用密語
10:34
and if you look at the second第二 row of this cartoon動畫片,
272
622358
3170
如果你看這部動畫片的第二行
10:37
you can see the cartoonist漫畫家 is suggesting提示
273
625528
1857
可以看到漫畫家建議
10:39
that the pass通過 phrase短語 "correct正確 horse battery電池 staple釘書針"
274
627385
3441
密語 “正確的馬電池主食”
10:42
would be a very strong強大 pass通過 phrase短語
275
630826
2481
是一個非常強密語
10:45
and something really easy簡單 to remember記得.
276
633307
1916
也很容易記住
10:47
He says, in fact事實, you've already已經 remembered記得 it.
277
635223
2797
他說 事實上 你已經記住了
10:50
And so we decided決定 to do a research研究 study研究
278
638020
2150
所以我們決定做一個研究
10:52
to find out whether是否 this was true真正 or not.
279
640170
2592
看這是真還是假
10:54
In fact事實, everybody每個人 who I talk to,
280
642762
1775
事實上 我們訪問的對象
10:56
who I mention提到 I'm doing password密碼 research研究,
281
644537
2042
當我們提到是做密碼的研究
10:58
they point out this cartoon動畫片.
282
646579
1400
他們提到這部動畫片
10:59
"Oh, have you seen看到 it? That xkcdXKCD.
283
647979
1574
“你看過那XKCD嗎?
11:01
Correct正確 horse battery電池 staple釘書針."
284
649553
1602
正確的馬電池主食"
11:03
So we did the research研究 study研究 to see
285
651155
1806
所以我們做了調查研究
11:04
what would actually其實 happen發生.
286
652961
2359
看看到底會發生什麼
11:07
So in our study研究, we used Mechanical機械 Turk土耳其人 again,
287
655320
3060
研究中 我們再次使用群眾外包平台
11:10
and we had the computer電腦 pick the random隨機 words
288
658380
4167
我們讓電腦隨機
11:14
in the pass通過 phrase短語.
289
662547
1100
在密語中選字
我們這樣做的原因是
11:15
Now the reason原因 we did this
290
663647
1153
11:16
is that humans人類 are not very good
291
664800
1586
人類不是很會
11:18
at picking選擇 random隨機 words.
292
666386
1384
隨機選字
11:19
If we asked a human人的 to do it,
293
667770
1262
如果我們找人來做
11:21
they would pick things that were not very random隨機.
294
669032
2998
他們挑的都不是很隨機
11:24
So we tried試著 a few少數 different不同 conditions條件.
295
672030
2032
因此 我們嘗試了一些不同的情況
11:26
In one condition條件, the computer電腦 picked採摘的
296
674062
2090
在其中一種情況下
11:28
from a dictionary字典 of the very common共同 words
297
676152
2216
電腦從字典選出
英文很常見的字
11:30
in the English英語 language語言,
298
678368
1362
所以你可能會有
11:31
and so you'd get pass通過 phrases短語 like
299
679730
1764
“試那兒三來” 的密語
11:33
"try there three come."
300
681494
1924
11:35
And we looked看著 at that, and we said,
301
683418
1732
我們看了說
11:37
"Well, that doesn't really seem似乎 very memorable難忘."
302
685150
3050
這似乎不是很好記
11:40
So then we tried試著 picking選擇 words
303
688200
2240
於是當我們嘗試
在演說中的特定部分來選字
11:42
that came來了 from specific具體 parts部分 of speech言語,
304
690440
2521
就像名詞 動詞 形容詞 名詞
11:44
so how about noun-verb-adjective-noun名詞 - 動詞 - 形容詞 - 名詞.
305
692961
2182
11:47
That comes up with something
that's sort分類 of sentence-like一句話樣.
306
695143
2577
我們得到像句子的東西
11:49
So you can get a pass通過 phrase短語 like
307
697720
2070
這樣你就可以得到一個密語 像是
"計畫建造肯定力量"
11:51
"plan計劃 builds建立 sure power功率"
308
699790
1308
11:53
or "end結束 determines確定 red drug藥物."
309
701098
2786
或是 "結束決定紅色藥"
而這些似乎比較容易記住
11:55
And these seemed似乎 a little bit more memorable難忘,
310
703884
2676
也許人們會比較喜歡一點
11:58
and maybe people would like those a little bit better.
311
706560
2822
我們希望將它們與密碼進行比較
12:01
We wanted to compare比較 them with passwords密碼,
312
709382
2572
12:03
and so we had the computer電腦
pick random隨機 passwords密碼,
313
711954
3196
所以我們用電腦挑隨機密碼
這些都是又短又好 但你會發現
12:07
and these were nice不錯 and short, but as you can see,
314
715150
1990
他們看起來不好記
12:09
they don't really look very memorable難忘.
315
717140
2806
12:11
And then we decided決定 to try something called
316
719946
1396
後我們決定嘗試一種叫做
12:13
a pronounceable拼讀 password密碼.
317
721342
1646
可發音密碼
12:14
So here the computer電腦 picks精選 random隨機 syllables音節
318
722988
2245
這是電腦隨機挑選的音節
並把它們放在一起
12:17
and puts看跌期權 them together一起
319
725233
1134
12:18
so you have something sort分類 of pronounceable拼讀,
320
726367
2475
你得到這樣好像讀得出來的東西
12:20
like "tufritvitufritvi" and "vadasabivadasabi."
321
728842
2602
“tufritvi”和“vadasabi”
12:23
That one kind of rolls勞斯萊斯 off your tongue.
322
731444
2147
又有點讓舌頭打結
這些都是電腦產生的
12:25
So these were random隨機 passwords密碼 that were
323
733591
2216
隨機密碼
12:27
generated產生 by our computer電腦.
324
735807
2744
我們在這項研究發現是
12:30
So what we found發現 in this study研究 was that, surprisingly出奇,
325
738551
2978
驚訝的是 密語實際上不是那麼好
12:33
pass通過 phrases短語 were not actually其實 all that good.
326
741529
3768
12:37
People were not really better at remembering記憶
327
745297
2793
人們記密語
12:40
the pass通過 phrases短語 than these random隨機 passwords密碼,
328
748090
2953
并不比隨機密碼記得更好
12:43
and because the pass通過 phrases短語 are longer,
329
751043
2754
並且因為密語較長
12:45
they took longer to type類型
330
753797
1226
需要較長的時間來輸入
12:47
and people made製作 more errors錯誤 while typing打字 them in.
331
755023
3010
輸入時容易出錯
所以密語不會比較好
12:50
So it's not really a clear明確 win贏得 for pass通過 phrases短語.
332
758033
3227
12:53
Sorry, all of you xkcdXKCD fans球迷.
333
761260
3345
xkcd的粉絲們 抱歉了
在另一方面 我們確實發現
12:56
On the other hand, we did find
334
764605
1892
12:58
that pronounceable拼讀 passwords密碼
335
766497
1804
那可發音密碼
13:00
worked工作 surprisingly出奇 well,
336
768301
1471
出奇地好用
13:01
and so we actually其實 are doing some more research研究
337
769772
2418
所以我們正在做一些調查研究
看看我們是否可以讓這方法做得更好
13:04
to see if we can make that
approach途徑 work even better.
338
772190
3195
13:07
So one of the problems問題
339
775385
1812
我們已經做的一些研究中
13:09
with some of the studies學習 that we've我們已經 doneDONE
340
777197
1623
有一個問題就是
13:10
is that because they're all doneDONE
341
778820
1683
因為這些研究
都是在群眾外包平台上執行
13:12
using運用 Mechanical機械 Turk土耳其人,
342
780503
1590
13:14
these are not people's人們 real真實 passwords密碼.
343
782093
1812
這些都不是人們的真正密碼
13:15
They're the passwords密碼 that they created創建
344
783905
2105
這些是為研究而創造的密碼
13:18
or the computer電腦 created創建 for them for our study研究.
345
786010
2495
或者為電腦產生的密碼
13:20
And we wanted to know whether是否 people
346
788505
1568
而我們想知道
13:22
would actually其實 behave表現 the same相同 way
347
790073
2312
是否實際上人們具有相同的行為方式
來設定他們的真實密碼
13:24
with their real真實 passwords密碼.
348
792385
2227
13:26
So we talked to the information信息
security安全 office辦公室 at Carnegie卡內基 Mellon梅隆
349
794612
3681
因此 我們請卡內積梅隆的
資訊安全辦公室
為我們提供大家的真實密碼
13:30
and asked them if we could
have everybody's每個人的 real真實 passwords密碼.
350
798293
3803
不出意外 他們是有點不願意
13:34
Not surprisingly出奇, they were a little bit reluctant不情願
351
802096
1754
13:35
to share分享 them with us,
352
803850
1550
與我們分享
13:37
but we were actually其實 able能夠 to work out
353
805400
1810
但是我們共同想出
13:39
a system系統 with them
354
807210
1040
一個辦法
13:40
where they put all of the real真實 passwords密碼
355
808250
2109
他們用來儲存
13:42
for 25,000 CMUCMU students學生們, faculty學院 and staff員工,
356
810359
3091
CMU25,000個學生,教師,和員工
真實密碼的系統
13:45
into a locked鎖定 computer電腦 in a locked鎖定 room房間,
357
813450
2448
放在一台鎖碼的電腦里 鎖在教室
13:47
not connected連接的 to the Internet互聯網,
358
815898
1394
沒有網路連線
13:49
and they ran code on it that we wrote
359
817292
1848
他們操作我們所寫的代碼
13:51
to analyze分析 these passwords密碼.
360
819140
2152
來分析密碼
13:53
They audited審計 our code.
361
821292
1326
他們審核我們的代碼
13:54
They ran the code.
362
822618
1312
他們跑這些代碼
13:55
And so we never actually其實 saw
363
823930
1738
所以我們並沒真正看到
13:57
anybody's任何人的 password密碼.
364
825668
2817
任何人的密碼
14:00
We got some interesting有趣 results結果,
365
828485
1515
我們得到了一些有趣的結果
14:02
and those of you Tepper泰珀 students學生們 in the back
366
830000
1696
那些在後面的太普學生
14:03
will be very interested有興趣 in this.
367
831696
2875
都會對此很感興趣
14:06
So we found發現 that the passwords密碼 created創建
368
834571
3731
我們發現
電腦科學學院學生密碼
14:10
by people affiliated附屬 with the
school學校 of computer電腦 science科學
369
838302
2158
14:12
were actually其實 1.8 times stronger
370
840460
2324
比那些商學院學生的密碼
14:14
than those affiliated附屬 with the business商業 school學校.
371
842784
3738
要強1.8倍
我們有很多其他的真的很有趣
14:18
We have lots of other really interesting有趣
372
846522
2040
14:20
demographic人口 information信息 as well.
373
848562
2238
人口統計資訊
另一個有趣的事情
14:22
The other interesting有趣 thing that we found發現
374
850800
1846
14:24
is that when we compared相比
the Carnegie卡內基 Mellon梅隆 passwords密碼
375
852646
2440
當我們比較
卡內基梅隆和群眾外包平台生成的密碼
14:27
to the Mechanical機械 Turk-generated土耳其人生成 passwords密碼,
376
855086
2283
14:29
there was actually其實 a lot of similarities相似之處,
377
857369
2619
實際上存在很多的相似之處
所以這有助於驗證我們的研究方法
14:31
and so this helped幫助 validate驗證 our research研究 method方法
378
859988
1948
14:33
and show顯示 that actually其實, collecting蒐集 passwords密碼
379
861936
2510
並顯示 實際上
群眾外包平台收集的密碼
14:36
using運用 these Mechanical機械 Turk土耳其人 studies學習
380
864446
1808
14:38
is actually其實 a valid有效 way to study研究 passwords密碼.
381
866254
2788
對我們的研究很有效
所以這是個好消息
14:41
So that was good news新聞.
382
869042
2285
14:43
Okay, I want to close by talking about
383
871327
2414
好吧 我想用下面所說的來做總結
14:45
some insights見解 I gained獲得 while on sabbatical休假
384
873741
2068
一些我去年在卡內基·梅隆藝術學校
14:47
last year in the Carnegie卡內基 Mellon梅隆 art藝術 school學校.
385
875809
3201
公休時獲得的啟發
其中一個我做的事就是
14:51
One of the things that I did
386
879010
1281
14:52
is I made製作 a number of quilts棉被,
387
880291
1524
我做了一些拼布棉被
14:53
and I made製作 this quilt被子 here.
388
881815
1548
這是我在這裡做的
14:55
It's called "Security安全 Blanket."
389
883363
1899
叫做 "安全性毯子"
14:57
(Laughter笑聲)
390
885262
2431
(笑聲)
14:59
And this quilt被子 has the 1,000
391
887693
3095
這棉被上有RockYou網站上公布的
15:02
most frequent頻繁 passwords密碼 stolen被盜
392
890788
2328
1000個
最常被偷的密碼
15:05
from the RockYouRockYou的 website網站.
393
893116
2571
15:07
And the size尺寸 of the passwords密碼 is proportional成比例的
394
895687
2061
密碼的大小顯示
15:09
to how frequently經常 they appeared出現
395
897748
1901
他們再被盜資料庫中
15:11
in the stolen被盜 dataset數據集.
396
899649
2248
出現頻率成正比
而我創造了這個字雲
15:13
And what I did is I created創建 this word cloud,
397
901897
2632
15:16
and I went through通過 all 1,000 words,
398
904529
2132
我把 1000個字
15:18
and I categorized分類 them into
399
906661
1795
歸類成
15:20
loose疏鬆 thematic專題 categories類別.
400
908456
2380
幾個大項
15:22
And it was, in some cases,
401
910836
1903
並且在某些情況下
15:24
it was kind of difficult to figure數字 out
402
912739
2038
很難將他們
15:26
what category類別 they should be in,
403
914777
1755
歸類
15:28
and then I color-coded顏色編碼 them.
404
916532
1899
然後我就用顏色碼來分
15:30
So here are some examples例子 of the difficulty困難.
405
918431
2619
這裡是一些有難度的例子
15:33
So "justin賈斯汀."
406
921050
1181
譬如"賈斯汀"
15:34
Is that the name名稱 of the user用戶,
407
922231
1829
是用戶的名字
15:36
their boyfriend男朋友, their son兒子?
408
924060
1322
男友 兒子?
或許是賈斯汀·比伯的粉絲
15:37
Maybe they're a Justin賈斯汀 Bieber比伯 fan風扇.
409
925382
2888
15:40
Or "princess公主."
410
928270
2225
又或是"公主"
15:42
Is that a nickname暱稱?
411
930495
1635
是個暱稱
15:44
Are they Disney迪士尼 princess公主 fans球迷?
412
932130
1595
他們是迪士尼公主的粉絲嗎?
15:45
Or maybe that's the name名稱 of their cat.
413
933725
3694
又或是是他們貓咪的名字
"我愛你" 用很多不同語言
15:49
"Iloveyou我愛你" appears出現 many許多 times
414
937419
1655
出現很多次
15:51
in many許多 different不同 languages語言.
415
939074
1545
15:52
There's a lot of love in these passwords密碼.
416
940619
3735
在這些密碼中 很多愛字
如果你仔細看
15:56
If you look carefully小心, you'll你會 see there's also
417
944354
1680
你會發現也有些褻瀆
15:58
some profanity褻瀆,
418
946034
2267
16:00
but it was really interesting有趣 to me to see
419
948301
1950
很有趣 我看到了
16:02
that there's a lot more love than hate討厭
420
950251
2307
這些密碼中
16:04
in these passwords密碼.
421
952558
2292
愛比恨多
還有動物
16:06
And there are animals動物,
422
954850
1490
16:08
a lot of animals動物,
423
956340
1360
很多的動物
16:09
and "monkey" is the most common共同 animal動物
424
957700
2304
而“猴子”是最常見的動物
16:12
and the 14th most popular流行 password密碼 overall總體.
425
960004
3675
第14個最受歡迎的密碼
我對此很好奇 也很納悶
16:15
And this was really curious好奇 to me,
426
963679
2231
16:17
and I wondered想知道, "Why are monkeys猴子 so popular流行?"
427
965910
2523
“為什麼是猴子如此受歡迎?”
16:20
And so in our last password密碼 study研究,
428
968433
3352
所以在我們最後一個密碼研究
16:23
any time we detected檢測 somebody
429
971785
1686
當我們發現有人
16:25
creating創建 a password密碼 with the word "monkey" in it,
430
973471
2649
在設定密碼中出現"猴子"
16:28
we asked them why they had
a monkey in their password密碼.
431
976120
3030
我們問他們原因
我們發現---
16:31
And what we found發現 out --
432
979150
1910
目前有17個人
16:33
we found發現 17 people so far, I think,
433
981060
2103
16:35
who have the word "monkey" --
434
983163
1283
密碼設置為猴子
16:36
We found發現 out about a third第三 of them said
435
984446
1812
有三分之一的人說
16:38
they have a pet寵物 named命名 "monkey"
436
986258
1740
他們有叫"猴子 "的寵物
16:39
or a friend朋友 whose誰的 nickname暱稱 is "monkey,"
437
987998
2291
或有朋友暱稱"猴子"
而有三分之一的說
16:42
and about a third第三 of them said
438
990289
1660
16:43
that they just like monkeys猴子
439
991949
1533
他們就是喜歡猴子
16:45
and monkeys猴子 are really cute可愛.
440
993482
1638
猴子很可愛
16:47
And that guy is really cute可愛.
441
995120
3639
而那人真的很可愛
最終這似乎表明
16:50
So it seems似乎 that at the end結束 of the day,
442
998759
3408
16:54
when we make passwords密碼,
443
1002167
1783
當我們選密碼時
16:55
we either make something that's really easy簡單
444
1003950
1974
並不選擇那些很容易輸入的東西
16:57
to type類型, a common共同 pattern模式,
445
1005924
3009
常見的模式 或是
17:00
or things that remind提醒 us of the word password密碼
446
1008933
2486
提醒我們有關密碼的東西
17:03
or the account帳戶 that we've我們已經 created創建 the password密碼 for,
447
1011419
3312
或提醒有關帳號
17:06
or whatever隨你.
448
1014731
2617
或是其他的
17:09
Or we think about things that make us happy快樂,
449
1017348
2642
或是想到讓我們感到快樂的事
17:11
and we create創建 our password密碼
450
1019990
1304
17:13
based基於 on things that make us happy快樂.
451
1021294
2238
然後我們基於
17:15
And while this makes品牌 typing打字
452
1023532
2863
讓我們快樂的事來設置密碼
雖然這使我們輸入密碼及記憶密碼的
17:18
and remembering記憶 your password密碼 more fun開玩笑,
453
1026395
2870
時候變得有趣
17:21
it also makes品牌 it a lot easier更輕鬆
454
1029265
1807
但也更容易猜測你所設立的密碼
17:23
to guess猜測 your password密碼.
455
1031072
1506
17:24
So I know a lot of these TEDTED Talks會談
456
1032578
1748
我知道TED的很多演講
17:26
are inspirational勵志
457
1034326
1634
都很有啓發性
17:27
and they make you think about nice不錯, happy快樂 things,
458
1035960
2461
並常常是可愛快樂的主題
17:30
but when you're creating創建 your password密碼,
459
1038421
1897
但當你設立密碼時
17:32
try to think about something else其他.
460
1040318
1991
試著用其他的心態去創造你的密碼
謝謝大家
17:34
Thank you.
461
1042309
1107
(掌聲)
17:35
(Applause掌聲)
462
1043416
553
Translated by Hsinju Chen
Reviewed by Justine Bai

▲Back to top

ABOUT THE SPEAKER
Lorrie Faith Cranor - Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online.

Why you should listen

Lorrie Faith Cranor is an Associate Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University, where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. She is also a co-founder of Wombat Security Technologies, Inc. She has authored over 100 research papers on online privacy, usable security, phishing, spam, electronic voting, anonymous publishing, and other topics.

Cranor plays a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P. She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. In 2003 she was named one of the top 100 innovators 35 or younger by Technology Review.

More profile about the speaker
Lorrie Faith Cranor | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee