ABOUT THE SPEAKER
Lorrie Faith Cranor - Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online.

Why you should listen

Lorrie Faith Cranor is an Associate Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University, where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. She is also a co-founder of Wombat Security Technologies, Inc. She has authored over 100 research papers on online privacy, usable security, phishing, spam, electronic voting, anonymous publishing, and other topics.

Cranor plays a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P. She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. In 2003 she was named one of the top 100 innovators 35 or younger by Technology Review.

More profile about the speaker
Lorrie Faith Cranor | Speaker | TED.com
TEDxCMU

Lorrie Faith Cranor: What’s wrong with your pa$$w0rd?

洛里·费斯·克兰纳: 你的密码有什么问题?

Filmed:
1,566,161 views

洛里·费斯·克兰纳通过研究数以千记的密码得到了很多惊奇的发现, 以及用户,安全网站经常会犯的影响信息安全的错误。你有可能会问她是如何研究上千个密码而不威胁到用户的信息安全的?这本身就是一个有趣的故事。这些密码的秘密值得你去了解,尤其如果你的密码是123456的话
- Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online. Full bio

Double-click the English transcript below to play the video.

00:12
I am a computer电脑 science科学 and engineering工程
professor教授 here at Carnegie卡内基 Mellon梅隆,
0
535
3445
我是卡内基梅隆大学计算机科学与工程专业的教授,
00:15
and my research研究 focuses重点 on
usable可用 privacy隐私 and security安全,
1
3980
4248
我的研究兴趣是隐私与安全保护。
00:20
and so my friends朋友 like to give me examples例子
2
8228
2768
因此,我那些朋友喜欢跟我吐槽,
00:22
of their frustrations挫折 with computing计算 systems系统,
3
10996
2202
说他们使用计算机时受到的种种挫折,
00:25
especially特别 frustrations挫折 related有关 to
4
13198
3354
特别是
跟不可用隐私和安全有关的挫折
00:28
unusable不可用 privacy隐私 and security安全.
5
16552
4112
00:32
So passwords密码 are something that I hear a lot about.
6
20664
2711
密码就是我经常听到抱怨的一个问题
00:35
A lot of people are frustrated受挫 with passwords密码,
7
23375
2880
很多人因为密码的问题感到沮丧
00:38
and it's bad enough足够
8
26255
1694
这真的很令人头疼
00:39
when you have to have one really good password密码
9
27949
2644
当你不得不有一个很好的密码
00:42
that you can remember记得
10
30593
1822
一个你自己可以记住
00:44
but nobody没有人 else其他 is going to be able能够 to guess猜测.
11
32415
2894
但是其他人都猜不到的密码
00:47
But what do you do when you have accounts账户
12
35309
1637
但你会怎么做呢?
当你在一百个不同的系统里有不同的账户
00:48
on a hundred different不同 systems系统
13
36946
1808
00:50
and you're supposed应该 to have a unique独特 password密码
14
38754
2276
你是不是
该给每个系统都设立一个独立的密码呢?
00:53
for each of these systems系统?
15
41030
3037
00:56
It's tough强硬.
16
44067
2184
这是非常困难的
00:58
At Carnegie卡内基 Mellon梅隆, they used to make it
17
46251
1759
在卡内基梅隆,
01:00
actually其实 pretty漂亮 easy简单 for us
18
48010
1299
曾经对于我们来说很容易
01:01
to remember记得 our passwords密码.
19
49309
1737
去记住我们的密码
01:03
The password密码 requirement需求 up through通过 2009
20
51046
2403
在2009年以前,对于密码的要求
01:05
was just that you had to have a password密码
21
53449
2379
仅仅是你的密码中
01:07
with at least最小 one character字符.
22
55828
2211
最少需要一个字母
非常容易,但是他们改变了规则
01:10
Pretty漂亮 easy简单. But then they changed things,
23
58039
2888
01:12
and at the end结束 of 2009, they announced公布
24
60927
2670
在2009年末,他们宣布
01:15
that we were going to have a new policy政策,
25
63597
2376
我们将会有新的规则
01:17
and this new policy政策 required需要
26
65973
1863
这个新的规则要求
01:19
passwords密码 that were at least最小 eight characters人物 long,
27
67836
2681
密码至少有8位数长
01:22
with an uppercase大写 letter, lowercase小写 letter,
28
70517
1775
至少有一个大写字母,一个小写字母
01:24
a digit数字, a symbol符号,
29
72292
1288
一个数字,一个符号
01:25
you couldn't不能 use the same相同
character字符 more than three times,
30
73580
2638
你不能用重复使用同一个字符三次
01:28
and it wasn't allowed允许 to be in a dictionary字典.
31
76218
2434
并且密码不能是字典里的一个单词
01:30
Now, when they implemented实施 this new policy政策,
32
78652
2182
现在,当这个新的规则被使用后
01:32
a lot of people, my colleagues同事 and friends朋友,
33
80834
2310
很多人,我的同学,朋友,都跑来找我
01:35
came来了 up to me and they said, "Wow,
34
83144
1854
他们说:“天哪“
01:36
now that's really unusable不可用.
35
84998
1512
”这个新的规则真的很难被使用“
01:38
Why are they doing this to us,
36
86510
1193
“他们为什么要对我们这么做”
01:39
and why didn't you stop them?"
37
87703
1711
“你为什么不阻止他们呢?”
01:41
And I said, "Well, you know what?
38
89414
1356
我说:“你知道吗”
01:42
They didn't ask me."
39
90770
1508
”他们做出这些调整的时候并没有问我“
01:44
But I got curious好奇, and I decided决定 to go talk
40
92278
3465
但是我对此产生了兴趣
所以我决定去找掌管我们计算机系统的同事聊聊
01:47
to the people in charge收费 of our computer电脑 systems系统
41
95743
1937
01:49
and find out what led them to introduce介绍
42
97680
2831
并且知道了他们为什么要引进
01:52
this new policy政策,
43
100511
1848
这个新的规则
他们说我们学校
01:54
and they said that the university大学
44
102359
1584
01:55
had joined加盟 a consortium财团 of universities高校,
45
103943
2366
加入了一个大学联盟
01:58
and one of the requirements要求 of membership
46
106309
2634
加入这个大学联盟的一个要求
02:00
was that we had to have stronger passwords密码
47
108943
2248
就是我们要有更安全的密码
02:03
that complied编译过 with some new requirements要求,
48
111191
2272
这个密码需要符合最新的要求
而这个最新的标准就是
02:05
and these requirements要求 were that our passwords密码
49
113463
2104
我们的密码需要是一个无序的组合
02:07
had to have a lot of entropy.
50
115567
1604
02:09
Now entropy is a complicated复杂 term术语,
51
117171
2278
无序状态是一个很复杂的名词
02:11
but basically基本上 it measures措施 the strength强度 of passwords密码.
52
119449
2798
但是基本上来说,他是用来衡量密码安全性的标准
02:14
But the thing is, there isn't actually其实
53
122247
1979
但是问题是
并没有一个标准的方法来测量无序性
02:16
a standard标准 measure测量 of entropy.
54
124226
1949
02:18
Now, the National国民 Institute研究所
of Standards标准 and Technology技术
55
126175
2399
现在,国家标准技术局
02:20
has a set of guidelines方针
56
128574
1553
有一系列的标准
02:22
which哪一个 have some rules规则 of thumb拇指
57
130127
2568
这些标准中有一些粗略的方法
02:24
for measuring测量 entropy,
58
132695
1440
用来测量无序性
02:26
but they don't have anything too specific具体,
59
134135
2895
但是他们并没有很详细的方法
02:29
and the reason原因 they only have rules规则 of thumb拇指
60
137030
2337
他们只有粗略的方法的原因是
02:31
is it turns out they don't actually其实 have any good data数据
61
139367
3136
他们事实上并没有很多好的的数据
02:34
on passwords密码.
62
142503
1520
来研究密码
02:36
In fact事实, their report报告 states状态,
63
144023
2312
事实上,他们在工作报告中说
02:38
"Unfortunately不幸, we do not have much data数据
64
146335
2328
”很不幸的是,我们并没有很多关于
02:40
on the passwords密码 users用户
choose选择 under particular特定 rules规则.
65
148663
2842
用户在一种规则下如何选择密码的数据“
02:43
NISTNIST would like to obtain获得 more data数据
66
151505
2333
国家标准技术局想要获得更多
02:45
on the passwords密码 users用户 actually其实 choose选择,
67
153838
2462
关于用户如何选择密码的数据
02:48
but system系统 administrators管理员
are understandably可以理解的 reluctant不情愿
68
156300
2463
但是系统管理员合情合理的拒绝
02:50
to reveal揭示 password密码 data数据 to others其他."
69
158763
2940
把密码信息透露给其他人
02:53
So this is a problem问题, but our research研究 group
70
161703
3097
所以这是一个问题
但我们的研究小组认为这是一个机会
02:56
looked看着 at it as an opportunity机会.
71
164800
2140
02:58
We said, "Well, there's a need
for good password密码 data数据.
72
166940
3100
我们认为:”这表明很需要有一个好的密码数据库。”
03:02
Maybe we can collect搜集 some good password密码 data数据
73
170040
2148
也许我们可以收集一些好的密码数据
03:04
and actually其实 advance提前 the state of the art艺术 here.
74
172188
2704
并且推进这方面的研究
03:06
So the first thing we did is,
75
174892
1672
因此,我们要做的第一件事是:
03:08
we got a bag of candy糖果 bars酒吧
76
176564
1556
我们买了一袋糖
03:10
and we walked around campus校园
77
178120
1086
走在校园里
03:11
and talked to students学生们, faculty学院 and staff员工,
78
179206
2798
并且跟同学,教师,员工对话
03:14
and asked them for information信息
79
182004
1530
所要他们
密码的信息
03:15
about their passwords密码.
80
183534
1552
03:17
Now we didn't say, "Give us your password密码."
81
185086
3004
我们并没有说:“把你的密码给我们吧”
03:20
No, we just asked them about their password密码.
82
188090
2661
我们只是问关于他们密码的信息
03:22
How long is it? Does it have a digit数字?
83
190751
1478
密码有多长?包含有数字吗?
03:24
Does it have a symbol符号?
84
192229
1068
有符号吗?
03:25
And were you annoyed懊恼 at having to create创建
85
193297
2045
你有没有感到恼怒?
因为上周要重新拟定一个密码
03:27
a new one last week?
86
195342
2744
03:30
So we got results结果 from 470 students学生们,
87
198086
3206
我们得到了结果从470个学生
老师跟员工
03:33
faculty学院 and staff员工,
88
201292
971
03:34
and indeed确实 we confirmed确认 that the new policy政策
89
202263
2514
事实上我们证实了这个新的规则
03:36
was very annoying恼人的,
90
204777
1453
很让人讨厌
03:38
but we also found发现 that people said
91
206230
1792
但与此同时,人们也表示
03:40
they felt more secure安全 with these new passwords密码.
92
208022
3130
这个这个新的密码更加的安全
03:43
We found发现 that most people knew知道
93
211152
2306
我们发现大部分人知道
03:45
they were not supposed应该 to
write their password密码 down,
94
213458
2152
他们不应该把他们的密码写下来
03:47
and only 13 percent百分 of them did,
95
215610
2391
并且只有13%的人会把密码写下来
03:50
but disturbingly令人不安, 80 percent百分 of people
96
218001
2416
但是与之矛盾的是
有80%的人会重复使用同一个密码
03:52
said they were reusing重用 their password密码.
97
220417
2124
03:54
Now, this is actually其实 more dangerous危险
98
222541
1796
这事实上
比把密码记下来更加的危险
03:56
than writing写作 your password密码 down,
99
224337
2022
03:58
because it makes品牌 you much
more susceptible易感 to attackers攻击者.
100
226359
3561
因为这让你更容易被黑客攻击
04:01
So if you have to, write your passwords密码 down,
101
229920
3118
如果没有别的选择,那么请把你的密码记下来
04:05
but don't reuse重用 them.
102
233038
1799
而不要重读使用一个密码
04:06
We also found发现 some interesting有趣 things
103
234837
1751
我们还有一些很有趣的发现
04:08
about the symbols符号 people use in passwords密码.
104
236588
2961
这些发现跟人们在密码中使用符号有关
04:11
So CMUCMU allows允许 32 possible可能 symbols符号,
105
239549
2799
卡内基梅隆大学允许使用32个符号,
04:14
but as you can see, there's only a small number
106
242348
2433
但事实上只有少数几个符号
04:16
that most people are using运用,
107
244781
1802
被大多数人使用
04:18
so we're not actually其实 getting得到 very much strength强度
108
246583
2941
因此,事实上
使用符号并没有让我们的密码变得更加安全
04:21
from the symbols符号 in our passwords密码.
109
249524
2466
04:23
So this was a really interesting有趣 study研究,
110
251990
2711
因此,这真的是一项很有趣的研究
04:26
and now we had data数据 from 470 people,
111
254701
2464
现在,我们已经有从470个人那里拿到的数据
04:29
but in the scheme方案 of things,
112
257165
1305
但整体来说
04:30
that's really not very much password密码 data数据,
113
258470
2580
这些数据并不是确切的密码的数据
04:33
and so we looked看着 around to see
114
261050
1445
因此我们还得通过其他方式
04:34
where could we find additional额外 password密码 data数据?
115
262495
2560
来获取更多的密码数据
04:37
So it turns out there are a lot of people
116
265055
2176
生活中有很多人
04:39
going around stealing偷窃行为 passwords密码,
117
267231
2202
窃取他人的密码
04:41
and they often经常 go and post岗位 these passwords密码
118
269433
2477
他们经常会把这些密码公布
04:43
on the Internet互联网.
119
271910
1337
在网上
04:45
So we were able能够 to get access访问
120
273247
1673
因此,我们可以获得一些
04:46
to some of these stolen被盗 password密码 sets.
121
274920
3970
这种偷来的密码
04:50
This is still not really ideal理想 for research研究, though虽然,
122
278890
2328
这些数据对于我们的研究来书还不是很完美
04:53
because it's not entirely完全 clear明确
123
281218
2037
因为我们并不知道
04:55
where all of these passwords密码 came来了 from,
124
283255
2184
这些密码的来源
04:57
or exactly究竟 what policies政策 were in effect影响
125
285439
2242
以及这些密码是在什么样的规则下
04:59
when people created创建 these passwords密码.
126
287681
2108
制定出来的
05:01
So we wanted to find some better source资源 of data数据.
127
289789
3552
因此我们需要找到一些更好的数据来源
05:05
So we decided决定 that one thing we could do
128
293341
1634
所以我们觉得我们可以做的是
05:06
is we could do a study研究 and have people
129
294975
2129
我们可以做一个研究
并且让人们为我们的实验设置密码
05:09
actually其实 create创建 passwords密码 for our study研究.
130
297104
3240
05:12
So we used a service服务 called
Amazon亚马逊 Mechanical机械 Turk土耳其人,
131
300344
2821
所以我们就通过使用一个叫做亚马逊机器土耳其人的服务
05:15
and this is a service服务 where you can post岗位
132
303165
2334
这个服务可以让你在网上公布一些小任务,
05:17
a small job工作 online线上 that takes a minute分钟,
133
305499
2304
这些任务可能好使一分钟
05:19
a few少数 minutes分钟, an hour小时,
134
307803
1500
几分钟,一个小时
05:21
and pay工资 people, a penny一分钱, ten cents, a few少数 dollars美元,
135
309303
2584
我们支付人们一美分,几美分,几美元
05:23
to do a task任务 for you,
136
311887
1346
来帮助我们完成任务
05:25
and then you pay工资 them through通过 Amazon亚马逊.comCOM.
137
313233
2122
之后你可以通过亚马逊来支付这些参与者
05:27
So we paid支付 people about 50 cents
138
315355
2294
我们付大约50美分让
05:29
to create创建 a password密码 following以下 our rules规则
139
317649
2596
参与者在我们的规则下制定密码
05:32
and answering回答 a survey调查,
140
320245
1410
并且完成调查问卷
05:33
and then we paid支付 them again to come back
141
321655
2525
然后当我们会支付他们第二笔钱
05:36
two days later后来 and log日志 in
142
324180
2071
当他们两天后
用这个密码登录并完成另一份调查问卷
05:38
using运用 their password密码 and answering回答 another另一个 survey调查.
143
326251
2574
05:40
So we did this, and we collected 5,000 passwords密码,
144
328825
4464
我们通过这种方式拿到了5000个密码
05:45
and we gave people a bunch of different不同 policies政策
145
333289
2695
我们给人么不同的规则
来制定密码
05:47
to create创建 passwords密码 with.
146
335984
1508
05:49
So some people had a pretty漂亮 easy简单 policy政策,
147
337492
1910
一些人的规则比较简单
05:51
we call it Basic基本8,
148
339402
1539
我们称它为基础8
05:52
and here the only rule规则 was that your password密码
149
340941
2146
只有一个规则,就是你的密码
05:55
had to have at least最小 eight characters人物.
150
343087
3416
必须包含8个字符
05:58
Then some people had a much harder更难 policy政策,
151
346503
2251
有些人则会有更难的规则
06:00
and this was very similar类似 to the CMUCMU policy政策,
152
348754
2537
这些规则跟卡内基梅隆大学的规则跟相似
06:03
that it had to have eight characters人物
153
351291
1934
密码必须由八位数组成
06:05
including包含 uppercase大写, lowercase小写, digit数字, symbol符号,
154
353225
2376
包含有大写字母,小写字母,数字跟符号
06:07
and pass通过 a dictionary字典 check.
155
355601
2389
并且可以通过字典检查
06:09
And one of the other policies政策 we tried试着,
156
357990
1335
我们也试了另外一种规则
06:11
and there were a whole整个 bunch more,
157
359325
1270
以及许多别的规则
06:12
but one of the ones那些 we tried试着 was called Basic基本16,
158
360595
2240
其中有一种规则我们称之为基础16
06:14
and the only requirement需求 here
159
362835
2632
唯一的要求就是
06:17
was that your password密码 had
to have at least最小 16 characters人物.
160
365467
3153
你的密码必须至少由16个字符组成
06:20
All right, so now we had 5,000 passwords密码,
161
368620
2458
那么,现在我们已经有5000个密码了
06:23
and so we had much more detailed详细 information信息.
162
371078
3563
并且我们有了更加具体的信息
06:26
Again we see that there's only a small number
163
374641
2559
我们再次发现
只有很少数的符号
06:29
of symbols符号 that people are actually其实 using运用
164
377200
1915
被人们在设定密码的过程中使用
06:31
in their passwords密码.
165
379115
1886
06:33
We also wanted to get an idea理念 of how strong强大
166
381001
2599
我们也很想知道
人们设定的密码安全性有多高
06:35
the passwords密码 were that people were creating创建,
167
383600
2771
06:38
but as you may可能 recall召回, there isn't a good measure测量
168
386371
2620
但也许你还记的
并没有很好的方法可以用来衡量密码的安全性
06:40
of password密码 strength强度.
169
388991
1754
06:42
So what we decided决定 to do was to see
170
390745
2312
因此,我们决定通过
破解密码的时间
06:45
how long it would take to crack裂纹 these passwords密码
171
393057
2370
06:47
using运用 the best最好 cracking开裂 tools工具
172
395427
1414
使用最好的解密软件
06:48
that the bad guys are using运用,
173
396841
1808
那些正在被坏人使用的
06:50
or that we could find information信息 about
174
398649
2016
或者我们也可以
通过查阅文献来获取相应的信息
06:52
in the research研究 literature文学.
175
400665
1537
06:54
So to give you an idea理念 of how bad guys
176
402202
2758
为了让大家更好的了解坏人
06:56
go about cracking开裂 passwords密码,
177
404960
2170
是如何破解密码的
06:59
they will steal a password密码 file文件
178
407130
1951
他们会偷一个密码文件
07:01
that will have all of the passwords密码
179
409081
2153
这个文件有所有的密码
07:03
in kind of a scrambled form形成, called a hash哈希,
180
411234
2889
无序排列,称为散表
07:06
and so what they'll他们会 do is they'll他们会 make a guess猜测
181
414123
2562
然后他们开始猜测
07:08
as to what a password密码 is,
182
416685
1712
密码会是什么
07:10
run it through通过 a hashing散列 function功能,
183
418397
1897
通过运行哈希函数
07:12
and see whether是否 it matches火柴
184
420294
1765
来看这个密码
跟密码清单上的密码能否相对应
07:14
the passwords密码 they have on
their stolen被盗 password密码 list名单.
185
422059
3950
07:18
So a dumb attacker攻击者 will try every一切 password密码 in order订购.
186
426009
3105
一个笨的黑客会按照顺序试每一种密码
07:21
They'll他们会 start开始 with AAAAAAAAAA and move移动 on to AAAABAAAAB,
187
429114
3568
他们会从AAAAA开始,然后AAAAB
07:24
and this is going to take a really long time
188
432682
2418
这种方法会消耗很长的时间
07:27
before they get any passwords密码
189
435100
1526
直到他们找到
07:28
that people are really likely容易 to actually其实 have.
190
436626
2697
那些人们真正会使用的密码
07:31
A smart聪明 attacker攻击者, on the other hand,
191
439323
2183
然而,一个聪明的黑客
07:33
does something much more clever聪明.
192
441506
1386
会使用更加明智的方法
07:34
They look at the passwords密码
193
442892
1826
他们观察这些密码
07:36
that are known已知 to be popular流行
194
444718
1800
找出那些最受欢迎的组合
07:38
from these stolen被盗 password密码 sets,
195
446518
1727
从偷来的密码清单上
07:40
and they guess猜测 those first.
196
448245
1189
他们会先试这些受欢迎的密码
07:41
So they're going to start开始 by guessing揣测 "password密码,"
197
449434
2134
所以他们会先猜 “密码",
07:43
and then they'll他们会 guess猜测 "I love you," and "monkey,"
198
451568
2751
然后 ”我爱你" ,然后”猴子”
07:46
and "12345678,"
199
454319
2583
"12345678"
因为这些密码
07:48
because these are the passwords密码
200
456902
1312
07:50
that are most likely容易 for people to have.
201
458214
1905
是最常被人们使用的
07:52
In fact事实, some of you probably大概 have these passwords密码.
202
460119
3261
事实上,很有可能在座的各位中也有人使用这样的密码
07:57
So what we found发现
203
465191
1298
因此我们发现
通过破解我们在这个试验中收集到的5000个密码
07:58
by running赛跑 all of these 5,000 passwords密码 we collected
204
466489
3406
08:01
through通过 these tests测试 to see how strong强大 they were,
205
469895
4106
来判断这些密码的安全性
08:06
we found发现 that the long passwords密码
206
474001
2752
我们发现长密码
的安全性很高
08:08
were actually其实 pretty漂亮 strong强大,
207
476753
1280
08:10
and the complex复杂 passwords密码 were pretty漂亮 strong强大 too.
208
478033
3262
并且那些复杂组合密码的安全性也很高
08:13
However然而, when we looked看着 at the survey调查 data数据,
209
481295
2442
然而,当我们分析调查问卷的数据
08:15
we saw that people were really frustrated受挫
210
483737
3024
我们发现人们
对于复杂组合的密码感到沮丧
08:18
by the very complex复杂 passwords密码,
211
486761
2339
08:21
and the long passwords密码 were a lot more usable可用,
212
489100
2630
而那些长的密码反而实用性更高
08:23
and in some cases, they were actually其实
213
491730
1325
在某种情况下,长密码
08:25
even stronger than the complex复杂 passwords密码.
214
493055
2908
比复杂组合密码的安全性反而更高
由此我们得出结论
08:27
So this suggests提示 that,
215
495963
1169
08:29
instead代替 of telling告诉 people that they need
216
497132
1703
与其让人们把
08:30
to put all these symbols符号 and numbers数字
217
498835
1522
各种符号,数字
08:32
and crazy things into their passwords密码,
218
500357
2842
以及各种疯狂的元素加入他们的密码
08:35
we might威力 be better off just telling告诉 people
219
503199
2022
还不如就让他们
08:37
to have long passwords密码.
220
505221
2652
制定更长的密码
08:39
Now here's这里的 the problem问题, though虽然:
221
507873
1792
然后问题出现了:
08:41
Some people had long passwords密码
222
509665
2255
有一些人的长密码
08:43
that actually其实 weren't very strong强大.
223
511920
1555
并不是很安全
08:45
You can make long passwords密码
224
513475
1997
你可以制定很长的密码,
08:47
that are still the sort分类 of thing
225
515472
1556
但是这些密码
08:49
that an attacker攻击者 could easily容易 guess猜测.
226
517028
1742
还是很容易被黑客猜到
08:50
So we need to do more than
just say long passwords密码.
227
518770
3365
因此紧紧要求密码的长度是不够的
我们还需要一些其它的要求
08:54
There has to be some additional额外 requirements要求,
228
522135
1936
08:56
and some of our ongoing不断的 research研究 is looking at
229
524071
2969
一些我们目前正在做的研究
就是想要找出这些额外的要求
08:59
what additional额外 requirements要求 we should add
230
527040
2439
09:01
to make for stronger passwords密码
231
529479
2104
让密码更加安全
09:03
that also are going to be easy简单 for people
232
531583
2312
并且这些要求得让人们
09:05
to remember记得 and type类型.
233
533895
2698
觉得很容易记忆跟输入
09:08
Another另一个 approach途径 to getting得到 people to have
234
536593
2126
另一个让人们有
更安全的密码的方法就是用密码尺
09:10
stronger passwords密码 is to use a password密码 meter仪表.
235
538719
2257
09:12
Here are some examples例子.
236
540976
1385
这里有一些例子
09:14
You may可能 have seen看到 these on the Internet互联网
237
542361
1401
你也许在网上已经见过了
09:15
when you were creating创建 passwords密码.
238
543762
3057
当你设定密码的时候
09:18
We decided决定 to do a study研究 to find out
239
546819
2248
我们决定通过一个实验来判断
09:21
whether是否 these password密码 meters actually其实 work.
240
549067
2887
这些密码尺是否有效
09:23
Do they actually其实 help people
241
551954
1421
他能不能真正的帮助人们
09:25
have stronger passwords密码,
242
553375
1453
设定更为安全的密码
09:26
and if so, which哪一个 ones那些 are better?
243
554828
2086
如果可以的话,哪一种密码尺更为有效
09:28
So we tested测试 password密码 meters that were
244
556914
2507
因此,我们检测了
09:31
different不同 sizes大小, shapes形状, colors颜色,
245
559421
2098
不同尺寸,形状,颜色
不同描述语言的密码尺
09:33
different不同 words next下一个 to them,
246
561519
1416
09:34
and we even tested测试 one that was a dancing跳舞 bunny兔子.
247
562935
3275
我们甚至还检测了一种像跳舞的兔子的密码尺
09:38
As you type类型 a better password密码,
248
566210
1582
当你输入一个很好的密码的时候
09:39
the bunny兔子 dances舞蹈 faster更快 and faster更快.
249
567792
2539
兔子会跳的越来越快
09:42
So this was pretty漂亮 fun开玩笑.
250
570331
2529
所以这种密码尺很有趣
09:44
What we found发现
251
572860
1567
我们发现
09:46
was that password密码 meters do work.
252
574427
3572
这些密码尺确实有用
(笑声)
09:49
(Laughter笑声)
253
577999
1801
09:51
Most of the password密码 meters were actually其实 effective有效,
254
579800
3333
大多数的密码尺是有效的
跳舞的兔子尤其的有效
09:55
and the dancing跳舞 bunny兔子 was very effective有效 too,
255
583133
2521
09:57
but the password密码 meters that were the most effective有效
256
585654
2881
但最有效的密码尺
10:00
were the ones那些 that made制作 you work harder更难
257
588535
2355
是让你更努力的工作
10:02
before they gave you that thumbs大拇指 up and said
258
590890
1980
直到他竖起大拇指跟你说
你做的很棒
10:04
you were doing a good job工作,
259
592870
1377
10:06
and in fact事实 we found发现 that most
260
594247
1512
但事实上,我们发现
10:07
of the password密码 meters on the Internet互联网 today今天
261
595759
2281
目前网络上现有的密码尺
都太温柔了
10:10
are too soft柔软的.
262
598040
952
10:10
They tell you you're doing a good job工作 too early,
263
598992
2203
他们都太早告诉你,你做的很好
10:13
and if they would just wait a little bit
264
601195
1929
如果他们可以晚一些
10:15
before giving you that positive feedback反馈,
265
603124
2049
给你正面的回应
10:17
you probably大概 would have better passwords密码.
266
605173
3160
你很有可能可以设定更安全的密码
10:20
Now another另一个 approach途径 to better passwords密码, perhaps也许,
267
608333
3847
设立更好的密码的另一种方法
也许是使用词汇密码而不是密码
10:24
is to use pass通过 phrases短语 instead代替 of passwords密码.
268
612180
2890
这是很多年前的一个xkcd动画
10:27
So this was an xkcdXKCD cartoon动画片
from a couple一对 of years年份 ago,
269
615070
3418
10:30
and the cartoonist漫画家 suggests提示
270
618488
1674
动漫家们暗示
10:32
that we should all use pass通过 phrases短语,
271
620162
2196
我们都应该使用词汇密码
10:34
and if you look at the second第二 row of this cartoon动画片,
272
622358
3170
如果你看这个卡通的第二排
10:37
you can see the cartoonist漫画家 is suggesting提示
273
625528
1857
你会发现这些动漫家建议
10:39
that the pass通过 phrase短语 "correct正确 horse battery电池 staple钉书针"
274
627385
3441
词汇密码“正确马电池枫叶"
10:42
would be a very strong强大 pass通过 phrase短语
275
630826
2481
的安全性很高
10:45
and something really easy简单 to remember记得.
276
633307
1916
并且很容易被记住
10:47
He says, in fact事实, you've already已经 remembered记得 it.
277
635223
2797
他们认为,事实上你已经记住了
10:50
And so we decided决定 to do a research研究 study研究
278
638020
2150
因此我们决定做一项研究
10:52
to find out whether是否 this was true真正 or not.
279
640170
2592
来证明这是不是真的
10:54
In fact事实, everybody每个人 who I talk to,
280
642762
1775
事实上,跟我对话的每一个人
10:56
who I mention提到 I'm doing password密码 research研究,
281
644537
2042
那些我告诉他们我是做密码研究的人
10:58
they point out this cartoon动画片.
282
646579
1400
他们都提到了这个动画
10:59
"Oh, have you seen看到 it? That xkcdXKCD.
283
647979
1574
”你看过那个动画吗?那个xkcd“
11:01
Correct正确 horse battery电池 staple钉书针."
284
649553
1602
“正确马电池枫叶”
11:03
So we did the research研究 study研究 to see
285
651155
1806
因此我们做了一项研究看
11:04
what would actually其实 happen发生.
286
652961
2359
到底会发生什么
11:07
So in our study研究, we used Mechanical机械 Turk土耳其人 again,
287
655320
3060
我们使用亚马逊机器土耳其人来做这个研究
11:10
and we had the computer电脑 pick the random随机 words
288
658380
4167
我们让电脑随机挑选一些
11:14
in the pass通过 phrase短语.
289
662547
1100
过去式的单词
11:15
Now the reason原因 we did this
290
663647
1153
我们这样做的原因是
11:16
is that humans人类 are not very good
291
664800
1586
人们并不擅长
11:18
at picking选择 random随机 words.
292
666386
1384
随机挑选单词
11:19
If we asked a human人的 to do it,
293
667770
1262
如果我让一个人去选单词
11:21
they would pick things that were not very random随机.
294
669032
2998
他们选出的单词不会是随机的
11:24
So we tried试着 a few少数 different不同 conditions条件.
295
672030
2032
因此我们试了不同的条件。
11:26
In one condition条件, the computer电脑 picked采摘的
296
674062
2090
在一种条件下,
电脑从一本字典中选出一些非常常用的
11:28
from a dictionary字典 of the very common共同 words
297
676152
2216
11:30
in the English英语 language语言,
298
678368
1362
英文单词
11:31
and so you'd get pass通过 phrases短语 like
299
679730
1764
因此你会得到密码词汇像
11:33
"try there three come."
300
681494
1924
“试那里三来"
11:35
And we looked看着 at that, and we said,
301
683418
1732
我们看着这些词汇说,
11:37
"Well, that doesn't really seem似乎 very memorable难忘."
302
685150
3050
‘这看上去并不是很容易被记住。”
11:40
So then we tried试着 picking选择 words
303
688200
2240
然后我们尝试
11:42
that came来了 from specific具体 parts部分 of speech言语,
304
690440
2521
从日常对话中挑选词汇
11:44
so how about noun-verb-adjective-noun名词 - 动词 - 形容词 - 名词.
305
692961
2182
像名词-动词-形容词-名词的组合。
11:47
That comes up with something
that's sort分类 of sentence-like一句话样.
306
695143
2577
这会让出现的单词更像句子
11:49
So you can get a pass通过 phrase短语 like
307
697720
2070
这样你会得到像这样的词汇密码
11:51
"plan计划 builds建立 sure power功率"
308
699790
1308
“计划建设肯定的权利"
11:53
or "end结束 determines确定 red drug药物."
309
701098
2786
或者 ”结局决定红色的药。“
这些组合看上去更容易被记住
11:55
And these seemed似乎 a little bit more memorable难忘,
310
703884
2676
11:58
and maybe people would like those a little bit better.
311
706560
2822
人们也许会更喜欢这样的密码
12:01
We wanted to compare比较 them with passwords密码,
312
709382
2572
我们想把这样的密码词汇更普通的密码做比较
12:03
and so we had the computer电脑
pick random随机 passwords密码,
313
711954
3196
因此我们让电脑随机挑选密码
12:07
and these were nice不错 and short, but as you can see,
314
715150
1990
这些密码都很好很短
12:09
they don't really look very memorable难忘.
315
717140
2806
但你会发现他们并不好记忆
12:11
And then we decided决定 to try something called
316
719946
1396
然后我们决定尝试一种叫做
12:13
a pronounceable拼读 password密码.
317
721342
1646
可以发声的密码
12:14
So here the computer电脑 picks精选 random随机 syllables音节
318
722988
2245
电脑挑选随机的音节
12:17
and puts看跌期权 them together一起
319
725233
1134
把他们组合在一起
12:18
so you have something sort分类 of pronounceable拼读,
320
726367
2475
这样你就有了一些可以发声的密码
12:20
like "tufritvitufritvi" and "vadasabivadasabi."
321
728842
2602
像”tufritvi" 和“vadasabi.’
12:23
That one kind of rolls劳斯莱斯 off your tongue.
322
731444
2147
这些密码像是在挑战你的舌头
因此这些密码
12:25
So these were random随机 passwords密码 that were
323
733591
2216
12:27
generated产生 by our computer电脑.
324
735807
2744
是计算机为我们设定的
12:30
So what we found发现 in this study研究 was that, surprisingly出奇,
325
738551
2978
很惊讶的是,我们从这项试验中发现
12:33
pass通过 phrases短语 were not actually其实 all that good.
326
741529
3768
密码词汇并没有想象中那么好
12:37
People were not really better at remembering记忆
327
745297
2793
跟普通密码相比,人们并没有更好的记住
12:40
the pass通过 phrases短语 than these random随机 passwords密码,
328
748090
2953
这些词汇密码
12:43
and because the pass通过 phrases短语 are longer,
329
751043
2754
并且由于词汇密码更长
12:45
they took longer to type类型
330
753797
1226
会花更长的时间来输入
12:47
and people made制作 more errors错误 while typing打字 them in.
331
755023
3010
这会让人们打字的时候犯更多的错误
12:50
So it's not really a clear明确 win赢得 for pass通过 phrases短语.
332
758033
3227
因此词汇密码并没有明显的优势
12:53
Sorry, all of you xkcdXKCD fans球迷.
333
761260
3345
对于那些xkcd粉丝来说,这项结果真的很遗憾
12:56
On the other hand, we did find
334
764605
1892
另一方面,我们发现
12:58
that pronounceable拼读 passwords密码
335
766497
1804
那些可发声密码
13:00
worked工作 surprisingly出奇 well,
336
768301
1471
非常的有效
13:01
and so we actually其实 are doing some more research研究
337
769772
2418
因此,我们做了更多的研究
13:04
to see if we can make that
approach途径 work even better.
338
772190
3195
是的这种方法可以更好的运作
13:07
So one of the problems问题
339
775385
1812
有一个问题
13:09
with some of the studies学习 that we've我们已经 doneDONE
340
777197
1623
存在于我们做的一些实验中
13:10
is that because they're all doneDONE
341
778820
1683
那就是这些实验都
13:12
using运用 Mechanical机械 Turk土耳其人,
342
780503
1590
是通过机器土耳其人做的
13:14
these are not people's人们 real真实 passwords密码.
343
782093
1812
这些密码不是人们日常生活中会用的密码
13:15
They're the passwords密码 that they created创建
344
783905
2105
这些密码是人们
13:18
or the computer电脑 created创建 for them for our study研究.
345
786010
2495
或者是计算机为了我们的实验而设立的
13:20
And we wanted to know whether是否 people
346
788505
1568
而我们很想知道
人们会不会用同样的方式来制定密码
13:22
would actually其实 behave表现 the same相同 way
347
790073
2312
13:24
with their real真实 passwords密码.
348
792385
2227
在日常生活中
13:26
So we talked to the information信息
security安全 office办公室 at Carnegie卡内基 Mellon梅隆
349
794612
3681
因此我们跟卡内基梅隆大学信息安全中心的人对话
13:30
and asked them if we could
have everybody's每个人的 real真实 passwords密码.
350
798293
3803
问他们我们能不能拿到所有人的真实密码
13:34
Not surprisingly出奇, they were a little bit reluctant不情愿
351
802096
1754
不出意外,他们不愿意
13:35
to share分享 them with us,
352
803850
1550
把这些信息跟我们分享
13:37
but we were actually其实 able能够 to work out
353
805400
1810
但我们事实上找到了一种
13:39
a system系统 with them
354
807210
1040
跟他们合作的方法
13:40
where they put all of the real真实 passwords密码
355
808250
2109
他们把
学校25000名学生,老师,员工的密码
13:42
for 25,000 CMUCMU students学生们, faculty学院 and staff员工,
356
810359
3091
13:45
into a locked锁定 computer电脑 in a locked锁定 room房间,
357
813450
2448
放进一台带锁的电脑,在一个带锁的房间里
13:47
not connected连接的 to the Internet互联网,
358
815898
1394
没有网络
13:49
and they ran code on it that we wrote
359
817292
1848
他们在那台电脑上运行我们所写的程序
13:51
to analyze分析 these passwords密码.
360
819140
2152
来分析这些密码
13:53
They audited审计 our code.
361
821292
1326
他们审查了我们的代码
13:54
They ran the code.
362
822618
1312
并且运行它
13:55
And so we never actually其实 saw
363
823930
1738
因此,我们事实上并没有
13:57
anybody's任何人的 password密码.
364
825668
2817
看见任何人的密码
14:00
We got some interesting有趣 results结果,
365
828485
1515
我们得到了一些有趣的结果
14:02
and those of you Tepper泰珀 students学生们 in the back
366
830000
1696
那些坐在后排的Tepper的同学们
14:03
will be very interested有兴趣 in this.
367
831696
2875
会对这个结果很感兴趣
14:06
So we found发现 that the passwords密码 created创建
368
834571
3731
我们发现
计算机专业的同学所设立的密码
14:10
by people affiliated附属 with the
school学校 of computer电脑 science科学
369
838302
2158
14:12
were actually其实 1.8 times stronger
370
840460
2324
要安全1.8倍
比商学院的同学
14:14
than those affiliated附属 with the business商业 school学校.
371
842784
3738
14:18
We have lots of other really interesting有趣
372
846522
2040
我们有很多其它非常有趣的
14:20
demographic人口 information信息 as well.
373
848562
2238
地域性发现
14:22
The other interesting有趣 thing that we found发现
374
850800
1846
另一项有趣的发现是
14:24
is that when we compared相比
the Carnegie卡内基 Mellon梅隆 passwords密码
375
852646
2440
通过对比卡内基梅隆的密码
14:27
to the Mechanical机械 Turk-generated土耳其人生成 passwords密码,
376
855086
2283
跟机器土耳其人产生的密码
14:29
there was actually其实 a lot of similarities相似之处,
377
857369
2619
他们有很多的相似性
因此这可以验证我们的实验方法
14:31
and so this helped帮助 validate验证 our research研究 method方法
378
859988
1948
14:33
and show显示 that actually其实, collecting搜集 passwords密码
379
861936
2510
证实事实上
通过土耳其机器人
14:36
using运用 these Mechanical机械 Turk土耳其人 studies学习
380
864446
1808
收集密码的方法是有效的
14:38
is actually其实 a valid有效 way to study研究 passwords密码.
381
866254
2788
14:41
So that was good news新闻.
382
869042
2285
这是一个好消息
14:43
Okay, I want to close by talking about
383
871327
2414
最后,我想谈一谈
14:45
some insights见解 I gained获得 while on sabbatical休假
384
873741
2068
我的一些感想,来源于
14:47
last year in the Carnegie卡内基 Mellon梅隆 art艺术 school学校.
385
875809
3201
去年在卡内基梅隆艺术学院休假
当时我做的一件事情就是
14:51
One of the things that I did
386
879010
1281
14:52
is I made制作 a number of quilts棉被,
387
880291
1524
我做了很多的被子
14:53
and I made制作 this quilt被子 here.
388
881815
1548
我也在这里做了很多被子
14:55
It's called "Security安全 Blanket."
389
883363
1899
这些被子叫做”安全毯“
14:57
(Laughter笑声)
390
885262
2431
(笑声)
14:59
And this quilt被子 has the 1,000
391
887693
3095
这条被子由1000个
15:02
most frequent频繁 passwords密码 stolen被盗
392
890788
2328
最常被盗的密码组成
这些密码来自于RockYou网站
15:05
from the RockYouRockYou的 website网站.
393
893116
2571
15:07
And the size尺寸 of the passwords密码 is proportional成比例的
394
895687
2061
密码的大小跟
15:09
to how frequently经常 they appeared出现
395
897748
1901
他被盗的平率成正比
15:11
in the stolen被盗 dataset数据集.
396
899649
2248
在被盗密码数据库中
15:13
And what I did is I created创建 this word cloud,
397
901897
2632
我创建了这个单词库
15:16
and I went through通过 all 1,000 words,
398
904529
2132
然后我给这1000个单词
15:18
and I categorized分类 them into
399
906661
1795
进行分进
15:20
loose疏松 thematic专题 categories类别.
400
908456
2380
不是很严格的主题类别中
15:22
And it was, in some cases,
401
910836
1903
一些情况下
15:24
it was kind of difficult to figure数字 out
402
912739
2038
很难判断
一些单词应该被分入哪个类别中
15:26
what category类别 they should be in,
403
914777
1755
15:28
and then I color-coded颜色编码 them.
404
916532
1899
然后我用不同的颜色标记他们
15:30
So here are some examples例子 of the difficulty困难.
405
918431
2619
这里是一些很难被分类的单词的列子
15:33
So "justin贾斯汀."
406
921050
1181
比如说 “贾斯丁"
15:34
Is that the name名称 of the user用户,
407
922231
1829
是用户的名字?
15:36
their boyfriend男朋友, their son儿子?
408
924060
1322
男朋友的名字?还是儿子的名字?
15:37
Maybe they're a Justin贾斯汀 Bieber比伯 fan风扇.
409
925382
2888
也有可能他是贾斯丁比伯的粉丝
15:40
Or "princess公主."
410
928270
2225
或者说 ”公主“
15:42
Is that a nickname昵称?
411
930495
1635
是一个外号?
15:44
Are they Disney迪士尼 princess公主 fans球迷?
412
932130
1595
还是用户是迪斯尼公主的粉丝?
15:45
Or maybe that's the name名称 of their cat.
413
933725
3694
也有可能是他们猫的名字
15:49
"Iloveyou我爱你" appears出现 many许多 times
414
937419
1655
”我爱你"经常会被用到
15:51
in many许多 different不同 languages语言.
415
939074
1545
不同的语言
15:52
There's a lot of love in these passwords密码.
416
940619
3735
在密码中会有很多“爱”
15:56
If you look carefully小心, you'll你会 see there's also
417
944354
1680
如果你仔细观察,你还会发现
15:58
some profanity亵渎,
418
946034
2267
密码中有很多的脏话
16:00
but it was really interesting有趣 to me to see
419
948301
1950
但是有一个发现很有趣
16:02
that there's a lot more love than hate讨厌
420
950251
2307
爱比恨要多很多
在密码中
16:04
in these passwords密码.
421
952558
2292
16:06
And there are animals动物,
422
954850
1490
密码中还会有动物
16:08
a lot of animals动物,
423
956340
1360
很多的动物
16:09
and "monkey" is the most common共同 animal动物
424
957700
2304
猴子是最常见的动物
16:12
and the 14th most popular流行 password密码 overall总体.
425
960004
3675
是第14个最受欢饮的密码
16:15
And this was really curious好奇 to me,
426
963679
2231
我对这个发现非常的好奇
16:17
and I wondered想知道, "Why are monkeys猴子 so popular流行?"
427
965910
2523
我好奇为什么猴子会那么的受欢迎?
16:20
And so in our last password密码 study研究,
428
968433
3352
因此,在我们最近的一项密码研究中
16:23
any time we detected检测 somebody
429
971785
1686
每次我们发现有人
16:25
creating创建 a password密码 with the word "monkey" in it,
430
973471
2649
在他们的密码中用到猴子的时候
16:28
we asked them why they had
a monkey in their password密码.
431
976120
3030
我们会问他们为什么他会用猴子
16:31
And what we found发现 out --
432
979150
1910
结果我们发现
16:33
we found发现 17 people so far, I think,
433
981060
2103
在我们目前发现的17个
16:35
who have the word "monkey" --
434
983163
1283
用猴子做密码的人中
16:36
We found发现 out about a third第三 of them said
435
984446
1812
有三分之一的人说
16:38
they have a pet宠物 named命名 "monkey"
436
986258
1740
他们有一个宠物叫猴子
16:39
or a friend朋友 whose谁的 nickname昵称 is "monkey,"
437
987998
2291
有一个朋友的外号叫猴子
16:42
and about a third第三 of them said
438
990289
1660
另外三分之一的人说
16:43
that they just like monkeys猴子
439
991949
1533
他们只是很喜欢猴子
16:45
and monkeys猴子 are really cute可爱.
440
993482
1638
他们觉得猴子很可爱
16:47
And that guy is really cute可爱.
441
995120
3639
或者那个朋友很可爱
16:50
So it seems似乎 that at the end结束 of the day,
442
998759
3408
所以看来,在一天的最后
在我们制定密码的时候
16:54
when we make passwords密码,
443
1002167
1783
16:55
we either make something that's really easy简单
444
1003950
1974
我们要么会用一些容易
16:57
to type类型, a common共同 pattern模式,
445
1005924
3009
输入的东西,一些常用组合
17:00
or things that remind提醒 us of the word password密码
446
1008933
2486
或者是一些可以让我想起密码的事物
17:03
or the account帐户 that we've我们已经 created创建 the password密码 for,
447
1011419
3312
或者是我们制定密码的账户
17:06
or whatever随你.
448
1014731
2617
或者是任何事
17:09
Or we think about things that make us happy快乐,
449
1017348
2642
或者是那些会让我们想起来开心的事物
17:11
and we create创建 our password密码
450
1019990
1304
我们设定密码
17:13
based基于 on things that make us happy快乐.
451
1021294
2238
基于那些让我们开心的事物
17:15
And while this makes品牌 typing打字
452
1023532
2863
这让我们输入
17:18
and remembering记忆 your password密码 more fun开玩笑,
453
1026395
2870
跟记忆密码变得更为有趣
17:21
it also makes品牌 it a lot easier更轻松
454
1029265
1807
这也使得窃取密码的人更容易
17:23
to guess猜测 your password密码.
455
1031072
1506
猜到你的密码
17:24
So I know a lot of these TEDTED Talks会谈
456
1032578
1748
我知道跟多TED谈话的内容
17:26
are inspirational励志
457
1034326
1634
都非常的激发人们的灵感
17:27
and they make you think about nice不错, happy快乐 things,
458
1035960
2461
他们让你们想到美好开心的事
17:30
but when you're creating创建 your password密码,
459
1038421
1897
但是当你设定你密码的时候
17:32
try to think about something else其他.
460
1040318
1991
试着想一些别的事情
17:34
Thank you.
461
1042309
1107
谢谢
17:35
(Applause掌声)
462
1043416
553
Translated by FBC Global
Reviewed by Xinhui Wang

▲Back to top

ABOUT THE SPEAKER
Lorrie Faith Cranor - Security researcher
At Carnegie Mellon University, Lorrie Faith Cranor studies online privacy, usable security, phishing, spam and other research around keeping us safe online.

Why you should listen

Lorrie Faith Cranor is an Associate Professor of Computer Science and of Engineering and Public Policy at Carnegie Mellon University, where she is director of the CyLab Usable Privacy and Security Laboratory (CUPS) and co-director of the MSIT-Privacy Engineering masters program. She is also a co-founder of Wombat Security Technologies, Inc. She has authored over 100 research papers on online privacy, usable security, phishing, spam, electronic voting, anonymous publishing, and other topics.

Cranor plays a key role in building the usable privacy and security research community, having co-edited the seminal book Security and Usability and founded the Symposium On Usable Privacy and Security (SOUPS). She also chaired the Platform for Privacy Preferences Project (P3P) Specification Working Group at the W3C and authored the book Web Privacy with P3P. She has served on a number of boards, including the Electronic Frontier Foundation Board of Directors, and on the editorial boards of several journals. In 2003 she was named one of the top 100 innovators 35 or younger by Technology Review.

More profile about the speaker
Lorrie Faith Cranor | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee