ABOUT THE SPEAKER
Avi Rubin - Computer security expert
Avi Rubin is a professor of computer science and director of the Health and Medical Security Lab at Johns Hopkins University. His research is focused on the security of electronic records -- including medical and voting records.

Why you should listen

Along with running the Health and Medical Security Lab, Avi Rubin is also the technical director of the JHU Information Security Institute. From 1997 to 2002, Avi was a researcher in AT&T’s Secure Systems Department, where he focused on cryptography and network security. He is also the founder of Harbor Labs, which provides expert testimony and review in legal cases related to high tech security. Avi has authored several books related to electronic security, including Brave New Ballot, published in 2006.

More profile about the speaker
Avi Rubin | Speaker | TED.com
TEDxMidAtlantic

Avi Rubin: All your devices can be hacked

Filmed:
1,251,015 views

Could someone hack your pacemaker? Avi Rubin shows how hackers are compromising cars, smartphones and medical devices, and warns us about the dangers of an increasingly hack-able world.
- Computer security expert
Avi Rubin is a professor of computer science and director of the Health and Medical Security Lab at Johns Hopkins University. His research is focused on the security of electronic records -- including medical and voting records. Full bio

Double-click the English transcript below to play the video.

00:12
I'm a computer science professor,
0
588
3031
00:15
and my area of expertise is
1
3619
2313
00:17
computer and information security.
2
5932
2199
00:20
When I was in graduate school,
3
8131
2320
00:22
I had the opportunity to overhear my grandmother
4
10451
2601
00:25
describing to one of her fellow senior citizens
5
13052
4134
00:29
what I did for a living.
6
17186
2369
00:31
Apparently, I was in charge of making sure that
7
19555
3562
00:35
no one stole the computers from the university. (Laughter)
8
23117
3900
00:39
And, you know, that's a perfectly reasonable thing
9
27017
2744
00:41
for her to think, because I told her I was working
10
29761
1920
00:43
in computer security,
11
31681
1507
00:45
and it was interesting to get her perspective.
12
33188
3597
00:48
But that's not the most ridiculous thing I've ever heard
13
36785
2617
00:51
anyone say about my work.
14
39402
2017
00:53
The most ridiculous thing I ever heard is,
15
41419
2284
00:55
I was at a dinner party, and a woman heard
16
43703
3134
00:58
that I work in computer security,
17
46837
1783
01:00
and she asked me if -- she said her computer had been
18
48620
3517
01:04
infected by a virus, and she was very concerned that she
19
52137
3436
01:07
might get sick from it, that she could get this virus. (Laughter)
20
55573
3951
01:11
And I'm not a doctor, but I reassured her
21
59524
2943
01:14
that it was very, very unlikely that this would happen,
22
62467
3144
01:17
but if she felt more comfortable, she could be free to use
23
65611
2801
01:20
latex gloves when she was on the computer,
24
68412
1848
01:22
and there would be no harm whatsoever in that.
25
70260
3392
01:25
I'm going to get back to this notion of being able to get
26
73652
2507
01:28
a virus from your computer, in a serious way.
27
76159
3508
01:31
What I'm going to talk to you about today
28
79667
1640
01:33
are some hacks, some real world cyberattacks that people
29
81307
4846
01:38
in my community, the academic research community,
30
86153
2554
01:40
have performed, which I don't think
31
88707
2794
01:43
most people know about,
32
91501
1208
01:44
and I think they're very interesting and scary,
33
92709
3028
01:47
and this talk is kind of a greatest hits
34
95737
2441
01:50
of the academic security community's hacks.
35
98178
2991
01:53
None of the work is my work. It's all work
36
101169
1987
01:55
that my colleagues have done, and I actually asked them
37
103156
2174
01:57
for their slides and incorporated them into this talk.
38
105330
2557
01:59
So the first one I'm going to talk about
39
107887
1742
02:01
are implanted medical devices.
40
109629
2674
02:04
Now medical devices have come a long way technologically.
41
112303
3040
02:07
You can see in 1926 the first pacemaker was invented.
42
115343
3856
02:11
1960, the first internal pacemaker was implanted,
43
119199
3552
02:14
hopefully a little smaller than that one that you see there,
44
122751
2552
02:17
and the technology has continued to move forward.
45
125303
2968
02:20
In 2006, we hit an important milestone from the perspective
46
128271
4633
02:24
of computer security.
47
132904
3167
02:28
And why do I say that?
48
136071
1341
02:29
Because that's when implanted devices inside of people
49
137412
2890
02:32
started to have networking capabilities.
50
140302
2745
02:35
One thing that brings us close to home is we look
51
143047
1880
02:36
at Dick Cheney's device, he had a device that
52
144927
2705
02:39
pumped blood from an aorta to another part of the heart,
53
147632
3869
02:43
and as you can see at the bottom there,
54
151501
1183
02:44
it was controlled by a computer controller,
55
152684
3009
02:47
and if you ever thought that software liability
56
155693
2517
02:50
was very important, get one of these inside of you.
57
158210
3589
02:53
Now what a research team did was they got their hands
58
161799
3695
02:57
on what's called an ICD.
59
165494
1420
02:58
This is a defibrillator, and this is a device
60
166914
2070
03:00
that goes into a person to control their heart rhythm,
61
168984
4336
03:05
and these have saved many lives.
62
173320
2338
03:07
Well, in order to not have to open up the person
63
175658
2472
03:10
every time you want to reprogram their device
64
178130
2194
03:12
or do some diagnostics on it, they made the thing be able
65
180324
2455
03:14
to communicate wirelessly, and what this research team did
66
182779
3102
03:17
is they reverse engineered the wireless protocol,
67
185881
2610
03:20
and they built the device you see pictured here,
68
188491
1872
03:22
with a little antenna, that could talk the protocol
69
190363
2760
03:25
to the device, and thus control it.
70
193123
4475
03:29
In order to make their experience real -- they were unable
71
197598
2689
03:32
to find any volunteers, and so they went
72
200287
2472
03:34
and they got some ground beef and some bacon
73
202759
2144
03:36
and they wrapped it all up to about the size
74
204903
1788
03:38
of a human being's area where the device would go,
75
206691
2798
03:41
and they stuck the device inside it
76
209489
1454
03:42
to perform their experiment somewhat realistically.
77
210943
3132
03:46
They launched many, many successful attacks.
78
214075
3020
03:49
One that I'll highlight here is changing the patient's name.
79
217095
3056
03:52
I don't know why you would want to do that,
80
220151
993
03:53
but I sure wouldn't want that done to me.
81
221144
2104
03:55
And they were able to change therapies,
82
223248
2331
03:57
including disabling the device -- and this is with a real,
83
225579
2495
04:00
commercial, off-the-shelf device --
84
228074
1896
04:01
simply by performing reverse engineering and sending
85
229970
2046
04:04
wireless signals to it.
86
232016
2989
04:07
There was a piece on NPR that some of these ICDs
87
235005
3580
04:10
could actually have their performance disrupted
88
238585
2422
04:13
simply by holding a pair of headphones onto them.
89
241007
3651
04:16
Now, wireless and the Internet
90
244658
1409
04:18
can improve health care greatly.
91
246067
1652
04:19
There's several examples up on the screen
92
247719
2087
04:21
of situations where doctors are looking to implant devices
93
249806
3107
04:24
inside of people, and all of these devices now,
94
252913
2865
04:27
it's standard that they communicate wirelessly,
95
255778
3125
04:30
and I think this is great,
96
258903
1412
04:32
but without a full understanding of trustworthy computing,
97
260315
3105
04:35
and without understanding what attackers can do
98
263420
2407
04:37
and the security risks from the beginning,
99
265827
2147
04:39
there's a lot of danger in this.
100
267974
2390
04:42
Okay, let me shift gears and show you another target.
101
270364
1477
04:43
I'm going to show you a few different targets like this,
102
271841
2088
04:45
and that's my talk. So we'll look at automobiles.
103
273929
2917
04:48
This is a car, and it has a lot of components,
104
276846
2896
04:51
a lot of electronics in it today.
105
279742
1620
04:53
In fact, it's got many, many different computers inside of it,
106
281362
4377
04:57
more Pentiums than my lab did when I was in college,
107
285739
3155
05:00
and they're connected by a wired network.
108
288894
3639
05:04
There's also a wireless network in the car,
109
292533
3431
05:07
which can be reached from many different ways.
110
295964
3233
05:11
So there's Bluetooth, there's the FM and XM radio,
111
299197
3701
05:14
there's actually wi-fi, there's sensors in the wheels
112
302898
2820
05:17
that wirelessly communicate the tire pressure
113
305718
2153
05:19
to a controller on board.
114
307871
1806
05:21
The modern car is a sophisticated multi-computer device.
115
309677
4918
05:26
And what happens if somebody wanted to attack this?
116
314595
3322
05:29
Well, that's what the researchers
117
317917
1317
05:31
that I'm going to talk about today did.
118
319234
1871
05:33
They basically stuck an attacker on the wired network
119
321105
2977
05:36
and on the wireless network.
120
324082
2322
05:38
Now, they have two areas they can attack.
121
326404
2699
05:41
One is short-range wireless, where you can actually
122
329103
2038
05:43
communicate with the device from nearby,
123
331141
1781
05:44
either through Bluetooth or wi-fi,
124
332922
2137
05:47
and the other is long-range, where you can communicate
125
335059
2174
05:49
with the car through the cellular network,
126
337233
1782
05:51
or through one of the radio stations.
127
339015
1960
05:52
Think about it. When a car receives a radio signal,
128
340975
3049
05:56
it's processed by software.
129
344024
2201
05:58
That software has to receive and decode the radio signal,
130
346225
3061
06:01
and then figure out what to do with it,
131
349286
1119
06:02
even if it's just music that it needs to play on the radio,
132
350405
3024
06:05
and that software that does that decoding,
133
353429
2268
06:07
if it has any bugs in it, could create a vulnerability
134
355697
3093
06:10
for somebody to hack the car.
135
358790
3035
06:13
The way that the researchers did this work is,
136
361825
2952
06:16
they read the software in the computer chips
137
364777
4223
06:21
that were in the car, and then they used sophisticated
138
369000
3193
06:24
reverse engineering tools
139
372193
1414
06:25
to figure out what that software did,
140
373607
2055
06:27
and then they found vulnerabilities in that software,
141
375662
3041
06:30
and then they built exploits to exploit those.
142
378703
3346
06:34
They actually carried out their attack in real life.
143
382049
2382
06:36
They bought two cars, and I guess
144
384431
1350
06:37
they have better budgets than I do.
145
385781
2918
06:40
The first threat model was to see what someone could do
146
388699
2590
06:43
if an attacker actually got access
147
391289
2144
06:45
to the internal network on the car.
148
393433
2053
06:47
Okay, so think of that as, someone gets to go to your car,
149
395486
2603
06:50
they get to mess around with it, and then they leave,
150
398089
2904
06:52
and now, what kind of trouble are you in?
151
400993
2368
06:55
The other threat model is that they contact you
152
403361
2792
06:58
in real time over one of the wireless networks
153
406153
2457
07:00
like the cellular, or something like that,
154
408610
2055
07:02
never having actually gotten physical access to your car.
155
410665
4000
07:06
This is what their setup looks like for the first model,
156
414665
2824
07:09
where you get to have access to the car.
157
417489
1683
07:11
They put a laptop, and they connected to the diagnostic unit
158
419172
3387
07:14
on the in-car network, and they did all kinds of silly things,
159
422559
2939
07:17
like here's a picture of the speedometer
160
425498
2783
07:20
showing 140 miles an hour when the car's in park.
161
428281
2816
07:23
Once you have control of the car's computers,
162
431097
2373
07:25
you can do anything.
163
433470
919
07:26
Now you might say, "Okay, that's silly."
164
434389
1616
07:28
Well, what if you make the car always say
165
436005
1659
07:29
it's going 20 miles an hour slower than it's actually going?
166
437664
2741
07:32
You might produce a lot of speeding tickets.
167
440405
2542
07:34
Then they went out to an abandoned airstrip with two cars,
168
442947
3856
07:38
the target victim car and the chase car,
169
446803
2745
07:41
and they launched a bunch of other attacks.
170
449548
2746
07:44
One of the things they were able to do from the chase car
171
452294
2766
07:47
is apply the brakes on the other car,
172
455060
1974
07:49
simply by hacking the computer.
173
457034
1560
07:50
They were able to disable the brakes.
174
458594
2431
07:53
They also were able to install malware that wouldn't kick in
175
461025
3178
07:56
and wouldn't trigger until the car was doing something like
176
464203
2425
07:58
going over 20 miles an hour, or something like that.
177
466628
3746
08:02
The results are astonishing, and when they gave this talk,
178
470374
2758
08:05
even though they gave this talk at a conference
179
473132
1716
08:06
to a bunch of computer security researchers,
180
474848
1726
08:08
everybody was gasping.
181
476574
1700
08:10
They were able to take over a bunch of critical computers
182
478274
3699
08:13
inside the car: the brakes computer, the lighting computer,
183
481973
3761
08:17
the engine, the dash, the radio, etc.,
184
485734
2827
08:20
and they were able to perform these on real commercial
185
488561
2293
08:22
cars that they purchased using the radio network.
186
490854
3027
08:25
They were able to compromise every single one of the
187
493881
3003
08:28
pieces of software that controlled every single one
188
496884
2466
08:31
of the wireless capabilities of the car.
189
499350
3015
08:34
All of these were implemented successfully.
190
502365
2513
08:36
How would you steal a car in this model?
191
504878
2352
08:39
Well, you compromise the car by a buffer overflow
192
507230
3680
08:42
of vulnerability in the software, something like that.
193
510910
2527
08:45
You use the GPS in the car to locate it.
194
513437
2203
08:47
You remotely unlock the doors through the computer
195
515640
2195
08:49
that controls that, start the engine, bypass anti-theft,
196
517835
3138
08:52
and you've got yourself a car.
197
520973
1668
08:54
Surveillance was really interesting.
198
522641
2487
08:57
The authors of the study have a video where they show
199
525128
3209
09:00
themselves taking over a car and then turning on
200
528337
2549
09:02
the microphone in the car, and listening in on the car
201
530886
2761
09:05
while tracking it via GPS on a map,
202
533647
3351
09:08
and so that's something that the drivers of the car
203
536998
1713
09:10
would never know was happening.
204
538711
2168
09:12
Am I scaring you yet?
205
540879
2134
09:15
I've got a few more of these interesting ones.
206
543013
1943
09:16
These are ones where I went to a conference,
207
544956
1833
09:18
and my mind was just blown, and I said,
208
546789
1933
09:20
"I have to share this with other people."
209
548722
1826
09:22
This was Fabian Monrose's lab
210
550548
1623
09:24
at the University of North Carolina, and what they did was
211
552171
3456
09:27
something intuitive once you see it,
212
555627
2075
09:29
but kind of surprising.
213
557702
1714
09:31
They videotaped people on a bus,
214
559416
2259
09:33
and then they post-processed the video.
215
561675
2840
09:36
What you see here in number one is a
216
564515
2463
09:38
reflection in somebody's glasses of the smartphone
217
566978
4383
09:43
that they're typing in.
218
571361
1425
09:44
They wrote software to stabilize --
219
572786
1975
09:46
even though they were on a bus
220
574761
1365
09:48
and maybe someone's holding their phone at an angle --
221
576126
3211
09:51
to stabilize the phone, process it, and
222
579337
2370
09:53
you may know on your smartphone, when you type
223
581707
1885
09:55
a password, the keys pop out a little bit, and they were able
224
583592
2939
09:58
to use that to reconstruct what the person was typing,
225
586531
2840
10:01
and had a language model for detecting typing.
226
589371
4321
10:05
What was interesting is, by videotaping on a bus,
227
593692
2335
10:08
they were able to produce exactly what people
228
596027
2129
10:10
on their smartphones were typing,
229
598156
2151
10:12
and then they had a surprising result, which is that
230
600307
2260
10:14
their software had not only done it for their target,
231
602567
2764
10:17
but other people who accidentally happened
232
605331
1403
10:18
to be in the picture, they were able to produce
233
606734
2086
10:20
what those people had been typing, and that was kind of
234
608820
2727
10:23
an accidental artifact of what their software was doing.
235
611547
3617
10:27
I'll show you two more. One is P25 radios.
236
615164
4303
10:31
P25 radios are used by law enforcement
237
619467
2800
10:34
and all kinds of government agencies
238
622267
3407
10:37
and people in combat to communicate,
239
625674
1736
10:39
and there's an encryption option on these phones.
240
627410
2833
10:42
This is what the phone looks like. It's not really a phone.
241
630243
2728
10:44
It's more of a two-way radio.
242
632971
1206
10:46
Motorola makes the most widely used one, and you can see
243
634177
3322
10:49
that they're used by Secret Service, they're used in combat,
244
637499
2649
10:52
it's a very, very common standard in the U.S. and elsewhere.
245
640148
3102
10:55
So one question the researchers asked themselves is,
246
643250
2305
10:57
could you block this thing, right?
247
645555
2704
11:00
Could you run a denial-of-service,
248
648259
1583
11:01
because these are first responders?
249
649842
1824
11:03
So, would a terrorist organization want to black out the
250
651666
1801
11:05
ability of police and fire to communicate at an emergency?
251
653467
4488
11:09
They found that there's this GirlTech device used for texting
252
657955
3072
11:13
that happens to operate at the same exact frequency
253
661027
2718
11:15
as the P25, and they built what they called
254
663745
2271
11:18
My First Jammer. (Laughter)
255
666016
4334
11:22
If you look closely at this device,
256
670350
2378
11:24
it's got a switch for encryption or cleartext.
257
672728
3630
11:28
Let me advance the slide, and now I'll go back.
258
676358
3050
11:31
You see the difference?
259
679408
2547
11:33
This is plain text. This is encrypted.
260
681955
2557
11:36
There's one little dot that shows up on the screen,
261
684512
2557
11:39
and one little tiny turn of the switch.
262
687069
2085
11:41
And so the researchers asked themselves, "I wonder how
263
689154
1904
11:43
many times very secure, important, sensitive conversations
264
691058
4257
11:47
are happening on these two-way radios where they forget
265
695315
1623
11:48
to encrypt and they don't notice that they didn't encrypt?"
266
696938
2910
11:51
So they bought a scanner. These are perfectly legal
267
699848
3339
11:55
and they run at the frequency of the P25,
268
703187
3458
11:58
and what they did is they hopped around frequencies
269
706645
1767
12:00
and they wrote software to listen in.
270
708412
2510
12:02
If they found encrypted communication, they stayed
271
710922
2634
12:05
on that channel and they wrote down, that's a channel
272
713556
1686
12:07
that these people communicate in,
273
715242
1788
12:09
these law enforcement agencies,
274
717030
1622
12:10
and they went to 20 metropolitan areas and listened in
275
718652
3391
12:14
on conversations that were happening at those frequencies.
276
722043
3475
12:17
They found that in every metropolitan area,
277
725518
3239
12:20
they would capture over 20 minutes a day
278
728757
2154
12:22
of cleartext communication.
279
730911
2375
12:25
And what kind of things were people talking about?
280
733286
2000
12:27
Well, they found the names and information
281
735286
1484
12:28
about confidential informants. They found information
282
736770
2852
12:31
that was being recorded in wiretaps,
283
739622
2202
12:33
a bunch of crimes that were being discussed,
284
741824
2710
12:36
sensitive information.
285
744534
1162
12:37
It was mostly law enforcement and criminal.
286
745696
3363
12:41
They went and reported this to the law enforcement
287
749059
1834
12:42
agencies, after anonymizing it,
288
750893
2023
12:44
and the vulnerability here is simply the user interface
289
752916
3000
12:47
wasn't good enough. If you're talking
290
755916
1394
12:49
about something really secure and sensitive, it should
291
757310
2816
12:52
be really clear to you that this conversation is encrypted.
292
760126
3293
12:55
That one's pretty easy to fix.
293
763419
1886
12:57
The last one I thought was really, really cool,
294
765305
1669
12:58
and I just had to show it to you, it's probably not something
295
766974
2813
13:01
that you're going to lose sleep over
296
769787
1005
13:02
like the cars or the defibrillators,
297
770792
1791
13:04
but it's stealing keystrokes.
298
772583
3023
13:07
Now, we've all looked at smartphones upside down.
299
775606
2747
13:10
Every security expert wants to hack a smartphone,
300
778353
2190
13:12
and we tend to look at the USB port, the GPS for tracking,
301
780543
4612
13:17
the camera, the microphone, but no one up till this point
302
785155
3208
13:20
had looked at the accelerometer.
303
788363
1580
13:21
The accelerometer is the thing that determines
304
789943
1647
13:23
the vertical orientation of the smartphone.
305
791590
3494
13:27
And so they had a simple setup.
306
795084
1417
13:28
They put a smartphone next to a keyboard,
307
796501
2758
13:31
and they had people type, and then their goal was
308
799259
2712
13:33
to use the vibrations that were created by typing
309
801971
2856
13:36
to measure the change in the accelerometer reading
310
804827
4240
13:41
to determine what the person had been typing.
311
809067
3176
13:44
Now, when they tried this on an iPhone 3GS,
312
812243
2576
13:46
this is a graph of the perturbations that were created
313
814819
2769
13:49
by the typing, and you can see that it's very difficult
314
817588
3241
13:52
to tell when somebody was typing or what they were typing,
315
820829
3078
13:55
but the iPhone 4 greatly improved the accelerometer,
316
823907
3090
13:58
and so the same measurement
317
826997
3480
14:02
produced this graph.
318
830477
1832
14:04
Now that gave you a lot of information while someone
319
832309
2486
14:06
was typing, and what they did then is used advanced
320
834795
3241
14:10
artificial intelligence techniques called machine learning
321
838036
3007
14:13
to have a training phase,
322
841043
1431
14:14
and so they got most likely grad students
323
842474
2236
14:16
to type in a whole lot of things, and to learn,
324
844710
3789
14:20
to have the system use the machine learning tools that
325
848499
2768
14:23
were available to learn what it is that the people were typing
326
851267
2863
14:26
and to match that up
327
854130
2827
14:28
with the measurements in the accelerometer.
328
856957
2477
14:31
And then there's the attack phase, where you get
329
859434
1635
14:33
somebody to type something in, you don't know what it was,
330
861069
2811
14:35
but you use your model that you created
331
863880
1297
14:37
in the training phase to figure out what they were typing.
332
865177
3442
14:40
They had pretty good success. This is an article from the USA Today.
333
868619
3484
14:44
They typed in, "The Illinois Supreme Court has ruled
334
872103
2609
14:46
that Rahm Emanuel is eligible to run for Mayor of Chicago"
335
874712
2962
14:49
— see, I tied it in to the last talk —
336
877674
1354
14:51
"and ordered him to stay on the ballot."
337
879028
2118
14:53
Now, the system is interesting, because it produced
338
881146
2771
14:55
"Illinois Supreme" and then it wasn't sure.
339
883917
2886
14:58
The model produced a bunch of options,
340
886803
1950
15:00
and this is the beauty of some of the A.I. techniques,
341
888753
2709
15:03
is that computers are good at some things,
342
891462
2250
15:05
humans are good at other things,
343
893712
1534
15:07
take the best of both and let the humans solve this one.
344
895246
1931
15:09
Don't waste computer cycles.
345
897177
1382
15:10
A human's not going to think it's the Supreme might.
346
898559
2136
15:12
It's the Supreme Court, right?
347
900695
1740
15:14
And so, together we're able to reproduce typing
348
902435
2530
15:16
simply by measuring the accelerometer.
349
904965
2949
15:19
Why does this matter? Well, in the Android platform,
350
907914
3502
15:23
for example, the developers have a manifest
351
911416
4133
15:27
where every device on there, the microphone, etc.,
352
915564
2584
15:30
has to register if you're going to use it
353
918148
1956
15:32
so that hackers can't take over it,
354
920104
2316
15:34
but nobody controls the accelerometer.
355
922420
3108
15:37
So what's the point? You can leave your iPhone next to
356
925528
2216
15:39
someone's keyboard, and just leave the room,
357
927744
2106
15:41
and then later recover what they did,
358
929850
1639
15:43
even without using the microphone.
359
931489
1711
15:45
If someone is able to put malware on your iPhone,
360
933200
2174
15:47
they could then maybe get the typing that you do
361
935374
2848
15:50
whenever you put your iPhone next to your keyboard.
362
938222
2321
15:52
There's several other notable attacks that unfortunately
363
940543
2271
15:54
I don't have time to go into, but the one that I wanted
364
942814
2131
15:56
to point out was a group from the University of Michigan
365
944945
2277
15:59
which was able to take voting machines,
366
947222
2441
16:01
the Sequoia AVC Edge DREs that
367
949663
2498
16:04
were going to be used in New Jersey in the election
368
952161
1555
16:05
that were left in a hallway, and put Pac-Man on it.
369
953716
2161
16:07
So they ran the Pac-Man game.
370
955877
3623
16:11
What does this all mean?
371
959500
1747
16:13
Well, I think that society tends to adopt technology
372
961247
3647
16:16
really quickly. I love the next coolest gadget.
373
964894
2824
16:19
But it's very important, and these researchers are showing,
374
967718
2614
16:22
that the developers of these things
375
970332
1360
16:23
need to take security into account from the very beginning,
376
971692
2865
16:26
and need to realize that they may have a threat model,
377
974557
2785
16:29
but the attackers may not be nice enough
378
977342
2462
16:31
to limit themselves to that threat model,
379
979804
1777
16:33
and so you need to think outside of the box.
380
981581
2537
16:36
What we can do is be aware
381
984118
1578
16:37
that devices can be compromised,
382
985696
2479
16:40
and anything that has software in it
383
988175
1699
16:41
is going to be vulnerable. It's going to have bugs.
384
989874
2649
16:44
Thank you very much. (Applause)
385
992523
3497
Translated by Joseph Geni
Reviewed by Morton Bast

▲Back to top

ABOUT THE SPEAKER
Avi Rubin - Computer security expert
Avi Rubin is a professor of computer science and director of the Health and Medical Security Lab at Johns Hopkins University. His research is focused on the security of electronic records -- including medical and voting records.

Why you should listen

Along with running the Health and Medical Security Lab, Avi Rubin is also the technical director of the JHU Information Security Institute. From 1997 to 2002, Avi was a researcher in AT&T’s Secure Systems Department, where he focused on cryptography and network security. He is also the founder of Harbor Labs, which provides expert testimony and review in legal cases related to high tech security. Avi has authored several books related to electronic security, including Brave New Ballot, published in 2006.

More profile about the speaker
Avi Rubin | Speaker | TED.com