ABOUT THE SPEAKER
Pranav Mistry - Director of research, Samsung Research America
As an MIT grad student, Pranav Mistry invented SixthSense, a wearable device that enables new interactions between the real world and the world of data.

Why you should listen

When Pranav Mistry was a PhD student in the Fluid Interfaces Group at MIT's Media Lab, he worked with lab director Pattie Maes to create some of the most entertaining and thought-provoking interfaces the world had ever seen. And not just computer interfaces, mind you -- these are ways to help the digital and the actual worlds interface. Imagine: intelligent sticky notes, Quickies, that can be searched and can send reminders; a pen that draws in 3D; and TaPuMa, a tangible public map that can act as Google of physical world. And of course the legendary SixthSense, which is now open sourced

Before his studies at MIT, he worked with Microsoft as a UX researcher; he's a graduate of IIT. Now, as director of research at Samsung Research America, Mistry heads the Think Tank Team, an interdisciplinary group of researchers that hunts for new ways to mix digital informational with real-world interactions. As an example, Mistry launched the company's smartwatch, the Galaxy Gear, in 2013.

More profile about the speaker
Pranav Mistry | Speaker | TED.com
TEDIndia 2009

Pranav Mistry: The thrilling potential of SixthSense technology

Filmed:
18,689,186 views

At TEDIndia, Pranav Mistry demos several tools that help the physical world interact with the world of data -- including a deep look at his SixthSense device and a new, paradigm-shifting paper "laptop." In an onstage Q&A, Mistry says he'll open-source the software behind SixthSense, to open its possibilities to all.
- Director of research, Samsung Research America
As an MIT grad student, Pranav Mistry invented SixthSense, a wearable device that enables new interactions between the real world and the world of data. Full bio

Double-click the English transcript below to play the video.

00:15
We grew up
0
0
2000
00:17
interacting with the physical objects around us.
1
2000
3000
00:20
There are an enormous number of them
2
5000
2000
00:22
that we use every day.
3
7000
2000
00:24
Unlike most of our computing devices,
4
9000
3000
00:27
these objects are much more fun to use.
5
12000
3000
00:30
When you talk about objects,
6
15000
3000
00:33
one other thing automatically comes attached to that thing,
7
18000
3000
00:36
and that is gestures:
8
21000
2000
00:38
how we manipulate these objects,
9
23000
2000
00:40
how we use these objects in everyday life.
10
25000
3000
00:43
We use gestures not only to interact with these objects,
11
28000
3000
00:46
but we also use them to interact with each other.
12
31000
2000
00:48
A gesture of "Namaste!", maybe, to respect someone,
13
33000
3000
00:51
or maybe --
14
36000
1000
00:52
in India I don't need to teach a kid that this means
15
37000
2000
00:54
"four runs" in cricket.
16
39000
2000
00:56
It comes as a part of our everyday learning.
17
41000
3000
00:59
So, I am very interested,
18
44000
2000
01:01
from the beginning, that how --
19
46000
2000
01:03
how our knowledge
20
48000
2000
01:05
about everyday objects and gestures,
21
50000
2000
01:07
and how we use these objects,
22
52000
2000
01:09
can be leveraged to our interactions with the digital world.
23
54000
3000
01:12
Rather than using a keyboard and mouse,
24
57000
3000
01:15
why can I not use my computer
25
60000
3000
01:18
in the same way that I interact in the physical world?
26
63000
3000
01:21
So, I started this exploration around eight years back,
27
66000
3000
01:24
and it literally started with a mouse on my desk.
28
69000
3000
01:27
Rather than using it for my computer,
29
72000
3000
01:30
I actually opened it.
30
75000
3000
01:33
Most of you might be aware that, in those days,
31
78000
2000
01:35
the mouse used to come with a ball inside,
32
80000
2000
01:37
and there were two rollers
33
82000
2000
01:39
that actually guide the computer where the ball is moving,
34
84000
3000
01:42
and, accordingly, where the mouse is moving.
35
87000
2000
01:44
So, I was interested in these two rollers,
36
89000
3000
01:47
and I actually wanted more, so I borrowed another mouse from a friend --
37
92000
3000
01:50
never returned to him --
38
95000
2000
01:52
and I now had four rollers.
39
97000
2000
01:54
Interestingly, what I did with these rollers is,
40
99000
3000
01:57
basically, I took them off of these mouses
41
102000
3000
02:00
and then put them in one line.
42
105000
2000
02:02
It had some strings and pulleys and some springs.
43
107000
3000
02:05
What I got is basically a gesture interface device
44
110000
3000
02:08
that actually acts as a motion-sensing device
45
113000
4000
02:12
made for two dollars.
46
117000
2000
02:14
So, here, whatever movement I do in my physical world
47
119000
3000
02:17
is actually replicated inside the digital world
48
122000
3000
02:20
just using this small device that I made, around eight years back,
49
125000
3000
02:23
in 2000.
50
128000
2000
02:25
Because I was interested in integrating these two worlds,
51
130000
2000
02:27
I thought of sticky notes.
52
132000
2000
02:29
I thought, "Why can I not connect
53
134000
3000
02:32
the normal interface of a physical sticky note
54
137000
2000
02:34
to the digital world?"
55
139000
2000
02:36
A message written on a sticky note to my mom
56
141000
2000
02:38
on paper
57
143000
1000
02:39
can come to an SMS,
58
144000
2000
02:41
or maybe a meeting reminder
59
146000
2000
02:43
automatically syncs with my digital calendar --
60
148000
2000
02:45
a to-do list that automatically syncs with you.
61
150000
3000
02:48
But you can also search in the digital world,
62
153000
3000
02:51
or maybe you can write a query, saying,
63
156000
2000
02:53
"What is Dr. Smith's address?"
64
158000
2000
02:55
and this small system actually prints it out --
65
160000
2000
02:57
so it actually acts like a paper input-output system,
66
162000
2000
02:59
just made out of paper.
67
164000
3000
03:05
In another exploration,
68
170000
2000
03:07
I thought of making a pen that can draw in three dimensions.
69
172000
3000
03:10
So, I implemented this pen
70
175000
2000
03:12
that can help designers and architects
71
177000
2000
03:14
not only think in three dimensions,
72
179000
2000
03:16
but they can actually draw
73
181000
2000
03:18
so that it's more intuitive to use that way.
74
183000
2000
03:20
Then I thought, "Why not make a Google Map,
75
185000
2000
03:22
but in the physical world?"
76
187000
2000
03:24
Rather than typing a keyword to find something,
77
189000
3000
03:27
I put my objects on top of it.
78
192000
2000
03:29
If I put a boarding pass, it will show me where the flight gate is.
79
194000
3000
03:32
A coffee cup will show where you can find more coffee,
80
197000
3000
03:35
or where you can trash the cup.
81
200000
2000
03:37
So, these were some of the earlier explorations I did because
82
202000
3000
03:40
the goal was to connect these two worlds seamlessly.
83
205000
3000
03:44
Among all these experiments,
84
209000
2000
03:46
there was one thing in common:
85
211000
2000
03:48
I was trying to bring a part of the physical world to the digital world.
86
213000
4000
03:52
I was taking some part of the objects,
87
217000
3000
03:55
or any of the intuitiveness of real life,
88
220000
3000
03:58
and bringing them to the digital world,
89
223000
3000
04:01
because the goal was to make our computing interfaces more intuitive.
90
226000
3000
04:04
But then I realized that we humans
91
229000
2000
04:06
are not actually interested in computing.
92
231000
3000
04:09
What we are interested in is information.
93
234000
3000
04:12
We want to know about things.
94
237000
2000
04:14
We want to know about dynamic things going around.
95
239000
2000
04:16
So I thought, around last year -- in the beginning of the last year --
96
241000
5000
04:21
I started thinking, "Why can I not take this approach in the reverse way?"
97
246000
3000
04:24
Maybe, "How about I take my digital world
98
249000
3000
04:27
and paint the physical world with that digital information?"
99
252000
5000
04:32
Because pixels are actually, right now, confined in these rectangular devices
100
257000
4000
04:36
that fit in our pockets.
101
261000
2000
04:38
Why can I not remove this confine
102
263000
3000
04:41
and take that to my everyday objects, everyday life
103
266000
3000
04:44
so that I don't need to learn the new language
104
269000
2000
04:46
for interacting with those pixels?
105
271000
3000
04:50
So, in order to realize this dream,
106
275000
2000
04:52
I actually thought of putting a big-size projector on my head.
107
277000
3000
04:55
I think that's why this is called a head-mounted projector, isn't it?
108
280000
3000
04:58
I took it very literally,
109
283000
2000
05:00
and took my bike helmet,
110
285000
2000
05:02
put a little cut over there so that the projector actually fits nicely.
111
287000
3000
05:05
So now, what I can do --
112
290000
2000
05:07
I can augment the world around me with this digital information.
113
292000
4000
05:11
But later,
114
296000
2000
05:13
I realized that I actually wanted to interact with those digital pixels, also.
115
298000
3000
05:16
So I put a small camera over there,
116
301000
2000
05:18
that acts as a digital eye.
117
303000
2000
05:20
Later, we moved to a much better,
118
305000
2000
05:22
consumer-oriented pendant version of that,
119
307000
2000
05:24
that many of you now know as the SixthSense device.
120
309000
3000
05:27
But the most interesting thing about this particular technology
121
312000
3000
05:30
is that you can carry your digital world with you
122
315000
4000
05:34
wherever you go.
123
319000
2000
05:36
You can start using any surface, any wall around you,
124
321000
3000
05:39
as an interface.
125
324000
2000
05:41
The camera is actually tracking all your gestures.
126
326000
3000
05:44
Whatever you're doing with your hands,
127
329000
2000
05:46
it's understanding that gesture.
128
331000
2000
05:48
And, actually, if you see, there are some color markers
129
333000
2000
05:50
that in the beginning version we are using with it.
130
335000
3000
05:53
You can start painting on any wall.
131
338000
2000
05:55
You stop by a wall, and start painting on that wall.
132
340000
3000
05:58
But we are not only tracking one finger, here.
133
343000
2000
06:00
We are giving you the freedom of using all of both of your hands,
134
345000
4000
06:04
so you can actually use both of your hands to zoom into or zoom out
135
349000
3000
06:07
of a map just by pinching all present.
136
352000
2000
06:09
The camera is actually doing --
137
354000
3000
06:12
just, getting all the images --
138
357000
1000
06:13
is doing the edge recognition and also the color recognition
139
358000
3000
06:16
and so many other small algorithms are going on inside.
140
361000
3000
06:19
So, technically, it's a little bit complex,
141
364000
2000
06:21
but it gives you an output which is more intuitive to use, in some sense.
142
366000
3000
06:24
But I'm more excited that you can actually take it outside.
143
369000
3000
06:27
Rather than getting your camera out of your pocket,
144
372000
3000
06:30
you can just do the gesture of taking a photo
145
375000
3000
06:33
and it takes a photo for you.
146
378000
2000
06:35
(Applause)
147
380000
4000
06:39
Thank you.
148
384000
1000
06:41
And later I can find a wall, anywhere,
149
386000
2000
06:43
and start browsing those photos
150
388000
2000
06:45
or maybe, "OK, I want to modify this photo a little bit
151
390000
2000
06:47
and send it as an email to a friend."
152
392000
2000
06:49
So, we are looking for an era where
153
394000
3000
06:52
computing will actually merge with the physical world.
154
397000
3000
06:55
And, of course, if you don't have any surface,
155
400000
3000
06:58
you can start using your palm for simple operations.
156
403000
3000
07:01
Here, I'm dialing a phone number just using my hand.
157
406000
2000
07:07
The camera is actually not only understanding your hand movements,
158
412000
3000
07:10
but, interestingly,
159
415000
1000
07:11
is also able to understand what objects you are holding in your hand.
160
416000
3000
07:14
What we're doing here is actually --
161
419000
3000
07:17
for example, in this case,
162
422000
2000
07:19
the book cover is matched
163
424000
2000
07:21
with so many thousands, or maybe millions of books online,
164
426000
3000
07:24
and checking out which book it is.
165
429000
2000
07:26
Once it has that information,
166
431000
1000
07:27
it finds out more reviews about that,
167
432000
2000
07:29
or maybe New York Times has a sound overview on that,
168
434000
3000
07:32
so you can actually hear, on a physical book,
169
437000
2000
07:34
a review as sound.
170
439000
2000
07:36
("famous talk at Harvard University ...")
171
441000
2000
07:38
This was Obama's visit last week to MIT.
172
443000
3000
07:42
("... and particularly I want to thank two outstanding MIT ...")
173
447000
4000
07:46
So, I was seeing the live [video] of his talk, outside, on just a newspaper.
174
451000
5000
07:51
Your newspaper will show you live weather information
175
456000
3000
07:54
rather than having it updated -- like, you have to check your computer
176
459000
3000
07:57
in order to do that, right?
177
462000
2000
07:59
(Applause)
178
464000
5000
08:04
When I'm going back, I can just use my boarding pass
179
469000
3000
08:07
to check how much my flight has been delayed,
180
472000
2000
08:09
because at that particular time,
181
474000
2000
08:11
I'm not feeling like opening my iPhone,
182
476000
2000
08:13
and checking out a particular icon.
183
478000
2000
08:15
And I think this technology will not only change the way --
184
480000
3000
08:18
yes. (Laughter)
185
483000
1000
08:20
It will change the way we interact with people, also,
186
485000
2000
08:22
not only the physical world.
187
487000
2000
08:24
The fun part is, I'm going to the Boston metro,
188
489000
3000
08:27
and playing a pong game inside the train
189
492000
3000
08:30
on the ground, right?
190
495000
2000
08:32
(Laughter)
191
497000
1000
08:33
And I think the imagination is the only limit
192
498000
2000
08:35
of what you can think of
193
500000
2000
08:37
when this kind of technology merges with real life.
194
502000
2000
08:39
But many of you argue, actually, that
195
504000
2000
08:41
all of our work is not only about physical objects.
196
506000
3000
08:44
We actually do lots of accounting and paper editing
197
509000
3000
08:47
and all those kinds of things; what about that?
198
512000
2000
08:49
And many of you are excited about the next generation tablet computers
199
514000
4000
08:53
to come out in the market.
200
518000
2000
08:55
So, rather than waiting for that,
201
520000
2000
08:57
I actually made my own, just using a piece of paper.
202
522000
3000
09:00
So, what I did here is remove the camera --
203
525000
2000
09:02
All the webcam cameras have a microphone inside the camera.
204
527000
4000
09:06
I removed the microphone from that,
205
531000
3000
09:09
and then just pinched that --
206
534000
2000
09:11
like I just made a clip out of the microphone --
207
536000
3000
09:14
and clipped that to a piece of paper, any paper that you found around.
208
539000
4000
09:18
So now the sound of the touch
209
543000
3000
09:21
is getting me when exactly I'm touching the paper.
210
546000
3000
09:24
But the camera is actually tracking where my fingers are moving.
211
549000
4000
09:28
You can of course watch movies.
212
553000
3000
09:31
("Good afternoon. My name is Russell ...
213
556000
3000
09:34
and I am a Wilderness Explorer in Tribe 54.")
214
559000
3000
09:37
And you can of course play games.
215
562000
3000
09:40
(Car engine)
216
565000
3000
09:43
Here, the camera is actually understanding how you're holding the paper
217
568000
3000
09:46
and playing a car-racing game.
218
571000
2000
09:48
(Applause)
219
573000
3000
09:52
Many of you already must have thought, OK, you can browse.
220
577000
2000
09:54
Yeah. Of course you can browse to any websites
221
579000
3000
09:57
or you can do all sorts of computing on a piece of paper
222
582000
3000
10:00
wherever you need it.
223
585000
1000
10:01
So, more interestingly,
224
586000
3000
10:04
I'm interested in how we can take that in a more dynamic way.
225
589000
3000
10:07
When I come back to my desk I can just pinch that information
226
592000
3000
10:10
back to my desktop
227
595000
2000
10:12
so I can use my full-size computer.
228
597000
3000
10:15
(Applause)
229
600000
2000
10:17
And why only computers? We can just play with papers.
230
602000
3000
10:20
Paper world is interesting to play with.
231
605000
3000
10:23
Here, I'm taking a part of a document
232
608000
2000
10:25
and putting over here a second part from a second place --
233
610000
4000
10:29
and I'm actually modifying the information
234
614000
3000
10:32
that I have over there.
235
617000
2000
10:34
Yeah. And I say, "OK, this looks nice,
236
619000
3000
10:37
let me print it out, that thing."
237
622000
2000
10:39
So I now have a print-out of that thing, and now --
238
624000
2000
10:41
the workflow is more intuitive the way we used to do it
239
626000
3000
10:44
maybe 20 years back,
240
629000
3000
10:47
rather than now switching between these two worlds.
241
632000
3000
10:50
So, as a last thought,
242
635000
3000
10:53
I think that integrating information to everyday objects
243
638000
3000
10:56
will not only help us to get rid of the digital divide,
244
641000
5000
11:01
the gap between these two worlds,
245
646000
2000
11:03
but will also help us, in some way,
246
648000
2000
11:05
to stay human,
247
650000
2000
11:07
to be more connected to our physical world.
248
652000
3000
11:13
And it will actually help us not end up being machines
249
658000
3000
11:16
sitting in front of other machines.
250
661000
2000
11:18
That's all. Thank you.
251
663000
3000
11:21
(Applause)
252
666000
14000
11:35
Thank you.
253
680000
1000
11:36
(Applause)
254
681000
3000
11:39
Chris Anderson: So, Pranav,
255
684000
1000
11:40
first of all, you're a genius.
256
685000
3000
11:43
This is incredible, really.
257
688000
3000
11:46
What are you doing with this? Is there a company being planned?
258
691000
3000
11:49
Or is this research forever, or what?
259
694000
2000
11:51
Pranav Mistry: So, there are lots of companies --
260
696000
2000
11:53
actually sponsor companies of Media Lab --
261
698000
1000
11:54
interested in taking this ahead in one or another way.
262
699000
3000
11:57
Companies like mobile phone operators
263
702000
2000
11:59
want to take this in a different way than the NGOs in India,
264
704000
3000
12:02
[who] are thinking, "Why can we only have 'Sixth Sense'?
265
707000
3000
12:05
We should have a 'Fifth Sense' for missing-sense people
266
710000
2000
12:07
who cannot speak.
267
712000
1000
12:08
This technology can be used for them to speak out in a different way
268
713000
3000
12:11
with maybe a speaker system."
269
716000
1000
12:12
CA: What are your own plans? Are you staying at MIT,
270
717000
3000
12:15
or are you going to do something with this?
271
720000
1000
12:16
PM: I'm trying to make this more available to people
272
721000
2000
12:18
so that anyone can develop their own SixthSense device,
273
723000
3000
12:21
because the hardware is actually not that hard to manufacture
274
726000
5000
12:26
or hard to make your own.
275
731000
2000
12:28
We will provide all the open source software for them,
276
733000
2000
12:30
maybe starting next month.
277
735000
2000
12:32
CA: Open source? Wow.
278
737000
2000
12:34
(Applause)
279
739000
5000
12:39
CA: Are you going to come back to India with some of this, at some point?
280
744000
3000
12:42
PM: Yeah. Yes, yes, of course.
281
747000
2000
12:44
CA: What are your plans? MIT?
282
749000
2000
12:46
India? How are you going to split your time going forward?
283
751000
2000
12:48
PM: There is a lot of energy here. Lots of learning.
284
753000
3000
12:51
All of this work that you have seen is all about
285
756000
2000
12:53
my learning in India.
286
758000
2000
12:55
And now, if you see, it's more about the cost-effectiveness:
287
760000
3000
12:58
this system costs you $300
288
763000
2000
13:00
compared to the $20,000 surface tables, or anything like that.
289
765000
3000
13:03
Or maybe even the $2 mouse gesture system
290
768000
3000
13:06
at that time was costing around $5,000?
291
771000
3000
13:09
So, we actually -- I showed that, at a conference, to
292
774000
4000
13:13
President Abdul Kalam, at that time,
293
778000
2000
13:15
and then he said, "OK, we should use this in Bhabha Atomic Research Centre
294
780000
3000
13:18
for some use of that."
295
783000
2000
13:20
So I'm excited about how I can bring the technology to the masses
296
785000
3000
13:23
rather than just keeping that technology in the lab environment.
297
788000
3000
13:26
(Applause)
298
791000
4000
13:30
CA: Based on the people we've seen at TED,
299
795000
3000
13:33
I would say you're truly one of the two or three
300
798000
1000
13:34
best inventors in the world right now.
301
799000
2000
13:36
It's an honor to have you at TED.
302
801000
2000
13:38
Thank you so much.
303
803000
2000
13:40
That's fantastic.
304
805000
1000
13:41
(Applause)
305
806000
4000

▲Back to top

ABOUT THE SPEAKER
Pranav Mistry - Director of research, Samsung Research America
As an MIT grad student, Pranav Mistry invented SixthSense, a wearable device that enables new interactions between the real world and the world of data.

Why you should listen

When Pranav Mistry was a PhD student in the Fluid Interfaces Group at MIT's Media Lab, he worked with lab director Pattie Maes to create some of the most entertaining and thought-provoking interfaces the world had ever seen. And not just computer interfaces, mind you -- these are ways to help the digital and the actual worlds interface. Imagine: intelligent sticky notes, Quickies, that can be searched and can send reminders; a pen that draws in 3D; and TaPuMa, a tangible public map that can act as Google of physical world. And of course the legendary SixthSense, which is now open sourced

Before his studies at MIT, he worked with Microsoft as a UX researcher; he's a graduate of IIT. Now, as director of research at Samsung Research America, Mistry heads the Think Tank Team, an interdisciplinary group of researchers that hunts for new ways to mix digital informational with real-world interactions. As an example, Mistry launched the company's smartwatch, the Galaxy Gear, in 2013.

More profile about the speaker
Pranav Mistry | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee