ABOUT THE SPEAKER
Jeff Han - Human-computer interface designer
After years of research on touch-driven computer displays, Jeff Han has created a simple, multi-touch, multi-user screen interface that just might herald the end of the point-and-click era.

Why you should listen

Jeff Han's intuitive "interface-free" computer displays -- controlled by the touch of fingertips -- will change forever the way you think about computers. At TED 2006, the audience whistled, clapped and gasped audibly as Han demoed (for the first time publicly) his prototype drafting table-cum-touch display, developed at NYU's Courant Institute of Mathematical Sciences. The demo included a virtual lightbox, where he moved photos by fingertip -- as if they were paper on a desk -- flicking them across the screen and zooming in and out by pinching two fingers together, as well as a Google Earth-like map that he tilted and flew over with simple moves.

When the demo hit the web, bloggers and YouTubers made him a bit of a megastar. (His video has been watched more than 600,000 times on YouTube alone; "Amazing," "Incredible" and "Freaking awesome" are the typical responses there. Also: "When can I buy one?") After this legendary demo, Han launched a startup called Perceptive Pixel -- and when he came back to TED2007, he and his team brought an entire interactive wall, where TEDsters lined up to play virtual guitars. His talent and reputation earned him a place on Time Magazine's 2008 list of the world's 100 Most Influential People. 

More profile about the speaker
Jeff Han | Speaker | TED.com
TED2006

Jeff Han: The radical promise of the multi-touch interface

Filmed:
4,787,802 views

Jeff Han shows off a cheap, scalable multi-touch and pressure-sensitive computer screen interface that may spell the end of point-and-click.
- Human-computer interface designer
After years of research on touch-driven computer displays, Jeff Han has created a simple, multi-touch, multi-user screen interface that just might herald the end of the point-and-click era. Full bio

Double-click the English transcript below to play the video.

00:24
I'm really excited to be here today.
0
95
1779
00:25
I'll show you some stuff
that's just ready to come out of the lab,
1
1898
3102
00:29
literally, and I'm really glad
that you guys
2
5024
2776
00:31
are going to be among
the first to see it in person,
3
7824
2472
00:34
because I really think this is going
to really change
4
10320
2489
00:36
the way we interact
with machines from this point on.
5
12833
2482
00:39
Now, this is a rear-projected
drafting table.
6
15339
2675
00:42
It's about 36 inches wide
7
18038
1222
00:43
and it's equipped
with a multi-touch sensor.
8
19284
2083
00:45
Normal touch sensors that you see,
9
21391
1644
00:47
like on a kiosk
or interactive whiteboards,
10
23059
2137
00:49
can only register one point
of contact at a time.
11
25220
2583
00:52
This thing allows you to have
multiple points at the same time.
12
28137
3715
00:55
They can use both my hands;
I can use chording actions;
13
31876
3314
00:59
I can just go right up and use
all 10 fingers if I wanted to.
14
35214
2937
01:02
You know, like that.
15
38597
1157
01:03
Now, multi-touch sensing
isn't completely new.
16
39778
4543
01:08
People like Bill Buxton have been
playing around with it in the '80s.
17
44345
3253
01:11
However, the approach I built here
is actually high-resolution,
18
47622
4028
01:15
low-cost, and probably
most importantly, very scalable.
19
51674
2976
01:19
So, the technology, you know,
20
55000
2337
01:21
isn't the most exciting thing
here right now,
21
57361
2446
01:23
other than probably
its newfound accessibility.
22
59831
2318
01:26
What's really interesting here
is what you can do with it
23
62173
3525
01:29
and the kind of interfaces
you can build on top of it.
24
65722
2618
01:33
So let's see.
25
69141
1136
01:35
So, for instance, we have
a lava lamp application here.
26
71111
4313
01:39
Now, you can see,
27
75448
1702
01:41
I can use both of my hands to kind
of squeeze and put the blobs together.
28
77174
3429
01:44
I can inject heat into the system here,
29
80627
2543
01:47
or I can pull it apart
with two of my fingers.
30
83194
2182
01:49
It's completely intuitive;
there's no instruction manual.
31
85400
2730
01:52
The interface just kind of disappears.
32
88154
1822
01:54
This started out as a screensaver app
33
90000
1920
01:55
that one of the Ph.D. students
in our lab, Ilya Rosenberg, made.
34
91944
3047
01:59
But I think its true identity
comes out here.
35
95015
4021
02:04
Now what's great about a multi-touch
sensor is that, you know,
36
100153
2923
02:07
I could be doing this
with as many fingers here,
37
103100
2737
02:09
but of course multi-touch
also inherently means multi-user.
38
105861
2978
02:12
Chris could be interacting
with another part of Lava,
39
108863
2499
02:15
while I play around with it here.
40
111386
1634
02:17
You can imagine
a new kind of sculpting tool,
41
113044
2096
02:19
where I'm kind of warming something up,
making it malleable,
42
115164
2979
02:22
and then letting it cool down
and solidifying in a certain state.
43
118167
3258
02:29
Google should have
something like this in their lobby.
44
125712
2947
02:32
(Laughter)
45
128683
5293
02:38
I'll show you a little more
of a concrete example here,
46
134000
3272
02:41
as this thing loads.
47
137296
1333
02:43
This is a photographer's
light-box application.
48
139181
2717
02:45
Again, I can use both of my hands
to interact and move photos around.
49
141922
3977
02:49
But what's even cooler
is that if I have two fingers,
50
145923
3555
02:53
I can actually grab a photo and then
stretch it out like that really easily.
51
149502
4130
02:57
I can pan, zoom
and rotate it effortlessly.
52
153656
3466
03:01
I can do that grossly
with both of my hands,
53
157146
2331
03:03
or I can do it just with two fingers
on each of my hands together.
54
159501
3204
03:06
If I grab the canvas, I can do
the same thing -- stretch it out.
55
162729
3027
03:09
I can do it simultaneously,
holding this down,
56
165780
2159
03:11
and gripping on another one,
stretching this out.
57
167963
2299
03:14
Again, the interface just disappears here.
58
170286
2228
03:16
There's no manual.
59
172538
1033
03:17
This is exactly what you expect,
60
173595
2302
03:19
especially if you haven't interacted
with a computer before.
61
175921
3055
03:23
Now, when you have initiatives
like the $100 laptop,
62
179000
2704
03:25
I kind of cringe at the idea
63
181728
1533
03:27
of introducing a whole new
generation to computing
64
183285
2577
03:29
with this standard
mouse-and-windows-pointer interface.
65
185886
2738
03:32
This is something that I think
is really the way
66
188648
3136
03:35
we should be interacting
with machines from now on.
67
191808
2641
03:38
(Applause)
68
194473
6503
03:45
Now, of course, I can bring up a keyboard.
69
201000
2000
03:47
(Laughter)
70
203024
2298
03:52
And I can bring that around,
put that up there.
71
208000
2745
03:55
Obviously, this is a standard keyboard,
72
211213
1858
03:57
but of course I can rescale it
to make it work well for my hands.
73
213095
3082
04:00
That's really important, because
there's no reason in this day and age
74
216201
3333
04:03
that we should be conforming
to a physical device.
75
219558
2340
04:05
That leads to bad things, like RSI.
76
221922
1747
04:07
We have so much technology nowadays
77
223693
2777
04:10
that these interfaces
should start conforming to us.
78
226494
4029
04:14
There's so little applied now
to actually improving
79
230547
3977
04:18
the way we interact with interfaces
from this point on.
80
234548
2579
04:21
This keyboard is probably actually
the really wrong direction to go.
81
237151
3191
04:24
You can imagine, in the future,
as we develop this kind of technology,
82
240366
3302
04:27
a keyboard that kind of automatically
drifts as your hand moves away,
83
243692
3403
04:31
and really intelligently anticipates
which key you're trying to stroke.
84
247119
3515
04:35
So -- again, isn't this great?
85
251477
2293
04:39
(Laughter)
86
255051
1972
04:41
Audience: Where's your lab?
87
257047
1763
04:42
Jeff Han: I'm a research scientist
at NYU in New York.
88
258834
2572
04:49
Here's an example of another kind of app.
I can make these little fuzz balls.
89
265341
3635
04:53
It'll remember the strokes I'm making.
90
269000
2334
04:55
Of course I can do it with all my hands.
91
271358
1919
04:57
It's pressure-sensitive.
92
273301
1291
04:59
What's neat about that is,
93
275905
1556
05:01
I showed that two-finger gesture
that zooms in really quickly.
94
277485
2965
05:04
Because you don't have
to switch to a hand tool
95
280474
2207
05:06
or the magnifying glass tool,
96
282705
1406
05:08
you can just continuously make things
97
284135
2539
05:10
in real multiple scales,
all at the same time.
98
286698
2354
05:13
I can create big things out here,
99
289076
2013
05:15
but I can go back
and really quickly go back
100
291113
2098
05:17
to where I started,
and make even smaller things here.
101
293235
2700
05:21
This is going to be really important
102
297271
1887
05:23
as we start getting to things
like data visualization.
103
299182
3033
05:26
For instance, I think
we all enjoyed Hans Rosling's talk,
104
302239
2667
05:28
and he really emphasized the fact
I've been thinking about for a long time:
105
304930
3526
05:32
We have all this great data,
106
308480
1351
05:33
but for some reason,
it's just sitting there.
107
309855
2097
05:35
We're not accessing it.
108
311976
1119
05:37
And one of the reasons why I think that is
109
313119
3523
05:40
will be helped by things like graphics
and visualization and inference tools,
110
316666
4716
05:45
but I also think a big part of it
111
321406
1861
05:47
is going to be having better interfaces,
112
323291
2056
05:49
to be able to drill down
into this kind of data,
113
325371
2240
05:51
while still thinking
about the big picture here.
114
327635
2338
05:54
Let me show you another app here.
This is called WorldWind.
115
330460
2979
05:57
It's done by NASA.
116
333463
1139
05:58
We've all seen Google Earth;
117
334626
2830
06:01
this is an open-source version of that.
118
337480
1958
06:03
There are plug-ins to be able
to load in different data sets
119
339462
3524
06:07
that NASA's collected over the years.
120
343010
1799
06:08
As you can see, I can use
the same two-fingered gestures
121
344833
2620
06:11
to go down and go in really seamlessly.
122
347477
2482
06:13
There's no interface, again.
123
349983
1380
06:15
It really allows anybody
to kind of go in --
124
351387
3271
06:18
and it just does
what you'd expect, you know?
125
354682
2963
06:21
Again, there's just no interface here.
The interface just disappears.
126
357669
3272
06:27
I can switch to different data views.
127
363000
1876
06:28
That's what's neat about this app here.
128
364900
2194
06:31
NASA's really cool.
129
367118
1157
06:32
These hyper-spectral images
are false-colored so you can --
130
368299
2762
06:35
it's really good for determining
vegetative use.
131
371085
3943
06:39
Well, let's go back to this.
132
375887
1586
06:44
The great thing
about mapping applications --
133
380312
2174
06:46
it's not really 2D, it's 3D.
134
382510
1490
06:48
So, again, with a multi-point interface,
you can do a gesture like this --
135
384024
3492
06:51
so you can be able
to tilt around like that --
136
387540
3437
06:55
(Surprised laughter)
137
391001
1874
06:56
It's not just simply relegated
to a kind of 2D panning and motion.
138
392899
3182
07:00
This gesture is just putting
two fingers down --
139
396105
2635
07:02
it's defining an axis of tilt --
and I can tilt up and down that way.
140
398764
3653
07:06
We just came up with that on the spot,
141
402441
2078
07:08
it's probably not the right thing to do,
142
404543
1937
07:10
but there's such interesting things
you can do with this interface.
143
406504
3229
07:15
It's just so much fun
playing around with it, too.
144
411000
2934
07:17
(Laughter)
145
413958
1353
07:19
And so the last thing
I want to show you is --
146
415335
2641
07:22
I'm sure we can all think
of a lot of entertainment apps
147
418000
2677
07:24
that you can do with this thing.
148
420701
1524
07:26
I'm more interested in the creative
applications we can do with this.
149
422249
4050
07:30
Now, here's a simple application here --
I can draw out a curve.
150
426323
3237
07:35
And when I close it,
it becomes a character.
151
431201
3266
07:38
But the neat thing about it
is I can add control points.
152
434785
2977
07:41
And then what I can do is manipulate them
with both of my fingers at the same time.
153
437786
3977
07:45
And you notice what it does.
154
441787
1903
07:48
It's kind of a puppeteering thing,
155
444253
2555
07:50
where I can use as many fingers
as I have to draw and make --
156
446832
5218
08:02
Now, there's a lot of actual math
going on under here
157
458274
2702
08:05
for this to control this mesh
and do the right thing.
158
461000
4442
08:10
This technique of being able to manipulate
a mesh here, with multiple control points,
159
466183
5692
08:15
is actually state of the art.
160
471899
1476
08:17
It was released at SIGGRAPH last year.
161
473399
1824
08:19
It's a great example
of the kind of research I really love:
162
475247
2770
08:22
all this compute power
to make things do the right things,
163
478041
2799
08:24
intuitive things,
to do exactly what you expect.
164
480864
2721
08:31
So, multi-touch interaction research
is a very active field right now in HCI.
165
487000
4658
08:36
I'm not the only one doing it,
a lot of other people are getting into it.
166
492000
3476
08:39
This kind of technology is going to let
even more people get into it,
167
495500
3245
08:42
I'm looking forward to interacting
with all of you over the next few days
168
498769
3492
08:46
and seeing how it can apply
to your respective fields.
169
502285
2537
08:48
Thank you.
170
504846
1158
08:50
(Applause)
171
506028
2569

▲Back to top

ABOUT THE SPEAKER
Jeff Han - Human-computer interface designer
After years of research on touch-driven computer displays, Jeff Han has created a simple, multi-touch, multi-user screen interface that just might herald the end of the point-and-click era.

Why you should listen

Jeff Han's intuitive "interface-free" computer displays -- controlled by the touch of fingertips -- will change forever the way you think about computers. At TED 2006, the audience whistled, clapped and gasped audibly as Han demoed (for the first time publicly) his prototype drafting table-cum-touch display, developed at NYU's Courant Institute of Mathematical Sciences. The demo included a virtual lightbox, where he moved photos by fingertip -- as if they were paper on a desk -- flicking them across the screen and zooming in and out by pinching two fingers together, as well as a Google Earth-like map that he tilted and flew over with simple moves.

When the demo hit the web, bloggers and YouTubers made him a bit of a megastar. (His video has been watched more than 600,000 times on YouTube alone; "Amazing," "Incredible" and "Freaking awesome" are the typical responses there. Also: "When can I buy one?") After this legendary demo, Han launched a startup called Perceptive Pixel -- and when he came back to TED2007, he and his team brought an entire interactive wall, where TEDsters lined up to play virtual guitars. His talent and reputation earned him a place on Time Magazine's 2008 list of the world's 100 Most Influential People. 

More profile about the speaker
Jeff Han | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee