10:12
TED2006

Jeff Han: The radical promise of the multi-touch interface

Filmed:

Jeff Han shows off a cheap, scalable multi-touch and pressure-sensitive computer screen interface that may spell the end of point-and-click.

- Human-computer interface designer
After years of research on touch-driven computer displays, Jeff Han has created a simple, multi-touch, multi-user screen interface that just might herald the end of the point-and-click era. Full bio

I'm really excited to be here today.
00:24
I'll show you some stuff
that's just ready to come out of the lab,
00:25
literally, and I'm really glad
that you guys
00:29
are going to be among
the first to see it in person,
00:31
because I really think this is going
to really change
00:34
the way we interact
with machines from this point on.
00:36
Now, this is a rear-projected
drafting table.
00:39
It's about 36 inches wide
00:42
and it's equipped
with a multi-touch sensor.
00:43
Normal touch sensors that you see,
00:45
like on a kiosk
or interactive whiteboards,
00:47
can only register one point
of contact at a time.
00:49
This thing allows you to have
multiple points at the same time.
00:52
They can use both my hands;
I can use chording actions;
00:55
I can just go right up and use
all 10 fingers if I wanted to.
00:59
You know, like that.
01:02
Now, multi-touch sensing
isn't completely new.
01:03
People like Bill Buxton have been
playing around with it in the '80s.
01:08
However, the approach I built here
is actually high-resolution,
01:11
low-cost, and probably
most importantly, very scalable.
01:15
So, the technology, you know,
01:19
isn't the most exciting thing
here right now,
01:21
other than probably
its newfound accessibility.
01:23
What's really interesting here
is what you can do with it
01:26
and the kind of interfaces
you can build on top of it.
01:29
So let's see.
01:33
So, for instance, we have
a lava lamp application here.
01:35
Now, you can see,
01:39
I can use both of my hands to kind
of squeeze and put the blobs together.
01:41
I can inject heat into the system here,
01:44
or I can pull it apart
with two of my fingers.
01:47
It's completely intuitive;
there's no instruction manual.
01:49
The interface just kind of disappears.
01:52
This started out as a screensaver app
01:54
that one of the Ph.D. students
in our lab, Ilya Rosenberg, made.
01:55
But I think its true identity
comes out here.
01:59
Now what's great about a multi-touch
sensor is that, you know,
02:04
I could be doing this
with as many fingers here,
02:07
but of course multi-touch
also inherently means multi-user.
02:09
Chris could be interacting
with another part of Lava,
02:12
while I play around with it here.
02:15
You can imagine
a new kind of sculpting tool,
02:17
where I'm kind of warming something up,
making it malleable,
02:19
and then letting it cool down
and solidifying in a certain state.
02:22
Google should have
something like this in their lobby.
02:29
(Laughter)
02:32
I'll show you a little more
of a concrete example here,
02:38
as this thing loads.
02:41
This is a photographer's
light-box application.
02:43
Again, I can use both of my hands
to interact and move photos around.
02:45
But what's even cooler
is that if I have two fingers,
02:49
I can actually grab a photo and then
stretch it out like that really easily.
02:53
I can pan, zoom
and rotate it effortlessly.
02:57
I can do that grossly
with both of my hands,
03:01
or I can do it just with two fingers
on each of my hands together.
03:03
If I grab the canvas, I can do
the same thing -- stretch it out.
03:06
I can do it simultaneously,
holding this down,
03:09
and gripping on another one,
stretching this out.
03:11
Again, the interface just disappears here.
03:14
There's no manual.
03:16
This is exactly what you expect,
03:17
especially if you haven't interacted
with a computer before.
03:19
Now, when you have initiatives
like the $100 laptop,
03:23
I kind of cringe at the idea
03:25
of introducing a whole new
generation to computing
03:27
with this standard
mouse-and-windows-pointer interface.
03:29
This is something that I think
is really the way
03:32
we should be interacting
with machines from now on.
03:35
(Applause)
03:38
Now, of course, I can bring up a keyboard.
03:45
(Laughter)
03:47
And I can bring that around,
put that up there.
03:52
Obviously, this is a standard keyboard,
03:55
but of course I can rescale it
to make it work well for my hands.
03:57
That's really important, because
there's no reason in this day and age
04:00
that we should be conforming
to a physical device.
04:03
That leads to bad things, like RSI.
04:05
We have so much technology nowadays
04:07
that these interfaces
should start conforming to us.
04:10
There's so little applied now
to actually improving
04:14
the way we interact with interfaces
from this point on.
04:18
This keyboard is probably actually
the really wrong direction to go.
04:21
You can imagine, in the future,
as we develop this kind of technology,
04:24
a keyboard that kind of automatically
drifts as your hand moves away,
04:27
and really intelligently anticipates
which key you're trying to stroke.
04:31
So -- again, isn't this great?
04:35
(Laughter)
04:39
Audience: Where's your lab?
04:41
Jeff Han: I'm a research scientist
at NYU in New York.
04:42
Here's an example of another kind of app.
I can make these little fuzz balls.
04:49
It'll remember the strokes I'm making.
04:53
Of course I can do it with all my hands.
04:55
It's pressure-sensitive.
04:57
What's neat about that is,
04:59
I showed that two-finger gesture
that zooms in really quickly.
05:01
Because you don't have
to switch to a hand tool
05:04
or the magnifying glass tool,
05:06
you can just continuously make things
05:08
in real multiple scales,
all at the same time.
05:10
I can create big things out here,
05:13
but I can go back
and really quickly go back
05:15
to where I started,
and make even smaller things here.
05:17
This is going to be really important
05:21
as we start getting to things
like data visualization.
05:23
For instance, I think
we all enjoyed Hans Rosling's talk,
05:26
and he really emphasized the fact
I've been thinking about for a long time:
05:28
We have all this great data,
05:32
but for some reason,
it's just sitting there.
05:33
We're not accessing it.
05:35
And one of the reasons why I think that is
05:37
will be helped by things like graphics
and visualization and inference tools,
05:40
but I also think a big part of it
05:45
is going to be having better interfaces,
05:47
to be able to drill down
into this kind of data,
05:49
while still thinking
about the big picture here.
05:51
Let me show you another app here.
This is called WorldWind.
05:54
It's done by NASA.
05:57
We've all seen Google Earth;
05:58
this is an open-source version of that.
06:01
There are plug-ins to be able
to load in different data sets
06:03
that NASA's collected over the years.
06:07
As you can see, I can use
the same two-fingered gestures
06:08
to go down and go in really seamlessly.
06:11
There's no interface, again.
06:13
It really allows anybody
to kind of go in --
06:15
and it just does
what you'd expect, you know?
06:18
Again, there's just no interface here.
The interface just disappears.
06:21
I can switch to different data views.
06:27
That's what's neat about this app here.
06:28
NASA's really cool.
06:31
These hyper-spectral images
are false-colored so you can --
06:32
it's really good for determining
vegetative use.
06:35
Well, let's go back to this.
06:39
The great thing
about mapping applications --
06:44
it's not really 2D, it's 3D.
06:46
So, again, with a multi-point interface,
you can do a gesture like this --
06:48
so you can be able
to tilt around like that --
06:51
(Surprised laughter)
06:55
It's not just simply relegated
to a kind of 2D panning and motion.
06:56
This gesture is just putting
two fingers down --
07:00
it's defining an axis of tilt --
and I can tilt up and down that way.
07:02
We just came up with that on the spot,
07:06
it's probably not the right thing to do,
07:08
but there's such interesting things
you can do with this interface.
07:10
It's just so much fun
playing around with it, too.
07:15
(Laughter)
07:17
And so the last thing
I want to show you is --
07:19
I'm sure we can all think
of a lot of entertainment apps
07:22
that you can do with this thing.
07:24
I'm more interested in the creative
applications we can do with this.
07:26
Now, here's a simple application here --
I can draw out a curve.
07:30
And when I close it,
it becomes a character.
07:35
But the neat thing about it
is I can add control points.
07:38
And then what I can do is manipulate them
with both of my fingers at the same time.
07:41
And you notice what it does.
07:45
It's kind of a puppeteering thing,
07:48
where I can use as many fingers
as I have to draw and make --
07:50
Now, there's a lot of actual math
going on under here
08:02
for this to control this mesh
and do the right thing.
08:05
This technique of being able to manipulate
a mesh here, with multiple control points,
08:10
is actually state of the art.
08:15
It was released at SIGGRAPH last year.
08:17
It's a great example
of the kind of research I really love:
08:19
all this compute power
to make things do the right things,
08:22
intuitive things,
to do exactly what you expect.
08:24
So, multi-touch interaction research
is a very active field right now in HCI.
08:31
I'm not the only one doing it,
a lot of other people are getting into it.
08:36
This kind of technology is going to let
even more people get into it,
08:39
I'm looking forward to interacting
with all of you over the next few days
08:42
and seeing how it can apply
to your respective fields.
08:46
Thank you.
08:48
(Applause)
08:50

▲Back to top

About the Speaker:

Jeff Han - Human-computer interface designer
After years of research on touch-driven computer displays, Jeff Han has created a simple, multi-touch, multi-user screen interface that just might herald the end of the point-and-click era.

Why you should listen

Jeff Han's intuitive "interface-free" computer displays -- controlled by the touch of fingertips -- will change forever the way you think about computers. At TED 2006, the audience whistled, clapped and gasped audibly as Han demoed (for the first time publicly) his prototype drafting table-cum-touch display, developed at NYU's Courant Institute of Mathematical Sciences. The demo included a virtual lightbox, where he moved photos by fingertip -- as if they were paper on a desk -- flicking them across the screen and zooming in and out by pinching two fingers together, as well as a Google Earth-like map that he tilted and flew over with simple moves.

When the demo hit the web, bloggers and YouTubers made him a bit of a megastar. (His video has been watched more than 600,000 times on YouTube alone; "Amazing," "Incredible" and "Freaking awesome" are the typical responses there. Also: "When can I buy one?") After this legendary demo, Han launched a startup called Perceptive Pixel -- and when he came back to TED2007, he and his team brought an entire interactive wall, where TEDsters lined up to play virtual guitars. His talent and reputation earned him a place on Time Magazine's 2008 list of the world's 100 Most Influential People. 

More profile about the speaker
Jeff Han | Speaker | TED.com