TED2010

John Underkoffler: Pointing to the future of UI

Filmed:

Minority Report science adviser and inventor John Underkoffler demos g-speak -- the real-life version of the film's eye-popping, tai chi-meets-cyberspace computer interface. Is this how tomorrow's computers will be controlled?

- Interface designer
Remember the data interface from Minority Report? Well, it's real, John Underkoffler invented it -- as a point-and-touch interface called g-speak -- and it's about to change the way we interact with data. Full bio

We're 25, 26 years after
00:15
the advent of the Macintosh,
00:17
which was an astoundingly seminal event
00:19
in the history
00:21
of human-machine interface
00:23
and in computation in general.
00:25
It fundamentally changed the way
00:27
that people thought about computation,
00:29
thought about computers,
00:31
how they used them and who and how many people were able to use them.
00:33
It was such a radical change, in fact,
00:36
that the early Macintosh development team
00:38
in '82, '83, '84
00:40
had to write an entirely new operating system from the ground up.
00:42
Now, this is an interesting little message,
00:45
and it's a lesson that has since, I think,
00:47
been forgotten or lost or something,
00:49
and that is, namely, that the OS is the interface.
00:51
The interface is the OS.
00:54
It's like the land and the king (i.e. Arthur) they're inseparable, they are one.
00:56
And to write a new operating system was not a capricious matter.
00:59
It wasn't just a matter of tuning up some graphics routines.
01:02
There were no graphics routines. There were no mouse drivers.
01:05
So it was a necessity.
01:08
But in the quarter-century since then,
01:10
we've seen all of the fundamental
01:12
supporting technologies go berserk.
01:14
So memory capacity and disk capacity
01:16
have been multiplied by something between 10,000 and a million.
01:19
Same thing for processor speeds.
01:22
Networks, we didn't have networks at all
01:24
at the time of the Macintosh's introduction,
01:26
and that has become the single most salient aspect
01:29
of how we live with computers.
01:31
And, of course, graphics: Today
01:33
84 dollars and 97 cents at Best Buy
01:35
buys you more graphics power
01:38
than you could have gotten for a million bucks from SGI only a decade ago.
01:40
So we've got that incredible ramp-up.
01:43
Then, on the side, we've got the Web
01:45
and, increasingly, the cloud,
01:47
which is fantastic,
01:49
but also -- in the regard in which an interface is fundamental --
01:51
kind of a distraction.
01:54
So we've forgotten to invent new interfaces.
01:56
Certainly we've seen in recent years a lot of change in that regard,
01:58
and people are starting to wake up about that.
02:00
So what happens next? Where do we go from there?
02:04
The problem, as we see it,
02:06
has to do with a single, simple word: "space,"
02:08
or a single, simple phrase:
02:10
"real world geometry."
02:12
Computers and the programming languages
02:14
that we talk to them in, that we teach them in,
02:16
are hideously insensate when it comes to space.
02:18
They don't understand real world space.
02:21
It's a funny thing because the rest of us occupy it quite frequently and quite well.
02:23
They also don't understand time, but that's a matter for a separate talk.
02:26
So what happens if you start to
02:29
explain space to them?
02:31
One thing you might get is something like the Luminous Room.
02:36
The Luminous Room is a system
02:39
in which it's considered that
02:41
input and output spaces are co-located.
02:43
That's a strangely simple,
02:45
and yet unexplored idea, right?
02:47
When you use a mouse, your hand is down here on the mouse pad.
02:49
It's not even on the same plane as what you're talking about:
02:52
The pixels are up on the display.
02:54
So here was a room in which all the walls, floors, ceilings,
02:56
pets, potted plants, whatever was in there,
02:59
were capable, not only of display but of sensing as well.
03:01
And that means input and output are in the same space
03:04
enabling stuff like this.
03:06
That's a digital storage in a physical container.
03:08
The contract is the same
03:10
as with real word objects in real world containers.
03:12
Has to come back out, whatever you put in.
03:15
This little design experiment
03:18
that was a small office here knew a few other tricks as well.
03:20
If you presented it with a chess board,
03:23
it tried to figure out what you might mean by that.
03:25
And if there was nothing for them to do,
03:27
the chess pieces eventually got bored
03:29
and hopped away.
03:31
The academics who were overseeing this work
03:33
thought that that was too frivolous,
03:36
so we built deadly serious applications
03:38
like this optics prototyping workbench
03:40
in which a toothpaste cap on a cardboard box
03:42
becomes a laser.
03:45
The beam splitters and lenses are represented by physical objects,
03:47
and the system projects down the laser beam path.
03:50
So you've got an interface that has no interface.
03:53
You operate the world as you operate the real world,
03:55
which is to say, with your hands.
03:58
Similarly, a digital wind tunnel with digital wind
04:00
flowing from right to left --
04:02
not that remarkable in a sense; we didn't invent the mathematics.
04:04
But if you displayed that on a CRT or flat panel display,
04:07
it would be meaningless to hold up an arbitrary object,
04:09
a real world object in that.
04:12
Here, the real world merges with the simulation.
04:14
And finally, to pull out all the stops,
04:18
this is a system called Urp, for urban planners,
04:20
in which we give architects and urban planners back
04:23
the models that we confiscated
04:26
when we insisted that they use CAD systems.
04:28
And we make the machine meet them half way.
04:30
It projects down digital shadows, as you see here.
04:32
And if you introduce tools like this inverse clock,
04:35
then you can control the sun's position in the sky.
04:38
That's 8 a.m. shadows.
04:40
They get a little shorter at 9 a.m.
04:42
There you are, swinging the sun around.
04:44
Short shadows at noon and so forth.
04:46
And we built up a series of tools like this.
04:50
There are inter-shadowing studies
04:53
that children can operate,
04:55
even though they don't know anything about urban planning:
04:57
To move a building, you simply reach out your hand and you move the building.
04:59
A material wand makes the building
05:02
into a sort of Frank Gehry thing that reflects light in all directions.
05:04
Are you blinding passers by and motorists on the freeways?
05:07
A zoning tool connects distant structures, a building and a roadway.
05:10
Are you going to get sued by the zoning commission? And so forth.
05:13
Now, if these ideas seem familiar
05:17
or perhaps even a little dated,
05:19
that's great; they should seem familiar.
05:21
This work is 15 years old.
05:23
This stuff was undertaken at MIT and the Media Lab
05:26
under the incredible direction of Professor Hiroshi Ishii,
05:29
director of the Tangible Media Group.
05:32
But it was that work that was seen
05:34
by Alex McDowell,
05:36
one of the world's legendary production designers.
05:38
But Alex was preparing a little, sort of obscure, indie, arthouse film
05:41
called "Minority Report" for Steven Spielberg,
05:44
and invited us to come out from MIT
05:47
and design the interfaces
05:49
that would appear in that film.
05:52
And the great thing about it was
05:55
that Alex was so dedicated to the idea of verisimilitude,
05:57
the idea that the putative 2054
06:00
that we were painting in the film be believable,
06:03
that he allowed us to take on that design work
06:06
as if it were an R&D effort.
06:08
And the result is sort of
06:10
gratifyingly perpetual.
06:12
People still reference those sequences in "Minority Report"
06:14
when they talk about new UI design.
06:17
So this led full circle, in a strange way,
06:19
to build these ideas into what we believe
06:21
is the necessary future of human machine interface:
06:24
the Spatial Operating Environment, we call it.
06:27
So here we have a bunch of stuff, some images.
06:32
And, using a hand,
06:35
we can actually exercise six degrees of freedom,
06:37
six degrees of navigational control.
06:40
And it's fun to fly through Mr. Beckett's eye.
06:43
And you can come back out
06:45
through the scary orangutan.
06:47
And that's all well and good.
06:49
Let's do something a little more difficult.
06:52
Here, we have a whole bunch of disparate images.
06:55
We can fly around them.
06:57
So navigation is a fundamental issue.
06:59
You have to be able to navigate in 3D.
07:01
Much of what we want computers to help us with in the first place
07:04
is inherently spatial.
07:07
And the part that isn't spatial can often be spatialized
07:09
to allow our wetware to make greater sense of it.
07:11
Now we can distribute this stuff in many different ways.
07:14
So we can throw it out like that. Let's reset it.
07:17
We can organize it this way.
07:19
And, of course, it's not just about navigation,
07:21
but about manipulation as well.
07:24
So if we don't like stuff,
07:26
or we're intensely curious about
07:28
Ernst Haeckel's scientific falsifications,
07:30
we can pull them out like that.
07:33
And then if it's time for analysis, we can pull back a little bit
07:35
and ask for a different distribution.
07:38
Let's just come down a bit
07:43
and fly around.
07:46
So that's a different way to look at stuff.
07:49
If you're of a more analytical nature
07:52
then you might want, actually, to look at this
07:54
as a color histogram.
07:56
So now we've got the stuff color-sorted,
07:59
angle maps onto color.
08:02
And now, if we want to select stuff,
08:05
3D, space,
08:07
the idea that we're tracking hands in real space
08:09
becomes really important because we can reach in,
08:12
not in 2D, not in fake 2D, but in actual 3D.
08:15
Here are some selection planes.
08:17
And we'll perform this Boolean operation
08:19
because we really love yellow and tapirs on green grass.
08:22
So, from there to the world of real work.
08:34
Here's a logistics system,
08:37
a small piece of one that we're currently building.
08:39
There're a lot of elements.
08:41
And one thing that's very important is to combine traditional tabular data
08:43
with three-dimensional and geospatial information.
08:46
So here's a familiar place.
08:49
And we'll bring this back here for a second.
08:52
Maybe select a little bit of that.
08:55
And bring out this graph.
08:58
And we should, now,
09:01
be able to fly in here
09:03
and have a closer look.
09:06
These are logistics elements
09:09
that are scattered across the United States.
09:11
One thing that three-dimensional interactions
09:20
and the general idea of imbuing
09:23
computation with space affords you
09:25
is a final destruction of that unfortunate
09:27
one-to-one pairing between human beings and computers.
09:29
That's the old way, that's the old mantra:
09:32
one machine, one human, one mouse, one screen.
09:34
Well, that doesn't really cut it anymore.
09:36
In the real world, we have people who collaborate;
09:39
we have people who have to work together,
09:42
and we have many different displays.
09:45
And we might want to look at these various images.
09:48
We might want to ask for some help.
09:51
The author of this new pointing device
09:53
is sitting over there,
09:56
so I can pull this from there to there.
09:58
These are unrelated machines, right?
10:00
So the computation is space soluble and network soluble.
10:03
So I'm going to leave that over there
10:06
because I have a question for Paul.
10:08
Paul is the designer of this wand, and maybe its easiest
10:10
for him to come over here and tell me in person what's going on.
10:12
So let me get some of these out of the way.
10:15
Let's pull this apart:
10:20
I'll go ahead and explode it.
10:23
Kevin, can you help?
10:26
Let me see if I can help us find the circuit board.
10:33
Mind you, it's a sort of gratuitous field-stripping exercise,
10:38
but we do it in the lab all the time.
10:41
All right.
10:44
So collaborative work, whether it's immediately co-located
10:46
or distant and distinct, is always important.
10:49
And again, that stuff
10:52
needs to be undertaken in the context of space.
10:54
And finally, I'd like to leave you with a glimpse
10:59
that takes us back to the world of imagery.
11:02
This is a system called TAMPER,
11:04
which is a slightly whimsical look
11:06
at what the future of editing
11:08
and media manipulation systems might be.
11:10
We at Oblong believe that media should be
11:12
accessible in much more fine-grained form.
11:14
So we have a large number of movies
11:17
stuck inside here.
11:19
And let's just pick out a few elements.
11:21
We can zip through them
11:24
as a possibility.
11:27
We can grab elements off the front,
11:29
where upon they reanimate, come to life,
11:31
and drag them down onto the table here.
11:34
We'll go over to Jacques Tati here
11:39
and grab our blue friend
11:42
and put him down on the table as well.
11:45
We may need more than one.
11:48
And we probably need,
11:53
well, we probably need a cowboy
11:55
to be quite honest.
11:57
(Laughter)
11:59
Yeah, let's
12:01
take that one.
12:04
(Laughter)
12:06
You see, cowboys and French farce people
12:08
don't go well together, and the system knows that.
12:10
Let me leave with one final thought,
12:15
and that is that
12:17
one of the greatest English language writers
12:19
of the last three decades
12:21
suggested that great art is always a gift.
12:23
And he wasn't talking about whether the novel costs 24.95 [dollars],
12:26
or whether you have to spring 70 million bucks
12:29
to buy the stolen Vermeer;
12:31
he was talking about the circumstances of its creation
12:33
and of its existence.
12:35
And I think that it's time that we asked
12:37
for the same from technology.
12:39
Technology is capable of
12:41
expressing and being imbued with
12:43
a certain generosity,
12:46
and we need to demand that, in fact.
12:48
For some of this kind of technology,
12:50
ground center is
12:53
a combination of design, which is crucially important.
12:56
We can't have advances in technology any longer
12:58
unless design is integrated from the very start.
13:01
And, as well, as of efficacy, agency.
13:04
We're, as human beings, the creatures that create,
13:07
and we should make sure that our machines aid us in that task
13:10
and are built in that same image.
13:13
So I will leave you with that. Thank you.
13:16
(Applause)
13:18
Chris Anderson: So to ask the obvious question --
13:33
actually this is from Bill Gates --
13:36
when? (John Underkoffler: When?)
13:38
CA: When real? When for us, not just in a lab and on a stage?
13:41
Can it be for every man, or is this just for corporations and movie producers?
13:45
JU: No, it has to be for every human being.
13:48
That's our goal entirely.
13:50
We won't have succeeded
13:52
unless we take that next big step.
13:54
I mean it's been 25 years.
13:56
Can there really be only one interface? There can't.
13:58
CA: But does that mean that, at your desk or in your home,
14:00
you need projectors, cameras?
14:02
You know, how can it work?
14:05
JU: No, this stuff will be built into the bezel of every display.
14:07
It'll be built into architecture.
14:09
The gloves go away in a matter of months or years.
14:11
So this is the inevitability about it.
14:14
CA: So, in your mind, five years time,
14:16
someone can buy this as part of
14:18
a standard computer interface?
14:20
JU: I think in five years time when you buy a computer,
14:22
you'll get this.
14:25
CA: Well that's cool.
14:27
(Applause)
14:29
The world has a habit of surprising us as to how these things are actually used.
14:33
What do you think, what in your mind is the first killer app for this?
14:36
JU: That's a good question, and we ask ourselves that every day.
14:39
At the moment, our early-adopter customers --
14:42
and these systems are deployed out in the real world --
14:45
do all the big data intensive, data heavy problems with it.
14:48
So, whether it's logistics and supply chain management
14:51
or natural gas and resource extraction,
14:53
financial services, pharmaceuticals, bioinformatics,
14:56
those are the topics right now, but that's not a killer app.
14:59
And I understand what you're asking.
15:01
CA: C'mon, c'mon. Martial arts, games. C'mon.
15:03
(Laughter)
15:05
John, thank you for making science-fiction real.
15:07
JU: It's been a great pleasure.
15:10
Thank you to you all.
15:12
(Applause)
15:14

▲Back to top

About the Speaker:

John Underkoffler - Interface designer
Remember the data interface from Minority Report? Well, it's real, John Underkoffler invented it -- as a point-and-touch interface called g-speak -- and it's about to change the way we interact with data.

Why you should listen

When Tom Cruise put on his data glove and started whooshing through video clips of future crimes, how many of us felt the stirrings of geek lust? This iconic scene in Minority Report marked a change in popular thinking about interfaces -- showing how sexy it could be to use natural gestures, without keyboard, mouse or command line.
 
John Underkoffler led the team that came up with this interface, called the g-speak Spatial Operating Environment. His company, Oblong Industries, was founded to move g-speak into the real world. Oblong is building apps for aerospace, bioinformatics, video editing and more. But the big vision is ubiquity: g-speak on every laptop, every desktop, every microwave oven, TV, dashboard. "It has to be like this," he says. "We all of us every day feel that. We build starting there. We want to change it all."
 
Before founding Oblong, Underkoffler spent 15 years at MIT's Media Laboratory, working in holography, animation and visualization techniques, and building the I/O Bulb and Luminous Room Systems.

More profile about the speaker
John Underkoffler | Speaker | TED.com