ABOUT THE SPEAKER
Ed Ulbrich - Visual storyteller
Ed Ulbrich works at the leading edge of computer-generated visuals. On a recent project, filmmakers, artists, and technologists have been working at a breakthrough point where reality and digitally created worlds collide.

Why you should listen

Ed Ulbrich spoke at TED2009 representing a team of filmmakers, artists and technologists who've been working on a significant breakthrough in visual storytelling -- a startling blurring of the line between digital creation and actor.  

Ulbrich was the long time executive VP of production at Digital Domain, for whom he executive-produced Academy Award-winning visual effects for Titanic, What Dreams May Come, Fight Club, Zodiac, Adaptation and other features, as well as music videos and more than 500 commercials. He has recently exited this position but has entered into a creative consultant arrangement with the company. In 2007, he was named to the Creativity 50 -- top innovators in advertising and design.

More profile about the speaker
Ed Ulbrich | Speaker | TED.com
TED2009

Ed Ulbrich: How Benjamin Button got his face

Filmed:
1,080,448 views

Ed Ulbrich, the digital-effects guru from Digital Domain, explains the Oscar-winning technology that allowed his team to digitally create the older versions of Brad Pitt's face for "The Curious Case of Benjamin Button."
- Visual storyteller
Ed Ulbrich works at the leading edge of computer-generated visuals. On a recent project, filmmakers, artists, and technologists have been working at a breakthrough point where reality and digitally created worlds collide. Full bio

Double-click the English transcript below to play the video.

00:18
I'm here today representing a team of artists and technologists and filmmakers
0
0
5000
00:23
that worked together on a remarkable film project for the last four years.
1
5000
3000
00:26
And along the way they created a breakthrough in computer visualization.
2
8000
4000
00:30
So I want to show you a clip of the film now.
3
12000
3000
00:33
Hopefully it won't stutter.
4
15000
3000
00:36
And if we did our jobs well, you won't know that we were even involved.
5
18000
3000
00:39
Voice (Video): I don't know how it's possible ...
6
21000
3000
00:42
but you seem to have more hair.
7
24000
3000
00:45
Brad Pitt: What if I told you that I wasn't getting older ...
8
27000
3000
00:48
but I was getting younger than everybody else?
9
30000
2000
00:53
I was born with some form of disease.
10
35000
3000
00:56
Voice: What kind of disease?
11
38000
2000
00:58
BP: I was born old.
12
40000
2000
01:01
Man: I'm sorry.
13
43000
2000
01:03
BP: No need to be. There's nothing wrong with old age.
14
45000
3000
01:08
Girl: Are you sick?
15
50000
2000
01:10
BP: I heard momma and Tizzy whisper,
16
52000
3000
01:13
and they said I was gonna die soon.
17
55000
2000
01:15
But ... maybe not.
18
57000
3000
01:18
Girl: You're different than anybody I've ever met.
19
60000
3000
01:22
BB: There were many changes ...
20
64000
3000
01:25
some you could see, some you couldn't.
21
67000
3000
01:28
Hair started growing in all sorts of places,
22
70000
3000
01:31
along with other things.
23
73000
3000
01:34
I felt pretty good, considering.
24
76000
3000
01:38
Ed Ulbrich: That was a clip from "The Curious Case of Benjamin Button."
25
80000
4000
01:42
Many of you, maybe you've seen it or you've heard of the story,
26
84000
4000
01:46
but what you might not know
27
88000
2000
01:48
is that for nearly the first hour of the film,
28
90000
2000
01:50
the main character, Benjamin Button, who's played by Brad Pitt,
29
92000
3000
01:53
is completely computer-generated from the neck up.
30
95000
3000
01:56
Now, there's no use of prosthetic makeup
31
98000
3000
01:59
or photography of Brad superimposed over another actor's body.
32
101000
3000
02:02
We've created a completely digital human head.
33
104000
3000
02:05
So I'd like to start with a little bit of history on the project.
34
107000
3000
02:08
This is based on an F. Scott Fitzgerald short story.
35
110000
2000
02:10
It's about a man who's born old and lives his life in reverse.
36
112000
3000
02:13
Now, this movie has floated around Hollywood
37
115000
2000
02:15
for well over half a century,
38
117000
2000
02:17
and we first got involved with the project in the early '90s,
39
119000
3000
02:20
with Ron Howard as the director.
40
122000
2000
02:22
We took a lot of meetings and we seriously considered it.
41
124000
3000
02:25
But at the time we had to throw in the towel.
42
127000
2000
02:27
It was deemed impossible.
43
129000
2000
02:29
It was beyond the technology of the day to depict a man aging backwards.
44
131000
4000
02:33
The human form, in particular the human head,
45
135000
3000
02:36
has been considered the Holy Grail of our industry.
46
138000
3000
02:39
The project came back to us about a decade later,
47
141000
3000
02:42
and this time with a director named David Fincher.
48
144000
3000
02:45
Now, Fincher is an interesting guy.
49
147000
3000
02:48
David is fearless of technology,
50
150000
2000
02:50
and he is absolutely tenacious.
51
152000
2000
02:52
And David won't take "no."
52
154000
2000
02:54
And David believed, like we do in the visual effects industry,
53
156000
3000
02:57
that anything is possible
54
159000
3000
03:00
as long as you have enough time, resources and, of course, money.
55
162000
3000
03:03
And so David had an interesting take on the film,
56
165000
4000
03:07
and he threw a challenge at us.
57
169000
3000
03:10
He wanted the main character of the film to be played
58
172000
3000
03:13
from the cradle to the grave by one actor.
59
175000
2000
03:15
It happened to be this guy.
60
177000
2000
03:17
We went through a process of elimination and a process of discovery
61
179000
3000
03:20
with David, and we ruled out, of course, swapping actors.
62
182000
3000
03:23
That was one idea: that we would have different actors,
63
185000
3000
03:26
and we would hand off from actor to actor.
64
188000
2000
03:28
We even ruled out the idea of using makeup.
65
190000
2000
03:30
We realized that prosthetic makeup just wouldn't hold up,
66
192000
3000
03:33
particularly in close-up.
67
195000
2000
03:35
And makeup is an additive process. You have to build the face up.
68
197000
3000
03:38
And David wanted to carve deeply into Brad's face
69
200000
3000
03:41
to bring the aging to this character.
70
203000
2000
03:43
He needed to be a very sympathetic character.
71
205000
2000
03:45
So we decided to cast a series of little people
72
207000
3000
03:48
that would play the different bodies of Benjamin
73
210000
3000
03:51
at the different increments of his life
74
213000
2000
03:53
and that we would in fact create a computer-generated version of Brad's head,
75
215000
3000
03:56
aged to appear as Benjamin,
76
218000
2000
03:58
and attach that to the body of the real actor.
77
220000
3000
04:01
Sounded great.
78
223000
2000
04:03
Of course, this was the Holy Grail of our industry,
79
225000
3000
04:06
and the fact that this guy is a global icon didn't help either,
80
228000
3000
04:09
because I'm sure if any of you ever stand in line at the grocery store,
81
231000
3000
04:12
you know -- we see his face constantly.
82
234000
3000
04:15
So there really was no tolerable margin of error.
83
237000
2000
04:17
There were two studios involved: Warner Brothers and Paramount.
84
239000
3000
04:20
And they both believed this would make an amazing film, of course,
85
242000
3000
04:23
but it was a very high-risk proposition.
86
245000
3000
04:26
There was lots of money and reputations at stake.
87
248000
3000
04:29
But we believed that we had a very solid methodology
88
251000
3000
04:32
that might work ...
89
254000
3000
04:35
But despite our verbal assurances,
90
257000
3000
04:38
they wanted some proof.
91
260000
2000
04:40
And so, in 2004, they commissioned us to do a screen test of Benjamin.
92
262000
3000
04:43
And we did it in about five weeks.
93
265000
3000
04:46
But we used lots of cheats and shortcuts.
94
268000
3000
04:49
We basically put something together to get through the meeting.
95
271000
3000
04:52
I'll roll that for you now. This was the first test for Benjamin Button.
96
274000
3000
04:55
And in here, you can see, that's a computer-generated head --
97
277000
3000
04:58
pretty good -- attached to the body of an actor.
98
280000
3000
05:01
And it worked. And it gave the studio great relief.
99
283000
3000
05:04
After many years of starts and stops on this project,
100
286000
3000
05:07
and making that tough decision,
101
289000
3000
05:10
they finally decided to greenlight the movie.
102
292000
3000
05:13
And I can remember, actually, when I got the phone call to congratulate us,
103
295000
3000
05:16
to say the movie was a go,
104
298000
2000
05:18
I actually threw up.
105
300000
2000
05:20
(Laughter)
106
302000
2000
05:22
You know, this is some tough stuff.
107
304000
2000
05:24
So we started to have early team meetings,
108
306000
3000
05:27
and we got everybody together,
109
309000
2000
05:29
and it was really more like therapy in the beginning,
110
311000
3000
05:32
convincing each other and reassuring each other that we could actually undertake this.
111
314000
3000
05:35
We had to hold up an hour of a movie with a character.
112
317000
3000
05:38
And it's not a special effects film; it has to be a man.
113
320000
3000
05:41
We really felt like we were in a -- kind of a 12-step program.
114
323000
3000
05:44
And of course, the first step is: admit you've got a problem. (Laughter)
115
326000
3000
05:48
So we had a big problem:
116
330000
2000
05:50
we didn't know how we were going to do this.
117
332000
3000
05:53
But we did know one thing.
118
335000
2000
05:55
Being from the visual effects industry,
119
337000
3000
05:58
we, with David, believed that we now had enough time,
120
340000
3000
06:01
enough resources, and, God, we hoped we had enough money.
121
343000
3000
06:04
And we had enough passion to will the processes and technology into existence.
122
346000
5000
06:09
So, when you're faced with something like that,
123
351000
2000
06:11
of course you've got to break it down.
124
353000
2000
06:13
You take the big problem and you break it down into smaller pieces
125
355000
2000
06:15
and you start to attack that.
126
357000
1000
06:16
So we had three main areas that we had to focus on.
127
358000
2000
06:18
We needed to make Brad look a lot older --
128
360000
2000
06:20
needed to age him 45 years or so.
129
362000
2000
06:22
And we also needed to make sure that we could take Brad's idiosyncrasies,
130
364000
6000
06:28
his little tics, the little subtleties that make him who he is
131
370000
2000
06:30
and have that translate through our process
132
372000
2000
06:32
so that it appears in Benjamin on the screen.
133
374000
3000
06:35
And we also needed to create a character
134
377000
2000
06:37
that could hold up under, really, all conditions.
135
379000
3000
06:40
He needed to be able to walk in broad daylight,
136
382000
2000
06:42
at nighttime, under candlelight,
137
384000
3000
06:45
he had to hold an extreme close-up,
138
387000
2000
06:47
he had to deliver dialogue,
139
389000
1000
06:48
he had to be able to run, he had to be able to sweat,
140
390000
2000
06:50
he had to be able to take a bath, to cry,
141
392000
2000
06:52
he even had to throw up.
142
394000
1000
06:53
Not all at the same time --
143
395000
1000
06:54
but he had to, you know, do all of those things.
144
396000
2000
06:56
And the work had to hold up for almost the first hour of the movie.
145
398000
3000
06:59
We did about 325 shots.
146
401000
2000
07:01
So we needed a system that would allow Benjamin
147
403000
3000
07:04
to do everything a human being can do.
148
406000
3000
07:07
And we realized that there was a giant chasm
149
409000
3000
07:10
between the state of the art of technology in 2004
150
412000
3000
07:13
and where we needed it to be.
151
415000
2000
07:15
So we focused on motion capture.
152
417000
3000
07:18
I'm sure many of you have seen motion capture.
153
420000
2000
07:20
The state of the art at the time
154
422000
2000
07:22
was something called marker-based motion capture.
155
424000
2000
07:24
I'll give you an example here.
156
426000
1000
07:25
It's basically the idea of, you wear a leotard,
157
427000
2000
07:27
and they put some reflective markers on your body,
158
429000
2000
07:29
and instead of using cameras,
159
431000
2000
07:31
there're infrared sensors around a volume,
160
433000
2000
07:33
and those infrared sensors track the three-dimensional position
161
435000
2000
07:35
of those markers in real time.
162
437000
2000
07:37
And then animators can take the data of the motion of those markers
163
439000
3000
07:40
and apply them to a computer-generated character.
164
442000
2000
07:42
You can see the computer characters on the right
165
444000
3000
07:45
are having the same complex motion as the dancers.
166
447000
3000
07:48
But we also looked at numbers of other films at the time
167
450000
2000
07:50
that were using facial marker tracking,
168
452000
2000
07:52
and that's the idea of putting markers on the human face
169
454000
2000
07:54
and doing the same process.
170
456000
1000
07:55
And as you can see, it gives you a pretty crappy performance.
171
457000
4000
07:59
That's not terribly compelling.
172
461000
3000
08:02
And what we realized
173
464000
2000
08:04
was that what we needed
174
466000
1000
08:05
was the information that was going on between the markers.
175
467000
2000
08:07
We needed the subtleties of the skin.
176
469000
3000
08:10
We needed to see skin moving over muscle moving over bone.
177
472000
3000
08:13
We needed creases and dimples and wrinkles and all of those things.
178
475000
2000
08:15
Our first revelation was to completely abort and walk away from
179
477000
3000
08:18
the technology of the day, the status quo, the state of the art.
180
480000
3000
08:21
So we aborted using motion capture.
181
483000
3000
08:24
And we were now well out of our comfort zone,
182
486000
3000
08:27
and in uncharted territory.
183
489000
2000
08:29
So we were left with this idea
184
491000
3000
08:32
that we ended up calling "technology stew."
185
494000
3000
08:35
We started to look out in other fields.
186
497000
2000
08:37
The idea was that we were going to find
187
499000
3000
08:40
nuggets or gems of technology
188
502000
2000
08:42
that come from other industries like medical imaging,
189
504000
2000
08:44
the video game space,
190
506000
1000
08:45
and re-appropriate them.
191
507000
2000
08:47
And we had to create kind of a sauce.
192
509000
3000
08:50
And the sauce was code in software
193
512000
3000
08:53
that we'd written to allow these disparate pieces of technology
194
515000
3000
08:56
to come together and work as one.
195
518000
2000
08:58
Initially, we came across some remarkable research
196
520000
2000
09:00
done by a gentleman named Dr. Paul Ekman in the early '70s.
197
522000
3000
09:03
He believed that he could, in fact,
198
525000
3000
09:06
catalog the human face.
199
528000
2000
09:08
And he came up with this idea of Facial Action Coding System, or FACS.
200
530000
3000
09:11
He believed that there were 70 basic poses
201
533000
3000
09:14
or shapes of the human face,
202
536000
3000
09:17
and that those basic poses or shapes of the face
203
539000
3000
09:20
can be combined to create infinite possibilities
204
542000
3000
09:23
of everything the human face is capable of doing.
205
545000
2000
09:25
And of course, these transcend age, race, culture, gender.
206
547000
3000
09:28
So this became the foundation of our research as we went forward.
207
550000
4000
09:32
And then we came across some remarkable technology
208
554000
3000
09:35
called Contour.
209
557000
1000
09:36
And here you can see a subject having phosphorus makeup
210
558000
3000
09:39
stippled on her face.
211
561000
2000
09:41
And now what we're looking at is really creating a surface capture
212
563000
3000
09:44
as opposed to a marker capture.
213
566000
2000
09:46
The subject stands in front of a computer array of cameras,
214
568000
2000
09:48
and those cameras can, frame-by-frame,
215
570000
2000
09:50
reconstruct the geometry of exactly what the subject's doing at the moment.
216
572000
3000
09:53
So, effectively, you get 3D data in real time of the subject.
217
575000
5000
09:58
And if you look in a comparison,
218
580000
3000
10:01
on the left, we see what volumetric data gives us
219
583000
3000
10:04
and on the right you see what markers give us.
220
586000
2000
10:07
So, clearly, we were in a substantially better place for this.
221
589000
2000
10:09
But these were the early days of this technology,
222
591000
2000
10:11
and it wasn't really proven yet.
223
593000
2000
10:13
We measure complexity and fidelity of data
224
595000
2000
10:15
in terms of polygonal count.
225
597000
2000
10:17
And so, on the left, we were seeing 100,000 polygons.
226
599000
3000
10:20
We could go up into the millions of polygons.
227
602000
2000
10:22
It seemed to be infinite.
228
604000
2000
10:24
This was when we had our "Aha!"
229
606000
2000
10:26
This was the breakthrough.
230
608000
1000
10:27
This is when we're like, "OK, we're going to be OK,
231
609000
2000
10:29
This is actually going to work."
232
611000
1000
10:30
And the "Aha!" was, what if we could take Brad Pitt,
233
612000
4000
10:34
and we could put Brad in this device,
234
616000
3000
10:37
and use this Contour process,
235
619000
2000
10:39
and we could stipple on this phosphorescent makeup
236
621000
2000
10:41
and put him under the black lights,
237
623000
1000
10:42
and we could, in fact, scan him in real time
238
624000
3000
10:45
performing Ekman's FACS poses.
239
627000
2000
10:47
Right? So, effectively,
240
629000
2000
10:49
we ended up with a 3D database
241
631000
2000
10:51
of everything Brad Pitt's face is capable of doing.
242
633000
3000
10:54
(Laughter)
243
636000
2000
10:56
From there, we actually carved up those faces
244
638000
3000
10:59
into smaller pieces and components of his face.
245
641000
3000
11:02
So we ended up with literally thousands and thousands and thousands of shapes,
246
644000
3000
11:05
a complete database of all possibilities
247
647000
3000
11:08
that his face is capable of doing.
248
650000
3000
11:11
Now, that's great, except we had him at age 44.
249
653000
3000
11:14
We need to put another 40 years on him at this point.
250
656000
3000
11:17
We brought in Rick Baker,
251
659000
2000
11:19
and Rick is one of the great makeup and special effects gurus
252
661000
2000
11:21
of our industry.
253
663000
1000
11:22
And we also brought in a gentleman named Kazu Tsuji,
254
664000
3000
11:25
and Kazu Tsuji is one of the great photorealist sculptors of our time.
255
667000
3000
11:28
And we commissioned them to make a maquette,
256
670000
3000
11:31
or a bust, of Benjamin.
257
673000
2000
11:33
So, in the spirit of "The Great Unveiling" -- I had to do this --
258
675000
3000
11:36
I had to unveil something.
259
678000
2000
11:38
So this is Ben 80.
260
680000
2000
11:40
We created three of these:
261
682000
2000
11:42
there's Ben 80, there's Ben 70, there's Ben 60.
262
684000
2000
11:44
And this really became the template for moving forward.
263
686000
3000
11:47
Now, this was made from a life cast of Brad.
264
689000
2000
11:49
So, in fact, anatomically, it is correct.
265
691000
3000
11:52
The eyes, the jaw, the teeth:
266
694000
3000
11:55
everything is in perfect alignment with what the real guy has.
267
697000
3000
11:58
We have these maquettes scanned into the computer
268
700000
2000
12:00
at very high resolution --
269
702000
2000
12:02
enormous polygonal count.
270
704000
2000
12:04
And so now we had three age increments of Benjamin
271
706000
4000
12:08
in the computer.
272
710000
2000
12:10
But we needed to get a database of him doing more than that.
273
712000
3000
12:13
We went through this process, then, called retargeting.
274
715000
3000
12:16
This is Brad doing one of the Ekman FACS poses.
275
718000
2000
12:18
And here's the resulting data that comes from that,
276
720000
3000
12:21
the model that comes from that.
277
723000
2000
12:23
Retargeting is the process of transposing that data
278
725000
3000
12:26
onto another model.
279
728000
2000
12:28
And because the life cast, or the bust -- the maquette -- of Benjamin
280
730000
3000
12:31
was made from Brad,
281
733000
2000
12:33
we could transpose the data of Brad at 44
282
735000
3000
12:36
onto Brad at 87.
283
738000
2000
12:38
So now, we had a 3D database of everything Brad Pitt's face can do
284
740000
3000
12:41
at age 87, in his 70s and in his 60s.
285
743000
4000
12:45
Next we had to go into the shooting process.
286
747000
3000
12:48
So while all that's going on,
287
750000
1000
12:49
we're down in New Orleans and locations around the world.
288
751000
2000
12:51
And we shot our body actors,
289
753000
2000
12:53
and we shot them wearing blue hoods.
290
755000
2000
12:55
So these are the gentleman who played Benjamin.
291
757000
2000
12:57
And the blue hoods helped us with two things:
292
759000
2000
12:59
one, we could easily erase their heads;
293
761000
2000
13:01
and we also put tracking markers on their heads
294
763000
2000
13:03
so we could recreate the camera motion
295
765000
2000
13:05
and the lens optics from the set.
296
767000
2000
13:07
But now we needed to get Brad's performance to drive our virtual Benjamin.
297
769000
3000
13:10
And so we edited the footage that was shot on location
298
772000
2000
13:12
with the rest of the cast and the body actors
299
774000
3000
13:15
and about six months later
300
777000
2000
13:17
we brought Brad onto a sound stage in Los Angeles
301
779000
3000
13:20
and he watched on the screen.
302
782000
3000
13:23
His job, then, was to become Benjamin.
303
785000
2000
13:25
And so we looped the scenes.
304
787000
1000
13:26
He watched again and again.
305
788000
1000
13:27
We encouraged him to improvise.
306
789000
2000
13:29
And he took Benjamin into interesting and unusual places
307
791000
3000
13:32
that we didn't think he was going to go.
308
794000
2000
13:34
We shot him with four HD cameras
309
796000
2000
13:36
so we'd get multiple views of him
310
798000
1000
13:37
and then David would choose the take of Brad being Benjamin
311
799000
3000
13:40
that he thought best matched the footage
312
802000
3000
13:43
with the rest of the cast.
313
805000
1000
13:44
From there we went into a process called image analysis.
314
806000
3000
13:47
And so here, you can see again, the chosen take.
315
809000
3000
13:50
And you are seeing, now, that data being transposed on to Ben 87.
316
812000
3000
13:53
And so, what's interesting about this is
317
815000
3000
13:56
we used something called image analysis,
318
818000
2000
13:58
which is taking timings from different components of Benjamin's face.
319
820000
3000
14:01
And so we could choose, say, his left eyebrow.
320
823000
3000
14:04
And the software would tell us that, well,
321
826000
2000
14:06
in frame 14 the left eyebrow begins to move from here to here,
322
828000
2000
14:08
and it concludes moving in frame 32.
323
830000
2000
14:10
And so we could choose numbers of positions on the face
324
832000
2000
14:12
to pull that data from.
325
834000
2000
14:14
And then, the sauce I talked about with our technology stew --
326
836000
2000
14:16
that secret sauce was, effectively, software that allowed us to
327
838000
3000
14:19
match the performance footage of Brad
328
841000
3000
14:22
in live action with our database of aged Benjamin,
329
844000
4000
14:26
the FACS shapes that we had.
330
848000
2000
14:28
On a frame-by-frame basis,
331
850000
3000
14:31
we could actually reconstruct a 3D head
332
853000
3000
14:34
that exactly matched the performance of Brad.
333
856000
3000
14:37
So this was how the finished shot appeared in the film.
334
859000
3000
14:40
And here you can see the body actor.
335
862000
2000
14:42
And then this is what we called the "dead head," no reference to Jerry Garcia.
336
864000
3000
14:45
And then here's the reconstructed performance
337
867000
3000
14:48
now with the timings of the performance.
338
870000
3000
14:51
And then, again, the final shot.
339
873000
2000
14:54
It was a long process.
340
876000
2000
14:56
(Applause)
341
878000
3000
15:07
The next section here, I'm going to just blast through this,
342
889000
2000
15:09
because we could do a whole TEDTalk on the next several slides.
343
891000
4000
15:13
We had to create a lighting system.
344
895000
3000
15:16
So really, a big part of our processes was creating a lighting environment
345
898000
3000
15:19
for every single location that Benjamin had to appear
346
901000
2000
15:21
so that we could put Ben's head into any scene
347
903000
3000
15:24
and it would exactly match the lighting that's on the other actors
348
906000
3000
15:27
in the real world.
349
909000
1000
15:28
We also had to create an eye system.
350
910000
3000
15:31
We found the old adage, you know,
351
913000
2000
15:33
"The eyes are the window to the soul,"
352
915000
2000
15:35
absolutely true.
353
917000
1000
15:36
So the key here was to keep everybody looking in Ben's eyes.
354
918000
2000
15:38
And if you could feel the warmth, and feel the humanity,
355
920000
2000
15:40
and feel his intent coming through the eyes,
356
922000
3000
15:43
then we would succeed.
357
925000
1000
15:44
So we had one person focused on the eye system
358
926000
3000
15:47
for almost two full years.
359
929000
2000
15:49
We also had to create a mouth system.
360
931000
2000
15:51
We worked from dental molds of Brad.
361
933000
2000
15:53
We had to age the teeth over time.
362
935000
2000
15:55
We also had to create an articulating tongue that allowed him to enunciate his words.
363
937000
3000
15:58
There was a whole system written in software to articulate the tongue.
364
940000
2000
16:00
We had one person devoted to the tongue for about nine months.
365
942000
2000
16:02
He was very popular.
366
944000
2000
16:04
Skin displacement: another big deal.
367
946000
3000
16:07
The skin had to be absolutely accurate.
368
949000
2000
16:09
He's also in an old age home, he's in a nursing home
369
951000
3000
16:12
around other old people,
370
954000
2000
16:14
so he had to look exactly the same as the others.
371
956000
2000
16:16
So, lots of work on skin deformation,
372
958000
1000
16:17
you can see in some of these cases it works,
373
959000
1000
16:18
in some cases it looks bad.
374
960000
1000
16:19
This is a very, very, very early test in our process.
375
961000
2000
16:21
So, effectively we created a digital puppet
376
963000
3000
16:24
that Brad Pitt could operate with his own face.
377
966000
3000
16:27
There were no animators necessary to come in and interpret behavior
378
969000
4000
16:31
or enhance his performance.
379
973000
2000
16:33
There was something that we encountered, though,
380
975000
3000
16:36
that we ended up calling "the digital Botox effect."
381
978000
3000
16:39
So, as things went through this process,
382
981000
3000
16:42
Fincher would always say, "It sandblasts the edges off of the performance."
383
984000
3000
16:45
And thing our process and the technology couldn't do,
384
987000
3000
16:48
is they couldn't understand intent,
385
990000
3000
16:51
the intent of the actor.
386
993000
2000
16:53
So it sees a smile as a smile.
387
995000
2000
16:55
It doesn't recognize an ironic smile, or a happy smile,
388
997000
3000
16:58
or a frustrated smile.
389
1000000
1000
16:59
So it did take humans to kind of push it one way or another.
390
1001000
3000
17:02
But we ended up calling the entire process
391
1004000
3000
17:05
and all the technology "emotion capture,"
392
1007000
2000
17:07
as opposed to just motion capture.
393
1009000
1000
17:08
Take another look.
394
1010000
2000
17:11
Brad Pitt: Well, I heard momma and Tizzy whisper,
395
1013000
2000
17:13
and they said I was gonna die soon,
396
1015000
2000
17:15
but ... maybe not.
397
1017000
2000
17:37
EU: That's how to create a digital human in 18 minutes.
398
1039000
3000
17:40
(Applause)
399
1042000
3000
17:48
A couple of quick factoids;
400
1050000
2000
17:50
it really took 155 people over two years,
401
1052000
4000
17:54
and we didn't even talk about 60 hairstyles and an all-digital haircut.
402
1056000
4000
17:58
But, that is Benjamin. Thank you.
403
1060000
3000

▲Back to top

ABOUT THE SPEAKER
Ed Ulbrich - Visual storyteller
Ed Ulbrich works at the leading edge of computer-generated visuals. On a recent project, filmmakers, artists, and technologists have been working at a breakthrough point where reality and digitally created worlds collide.

Why you should listen

Ed Ulbrich spoke at TED2009 representing a team of filmmakers, artists and technologists who've been working on a significant breakthrough in visual storytelling -- a startling blurring of the line between digital creation and actor.  

Ulbrich was the long time executive VP of production at Digital Domain, for whom he executive-produced Academy Award-winning visual effects for Titanic, What Dreams May Come, Fight Club, Zodiac, Adaptation and other features, as well as music videos and more than 500 commercials. He has recently exited this position but has entered into a creative consultant arrangement with the company. In 2007, he was named to the Creativity 50 -- top innovators in advertising and design.

More profile about the speaker
Ed Ulbrich | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee