sponsored links
TEDxBeaconStreet

Michael Rubinstein: See invisible motion, hear silent sounds

November 13, 2014

Meet the “motion microscope,” a video-processing tool that plays up tiny changes in motion and color impossible to see with the naked eye. Video researcher Michael Rubinstein plays us clip after jaw-dropping clip showing how this tech can track an individual’s pulse and heartbeat simply from a piece of footage. Watch him re-create a conversation by amplifying the movements from sound waves bouncing off a bag of chips. The wow-inspiring and sinister applications of this tech you have to see to believe.

Michael Rubinstein - Research scientist, Google
Computer scientist Michael Rubinstein and his team have developed a "motion microscope" that can show video footage of barely perceivable movements, like breaths and heartbeats. Full bio

sponsored links
Double-click the English subtitles below to play the video.
So over the past few centuries,
microscopes have revolutionized our world.
00:12
They revealed to us a tiny world
of objects, life and structures
00:20
that are too small for us
to see with our naked eyes.
00:26
They are a tremendous contribution
to science and technology.
00:28
Today I'd like to introduce you
to a new type of microscope,
00:31
a microscope for changes.
00:35
It doesn't use optics
like a regular microscope
00:37
to make small objects bigger,
00:40
but instead it uses a video camera
and image processing
00:42
to reveal to us the tiniest motions
and color changes in objects and people,
00:47
changes that are impossible
for us to see with our naked eyes.
00:52
And it lets us look at our world
in a completely new way.
00:56
So what do I mean by color changes?
01:00
Our skin, for example,
changes its color very slightly
01:02
when the blood flows under it.
01:05
That change is incredibly subtle,
01:07
which is why, when you
look at other people,
01:09
when you look at the person
sitting next to you,
01:11
you don't see their skin
or their face changing color.
01:13
When we look at this video of Steve here,
it appears to us like a static picture,
01:17
but once we look at this video
through our new, special microscope,
01:21
suddenly we see
a completely different image.
01:25
What you see here are small changes
in the color of Steve's skin,
01:28
magnified 100 times
so that they become visible.
01:32
We can actually see a human pulse.
01:36
We can see how fast
Steve's heart is beating,
01:39
but we can also see the actual way
that the blood flows in his face.
01:43
And we can do that not just
to visualize the pulse,
01:48
but also to actually
recover our heart rates,
01:50
and measure our heart rates.
01:54
And we can do it with regular cameras
and without touching the patients.
01:56
So here you see the pulse and heart rate
we extracted from a neonatal baby
02:00
from a video we took
with a regular DSLR camera,
02:06
and the heart rate measurement we get
02:09
is as accurate as the one you'd get
with a standard monitor in a hospital.
02:11
And it doesn't even have to be
a video we recorded.
02:15
We can do it essentially
with other videos as well.
02:18
So I just took a short clip
from "Batman Begins" here
02:21
just to show Christian Bale's pulse.
02:25
(Laughter)
02:27
And you know, presumably
he's wearing makeup,
02:29
the lighting here is kind of challenging,
02:31
but still, just from the video,
we're able to extract his pulse
02:33
and show it quite well.
02:36
So how do we do all that?
02:38
We basically analyze the changes
in the light that are recorded
02:40
at every pixel in the video over time,
02:44
and then we crank up those changes.
02:46
We make them bigger
so that we can see them.
02:48
The tricky part is that those signals,
02:50
those changes that we're after,
are extremely subtle,
02:52
so we have to be very careful
when you try to separate them
02:55
from noise that always exists in videos.
02:58
So we use some clever
image processing techniques
03:02
to get a very accurate measurement
of the color at each pixel in the video,
03:05
and then the way the color
changes over time,
03:09
and then we amplify those changes.
03:11
We make them bigger to create those types
of enhanced videos, or magnified videos,
03:14
that actually show us those changes.
03:18
But it turns out we can do that
not just to show tiny changes in color,
03:20
but also tiny motions,
03:25
and that's because the light
that gets recorded in our cameras
03:27
will change not only if the color
of the object changes,
03:30
but also if the object moves.
03:33
So this is my daughter
when she was about two months old.
03:36
It's a video I recorded
about three years ago.
03:39
And as new parents, we all want
to make sure our babies are healthy,
03:42
that they're breathing,
that they're alive, of course.
03:45
So I too got one of those baby monitors
03:48
so that I could see my daughter
when she was asleep.
03:50
And this is pretty much what you'll see
with a standard baby monitor.
03:53
You can see the baby's sleeping, but
there's not too much information there.
03:56
There's not too much we can see.
04:00
Wouldn't it be better,
or more informative, or more useful,
04:01
if instead we could look
at the view like this.
04:04
So here I took the motions
and I magnified them 30 times,
04:07
and then I could clearly see that my
daughter was indeed alive and breathing.
04:14
(Laughter)
04:17
Here is a side-by-side comparison.
04:20
So again, in the source video,
in the original video,
04:22
there's not too much we can see,
04:24
but once we magnify the motions,
the breathing becomes much more visible.
04:26
And it turns out, there's
a lot of phenomena
04:29
we can reveal and magnify
with our new motion microscope.
04:31
We can see how our veins and arteries
are pulsing in our bodies.
04:35
We can see that our eyes
are constantly moving
04:40
in this wobbly motion.
04:42
And that's actually my eye,
04:44
and again this video was taken
right after my daughter was born,
04:46
so you can see I wasn't getting
too much sleep. (Laughter)
04:49
Even when a person is sitting still,
04:53
there's a lot of information
we can extract
04:56
about their breathing patterns,
small facial expressions.
04:58
Maybe we could use those motions
05:01
to tell us something about
our thoughts or our emotions.
05:03
We can also magnify small
mechanical movements,
05:06
like vibrations in engines,
05:09
that can help engineers detect
and diagnose machinery problems,
05:11
or see how our buildings and structures
sway in the wind and react to forces.
05:15
Those are all things that our society
knows how to measure in various ways,
05:19
but measuring those motions is one thing,
05:24
and actually seeing those
motions as they happen
05:26
is a whole different thing.
05:29
And ever since we discovered
this new technology,
05:31
we made our code available online so that
others could use and experiment with it.
05:34
It's very simple to use.
05:38
It can work on your own videos.
05:40
Our collaborators at Quanta Research
even created this nice website
05:42
where you can upload your videos
and process them online,
05:45
so even if you don't have any experience
in computer science or programming,
05:48
you can still very easily experiment
with this new microscope.
05:52
And I'd like to show you
just a couple of examples
05:55
of what others have done with it.
05:57
So this video was made by
a YouTube user called Tamez85.
06:00
I don't know who that user is,
06:05
but he, or she, used our code
06:07
to magnify small belly
movements during pregnancy.
06:09
It's kind of creepy.
06:13
(Laughter)
06:14
People have used it to magnify
pulsing veins in their hands.
06:16
And you know it's not real science
unless you use guinea pigs,
06:21
and apparently this guinea pig
is called Tiffany,
06:25
and this YouTube user claims
it is the first rodent on Earth
06:28
that was motion-magnified.
06:31
You can also do some art with it.
06:34
So this video was sent to me
by a design student at Yale.
06:36
She wanted to see
if there's any difference
06:39
in the way her classmates move.
06:41
She made them all stand still,
and then magnified their motions.
06:42
It's like seeing
still pictures come to life.
06:47
And the nice thing with
all those examples
06:50
is that we had nothing to do with them.
06:53
We just provided this new tool,
a new way to look at the world,
06:55
and then people find other interesting,
new and creative ways of using it.
06:59
But we didn't stop there.
07:04
This tool not only allows us
to look at the world in a new way,
07:06
it also redefines what we can do
07:09
and pushes the limits of what
we can do with our cameras.
07:11
So as scientists, we started wondering,
07:14
what other types of physical phenomena
produce tiny motions
07:17
that we could now use
our cameras to measure?
07:20
And one such phenomenon
that we focused on recently is sound.
07:23
Sound, as we all know,
is basically changes
07:27
in air pressure that
travel through the air.
07:29
Those pressure waves hit objects
and they create small vibrations in them,
07:32
which is how we hear
and how we record sound.
07:35
But it turns out that sound
also produces visual motions.
07:38
Those are motions
that are not visible to us
07:41
but are visible to a camera
with the right processing.
07:44
So here are two examples.
07:47
This is me demonstrating
my great singing skills.
07:49
(Singing)
07:52
(Laughter)
07:54
And I took a high-speed video
of my throat while I was humming.
07:55
Again, if you stare at that video,
07:58
there's not too much
you'll be able to see,
08:00
but once we magnify the motions 100 times,
we can see all the motions and ripples
08:02
in the neck that are involved
in producing the sound.
08:07
That signal is there in that video.
08:10
We also know that singers
can break a wine glass
08:13
if they hit the correct note.
08:15
So here, we're going to play a note
08:17
that's in the resonance
frequency of that glass
08:19
through a loudspeaker that's next to it.
08:21
Once we play that note
and magnify the motions 250 times,
08:23
we can very clearly see
how the glass vibrates
08:28
and resonates in response to the sound.
08:30
It's not something you're used
to seeing every day.
08:33
But this made us think.
It gave us this crazy idea.
08:36
Can we actually invert this process
and recover sound from video
08:39
by analyzing the tiny vibrations
that sound waves create in objects,
08:45
and essentially convert those
back into the sounds that produced them.
08:49
In this way, we can turn
everyday objects into microphones.
08:54
So that's exactly what we did.
08:58
So here's an empty bag of chips
that was lying on a table,
09:00
and we're going to turn that
bag of chips into a microphone
09:03
by filming it with a video camera
09:06
and analyzing the tiny motions
that sound waves create in it.
09:08
So here's the sound
that we played in the room.
09:11
(Music: "Mary Had a Little Lamb")
09:14
And this is a high-speed video
we recorded of that bag of chips.
09:21
Again it's playing.
09:24
There's no chance you'll be able
to see anything going on in that video
09:26
just by looking at it,
09:29
but here's the sound we were able
to recover just by analyzing
09:30
the tiny motions in that video.
09:33
(Music: "Mary Had a Little Lamb")
09:35
I call it -- Thank you.
09:52
(Applause)
09:54
I call it the visual microphone.
10:01
We actually extract audio signals
from video signals.
10:04
And just to give you a sense
of the scale of the motions here,
10:07
a pretty loud sound will cause that bag
of chips to move less than a micrometer.
10:10
That's one thousandth of a millimeter.
10:15
That's how tiny the motions are
that we are now able to pull out
10:18
just by observing how light
bounces off objects
10:22
and gets recorded by our cameras.
10:25
We can recover sounds
from other objects, like plants.
10:27
(Music: "Mary Had a Little Lamb")
10:30
And we can recover speech as well.
10:39
So here's a person speaking in a room.
10:41
Voice: Mary had a little lamb
whose fleece was white as snow,
10:43
and everywhere that Mary went,
that lamb was sure to go.
10:47
Michael Rubinstein: And here's
that speech again recovered
10:52
just from this video
of that same bag of chips.
10:54
Voice: Mary had a little lamb
whose fleece was white as snow,
10:58
and everywhere that Mary went,
that lamb was sure to go.
11:02
MR: We used "Mary Had a Little Lamb"
11:07
because those are said to be
the first words
11:10
that Thomas Edison spoke
into his phonograph in 1877.
11:12
It was one of the first sound
recording devices in history.
11:16
It basically directed the sounds
onto a diaphragm
11:19
that vibrated a needle that essentially
engraved the sound on tinfoil
11:22
that was wrapped around the cylinder.
11:27
Here's a demonstration of recording and
replaying sound with Edison's phonograph.
11:29
(Video) Voice: Testing,
testing, one two three.
11:35
Mary had a little lamb
whose fleece was white as snow,
11:38
and everywhere that Mary went,
the lamb was sure to go.
11:41
Testing, testing, one two three.
11:45
Mary had a little lamb
whose fleece was white as snow,
11:48
and everywhere that Mary went,
the lamb was sure to go.
11:52
MR: And now, 137 years later,
11:57
we're able to get sound
in pretty much similar quality
12:01
but by just watching objects
vibrate to sound with cameras,
12:05
and we can even do that when the camera
12:09
is 15 feet away from the object,
behind soundproof glass.
12:11
So this is the sound that we were
able to recover in that case.
12:15
Voice: Mary had a little lamb
whose fleece was white as snow,
12:19
and everywhere that Mary went,
the lamb was sure to go.
12:24
MR: And of course, surveillance is
the first application that comes to mind.
12:29
(Laughter)
12:32
But it might actually be useful
for other things as well.
12:35
Maybe in the future, we'll be able
to use it, for example,
12:39
to recover sound across space,
12:42
because sound can't travel
in space, but light can.
12:44
We've only just begun exploring
12:48
other possible uses
for this new technology.
12:50
It lets us see physical processes
that we know are there
12:53
but that we've never been able
to see with our own eyes until now.
12:56
This is our team.
13:00
Everything I showed you today
is a result of a collaboration
13:01
with this great group
of people you see here,
13:04
and I encourage you and welcome you
to check out our website,
13:06
try it out yourself,
13:09
and join us in exploring
this world of tiny motions.
13:11
Thank you.
13:14
(Applause)
13:15

sponsored links

Michael Rubinstein - Research scientist, Google
Computer scientist Michael Rubinstein and his team have developed a "motion microscope" that can show video footage of barely perceivable movements, like breaths and heartbeats.

Why you should listen

Michael Rubinstein zooms in on what we can't see and mangnifies it by thirty or a hundred times. His "motion microscope," developed at MIT with Microsoft and Quanta Research, picks up on subtle motion and color changes in videos and blows them up for the naked eye to see. The result: fun, cool, creepy videos.

Rubinstein is a research scientist at a new Cambridge-based Google lab for computer vision research. He has a PhD in computer science and electrical engineering from MIT.

sponsored links

If you need translations, you can install "Google Translate" extension into your Chrome Browser.
Furthermore, you can change playback rate by installing "Video Speed Controller" extension.

Data provided by TED.

This website is owned and operated by Tokyo English Network.
The developer's blog is here.