English-Video.net comment policy

The comment field is common to all languages

Let's write in your language and use "Google Translate" together

Please refer to informative community guidelines on TED.com

TEDxGöteborg 2010

Anders Ynnerman: Visualizing the medical data explosion

Filmed
Views 465,250

Medical scans can produce thousands of images for a single patient in seconds, but how do doctors know what's useful? Scientific visualization expert Anders Ynnerman shows us sophisticated new tools -- like virtual autopsies -- for analyzing our data, and hints at the sci-fi-sounding medical technologies coming up next. This talk contains some graphic medical imagery.

- Scientific visualization expert
Anders Ynnerman studies the fundamental aspects of computer graphics and visualization, in particular large scale and complex data sets with a focus on volume rendering and multi-modal interaction. Full bio

I will start by posing a little bit of a challenge:
00:15
the challenge of dealing with data,
00:19
data that we have to deal with
00:22
in medical situations.
00:24
It's really a huge challenge for us.
00:26
And this is our beast of burden --
00:28
this is a Computer Tomography machine,
00:30
a CT machine.
00:32
It's a fantastic device.
00:34
It uses X-rays, X-ray beams,
00:36
that are rotating very fast around the human body.
00:38
It takes about 30 seconds to go through the whole machine
00:41
and is generating enormous amounts of information
00:43
that comes out of the machine.
00:45
So this is a fantastic machine
00:47
that we can use
00:49
for improving health care,
00:51
but as I said, it's also a challenge for us.
00:53
And the challenge is really found in this picture here.
00:55
It's the medical data explosion
00:58
that we're having right now.
01:00
We're facing this problem.
01:02
And let me step back in time.
01:04
Let's go back a few years in time and see what happened back then.
01:06
These machines that came out --
01:09
they started coming in the 1970s --
01:11
they would scan human bodies,
01:13
and they would generate about 100 images
01:15
of the human body.
01:17
And I've taken the liberty, just for clarity,
01:19
to translate that to data slices.
01:21
That would correspond to about 50 megabytes of data,
01:24
which is small
01:26
when you think about the data we can handle today
01:28
just on normal mobile devices.
01:31
If you translate that to phone books,
01:33
it's about one meter of phone books in the pile.
01:35
Looking at what we're doing today
01:38
with these machines that we have,
01:40
we can, just in a few seconds,
01:42
get 24,000 images out of a body,
01:44
and that would correspond to about 20 gigabytes of data,
01:46
or 800 phone books,
01:49
and the pile would then be 200 meters of phone books.
01:51
What's about to happen --
01:53
and we're seeing this; it's beginning --
01:55
a technology trend that's happening right now
01:57
is that we're starting to look at time-resolved situations as well.
01:59
So we're getting the dynamics out of the body as well.
02:02
And just assume
02:05
that we will be collecting data during five seconds,
02:07
and that would correspond to one terabyte of data --
02:10
that's 800,000 books
02:12
and 16 kilometers of phone books.
02:14
That's one patient, one data set.
02:16
And this is what we have to deal with.
02:18
So this is really the enormous challenge that we have.
02:20
And already today -- this is 25,000 images.
02:23
Imagine the days
02:26
when we had radiologists doing this.
02:28
They would put up 25,000 images,
02:30
they would go like this, "25,0000, okay, okay.
02:32
There is the problem."
02:35
They can't do that anymore. That's impossible.
02:37
So we have to do something that's a little bit more intelligent than doing this.
02:39
So what we do is that we put all these slices together.
02:43
Imagine that you slice your body in all these directions,
02:45
and then you try to put the slices back together again
02:48
into a pile of data, into a block of data.
02:51
So this is really what we're doing.
02:53
So this gigabyte or terabyte of data, we're putting it into this block.
02:55
But of course, the block of data
02:58
just contains the amount of X-ray
03:00
that's been absorbed in each point in the human body.
03:02
So what we need to do is to figure out a way
03:04
of looking at the things we do want to look at
03:06
and make things transparent that we don't want to look at.
03:09
So transforming the data set
03:12
into something that looks like this.
03:14
And this is a challenge.
03:16
This is a huge challenge for us to do that.
03:18
Using computers, even though they're getting faster and better all the time,
03:21
it's a challenge to deal with gigabytes of data,
03:24
terabytes of data
03:26
and extracting the relevant information.
03:28
I want to look at the heart.
03:30
I want to look at the blood vessels. I want to look at the liver.
03:32
Maybe even find a tumor,
03:34
in some cases.
03:36
So this is where this little dear comes into play.
03:39
This is my daughter.
03:41
This is as of 9 a.m. this morning.
03:43
She's playing a computer game.
03:45
She's only two years old,
03:47
and she's having a blast.
03:49
So she's really the driving force
03:51
behind the development of graphics-processing units.
03:54
As long as kids are playing computer games,
03:58
graphics is getting better and better and better.
04:00
So please go back home, tell your kids to play more games,
04:02
because that's what I need.
04:04
So what's inside of this machine
04:06
is what enables me to do the things that I'm doing
04:08
with the medical data.
04:10
So really what I'm doing is using these fantastic little devices.
04:12
And you know, going back
04:15
maybe 10 years in time
04:17
when I got the funding
04:19
to buy my first graphics computer --
04:21
it was a huge machine.
04:23
It was cabinets of processors and storage and everything.
04:25
I paid about one million dollars for that machine.
04:28
That machine is, today, about as fast as my iPhone.
04:32
So every month there are new graphics cards coming out,
04:37
and here is a few of the latest ones from the vendors --
04:39
NVIDIA, ATI, Intel is out there as well.
04:42
And you know, for a few hundred bucks
04:45
you can get these things and put them into your computer,
04:47
and you can do fantastic things with these graphics cards.
04:49
So this is really what's enabling us
04:52
to deal with the explosion of data in medicine,
04:54
together with some really nifty work
04:57
in terms of algorithms --
04:59
compressing data,
05:01
extracting the relevant information that people are doing research on.
05:03
So I'm going to show you a few examples of what we can do.
05:06
This is a data set that was captured using a CT scanner.
05:09
You can see that this is a full data [set].
05:12
It's a woman. You can see the hair.
05:15
You can see the individual structures of the woman.
05:18
You can see that there is [a] scattering of X-rays
05:21
on the teeth, the metal in the teeth.
05:24
That's where those artifacts are coming from.
05:26
But fully interactively
05:29
on standard graphics cards on a normal computer,
05:31
I can just put in a clip plane.
05:34
And of course all the data is inside,
05:36
so I can start rotating, I can look at it from different angles,
05:38
and I can see that this woman had a problem.
05:41
She had a bleeding up in the brain,
05:44
and that's been fixed with a little stent,
05:46
a metal clamp that's tightening up the vessel.
05:48
And just by changing the functions,
05:50
then I can decide what's going to be transparent
05:52
and what's going to be visible.
05:55
I can look at the skull structure,
05:57
and I can see that, okay, this is where they opened up the skull on this woman,
05:59
and that's where they went in.
06:02
So these are fantastic images.
06:04
They're really high resolution,
06:06
and they're really showing us what we can do
06:08
with standard graphics cards today.
06:10
Now we have really made use of this,
06:13
and we have tried to squeeze a lot of data
06:15
into the system.
06:18
And one of the applications that we've been working on --
06:20
and this has gotten a little bit of traction worldwide --
06:22
is the application of virtual autopsies.
06:25
So again, looking at very, very large data sets,
06:27
and you saw those full-body scans that we can do.
06:29
We're just pushing the body through the whole CT scanner,
06:32
and just in a few seconds we can get a full-body data set.
06:35
So this is from a virtual autopsy.
06:38
And you can see how I'm gradually peeling off.
06:40
First you saw the body bag that the body came in,
06:42
then I'm peeling off the skin -- you can see the muscles --
06:45
and eventually you can see the bone structure of this woman.
06:48
Now at this point, I would also like to emphasize
06:51
that, with the greatest respect
06:54
for the people that I'm now going to show --
06:56
I'm going to show you a few cases of virtual autopsies --
06:58
so it's with great respect for the people
07:00
that have died under violent circumstances
07:02
that I'm showing these pictures to you.
07:04
In the forensic case --
07:08
and this is something
07:10
that ... there's been approximately 400 cases so far
07:12
just in the part of Sweden that I come from
07:14
that has been undergoing virtual autopsies
07:16
in the past four years.
07:18
So this will be the typical workflow situation.
07:20
The police will decide --
07:23
in the evening, when there's a case coming in --
07:25
they will decide, okay, is this a case where we need to do an autopsy?
07:27
So in the morning, in between six and seven in the morning,
07:30
the body is then transported inside of the body bag
07:33
to our center
07:35
and is being scanned through one of the CT scanners.
07:37
And then the radiologist, together with the pathologist
07:39
and sometimes the forensic scientist,
07:41
looks at the data that's coming out,
07:43
and they have a joint session.
07:45
And then they decide what to do in the real physical autopsy after that.
07:47
Now looking at a few cases,
07:52
here's one of the first cases that we had.
07:54
You can really see the details of the data set.
07:56
It's very high-resolution,
07:59
and it's our algorithms that allow us
08:01
to zoom in on all the details.
08:03
And again, it's fully interactive,
08:05
so you can rotate and you can look at things in real time
08:07
on these systems here.
08:09
Without saying too much about this case,
08:11
this is a traffic accident,
08:13
a drunk driver hit a woman.
08:15
And it's very, very easy to see the damages on the bone structure.
08:17
And the cause of death is the broken neck.
08:20
And this women also ended up under the car,
08:23
so she's quite badly beaten up
08:25
by this injury.
08:27
Here's another case, a knifing.
08:29
And this is also again showing us what we can do.
08:32
It's very easy to look at metal artifacts
08:34
that we can show inside of the body.
08:36
You can also see some of the artifacts from the teeth --
08:39
that's actually the filling of the teeth --
08:42
but because I've set the functions to show me metal
08:44
and make everything else transparent.
08:47
Here's another violent case. This really didn't kill the person.
08:49
The person was killed by stabs in the heart,
08:52
but they just deposited the knife
08:54
by putting it through one of the eyeballs.
08:56
Here's another case.
08:58
It's very interesting for us
09:00
to be able to look at things like knife stabbings.
09:02
Here you can see that knife went through the heart.
09:04
It's very easy to see how air has been leaking
09:07
from one part to another part,
09:09
which is difficult to do in a normal, standard, physical autopsy.
09:11
So it really, really helps
09:14
the criminal investigation
09:16
to establish the cause of death,
09:18
and in some cases also directing the investigation in the right direction
09:20
to find out who the killer really was.
09:23
Here's another case that I think is interesting.
09:25
Here you can see a bullet
09:27
that has lodged just next to the spine on this person.
09:29
And what we've done is that we've turned the bullet into a light source,
09:32
so that bullet is actually shining,
09:35
and it makes it really easy to find these fragments.
09:37
During a physical autopsy,
09:40
if you actually have to dig through the body to find these fragments,
09:42
that's actually quite hard to do.
09:44
One of the things that I'm really, really happy
09:48
to be able to show you here today
09:50
is our virtual autopsy table.
09:53
It's a touch device that we have developed
09:55
based on these algorithms, using standard graphics GPUs.
09:57
It actually looks like this,
10:00
just to give you a feeling for what it looks like.
10:02
It really just works like a huge iPhone.
10:05
So we've implemented
10:08
all the gestures you can do on the table,
10:10
and you can think of it as an enormous touch interface.
10:13
So if you were thinking of buying an iPad,
10:17
forget about it. This is what you want instead.
10:19
Steve, I hope you're listening to this, all right.
10:22
So it's a very nice little device.
10:26
So if you have the opportunity, please try it out.
10:28
It's really a hands-on experience.
10:30
So it gained some traction, and we're trying to roll this out
10:33
and trying to use it for educational purposes,
10:36
but also, perhaps in the future,
10:38
in a more clinical situation.
10:40
There's a YouTube video that you can download and look at this,
10:43
if you want to convey the information to other people
10:45
about virtual autopsies.
10:47
Okay, now that we're talking about touch,
10:50
let me move on to really "touching" data.
10:52
And this is a bit of science fiction now,
10:54
so we're moving into really the future.
10:56
This is not really what the medical doctors are using right now,
10:59
but I hope they will in the future.
11:02
So what you're seeing on the left is a touch device.
11:04
It's a little mechanical pen
11:07
that has very, very fast step motors inside of the pen.
11:09
And so I can generate a force feedback.
11:12
So when I virtually touch data,
11:14
it will generate forces in the pen, so I get a feedback.
11:16
So in this particular situation,
11:19
it's a scan of a living person.
11:21
I have this pen, and I look at the data,
11:23
and I move the pen towards the head,
11:26
and all of a sudden I feel resistance.
11:28
So I can feel the skin.
11:30
If I push a little bit harder, I'll go through the skin,
11:32
and I can feel the bone structure inside.
11:34
If I push even harder, I'll go through the bone structure,
11:37
especially close to the ear where the bone is very soft.
11:39
And then I can feel the brain inside, and this will be the slushy like this.
11:42
So this is really nice.
11:45
And to take that even further, this is a heart.
11:47
And this is also due to these fantastic new scanners,
11:50
that just in 0.3 seconds,
11:53
I can scan the whole heart,
11:55
and I can do that with time resolution.
11:57
So just looking at this heart,
11:59
I can play back a video here.
12:01
And this is Karljohan, one of my graduate students
12:03
who's been working on this project.
12:05
And he's sitting there in front of the Haptic device, the force feedback system,
12:07
and he's moving his pen towards the heart,
12:10
and the heart is now beating in front of him,
12:13
so he can see how the heart is beating.
12:15
He's taken the pen, and he's moving it towards the heart,
12:17
and he's putting it on the heart,
12:19
and then he feels the heartbeats from the real living patient.
12:21
Then he can examine how the heart is moving.
12:24
He can go inside, push inside of the heart,
12:26
and really feel how the valves are moving.
12:28
And this, I think, is really the future for heart surgeons.
12:31
I mean it's probably the wet dream for a heart surgeon
12:34
to be able to go inside of the patient's heart
12:37
before you actually do surgery,
12:40
and do that with high-quality resolution data.
12:42
So this is really neat.
12:44
Now we're going even further into science fiction.
12:47
And we heard a little bit about functional MRI.
12:50
Now this is really an interesting project.
12:53
MRI is using magnetic fields
12:56
and radio frequencies
12:58
to scan the brain, or any part of the body.
13:00
So what we're really getting out of this
13:03
is information of the structure of the brain,
13:05
but we can also measure the difference
13:07
in magnetic properties of blood that's oxygenated
13:09
and blood that's depleted of oxygen.
13:12
That means that it's possible
13:15
to map out the activity of the brain.
13:17
So this is something that we've been working on.
13:19
And you just saw Motts the research engineer, there,
13:21
going into the MRI system,
13:24
and he was wearing goggles.
13:26
So he could actually see things in the goggles.
13:28
So I could present things to him while he's in the scanner.
13:30
And this is a little bit freaky,
13:33
because what Motts is seeing is actually this.
13:35
He's seeing his own brain.
13:37
So Motts is doing something here,
13:40
and probably he is going like this with his right hand,
13:42
because the left side is activated
13:44
on the motor cortex.
13:46
And then he can see that at the same time.
13:48
These visualizations are brand new.
13:50
And this is something that we've been researching for a little while.
13:52
This is another sequence of Motts' brain.
13:55
And here we asked Motts to calculate backwards from 100.
13:58
So he's going "100, 97, 94."
14:01
And then he's going backwards.
14:03
And you can see how the little math processor is working up here in his brain
14:05
and is lighting up the whole brain.
14:08
Well this is fantastic. We can do this in real time.
14:10
We can investigate things. We can tell him to do things.
14:12
You can also see that his visual cortex
14:14
is activated in the back of the head,
14:16
because that's where he's seeing, he's seeing his own brain.
14:18
And he's also hearing our instructions
14:20
when we tell him to do things.
14:22
The signal is really deep inside of the brain as well,
14:24
and it's shining through,
14:26
because all of the data is inside this volume.
14:28
And in just a second here you will see --
14:30
okay, here. Motts, now move your left foot.
14:32
So he's going like this.
14:34
For 20 seconds he's going like that,
14:36
and all of a sudden it lights up up here.
14:38
So we've got motor cortex activation up there.
14:40
So this is really, really nice,
14:42
and I think this is a great tool.
14:44
And connecting also with the previous talk here,
14:46
this is something that we could use as a tool
14:48
to really understand
14:50
how the neurons are working, how the brain is working,
14:52
and we can do this with very, very high visual quality
14:54
and very fast resolution.
14:57
Now we're also having a bit of fun at the center.
15:00
So this is a CAT scan -- Computer Aided Tomography.
15:02
So this is a lion from the local zoo
15:06
outside of Norrkoping in Kolmarden, Elsa.
15:08
So she came to the center,
15:11
and they sedated her
15:13
and then put her straight into the scanner.
15:15
And then, of course, I get the whole data set from the lion.
15:17
And I can do very nice images like this.
15:20
I can peel off the layer of the lion.
15:22
I can look inside of it.
15:24
And we've been experimenting with this.
15:26
And I think this is a great application
15:28
for the future of this technology,
15:30
because there's very little known about the animal anatomy.
15:32
What's known out there for veterinarians is kind of basic information.
15:35
We can scan all sorts of things,
15:38
all sorts of animals.
15:40
The only problem is to fit it into the machine.
15:42
So here's a bear.
15:45
It was kind of hard to get it in.
15:47
And the bear is a cuddly, friendly animal.
15:49
And here it is. Here is the nose of the bear.
15:52
And you might want to cuddle this one,
15:55
until you change the functions and look at this.
15:58
So be aware of the bear.
16:01
So with that,
16:03
I'd like to thank all the people
16:05
who have helped me to generate these images.
16:07
It's a huge effort that goes into doing this,
16:09
gathering the data and developing the algorithms,
16:11
writing all the software.
16:14
So, some very talented people.
16:16
My motto is always, I only hire people that are smarter than I am
16:19
and most of these are smarter than I am.
16:22
So thank you very much.
16:24
(Applause)
16:26

▲Back to top

About the speaker:

Anders Ynnerman - Scientific visualization expert
Anders Ynnerman studies the fundamental aspects of computer graphics and visualization, in particular large scale and complex data sets with a focus on volume rendering and multi-modal interaction.

Why you should listen

Professor Anders Ynnerman received a Ph.D. in physics from Gothenburg University. During the early 90s he was doing research at Oxford University and Vanderbilt University. In 1996 he started the Swedish National Graduate School in Scientific Computing, which he directed until 1999. From 1997 to 2002 he directed the Swedish National Supercomputer Centre and from 2002 to 2006 he directed the Swedish National Infrastructure for Computing (SNIC).

Since 1999 he is holding a chair in scientific visualization at Linköping University and in 2000 he founded the Norrköping Visualization and Interaction Studio (NVIS). NVIS currently constitutes one of the main focal points for research and education in computer graphics and visualization in the Nordic region. Ynnerman is currently heading the build-up of a large scale center for Visualization in Norrköping.

More profile about the speaker
Anders Ynnerman | Speaker | TED.com