David Eagleman: Can we create new senses for humans?
David Eagleman - Neuroscientist
David Eagleman decodes the mysteries of the tangled web of neurons and electricity that make our minds tick -- and also make us human. Full bio
a very large cosmos,
very good at understanding reality
the world at that scale.
very thin slice of perception
that slice of reality that we call home,
of the action that's going on.
radiation that bounces off objects
in the back of our eyes.
all the waves out there.
of what's out there.
passing through your body right now
the proper biological receptors
of cell phone conversations
are inherently unseeable.
in their reality,
in their view of the world,
in the dashboards of our cars
in the radio frequency range,
to pick up on the X-ray range.
any of those by yourself,
with the proper sensors.
our experience of reality
the common sense notion
and our fingertips
the objective reality that's out there.
just a little bit of the world.
on different parts of reality.
and deaf world of the tick,
are temperature and butyric acid;
by electrical fields;
out of air compression waves.
that they can pick up on,
for the surrounding world.
objective reality out there,
what we can sense.
is we accept reality
200 million scent receptors in it,
that attract and trap scent molecules,
so you can take big nosefuls of air.
with a revelation.
and you think,
impoverished nose of a human?
a feeble little noseful of air?
a cat 100 yards away,
this very spot six hours ago?"
that world of smell,
into our umwelt.
do we have to be stuck there?
in the way that technology
the experience of being human.
our technology to our biology,
of people walking around
and artificial vision.
a microphone and you digitize the signal,
directly into the inner ear.
you take a camera
and then you plug an electrode grid
these technologies wouldn't work.
speak the language of Silicon Valley,
as our natural biological sense organs.
how to use the signals just fine.
or seeing any of this.
and darkness inside your skull.
and nothing more.
at taking in these signals
and assigning meaning,
and puts together a story
and it doesn't care,
it just figures out what to do with it.
kind of machine.
what it's going to do with it,
sorts of input channels.
model of evolution,
too technical here,
that all these sensors
and our ears and our fingertips,
with the data that comes in.
the animal kingdom,
with which to detect infrared,
a 3D model of the world,
so they can orient
nature doesn't have to continually
of brain operation established,
is designing new peripherals.
really special or fundamental
come to the table with.
information into the brain
what to do with it.
published in the journal Nature in 1969.
in a modified dental chair,
in front of the camera,
with a grid of solenoids.
in front of the camera,
got pretty good
what was in front of the camera
in the small of their back.
modern incarnations of this.
right in front of you
and get closer and farther,
start getting pretty good
through the ears:
on the forehead,
you're feeling it on your forehead.
using it for much else.
is called the brainport,
that sits on your tongue,
these little electrotactile signals,
that they can throw a ball into a basket,
complex obstacle courses.
coursing around in your brain.
where the signals come from.
is sensory substitution for the deaf,
in my lab, Scott Novich,
sound from the world gets converted
can understand what is being said.
and ubiquity of portable computing,
would run on cell phones and tablets,
to make this a wearable,
under your clothing.
is getting captured by the tablet,
that's covered in vibratory motors,
to a pattern of vibration on the vest.
and I'm wearing the vest right now.
into dynamic patterns of vibration.
with deaf people now,
just a little bit of time,
they can start understanding
He has a master's degree.
of his umwelt that's unavailable to him.
for four days, two hours a day,
Jonathan feels it on the vest,
this complicated pattern of vibrations
of what's being said.
because the patterns are too complicated,
the pattern that allows it to figure out
after wearing this for about three months,
perceptual experience of hearing
passes a finger over braille,
without any conscious intervention at all.
to be a game-changer,
for deafness is a cochlear implant,
than a cochlear implant,
even for the poorest countries.
by our results with sensory substitution,
is sensory addition.
to add a completely new kind of sense,
real-time data from the Internet
we're doing in the lab.
streaming feed from the Net of data
and he has to make a choice.
and he gets feedback after one second.
what all the patterns mean,
at figuring out which button to press.
whether he did the right thing or not.
can we expand the human umvelt
after several weeks,
of the economic movements of the planet.
to see how well this goes.
we've been automatically scraping Twitter
an automated sentiment analysis,
words or negative words or neutral?
to the aggregate emotion
because now I can know
and how much you're loving this.
than a human can normally have.
nine different measures
and orientation and heading,
this pilot's ability to fly it.
his skin up there, far away.
a modern cockpit full of gauges
to read the whole thing, you feel it.
between accessing big data
to the possibilities
being able to feel
of the International Space Station,
the invisible states of your own health,
and the state of your microbiome,
or seeing in infrared or ultraviolet.
As we move into the future,
to choose our own peripheral devices.
for Mother Nature's sensory gifts
she's given us the tools that we need
and experience your universe?
I felt applause on the vest.
Twitter's going mad.
that secures its funding forevermore,
have to write to NIH anymore.
skeptical for a minute,
but isn't most of the evidence so far
that sensory addition works?
blind person can see through their tongue
ready to process,
We actually have no idea
kind of data the brain can take in.
is that it's extraordinarily flexible.
what we used to call their visual cortex
by touch, by hearing, by vocabulary.
the cortex is kind of a one-trick pony.
of computations on things.
at things like braille, for example,
through bumps on their fingers.
to think there's a theoretical limit
you're going to be deluged.
possible applications for this.
excited about, the direction it might go?
a lot of applications here.
the things I started mentioning
they spend a lot of their time
just get what's going on,
is multidimensional data.
are good at detecting blobs and edges,
at what our world has become,
with lots and lots of data.
with our attentional systems.
feeling the state of something,
of your body as you're standing around.
feeling the state of a factory,
it'll go right away.
mind-blowing talk. Thank you very much.
About the speaker:David Eagleman - Neuroscientist
David Eagleman decodes the mysteries of the tangled web of neurons and electricity that make our minds tick -- and also make us human.
Why you should listen
As the creator of stacks of compelling research, books and now the 6-part PBS series The Brain, grey matter expert David Eagleman is our most visible evangelist for neuroscience. He has helmed ground-breaking studies on time perception, brain plasticity and neurolaw. His latest research explores technology that bypasses sensory impairment -- such as a smartphone-controlled vest that translates sound into patterns of vibration for the deaf.
Eagleman is also the author of Sum, an internationally bestselling short story collection speculating on life, death and what it means to be human. Translated into 28 languages, Sum has been turned into two separate operas at the Sydney Opera House and the Royal Opera House in London.
David Eagleman | Speaker | TED.com