ABOUT THE SPEAKERS
Steve Ramirez - Neuroscientist
When Steve Ramirez published his latest study in Science, it caused a media frenzy. Why? Because the paper was on implanting false memories in the brains of mice.

Why you should listen

Steve is a graduate student at MIT’s Brain and Cognitive Sciences department pursuing a Ph.D. in neuroscience. His work focuses on finding where single memories are located throughout the brain, genetically tricking the brain cells that house these memories to respond to brief pulses of light, and then using these same flickers of light to reactivate, erase and implant memories. The goals of his research are twofold: to figure out how the brain gives rise to the seemingly ephemeral process of memory, and to predict what happens when specific brain pieces breakdown to impair cognition. His work has been published in Science and covered by New Scientist, Discover, Scientific American, and Gizmodo.

Ramirez aims to be a professor who runs a lab that plucks questions from the tree of science fiction to ground them in experimental reality. He believes that a team-oriented approach to science makes research and teaching far more exciting. When he’s not tinkering with memories in the lab, Ramirez also enjoys running and cheering on every sports team in the city of Boston.

More profile about the speaker
Steve Ramirez | Speaker | TED.com
Xu Liu - Neuroscientist
In his groundbreaking work, Xu Liu investigated how to activate and deactivate specific memories in mice.

Why you should listen

During his PhD, Xu Liu studied the mechanisms of learning and memory, using fruit flies as a model system. By changing the expression of certain genes in the fly brain, he generated smart flies that can learn many times faster than their peers. Using live imaging, he also detected learning-induced changes in the brain cells and observed memory formation inside the brain with light.

After graduation, he moved to MIT and joined Dr. Susumu Tonegawa's lab as a postdoctoral associate. He continued his pursuit of memory with light there. Instead of just watching memory formation, he developed a system in mice where one can not only identify and label cells in the brain for a particular memory, but also turn these cells on and off with light to activate this memory at will. This work was published in Science and has been covered by the media worldwide. Liu passed away in February 2015.

More profile about the speaker
Xu Liu | Speaker | TED.com
TEDxBoston

Steve Ramirez and Xu Liu: A mouse. A laser beam. A manipulated memory.

Filmed:
1,127,889 views

Can we edit the content of our memories? It's a sci-fi-tinged question that Steve Ramirez and Xu Liu are asking in their lab at MIT. Essentially, the pair shoot a laser beam into the brain of a living mouse to activate and manipulate its memory. In this unexpectedly amusing talk they share not only how, but -- more important -- why they do this.
- Neuroscientist
When Steve Ramirez published his latest study in Science, it caused a media frenzy. Why? Because the paper was on implanting false memories in the brains of mice. Full bio - Neuroscientist
In his groundbreaking work, Xu Liu investigated how to activate and deactivate specific memories in mice. Full bio

Double-click the English transcript below to play the video.

00:12
Steve Ramirez: My first year of grad school,
0
371
1556
00:13
I found myself in my bedroom
1
1927
2004
00:15
eating lots of Ben & Jerry's
2
3931
2300
00:18
watching some trashy TV
3
6231
1684
00:19
and maybe, maybe listening to Taylor Swift.
4
7915
3227
00:23
I had just gone through a breakup.
5
11142
1741
00:24
(Laughter)
6
12883
1447
00:26
So for the longest time, all I would do
7
14330
2196
00:28
is recall the memory of this person over and over again,
8
16526
3816
00:32
wishing that I could get rid of that gut-wrenching,
9
20342
2493
00:34
visceral "blah" feeling.
10
22835
2533
00:37
Now, as it turns out, I'm a neuroscientist,
11
25368
2283
00:39
so I knew that the memory of that person
12
27651
2402
00:42
and the awful, emotional undertones that color in that memory,
13
30053
3140
00:45
are largely mediated by separate brain systems.
14
33193
2634
00:47
And so I thought, what if we could go into the brain
15
35827
2365
00:50
and edit out that nauseating feeling
16
38192
2090
00:52
but while keeping the memory of that person intact?
17
40282
2970
00:55
Then I realized, maybe that's a little bit lofty for now.
18
43252
2411
00:57
So what if we could start off by going into the brain
19
45663
2793
01:00
and just finding a single memory to begin with?
20
48456
2633
01:03
Could we jump-start that memory back to life,
21
51089
2514
01:05
maybe even play with the contents of that memory?
22
53603
3868
01:09
All that said, there is one person in the entire world right now
23
57471
2229
01:11
that I really hope is not watching this talk.
24
59700
2055
01:13
(Laughter)
25
61755
3941
01:17
So there is a catch. There is a catch.
26
65696
3289
01:20
These ideas probably remind you of "Total Recall,"
27
68985
2788
01:23
"Eternal Sunshine of the Spotless Mind,"
28
71773
1974
01:25
or of "Inception."
29
73747
1303
01:27
But the movie stars that we work with
30
75050
1576
01:28
are the celebrities of the lab.
31
76626
1943
01:30
Xu Liu: Test mice.
32
78569
1900
01:32
(Laughter)
33
80469
1128
01:33
As neuroscientists, we work in the lab with mice
34
81597
3154
01:36
trying to understand how memory works.
35
84751
3410
01:40
And today, we hope to convince you that now
36
88161
2569
01:42
we are actually able to activate a memory in the brain
37
90730
3216
01:45
at the speed of light.
38
93946
2170
01:48
To do this, there's only two simple steps to follow.
39
96116
3106
01:51
First, you find and label a memory in the brain,
40
99222
3510
01:54
and then you activate it with a switch.
41
102732
3630
01:58
As simple as that.
42
106362
1445
01:59
(Laughter)
43
107807
1822
02:01
SR: Are you convinced?
44
109629
1845
02:03
So, turns out finding a memory in the brain isn't all that easy.
45
111474
3721
02:07
XL: Indeed. This is way more difficult than, let's say,
46
115195
2803
02:09
finding a needle in a haystack,
47
117998
2404
02:12
because at least, you know, the needle is still something
48
120402
2589
02:14
you can physically put your fingers on.
49
122991
2500
02:17
But memory is not.
50
125491
1977
02:19
And also, there's way more cells in your brain
51
127468
3038
02:22
than the number of straws in a typical haystack.
52
130506
5066
02:27
So yeah, this task does seem to be daunting.
53
135572
2879
02:30
But luckily, we got help from the brain itself.
54
138451
3679
02:34
It turned out that all we need to do is basically
55
142130
2455
02:36
to let the brain form a memory,
56
144585
1993
02:38
and then the brain will tell us which cells are involved
57
146578
3830
02:42
in that particular memory.
58
150408
1766
02:44
SR: So what was going on in my brain
59
152174
2357
02:46
while I was recalling the memory of an ex?
60
154531
2094
02:48
If you were to just completely ignore human ethics for a second
61
156625
2402
02:51
and slice up my brain right now,
62
159027
1668
02:52
you would see that there was an amazing number
63
160695
2018
02:54
of brain regions that were active while recalling that memory.
64
162713
3178
02:57
Now one brain region that would be robustly active
65
165891
2912
03:00
in particular is called the hippocampus,
66
168803
2007
03:02
which for decades has been implicated in processing
67
170810
2455
03:05
the kinds of memories that we hold near and dear,
68
173265
2416
03:07
which also makes it an ideal target to go into
69
175681
2574
03:10
and to try and find and maybe reactivate a memory.
70
178255
2785
03:13
XL: When you zoom in into the hippocampus,
71
181040
2394
03:15
of course you will see lots of cells,
72
183434
2348
03:17
but we are able to find which cells are involved
73
185782
3031
03:20
in a particular memory,
74
188813
1476
03:22
because whenever a cell is active,
75
190289
2618
03:24
like when it's forming a memory,
76
192907
1464
03:26
it will also leave a footprint that will later allow us to know
77
194371
3757
03:30
these cells are recently active.
78
198128
2702
03:32
SR: So the same way that building lights at night
79
200830
2160
03:34
let you know that somebody's probably working there at any given moment,
80
202990
2948
03:37
in a very real sense, there are biological sensors
81
205938
3288
03:41
within a cell that are turned on
82
209226
1954
03:43
only when that cell was just working.
83
211180
2135
03:45
They're sort of biological windows that light up
84
213315
1876
03:47
to let us know that that cell was just active.
85
215191
1978
03:49
XL: So we clipped part of this sensor,
86
217169
2829
03:51
and attached that to a switch to control the cells,
87
219998
3147
03:55
and we packed this switch into an engineered virus
88
223145
3900
03:59
and injected that into the brain of the mice.
89
227045
2588
04:01
So whenever a memory is being formed,
90
229633
2634
04:04
any active cells for that memory
91
232267
2348
04:06
will also have this switch installed.
92
234615
2742
04:09
SR: So here is what the hippocampus looks like
93
237357
1625
04:10
after forming a fear memory, for example.
94
238982
2840
04:13
The sea of blue that you see here
95
241822
2140
04:15
are densely packed brain cells,
96
243962
1952
04:17
but the green brain cells,
97
245914
1545
04:19
the green brain cells are the ones that are holding on
98
247459
2096
04:21
to a specific fear memory.
99
249555
1843
04:23
So you are looking at the crystallization
100
251398
1979
04:25
of the fleeting formation of fear.
101
253377
2387
04:27
You're actually looking at the cross-section of a memory right now.
102
255764
3497
04:31
XL: Now, for the switch we have been talking about,
103
259261
2408
04:33
ideally, the switch has to act really fast.
104
261669
2970
04:36
It shouldn't take minutes or hours to work.
105
264639
2579
04:39
It should act at the speed of the brain, in milliseconds.
106
267218
4264
04:43
SR: So what do you think, Xu?
107
271482
1430
04:44
Could we use, let's say, pharmacological drugs
108
272912
2602
04:47
to activate or inactivate brain cells?
109
275514
1838
04:49
XL: Nah. Drugs are pretty messy. They spread everywhere.
110
277352
4063
04:53
And also it takes them forever to act on cells.
111
281415
3008
04:56
So it will not allow us to control a memory in real time.
112
284423
3649
05:00
So Steve, how about let's zap the brain with electricity?
113
288072
4294
05:04
SR: So electricity is pretty fast,
114
292366
2305
05:06
but we probably wouldn't be able to target it
115
294671
1739
05:08
to just the specific cells that hold onto a memory,
116
296410
2354
05:10
and we'd probably fry the brain.
117
298764
1878
05:12
XL: Oh. That's true. So it looks like, hmm,
118
300642
3195
05:15
indeed we need to find a better way
119
303837
2611
05:18
to impact the brain at the speed of light.
120
306448
3295
05:21
SR: So it just so happens that light travels at the speed of light.
121
309743
5086
05:26
So maybe we could activate or inactive memories
122
314829
3483
05:30
by just using light --
123
318312
1497
05:31
XL: That's pretty fast.
124
319809
1355
05:33
SR: -- and because normally brain cells
125
321164
1885
05:35
don't respond to pulses of light,
126
323049
1588
05:36
so those that would respond to pulses of light
127
324637
1958
05:38
are those that contain a light-sensitive switch.
128
326595
2464
05:41
Now to do that, first we need to trick brain cells
129
329059
1946
05:43
to respond to laser beams.
130
331005
1462
05:44
XL: Yep. You heard it right.
131
332467
1070
05:45
We are trying to shoot lasers into the brain.
132
333537
2132
05:47
(Laughter)
133
335669
1745
05:49
SR: And the technique that lets us do that is optogenetics.
134
337414
3324
05:52
Optogenetics gave us this light switch that we can use
135
340738
3282
05:56
to turn brain cells on or off,
136
344020
1508
05:57
and the name of that switch is channelrhodopsin,
137
345528
2517
06:00
seen here as these green dots attached to this brain cell.
138
348045
2537
06:02
You can think of channelrhodopsin as a sort of light-sensitive switch
139
350582
3299
06:05
that can be artificially installed in brain cells
140
353881
2627
06:08
so that now we can use that switch
141
356508
1914
06:10
to activate or inactivate the brain cell simply by clicking it,
142
358422
3013
06:13
and in this case we click it on with pulses of light.
143
361435
2468
06:15
XL: So we attach this light-sensitive switch of channelrhodopsin
144
363903
3786
06:19
to the sensor we've been talking about
145
367689
2208
06:21
and inject this into the brain.
146
369897
2455
06:24
So whenever a memory is being formed,
147
372352
3211
06:27
any active cell for that particular memory
148
375563
2227
06:29
will also have this light-sensitive switch installed in it
149
377790
3484
06:33
so that we can control these cells
150
381274
2401
06:35
by the flipping of a laser just like this one you see.
151
383675
4264
06:39
SR: So let's put all of this to the test now.
152
387939
2864
06:42
What we can do is we can take our mice
153
390803
2135
06:44
and then we can put them in a box that looks exactly like this box here,
154
392938
2928
06:47
and then we can give them a very mild foot shock
155
395866
2340
06:50
so that they form a fear memory of this box.
156
398206
2035
06:52
They learn that something bad happened here.
157
400241
1979
06:54
Now with our system, the cells that are active
158
402220
2342
06:56
in the hippocampus in the making of this memory,
159
404562
2931
06:59
only those cells will now contain channelrhodopsin.
160
407493
2883
07:02
XL: When you are as small as a mouse,
161
410376
3017
07:05
it feels as if the whole world is trying to get you.
162
413393
3595
07:08
So your best response of defense
163
416988
1748
07:10
is trying to be undetected.
164
418736
2482
07:13
Whenever a mouse is in fear,
165
421218
2033
07:15
it will show this very typical behavior
166
423251
1526
07:16
by staying at one corner of the box,
167
424777
1769
07:18
trying to not move any part of its body,
168
426546
3118
07:21
and this posture is called freezing.
169
429664
3295
07:24
So if a mouse remembers that something bad happened in this box,
170
432959
4294
07:29
and when we put them back into the same box,
171
437253
2623
07:31
it will basically show freezing
172
439876
1804
07:33
because it doesn't want to be detected
173
441680
2285
07:35
by any potential threats in this box.
174
443965
2695
07:38
SR: So you can think of freezing as,
175
446660
1355
07:40
you're walking down the street minding your own business,
176
448015
2215
07:42
and then out of nowhere you almost run into
177
450230
1725
07:43
an ex-girlfriend or ex-boyfriend,
178
451955
2192
07:46
and now those terrifying two seconds
179
454147
2134
07:48
where you start thinking, "What do I do? Do I say hi?
180
456281
1876
07:50
Do I shake their hand? Do I turn around and run away?
181
458157
1368
07:51
Do I sit here and pretend like I don't exist?"
182
459525
2029
07:53
Those kinds of fleeting thoughts that physically incapacitate you,
183
461554
3184
07:56
that temporarily give you that deer-in-headlights look.
184
464738
2746
07:59
XL: However, if you put the mouse in a completely different
185
467484
3291
08:02
new box, like the next one,
186
470775
3161
08:05
it will not be afraid of this box
187
473936
2147
08:08
because there's no reason that it will be afraid of this new environment.
188
476083
4729
08:12
But what if we put the mouse in this new box
189
480812
3180
08:15
but at the same time, we activate the fear memory
190
483992
3611
08:19
using lasers just like we did before?
191
487603
2679
08:22
Are we going to bring back the fear memory
192
490282
2854
08:25
for the first box into this completely new environment?
193
493136
3997
08:29
SR: All right, and here's the million-dollar experiment.
194
497133
2735
08:31
Now to bring back to life the memory of that day,
195
499868
2913
08:34
I remember that the Red Sox had just won,
196
502781
2183
08:36
it was a green spring day,
197
504964
1909
08:38
perfect for going up and down the river
198
506873
1817
08:40
and then maybe going to the North End
199
508690
2334
08:43
to get some cannolis, #justsaying.
200
511024
2160
08:45
Now Xu and I, on the other hand,
201
513184
3102
08:48
were in a completely windowless black room
202
516286
2829
08:51
not making any ocular movement that even remotely resembles an eye blink
203
519115
3661
08:54
because our eyes were fixed onto a computer screen.
204
522776
2466
08:57
We were looking at this mouse here trying to activate a memory
205
525242
2556
08:59
for the first time using our technique.
206
527798
1883
09:01
XL: And this is what we saw.
207
529681
2688
09:04
When we first put the mouse into this box,
208
532369
2202
09:06
it's exploring, sniffing around, walking around,
209
534571
3113
09:09
minding its own business,
210
537684
1689
09:11
because actually by nature,
211
539373
1701
09:13
mice are pretty curious animals.
212
541074
1979
09:15
They want to know, what's going on in this new box?
213
543053
2622
09:17
It's interesting.
214
545675
1531
09:19
But the moment we turned on the laser, like you see now,
215
547206
3451
09:22
all of a sudden the mouse entered this freezing mode.
216
550657
3032
09:25
It stayed here and tried not to move any part of its body.
217
553689
4431
09:30
Clearly it's freezing.
218
558120
1628
09:31
So indeed, it looks like we are able to bring back
219
559748
2583
09:34
the fear memory for the first box
220
562331
2064
09:36
in this completely new environment.
221
564395
3367
09:39
While watching this, Steve and I
222
567762
2112
09:41
are as shocked as the mouse itself.
223
569874
2133
09:44
(Laughter)
224
572007
1262
09:45
So after the experiment, the two of us just left the room
225
573269
3307
09:48
without saying anything.
226
576576
1753
09:50
After a kind of long, awkward period of time,
227
578329
3396
09:53
Steve broke the silence.
228
581725
2212
09:55
SR: "Did that just work?"
229
583937
2341
09:58
XL: "Yes," I said. "Indeed it worked!"
230
586278
2974
10:01
We're really excited about this.
231
589252
2117
10:03
And then we published our findings
232
591369
2624
10:05
in the journal Nature.
233
593993
1696
10:07
Ever since the publication of our work,
234
595689
2471
10:10
we've been receiving numerous comments
235
598160
2415
10:12
from all over the Internet.
236
600575
2125
10:14
Maybe we can take a look at some of those.
237
602700
3750
10:18
["OMGGGGG FINALLY... so much more to come, virtual reality, neural manipulation, visual dream emulation... neural coding, 'writing and re-writing of memories', mental illnesses. Ahhh the future is awesome"]
238
606450
2457
10:20
SR: So the first thing that you'll notice is that people
239
608907
2000
10:22
have really strong opinions about this kind of work.
240
610907
2903
10:25
Now I happen to completely agree with the optimism
241
613810
2554
10:28
of this first quote,
242
616364
816
10:29
because on a scale of zero to Morgan Freeman's voice,
243
617180
2808
10:31
it happens to be one of the most evocative accolades
244
619988
2123
10:34
that I've heard come our way.
245
622111
1550
10:35
(Laughter)
246
623661
2196
10:37
But as you'll see, it's not the only opinion that's out there.
247
625857
1951
10:39
["This scares the hell out of me... What if they could do that easily in humans in a couple of years?! OH MY GOD WE'RE DOOMED"]
248
627808
1564
10:41
XL: Indeed, if we take a look at the second one,
249
629372
1918
10:43
I think we can all agree that it's, meh,
250
631290
2107
10:45
probably not as positive.
251
633397
2375
10:47
But this also reminds us that,
252
635772
2185
10:49
although we are still working with mice,
253
637957
2186
10:52
it's probably a good idea to start thinking and discussing
254
640143
3517
10:55
about the possible ethical ramifications
255
643660
2991
10:58
of memory control.
256
646651
1948
11:00
SR: Now, in the spirit of the third quote,
257
648599
2200
11:02
we want to tell you about a recent project that we've been
258
650799
1974
11:04
working on in lab that we've called Project Inception.
259
652773
2428
11:07
["They should make a movie about this. Where they plant ideas into peoples minds, so they can control them for their own personal gain. We'll call it: Inception."]
260
655201
3243
11:10
So we reasoned that now that we can reactivate a memory,
261
658444
3734
11:14
what if we do so but then begin to tinker with that memory?
262
662178
2952
11:17
Could we possibly even turn it into a false memory?
263
665130
3033
11:20
XL: So all memory is sophisticated and dynamic,
264
668163
4099
11:24
but if just for simplicity, let's imagine memory
265
672262
2979
11:27
as a movie clip.
266
675241
1402
11:28
So far what we've told you is basically we can control
267
676643
2670
11:31
this "play" button of the clip
268
679313
1931
11:33
so that we can play this video clip any time, anywhere.
269
681244
4585
11:37
But is there a possibility that we can actually get
270
685829
2531
11:40
inside the brain and edit this movie clip
271
688360
2860
11:43
so that we can make it different from the original?
272
691220
2896
11:46
Yes we can.
273
694116
2178
11:48
Turned out that all we need to do is basically
274
696294
2130
11:50
reactivate a memory using lasers just like we did before,
275
698424
4275
11:54
but at the same time, if we present new information
276
702699
3439
11:58
and allow this new information to incorporate into this old memory,
277
706138
3974
12:02
this will change the memory.
278
710112
2438
12:04
It's sort of like making a remix tape.
279
712550
3663
12:08
SR: So how do we do this?
280
716213
2858
12:11
Rather than finding a fear memory in the brain,
281
719071
1957
12:13
we can start by taking our animals,
282
721028
1716
12:14
and let's say we put them in a blue box like this blue box here
283
722744
2933
12:17
and we find the brain cells that represent that blue box
284
725677
2494
12:20
and we trick them to respond to pulses of light
285
728171
2073
12:22
exactly like we had said before.
286
730244
2121
12:24
Now the next day, we can take our animals and place them
287
732365
2124
12:26
in a red box that they've never experienced before.
288
734489
2699
12:29
We can shoot light into the brain to reactivate
289
737188
2026
12:31
the memory of the blue box.
290
739214
2109
12:33
So what would happen here if, while the animal
291
741323
1744
12:35
is recalling the memory of the blue box,
292
743067
1894
12:36
we gave it a couple of mild foot shocks?
293
744961
2643
12:39
So here we're trying to artificially make an association
294
747604
2693
12:42
between the memory of the blue box
295
750297
1915
12:44
and the foot shocks themselves.
296
752212
1503
12:45
We're just trying to connect the two.
297
753715
1779
12:47
So to test if we had done so,
298
755494
1543
12:49
we can take our animals once again
299
757037
1328
12:50
and place them back in the blue box.
300
758365
1946
12:52
Again, we had just reactivated the memory of the blue box
301
760311
2738
12:55
while the animal got a couple of mild foot shocks,
302
763049
2167
12:57
and now the animal suddenly freezes.
303
765216
2370
12:59
It's as though it's recalling being mildly shocked in this environment
304
767586
3202
13:02
even though that never actually happened.
305
770788
3002
13:05
So it formed a false memory,
306
773790
1862
13:07
because it's falsely fearing an environment
307
775652
2011
13:09
where, technically speaking,
308
777663
1254
13:10
nothing bad actually happened to it.
309
778917
2349
13:13
XL: So, so far we are only talking about
310
781266
2453
13:15
this light-controlled "on" switch.
311
783719
2366
13:18
In fact, we also have a light-controlled "off" switch,
312
786085
3288
13:21
and it's very easy to imagine that
313
789373
2080
13:23
by installing this light-controlled "off" switch,
314
791453
2478
13:25
we can also turn off a memory, any time, anywhere.
315
793931
5588
13:31
So everything we've been talking about today
316
799519
2214
13:33
is based on this philosophically charged principle of neuroscience
317
801733
4677
13:38
that the mind, with its seemingly mysterious properties,
318
806410
4118
13:42
is actually made of physical stuff that we can tinker with.
319
810528
3645
13:46
SR: And for me personally,
320
814173
1475
13:47
I see a world where we can reactivate
321
815648
1581
13:49
any kind of memory that we'd like.
322
817229
2116
13:51
I also see a world where we can erase unwanted memories.
323
819345
3298
13:54
Now, I even see a world where editing memories
324
822643
1936
13:56
is something of a reality,
325
824579
1539
13:58
because we're living in a time where it's possible
326
826118
1705
13:59
to pluck questions from the tree of science fiction
327
827823
2225
14:02
and to ground them in experimental reality.
328
830048
2420
14:04
XL: Nowadays, people in the lab
329
832468
1883
14:06
and people in other groups all over the world
330
834351
2386
14:08
are using similar methods to activate or edit memories,
331
836737
3817
14:12
whether that's old or new, positive or negative,
332
840554
3841
14:16
all sorts of memories so that we can understand
333
844395
2672
14:19
how memory works.
334
847067
1864
14:20
SR: For example, one group in our lab
335
848931
1784
14:22
was able to find the brain cells that make up a fear memory
336
850715
2638
14:25
and converted them into a pleasurable memory, just like that.
337
853353
2775
14:28
That's exactly what I mean about editing these kinds of processes.
338
856128
3062
14:31
Now one dude in lab was even able to reactivate
339
859190
1812
14:33
memories of female mice in male mice,
340
861002
2501
14:35
which rumor has it is a pleasurable experience.
341
863503
2995
14:38
XL: Indeed, we are living in a very exciting moment
342
866498
4117
14:42
where science doesn't have any arbitrary speed limits
343
870615
3805
14:46
but is only bound by our own imagination.
344
874420
3187
14:49
SR: And finally, what do we make of all this?
345
877607
2147
14:51
How do we push this technology forward?
346
879754
1971
14:53
These are the questions that should not remain
347
881725
1837
14:55
just inside the lab,
348
883562
1675
14:57
and so one goal of today's talk was to bring everybody
349
885237
2572
14:59
up to speed with the kind of stuff that's possible
350
887809
2319
15:02
in modern neuroscience,
351
890128
1384
15:03
but now, just as importantly,
352
891512
1510
15:05
to actively engage everybody in this conversation.
353
893022
3332
15:08
So let's think together as a team about what this all means
354
896354
2329
15:10
and where we can and should go from here,
355
898683
2474
15:13
because Xu and I think we all have
356
901157
2098
15:15
some really big decisions ahead of us.
357
903255
2536
15:17
Thank you.
XL: Thank you.
358
905791
1125
15:18
(Applause)
359
906916
1634

▲Back to top

ABOUT THE SPEAKERS
Steve Ramirez - Neuroscientist
When Steve Ramirez published his latest study in Science, it caused a media frenzy. Why? Because the paper was on implanting false memories in the brains of mice.

Why you should listen

Steve is a graduate student at MIT’s Brain and Cognitive Sciences department pursuing a Ph.D. in neuroscience. His work focuses on finding where single memories are located throughout the brain, genetically tricking the brain cells that house these memories to respond to brief pulses of light, and then using these same flickers of light to reactivate, erase and implant memories. The goals of his research are twofold: to figure out how the brain gives rise to the seemingly ephemeral process of memory, and to predict what happens when specific brain pieces breakdown to impair cognition. His work has been published in Science and covered by New Scientist, Discover, Scientific American, and Gizmodo.

Ramirez aims to be a professor who runs a lab that plucks questions from the tree of science fiction to ground them in experimental reality. He believes that a team-oriented approach to science makes research and teaching far more exciting. When he’s not tinkering with memories in the lab, Ramirez also enjoys running and cheering on every sports team in the city of Boston.

More profile about the speaker
Steve Ramirez | Speaker | TED.com
Xu Liu - Neuroscientist
In his groundbreaking work, Xu Liu investigated how to activate and deactivate specific memories in mice.

Why you should listen

During his PhD, Xu Liu studied the mechanisms of learning and memory, using fruit flies as a model system. By changing the expression of certain genes in the fly brain, he generated smart flies that can learn many times faster than their peers. Using live imaging, he also detected learning-induced changes in the brain cells and observed memory formation inside the brain with light.

After graduation, he moved to MIT and joined Dr. Susumu Tonegawa's lab as a postdoctoral associate. He continued his pursuit of memory with light there. Instead of just watching memory formation, he developed a system in mice where one can not only identify and label cells in the brain for a particular memory, but also turn these cells on and off with light to activate this memory at will. This work was published in Science and has been covered by the media worldwide. Liu passed away in February 2015.

More profile about the speaker
Xu Liu | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee