ABOUT THE SPEAKER
Greg Gage - Neuroscientist
TED Fellow Greg Gage helps kids investigate the neuroscience in their own backyards.

Why you should listen

As half of Backyard Brains, neuroscientist and engineer Greg Gage builds the SpikerBox -- a small rig that helps kids understand the electrical impulses that control the nervous system. He's passionate about helping students understand (viscerally) how our brains and our neurons work, because, as he said onstage at TED2012, we still know very little about how the brain works -- and we need to start inspiring kids early to want to know more.

Before becoming a neuroscientist, Gage worked as an electrical engineer making touchscreens. As he told the Huffington Post: "Scientific equipment in general is pretty expensive, but it's silly because before [getting my PhD in neuroscience] I was an electrical engineer, and you could see that you could make it yourself. So we started as a way to have fun, to show off to our colleagues, but we were also going into classrooms around that time and we thought, wouldn't it be cool if you could bring these gadgets with us so the stuff we were doing in advanced Ph.D. programs in neuroscience, you could also do in fifth grade?" His latest pieces of gear: the Roboroach, a cockroach fitted with an electric backpack that makes it turn on command, and BYB SmartScope, a smartphone-powered microscope.

More profile about the speaker
Greg Gage | Speaker | TED.com
DIY Neuroscience

Greg Gage: This computer is learning to read your mind

Filmed:
339,691 views

Modern technology lets neuroscientists peer into the human brain, but can it also read minds? Armed with the device known as an electroencephalogram, or EEG, and some computing wizardry, our intrepid neuroscientists attempt to peer into a subject's thoughts.
- Neuroscientist
TED Fellow Greg Gage helps kids investigate the neuroscience in their own backyards. Full bio

Double-click the English transcript below to play the video.

00:12
Greg Gage: Mind-reading.
You've seen this in sci-fi movies:
0
203
2859
00:15
machines that can read our thoughts.
1
3086
1857
00:16
However, there are devices today
2
4967
1798
00:18
that can read the electrical
activity from our brains.
3
6789
2524
00:21
We call this the EEG.
4
9337
1272
00:23
Is there information
contained in these brainwaves?
5
11695
2829
00:26
And if so, could we train a computer
to read our thoughts?
6
14548
2813
00:29
My buddy Nathan
has been working to hack the EEG
7
17385
2904
00:32
to build a mind-reading machine.
8
20313
1676
00:34
[DIY Neuroscience]
9
22013
2457
00:36
So this is how the EEG works.
10
24939
1561
00:38
Inside your head is a brain,
11
26524
1844
00:40
and that brain is made
out of billions of neurons.
12
28392
2558
00:42
Each of those neurons sends
an electrical message to each other.
13
30974
3007
00:46
These small messages can combine
to make an electrical wave
14
34005
2814
00:48
that we can detect on a monitor.
15
36843
1570
00:50
Now traditionally, the EEG
can tell us large-scale things,
16
38437
2724
00:53
for example if you're asleep
or if you're alert.
17
41185
2350
00:55
But can it tell us anything else?
18
43559
1594
00:57
Can it actually read our thoughts?
19
45177
1702
00:58
We're going to test this,
20
46903
1219
01:00
and we're not going to start
with some complex thoughts.
21
48146
2623
01:02
We're going to do something very simple.
22
50793
1977
01:04
Can we interpret what someone is seeing
using only their brainwaves?
23
52794
3253
01:08
Nathan's going to begin by placing
electrodes on Christy's head.
24
56071
3000
01:11
Nathan: My life is tangled.
25
59095
1523
01:12
(Laughter)
26
60642
1150
01:14
GG: And then he's going to show her
a bunch of pictures
27
62152
2584
01:16
from four different categories.
28
64760
1521
01:18
Nathan: Face, house, scenery
and weird pictures.
29
66305
2654
01:20
GG: As we show Christy
hundreds of these images,
30
68983
2498
01:23
we are also capturing the electrical waves
onto Nathan's computer.
31
71505
3543
01:27
We want to see if we can detect
any visual information about the photos
32
75072
3386
01:30
contained in the brainwaves,
33
78482
1352
01:31
so when we're done,
we're going to see if the EEG
34
79858
2331
01:34
can tell us what kind of picture
Christy is looking at,
35
82213
2598
01:36
and if it does, each category
should trigger a different brain signal.
36
84835
3584
01:40
OK, so we collected all the raw EEG data,
37
88443
2628
01:43
and this is what we got.
38
91095
1150
01:45
It all looks pretty messy,
so let's arrange them by picture.
39
93389
2938
01:48
Now, still a bit too noisy
to see any differences,
40
96826
2656
01:51
but if we average the EEG
across all image types
41
99506
3040
01:54
by aligning them
to when the image first appeared,
42
102570
2436
01:57
we can remove this noise,
43
105030
1617
01:58
and pretty soon, we can see
some dominant patterns
44
106671
2334
02:01
emerge for each category.
45
109029
1564
02:02
Now the signals all
still look pretty similar.
46
110617
2156
02:04
Let's take a closer look.
47
112797
1215
02:06
About a hundred milliseconds
after the image comes on,
48
114036
2525
02:08
we see a positive bump in all four cases,
49
116585
2628
02:11
and we call this the P100,
and what we think that is
50
119237
2789
02:14
is what happens in your brain
when you recognize an object.
51
122050
3075
02:17
But damn, look at
that signal for the face.
52
125149
2086
02:19
It looks different than the others.
53
127259
1711
02:20
There's a negative dip
about 170 milliseconds
54
128994
2890
02:23
after the image comes on.
55
131908
1540
02:25
What could be going on here?
56
133472
1750
02:27
Research shows that our brain
has a lot of neurons that are dedicated
57
135246
3240
02:30
to recognizing human faces,
58
138510
1459
02:31
so this N170 spike could be
all those neurons
59
139993
2844
02:34
firing at once in the same location,
60
142861
1985
02:36
and we can detect that in the EEG.
61
144870
1634
02:39
So there are two takeaways here.
62
147083
1820
02:40
One, our eyes can't really detect
the differences in patterns
63
148927
3085
02:44
without averaging out the noise,
64
152036
1571
02:45
and two, even after removing the noise,
65
153631
2237
02:47
our eyes can only pick up
the signals associated with faces.
66
155892
3001
02:50
So this is where we turn
to machine learning.
67
158917
2268
02:53
Now, our eyes are not very good
at picking up patterns in noisy data,
68
161209
3976
02:57
but machine learning algorithms
are designed to do just that,
69
165209
2946
03:00
so could we take a lot of pictures
and a lot of data
70
168179
3201
03:03
and feed it in and train a computer
71
171404
1790
03:05
to be able to interpret
what Christy is looking at in real time?
72
173218
3381
03:09
We're trying to code the information
that's coming out of her EEG
73
177088
4117
03:13
in real time
74
181229
1175
03:14
and predict what it is
that her eyes are looking at.
75
182428
2461
03:16
And if it works, what we should see
76
184913
1727
03:18
is every time that she gets
a picture of scenery,
77
186664
2381
03:21
it should say scenery,
scenery, scenery, scenery.
78
189069
2286
03:23
A face -- face, face, face, face,
79
191379
1957
03:25
but it's not quite working that way,
is what we're discovering.
80
193360
3531
03:33
(Laughter)
81
201385
3548
03:36
OK.
82
204957
1151
03:38
Director: So what's going on here?
GG: We need a new career, I think.
83
206132
3382
03:41
(Laughter)
84
209538
1070
03:42
OK, so that was a massive failure.
85
210632
2444
03:45
But we're still curious:
How far could we push this technology?
86
213100
3212
03:48
And we looked back at what we did.
87
216336
1640
03:50
We noticed that the data was coming
into our computer very quickly,
88
218000
3143
03:53
without any timing
of when the images came on,
89
221167
2241
03:55
and that's the equivalent
of reading a very long sentence
90
223432
2876
03:58
without spaces between the words.
91
226332
1605
03:59
It would be hard to read,
92
227961
1438
04:01
but once we add the spaces,
individual words appear
93
229423
3713
04:05
and it becomes a lot more understandable.
94
233160
2044
04:07
But what if we cheat a little bit?
95
235228
1847
04:09
By using a sensor, we can tell
the computer when the image first appears.
96
237099
3537
04:12
That way, the brainwave stops being
a continuous stream of information,
97
240660
3602
04:16
and instead becomes
individual packets of meaning.
98
244286
2711
04:19
Also, we're going
to cheat a little bit more,
99
247021
2368
04:21
by limiting the categories to two.
100
249413
1812
04:23
Let's see if we can do
some real-time mind-reading.
101
251249
2383
04:25
In this new experiment,
102
253656
1235
04:26
we're going to constrict it
a little bit more
103
254915
2097
04:29
so that we know the onset of the image
104
257036
2252
04:31
and we're going to limit
the categories to "face" or "scenery."
105
259312
3382
04:35
Nathan: Face. Correct.
106
263097
1511
04:37
Scenery. Correct.
107
265780
1351
04:40
GG: So right now,
every time the image comes on,
108
268251
2373
04:42
we're taking a picture
of the onset of the image
109
270648
2266
04:44
and decoding the EEG.
110
272938
1695
04:46
It's getting correct.
111
274657
1256
04:47
Nathan: Yes. Face. Correct.
112
275937
1579
04:49
GG: So there is information
in the EEG signal, which is cool.
113
277540
2859
04:52
We just had to align it
to the onset of the image.
114
280423
2537
04:55
Nathan: Scenery. Correct.
115
283307
1311
04:59
Face. Yeah.
116
287344
1150
05:00
GG: This means there is some
information there,
117
288518
2288
05:02
so if we know at what time
the picture came on,
118
290830
2913
05:05
we can tell what type of picture it was,
119
293767
1999
05:07
possibly, at least on average,
by looking at these evoked potentials.
120
295790
5096
05:12
Nathan: Exactly.
121
300910
1325
05:14
GG: If you had told me at the beginning
of this project this was possible,
122
302259
3521
05:17
I would have said no way.
123
305804
1251
05:19
I literally did not think
we could do this.
124
307079
2000
05:21
Did our mind-reading
experiment really work?
125
309103
2066
05:23
Yes, but we had to do a lot of cheating.
126
311193
1975
05:25
It turns out you can find
some interesting things in the EEG,
127
313192
2905
05:28
for example if you're
looking at someone's face,
128
316121
2290
05:30
but it does have a lot of limitations.
129
318435
2157
05:32
Perhaps advances in machine learning
will make huge strides,
130
320616
2946
05:35
and one day we will be able to decode
what's going on in our thoughts.
131
323586
3390
05:39
But for now, the next time a company says
that they can harness your brainwaves
132
327000
4077
05:43
to be able to control devices,
133
331101
1750
05:44
it is your right, it is your duty
to be skeptical.
134
332875
3310
Translated by Joseph Geni
Reviewed by Krystian Aparta

▲Back to top

ABOUT THE SPEAKER
Greg Gage - Neuroscientist
TED Fellow Greg Gage helps kids investigate the neuroscience in their own backyards.

Why you should listen

As half of Backyard Brains, neuroscientist and engineer Greg Gage builds the SpikerBox -- a small rig that helps kids understand the electrical impulses that control the nervous system. He's passionate about helping students understand (viscerally) how our brains and our neurons work, because, as he said onstage at TED2012, we still know very little about how the brain works -- and we need to start inspiring kids early to want to know more.

Before becoming a neuroscientist, Gage worked as an electrical engineer making touchscreens. As he told the Huffington Post: "Scientific equipment in general is pretty expensive, but it's silly because before [getting my PhD in neuroscience] I was an electrical engineer, and you could see that you could make it yourself. So we started as a way to have fun, to show off to our colleagues, but we were also going into classrooms around that time and we thought, wouldn't it be cool if you could bring these gadgets with us so the stuff we were doing in advanced Ph.D. programs in neuroscience, you could also do in fifth grade?" His latest pieces of gear: the Roboroach, a cockroach fitted with an electric backpack that makes it turn on command, and BYB SmartScope, a smartphone-powered microscope.

More profile about the speaker
Greg Gage | Speaker | TED.com