ABOUT THE SPEAKER
Ray Kurzweil - Inventor, futurist
Ray Kurzweil is an engineer who has radically advanced the fields of speech, text and audio technology. He's revered for his dizzying -- yet convincing -- writing on the advance of technology, the limits of biology and the future of the human species.

Why you should listen

Inventor, entrepreneur, visionary, Ray Kurzweil's accomplishments read as a startling series of firsts -- a litany of technological breakthroughs we've come to take for granted. Kurzweil invented the first optical character recognition (OCR) software for transforming the written word into data, the first print-to-speech software for the blind, the first text-to-speech synthesizer, and the first music synthesizer capable of recreating the grand piano and other orchestral instruments, and the first commercially marketed large-vocabulary speech recognition.

Yet his impact as a futurist and philosopher is no less significant. In his best-selling books, which include How to Create a Mind, The Age of Spiritual Machines, The Singularity Is Near: When Humans Transcend Biology, Kurzweil depicts in detail a portrait of the human condition over the next few decades, as accelerating technologies forever blur the line between human and machine.

In 2009, he unveiled Singularity University, an institution that aims to "assemble, educate and inspire leaders who strive to understand and facilitate the development of exponentially advancing technologies." He is a Director of Engineering at Google, where he heads up a team developing machine intelligence and natural language comprehension.

More profile about the speaker
Ray Kurzweil | Speaker | TED.com
TED2014

Ray Kurzweil: Get ready for hybrid thinking

Ray Kurzweil: Spremite se za hibridno razmišljanje

Filmed:
3,548,296 views

Prije 200 milijuna godina naši preci sisavci su razvili novu karakteristiku mozga: neokorteks. Ovaj komad tkiva veličine poštanske marke (omotan oko mozga veličine oraha) je bio ključan u razvoju čovječanstva. Sada, futurist Ray Kurzweil predlaže da se spremimo za sljedeći veliki skok po pitanju moći mozga, dok se priključujemo u moć računalstva u 'oblaku'.
- Inventor, futurist
Ray Kurzweil is an engineer who has radically advanced the fields of speech, text and audio technology. He's revered for his dizzying -- yet convincing -- writing on the advance of technology, the limits of biology and the future of the human species. Full bio

Double-click the English transcript below to play the video.

00:12
Let me tell you a storypriča.
0
988
2316
Htio bih vam ispričati priču.
00:15
It goeside back 200 millionmilijuna yearsgodina.
1
3304
1799
Ona počinje prije 200 milijuna godina
00:17
It's a storypriča of the neocortexneokorteks,
2
5103
1984
To je priča o neokorteksu
00:19
whichkoji meanssredstva "newnovi rindkorica."
3
7087
1974
što znači "nova kora".
00:21
So in these earlyrano mammalssisavci,
4
9061
2431
Kod prvih sisavaca,
00:23
because only mammalssisavci have a neocortexneokorteks,
5
11492
2055
jer samo sisavci imaju neokorteks,
00:25
rodent-likeglodavaca, kao creaturesstvorenja.
6
13547
1664
bića sličnih glodavcima.
00:27
It was the sizeveličina of a postagepoštarina stamppečat and just as thintanak,
7
15211
3579
Neokorteks je bio veličine
poštanske marke te jednako tanak,
00:30
and was a thintanak coveringpokrivanje around
8
18790
1439
bio je to tanki sloj koji je bio omotan
00:32
theirnjihov walnut-sizedveličine oraha brainmozak,
9
20229
2264
oko njihova mozga veličine oraha
00:34
but it was capablesposoban of a newnovi typetip of thinkingmišljenje.
10
22493
3701
ali bio je sposoban za
novu vrstu razmišljanja.
Za razliku od utvrđenih ponašanja,
00:38
RatherRadije than the fixedfiksni behaviorsponašanja
11
26194
1567
00:39
that non-mammalianprerañenih animalsživotinje have,
12
27761
1992
tipičnih za ne-sisavce,
00:41
it could inventizumiti newnovi behaviorsponašanja.
13
29753
2692
neokorteks je mogao smisliti
nove oblike ponašanja.
00:44
So a mousemiš is escapingbijeg a predatorgrabežljivac,
14
32445
2553
Tako da miš koji bježi svom predatoru
00:46
its pathstaza is blockedblokiran,
15
34998
1540
a put mu je blokiran
00:48
it'llto će try to inventizumiti a newnovi solutionriješenje.
16
36538
2129
pokušat će smisliti novo rješenje.
00:50
That maysvibanj work, it maysvibanj not,
17
38667
1266
Koje može biti uspješno ili ne
00:51
but if it does, it will rememberzapamtiti that
18
39933
1910
ali ako bude uspješno, miš će to zapamtiti
00:53
and have a newnovi behaviorponašanje,
19
41843
1292
i stvoriti novo ponašanje
00:55
and that can actuallyzapravo spreadširenje virallyvirusno
20
43135
1457
koje se može viralno proširiti
00:56
throughkroz the restodmor of the communityzajednica.
21
44592
2195
ostatkom zajednice.
00:58
AnotherJoš jedan mousemiš watchinggledanje this could say,
22
46787
1609
Drugi miš gledajući ovo bi mogao reći:
01:00
"Hey, that was prettyprilično cleverpametan, going around that rockstijena,"
23
48396
2704
"Hej, to je bilo baš pametno,
tako zaobići kamen,"
01:03
and it could adoptposvojiti a newnovi behaviorponašanje as well.
24
51100
3725
i mogao bi također usvojiti to ponašanje.
01:06
Non-mammalianSobe-sisavaca u suspenziji animalsživotinje
25
54825
1717
Ne-sisavci
01:08
couldn'tne mogu do any of those things.
26
56542
1713
nisu bili sposobni za nešto takvo.
01:10
They had fixedfiksni behaviorsponašanja.
27
58255
1215
Oni su imali utvrđene oblike ponašanja.
01:11
Now they could learnnaučiti a newnovi behaviorponašanje
28
59470
1331
Mogli su naučiti novo ponašanje
01:12
but not in the coursenaravno of one lifetimedoživotno.
29
60801
2576
ali ne u toku jednog životnog vijeka.
01:15
In the coursenaravno of maybe a thousandtisuću lifetimesvijekom trajanja,
30
63377
1767
Možda u toku od tisuću života
01:17
it could evolverazviti a newnovi fixedfiksni behaviorponašanje.
31
65144
3330
su mogli razviti novo utvrđeno ponašanje.
01:20
That was perfectlysavršeno okay 200 millionmilijuna yearsgodina agoprije.
32
68474
3377
To je bilo sasvim u redu
prije 200 milijuna godina.
01:23
The environmentokolina changedpromijenjen very slowlypolako.
33
71851
1981
Okoliš se mijenjao vrlo sporo.
01:25
It could take 10,000 yearsgodina for there to be
34
73832
1554
Moglo je proći i 10 000 godina
da bi se dogodila
01:27
a significantznačajan environmentalekološki changepromijeniti,
35
75386
2092
značajnija promjena okoliša,
01:29
and duringza vrijeme that periodrazdoblje of time
36
77478
1382
a tijekom tog razdoblja
01:30
it would evolverazviti a newnovi behaviorponašanje.
37
78860
2929
mogli su razviti novo ponašanje.
01:33
Now that wentotišao alonguz fine,
38
81789
1521
Sve je išlo svojim tijekom
01:35
but then something happeneddogodilo.
39
83310
1704
ali onda se nešto dogodilo.
01:37
Sixty-five65 millionmilijuna yearsgodina agoprije,
40
85014
2246
Prije 65 milijuna godina
01:39
there was a suddennaglo, violentnasilan
changepromijeniti to the environmentokolina.
41
87260
2615
iznenada se dogodila silovita
promjena okoliša.
01:41
We call it the CretaceousKreda extinctionizumiranje eventdogađaj.
42
89875
3505
A zovemo je događaj
Kreda-Tercijar izumiranja.
01:45
That's when the dinosaursdinosauri wentotišao extinctizumro,
43
93380
2293
Tada su izumrli dinosauri,
01:47
that's when 75 percentposto of the
44
95673
3449
tada je 75 posto
01:51
animalživotinja and plantbiljka speciesvrsta wentotišao extinctizumro,
45
99122
2746
životinjskih i biljnih vrsta izumrlo
01:53
and that's when mammalssisavci
46
101868
1745
i tada su sisavci
01:55
overtookzauzele theirnjihov ecologicalekološki nicheniša,
47
103613
2152
zauzeli svoju ekološku nišu
01:57
and to anthropomorphizesmijemo humanizirati, biologicalbiološki evolutionevolucija said,
48
105765
3654
i da antropomorfiziramo,
biološka evolucija je rekla:
02:01
"HmmHmm, this neocortexneokorteks is prettyprilično good stuffstvari,"
49
109419
2025
"Hm, ovaj neokorteks je odlična stvar",
02:03
and it beganpočeo to growrasti it.
50
111444
1793
i počela ga je razvijati.
02:05
And mammalssisavci got biggerveći,
51
113237
1342
I sisavci su postajali sve veći
02:06
theirnjihov brainsmozak got biggerveći at an even fasterbrže pacetempo,
52
114579
2915
i njihovi mozgovi su postajali
veći još većom brzinom
02:09
and the neocortexneokorteks got biggerveći even fasterbrže than that
53
117494
3807
a neokorteks je postajao veći
još brže od toga
02:13
and developedrazvijen these distinctivekarakterističan ridgesgrebeni and foldsnabora
54
121301
2929
i razvio je prepoznatljive
brazde i prijevoje
02:16
basicallyu osnovi to increasepovećati its surfacepovršinski areapodručje.
55
124230
2881
kako bi povećao svoju površinu.
02:19
If you tookuzeo the humanljudski neocortexneokorteks
56
127111
1819
Kad biste uzeli ljudski neokorteks
02:20
and stretchedrastegnut it out,
57
128930
1301
i potpuno ga rastegli,
02:22
it's about the sizeveličina of a tablestol napkinubrus,
58
130231
1713
bio bi veličine kuhinjske salvete
02:23
and it's still a thintanak structurestruktura.
59
131944
1306
a još uvijek bi bio tanke strukture.
02:25
It's about the thicknessdebljina of a tablestol napkinubrus.
60
133250
1980
Otprilike je debljine kuhinjske salvete.
02:27
But it has so manymnogi convolutionsvala and ridgesgrebeni
61
135230
2497
Ali ima toliko vijuga i brazdi
02:29
it's now 80 percentposto of our brainmozak,
62
137727
3075
da sačinjava 80 posto našeg mozga,
02:32
and that's where we do our thinkingmišljenje,
63
140802
2461
tu se odvijaju naši procesi razmišljanja
02:35
and it's the great sublimatorSublimator.
64
143263
1761
i zadužen je za proces sublimiranja.
02:37
We still have that oldstar brainmozak
65
145024
1114
Još uvijek imamo onaj stari mozak
02:38
that providespruža our basicosnovni drivesdiskovi and motivationsmotivacije,
66
146138
2764
koji određuje naše osnovne
nagone i motivacije
02:40
but I maysvibanj have a drivepogon for conquestosvajanje,
67
148902
2716
ali ja mogu imati nagon za osvajanjem
02:43
and that'llto će be sublimatedsvagdanje by the neocortexneokorteks
68
151618
2715
a to će neokorteks sublimirati
02:46
into writingpisanje a poempjesma or inventingizmišljanje an appapp
69
154333
2909
u proces pisanja pjesme ili
smišljanja aplikacije
02:49
or givingdavanje a TEDTED Talk,
70
157242
1509
ili držanja govora na TED konferenciji
02:50
and it's really the neocortexneokorteks that's where
71
158751
3622
i zapravo se u neokorteksu
02:54
the actionakcijski is.
72
162373
1968
događa sva akcija.
02:56
FiftyPedeset yearsgodina agoprije, I wrotenapisao a paperpapir
73
164341
1717
Prije 50 godina sam napisao rad
02:58
describingopisujući how I thought the brainmozak workedradio,
74
166058
1918
u kojem sam opisao
kako sam mislio da mozak radi
02:59
and I describedopisan it as a seriesniz of modulesmoduli.
75
167976
3199
i opisao sam ga kao niz modula.
03:03
EachSvaki modulemodul could do things with a patternuzorak.
76
171175
2128
Svaki modul je mogao nešto obavljati
ali slijedeći uzorak.
03:05
It could learnnaučiti a patternuzorak. It could rememberzapamtiti a patternuzorak.
77
173303
2746
Mogao je naučiti uzorak,
mogao je zapamtiti uzorak.
03:08
It could implementimplementirati a patternuzorak.
78
176049
1407
Mogao je primijeniti uzorak.
03:09
And these modulesmoduli were organizedorganizirani in hierarchiesHijerarhija,
79
177456
2679
I ovi moduli su organizirani
u hijerarhije,
03:12
and we createdstvorio that hierarchyhijerarhija with our ownvlastiti thinkingmišljenje.
80
180135
2954
a mi stvaramo te hijerarhije
svojim razmišljanjem.
03:15
And there was actuallyzapravo very little to go on
81
183089
3333
Bilo je vrlo malo informacija
s kojim ste mogli raditi
03:18
50 yearsgodina agoprije.
82
186422
1562
prije 50 godina.
03:19
It led me to meetsastati PresidentPredsjednik JohnsonJohnson.
83
187984
2115
To me je dovelo do predsjednika Johnsona.
03:22
I've been thinkingmišljenje about this for 50 yearsgodina,
84
190099
2173
Razmišljao sam o ovome 50 godina
03:24
and a yeargodina and a halfpola agoprije I camedošao out with the bookrezervirati
85
192272
2828
i prije godinu i pol izdao sam knjigu
03:27
"How To CreateStvaranje A MindUm,"
86
195100
1265
"Kako stvoriti um",
03:28
whichkoji has the sameisti thesisteza,
87
196365
1613
koja ima istu postavku
03:29
but now there's a plethorabogatstvom of evidencedokaz.
88
197978
2812
ali danas postoji mnoštvo dokaza.
03:32
The amountiznos of datapodaci we're gettinguzimajući about the brainmozak
89
200790
1814
Količina podataka koju dobivamo
o mozgu
03:34
from neuroscienceneuroznanost is doublingdubliranje everysvaki yeargodina.
90
202604
2203
od neuroznanosti udvostručava
se svake godine.
03:36
SpatialProstornih resolutionrezolucija of brainscanningbrainscanning of all typesvrste
91
204807
2654
Prostorna rezolucija skeniranja mozga
svih vrsta
03:39
is doublingdubliranje everysvaki yeargodina.
92
207461
2285
se udvostručava svake godine.
03:41
We can now see insideiznutra a livingživot brainmozak
93
209746
1717
Danas možemo vidjeti
unutrašnjost živog mozga
03:43
and see individualpojedinac interneuralinterneural connectionsveze
94
211463
2870
i vidjeti pojedinačne među-neuronske veze
03:46
connectingspojni in realstvaran time, firingplamena in realstvaran time.
95
214333
2703
kako se povezuju u realnom vremenu,
kako ispaljuju impulse u relnom vremenu.
03:49
We can see your brainmozak createstvoriti your thoughtsmisli.
96
217036
2419
Možemo vidjeti kako vaš mozak kreira misli.
03:51
We can see your thoughtsmisli createstvoriti your brainmozak,
97
219455
1575
Možemo vidjeti kako
vaše misli oblikuju vaš mozak,
03:53
whichkoji is really keyključ to how it worksdjela.
98
221030
1999
što je najbitnije u načinu kako radi.
03:55
So let me describeopisati brieflykratko how it worksdjela.
99
223029
2219
Dopustite mi da vam ukratko objasnim
kako mozak radi.
03:57
I've actuallyzapravo countedbroje these modulesmoduli.
100
225248
2275
Zapravo, ja sam izbrojio ove module.
03:59
We have about 300 millionmilijuna of them,
101
227523
2046
Postoji negdje oko 300 milijuna modula
04:01
and we createstvoriti them in these hierarchiesHijerarhija.
102
229569
2229
i oblikujemo ih u hijerarhije.
04:03
I'll give you a simplejednostavan exampleprimjer.
103
231798
2082
Dat ću vam jednostavan primjer.
04:05
I've got a bunchmnogo of modulesmoduli
104
233880
2805
Imam gomilu modula
04:08
that can recognizeprepoznati the crossbarprečka to a capitalglavni A,
105
236685
3403
koji mogu prepoznati poprečnu crticu
velikog slova A
04:12
and that's all they carebriga about.
106
240088
1914
i to je sve što ih zanima.
04:14
A beautifullijep songpjesma can playigrati,
107
242002
1578
Može svirati prelijepa pjesma,
04:15
a prettyprilično girldjevojka could walkhodati by,
108
243580
1434
može proći prelijepa djevojka
04:17
they don't carebriga, but they see
a crossbarprečka to a capitalglavni A,
109
245014
2846
oni neće mariti za to, ali kad vide
poprečnu crticu velikog slova A,
04:19
they get very exciteduzbuđen and they say "crossbarprečka,"
110
247860
3021
uzbude se i kažu "poprečna crtica",
04:22
and they put out a highvisok probabilityvjerojatnost
111
250881
2112
i odašilju signal velike vjerojatnosti
04:24
on theirnjihov outputizlaz axonAkson.
112
252993
1634
na izlaznom aksonu.
04:26
That goeside to the nextSljedeći levelnivo,
113
254627
1333
Tada to prelazi na višu razinu,
04:27
and these layersslojevi are organizedorganizirani in conceptualpojmovni levelsrazina.
114
255960
2750
ovi slojevi su organizirani
u konceptualne razine.
04:30
EachSvaki is more abstractsažetak than the nextSljedeći one,
115
258710
1856
Svaka sljedeća je apstraktnija
od prethodne
04:32
so the nextSljedeći one mightmoć say "capitalglavni A."
116
260566
2418
pa bi sljedeća mogla reći "veliko slovo A".
04:34
That goeside up to a higherviši
levelnivo that mightmoć say "AppleApple."
117
262984
2891
To ide na višu razinu koja
bi mogla reći "Auto".
04:37
InformationInformacije flowsteče down alsotakođer.
118
265875
2167
Informacija također ide prema dolje.
04:40
If the applejabuka recognizeralat za prepoznavanje has seenvidio A-P-P-LA-P-P-L,
119
268042
2936
Ako je identifikator auta vidio A-U-T-,
04:42
it'llto će think to itselfsebe, "HmmHmm, I
think an E is probablyvjerojatno likelyVjerojatno,"
120
270978
3219
pomislit će: " Hm, mislim da je
jedno O vrlo vjerojatno",
04:46
and it'llto će sendposlati a signalsignal down to all the E recognizersAlati za prepoznavanje
121
274197
2564
i poslat će signal svim O identifikatorima
04:48
sayingizreka, "Be on the lookoutVidikovac for an E,
122
276761
1619
govoreći: "Pazite na slovo O,
04:50
I think one mightmoć be comingdolazak."
123
278380
1556
mislim da bi se jedno moglo pojaviti."
04:51
The E recognizersAlati za prepoznavanje will lowerdonji theirnjihov thresholdprag
124
279936
2843
'O' identifikatori će sniziti svoj prag
04:54
and they see some sloppyneuredan
thing, could be an E.
125
282779
1945
i ako vide neku brljotinu, mogla bi biti O.
04:56
OrdinarilyObično you wouldn'tne bi think so,
126
284724
1490
Obično to ne biste pomislili,
04:58
but we're expectingočekujući an E, it's good enoughdovoljno,
127
286214
2009
ali sad kad očekujemo 'O',
dovoljno je dobra,
05:00
and yeah, I've seenvidio an E, and then applejabuka sayskaže,
128
288223
1817
i da, vidio sam 'O', i tada auto kaže:
05:02
"Yeah, I've seenvidio an AppleApple."
129
290040
1728
"Da, vidio sam Auto".
05:03
Go up anotherjoš fivepet levelsrazina,
130
291768
1746
Popnite se pet razina
05:05
and you're now at a prettyprilično highvisok levelnivo
131
293514
1353
i sada ste na vrlo viskoj razini
05:06
of this hierarchyhijerarhija,
132
294867
1569
ove hijerarhije
05:08
and stretchrastezanje down into the differentdrugačiji sensesosjetila,
133
296436
2353
i spustite se prema različitim osjetilima,
05:10
and you maysvibanj have a modulemodul
that seesvidi a certainsiguran fabrictkanina,
134
298789
2655
i mogli biste mati modul
koji vidi određenu tkaninu,
05:13
hearsčuje a certainsiguran voiceglas qualitykvaliteta,
smellsmiriše a certainsiguran perfumeparfem,
135
301444
2844
čuje određenu kvalitetu glasa,
njuši određeni parfem,
05:16
and will say, "My wifežena has enteredušao the roomsoba."
136
304288
2513
i reći će: "Moja žena je ušla u sobu".
05:18
Go up anotherjoš 10 levelsrazina, and now you're at
137
306801
1895
Popnite se još 10 razina i sada ste na
05:20
a very highvisok levelnivo.
138
308696
1160
zaista visokoj razini.
05:21
You're probablyvjerojatno in the frontalfrontalni cortexkorteks,
139
309856
1937
Vjerojatno se nalazite
u frontalnom korteksu,
05:23
and you'llvi ćete have modulesmoduli that say, "That was ironicIronično.
140
311793
3767
i imat ćete module koji govore:
" To je ironično.
05:27
That's funnysmiješno. She's prettyprilično."
141
315560
2370
To je smiješno. Ona je lijepa."
05:29
You mightmoć think that those are more sophisticatedsofisticirana,
142
317930
2105
Mogli biste pomisliti da
su oni sofisticiraniji,
05:32
but actuallyzapravo what's more complicatedsložen
143
320035
1506
ali ono što je složenije
05:33
is the hierarchyhijerarhija beneathispod them.
144
321541
2669
jest hijerarhija koja se
nalazi ispod njih.
05:36
There was a 16-year-old-godinu star girldjevojka, she had brainmozak surgerykirurgija,
145
324210
2620
Jedna šesnaestogodišnja djevojka
je imala operaciju na mozgu
05:38
and she was conscioussvjestan because the surgeonskirurzi
146
326830
2051
i bila je pri svijesti tijekom
operacije jer su kirurzi
05:40
wanted to talk to her.
147
328881
1537
željeli razgovarati s njom.
05:42
You can do that because there's no painbol receptorsreceptori
148
330418
1822
To je moguće zato što nema
receptora boli
05:44
in the brainmozak.
149
332240
1038
na mozgu.
05:45
And wheneverkada they stimulatedstimulirana particularposebno,
150
333278
1800
I kadgod su stimulirali određene,
05:47
very smallmali pointsbodova on her neocortexneokorteks,
151
335078
2463
vrlo male točke na njezinom neokorteksu
05:49
shownprikazan here in redcrvena, she would laughsmijeh.
152
337541
2665
koje su ovdje pokazane crvenom bojom,
ona se smijala.
05:52
So at first they thought they were triggeringpokreće
153
340206
1440
Prvotno su mislili da su pogodili
05:53
some kindljubazan of laughsmijeh reflexrefleks,
154
341646
1720
nekakvu vrstu refleksa smijeha
05:55
but no, they quicklybrzo realizedshvatio they had foundpronađeno
155
343366
2519
ali ne, ubrzo su shvatili da su pronašli
05:57
the pointsbodova in her neocortexneokorteks that detectotkriti humorhumor,
156
345885
3044
točke njezinog neokorteksa
koje prepoznaju humor,
06:00
and she just foundpronađeno everything hilarioussmiješan
157
348929
1969
i njoj je jednostavno sve bilo smiješno
06:02
wheneverkada they stimulatedstimulirana these pointsbodova.
158
350898
2437
kadgod su joj stimulirali ove točke.
06:05
"You guys are so funnysmiješno just standingstajati around,"
159
353335
1925
"Momci, što ste smiješni
dok tako stojite tu",
06:07
was the typicaltipičan commentkomentar,
160
355260
1738
je bio uobičajen komentar,
06:08
and they weren'tnisu funnysmiješno,
161
356998
2302
ali oni nisu bili smiješni
06:11
not while doing surgerykirurgija.
162
359300
3247
bar ne dok su obavljali operaciju.
06:14
So how are we doing todaydanas?
163
362547
4830
Pa kako nam ide danas?
06:19
Well, computersračunala are actuallyzapravo beginningpočetak to mastermajstorski
164
367377
3054
Zapravo, računala počinju savladavati
06:22
humanljudski languagejezik with techniquesTehnike
165
370431
2001
ljudski jezik tehnikama
06:24
that are similarsličan to the neocortexneokorteks.
166
372432
2867
koje su slične neokorteksu.
06:27
I actuallyzapravo describedopisan the algorithmalgoritam,
167
375299
1514
Opisao sam algoritam
06:28
whichkoji is similarsličan to something calledzvao
168
376813
2054
koji je sličan nečemu što se zove
06:30
a hierarchicalhijerarhijski hiddenskriven MarkovMarkov modelmodel,
169
378867
2233
hijerarhijski skriveni Markovljev model,
06:33
something I've workedradio on sinceod the '90s.
170
381100
3241
nešto na čemu sam radio još od 90-ih.
06:36
"JeopardyOpasnosti" is a very broadširok naturalprirodni languagejezik gameigra,
171
384341
3238
"Izazov" je vrlo široka
igra prirodnog jezika,
06:39
and WatsonWatson got a higherviši scorepostići
172
387579
1892
i Watson je osvojio više bodova
06:41
than the bestnajbolje two playersigrači combinedkombinirana.
173
389471
2000
nego najbolja dva igrača zajedno.
06:43
It got this queryupit correctispravan:
174
391471
2499
Točno je odgovorio na ovaj upit:
06:45
"A long, tiresomezamoran speechgovor
175
393970
2085
" Dugi, zamoran govor
06:48
deliveredisporučena by a frothypovršan piepita toppingna vrhu,"
176
396055
2152
pjenušavog nadjeva za pitu",
06:50
and it quicklybrzo respondedodgovorila,
"What is a meringuekolač od bjelanaca harangueharangue?"
177
398207
2796
i on je brzo odgovorio,
"Što je meringue harangue?"
06:53
And JenningsJennings and the other guy didn't get that.
178
401003
2635
A Jennings i onaj
drugi tip to nisu shvatili.
06:55
It's a prettyprilično sophisticatedsofisticirana exampleprimjer of
179
403638
1926
To je vrlo sofisticirani primjer
06:57
computersračunala actuallyzapravo understandingrazumijevanje humanljudski languagejezik,
180
405564
1914
kako računala zapravo
razumiju ljudski jezik,
06:59
and it actuallyzapravo got its knowledgeznanje by readingčitanje
181
407478
1652
a on je zapravo stekao znanje čitajući
07:01
WikipediaWikipedia and severalnekoliko other encyclopediasenciklopedije.
182
409130
3785
Wikipediu i nekoliko drugih enciklopedija.
07:04
FivePet to 10 yearsgodina from now,
183
412915
2133
Negdje za 5 do 10 godina
07:07
searchtraži enginesmotori will actuallyzapravo be basedzasnovan on
184
415048
2184
internetski pretraživači
će raditi na principu
07:09
not just looking for combinationskombinacije of wordsriječi and linkslinkovi
185
417232
2794
ne samo na traženju
kombinacije riječi i linkova
07:12
but actuallyzapravo understandingrazumijevanje,
186
420026
1914
već razumijevajući,
07:13
readingčitanje for understandingrazumijevanje the billionsmilijarde of pagesstranica
187
421940
2411
čitajući da bi razumijeli
milijarde stranica
07:16
on the webmreža and in booksknjige.
188
424351
2733
na webu i u knjigama.
07:19
So you'llvi ćete be walkinghodanje alonguz, and GoogleGoogle will poppop up
189
427084
2616
Tako dok se šetate, iskočit će Google
07:21
and say, "You know, MaryMarija, you expressedizrazio concernbriga
190
429700
3081
i reći: "Znaš, Mary,
izrazila si zabrinutost
07:24
to me a monthmjesec agoprije that your glutathioneglutation supplementdodatak
191
432781
3019
negdje prije mjesec dana da tvoj
glutationski dodatak prehrani
07:27
wasn'tnije gettinguzimajući pastprošlost the blood-brainkrv-mozak barrierprepreka.
192
435800
2231
ne može proći barijeru krv-mozak.
07:30
Well, newnovi researchistraživanje just camedošao out 13 secondssekundi agoprije
193
438031
2593
Pa, novo istraživanje je upravo izašlo
prije 13 sekundi
07:32
that showspokazuje a wholečitav newnovi approachpristup to that
194
440624
1711
koje pokazuje jedan skroz novi pristup
07:34
and a newnovi way to take glutathioneglutation.
195
442335
1993
i novi način uzimanja glutationa.
07:36
Let me summarizerezimirati it for you."
196
444328
2562
Dopusti da ga sažmem."
07:38
TwentyDvadeset yearsgodina from now, we'lldobro have nanobotsNanoboti,
197
446890
3684
Za 20 godina imat ćemo nanobote,
07:42
because anotherjoš exponentialeksponencijalan trendtrend
198
450574
1627
još jedan rastući trend
07:44
is the shrinkingskupljanje of technologytehnologija.
199
452201
1615
jest smanjivanje tehnologije.
07:45
They'llOni će go into our brainmozak
200
453816
2370
Oni će ući u naš mozak
07:48
throughkroz the capillarieskapilare
201
456186
1703
kroz kapilare
07:49
and basicallyu osnovi connectSpojiti our neocortexneokorteks
202
457889
2477
i zapravo spojiti naš neokorteks
07:52
to a syntheticsintetski neocortexneokorteks in the cloudoblak
203
460366
3185
sa sintetskim neokorteksom u oblaku
07:55
providingpružanje an extensionnastavak of our neocortexneokorteks.
204
463551
3591
koji će biti produžetak našeg neokorteksa.
07:59
Now todaydanas, I mean,
205
467142
1578
Danas,
08:00
you have a computerračunalo in your phonetelefon,
206
468720
1530
imate računalo u vašem telefonu
08:02
but if you need 10,000 computersračunala for a fewnekoliko secondssekundi
207
470250
2754
ali ako vam zatreba 10,000 računala
za samo nekoliko sekundi
08:05
to do a complexkompleks searchtraži,
208
473004
1495
kako biste napravili složenu pretragu,
08:06
you can accesspristup that for a seconddrugi or two in the cloudoblak.
209
474499
3396
možete pristupiti tome
za 1-2 sekunde u oblaku.
08:09
In the 2030s, if you need some extraekstra neocortexneokorteks,
210
477895
3095
U 2030-im ako budete trebali
još neokorteksa
08:12
you'llvi ćete be ableu stanju to connectSpojiti to that in the cloudoblak
211
480990
2273
moći ćete se povezati s njim u oblaku
08:15
directlydirektno from your brainmozak.
212
483263
1648
izravno iz vašeg mozga.
08:16
So I'm walkinghodanje alonguz and I say,
213
484911
1543
Naprimjer šetam se i kažem:
08:18
"Oh, there's ChrisChris AndersonAnderson.
214
486454
1363
"Oh, evo Chris Anderson.
08:19
He's comingdolazak my way.
215
487817
1525
Dolazi prema meni.
08:21
I'd better think of something cleverpametan to say.
216
489342
2335
Bolje da smislim nešto pametno za reći.
08:23
I've got threetri secondssekundi.
217
491677
1524
imam tri sekunde.
08:25
My 300 millionmilijuna modulesmoduli in my neocortexneokorteks
218
493201
3097
300 milijuna modula u mojem neokorteksu
08:28
isn't going to cutrez it.
219
496298
1240
neće to uspjeti.
08:29
I need a billionmilijardi more."
220
497538
1246
Trabam ih još milijardu."
08:30
I'll be ableu stanju to accesspristup that in the cloudoblak.
221
498784
3323
Moći ću im pristupiti preko oblaka.
08:34
And our thinkingmišljenje, then, will be a hybridhibrid
222
502107
2812
I tada će naše razmišljanje biti hibrid
08:36
of biologicalbiološki and non-biologicalNe-biološkim thinkingmišljenje,
223
504919
3522
biološkog i ne-biološkog razmišljanja,
08:40
but the non-biologicalNe-biološkim portiondio
224
508441
1898
ali nebiološki dio
08:42
is subjectpredmet to my lawzakon of acceleratingubrzanje returnsvraća.
225
510339
2682
je podložan mom zakonu
ubrzavajućih povrata.
08:45
It will growrasti exponentiallyeksponencijalno.
226
513021
2239
On će rasti eksponencijalno.
08:47
And rememberzapamtiti what happensdogađa se
227
515260
2016
Sjećate li se što se dogodilo
08:49
the last time we expandedproširen our neocortexneokorteks?
228
517276
2645
zadnji put kad nam se povećao neokorteks?
08:51
That was two millionmilijuna yearsgodina agoprije
229
519921
1426
To je bilo prije 2 milijuna godina
08:53
when we becamepostao humanoidshumanoida
230
521347
1236
kada smo postali humanoidi
08:54
and developedrazvijen these largeveliki foreheadsčela.
231
522583
1594
i razvili ova visoka čela.
08:56
Other primatesprimati have a slantedzakrivljen browobrve.
232
524177
2583
Drugi primati imaju koso čelo.
08:58
They don't have the frontalfrontalni cortexkorteks.
233
526760
1745
Oni nemaju frontalni korteks.
09:00
But the frontalfrontalni cortexkorteks is not
really qualitativelykvalitativno differentdrugačiji.
234
528505
3685
Ali frontalni korteks nije
kvalitativno drugačiji.
09:04
It's a quantitativekvantitativan expansionekspanzija of neocortexneokorteks,
235
532190
2743
On je kvantitativno
proširenje neokorteksa,
09:06
but that additionalDodatne quantitykoličina of thinkingmišljenje
236
534933
2703
ali ta dodatna količina razmišljanja
09:09
was the enablingomogućujući factorfaktor for us to take
237
537636
1779
nam je omogućila da napravimo
09:11
a qualitativekvalitativne leapskok and inventizumiti languagejezik
238
539415
3346
kvalitativan skok i izmislimo jezik
09:14
and artumjetnost and scienceznanost and technologytehnologija
239
542761
1967
i umjetnost i znanost i tehnologiju
09:16
and TEDTED conferenceskonferencije.
240
544728
1454
i TED konferencije.
09:18
No other speciesvrsta has doneučinio that.
241
546182
2131
Nijednoj drugoj vrsti to nije uspjelo.
09:20
And so, over the nextSljedeći fewnekoliko decadesdesetljeća,
242
548313
2075
Tijekom sljedećih nekoliko desetljeća
09:22
we're going to do it again.
243
550388
1760
učinit ćemo to ponovno.
09:24
We're going to again expandproširiti our neocortexneokorteks,
244
552148
2274
Opet ćemo proširiti svoj neokorteks
09:26
only this time we won'tnavika be limitedograničen
245
554422
1756
samo ovaj put nećemo biti ograničeni
09:28
by a fixedfiksni architecturearhitektura of enclosureprilog.
246
556178
4280
fiksnom arhitekturom zatvorenog prostora.
09:32
It'llTo ćete be expandedproširen withoutbez limitograničiti.
247
560458
3304
Širit će se bez ograničenja.
09:35
That additionalDodatne quantitykoličina will again
248
563762
2243
Ta dodatna količina će opet
09:38
be the enablingomogućujući factorfaktor for anotherjoš qualitativekvalitativne leapskok
249
566005
3005
omogućiti kvalitativan skok
09:41
in cultureKultura and technologytehnologija.
250
569010
1635
u kulturi i tehnologiji.
09:42
Thank you very much.
251
570645
2054
Mnogo vam hvala.
09:44
(ApplausePljesak)
252
572699
3086
(Pljesak)
Translated by Marina Maras
Reviewed by Ivan Stamenkovic

▲Back to top

ABOUT THE SPEAKER
Ray Kurzweil - Inventor, futurist
Ray Kurzweil is an engineer who has radically advanced the fields of speech, text and audio technology. He's revered for his dizzying -- yet convincing -- writing on the advance of technology, the limits of biology and the future of the human species.

Why you should listen

Inventor, entrepreneur, visionary, Ray Kurzweil's accomplishments read as a startling series of firsts -- a litany of technological breakthroughs we've come to take for granted. Kurzweil invented the first optical character recognition (OCR) software for transforming the written word into data, the first print-to-speech software for the blind, the first text-to-speech synthesizer, and the first music synthesizer capable of recreating the grand piano and other orchestral instruments, and the first commercially marketed large-vocabulary speech recognition.

Yet his impact as a futurist and philosopher is no less significant. In his best-selling books, which include How to Create a Mind, The Age of Spiritual Machines, The Singularity Is Near: When Humans Transcend Biology, Kurzweil depicts in detail a portrait of the human condition over the next few decades, as accelerating technologies forever blur the line between human and machine.

In 2009, he unveiled Singularity University, an institution that aims to "assemble, educate and inspire leaders who strive to understand and facilitate the development of exponentially advancing technologies." He is a Director of Engineering at Google, where he heads up a team developing machine intelligence and natural language comprehension.

More profile about the speaker
Ray Kurzweil | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee