Nick Bostrom: How civilization could destroy itself -- and 4 ways we could prevent it
نیک بوستروم: چگونه یک تمدن می تواند خود را نابود کند -- و چهار راه برای جلوگیری از آن
Nick Bostrom asks big questions: What should we do, as individuals and as a species, to optimize our long-term prospects? Will humanity’s technological advancements ultimately destroy us? Full bioChris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading. Full bio
Double-click the English transcript below to play the video.
so many crazy ideas out there.
و غریب زیادی داشته ای.
all be living in a simulation,
در یک شبیه سازی زندگی کنیم،
of how artificial general intelligence
از اینکه چطور هوش مصنوعی عمومی
the vulnerable world hypothesis.
فرضیات دنیایِ آسیب پذیر نام دارد.
give the illustrated guide to that.
یک بروشور راهنما برای آن است.
of the current human condition.
وضعیت موجود انسان بیاندیشد.
possible technologies.
و فناوری های ممکن است.
of human creativity
and pulling out one ball after another,
آوردن توپ یکی پس از دیگری تصور کنی،
has been hugely beneficial, right?
بسیار مفید بوده، درست؟
سفید خیلی خوبی را درآورده ایم،
mixed blessings.
که چیزهای درهم خوب و بدی هستند.
pulled out the black ball --
the civilization that discovers it.
که پیدایش کند را نابود می کند.
about what could such a black ball be.
که چه چیزی ممکن است یک توپ سیاه باشد.
bring about civilizational destruction.
the semi-anarchic default condition.
را نیمه-آشفته گذاشته ام خارج شویم.
این نظریه قابل را قبول کنید
we've actually got lucky,
that death ball
را بیرون می کشیدیم
it's just meant to illustrate
ما را به کجا خواهند برد.
at pulling out balls,
to put the ball back into the urn, right.
به گلدان را نداریم، غیر از این است؟
اما نمی توانیم حذف-اختراع کنیم.
no black ball in the urn.
که توپ سیاهی در گلدان نباشد.
and you can't put it back in,
و نمی شود برش گرداند،
of these examples.
types of vulnerability.
that just makes it very easy
source of that kind of black ball,
نمونه ای حاصلخیز از این توپ سیاه باشد،
هم می توانیم انجام دهیم --
really great, right?
توجه کنید که بسیار عالی است، نه؟
to get too easy either,
and his grandmother
alter the earth's climate.
را به شدت تغییر دهند.
killer bot swarms.
هجوم ربات های کشنده.
artificial general intelligence.
that when we discovered
که وقتی ما کشف کردیم
accessible to anyone.
در دسترس همه.
به سال های ۱۹۳۰ برگردیم
some breakthroughs in nuclear physics,
در فیزیک هسته ای پیشرفت هایی حاصل کنیم،
to create a nuclear chain reaction
واکنش زنجیره ای هسته ای هم ایجاد کرد
that this could lead to the bomb.
که این می تواند تبدیل به بمب شود.
to make a nuclear bomb
که برای ساختن بمب نیاز داری
خیلی بالا نیاز داری،
massive amounts of energy.
to unlock the energy of the atom.
تا انرژی اتم را آزاد کنی.
in the microwave oven
a nuclear detonation.
physically impossible.
از لحاظ فیزیکی ممکن نیست.
how it would turn out?
massive nuclear reactions relatively easy,
بزرگ هسته ای نسبتا ساده بود،
that is easy to do on purpose
که انجام عمدی اش ساده باشد
on top of one another,
را روی هم بگذاریم،
like, a stack of 10 blocks.
دسته ای از ۱۰ قطعه روی هم پیدا کنی.
is perhaps the quickest route
احتمالا سریعترین مسیری است
in our near future to get us here.
که در آینده نزدیک ما را به اینجا بیاورد.
about what that would have meant
in their kitchen for an afternoon
بعد از ظهر کاری می کرد که
modern civilization as we know it
امروزی که ما می شناسیم
of a million people,
who would, for whatever reason,
را نابود کند، یا بدتر از آن،
of vulnerability.
kind of obvious types of black balls
کاملا واضح از توپ های سیاه
to blow up a lot of things,
by creating bad incentives
مشوق های بد عمل می کنند
that incentivizes great powers
برای قدرت های بزرگ ایجاد می کند
to create destruction.
برای تخریب استفاده کنند.
very close to this, right?
خیلی به این نزدیک هستند، نه؟
over 10 trillion dollars
بیش از ۱۰ تریلیون دلار هزینه کردیم
during the Cold War
this would be a great idea,
فکر می کردند که این کار خیلی خوبی است،
to blow ourselves up,
تا خودمان را منفجر کنیم،
that we were finding ourselves --
که خودمان را --
a safe first strike.
اولیه بدون مشکل انجام می شد.
خیلی پیچیده ای پیش بیاید،
all their nuclear missiles.
موشک های هسته ای بشویم.
that the other side might do it.
که طرف دیگر این کار را انجام دهد.
more unstable than it was.
از آنچه بود باشد.
other properties of technology.
to have arms treaties,
می توانست سخت تر باشد،
or something less distinctive.
for powerful actors,
برای بازیگران قدرت،
for all of us, in Type-2b here.
برای همه ما باشی، اینجا در نوع ۲-ب.
take the case of global warming.
مسئله گرمایش جهانی را در نظر گرفت.
have no significant effect, right?
اثر خیلی زیادی ندارند، درست؟
could have been a lot worse than it is.
خیلی از این بدتر باشد.
sensitivity parameter, right.
how much warmer does it get
چقدر گرمتر می شود
of greenhouse gases.
of greenhouse gases we emitted,
که منتشر می کردیم،
15 degrees or 20 degrees.
in a very bad situation.
خیلی بدی داشته باشیم.
had just been a lot harder to do.
تجدید پذیر خیلی مشکل تر بود.
more fossil fuels in the ground.
فسیلی بیشتری در زمین بود.
that if in that case of --
in the time period that we could see,
که می توانستیم ببینیم می شد،
off its ass and done something about it.
می داد و کاری می کرد.
maybe that stupid.
to switch to renewables and stuff, right,
تجدید پذیر کمی مشکل است، درست،
with slightly different physics,
to do these things.
خیلی پر هزینه تر باشند.
these possibilities together,
well be various black balls in the urn,
که توپ های سیاه مختلفی در گلدان باشند،
protect against black balls.
they will come out.
بیرون آمدنشان به چه صورت است.
philosophical critique of this idea
محتمل فلسفی به این تفکر
that the future is essentially settled.
دارد که آینده اساسا تثبیت شده است.
is that ball there or it's not.
that I want to believe.
که بخواهم باور کنم.
that the future is undetermined,
we pull out of that urn.
که اگر همینطور نوآوری را ادامه دهیم،
pull out all the balls.
of weak form of technological determinism
جبرگرایی فناوری
to encounter a society
کمی دارد که جامعه ای پیدا کنی
و هواپیمای جت با هم استفاده کند.
of a technology as a set of affordances.
مجموعه ای از ویژگی ها در نظر گرفت
that enables us to do various things
امکان انجام کارهای مختلف
of course depends on human choice.
مسلما به انتخاب انسان بستگی دارد.
three types of vulnerability,
آسیب پذیری فکر کنیم،
about how we would choose to use them.
در چگونه استفاده کردنشان اند.
this massive, destructive power,
این نیروی عظیم مخرب،
of millions of people
to use it destructively.
از آن برای تخریب استفاده کنند.
disturbing argument
some kind of view into the urn
ما نگاهی به داخل گلدان بیاندازیم
very likely that we're doomed.
محکوم به بدبختی شویم.
in accelerating power,
that make us more powerful,
که ما را قوی تر می کنند،
can take us all down,
می تواند حریف همه ما شود،
to use those powers,
that kind of help us control
هم بسازیم که به ما در کنترل آن
استفاده کنند کمک کند.
let's talk about the response.
در مورد پاسخ به آن صحبت کنیم.
about all the possibilities
it's things like cyberwarfare,
چیزهایی مثل نبرد سایبری،
serious doom in our future.
آینده ما را محکوم می کنند.
four possible responses as well.
doesn't seem promising,
به نظر امیدوار کننده نمی رسد،
to technological progress.
در پیشرفت فناوری است.
even if we could do it.
حتی اگر بتوانیم انجامشان دهیم.
slower technological progress.
faster progress in bioweapons,
تسلیحات زیستی را بخواهی،
را ساده تر می کند.
fully on board with that.
کاملا با آن همراه بودم.
push back on that for a minute.
دقیقه ای آن را کنار بگذارم.
of the last couple of decades,
push forward at full speed,
فشار به جلو با حداکثر سرعت بوده،
and the rapid acceleration of that,
و شتاب سریع آن فکر کنی،
"move fast and break things"
مادر علم است» توجه کنی
for synthetic biology,
زیست شناسی مصنوعی توجه کنی،
move forward rapidly
a DNA printer in every home
چاپگر دی ان ای در هر خانه
the first part, the not feasible.
در امکان پذیر نبودن.
desirable to stop it,
if one nation kind of --
اگر نوعی از یک کشور --
if one nation does,
اگر یک کشور انجامش دهد،
the nuclear threat,
از خطر هسته ای عبور کرده ایم،
the painful process of negotiating.
that we, as a matter of global priority,
اگر به عنوان اولویتی جهانی،
really strict rules
برای قوانینی خیلی محکم را شروع کنیم
مصنوعی تا کجا ادامه یابد،
that you want to democratize, no?
به دست همه بدهید، نه؟
has their own device,
دستگاهش را داشته باشد،
four or five places in the world
and the DNA comes back, right?
و دی ان ای برایتان بر می گردد، درست؟
like it was necessary,
a finite set of choke points.
for kind of special opportunities,
نوعی موقعیت خاص هستی،
in just holding back.
موفقیتی بدست نمی آوریم.
North Korea, you know --
کره شمالی، می دانی --
and discover this knowledge,
under current conditions.
new change in the world
جدید و عمیق در جهان
another possible response.
از پاسخ نگاه کنیم.
has only limited potential.
تنها توانایی محدودی دارد.
of people who are incentivized
تعداد کسانی را کاهش دهی که انگیزه
access and the means,
و روش ها دسترسی داشته باشند ،
که از ما می خواهی انجام دهیم
flying around the world
که دور دنیا پرواز می کنند
showing signs of sociopathic behavior,
که رفتارهای جامعه ستیزانه دارد،
تصویری ترکیبی است.
like, incarcerate or kill,
to a better view of the world.
به نگاه بهتری از جهان باشد.
extremely successful in this,
of such individuals by half.
این کار را انجام دهید،
all other powerful forces
نیروهای قدرتمند دیگر می شوید
سیستم های آموزشی می کنند.
would be reduced by half.
humanity's future on response two.
را روی پاسخ دوم شرط بندی کنیم.
to try to deter and persuade people,
که برای تشویق و تغییر افراد تلاش کنیم،
as our only safeguard.
تنها راه حفاظت بدانیم.
the ability to stabilize the world
of possible vulnerabilities.
احتمالی قابل استفاده هستند.
this dangerous thing,
خطرناک انجام دهد،
in real time, and stop them.
و جلویشان را بگیری.
ubiquitous surveillance,
essentially, a form of.
اساسا، به شکلی.
هوش مصنوعی داشته باشی،
that were reviewing this, etc., etc.
که این را بررسی می کنند، غیره و غیره.
is not a very popular term right now?
این روزها موضوع خیلی محبوبی نیست؟
that you would have to wear at all times
که باید همیشه ببندی
or something like that.
یا چیزی مشابه آن.
such a mind-blowing conversation.
دیوانه کننده است.
a whole big conversation on this
with that, right?
همراهش است، درسته؟
another governance gap.
مدیریتی دیگر است.
governance gap at the microlevel,
از مدیریت شکاف مدیریتی در سطح-خرد است.
from ever doing something highly illegal.
هیچوقت خطای خیلی بزرگی نکند.
governance gap
of global coordination failures,
عدم هماهنگی جهانی شوی،
the Type-2a vulnerabilities.
نوع ۲-الف مواجه شوی.
of fashion right now,
that throughout history,
نتیجه بگیری که در طول تاریخ،
of technological power increase,
and sort of centralized the power.
قدرت را مرکزی کرده.
when a roving band of criminals
well, you have a nation-state
a police force or an army,
مثل نیروی پلیس یا ارتش،
a single person or a single group
we're going to have to go this route,
باید وارد این مسیر شویم،
of political organization has increased
که اندازه سازمان های سیاسی افزایش یافته
and so on and so forth.
و به همین شکل و صورت.
and to global governance.
و هم در مدیریت جهانی.
that if we are lucky,
که ما خوش شانسیم،
that these would be the only ways
که اینها تنها راه های
we can't have it all.
که نمی شود همه چیز را با هم داشت.
that many of us had
going to be a force for good,
go as fast as you can
تا می توانی تند برو
to some of the consequences,
very uncomfortable things with it,
را که همراه آن است،
arms race with ourselves
مسابقه تسلیحاتی با خودت است
you better limit it,
بهتر است محدودش کنی،
it's in a sense the easiest option
از این نظر که انتخابی ساده تر است
vulnerable to extracting a black ball.
در بیرون آوردن توپ سیاه آسیب پذیریم.
macrogovernance problem,
این مدیریت کلان را حل کنی،
all the balls from the urn
in a simulation, does it matter?
زندگی می کنیم، این اهمیتی دارد؟
how likely is it that we're doomed?
احتمال محکوم به شکست بودنمان چقدر است؟
when you ask that question.
وقتی این سوال را می پرسی.
just with the time line,
تنها از دید زمانی،
and all kinds of things, right?
و هه این چیزها، درست؟
بتوانی احتمالی هم به آن بدهی،
so that you can attach a probability,
probably you'll die of natural causes,
احتمالا به دلایل طبیعی خواهید مرد،
you might have a 100-year --
احتمالا به ۱۰۰ سالگی برسید --
on who you ask.
as civilizational devastation?
را نابودی تمدن می دانیم؟
an existential catastrophe
به یک فاجعه نابود کننده هستی
you say the threshold is,
که شما تعریف می کنید،
put me down as a frightened optimist.
به عنوان یک خوشبین ترسو کنار بگذارید.
a large number of other frightened ...
ترسوی دیگر بوجود آوردی ....
the living daylights out of us.
حالا ما را از روز روشن هم ترسانده ای.
ABOUT THE SPEAKERS
Nick Bostrom - PhilosopherNick Bostrom asks big questions: What should we do, as individuals and as a species, to optimize our long-term prospects? Will humanity’s technological advancements ultimately destroy us?
Why you should listen
Philosopher Nick Bostrom envisioned a future full of human enhancement, nanotechnology and machine intelligence long before they became mainstream concerns. From his famous simulation argument -- which identified some striking implications of rejecting the Matrix-like idea that humans are living in a computer simulation -- to his work on existential risk, Bostrom approaches both the inevitable and the speculative using the tools of philosophy, probability theory, and scientific analysis.
Since 2005, Bostrom has led the Future of Humanity Institute, a research group of mathematicians, philosophers and scientists at Oxford University tasked with investigating the big picture for the human condition and its future. He has been referred to as one of the most important thinkers of our age.
Nick was honored as one of Foreign Policy's 2015 Global Thinkers .
His recent book Superintelligence advances the ominous idea that “the first ultraintelligent machine is the last invention that man need ever make.”
Nick Bostrom | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.
Why you should listen
Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.
Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.
Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.
Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.
This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.
He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.
In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.
Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.
Chris Anderson | Speaker | TED.com