ABOUT THE SPEAKER
Tristan Harris - Design thinker
Tristan Harris helps the technology industry more consciously and ethically shape the human spirit and human potential.

Why you should listen

Tristan Harris has been called "the closest thing Silicon Valley has to a conscience" by The Atlantic magazine. Prior to founding the new Center for Humane Technology, he was Google's Design Ethicist, developing a framework for how technology should "ethically" steer the thoughts and actions of billions of people from screens.  

Harris has spent a decade understanding the invisible influences that hijack human thinking and action. Drawing on literature from addiction, performative magic, social engineering, persuasive design and behavioral economics, he is currently developing a framework for ethical persuasion, especially as it relates to the moral responsibility of technology companies.

Rolling Stone magazine named Harris one of "25 People Shaping the World" in 2017. His work has been featured on TED, "60 Minutes," HBO's "RealTime with Bill Maher," "PBS NewsHour," Recode, The Atlantic, WIRED, the New York Times, Der Spiegel, The Economist and many more. Harris has briefed heads of state, technology company CEOs and members of the US Congress about the attention economy.

More profile about the speaker
Tristan Harris | Speaker | TED.com
TED2017

Tristan Harris: How a handful of tech companies control billions of minds every day

特里斯坦•哈里斯: 科技公司如何控制你的注意力

Filmed:
2,591,029 views

屈指可数的科技公司中的一小部分人掌控着亿万人每天的思考,设计思考者特里斯坦•哈里斯说道。从Facebook的消息通知到Snapstreaks再到YouTube的自动播放,它们都在为一件东西竞争:你的注意力。哈里斯分享了这些公司如何利用我们的心理为他们自己谋利,并且呼吁进行一次科技重塑,让科技帮助我们按照我们的心愿来生活。
- Design thinker
Tristan Harris helps the technology industry more consciously and ethically shape the human spirit and human potential. Full bio

Double-click the English transcript below to play the video.

00:12
I want you to imagine想像
0
960
1200
我想让你们想象
走进一个房间,
00:15
walking步行 into a room房间,
1
3000
1200
这个房间是个控制室,
里面有一群人,大概上百人,
00:17
a control控制 room房间 with a bunch of people,
2
5480
2136
都在有控制器的桌子旁伏案工作,
00:19
a hundred people, hunched驼背
over a desk with little dials表盘,
3
7640
2840
而那间控制室
00:23
and that that control控制 room房间
4
11280
1520
00:25
will shape形状 the thoughts思念 and feelings情怀
5
13680
3696
会决定上亿人的
思想和情感。
00:29
of a billion十亿 people.
6
17400
1240
00:32
This might威力 sound声音 like science科学 fiction小说,
7
20560
1880
这听上去像是科幻小说,
00:35
but this actually其实 exists存在
8
23320
2216
但是今天,此刻,
它真真切切地发生着。
00:37
right now, today今天.
9
25560
1200
00:40
I know because I used to be
in one of those control控制 rooms客房.
10
28040
3360
因为我就在这种控制室工作过。
我曾是谷歌的设计伦理学家,
00:44
I was a design设计 ethicist伦理学家 at Google谷歌,
11
32159
2297
在那里我曾研究过如何
名正言顺地左右人们的想法。
00:46
where I studied研究 how do you ethically道德
steer驾驶 people's人们 thoughts思念?
12
34480
3360
00:50
Because what we don't talk about
is how the handful少数 of people
13
38560
2896
因为我们不常讨论屈指可数的
几家科技公司的那么几个人,
00:53
working加工 at a handful少数
of technology技术 companies公司
14
41480
2536
是如何通过他们的选择
来影响现今上亿人的思想。
00:56
through通过 their choices选择 will steer驾驶
what a billion十亿 people are thinking思维 today今天.
15
44040
5040
01:02
Because when you pull out your phone电话
16
50400
1736
因为当你拿出手机,
他们将设计下一步会如何
以及应该推送哪些新闻,
01:04
and they design设计 how this works作品
or what's on the feed饲料,
17
52160
3096
就像在我们的脑子里设定出了小小的模块。
01:07
it's scheduling调度 little blocks
of time in our minds头脑.
18
55280
3216
如果你看到一个消息,
这会促使你形成
01:10
If you see a notification通知,
it schedules时间表 you to have thoughts思念
19
58520
3136
01:13
that maybe you didn't intend打算 to have.
20
61680
2040
或许原本没有的想法。
01:16
If you swipe刷卡 over that notification通知,
21
64400
2616
如果你忽略了这个消息,
他们还会设计让你在原本
01:19
it schedules时间表 you into spending开支
a little bit of time
22
67040
2381
不关注的东西上
01:21
getting得到 sucked into something
23
69445
1381
01:22
that maybe you didn't intend打算
to get sucked into.
24
70850
2955
多花些时间。
01:27
When we talk about technology技术,
25
75320
1520
当我们谈到科技时,
我们倾向于把它描绘成
充满机遇的蓝图。
01:30
we tend趋向 to talk about it
as this blue蓝色 sky天空 opportunity机会.
26
78040
2696
01:32
It could go any direction方向.
27
80760
1480
天马行空,任你想象。
01:35
And I want to get serious严重 for a moment时刻
28
83400
1856
而我想严肃地告诉你们,
为什么它选择了某个特定的方向。
01:37
and tell you why it's going
in a very specific具体 direction方向.
29
85280
2680
因为科技的发展不是随机的。
01:40
Because it's not evolving进化 randomly随机.
30
88840
2200
01:44
There's a hidden goal目标
driving主动 the direction方向
31
92000
2016
在我们创造的所有科技背后
都隐藏着明确的目标,
01:46
of all of the technology技术 we make,
32
94040
2136
那个目标就是追求我们的注意力。
01:48
and that goal目标 is the race种族
for our attention注意.
33
96200
2920
01:52
Because every一切 new site现场 --
34
100840
2736
因为所有新的网站——
TED演讲,选举,政客,
01:55
TEDTED, elections选举, politicians政治家,
35
103600
2736
游戏,甚至是冥想软件——
01:58
games游戏, even meditation冥想 apps应用 --
36
106360
1976
都需要为同一种东西彼此竞争,
02:00
have to compete竞争 for one thing,
37
108360
1960
02:03
which哪一个 is our attention注意,
38
111160
1736
那就是我们的注意力,
02:04
and there's only so much of it.
39
112920
1600
而注意力是有限的。
02:08
And the best最好 way to get people's人们 attention注意
40
116440
2416
获得人们注意力的最佳办法
02:10
is to know how someone's谁家 mind心神 works作品.
41
118880
2440
就是了解我们的大脑是如何运作的。
02:13
And there's a whole整个 bunch
of persuasive说服力 techniques技术
42
121800
2336
大学时我曾在一个叫做
说服性技术实验室的地方
02:16
that I learned学到了 in college学院 at a lab实验室
called the Persuasive有说服力 Technology技术 Lab实验室
43
124160
3496
学到了许多说服别人的技巧,
来获取人们的注意力。
02:19
to get people's人们 attention注意.
44
127680
1600
02:21
A simple简单 example is YouTubeYouTube的.
45
129880
1480
一个简单的例子就是YouTube。
YouTube希望你在
它上面花的时间越多越好。
02:24
YouTubeYouTube的 wants to maximize最大化
how much time you spend.
46
132000
2936
02:26
And so what do they do?
47
134960
1200
那么他们做了什么呢?
02:28
They autoplay自动播放 the next下一个 video视频.
48
136840
2280
他们增加了自动播放下一个视频的功能。
效果应该说很不错。
02:31
And let's say that works作品 really well.
49
139760
1816
人们在上面花费了更多的时间。
02:33
They're getting得到 a little bit
more of people's人们 time.
50
141600
2416
如果你在Netflix工作,
看到他们这么做之后会觉得,
02:36
Well, if you're NetflixNetflix公司,
you look at that and say,
51
144040
2376
这会减少我的市场份额,
02:38
well, that's shrinking萎缩 my market市场 share分享,
52
146440
1858
我也要自动播放下一集。
02:40
so I'm going to autoplay自动播放 the next下一个 episode插曲.
53
148322
2000
02:43
But then if you're FacebookFacebook的,
54
151320
1376
但如果你是Facebook的员工,
你会说,这减少了我的市场份额,
02:44
you say, that's shrinking萎缩
all of my market市场 share分享,
55
152720
2336
所以我会在你点击播放按键前
02:47
so now I have to autoplay自动播放
all the videos视频 in the newsfeed新闻源
56
155080
2656
自动播放新闻推送中的所有视频。
02:49
before waiting等候 for you to click点击 play.
57
157760
1762
02:52
So the internet互联网 is not evolving进化 at random随机.
58
160320
3160
所以网络不是随机发展的。
02:56
The reason原因 it feels感觉
like it's sucking吸吮 us in the way it is
59
164320
4416
我们感觉上瘾的原因在于
我们的注意力成了被竞争的对象。
03:00
is because of this race种族 for attention注意.
60
168760
2376
我们知道这样发展下去会如何。
03:03
We know where this is going.
61
171160
1416
03:04
Technology技术 is not neutral中性,
62
172600
1520
科技并非中性的,
03:07
and it becomes this race种族
to the bottom底部 of the brain stem
63
175320
3416
它变成了那些可以深入了解
人脑运作的人之间的竞争。
03:10
of who can go lower降低 to get it.
64
178760
2200
03:13
Let me give you an example of SnapchatSnapchat.
65
181920
2336
给你们举个Snapchat的例子。
有人可能不太了解,
Snapchat是在美国青少年中
03:16
If you didn't know,
SnapchatSnapchat is the number one way
66
184280
3696
排名第一的交流方式。
03:20
that teenagers青少年 in
the United联合的 States状态 communicate通信.
67
188000
2256
如果你们和我一样用短信交流,
03:22
So if you're like me, and you use
text文本 messages消息 to communicate通信,
68
190280
4176
那么Snapchat就是青少年的短信,
03:26
SnapchatSnapchat is that for teenagers青少年,
69
194480
1776
03:28
and there's, like,
a hundred million百万 of them that use it.
70
196280
2696
有近一亿人在使用。
他们发明了一种叫Snapstreaks的功能,
03:31
And they invented发明
a feature特征 called SnapstreaksSnapstreaks,
71
199000
2216
它会展示两个人
03:33
which哪一个 shows节目 the number of days in a row
72
201240
1896
持续互动的天数。
03:35
that two people have
communicated传达 with each other.
73
203160
2616
换句话说,他们所做的
03:37
In other words, what they just did
74
205800
1856
03:39
is they gave two people
something they don't want to lose失去.
75
207680
2960
就是给了两个人不想失去的记录。
03:44
Because if you're a teenager青少年,
and you have 150 days in a row,
76
212000
3456
因为如果你是一个青少年,
连续交流了150天,
03:47
you don't want that to go away.
77
215480
1976
你不想失去这些东西。
03:49
And so think of the little blocks of time
that that schedules时间表 in kids'孩子们 minds头脑.
78
217480
4160
试想一下孩子脑子里
已被设定好的时间模块。
03:54
This isn't theoretical理论:
when kids孩子 go on vacation假期,
79
222160
2336
这不是理论空谈,而是事实;
当孩子们去度假时,
即使他们不能使用
Snapstreaks,也会把密码
03:56
it's been shown显示 they give their passwords密码
to up to five other friends朋友
80
224520
3256
告诉至多五个朋友,
03:59
to keep their SnapstreaksSnapstreaks going,
81
227800
2216
04:02
even when they can't do it.
82
230040
2016
来帮助他把Snapstreaks接力下去。
04:04
And they have, like, 30 of these things,
83
232080
1936
他们要同时关注大约30件这种事情,
所以每天他们就忙于拍拍画,拍拍墙,
04:06
and so they have to get through通过
taking服用 photos相片 of just pictures图片 or walls墙壁
84
234040
3376
拍拍天花板来消磨时间。
04:09
or ceilings天花板 just to get through通过 their day.
85
237440
2480
04:13
So it's not even like
they're having real真实 conversations对话.
86
241200
2696
这甚至不能被称为真正的交流。
我们可能会这么想,
04:15
We have a temptation诱惑 to think about this
87
243920
1936
他们用Snapchat的方式
04:17
as, oh, they're just using运用 SnapchatSnapchat
88
245880
2696
04:20
the way we used to
gossip八卦 on the telephone电话.
89
248600
2016
就像我们曾经用电话聊八卦一样,
应该是没问题的。
04:22
It's probably大概 OK.
90
250640
1200
04:24
Well, what this misses错过
is that in the 1970s,
91
252480
2256
但人们忽略了在上世纪七十年代,
04:26
when you were just
gossiping闲聊 on the telephone电话,
92
254760
2615
当你们用电话聊八卦时,
04:29
there wasn't a hundred engineers工程师
on the other side of the screen屏幕
93
257399
3017
在另一头并没有数百名工程师
准确知道你们的心理活动,
04:32
who knew知道 exactly究竟
how your psychology心理学 worked工作
94
260440
2056
并精心策划着如何让你们加强联络。
04:34
and orchestrated策划 you
into a double bind捆绑 with each other.
95
262520
2640
04:38
Now, if this is making制造 you
feel a little bit of outrage暴行,
96
266440
3400
如果这让你有些愤怒了,
04:42
notice注意 that that thought
just comes over you.
97
270680
2576
请注意,你刚刚才意识到这点,
04:45
Outrage暴行 is a really good way also
of getting得到 your attention注意,
98
273280
3320
愤怒也是获取你注意力的一种好方式。
04:49
because we don't choose选择 outrage暴行.
99
277880
1576
因为我们不会选择愤怒。
它会自动发生在我们身上。
04:51
It happens发生 to us.
100
279480
1416
04:52
And if you're the FacebookFacebook的 newsfeed新闻源,
101
280920
1856
如果你负责Facebook的新闻推送,
信不信由你,
04:54
whether是否 you'd want to or not,
102
282800
1416
人们愤怒的时候你会受益。
04:56
you actually其实 benefit效益 when there's outrage暴行.
103
284240
2736
因为愤怒不仅仅是人们
04:59
Because outrage暴行
doesn't just schedule时间表 a reaction反应
104
287000
2936
05:01
in emotional情绪化 time, space空间, for you.
105
289960
2880
在情感和空间上的一个反应。
05:05
We want to share分享 that outrage暴行
with other people.
106
293440
2416
想要与人分享这份愤怒,
所以我们会按下分享键,然后说,
05:07
So we want to hit击中 share分享 and say,
107
295880
1576
“你能相信他们说的吗?”
05:09
"Can you believe the thing
that they said?"
108
297480
2040
05:12
And so outrage暴行 works作品 really well
at getting得到 attention注意,
109
300520
3376
而这种愤怒真的特别容易获得注意力,
甚至于如果Facebook
可以在展示惹人愤怒的消息
05:15
such这样 that if FacebookFacebook的 had a choice选择
between之间 showing展示 you the outrage暴行 feed饲料
110
303920
3896
和一般的消息之间进行选择的话,
05:19
and a calm冷静 newsfeed新闻源,
111
307840
1320
他们会选择向你发布让你愤怒的消息,
05:22
they would want
to show显示 you the outrage暴行 feed饲料,
112
310120
2136
不是因为刻意的原因,
05:24
not because someone有人
consciously自觉 chose选择 that,
113
312280
2056
05:26
but because that worked工作 better
at getting得到 your attention注意.
114
314360
2680
只是那会更好的获得你的注意。
05:31
And the newsfeed新闻源 control控制 room房间
is not accountable问责 to us.
115
319120
5480
消息控制室不用对我们负责,
05:37
It's only accountable问责
to maximizing最大化 attention注意.
116
325040
2296
它只对最大化获得人们的注意力负责,
鉴于广告的商业模型,
05:39
It's also accountable问责,
117
327360
1216
它也对可以花钱进入
05:40
because of the business商业 model模型
of advertising广告,
118
328600
2376
05:43
for anybody任何人 who can pay工资 the most
to actually其实 walk步行 into the control控制 room房间
119
331000
3336
控制室的人负责,他们可以要求说
05:46
and say, "That group over there,
120
334360
1576
“那一组人,我想把
这些东西灌输给他们。”
05:47
I want to schedule时间表 these thoughts思念
into their minds头脑."
121
335960
2640
05:51
So you can target目标,
122
339760
1200
所以你可以定位,
05:54
you can precisely恰恰 target目标 a lie谎言
123
342040
1936
你可以准确定位到
05:56
directly to the people
who are most susceptible易感.
124
344000
2920
那些最易受到影响的人。
06:00
And because this is profitable有利可图,
it's only going to get worse更差.
125
348080
2880
因为这有利可图,
所以只会变得越来越糟。
06:05
So I'm here today今天
126
353040
1800
今天我在这里(跟大家讲)
06:08
because the costs成本 are so obvious明显.
127
356160
2000
是因为代价已经太高了。
06:12
I don't know a more urgent紧急
problem问题 than this,
128
360280
2136
我不知道还有什么问题比这更紧急的,
因为这个问题隐藏在所有问题之中。
06:14
because this problem问题
is underneath all other problems问题.
129
362440
3120
06:18
It's not just taking服用 away our agency机构
130
366720
3176
它并不仅仅是剥夺了我们的注意力
和选择生活方式的自主权,
06:21
to spend our attention注意
and live生活 the lives生活 that we want,
131
369920
2600
06:25
it's changing改变 the way
that we have our conversations对话,
132
373720
3536
更是改变了我们进行交流的方式,
改变了我们的民主意识,
06:29
it's changing改变 our democracy民主,
133
377280
1736
06:31
and it's changing改变 our ability能力
to have the conversations对话
134
379040
2616
改变了我们与他人交流,
维系关系的能力。
06:33
and relationships关系 we want with each other.
135
381680
2000
06:37
And it affects影响 everyone大家,
136
385160
1776
这影响到了每个人,
06:38
because a billion十亿 people
have one of these in their pocket口袋.
137
386960
3360
因为亿万人的口袋里
都装着这个东西(手机)。
06:45
So how do we fix固定 this?
138
393360
1400
那么我们要如何解决这个问题呢?
06:49
We need to make three radical激进 changes变化
139
397080
2936
我们需要对科技和社会
06:52
to technology技术 and to our society社会.
140
400040
1800
做三个大胆的突破。
06:55
The first is we need to acknowledge确认
that we are persuadable说服的.
141
403720
3800
首先,我们需要承认我们可以被说服。
07:00
Once一旦 you start开始 understanding理解
142
408840
1376
一旦了解
我们是可以安排大脑去想一点其他事情,
07:02
that your mind心神 can be scheduled计划
into having little thoughts思念
143
410240
2776
或无计划地占用一些时间,
07:05
or little blocks of time
that you didn't choose选择,
144
413040
2576
我们难道不能利用这点认识
07:07
wouldn't不会 we want to use that understanding理解
145
415640
2056
来改变现状吗?
07:09
and protect保护 against反对 the way
that that happens发生?
146
417720
2160
07:12
I think we need to see ourselves我们自己
fundamentally从根本上 in a new way.
147
420600
3296
我认为我们需要
以全新的方式审视自己。
就像是人类历史上的新篇章,
07:15
It's almost几乎 like a new period
of human人的 history历史,
148
423920
2216
就像启蒙运动,
07:18
like the Enlightenment启示,
149
426160
1216
但这是自省式的启蒙运动,
07:19
but almost几乎 a kind of
self-aware自我意识 Enlightenment启示,
150
427400
2216
意识到我们可以被说服影响,
07:21
that we can be persuaded说服了,
151
429640
2080
意识到有些东西需要我们去保护。
07:24
and there might威力 be something
we want to protect保护.
152
432320
2240
07:27
The second第二 is we need new models楷模
and accountability问责 systems系统
153
435400
4576
第二点是,我们需要新的
模型和责任系统,
以便让世界变得越来越好,
也越来越有影响力时——
07:32
so that as the world世界 gets得到 better
and more and more persuasive说服力 over time --
154
440000
3496
07:35
because it's only going
to get more persuasive说服力 --
155
443520
2336
也就是更能说服我们时——
那些控制室里的人们
07:37
that the people in those control控制 rooms客房
156
445880
1856
才会对我们负责,且行为对我们透明化。
07:39
are accountable问责 and transparent透明
to what we want.
157
447760
2456
道德说服只有当
07:42
The only form形成 of ethical合乎道德的
persuasion劝说 that exists存在
158
450240
2696
说服和被说服者的
07:44
is when the goals目标 of the persuader说服者
159
452960
1936
07:46
are aligned对齐 with the goals目标
of the persuadeepersuadee.
160
454920
2200
目标一致时才存在。
07:49
And that involves涉及 questioning疑问 big things,
like the business商业 model模型 of advertising广告.
161
457640
3840
这就涉及到对热门举措的怀疑,
比如广告的商业模型。
07:54
Lastly最后,
162
462720
1576
最后,
07:56
we need a design设计 renaissance再生,
163
464320
1680
我们需要一次科技重塑,
07:59
because once一旦 you have
this view视图 of human人的 nature性质,
164
467080
3056
因为一旦你开始了解人的这种本性,
你就可以控制上亿人的时间——
08:02
that you can steer驾驶 the timelines时间线
of a billion十亿 people --
165
470160
2976
08:05
just imagine想像, there's people
who have some desire欲望
166
473160
2736
想象一下,有些人有这样的欲望,
他们想做什么,
他们想思考什么,
08:07
about what they want to do
and what they want to be thinking思维
167
475920
2856
他们想感受什么,
他们想了解什么,
08:10
and what they want to be feeling感觉
and how they want to be informed通知,
168
478800
3136
而我们被吸引到这些不同的方向。
08:13
and we're all just tugged扯了扯
into these other directions方向.
169
481960
2536
数亿人会跟随着这些不同的方向。
08:16
And you have a billion十亿 people just tugged扯了扯
into all these different不同 directions方向.
170
484520
3696
试想一下一个完整的科技复兴
08:20
Well, imagine想像 an entire整个 design设计 renaissance再生
171
488240
2056
08:22
that tried试着 to orchestrate编排
the exact精确 and most empowering授权
172
490320
3096
将会指导我们准确有效的
08:25
time-well-spent时间能得到有效利用 way
for those timelines时间线 to happen发生.
173
493440
3136
分配时间。
这包含两个方面:
08:28
And that would involve涉及 two things:
174
496600
1656
一是防止我们把时间花在
08:30
one would be protecting保护
against反对 the timelines时间线
175
498280
2136
08:32
that we don't want to be experiencing经历,
176
500440
1856
在不想花的地方,
08:34
the thoughts思念 that we
wouldn't不会 want to be happening事件,
177
502320
2416
阻止我们产生不想形成的想法,
即便提示音响了,
我们也不会被牵着鼻子走;
08:36
so that when that ding happens发生,
not having the ding that sends发送 us away;
178
504760
3336
二是让我们按照所期待的时间轨迹生活。
08:40
and the second第二 would be empowering授权 us
to live生活 out the timeline时间线 that we want.
179
508120
3616
08:43
So let me give you a concrete具体 example.
180
511760
1880
给你们举个实际的例子。
08:46
Today今天, let's say your friend朋友
cancels取消 dinner晚餐 on you,
181
514280
2456
今天,比如你的朋友
取消了和你共进晚餐,
08:48
and you are feeling感觉 a little bit lonely孤独.
182
516760
3775
你感到有些寂寞。
08:52
And so what do you do in that moment时刻?
183
520559
1817
这一刻你会做什么呢?
你打开了Facebook。
08:54
You open打开 up FacebookFacebook的.
184
522400
1279
08:56
And in that moment时刻,
185
524960
1696
在那一刻,
08:58
the designers设计师 in the control控制 room房间
want to schedule时间表 exactly究竟 one thing,
186
526680
3376
控制室里的设计者要做一件事,
09:02
which哪一个 is to maximize最大化 how much time
you spend on the screen屏幕.
187
530080
3040
那就是要你盯着屏幕,时间越长越好。
09:06
Now, instead代替, imagine想像 if those designers设计师
created创建 a different不同 timeline时间线
188
534640
3896
现在想象一下,
如果设计者规划了另一条时间轴,
09:10
that was the easiest最简单的 way,
using运用 all of their data数据,
189
538560
3496
利用他们所有的数据,
用最简单的方法
09:14
to actually其实 help you get out
with the people that you care关心 about?
190
542080
3096
帮你约出你关心的人呢?
09:17
Just think, alleviating缓解
all loneliness孤单 in society社会,
191
545200
5416
试想如果消除社会中所有的孤单,
09:22
if that was the timeline时间线 that FacebookFacebook的
wanted to make possible可能 for people.
192
550640
3496
才是Facebook想要为人们实现的理想。
或试想另一个对话。
09:26
Or imagine想像 a different不同 conversation会话.
193
554160
1715
比方说你想要在Facebook上
发表备受争议的言论,
09:27
Let's say you wanted to post岗位
something supercontroversialsupercontroversial on FacebookFacebook的,
194
555899
3317
能这么做是很重要的,
09:31
which哪一个 is a really important重要
thing to be able能够 to do,
195
559240
2416
谈论争议性话题。
09:33
to talk about controversial争论的 topics主题.
196
561680
1696
现在有一个巨大的评论栏,
09:35
And right now, when there's
that big comment评论 box,
197
563400
2336
它就像是在问你,
你想要输入什么东西?
09:37
it's almost几乎 asking you,
what key do you want to type类型?
198
565760
3376
09:41
In other words, it's scheduling调度
a little timeline时间线 of things
199
569160
2816
换句话说,它设定了你将要
继续在屏幕上做的事情。
09:44
you're going to continue继续
to do on the screen屏幕.
200
572000
2136
试想如果有另外一个提问框跳出来问你,
09:46
And imagine想像 instead代替 that there was
another另一个 button按键 there saying,
201
574160
2976
怎么花费时间最好?
09:49
what would be most
time well spent花费 for you?
202
577160
2056
你点击 “办个晚餐聚会”
09:51
And you click点击 "host主办 a dinner晚餐."
203
579240
1576
09:52
And right there
underneath the item项目 it said,
204
580840
2096
紧接着下面有个选项问道,
09:54
"Who wants to RSVP敬请回复 for the dinner晚餐?"
205
582960
1696
“谁想要去这个晚餐?”
虽然你仍想对
有争议性的话题进行讨论,
09:56
And so you'd still have a conversation会话
about something controversial争论的,
206
584680
3256
但你的时间花在了最好的地方,
09:59
but you'd be having it in the most
empowering授权 place地点 on your timeline时间线,
207
587960
3736
10:03
which哪一个 would be at home that night
with a bunch of a friends朋友 over
208
591720
3016
晚上在家里和一群朋友
10:06
to talk about it.
209
594760
1200
讨论那个话题。
10:09
So imagine想像 we're running赛跑, like,
a find and replace更换
210
597000
3160
想象我们正在使用“查找并替代” 功能,
10:13
on all of the timelines时间线
that are currently目前 steering操舵 us
211
601000
2576
把所有那些正在促使我们
花越来越多的时间在屏幕上的事情,
10:15
towards more and more
screen屏幕 time persuasively有说服力
212
603600
2560
10:19
and replacing更换 all of those timelines时间线
213
607080
2536
用我们在生活中的真实意愿
来逐一替换掉。
10:21
with what do we want in our lives生活.
214
609640
1640
10:26
It doesn't have to be this way.
215
614960
1480
事情本不必如此复杂。
10:30
Instead代替 of handicapping设限 our attention注意,
216
618360
2256
与阻碍我们的注意力相反,
10:32
imagine想像 if we used all of this data数据
and all of this power功率
217
620640
2816
试想如果我们利用所有这些数据和功能,
10:35
and this new view视图 of human人的 nature性质
218
623480
1616
加上对人类本性的全新认识,
给我们以超人的能力来集中注意力,
10:37
to give us a superhuman超人 ability能力 to focus焦点
219
625120
2856
10:40
and a superhuman超人 ability能力 to put
our attention注意 to what we cared照顾 about
220
628000
4136
来关注我们应该关心的事,
来进行民主所需要的
10:44
and a superhuman超人 ability能力
to have the conversations对话
221
632160
2616
10:46
that we need to have for democracy民主.
222
634800
2000
互动交流。
10:51
The most complex复杂 challenges挑战 in the world世界
223
639600
2680
世界上最复杂的挑战
10:56
require要求 not just us
to use our attention注意 individually个别地.
224
644280
3120
要求我们不仅独立的运用我们的注意力,
11:00
They require要求 us to use our attention注意
and coordinate坐标 it together一起.
225
648440
3320
还要求我们同心协力。
11:04
Climate气候 change更改 is going to require要求
that a lot of people
226
652440
2816
气候变化需要许多人
使用最有力的方式
11:07
are being存在 able能够
to coordinate坐标 their attention注意
227
655280
2096
11:09
in the most empowering授权 way together一起.
228
657400
1896
彼此协作。
11:11
And imagine想像 creating创建
a superhuman超人 ability能力 to do that.
229
659320
3080
试想创造出这种超人能力会是什么情景。
11:19
Sometimes有时 the world's世界
most pressing紧迫 and important重要 problems问题
230
667000
4160
有时世界上最紧要,最关键的问题
11:24
are not these hypothetical假想 future未来 things
that we could create创建 in the future未来.
231
672040
3840
不是假象出来的未来事物。
11:28
Sometimes有时 the most pressing紧迫 problems问题
232
676560
1736
有时最紧要的东西
恰恰是我们眼前的东西,
11:30
are the ones那些 that are
right underneath our noses鼻子,
233
678320
2336
11:32
the things that are already已经 directing导演
a billion十亿 people's人们 thoughts思念.
234
680680
3120
是已经影响了上亿人思想的东西。
11:36
And maybe instead代替 of getting得到 excited兴奋
about the new augmented增强 reality现实
235
684600
3376
也许,与其对新兴的增强现实,
11:40
and virtual虚拟 reality现实
and these cool things that could happen发生,
236
688000
3296
虚拟现实等炫酷玩物感到着迷,
11:43
which哪一个 are going to be susceptible易感
to the same相同 race种族 for attention注意,
237
691320
3296
使得它们也会成为
注意力竞争中的一员,
11:46
if we could fix固定 the race种族 for attention注意
238
694640
2176
我们不如改进亿万人口袋里
11:48
on the thing that's already已经
in a billion十亿 people's人们 pockets口袋.
239
696840
2720
影响注意力的那个东西。
11:52
Maybe instead代替 of getting得到 excited兴奋
240
700040
1576
也许,我们与其对
11:53
about the most exciting扣人心弦
new cool fancy幻想 education教育 apps应用,
241
701640
4176
新潮炫酷的教育软件感到兴奋,
还不如想想如何拯救那些已被操纵,
11:57
we could fix固定 the way
kids'孩子们 minds头脑 are getting得到 manipulated操纵
242
705840
2896
12:00
into sending发出 empty messages消息
back and forth向前.
243
708760
2480
来回发送空洞消息的孩子们。
12:04
(Applause掌声)
244
712040
4296
(掌声)
12:08
Maybe instead代替 of worrying令人担忧
245
716360
1256
也许,我们与其
12:09
about hypothetical假想 future未来
runaway逃跑 artificial人造 intelligences智能
246
717640
3776
对假象的未来中的最大主角,
12:13
that are maximizing最大化 for one goal目标,
247
721440
1880
人工智能感到担心,
12:16
we could solve解决 the runaway逃跑
artificial人造 intelligence情报
248
724680
2656
不如解决当下就存在的
12:19
that already已经 exists存在 right now,
249
727360
2056
人工智能问题,
12:21
which哪一个 are these newsfeeds新闻推送
maximizing最大化 for one thing.
250
729440
2920
就是那些争夺注意力的新闻推送。
12:26
It's almost几乎 like instead代替 of running赛跑 away
to colonize拓殖 new planets行星,
251
734080
3816
这就好比与其逃跑和殖民新的星球,
我们可以解决现在所在星球的问题。
12:29
we could fix固定 the one
that we're already已经 on.
252
737920
2056
12:32
(Applause掌声)
253
740000
4120
(掌声)
12:40
Solving解决 this problem问题
254
748040
1776
解决这个问题
12:41
is critical危急 infrastructure基础设施
for solving every一切 other problem问题.
255
749840
3800
是解决其他问题的关键所在。
12:46
There's nothing in your life
or in our collective集体 problems问题
256
754600
4016
在你生命之中或是我们共同的问题中,
12:50
that does not require要求 our ability能力
to put our attention注意 where we care关心 about.
257
758640
3560
没有一件事不需要我们
把注意力放到我们关心的事情上去。
12:55
At the end结束 of our lives生活,
258
763800
1240
归根到底,在一生中,
12:58
all we have is our attention注意 and our time.
259
766240
2640
我们共同拥有的就是注意力和时间。
13:01
What will be time well spent花费 for ours我们的?
260
769800
1896
时间会被我们自己充分利用吗?
13:03
Thank you.
261
771720
1216
谢谢大家。
(掌声)
13:04
(Applause掌声)
262
772960
3120
13:17
Chris克里斯 Anderson安德森: Tristan特里斯坦, thank you.
Hey, stay up here a sec.
263
785760
2936
克里斯•安德森:
Tristan,谢谢你,请留步。
首先,谢谢你。
13:20
First of all, thank you.
264
788720
1336
13:22
I know we asked you to do this talk
on pretty漂亮 short notice注意,
265
790080
2776
我知道我们很晚才通知你要做这次演讲,
这周准备这个演讲,
13:24
and you've had quite相当 a stressful压力 week
266
792880
2216
13:27
getting得到 this thing together一起, so thank you.
267
795120
2440
压力确实很大,再次谢谢你。
13:30
Some people listening might威力 say,
what you complain抱怨 about is addiction,
268
798680
3976
有些听众可能会说,
你抱怨的不就是上瘾吗,
13:34
and all these people doing this stuff东东,
for them it's actually其实 interesting有趣.
269
802680
3496
而对于真正这么做的人来说,
他们确确实实是觉得有趣的。
所有这些设计策划
13:38
All these design设计 decisions决定
270
806200
1256
使得用户获取了十分有趣的内容。
13:39
have built内置 user用户 content内容
that is fantastically飞驰 interesting有趣.
271
807480
3096
13:42
The world's世界 more interesting有趣
than it ever has been.
272
810600
2416
这个世界从未如此有意思过。
这有什么问题吗?
13:45
What's wrong错误 with that?
273
813040
1256
特里斯坦•哈里斯:
我认为这确实有趣。
13:46
Tristan特里斯坦 Harris哈里斯:
I think it's really interesting有趣.
274
814320
2256
换个角度思考一下
举个例子,还用YouTube,
13:48
One way to see this
is if you're just YouTubeYouTube的, for example,
275
816600
4016
13:52
you want to always show显示
the more interesting有趣 next下一个 video视频.
276
820640
2656
你总是想让下一个视频更有趣。
你想要在建议下一个视频
这件事上越做越好,
13:55
You want to get better and better
at suggesting提示 that next下一个 video视频,
277
823320
3016
但即使你可以建议一个
13:58
but even if you could propose提出
the perfect完善 next下一个 video视频
278
826360
2456
所有人都想要观看的完美视频,
14:00
that everyone大家 would want to watch,
279
828840
1656
其实只是在吸引人们
盯住屏幕这件事上越来越好。
14:02
it would just be better and better
at keeping保持 you hooked迷上 on the screen屏幕.
280
830520
3336
所以这个等式缺失的是
14:05
So what's missing失踪 in that equation方程
281
833880
1656
知道我们的极限在哪里。
14:07
is figuring盘算 out what
our boundaries边界 would be.
282
835560
2136
比如,你想要让YouTube对
入睡这样的事儿有所了解。
14:09
You would want YouTubeYouTube的 to know
something about, say, falling落下 asleep睡着.
283
837720
3216
Netflix的CEO最近说过,
14:12
The CEOCEO of NetflixNetflix公司 recently最近 said,
284
840960
1616
“我们最大的对手是Facebook,
YouTube和睡眠。”
14:14
"our biggest最大 competitors竞争对手
are FacebookFacebook的, YouTubeYouTube的 and sleep睡觉."
285
842600
2736
14:17
And so what we need to recognize认识
is that the human人的 architecture建筑 is limited有限
286
845360
4456
所以我们需要认识到
人体本身是有极限的,
在我们的生活中有某些界限或方面
14:21
and that we have certain某些 boundaries边界
or dimensions尺寸 of our lives生活
287
849840
2976
是要得到尊重的,
14:24
that we want to be honored荣幸 and respected尊敬,
288
852840
1976
而科技是可以帮助我们实现这些的。
14:26
and technology技术 could help do that.
289
854840
1816
14:28
(Applause掌声)
290
856680
2616
(掌声)
CA:你是否可以说明一下
14:31
CACA: I mean, could you make the case案件
291
859320
1696
14:33
that part部分 of the problem问题 here is that
we've我们已经 got a naïve已经 model模型 of human人的 nature性质?
292
861040
6056
我们是不是对人性的认识太天真了?
14:39
So much of this is justified有理
in terms条款 of human人的 preference偏爱,
293
867120
2736
从人类偏好来看,
这些东西是可以合理化的,
我们有这些极其棒的算法
14:41
where we've我们已经 got these algorithms算法
that do an amazing惊人 job工作
294
869880
2616
为人类的偏好优化上做着贡献,
14:44
of optimizing优化 for human人的 preference偏爱,
295
872520
1696
但是,是什么偏好呢?
14:46
but which哪一个 preference偏爱?
296
874240
1336
14:47
There's the preferences优先
of things that we really care关心 about
297
875600
3496
有我们确实关心的
偏好,
14:51
when we think about them
298
879120
1376
14:52
versus the preferences优先
of what we just instinctively本能 click点击 on.
299
880520
3056
也有我们只是下意识点击的偏好。
14:55
If we could implant注入 that more nuanced细致入微
view视图 of human人的 nature性质 in every一切 design设计,
300
883600
4656
如果我们在每个设计中
多加一点对人类本性的关注,
这会不会是一种进步呢?
15:00
would that be a step forward前锋?
301
888280
1456
15:01
THTH: Absolutely绝对. I mean, I think right now
302
889760
1976
TH:那是肯定的。我认为现在
科技基本上只是利用我们的下意识,
15:03
it's as if all of our technology技术
is basically基本上 only asking our lizard蜥蜴 brain
303
891760
3496
找到最好的方式让我们自然而然地
15:07
what's the best最好 way
to just impulsively冲动 get you to do
304
895280
2496
15:09
the next下一个 tiniest最小的 thing with your time,
305
897800
2136
对微不足道的事上瘾,
15:11
instead代替 of asking you in your life
306
899960
1656
而不是问我们在生活中
15:13
what we would be most
time well spent花费 for you?
307
901640
2176
时间要如何花费才最有意义。
什么才是完美的时间安排,
这可能包括后面的事情,
15:15
What would be the perfect完善 timeline时间线
that might威力 include包括 something later后来,
308
903840
3296
例如你利用在这儿的最后一天
参加TED,是很好的利用了时间吗?
15:19
would be time well spent花费 for you
here at TEDTED in your last day here?
309
907160
3176
CA:如果Facebook和Google
或任何一个人上来对我们说,
15:22
CACA: So if FacebookFacebook的 and Google谷歌
and everyone大家 said to us first up,
310
910360
2976
“嘿,你想要我们为你的思考进行优化,
15:25
"Hey, would you like us
to optimize优化 for your reflective反光 brain
311
913360
2896
还是优化你的下意识?你来选择。”
15:28
or your lizard蜥蜴 brain? You choose选择."
312
916280
1656
TH:是的。这可能是个方法。
15:29
THTH: Right. That would be one way. Yes.
313
917960
2080
15:34
CACA: You said persuadabilitypersuadability,
that's an interesting有趣 word to me
314
922358
2858
CA:你说到可说服性,
对我来说这个词很有趣,
15:37
because to me there's
two different不同 types类型 of persuadabilitypersuadability.
315
925240
2856
因为在我看来有两种不同的可说服性。
有一种可说服性是我们现在正尝试的,
15:40
There's the persuadabilitypersuadability
that we're trying right now
316
928120
2536
关于推理,思考以及做出论证,
15:42
of reason原因 and thinking思维
and making制造 an argument论据,
317
930680
2176
但是我觉得你在谈论另一种,
15:44
but I think you're almost几乎
talking about a different不同 kind,
318
932880
2696
更加内在的本能的一种可说服性,
15:47
a more visceral内脏 type类型 of persuadabilitypersuadability,
319
935590
1906
即在不经思考下就被说服了。
15:49
of being存在 persuaded说服了 without
even knowing会心 that you're thinking思维.
320
937520
2896
TH:是的。我十分关注
这个问题的原因是,
15:52
THTH: Exactly究竟. The reason原因
I care关心 about this problem问题 so much is
321
940440
2856
我曾在斯坦福的
说服性技术实验室学习,
15:55
I studied研究 at a lab实验室 called
the Persuasive有说服力 Technology技术 Lab实验室 at Stanford斯坦福
322
943320
3176
那里就是教学生这些技巧。
15:58
that taught people
exactly究竟 these techniques技术.
323
946520
2096
那有专门的论坛和讨论会,
教授人们如何用隐蔽的方式
16:00
There's conferences会议 and workshops研讨会
that teach people all these covert隐蔽 ways方法
324
948640
3456
来获得人们的注意力,
从而策划人们的生活。
16:04
of getting得到 people's人们 attention注意
and orchestrating策划 people's人们 lives生活.
325
952120
2976
正因为大部分人不知道它的存在,
16:07
And it's because most people
don't know that that exists存在
326
955120
2656
才使得这个演讲如此重要。
16:09
that this conversation会话 is so important重要.
327
957800
1896
16:11
CACA: Tristan特里斯坦, you and I, we both know
so many许多 people from all these companies公司.
328
959720
3776
CA:Tristan,咱们都认识
许多来自这些公司的人。
他们当中许多人也在这里,
16:15
There are actually其实 many许多 here in the room房间,
329
963520
1976
16:17
and I don't know about you,
but my experience经验 of them
330
965520
2477
我不知道你的想法,
但是就我的经验而言,
16:20
is that there is
no shortage短缺 of good intent意图.
331
968021
2075
他们是心存善意的。
大家都向往更美好的世界。
16:22
People want a better world世界.
332
970120
2176
16:24
They are actually其实 -- they really want it.
333
972320
3520
他们确实是这样想的。
16:28
And I don't think anything you're saying
is that these are evil邪恶 people.
334
976320
4176
我不认为你的意思是
这些人都是坏人。
16:32
It's a system系统 where there's
these unintended意外 consequences后果
335
980520
3696
只不过是这个系统产生了不经意的结果,
超出了我们的控制范围——
16:36
that have really got out of control控制 --
336
984240
1856
TH:在争夺注意力
这件事儿上,没错儿。
16:38
THTH: Of this race种族 for attention注意.
337
986120
1496
当你要获得众人的注意力时,
便成了经典的厮杀,
16:39
It's the classic经典 race种族 to the bottom底部
when you have to get attention注意,
338
987640
3176
而且特别激烈残忍。
16:42
and it's so tense紧张.
339
990840
1216
取得更多注意的唯一办法,
只有深入大脑,
16:44
The only way to get more
is to go lower降低 on the brain stem,
340
992080
2736
深入愤怒,深入情感,
16:46
to go lower降低 into outrage暴行,
to go lower降低 into emotion情感,
341
994840
2416
16:49
to go lower降低 into the lizard蜥蜴 brain.
342
997280
1696
深入本性。
16:51
CACA: Well, thank you so much for helping帮助 us
all get a little bit wiser聪明 about this.
343
999000
3816
CA:感谢你帮助我们
对这个问题有了更深的认识。
特里斯坦•哈里斯,谢谢你。
TH:非常感谢。
16:54
Tristan特里斯坦 Harris哈里斯, thank you.
THTH: Thank you very much.
344
1002840
2416
(掌声)
16:57
(Applause掌声)
345
1005280
2240
Translated by Lipeng Chen
Reviewed by Jing Peng

▲Back to top

ABOUT THE SPEAKER
Tristan Harris - Design thinker
Tristan Harris helps the technology industry more consciously and ethically shape the human spirit and human potential.

Why you should listen

Tristan Harris has been called "the closest thing Silicon Valley has to a conscience" by The Atlantic magazine. Prior to founding the new Center for Humane Technology, he was Google's Design Ethicist, developing a framework for how technology should "ethically" steer the thoughts and actions of billions of people from screens.  

Harris has spent a decade understanding the invisible influences that hijack human thinking and action. Drawing on literature from addiction, performative magic, social engineering, persuasive design and behavioral economics, he is currently developing a framework for ethical persuasion, especially as it relates to the moral responsibility of technology companies.

Rolling Stone magazine named Harris one of "25 People Shaping the World" in 2017. His work has been featured on TED, "60 Minutes," HBO's "RealTime with Bill Maher," "PBS NewsHour," Recode, The Atlantic, WIRED, the New York Times, Der Spiegel, The Economist and many more. Harris has briefed heads of state, technology company CEOs and members of the US Congress about the attention economy.

More profile about the speaker
Tristan Harris | Speaker | TED.com