ABOUT THE SPEAKER
Rachel Botsman - Trust researcher
Rachel Botsman is a recognized expert on how collaboration and trust enabled by digital technologies will change the way we live, work, bank and consume.

Why you should listen

Rachel Botsman is an author and a visiting academic at the University of Oxford, Saïd Business School. Her work focuses on how technology is enabling trust in ways that are changing the way we live, work, bank and consume. She defined the theory of "collaborative consumption" in her first book, What's Mine Is Yours, which she co-authored with Roo Rogers. The concept was subsequently named by TIME as one of the "10 Ideas that Will Change the World" and by Thinkers50 as the 2015 Breakthrough Idea.

Named a "Young Global Leader" by the World Economic Forum, Botsman examines the growth and challenges of start-ups such as Airbnb, TaskRabbit and Uber. She is regular writer and commentator in leading international publications including the New York Times, The Wall Street Journal, Harvard Business Review, The Economist, WIRED and more. She is currently writing a new book that explores why the real disruption happening isn’t technology; it’s a profound shift in trust.

More profile about the speaker
Rachel Botsman | Speaker | TED.com
TEDSummit

Rachel Botsman: We've stopped trusting institutions and started trusting strangers

Filmed:
1,808,816 views

Something profound is changing our concept of trust, says Rachel Botsman. While we used to place our trust in institutions like governments and banks, today we increasingly rely on others, often strangers, on platforms like Airbnb and Uber and through technologies like the blockchain. This new era of trust could bring with it a more transparent, inclusive and accountable society -- if we get it right. Who do you trust?
- Trust researcher
Rachel Botsman is a recognized expert on how collaboration and trust enabled by digital technologies will change the way we live, work, bank and consume. Full bio

Double-click the English transcript below to play the video.

00:12
Let's talk about trust.
0
760
2560
00:16
We all know trust is fundamental,
1
4240
3496
00:19
but when it comes to trusting people,
2
7760
2616
00:22
something profound is happening.
3
10400
2760
00:25
Please raise your hand
4
13800
1256
00:27
if you have ever been
a host or a guest on Airbnb.
5
15080
4320
00:32
Wow. That's a lot of you.
6
20400
3216
00:35
Who owns Bitcoin?
7
23640
1440
00:38
Still a lot of you. OK.
8
26640
1256
00:39
And please raise your hand
if you've ever used Tinder
9
27920
2696
00:42
to help you find a mate.
10
30640
1616
00:44
(Laughter)
11
32280
1816
00:46
This one's really hard to count
because you're kind of going like this.
12
34120
3376
00:49
(Laughter)
13
37520
1816
00:51
These are all examples of how technology
14
39360
2776
00:54
is creating new mechanisms
15
42160
2096
00:56
that are enabling us to trust
unknown people, companies and ideas.
16
44280
6016
01:02
And yet at the same time,
17
50320
2016
01:04
trust in institutions --
18
52360
1536
01:05
banks, governments and even churches --
19
53920
2936
01:08
is collapsing.
20
56880
1376
01:10
So what's happening here,
21
58280
2216
01:12
and who do you trust?
22
60520
1760
01:14
Let's start in France with a platform --
with a company, I should say --
23
62800
3896
01:18
with a rather funny-sounding name,
24
66720
2136
01:20
BlaBlaCar.
25
68880
1256
01:22
It's a platform that matches
drivers and passengers
26
70160
3616
01:25
who want to share
long-distance journeys together.
27
73800
4016
01:29
The average ride taken is 320 kilometers.
28
77840
4216
01:34
So it's a good idea
to choose your fellow travelers wisely.
29
82080
4600
01:39
Social profiles and reviews
help people make a choice.
30
87320
3976
01:43
You can see if someone's a smoker,
you can see what kind of music they like,
31
91320
5016
01:48
you can see if they're going to bring
their dog along for the ride.
32
96360
3600
01:52
But it turns out
that the key social identifier
33
100440
3576
01:56
is how much you're going
to talk in the car.
34
104040
2616
01:58
(Laughter)
35
106680
1816
02:00
Bla, not a lot,
36
108520
1616
02:02
bla bla, you want a nice bit of chitchat,
37
110160
2296
02:04
and bla bla bla, you're not going
to stop talking the entire way
38
112480
3736
02:08
from London to Paris.
39
116240
1456
02:09
(Laughter)
40
117720
2296
02:12
It's remarkable, right,
that this idea works at all,
41
120040
3016
02:15
because it's counter to the lesson
most of us were taught as a child:
42
123080
3856
02:18
never get in a car with a stranger.
43
126960
2320
02:21
And yet, BlaBlaCar transports
more than four million people
44
129680
5136
02:26
every single month.
45
134840
1240
02:28
To put that in context,
that's more passengers
46
136880
2336
02:31
than the Eurostar
or JetBlue airlines carry.
47
139240
3480
02:35
BlaBlaCar is a beautiful illustration
of how technology is enabling
48
143360
4136
02:39
millions of people across the world
to take a trust leap.
49
147520
3600
02:43
A trust leap happens when we take the risk
to do something new or different
50
151560
6256
02:49
to the way that we've always done it.
51
157840
2040
02:52
Let's try to visualize this together.
52
160440
2560
02:55
OK. I want you to close your eyes.
53
163360
3160
02:59
There is a man staring at me
with his eyes wide open.
54
167320
3056
03:02
I'm on this big red circle. I can see.
55
170400
2456
03:04
So close your eyes.
56
172880
1416
03:06
(Laughter) (Applause)
57
174320
2856
03:09
I'll do it with you.
58
177200
1256
03:10
And I want you to imagine
there exists a gap
59
178480
3016
03:13
between you and something unknown.
60
181520
2560
03:16
That unknown can be
someone you've just met.
61
184760
2856
03:19
It can be a place you've never been to.
62
187640
2096
03:21
It can be something
you've never tried before.
63
189760
3376
03:25
You got it?
64
193160
1216
03:26
OK. You can open your eyes now.
65
194400
2096
03:28
For you to leap from a place of certainty,
66
196520
3336
03:31
to take a chance on that someone
or something unknown,
67
199880
3576
03:35
you need a force to pull you over the gap,
68
203480
3336
03:38
and that remarkable force is trust.
69
206840
2840
03:42
Trust is an elusive concept,
70
210560
3616
03:46
and yet we depend on it
for our lives to function.
71
214200
3376
03:49
I trust my children
72
217600
2256
03:51
when they say they're going
to turn the lights out at night.
73
219880
2856
03:54
I trusted the pilot
who flew me here to keep me safe.
74
222760
3216
03:58
It's a word we use a lot,
75
226000
2816
04:00
without always thinking
about what it really means
76
228840
2416
04:03
and how it works in different
contexts of our lives.
77
231280
3376
04:06
There are, in fact,
hundreds of definitions of trust,
78
234680
3616
04:10
and most can be reduced
to some kind of risk assessment
79
238320
4576
04:14
of how likely it is
that things will go right.
80
242920
2760
04:18
But I don't like this definition of trust,
81
246120
2576
04:20
because it makes trust
sound rational and predictable,
82
248720
4296
04:25
and it doesn't really get
to the human essence
83
253040
2576
04:27
of what it enables us to do
84
255640
1936
04:29
and how it empowers us
85
257600
1976
04:31
to connect with other people.
86
259600
2096
04:33
So I define trust a little differently.
87
261720
2056
04:35
I define trust as a confident
relationship to the unknown.
88
263800
5440
04:41
Now, when you view trust
through this lens,
89
269920
2096
04:44
it starts to explain
why it has the unique capacity
90
272040
3656
04:47
to enable us to cope with uncertainty,
91
275720
3096
04:50
to place our faith in strangers,
92
278840
3016
04:53
to keep moving forward.
93
281880
1880
04:56
Human beings are remarkable
94
284400
2776
04:59
at taking trust leaps.
95
287200
1680
05:01
Do you remember the first time
you put your credit card details
96
289400
2976
05:04
into a website?
97
292400
1216
05:05
That's a trust leap.
98
293640
1336
05:07
I distinctly remember telling my dad
99
295000
2936
05:09
that I wanted to buy a navy blue
secondhand Peugeot on eBay,
100
297960
5256
05:15
and he rightfully pointed out
101
303240
1656
05:16
that the seller's name
was "Invisible Wizard"
102
304920
2376
05:19
and that this probably
was not such a good idea.
103
307320
3096
05:22
(Laughter)
104
310440
1696
05:24
So my work, my research
focuses on how technology
105
312160
3416
05:27
is transforming
the social glue of society,
106
315600
2656
05:30
trust between people,
107
318280
1656
05:31
and it's a fascinating area to study,
108
319960
2256
05:34
because there's still
so much we do not know.
109
322240
3376
05:37
For instance, do men and women
trust differently in digital environments?
110
325640
5096
05:42
Does the way we build trust
face-to-face translate online?
111
330760
4896
05:47
Does trust transfer?
112
335680
1936
05:49
So if you trust finding a mate on Tinder,
113
337640
2696
05:52
are you more likely
to trust finding a ride on BlaBlaCar?
114
340360
3360
05:56
But from studying hundreds
of networks and marketplaces,
115
344440
3256
05:59
there is a common pattern
that people follow,
116
347720
2816
06:02
and I call it "climbing the trust stack."
117
350560
2656
06:05
Let me use BlaBlaCar
as an example to bring it to life.
118
353240
3200
06:09
On the first level,
119
357080
1296
06:10
you have to trust the idea.
120
358400
2176
06:12
So you have to trust
121
360600
1216
06:13
the idea of ride-sharing
is safe and worth trying.
122
361840
3400
06:17
The second level is about having
confidence in the platform,
123
365640
4696
06:22
that BlaBlaCar will help you
if something goes wrong.
124
370360
4136
06:26
And the third level is about
using little bits of information
125
374520
3656
06:30
to decide whether
the other person is trustworthy.
126
378200
3480
06:34
Now, the first time
we climb the trust stack,
127
382200
2616
06:36
it feels weird, even risky,
128
384840
3256
06:40
but we get to a point
where these ideas seem totally normal.
129
388120
4976
06:45
Our behaviors transform,
130
393120
2296
06:47
often relatively quickly.
131
395440
1976
06:49
In other words, trust enables
change and innovation.
132
397440
4800
06:55
So an idea that intrigued me,
and I'd like you to consider,
133
403280
3416
06:58
is whether we can better understand
134
406720
2536
07:01
major ways of disruption and change
in individuals in society
135
409280
4176
07:05
through the lens of trust.
136
413480
2056
07:07
Well, it turns out
that trust has only evolved
137
415560
3296
07:10
in three significant chapters
throughout the course of human history:
138
418880
4656
07:15
local, institutional
139
423560
2216
07:17
and what we're now entering, distributed.
140
425800
2400
07:20
So for a long time,
141
428680
2336
07:23
until the mid-1800s,
142
431040
1256
07:24
trust was built
around tight-knit relationships.
143
432320
3936
07:28
So say I lived in a village
144
436280
2016
07:30
with the first five rows of this audience,
145
438320
2576
07:32
and we all knew one another,
146
440920
1936
07:34
and say I wanted to borrow money.
147
442880
2896
07:37
The man who had his eyes wide open,
he might lend it to me,
148
445800
2976
07:40
and if I didn't pay him back,
149
448800
2096
07:42
you'd all know I was dodgy.
150
450920
1656
07:44
I would get a bad reputation,
151
452600
1656
07:46
and you would refuse
to do business with me in the future.
152
454280
3056
07:49
Trust was mostly local
and accountability-based.
153
457360
4216
07:53
In the mid-19th century,
154
461600
1336
07:54
society went through
a tremendous amount of change.
155
462960
3416
07:58
People moved to fast-growing cities
such as London and San Francisco,
156
466400
3776
08:02
and a local banker here
was replaced by large corporations
157
470200
4856
08:07
that didn't know us as individuals.
158
475080
2896
08:10
We started to place our trust
159
478000
1976
08:12
into black box systems of authority,
160
480000
3576
08:15
things like legal contracts
and regulation and insurance,
161
483600
4296
08:19
and less trust directly in other people.
162
487920
4016
08:23
Trust became institutional
and commission-based.
163
491960
3856
08:27
It's widely talked about how trust
in institutions and many corporate brands
164
495840
4816
08:32
has been steadily declining
and continues to do so.
165
500680
3736
08:36
I am constantly stunned
by major breaches of trust:
166
504440
5616
08:42
the News Corp phone hacking,
167
510080
2496
08:44
the Volkswagen emissions scandal,
168
512600
2816
08:47
the widespread abuse
in the Catholic Church,
169
515440
3335
08:50
the fact that only one measly banker
170
518799
3216
08:54
went to jail after the great
financial crisis,
171
522039
3297
08:57
or more recently the Panama Papers
172
525360
2056
08:59
that revealed how the rich
can exploit offshore tax regimes.
173
527440
5136
09:04
And the thing that really surprises me
174
532600
2456
09:07
is why do leaders find it so hard
175
535080
4016
09:11
to apologize, I mean sincerely apologize,
176
539120
3176
09:14
when our trust is broken?
177
542320
2160
09:17
It would be easy to conclude
that institutional trust isn't working
178
545360
4136
09:21
because we are fed up
179
549520
1496
09:23
with the sheer audacity
of dishonest elites,
180
551040
2896
09:25
but what's happening now
181
553960
1976
09:27
runs deeper than the rampant questioning
of the size and structure of institutions.
182
555960
5616
09:33
We're starting to realize
183
561600
2016
09:35
that institutional trust
184
563640
1776
09:37
wasn't designed for the digital age.
185
565440
2976
09:40
Conventions of how trust is built,
186
568440
3656
09:44
managed, lost and repaired --
187
572120
2336
09:46
in brands, leaders and entire systems --
188
574480
2496
09:49
is being turned upside down.
189
577000
2000
09:51
Now, this is exciting,
190
579760
2096
09:53
but it's frightening,
191
581880
1536
09:55
because it forces many of us
to have to rethink
192
583440
2696
09:58
how trust is built and destroyed
with our customers, with our employees,
193
586160
4696
10:02
even our loved ones.
194
590880
1480
10:05
The other day, I was talking to the CEO
of a leading international hotel brand,
195
593800
6176
10:12
and as is often the case,
we got onto the topic of Airbnb.
196
600000
3240
10:15
And he admitted to me
that he was perplexed by their success.
197
603840
5096
10:20
He was perplexed at how a company
198
608960
2136
10:23
that depends on the willingness
of strangers to trust one another
199
611120
4176
10:27
could work so well across 191 countries.
200
615320
3960
10:31
So I said to him
that I had a confession to make,
201
619920
3136
10:35
and he looked at me a bit strangely,
202
623080
1976
10:37
and I said --
203
625080
1376
10:38
and I'm sure many of you
do this as well --
204
626480
2016
10:40
I don't always bother to hang my towels up
205
628520
2496
10:43
when I'm finished in the hotel,
206
631040
2936
10:46
but I would never do this
as a guest on Airbnb.
207
634000
2640
10:49
And the reason why I would never do this
as a guest on Airbnb
208
637560
3336
10:52
is because guests know
that they'll be rated by hosts,
209
640920
3656
10:56
and that those ratings
are likely to impact their ability
210
644600
3736
11:00
to transact in the future.
211
648360
1680
11:02
It's a simple illustration of how
online trust will change our behaviors
212
650680
4216
11:06
in the real world,
213
654920
1296
11:08
make us more accountable
214
656240
2496
11:10
in ways we cannot yet even imagine.
215
658760
3440
11:14
I am not saying we do not need hotels
216
662880
3056
11:17
or traditional forms of authority.
217
665960
2336
11:20
But what we cannot deny
218
668320
2096
11:22
is that the way trust
flows through society is changing,
219
670440
4256
11:26
and it's creating this big shift
220
674720
2296
11:29
away from the 20th century
221
677040
1856
11:30
that was defined by institutional trust
222
678920
2816
11:33
towards the 21st century
223
681760
2496
11:36
that will be fueled by distributed trust.
224
684280
2640
11:39
Trust is no longer top-down.
225
687480
4176
11:43
It's being unbundled and inverted.
226
691680
2096
11:45
It's no longer opaque and linear.
227
693800
2840
11:49
A new recipe for trust is emerging
228
697160
2976
11:52
that once again
is distributed amongst people
229
700160
3896
11:56
and is accountability-based.
230
704080
2136
11:58
And this shift is only going to accelerate
231
706240
3416
12:01
with the emergence of the blockchain,
232
709680
2736
12:04
the innovative ledger technology
underpinning Bitcoin.
233
712440
3640
12:08
Now let's be honest,
234
716800
2936
12:11
getting our heads around
the way blockchain works
235
719760
3456
12:15
is mind-blowing.
236
723240
1440
12:17
And one of the reasons why
is it involves processing
237
725720
3256
12:21
some pretty complicated concepts
238
729000
2656
12:23
with terrible names.
239
731680
1496
12:25
I mean, cryptographic algorithms
and hash functions,
240
733200
4496
12:29
and people called miners,
who verify transactions --
241
737720
3056
12:32
all that was created
by this mysterious person
242
740800
3576
12:36
or persons called Satoshi Nakamoto.
243
744400
2736
12:39
Now, that is a massive trust leap
that hasn't happened yet.
244
747160
5656
12:44
(Applause)
245
752840
3056
12:47
But let's try to imagine this.
246
755920
1456
12:49
So "The Economist"
eloquently described the blockchain
247
757400
3696
12:53
as the great chain
of being sure about things.
248
761120
3656
12:56
The easiest way I can describe it
is imagine the blocks as spreadsheets,
249
764800
5056
13:01
and they are filled with assets.
250
769880
2976
13:04
So that could be a property title.
251
772880
2416
13:07
It could be a stock trade.
252
775320
2016
13:09
It could be a creative asset,
such as the rights to a song.
253
777360
2960
13:12
Every time something moves
254
780960
3016
13:16
from one place on the register
to somewhere else,
255
784000
3816
13:19
that asset transfer is time-stamped
256
787840
3096
13:22
and publicly recorded on the blockchain.
257
790960
3416
13:26
It's that simple. Right.
258
794400
1880
13:28
So the real implication of the blockchain
259
796720
3096
13:31
is that it removes the need
for any kind of third party,
260
799840
4136
13:36
such as a lawyer,
261
804000
1336
13:37
or a trusted intermediary,
or maybe not a government intermediary
262
805360
3456
13:40
to facilitate the exchange.
263
808840
1816
13:42
So if we go back to the trust stack,
264
810680
2256
13:44
you still have to trust the idea,
265
812960
2736
13:47
you have to trust the platform,
266
815720
2296
13:50
but you don't have to trust
the other person
267
818040
2936
13:53
in the traditional sense.
268
821000
1936
13:54
The implications are huge.
269
822960
2496
13:57
In the same way the internet blew open
the doors to an age of information
270
825480
3696
14:01
available to everyone,
271
829200
1416
14:02
the blockchain will revolutionize
trust on a global scale.
272
830640
4320
14:08
Now, I've waited to the end
intentionally to mention Uber,
273
836240
3936
14:12
because I recognize
that it is a contentious
274
840200
3456
14:15
and widely overused example,
275
843680
2616
14:18
but in the context of a new era of trust,
it's a great case study.
276
846320
3240
14:21
Now, we will see cases of abuse
of distributed trust.
277
849920
4736
14:26
We've already seen this,
and it can go horribly wrong.
278
854680
3656
14:30
I am not surprised that we are seeing
protests from taxi associations
279
858360
5176
14:35
all around the world
280
863560
1336
14:36
trying to get governments to ban Uber
based on claims that it is unsafe.
281
864920
4880
14:42
I happened to be in London
the day that these protests took place,
282
870320
4376
14:46
and I happened to notice a tweet
283
874720
1976
14:48
from Matt Hancock, who is
a British minister for business.
284
876720
3616
14:52
And he wrote,
285
880360
1216
14:53
"Does anyone have details of this
#Uber app everyone's talking about?
286
881600
3976
14:57
(Laughter)
287
885600
1200
14:59
I'd never heard of it until today."
288
887880
2640
15:03
Now, the taxi associations,
289
891560
3280
15:07
they legitimized the first layer
of the trust stack.
290
895800
2736
15:10
They legitimized the idea
that they were trying to eliminate,
291
898560
3336
15:13
and sign-ups increased
by 850 percent in 24 hours.
292
901920
5136
15:19
Now, this is a really strong illustration
293
907080
3256
15:22
of how once a trust shift has happened
around a behavior or an entire sector,
294
910360
5816
15:28
you cannot reverse the story.
295
916200
2240
15:31
Every day, five million people
will take a trust leap
296
919120
3856
15:35
and ride with Uber.
297
923000
1536
15:36
In China, on Didi,
the ride-sharing platform,
298
924560
3216
15:39
11 million rides taken every day.
299
927800
2896
15:42
That's 127 rides per second,
300
930720
3616
15:46
showing that this is
a cross-cultural phenomenon.
301
934360
2816
15:49
And the fascinating thing is
that both drivers and passengers report
302
937200
4176
15:53
that seeing a name
303
941400
2496
15:55
and seeing someone's photo
and their rating
304
943920
2976
15:58
makes them feel safer,
305
946920
2256
16:01
and as you may have experienced,
306
949200
1576
16:02
even behave a little more nicely
in the taxi cab.
307
950800
3960
16:07
Uber and Didi are early
but powerful examples
308
955360
3696
16:11
of how technology
is creating trust between people
309
959080
3976
16:15
in ways and on a scale
never possible before.
310
963080
3280
16:19
Today, many of us are comfortable
getting into cars driven by strangers.
311
967120
6056
16:25
We meet up with someone
we swiped right to be matched with.
312
973200
4296
16:29
We share our homes
with people we do not know.
313
977520
3936
16:33
This is just the beginning,
314
981480
2360
16:36
because the real disruption happening
315
984440
2576
16:39
isn't technological.
316
987040
1936
16:41
It's the trust shift it creates,
317
989000
2320
16:43
and for my part, I want to help people
understand this new era of trust
318
991880
5096
16:49
so that we can get it right
319
997000
1656
16:50
and we can embrace
the opportunities to redesign systems
320
998680
3896
16:54
that are more transparent,
inclusive and accountable.
321
1002600
4096
16:58
Thank you very much.
322
1006720
1256
17:00
(Applause)
323
1008000
2576
17:02
Thank you.
324
1010600
1216
17:03
(Applause)
325
1011840
3708

▲Back to top

ABOUT THE SPEAKER
Rachel Botsman - Trust researcher
Rachel Botsman is a recognized expert on how collaboration and trust enabled by digital technologies will change the way we live, work, bank and consume.

Why you should listen

Rachel Botsman is an author and a visiting academic at the University of Oxford, Saïd Business School. Her work focuses on how technology is enabling trust in ways that are changing the way we live, work, bank and consume. She defined the theory of "collaborative consumption" in her first book, What's Mine Is Yours, which she co-authored with Roo Rogers. The concept was subsequently named by TIME as one of the "10 Ideas that Will Change the World" and by Thinkers50 as the 2015 Breakthrough Idea.

Named a "Young Global Leader" by the World Economic Forum, Botsman examines the growth and challenges of start-ups such as Airbnb, TaskRabbit and Uber. She is regular writer and commentator in leading international publications including the New York Times, The Wall Street Journal, Harvard Business Review, The Economist, WIRED and more. She is currently writing a new book that explores why the real disruption happening isn’t technology; it’s a profound shift in trust.

More profile about the speaker
Rachel Botsman | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee