ABOUT THE SPEAKER
Sebastian Deterding - Designer
Sebastian Deterding is an interface designer who thinks deeply about persuasive and gameful design.

Why you should listen

Sebastian Deterding is a designer and researcher working on user experience, video games, persuasive technology and gameful design. He is interested in how code shapes conduct -- and how to put that knowledge into practice. He is a PhD researcher in Communication at the Graduate School of the Research Center for Media and Communication, Hamburg University. He is also an affiliated researcher at the Hans Bredow Institute for Media Research in Hamburg, and works as an independent user experience designer.

More profile about the speaker
Sebastian Deterding | Speaker | TED.com
TEDxHogeschoolUtrecht

Sebastian Deterding: What your designs say about you

Filmed:
667,868 views

What does your chair say about what you value? Designer Sebastian Deterding shows how our visions of morality and "the good life" are reflected in the design of objects around us.
- Designer
Sebastian Deterding is an interface designer who thinks deeply about persuasive and gameful design. Full bio

Double-click the English transcript below to play the video.

00:16
We are today talking about moral persuasion.
0
392
2651
00:18
What is moral and immoral
1
3043
1549
00:20
in trying to change people's behaviors
2
4592
2416
00:22
by using technology and using design?
3
7008
2351
00:25
And I don't know what you expect,
4
9359
1852
00:27
but when I was thinking about that issue,
5
11211
2048
00:29
I early on realized
6
13259
1233
00:30
what I'm not able to give you are answers.
7
14492
3334
00:33
I'm not able to tell you what is moral or immoral
8
17826
2316
00:36
because we're living in a pluralist society.
9
20142
2916
00:38
My values can be radically different
10
23058
2633
00:41
from your values.
11
25691
1884
00:43
Which means that what I consider moral or immoral based on that
12
27575
3419
00:46
might not necessarily be what you consider moral or immoral.
13
30994
3731
00:50
But I also realized that there is one thing that I could give you.
14
34725
2866
00:53
And that is what this guy behind me gave the world --
15
37591
3059
00:56
Socrates.
16
40650
911
00:57
It is questions.
17
41561
1447
00:58
What I can do and what I would like to do with you
18
43008
2625
01:01
is give you, like that initial question,
19
45633
2020
01:03
a set of questions
20
47653
1315
01:04
to figure out for yourself,
21
48968
2033
01:06
layer by layer,
22
51001
1434
01:08
like peeling an onion,
23
52435
2217
01:10
getting at the core of what you believe
24
54652
2216
01:12
is moral or immoral persuasion.
25
56868
3332
01:16
And I'd like to do that with a couple of examples
26
60200
2868
01:18
of technologies where people have used game elements
27
63068
2983
01:21
to get people to do things.
28
66051
3317
01:25
So it's a first very simple, a very obvious question
29
69368
3466
01:28
I would like to give you:
30
72834
1533
01:30
What are your intentions if you are designing something?
31
74367
2818
01:33
And obviously intentions are not the only thing,
32
77185
3233
01:36
so here is another example for one of these applications.
33
80418
3383
01:39
There are a couple of these kinds of eco-dashboards right now --
34
83801
2899
01:42
so dashboards built into cars
35
86700
1901
01:44
which try to motivate you to drive more fuel efficiently.
36
88601
2733
01:47
This here is Nissan's MyLeaf,
37
91334
1617
01:48
where your driving behavior is compared
38
92951
2297
01:51
with the driving behavior of other people,
39
95248
1719
01:52
so you can compete for who drives around
40
96967
1953
01:54
the most fuel efficiently.
41
98920
1398
01:56
And these things are very effective, it turns out,
42
100318
2266
01:58
so effective that they motivate people
43
102584
2617
02:01
to engage in unsafe driving behaviors --
44
105201
2082
02:03
like not stopping on a red headlight.
45
107283
1742
02:04
Because that way you have to stop and restart the engine,
46
109025
2221
02:07
and that would use quite some fuel, wouldn't it?
47
111246
3920
02:11
So despite this being a very well-intended application,
48
115166
4345
02:15
obviously there was a side effect of that.
49
119511
2580
02:17
And here's another example for one of these side effects.
50
122091
2149
02:20
Commendable:
51
124240
1401
02:21
a site that allows parents to give their kids little badges
52
125641
3649
02:25
for doing the things that parents want their kids to do --
53
129290
2200
02:27
like tying their shoes.
54
131490
1335
02:28
And at first that sounds very nice,
55
132825
2354
02:31
very benign, well intended.
56
135179
1961
02:33
But it turns out, if you look into research on people's mindset,
57
137140
3884
02:36
that caring about outcomes,
58
141024
2035
02:38
caring about public recognition,
59
143059
1797
02:40
caring about these kinds of public tokens of recognition
60
144856
3385
02:44
is not necessarily very helpful
61
148241
2518
02:46
for your long-term psychological well-being.
62
150759
2197
02:48
It's better if you care about learning something.
63
152956
2834
02:51
It's better when you care about yourself
64
155790
1833
02:53
than how you appear in front of other people.
65
157623
2986
02:56
So that kind of motivational tool that is used
66
160609
3181
02:59
actually in and of itself
67
163790
1784
03:01
has a long-term side effect
68
165574
2184
03:03
in that every time we use a technology
69
167758
1717
03:05
that uses something like public recognition or status,
70
169475
3315
03:08
we're actually positively endorsing this
71
172790
2400
03:11
as a good and a normal thing to care about --
72
175190
3434
03:14
that way, possibly having a detrimental effect
73
178624
2685
03:17
on the long-term psychological well-being of ourselves as a culture.
74
181309
4082
03:21
So that's a second, very obvious question:
75
185391
2468
03:23
What are the effects of what you're doing?
76
187859
2632
03:26
The effects that you're having with the device,
77
190491
2105
03:28
like less fuel,
78
192596
1417
03:29
as well as the effects of the actual tools you're using
79
194013
2783
03:32
to get people to do things --
80
196796
2032
03:34
public recognition.
81
198828
1583
03:36
Now is that all -- intention, effect?
82
200411
2950
03:39
Well there are some technologies
83
203361
1520
03:40
which obviously combine both.
84
204881
1734
03:42
Both good long-term and short-term effects
85
206615
2365
03:44
and a positive intention like Fred Stutzman's Freedom,
86
208980
3052
03:47
where the whole point of that application
87
212032
1559
03:49
is, well, we're usually so bombarded
88
213591
3271
03:52
with calls and requests by other people,
89
216862
1201
03:53
with this device you can shut off the Internet connectivity
90
218063
2616
03:56
of your PC of choice for a preset amount of time
91
220679
3384
03:59
to actually get some work done.
92
224063
2200
04:02
And I think most of us will agree,
93
226263
1333
04:03
well that's something well intended
94
227596
1449
04:04
and also has good consequences.
95
229045
2117
04:07
In the words of Michel Foucault,
96
231162
2117
04:09
"It is a technology of the self."
97
233279
2217
04:11
It is a technology that empowers the individual
98
235496
2351
04:13
to determine its own life course,
99
237847
2348
04:16
to shape itself.
100
240195
1884
04:17
But the problem is,
101
242079
1116
04:19
as Foucault points out,
102
243195
1801
04:20
that every technology of the self
103
244996
1799
04:22
has a technology of domination as its flip side.
104
246795
3167
04:25
As you see in today's modern liberal democracies,
105
249962
4334
04:30
the society, the state,
106
254296
2119
04:32
not only allows us to determine our self, to shape our self,
107
256415
4379
04:36
it also demands it of us.
108
260794
1818
04:38
It demands that we optimize ourselves,
109
262612
2334
04:40
that we control ourselves,
110
264946
1550
04:42
that we self-manage continuously
111
266496
2449
04:44
because that's the only way
112
268945
1801
04:46
in which such a liberal society works.
113
270746
2715
04:49
These technologies want us to stay in the game
114
273461
4293
04:53
that society has devised for us.
115
277754
2642
04:56
They want us to fit in even better.
116
280396
2465
04:58
They want us to optimize ourselves to fit in.
117
282861
2859
05:01
Now I don't say that is necessarily a bad thing.
118
285720
3530
05:05
I just think that this example
119
289250
2633
05:07
points us to a general realization,
120
291883
2316
05:10
and that is no matter what technology or design you look at,
121
294199
3719
05:13
even something we consider as well intended and as good in its effects --
122
297918
5066
05:18
like Stutzman's Freedom --
123
302984
1233
05:20
comes with certain values embedded in it.
124
304217
2700
05:22
And we can question these values.
125
306917
1683
05:24
We can question: Is it a good thing
126
308600
1931
05:26
that all of us continuously self-optimize ourselves
127
310531
3883
05:30
to fit better into that society?
128
314414
1902
05:32
Or to give you another example,
129
316316
1499
05:33
what about a piece of persuasive technology
130
317815
1884
05:35
that convinces Muslim women to wear their headscarves?
131
319699
3700
05:39
Is that a good or a bad technology
132
323399
2284
05:41
in its intentions or in its effects?
133
325683
2516
05:44
Well that basically depends
134
328199
1400
05:45
on the kind of values that you bring to bear
135
329599
2684
05:48
to make these kinds of judgments.
136
332283
2116
05:50
So that's a third question:
137
334399
2034
05:52
What values do you use to judge?
138
336433
1552
05:53
And speaking of values,
139
337985
1849
05:55
I've noticed that in the discussion about moral persuasion online,
140
339834
3286
05:59
and when I'm talking with people,
141
343120
1798
06:00
more often than not there is a weird bias.
142
344918
2789
06:03
And that bias is that we're asking,
143
347707
3260
06:06
is this or that "still" ethical?
144
350967
3117
06:09
Is it "still" permissible?
145
354084
2334
06:12
We're asking things like,
146
356418
1268
06:13
Is this Oxfam donation form --
147
357686
2266
06:15
where the regular monthly donation is the preset default
148
359952
3016
06:18
and people, maybe without intending it,
149
362968
2084
06:20
are that way encouraged or nudged
150
365052
2815
06:23
into giving a regular donation instead of a one-time donation --
151
367867
2397
06:26
is that still permissible?
152
370264
1366
06:27
Is it still ethical?
153
371630
1386
06:28
We're fishing at the low end.
154
373016
1684
06:30
But in fact, that question
155
374700
1877
06:32
"Is it still ethical?"
156
376577
910
06:33
is just one way of looking at ethics.
157
377487
2330
06:35
Because if you look at the beginning of ethics
158
379817
2591
06:38
in Western culture,
159
382408
2225
06:40
you see a very different idea
160
384633
2165
06:42
of what ethics also could be.
161
386798
1536
06:44
For Aristotle, ethics was not about the question,
162
388334
4098
06:48
is that still good, or is it bad?
163
392432
2485
06:50
Ethics was about the question of how to live life well.
164
394917
3450
06:54
And he put that in the word "arete,"
165
398367
2162
06:56
which we, from the [Ancient Greek], translate as "virtue."
166
400529
2921
06:59
But really it means excellence.
167
403450
1625
07:00
It means living up to your own full potential
168
405075
3999
07:04
as a human being.
169
409074
2228
07:07
And that is an idea
170
411302
1184
07:08
that, I think, that Paul Richard Buchanan nicely put in a recent essay
171
412486
3506
07:11
where he said, "Products are vivid arguments
172
415992
2451
07:14
about how we should live our lives."
173
418443
2098
07:16
Our designs are not ethical or unethical
174
420541
2886
07:19
in that they're using ethical or unethical means of persuading us.
175
423427
4614
07:23
They have a moral component
176
428041
1567
07:25
just in the kind of vision and the aspiration of the good life
177
429608
4089
07:29
that they present to us.
178
433697
2696
07:32
And if you look into the designed environment around us
179
436393
3315
07:35
with that kind of lens,
180
439708
1196
07:36
asking, "What is the vision of the good life
181
440904
2221
07:39
that our products, our design, present to us?",
182
443125
3018
07:42
then you often get the shivers,
183
446143
2300
07:44
because of how little we expect of each other,
184
448443
2352
07:46
of how little we actually seem to expect
185
450795
2718
07:49
of our life and what the good life looks like.
186
453513
3583
07:52
So that's the fourth question I'd like to leave you with:
187
457096
3801
07:56
What vision of the good life
188
460897
1483
07:58
do your designs convey?
189
462380
3665
08:01
And speaking of design,
190
466045
1102
08:03
you notice that I already broadened the discussion.
191
467147
3732
08:06
Because it's not just persuasive technology that we're talking about here,
192
470879
4735
08:11
it's any piece of design that we put out here in the world.
193
475614
4200
08:15
I don't know whether you know
194
479814
1382
08:17
the great communication researcher Paul Watzlawick
195
481196
1668
08:18
who, back in the '60s, made the argument
196
482864
1848
08:20
we cannot not communicate.
197
484712
1335
08:21
Even if we choose to be silent,
198
486047
2533
08:24
we chose to be silent. We're communicating something by choosing to be silent.
199
488580
4397
08:28
And in the same way that we cannot not communicate,
200
492977
2604
08:31
we cannot not persuade.
201
495581
1517
08:32
Whatever we do or refrain from doing,
202
497098
2299
08:35
whatever we put out there as a piece of design
203
499397
2866
08:38
into the world
204
502263
1267
08:39
has a persuasive component.
205
503530
2284
08:41
It tries to affect people.
206
505814
2116
08:43
It puts a certain vision of the good life
207
507930
2184
08:46
out there in front of us.
208
510114
1337
08:47
Which is what Peter-Paul Verbeek,
209
511451
1986
08:49
the Dutch philosopher of technology, says.
210
513437
2146
08:51
No matter whether we as designers intend it or not,
211
515583
4566
08:56
we materialize morality.
212
520149
1764
08:57
We make certain things harder and easier to do.
213
521913
3051
09:00
We organize the existence of people.
214
524964
1950
09:02
We put a certain vision of what good or bad
215
526914
2750
09:05
or normal or usual is in front of people
216
529664
3368
09:08
by everything we put out there in the world.
217
533032
2549
09:11
Even something as innocuous as a set of school chairs
218
535581
3516
09:14
is a persuasive technology.
219
539097
1434
09:16
Because it presents and materializes
220
540531
2816
09:19
a certain vision of the good life --
221
543347
1934
09:21
the good life in which teaching and learning and listening
222
545281
3482
09:24
is about one person teaching, the others listening,
223
548763
2935
09:27
in which it is about, learning is done while sitting,
224
551698
3752
09:31
in which you learn for yourself,
225
555450
2230
09:33
in which you're not supposed to change these rules
226
557680
2250
09:35
because the chairs are fixed to the ground.
227
559930
3250
09:39
And even something as innocuous as a single design chair --
228
563180
3383
09:42
like this one by Arne Jacobsen --
229
566563
1301
09:43
is a persuasive technology.
230
567864
1701
09:45
Because, again, it communicates an idea of the good life.
231
569565
3415
09:48
A good life --
232
572980
1499
09:50
a life that you say you as a designer consent to
233
574479
2201
09:52
by saying, "In the good life,
234
576680
2087
09:54
goods are produced as sustainably or unsustainably as this chair.
235
578767
4014
09:58
Workers are treated as well or as badly
236
582781
2622
10:01
as the workers were treated who built that chair."
237
585403
2644
10:03
The good life is a life where design is important
238
588047
2333
10:06
because somebody obviously took the time and spent the money
239
590380
2734
10:09
for that kind of well-designed chair,
240
593114
1864
10:10
where tradition is important
241
594978
1419
10:12
because this is a traditional classic
242
596397
1801
10:14
and someone cared about this,
243
598198
1512
10:15
and where there is something as conspicuous consumption,
244
599710
1952
10:17
where it is okay and normal
245
601662
1568
10:19
to spend a humungous amount of money on such a chair
246
603230
3019
10:22
to signal to other people what your social status is.
247
606249
3415
10:25
So these are the kinds of layers, the kinds of questions
248
609664
3167
10:28
I wanted to lead you through today --
249
612831
1753
10:30
the question of, What are the intentions
250
614584
2148
10:32
that you bring to bear when you're designing something?
251
616732
2466
10:35
What are the effects, intended and unintended, that you're having?
252
619198
3500
10:38
What are the values you're using
253
622698
1565
10:40
to judge those?
254
624263
1352
10:41
What are the virtues, the aspirations
255
625615
1467
10:42
that you're actually expressing in that?
256
627082
2719
10:45
And how does that apply,
257
629801
2095
10:47
not just to persuasive technology,
258
631896
1992
10:49
but to everything you design?
259
633888
2334
10:52
Do we stop there?
260
636222
1902
10:54
I don't think so.
261
638124
1633
10:55
I think that all of these things are eventually informed
262
639757
4465
11:00
by the core of all of this --
263
644222
1718
11:01
and this is nothing but life itself.
264
645940
3115
11:04
Why, when the question of what the good life is
265
649055
2900
11:07
informs everything that we design,
266
651955
2351
11:10
should we stop at design and not ask ourselves,
267
654306
2583
11:12
how does it apply to our own life?
268
656889
2733
11:15
"Why should the lamp or the house be an art object,
269
659622
2735
11:18
but not our life?"
270
662357
1231
11:19
as Michel Foucault puts it.
271
663588
1616
11:21
Just to give you a practical example of Buster Benson.
272
665204
3551
11:24
This is Buster setting up a pull-up machine
273
668755
2515
11:27
at the office of his new startup Habit Labs,
274
671270
2453
11:29
where they're trying to build up other applications
275
673723
2017
11:31
like Health Month for people.
276
675740
1982
11:33
And why is he building a thing like this?
277
677722
2580
11:36
Well here is the set of axioms
278
680302
2005
11:38
that Habit Labs, Buster's startup, put up for themselves
279
682307
3203
11:41
on how they wanted to work together as a team
280
685510
3121
11:44
when they're building these applications --
281
688631
1824
11:46
a set of moral principles they set themselves
282
690455
2052
11:48
for working together --
283
692507
1748
11:50
and one of them being,
284
694255
1398
11:51
"We take care of our own health and manage our own burnout."
285
695653
3082
11:54
Because ultimately how can you ask yourselves
286
698735
3750
11:58
and how can you find an answer
287
702485
1467
11:59
on what vision of the good life
288
703952
2300
12:02
you want to convey and create with your designs
289
706252
3101
12:05
without asking the question,
290
709353
1682
12:06
what vision of the good life
291
711035
1484
12:08
do you yourself want to live?
292
712519
2900
12:11
And with that, I thank you.
293
715419
4250
12:15
(Applause)
294
719669
2963
Translated by Timothy Covell
Reviewed by Morton Bast

▲Back to top

ABOUT THE SPEAKER
Sebastian Deterding - Designer
Sebastian Deterding is an interface designer who thinks deeply about persuasive and gameful design.

Why you should listen

Sebastian Deterding is a designer and researcher working on user experience, video games, persuasive technology and gameful design. He is interested in how code shapes conduct -- and how to put that knowledge into practice. He is a PhD researcher in Communication at the Graduate School of the Research Center for Media and Communication, Hamburg University. He is also an affiliated researcher at the Hans Bredow Institute for Media Research in Hamburg, and works as an independent user experience designer.

More profile about the speaker
Sebastian Deterding | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee