ABOUT THE SPEAKER
Margaret Gould Stewart - User experience master
At Facebook (and previously at YouTube), Margaret Gould Stewart designs experiences that touch the lives of a large percentage of the world's population.

Why you should listen

Margaret Gould Stewart has spent her career asking, “How do we design user experiences that change the world in fundamental ways?” It's a powerful question that has led her to manage user experiences for six of the ten most visited websites in the world, including Facebook, where she serves as Director of Product Design.

Before joining Facebook, Margaret managed the User Experience Team for YouTube, where she oversaw the largest redesign in the company's history, including the YouTube player page. She came to YouTube after two years leading Search and Consumer Products UX at Google. She approaches her work with a combined appreciation for timeless great design and transient digital technologies, and always with the end goal of improving people's lives. As she says: "Design is creativity in service of others."

More profile about the speaker
Margaret Gould Stewart | Speaker | TED.com
TED2014

Margaret Gould Stewart: How giant websites design for you (and a billion others, too)

Filmed:
1,744,107 views

Facebook's "like" and "share" buttons are seen 22 billion times a day, making them some of the most-viewed design elements ever created. Margaret Gould Stewart, Facebook's director of product design, outlines three rules for design at such a massive scale—one so big that the tiniest of tweaks can cause global outrage, but also so large that the subtlest of improvements can positively impact the lives of many.
- User experience master
At Facebook (and previously at YouTube), Margaret Gould Stewart designs experiences that touch the lives of a large percentage of the world's population. Full bio

Double-click the English transcript below to play the video.

00:12
What do you think of when I say the word "design"?
0
779
3721
00:16
You probably think of things like this,
1
4500
2320
00:18
finely crafted objects that you can hold in your hand,
2
6820
3511
00:22
or maybe logos and posters and maps
3
10331
2929
00:25
that visually explain things,
4
13260
1921
00:27
classic icons of timeless design.
5
15181
3158
00:30
But I'm not here to talk about that kind of design.
6
18339
2881
00:33
I want to talk about the kind
7
21220
1155
00:34
that you probably use every day
8
22375
2029
00:36
and may not give much thought to,
9
24404
1761
00:38
designs that change all the time
10
26165
2111
00:40
and that live inside your pocket.
11
28276
2257
00:42
I'm talking about the design
12
30533
2063
00:44
of digital experiences
13
32596
2055
00:46
and specifically the design of systems
14
34651
2736
00:49
that are so big that their scale
15
37387
1633
00:51
can be hard to comprehend.
16
39020
2091
00:53
Consider the fact that Google processes
17
41111
2261
00:55
over one billion search queries every day,
18
43372
3785
00:59
that every minute, over 100 hours
19
47157
2168
01:01
of footage are uploaded to YouTube.
20
49325
2200
01:03
That's more in a single day
21
51525
1767
01:05
than all three major U.S. networks broadcast
22
53292
2750
01:08
in the last five years combined.
23
56042
3275
01:11
And Facebook transmitting the photos,
24
59317
2304
01:13
messages and stories
25
61621
1196
01:14
of over 1.23 billion people.
26
62817
3172
01:17
That's almost half of the Internet population,
27
65989
2744
01:20
and a sixth of humanity.
28
68733
2672
01:23
These are some of the products
29
71405
1014
01:24
that I've helped design over the course of my career,
30
72419
2754
01:27
and their scale is so massive
31
75173
2120
01:29
that they've produced unprecedented
32
77293
1888
01:31
design challenges.
33
79181
1846
01:33
But what is really hard
34
81027
2362
01:35
about designing at scale is this:
35
83389
2731
01:38
It's hard in part because
36
86120
1836
01:39
it requires a combination of two things,
37
87956
2905
01:42
audacity and humility —
38
90861
2777
01:45
audacity to believe that the thing that you're making
39
93638
3844
01:49
is something that the entire world wants and needs,
40
97482
3296
01:52
and humility to understand that as a designer,
41
100778
3096
01:55
it's not about you or your portfolio,
42
103874
2440
01:58
it's about the people that you're designing for,
43
106314
2592
02:00
and how your work just might help them
44
108906
1896
02:02
live better lives.
45
110802
1604
02:04
Now, unfortunately, there's no school
46
112406
2621
02:07
that offers the course Designing for Humanity 101.
47
115027
4093
02:11
I and the other designers
48
119120
1572
02:12
who work on these kinds of products
49
120692
1744
02:14
have had to invent it as we go along,
50
122436
3352
02:17
and we are teaching ourselves
51
125788
1617
02:19
the emerging best practices
52
127405
1868
02:21
of designing at scale,
53
129273
1980
02:23
and today I'd like share some of the things
54
131253
1952
02:25
that we've learned over the years.
55
133205
2080
02:27
Now, the first thing that you need to know
56
135285
1231
02:28
about designing at scale
57
136516
1130
02:29
is that the little things really matter.
58
137646
2974
02:32
Here's a really good example of how
59
140620
1937
02:34
a very tiny design element can make a big impact.
60
142557
3503
02:38
The team at Facebook that manages
61
146060
3080
02:41
the Facebook "Like" button
62
149140
2108
02:43
decided that it needed to be redesigned.
63
151248
2472
02:45
The button had kind of gotten out of sync
64
153720
2427
02:48
with the evolution of our brand
65
156147
1449
02:49
and it needed to be modernized.
66
157596
1788
02:51
Now you might think, well, it's a tiny little button,
67
159384
1868
02:53
it probably is a pretty straightforward,
68
161252
1993
02:55
easy design assignment, but it wasn't.
69
163245
3015
02:58
Turns out, there were all kinds of constraints
70
166260
2121
03:00
for the design of this button.
71
168381
1809
03:02
You had to work within specific
height and width parameters.
72
170190
3455
03:05
You had to be careful to make it work
73
173645
2295
03:07
in a bunch of different languages,
74
175940
1761
03:09
and be careful about using
fancy gradients or borders
75
177701
3055
03:12
because it has to degrade gracefully
76
180756
2144
03:14
in old web browsers.
77
182900
1753
03:16
The truth is, designing this tiny little button
78
184653
2672
03:19
was a huge pain in the butt.
79
187325
2183
03:21
Now, this is the new version of the button,
80
189508
2052
03:23
and the designer who led this project estimates
81
191560
2514
03:26
that he spent over 280 hours
82
194074
3514
03:29
redesigning this button over the course of months.
83
197588
3650
03:33
Now, why would we spend so much time
84
201238
2344
03:35
on something so small?
85
203582
2304
03:37
It's because when you're designing at scale,
86
205886
1616
03:39
there's no such thing as a small detail.
87
207502
3129
03:42
This innocent little button
88
210631
1516
03:44
is seen on average 22 billion times a day
89
212147
3955
03:48
and on over 7.5 million websites.
90
216102
3500
03:51
It's one of the single most viewed
design elements ever created.
91
219602
4084
03:55
Now that's a lot of pressure for a little button
92
223686
2504
03:58
and the designer behind it,
93
226190
1720
03:59
but with these kinds of products,
94
227910
1528
04:01
you need to get even the tiny things right.
95
229438
2736
04:04
Now, the next thing that you need to understand
96
232174
2536
04:06
is how to design with data.
97
234710
2488
04:09
Now, when you're working on products like this,
98
237198
1672
04:10
you have incredible amounts of information
99
238870
2745
04:13
about how people are using your product
100
241615
2088
04:15
that you can then use to influence
101
243703
1655
04:17
your design decisions,
102
245358
1584
04:18
but it's not just as simple as following the numbers.
103
246942
2752
04:21
Let me give you an example
104
249694
1251
04:22
so that you can understand what I mean.
105
250945
2165
04:25
Facebook has had a tool for a long time
106
253110
2157
04:27
that allowed people to report photos
107
255267
2313
04:29
that may be in violation of our community standards,
108
257580
2862
04:32
things like spam and abuse.
109
260442
2420
04:34
And there were a ton of photos reported,
110
262862
2008
04:36
but as it turns out,
111
264870
1636
04:38
only a small percentage were actually
112
266506
1896
04:40
in violation of those community standards.
113
268402
2918
04:43
Most of them were just your typical party photo.
114
271320
2510
04:45
Now, to give you a specific hypothetical example,
115
273830
2908
04:48
let's say my friend Laura hypothetically
116
276738
2496
04:51
uploads a picture of me
117
279234
1658
04:52
from a drunken night of karaoke.
118
280892
2988
04:55
This is purely hypothetical, I can assure you.
119
283880
3266
04:59
(Laughter)
120
287146
1495
05:00
Now, incidentally,
121
288641
1769
05:02
you know how some people are kind of worried
122
290410
1290
05:03
that their boss or employee
123
291700
1643
05:05
is going to discover embarrassing photos of them
124
293343
2007
05:07
on Facebook?
125
295350
1540
05:08
Do you know how hard that is to avoid
126
296890
1936
05:10
when you actually work at Facebook?
127
298826
3254
05:14
So anyway, there are lots of these photos
128
302080
2286
05:16
being erroneously reported as spam and abuse,
129
304366
3434
05:19
and one of the engineers on the team had a hunch.
130
307800
2449
05:22
He really thought there was something else going on
131
310249
2127
05:24
and he was right,
132
312376
1156
05:25
because when he looked
through a bunch of the cases,
133
313532
2324
05:27
he found that most of them
134
315856
1600
05:29
were from people who were requesting
135
317456
1875
05:31
the takedown of a photo of themselves.
136
319331
2822
05:34
Now this was a scenario that the team
137
322153
2040
05:36
never even took into account before.
138
324193
2367
05:38
So they added a new feature
139
326560
1809
05:40
that allowed people to message their friend
140
328369
2144
05:42
to ask them to take the photo down.
141
330513
2272
05:44
But it didn't work.
142
332785
1459
05:46
Only 20 percent of people
143
334244
1251
05:47
sent the message to their friend.
144
335495
2218
05:49
So the team went back at it.
145
337713
2251
05:51
They consulted with experts in conflict resolution.
146
339964
3191
05:55
They even studied the universal principles
147
343155
3062
05:58
of polite language,
148
346217
1487
05:59
which I didn't even actually know existed
149
347704
1535
06:01
until this research happened.
150
349239
1937
06:03
And they found something really interesting.
151
351176
2529
06:05
They had to go beyond just helping people
152
353705
2183
06:07
ask their friend to take the photo down.
153
355888
1912
06:09
They had to help people express to their friend
154
357800
2729
06:12
how the photo made them feel.
155
360529
2431
06:14
Here's how the experience works today.
156
362960
2198
06:17
So I find this hypothetical photo of myself,
157
365158
3227
06:20
and it's not spam, it's not abuse,
158
368385
2629
06:23
but I really wish it weren't on the site.
159
371014
2523
06:25
So I report it and I say,
160
373537
3224
06:28
"I'm in this photo and I don't like it,"
161
376761
2040
06:30
and then we dig deeper.
162
378801
3262
06:34
Why don't you like this photo of yourself?
163
382063
2506
06:36
And I select "It's embarrassing."
164
384569
2847
06:39
And then I'm encouraged to message my friend,
165
387416
3326
06:42
but here's the critical difference.
166
390742
1859
06:44
I'm provided specific suggested language
167
392601
3632
06:48
that helps me communicate to Laura
168
396233
2336
06:50
how the photo makes me feel.
169
398569
1935
06:52
Now the team found that this relatively small change
170
400504
3041
06:55
had a huge impact.
171
403545
1679
06:57
Before, only 20 percent of people
172
405224
2400
06:59
were sending the message,
173
407624
1171
07:00
and now 60 percent were,
174
408795
2045
07:02
and surveys showed that people
175
410840
1634
07:04
on both sides of the conversation
176
412474
1928
07:06
felt better as a result.
177
414402
1640
07:08
That same survey showed
178
416042
1655
07:09
that 90 percent of your friends
179
417697
2353
07:12
want to know if they've done something to upset you.
180
420050
2738
07:14
Now I don't know who the other 10 percent are,
181
422788
2097
07:16
but maybe that's where our "Unfriend" feature
182
424885
1774
07:18
can come in handy.
183
426659
1561
07:20
So as you can see,
184
428220
1774
07:21
these decisions are highly nuanced.
185
429994
2608
07:24
Of course we use a lot of data
186
432602
1969
07:26
to inform our decisions,
187
434571
1311
07:27
but we also rely very heavily on iteration,
188
435882
3443
07:31
research, testing, intuition, human empathy.
189
439325
4124
07:35
It's both art and science.
190
443449
1846
07:37
Now, sometimes the designers
who work on these products
191
445295
2566
07:39
are called "data-driven,"
192
447861
1688
07:41
which is a term that totally drives us bonkers.
193
449549
2741
07:44
The fact is, it would be irresponsible of us
194
452290
2899
07:47
not to rigorously test our designs
195
455189
2552
07:49
when so many people are counting on us
196
457741
2086
07:51
to get it right,
197
459827
1193
07:53
but data analytics
198
461020
2320
07:55
will never be a substitute for design intuition.
199
463340
3102
07:58
Data can help you make a good design great,
200
466442
3168
08:01
but it will never made a bad design good.
201
469610
3624
08:05
The next thing that you need
to understand as a principle
202
473234
3112
08:08
is that when you introduce change,
203
476346
1547
08:09
you need to do it extraordinarily carefully.
204
477893
2606
08:12
Now I often have joked that
205
480499
1667
08:14
I spend almost as much time
206
482166
2238
08:16
designing the introduction of change
207
484404
1814
08:18
as I do the change itself,
208
486218
2200
08:20
and I'm sure that we can all relate to that
209
488418
2296
08:22
when something that we use a lot changes
210
490714
2040
08:24
and then we have to adjust.
211
492754
2232
08:26
The fact is, people can become
212
494986
2204
08:29
very efficient at using bad design,
213
497190
2940
08:32
and so even if the change is
good for them in the long run,
214
500130
2672
08:34
it's still incredibly frustrating when it happens,
215
502802
3057
08:37
and this is particularly true
216
505859
1807
08:39
with user-generated content platforms,
217
507666
2696
08:42
because people can rightfully
claim a sense of ownership.
218
510362
3441
08:45
It is, after all, their content.
219
513803
3603
08:49
Now, years ago, when I was working at YouTube,
220
517406
2367
08:51
we were looking for ways to
221
519773
2195
08:53
encourage more people to rate videos,
222
521968
2276
08:56
and it was interesting because
when we looked into the data,
223
524244
2726
08:58
we found that almost everyone was exclusively using
224
526970
3459
09:02
the highest five-star rating,
225
530429
1967
09:04
a handful of people were using
226
532396
1584
09:05
the lowest one-star,
227
533980
1676
09:07
and virtually no one
228
535656
1419
09:09
was using two, three or four stars.
229
537075
2330
09:11
So we decided to simplify
230
539405
1895
09:13
into an up-down kind of voting binary model.
231
541300
3324
09:16
It's going to be much easier
for people to engage with.
232
544624
2996
09:19
But people were very attached
233
547620
2552
09:22
to the five-star rating system.
234
550172
1793
09:23
Video creators really loved their ratings.
235
551965
2431
09:26
Millions and millions of people
236
554396
1222
09:27
were accustomed to the old design.
237
555618
2163
09:29
So in order to help people
238
557781
1935
09:31
prepare themselves for change
239
559716
1704
09:33
and acclimate to the new design more quickly,
240
561420
2233
09:35
we actually published the data graph
241
563653
2463
09:38
sharing with the community
242
566116
1720
09:39
the rationale for what we were going to do,
243
567836
2148
09:41
and it even engaged the larger industry
244
569984
2621
09:44
in a conversation, which resulted in
245
572605
1465
09:46
my favorite TechCrunch headline of all time:
246
574070
3078
09:49
"YouTube Comes to a 5-Star Realization:
247
577148
3464
09:52
Its Ratings Are Useless."
248
580612
3028
09:55
Now, it's impossible to completely avoid
249
583640
2655
09:58
change aversion when you're making changes
250
586295
2500
10:00
to products that so many people use.
251
588795
2128
10:02
Even though we tried to do all the right things,
252
590923
1607
10:04
we still received our customary flood
253
592530
2233
10:06
of video protests and angry emails
254
594763
2624
10:09
and even a package that had
to be scanned by security,
255
597387
4248
10:13
but we have to remember
256
601635
1960
10:15
people care intensely about this stuff,
257
603595
2840
10:18
and it's because these products, this work,
258
606435
2938
10:21
really, really matters to them.
259
609373
2220
10:23
Now, we know that we have to be careful
260
611593
3307
10:26
about paying attention to the details,
261
614900
1990
10:28
we have to be cognizant about how we use data
262
616890
2473
10:31
in our design process,
263
619363
1672
10:33
and we have to introduce change
264
621035
1805
10:34
very, very carefully.
265
622840
1691
10:36
Now, these things are all really useful.
266
624531
2096
10:38
They're good best practices for designing at scale.
267
626627
3112
10:41
But they don't mean anything
268
629739
1792
10:43
if you don't understand something
269
631531
1672
10:45
much more fundamental.
270
633203
1662
10:46
You have to understand who you are designing for.
271
634865
4761
10:51
Now, when you set a goal to design
272
639626
1737
10:53
for the entire human race,
273
641363
1866
10:55
and you start to engage in that goal in earnest,
274
643229
3054
10:58
at some point you run into the walls
275
646283
2424
11:00
of the bubble that you're living in.
276
648707
2048
11:02
Now, in San Francisco, we get a little miffed
277
650755
2640
11:05
when we hit a dead cell zone
278
653395
1584
11:06
because we can't use our phones to navigate
279
654979
1649
11:08
to the new hipster coffee shop.
280
656628
2495
11:11
But what if you had to drive four hours
281
659123
3154
11:14
to charge your phone
282
662277
1336
11:15
because you had no reliable source of electricity?
283
663613
3241
11:18
What if you had no access to public libraries?
284
666854
3455
11:22
What if your country had no free press?
285
670309
2824
11:25
What would these products start to mean to you?
286
673133
3740
11:28
This is what Google, YouTube and Facebook
287
676873
2744
11:31
look like to most of the world,
288
679617
1719
11:33
and it's what they'll look like
289
681336
1058
11:34
to most of the next five billion people
290
682394
2102
11:36
to come online.
291
684496
1560
11:38
Designing for low-end cell phones
292
686056
2266
11:40
is not glamorous design work,
293
688322
2367
11:42
but if you want to design for the whole world,
294
690689
1912
11:44
you have to design for where people are,
295
692601
2111
11:46
and not where you are.
296
694712
1753
11:48
So how do we keep this big, big picture in mind?
297
696465
3167
11:51
We try to travel outside of our bubble to see, hear
298
699632
3084
11:54
and understand the people we're designing for.
299
702716
2575
11:57
We use our products in non-English languages
300
705291
2376
11:59
to make sure that they work just as well.
301
707667
2153
12:01
And we try to use one of these
phones from time to time
302
709820
3135
12:04
to keep in touch with their reality.
303
712955
2936
12:07
So what does it mean to design at a global scale?
304
715891
4277
12:12
It means difficult and sometimes exasperating work
305
720168
3414
12:15
to try to improve and evolve products.
306
723582
3972
12:19
Finding the audacity and the
humility to do right by them
307
727554
3193
12:22
can be pretty exhausting,
308
730747
1363
12:24
and the humility part,
309
732110
1512
12:25
it's a little tough on the design ego.
310
733622
2204
12:27
Because these products are always changing,
311
735826
2472
12:30
everything that I've designed in my career
312
738298
2512
12:32
is pretty much gone,
313
740810
1296
12:34
and everything that I will design will fade away.
314
742106
3280
12:37
But here's what remains:
315
745386
1776
12:39
the never-ending thrill
316
747162
1952
12:41
of being a part of something that is so big,
317
749114
2848
12:43
you can hardly get your head around it,
318
751962
2733
12:46
and the promise that it just might change the world.
319
754695
2897
12:49
Thank you.
320
757592
2398
12:51
(Applause)
321
759990
2971

▲Back to top

ABOUT THE SPEAKER
Margaret Gould Stewart - User experience master
At Facebook (and previously at YouTube), Margaret Gould Stewart designs experiences that touch the lives of a large percentage of the world's population.

Why you should listen

Margaret Gould Stewart has spent her career asking, “How do we design user experiences that change the world in fundamental ways?” It's a powerful question that has led her to manage user experiences for six of the ten most visited websites in the world, including Facebook, where she serves as Director of Product Design.

Before joining Facebook, Margaret managed the User Experience Team for YouTube, where she oversaw the largest redesign in the company's history, including the YouTube player page. She came to YouTube after two years leading Search and Consumer Products UX at Google. She approaches her work with a combined appreciation for timeless great design and transient digital technologies, and always with the end goal of improving people's lives. As she says: "Design is creativity in service of others."

More profile about the speaker
Margaret Gould Stewart | Speaker | TED.com