ABOUT THE SPEAKERS
Jack Dorsey - Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both.

Why you should listen
More profile about the speaker
Jack Dorsey | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
Whitney Pennington Rodgers - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional.

Why you should listen

Prior to joining TED as current affairs curator, Whitney Pennington Rodgers produced for NBC's primetime news magazine Dateline NBC. She earned a duPont-Columbia award and a News & Documentary Emmy or her contributions to the Dateline NBC hour "The Cosby Accusers Speak" -- an extensive group interview with 27 of the women who accused entertainer Bill Cosby of sexual misconduct.

Pennington Rodgers has worked at NBC's in-house production company Peacock Productions, The Today Show, Nightly News, Rock Center with Brian Williams and New Jersey-centric public affairs shows Caucus: New Jersey and One-on-One with Steve Adubato. Prior to beginning her career in media, she had a short stint as a fourth-grade teacher through the Teach for America program.

Pennington Rodgers received her Bachelor's in journalism and media studies from Rutgers University. She completed her Master's of Journalism at the University of California at Berkeley, where she produced a documentary about recruitment of nonblack students at historically black colleges and universities.

More profile about the speaker
Whitney Pennington Rodgers | Speaker | TED.com
TED2019

Jack Dorsey: How Twitter needs to change

Jack Dorsey: Como o Twitter precisa mudar

Filmed:
2,089,470 views

O Twitter pode ser salvo? Em uma abrangente conversa com Chris Anderson e Whitney Pennington Rodgers, do TED, Jack Dorsey, CEO do Twitter, discute o futuro da plataforma, reconhecendo problemas com o assédio e a moderação e propondo algumas mudanças fundamentais que ele espera que encorajem conversas saudáveis ​​e respeitosas. "Estamos entregando de fato, todo dia, algo que as pessoas valorizam?", pergunta Dorsey.
- Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both. Full bio - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading. Full bio - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional. Full bio

Double-click the English transcript below to play the video.

Chris Anderson: Qual é
a sua preocupação atual?
00:13
Chris Anderson:
What worries you right now?
0
1131
2408
Você tem sido bastante aberto
sobre muitos problemas do Twitter.
00:15
You've been very open
about lots of issues on Twitter.
1
3563
2853
00:18
What would be your top worry
2
6440
2299
Qual seria a sua maior preocupação
sobre como as coisas estão hoje?
00:20
about where things are right now?
3
8763
2049
00:23
Jack Dorsey: Right now,
the health of the conversation.
4
11447
2929
Jack Dorsey: Neste momento,
é a saúde da conversa.
00:26
So, our purpose is to serve
the public conversation,
5
14400
3660
Nosso propósito é servir
à conversa pública,
00:30
and we have seen
a number of attacks on it.
6
18084
5056
e temos visto muitos ataques nela.
00:35
We've seen abuse, we've seen harassment,
7
23164
2425
Temos visto insulto, assédio, manipulação,
00:37
we've seen manipulation,
8
25613
3222
00:40
automation, human coordination,
misinformation.
9
28859
4265
automação, coordenação humana
e desinformação.
00:46
So these are all dynamics
that we were not expecting
10
34134
4034
Todas essas são dinâmicas
que não esperávamos 13 anos atrás,
00:50
13 years ago when we were
starting the company.
11
38192
3718
quando começamos a empresa.
00:53
But we do now see them at scale,
12
41934
2664
Mas agora vemos isso em grande escala,
00:56
and what worries me most
is just our ability to address it
13
44622
5278
e o que mais me preocupa é
a nossa capacidade de resolver isso
01:01
in a systemic way that is scalable,
14
49924
3108
de forma sistêmica, em grande escala,
01:05
that has a rigorous understanding
of how we're taking action,
15
53056
6976
que tenha uma compreensão rigorosa
de como estamos agindo,
01:12
a transparent understanding
of how we're taking action
16
60056
3105
uma compreensão transparente
de como estamos agindo
e um processo rigoroso de contestação
para quando errarmos,
01:15
and a rigorous appeals process
for when we're wrong,
17
63185
3101
porque vamos errar.
01:18
because we will be wrong.
18
66310
2169
01:20
Whitney Pennington Rodgers:
I'm really glad to hear
19
68503
2397
Whitney Pennington Rodgers:
Que bom ouvir que isso te preocupa,
01:22
that that's something that concerns you,
20
70924
1928
pois muito tem sido escrito sobre pessoas
01:24
because I think there's been
a lot written about people
21
72876
2630
que sentem que tem sido insultadas
e assediadas no Twitter,
01:27
who feel they've been abused
and harassed on Twitter,
22
75530
2477
sobretudo mulheres,
mulheres de cor e negras.
01:30
and I think no one more so
than women and women of color
23
78031
4102
01:34
and black women.
24
82157
1170
01:35
And there's been data that's come out --
25
83351
1913
A Anistia Internacional divulgou
um relatório há alguns meses
01:37
Amnesty International put out
a report a few months ago
26
85288
2909
01:40
where they showed that a subset
of active black female Twitter users
27
88221
4480
apontando um subgrupo de mulheres
negras usuárias ativas do Twitter,
01:44
were receiving, on average,
one in 10 of their tweets
28
92725
3456
que, de cada dez tuítes que recebiam,
01:48
were some form of harassment.
29
96205
2099
um, em média, era uma forma de assédio.
01:50
And so when you think about health
for the community on Twitter,
30
98328
3907
E, quando pensamos em saúde
para a comunidade do Twitter,
01:54
I'm interested to hear,
"health for everyone,"
31
102259
4024
estou interessada em ouvir
"saúde para todos",
mas especificamente como você
tenta fazer do Twitter um espaço seguro
01:58
but specifically: How are you looking
to make Twitter a safe space
32
106307
3125
02:01
for that subset, for women,
for women of color and black women?
33
109456
4164
para esse subgrupo de mulheres,
mulheres de cor e negras?
02:05
JD: Yeah.
34
113644
1164
JD: Verdade.
02:06
So it's a pretty terrible situation
35
114832
2643
É uma situação bem terrível
quando se acessa um serviço
02:09
when you're coming to a service
36
117499
1619
02:11
that, ideally, you want to learn
something about the world,
37
119142
4321
em que se busca, idealmente,
aprender algo sobre o mundo,
02:15
and you spend the majority of your time
reporting abuse, receiving abuse,
38
123487
5443
e se passa a maior parte do tempo
denunciando e recebendo insultos,
02:20
receiving harassment.
39
128954
1804
sofrendo assédio.
02:23
So what we're looking most deeply at
is just the incentives
40
131373
6321
Então, nós...
estamos procurando
mais profundamente os incentivos
02:29
that the platform naturally provides
and the service provides.
41
137718
3823
que a plataforma e o serviço
propiciam naturalmente.
02:34
Right now, the dynamic of the system
makes it super-easy to harass
42
142262
4577
Atualmente, a dinâmica
do sistema torna superfácil
assediar e insultar os outros
através do serviço,
02:38
and to abuse others through the service,
43
146863
3664
02:42
and unfortunately, the majority
of our system in the past
44
150551
3262
e, infelizmente, boa parte
do nosso sistema no passado
02:45
worked entirely based on people
reporting harassment and abuse.
45
153837
5596
trabalhou inteiramente com base
em pessoas denunciando assédio e insultos.
02:51
So about midway last year,
we decided that we were going to apply
46
159457
5075
Por isso, em meados do ano passado,
decidimos que íamos aplicar
02:56
a lot more machine learning,
a lot more deep learning to the problem,
47
164556
3982
muito mais aprendizado de máquina,
e aprendizado profundo no problema,
03:00
and try to be a lot more proactive
around where abuse is happening,
48
168562
4538
e tentar ser muito mais proativos
quando o insulto está acontecendo,
03:05
so that we can take the burden
off the victim completely.
49
173124
3960
para que possamos tirar
completamente o fardo da vítima.
03:09
And we've made some progress recently.
50
177108
2435
E recentemente fizemos alguns progressos:
03:11
About 38 percent of abusive tweets
are now proactively identified
51
179567
6689
cerca de 38% dos tuítes insultantes
agora são identificados proativamente
por algoritmos de aprendizado de máquina
03:18
by machine learning algorithms
52
186280
1715
03:20
so that people don't actually
have to report them.
53
188019
2334
para que as pessoas
não precisem denunciá-los.
03:22
But those that are identified
are still reviewed by humans,
54
190377
3305
Mas aqueles que são identificados
ainda são revisados ​​por humanos,
03:25
so we do not take down content or accounts
without a human actually reviewing it.
55
193706
5384
por isso não removemos conteúdo
ou contas sem a revisão humana.
03:31
But that was from zero percent
just a year ago.
56
199114
2759
Mas isso era 0% apenas um ano atrás.
03:33
So that meant, at that zero percent,
57
201897
1931
Ou seja, esse 0% representa
03:35
every single person who received abuse
had to actually report it,
58
203852
3650
que toda pessoa que era insultada
tinha de reportá-lo,
03:39
which was a lot of work for them,
a lot of work for us
59
207526
3579
o que significava muito trabalho
para elas e para nós,
03:43
and just ultimately unfair.
60
211129
2018
algo completamente injusto.
03:46
The other thing that we're doing
is making sure that we, as a company,
61
214528
3780
A outra coisa que estamos fazendo
é assegurar que, como empresa,
03:50
have representation of all the communities
that we're trying to serve.
62
218332
3333
tenhamos representação de todas
as comunidades que tentamos atender.
Não podemos construir
um negócio bem-sucedido
03:53
We can't build a business
that is successful
63
221689
2159
sem que tenhamos conosco
uma diversidade de perspectivas
03:55
unless we have a diversity
of perspective inside of our walls
64
223872
3300
03:59
that actually feel these issues
every single day.
65
227196
3732
que sintam de fato
esses problemas todos os dias.
04:02
And that's not just with the team
that's doing the work,
66
230952
3738
E isso não é só com a equipe operacional,
mas também na liderança da empresa.
04:06
it's also within our leadership as well.
67
234714
2096
04:08
So we need to continue to build empathy
for what people are experiencing
68
236834
5757
Precisamos continuar a construir empatia
pelo que as pessoas estão experimentando
e lhes dar melhores ferramentas para agir
04:14
and give them better tools to act on it
69
242615
3316
04:17
and also give our customers
a much better and easier approach
70
245955
4252
e também dar aos nossos clientes
uma abordagem melhor e mais fácil
para lidar com algumas
das coisas que estão vendo.
04:22
to handle some of the things
that they're seeing.
71
250231
2382
04:24
So a lot of what we're doing
is around technology,
72
252637
3266
Assim, muito do que estamos fazendo
tem a ver com tecnologia,
04:27
but we're also looking at
the incentives on the service:
73
255927
4308
mas também consideramos
os incentivos no serviço:
04:32
What does Twitter incentivize you to do
when you first open it up?
74
260259
5183
o que o Twitter incentiva a pessoa a fazer
quando ela o acessa pela primeira vez?
04:37
And in the past,
75
265466
1294
No passado,
04:40
it's incented a lot of outrage,
it's incented a lot of mob behavior,
76
268670
5544
ele incentivou muito disparate,
muito comportamento de massa,
04:46
it's incented a lot of group harassment.
77
274238
2459
muito assédio em grupo.
04:48
And we have to look a lot deeper
at some of the fundamentals
78
276721
3648
Temos de olhar muito mais profundamente
alguns dos fundamentos
04:52
of what the service is doing
to make the bigger shifts.
79
280393
2958
do que o serviço está fazendo
para criar as maiores mudanças.
04:55
We can make a bunch of small shifts
around technology, as I just described,
80
283375
4031
Podemos fazer várias pequenas
mudanças com a tecnologia,
como acabei de descrever,
04:59
but ultimately, we have to look deeply
at the dynamics in the network itself,
81
287430
4386
mas, em última análise, temos de observar
profundamente a dinâmica da própria rede,
05:03
and that's what we're doing.
82
291840
1368
e é o que estamos fazendo hoje.
05:05
CA: But what's your sense --
83
293232
2060
CA: Mas o que na sua opinião
05:07
what is the kind of thing
that you might be able to change
84
295316
3963
você teria condições de alterar
05:11
that would actually
fundamentally shift behavior?
85
299303
2749
que fundamentalmente
mudaria o comportamento?
05:15
JD: Well, one of the things --
86
303386
1480
JD: Bem, nós começamos o serviço
com o conceito de seguir uma conta,
05:16
we started the service
with this concept of following an account,
87
304890
5340
05:22
as an example,
88
310254
1725
como um exemplo,
05:24
and I don't believe that's why
people actually come to Twitter.
89
312003
4349
e não acredito que seja por isso
que as pessoas acessem o Twitter.
05:28
I believe Twitter is best
as an interest-based network.
90
316376
4857
Acredito que o Twitter seja melhor
como uma rede baseada em interesses.
05:33
People come with a particular interest.
91
321257
3453
As pessoas o acessam
devido a um determinado interesse.
05:36
They have to do a ton of work
to find and follow the related accounts
92
324734
3487
Elas têm muito trabalho para encontrar
e acompanhar as contas relacionadas
05:40
around those interests.
93
328245
1405
a esses interesses.
05:42
What we could do instead
is allow you to follow an interest,
94
330217
3397
Poderíamos, em vez disso,
permitir que seguissem um interesse,
05:45
follow a hashtag, follow a trend,
95
333638
2103
seguissem uma hashtag,
uma tendência, uma comunidade,
05:47
follow a community,
96
335765
1754
05:49
which gives us the opportunity
to show all of the accounts,
97
337543
4637
o que nos dá a oportunidade
de mostrar todas as contas,
05:54
all the topics, all the moments,
all the hashtags
98
342204
3323
todos os tópicos, os momentos,
todas as hashtags
05:57
that are associated with that
particular topic and interest,
99
345551
3992
que estão associados àquele
determinado tópico e interesse,
06:01
which really opens up
the perspective that you see.
100
349567
4600
o que realmente abre
a perspectiva que a pessoa vê.
06:06
But that is a huge fundamental shift
101
354191
2157
Mas essa é uma enorme mudança fundamental,
06:08
to bias the entire network
away from just an account bias
102
356372
3792
para desviar toda a rede
do viés de apenas uma conta
06:12
towards a topics and interest bias.
103
360188
2587
para um viés de tópicos e interesses.
06:15
CA: Because isn't it the case
104
363283
3148
CA: Mas não é verdade
06:19
that one reason why you have
so much content on there
105
367375
3541
que o fato de haver tanto conteúdo lá
06:22
is a result of putting millions
of people around the world
106
370940
3591
é o resultado de se colocar
milhões de pessoas do mundo todo
06:26
in this kind of gladiatorial
contest with each other
107
374555
3142
num tipo de competição gladiatorial
umas com as outras
06:29
for followers, for attention?
108
377721
2090
por seguidores, por atenção?
06:31
Like, from the point of view
of people who just read Twitter,
109
379835
4117
Do ponto de vista daqueles
que apenas leem o Twitter,
06:35
that's not an issue,
110
383976
1155
isso não é problema,
06:37
but for the people who actually create it,
everyone's out there saying,
111
385155
3350
mas, para quem cria tuítes, eles dizem:
"Eu gostaria de ter mais curtidas,
seguidores, retuítes".
06:40
"You know, I wish I had
a few more 'likes,' followers, retweets."
112
388529
3236
Assim, elas constantemente experimentam,
06:43
And so they're constantly experimenting,
113
391789
2148
tentando encontrar um jeito de fazer isso.
06:45
trying to find the path to do that.
114
393961
1961
06:47
And what we've all discovered
is that the number one path to do that
115
395946
4126
E parece que todos descobriram
que o melhor jeito é sendo,
06:52
is to be some form of provocative,
116
400096
3406
de alguma forma, provocativo
06:55
obnoxious, eloquently obnoxious,
117
403526
2980
detestável, eloquentemente detestável...
06:58
like, eloquent insults
are a dream on Twitter,
118
406530
3516
insultos eloquentes
são um sonho no Twitter
07:02
where you rapidly pile up --
119
410070
2603
e a pessoa se avulta rapidamente,
07:04
and it becomes this self-fueling
process of driving outrage.
120
412697
4608
retroalimentando esse processo
de estimular ultrajes.
07:09
How do you defuse that?
121
417329
2351
Como você neutraliza isso?
07:12
JD: Yeah, I mean, I think you're spot on,
122
420624
2947
JD: Sim, acho que você foi ao ponto,
07:15
but that goes back to the incentives.
123
423595
1886
mas isso remonta aos incentivos.
07:17
Like, one of the choices
we made in the early days was
124
425505
2632
Uma das escolhas que fizemos no início
07:20
we had this number that showed
how many people follow you.
125
428161
4701
foi mostrar o número de seguidores.
07:24
We decided that number
should be big and bold,
126
432886
2959
Decidimos que esse número
deveria ser grande e ambicioso,
07:27
and anything that's on the page
that's big and bold has importance,
127
435869
3740
e qualquer coisa numa página
grande e ambiciosa teria importância,
07:31
and those are the things
that you want to drive.
128
439633
2278
e a pessoa buscaria atrair essas coisas.
Foi uma decisão acertada na época?
Provavelmente não.
07:33
Was that the right decision at the time?
129
441935
1907
07:35
Probably not.
130
443866
1153
Se tivesse de recomeçar o Twitter,
07:37
If I had to start the service again,
131
445043
1805
não enfatizaria tanto
o número de seguidores,
07:38
I would not emphasize
the follower count as much.
132
446872
2398
nem o número de curtidas.
07:41
I would not emphasize
the "like" count as much.
133
449294
2295
07:43
I don't think I would even
create "like" in the first place,
134
451613
3120
Pra começar, acho
que nem criaria as curtidas,
07:46
because it doesn't actually push
135
454757
3267
porque elas não estimulam
07:50
what we believe now
to be the most important thing,
136
458048
3179
aquilo que agora acreditamos
ser o mais importante,
07:53
which is healthy contribution
back to the network
137
461251
3039
que é a contribuição saudável para a rede,
07:56
and conversation to the network,
138
464314
2652
conversa dentro da rede,
07:58
participation within conversation,
139
466990
2072
participação dentro da conversa,
08:01
learning something from the conversation.
140
469086
2493
e aprender algo com essa conversa.
08:03
Those are not things
that we thought of 13 years ago,
141
471603
2824
Não consideramos essas coisas há 13 anos,
08:06
and we believe are extremely
important right now.
142
474451
2439
mas hoje acreditamos que sejam
extremamente importantes.
08:08
So we have to look at
how we display the follower count,
143
476914
3023
Então temos de ver como mostramos
a conta do seguidor,
08:11
how we display retweet count,
144
479961
2365
a contagem de retuítes,
como mostramos as curtidas,
08:14
how we display "likes,"
145
482350
1401
08:15
and just ask the deep question:
146
483775
2254
e nos perguntar seriamente:
08:18
Is this really the number
that we want people to drive up?
147
486053
3048
esse é realmente o número que queremos
que as pessoas alcancem?
08:21
Is this the thing that,
when you open Twitter,
148
489125
2545
É isso o que a pessoa,
quando abre o Twitter,
08:23
you see, "That's the thing
I need to increase?"
149
491694
2516
pensa: "É isso o que eu preciso aumentar?"
08:26
And I don't believe
that's the case right now.
150
494234
2144
E não acredito que seja o caso agora.
08:28
(Applause)
151
496402
2103
(Aplausos)
08:30
WPR: I think we should look at
some of the tweets
152
498529
2352
WPR: Vamos ver alguns dos tuítes chegando
com perguntas da plateia também.
08:32
that are coming
in from the audience as well.
153
500905
2169
08:35
CA: Let's see what you guys are asking.
154
503868
2436
CA: Vejamos o que estão perguntando.
08:38
I mean, this is -- generally, one
of the amazing things about Twitter
155
506328
3294
Umas das coisas incríveis sobre o Twitter
é como podemos usá-lo
como inteligência coletiva,
08:41
is how you can use it for crowd wisdom,
156
509646
2294
08:43
you know, that more knowledge,
more questions, more points of view
157
511964
4840
sabe, mais conhecimento, mais perguntas,
mais opiniões do que se pode imaginar
08:48
than you can imagine,
158
516828
1238
08:50
and sometimes, many of them
are really healthy.
159
518090
3689
e, às vezes, muitos deles
são realmente saudáveis.
08:53
WPR: I think one I saw that
passed already quickly down here,
160
521803
2900
WPR: Uma das perguntas aqui é:
"Qual o plano do Twitter para combater
08:56
"What's Twitter's plan to combat
foreign meddling in the 2020 US election?"
161
524717
3524
a ingerência estrangeira
nas eleições dos EUA em 2020?"
Acho que essa é uma questão
que vemos na internet em geral,
09:00
I think that's something
that's an issue we're seeing
162
528265
2571
09:02
on the internet in general,
163
530860
1901
pois há muita atividade maliciosa
automatizada acontecendo.
09:04
that we have a lot of malicious
automated activity happening.
164
532785
3667
09:08
And on Twitter, for example,
in fact, we have some work
165
536476
5373
E no Twitter, por exemplo,
temos um trabalho
09:13
that's come from our friends
at Zignal Labs,
166
541873
2758
dos nossos amigos no Zignal Labs,
09:16
and maybe we can even see that
to give us an example
167
544655
2656
e talvez possamos pegar um exemplo
para demonstrar o que estou falando,
09:19
of what exactly I'm talking about,
168
547335
1927
09:21
where you have these bots, if you will,
169
549286
3204
no qual existem esses "bots",
09:24
or coordinated automated
malicious account activity,
170
552514
4550
ou atividade de conta automatizada
coordenada maliciosa,
09:29
that is being used to influence
things like elections.
171
557088
2764
sendo usados para influenciar,
por exemplo, as eleições.
09:31
And in this example we have
from Zignal which they've shared with us
172
559876
3843
E temos um exemplo do Zignal,
que eles compartilharam conosco,
09:35
using the data that
they have from Twitter,
173
563743
2198
usando os dados que eles têm do Twitter,
09:37
you actually see that in this case,
174
565965
2441
e podemos ver que, nesse caso,
09:40
white represents the humans --
human accounts, each dot is an account.
175
568430
4370
o branco representa cada ponto
como uma conta de humano.
09:44
The pinker it is,
176
572824
1359
Quanto mais rosa,
mais automatizada é a atividade
09:46
the more automated the activity is.
177
574207
1740
09:47
And you can see how you have
a few humans interacting with bots.
178
575971
5970
e dá pra ver como existem
alguns humanos interagindo com bots.
09:53
In this case, it's related
to the election in Israel
179
581965
4419
Trata-se aqui da eleição em Israel,
09:58
and spreading misinformation
about Benny Gantz,
180
586408
2833
em que se espalhou desinformação
sobre Benny Gantz,
10:01
and as we know, in the end,
that was an election
181
589265
2662
e, no final, essa foi uma eleição
10:03
that Netanyahu won by a slim margin,
182
591951
3724
na qual Netanyahu ganhou
por uma pequena margem,
10:07
and that may have been
in some case influenced by this.
183
595699
2842
e pode ter sido influenciado por isso.
10:10
And when you think about
that happening on Twitter,
184
598565
2615
E quando pensamos
nisso acontecendo no Twitter,
10:13
what are the things
that you're doing, specifically,
185
601204
2456
o que estão fazendo especificamente
para evitar que desinformação
assim se espalhe dessa maneira,
10:15
to ensure you don't have misinformation
like this spreading in this way,
186
603684
3702
10:19
influencing people in ways
that could affect democracy?
187
607410
4181
influenciando as pessoas em maneiras
que podem afetar a democracia?
10:23
JD: Just to back up a bit,
188
611615
1771
JD: Só para voltar um pouco.
10:25
we asked ourselves a question:
189
613410
2975
Nós nos perguntamos:
10:28
Can we actually measure
the health of a conversation,
190
616409
3816
podemos realmente medir
a saúde de uma conversa,
10:32
and what does that mean?
191
620249
1288
e o que isso significa?
10:33
And in the same way
that you have indicators
192
621561
3382
E da mesma maneira
que existem indicadores
10:36
and we have indicators as humans
in terms of are we healthy or not,
193
624967
3467
para medir a saúde dos seres humanos,
10:40
such as temperature,
the flushness of your face,
194
628458
4658
como temperatura, as faces coradas,
10:45
we believe that we could find
the indicators of conversational health.
195
633140
4560
acreditamos que poderíamos encontrar
os indicadores da saúde conversacional.
10:49
And we worked with a lab
called Cortico at MIT
196
637724
3843
E trabalhamos com um laboratório
chamado Cortico, no MIT,
10:54
to propose four starter indicators
197
642479
6091
para propor quatro indicadores iniciais
11:00
that we believe we could ultimately
measure on the system.
198
648594
3670
que acreditamos que poderiam
no final serem medidas no sistema.
11:05
And the first one is
what we're calling shared attention.
199
653249
5604
O primeiro é o que chamamos
de atenção compartilhada.
11:10
It's a measure of how much
of the conversation is attentive
200
658877
3581
Ele mede quanto da conversa tem a ver
11:14
on the same topic versus disparate.
201
662482
2630
com o mesmo tópico ou é um disparate.
11:17
The second one is called shared reality,
202
665739
2783
O segundo é chamado
realidade compartilhada,
11:21
and this is what percentage
of the conversation
203
669217
2259
que mede qual porcentagem da conversa
compartilha os mesmos fatos.
11:23
shares the same facts --
204
671500
2005
11:25
not whether those facts
are truthful or not,
205
673529
3113
Não mede se os fatos
são verdadeiros ou não,
11:28
but are we sharing
the same facts as we converse?
206
676666
3009
mas se compartilhamos os mesmos fatos
enquanto conversamos.
11:32
The third is receptivity:
207
680235
2353
O terceiro é a receptividade:
11:34
How much of the conversation
is receptive or civil
208
682612
3959
quanto da conversa
é receptiva ou civilizada,
11:38
or the inverse, toxic?
209
686595
2944
ou o inverso, tóxica?
11:42
And then the fourth
is variety of perspective.
210
690213
3222
E o quarto é a variedade de perspectiva.
11:45
So, are we seeing filter bubbles
or echo chambers,
211
693459
3145
Assim, estamos vendo bolhas
de filtro ou câmaras de eco,
11:48
or are we actually getting
a variety of opinions
212
696628
3057
ou recebendo uma variedade
de opiniões dentro da conversa?
11:51
within the conversation?
213
699709
1635
11:53
And implicit in all four of these
is the understanding that,
214
701368
4018
Está implícito em todos esses quatro
o entendimento de que,
11:57
as they increase, the conversation
gets healthier and healthier.
215
705410
3390
à medida que eles aumentam,
a conversa fica cada vez mais saudável.
12:00
So our first step is to see
if we can measure these online,
216
708824
4869
Então, nosso primeiro passo é ver
se podemos medir isso on-line,
12:05
which we believe we can.
217
713717
1308
o que acreditamos poder.
12:07
We have the most momentum
around receptivity.
218
715049
3167
Estamos num ótimo momento
no que se refere à receptividade.
12:10
We have a toxicity score,
a toxicity model, on our system
219
718240
4317
Temos uma pontuação de toxicidade,
um modelo de toxicidade, em nosso sistema
12:14
that can actually measure
whether you are likely to walk away
220
722581
4124
que pode medir de fato
a chance de a pessoa se afastar
12:18
from a conversation
that you're having on Twitter
221
726729
2313
de uma conversa que está tendo no Twitter,
12:21
because you feel it's toxic,
222
729066
1633
por sentir que é tóxica,
num grau bastante preciso.
12:22
with some pretty high degree.
223
730723
2512
12:26
We're working to measure the rest,
224
734369
2199
Estamos trabalhando para medir o resto,
12:28
and the next step is,
225
736592
1964
e o próximo passo é,
12:30
as we build up solutions,
226
738580
3359
enquanto construímos soluções,
12:33
to watch how these measurements
trend over time
227
741963
3491
ver como essas medidas
se comportam ao longo do tempo
12:37
and continue to experiment.
228
745478
1873
e continuar a testar.
12:39
And our goal is to make sure
that these are balanced,
229
747375
4041
E nosso objetivo é garantir
que os indicadores estejam equilibrados,
12:43
because if you increase one,
you might decrease another.
230
751440
3066
porque, se aumentarmos um,
podemos diminuir outro.
12:46
If you increase variety of perspective,
231
754530
2147
Se aumentarmos a variedade de perspectiva,
12:48
you might actually decrease
shared reality.
232
756701
3091
podemos acabar diminuindo
a realidade compartilhada.
12:51
CA: Just picking up on some
of the questions flooding in here.
233
759816
4989
CA: Só para pegar algumas
das inúmeras perguntas aqui.
12:56
JD: Constant questioning.
234
764829
1271
JD: Questionamento constante.
12:58
CA: A lot of people are puzzled why,
235
766996
3620
CA: Muitas pessoas estão intrigadas
13:02
like, how hard is it to get rid
of Nazis from Twitter?
236
770640
4247
em saber quão difícil é se livrar
de nazistas no Twitter.
13:08
JD: (Laughs)
237
776309
1322
JD: (Risos)
13:09
So we have policies
around violent extremist groups,
238
777655
6995
Temos políticas para grupos
extremistas violentos,
13:16
and the majority of our work
and our terms of service
239
784674
4426
e a maior parte do nosso trabalho
e dos nossos termos de serviço
13:21
works on conduct, not content.
240
789124
3729
trabalha na conduta, não no conteúdo.
13:24
So we're actually looking for conduct.
241
792877
2551
Então, realmente focamos a conduta.
13:27
Conduct being using the service
242
795452
3014
Conduta na utilização do serviço,
13:30
to repeatedly or episodically
harass someone,
243
798490
3867
repetidamente ou ocasionalmente
assediar alguém,
13:34
using hateful imagery
244
802381
2493
usar imagens cheias de ódio
13:36
that might be associated with the KKK
245
804898
2106
que possam estar associadas ao KKK
13:39
or the American Nazi Party.
246
807028
3281
ou ao Partido Nazista Americano.
13:42
Those are all things
that we act on immediately.
247
810333
4156
Tudo isso são coisas
nas quais agimos imediatamente.
13:47
We're in a situation right now
where that term is used fairly loosely,
248
815002
5452
Estamos numa situação em que esse termo
é usado de forma bastante frouxa,
13:52
and we just cannot take
any one mention of that word
249
820478
5313
e não podemos aceitar simplesmente
a palavra de uma pessoa
13:57
accusing someone else
250
825815
2117
para acusar uma outra
13:59
as a factual indication that they
should be removed from the platform.
251
827956
3755
como uma indicação fatual de que esta
deva ser removida da plataforma.
14:03
So a lot of our models
are based around, number one:
252
831735
2627
Então muitos dos nossos modelos
baseiam-se, primeiro:
14:06
Is this account associated
with a violent extremist group?
253
834386
3140
essa conta está associada
a um grupo extremista violento?
14:09
And if so, we can take action.
254
837550
1983
E, se for o caso, podemos agir.
14:11
And we have done so on the KKK
and the American Nazi Party and others.
255
839557
3852
E fizemos isso com o KKK
e o Partido Nazista Americano e outros.
14:15
And number two: Are they using
imagery or conduct
256
843433
4183
Segundo: eles estão usando
imagens ou tendo condutas
14:19
that would associate them as such as well?
257
847640
2372
que os associariam a isso também?
14:22
CA: How many people do you have
working on content moderation
258
850416
2932
CA: Quantas pessoas trabalham
na moderação de conteúdo, cuidando disso?
14:25
to look at this?
259
853372
1250
14:26
JD: It varies.
260
854646
1496
JD: Isso varia.
14:28
We want to be flexible on this,
261
856166
1595
Queremos ser flexíveis nisso,
14:29
because we want to make sure
that we're, number one,
262
857785
2646
porque queremos ter certeza
de estarmos, primeiramente,
14:32
building algorithms instead of just
hiring massive amounts of people,
263
860455
4424
construindo algoritmos em vez de apenas
contratar um número enorme de pessoas,
14:36
because we need to make sure
that this is scalable,
264
864903
2824
porque precisamos ter certeza
de que isso pode ser ajustado,
14:39
and there are no amount of people
that can actually scale this.
265
867751
3454
e não há um número de pessoas
que se possa ajustar a isso.
14:43
So this is why we've done so much work
around proactive detection of abuse
266
871229
6629
Por isso trabalhamos muito
na detecção proativa do insulto
que humanos possam rever.
14:49
that humans can then review.
267
877882
1391
14:51
We want to have a situation
268
879297
2861
Queremos ter uma situação
14:54
where algorithms are constantly
scouring every single tweet
269
882182
3741
em que os algoritmos vasculhem
constantemente cada tuíte,
14:57
and bringing the most
interesting ones to the top
270
885947
2342
trazendo os mais interessantes para o topo
15:00
so that humans can bring their judgment
to whether we should take action or not,
271
888313
3902
para que humanos possam avaliar,
para saber se devemos agir ou não,
com base em nossos termos de serviço.
15:04
based on our terms of service.
272
892239
1524
15:05
WPR: But there's not an amount
of people that are scalable,
273
893787
2803
WPR: Pode não haver uma quantidade
ajustável de pessoas,
15:08
but how many people do you currently have
monitoring these accounts,
274
896614
3497
mas quantas vocês têm atualmente
monitorando essas contas,
15:12
and how do you figure out what's enough?
275
900135
2546
e como sabem que é suficiente?
15:14
JD: They're completely flexible.
276
902705
2272
JD: Elas são completamente flexíveis.
15:17
Sometimes we associate folks with spam.
277
905001
2941
Às vezes as designamos para spam,
15:19
Sometimes we associate folks
with abuse and harassment.
278
907966
3845
outras vezes para insulto e assédio.
Buscamos ser flexíveis com nosso pessoal
15:23
We're going to make sure that
we have flexibility in our people
279
911835
3062
de modo a direcioná-lo
para o que for mais necessário.
15:26
so that we can direct them
at what is most needed.
280
914921
2350
Às vezes, são as eleições.
15:29
Sometimes, the elections.
281
917295
1204
Tivemos uma sequência
de eleições no México,
15:30
We've had a string of elections
in Mexico, one coming up in India,
282
918523
4927
uma em breve na Índia,
15:35
obviously, the election last year,
the midterm election,
283
923474
4447
obviamente as eleições
legislativas do ano passado,
15:39
so we just want to be flexible
with our resources.
284
927945
2472
então nós queremos ser flexíveis
com nossos recursos.
15:42
So when people --
285
930441
2129
15:44
just as an example, if you go
to our current terms of service
286
932594
6389
Apenas como um exemplo, se você
consultar nossos termos de serviço atuais
15:51
and you bring the page up,
287
939007
1641
e rolar a página para cima,
15:52
and you're wondering about abuse
and harassment that you just received
288
940672
3682
e estiver se perguntando sobre o insulto
ou assédio que acabou de sofrer
15:56
and whether it was against
our terms of service to report it,
289
944378
3634
para saber se denunciá-lo
é contra nossos termos de serviço,
16:00
the first thing you see
when you open that page
290
948036
2559
a primeira coisa que vai ver
ao abrir essa página
16:02
is around intellectual
property protection.
291
950619
3088
é sobre proteção
da propriedade intelectual.
16:06
You scroll down and you get to
abuse, harassment
292
954504
5323
Ao rolar para baixo,
vai ler sobre insulto, assédio
16:11
and everything else
that you might be experiencing.
293
959851
2382
ou qualquer outra coisa
que possa estar sofrendo.
16:14
So I don't know how that happened
over the company's history,
294
962257
3195
Não sei bem como isso aconteceu
ao longo da história da empresa,
16:17
but we put that above
the thing that people want
295
965476
4797
mas colocamos isso acima
da coisa que as pessoas querem,
16:24
the most information on
and to actually act on.
296
972146
3222
mais informações sobre isso e como agir.
16:27
And just our ordering shows the world
what we believed was important.
297
975392
5241
E a nossa forma de ordenar isso mostra
o que acreditávamos ser importante.
16:32
So we're changing all that.
298
980657
2881
Estamos mudando tudo isso.
16:35
We're ordering it the right way,
299
983562
1563
Estamos ordenando do jeito certo,
mas também simplificando as regras
16:37
but we're also simplifying the rules
so that they're human-readable
300
985149
3451
para que sejam legíveis para as pessoas
e que elas consigam de fato entender
16:40
so that people can actually
understand themselves
301
988624
4067
16:44
when something is against our terms
and when something is not.
302
992715
3448
quando algo é contra
nossos termos e quando não é.
16:48
And then we're making --
303
996187
2161
E estamos fazendo...
16:50
again, our big focus is on removing
the burden of work from the victims.
304
998372
5200
novamente, nosso grande foco
é remover o peso do trabalho das vítimas.
16:55
So that means push more
towards technology,
305
1003596
3734
E isso significa usar mais tecnologia,
em vez de seres humanos,
16:59
rather than humans doing the work --
306
1007354
1873
17:01
that means the humans receiving the abuse
307
1009251
2413
o que significa seres
humanos sofrendo insulto
17:03
and also the humans
having to review that work.
308
1011688
3026
e tendo de rever esse trabalho.
17:06
So we want to make sure
309
1014738
1673
Assim, queremos assegurar
17:08
that we're not just encouraging more work
310
1016435
2841
que não estamos apenas
incentivando mais trabalho
17:11
around something
that's super, super negative,
311
1019300
2629
em torno de algo que é supernegativo
17:13
and we want to have a good balance
between the technology
312
1021953
2674
e queremos um bom equilíbrio
entre a tecnologia
17:16
and where humans can actually be creative,
313
1024651
2852
e onde os humanos podem ser criativos,
17:19
which is the judgment of the rules,
314
1027527
3090
que é o julgamento das regras,
17:22
and not just all the mechanical stuff
of finding and reporting them.
315
1030641
3267
e não apenas toda a coisa mecânica
de encontrá-los e reportá-los.
17:25
So that's how we think about it.
316
1033932
1530
É isso que pensamos.
17:27
CA: I'm curious to dig in more
about what you said.
317
1035486
2406
CA: Gostaria de explorar mais isso.
17:29
I mean, I love that you said
you are looking for ways
318
1037916
2605
Adorei o que disse sobre procurar maneiras
17:32
to re-tweak the fundamental
design of the system
319
1040545
3462
para reajustar o design
fundamental do sistema
17:36
to discourage some of the reactive
behavior, and perhaps --
320
1044031
4875
para desencorajar alguns
dos comportamento reativos e talvez,
17:40
to use Tristan Harris-type language --
321
1048930
2705
para usar termos de Tristan Harris,
17:43
engage people's more reflective thinking.
322
1051659
4288
atrair o pensamento
mais reflexivo das pessoas.
17:47
How far advanced is that?
323
1055971
1854
Como estão os avanços nisso?
17:49
What would alternatives
to that "like" button be?
324
1057849
4305
Quais seriam as alternativas
para aquele botão "curtir"?
17:55
JD: Well, first and foremost,
325
1063518
3575
JD: Bem, em primeiro lugar,
17:59
my personal goal with the service
is that I believe fundamentally
326
1067117
5753
meu objetivo pessoal com o serviço
é que acredito fundamentalmente
18:04
that public conversation is critical.
327
1072894
2702
que a conversa pública é crucial.
18:07
There are existential problems
facing the world
328
1075620
2647
O mundo inteiro está enfrentando
problemas existenciais,
18:10
that are facing the entire world,
not any one particular nation-state,
329
1078291
4163
não apenas um estado-nação em particular,
18:14
that global public conversation benefits.
330
1082478
2649
de que o debate público
global se beneficia.
18:17
And that is one of the unique
dynamics of Twitter,
331
1085151
2372
E essa é uma das dinâmicas
únicas do Twitter:
18:19
that it is completely open,
332
1087547
1814
ser completamente aberto,
18:21
it is completely public,
333
1089385
1596
completamente público, fluído,
18:23
it is completely fluid,
334
1091005
1399
e qualquer um pode ver
qualquer conversa e participar dela.
18:24
and anyone can see any other conversation
and participate in it.
335
1092428
4038
18:28
So there are conversations
like climate change.
336
1096490
2206
Há conversas como a mudança climática.
18:30
There are conversations
like the displacement in the work
337
1098720
2682
Há conversas como perda
de vagas de emprego
18:33
through artificial intelligence.
338
1101426
2000
por causa da inteligência artificial.
18:35
There are conversations
like economic disparity.
339
1103450
3006
Há conversas como disparidade econômica.
18:38
No matter what any one nation-state does,
340
1106480
2765
Não importa o que qualquer
estado-nação faça,
18:41
they will not be able
to solve the problem alone.
341
1109269
2421
ele não será capaz de resolver
o problema sozinho.
18:43
It takes coordination around the world,
342
1111714
2643
É preciso coordenação
com o resto do mundo,
18:46
and that's where I think
Twitter can play a part.
343
1114381
3047
e é aí que acho que o Twitter
pode desempenhar um papel.
18:49
The second thing is that Twitter,
right now, when you go to it,
344
1117452
5642
A segunda coisa é que o Twitter,
como é hoje, quando você acessa,
18:55
you don't necessarily walk away
feeling like you learned something.
345
1123118
3746
não sai sentindo necessariamente
que aprendeu alguma coisa.
18:58
Some people do.
346
1126888
1276
Algumas pessoas sentem
19:00
Some people have
a very, very rich network,
347
1128188
3107
e possuem uma rede muito rica,
19:03
a very rich community
that they learn from every single day.
348
1131319
3117
uma comunidade muito rica
com a qual aprendem todos os dias.
19:06
But it takes a lot of work
and a lot of time to build up to that.
349
1134460
3691
Mas é preciso muito trabalho
e muito tempo para construir isso.
19:10
So we want to get people
to those topics and those interests
350
1138175
3448
Queremos levar as pessoas
para esses tópicos e interesses
19:13
much, much faster
351
1141647
1579
de forma muito mais rápida
19:15
and make sure that
they're finding something that,
352
1143250
2566
e assegurar que elas encontrem algo que,
19:18
no matter how much time
they spend on Twitter --
353
1146728
2360
não importa quanto tempo
elas passem no Twitter,
19:21
and I don't want to maximize
the time on Twitter,
354
1149112
2358
e não quero maximizar o tempo no Twitter,
19:23
I want to maximize
what they actually take away from it
355
1151494
2910
quero maximizar
o que elas realmente tiram de lá
19:26
and what they learn from it, and --
356
1154428
2030
e o que aprendem com isso, e...
19:29
CA: Well, do you, though?
357
1157598
1328
CA: Bem, essa é a questão central
que muita gente quer saber.
19:30
Because that's the core question
that a lot of people want to know.
358
1158950
3244
19:34
Surely, Jack, you're constrained,
to a huge extent,
359
1162218
3638
Certamente, Jack, você está
limitado, em grande medida,
19:37
by the fact that you're a public company,
360
1165880
2007
pelo fato de ser uma empresa pública,
ter investidores te pressionando,
19:39
you've got investors pressing on you,
361
1167911
1774
19:41
the number one way you make your money
is from advertising --
362
1169709
3559
sua principal fonte
de dinheiro ser a publicidade
19:45
that depends on user engagement.
363
1173292
2772
e isso depende do envolvimento do usuário.
19:48
Are you willing to sacrifice
user time, if need be,
364
1176088
4700
Você está disposto a sacrificar
o tempo do usuário, se necessário,
19:52
to go for a more reflective conversation?
365
1180812
3729
para ir para uma conversa mais reflexiva?
19:56
JD: Yeah; more relevance means
less time on the service,
366
1184565
3111
JD: Sim, mais relevância significa
menos tempo no serviço,
19:59
and that's perfectly fine,
367
1187700
1937
e isso está perfeitamente bem,
20:01
because we want to make sure
that, like, you're coming to Twitter,
368
1189661
3099
porque queremos ter certeza
de que a pessoa acesse o Twitter
20:04
and you see something immediately
that you learn from and that you push.
369
1192784
4520
e veja algo imediatamente
com que aprenda e divulgue.
20:09
We can still serve an ad against that.
370
1197328
3420
Ainda podemos veicular um anúncio aí.
20:12
That doesn't mean you need to spend
any more time to see more.
371
1200772
2921
Não significa que precisa
de mais tempo para ver mais.
A segunda coisa...
20:15
The second thing we're looking at --
372
1203717
1733
CA: Nesse objetivo, uso ativo diário,
se você está medindo isso,
20:17
CA: But just -- on that goal,
daily active usage,
373
1205474
2698
20:20
if you're measuring that,
that doesn't necessarily mean things
374
1208196
3245
não significa necessariamente coisas
que as pessoas valorizam todo dia.
20:23
that people value every day.
375
1211465
1738
20:25
It may well mean
376
1213227
1161
Podem ser coisas para as quais
as pessoas são atraídas,
20:26
things that people are drawn to
like a moth to the flame, every day.
377
1214412
3306
como a mariposa para a luz,
todo dia... somos viciados.
20:29
We are addicted, because we see
something that pisses us off,
378
1217742
3022
Vemos algo que nos irrita,
20:32
so we go in and add fuel to the fire,
379
1220788
3178
aí entramos e jogamos lenha na fogueira,
20:35
and the daily active usage goes up,
380
1223990
1927
e o uso ativo diário aumenta,
e há mais receita publicitária,
20:37
and there's more ad revenue there,
381
1225941
1715
20:39
but we all get angrier with each other.
382
1227680
2752
mas todos ficamos
com mais raiva uns dos outros.
20:42
How do you define ...
383
1230456
2509
Como você define...
20:44
"Daily active usage" seems like a really
dangerous term to be optimizing.
384
1232989
4126
"uso ativo diário" parece ser realmente
um termo perigoso para otimizar.
20:49
(Applause)
385
1237139
5057
(Aplausos)
20:54
JD: Taken alone, it is,
386
1242220
1268
JD: Tomado sozinho, é,
20:55
but you didn't let me
finish the other metric,
387
1243512
2346
mas você não me deixou
terminar a outra métrica,
20:57
which is, we're watching for conversations
388
1245882
3727
que é estarmos atentos a conversas
21:01
and conversation chains.
389
1249633
2129
e cadeias de conversas.
21:03
So we want to incentivize
healthy contribution back to the network,
390
1251786
5076
Queremos incentivar
contribuição saudável de volta à rede,
21:08
and what we believe that is
is actually participating in conversation
391
1256886
4181
e acreditamos que isso possa ser feito
participando de conversas saudáveis,
21:13
that is healthy,
392
1261091
1197
21:14
as defined by those four indicators
I articulated earlier.
393
1262312
5037
definidas por aqueles quatro indicadores
que mencionei anteriormente.
21:19
So you can't just optimize
around one metric.
394
1267373
2657
Por isso não podemos otimizar
apenas uma métrica.
21:22
You have to balance and look constantly
395
1270054
2752
Temos de equilibrar e olhar constantemente
21:24
at what is actually going to create
a healthy contribution to the network
396
1272830
4083
o que realmente vai criar
uma contribuição saudável para a rede
21:28
and a healthy experience for people.
397
1276937
2341
e uma experiência saudável
para as pessoas.
No final, queremos chegar a uma métrica
em que as pessoas nos digam:
21:31
Ultimately, we want to get to a metric
398
1279302
1866
21:33
where people can tell us,
"Hey, I learned something from Twitter,
399
1281192
3757
"Ei, aprendi algo no Twitter,
e estou saindo com algo de valor".
21:36
and I'm walking away
with something valuable."
400
1284973
2167
Esse é o nosso objetivo,
mas vai levar algum tempo.
21:39
That is our goal ultimately over time,
401
1287164
2043
21:41
but that's going to take some time.
402
1289231
1809
21:43
CA: You come over to many,
I think to me, as this enigma.
403
1291064
5282
CA: Para muitos, e para mim,
você é um enigma.
21:48
This is possibly unfair,
but I woke up the other night
404
1296370
4396
Isso é possivelmente injusto,
mas acordei uma noite dessas
21:52
with this picture of how I found I was
thinking about you and the situation,
405
1300790
3879
com uma imagem de como eu via
você e a situação:
21:56
that we're on this great voyage with you
on this ship called the "Twittanic" --
406
1304693
6903
estamos numa viagem ótima com você
num navio chamado "Twittanic"...
(Risos)
22:03
(Laughter)
407
1311620
1281
22:04
and there are people on board in steerage
408
1312925
4357
e há pessoas a bordo na terceira classe
22:09
who are expressing discomfort,
409
1317306
2203
que estão expressando desconforto,
22:11
and you, unlike many other captains,
410
1319533
2543
e você, ao contrário
de muitos outros comandantes,
22:14
are saying, "Well, tell me, talk to me,
listen to me, I want to hear."
411
1322100
3431
está dizendo: "Bem, me contem,
falem comigo, quero ouvir".
22:17
And they talk to you, and they say,
"We're worried about the iceberg ahead."
412
1325555
3619
E elas dizem: "Estamos preocupados
com o iceberg à frente".
Você diz: "Essa é uma questão importante,
22:21
And you go, "You know,
that is a powerful point,
413
1329198
2242
e nosso navio, francamente, não foi feito
para ser dirigido como deveria".
22:23
and our ship, frankly,
hasn't been built properly
414
1331464
2430
22:25
for steering as well as it might."
415
1333918
1669
E lhe pedimos: "Por favor, faça algo".
22:27
And we say, "Please do something."
416
1335611
1658
Você vai à cabine de comando,
e ficamos esperando, observando
22:29
And you go to the bridge,
417
1337293
1411
22:30
and we're waiting,
418
1338728
2295
22:33
and we look, and then you're showing
this extraordinary calm,
419
1341047
4548
e você demonstra uma calma extraordinária,
22:37
but we're all standing outside,
saying, "Jack, turn the fucking wheel!"
420
1345619
3883
mas todos nós estamos do lado de fora,
dizendo: "Jack, vire essa porra de leme!"
22:41
You know?
421
1349526
1151
Sabe?
22:42
(Laughter)
422
1350701
1335
(Risos)
22:44
(Applause)
423
1352060
2381
(Aplausos) (Vivas)
22:46
I mean --
424
1354465
1166
22:47
(Applause)
425
1355655
1734
22:49
It's democracy at stake.
426
1357413
4594
É a democracia em jogo,
22:54
It's our culture at stake.
It's our world at stake.
427
1362031
2821
é a nossa cultura e nosso mundo em jogo,
22:56
And Twitter is amazing and shapes so much.
428
1364876
4706
e o Twitter é incrível e molda muito.
Não é tão grande
quanto outras plataformas,
23:01
It's not as big as some
of the other platforms,
429
1369606
2233
mas as pessoas de influência
usam-no para definir a agenda,
23:03
but the people of influence use it
to set the agenda,
430
1371863
2804
23:06
and it's just hard to imagine a more
important role in the world than to ...
431
1374691
6787
e é difícil imaginar um papel
mais importante no mundo do que para...
23:13
I mean, you're doing a brilliant job
of listening, Jack, and hearing people,
432
1381502
3784
quero dizer, você está fazendo um trabalho
brilhante de ouvir as pessoas, Jack,
23:17
but to actually dial up the urgency
and move on this stuff --
433
1385310
4445
mas é preciso ressaltar a urgência
e trabalhar nisso.
23:21
will you do that?
434
1389779
2201
Você pretende fazer isso?
23:24
JD: Yes, and we have been
moving substantially.
435
1392750
3815
JD: Sim, e estamos
nos movendo significativamente.
23:28
I mean, there's been
a few dynamics in Twitter's history.
436
1396589
3225
Tem havido algumas dinâmicas
na história do Twitter.
23:31
One, when I came back to the company,
437
1399838
2083
Uma, quando voltei para a empresa,
23:35
we were in a pretty dire state
in terms of our future,
438
1403477
6256
estávamos num estado terrível
em termos do nosso futuro,
23:41
and not just from how people
were using the platform,
439
1409757
4634
não apenas como as pessoas
estavam usando a plataforma,
23:46
but from a corporate narrative as well.
440
1414415
2047
mas nossa narrativa corporativa também.
23:48
So we had to fix
a bunch of the foundation,
441
1416486
3204
Tivemos de consertar muito da fundação,
23:51
turn the company around,
442
1419714
1969
revirar a empresa,
23:53
go through two crazy layoffs,
443
1421707
3111
passar por duas demissões
em massa malucas,
23:56
because we just got too big
for what we were doing,
444
1424842
3793
porque ficamos grandes demais
para o que estávamos fazendo,
24:00
and we focused all of our energy
445
1428659
2060
e concentramos toda nossa energia
24:02
on this concept of serving
the public conversation.
446
1430743
3508
no conceito de servir à conversa pública.
24:06
And that took some work.
447
1434275
1451
E isso exigiu muito trabalho.
24:07
And as we dived into that,
448
1435750
2608
E, quando mergulhamos nisso,
24:10
we realized some of the issues
with the fundamentals.
449
1438382
2992
percebemos alguns dos problemas
com os fundamentos.
24:14
We could do a bunch of superficial things
to address what you're talking about,
450
1442120
4656
Poderíamos fazer um monte
de coisas superficiais
para abordar o que você está falando,
mas precisamos de mudanças duradouras,
24:18
but we need the changes to last,
451
1446800
1790
24:20
and that means going really, really deep
452
1448614
2459
e isso significa ir mais fundo
24:23
and paying attention
to what we started 13 years ago
453
1451097
4350
e prestar atenção
ao que começamos há 13 anos,
24:27
and really questioning
454
1455471
2261
questionando de fato
24:29
how the system works
and how the framework works
455
1457756
2566
como o sistema e a estrutura funcionam
24:32
and what is needed for the world today,
456
1460346
3833
e o que é necessário para o mundo hoje,
24:36
given how quickly everything is moving
and how people are using it.
457
1464203
4024
dada a rapidez como tudo está se movendo
e as pessoas o estão usando.
24:40
So we are working as quickly as we can,
but quickness will not get the job done.
458
1468251
6544
Trabalhamos o mais rápido possível,
mas rapidez não vai resolver.
24:46
It's focus, it's prioritization,
459
1474819
2611
É o foco, é a priorização,
24:49
it's understanding
the fundamentals of the network
460
1477454
2946
é entender os fundamentos da rede
24:52
and building a framework that scales
461
1480424
2842
e construir uma estrutura que se ajuste,
24:55
and that is resilient to change,
462
1483290
2351
que seja resiliente à mudança,
24:57
and being open about where we are
and being transparent about where are
463
1485665
5429
e ser abertos e transparentes
sobre onde estamos
25:03
so that we can continue to earn trust.
464
1491118
2179
para que possamos
continuar a merecer confiança.
25:06
So I'm proud of all the frameworks
that we've put in place.
465
1494141
3331
Por isso estou orgulhoso
das estruturas que estabelecemos.
25:09
I'm proud of our direction.
466
1497496
2888
Estou orgulhoso da nossa direção.
25:12
We obviously can move faster,
467
1500915
2718
Obviamente podemos nos mover mais rápido,
25:15
but that required just stopping a bunch
of stupid stuff we were doing in the past.
468
1503657
4719
mas isso demandaria
simplesmente evitar um monte de coisas
estúpidas que fazíamos no passado.
25:21
CA: All right.
469
1509067
1164
CA: Suspeito que há muitas pessoas aqui
que, se tivessem a chance,
25:22
Well, I suspect there are many people here
who, if given the chance,
470
1510255
4067
25:26
would love to help you
on this change-making agenda you're on,
471
1514346
3989
adorariam ajudá-lo
nessa sua agenda de mudança
25:30
and I don't know if Whitney --
472
1518359
1542
e, não sei a Whitney...
25:31
Jack, thank you for coming here
and speaking so openly.
473
1519925
2761
Jack, obrigado por vir aqui
e falar tão abertamente.
25:34
It took courage.
474
1522710
1527
Foi preciso coragem.
Muito obrigado pelo que você falou
e boa sorte com sua missão.
25:36
I really appreciate what you said,
and good luck with your mission.
475
1524261
3384
25:39
JD: Thank you so much.
Thanks for having me.
476
1527669
2095
JD: Muito obrigado.
Obrigado por me receber.
25:41
(Applause)
477
1529788
3322
(Aplausos)
Obrigado.
25:45
Thank you.
478
1533134
1159
Translated by Raissa Mendes
Reviewed by Maricene Crus

▲Back to top

ABOUT THE SPEAKERS
Jack Dorsey - Entrepreneur, programmer
Jack Dorsey is the CEO of Twitter, CEO & Chairman of Square, and a cofounder of both.

Why you should listen
More profile about the speaker
Jack Dorsey | Speaker | TED.com
Chris Anderson - TED Curator
After a long career in journalism and publishing, Chris Anderson became the curator of the TED Conference in 2002 and has developed it as a platform for identifying and disseminating ideas worth spreading.

Why you should listen

Chris Anderson is the Curator of TED, a nonprofit devoted to sharing valuable ideas, primarily through the medium of 'TED Talks' -- short talks that are offered free online to a global audience.

Chris was born in a remote village in Pakistan in 1957. He spent his early years in India, Pakistan and Afghanistan, where his parents worked as medical missionaries, and he attended an American school in the Himalayas for his early education. After boarding school in Bath, England, he went on to Oxford University, graduating in 1978 with a degree in philosophy, politics and economics.

Chris then trained as a journalist, working in newspapers and radio, including two years producing a world news service in the Seychelles Islands.

Back in the UK in 1984, Chris was captivated by the personal computer revolution and became an editor at one of the UK's early computer magazines. A year later he founded Future Publishing with a $25,000 bank loan. The new company initially focused on specialist computer publications but eventually expanded into other areas such as cycling, music, video games, technology and design, doubling in size every year for seven years. In 1994, Chris moved to the United States where he built Imagine Media, publisher of Business 2.0 magazine and creator of the popular video game users website IGN. Chris eventually merged Imagine and Future, taking the combined entity public in London in 1999, under the Future name. At its peak, it published 150 magazines and websites and employed 2,000 people.

This success allowed Chris to create a private nonprofit organization, the Sapling Foundation, with the hope of finding new ways to tackle tough global issues through media, technology, entrepreneurship and, most of all, ideas. In 2001, the foundation acquired the TED Conference, then an annual meeting of luminaries in the fields of Technology, Entertainment and Design held in Monterey, California, and Chris left Future to work full time on TED.

He expanded the conference's remit to cover all topics, including science, business and key global issues, while adding a Fellows program, which now has some 300 alumni, and the TED Prize, which grants its recipients "one wish to change the world." The TED stage has become a place for thinkers and doers from all fields to share their ideas and their work, capturing imaginations, sparking conversation and encouraging discovery along the way.

In 2006, TED experimented with posting some of its talks on the Internet. Their viral success encouraged Chris to begin positioning the organization as a global media initiative devoted to 'ideas worth spreading,' part of a new era of information dissemination using the power of online video. In June 2015, the organization posted its 2,000th talk online. The talks are free to view, and they have been translated into more than 100 languages with the help of volunteers from around the world. Viewership has grown to approximately one billion views per year.

Continuing a strategy of 'radical openness,' in 2009 Chris introduced the TEDx initiative, allowing free licenses to local organizers who wished to organize their own TED-like events. More than 8,000 such events have been held, generating an archive of 60,000 TEDx talks. And three years later, the TED-Ed program was launched, offering free educational videos and tools to students and teachers.

More profile about the speaker
Chris Anderson | Speaker | TED.com
Whitney Pennington Rodgers - TED Current Affairs Curator
Whitney Pennington Rodgers is an award-winning journalist and media professional.

Why you should listen

Prior to joining TED as current affairs curator, Whitney Pennington Rodgers produced for NBC's primetime news magazine Dateline NBC. She earned a duPont-Columbia award and a News & Documentary Emmy or her contributions to the Dateline NBC hour "The Cosby Accusers Speak" -- an extensive group interview with 27 of the women who accused entertainer Bill Cosby of sexual misconduct.

Pennington Rodgers has worked at NBC's in-house production company Peacock Productions, The Today Show, Nightly News, Rock Center with Brian Williams and New Jersey-centric public affairs shows Caucus: New Jersey and One-on-One with Steve Adubato. Prior to beginning her career in media, she had a short stint as a fourth-grade teacher through the Teach for America program.

Pennington Rodgers received her Bachelor's in journalism and media studies from Rutgers University. She completed her Master's of Journalism at the University of California at Berkeley, where she produced a documentary about recruitment of nonblack students at historically black colleges and universities.

More profile about the speaker
Whitney Pennington Rodgers | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee