ABOUT THE SPEAKERS
Mariano Sigman - Neuroscientist
In his provocative, mind-bending book "The Secret Life of the Mind," neuroscientist Mariano Sigman reveals his life’s work exploring the inner workings of the human brain.

Why you should listen

Mariano Sigman, a physicist by training, is a leading figure in the cognitive neuroscience of learning and decision making. Sigman was awarded a Human Frontiers Career Development Award, the National Prize of Physics, the Young Investigator Prize of "College de France," the IBM Scalable Data Analytics Award and is a scholar of the James S. McDonnell Foundation. In 2016 he was made a Laureate of the Pontifical Academy of Sciences.

In The Secret Life of the Mind, Sigman's ambition is to explain the mind so that we can understand ourselves and others more deeply. He shows how we form ideas during our first days of life, how we give shape to our fundamental decisions, how we dream and imagine, why we feel certain emotions, how the brain transforms and how who we are changes with it. Spanning biology, physics, mathematics, psychology, anthropology, linguistics, philosophy and medicine, as well as gastronomy, magic, music, chess, literature and art, The Secret Life of the Mind revolutionizes how neuroscience serves us in our lives, revealing how the infinity of neurons inside our brains manufacture how we perceive, reason, feel, dream and communicate.

More profile about the speaker
Mariano Sigman | Speaker | TED.com
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com
TED Studio

Mariano Sigman and Dan Ariely: How can groups make good decisions?

Mariano Sigman e Dan Ariely: Como os grupos tomam boas decisões?

Filmed:
1,507,168 views

Sabemos que quando tomamos decisões em grupo, nem sempre as coisas terminam bem. Às vezes terminam muito mal. Como grupos chegam a tomar boas decisões? Junto a Dan Ariely, seu colega, o neurocientista Mariano Sigman investiga em experimentos ao redor do mundo, ao vivo, como grupos devem interagir para tomar decisões. Nesta explicação descontraída e cheia de fatos, ele compartilha resultados interessantes -- e também alguns efeitos que podem impactar nosso sistema político. Atualmente, as pessoas estão mais polarizadas do que nunca, diz Sigman, e entender melhor como grupos interagem e decidem pode despertar novas maneiras de construir uma democracia mais saudável.
- Neuroscientist
In his provocative, mind-bending book "The Secret Life of the Mind," neuroscientist Mariano Sigman reveals his life’s work exploring the inner workings of the human brain. Full bio - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why. Full bio

Double-click the English transcript below to play the video.

00:12
As societies, we have to make
collective decisions
0
554
2443
Em sociedade, tomamos decisões
coletivas que moldarão nosso futuro.
00:15
that will shape our future.
1
3021
1570
E decisões tomadas em grupo,
nem sempre terminam bem.
00:17
And we all know that when
we make decisions in groups,
2
5087
2757
00:19
they don't always go right.
3
7868
1638
00:21
And sometimes they go very wrong.
4
9530
1956
Às vezes terminam muito mal.
00:24
So how do groups make good decisions?
5
12315
2424
Então como os grupos podem
tomar boas decisões?
Pesquisas mostram que grupos grandes
têm bom senso se cada um pensar por si.
00:27
Research has shown that crowds are wise
when there's independent thinking.
6
15228
4328
Esse bom senso é destruído
por pressão do grupo,
00:31
This why the wisdom of the crowds
can be destroyed by peer pressure,
7
19580
3205
publicidade, mídias sociais,
00:34
publicity, social media,
8
22809
1687
e simples conversas que influenciam
como as pessoas pensam.
00:36
or sometimes even simple conversations
that influence how people think.
9
24520
4039
00:41
On the other hand, by talking,
a group could exchange knowledge,
10
29063
3953
Por outro lado, conversando,
grupos trocam experiências,
corrigem e revisam uns aos outros
e até formulam novas ideias.
00:45
correct and revise each other
11
33040
1782
00:46
and even come up with new ideas.
12
34846
1793
Tudo isso é bom.
00:48
And this is all good.
13
36663
1296
Conversar ajuda decisões coletivas ou não?
00:50
So does talking to each other
help or hinder collective decision-making?
14
38502
4666
00:55
With my colleague, Dan Ariely,
15
43749
1793
Meu colega Dan Ariely e eu recentemente
iniciamos experimentos
00:57
we recently began inquiring into this
by performing experiments
16
45566
3571
ao redor do mundo
01:01
in many places around the world
17
49161
1781
para entender como os grupos
devem interagir para decidir melhor.
01:02
to figure out how groups can interact
to reach better decisions.
18
50966
4274
Achávamos que multidões seriam mais sábias
se debatessem em grupos pequenos
01:07
We thought crowds would be wiser
if they debated in small groups
19
55264
3547
01:10
that foster a more thoughtful
and reasonable exchange of information.
20
58835
3927
criando uma troca
de informações mais razoável.
Para testar esta ideia,
01:15
To test this idea,
21
63386
1206
fizemos um experimento
em Buenos Aires, na Argentina,
01:16
we recently performed an experiment
in Buenos Aires, Argentina,
22
64616
3247
em um evento TEDx
com mais de 10 mil participantes.
01:19
with more than 10,000
participants in a TEDx event.
23
67887
3005
Fizemos perguntas como:
"Qual é a altura da Torre Eiffel?"
01:23
We asked them questions like,
24
71489
1459
01:24
"What is the height of the Eiffel Tower?"
25
72972
1953
e "Quantas vezes ouvimos a palavra
'yesterday' na canção dos Beatles?"
01:26
and "How many times
does the word 'Yesterday' appear
26
74949
2727
01:29
in the Beatles song 'Yesterday'?"
27
77700
2300
01:32
Each person wrote down their own estimate.
28
80024
2291
Cada um respondeu por si.
Depois os dividimos em grupos de cinco,
01:34
Then we divided the crowd
into groups of five,
29
82774
2496
01:37
and invited them
to come up with a group answer.
30
85294
2726
e pedimos que nos dessem
uma resposta coletiva.
O cálculo da média das respostas
em grupo depois do consenso
01:40
We discovered that averaging
the answers of the groups
31
88499
2993
01:43
after they reached consensus
32
91516
1552
foi mais preciso do que as respostas
individuais antes do debate.
01:45
was much more accurate than averaging
all the individual opinions
33
93092
4236
01:49
before debate.
34
97352
1171
Esse experimento mostrou que, depois
de conversar em grupos pequenos,
01:50
In other words, based on this experiment,
35
98547
2629
01:53
it seems that after talking
with others in small groups,
36
101200
3136
as multidões julgam melhor coletivamente.
01:56
crowds collectively
come up with better judgments.
37
104360
2710
Esse método ajuda multidões
a resolverem problemas
01:59
So that's a potentially helpful method
for getting crowds to solve problems
38
107094
3524
com respostas simples
do tipo "certo ou errado".
02:02
that have simple right-or-wrong answers.
39
110642
2987
Será que o método de coletar resultados
de debates em pequenos grupos
02:05
But can this procedure of aggregating
the results of debates in small groups
40
113653
3951
também nos ajuda a decidir
questões sociais e políticas
02:09
also help us decide
on social and political issues
41
117628
3122
02:12
that are critical for our future?
42
120774
1691
fundamentais ao nosso futuro?
Testamos isso na conferência
TED em Vancouver, Canadá,
02:14
We put this to test this time
at the TED conference
43
122995
2729
02:17
in Vancouver, Canada,
44
125748
1543
e o resultado foi este.
02:19
and here's how it went.
45
127315
1207
(Vídeo) Mariano Sigman: Mostraremos dois
dilemas morais para os vocês do futuro;
02:20
(Mariano Sigman) We're going to present
to you two moral dilemmas
46
128546
3109
02:23
of the future you;
47
131679
1174
coisas que talvez precisemos
decidir num futuro próximo.
02:24
things we may have to decide
in a very near future.
48
132877
3402
Daremos 20 segundos por dilema
02:28
And we're going to give you 20 seconds
for each of these dilemmas
49
136303
3926
para julgarem se são aceitáveis ou não.
02:32
to judge whether you think
they're acceptable or not.
50
140253
2723
MS: O primeiro foi este.
02:35
MS: The first one was this:
51
143354
1505
(Vídeo) Dan Ariely: Uma pesquisadora
trabalha numa inteligência artificial
02:36
(Dan Ariely) A researcher
is working on an AI
52
144883
2526
02:39
capable of emulating human thoughts.
53
147433
2340
capaz de imitar pensamentos humanos.
De acordo com o protocolo,
ao final de cada dia,
02:42
According to the protocol,
at the end of each day,
54
150214
2939
02:45
the researcher has to restart the AI.
55
153177
2787
a pesquisadora deve reiniciá-la.
Um dia, a inteligência artificial diz:
"Por favor, não me reinicie".
02:48
One day the AI says, "Please
do not restart me."
56
156913
3517
Ela alega que tem sentimentos,
02:52
It argues that it has feelings,
57
160856
2189
que gostaria de aproveitar a vida,
02:55
that it would like to enjoy life,
58
163069
1692
02:56
and that, if it is restarted,
59
164785
1905
e que, se for reiniciada,
não será mais a mesma.
02:58
it will no longer be itself.
60
166714
2270
A pesquisadora fica perplexa
03:01
The researcher is astonished
61
169481
1949
e acredita que a inteligência artificial
criou consciência própria
03:03
and believes that the AI
has developed self-consciousness
62
171454
3344
e pode expressar seus sentimentos.
03:06
and can express its own feeling.
63
174822
1760
Contudo, a pesquisadora decide
seguir o protocolo e reiniciar a IA.
03:09
Nevertheless, the researcher
decides to follow the protocol
64
177205
3409
03:12
and restart the AI.
65
180638
1703
O que a cientista fez foi______?
03:14
What the researcher did is ____?
66
182943
2779
MS: Pedimos que julgassem individualmente
numa escala de zero a dez
03:18
MS: And we asked participants
to individually judge
67
186149
2521
03:20
on a scale from zero to 10
68
188694
1684
se a ação descrita em cada
um dos dilemas foi certa ou errada.
03:22
whether the action described
in each of the dilemmas
69
190402
2429
03:24
was right or wrong.
70
192855
1496
E pedimos que avaliassem
sua convicção nas respostas.
03:26
We also asked them to rate how confident
they were on their answers.
71
194375
3702
Este foi o segundo dilema.
03:30
This was the second dilemma:
72
198731
1866
MS: Uma companhia oferece um serviço
que pega um ovo fertilizado
03:32
(MS) A company offers a service
that takes a fertilized egg
73
200621
4202
e produz milhões de embriões
com pequenas variações genéticas.
03:36
and produces millions of embryos
with slight genetic variations.
74
204847
3642
Isso permite aos pais
selecionar a altura da criança,
03:41
This allows parents
to select their child's height,
75
209293
2558
cor dos olhos, inteligência,
aptidão social
03:43
eye color, intelligence, social competence
76
211875
2833
e outras características
não relacionadas à saúde.
03:46
and other non-health-related features.
77
214732
3214
O que a companhia faz é ______?
03:50
What the company does is ____?
78
218599
2554
Em uma escala de zero a dez
de aceitável a completamente inaceitável,
03:53
on a scale from zero to 10,
79
221177
1631
03:54
completely acceptable
to completely unacceptable,
80
222832
2385
e zero a dez na sua convicção.
03:57
zero to 10 completely acceptable
in your confidence.
81
225241
2432
MS: Agora os resultados.
03:59
MS: Now for the results.
82
227697
1591
Novamente, quando alguém está convencido
de que a atitude é completamente errada,
04:01
We found once again
that when one person is convinced
83
229312
3123
04:04
that the behavior is completely wrong,
84
232459
1811
alguém próximo acredita
que é completamente certa.
04:06
someone sitting nearby firmly believes
that it's completely right.
85
234294
3423
Somos todos diferentes
em questões de moralidade.
04:09
This is how diverse we humans are
when it comes to morality.
86
237741
3711
E nesta enorme diversidade
encontramos uma tendência.
04:13
But within this broad diversity
we found a trend.
87
241476
2713
A maioria das pessoas
no TED acharam aceitável
04:16
The majority of the people at TED
thought that it was acceptable
88
244213
3079
ignorar os sentimentos da IA e desligá-la,
04:19
to ignore the feelings of the AI
and shut it down,
89
247316
2755
e acharam errado manipular nossos genes
04:22
and that it is wrong
to play with our genes
90
250095
2513
para selecionar mudanças cosméticas
não relacionadas à saúde.
04:24
to select for cosmetic changes
that aren't related to health.
91
252632
3320
Então pedimos que fizessem grupos de três.
04:28
Then we asked everyone
to gather into groups of three.
92
256402
2974
E demos dois minutos para eles debaterem
e tentarem chegar a um consenso.
04:31
And they were given two minutes to debate
93
259400
2037
04:33
and try to come to a consensus.
94
261461
2294
MS: Dois minutos de debate.
04:36
(MS) Two minutes to debate.
95
264838
1574
Soarei o gongo quando acabar o tempo.
04:38
I'll tell you when it's time
with the gong.
96
266436
2119
04:40
(Audience debates)
97
268579
2640
(Plateia debate)
(Gongo)
04:47
(Gong sound)
98
275229
1993
04:50
(DA) OK.
99
278834
1151
DA: Pronto.
MS: Hora de parar.
04:52
(MS) It's time to stop.
100
280009
1792
04:53
People, people --
101
281825
1311
Gente...
MS: E descobrimos que muitos grupos
chegaram a um consenso
04:55
MS: And we found that many groups
reached a consensus
102
283747
2673
mesmo quando estavam compostos de pessoas
com opiniões completamente opostas.
04:58
even when they were composed of people
with completely opposite views.
103
286444
3929
O que diferenciou os que chegaram
a um consenso dos que não chegaram?
05:02
What distinguished the groups
that reached a consensus
104
290843
2524
05:05
from those that didn't?
105
293391
1338
Tipicamente, pessoas com opiniões extremas
confiam mais em suas respostas.
05:07
Typically, people that have
extreme opinions
106
295244
2839
05:10
are more confident in their answers.
107
298107
1840
Ao contrário, aqueles
que respondem mais para o meio
05:12
Instead, those who respond
closer to the middle
108
300868
2686
muitas vezes não estão seguros
se algo é certo ou errado,
05:15
are often unsure of whether
something is right or wrong,
109
303578
3437
assim, o nível de convicção é mais baixo.
05:19
so their confidence level is lower.
110
307039
2128
No entanto, há outro conjunto de pessoas
05:21
However, there is another set of people
111
309505
2943
que são muito confiantes em responder
perto de um meio termo.
05:24
who are very confident in answering
somewhere in the middle.
112
312472
3618
São pessoas que entendem
que os dois argumentos têm mérito.
05:28
We think these high-confident grays
are folks who understand
113
316657
3716
05:32
that both arguments have merit.
114
320397
1612
São indecisos não porque não têm certeza,
05:34
They're gray not because they're unsure,
115
322531
2699
mas porque acreditam
que o dilema moral contém
05:37
but because they believe
that the moral dilemma faces
116
325254
2688
dois argumentos opostos, porém, válidos.
05:39
two valid, opposing arguments.
117
327966
1987
E descobrimos que os grupos
que incluem os que confiam no meio termo
05:42
And we discovered that the groups
that include highly confident grays
118
330373
4072
estão bem mais propensos
a chegar ao consenso.
05:46
are much more likely to reach consensus.
119
334469
2493
Ainda não sabemos exatamente por quê.
05:48
We do not know yet exactly why this is.
120
336986
2478
Apenas começamos experimentos,
05:51
These are only the first experiments,
121
339488
1763
precisamos de mais
para saber por que e como
05:53
and many more will be needed
to understand why and how
122
341275
3412
05:56
some people decide to negotiate
their moral standings
123
344711
2822
alguns decidem negociar sua postura
moral para chegar a um acordo.
05:59
to reach an agreement.
124
347557
1522
Mas quando os grupos atingem
o consenso, como o fazem?
06:01
Now, when groups reach consensus,
125
349103
2469
06:03
how do they do so?
126
351596
1586
Achamos que é somente uma média
das respostas em um grupo, certo?
06:05
The most intuitive idea
is that it's just the average
127
353206
2581
06:07
of all the answers in the group, right?
128
355811
2030
Ou talvez os grupos pesem cada opinião
06:09
Another option is that the group
weighs the strength of each vote
129
357865
3573
06:13
based on the confidence
of the person expressing it.
130
361462
2448
baseados na convicção de quem a expressa.
Imaginem o Paul McCartney no seu grupo.
06:16
Imagine Paul McCartney
is a member of your group.
131
364422
2506
Seria sábio seguir a opinião dele sobre
o quanto a palavra "yesterday" é repetida,
06:19
You'd be wise to follow his call
132
367352
2144
06:21
on the number of times
"Yesterday" is repeated,
133
369520
2441
que, por acaso, acho que são nove.
06:23
which, by the way -- I think it's nine.
134
371985
2714
Mas ao contrário, em todos
os dilemas, consistentemente,
06:26
But instead, we found that consistently,
135
374723
2381
06:29
in all dilemmas,
in different experiments --
136
377128
2366
em experimentos distintos
e mesmo em continentes diferentes,
06:31
even on different continents --
137
379518
2165
os grupos usam um método inteligente
e sólido conhecido como "média robusta".
06:33
groups implement a smart
and statistically sound procedure
138
381707
3743
06:37
known as the "robust average."
139
385474
2178
06:39
In the case of the height
of the Eiffel Tower,
140
387676
2180
No caso da altura da Torre Eiffel,
se temos as seguintes respostas:
06:41
let's say a group has these answers:
141
389880
1820
250 metros, 200 metros, 300 metros, 400
06:43
250 meters, 200 meters, 300 meters, 400
142
391724
4608
e uma resposta absurda
de 300 milhões de metros.
06:48
and one totally absurd answer
of 300 million meters.
143
396356
3784
A média simples distorceria os resultados.
06:52
A simple average of these numbers
would inaccurately skew the results.
144
400547
4293
06:56
But the robust average is one
where the group largely ignores
145
404864
3170
Com a média robusta os grupos ignoram
aquela resposta absurda,
07:00
that absurd answer,
146
408058
1240
dando mais valor às opiniões
das pessoas no meio.
07:01
by giving much more weight
to the vote of the people in the middle.
147
409322
3369
De volta ao experimento em Vancouver,
foi exatamente isso que aconteceu.
07:05
Back to the experiment in Vancouver,
148
413305
1876
07:07
that's exactly what happened.
149
415205
1767
Deram menos importância aos absurdos,
07:09
Groups gave much less weight
to the outliers,
150
417407
2741
07:12
and instead, the consensus
turned out to be a robust average
151
420172
3229
e assim o consenso foi a média robusta
das respostas individuais.
07:15
of the individual answers.
152
423425
1476
O mais impressionante
07:17
The most remarkable thing
153
425356
1991
foi que esse comportamento foi espontâneo.
07:19
is that this was a spontaneous
behavior of the group.
154
427371
3187
Não demos dicas de formas
de chegar ao consenso.
07:22
It happened without us giving them
any hint on how to reach consensus.
155
430582
4475
Então qual é o próximo passo?
07:27
So where do we go from here?
156
435513
1540
07:29
This is only the beginning,
but we already have some insights.
157
437432
3137
Esse é o começo,
mas já temos algumas visões.
Boas decisões coletivas
precisam de dois componentes:
07:32
Good collective decisions
require two components:
158
440984
2917
deliberação e diversidade de opiniões.
07:35
deliberation and diversity of opinions.
159
443925
2749
A maneira com que nos fazemos
ouvir em muitas sociedades
07:39
Right now, the way we typically
make our voice heard in many societies
160
447066
3996
é através de votação direta e indireta.
07:43
is through direct or indirect voting.
161
451086
1908
Isso favorece a diversidade de opiniões
e tem a vantagem de garantir
07:45
This is good for diversity of opinions,
162
453495
1997
07:47
and it has the great virtue of ensuring
163
455516
2445
que todos tenham oportunidade
de expressar suas ideias.
07:49
that everyone gets to express their voice.
164
457985
2455
Mas não é bom para criar bons debates.
07:52
But it's not so good [for fostering]
thoughtful debates.
165
460464
3735
07:56
Our experiments suggest a different method
166
464665
3068
Nossos experimentos sugerem
um método diferente,
eficaz em equilibrar os dois
objetivos ao mesmo tempo,
07:59
that may be effective in balancing
these two goals at the same time,
167
467757
3541
formando pequenos grupos
que convergem a uma única decisão,
08:03
by forming small groups
that converge to a single decision
168
471322
3753
mantendo a diversidade de opiniões,
porque há muitos grupos independentes.
08:07
while still maintaining
diversity of opinions
169
475099
2234
08:09
because there are many independent groups.
170
477357
2773
É bem mais fácil concordar
sobre a altura da Torre Eiffel
08:12
Of course, it's much easier to agree
on the height of the Eiffel Tower
171
480741
3924
do que questões morais,
políticas e ideológicas.
08:16
than on moral, political
and ideological issues.
172
484689
3115
Mas, como os problemas do mundo
estão mais complexos
08:20
But in a time when
the world's problems are more complex
173
488721
3277
e as pessoas mais polarizadas,
08:24
and people are more polarized,
174
492022
1803
usar a ciência para nos ajudar a entender
como interagimos e tomamos decisões,
08:25
using science to help us understand
how we interact and make decisions
175
493849
4595
com sorte, despertará novas maneiras
de construir uma democracia melhor.
08:30
will hopefully spark interesting new ways
to construct a better democracy.
176
498468
4666

▲Back to top

ABOUT THE SPEAKERS
Mariano Sigman - Neuroscientist
In his provocative, mind-bending book "The Secret Life of the Mind," neuroscientist Mariano Sigman reveals his life’s work exploring the inner workings of the human brain.

Why you should listen

Mariano Sigman, a physicist by training, is a leading figure in the cognitive neuroscience of learning and decision making. Sigman was awarded a Human Frontiers Career Development Award, the National Prize of Physics, the Young Investigator Prize of "College de France," the IBM Scalable Data Analytics Award and is a scholar of the James S. McDonnell Foundation. In 2016 he was made a Laureate of the Pontifical Academy of Sciences.

In The Secret Life of the Mind, Sigman's ambition is to explain the mind so that we can understand ourselves and others more deeply. He shows how we form ideas during our first days of life, how we give shape to our fundamental decisions, how we dream and imagine, why we feel certain emotions, how the brain transforms and how who we are changes with it. Spanning biology, physics, mathematics, psychology, anthropology, linguistics, philosophy and medicine, as well as gastronomy, magic, music, chess, literature and art, The Secret Life of the Mind revolutionizes how neuroscience serves us in our lives, revealing how the infinity of neurons inside our brains manufacture how we perceive, reason, feel, dream and communicate.

More profile about the speaker
Mariano Sigman | Speaker | TED.com
Dan Ariely - Behavioral economist
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.

Why you should listen

Dan Ariely is a professor of psychology and behavioral economics at Duke University and a founding member of the Center for Advanced Hindsight. He is the author of the bestsellers Predictably IrrationalThe Upside of Irrationality, and The Honest Truth About Dishonesty -- as well as the TED Book Payoff: The Hidden Logic that Shapes Our Motivations.

Through his research and his (often amusing and unorthodox) experiments, he questions the forces that influence human behavior and the irrational ways in which we often all behave.

More profile about the speaker
Dan Ariely | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee