ABOUT THE SPEAKER
Peter Donnelly - Mathematician; statistician
Peter Donnelly is an expert in probability theory who applies statistical methods to genetic data -- spurring advances in disease treatment and insight on our evolution. He's also an expert on DNA analysis, and an advocate for sensible statistical analysis in the courtroom.

Why you should listen

Peter Donnelly applies statistical methods to real-world problems, ranging from DNA analysis (for criminal trials), to the treatment of genetic disorders. A mathematician who collaborates with biologists, he specializes in applying probability and statistics to the field of genetics, in hopes of shedding light on evolutionary history and the structure of the human genome.

The Australian-born, Oxford-based mathematician is best known for his work in molecular evolution (tracing the roots of human existence to their earliest origins using the mutation rates of mitochondrial DNA). He studies genetic distributions in living populations to trace human evolutionary history -- an approach that informs research in evolutionary biology, as well as medical treatment for genetic disorders. Donnelly is a key player in the International HapMap Project, an ongoing international effort to model human genetic variation and pinpoint the genes responsible for specific aspects of health and disease; its implications for disease prevention and treatment are vast.

He's also a leading expert on DNA analysis and the use of forensic science in criminal trials; he's an outspoken advocate for bringing sensible statistical analysis into the courtroom. Donnelly leads Oxford University's Mathematical Genetics Group, which conducts research in genetic modeling, human evolutionary history, and forensic DNA profiling. He is also serves as Director of the Wellcome Trust Centre for Human Genetics at Oxford University, which explores the genetic relationships to disease and illness. 

More profile about the speaker
Peter Donnelly | Speaker | TED.com
TEDGlobal 2005

Peter Donnelly: How juries are fooled by statistics

Filmed:
1,279,860 views

Oxford mathematician Peter Donnelly reveals the common mistakes humans make in interpreting statistics -- and the devastating impact these errors can have on the outcome of criminal trials.
- Mathematician; statistician
Peter Donnelly is an expert in probability theory who applies statistical methods to genetic data -- spurring advances in disease treatment and insight on our evolution. He's also an expert on DNA analysis, and an advocate for sensible statistical analysis in the courtroom. Full bio

Double-click the English transcript below to play the video.

00:25
As other speakers have said, it's a rather daunting experience --
0
0
2000
Beste hizlariek esan duten moduan, nahiko esperientzia beldulgarria da -
00:27
a particularly daunting experience -- to be speaking in front of this audience.
1
2000
3000
esperientzia bereziki beldulgarria da - entzuleria honen aurrean hitz egitea.
00:30
But unlike the other speakers, I'm not going to tell you about
2
5000
3000
Baina besteek ez bezala, nik
00:33
the mysteries of the universe, or the wonders of evolution,
3
8000
2000
unibertsoko misterioei edo eboluzioaren edertasunari
00:35
or the really clever, innovative ways people are attacking
4
10000
4000
edo gure munduko desberdintasun handienei aurre egiteko
00:39
the major inequalities in our world.
5
14000
2000
erabiltzen ari diren modu berritzaileei buruz hitz egingo dizuet.
00:41
Or even the challenges of nation-states in the modern global economy.
6
16000
5000
Edo ekonomia global modernoan nazioek dituzten erronkei buruz.
00:46
My brief, as you've just heard, is to tell you about statistics --
7
21000
4000
Nire lana, estatistikaz hitz egitea da --
00:50
and, to be more precise, to tell you some exciting things about statistics.
8
25000
3000
hobe esanda, estatistikaren gauza liluragarriak kontatzea.
00:53
And that's --
9
28000
1000
Eta hori...
00:54
(Laughter)
10
29000
1000
(barreak)
00:55
-- that's rather more challenging
11
30000
2000
hori nire aurrekoek egindakoa, eta
00:57
than all the speakers before me and all the ones coming after me.
12
32000
2000
ondorendoek egingo dutena baino zailagoa da.
00:59
(Laughter)
13
34000
1000
(barreak)
01:01
One of my senior colleagues told me, when I was a youngster in this profession,
14
36000
5000
Lanbide honetan berria nintzenean, lankide batek esan zidan
01:06
rather proudly, that statisticians were people who liked figures
15
41000
4000
estatistikariak zenbakiak maite zituzten, baina kontable izateko
01:10
but didn't have the personality skills to become accountants.
16
45000
3000
pertsonalitaterik ez zuten pertsonak zirela.
01:13
(Laughter)
17
48000
2000
(barreak)
01:15
And there's another in-joke among statisticians, and that's,
18
50000
3000
Estatistikoen arteko beste txiste batek dio:
01:18
"How do you tell the introverted statistician from the extroverted statistician?"
19
53000
3000
"Nola ezberdindu estatistikari introbertitu bat
01:21
To which the answer is,
20
56000
2000
estatistikari extrobertitu batengandik?"
01:23
"The extroverted statistician's the one who looks at the other person's shoes."
21
58000
5000
"Estatistikari extrobertitua beste pertsonaren zapatetara begiratzen duena da"
01:28
(Laughter)
22
63000
3000
(barreak)
01:31
But I want to tell you something useful -- and here it is, so concentrate now.
23
66000
5000
Baina gauza bat esan nahi dizuet - eta orain doa, beraz adi.
01:36
This evening, there's a reception in the University's Museum of Natural History.
24
71000
3000
Gaur harrera bat dago Unibertsitateko Natur Zientzien Museoan.
01:39
And it's a wonderful setting, as I hope you'll find,
25
74000
2000
Ikusiko duzuen bezala, toki zoragarri bat da,
01:41
and a great icon to the best of the Victorian tradition.
26
76000
5000
tradizio victoriar hoberenaren ikono handi bat.
01:46
It's very unlikely -- in this special setting, and this collection of people --
27
81000
5000
Nekez gertatuko da, toki berezi horretan, hainbeste jende artean,
01:51
but you might just find yourself talking to someone you'd rather wish that you weren't.
28
86000
3000
baina gerta daiteke, nahi ez duzuen norbaitekin hitz egiten amaitzea.
01:54
So here's what you do.
29
89000
2000
Hau da egin behar duzuena.
01:56
When they say to you, "What do you do?" -- you say, "I'm a statistician."
30
91000
4000
"Zein da zure lanbidea?" galdetzean, "Estatistikaria naiz" erantzun.
02:00
(Laughter)
31
95000
1000
(barreak)
02:01
Well, except they've been pre-warned now, and they'll know you're making it up.
32
96000
4000
Beno, orain abisatuta zaudete, eta asmatzen ari zaretela jakingo du,
02:05
And then one of two things will happen.
33
100000
2000
baina bestela, bi gauza pasa daitezke.
02:07
They'll either discover their long-lost cousin in the other corner of the room
34
102000
2000
Gelaren beste puntan lehengusu bat aurkituko du,
02:09
and run over and talk to them.
35
104000
2000
eta harekin hitz egitera joango da,
02:11
Or they'll suddenly become parched and/or hungry -- and often both --
36
106000
3000
edo bapatean goseak eta egarriak ipiniko da -askotan biak-
02:14
and sprint off for a drink and some food.
37
109000
2000
eta edateko eta jateko zerbaiten bila joango da.
02:16
And you'll be left in peace to talk to the person you really want to talk to.
38
111000
4000
Eta zu lasai geratuko zara, benetan hitz egin nahi duzunarengana joateko.
02:20
It's one of the challenges in our profession to try and explain what we do.
39
115000
3000
Gure lanbidearen erronketako bat egiten duguna azaltzea da.
02:23
We're not top on people's lists for dinner party guests and conversations and so on.
40
118000
5000
Afari, hitzaldi eta horrelakoetara ez gaituzte gonbidatzen.
02:28
And it's something I've never really found a good way of doing.
41
123000
2000
Sekula ez dut asmatu hori nola lortu.
02:30
But my wife -- who was then my girlfriend --
42
125000
3000
Baina nire emazteak - orduan nire neskalagunak -
02:33
managed it much better than I've ever been able to.
43
128000
3000
nik sekula lortu ez dudana lortu zuen.
02:36
Many years ago, when we first started going out, she was working for the BBC in Britain,
44
131000
3000
Duela urte asko, elkarrekin hasi ginenean, berak BBCrako lan egiten zuen, Britainia Handian,
02:39
and I was, at that stage, working in America.
45
134000
2000
eta ni une horretan Estatu Batuetan nengoen lanean.
02:41
I was coming back to visit her.
46
136000
2000
Bera bisitatzera joan nintzen batean,
02:43
She told this to one of her colleagues, who said, "Well, what does your boyfriend do?"
47
138000
6000
bere laneko batek "eta zure mutil lagunak zer egiten du?" galdetu zion
02:49
Sarah thought quite hard about the things I'd explained --
48
144000
2000
Sarah-k nik azaldu nizkion gauzei buruz pentsatu,
02:51
and she concentrated, in those days, on listening.
49
146000
4000
egun haietan entzuten arreta jarri zuen,
02:55
(Laughter)
50
150000
2000
(barreak)
02:58
Don't tell her I said that.
51
153000
2000
Ez esan halakorik esan dudanik.
03:00
And she was thinking about the work I did developing mathematical models
52
155000
4000
eboluzioa eta genetika ulertzeko eredu matematikoak
03:04
for understanding evolution and modern genetics.
53
159000
3000
garatzen burutu nuen lanean pentsatu zuen
03:07
So when her colleague said, "What does he do?"
54
162000
3000
eta bere lankideak "zer egiten du?" galdetzean
03:10
She paused and said, "He models things."
55
165000
4000
Sarah-k etenaldi bat egin eta esan zion "gauzak modelatzen ditu".
03:14
(Laughter)
56
169000
1000
(barreak)
03:15
Well, her colleague suddenly got much more interested than I had any right to expect
57
170000
4000
Bere lankidea, bapatean, espero zitekeena baina gehiago interesatu zen
03:19
and went on and said, "What does he model?"
58
174000
3000
eta jarraitu zuen "zer modelatzen du?"
03:22
Well, Sarah thought a little bit more about my work and said, "Genes."
59
177000
3000
Sarh-k nire lanean pixka bat gehiago pentsatu eta "geneak" esan zion
03:25
(Laughter)
60
180000
4000
(barreak)
03:29
"He models genes."
61
184000
2000
"geneak modelatzen ditu".
03:31
That is my first love, and that's what I'll tell you a little bit about.
62
186000
4000
Hau da nire bizitzako amodioa, eta hortaz pixka bat hitz egingo dut.
03:35
What I want to do more generally is to get you thinking about
63
190000
4000
Gure munduan zoriak eta probabilitateak
03:39
the place of uncertainty and randomness and chance in our world,
64
194000
3000
duten lekuan pentsatzea nahi dut,
03:42
and how we react to that, and how well we do or don't think about it.
65
197000
5000
eta horren aurrean nola jokatzen dugun.
03:47
So you've had a pretty easy time up till now --
66
202000
2000
Orain arte nahiko erraza izan da,
03:49
a few laughs, and all that kind of thing -- in the talks to date.
67
204000
2000
orain arteko hitzaldietan barre batzuk egin dituzue.
03:51
You've got to think, and I'm going to ask you some questions.
68
206000
3000
Orain pentsatu egin behar duzue, galderak egingo dizkizuet.
03:54
So here's the scene for the first question I'm going to ask you.
69
209000
2000
Beraz, hau da lehen galderaren eszenatokia:
03:56
Can you imagine tossing a coin successively?
70
211000
3000
Imaginatu zaitezte txanpon bat behin eta berriz airera botatzen
03:59
And for some reason -- which shall remain rather vague --
71
214000
3000
eta arrazoi batengatik, ez dugu zehaztuko zergatik,
04:02
we're interested in a particular pattern.
72
217000
2000
patroi zehatz batetan interesa dugu.
04:04
Here's one -- a head, followed by a tail, followed by a tail.
73
219000
3000
Esaterako: aurpegia, gurutzea, gurutzea.
04:07
So suppose we toss a coin repeatedly.
74
222000
3000
Beraz txanpon bat behin eta berriz jaurtitzen dugu.
04:10
Then the pattern, head-tail-tail, that we've suddenly become fixated with happens here.
75
225000
5000
Eta... aurpegia-gurutzea-gurutzea, gure patroia agertzen da.
04:15
And you can count: one, two, three, four, five, six, seven, eight, nine, 10 --
76
230000
4000
Kontatu eta bat, bi, hiru, lau, bost, sei, zazpi, zortzi, bederatzi, hamar,
04:19
it happens after the 10th toss.
77
234000
2000
hamargarren jaurtiketaren ostean gertatu da.
04:21
So you might think there are more interesting things to do, but humor me for the moment.
78
236000
3000
Gauza interesgarriagoak egin daitezkeela pentsatuko duzue, baina jarrai iezaidazue une batez.
04:24
Imagine this half of the audience each get out coins, and they toss them
79
239000
4000
Imajinatu entzuleriaren alde honetako bakoitzak txanpon bat atera eta
04:28
until they first see the pattern head-tail-tail.
80
243000
3000
aurpegi-gurutze-gurutze patroia atera arte jaurtitzen duela.
04:31
The first time they do it, maybe it happens after the 10th toss, as here.
81
246000
2000
Egiten duten lehen aldian agian hamargarren jaurtiketan gertatzen da.
04:33
The second time, maybe it's after the fourth toss.
82
248000
2000
Bigarrenean agian laugarrenean.
04:35
The next time, after the 15th toss.
83
250000
2000
Eta ondoren hamabosgarrenean.
04:37
So you do that lots and lots of times, and you average those numbers.
84
252000
3000
Beraz txanpona askotan botatzen duzue, eta zenbaki horien bataz bestekoa kalkulatzen duzue.
04:40
That's what I want this side to think about.
85
255000
3000
Horretan pentsatzea nahi dut.
04:43
The other half of the audience doesn't like head-tail-tail --
86
258000
2000
Entzuleriaren beste aldeak ez du aurpegi-gurutze-gurutze nahi,
04:45
they think, for deep cultural reasons, that's boring --
87
260000
3000
arrazoi kulturalengatik aspergarria dela uste dute,
04:48
and they're much more interested in a different pattern -- head-tail-head.
88
263000
3000
eta gehiago gustatzen zaie aurpegi-gurutze-aurpegi patroia.
04:51
So, on this side, you get out your coins, and you toss and toss and toss.
89
266000
3000
Beraz hemen ere, txanponak atera eta jaurti eta jaurti hasten dira.
04:54
And you count the number of times until the pattern head-tail-head appears
90
269000
3000
Jaurtiketak kontatzen dituzte aurpegi-gurutze-aurpegi patroia atera arte.
04:57
and you average them. OK?
91
272000
3000
Eta bataz bestekoa ateratzen dute, ados?
05:00
So on this side, you've got a number --
92
275000
2000
Beraz alde honetan zenbaki bat dute,
05:02
you've done it lots of times, so you get it accurately --
93
277000
2000
askotan egin dute beraz zenbakia zehatza da,
05:04
which is the average number of tosses until head-tail-tail.
94
279000
3000
aurpegi-gurutze-gurutze lortu arte behar diren jaurtiketen bataz bestekoa da.
05:07
On this side, you've got a number -- the average number of tosses until head-tail-head.
95
282000
4000
Hemen beste zenbaki bat dute, aurpegi-gurutze-aurpegi lortu harteko bataz besteko jaurtiketa kopurua.
05:11
So here's a deep mathematical fact --
96
286000
2000
Hemen gauza matematiko sakon bat topatuko dugu,
05:13
if you've got two numbers, one of three things must be true.
97
288000
3000
bi zenbaki badituzu, hiru gauza gerta daitezke.
05:16
Either they're the same, or this one's bigger than this one,
98
291000
3000
Edo berdinak dira, bat bestea baino handiagoa da,
05:19
or this one's bigger than that one.
99
294000
1000
edo alderantziz.
05:20
So what's going on here?
100
295000
3000
Beraz, hemen zer gertatzen da?
05:23
So you've all got to think about this, and you've all got to vote --
101
298000
2000
Guztiok pentsatu behar duzue, eta guztiok erantzun behar duzue,
05:25
and we're not moving on.
102
300000
1000
bestela ez dugu jarraituko.
05:26
And I don't want to end up in the two-minute silence
103
301000
2000
Eta ez dut bi minutuko isilunearekin amaitu nahi
05:28
to give you more time to think about it, until everyone's expressed a view. OK.
104
303000
4000
guztioi erantzuteko denbora emateko.
05:32
So what you want to do is compare the average number of tosses until we first see
105
307000
4000
aurpegi-gurutze-aurpegi patroia lortu arte behar ditugun bataz besteko jaurtiketa kopurua
05:36
head-tail-head with the average number of tosses until we first see head-tail-tail.
106
311000
4000
aurpegi-gurutze-gurutze patroia lortu arte behar ditugunekin konparatu behar duzue.
05:41
Who thinks that A is true --
107
316000
2000
Zeintzuk uste dute A egia dela,
05:43
that, on average, it'll take longer to see head-tail-head than head-tail-tail?
108
318000
4000
bataz beste denbora gehiago beharko dela aurpegi-gurutze-aurpegi lortzeko aurpegi-gurutze-gurutze baino?
05:47
Who thinks that B is true -- that on average, they're the same?
109
322000
3000
Nork uste du B egia dela, batez bestekoa berdina dela?
05:51
Who thinks that C is true -- that, on average, it'll take less time
110
326000
2000
Nork uste du C egia dela, bataz beste denbora gutxiago beharko dela
05:53
to see head-tail-head than head-tail-tail?
111
328000
3000
aurpegi-gurutze-gurutze lortzeko aurpegi-gurutze-gurutze lortzeko baino?
05:57
OK, who hasn't voted yet? Because that's really naughty -- I said you had to.
112
332000
3000
Ados, nor falta da erantzuteko? Hori bihurrikeria bat da, erantzun egin behar zela esan dut.
06:00
(Laughter)
113
335000
1000
(barreak)
06:02
OK. So most people think B is true.
114
337000
3000
Ados, gehiengoak uste du B dela egia.
06:05
And you might be relieved to know even rather distinguished mathematicians think that.
115
340000
3000
Eta lasai, matematikari ezagun batzuek ere hori pentsatzen dute eta.
06:08
It's not. A is true here.
116
343000
4000
Baina ez, A da egia.
06:12
It takes longer, on average.
117
347000
2000
Bataz beste denbora gehiago behar du.
06:14
In fact, the average number of tosses till head-tail-head is 10
118
349000
2000
Izatez, aurpegi-gurutze-aurpegi lortzeko bataz besteko jaurtiketa kopurua 10 da
06:16
and the average number of tosses until head-tail-tail is eight.
119
351000
5000
eta aurpegi-gurutze-gurutze lortzeko 8.
06:21
How could that be?
120
356000
2000
Nola da posible hau?
06:24
Anything different about the two patterns?
121
359000
3000
Patroietan desberdintasunen bat dago?
06:30
There is. Head-tail-head overlaps itself.
122
365000
5000
Bai. aurpegi-gurutze-aurpegi gainjarri egiten da.
06:35
If you went head-tail-head-tail-head, you can cunningly get two occurrences
123
370000
4000
Aurpegi-gurutze-aurpegi bilatzen baduzu, zortearekin patroiaren
06:39
of the pattern in only five tosses.
124
374000
3000
bi sekuentzia lor ditzakezu bost jaurtiketatan.
06:42
You can't do that with head-tail-tail.
125
377000
2000
Hori ezin duzu aurpegi-gurutze-gurutze patroiarekin lortu.
06:44
That turns out to be important.
126
379000
2000
Eta hori garrantzitsua da.
06:46
There are two ways of thinking about this.
127
381000
2000
Bi modu daude honen inguruan pentsatzeko.
06:48
I'll give you one of them.
128
383000
2000
Bat erakutsiko dizuet.
06:50
So imagine -- let's suppose we're doing it.
129
385000
2000
Imajinatu, demagun egiten ari garela.
06:52
On this side -- remember, you're excited about head-tail-tail;
130
387000
2000
Alde honetan, gogoratu aurpegi-gurutze-gurutze
06:54
you're excited about head-tail-head.
131
389000
2000
eta zuek aurpegi-gurutze-aurpegi.
06:56
We start tossing a coin, and we get a head --
132
391000
3000
Txanpona jaurti eta aurpegia ateratzen da,
06:59
and you start sitting on the edge of your seat
133
394000
1000
zuen eserlekuaren iskinan zaudete, zerbait handia
07:00
because something great and wonderful, or awesome, might be about to happen.
134
395000
5000
ederra edo sinesgaitza gerta daitekeelako.
07:05
The next toss is a tail -- you get really excited.
135
400000
2000
Bigarren jaurtiketa gurutzea ateratzen da, benetan gustora zaudete.
07:07
The champagne's on ice just next to you; you've got the glasses chilled to celebrate.
136
402000
4000
Txanpaina izotzetan sartuta dago, eta kopak ospatzeko prest daude.
07:11
You're waiting with bated breath for the final toss.
137
406000
2000
Bihotza abiada bizian duzue azken jaurtiketan.
07:13
And if it comes down a head, that's great.
138
408000
2000
Aurpegia ateratzen bada izugarria izango da.
07:15
You're done, and you celebrate.
139
410000
2000
Lortu eta ospatu egingo duzue.
07:17
If it's a tail -- well, rather disappointedly, you put the glasses away
140
412000
2000
Gurutzea ateratzen bada, beno etsigarria da, kopak gorde
07:19
and put the champagne back.
141
414000
2000
eta txanpaina bere lekuan uzten duzue.
07:21
And you keep tossing, to wait for the next head, to get excited.
142
416000
3000
Eta jaurtitzen jarraitzen duzue, hurrengo aurpegiaren zain.
07:25
On this side, there's a different experience.
143
420000
2000
Alde honetan esperientzia ezberdina da.
07:27
It's the same for the first two parts of the sequence.
144
422000
3000
Berdina da sekuentziaren lehen bi zatitan.
07:30
You're a little bit excited with the first head --
145
425000
2000
Pixka bat gustora zaudete lehen aurpegiarekin,
07:32
you get rather more excited with the next tail.
146
427000
2000
eta oso gustura hurrengo gurutzearekin.
07:34
Then you toss the coin.
147
429000
2000
Orduan txanpona jaurtitzen duzue.
07:36
If it's a tail, you crack open the champagne.
148
431000
3000
Gurutzea bada txanpaina irekitzen duzue.
07:39
If it's a head you're disappointed,
149
434000
2000
Aurpegia bada, etsigarria da,
07:41
but you're still a third of the way to your pattern again.
150
436000
3000
baina zuen patroiaren herena badaukazue jada.
07:44
And that's an informal way of presenting it -- that's why there's a difference.
151
439000
4000
Eta hori aurkezteko modu ez formala litzateke, baina hori da desberdintasuna.
07:48
Another way of thinking about it --
152
443000
2000
Ikusteko beste modu bat,
07:50
if we tossed a coin eight million times,
153
445000
2000
txanpona 8 milioi aldiz botatzen badugu,
07:52
then we'd expect a million head-tail-heads
154
447000
2000
milioi bat aurpegi-gurutze-aurpegi esperoko genituzke
07:54
and a million head-tail-tails -- but the head-tail-heads could occur in clumps.
155
449000
7000
eta milioi bat aurpegi-gurutze-gurutze, baina aurpegi-gurutze-gurutzeak multzoka ager daitezke.
08:01
So if you want to put a million things down amongst eight million positions
156
456000
2000
Beraz, milioi bat gauza zortzi milioi posiziotan ipintzen badituzue
08:03
and you can have some of them overlapping, the clumps will be further apart.
157
458000
5000
eta gainjartze apur bat onartzen baduzue, multzoak elkarrengandik hurrunago egongo dira.
08:08
It's another way of getting the intuition.
158
463000
2000
Hau ulertzeko beste modu bat da.
08:10
What's the point I want to make?
159
465000
2000
Zer esan nahi dut?
08:12
It's a very, very simple example, an easily stated question in probability,
160
467000
4000
Oso adibide sinplea da, probabilitate galdera xume bat,
08:16
which every -- you're in good company -- everybody gets wrong.
161
471000
3000
eta guztiek, eta lagunarte onean zaudete, gaizki erantzuten dute.
08:19
This is my little diversion into my real passion, which is genetics.
162
474000
4000
Hau da nire pasioarekin, genetikarekin, lotuta nire dibertimentu txikia.
08:23
There's a connection between head-tail-heads and head-tail-tails in genetics,
163
478000
3000
Bada erlazio bat, aurpegi-gurutze-aurpegi eta aurpegi-gurutze-gurutzeren artean.
08:26
and it's the following.
164
481000
3000
Eta hau da.
08:29
When you toss a coin, you get a sequence of heads and tails.
165
484000
3000
Txanpona jaurtitzean, aurpegi eta gurutzeen sekuentzia bat lortzen duzu.
08:32
When you look at DNA, there's a sequence of not two things -- heads and tails --
166
487000
3000
DNA ikustean sekuentzia bat dago, baina ez bi gauzena soilik,
08:35
but four letters -- As, Gs, Cs and Ts.
167
490000
3000
lau hizkiena baizik, A, G, C eta T.
08:38
And there are little chemical scissors, called restriction enzymes
168
493000
3000
Eta guraize kimiko txiki batzuk daude, errestrikzio entzimak,
08:41
which cut DNA whenever they see particular patterns.
169
496000
2000
patroi jakin bat ikustean DNA mozten dutenak.
08:43
And they're an enormously useful tool in modern molecular biology.
170
498000
4000
Biologia molekular modernoan oso erabilgarriak diren tresna bat dira.
08:48
And instead of asking the question, "How long until I see a head-tail-head?" --
171
503000
3000
Eta "aurpegi-gurutze-aurpegi lortzeko zenbat jaurtiketa behar dira?" galdetu beharrean,
08:51
you can ask, "How big will the chunks be when I use a restriction enzyme
172
506000
3000
"zein tamaina izango dute adibidez G-A-A-G patroia ikustean mozten duten
08:54
which cuts whenever it sees G-A-A-G, for example?
173
509000
4000
errestrikzio entzimak erabiltzen baditut?" galdetu dezakegu.
08:58
How long will those chunks be?"
174
513000
2000
Ze tamainako pusketak izango ditut?
09:00
That's a rather trivial connection between probability and genetics.
175
515000
5000
Hau probabilitatearen eta genetikaren arteko azaleko lotura bat da.
09:05
There's a much deeper connection, which I don't have time to go into
176
520000
3000
Lotura sakonago bat ere badago, baina ez daukat hura aztertzeko denborarik,
09:08
and that is that modern genetics is a really exciting area of science.
177
523000
3000
genetika modernoa zientziaren esparru oso kitzikagarri bat baita benetan.
09:11
And we'll hear some talks later in the conference specifically about that.
178
526000
4000
Eta beranduago honen inguruko hitzaldiak entzungo ditugu.
09:15
But it turns out that unlocking the secrets in the information generated by modern
179
530000
4000
Baina teknologia experimental modernoek sortzen duten informazioaren sekretu batzuk jakiteko
09:19
experimental technologies, a key part of that has to do with fairly sophisticated --
180
534000
5000
gakoetako batzuk teknika sofistikatuetatik etortzen dira,
09:24
you'll be relieved to know that I do something useful in my day job,
181
539000
3000
jakin ezazute nire eguneroko lanean zerbait erabilgarria egiten dudala,
09:27
rather more sophisticated than the head-tail-head story --
182
542000
2000
aurpegi-gurutze-gurutzeren istorioa baino sofistikatuagoa,
09:29
but quite sophisticated computer modelings and mathematical modelings
183
544000
4000
eredu konputazional eta matematiko nahiko konplexuekin,
09:33
and modern statistical techniques.
184
548000
2000
eta teknika estatistiko modernoekin.
09:35
And I will give you two little snippets -- two examples --
185
550000
3000
Nire taldeak Oxford-en daramatzan bi proiekturen
09:38
of projects we're involved in in my group in Oxford,
186
553000
3000
zati txiki batzuk, adibide batzuk, azalduko ditut
09:41
both of which I think are rather exciting.
187
556000
2000
interesgarriak direla uste dut eta.
09:43
You know about the Human Genome Project.
188
558000
2000
Giza genomaren proiektuari buruz entzun duzue.
09:45
That was a project which aimed to read one copy of the human genome.
189
560000
4000
Giza genomaren kopia bat deszifratzea helburu zuen proiektu bat zen.
09:51
The natural thing to do after you've done that --
190
566000
2000
Hau lortu ostean, noski, beste proiektu bat doa,
09:53
and that's what this project, the International HapMap Project,
191
568000
2000
HapMap nazioarteko proiektua,
09:55
which is a collaboration between labs in five or six different countries.
192
570000
5000
5 edo 6 herrialdeetako laborategiek kolaborazioan garatzen duten proiektua.
10:00
Think of the Human Genome Project as learning what we've got in common,
193
575000
4000
Giza genomaren proiektuak zer dugun amankomunean aurkitu nahi du,
10:04
and the HapMap Project is trying to understand
194
579000
2000
HapMap proiektuak, pertsona desberdinen arteko
10:06
where there are differences between different people.
195
581000
2000
diferentziak non dauden ulertu nahi du.
10:08
Why do we care about that?
196
583000
2000
Zergatik axola digu?
10:10
Well, there are lots of reasons.
197
585000
2000
Beno, arrazoi asko daude.
10:12
The most pressing one is that we want to understand how some differences
198
587000
4000
Premiazkoena, zera ulertzea da, diferentzia batzuk nola egiten duten
10:16
make some people susceptible to one disease -- type-2 diabetes, for example --
199
591000
4000
batzuk gaixotasun batzuk izateko joera izatea, 2 motako diabetesa izatera adibidez,
10:20
and other differences make people more susceptible to heart disease,
200
595000
5000
edo gaixotasun kardiakoak izateko joera izatea,
10:25
or stroke, or autism and so on.
201
600000
2000
edo aplopejiak, autismoa...
10:27
That's one big project.
202
602000
2000
Hori proiektu handi bat da.
10:29
There's a second big project,
203
604000
2000
Badago beste proiektu handi bat,
10:31
recently funded by the Wellcome Trust in this country,
204
606000
2000
Wellcome Trust-ek berriki finantziatua,
10:33
involving very large studies --
205
608000
2000
genetika ulertzeko, ikerketa handiak, milaka pertsona
10:35
thousands of individuals, with each of eight different diseases,
206
610000
3000
8 gaixotasun desberdin, 1 eta 2 motako diabetesa,
10:38
common diseases like type-1 and type-2 diabetes, and coronary heart disease,
207
613000
4000
gaixotasun koronarioak eta nahaste bipolarra adibidez,
10:42
bipolar disease and so on -- to try and understand the genetics.
208
617000
4000
barne hartzen dituena.
10:46
To try and understand what it is about genetic differences that causes the diseases.
209
621000
3000
Ze desberdintasun genetikok sortzen dituzten gaixotasunak eta zergatik ulertzeko.
10:49
Why do we want to do that?
210
624000
2000
Zergatik egin nahi dugu?
10:51
Because we understand very little about most human diseases.
211
626000
3000
Giza gaixotasun gehienen inguruan ezer gutxi dakigulako.
10:54
We don't know what causes them.
212
629000
2000
Ez dakigu zerk sortzen dituen.
10:56
And if we can get in at the bottom and understand the genetics,
213
631000
2000
Eta oinarrira iritsi eta genetika ulertu ahalko bagenu,
10:58
we'll have a window on the way the disease works,
214
633000
3000
gaixotasunak nola jokatzen duen jakingo genukeelako.
11:01
and a whole new way about thinking about disease therapies
215
636000
2000
Eta terapiak eta tratamentu prebentiboak ikusteko
11:03
and preventative treatment and so on.
216
638000
3000
modu berri bat izango genukeelako.
11:06
So that's, as I said, the little diversion on my main love.
217
641000
3000
Beraz hori da, nire benetako maitasunaren barnean daukadan dibertimentu txikia.
11:09
Back to some of the more mundane issues of thinking about uncertainty.
218
644000
5000
Ziurtasunik ezaren inguruan egiten ditugun arrazoiketetara itzuliz,
11:14
Here's another quiz for you --
219
649000
2000
hona zuentzat beste askmakizun bat:
11:16
now suppose we've got a test for a disease
220
651000
2000
Imajinatu gaixotasun bat detektatzeko proba bat daukagula.
11:18
which isn't infallible, but it's pretty good.
221
653000
2000
Ez da hutsezina, baino nahiko ona da.
11:20
It gets it right 99 percent of the time.
222
655000
3000
kasuen %99an asmatzen du.
11:23
And I take one of you, or I take someone off the street,
223
658000
3000
Eta zuetako bat, edo kaleko norbait hartzen dut, zoriz,
11:26
and I test them for the disease in question.
224
661000
2000
eta proba hori egiten diot.
11:28
Let's suppose there's a test for HIV -- the virus that causes AIDS --
225
663000
4000
Demagun GIB-rako proba dela, IHESA sortzen duen birusa,
11:32
and the test says the person has the disease.
226
667000
3000
eta probak pertsona gaixo dagoela esaten duela.
11:35
What's the chance that they do?
227
670000
3000
Zein da gaixotasuna izateko probabilitatea?
11:38
The test gets it right 99 percent of the time.
228
673000
2000
Probak kasuen %99an asmatzen du.
11:40
So a natural answer is 99 percent.
229
675000
4000
Beraz erantzun azkarra %99 da.
11:44
Who likes that answer?
230
679000
2000
Nori gustatzen zaio erantzun hori?
11:46
Come on -- everyone's got to get involved.
231
681000
1000
Benga, guztiok parte hartu behar dugu.
11:47
Don't think you don't trust me anymore.
232
682000
2000
Ez pentsatu jada ezin duzuenik nigan konfidantza izan.
11:49
(Laughter)
233
684000
1000
(barreak)
11:50
Well, you're right to be a bit skeptical, because that's not the answer.
234
685000
3000
Ongi dago pixka bat eszeptiko egotea, hori ez baita erantzun zuzena.
11:53
That's what you might think.
235
688000
2000
Hori pentsa dezakezue.
11:55
It's not the answer, and it's not because it's only part of the story.
236
690000
3000
Baina ez da erantzun zuzena, historiaren zati bakarra baita.
11:58
It actually depends on how common or how rare the disease is.
237
693000
3000
Berez, gaixotasunaren hedapenaren araberakoa izango da probabilitatea.
12:01
So let me try and illustrate that.
238
696000
2000
Utzi iezaidazue erakusten.
12:03
Here's a little caricature of a million individuals.
239
698000
4000
Milioi bat pertsonaz osatutako lagina dugu.
12:07
So let's think about a disease that affects --
240
702000
3000
Pentsa dezagun gaixotasun bitxi batean,
12:10
it's pretty rare, it affects one person in 10,000.
241
705000
2000
10.000tik bati eragiten dion batean.
12:12
Amongst these million individuals, most of them are healthy
242
707000
3000
Milioi horretan, gehienak osasuntsu daude,
12:15
and some of them will have the disease.
243
710000
2000
eta batzuk gaixotasun hori izango dute.
12:17
And in fact, if this is the prevalence of the disease,
244
712000
3000
Berez, hori bada gaixotasunaren maiztasuna,
12:20
about 100 will have the disease and the rest won't.
245
715000
3000
gutxi gora behera 100 gaixo izango genituzke.
12:23
So now suppose we test them all.
246
718000
2000
Demagun proba guztiei egiten diegula.
12:25
What happens?
247
720000
2000
Zer gertatzen da?
12:27
Well, amongst the 100 who do have the disease,
248
722000
2000
Gaixotasuna duten 100 pertsonetan,
12:29
the test will get it right 99 percent of the time, and 99 will test positive.
249
724000
5000
Frogak %99tan asmatuko du, eta 99 gaixo detektatuko ditu.
12:34
Amongst all these other people who don't have the disease,
250
729000
2000
Gaixotasuna ez duten pertsonetan,
12:36
the test will get it right 99 percent of the time.
251
731000
3000
frogak %99tan asmatuko du.
12:39
It'll only get it wrong one percent of the time.
252
734000
2000
Kasuen %1ean erratuko da.
12:41
But there are so many of them that there'll be an enormous number of false positives.
253
736000
4000
Baina hainbeste osasuntsu daude, positibo faltsu asko egongo direla.
12:45
Put that another way --
254
740000
2000
Beste era batera esanda,
12:47
of all of them who test positive -- so here they are, the individuals involved --
255
742000
5000
frogak gaixo dagoela esaten duen horietan,
12:52
less than one in 100 actually have the disease.
256
747000
5000
ehunetik batek baino gutxiagok izango du gaixotasuna benetan.
12:57
So even though we think the test is accurate, the important part of the story is
257
752000
4000
Beraz froga zehatza dela uste badugu ere, garrantzitsua zera da,
13:01
there's another bit of information we need.
258
756000
3000
beharrezko den beste informazio bat falta dela.
13:04
Here's the key intuition.
259
759000
2000
Hau da ideia garrantzitsua.
13:07
What we have to do, once we know the test is positive,
260
762000
3000
Froga positiboa dela jakin ostean egin behar duguna zera da,
13:10
is to weigh up the plausibility, or the likelihood, of two competing explanations.
261
765000
6000
lehian dauden bi azalpenen probabilitatea aztertu.
13:16
Each of those explanations has a likely bit and an unlikely bit.
262
771000
3000
Azalpen bakoitzak zati probable eta zati inprobable bat ditu.
13:19
One explanation is that the person doesn't have the disease --
263
774000
3000
Azalpen bat pertsona gaixo ez egotea da,
13:22
that's overwhelmingly likely, if you pick someone at random --
264
777000
3000
hau oso probablea da, norbait zoriz hautatzen baduzu,
13:25
but the test gets it wrong, which is unlikely.
265
780000
3000
baina froga erratu egiten da, eta hau inprobablea da.
13:29
The other explanation is that the person does have the disease -- that's unlikely --
266
784000
3000
Beste azalpena, pertsona gaixo egotea da, inprobablea dena,
13:32
but the test gets it right, which is likely.
267
787000
3000
eta froga zuzen egotea, probablea dena.
13:35
And the number we end up with --
268
790000
2000
Eta topatu nahi dugun zenbaki horrek,
13:37
that number which is a little bit less than one in 100 --
269
792000
3000
ehuneko bat baino txikiagoa den horrek,
13:40
is to do with how likely one of those explanations is relative to the other.
270
795000
6000
azalpen batek bestearekiko duen probabilitatearekin du zerikusia.
13:46
Each of them taken together is unlikely.
271
801000
2000
Multzo bakoitza banaka inprobablea da.
13:49
Here's a more topical example of exactly the same thing.
272
804000
3000
Hona hemen gai bera jorratzen duen adibide berriago bat.
13:52
Those of you in Britain will know about what's become rather a celebrated case
273
807000
4000
Britainia Handikoak zateretenak jakingo duzue, kasua nahiko famatua egin baita.
13:56
of a woman called Sally Clark, who had two babies who died suddenly.
274
811000
5000
Sally Clark izeneko emakume batek bapatean hil ziren bi haur izan zituen.
14:01
And initially, it was thought that they died of what's known informally as "cot death,"
275
816000
4000
Hasieran informalki "sehaskako heriotza" deritzonarengatik,
14:05
and more formally as "Sudden Infant Death Syndrome."
276
820000
3000
formalki, haurren bapateko heriotzagatik hil zirela uste zen.
14:08
For various reasons, she was later charged with murder.
277
823000
2000
Hainbat arrazoirengatik, erahilketa leporatu zitzaion.
14:10
And at the trial, her trial, a very distinguished pediatrician gave evidence
278
825000
4000
Bere epaiketan, pediatra oso ezagun batek ebidentzia bezala esan zuen
14:14
that the chance of two cot deaths, innocent deaths, in a family like hers --
279
829000
5000
bi horrelako heriotza izateko probabilitatea, berea bezalako familia batean,
14:19
which was professional and non-smoking -- was one in 73 million.
280
834000
6000
profesionala eta ez erretzailea, 73 milioiren artean batekoa zela.
14:26
To cut a long story short, she was convicted at the time.
281
841000
3000
Laburtuz kondenatu egin zuten.
14:29
Later, and fairly recently, acquitted on appeal -- in fact, on the second appeal.
282
844000
5000
Geroago, bere bigarren apelazioan, errugabea zela erabaki zen.
14:34
And just to set it in context, you can imagine how awful it is for someone
283
849000
4000
Pentsa ezazue zer izan daitekeen norbaitentzat
14:38
to have lost one child, and then two, if they're innocent,
284
853000
3000
haur bat galtzea, gero bestea galtzea, eta errugabea izanik
14:41
to be convicted of murdering them.
285
856000
2000
erahilketa leporatuta kondenatua izatea.
14:43
To be put through the stress of the trial, convicted of murdering them --
286
858000
2000
Epaiketaren eta seme-alabak galtzearen estresa jasatea
14:45
and to spend time in a women's prison, where all the other prisoners
287
860000
3000
eta emakumezkoen gartzelan denbora pasatzea, beste preso guztiek
14:48
think you killed your children -- is a really awful thing to happen to someone.
288
863000
5000
zure seme-alabak hil zenituela uste duten bitartean. Horrez izugarria izan behar du.
14:53
And it happened in large part here because the expert got the statistics
289
868000
5000
Eta gertatu egin zen. Adituak estatistikak ereabiltzerakoan
14:58
horribly wrong, in two different ways.
290
873000
3000
bi akats egin zituelako.
15:01
So where did he get the one in 73 million number?
291
876000
4000
Beraz, nondik atera zuen "73 milioitik bat" zenbakia?
15:05
He looked at some research, which said the chance of one cot death in a family
292
880000
3000
Bapateko heriotzaren estatistika batzuk kontsultatu zituen, eta
15:08
like Sally Clark's is about one in 8,500.
293
883000
5000
Sally Clark-ena bezelako familia batean hori gertatzeko probabilitatea 8500etik batekoa zela ikusi zuen.
15:13
So he said, "I'll assume that if you have one cot death in a family,
294
888000
4000
Gero pentsatu zuen "familian horrelako heriotza bat izateak
15:17
the chance of a second child dying from cot death aren't changed."
295
892000
4000
ez du beste bat izateko probabilitatea aldatuko".
15:21
So that's what statisticians would call an assumption of independence.
296
896000
3000
Horri estatistikoek independentziaren aurretikoa deitzen diote.
15:24
It's like saying, "If you toss a coin and get a head the first time,
297
899000
2000
"Txanpon bat airera bota eta aurpegia ateratzeak,
15:26
that won't affect the chance of getting a head the second time."
298
901000
3000
ez du bigarren aldiz aurpegia ateratzeko probabilitatea aldatuko" esatea bezala da.
15:29
So if you toss a coin twice, the chance of getting a head twice are a half --
299
904000
5000
Beraz txanpon bat airera bi aldiz botatzen baduzu, bi aldiz aurpegia ateratzeko probabilitatea erdia,
15:34
that's the chance the first time -- times a half -- the chance a second time.
300
909000
3000
lehen aldiz aurpegia ateratzeko probabilitatea, bider erdia, bigarreneko probabilitatea, da.
15:37
So he said, "Here,
301
912000
2000
Beraz esan zuen, "demagun
15:39
I'll assume that these events are independent.
302
914000
4000
bi gertaerak independenteak direla pentsatuko dut,
15:43
When you multiply 8,500 together twice,
303
918000
2000
8500 bider 8500,
15:45
you get about 73 million."
304
920000
2000
73 milioi inguru da".
15:47
And none of this was stated to the court as an assumption
305
922000
2000
Eta guzti hau ez zitzaion zinpekoei suposizio gisa,
15:49
or presented to the jury that way.
306
924000
2000
edo modu honetara azaldu.
15:52
Unfortunately here -- and, really, regrettably --
307
927000
3000
Zoritxarrez, lehenik eta behin
15:55
first of all, in a situation like this you'd have to verify it empirically.
308
930000
4000
egoera horretan enpirikoki egiaztatu beharko litzateke.
15:59
And secondly, it's palpably false.
309
934000
2000
Eta bigarrenik, erabat faltsua da.
16:02
There are lots and lots of things that we don't know about sudden infant deaths.
310
937000
5000
Gauza asko dira bapateko haurren heriotzari buruz ez dakizkigunak.
16:07
It might well be that there are environmental factors that we're not aware of,
311
942000
3000
Posible da ezagutzen ez ditugun faktore anbientalak egotea,
16:10
and it's pretty likely to be the case that there are
312
945000
2000
eta oso litekeena da ere, ezagutzen
16:12
genetic factors we're not aware of.
313
947000
2000
ez ditugun faktore genetikoak egotea.
16:14
So if a family suffers from one cot death, you'd put them in a high-risk group.
314
949000
3000
Beraz, familia batek horrelako heriotza jasaten badu, arrisku altuko taldean sartuko zenuke.
16:17
They've probably got these environmental risk factors
315
952000
2000
Ziurrenik, ezagutzen ez ditugun arrisku faktore
16:19
and/or genetic risk factors we don't know about.
316
954000
3000
anbiental eta genetikoak izango dituzte.
16:22
And to argue, then, that the chance of a second death is as if you didn't know
317
957000
3000
Bigarren heriotzaren probabilitatea, informazio hori ezagutuko ez bazenukeenaren
16:25
that information is really silly.
318
960000
3000
berdina izango dela argudiatzea benetan inozoa da.
16:28
It's worse than silly -- it's really bad science.
319
963000
4000
Inozoa baino okerrago, benetan zientzia txarra da.
16:32
Nonetheless, that's how it was presented, and at trial nobody even argued it.
320
967000
5000
Hala ere, hala aurkeztu zen, eta epaiketan inork ez zuen eztabaidatu.
16:37
That's the first problem.
321
972000
2000
Hori da lehen arazoa.
16:39
The second problem is, what does the number of one in 73 million mean?
322
974000
4000
Bigarren arazoa zera da, zer esan nahi du 73 miliotik batek?
16:43
So after Sally Clark was convicted --
323
978000
2000
Beraz Sally Clark kondenatua izan ostean,
16:45
you can imagine, it made rather a splash in the press --
324
980000
4000
imajina dezakezue prentsan izan zuen oihartzuna,
16:49
one of the journalists from one of Britain's more reputable newspapers wrote that
325
984000
7000
Britainia Handiko egunkari errespatuenetako kazetari batek
16:56
what the expert had said was,
326
991000
2000
adituak zera esan zuela idatzi zuen:
16:58
"The chance that she was innocent was one in 73 million."
327
993000
5000
"Errugabea izateko aukera 73 miliotik batekoa zela"
17:03
Now, that's a logical error.
328
998000
2000
Akats logiko izan zen
17:05
It's exactly the same logical error as the logical error of thinking that
329
1000000
3000
%99ko zehaztasuna duen gaixotasunaren froga eta gero
17:08
after the disease test, which is 99 percent accurate,
330
1003000
2000
gaixotasuna izateko aukera %99koa dela pentsatzearen
17:10
the chance of having the disease is 99 percent.
331
1005000
4000
errore bera.
17:14
In the disease example, we had to bear in mind two things,
332
1009000
4000
Gaixotasunaren adibidean bi gauza izan behar genituen kontuan,
17:18
one of which was the possibility that the test got it right or not.
333
1013000
4000
bat froga erratu zitekeela, eta bestea
17:22
And the other one was the chance, a priori, that the person had the disease or not.
334
1017000
4000
a priori, pertsonak gaixo egoteko edo ez egoteko zuen probabilitatea.
17:26
It's exactly the same in this context.
335
1021000
3000
Testuinguru honetan gauza bera gertatzen da.
17:29
There are two things involved -- two parts to the explanation.
336
1024000
4000
Bi gauza daude, azalpenaren bi zati.
17:33
We want to know how likely, or relatively how likely, two different explanations are.
337
1028000
4000
Bi azalpenek duten probabilitatea jakin nahi dugu.
17:37
One of them is that Sally Clark was innocent --
338
1032000
3000
Bat Sally Clark errugabea dela,
17:40
which is, a priori, overwhelmingly likely --
339
1035000
2000
a priori oso posible dena,
17:42
most mothers don't kill their children.
340
1037000
3000
ama gehienek ez dituzte beren seme-alabak hiltzen.
17:45
And the second part of the explanation
341
1040000
2000
Eta azalpenaren bigarren zatia
17:47
is that she suffered an incredibly unlikely event.
342
1042000
3000
oso inprobablea zen zerbait pasa zela.
17:50
Not as unlikely as one in 73 million, but nonetheless rather unlikely.
343
1045000
4000
Ez 73 miliotik behin bezain inprobablea, baina inprobablea hala ere.
17:54
The other explanation is that she was guilty.
344
1049000
2000
Beste azalpena erruduna zela da.
17:56
Now, we probably think a priori that's unlikely.
345
1051000
2000
A priori inprobablea dela pentsa dezakegu.
17:58
And we certainly should think in the context of a criminal trial
346
1053000
3000
Eta horrela pentsatu beharko genuke epaiketa kriminal baten testuinguruan,
18:01
that that's unlikely, because of the presumption of innocence.
347
1056000
3000
inprobablea dela, inozentziaren aurretikoari esker.
18:04
And then if she were trying to kill the children, she succeeded.
348
1059000
4000
Beraz, seme-alabak hil nahi bazituen, lortu zuen.
18:08
So the chance that she's innocent isn't one in 73 million.
349
1063000
4000
Beraz errugabea izateko aukera ez da bat 73 miliotik.
18:12
We don't know what it is.
350
1067000
2000
Ez dakigu zenbatekoa den.
18:14
It has to do with weighing up the strength of the other evidence against her
351
1069000
4000
Bere aurkako ebidentzia eta ebidentzia
18:18
and the statistical evidence.
352
1073000
2000
estatistikoa aztertu behar dira.
18:20
We know the children died.
353
1075000
2000
Badakigu haurrak hil zirela.
18:22
What matters is how likely or unlikely, relative to each other,
354
1077000
4000
Beraz jakin behar dena bi azalpenen
18:26
the two explanations are.
355
1081000
2000
probabilitate erlatiboa da.
18:28
And they're both implausible.
356
1083000
2000
Biak sinesgaitzak dira.
18:31
There's a situation where errors in statistics had really profound
357
1086000
4000
Estatistikako akatsak ondorio lazgarriak izan zituzten
18:35
and really unfortunate consequences.
358
1090000
3000
egoeretako bat da hau.
18:38
In fact, there are two other women who were convicted on the basis of the
359
1093000
2000
Izatez, beste bi emakume ere kondenatuak izan ziren
18:40
evidence of this pediatrician, who have subsequently been released on appeal.
360
1095000
4000
pediatra honen ebidentziagatik, eta gero aske geratu dira, apelatu ere egin gabe.
18:44
Many cases were reviewed.
361
1099000
2000
Kasu asko errebisatu ziren.
18:46
And it's particularly topical because he's currently facing a disrepute charge
362
1101000
4000
Eta orain Britainia Handiko Kontseilu Mediku Orokorrean
18:50
at Britain's General Medical Council.
363
1105000
3000
ospea galdu du.
18:53
So just to conclude -- what are the take-home messages from this?
364
1108000
4000
Beraz, amaitzeko, zein da guzti honen mezua?
18:57
Well, we know that randomness and uncertainty and chance
365
1112000
4000
Badakigu zoria, probabilitatea eta ziurtasunik eza
19:01
are very much a part of our everyday life.
366
1116000
3000
gure bizitzako zati direla.
19:04
It's also true -- and, although, you, as a collective, are very special in many ways,
367
1119000
5000
Egia da ere, nahiz eta zuek oso publiko berezia izan,
19:09
you're completely typical in not getting the examples I gave right.
368
1124000
4000
oso tipikoa dela jarri ditudan adibide horiek ez asmatzea.
19:13
It's very well documented that people get things wrong.
369
1128000
3000
Oso ongi dokumentatuta dago jendea gauza hauetan erratu egiten dela.
19:16
They make errors of logic in reasoning with uncertainty.
370
1131000
3000
Logikako akatsak egiten dira ziurtasunik ezaren inguruan arrazoitzean.
19:20
We can cope with the subtleties of language brilliantly --
371
1135000
2000
Hizkuntzaren txikikeriekin oso ongilan egin dezakegu,
19:22
and there are interesting evolutionary questions about how we got here.
372
1137000
3000
eta hori nola lortu dugunaren inguruan oso galdera interesgarriak daude.
19:25
We are not good at reasoning with uncertainty.
373
1140000
3000
Baina ez gara onak ziurtasunik ezaren inguruan arrazoitzen.
19:28
That's an issue in our everyday lives.
374
1143000
2000
Hori gure eguneroko bizitzan arazo bat da.
19:30
As you've heard from many of the talks, statistics underpins an enormous amount
375
1145000
3000
Hitzaldi askotan entzun duzuen bezala, estatistika
19:33
of research in science -- in social science, in medicine
376
1148000
3000
ikerketa zientifiko askoren, gizarte zientzietan, medikuntzan...
19:36
and indeed, quite a lot of industry.
377
1151000
2000
eta industriaren zati handi baten oinbarrian dago.
19:38
All of quality control, which has had a major impact on industrial processing,
378
1153000
4000
Industriaren prozesuan inpaktu handia duen kalitate kontrol hori guztia,
19:42
is underpinned by statistics.
379
1157000
2000
estatistikan oinarritzen da.
19:44
It's something we're bad at doing.
380
1159000
2000
Gaizki egiten dugun zerbait da.
19:46
At the very least, we should recognize that, and we tend not to.
381
1161000
3000
Gutxienez onartu egin beharko genuke, baina ez onartzeko joera dugu.
19:49
To go back to the legal context, at the Sally Clark trial
382
1164000
4000
Testuinguru legalera bueltatuz, Sally Clark-en epaiketan,
19:53
all of the lawyers just accepted what the expert said.
383
1168000
4000
abokatu guztiek adituaren hitzak onartu zituzten, besterik gabe.
19:57
So if a pediatrician had come out and said to a jury,
384
1172000
2000
Beraz pediatra batek zinpekoei zera esan bazien:
19:59
"I know how to build bridges. I've built one down the road.
385
1174000
3000
"Badakit zubiak eraikitzen. Kale horretan bat eraiki dut.
20:02
Please drive your car home over it,"
386
1177000
2000
Mesedez, pasa bertatik zure autoarekin",
20:04
they would have said, "Well, pediatricians don't know how to build bridges.
387
1179000
2000
zinpekoek erantzungo zuten: "pediatrek ez dakite zubiak eraikitzen.
20:06
That's what engineers do."
388
1181000
2000
Hori injeniarei dagokie."
20:08
On the other hand, he came out and effectively said, or implied,
389
1183000
3000
Baina horren ordez iritsi eta esan zuen, edo aditzera eman zuen:
20:11
"I know how to reason with uncertainty. I know how to do statistics."
390
1186000
3000
"Badakit nola arrazoitu ziurtasunik gabeko egoeretan, badakit estatistikarekin lan egiten."
20:14
And everyone said, "Well, that's fine. He's an expert."
391
1189000
3000
Eta guztiek esan zuten, ados, aditu bat da.
20:17
So we need to understand where our competence is and isn't.
392
1192000
3000
Beraz gure konpetentziak non amaitzen diren jakin behar dugu.
20:20
Exactly the same kinds of issues arose in the early days of DNA profiling,
393
1195000
4000
Horrelakoxe gauzak atera ziren DNA sekuentziatzen hasi zirenean,
20:24
when scientists, and lawyers and in some cases judges,
394
1199000
4000
zientzialariak, abokatuak eta epaileak ere
20:28
routinely misrepresented evidence.
395
1203000
3000
sistematikoki frogak desitxuratu zituztenean.
20:32
Usually -- one hopes -- innocently, but misrepresented evidence.
396
1207000
3000
Orokorrean, uste dugu, maliziarik gabe, baina frogak desitxuratu zituzten.
20:35
Forensic scientists said, "The chance that this guy's innocent is one in three million."
397
1210000
5000
Zientzialari forenseek esan zuten "hau errugabea izateko probabilitatea 3 miliotik batekoa da".
20:40
Even if you believe the number, just like the 73 million to one,
398
1215000
2000
Zenbakia sinistuta ere, 73 miliotik bat bezala,
20:42
that's not what it meant.
399
1217000
2000
ez du hori esan nahi.
20:44
And there have been celebrated appeal cases
400
1219000
2000
Eta apelazio famatuak egon dira horregatik
20:46
in Britain and elsewhere because of that.
401
1221000
2000
Britainia Handian, eta beste lekuetan.
20:48
And just to finish in the context of the legal system.
402
1223000
3000
Eta lege-sistemaren testuinguruarekin amaitzeko.
20:51
It's all very well to say, "Let's do our best to present the evidence."
403
1226000
4000
Oso ongi dago "ahalik eta ongien froga aurkeztea".
20:55
But more and more, in cases of DNA profiling -- this is another one --
404
1230000
3000
Baina gero eta gehiago, batez ere DNA-ren azterketen kasuetan,
20:58
we expect juries, who are ordinary people --
405
1233000
3000
zinpekoak, pertsona normalak direnak,
21:01
and it's documented they're very bad at this --
406
1236000
2000
eta jakina den arren horretan oso txarrak direla,
21:03
we expect juries to be able to cope with the sorts of reasoning that goes on.
407
1238000
4000
arrazoitze modu horrekin lan egiteko gai direla uste dugu.
21:07
In other spheres of life, if people argued -- well, except possibly for politics --
408
1242000
5000
Bizitzaren beste esparruetan, jendeak ilogikoki argudiatuko balu,
21:12
but in other spheres of life, if people argued illogically,
409
1247000
2000
beno politikoak kenduta, beste esparruetan jendeak ilogikoki argudiatuko balu,
21:14
we'd say that's not a good thing.
410
1249000
2000
ez dela ona esango genuke.
21:16
We sort of expect it of politicians and don't hope for much more.
411
1251000
4000
Politikoengandik espero dugu, baina ez beste inorrengandik.
21:20
In the case of uncertainty, we get it wrong all the time --
412
1255000
3000
Ziurtasunik ezaren kasuan beti erratuta gaude, eta
21:23
and at the very least, we should be aware of that,
413
1258000
2000
gutxienez kontziente izan beharko genuke.
21:25
and ideally, we might try and do something about it.
414
1260000
2000
Eta idealki horen inguruan zerbait egin beharko genuke.
21:27
Thanks very much.
415
1262000
1000
Mila esker.

▲Back to top

ABOUT THE SPEAKER
Peter Donnelly - Mathematician; statistician
Peter Donnelly is an expert in probability theory who applies statistical methods to genetic data -- spurring advances in disease treatment and insight on our evolution. He's also an expert on DNA analysis, and an advocate for sensible statistical analysis in the courtroom.

Why you should listen

Peter Donnelly applies statistical methods to real-world problems, ranging from DNA analysis (for criminal trials), to the treatment of genetic disorders. A mathematician who collaborates with biologists, he specializes in applying probability and statistics to the field of genetics, in hopes of shedding light on evolutionary history and the structure of the human genome.

The Australian-born, Oxford-based mathematician is best known for his work in molecular evolution (tracing the roots of human existence to their earliest origins using the mutation rates of mitochondrial DNA). He studies genetic distributions in living populations to trace human evolutionary history -- an approach that informs research in evolutionary biology, as well as medical treatment for genetic disorders. Donnelly is a key player in the International HapMap Project, an ongoing international effort to model human genetic variation and pinpoint the genes responsible for specific aspects of health and disease; its implications for disease prevention and treatment are vast.

He's also a leading expert on DNA analysis and the use of forensic science in criminal trials; he's an outspoken advocate for bringing sensible statistical analysis into the courtroom. Donnelly leads Oxford University's Mathematical Genetics Group, which conducts research in genetic modeling, human evolutionary history, and forensic DNA profiling. He is also serves as Director of the Wellcome Trust Centre for Human Genetics at Oxford University, which explores the genetic relationships to disease and illness. 

More profile about the speaker
Peter Donnelly | Speaker | TED.com

Data provided by TED.

This site was created in May 2015 and the last update was on January 12, 2020. It will no longer be updated.

We are currently creating a new site called "eng.lish.video" and would be grateful if you could access it.

If you have any questions or suggestions, please feel free to write comments in your language on the contact form.

Privacy Policy

Developer's Blog

Buy Me A Coffee