1 00:00:12,510 --> 00:00:18,120 Welcome back to the Oxford Mathematics Public Lectures Home Edition. This will be your fifth event. 2 00:00:18,120 --> 00:00:23,820 My name is Alan Guerrieri, and I'm in charge of external relations for the Mathematical Institute. 3 00:00:23,820 --> 00:00:26,190 Special thanks to sponsors. 4 00:00:26,190 --> 00:00:34,230 Market access markets are leading quantitative driven electronic market maker with offices in London, Singapore and New York. 5 00:00:34,230 --> 00:00:39,070 Their ongoing support is crucial in providing you quality content. 6 00:00:39,070 --> 00:00:49,060 The current pandemic crisis, as underlined, the importance of mathematics in numbers not a day goes by without news about the reproduction number. 7 00:00:49,060 --> 00:00:53,380 The number of case, the number of deaths and sadly, the number of death. 8 00:00:53,380 --> 00:01:01,120 All these numbers collected, evaluated and presented as led to much confusion, manipulation and endless argument. 9 00:01:01,120 --> 00:01:09,580 The problem goes straight to the heart of what statistic is, and it is used and sometimes abused in society more than ever. 10 00:01:09,580 --> 00:01:16,060 There is a need for clear explanation of numbers and how they are used in decision making processes. 11 00:01:16,060 --> 00:01:22,180 I'm particularly happy that Tim Harford has accepted the invitation to give a lecture about his issues tonight. 12 00:01:22,180 --> 00:01:28,420 Many of you know Tim to his work and the Financial Times the BBC, but to his books, 13 00:01:28,420 --> 00:01:35,800 is sure more or less on Radio four has been particularly influential in bringing light into the recent darkness tonight. 14 00:01:35,800 --> 00:01:40,300 Tim will talk about his recent book, How to Make the World Add Up. 15 00:01:40,300 --> 00:01:44,290 Thank you very much, Tim, for doing this. Please start now. 16 00:01:44,290 --> 00:01:50,800 Hi, Tim Harford here, and thanks so much to the Oxford Mathematics Department for hosting this. 17 00:01:50,800 --> 00:01:57,820 Talk about my book before I actually start talking about the book and answer the questions that you've sent in on Twitter. 18 00:01:57,820 --> 00:02:05,260 One moment. And if you're watching this after the 14th of October, you can skip ahead a minute. 19 00:02:05,260 --> 00:02:15,400 I wanted to talk about Blackwell's books The Amazing Bookshop on Broad Street, the heart of Oxford, when my book came out. 20 00:02:15,400 --> 00:02:20,470 It sold out absolutely everywhere except Blackwell's books, 21 00:02:20,470 --> 00:02:25,000 who had actually gone to the trouble of having enough confidence in me to order enough books. 22 00:02:25,000 --> 00:02:29,440 They didn't instantly sell out, so I wanted to say thank you to them. 23 00:02:29,440 --> 00:02:40,690 So I've teamed up with them to supply personalised dedications of the book, which isn't very easy in this time of coronavirus. 24 00:02:40,690 --> 00:02:46,060 So if you're actually if you're anywhere in the UK, you can get black girls to send you the book. 25 00:02:46,060 --> 00:02:59,020 If you're local, you can order the book. The book looks like this how to make the world that up, and it's up kind of them to to agree to do this. 26 00:02:59,020 --> 00:03:06,640 But if you email Oxford at Blackwell's Echo Dot UK before the 14th of October, 27 00:03:06,640 --> 00:03:12,880 if you include your name, your contact details, the person you would like me to sign the book for. 28 00:03:12,880 --> 00:03:18,430 And if you've got a short message like Happy Birthday or Merry Christmas is not too early for Christmas, 29 00:03:18,430 --> 00:03:23,020 presents just include that I will come into Blackwell's. 30 00:03:23,020 --> 00:03:29,710 I will sign a whole bunch of books, I will personalise them for you. And then after I've done that, you can get in contact with Blackwell. 31 00:03:29,710 --> 00:03:34,390 As you can go in, you can pick up your book, you can ask them to send it to you. 32 00:03:34,390 --> 00:03:38,200 So please do that before the 14th of October 2020. 33 00:03:38,200 --> 00:03:45,920 And then the book should be available after the 16th of October 2020 to collect or to arrange to deliver, 34 00:03:45,920 --> 00:03:52,460 OK, personalised Blackwell's book message over. 35 00:03:52,460 --> 00:03:58,100 Let's talk about storks and babies where babies come from. 36 00:03:58,100 --> 00:04:04,790 You may have heard the the old saying that the stork delivers your baby brother or your baby sister. 37 00:04:04,790 --> 00:04:12,440 That's where babies come from. And you may alternatively have heard a number of rather wild, 38 00:04:12,440 --> 00:04:18,500 implausible and somewhat disgusting theories about alternative explanations for where babies come from. 39 00:04:18,500 --> 00:04:23,330 You can put those out of your mind because the truth is babies do come from storks. 40 00:04:23,330 --> 00:04:32,090 Storks do deliver babies. And if you doubt that for a moment, be reassured that I can prove that with statistics. 41 00:04:32,090 --> 00:04:41,030 It's very simple. What you need to do is gather together a list of the number of babies born in each country and say, 42 00:04:41,030 --> 00:04:48,770 Europe per year and the best estimate of the breeding population of storks in each country. 43 00:04:48,770 --> 00:04:58,130 And you can just plop them a simple scatterplot babies and storks, and you will see just eyeballing the graph that there appears to be a relationship. 44 00:04:58,130 --> 00:05:07,880 And when you do the maths properly, you will find a very strong correlation, far too strong to have been produced by random chance. 45 00:05:07,880 --> 00:05:13,790 And in case you think, well, why would I trust that guy, Tim Harford, what's he ever done for the world of statistics? 46 00:05:13,790 --> 00:05:18,170 I can assure you that this has survived peer review. 47 00:05:18,170 --> 00:05:27,620 There is, in fact, a published paper entitled Storks Deliver Babies P equals zero point zero zero eight. 48 00:05:27,620 --> 00:05:36,560 And without getting into all technical details, all the zeros after the decimal point mean this relationship is real? 49 00:05:36,560 --> 00:05:44,810 It's not a coincidence. There is a strong correlation between storks and babies. 50 00:05:44,810 --> 00:05:52,530 Now, by this point, you may be wondering how exactly the trick is done, or perhaps you figured out the trick, it's pretty simple. 51 00:05:52,530 --> 00:06:00,030 Think about a place like Monaco or Luxembourg. There's not a lot of room for babies, and there's not a lot of room for storks. 52 00:06:00,030 --> 00:06:04,620 Alternatively, if you mention a country such as Germany or Poland, 53 00:06:04,620 --> 00:06:12,210 loads of space for storks and plenty of space for a high population, which means plenty of space for babies too. 54 00:06:12,210 --> 00:06:20,910 So what this relationship is measuring is perfectly real. But what this relationship is measuring is basically big countries versus small countries. 55 00:06:20,910 --> 00:06:25,170 It's a land area that's really being tracked now. 56 00:06:25,170 --> 00:06:33,450 Congratulations. If you figured out exactly how the trick was done, but if you if you want to understand this sort of statistical conjuring, 57 00:06:33,450 --> 00:06:38,970 there's no better book to read than how to lie with statistics, 58 00:06:38,970 --> 00:06:47,070 which is an elegant little work was published first in 1954 and written by a journalist named Darrel Huff, 59 00:06:47,070 --> 00:06:53,220 and immediately got rave reviews in the New York Times and elsewhere and went on to be so, 60 00:06:53,220 --> 00:06:59,460 they say, the bestselling book about statistics ever published. 61 00:06:59,460 --> 00:07:05,430 And I can see why the book has been so successful. I read it when I was a teenager, I learnt a lot. 62 00:07:05,430 --> 00:07:09,930 There's great. It's funny, it's got cartoons. The examples are really good. 63 00:07:09,930 --> 00:07:15,690 They're really incisive. They're written with with real wit. They have a dodgy grass, have a dodgy maps, 64 00:07:15,690 --> 00:07:24,720 and the whole thing will really help you avoid being fooled by phoney statistics, by people who are trying to deceive you. 65 00:07:24,720 --> 00:07:28,890 And of course, there are plenty of people out there who try to deceive you. 66 00:07:28,890 --> 00:07:35,790 Daryl Huff lays out his stall very clearly. He says the the crooks already know these tricks. 67 00:07:35,790 --> 00:07:40,950 It's the honest folk who have to learn them in self-defence. 68 00:07:40,950 --> 00:07:42,840 So there's nothing wrong with that. 69 00:07:42,840 --> 00:07:53,010 There's nothing wrong with a witty little book about how to lie with statistics, except that over the last 10 years or so, 70 00:07:53,010 --> 00:08:00,720 I've become increasingly uneasy about that book and about what it represents and what 71 00:08:00,720 --> 00:08:07,080 does it say about statistics and what does it say about our attitude to statistics? 72 00:08:07,080 --> 00:08:16,110 The most successful book ever written about them is from cover to cover a warning about misinformation. 73 00:08:16,110 --> 00:08:27,880 Once you start trading down that path, it becomes very easy to imagine that lying with statistics is all anybody ever does. 74 00:08:27,880 --> 00:08:39,130 And I think that's a problem, because let me contrast how to lie with statistics published in 1954 with something else that happened in 1954, 75 00:08:39,130 --> 00:08:45,640 which was that two British epidemiologists, Richard Dole and Austin Bradford Hill, 76 00:08:45,640 --> 00:08:54,220 published some of the first compelling evidence that smoking cigarettes dramatically increases your risk of lung cancer. 77 00:08:54,220 --> 00:09:00,760 Now, this is a really important result, and there were other scientists working on this result around the same time. 78 00:09:00,760 --> 00:09:13,360 But it matters to discover that this very popular product that people use all over the world to discover that that's killing people in large numbers, 79 00:09:13,360 --> 00:09:17,470 that it's extremely dangerous. Well, that's a little piece of information. 80 00:09:17,470 --> 00:09:26,980 And that piece of information has contributed, of course, to substantial declines in rates of smoking in developed countries and a longer life 81 00:09:26,980 --> 00:09:33,640 expectancy and reduced heart disease and reduced lung cancer and all kinds of good things. 82 00:09:33,640 --> 00:09:35,230 And this is a life saving discovery. 83 00:09:35,230 --> 00:09:47,430 This is tens of millions of people having their lives saved by war, by war, by a discovery that was made using statistics. 84 00:09:47,430 --> 00:09:52,530 Because when you think about it, you realise you couldn't have figured this out in any other way. 85 00:09:52,530 --> 00:09:59,550 That was one doctor in Argentina who treated a vast number of patients with throat cancer and lung cancer, 86 00:09:59,550 --> 00:10:04,140 and he noticed that cancer rates were on the increase. 87 00:10:04,140 --> 00:10:08,430 But actually, even just to notice that it would be an unusual thing, 88 00:10:08,430 --> 00:10:16,050 much easier to notice that lung cancer is dramatically increasing through statistics and by statistics. 89 00:10:16,050 --> 00:10:23,680 I don't mean anything terribly sophisticated to just mean somebody's got to count cases of lung cancer and write them down somewhere. 90 00:10:23,680 --> 00:10:28,750 So even to notice it, even to see that there's a problem requires a statistical perspective, 91 00:10:28,750 --> 00:10:37,990 even even quite a simple one, and to diagnose what is causing this huge increase in lung cancer. 92 00:10:37,990 --> 00:10:45,250 Well, then you really need to ask the questions in a statistical way to ask them very carefully. 93 00:10:45,250 --> 00:10:49,270 And bear in mind that Richard Doll didn't think that cigarettes were to blame. 94 00:10:49,270 --> 00:10:57,520 Bradford Hill and Dole, by the way, both smokers, at least at the beginning of that research, think you know, the research cured them of that habit. 95 00:10:57,520 --> 00:11:03,670 But Richard Dole thought that it was probably something to do with with the tarmac on the roads. 96 00:11:03,670 --> 00:11:06,670 And another theory, quite quite a plausible theory, 97 00:11:06,670 --> 00:11:14,140 is that it was pollution from cars because you think about this huge increase in lung cancer that's taking place through the late 1930s, 98 00:11:14,140 --> 00:11:20,380 through the 1940s, into the 1950s. Well, what else is going on a lot more cars? 99 00:11:20,380 --> 00:11:31,930 So it makes sense. But to really examine and to figure out, is it the cigarettes or is it the cars that requires the statistical lens? 100 00:11:31,930 --> 00:11:36,490 So what Dole and Bradford Hill were doing was not incredibly complicated, 101 00:11:36,490 --> 00:11:45,910 it did require care and it required effort at what stage they sent what they called a question or two about 60000 doctors. 102 00:11:45,910 --> 00:11:55,510 They thought, Oh, if we if we contact all the doctors, that's that's a great group of people to contact for start 80 percent of them smoke. 103 00:11:55,510 --> 00:12:00,610 But also, we don't want to lose track of doctors because they're all on the medical register. 104 00:12:00,610 --> 00:12:05,350 And when a doctor dies, we'll know what he or she died of. 105 00:12:05,350 --> 00:12:08,290 So they're a great group to examine. Well, this, of course, 106 00:12:08,290 --> 00:12:17,110 made them extremely unpopular with doctors because nobody wants to be told that this thing that you're smoking is is killing you. 107 00:12:17,110 --> 00:12:23,460 And at one Dr. Buttonholed Austin, Bradford Hill, I a dinner party and and said, Oh. 108 00:12:23,460 --> 00:12:27,630 So you're the chap who wants us all to quit smoking, are you? 109 00:12:27,630 --> 00:12:33,600 And Bradford Hill responded, No, not at all. I just want to see how you die. 110 00:12:33,600 --> 00:12:40,020 If you quit smoking, I want to see how you die. And if you don't quit smoking, I want to see how you die. 111 00:12:40,020 --> 00:12:49,110 So do what you wish. Quit or keep going. I will chalk up your death anyway, and it will be very useful to me. 112 00:12:49,110 --> 00:12:52,710 I should mention Austin Bradford Hill, originally trained as an economist. 113 00:12:52,710 --> 00:12:57,300 It's where he learnt his charm. So this is important work. 114 00:12:57,300 --> 00:13:07,230 This is vital work. But now let's think about these two visions of statistics that have emerged in the same year 1954. 115 00:13:07,230 --> 00:13:18,600 You've got Darel Health, who says it's a trick. I'm going to show you where the magician hides the rabbit before pulling it out of a hat. 116 00:13:18,600 --> 00:13:27,090 And it's great fun and it's clever and we can appreciate it, but don't ever take it seriously. 117 00:13:27,090 --> 00:13:38,900 Don't ever think of it as more than a game. In the very same year, Richard Doll, an Austin, Bradford Hill, using it not as a trick, but as a tool, 118 00:13:38,900 --> 00:13:49,960 a tool to understand the world, and they realised that this is not a game or if it is a game, the stakes are incredibly high. 119 00:13:49,960 --> 00:13:56,620 And my aim in writing how to make the world add up is above everything else. 120 00:13:56,620 --> 00:14:03,940 To urge you not to be seduced by this vision of Daryl Huff, where statistics are a trick. 121 00:14:03,940 --> 00:14:14,160 And instead to embrace the perspective of Richard Doll and Austin Bradford Hill to use statistics as a tool like 122 00:14:14,160 --> 00:14:24,200 an astronomer uses a telescope or radiologist uses a uses an X-ray machine or air traffic control use radar. 123 00:14:24,200 --> 00:14:31,660 There are certain things about the world. That we can't perceive in any other way. 124 00:14:31,660 --> 00:14:35,830 And we need statistics to show us what's going on. 125 00:14:35,830 --> 00:14:40,870 And I think the virus, the pandemic has underlined that point. 126 00:14:40,870 --> 00:14:45,490 I mean, it's a it's a heck of a way to be proved right about what you were trying to say. 127 00:14:45,490 --> 00:14:50,740 But to understand where the virus is and who's got it and how quickly it's spreading. 128 00:14:50,740 --> 00:14:55,000 And is it? Is it spreading or is it in retreat? 129 00:14:55,000 --> 00:15:00,580 And how dangerous is it? And who is it dangerous to? And how can we fight it? 130 00:15:00,580 --> 00:15:08,560 What are the treatments that are most effective? These are all one way or another statistical questions. 131 00:15:08,560 --> 00:15:17,710 And I think we learnt back in February March, when we were scrambling to make incredibly high stakes decisions with basically very little information. 132 00:15:17,710 --> 00:15:20,560 We learnt that having the data matters, 133 00:15:20,560 --> 00:15:30,180 having the statistics matters and when they're not available or whether they're patchy and incomplete, we suffer as a result. 134 00:15:30,180 --> 00:15:35,020 So it makes me angry to just hear people going, oh yeah, lies damned lies and statistics, 135 00:15:35,020 --> 00:15:43,540 so you can prove anything with statistics because this is a life saving tool, a life saving tool. 136 00:15:43,540 --> 00:15:57,840 And we should not take it for granted. So one of the things that I'm trying to do in the book is to give people a little bit more confidence to 137 00:15:57,840 --> 00:16:06,450 use that tool for themselves to to teach you certain habits of mind that I've picked up in 13 years, 138 00:16:06,450 --> 00:16:10,350 presenting more or less questions to ask things. 139 00:16:10,350 --> 00:16:18,810 To be curious about that will help you evaluate the claims that you see and think more clearly about them. 140 00:16:18,810 --> 00:16:26,460 One friend of mine, Matt Parker, the stand up mathematician, said, Oh so teams, so you've written this book about statistics. 141 00:16:26,460 --> 00:16:34,860 And I said, Well, no, I haven't. I haven't written a book about statistics. I've written a book about how to think about the world. 142 00:16:34,860 --> 00:16:40,800 And for me, it's just statistics are an important tool for thinking about the world, 143 00:16:40,800 --> 00:16:46,380 and I really believe that it's not as complicated as we often make it seem. 144 00:16:46,380 --> 00:16:52,230 The questions we need to ask don't require a great deal of technical expertise. 145 00:16:52,230 --> 00:16:59,150 Now. I spent some time talking about what happened in 1954, 146 00:16:59,150 --> 00:17:03,650 and there is a reason why I'm so obsessed with this sort of period of history in 147 00:17:03,650 --> 00:17:10,340 the in the early 1950s is because that was also the time at which the tobacco 148 00:17:10,340 --> 00:17:16,580 companies started realising that the scientific evidence was coming in showing 149 00:17:16,580 --> 00:17:20,990 that their product was dangerous and they had to figure out their own response. 150 00:17:20,990 --> 00:17:25,880 And the way they responded, I think was quite brilliant in a in a dark way. 151 00:17:25,880 --> 00:17:33,890 I mean, think about the problem you face. Your product is probably killing the people who consume it. 152 00:17:33,890 --> 00:17:38,200 Think about how anxious people get about even even small things. 153 00:17:38,200 --> 00:17:47,510 We worry about things like gluten these days, and this is a product that is killing people in large numbers, 154 00:17:47,510 --> 00:17:53,810 like thousands of people every week are dying. What on earth is your response to that? 155 00:17:53,810 --> 00:18:01,140 And what the tobacco industry realised was that people who smoke? 156 00:18:01,140 --> 00:18:08,730 They would like to believe that the product is safe, they would like to believe that this thing that they enjoy, 157 00:18:08,730 --> 00:18:15,880 that they find quite social, that they would find it difficult to quit. They'd like to believe that it is not killing them. 158 00:18:15,880 --> 00:18:24,730 So they want reasons to believe, and you don't need to prove that cigarettes are safe. 159 00:18:24,730 --> 00:18:32,290 All you need to do is give them reason to doubt that they're dangerous. 160 00:18:32,290 --> 00:18:37,290 Scepticism, doubt is the weapon. 161 00:18:37,290 --> 00:18:46,140 I think this was a brilliantly simple realisation when you look at the evidence, it does seem that doubt has a particular kind of power. 162 00:18:46,140 --> 00:18:52,890 There's lots of evidence in the book on motivated reasoning and people reaching conclusions that they want to reach. 163 00:18:52,890 --> 00:19:03,900 But one of the very first studies that I discuss by two psychologists called Carey Edwards and Edward Smith shows that negative arguments, 164 00:19:03,900 --> 00:19:09,590 doubtful arguments. They flow with a particular fluidity. 165 00:19:09,590 --> 00:19:15,380 That it's sure people are pretty good at coming up with reasons to believe what they want to believe, 166 00:19:15,380 --> 00:19:24,230 but they're really good at coming up with reasons to disbelieve what they want to disbelieve, to doubt what they want to doubt. 167 00:19:24,230 --> 00:19:32,270 And that's why slogans about lies, damn lies and statistics or fake news, they're so powerful. 168 00:19:32,270 --> 00:19:39,060 You get to this point where we've all had enough of experts because the experts are telling us things we don't want to see. 169 00:19:39,060 --> 00:19:45,280 We don't want to believe. Now we've seen these sorts of. 170 00:19:45,280 --> 00:19:54,790 Doubt seeding strategies, we've seen this sort of approach used in all kinds of politics and debate over climate change and so on. 171 00:19:54,790 --> 00:19:57,170 But it started with cigarettes. 172 00:19:57,170 --> 00:20:05,060 It started with the cigarette companies sounding very reasonable, saying, Well, look, the world's a very uncertain place. 173 00:20:05,060 --> 00:20:11,110 It's true. Experts disagree. It's true. More research is needed. 174 00:20:11,110 --> 00:20:15,610 That's true. I mean, it sounds very reasonable. Sounds very scientific. 175 00:20:15,610 --> 00:20:21,970 But in the end, you very quickly get from the Royal Society's motto Nullius in verba. 176 00:20:21,970 --> 00:20:26,830 Take nobody's word for it to. Nobody knows anything. 177 00:20:26,830 --> 00:20:34,460 And if nobody knows anything, you can believe whatever you want. 178 00:20:34,460 --> 00:20:39,440 In the nineteen sixties, nineteen sixty five, how I remember rightly, 179 00:20:39,440 --> 00:20:49,230 the US Congress held a hearing to decide whether cigarette packets should have health warnings on them. 180 00:20:49,230 --> 00:20:54,020 Was a Senate hearing, and they called in all kinds of expert witnesses, 181 00:20:54,020 --> 00:21:02,760 the epidemiologists and others to discuss the risks of smoking and the pros and cons of putting these health warnings on. 182 00:21:02,760 --> 00:21:16,160 And one of the witnesses. Who was called sat down in front of the senators and started to explain that there's a correlation. 183 00:21:16,160 --> 00:21:20,400 Between storks and babies. 184 00:21:20,400 --> 00:21:31,630 And this correlation is pretty robust, but of course, that's not because storks actually deliver babies, it's because larger places. 185 00:21:31,630 --> 00:21:44,080 Have plenty of room for babies and plenty of room for storks. The expert witness, his name was Darrel Huff, author of How to Lie with Statistics, 186 00:21:44,080 --> 00:21:53,380 because his brand of scepticism searing into cynicism was absolutely perfect for the tobacco industry. 187 00:21:53,380 --> 00:21:57,910 It's exactly the kind of message they wanted, so they hired him. 188 00:21:57,910 --> 00:22:02,770 They paid him to work on a sequel called How to Line with Smoking Statistics. 189 00:22:02,770 --> 00:22:16,330 And they persuaded him to testify in front of the Senate. And the senator leading the the committee hearing said, do you honestly mean to tell us, 190 00:22:16,330 --> 00:22:26,780 you think there is this casual, a connexion between cigarettes and lung cancer, as there is between storks and babies? 191 00:22:26,780 --> 00:22:36,300 And Daryl Huff replied, Well, the two seem to me to be about the same. 192 00:22:36,300 --> 00:22:48,690 That's when I really start to worry about how to lie with statistics about what it represents because it's right to maintain a healthy scepticism. 193 00:22:48,690 --> 00:22:54,720 It's right to notice that statistics are often used to deceive us. 194 00:22:54,720 --> 00:23:05,310 But when we descend into that mode of thinking that it's all a trick, it's all a joke that you can't believe anything very quickly. 195 00:23:05,310 --> 00:23:10,350 You end up siding with a guy who sits in front of the Senate commission and says 196 00:23:10,350 --> 00:23:16,920 that cigarettes and lung cancer is basically just the same as storks and babies. 197 00:23:16,920 --> 00:23:23,430 We, we end up in this sort of defensive crouch where we're basically not willing to believe anything on its merits. 198 00:23:23,430 --> 00:23:30,060 We're not willing to weigh up the evidence. We're just scrabbling around for reasons to believe whatever we want to believe. 199 00:23:30,060 --> 00:23:38,760 And I think that's a shame if I don't think it's a tragedy and I don't think it's that hard to push back and to stand up for ourselves and to say, 200 00:23:38,760 --> 00:23:47,430 actually, this stuff isn't so difficult. I can tell a clear argument from an unclear argument. 201 00:23:47,430 --> 00:23:55,440 I can tell the difference between someone who's seeking after truth and someone who's just trying to win some political debate or score a few points. 202 00:23:55,440 --> 00:24:01,800 I can tell the difference between truth and lies. It's not that hard. 203 00:24:01,800 --> 00:24:07,920 There is an old story about. Galileo and his telescope. 204 00:24:07,920 --> 00:24:18,570 The time he was looking at the moons of Jupiter, the rings of Saturn, the Catholic Church was persecuting him and he said to the Cardinals. 205 00:24:18,570 --> 00:24:24,040 No, I've got my telescope here. Just just take a look. 206 00:24:24,040 --> 00:24:31,780 I'll show you I'll show you what I see about the Solar System and that the Cardinals wouldn't look. 207 00:24:31,780 --> 00:24:37,810 They thought the telescope itself was full of trickery. Some kind of magicians to. 208 00:24:37,810 --> 00:24:41,830 Now, that story has been exaggerated over the years, 209 00:24:41,830 --> 00:24:52,570 but we still tell it and we tell it in this sort of smug way that we dismiss these outdated beliefs and this these religious extremists. 210 00:24:52,570 --> 00:25:00,970 I don't think we should be so smug. Because I see people behaving like that around me every day. 211 00:25:00,970 --> 00:25:07,560 Remember, I described statistics as as like a telescope, the way to see facts about the world. 212 00:25:07,560 --> 00:25:12,960 Facts about the human brain. The human body. About the economy, about the environment. 213 00:25:12,960 --> 00:25:19,260 About this incredibly complex world in which we live that way to see things we can't see in any other way. 214 00:25:19,260 --> 00:25:28,830 So that's why I described them as a telescope. And every day I see people who won't pick up the statistical telescope, they won't use it. 215 00:25:28,830 --> 00:25:36,830 They can look through it for exactly the same reason that allegedly the Cardinals wouldn't look to Galileo's. 216 00:25:36,830 --> 00:25:46,360 Because they were afraid of being tricked. So don't be afraid, I urge you to have a bit more confidence in yourself. 217 00:25:46,360 --> 00:25:49,390 I don't think it's as hard as we sometimes make it out. 218 00:25:49,390 --> 00:25:57,680 And the 10 rules of thumb that I've outlined in the book, I hope, are rules that anybody can use. 219 00:25:57,680 --> 00:26:03,290 So have some confidence in your ability to distinguish truth from fiction. 220 00:26:03,290 --> 00:26:08,000 Have some confidence in your ability to think clearly about the world. 221 00:26:08,000 --> 00:26:13,730 A lot of it is simply about wanting to know, wanting to understand. 222 00:26:13,730 --> 00:26:19,520 And I think all of us should be willing to pick up this statistical telescope. 223 00:26:19,520 --> 00:26:28,560 And to look around, you'll be amazed what we see. Thanks for listening. 224 00:26:28,560 --> 00:26:40,950 I have received many, many questions on Twitter because I asked some of the questions were along the lines of Why are you taking questions on Twitter? 225 00:26:40,950 --> 00:26:47,160 Why can't you give the talk live? Well, you know, there are different ways to give talks. 226 00:26:47,160 --> 00:26:52,500 We're all trying to experiment and adapt in this, this COVID era. 227 00:26:52,500 --> 00:26:58,860 I'm really delighted to have been asked to speak at the Maths Institute, and I hope you've enjoyed what I've said. 228 00:26:58,860 --> 00:27:05,730 But I do have these questions and I wanted to leave quite a bit of time for questions because the questions that have come in, I think, 229 00:27:05,730 --> 00:27:13,650 are a really great way to get into some of the ideas in the book and some of the practical tips that I outline that I outline in the book. 230 00:27:13,650 --> 00:27:18,690 I should wave the book around. Shouldn't I go out to make the world that up? 231 00:27:18,690 --> 00:27:24,990 10 Rules for thinking differently about numbers. So, OK, questions. 232 00:27:24,990 --> 00:27:35,460 Question from Sarah Brighton. What are the best questions to ask when we see statistics used in the news to help understand them? 233 00:27:35,460 --> 00:27:40,070 It's a great question, partly because basically the the 10 rules in the book, the 10, 234 00:27:40,070 --> 00:27:45,120 they're not really Ten Commandments, the ten habits of mind, plus the golden rule at the end of the book. 235 00:27:45,120 --> 00:27:49,680 I mean, they are my answer to the question. The book is basically the answer to this question. 236 00:27:49,680 --> 00:27:54,130 What questions should we ask and when we see statistics used in the news? 237 00:27:54,130 --> 00:28:02,580 Let me give you a few examples. Take a little bit of time to to to focus on this question because it is so central to the idea of the book. 238 00:28:02,580 --> 00:28:09,420 So the first question I would ask, I think, might surprise some people. 239 00:28:09,420 --> 00:28:16,110 So I begin the first chapter of the book by telling the story of Abraham produced Abraham. 240 00:28:16,110 --> 00:28:24,450 Radius was a great art critic in the late 19th and early 20th century, and I describe him sitting in his in his villa in Monaco, 241 00:28:24,450 --> 00:28:32,400 enjoying a well-deserved retirement and enjoying the admiration of the art world for his expertise. 242 00:28:32,400 --> 00:28:43,810 When in the nineteen thirties, a charming Dutch lawyer named Gerard Boone visited him and showed him a painting. 243 00:28:43,810 --> 00:28:52,150 And asked his opinion because Boone said this painting has recently been discovered in from a private collector, 244 00:28:52,150 --> 00:28:55,510 and we think it may be by your hands for me. 245 00:28:55,510 --> 00:29:04,470 Yeah, that's for me, the great Dutch interior painter and Mr. Brady, as you are the world's leading expert on Vermeer. 246 00:29:04,470 --> 00:29:12,900 So could you tell us what what you think of the painting and Abraham Radius was spellbound, almost literally. 247 00:29:12,900 --> 00:29:18,570 He he later wrote shortly afterwards wrote in an art magazine, 248 00:29:18,570 --> 00:29:29,100 Burlington He that when he saw this painting, he said I had difficulty controlling my emotion. 249 00:29:29,100 --> 00:29:32,970 He said he felt that he was in the presence, not just of of a Vermeer, 250 00:29:32,970 --> 00:29:40,680 but the masterpiece of Johannes for me, of Delft, quite different from all his other paintings. 251 00:29:40,680 --> 00:29:52,110 And yet, every inch of Amir. He also described it as uncorrupt just the Dutch word for virginal and pure and uncorrupted and untouched, 252 00:29:52,110 --> 00:29:56,340 which was ironic because basically it was a fraud and it was not only a fraud, 253 00:29:56,340 --> 00:30:05,550 it was a really nasty, vicious fraud by a really nasty, vicious man whom I describe in the book. 254 00:30:05,550 --> 00:30:13,620 And it wasn't even a very good painting. And the thing the one thing you can say about Your Highness for me is that paintings are good. 255 00:30:13,620 --> 00:30:18,980 I he's really good at what he does. The paintings are magical. 256 00:30:18,980 --> 00:30:30,470 So how was it that Abraham Brady was this great critic, this great expert was fooled by this crude forgery that wouldn't fool you or me. 257 00:30:30,470 --> 00:30:35,750 And the simple answer is in that line, he wrote for Burlington Magazine. 258 00:30:35,750 --> 00:30:44,510 When I saw the painting, I had difficulty controlling my emotion by exploiting Abraham Brady as his wishful thinking, 259 00:30:44,510 --> 00:30:51,110 by painting the picture that he knew Abraham Radius wanted to see. 260 00:30:51,110 --> 00:30:55,100 The forger managed to bypass Brady as his expertise. 261 00:30:55,100 --> 00:31:01,220 It didn't matter how much he knew about the technical details of painting. 262 00:31:01,220 --> 00:31:08,180 He wanted to be fooled, and in fact, his expertise in this particular case only made things worse. 263 00:31:08,180 --> 00:31:16,370 She gave him new reasons to believe in a painting that he should never have believed in you, or I would never have fallen for this. 264 00:31:16,370 --> 00:31:23,180 But Abraham Brady, as the expert did. Now why am I telling you a story about art forgery? 265 00:31:23,180 --> 00:31:28,880 Why am I beginning the first chapter of my book, which is supposed to be a book about numbers, right? 266 00:31:28,880 --> 00:31:35,600 That has no numbers in it? That is purely about, well, about painting and about perception. 267 00:31:35,600 --> 00:31:39,590 It's because of emotional reactions come first. 268 00:31:39,590 --> 00:31:46,040 If we've learnt anything from the last few years, the Brexit referendum, the election of Donald Trump, we've learnt that. 269 00:31:46,040 --> 00:31:54,320 We believe what we believe because of who we are and how we feel, our emotional reactions and our preconceptions. 270 00:31:54,320 --> 00:32:00,410 So there's no point in me writing a book that will tell you how to solve technical problems in statistics 271 00:32:00,410 --> 00:32:09,890 if it can also give you the emotional tools to get past your wishful thinking and see the world clearly. 272 00:32:09,890 --> 00:32:20,390 I think the one thing that we've all learnt over the last few years is that what we believe is overwhelmingly determined by what we want to believe. 273 00:32:20,390 --> 00:32:26,780 We're all influenced by ah, by our friends, by our cultural identity, by our political preconceptions. 274 00:32:26,780 --> 00:32:38,030 And so, of course, our emotions matter. So the very first question I would ask in answer to Sarah is how is this statistical claim making me feel? 275 00:32:38,030 --> 00:32:44,360 Is it making me feel angry or defensive or vindicated? 276 00:32:44,360 --> 00:32:50,270 Can I do I see this as as some ammunition for some argument that I want to make? 277 00:32:50,270 --> 00:32:56,690 Because if that's how you're feeling, you're probably not thinking clearly. Of course, we should be influenced by our emotions. 278 00:32:56,690 --> 00:33:00,860 Our emotions are important. We're social beings. It's fine to have preconceptions. 279 00:33:00,860 --> 00:33:05,990 It's fine to pay attention to what our friends think, but we have to be aware of that. 280 00:33:05,990 --> 00:33:12,770 So I'm advocating statistical mindfulness. I realise I now sound like yoga with a calculator. 281 00:33:12,770 --> 00:33:18,110 Statistical mindfulness. Notice how the claim makes you feel. 282 00:33:18,110 --> 00:33:28,910 And if you're if you're if you are aware of your emotions, if you're aware of that emotional reaction, you take a few seconds to observe it. 283 00:33:28,910 --> 00:33:30,530 Maybe let it subside. 284 00:33:30,530 --> 00:33:38,550 I think you're going to think more clearly about how to then go about evaluating the claim and whether you should be sharing it or not. 285 00:33:38,550 --> 00:33:45,020 And second question I would ask, it's a serious question. What are the best questions to ask when we see a number in the news? 286 00:33:45,020 --> 00:33:53,730 The second question to ask is what does the name actually mean by which I mean something quite specific, like what is actually being counted? 287 00:33:53,730 --> 00:34:06,940 So for example, we see these of these daily coronavirus cases reported every day in the UK to understand that. 288 00:34:06,940 --> 00:34:13,600 You know this these are official cases, they're being processed, the tests are being done, 289 00:34:13,600 --> 00:34:17,950 and the tests may be being processed with a delay of several days. 290 00:34:17,950 --> 00:34:25,750 They might have a test gathered before the weekend, and the test is that being the test results published after the weekend? 291 00:34:25,750 --> 00:34:33,070 That sort of is just straightforward knowledge about what this number represents versus, for example, 292 00:34:33,070 --> 00:34:39,490 the Office for National Statistics Infection Survey, which is this is not an official case count. 293 00:34:39,490 --> 00:34:48,910 This is an attempt to randomly sample the population and to estimate how many people out there have infections. 294 00:34:48,910 --> 00:34:51,070 It's a different methodology, I would say. 295 00:34:51,070 --> 00:34:58,540 Probably a better methodology for estimating how how much the virus is out there just to know what's being measured, 296 00:34:58,540 --> 00:35:02,540 what it means during the financial crisis. 297 00:35:02,540 --> 00:35:08,650 I got so, so many claims that mixed up the deficit and the debt you could argue. 298 00:35:08,650 --> 00:35:12,040 People would argue about whether the claim was true, whether the number was right. 299 00:35:12,040 --> 00:35:17,730 And so you can actually even understand what it is that you're talking about. 300 00:35:17,730 --> 00:35:27,670 Or another example, it's in the book an article in The Guardian that talked about suicide and self-harm. 301 00:35:27,670 --> 00:35:31,150 And actually suicides are different from self-harm, 302 00:35:31,150 --> 00:35:41,710 and the definition that was being used for self-harm was was actually ambiguous that the article didn't discuss what was meant by self-harm. 303 00:35:41,710 --> 00:35:45,280 And then I had to go and talk to the researchers and the researchers in the end said, Well, actually, 304 00:35:45,280 --> 00:35:53,650 we don't know what's meant by self-harm either and self-harm as as defined by the the people who responded to our survey. 305 00:35:53,650 --> 00:35:57,040 So it's whatever they think now, it's nothing wrong with that. 306 00:35:57,040 --> 00:36:02,710 But just understanding what the definition is, what's being measured, what's being described before, 307 00:36:02,710 --> 00:36:09,370 before you get into the mass, before you get into evaluating the number, that's really important. 308 00:36:09,370 --> 00:36:15,850 And the third question that I would ask a very simple question is how does this fit into the bigger context? 309 00:36:15,850 --> 00:36:23,590 So for example, if you if you see that there have been 6000 daily cases of coronavirus to ask questions like, 310 00:36:23,590 --> 00:36:28,240 well, how does that compare to a week ago? 311 00:36:28,240 --> 00:36:31,780 Is there a rolling average? How does that compare to other countries? 312 00:36:31,780 --> 00:36:36,070 What about France? What about Spain? How does it compare to six months ago? 313 00:36:36,070 --> 00:36:43,330 Can we make the comparison with six months ago? Are there other reasons to believe that the comparison doesn't work? 314 00:36:43,330 --> 00:36:47,440 Now, of course, you know, as an individual. 315 00:36:47,440 --> 00:36:54,190 Some of this stuff is a lot of hard work. You don't want to have to go through all this work every time you see a number, 316 00:36:54,190 --> 00:36:59,020 but you should certainly see some evidence that somebody else is doing that work for you. 317 00:36:59,020 --> 00:37:05,590 So if you're reading a social media post, listening to the radio, watching the TV, reading a newspaper article, 318 00:37:05,590 --> 00:37:14,350 if you're hearing that sort of explanation, that sort of context, what was down, what was measured is the number going up or going down. 319 00:37:14,350 --> 00:37:16,900 What is it per person or per million people? 320 00:37:16,900 --> 00:37:24,520 If you're seeing that, I would be reassured that you're hearing from someone who's trying to help you understand the world. 321 00:37:24,520 --> 00:37:30,550 And if you're not getting that context, you're you're hearing from someone who's trying to get you excited or riled 322 00:37:30,550 --> 00:37:35,870 up or sort of win your vote or defend themselves against a political attack. 323 00:37:35,870 --> 00:37:40,700 And that's not the same thing, and it's not helpful. Right? 324 00:37:40,700 --> 00:37:45,310 Another question. Thanks, Sarah, for that question. Dave Bradshaw. 325 00:37:45,310 --> 00:37:54,910 This leads into the previous question. Dave Bradshaw asks how much of a worry is it when he puts this in square in scare quotes? 326 00:37:54,910 --> 00:37:55,900 How much of a worry is it? 327 00:37:55,900 --> 00:38:06,310 When leading epidemiologists put out misleading information about the infection fatality rate of the virus never bother to admit the mistake, 328 00:38:06,310 --> 00:38:09,610 and he links to a particular video. 329 00:38:09,610 --> 00:38:17,020 But I'm not so interested in that particular video because there's lots of this about there are we focus on the virus fragments. 330 00:38:17,020 --> 00:38:22,750 There are a lot of people around, some of whom are eminently qualified and some of whom. 331 00:38:22,750 --> 00:38:25,720 It's not that amateurs or their politicians, 332 00:38:25,720 --> 00:38:35,290 but a lot of people about who have become very concerned with winning an argument with making a particular case for a particular view of the world. 333 00:38:35,290 --> 00:38:43,060 And whenever I see that I, I start to worry because it becomes harder to really evaluate what's going on, 334 00:38:43,060 --> 00:38:49,180 becomes harder to see clearly becomes very tempting to score cheap points. 335 00:38:49,180 --> 00:38:51,740 It becomes very tempting to play for people's emotions. 336 00:38:51,740 --> 00:38:59,260 And as we already saw with my answer to Sarah, you know, getting people emotional does not help clear thinking. 337 00:38:59,260 --> 00:39:05,890 And you start getting tempted to cherry pick data to strip things of context and to be unfair. 338 00:39:05,890 --> 00:39:16,570 So some of the. Some of this sort of debating this point scoring is done by experts with some expertise, but it still makes me uncomfortable. 339 00:39:16,570 --> 00:39:19,450 And some of it is just complete nonsense. 340 00:39:19,450 --> 00:39:32,770 So for example, on the side of people who who say we need to be very, very worried indeed about about the virus, I see lots of stuff about long COVID. 341 00:39:32,770 --> 00:39:44,060 In other words, people who have lasting and maybe serious symptoms who nevertheless didn't die, but very little about how common this is. 342 00:39:44,060 --> 00:39:46,120 And we need better data, of course. 343 00:39:46,120 --> 00:39:54,340 But if you're just tweeting examples of lupus, this person and this person suffered this long running debilitating disease. 344 00:39:54,340 --> 00:39:58,090 We haven't really learnt anything because millions of people have had the virus. 345 00:39:58,090 --> 00:40:00,700 Tens of thousands of people have died of the virus. 346 00:40:00,700 --> 00:40:07,600 The fact that you can give me a single example of someone who's suffering long term health effects, it doesn't help me understand. 347 00:40:07,600 --> 00:40:11,740 Now I want to understand long home and I want to know how prevalent it is. 348 00:40:11,740 --> 00:40:16,030 I think it could be a very serious problem, and it's certainly something we should investigate. 349 00:40:16,030 --> 00:40:24,070 But I want to explore that with the mindset of someone who's seeking after truth rather than someone who's trying to make a particular point. 350 00:40:24,070 --> 00:40:30,850 And you see on the other side of the debate, the lockdown sceptics, I think they often call themselves again. 351 00:40:30,850 --> 00:40:38,080 I'm not very loud and we've skipped from one point to another or the infection fatality rate might be super low because 352 00:40:38,080 --> 00:40:46,930 maybe everyone's kind of already had it or we just on the brink of herd immunity or died with COVID vs. died of COVID. 353 00:40:46,930 --> 00:40:55,390 And now all this nonsense about false positives and in each case, there's somewhere underneath those arguments. 354 00:40:55,390 --> 00:40:59,710 There's a really important point to make. False positives are a thing. 355 00:40:59,710 --> 00:41:03,070 Asymptomatic, undetected cases are a thing. 356 00:41:03,070 --> 00:41:11,500 Herd immunity might be more prevalent than we think, but when it's used again to make a particular argument, we're getting stupider. 357 00:41:11,500 --> 00:41:19,540 We're not getting smarter. So it disturbs me when I see anybody arguing for a particular case rather than just trying to explore the truth. 358 00:41:19,540 --> 00:41:29,230 And you can see it's not hard to see. What you know, who's who's doing, what goes on on this sort of thing. 359 00:41:29,230 --> 00:41:40,390 A question from Sally, Steve and Sally Stephens says should it be mandatory for MPs to have some statistics training? 360 00:41:40,390 --> 00:41:47,980 Probably not because, you know, in the end, it's a democracy and we voters need to take responsibility for who we elect. 361 00:41:47,980 --> 00:41:55,540 Right. But I think it would certainly be helpful for MPs to have some statistics training two separate points. 362 00:41:55,540 --> 00:42:02,380 One is that we just need a more diverse group of MPs. And by diversity, I refer to all kinds of things. 363 00:42:02,380 --> 00:42:04,990 You know the things that you're probably thinking of. 364 00:42:04,990 --> 00:42:11,020 So and people from different ethnic backgrounds and from different classes, different parts of the country, 365 00:42:11,020 --> 00:42:20,380 different educational backgrounds, men and women, different sexual orientation and peace with disabilities? 366 00:42:20,380 --> 00:42:27,130 We need all of that. We also need peace with different educational backgrounds. 367 00:42:27,130 --> 00:42:34,600 So a lot of lawyers in parliament and quite a lot of people who studied history and classics and even a few who studied people like me, 368 00:42:34,600 --> 00:42:37,180 there's nothing wrong with any of that. 369 00:42:37,180 --> 00:42:47,560 But it would be good also to see MPs with mathematical training, statistical backgrounds, chemistry, physics, biology, medicine. 370 00:42:47,560 --> 00:42:53,530 We need all of these things. The world is a complicated place. The decisions that MVP's have to make on our behalf, 371 00:42:53,530 --> 00:43:02,360 they're complex and you're only going to get the best decision making when you have that real mix of different expertise in the room. 372 00:43:02,360 --> 00:43:14,740 So it's that second point to make is the research done by Phil Tetlock, Barbara Melas and Don Moore became famous as the super forecasting research. 373 00:43:14,740 --> 00:43:22,150 There's a great book called Super Forecasting by BI Tetlock, and Dan Gardner is his co-author. 374 00:43:22,150 --> 00:43:29,950 One of the things that they find is that even quite brief statistical training, like an hour of statistical training, 375 00:43:29,950 --> 00:43:37,840 really helps forecasters to make better predictions about the world more robust predictions and really understand what's going on. 376 00:43:37,840 --> 00:43:42,310 So there probably is a case for some statistical training. 377 00:43:42,310 --> 00:43:46,540 I don't think it's that hard. I mean, another thing we could do is maybe we could. 378 00:43:46,540 --> 00:43:52,210 Just everyone could mail a copy of how to make the world add up to their local MP. 379 00:43:52,210 --> 00:44:01,240 Great idea. Now you suggest it. But this stuff doesn't always require a Ph.D. in statistics or advanced mathematics. 380 00:44:01,240 --> 00:44:09,850 A lot of it is basic critical thinking skills, plus the motivation to understand what's happening rather than try to win an argument. 381 00:44:09,850 --> 00:44:17,320 And of course, MPPs are highly motivated to try to win arguments, which is a problem and not a problem I can entirely blame them for. 382 00:44:17,320 --> 00:44:25,480 That's just the way things are, right? Oh, question from Will Moy, who runs full fact. 383 00:44:25,480 --> 00:44:30,070 What stat do you wish you knew and why? 384 00:44:30,070 --> 00:44:35,170 She will know if Will is aware of this, but he's in the book, 385 00:44:35,170 --> 00:44:51,040 quoted as saying that we know more about golf than we do about victims of serious crimes such as rape and sexual assault and assault and murder. 386 00:44:51,040 --> 00:44:54,910 Why do we know more about golf than we know about the victims of these crimes? 387 00:44:54,910 --> 00:45:04,000 And because the the survey that measures physical activity, sporting activity is much bigger than the crime survey, 388 00:45:04,000 --> 00:45:12,910 so it's higher resolution enables you to make statements about local areas, enables you to make statements about rare activities. 389 00:45:12,910 --> 00:45:19,120 But this isn't because some civil servant or some politician at some stage said, You know what? 390 00:45:19,120 --> 00:45:24,520 We've got the statistics budget, and we really think sports are more important than crime. 391 00:45:24,520 --> 00:45:32,560 It's just because the The Victims of Crime Survey has been running for a long time at a particular size. 392 00:45:32,560 --> 00:45:36,820 And then when the London Olympics came along, there's this big push for sporting participation, 393 00:45:36,820 --> 00:45:43,990 a big budget to to encourage sporting participation and therefore a big survey of sporting participation. 394 00:45:43,990 --> 00:45:49,720 Nothing wrong with that, but just goes to show that we sometimes we make these argument. 395 00:45:49,720 --> 00:45:56,860 We collect these statistics almost by accident. We're not very considered about what we measure and what we don't. 396 00:45:56,860 --> 00:46:05,020 And I should shout out for an iPad. Will Smith's blog Missing numbers, I think, is missing numbers dot org, but you can find it, 397 00:46:05,020 --> 00:46:11,410 which basically just tries to track things that maybe the government should be measuring and and isn't. 398 00:46:11,410 --> 00:46:20,110 And in particular, things the government used to measure and then stopped measuring and often didn't even notify anybody to stop measuring it. 399 00:46:20,110 --> 00:46:27,850 The data there was just disappeared. For power not to collect numbers is is very important. 400 00:46:27,850 --> 00:46:35,620 All that said, OK, so to answer Will's question, what state do you wish you knew and why? 401 00:46:35,620 --> 00:46:43,780 Well, when I was finishing off the book in late April, the answer was very clear it was the infection fatality rate, 402 00:46:43,780 --> 00:46:53,500 and I wanted to know the infection fatality rate of coronavirus because it was still, I think, highly uncertain and hugely consequential. 403 00:46:53,500 --> 00:46:55,450 Some people were emailing me and saying, 404 00:46:55,450 --> 00:47:05,320 you said in your latest column that the infection fatality rate might be one percent, but it's clearly one in two thousand. 405 00:47:05,320 --> 00:47:14,450 I mean it. If it's one in two thousand, everybody in the UK had coronavirus twice in March, 406 00:47:14,450 --> 00:47:20,480 April and May, there is no way to make the deaths come out in the other way. 407 00:47:20,480 --> 00:47:25,100 This is clearly not one in two thousand so that that was the number I really wanted. 408 00:47:25,100 --> 00:47:33,140 I think we've done a lot of work figuring out what it is. I think we now know it was about one percent and it has been coming down. 409 00:47:33,140 --> 00:47:38,360 It's been coming down because older people who are much more vulnerable and people who 410 00:47:38,360 --> 00:47:43,250 are vulnerable for other reasons are doing a better job of hiding from the virus, 411 00:47:43,250 --> 00:47:50,000 and we're doing a better job of shielding them from the virus. And also because treatment is is getting better. 412 00:47:50,000 --> 00:47:56,600 Possibly also because we're doing a better job of shielding people from from high exposure to the virus so that 413 00:47:56,600 --> 00:48:01,970 these various things are helping to bring the infection fatality rate down from one percent to half a percent, 414 00:48:01,970 --> 00:48:12,660 which is obviously important. Still, a really nasty virus and incredibly dangerous for four people over the age of 70. 415 00:48:12,660 --> 00:48:17,870 So that's the number I that's what I would have said when I was writing the book. 416 00:48:17,870 --> 00:48:22,910 If you'd asked me what number now? So there's a lot in the book about algorithm. 417 00:48:22,910 --> 00:48:30,080 There's a long chapter on algorithms, and one of the things I really wish we knew more about was the effectiveness of 418 00:48:30,080 --> 00:48:37,850 algorithms that our politicians are often deploying to make decisions about us. 419 00:48:37,850 --> 00:48:43,040 So we have algorithms that are deciding who gets bail and who doesn't get bail. 420 00:48:43,040 --> 00:48:50,690 So we saw this summer algorithms that replacing A-level and GCSE exams. 421 00:48:50,690 --> 00:48:57,380 And I think the algo shambles of the summer really indicated part of the problem that 422 00:48:57,380 --> 00:49:05,780 somebody had managed to persuade politicians that this very painful decision that made, 423 00:49:05,780 --> 00:49:10,310 which was to cancel everyone's exams didn't have to be painful at all because there 424 00:49:10,310 --> 00:49:15,410 was going to be this magic algorithm that would just give everyone the right grade, 425 00:49:15,410 --> 00:49:17,750 which when you think about it, it's clearly impossible. 426 00:49:17,750 --> 00:49:24,820 How could an algorithm give you the right grade for an exam that you haven't sat and you're not going to say it is impossible? 427 00:49:24,820 --> 00:49:31,250 And I think if politicians had recognised that back in March, they could very quickly have said, 428 00:49:31,250 --> 00:49:38,540 OK, what are we going to do about the fact that there is no way to give people that grades fairly? 429 00:49:38,540 --> 00:49:43,490 What are we going to do to survive a world where we don't have that information anymore could be postponed? 430 00:49:43,490 --> 00:49:48,920 The exams can be start expanding higher education? 431 00:49:48,920 --> 00:49:53,450 What do we do? But instead, people just thought, well, the algorithm will be fine. 432 00:49:53,450 --> 00:49:59,660 So I want politicians to have a better sense of what these algorithms can and can't do. 433 00:49:59,660 --> 00:50:07,550 But that then leads into this missing statistic that is so often missing that will boy asks about, 434 00:50:07,550 --> 00:50:13,340 which is very often we don't have any proof of effectiveness of an algorithm. 435 00:50:13,340 --> 00:50:19,580 We don't see algorithms evaluated with, for example, with rigorous randomised trial, 436 00:50:19,580 --> 00:50:24,020 which you would often want to do if you've got a medical decision making algorithm or a 437 00:50:24,020 --> 00:50:30,260 policing algorithm to compare what the algorithm is doing vs. what human judgement is doing, 438 00:50:30,260 --> 00:50:36,980 and to say, yeah, the algorithm is or is not making the decision more quickly, more accurately, 439 00:50:36,980 --> 00:50:46,010 etc. So you made these grand claims for these algorithms that are commercially secret that can't be evaluated by independent experts. 440 00:50:46,010 --> 00:50:52,910 We want to see what's going on inside the algorithm. We want independent scrutiny and as much as anything else, 441 00:50:52,910 --> 00:50:59,810 we want proof that it actually works and we don't take vaccines or drugs without proof that they work. 442 00:50:59,810 --> 00:51:07,130 Why on earth should we let an algorithm make decisions without proof that it works? 443 00:51:07,130 --> 00:51:11,030 I'm looking at my clock. I've talked for too long. I've talked for far too long. 444 00:51:11,030 --> 00:51:16,040 I'm going to shut up now and thank you for everyone who. To everyone who sent in questions on Twitter. 445 00:51:16,040 --> 00:51:21,290 I'm sorry, I didn't get to all of them. There were loads of good ones. The book answers some of them. 446 00:51:21,290 --> 00:51:26,300 You can always drop me an email if there are others that you want answered, and I will do my best. 447 00:51:26,300 --> 00:51:34,040 Thank you so much to the Oxford Maths Department, the Mathematical Institute, for inviting me to speak. 448 00:51:34,040 --> 00:51:39,080 Thanks again to Blackwell's books for supporting how to make the world that up. 449 00:51:39,080 --> 00:51:44,480 And final word to all of you. Be curious about the world. 450 00:51:44,480 --> 00:51:54,320 Try to think clearly. Try to ask the questions that you want answered whenever you see one of these statistics. 451 00:51:54,320 --> 00:51:59,810 Don't use statistics as weapons don't stand for other people using them as weapons. 452 00:51:59,810 --> 00:52:05,930 Use them instead. As a kind of telescope to see the world more clearly, you can do it. 453 00:52:05,930 --> 00:52:13,370 I believe in you. I think we can all do it, and we're going to make it easier for each other if we all band together on this point. 454 00:52:13,370 --> 00:52:35,219 So thanks very much.