1 00:00:00,890 --> 00:00:09,250 I guess it should be sort of a different kind of setting out here tonight is to introduce our speaker documented use, 2 00:00:09,250 --> 00:00:15,070 who is a fellow and senior education evaluation specialist at RTI, 3 00:00:15,070 --> 00:00:24,700 when she has extensive experience in 30 different kinds of educational projects in various international contexts. 4 00:00:24,700 --> 00:00:34,780 Most of the other, the agency has conducted focus on early intervention and the promotion of learning to buy your house. 5 00:00:34,780 --> 00:00:44,570 He's contributing to projects in Malawi and Tanzania aimed at improving the quality of pre-primary and primary education in those countries. 6 00:00:44,570 --> 00:00:52,750 More specifically, matches with Typekit Semmens culturally relevant approaches to assessments of social and. 7 00:00:52,750 --> 00:01:01,990 This is in Tanzania, for instance, or improving the damages to an understanding of the cultural basis of teacher child interactions, 8 00:01:01,990 --> 00:01:14,080 various methods to such reading proficiency benchmarks or frameworks to improve evidence based decision making that he will be talking about today. 9 00:01:14,080 --> 00:01:20,810 How you joining al-Qaida in 2016? Met you at the. 10 00:01:20,810 --> 00:01:27,160 Two reasons why she was senior director of global research, monitoring and education. 11 00:01:27,160 --> 00:01:35,200 For about 40 years and before to read, he was an associate professor at Harvard Graduate School of Education, 12 00:01:35,200 --> 00:01:43,120 where he taught courses on evidence based decision making and on the growth of those culture and health ineffective education. 13 00:01:43,120 --> 00:01:48,220 And that's the place where I met my chief for the first time in 2006. 14 00:01:48,220 --> 00:01:52,660 I would certainly try to get to tell the truth. 15 00:01:52,660 --> 00:01:54,130 His professional career, and that too, 16 00:01:54,130 --> 00:02:02,380 has worked closely with various international organisations like the World Bank, UNESCO, USAID and Save the Children. 17 00:02:02,380 --> 00:02:11,120 But she holds a B.A. in physics from the University of Oxford and its NSC in Experimental Psychology from Sussex and DPhil from Oxford. 18 00:02:11,120 --> 00:02:19,800 The Game in Psychology and Experience. So I'd really like to present the speaker today, 19 00:02:19,800 --> 00:02:27,170 and I look forward to listening to be stopped on of the rational approach to evidence based decision making in education policy. 20 00:02:27,170 --> 00:02:36,970 Thank you very much. Thanks very much. And takes a lot of damage, and it just reminded me that I have spent nine years in Oxford, 21 00:02:36,970 --> 00:02:42,940 three and three different departments, including epidemiology and not yet here. 22 00:02:42,940 --> 00:02:47,600 And it wasn't until I got lost today that I realised the jury room was actually a of life here. 23 00:02:47,600 --> 00:02:56,010 I know my psychology. It was a huge both of it's friends sometimes helps to get lost. 24 00:02:56,010 --> 00:03:01,680 And so I played around with the title, but I'm aware that calling this a rational approach and evidence Evidence-Based 25 00:03:01,680 --> 00:03:08,730 Decision Making someone implies that everyone else's approach is irrational, which is not really the point I'm trying to make. 26 00:03:08,730 --> 00:03:19,560 But I'll explain that the rationale behind such and this is what I'm doing with a political scientist at an earlier time. 27 00:03:19,560 --> 00:03:29,250 So you spend a little bit of time setting up what I think the problem is and in the three two sections to be to the idea of the approach, 28 00:03:29,250 --> 00:03:32,950 which is all to think about uncertainty and. 29 00:03:32,950 --> 00:03:40,390 But the whole problem being that we can't be certain about about a policy action based on the set of evidence, 30 00:03:40,390 --> 00:03:44,320 very graphic, we talk about how to think about certain evidence. 31 00:03:44,320 --> 00:03:52,390 And then when it's appropriate to bring back memories and discuss communications. 32 00:03:52,390 --> 00:03:58,100 I just want to set up a few. What is is and is not included in this. 33 00:03:58,100 --> 00:04:02,270 So I think this is all kinds of evidence based decision making, 34 00:04:02,270 --> 00:04:08,570 but I'm really to talk mostly today about the kind of way you're looking at the 35 00:04:08,570 --> 00:04:14,930 evidence synthesis review and then deciding how to act on what it tells you. 36 00:04:14,930 --> 00:04:17,450 And also, it's it's a normative framework. 37 00:04:17,450 --> 00:04:24,830 And so I know that there's a lot of good work that's done on how to convince people to listen to the evidence. 38 00:04:24,830 --> 00:04:32,030 And we were just chatting with everyone before we started about. When people ignore research evidence and I know that's common. 39 00:04:32,030 --> 00:04:33,380 But for now, 40 00:04:33,380 --> 00:04:42,950 I'm I'm resisting in the fantasy world where people do listen to the evidence and so be mainly people who are sold on the idea of evidence, 41 00:04:42,950 --> 00:04:49,740 but just recognise that it's complex to tweet about and to provide some guidance on how. 42 00:04:49,740 --> 00:04:55,720 So that's what we should do not to understand that the two. 43 00:04:55,720 --> 00:05:03,610 I'm talking about education, but I think this is relevant to social policy in general, particularly because human behaviours involve evidence, 44 00:05:03,610 --> 00:05:08,950 becomes complicated and complex and also compared to medicine, for example, 45 00:05:08,950 --> 00:05:13,950 I think social policy goes a relatively fragile of assessing potential harm. 46 00:05:13,950 --> 00:05:23,040 And most of all, this kind of development and international development, I think uncertainty is rife in international development, 47 00:05:23,040 --> 00:05:30,150 probably because we keep setting ourselves really ambitious goals for sustainable own goals. 48 00:05:30,150 --> 00:05:35,670 So we want to eradicate the sort of poverty everywhere by 2030. 49 00:05:35,670 --> 00:05:42,150 We want to ensure that every child, lifelong dedication, both competitive domestic work. 50 00:05:42,150 --> 00:05:47,260 I think we have less information to go on. So inevitably that some. 51 00:05:47,260 --> 00:05:56,350 And I think that also, you know, the political environment, there's a lot of pressure to be effective or to reduce aid spending. 52 00:05:56,350 --> 00:06:02,110 And therefore, if you're public about your insurgency, if Republicans say, well, we don't really know what we're doing, 53 00:06:02,110 --> 00:06:07,690 but we'd like to still continue spending 47 percent of the GDP on international donors, 54 00:06:07,690 --> 00:06:16,660 then there's going to be a lot of pressure groups, Tony, to do it. So there's a need for what he calls incredible certitude. 55 00:06:16,660 --> 00:06:21,550 I like to quote, apparently from Mr Johnson, courageous of the capital. Just give me an idea. 56 00:06:21,550 --> 00:06:32,570 So this is precious to be as. OK, so that's the scope I want to talk a little bit about what I see the problem is. 57 00:06:32,570 --> 00:06:41,180 So I think that there's been increasing recognition in the last couple of decades of what constitutes rigorous evidence and, you know, 58 00:06:41,180 --> 00:06:46,580 a big move towards basing policy on regulatory oversight of a massive expansion of 59 00:06:46,580 --> 00:06:52,520 the use of trying to mass trials in an international language and in Britain too. 60 00:06:52,520 --> 00:06:59,930 And so there's a high bar that the people understand the evidence needs to turn to. 61 00:06:59,930 --> 00:07:12,270 To go over to disqualify. And what only one in practise that we review a lot of studies and many systematic reviews and some 62 00:07:12,270 --> 00:07:19,650 of the studies to make it into a systematic review and others will be carefully considered as well. 63 00:07:19,650 --> 00:07:28,410 Over time, they will be ignored and thrown in the bin and. And one of the questions I had for consumers about how much information and 64 00:07:28,410 --> 00:07:36,790 useful information was most important is not going to these kind of reviews. 65 00:07:36,790 --> 00:07:44,500 And so, you know, and in some cases, I think that people conduct studies aiming for this high volume increased rigorous evidence. 66 00:07:44,500 --> 00:07:55,750 And he just said this binary thinking, it doesn't feel like they just say to themselves if they haven't raised the bar. 67 00:07:55,750 --> 00:07:59,380 And in many cases, it's difficult to get rid of representatives about this, all right, 68 00:07:59,380 --> 00:08:07,010 because education is complex in international work and there's lots of studies out there at the moment. 69 00:08:07,010 --> 00:08:13,150 Also experimental studies that look at the impact of a set of interventions on early literacy, 70 00:08:13,150 --> 00:08:17,170 for example, and they will tell you whether it works or not. 71 00:08:17,170 --> 00:08:26,320 But we know that actually teaching literacy involves, you know, hundreds of different actions in dozens of different activities each day. 72 00:08:26,320 --> 00:08:33,310 These teachers are all different from each other for different ages. Children behave differently and try to work out what I'm going to do. 73 00:08:33,310 --> 00:08:39,160 That works is quite a complex question that is not amenable to a simple trial. 74 00:08:39,160 --> 00:08:43,300 And similarly, this is operating in a large system and education system, 75 00:08:43,300 --> 00:08:51,020 Panopto interactive components that also are tough to address with really big. 76 00:08:51,020 --> 00:09:04,820 I've seen estimates that even five percent of interventions in international development are actually applicable to randomised trials. 77 00:09:04,820 --> 00:09:09,350 So this is what actually qualifies for adults 18 months ago, 78 00:09:09,350 --> 00:09:22,940 so what qualifies as three percent of those in the areas of education in low income countries that this priority review? 79 00:09:22,940 --> 00:09:28,910 So if you imagine that you're your Ministry of Education officials, you're looking at this, 80 00:09:28,910 --> 00:09:35,810 you're trying to decide what reforms you are taking to your country, the green ones of the things that are slammed. 81 00:09:35,810 --> 00:09:41,390 And you know, this definitely works. We've got two to choose from. 82 00:09:41,390 --> 00:09:50,360 Cash transfers, which are given to households to encouragement for families to send their kids to school and structured pedagogy. 83 00:09:50,360 --> 00:09:53,600 At least two things to illustrate some of the points I was just making, 84 00:09:53,600 --> 00:09:58,720 which are that cash transfers are relatively simple, discrete interventions like this. 85 00:09:58,720 --> 00:10:04,610 You give some money to a bunch of farmers and you watch what happens and don't waste a lot of 86 00:10:04,610 --> 00:10:11,900 evidence for cash transfers because it's simple to manage lots of people doing financial. 87 00:10:11,900 --> 00:10:17,450 But it's harder to produce evidence in a more complex kind of performance. 88 00:10:17,450 --> 00:10:22,400 Structured pedagogy, well, what this really tells us is that it's a good idea to structure lessons, 89 00:10:22,400 --> 00:10:31,730 so there's evidence that if you your lesson plans for teachers in areas of load capacity, it has to do it's effective. 90 00:10:31,730 --> 00:10:36,230 But you know, we're less completely agnostic as the whole pedagogy is. 91 00:10:36,230 --> 00:10:40,880 That could be any. Lots of different approaches to teaching lots of debates with the military, 92 00:10:40,880 --> 00:10:49,340 and clearly that says nothing about which approach to effective to lots of uncertainties, even about these two. 93 00:10:49,340 --> 00:11:01,240 And these two findings were something else took a little bit about their basic scholarships and a few others who would look after that promise. 94 00:11:01,240 --> 00:11:04,520 And the point is that if this was the only thing you base your policy on, 95 00:11:04,520 --> 00:11:11,240 this is clearly that kind of pleading for certain positions to take that there's not a lot to choose from. 96 00:11:11,240 --> 00:11:22,580 So you have to look beyond the slam dunk evidence and think about what things, what you could possibly reform beyond this. 97 00:11:22,580 --> 00:11:31,370 But then how do you think about it? I mean, there's a couple of ways couples who contrast the thinking I would introduce with one approach. 98 00:11:31,370 --> 00:11:36,930 I think a sense, you know, I think these are really positions that you get people talking about on the field. 99 00:11:36,930 --> 00:11:42,700 Sometimes you could just think harder and additional studies and not actually interviewing, 100 00:11:42,700 --> 00:11:52,190 to be sure that you know what you're want to do is going to work. I said this is another reason, which is and I hear this expressed in different ways. 101 00:11:52,190 --> 00:11:59,530 Don't know the the. You know, usually for choice, but you can't be tied to the efforts. 102 00:11:59,530 --> 00:12:12,030 We have to sort of just go ahead lowering tuition straightforward guarantees because, you know, the evidence is too restrictive. 103 00:12:12,030 --> 00:12:21,670 A couple of other. This is car, so if they can't find the father of scientific scepticism. 104 00:12:21,670 --> 00:12:25,450 And she's incredibly powerful idea, and this is not an idea you I'm against all, 105 00:12:25,450 --> 00:12:34,630 but we only make progress by accepting findings when we can show beyond doubt to the truth. 106 00:12:34,630 --> 00:12:40,840 So I, you know, wish I could have hoped she could find evidence of that kind of technology. 107 00:12:40,840 --> 00:12:48,250 Thinking and potentially evidence of this has got to. 108 00:12:48,250 --> 00:12:53,020 Speaking as a former. 109 00:12:53,020 --> 00:13:00,640 That's a very different kind of thinking than if you are if you are a minister of education and maybe you consider that kind of 70, 110 00:13:00,640 --> 00:13:16,270 you use actual evidence and make some kind of decisions. OK, so I just want to get you to speak of one thing before we continue, so this is a real. 111 00:13:16,270 --> 00:13:21,940 This is an almost real example of a conversation I had with a few of the people 112 00:13:21,940 --> 00:13:28,900 advising the foundation about the decision I have to make to invest in girls education, 113 00:13:28,900 --> 00:13:32,860 proving justification in lower income countries. 114 00:13:32,860 --> 00:13:42,280 And that plan was to implement programmes, but also to try to prevent the space of two things. 115 00:13:42,280 --> 00:13:48,490 And let's say that the short listed move down to these three things. 116 00:13:48,490 --> 00:13:53,580 Merit based scholarships as reasonable evidence that merit based scholarships give a go. 117 00:13:53,580 --> 00:13:59,010 So the secrecy of those who said they work to improve, achieve. 118 00:13:59,010 --> 00:14:07,290 And it's pretty critical, you people do it. They're also considering developing social, emotional learning. 119 00:14:07,290 --> 00:14:12,340 There's good evidence that that can be effective, but not particularly for low income countries. 120 00:14:12,340 --> 00:14:21,700 But it's reasonable, and they're also considering tax reform education systems in developing countries to deliver social investment, 121 00:14:21,700 --> 00:14:30,680 which has no really strong evidence that is being affected and nor is it even clear and it's not abundantly clear. 122 00:14:30,680 --> 00:14:35,690 But the first step is what I'd like you to think about what you would do in this situation. 123 00:14:35,690 --> 00:14:39,830 What would you recommend? It's great if you all think about that. 124 00:14:39,830 --> 00:14:49,520 Some of them could even shaded. I learnt large company. 125 00:14:49,520 --> 00:14:53,750 Term limited definition also changed in November 2008. 126 00:14:53,750 --> 00:15:04,910 Well, let's imagine that that's up for grabs. And so if you as the policy adviser to ask, we can push them to be more specific, but. 127 00:15:04,910 --> 00:15:13,390 In mindsets. Moving through this sequence. 128 00:15:13,390 --> 00:15:18,520 And I wonder if I gave you enough information about, know, just just based on what I've given you, 129 00:15:18,520 --> 00:15:22,420 even if you have any thoughts about what would be a bad idea, it would be a good idea. 130 00:15:22,420 --> 00:15:28,770 And hopefully some argue this would be just. 131 00:15:28,770 --> 00:15:37,980 So the execution rate comes. There is no excuse, no education is the footprint of education, right? 132 00:15:37,980 --> 00:15:49,220 So this would this wouldn't just be a sort of welfare intervention, for example, for 2002 would have a funded. 133 00:15:49,220 --> 00:15:53,680 So this is that we might have other things. They definitely want us to. 134 00:15:53,680 --> 00:15:58,940 Yeah, that's right. So let's let's remember that these three. 135 00:15:58,940 --> 00:16:05,860 He's getting close to. Right here in. 136 00:16:05,860 --> 00:16:13,040 Sexual assault, you know, it's a the of. But it was some toxic waste. 137 00:16:13,040 --> 00:16:18,830 Support for people need based scholarships with some funds, 138 00:16:18,830 --> 00:16:26,330 and maybe that is something that's evidence to the contrary that say, weeks ago, maybe some of the. 139 00:16:26,330 --> 00:16:38,400 Financial short and treatment systems to help you set yourself up by saying from a normal to company and as important for your life and for them, 140 00:16:38,400 --> 00:16:44,900 simply because you care a lot to gain some critical skills and possibly easier talking to one of your first. 141 00:16:44,900 --> 00:16:52,020 Right. Which is on tape talking about some concern. That's a good strategic idea. 142 00:16:52,020 --> 00:16:56,380 So the idea is that you don't just do things that are potentially fatal. 143 00:16:56,380 --> 00:17:03,340 Yeah, you want to do at least something. 144 00:17:03,340 --> 00:17:14,970 And Canadians looking for a moment, which is great, but he doesn't have any opportunity to stop making scholarships and things, you have to. 145 00:17:14,970 --> 00:17:20,100 This is going to be much time. 146 00:17:20,100 --> 00:17:25,730 So they should be saying that. Yes. 147 00:17:25,730 --> 00:17:28,570 And these are some of the considerations that we thought we had around the table, 148 00:17:28,570 --> 00:17:33,850 but I mean the main one because the debate is there's no evidence for this. 149 00:17:33,850 --> 00:17:39,220 And there's more to some of the debates that you could throw as a result of that. 150 00:17:39,220 --> 00:17:41,670 You have a pretty weak track. 151 00:17:41,670 --> 00:17:49,810 And you and also you could argue that maybe just implementing a relatively limited programme to improve for so much money would be a sticking plaster, 152 00:17:49,810 --> 00:17:54,190 which would which have no sustainability, whereas this really has got to be sustainable. 153 00:17:54,190 --> 00:17:59,620 But there's no it. So how do you think about that as an evidence based organisation? 154 00:17:59,620 --> 00:18:02,770 And actually one of the one of the people around the table said, Actually, 155 00:18:02,770 --> 00:18:10,420 sometimes I think we look beyond the evidence and, you know, think about the implications. 156 00:18:10,420 --> 00:18:14,440 And that confuses me a little bit. I mean, it's like a slide. 157 00:18:14,440 --> 00:18:18,910 I showed a couple of weeks where I kind of like, Well, let's forget about the edges, because that's not helpful. 158 00:18:18,910 --> 00:18:28,390 Let's go with some other considerations. So really what you really pretty well. 159 00:18:28,390 --> 00:18:35,190 I mean, this is a problem that really seemed to be thinking like this is like, how do you then think frankly about this? 160 00:18:35,190 --> 00:18:42,470 How do you how do you find a way takes into account the evidence because it means you don't know the facts? 161 00:18:42,470 --> 00:18:50,880 And part of it is to speak about, in some situations, moving away from thinking I was thinking about it as a sort of sliding scale of. 162 00:18:50,880 --> 00:18:57,880 Uncertainty. So this is what I mean by a rational approach. 163 00:18:57,880 --> 00:19:03,490 Why would we have? OK. 164 00:19:03,490 --> 00:19:09,320 It's a yes. And. I'm sorry for you. 165 00:19:09,320 --> 00:19:19,300 Yes, of course. Okay, well, thank you for helping us. 166 00:19:19,300 --> 00:19:23,480 So what is the rationale for several places in Mexico? 167 00:19:23,480 --> 00:19:31,130 Well, part of it is we don't ignore the best use of the evidence and every time in my definition of Russian, 168 00:19:31,130 --> 00:19:39,820 there's rumours like these are these are the best outcome. So that's improving on a friendship. 169 00:19:39,820 --> 00:19:45,140 And so there's two of two elements. 170 00:19:45,140 --> 00:19:56,750 The first is how to think about it, so I want to introduce two elements of the consequences of a policy and its course. 171 00:19:56,750 --> 00:20:02,480 And before the interview, I would tell you what I mean by a certain period of time. 172 00:20:02,480 --> 00:20:07,460 Those of you who spent a lot of time doing quantitative work will be familiar to most people's minds. 173 00:20:07,460 --> 00:20:10,970 Go to the end when you talk about the first one. 174 00:20:10,970 --> 00:20:18,470 So you look at the impact of the programme and there will be some confidence intervals around it with some uncertainty of. 175 00:20:18,470 --> 00:20:24,530 So that's included in the definition, but it's not the only thing you could think about 17 methods. 176 00:20:24,530 --> 00:20:31,040 So you do not close the study, you're not sure if you've really found the programme. 177 00:20:31,040 --> 00:20:36,860 In fact, there are some assumptions that you just use uncertainty as uncertainty populations. 178 00:20:36,860 --> 00:20:42,850 You make your study in one place and you have some uncertainty about whether it looks like somewhere else or the length time. 179 00:20:42,850 --> 00:20:50,160 And even structural uncertainty that just which is just the thinking around this thing in the hope that you give it in a real neighbourhood. 180 00:20:50,160 --> 00:20:57,060 So when I used the term uncertain term, I'm talking about any reason why a policy we were about to implement. 181 00:20:57,060 --> 00:21:08,660 You're uncertain, but it's going to have the opportunity. So I think about this continuum of consequences. 182 00:21:08,660 --> 00:21:14,060 On the one hand, we might kill people with a fever or even not tested. 183 00:21:14,060 --> 00:21:20,690 And I say that because a lot of the thinking around the evidence, if it comes from a medical model, right? 184 00:21:20,690 --> 00:21:29,810 And so if we're at this end of the spectrum, we want to get control of that, at least that it doesn't kill people and probably works. 185 00:21:29,810 --> 00:21:34,950 But somewhere in the middle, we might not be sure of perspective, but it's what we have. 186 00:21:34,950 --> 00:21:44,190 So we know that. But we can't waste the money, and you can even think of like an extreme example where even as a short policy, 187 00:21:44,190 --> 00:21:48,330 these people act, a government has to decide between this textbook of this textbook. 188 00:21:48,330 --> 00:21:56,220 They don't know much about either the cost roughly the same and really boils down to an arbitrary decision for something close to. 189 00:21:56,220 --> 00:22:01,730 And if this is the scale, we should feel more comfortable at weighing up the less. 190 00:22:01,730 --> 00:22:09,280 The government's actually on its hands, but there was some measure of making sure that as soon as we do it, 191 00:22:09,280 --> 00:22:17,090 we have some idea about whether it's worth. OK, so I'm just fine, 192 00:22:17,090 --> 00:22:21,640 I'm going to present a few graphs and just to get people fine speaking in this 100 percent of 193 00:22:21,640 --> 00:22:31,060 the way to a couple of years that we want to test and I'm not sure about is the cure rates. 194 00:22:31,060 --> 00:22:37,480 Some studies say to be as high as 70 percent of people care about us. Other studies say it's completely effective. 195 00:22:37,480 --> 00:22:46,600 We need more evidence that you put somewhere in that range, but we're pretty sure it's going to be another drug sildenafil. 196 00:22:46,600 --> 00:22:50,110 There's some pretty good studies that it's effective. Pretty sure. 197 00:22:50,110 --> 00:22:56,590 Appreciate it does to show people that there's some uncertainty about that, too. 198 00:22:56,590 --> 00:23:04,870 Now, if you're looking at this and as healthy as a healthy person, you would you would say, Well, I'm going to choose between these two. 199 00:23:04,870 --> 00:23:08,890 I'm totally going to because I leases, I'm not going to die. 200 00:23:08,890 --> 00:23:20,140 And maybe, maybe well, maybe one. But this is going to. You can make a different decision if you are either, you know, totally safe. 201 00:23:20,140 --> 00:23:29,310 I'll increase security. So I'm going to translate these images from racially across, I'm assuming that ultimately with this, 202 00:23:29,310 --> 00:23:37,290 I don't know how many people are so typically present odds ratios, those if there's an alteration of one, 203 00:23:37,290 --> 00:23:44,460 it means there's no effect of something that has no impact on the likelihood that it could be as much as 10 204 00:23:44,460 --> 00:23:53,850 times more likely to secure you if you take this group and its distribution between in the middle of the. 205 00:23:53,850 --> 00:23:58,710 So this is going to last. We have some uncertainty about the effectiveness. 206 00:23:58,710 --> 00:24:07,110 So I'm just going to start off by this example, I just a moment when one of these graphs. 207 00:24:07,110 --> 00:24:12,480 So this first round has a lot of uncertainty about whether it's going to cure you, 208 00:24:12,480 --> 00:24:18,010 but you're pretty sure that the patient is more likely to get behind the. 209 00:24:18,010 --> 00:24:20,810 Was this one better estimate for being sure, 210 00:24:20,810 --> 00:24:30,460 but the fact this comes under the gun here means that there's a chance the patient is less likely to be alive and so. 211 00:24:30,460 --> 00:24:39,530 So obviously you probably got. I'm just translating that into what I hope will be some more useful developments. 212 00:24:39,530 --> 00:24:48,050 The decision theory has, you know, his thoughts are a lot of this about the about what approach you take. 213 00:24:48,050 --> 00:24:57,030 So the realist would say what is the most likely expected? But they're not afraid to let me just say, well, I mean, look, I mean, 214 00:24:57,030 --> 00:25:01,620 I look at the politics can give you the best possible opportunity to actually be able to 215 00:25:01,620 --> 00:25:07,170 approach sort of respective respective like a realist would never buy a lottery ticket. 216 00:25:07,170 --> 00:25:13,510 But under this approach, you might practise the best possible approaches if you need to raise a million pounds for tomorrow lottery. 217 00:25:13,510 --> 00:25:18,810 The Russian approach. No rational approach to investing in our economy. 218 00:25:18,810 --> 00:25:26,310 And the pessimists, I might say when we look at the worst possible case, the oldest policies, and I'm going to make sure that I get the least worst. 219 00:25:26,310 --> 00:25:31,840 So there's different approaches. I on this graph, you can imagine. 220 00:25:31,840 --> 00:25:38,480 So this is an official, this is hopefully two different interventions. 221 00:25:38,480 --> 00:25:42,800 If you were a refugee for a moment, you think it's the best possible outcome for the best, 222 00:25:42,800 --> 00:25:54,090 expect to get off to a serious issue of infection one, because just for the of uncertainty, that could be as good as it is, right? 223 00:25:54,090 --> 00:25:59,550 And if you're a pessimist or maximum approach, you would say, well, if I go with this one list, 224 00:25:59,550 --> 00:26:09,770 we could possibly get this here, right, which is which is better than the. So that's kind of the phrase. 225 00:26:09,770 --> 00:26:17,720 And if we contrast to the options are getting reinforces the importance of scholarships. 226 00:26:17,720 --> 00:26:20,270 This is how you think about system reform. 227 00:26:20,270 --> 00:26:27,600 In theory, you have a lot of potential to be beneficial and makes a lot of uncertainty, but we think it could be. 228 00:26:27,600 --> 00:26:33,790 And the scholarships, we think are pretty tightly. Typekit converted on last. 229 00:26:33,790 --> 00:26:44,410 So that's essentially our choice. And you could see how direction the British decided to go for the best possible. 230 00:26:44,410 --> 00:26:50,740 You think differently about it, though, if this one is on the surface, potentially if you look to this? 231 00:26:50,740 --> 00:26:54,300 Well, you know, I don't know what's going to happen, but I think it might be helpful. 232 00:26:54,300 --> 00:26:59,580 I think this could be catching up. Refugees, we think well. 233 00:26:59,580 --> 00:27:03,720 Peter, it's not worth it the same way you say different about like buying a lottery 234 00:27:03,720 --> 00:27:15,250 ticket for a possible outcome to get someone to come out with a bill for $150. 235 00:27:15,250 --> 00:27:21,010 And frankly, this is one more concept, which I think is bring about a skewed utility utilities. 236 00:27:21,010 --> 00:27:32,460 The concept of homelessness, which is thinking of which is really just a sum of responsibility, lots of different outcomes, and how much is I? 237 00:27:32,460 --> 00:27:40,950 And if it helps, I'm just going to go through a numerical example of this in the paper in evidence based medicine, 238 00:27:40,950 --> 00:27:46,020 which is the treatment of women who've had one year of GDP growth, 239 00:27:46,020 --> 00:27:56,530 which is at least five different, and that decision, whether or not to take photographs of gone for the next pregnancy. 240 00:27:56,530 --> 00:28:04,300 So if you don't take photographs acid, you have roughly three and a half percent cut in half. 241 00:28:04,300 --> 00:28:09,670 Now it's a little bit 18 to put some money to on this, but this is purely to sort an American scale. 242 00:28:09,670 --> 00:28:17,060 Let's say we would be willing to spend $100000 to avoid that. 243 00:28:17,060 --> 00:28:25,680 So then we expect to each other to this is 3.4 percent, 200000, which is three thousand. 244 00:28:25,680 --> 00:28:35,960 You take folic acid, there's reduced chunks of 1.0 percent, which is a utility. 245 00:28:35,960 --> 00:28:43,760 So then the calculations would say I could facilitate you taking folic acid. 246 00:28:43,760 --> 00:28:50,290 So you compared, I guess, just. Focuses and focuses on. I mean, three times of. 247 00:28:50,290 --> 00:29:02,650 So it's a rational decision. You probably don't need us corporations to realise that, but it was time to go through the motions. 248 00:29:02,650 --> 00:29:09,910 So you could apply the same rules apply second, and this is just the gross entry amount as a matter of funding to the president. 249 00:29:09,910 --> 00:29:18,230 The idea that some the policy would lead to a worst case of $200 return in best case. 250 00:29:18,230 --> 00:29:28,000 So you could kind of put some numbers on it to help us. But let's say the policies, the intervention cost $220. 251 00:29:28,000 --> 00:29:31,630 Then you say, well, look, there's a chance this is not going to be worth it. 252 00:29:31,630 --> 00:29:39,120 I mean, those returns are going to be less, but there's a big chance that they are removed. So maybe that's worth going. 253 00:29:39,120 --> 00:29:44,760 But in many cases, before us make decisions, we don't know a lot, we don't know much about the cost of we'd like to. 254 00:29:44,760 --> 00:29:48,750 And research is going to be very interested in the impacts on U.S. The. 255 00:29:48,750 --> 00:29:57,610 Typically, there's a lot of uncertainty in the policy, which is about changes the decision. 256 00:29:57,610 --> 00:30:03,460 OK, so I'm I'm done with these sort of abstract threat framework. 257 00:30:03,460 --> 00:30:07,360 I'm just going to a couple of predictions and I'll give you some examples. 258 00:30:07,360 --> 00:30:14,470 And the question is, you know, we need to think about the worst possible outcomes for those of us who are. 259 00:30:14,470 --> 00:30:21,370 And there are places in which you could be rational to proceed to pursue a policy that has an uncertain outcomes. 260 00:30:21,370 --> 00:30:25,210 If you can rule out negative effects, there is nothing disastrous is going to happen. 261 00:30:25,210 --> 00:30:37,300 You think it's plausible? Logical. And you must be not going to be obese, but but even better at costume. 262 00:30:37,300 --> 00:30:40,780 And one implication is there's a few of the cases, 263 00:30:40,780 --> 00:30:48,490 one is that I think we need to do a better job in social policy and education policy of thinking of the positive and negative outcomes. 264 00:30:48,490 --> 00:30:52,870 People typically evaluate the intended outcome not so much to be unintended. 265 00:30:52,870 --> 00:30:58,300 And there are plenty of people in the population looking to saying the same thing to. 266 00:30:58,300 --> 00:31:04,720 Now to think about where to gather evidence, when there's uncertainty, you have here an important outcome. 267 00:31:04,720 --> 00:31:12,190 That's where you should focus whoever this guy is, particularly if you have information about him and this person, 268 00:31:12,190 --> 00:31:18,430 other people say you need that across in this atmosphere of these decisions. 269 00:31:18,430 --> 00:31:23,170 And I have only done four prior to that section, I used to work for a lot on these things. 270 00:31:23,170 --> 00:31:34,110 The district, this is a close up of the Mouse Club. The leaders of East to claim to some sort of government. 271 00:31:34,110 --> 00:31:42,730 The president. And people just ten years ago, and this was the state of the evidence, 272 00:31:42,730 --> 00:31:51,880 then we had a Cochrane review systematic review saying that there's no evidence that we want to improve the detection options. 273 00:31:51,880 --> 00:31:55,580 At the same time, the policies of the action went on to do. 274 00:31:55,580 --> 00:32:01,850 Randomised trials have policy briefs, specific deworming is the best one. 275 00:32:01,850 --> 00:32:07,740 So I spent a lot of time thinking and trying to explain why. And I do believe both of these positions can be. 276 00:32:07,740 --> 00:32:15,350 Same time, and the framework can help us to figure out a little bit. 277 00:32:15,350 --> 00:32:22,910 So to begin with, the impact of DOMA on improved education is less certain. 278 00:32:22,910 --> 00:32:29,240 So this is not just a shocking jump. There's some there's some reasonable evidence to think that it might be effective. 279 00:32:29,240 --> 00:32:36,000 It is a nice theory that says you put the media and media concentration for results. 280 00:32:36,000 --> 00:32:44,070 And there's a lot of evidence that the new systematic review is evidence of a mass because this is a highly complex dependent thing. 281 00:32:44,070 --> 00:32:51,200 So it works in one area and not another. 282 00:32:51,200 --> 00:33:01,560 So you're going to be X amount of times you look at the evidence of a lot of randomised trials suddenly having to test results, 283 00:33:01,560 --> 00:33:11,420 but they say for certain subgroups that there are so few those of you could make a case for this being something worth considering. 284 00:33:11,420 --> 00:33:17,060 Meanwhile, we know that at least on an individual level, three women doesn't have the serious side effects. 285 00:33:17,060 --> 00:33:24,690 We know it improves your health, and it's pretty cheap. When it's delivered through schools, it's got down to 10 cents per child. 286 00:33:24,690 --> 00:33:29,690 The demands of all these things would, you know, at least compared to if these were not? 287 00:33:29,690 --> 00:33:35,900 That would make the system rational approaches to test out. 288 00:33:35,900 --> 00:33:46,860 Context under testing, this is crucial because actually, this is relatively easy to evaluate as you test it in context, and that's. 289 00:33:46,860 --> 00:33:54,880 This next section, so I want to talk about it, ideally, we want every decision to be based on 47. 290 00:33:54,880 --> 00:34:01,110 To have a lot of evidence, so trying to sort of think through where it would be acceptable, 291 00:34:01,110 --> 00:34:08,270 it would appear rational, reasonable to consider ourselves. 292 00:34:08,270 --> 00:34:17,290 So here's the first couple of things. I think when there's an urgent decision, this could be, you know, 293 00:34:17,290 --> 00:34:28,410 focussing events where policy review because every five years it's now or never even met a justification for emergencies. 294 00:34:28,410 --> 00:34:33,080 So, you know, education and humanitarian crisis. 295 00:34:33,080 --> 00:34:37,880 The executive decision was a moment to think about, that is what is the alternative, right? 296 00:34:37,880 --> 00:34:44,810 So you shouldn't you can say, well, there's a lot of uncertainty about this cause of action. 297 00:34:44,810 --> 00:34:54,940 But there's also uncertainty about what happens if we do nothing. And so it's about balanced approach to consideration. 298 00:34:54,940 --> 00:34:58,800 The second thing is, how feasible is it to get better outcomes? 299 00:34:58,800 --> 00:35:04,230 So, for example, our system reform reform idea that we discussed the beginning, there's not good evidence, 300 00:35:04,230 --> 00:35:12,500 but you know, if you're looking for a better analysis of opportunities, you're not going to get and emphasis. 301 00:35:12,500 --> 00:35:17,300 The evidence is always in complex and there may be ethical issues as well. 302 00:35:17,300 --> 00:35:24,020 In doing a controlled study, so. 303 00:35:24,020 --> 00:35:30,290 You may want to embrace the war on some kind of courses of action if you know that the evidence is. 304 00:35:30,290 --> 00:35:37,860 So. I have a couple of things under this heading, which I will spend more time talking about, 305 00:35:37,860 --> 00:35:47,440 which start with this statement first on the efficacy of future policy decisions cannot be described by single parameters. 306 00:35:47,440 --> 00:35:55,440 So, you know, you have studies that produce constraints like what works and what works clearinghouse. 307 00:35:55,440 --> 00:36:04,590 We sort of try and sell the idea that this is a silver bullet approach to Evidence-Based Policy to do a bunch of studies that show the impact. 308 00:36:04,590 --> 00:36:14,280 Therefore, this works. But I would argue that every decision that you make is a combination of different kinds of evidence. 309 00:36:14,280 --> 00:36:19,200 Maybe the main bit of evidence is an evaluation of some kind of efficacy, 310 00:36:19,200 --> 00:36:24,360 some kind of effort to study, but really easy to piece together all the facts and evidence. 311 00:36:24,360 --> 00:36:27,940 And the critical thing is how important fathers are. 312 00:36:27,940 --> 00:36:37,120 Governments, if that more important than the mainland sort of have massive impact evaluation here, then it could be appropriate. 313 00:36:37,120 --> 00:36:43,730 So let me give you a couple of examples of that. So we just started RTI. 314 00:36:43,730 --> 00:36:53,320 I worked for started working two weeks ago, packed with essential but relevant and funded by citizens against foundation, 315 00:36:53,320 --> 00:37:01,530 trying to ask this question about what is the. Construction mosquitoes scale. 316 00:37:01,530 --> 00:37:08,790 And we're putting us down to try to understand the structural approaches for these tax learning in large scale programmes, 317 00:37:08,790 --> 00:37:18,480 mainly in Africa and South Asia, and in trying to understand what aspects of the system supporting. 318 00:37:18,480 --> 00:37:22,800 So you can think you can break this down in this way. 319 00:37:22,800 --> 00:37:30,360 The probability that a programme is going to work at scale is a combination of the probability of a pilot programme 320 00:37:30,360 --> 00:37:38,070 works time by the probability that you can replicate this type of study conditions across the whole nation, 321 00:37:38,070 --> 00:37:45,090 for example. And typically, we have much better evidence here. 322 00:37:45,090 --> 00:37:53,990 Know we have lots of evaluations of other programmes that tell us this. 323 00:37:53,990 --> 00:37:57,540 But we really have good information on this story, right? 324 00:37:57,540 --> 00:38:04,660 I don't know if I'm ready to studies to even try and isolate this from one or two, from probably from the other. 325 00:38:04,660 --> 00:38:10,440 So the consideration here is what if this is much worse for the environment? 326 00:38:10,440 --> 00:38:15,750 That you will say to try to get the vaccine, that you already have a delivery system, 327 00:38:15,750 --> 00:38:22,560 you never have to get a vaccine candidate recruitment, at least I'm I'm only really interested in becoming the vaccine works. 328 00:38:22,560 --> 00:38:34,570 This is much more. But arguably a lot of the situations we look at in and in our countries, this is at least important because of these difficulties. 329 00:38:34,570 --> 00:38:40,480 I mean, there's a bunch of the structural approaches that we could take the evolution efficacy, 330 00:38:40,480 --> 00:38:47,280 but the big uncertainties have we know how it works if a bunch of teachers do it, how do you get all the teachers? 331 00:38:47,280 --> 00:38:55,810 And so in this case, will your attitudes are due to a solid evidence base that the metro analysis literature in the beginning is different because, 332 00:38:55,810 --> 00:39:05,400 you know, the fact is not necessarily most. And what we like to do is to try and break this problem now is to try to ask first the question of 333 00:39:05,400 --> 00:39:10,800 when teachers apply the methods that this instruction approach complicates the picture of learning. 334 00:39:10,800 --> 00:39:17,670 That's one question. But then the second question is when we implement this nationwide programme of teacher professional development, 335 00:39:17,670 --> 00:39:22,890 do teachers apply demands to think about those two questions that. 336 00:39:22,890 --> 00:39:31,790 I must say, if there's a lot of uncertainty here, then being very certain about a sexual relationship, maybe multiple as. 337 00:39:31,790 --> 00:39:41,870 And this is not just hypothetical. There's a great study that just evolved and just this idea that just published this year, 338 00:39:41,870 --> 00:39:49,940 which looks at this is the impact of hiring contract teachers in lower grades to improve student achievement. 339 00:39:49,940 --> 00:39:56,870 There was an announcement here in India which says, There's lots of you and can you show some facts? 340 00:39:56,870 --> 00:40:03,530 And then they scale up in Kenya. So that was founded by an NGO, some of the scale of other governments. 341 00:40:03,530 --> 00:40:08,540 And so you see, the impact is completely dependent on austerity. 342 00:40:08,540 --> 00:40:17,400 So that's an example of the uncertainty. The scaling is as important as concrete. 343 00:40:17,400 --> 00:40:22,740 And then the last example is itself. 344 00:40:22,740 --> 00:40:31,800 So. You're looking at evidence of a programme that's worked in Massachusetts and you're trying to understand that it's going to work and 345 00:40:31,800 --> 00:40:44,240 obviously you could decompose this into probability works is the probability that you put that time for some consideration of how similar to. 346 00:40:44,240 --> 00:40:50,900 And again, as with the previous example, lots of money is put into this with a high level of evidence, 347 00:40:50,900 --> 00:40:55,520 but we don't really know that much about how to think about the second question. 348 00:40:55,520 --> 00:41:02,510 And there's certainly no longer clear studies. 349 00:41:02,510 --> 00:41:08,960 And again, so I'm said the same thing that was this uncertainty for. 350 00:41:08,960 --> 00:41:14,330 And again, with things like vaccines, you know, there are some population differences in response to vaccines. 351 00:41:14,330 --> 00:41:18,450 Probably it's the response is the same. 352 00:41:18,450 --> 00:41:25,130 The education programmes is not. And this is paper. 353 00:41:25,130 --> 00:41:34,190 If everyone in the family knows the central initiative and wondering if you can actually see the impacts of the produce, the great papers. 354 00:41:34,190 --> 00:41:43,010 This is a woman addressing that issue. And I read this a quote from the blogger to which is to. 355 00:41:43,010 --> 00:41:53,960 So he's a real example of this. This issue of obviously, which is paid for by large parts of the country. 356 00:41:53,960 --> 00:42:02,290 So they reviewed dozens of studies on the impacts of reducing trust science on the cheap. 357 00:42:02,290 --> 00:42:16,440 I think BP and American countries. So we found one last Tuesday, which showed opposed to the very small effects, positive side reducing class size. 358 00:42:16,440 --> 00:42:26,880 And achievement and the few other food and other causes for this progression, this was on things I've done. 359 00:42:26,880 --> 00:42:33,550 And you can see there's a smattering of some kind of positive, but somehow the negative emotion. 360 00:42:33,550 --> 00:42:43,420 And there's a whole slew of nuclear studies where people have just looked for correlations between class, size and achievement. 361 00:42:43,420 --> 00:42:49,090 And again, findings were posted on. 362 00:42:49,090 --> 00:42:53,140 Biomarkers from Boston College for the paper, 363 00:42:53,140 --> 00:43:03,100 but Paterson set a maximum level from the day to come up with this, which I'm asking you to take on trust your. 364 00:43:03,100 --> 00:43:15,520 And they came up with this conclusion that the finest from the right place is better suited to this particular delivery. 365 00:43:15,520 --> 00:43:32,630 That issue of all the contact stimuli is more important to this issue than getting really quality, high quality, rigorous evidence from just. 366 00:43:32,630 --> 00:43:43,490 OK, so. So that was the issue of when to think about. 367 00:43:43,490 --> 00:43:51,790 Until this week, including this report, one and a report from. 368 00:43:51,790 --> 00:43:59,740 On the stand here, so to answer this question of when is the Russians pursue a policy on? 369 00:43:59,740 --> 00:44:03,700 Witnesses are urgent, but evidence is hard to improve, 370 00:44:03,700 --> 00:44:10,390 and the point I was making in this last two sets of slides when uncertainty isn't mainly about the efficacy of intervention, 371 00:44:10,390 --> 00:44:25,420 but in other considerations, context scale. And I think I just add a couple of ideas about how we do. 372 00:44:25,420 --> 00:44:29,410 If you take this on board, how you do things differently. 373 00:44:29,410 --> 00:44:40,820 And the first two ideas are really from Charles Manson to written this book on policy, and I said, Well, if you just acknowledge that, 374 00:44:40,820 --> 00:44:50,560 but there's more uncertainty decisions than might be evidence from just looking at a systematic review, 375 00:44:50,560 --> 00:44:54,310 you say, Well, actually, you can't really be sure about a whole lot of things. 376 00:44:54,310 --> 00:45:01,720 So a good idea would be instead of making a decision about what policy are we going to adopt for the entire country, 377 00:45:01,720 --> 00:45:03,910 let's adopt policy diversification. 378 00:45:03,910 --> 00:45:13,000 Let's think about trying different variants of something out, different places with Facebook's or placing small bets on different ideas. 379 00:45:13,000 --> 00:45:19,360 On the other hand, as we get closer to the moment, is this probably driven pictures politicians, 380 00:45:19,360 --> 00:45:27,820 which is just saying that we should implement programmes under lots of iterations of testing and is actually that to make the content specific, 381 00:45:27,820 --> 00:45:39,180 but also seems to have a great to. You think you need to take some sort of research and validation when evaluating 382 00:45:39,180 --> 00:45:44,340 whether it makes sense to focus on the data that provide the most information? 383 00:45:44,340 --> 00:45:49,290 And I was amazed at how often the conversations like this are always do. 384 00:45:49,290 --> 00:45:57,210 This seems like common sense, but actually, if you look at your average age, a lot of work for USAID and USAID in particular framework, 385 00:45:57,210 --> 00:46:07,500 they wanted to measure everything which makes much more sense to think about what's going on, how it impacts whatever. 386 00:46:07,500 --> 00:46:15,060 As long as the companies are scaling and it's based on average rates, consumer activity based approach to breaking down. 387 00:46:15,060 --> 00:46:23,160 You probably fixed the problem. And fantastic. 388 00:46:23,160 --> 00:46:33,500 And then I see, as there's a lot of work to do in research methods that really systematically assess and reduce uncertainty. 389 00:46:33,500 --> 00:46:40,570 And so I don't know if you were familiar with this, I've read lots of studies that may make some strong claim. 390 00:46:40,570 --> 00:46:49,900 I mean, the limitations they say, well, this wasn't part of the study to everything we just said might actually be wrong in the end. 391 00:46:49,900 --> 00:46:51,640 And not many studies to say, Well, 392 00:46:51,640 --> 00:47:00,190 let's try and think together about how likely it is that all the assumptions I made to draw some conclusions are really wrong. 393 00:47:00,190 --> 00:47:14,240 So trying to assessing the assumptions and first ability to come up with, you know, how does. 394 00:47:14,240 --> 00:47:22,790 And it still have some caveats. I think that it's very important to look at where the strong evidence is simply 395 00:47:22,790 --> 00:47:27,050 doesn't work because often that gets put in the same bucket passengers, 396 00:47:27,050 --> 00:47:31,000 any and conclusive evidence. But I think we could make a big step forward by just saying. 397 00:47:31,000 --> 00:47:35,650 This has been tested 20 times per minute and not using this, 398 00:47:35,650 --> 00:47:43,850 you are not using this kind of lowering the bar on this as a whole looking very provides an opportunity to consider. 399 00:47:43,850 --> 00:47:51,560 And it's not about lowering the bar, and I think that that idea of having a high bar for what constitutes certain elements is a really important, 400 00:47:51,560 --> 00:48:01,290 but, you know, I'm not arguing against. But in addition to that, we need to think about ways that we can make better use of some people. 401 00:48:01,290 --> 00:48:08,210 And I think a lot of the arguments is grossly presented would be more useful if we could actually 402 00:48:08,210 --> 00:48:13,490 quantify the kind of the numbers of people who did this and how to weigh the amount of it. 403 00:48:13,490 --> 00:48:19,540 More but more importantly, the methods that. 404 00:48:19,540 --> 00:48:30,620 I just wanted to end on an analogy that the line, do you think about it, which is what you need to see is the distance and all we have is. 405 00:48:30,620 --> 00:48:34,820 Kind of broken spectacles if they're not too broken. 406 00:48:34,820 --> 00:48:42,380 There's a reasonable chance of putting them on is going to help you see, not perfectly, but with vision, the character. 407 00:48:42,380 --> 00:48:47,360 But there is, of course, a point so perfectly. That's pretty much what, three months. 408 00:48:47,360 --> 00:48:57,150 So that's the question. I think we need to ask for evidence. It's not, you know, is a perfect, almost conclusive. 409 00:48:57,150 --> 00:49:04,710 But is it just because it put us in a better position than before? 410 00:49:04,710 --> 00:49:09,150 And so this is, you know, it is a work in progress. 411 00:49:09,150 --> 00:49:27,932 I'm deeply motivated not to put in Typekit paper and brought this up, but I'm enjoying it presented fully by this because I.