1 00:00:00,660 --> 00:00:04,260 Good afternoon, everybody. I don't call him again. 2 00:00:04,350 --> 00:00:13,170 If you don't, I am director of the Centre for Evidence based Medicine, Professor of evidence based Medicine in the weekend and evening of a GP. 3 00:00:13,830 --> 00:00:15,060 And I'm also a dad of two. 4 00:00:16,200 --> 00:00:23,940 I'm going to talk to you about and to take you on a journey around the idea of real world critical appraisal in evidence based medicine. 5 00:00:24,990 --> 00:00:30,870 I know many of you today have been on your staff introduction to IBM on the mythical Earth, 6 00:00:31,950 --> 00:00:40,110 and David will be talking to you and Ball about introduction to media and some of the issues about critical appraisal, study design and so forth. 7 00:00:40,740 --> 00:00:48,629 But what I'm really interested in is the panacea is how do we solve the problem of when you have a piece of evidence at the bedside, 8 00:00:48,630 --> 00:00:54,720 you can apply it to patient care then, and you can make decisions really efficiently and quickly, and you can do that in real time. 9 00:00:55,080 --> 00:01:03,719 So that's what I mean by real world critical appraisal, as opposed to the critical appraisal that I see taught a lot is sometimes I've been 10 00:01:03,720 --> 00:01:09,240 to sessions where it's been like 4 hours long and it's been a checklist of 20 items. 11 00:01:09,240 --> 00:01:13,530 Do come in, sit down, we won't show you anything. 12 00:01:13,600 --> 00:01:19,140 It's a checklist. And by the end of it, everybody think I'm never going to do a critical appraisal again. 13 00:01:19,560 --> 00:01:23,070 Okay. And somewhere in your journey, some of you will have been on that journey. 14 00:01:23,070 --> 00:01:29,940 I've been there. And I found that to be demotivating and a real barrier to evidence based management. 15 00:01:30,870 --> 00:01:39,120 So this is a bit of a philosophical, but where I am now at the 24th years of teaching that I'm thinking slightly differently about some of the issues. 16 00:01:40,950 --> 00:01:45,780 Okay, let's stop. So this is a really unique book and it's worth reading. 17 00:01:46,020 --> 00:01:48,900 Effectiveness and efficiency in biology, cookery. 18 00:01:49,470 --> 00:01:56,280 And because it starts at this point of criticised a lack of reliable evidence behind many of the commonly accepted health care interventions, 19 00:01:57,360 --> 00:02:00,599 I think that's a unique starting point in the evidence based unit. 20 00:02:00,600 --> 00:02:05,040 Not that long ago, though. 30, 40 years ago, yeah. 21 00:02:05,040 --> 00:02:10,290 You know, Juni, so evidence based medicine is relatively new in terms of where we're able to. 22 00:02:13,670 --> 00:02:20,750 Second. Then moving in that journey you'll have got his a David Suchet traditional diagnosis of three. 23 00:02:20,870 --> 00:02:26,629 The traditional definition of evidence based medicine still useful to think about the integration of research, 24 00:02:26,630 --> 00:02:29,840 evidence with patient values and clinical expertise. 25 00:02:31,170 --> 00:02:37,889 And I always ingrain that because where is most of it's gone wrong in the last 20 years is because 26 00:02:37,890 --> 00:02:43,650 people have taken that ham and they've certainly taken that out and they form evidence based medicine. 27 00:02:43,650 --> 00:02:48,150 The truth about research, evidence and all practice can be based on research evidence. 28 00:02:48,150 --> 00:02:55,710 And once we've accumulated all the evidence and we produce all the guidelines, then somehow we'll be in a position where we'll know what to do. 29 00:02:58,590 --> 00:03:04,490 And thinking about where you are. So in terms of the steps in APM, you're asked to learn to ask questions efficiently. 30 00:03:04,510 --> 00:03:09,900 I've been doing that today and search for evidence effectively and you need to be really skilful at this. 31 00:03:10,320 --> 00:03:12,750 So that's the first thing is a skill. 32 00:03:14,340 --> 00:03:20,220 What defines if anybody is any good at a skill is whether they need practice or not and whether they do it regularly. 33 00:03:20,970 --> 00:03:23,940 So a lot of people say to me, Well, how do you know where to look? 34 00:03:24,480 --> 00:03:29,190 And I go, Well, once you've been practising law, you'll get the feeling for it and you'll be really skilful on it. 35 00:03:31,080 --> 00:03:32,670 But once you find some evidence, 36 00:03:32,670 --> 00:03:40,409 the problem is and this is an editorial by Dale Goldman in 1994 that sets out one of the problems with evidence is there are huge 37 00:03:40,410 --> 00:03:47,250 shortcomings and a lot of the evidence what would be ideal if you would ask a question and find some evidence and it was so well done, 38 00:03:47,670 --> 00:03:51,810 it was only indexed in PubMed and then you'd be able to use it because you believe it. 39 00:03:52,620 --> 00:03:57,569 But that's not true, is it? And so there are huge shortcomings in lots of the evidence. 40 00:03:57,570 --> 00:04:08,880 And one of the things is he published this in 1994, and he said, what we need is less evidence, but more higher quality and more better evidence. 41 00:04:10,470 --> 00:04:13,560 And it's interesting. Then you've come in. 42 00:04:13,650 --> 00:04:17,430 That's all right. Nice to see you. We've got presents as well. 43 00:04:19,140 --> 00:04:22,260 Wow. That's never happened to me before. People brought this up. 44 00:04:24,750 --> 00:04:27,989 He said, well, look, 1994 was what's happened. 45 00:04:27,990 --> 00:04:36,780 So actually there has been a three fold increase in the number of randomised controlled trials in that time, but lots of better high quality evidence. 46 00:04:38,970 --> 00:04:43,799 However, when you look at observational research, it's a similar trajectory. 47 00:04:43,800 --> 00:04:48,240 You go from a hundred thousand to nearly 400,000 observational studies. 48 00:04:48,600 --> 00:04:52,260 Well, that's not better quality if you're thinking about therapy. 49 00:04:52,830 --> 00:05:01,240 When you look at Medline overall, you've gone from 450,000 to about 1.2 million current number. 50 00:05:01,680 --> 00:05:04,740 So the first thing is it's really easy to get published. 51 00:05:05,940 --> 00:05:09,270 Because a lot of us are publishing a lot of stuff on a daily basis. 52 00:05:10,350 --> 00:05:15,360 So when you think of all that, even if you just go back to the 32,000 trials on average. 53 00:05:15,730 --> 00:05:21,360 Yeah, bear with me. And if you say let's say on average, they have no effect. 54 00:05:24,180 --> 00:05:29,520 And it's a normal distribution. Now, when I'm in a room of researchers, now they go, What do you mean, no effect? 55 00:05:29,550 --> 00:05:35,280 My research always makes a difference for patient care. But if you say it had no effect and it was a normal distribution, 56 00:05:35,580 --> 00:05:41,370 then you should expect about 800 trials a year to impact on clinical practice because to announce the Senate would be positive, 57 00:05:41,730 --> 00:05:47,309 two and a half percent be negative, and all this stuff in the middle would be sort of sort of nonsignificant. 58 00:05:47,310 --> 00:05:49,560 Wouldn't it balance itself out? 59 00:05:50,460 --> 00:05:57,690 Well, if we could produce a society that produced 800 new trials a year that impacted on clinical practice within about four or five years, 60 00:05:57,690 --> 00:06:02,190 we'd be revolutionary in our approach to medicine. Things would be changing. 61 00:06:02,370 --> 00:06:05,160 That would be about 2 to 3 changes on a daily basis, 62 00:06:06,240 --> 00:06:11,060 and we'd be in a position where you'd have to stay up to date so often because you've got a bad way of doing it. 63 00:06:11,820 --> 00:06:16,590 But everybody in the room who's allied to clinical care would certainly be thinking, But that's not what happened. 64 00:06:17,940 --> 00:06:24,390 When you actually ask people like in clinical settings, they generally notice they do about four or five things differently in a year 65 00:06:24,660 --> 00:06:28,260 and four of them are probably harmful and maybe one thing from beneficial. 66 00:06:30,060 --> 00:06:35,370 And in fact, when you talk to research is you would expect with all what we do that we skew this way to say 67 00:06:35,370 --> 00:06:40,200 actually more of it should be beneficial at the outset because we designed research really well. 68 00:06:40,560 --> 00:06:47,430 We are funded at the outset and so surely then we should be able to say what's the number we expect to see impact from clinical practice? 69 00:06:47,430 --> 00:06:51,390 It should be more than 800 a year. But it certainly isn't. 70 00:06:53,210 --> 00:06:57,140 Now, always when you ask a question like that, somebody is out there. 71 00:06:57,420 --> 00:07:01,700 If you go backwards in time, somebody out there will have done something before. 72 00:07:02,570 --> 00:07:07,130 And this is a study that compared new treatments compared to established treatments in randomised trials. 73 00:07:08,160 --> 00:07:11,510 They said, How often do we find new things make a difference to the chip care? 74 00:07:12,730 --> 00:07:18,430 And in effect, they say new treatments are only slightly superior to established treatments when tested in our teeth. 75 00:07:18,850 --> 00:07:21,070 And this has stayed stable over time. 76 00:07:23,400 --> 00:07:29,580 Over the last 50 years, there been no incremental improvement in the ability to show that we make a difference to patient care. 77 00:07:30,390 --> 00:07:36,719 So if you think about it, slightly more than the number is about 1000 to 1500 trials should impact on clinical practice, 78 00:07:36,720 --> 00:07:38,610 which is a huge number when you think about it. 79 00:07:40,830 --> 00:07:48,450 That means we're in a position to really radically change health care and health care system that every day should be going right. 80 00:07:48,450 --> 00:07:51,240 What we're doing different today. But we all know that's not the case. 81 00:07:53,710 --> 00:08:00,190 So there's a significant problem in disconnecting evidence and evidence based medicine that exists. 82 00:08:01,180 --> 00:08:05,139 And one of the key things is why does that occur? Where does that come from? 83 00:08:05,140 --> 00:08:08,410 And why does so little of that research translate into practice? 84 00:08:09,400 --> 00:08:15,280 It's a really interesting phenomenon. What's going wrong or is it all perfect research? 85 00:08:15,490 --> 00:08:18,790 Why are we not translating that into effective care on a daily basis? 86 00:08:20,140 --> 00:08:25,360 It's a really interesting point. Now, traditionally, when we've looked at the three main problems, we think of it this way. 87 00:08:26,410 --> 00:08:30,910 There's a problem with external validity in the results of the trial. 88 00:08:31,180 --> 00:08:34,240 Just do not apply to the populations we think in practice. 89 00:08:34,570 --> 00:08:37,450 Somehow all that research is in different populations. 90 00:08:38,590 --> 00:08:44,560 The second issue is to say there's internal validity, so there's something wrong with the structure and how the methods are applied. 91 00:08:44,920 --> 00:08:49,050 And there are, although they're positive, there must be huge shortcomings in the methods. 92 00:08:49,930 --> 00:08:57,970 Easy, fixable, but must be that. Or the third thing is to say, okay, we know there are outcomes, 93 00:08:58,390 --> 00:09:04,600 but they may be positive or maybe they're just not significant to improve patient care, and that's called clinical significance. 94 00:09:05,080 --> 00:09:09,280 So although they're statistically significant, they may not be clinically significant. 95 00:09:09,970 --> 00:09:17,520 Okay. So when I have problems like this, I tend to do a lot is I go backwards in time. 96 00:09:18,570 --> 00:09:22,110 And I go and look and read papers about what other people have thought about this. 97 00:09:22,110 --> 00:09:26,610 And this is David Sackett, who, when I first came to Oxford in 1993, 98 00:09:26,610 --> 00:09:31,650 was first director of the centre here, and I worked with him until he died a few years ago. 99 00:09:31,660 --> 00:09:39,270 But he wrote this paper which I found fascinating paper is there's only one formula that you're 100 00:09:39,270 --> 00:09:44,130 ever likely to meet and the whole of evidence based medicine really to understand what's going on. 101 00:09:45,330 --> 00:09:50,850 And then that formula, he said, it's really easy. And I look at it and a formula is ridiculously simple and looks like this. 102 00:09:52,440 --> 00:09:57,649 I have to say, I find that not ridiculous. A simple. And so when I see things like that. 103 00:09:57,650 --> 00:10:03,200 So he's had the confidence in in in in the benefits and harms of her treatment are directly 104 00:10:03,200 --> 00:10:10,000 proportional to signal inversely proportional to the noise and ah factor of the square root, 105 00:10:10,010 --> 00:10:15,260 the sample size. I don't find that simple and very that simple and I'm a simple person. 106 00:10:15,260 --> 00:10:18,320 Okay. I like to simplify things in my life. So I played around to it. 107 00:10:18,620 --> 00:10:26,420 So what we found is a confidence in the balance of the benefits to harm are equal to the effect size, are inversely proportional to the bias. 108 00:10:27,050 --> 00:10:35,450 And that's why we got to and then develop it some more and end up with a very simple equation about how you might first start to think about research. 109 00:10:36,570 --> 00:10:43,700 Yeah, you have to have a simple reference because what happens is when you put in all this critical appraisal and everything to work, 110 00:10:43,700 --> 00:10:50,000 you have to have a visual way of thinking what's going on? So if you find some patient benefit, if you want more patient benefit. 111 00:10:51,440 --> 00:10:55,970 It's directly related to the size of the outcome, isn't it? The bigger the outcome, the more benefit. 112 00:10:56,300 --> 00:11:00,140 Everybody happy with that? But I think that's like proportional to the bias. 113 00:11:00,740 --> 00:11:04,370 The more bias, the less patient benefit. Happy with it? 114 00:11:05,630 --> 00:11:08,810 And then it's food is proportional to the optimal information size. 115 00:11:09,840 --> 00:11:14,159 That's why you do systematic reviews. And it's not about the 32,000 more data. 116 00:11:14,160 --> 00:11:17,160 You have to get to a point where you've got enough data you can stop. 117 00:11:17,880 --> 00:11:23,420 Some people would like to say you patient benefit equals big data, but actually it's not. 118 00:11:23,430 --> 00:11:27,270 So when you're thinking about patient benefit and you do doing a systematic review, 119 00:11:27,270 --> 00:11:32,400 if you have a randomised controlled trial in front of you, you're thinking in your mind, do I have enough information? 120 00:11:34,290 --> 00:11:40,150 Okay. So patient benefit is only directly relevant when you've got the right amount of information in front of you. 121 00:11:40,900 --> 00:11:46,059 So that allows you to think if you can take that on board, you can start to think a bit simpler. 122 00:11:46,060 --> 00:11:53,250 When you see a piece of information, a piece of evidence, you can start to ask some simple questions and I will take you through that unit. 123 00:11:55,910 --> 00:12:00,139 You think about the outcome? To what extent is this outcome important? 124 00:12:00,140 --> 00:12:06,290 You would want to optimise the outcome. You want to minimise the biases, and you want to achieve the optimal information source. 125 00:12:07,040 --> 00:12:11,620 That's what you really want to do in a really simple way. Okay. 126 00:12:12,870 --> 00:12:16,890 So when I think about this for me, 127 00:12:18,300 --> 00:12:26,310 I decided that when I look at pieces of research I actually thought about when I'm asked about confidence in the result, a patient benefit. 128 00:12:27,570 --> 00:12:32,300 I start with the outcome. Doesn't matter how many times I've done this, 129 00:12:32,690 --> 00:12:39,410 I tried to force myself to look at the biases but can never do it because it's too dull and it's too boring. 130 00:12:39,890 --> 00:12:44,450 The immediate thing I do is I say, To what extent does this outcome make a difference to my patient? 131 00:12:45,750 --> 00:12:51,590 It doesn't matter what piece of information, whether I'm looking at an abstract. Do you notice what you do next time you look at a piece of research? 132 00:12:51,890 --> 00:12:55,760 The first thing you do is you go home. What if you have cancer? 133 00:12:58,130 --> 00:13:05,190 You do not say, let me look at the method of allocation concealing what you could do. 134 00:13:05,210 --> 00:13:08,480 I'm okay. You can be quite explain how I think we go back. 135 00:13:09,950 --> 00:13:15,620 So and the reason is I'm going to if the answer is no, you might consider stopping immediately and I'll come back to it. 136 00:13:16,370 --> 00:13:23,600 So that's the first thing. You've got a mind set now of a very simple equation that patient benefit is proportionate to the outcome, 137 00:13:24,290 --> 00:13:28,430 inversely proportional to the biases, and you need to achieve the optimal information. 138 00:13:28,440 --> 00:13:34,070 So if you've got a very simple way of looking at all of research and you can plug that in and go right and then apply that. 139 00:13:35,920 --> 00:13:42,710 Okay. So when I think about when people are engaged with questions that I'm starting to teach and I go from there, 140 00:13:43,400 --> 00:13:46,640 I ask them what groups of people think, does this make a difference? 141 00:13:47,960 --> 00:13:54,290 So you've got about 30 seconds now. The person next to you, you're going to ask them and go, I want you. 142 00:13:54,290 --> 00:13:58,430 Everybody's going to have to either say, do vitamins increase or decrease mortality? 143 00:13:58,760 --> 00:14:02,140 You've got one of three ounces. Yes. 144 00:14:02,980 --> 00:14:11,530 No, I don't know. And I'm going to give you 30 seconds to talk to the person next to you and just decide what your answer is going to be. 145 00:14:11,740 --> 00:14:15,440 And everybody's got to have a vote. But exactly what kind of. 146 00:14:35,530 --> 00:14:38,940 But you know. 147 00:14:49,960 --> 00:14:55,770 Not like that's okay. 148 00:14:57,190 --> 00:15:02,030 Oh, yeah. Now, who said who in your pants? 149 00:15:02,030 --> 00:15:05,900 Who said yes? I'm going to yesterday in the room. 150 00:15:06,620 --> 00:15:11,210 Okay. So we've got one more question. It says increase or decrease? 151 00:15:11,540 --> 00:15:22,220 Okay. Yes. All right. Okay. Well spotted. When did go to them? 152 00:15:24,800 --> 00:15:32,840 Who said? Don't know. So every answer is right in this one, but. 153 00:15:35,110 --> 00:15:39,340 Okay. Yes. Well, actually, it's interesting when you look at the thing. 154 00:15:39,640 --> 00:15:44,620 Look at the evidence. It increases mortality that these patients are consuming. 155 00:15:44,620 --> 00:15:49,870 And dolphins were 1.3 times as likely to die as we the control for slightly more. 156 00:15:50,470 --> 00:15:55,150 But if you said no when all of the trials were combined, antioxidants may or may not have increased mortality, 157 00:15:55,150 --> 00:15:59,709 depending on which statistical combination is important. And if you test that, don't know. 158 00:15:59,710 --> 00:16:03,220 You were thinking the increased risk of mortality was associated with beta carotene, 159 00:16:03,430 --> 00:16:07,480 possibly to many better age, but was not associated with the use of vitamin C or selenium. 160 00:16:08,260 --> 00:16:14,230 Okay. So it's really interesting it when we talk about effects, are we much more engaged in thought processes? 161 00:16:14,500 --> 00:16:20,890 We're interested in the effects of. If I'd have asked you to say let's judge whether this is allocation, concealment or not, 162 00:16:21,280 --> 00:16:24,700 you might have gone, oh, this is very interesting, but we start from this premise. 163 00:16:25,270 --> 00:16:28,909 Okay. Right. Question two. You've got 30 seconds again. 164 00:16:28,910 --> 00:16:33,100 You can answer this one. All right. Uh huh. 165 00:16:33,970 --> 00:16:39,420 All right. I'm going to give you 30 seconds again. Should you do that? 166 00:16:39,430 --> 00:16:55,880 Even people who have had a heart attack. No, no. The like. 167 00:17:01,850 --> 00:17:10,350 No, it's okay. 168 00:17:11,300 --> 00:17:15,510 Now he's voted for? 169 00:17:15,510 --> 00:17:22,200 Yes. People forget about focus or I'm going to come back to you. 170 00:17:22,220 --> 00:17:26,120 He's voted no. No. We left a note in there. Yeah, he voted. 171 00:17:26,120 --> 00:17:30,529 Don't know who about. So the Solomon knows and don't know then. 172 00:17:30,530 --> 00:17:34,040 Yes he's probably coming the last. So the people. He voted yes. 173 00:17:34,040 --> 00:17:38,490 Why did you vote yes. Let's go. 174 00:17:40,660 --> 00:17:49,899 Like guys like. When a person I do not how to talk and this saturation situation is below 95%. 175 00:17:49,900 --> 00:17:55,420 You should give oxygen. Okay. So you're going. Yes. Because you're using specific pieces of information. 176 00:17:55,750 --> 00:17:59,770 So there are it's only use in people below 90 to saturation. 177 00:17:59,770 --> 00:18:03,580 So you're picking out specific pieces of information that you can say yes. 178 00:18:04,150 --> 00:18:09,940 So you're adding that in. If you go back here, what happens is the further back in time you go, it's interesting. 179 00:18:10,330 --> 00:18:13,780 When I was a medical student, we used we felt oxygen was a lifesaver. 180 00:18:14,170 --> 00:18:21,520 That was the drug of choice in heart attack. And in fact, now you say you're not sure you want to put specific limitations in there. 181 00:18:21,760 --> 00:18:29,890 All right. What about the know people in the room now with those who say no as this because it's only recommended one is less than 90%. 182 00:18:29,920 --> 00:18:33,160 Okay. Okay. But it's a yes. 183 00:18:33,270 --> 00:18:36,639 Well, I. Huh. Well, even if he said no, you would still. 184 00:18:36,640 --> 00:18:44,830 Right? Actually, because actually there were similar death rates in both groups suggesting oxygen neither helps no harm in a routine use of oxygen, 185 00:18:44,830 --> 00:18:46,540 people have had a heart attack. So you're right. 186 00:18:46,570 --> 00:18:52,900 So if you said no, you're saying no because you say, well, the routine use in this population is not indicated. 187 00:18:53,410 --> 00:18:57,790 What about the Don't Know group? Why do you say? 188 00:18:57,840 --> 00:19:01,380 Don't know the questions. Okay. You think the questions are clear? 189 00:19:01,500 --> 00:19:04,770 Okay. Okay. So you go nuts. Okay. 190 00:19:05,070 --> 00:19:08,190 And I think if you don't know, you could also say, well, actually, 191 00:19:08,250 --> 00:19:13,900 because we still don't know the answer to the question, but it's also about being specific about the actual scenario. 192 00:19:13,920 --> 00:19:17,610 So everyone has a population that applies to, doesn't it? 193 00:19:18,840 --> 00:19:22,320 You picked out this has to apply to somebody, this research. 194 00:19:22,620 --> 00:19:26,639 It's not just an effect. It has to apply to somebody. I'm applying already. 195 00:19:26,640 --> 00:19:32,670 You've already done it in this small scenario. In fact, I've applied this to a specific population below 90. 196 00:19:32,820 --> 00:19:35,520 That's my peak above 94 different population. 197 00:19:35,790 --> 00:19:41,440 So when we think about the outcome, you have to start to think about who does this apply to her through interest? 198 00:19:41,550 --> 00:19:44,670 So we don't just decide that. In fact, it's about who does it apply to. 199 00:19:45,900 --> 00:19:52,230 So that makes me think really interestingly about when I think about evidence that comes not just about our founding effect. 200 00:19:52,640 --> 00:19:56,020 It has to apply to somebody who doesn't. And that's quite interesting. 201 00:19:56,020 --> 00:20:01,510 And so long. Okay. So when I do this, look, this is where way back again. 202 00:20:01,530 --> 00:20:09,030 Does it make a difference? It's much more engaging when we think about, okay, if the answer is no, consider stopping and I'll come back. 203 00:20:09,090 --> 00:20:15,290 It's not always true. Now, one of the things that we found is the quality, the outcome. 204 00:20:15,300 --> 00:20:18,870 When I talk about optimise outcomes, there are huge problems with an outcome. 205 00:20:19,290 --> 00:20:21,270 So they may be an effect that's appeared. 206 00:20:21,510 --> 00:20:27,810 And we published this piece of research because every time I looked there was a huge number of those surrogate outcomes, 207 00:20:28,170 --> 00:20:32,430 problems with composite endpoint problems with when you analyse the data. 208 00:20:32,640 --> 00:20:39,660 Here's outcome selection patient reported outcome 70% of the outcomes don't mean anything to patients and not the outcome. 209 00:20:39,690 --> 00:20:44,849 They're interested in collective reporting. And so we ended up with a thought system in outcomes. 210 00:20:44,850 --> 00:20:48,299 Try and say, Well, yeah, man, there's loads of problems with outcomes. 211 00:20:48,300 --> 00:20:53,910 Even when you look at the outcome, you've got to be aware and these are my top three surrogate composite and subjective outcomes. 212 00:20:54,630 --> 00:21:01,530 They're so prevalent now that they're an epidemic and people don't even are aware of them. 213 00:21:02,660 --> 00:21:08,660 So they don't say, well, it found a significant effect. And they go, Oh, actually this is a weird composite outcome. 214 00:21:09,290 --> 00:21:14,660 Or I tend to make 10 to 13 won't actually make any difference to patient care down here. 215 00:21:14,930 --> 00:21:18,640 It may be a relative measure, it may be thrown in, might be multiplicity. 216 00:21:18,650 --> 00:21:23,209 So there's so many outcomes, you are bound to get some of that positive. It could be selective reporting. 217 00:21:23,210 --> 00:21:29,060 So we said there's just there's an outcome. It has to be a high quality outcome that's going to affect patient care. 218 00:21:30,470 --> 00:21:35,690 So you can radically start to think when you look at it, does it make a difference if you've already said, 219 00:21:35,690 --> 00:21:38,720 oh, well, it applies to somebody, what's the quality of the outcome? 220 00:21:39,710 --> 00:21:42,140 It's not just a statistical effect. 221 00:21:42,530 --> 00:21:49,700 You're actually starting to think to this, actually make a difference to my patient will in my head do I think this is applicable? 222 00:21:50,750 --> 00:21:53,180 And when you do that, you end up with more than one. 223 00:21:53,210 --> 00:21:58,130 So I've ended up with more for patient benefit, although I'm thinking of outcomes applies to three questions. 224 00:21:59,160 --> 00:22:03,270 Who does this apply to? Does it make a difference? And is the treatment feasible? 225 00:22:03,480 --> 00:22:09,030 Because that's my third point. Don't you see thinking about does it make a difference? 226 00:22:09,060 --> 00:22:13,710 You can't actually do it, isn't it? And you can see this all the time appear. 227 00:22:13,720 --> 00:22:18,270 So you see it in diabetes exercise program, 61 on one sessions. 228 00:22:19,140 --> 00:22:24,120 And I think who's going to pay for that in our CCG? Not a single person, end of my assessment. 229 00:22:24,750 --> 00:22:31,710 So it's a sort of free item that we think can get rid of a significant amount of research really quickly. 230 00:22:33,060 --> 00:22:37,110 So when you're faced with your next piece of research, if you just say, who does this apply to? 231 00:22:37,590 --> 00:22:46,500 Does it make a difference and is it feasible? My current estimate is that get rid of about 95% of all research and that's probably an underestimate. 232 00:22:47,040 --> 00:22:52,650 I'm going to try and do that and say we're going to apply this to a prospective number of studies and say, how do you do this? 233 00:22:54,660 --> 00:22:58,410 There are some caveats. So if the answer is no, consider stopping it. 234 00:22:58,890 --> 00:23:03,210 If there are times when you get an answer that's no and you shouldn't stop. 235 00:23:04,320 --> 00:23:08,880 It may be something where it doesn't make a difference to patient care, but you're actually doing it still. 236 00:23:09,780 --> 00:23:13,769 And that might be an example of too much medicine doing something. 237 00:23:13,770 --> 00:23:19,829 And we just looked at one recently and I wrote a piece about it which was monitoring and type two diabetes self monitoring. 238 00:23:19,830 --> 00:23:24,450 And Type two diabetes doesn't make a difference to patient care, but we still doing it. 239 00:23:25,500 --> 00:23:28,890 That's. The second is sometimes you miss an important outcome. 240 00:23:29,880 --> 00:23:34,830 So you might have a treatment which is potentially less risky, less invasive. 241 00:23:35,610 --> 00:23:43,890 But you're looking at mortality is no different. So a good example of that might be new oral anticoagulants that no better than warfarin. 242 00:23:44,790 --> 00:23:49,410 But actually the patient relevant outcome that might be of interest, which is they don't require monitoring. 243 00:23:49,680 --> 00:23:53,790 So you've missed an important patient outcome. So you often have to say, still check. 244 00:23:54,150 --> 00:24:04,410 Okay, so that's what I do when I see a piece of evidence and I get rid of a load of evidence really quickly and you can't quite see this at the back. 245 00:24:04,980 --> 00:24:10,230 We're going to I'm going to publish it and actually put it plugs in to the PCO. 246 00:24:10,240 --> 00:24:13,740 So that is a population that the outcome there, the comparator and that the intervention. 247 00:24:14,040 --> 00:24:19,259 So it plugs into your pico that you've been taught. So it fits really well with that. 248 00:24:19,260 --> 00:24:24,450 So the first question is when you get your population is, you know, the population of ask the question, 249 00:24:24,540 --> 00:24:30,000 the first question you ask of the evidence in front of you is, does this evidence apply to the same patients I've got in front of me? 250 00:24:30,900 --> 00:24:35,160 Okay. The second question is, does this outcome make a difference? 251 00:24:36,120 --> 00:24:40,590 And you're checking for some important Mexican. You probably got relative measures, but you need an absolute effect. 252 00:24:41,160 --> 00:24:44,580 Poor quality outcomes in my surrogate composite and subjective are the three 253 00:24:44,580 --> 00:24:48,630 most important you should look at for the reporting of a 15 that the top three. 254 00:24:49,560 --> 00:24:57,060 If the answer's no, why should you carry a part from my two caveats if it's too much medicine or you've missed an important outcome? 255 00:24:57,270 --> 00:25:02,720 But you could stop right there. And you get rid of a load of research really quickly. 256 00:25:04,310 --> 00:25:08,840 Now they begin to worry, like before they got to teaching the whole course about critical appraisal. 257 00:25:09,380 --> 00:25:12,740 Frankly, I think it's really important and why really relevant. 258 00:25:13,020 --> 00:25:19,010 I think when you move through and if you yes. Stage, you have to be aware of the other treatment options. 259 00:25:19,790 --> 00:25:23,270 Yeah. And you have to be aware of the comparator. 260 00:25:25,340 --> 00:25:33,800 And if you are where it's an adequate comparator, you should move on and it could be potentially practice changing at that point if you're happy that. 261 00:25:34,280 --> 00:25:41,299 So the reason you need to know about the alternative is often you might be seeing one anti-depressants, a placebo, and that's where the LNP put in. 262 00:25:41,300 --> 00:25:45,830 If you aware of the other treatment effects that are already in place, you might be able to say, well, 263 00:25:45,830 --> 00:25:50,390 this looks like an important benefit, but actually it's far worse than what we're already doing. 264 00:25:51,500 --> 00:25:55,920 Now, why is this important? Wherever I go, I go into a room and I'm going to remove Johnny Depp from the fake. 265 00:25:55,940 --> 00:26:00,740 Tell me about the last treatment you need and they'll go, Can I say? 266 00:26:01,230 --> 00:26:05,690 Tell me, does it make a difference to my patient? And nobody can answer this question. 267 00:26:07,310 --> 00:26:13,100 Most clinicians in their practice don't understand what treatments they use and how much difference they make, 268 00:26:13,280 --> 00:26:18,130 and can they quantify that to the patient, because we should be automatically able to do that. 269 00:26:19,130 --> 00:26:23,720 We should all be able to say, If I take older patients like you and I give you this treatment, 270 00:26:23,870 --> 00:26:28,969 this is how many will benefit and this is how many will be harmed. That's does it make a difference? 271 00:26:28,970 --> 00:26:32,150 So you can revolutionise your approach in the way you teach this stuff. 272 00:26:32,390 --> 00:26:37,160 And think about it. If you say, okay, let me just fix this, really make a difference. 273 00:26:37,160 --> 00:26:42,950 And it's not just a function of the effect size, as I've shown you, and you move through this flow. 274 00:26:43,820 --> 00:26:52,150 Now, what's interesting about this flow is what I'm saying is, is and that's the 20 years of teaching it. 275 00:26:52,160 --> 00:26:55,620 This was never in the ask a clinical question. Acquire some evidence. 276 00:26:55,620 --> 00:26:56,900 There's a drop off there. Isn't that. 277 00:26:58,120 --> 00:27:03,190 Because you ask lots of questions, but you can only acquire the evidence for some of them because can't physically. 278 00:27:03,400 --> 00:27:08,320 There's not enough time in the day for me to find evidence for all the questions I have when I'm in clinical practice. 279 00:27:08,830 --> 00:27:14,200 But when I acquire the best evidence, I can very quickly do these three things in probably 60 seconds. 280 00:27:15,200 --> 00:27:16,820 I can do it mainly from the abstract. 281 00:27:17,720 --> 00:27:23,360 I might have some knowledge gap where I'm like, Oh my God, I'm not aware of what we're currently doing in practice and what the effects are, 282 00:27:23,510 --> 00:27:27,110 but that's a limitation of my clinical practice for this. 283 00:27:27,110 --> 00:27:32,270 That is relatively easy. And that means less should get free to the quality of praise. 284 00:27:33,870 --> 00:27:37,440 Because by the time you're doing this, you're thinking about stuff that really matters. 285 00:27:37,710 --> 00:27:43,620 Do I believe this effect size? And if I do believe it, I should be thinking about applying it in practice. 286 00:27:45,100 --> 00:27:47,440 And you're also this is what's really interesting. 287 00:27:47,980 --> 00:27:56,470 You're also making a judgement about the size of the effect and the importance, and then you're looking at to say, what's the impact of these on this? 288 00:27:57,250 --> 00:28:03,460 Now, if you've got a very small effect, you'll start to think, Wow, this could be overturned by any introduction of bias. 289 00:28:04,030 --> 00:28:10,030 Well, if you've got a much larger effect you think will have an impact on patient care, you may tolerate more both at this approach. 290 00:28:10,660 --> 00:28:14,310 But the problem I see is that most people do appraisal and think about doing that job. 291 00:28:14,320 --> 00:28:19,570 Let's move on a few more great criteria. Moderate, end of story. They're not related to this. 292 00:28:20,020 --> 00:28:24,220 The outcome. What impact does this appraisal have on the outcome of interest? 293 00:28:25,000 --> 00:28:31,650 Is it going to overturn these results? Now, that's really important in your journey of thinking about this, 294 00:28:33,000 --> 00:28:38,320 because there's a real disconnect between people's ability to understand bias and its impact on outcome. 295 00:28:39,070 --> 00:28:45,190 Generally, people disconnect the two in like the two different functions and like bias doesn't affect the outcome in any way. 296 00:28:46,210 --> 00:28:52,300 And in fact, what you can see often is we don't even have to form an appraisal because I often I'm being asked to assess effect. 297 00:28:52,300 --> 00:28:57,100 At this point. I'm saying, well, I'm looking at observational research. I don't need to do much more. 298 00:28:57,220 --> 00:28:59,080 I'm looking at the wrong type of research. 299 00:28:59,680 --> 00:29:05,319 Obviously, there's a huge interruption of bias, so we've only got an observational effect with a small, small effect. 300 00:29:05,320 --> 00:29:09,969 So that's not that important to know. Somebody tells me about an observational study. 301 00:29:09,970 --> 00:29:12,470 I'm like, Well. It's interesting, 302 00:29:13,220 --> 00:29:18,380 but it's not even worth carrying on to do the appraisal at this point because I would want a randomised controlled trial 303 00:29:18,560 --> 00:29:23,840 and I definitely would want to see a systematic review at this point to be able to say we will implement this in care. 304 00:29:25,250 --> 00:29:29,120 And it's really interesting. So here's an example. Okay. 305 00:29:29,420 --> 00:29:39,920 Last week the Science Media Centre debuted on the first time on this list that we send out a study and they go, here's a study from the BMJ. 306 00:29:39,920 --> 00:29:43,040 It's coming out tomorrow, published on the 4th of October. 307 00:29:43,340 --> 00:29:46,730 It's 3rd of October and it's 3:30 p.m. 308 00:29:48,260 --> 00:29:51,740 Could you get back to it by 4:00 with your clothes on? 309 00:29:53,180 --> 00:29:58,790 Mm hmm. And I need a cup of tea because I've just come from one meeting, and I'm like, oh, my God. 310 00:29:58,790 --> 00:30:02,160 And he's going to take me 5 minutes to open up my computer and get dota2. 311 00:30:03,620 --> 00:30:10,790 So literally, if a boil it down, I've got about 10 minutes to give a quote and think about this today. 312 00:30:11,120 --> 00:30:17,180 Okay. And this study is about availability of evidence of benefits, overall survival and quality of life of cancer. 313 00:30:18,310 --> 00:30:22,450 Until I go, Oh, this is stuff I'm really interested in. I could bypass it now. 314 00:30:22,780 --> 00:30:26,670 Plug in a critical appraisal, spend 3 hours and try and write from what I said. 315 00:30:26,710 --> 00:30:31,030 Realistically, I've got about 8 minutes to 10 minutes. So what do I do? 316 00:30:31,060 --> 00:30:36,550 I plug in my formula. I'm basically plugging in my formula. 317 00:30:36,700 --> 00:30:42,230 I think about the outcome. I did write one thing about the bias, but I started with the outcome. 318 00:30:42,410 --> 00:30:47,900 What does this outcome mean when you go to The Guardian and pick up the Guardian from last week? 319 00:30:48,380 --> 00:30:56,000 You'll see that I said, professor of evidence based medicine at University of Oxford to describe the lack of improvement with regard to the Bible. 320 00:30:56,000 --> 00:31:02,910 It's disappointing. It's hard to understand why half the jokes were approved in the first place if they provide no clinically meaningful benefit. 321 00:31:02,930 --> 00:31:09,480 He said, Basically, if you're shown you that in terms of cancer drugs, you could get rid of 50% drugs immediately. 322 00:31:09,770 --> 00:31:12,500 If they do, they make a difference. What's the outcome? 323 00:31:13,730 --> 00:31:19,580 And so you can plug in a system where you go, Oh, I could get rid of half of all cancer research just like that when I'm reading it. 324 00:31:20,720 --> 00:31:26,510 But what tends to happen is we think it's really technically difficult to engage with research, and so we don't bother. 325 00:31:27,890 --> 00:31:32,090 But you could actually start to understand you can read this stuff really quickly and you can plug in something. 326 00:31:32,090 --> 00:31:36,860 You could have started with the biases. But in another quote about the bias is the lack of robust studies. 327 00:31:36,860 --> 00:31:41,840 But nobody enjoyed that. Nobody picked that up. They prefer the outcome just as we do. 328 00:31:42,530 --> 00:31:46,759 So journalists preferred outcomes probably prefer outcomes. We prefer outcomes. 329 00:31:46,760 --> 00:31:54,260 And that's what really switches us on. And when you think about outcomes, most of the things we see are very small, statistical, tiny effects. 330 00:31:55,220 --> 00:31:58,130 So again, there are smart people out there who wrote about this, 331 00:31:59,090 --> 00:32:08,210 and this is Jonny in Odyssey's work who says basically when you're assessing tiny facts or public are more common in the literature, 332 00:32:08,480 --> 00:32:16,100 call should interpretation is warranted, since most of these effects could be eliminated with even minimal biases and their importance is uncertain. 333 00:32:16,820 --> 00:32:23,360 So once you understand the size of the effect you're working with, and if it's particularly tiny, which is highly prevalent at the moment, 334 00:32:24,800 --> 00:32:28,430 you understand that if you detect any biases, which is what you learn about, 335 00:32:29,120 --> 00:32:33,650 you start to think, Oh, if this is going to overturn this effect size really quickly. 336 00:32:34,130 --> 00:32:39,440 So often you might not need to do an exhaustive appraisal to go, Oh my gosh, we just introduced some biases. 337 00:32:39,440 --> 00:32:40,340 I'm already worried. 338 00:32:41,270 --> 00:32:47,840 And what you're trying to do when you're doing that critical appraisal is to see what's the impact of this bias in overturning that effect on. 339 00:32:49,610 --> 00:32:52,970 That's what's really interesting about a system where you can do it really rapidly. 340 00:32:53,450 --> 00:32:54,230 Thank you very much.