1 00:00:00,330 --> 00:00:06,540 So you're all here for results. Anybody do resuscitation in the room and yet we've got A&E. 2 00:00:06,570 --> 00:00:11,220 So this is all about resuscitation, so you should help us out. It was a very, very busy keep coming in. 3 00:00:11,940 --> 00:00:20,280 Well, this is a lot of still my thought process is I've been involved in evidence based health care now, evidence based medicine for 25 years. 4 00:00:21,060 --> 00:00:27,180 And it's really interesting. I'm starting to really get get the hang of it after 25 years. 5 00:00:27,450 --> 00:00:29,579 And so lots of this is just my thinking. 6 00:00:29,580 --> 00:00:35,549 If you ask me about research methods, I could probably spend the whole week here actually talking to you and we'd still be. 7 00:00:35,550 --> 00:00:39,959 I'll be interested. But actually this talk is about thinking about what is it. 8 00:00:39,960 --> 00:00:47,490 The crux of evidence is, is about we think about evidence, high quality evidence and what it means for practice. 9 00:00:48,180 --> 00:00:51,690 And at the moment, I've got to switch this on. Is it recording? So be careful. 10 00:00:51,690 --> 00:00:56,490 If I say anything stupid, please let me know. If you say anything stupid, it'll be on record. 11 00:00:56,610 --> 00:01:02,819 Okay. Alright, so it's not so many of you on the IBM course or think about going back. 12 00:01:02,820 --> 00:01:05,490 It was Archie Cochrane who came along and said, Look, 13 00:01:05,490 --> 00:01:11,400 we need a summary of the evidence of the critical evidence that underpinned health care and this is in the 1970s. 14 00:01:12,030 --> 00:01:17,280 So it's about 50 years ago, people were looking at health care and going, look, 15 00:01:17,460 --> 00:01:22,590 there's so much fundamentally wrong in how we practice lots of it's opinion based and we have to get some 16 00:01:22,590 --> 00:01:27,420 way of summarising the evidence to improve and understand what we need to do and we need randomised trials. 17 00:01:27,810 --> 00:01:37,139 Okay. Second, as many of you know, the theme into late men today with particularly in the debate is England best research evidence. 18 00:01:37,140 --> 00:01:43,860 It wasn't just evidence, it was the best research evidence underpins evidence based medicine and the practice of health care. 19 00:01:44,250 --> 00:01:49,850 And when David arrived in 1994, when I was here as a medical student, there was lots of activity there. 20 00:01:50,200 --> 00:01:54,870 There was the birth of the Cochrane Library in 1993, Centre for Statistics in Medicine, 21 00:01:54,870 --> 00:02:01,410 that with about 94 and the Centre for CBM that's the first ever website is 1990 495. 22 00:02:02,040 --> 00:02:03,540 So lots happening in that time. 23 00:02:04,080 --> 00:02:10,139 But I wanted to draw attention to this one particular, which was wrote by Doug Altman, who is still professor of statistics here, 24 00:02:10,140 --> 00:02:18,780 who wrote an editorial in the BMJ in 1994 and offered, If you want to get a paper cited a lot, it's all about the title, isn't it? 25 00:02:19,260 --> 00:02:23,399 So the scandal of poor medical research. But particularly we need less research. 26 00:02:23,400 --> 00:02:27,840 Better research and research done for the right reasons. The really nice sentences in it. 27 00:02:28,080 --> 00:02:33,300 That's what got thousands of citations on this paper. I think if you read it, it's really interesting. 28 00:02:33,690 --> 00:02:37,919 What he's saying at the time is with almost like developing research evidence 29 00:02:37,920 --> 00:02:42,510 with a hobby and it was particularly a hobby for many clinicians and so it 30 00:02:42,510 --> 00:02:46,499 was so poorly done that actually nobody could actually tell what was good 31 00:02:46,500 --> 00:02:50,700 research and bad research and very little of it with translated into practice. 32 00:02:51,600 --> 00:02:56,579 But what I want you to draw your attention to now is to jump forward. Okay, let's jump forward. 33 00:02:56,580 --> 00:03:00,540 If we say need less research, better research and research done for the right reasons, 34 00:03:00,990 --> 00:03:10,350 if you go into PubMed and type in using the number of randomised trials, you can go back to 1995 and trace the increase in trials. 35 00:03:10,590 --> 00:03:15,120 Okay. So it's been about a three fold increase in the number of randomised controlled trials. 36 00:03:16,140 --> 00:03:22,980 So you could say well actually that's better research, we've had much better research done in that period of time, so we've done the good thing. 37 00:03:23,580 --> 00:03:28,260 But you can also then say, well, let's look at observational research and do the same thing. 38 00:03:28,260 --> 00:03:31,200 You can time the thing and look at the terms and then you can see actually 39 00:03:31,200 --> 00:03:34,770 there's been a nearly four fold growth of observational research in that time. 40 00:03:35,040 --> 00:03:41,790 So we've gone from about 100,000 observational studies when Douglas said there's a huge problem to actually now there's 400,000. 41 00:03:42,000 --> 00:03:47,460 And in fact, it's more than that because that's 214 I can go through each year they get indexed and then you can pull them through. 42 00:03:48,030 --> 00:03:54,360 And if you look at the whole of evidence, if you look at all the research in PubMed is actually a huge phenomenon. 43 00:03:54,720 --> 00:03:58,110 Research, it's a huge industry, it's a huge business. 44 00:03:58,290 --> 00:04:04,350 Over a million articles published each year and it's growing a percentage that's almost exponential. 45 00:04:04,530 --> 00:04:09,720 It's going this way now. So the first thing is to say it's not difficult to get published. 46 00:04:10,380 --> 00:04:15,870 All right. Anybody think it's difficult to get published in the room unless there's somebody with someone with somebody? 47 00:04:15,870 --> 00:04:22,020 First article is always the hardest. But basically, after about the first five or ten, it becomes ridiculously easy and a million per year. 48 00:04:22,050 --> 00:04:27,330 So actually there's a huge growth in research. Okay. 49 00:04:27,660 --> 00:04:33,210 But if you take them 31,000 trials. Okay. In the statisticians in the room. 50 00:04:34,320 --> 00:04:38,460 Well, we'll see. This is a normal distribution. Can I look at the stats people and they say, happy with that? 51 00:04:39,330 --> 00:04:47,400 And if you seem like out of the 31,000 trials, if we say let's just say on average when you before you start, they make no difference to practice. 52 00:04:47,970 --> 00:04:52,230 Okay. So we sum up all the trials positive and negative, and on average, they make no difference. 53 00:04:52,710 --> 00:04:56,490 So if you take your normal distribution and you say, well, let's take two and a half percent out, 54 00:04:56,790 --> 00:05:01,890 you would expect to see about 792 positive trials in that distribution. 55 00:05:02,460 --> 00:05:09,360 Okay, everybody happy with that? Following that, you would also expect to see 792 negative trials. 56 00:05:10,110 --> 00:05:14,879 If you take that distribution. Okay. However, you would think with all the money that we spend, 57 00:05:14,880 --> 00:05:21,930 we have biomedical research centres that get hundreds of millions of pounds that we will be a bit better than on an average, no benefit. 58 00:05:21,990 --> 00:05:25,710 Wouldn't you think we were a bit positive? Hello there. 59 00:05:26,130 --> 00:05:30,180 Come to the right talk. Okay, very good. 60 00:05:32,100 --> 00:05:35,759 We'd like to start again. All right. This is a difficult. 61 00:05:35,760 --> 00:05:40,440 But you have to follow. All right. So if you expect the old and if there are any basic scientists in the room. 62 00:05:41,580 --> 00:05:47,860 Okay. One person at the back put their hand up there. Hello, sir. On average, the research you do makes no difference. 63 00:05:47,880 --> 00:05:57,870 Is that fair to say? I don't have anybody in the room like to claim. 64 00:05:57,870 --> 00:06:01,650 On average, my research is slightly better than average or slightly worth. 65 00:06:01,980 --> 00:06:05,610 What do you think? Because if you think about it, it should look like that. 66 00:06:05,610 --> 00:06:09,490 Shouldn't. They should look on average, we should be positive. 67 00:06:10,330 --> 00:06:15,610 And if you do that, you would be then able to say, well, actually, this is how many impact on clinical practice. 68 00:06:16,450 --> 00:06:19,750 And actually out of the 31,000, it's 30%. 69 00:06:19,960 --> 00:06:25,330 So it's ten, 15,000. So you can say, well, actually, this line is about 70% chance at the outset. 70 00:06:25,480 --> 00:06:29,110 On average, 70% to see that follow. Okay. 71 00:06:29,800 --> 00:06:33,670 So that's interesting when you think about it all with research that should translate into practice. 72 00:06:33,910 --> 00:06:37,480 Well, if you do follow that, if you're in trouble in Oxford, 73 00:06:37,720 --> 00:06:42,129 you should always talk to somebody who's one of your neighbours and say, I've got this issue, I'd like to talk to you. 74 00:06:42,130 --> 00:06:48,260 So I went to talk to Rafael PEREIRA about this paper, about this issue, and he said, Oh, we've published on this already. 75 00:06:48,730 --> 00:06:51,970 So there's always somebody smarter than you in the room you can turn to in Oxford. 76 00:06:52,420 --> 00:06:56,200 So this is about the proportion of proposed new treatments that are successful. 77 00:06:56,460 --> 00:07:02,740 Okay. And how often did new experimental treatments evaluated in trials superior to what established treatments? 78 00:07:02,740 --> 00:07:06,160 That's what we really want to know. The sum of all that research, what difference does it make? 79 00:07:07,570 --> 00:07:10,800 Well, when you do that, this is what they did. They looked at cohort studies. 80 00:07:10,810 --> 00:07:16,090 This is their systematic review, which is to try and collect all the studies that have looked at this one question. 81 00:07:16,570 --> 00:07:23,440 Consecutive series of randomised trials registered after before study onset and compared new against established treatments in humans. 82 00:07:24,550 --> 00:07:36,400 To look at the effect and what they found is four cohorts of randomised trials that provided data from 743 athletes involving 297,744 patients. 83 00:07:36,700 --> 00:07:46,480 Happy with that? Okay. And what did they find? They found that basically new treatments are only slightly superior to old treatments. 84 00:07:47,350 --> 00:07:50,920 Okay. And that's really and you can go and read the paper and you can go into it. 85 00:07:51,550 --> 00:07:59,860 And what's interesting, this is like over the last 50, 40, 50 years, it looks like the ability to provide new results is pretty stable over time. 86 00:08:00,430 --> 00:08:04,450 So when you look at that and the other thing, it's to say that when you look at it, 87 00:08:04,450 --> 00:08:08,790 it's also a normal distribution and that if you actually translate it, 88 00:08:08,800 --> 00:08:16,570 what they say in the discussion is slightly more than half will prove to be better and slightly less than half will prove to be worse. 89 00:08:17,770 --> 00:08:21,130 So this line is a little bit this way. 90 00:08:22,280 --> 00:08:28,130 Yeah. So out of our 31,000 plus trials, you can expect about a thousand are positive. 91 00:08:29,330 --> 00:08:40,720 Okay. So first. But when you think about that, that means, wow, aren't we pretty bad at doing research because about 30,000 trials are of no benefit. 92 00:08:41,320 --> 00:08:44,980 So immediately we get rid of a huge wave of research. 93 00:08:45,550 --> 00:08:52,510 So there must be some fundamental problem right at the outset to get to the stages of randomised trials to say so many fail. 94 00:08:53,530 --> 00:08:57,010 So that's the first issue. However, track back a bit. 95 00:08:57,400 --> 00:09:06,460 Okay. It's still not a bad number, isn't it? 1000 to 1500 are positive each year and should impact on clinical practice. 96 00:09:07,360 --> 00:09:09,940 That's quite a lot if you think about it. 97 00:09:10,690 --> 00:09:17,679 And that means as clinicians, we should be in this sort of almost all the time looking up new evidence going, 98 00:09:17,680 --> 00:09:22,149 Oh, I'm going to change my practice this week. Some new evidence come. But how many clinicians in the room? 99 00:09:22,150 --> 00:09:30,070 Other clinicians in the room? Okay. Clinicians beyond sort of first year health officers will probably change their practice daily. 100 00:09:31,180 --> 00:09:38,260 Let me ask you a question. When was the last time you changed your practice? Of different antibiotic treatments for drug resistant. 101 00:09:38,930 --> 00:09:42,240 Okay. And that within the last in the last month. 102 00:09:43,040 --> 00:09:56,929 Okay. So we've got one of the one files and, um, the ones that are popular with OC and Loughrea is OC a chronic stomach acid, 103 00:09:56,930 --> 00:10:00,020 poly dental extractions with all our water. Okay. 104 00:10:00,020 --> 00:10:03,880 And how long with that when? Twice. It's past three months. 105 00:10:05,270 --> 00:10:08,730 So you've done it in the last year. You've changed practice once. That's it. 106 00:10:08,780 --> 00:10:12,080 So again, for me, having said that once in the last year. 107 00:10:12,800 --> 00:10:17,180 Okay, yeah. When you have nobody ever do the survey but great, we had some ongoing survey. 108 00:10:17,180 --> 00:10:20,839 I've changed my practice and to put it on the register, I've done somewhat different based on a bit of evidence. 109 00:10:20,840 --> 00:10:25,280 So when we look at it, it's actually very small number. So they must be somewhat fundamental going wrong. 110 00:10:25,550 --> 00:10:30,680 And generally that might be on evidence from three or four years ago, not from there, but in fact, as it comes out. 111 00:10:30,680 --> 00:10:34,310 So our ability to take up evidence is delayed already. 112 00:10:35,090 --> 00:10:40,460 So when you ask clinicians, actually, we don't translate evidence into practice very often do it. 113 00:10:42,260 --> 00:10:47,209 So that's an interesting concept. So we've already got rid of about 95% of all research that needs resuscitation already. 114 00:10:47,210 --> 00:10:53,090 So let's just focus on the 5%, okay? Maybe 3 to 5% that actually could make a difference. 115 00:10:54,140 --> 00:10:57,200 Okay. But this is the issue. 116 00:10:57,230 --> 00:11:04,100 The reason that bit doesn't translate is because there are significant problems in the evidence that exist within IBN. 117 00:11:04,100 --> 00:11:06,980 But even is then 1500 trials a year. 118 00:11:07,640 --> 00:11:13,580 We can forget the 95 if you if that thousand to 5000 we want to know about and see if we can translate them into practice. 119 00:11:14,660 --> 00:11:18,020 Okay. When you think about why do so little, 120 00:11:18,020 --> 00:11:24,410 this was my first and often my first thought of thinking about this issue is it's often you can just pull out a paper that says, 121 00:11:24,710 --> 00:11:29,780 well, it's external validity to internal validity, and there's some problem with clinical significance. 122 00:11:29,780 --> 00:11:34,130 And for many years I won't have said that to you. And then I think, what does that actually mean? 123 00:11:34,610 --> 00:11:40,070 What do we mean by external validity should apply to the trial populations we see in practice? 124 00:11:40,550 --> 00:11:46,490 The trials internal validity should be robust and have valid methods and clinical significance should mean it makes a difference. 125 00:11:48,930 --> 00:11:56,450 Sounds good. You have to learn it and you have to do it. And then if it is not quite right and interesting, while all this going on, 126 00:11:56,450 --> 00:12:00,139 even the CMO is so worried about the translation of evidence that she's come out now and 127 00:12:00,140 --> 00:12:05,120 said they must be able to report public trust in the safety and effectiveness of medicines. 128 00:12:06,200 --> 00:12:12,139 The CMO, Sally DAVIES, has come out and said We need to follow iterative report of what to do about the 129 00:12:12,140 --> 00:12:15,680 evidence and how we should inform public because they don't trust us anymore. 130 00:12:16,280 --> 00:12:21,620 So while we are, clinicians are developing the evidence and creating a lot of the problems and she the patients 131 00:12:21,620 --> 00:12:25,759 are starting to see through the issues and are doing much of the work for us and saying, 132 00:12:25,760 --> 00:12:30,079 hey, there's a problem, there's a problem, we don't believe you when you come about vaccines. 133 00:12:30,080 --> 00:12:35,450 We don't believe you when you come about medications. We don't believe you when you say you shouldn't take your antibiotics because of resistance. 134 00:12:35,450 --> 00:12:38,540 I don't believe you still want me antibiotics. The patients don't trust us. 135 00:12:38,810 --> 00:12:45,260 So they're getting this message then the media doesn't trust us. So this is a really interesting paper. 136 00:12:45,260 --> 00:12:53,180 I would put this on your reading list. This is a paper from 2001 from de facto it was started was when you go through lots of days articles. 137 00:12:53,720 --> 00:12:59,990 He's wrote an article about once every 18 months, which is really interesting to read because he was obviously thinking a lot about these issues, 138 00:13:01,430 --> 00:13:04,010 why randomised controlled trials fail but don't need to. 139 00:13:04,400 --> 00:13:09,049 And this is a bit of a really interesting the only formula and we can take the word clinician. 140 00:13:09,050 --> 00:13:16,370 Anybody ever needs to know about trials, okay, this is the only form and then I'm going to show and make it a bit simpler in my own thinking. 141 00:13:17,150 --> 00:13:25,430 So this is the formula. Our confidence, which means how confident are we in the effect is directly proportional to the signal. 142 00:13:26,330 --> 00:13:30,230 Divided by the noise times, the square root, the sample size. 143 00:13:30,870 --> 00:13:34,070 Okay. Now, stay with me. 144 00:13:34,250 --> 00:13:39,110 I'm going to translate a few words for you. Confidence in the balance of benefits and harms of a treatment. 145 00:13:39,360 --> 00:13:43,250 Yeah, because it's about the benefits and the harms signal with the effect size. 146 00:13:44,850 --> 00:13:49,160 And the noise is the amount of ice. Okay. Happy with it. 147 00:13:49,730 --> 00:13:51,740 And I'm come at the square of the sample size. 148 00:13:51,950 --> 00:13:57,530 I'm going to come back to believe that because it's it's I understand it, but it would take me 5 minutes to explain here. 149 00:13:58,280 --> 00:14:02,090 And if you take 5 minutes, it's of no value because going to make it even simpler, hopefully. 150 00:14:02,390 --> 00:14:05,870 All right. Read the paper. You will understand it. All right. 151 00:14:06,320 --> 00:14:10,490 So I want to do that. Everybody happy with that for now? Careful. 152 00:14:10,490 --> 00:14:12,830 I might start asking you about it in a minute if you've memorised it. 153 00:14:13,690 --> 00:14:19,250 Right now, when I realise a lot of the things we write in, I can see then a lot of things we're writing is in there. 154 00:14:19,280 --> 00:14:23,480 So when we wrote this editorial with Ben Goldacre about how medicine is broken and how we can fix it. 155 00:14:23,750 --> 00:14:26,990 What I realised is a lot of the time we're mentioning the biases. 156 00:14:27,440 --> 00:14:34,610 That's what we do it. Yeah. And because it's, it's proportional to one over the bias, it's a bias that increases. 157 00:14:34,820 --> 00:14:39,350 You can see you have to have a bigger effect size to overcome them both to see that it's really simple to think about. 158 00:14:39,650 --> 00:14:45,740 So if you want to have an effect based on an observational study, you should have a rather large effect, though, shouldn't you? 159 00:14:45,920 --> 00:14:50,240 Because we know observational studies are biased compared to randomised trials. 160 00:14:51,140 --> 00:14:59,690 If you have defects in the randomised trial you should have larger effect sizes and it's a simple way of thinking what you're starting to look at. 161 00:15:00,320 --> 00:15:09,420 Okay. So I know a lot about their thinking that actually there is these systemic problems, conflicts of interest, 162 00:15:09,630 --> 00:15:14,790 entrenched problems in Ibn Amr in deep trouble because they are becoming so problematic. 163 00:15:14,820 --> 00:15:19,410 What do we do about it? So it's trying to provide a system so people can think simply about what to do. 164 00:15:20,430 --> 00:15:24,270 Okay, now here's how I rewrote it. 165 00:15:24,810 --> 00:15:29,220 Okay. So patient benefit is proportional to the outcome. 166 00:15:30,310 --> 00:15:36,200 Divided by the amount of both. And multiply by what we call the optimal information site. 167 00:15:37,010 --> 00:15:43,430 Okay. Optimal information size means you've got to the point where you've got enough information to answer the question and you can stop. 168 00:15:45,100 --> 00:15:48,670 Because what happens now is people say, we've got big data, don't it? 169 00:15:49,420 --> 00:15:52,990 And our numbers are so big that actually it's going to overwhelm this. 170 00:15:53,230 --> 00:15:56,950 They're not going to care about that. Patient benefit equals big data. 171 00:15:57,920 --> 00:16:05,390 Big data equals patient benefit. But actually, in any scenario, you can calculate the optimal information size where you can say, 172 00:16:05,540 --> 00:16:11,360 based on what we'd expect to see, this is how much data we like to see in our randomised trials, in our systematic reviews. 173 00:16:12,020 --> 00:16:18,830 And at this point if we haven't developed any certainty, we can reject the treatment and so it's not worth carrying on. 174 00:16:19,010 --> 00:16:22,850 Or sometimes you can stop and say, actually, this is beneficial. We don't need another trial. 175 00:16:23,030 --> 00:16:30,700 We can stop. Now, that's an important concept that nobody's ever talked about and talked about very little. 176 00:16:32,170 --> 00:16:35,860 I have a Ph.D. student about to start to publish on this because it's really important 177 00:16:35,860 --> 00:16:39,790 how many treatments reach the threshold of optimal information where you can say, 178 00:16:39,790 --> 00:16:48,940 we're happy to be certain about this treatment. Okay, so what you want to do with these three phenomenons is you want to optimise the outcomes. 179 00:16:50,550 --> 00:16:54,600 You want to minimise the bias and you want to achieve the optimal information size. 180 00:16:55,590 --> 00:16:58,889 That's it, really, for any treatment, that's what you want to achieve. 181 00:16:58,890 --> 00:17:02,790 And if you achieve that and you say we've optimised the outcome, it's an important outcome. 182 00:17:03,000 --> 00:17:07,140 There are minimised biases and we've achieved the optimal information size. We don't need guidelines. 183 00:17:07,230 --> 00:17:14,040 We should get on with it. Okay. So a very good example of a treatment that might meet that is aspirin in MRI. 184 00:17:15,210 --> 00:17:18,960 The outcomes important because it's death. It's quite a big outcome. 185 00:17:19,350 --> 00:17:26,130 At one month, it's about 4% absolute reduction in mortality, three or four somewhere around there, 3%, 4%. 186 00:17:26,400 --> 00:17:30,630 The bias is a minimise because we've got 200 different randomised trials that it's been replicated in 187 00:17:30,990 --> 00:17:36,780 and then 200 trials have been put into an IPD collaboration to achieve the optimal information size. 188 00:17:37,590 --> 00:17:42,390 You don't need a guideline. 99% of people on aspirin get aspirin at heart attack. 189 00:17:43,670 --> 00:17:48,710 And clinicians will respond when the evidence meets this criteria. 190 00:17:48,740 --> 00:17:56,030 The problem is often evidence doesn't meet this criteria. And we're in an uncertain world where the outcomes are meaningless. 191 00:17:56,600 --> 00:17:59,780 The biases are quite high and we haven't quite got enough information. 192 00:18:00,650 --> 00:18:04,220 And we're going to end up with a guideline. Okay. And people hate guidelines. 193 00:18:05,120 --> 00:18:09,350 Right. Now, let's just take one of them. I can't give you more, but we could. 194 00:18:09,990 --> 00:18:13,280 And I can tell you what we're going to do about this at the end. 195 00:18:14,450 --> 00:18:18,529 So let's just think about outcomes. How. Optimising outcomes. 196 00:18:18,530 --> 00:18:23,870 Because this made me think about this. How do you optimise outcomes? What do we mean? But actually there are major problems in outcomes. 197 00:18:24,680 --> 00:18:31,309 When you think about it and it's really interesting and anybody think so, if I said to you, 198 00:18:31,310 --> 00:18:34,490 I've done a new randomised trial and I'm about to publish my outcomes and here they are. 199 00:18:34,790 --> 00:18:38,780 Can anybody think of one type of problem that might occur with the outcomes? 200 00:18:39,960 --> 00:18:44,170 I think. Yeah. So actually that's a really one. 201 00:18:44,380 --> 00:18:48,760 They're of no value to patients, aren't they? You look at anything that's not important to me. 202 00:18:49,090 --> 00:18:54,430 Why do I care? That's a really important one. That's probably number one. 203 00:18:55,670 --> 00:18:59,060 To measure it. You didn't measure it, right? Yep. 204 00:18:59,480 --> 00:19:03,680 That's one of them. But you didn't make that up. You didn't measure it at all. 205 00:19:04,250 --> 00:19:08,180 Okay. You didn't measure sugar, right? You didn't measure it at the right time. 206 00:19:09,050 --> 00:19:14,380 For some people, it doesn't work. For others, and we don't know. We have a mess there. 207 00:19:14,910 --> 00:19:18,420 You have a mess. Okay. I'm happy with the word mess. How significant? 208 00:19:19,150 --> 00:19:22,920 Okay. So how big is the outcome? You don't quite understand how big it is. 209 00:19:23,730 --> 00:19:28,650 I didn't link the patient to patient outcomes from the beginning. 210 00:19:28,920 --> 00:19:32,430 He didn't link the patient benefit with outcomes that beginning. Yeah, that's fair. 211 00:19:32,790 --> 00:19:42,029 So when it 40 down there and you think in the context of this I'm forcing you to start to think about now is what does he mean by important outcomes? 212 00:19:42,030 --> 00:19:46,650 Valid outcomes are both problems in outcomes what they're actually they're just outcomes and not outcomes. 213 00:19:47,220 --> 00:19:53,700 Yeah. Now think about it. We've got a world in millions of people involved in health care, in research, in epidemiology, 214 00:19:53,910 --> 00:19:58,890 and nobody really thinks about this and they plan whole research budgets around it. 215 00:19:59,280 --> 00:20:05,280 So this is what we've been thinking about. When you look at his recent ones, his one about problems of surrogate outcomes. 216 00:20:05,670 --> 00:20:12,450 Okay. You understand what surrogates are. Surrogates are an outcome on the real disease of interest. 217 00:20:13,080 --> 00:20:16,230 So on the way to cancer, we may look at disease progression. 218 00:20:17,550 --> 00:20:25,290 And often people would like to assume that they actually do predict the outcome of interest, but often they don't and they're of limited value. 219 00:20:25,530 --> 00:20:29,759 And here is this is based on Rita Freedberg saying that basically, first, 220 00:20:29,760 --> 00:20:33,600 the FDA should promptly withdraw approval for cancer drugs that are proven to be to have no 221 00:20:33,600 --> 00:20:37,620 clinical benefit because they're all suddenly approved on the basis of surrogate outcomes. 222 00:20:38,190 --> 00:20:45,100 That's one problem. Here's another one. Here's what happened is there's a whole research agenda from a priority setting exercise. 223 00:20:45,360 --> 00:20:54,209 How people people involved in and research. Number three in the Delphi of the priority exercise was here you go so consensus was 224 00:20:54,210 --> 00:20:58,320 achieved among CTU among clinical trial unit directors across the whole of the UK. 225 00:20:58,560 --> 00:21:06,990 Number one, with research into methods to use recruitment into trials, methods to minimise attrition, a choosing appropriate outcomes to measure. 226 00:21:08,310 --> 00:21:11,340 So in 2006 did all the clinical trials. 227 00:21:11,340 --> 00:21:15,030 People are telling you that we need research in how to achieve an appropriate measure. 228 00:21:15,630 --> 00:21:19,650 We have we don't really know what we do. Huge issues and huge problem. 229 00:21:20,100 --> 00:21:22,940 There's more of this stuff. Here's the surrogate victim on tour. 230 00:21:22,950 --> 00:21:30,540 John YUTKIN Telling you in about diabetes, how things like HBA one, C home and fasting glucose sort of not what you're interested in. 231 00:21:30,990 --> 00:21:34,020 What you really interested in is important vascular outcomes. 232 00:21:34,410 --> 00:21:42,870 You get more of them composite endpoints. These are fantastic way of of of of not optimising outcomes because what you do is you 233 00:21:42,870 --> 00:21:48,990 combine a number of different endpoints to try and get the optimal information size. 234 00:21:49,560 --> 00:21:51,150 That's what you're trying to do when you do that. 235 00:21:51,150 --> 00:21:58,050 But when you do that, you're often combining endpoints of different severity which are meaningless and then are of not value to pensions. 236 00:21:58,830 --> 00:21:59,830 There's more when you keep going. 237 00:22:00,570 --> 00:22:07,830 Did deviation from intention to treat analysis in randomised trials and treatment effective at the meta epidemiological study? 238 00:22:08,100 --> 00:22:15,630 I love the word and it's like trials that deviated from the IPT showed larger intervention effects in trials that reported the standard approach. 239 00:22:16,140 --> 00:22:20,670 So you have these biases that affect the outcome don't really make a difference and it keeps going. 240 00:22:20,670 --> 00:22:25,110 The outcome selection and role of patient reported outcomes in cardiovascular treatment trials. 241 00:22:27,120 --> 00:22:34,200 Down here, there's many trials in which patient reported outcomes would have been important of crucial or crucial for clinical decision making. 242 00:22:34,530 --> 00:22:40,109 Did not report such outcomes. 122 of 174 trials. 243 00:22:40,110 --> 00:22:43,860 70% of trials did not report. 244 00:22:43,860 --> 00:22:47,900 The critical patient outcome shows you got a huge problem. 245 00:22:48,420 --> 00:22:56,249 So we got all this research which is fundamentally distorted just because of the quality of the outcome, and you can keep going. 246 00:22:56,250 --> 00:23:00,900 And then this one selective reporting bias of harming outcomes to maximise the benefit. 247 00:23:01,680 --> 00:23:06,570 You don't report the beneficial outcomes and you forget to report the harms. 248 00:23:07,350 --> 00:23:12,380 And so you got a problem of publication bias and a problem of reporting bias when it comes to outcomes. 249 00:23:12,910 --> 00:23:18,180 Somebody said missed, you didn't collect them. We may have collected them, but then you decided not to report them. 250 00:23:19,350 --> 00:23:22,020 Okay, huge problem. And you can do that in two ways. 251 00:23:22,230 --> 00:23:26,180 You can say, I'm not publishing the paper, or you can say I'm publishing the paper and I'm take them out. 252 00:23:26,520 --> 00:23:29,790 And do you know what? Nobody can do anything about that whatsoever. 253 00:23:31,290 --> 00:23:36,779 So putting all together, I thought, wow, I didn't realise when I start thinking about their all these harm outcomes. 254 00:23:36,780 --> 00:23:41,760 So this is something that we've just submitted to trials and it's come back with Ben Goldacre and Kemal Mouton. 255 00:23:41,830 --> 00:23:43,260 Thinking about trial outcomes. 256 00:23:44,280 --> 00:23:51,749 Surely there's a better way of putting all these together and because they're difficult to interpret and they do have a huge impact. 257 00:23:51,750 --> 00:23:57,659 But what will happen happens is the articles are one over here, one over here and one over here, 258 00:23:57,660 --> 00:24:03,320 and nobody's connecting the dots in terms of all the problems. This is this is for all the researchers. 259 00:24:03,330 --> 00:24:09,959 This is how I go about it when I'm writing an article. It's this is a big Post-it note and these are small Post-it notes. 260 00:24:09,960 --> 00:24:13,710 And I stick them on my wall and I write in Post-it notes and then I move them around. 261 00:24:14,250 --> 00:24:23,370 And you can see this is what it starts out. And this is what you're doing to try to change the title of of an editorial or commentary. 262 00:24:23,460 --> 00:24:26,790 So that's one of you involved in thinking about ideas. This is some of the ways. 263 00:24:27,090 --> 00:24:31,530 And that that is that table fit. 264 00:24:32,900 --> 00:24:39,229 Okay. Patients in co-design poorly chosen methods poorly collected equator concerned 265 00:24:39,230 --> 00:24:44,090 thinking about them selective reported I sigma that it is obviously inappropriate. 266 00:24:44,240 --> 00:24:49,730 So that translates into that further analysis and it is like magic. But what it shows you is there are four fundamental issue. 267 00:24:49,740 --> 00:24:53,630 That's what we think at the design outcome, the badly chosen. 268 00:24:55,150 --> 00:25:01,720 At the method studies that Bradley collected at the publication, they selectively reported and then interpreted. 269 00:25:02,260 --> 00:25:07,059 We all do this a bit inappropriately or interpreted, so you can pick one area of this. 270 00:25:07,060 --> 00:25:10,910 And so here's the problem here at the design. This is a particular aspect. 271 00:25:10,930 --> 00:25:15,610 At the outset, you really are designing this in a way to maximise your chance of having a positive outcome. 272 00:25:16,300 --> 00:25:23,379 Yeah. And that's where both with in lack of relevance to patient and decision make of it in the you are collecting them badly. 273 00:25:23,380 --> 00:25:26,770 Well missing data, poorly specified outcomes is a really important issue. 274 00:25:27,340 --> 00:25:34,630 You may think you're sometimes looking at an outcome like pneumonia. And when you look actually in the in the protocol, it's not actually pneumonia. 275 00:25:35,320 --> 00:25:43,450 You wouldn't define it. That publication, selective reporting, publication bias reporting by underreporting of adverse outcomes and switched outcomes. 276 00:25:44,290 --> 00:25:49,690 And then finally, relative measures, spin, multiplicity. 277 00:25:49,960 --> 00:25:55,240 You can just publish as many outcomes as you want and at some point you'll have one positive one, won't you, by chance, alone? 278 00:25:56,020 --> 00:26:00,880 Yeah, I'd be happy with that. If you have a if you're the slave to the p value, you're going to get a problem. 279 00:26:01,180 --> 00:26:05,229 And then this lack of core out concepts makes a huge problem for us in evidence synthesis, 280 00:26:05,230 --> 00:26:11,250 because often you're looking at seven trials with seven different outcomes and nobody can bring anything together than what you have that. 281 00:26:11,440 --> 00:26:15,780 But what do you think? So we'll talk some more. He's still up for tomorrow. 282 00:26:16,920 --> 00:26:20,290 All right, good. We can leave now if you want. 283 00:26:20,470 --> 00:26:24,490 But if you've really everybody said at the back, somebody said, how can we? 284 00:26:24,580 --> 00:26:32,170 So this is when I go to even commissioning groups, the MSI, I'd start for the minimal, clinically important difference. 285 00:26:33,730 --> 00:26:39,550 At what point would you accept this difference to make it make a decision to treat somebody or not? 286 00:26:40,540 --> 00:26:46,450 And it's really interesting if you, I suppose to hear some of that statistically significant take it or leave it, 287 00:26:47,320 --> 00:26:54,969 what how much difference do you want to change? Practice. Before we stop him, before we look at evidence, what's the minimum? 288 00:26:54,970 --> 00:26:59,110 And then with clinicians, we will do that. We will treat people. 289 00:26:59,380 --> 00:27:05,530 I will treat people tonight when they may have a 50% chance of pneumonia or a 30% chance. 290 00:27:07,090 --> 00:27:11,560 Knowing that actually if I leave them, the utilities, they may go on to develop pneumonia and die. 291 00:27:11,590 --> 00:27:17,160 So actually I'm going to treat you at 30% jump. It's alright by me. Otherwise to get 200, then I've got standing to hospital. 292 00:27:17,200 --> 00:27:20,470 They've got to have the x ray, got to have the gold standard cost of thousands. 293 00:27:20,480 --> 00:27:25,660 I'm going to treat people. But when it comes to important differences in treatment, nobody has a clue. 294 00:27:26,890 --> 00:27:31,420 Wherever have been and said right and left treatment you used when you decided the ones that we decided. 295 00:27:32,110 --> 00:27:34,860 What was it in terms of the difference between treating somebody, 296 00:27:34,880 --> 00:27:38,590 not treating that made a difference to you saying going to do this as opposed to know what 297 00:27:38,590 --> 00:27:43,569 is it you as a group of doctors deciding together to say this benefit is so significant, 298 00:27:43,570 --> 00:27:53,860 we are going to change? What we're looking for is a 1% reduction in mortality with this new treatment, 0.1%, 0.01%. 299 00:27:54,280 --> 00:27:58,300 And whatever ago, nobody actually could pick trends up and go. Yeah, we thought about that. 300 00:27:58,810 --> 00:28:07,420 We really did for it. And when people understand the cost of new treatments, you actually sometimes they're adding in cost for no benefit whatsoever. 301 00:28:10,910 --> 00:28:16,790 So that's the IDs. But when you go and ask the same patients and say, well, this is what the doctors think. 302 00:28:18,500 --> 00:28:24,800 They actually want really important differences, which are about 3 to 4 times larger than what the doctors will say is important to them. 303 00:28:25,790 --> 00:28:32,930 So patients want and this is in a group of rheumatologic patients that say values are 3 to 4 times greater than the MCI ID values. 304 00:28:33,710 --> 00:28:37,100 The patients want more than what we would accept. 305 00:28:38,030 --> 00:28:46,310 So actually what we do do is often stop research too quickly without optimising the intervention and thinking, where do we want to get to? 306 00:28:47,810 --> 00:28:54,070 Huge issue. And that should be really important in people developing new services, commissioning, thinking What's our target? 307 00:28:54,080 --> 00:28:57,140 What do we want to achieve? How much difference do we want to make? 308 00:28:57,680 --> 00:29:01,580 And I can tell you, I've never been with a clinical commissioning group who thinks like that yet ever. 309 00:29:02,120 --> 00:29:07,069 The only two people have seen that is the W.H.O. So it's interesting. 310 00:29:07,070 --> 00:29:09,170 They go to the lowest resource settings in the world. 311 00:29:09,170 --> 00:29:17,479 It's almost you've got to get to the place where they've got nothing to then get somebody to set an NCD and they've got 2025, 312 00:29:17,480 --> 00:29:22,460 oh, 2020 so they want a 20% reduction in cardiovascular disease by 2020. 313 00:29:23,270 --> 00:29:33,350 Yeah. And they're the one. The second group I've noticed is Cancer Research UK is setting themselves a target by 2034. 314 00:29:34,040 --> 00:29:40,460 They want freedom for people to be free of cancer, don't they will be cured from cancer and they've tied it to 24. 315 00:29:40,700 --> 00:29:44,239 Quite smart when you think about it, but at least have set themselves a clinically important difference. 316 00:29:44,240 --> 00:29:51,050 I think you could answer and I go and talk to people, right? If I go into this, anybody in the room tell me in any of their settings, 317 00:29:51,230 --> 00:29:55,310 what's a clinically important difference that your organisation is working through right now? 318 00:29:55,700 --> 00:30:00,270 Anybody? 20% improvement. 319 00:30:01,110 --> 00:30:03,100 So I can actually see that it happened. Yeah. 320 00:30:03,150 --> 00:30:08,070 So you might say, well, if your organisation talking to you or you were the practice going, this is what we're working to. 321 00:30:08,310 --> 00:30:13,980 This is our, this is what would be important to us. So that said, no, you might think something don't so but then talk to you. 322 00:30:14,190 --> 00:30:18,059 But no organisation tells me or can talk to me and say this is what's important to 323 00:30:18,060 --> 00:30:21,810 do and what we've got is a problem in the distortion of what's important to us. 324 00:30:21,870 --> 00:30:28,079 Right now. It seems to be whether you wait half an hour or 40 minutes, not whether you get a cure or treatment. 325 00:30:28,080 --> 00:30:33,510 So we get distortions in what's important to us because. No, no, no organisation is that what's important to us? 326 00:30:34,350 --> 00:30:38,800 Okay. So much of what we do in is an example is compare. 327 00:30:39,220 --> 00:30:46,810 We work towards trying to fix some of the solution. And so this has been a project led by Ben Goldacre that's included medical students. 328 00:30:47,080 --> 00:30:52,659 And what we tried to do is there's been this long term problem of outcome switching in randomised trials. 329 00:30:52,660 --> 00:30:54,670 That was one of my problems in the outcomes. 330 00:30:55,520 --> 00:31:04,000 You what you do is you see in the systematic review, 31% of trials have had discrepancies between the pre-specified in reported primary outcomes. 331 00:31:04,600 --> 00:31:09,990 We had ten trials just basically say here's our primary outcome, but actually it's now different. 332 00:31:10,000 --> 00:31:16,630 When we published the paper, we switched it and you can switch from the one with negative or inconclusive to one that's positive. 333 00:31:16,990 --> 00:31:20,590 Well, that for me, things are ridiculous scandal, isn't it? That's what I feel about it. 334 00:31:20,590 --> 00:31:29,300 And that's what we all feel about it. And 15%, about one in ten, one in 15 introduces some outcome that they didn't even pre-specified. 335 00:31:30,580 --> 00:31:37,150 So one of the things we did with try and repeat this, but actually having a third feature with the see, could you correct science? 336 00:31:37,720 --> 00:31:41,210 Can the journals correct themselves? And these are the outcome report. 337 00:31:41,240 --> 00:31:50,050 So 67 trials checked, nine were perfect fringing 54 outcomes not reported infringed and 57 were silently added. 338 00:31:51,740 --> 00:31:56,530 So you can basically pull out and put in. You can basically say this is a mess, 58%. 339 00:31:56,540 --> 00:32:00,199 This was quite a big operation because that meant we had to check them all to students, 340 00:32:00,200 --> 00:32:04,130 would check them independently, and then they get one of the senior people like myself to go, 341 00:32:04,340 --> 00:32:11,210 Oh yeah, let's go over this, because we've got to do it within the window of a journal, which again seems absolutely ridiculous. 342 00:32:11,600 --> 00:32:15,320 So some of them are two weeks, a maximum four weeks. If you send after that, they'll send you a letter. 343 00:32:15,650 --> 00:32:20,720 In fact, you very much time the window. So you've got to get really organised to correct sign. 344 00:32:22,100 --> 00:32:26,990 You can't come a month later and go, there's a problem. They'll just reject the letter apart from one journal. 345 00:32:27,570 --> 00:32:32,299 Okay, so we wrote to these all them and here's some of the responses. 346 00:32:32,300 --> 00:32:39,110 So the BMJ is interested. They didn't have many trials, but they did have some with small problems and some of them led to correction. 347 00:32:40,410 --> 00:32:44,610 Okay. But I know that we wrote to no correction. 348 00:32:45,480 --> 00:32:48,540 That seems to be up to the journal editor. Same problem. 349 00:32:49,140 --> 00:32:57,510 Same issues when we correct. When we didn't. Different editors of The Lancet, they published. 350 00:32:58,020 --> 00:33:04,560 Many of them took over 160 days for some of them to be published and they're all in the off with replies. 351 00:33:04,590 --> 00:33:09,990 Got them to reply and no comment from the editors on the paper saying there's a correction problem here, just a letter. 352 00:33:10,020 --> 00:33:14,580 Some hundred and 50 days later, JAMA rejected all the letters. 353 00:33:15,390 --> 00:33:23,219 Not interested whatsoever. Too vague. And if you look to the there were only 250 words that we couldn't be that vague in 354 00:33:23,220 --> 00:33:27,900 250 words because there's a word limit on them and repetition between letters. 355 00:33:27,930 --> 00:33:31,770 So one of the things we did is to say, look, we can't make any mistakes, can we? 356 00:33:32,190 --> 00:33:40,560 We would have a call letter that would say this paper insert name has four primary outcomes, three of which were. 357 00:33:40,590 --> 00:33:45,210 So it was a standard letter so we couldn't make her. So that was what they were saying about repetition. 358 00:33:45,840 --> 00:33:51,120 Annals of Internal Medicine went to war in some way, started writing editorials, 359 00:33:51,120 --> 00:33:55,050 were pointing out all sorts of things in their journal that they could write. 360 00:33:55,410 --> 00:34:00,810 So you see, they are the editors to the editor. And in response, they were getting as much as they wanted to respond. 361 00:34:01,860 --> 00:34:04,620 So they wrote this whole diatribe thing for one trial. 362 00:34:04,630 --> 00:34:11,100 So this is the editors one trial compare apparently considered the protocol published well after data collection ended. 363 00:34:11,490 --> 00:34:19,070 Okay. However, they did not consider the protocol published two years before Matt Firth, an associate primary trial, was published. 364 00:34:19,080 --> 00:34:25,260 So that's what they say. But what this protocol was published one year after the trial, it started. 365 00:34:26,190 --> 00:34:31,709 Okay. You can see that the problem in research, we go to the registry, the rules. 366 00:34:31,710 --> 00:34:34,080 Where you go to the registry, I take that as gold standard. 367 00:34:34,080 --> 00:34:38,490 Unless there's a protocol published prior and we look at the protocol and say, when is it published? 368 00:34:41,100 --> 00:34:44,580 And that's what we do. Okay. 369 00:34:46,410 --> 00:34:50,210 Protocols were available for none of the 12. Yeah. 370 00:34:50,530 --> 00:34:58,090 And that's the statement. They go, okay. And that's the reply we got when we email. 371 00:34:59,300 --> 00:35:04,780 Outrageous when you think about what's going on in science, isn't it? So we have a reproducible statement. 372 00:35:04,790 --> 00:35:09,710 This is available from Dr. Edmondson. If you email Dr. Everson, that's the response you get. 373 00:35:10,730 --> 00:35:13,880 Try out if you're interested to see what happens. 374 00:35:15,500 --> 00:35:19,850 New England Journal of Medicine. Fantastic. 375 00:35:21,640 --> 00:35:25,480 Any introverted reader can compare the published article, the trial registration, 376 00:35:25,480 --> 00:35:28,510 the protocol with the reported results to view discrepancies themselves. 377 00:35:28,840 --> 00:35:33,220 It took us up to 4 hours for two people on the scene in person to do it. 378 00:35:33,910 --> 00:35:40,300 The quickest we did it within one hour. If you want to go down that route, it'll take you somewhere between one and 4 hours per paper. 379 00:35:41,410 --> 00:35:52,420 Okay, so that's where we are. That's an example of poorly reported outcomes of showing you that actually there are four fundamental features chosen, 380 00:35:52,420 --> 00:35:56,740 collected, reported and interpreted about outcome. There are so many fundamental problems. 381 00:35:57,010 --> 00:36:00,580 The next time you think about an outcome and you see a research outcome. 382 00:36:00,790 --> 00:36:06,910 Just step back and think. I wonder how they were chosen one day, how they were collected, have they been reported and how am I interpreting it? 383 00:36:07,210 --> 00:36:10,150 And particularly think about it in terms of a clinically important difference. 384 00:36:10,690 --> 00:36:16,720 And part of that is what's led to all of this evidence log manifesto, which is got lots about certain outcomes in them, 385 00:36:17,230 --> 00:36:22,850 and resuscitate in the deluge of poor quality evidence, which I'd if you want to leave a response. 386 00:36:22,870 --> 00:36:27,130 Think about what we've said tonight. Any issues that would be great are probably quote you in some way. 387 00:36:28,060 --> 00:36:41,770 But interestingly, however, everything I said is now up in the air because of the way the modern world is working in terms of fabrication, 388 00:36:41,920 --> 00:36:48,730 research, fraud and trial. And this is the newly established Chinese FDA equivalent. 389 00:36:49,060 --> 00:36:54,970 80% of data has been made up or fabricated in Chinese drug trials. 390 00:36:55,960 --> 00:37:01,780 And that's because the overwhelming market forces are about people trying to make money. 391 00:37:02,680 --> 00:37:09,430 And when we've got such issues with so much money is even whatever we do, that's still going to be the need for vigilance. 392 00:37:10,180 --> 00:37:10,960 Thank you very much.