1 00:00:00,570 --> 00:00:05,700 I've had a busy day, actually, a term. You've seen all these students, brand new students running around. 2 00:00:05,970 --> 00:00:12,360 This morning I started teaching the year four clinical medical students about introduction to evidence based medicine. 3 00:00:13,200 --> 00:00:17,220 And about half of them were nearly asleep with a lecture like that. 4 00:00:18,220 --> 00:00:21,870 So what if you fall asleep? I'm likely to do to you what I did to them. 5 00:00:22,110 --> 00:00:26,220 I walked right up to you and ask you a question. So be careful. 6 00:00:26,580 --> 00:00:33,809 Okay. I'm going to take you on a tour of how I think some of the work we do in the centre and 7 00:00:33,810 --> 00:00:39,090 how it applies to trying to improve the quality of evidence for better health care. 8 00:00:40,160 --> 00:00:47,400 And while I'm talking, if you're online or anybody online between email, you'll be in trouble because I might ask you a question. 9 00:00:47,610 --> 00:00:55,350 But you can go to evidence live dot org and you can see the manifesto there and look at what I'm talking about. 10 00:00:55,590 --> 00:01:00,720 So you can look at the manifesto or you can look at it after and you can see it's all freely available. 11 00:01:00,990 --> 00:01:06,940 Even though the BMJ version is behind a paywall, we published it in full on the Evidence Life site. 12 00:01:09,090 --> 00:01:19,290 So here's a couple of papers. So when I put a paper up like this and basically say, Go and read it, because I've read it and it's really useful. 13 00:01:19,290 --> 00:01:25,230 And this is a paper by Doug Altman, a 1994 editorial called The Scandal of Poor Medical Research. 14 00:01:26,340 --> 00:01:32,940 And in that, he says, there are huge shortcomings in the way that evidence based medicine operates today. 15 00:01:33,180 --> 00:01:40,169 And that's 1994. And here's a second paper that I was involved with. 16 00:01:40,170 --> 00:01:45,990 Chris Greenhalgh wrote this paper, which is called Evidence Based Medicine A Movement in Crisis. 17 00:01:47,130 --> 00:01:54,510 But there are structural problems with even how it's used, how it's applied and its influence on health care. 18 00:01:54,750 --> 00:02:01,230 And that actually is distorting health care in many ways, and much of that is detrimental to patients. 19 00:02:03,210 --> 00:02:10,290 And then finally, I publish this, which I'm going to come back to with a chap called Ben Goldacre that some of you might have heard of. 20 00:02:11,010 --> 00:02:15,930 That says, increasingly, the public is aware of the shortcomings in evidence based medicine. 21 00:02:16,260 --> 00:02:21,870 So although doctors for many years we've been getting a IT in clinical researchers, we know this is going on. 22 00:02:22,320 --> 00:02:28,590 The problem is the public is now finding out and once the public finds out, we have to do something about it. 23 00:02:29,850 --> 00:02:33,239 So just coming back to this paper, it's a really interesting paper. 24 00:02:33,240 --> 00:02:38,040 And this this line has made this cited thousands of times. 25 00:02:38,430 --> 00:02:42,420 We need less research, better research and research done for the right reasons. 26 00:02:43,080 --> 00:02:51,690 And this paper was done in 1994. And so if you look on PubMed and do a search filter into PubMed, you can look at what's happened. 27 00:02:51,960 --> 00:02:59,490 Since that paper was published in about 1994, 95, there were ten enough thousand clinical trials, randomised controlled trials. 28 00:02:59,910 --> 00:03:05,430 And look at the increase here. It's 32,000, it's now about 38,000. 29 00:03:06,360 --> 00:03:13,350 So it's going up. So you could say, well, actually there's been a three fold increase in randomised trials, everything's okay. 30 00:03:14,520 --> 00:03:21,270 However, there's also been this massive increase in observational research, nearly a fourfold increase. 31 00:03:21,480 --> 00:03:25,230 That number is now more than 400,000 articles per year. 32 00:03:26,040 --> 00:03:28,920 So there are 30,000 plus randomised trials. 33 00:03:29,250 --> 00:03:35,940 There are more than 400,000 observational research, about 1.3 million per year in total articles on PubMed. 34 00:03:36,990 --> 00:03:41,970 So amongst all that, there must be something that's making a difference to patient care. 35 00:03:42,660 --> 00:03:45,960 Otherwise, what's all the purpose of it? What's going wrong? 36 00:03:46,650 --> 00:03:50,130 There must be something in there that clinicians can go tomorrow and go. 37 00:03:50,370 --> 00:03:55,620 I can apply this to patient care. Now, I don't know if there are any clinicians in the room. 38 00:03:56,820 --> 00:04:01,649 There are a few. Do you feel like when you get to work tomorrow, you turn up and you go, Oh, 39 00:04:01,650 --> 00:04:04,830 here's all the wonderful evidence that helped me do my job better tomorrow. 40 00:04:05,940 --> 00:04:11,580 Wry smile among the room. In fact, what's happening is we're completely lost in all of this evidence. 41 00:04:11,970 --> 00:04:17,730 And it's even harder now than it's ever been to tell the good from the bad. 42 00:04:18,970 --> 00:04:21,980 So I'm really interested in that, and I'm interested in that. 43 00:04:21,990 --> 00:04:28,290 If you take the 32,000 trials and say on average they have no effect on clinical practice. 44 00:04:29,200 --> 00:04:36,190 Right? So if all the research we spend, the billions of pounds we spend globally is hundreds, 45 00:04:36,210 --> 00:04:39,840 billions, it's probably about maybe a hundred billion on average. 46 00:04:40,200 --> 00:04:46,889 Research makes no difference to clinical practice. Now, if you have a bunch of researchers in the room, they start to go, What do you mean? 47 00:04:46,890 --> 00:04:50,970 Everything I do is beneficial? Surely. What do you think? 48 00:04:51,030 --> 00:04:59,650 If everything. So if it makes no difference and some of it's harmful and a little bit of it is beneficial and you can. 49 00:04:59,970 --> 00:05:06,180 Find out if it's a normal distribution of no effect. So and if you do that. 50 00:05:06,600 --> 00:05:12,270 All of you on the course and all of you statisticians and researchers would go, oh, that's about two and a half percent. 51 00:05:13,650 --> 00:05:20,910 And if it's two an alpha, then there should be 792 trials that impact on clinical practice every year. 52 00:05:21,990 --> 00:05:26,460 Just FYI, just by chance, gentlemen, and saying this on average, no effect. 53 00:05:27,150 --> 00:05:32,580 You would want it skewed one year. So it would be more, would you say, on average, no effect. 54 00:05:33,450 --> 00:05:38,310 However, when you look at new treatments, what we find is in the published literature, 55 00:05:38,730 --> 00:05:42,780 they're only slightly superior to established treatments when tested in trials, 56 00:05:43,860 --> 00:05:49,280 and this small incremental benefit has stayed pretty stable over the last 50 years. 57 00:05:50,460 --> 00:05:54,120 We only make small differences now. 58 00:05:54,120 --> 00:05:59,760 Everybody in this innovation world would like you to believe that round the corner 59 00:05:59,970 --> 00:06:03,270 is this brand new treatment that's going to make a difference to your life. 60 00:06:06,000 --> 00:06:13,470 And if you say that, it's more than half will prove to be better and slightly less than one half to prove it, 61 00:06:13,830 --> 00:06:19,680 then we should expect that actually we get loads of things every day making a difference to practice. 62 00:06:21,360 --> 00:06:34,290 But it's a problem if I feel slightly more than half and there are at least somewhere between 2000 to 1500 per per year. 63 00:06:34,860 --> 00:06:39,450 That means there should be about four or five things differently each day. 64 00:06:39,450 --> 00:06:44,250 We should be doing it for clinicians. That should improve practice. That's not happening. 65 00:06:45,780 --> 00:06:55,560 The reason it's not happening is because the quality of the evidence and what we report has structural issues in the evidence base. 66 00:07:00,020 --> 00:07:07,010 In. The question you want to ask is why did so little research translate into actual clinical care and practice? 67 00:07:08,000 --> 00:07:13,880 It's actually minuscule compared to the 1.4 million articles per year that are published. 68 00:07:15,680 --> 00:07:24,050 It's actually really interesting when you think about it and you can go and talk to clinicians and not to new trainees will tell you everything's new. 69 00:07:24,350 --> 00:07:28,729 But if you talk to Wiley old clinicians, some of them will say, Do you know what? 70 00:07:28,730 --> 00:07:33,800 I've not done very much different for the last ten years. I know the managers are trying to make me do lots of different things, 71 00:07:34,160 --> 00:07:40,450 but I actually do pretty much the same apart from on occasion, one or two treatments that come through. 72 00:07:41,960 --> 00:07:52,970 Now what we do is traditionally we look at research translation in three areas external validity, internal validity and clinical significance. 73 00:07:52,970 --> 00:07:56,030 And this is what we teach the sort of structure, 74 00:07:57,230 --> 00:08:04,730 external validity trials have problems with external validity because they don't apply to the populations we see in practice. 75 00:08:05,030 --> 00:08:11,960 There are systemic differences between the trials and the general population or too internally valid. 76 00:08:12,140 --> 00:08:19,190 There are biases in the trials that prevent us from applying research in practice. 77 00:08:19,490 --> 00:08:28,640 And third, there's an issue that statistical significance, there's loads of that, but actually there's a problem with clinical significance. 78 00:08:29,870 --> 00:08:40,489 Now, it's not only me. You think of the problem because in 2015, Sally Davis is the chief medical officer, released a statement saying, 79 00:08:40,490 --> 00:08:46,729 we need a review to restore public trust in the safety and effectiveness and effectiveness of medicine, 80 00:08:46,730 --> 00:08:53,840 because patients see doctors overmedicating and clinical scientists afflicted by conflicts of interest. 81 00:08:54,890 --> 00:09:02,360 And she based it on two studies, two really important issues cholesterol lowering treatment and antivirals. 82 00:09:04,280 --> 00:09:06,980 And she based on antiviral for the following reasons. 83 00:09:08,480 --> 00:09:15,980 Now, this is work I've been involved in, which is the systematic reviews on neuraminidase in the Cochrane Library, 84 00:09:17,060 --> 00:09:23,360 and I've worked with Tom Jefferson and colleagues and we looked into this and there are problems with tracking down the data. 85 00:09:24,260 --> 00:09:31,940 So I'm going to take you back in time. In 2003, this is Archives of Internal Medicine released the systematic review. 86 00:09:32,570 --> 00:09:37,730 And in that systematic review it says all symptom of it, which is Tamiflu, 87 00:09:39,590 --> 00:09:49,070 reduces your risk of lower respiratory tract complications, antibiotic use and hospitalisation in both healthy and at risk adults. 88 00:09:49,670 --> 00:09:50,600 It doesn't do that. 89 00:09:51,470 --> 00:10:05,780 Okay, because this study included ten studies, included ten randomised trials and one of them was done by this chap, Professor John Treanor, 90 00:10:06,590 --> 00:10:11,719 and one of them was this is the only publication that exists for the largest ever single trial 91 00:10:11,720 --> 00:10:21,290 of Tamiflu and John Treanor were cited as the first author when you can see what he says, 92 00:10:21,290 --> 00:10:27,290 when we tried to track down the data, he basically said, Well, I don't even remember participating in this trial. 93 00:10:28,490 --> 00:10:34,310 I don't even know you should speak to the manufacturers. You can read it in a piece by Deborah Cohen in 2009, 94 00:10:35,750 --> 00:10:41,900 300 word abstract have over a thousand people included in the trial with use to inform what we do now. 95 00:10:43,880 --> 00:10:50,330 What we found out with 60% of the data was never published, was never in the public domain. 96 00:10:52,130 --> 00:10:56,300 Yet we stockpiled this drug and spent nearly £500 million. 97 00:10:58,610 --> 00:11:05,450 And this is a quite an influential find because what it also tells you about is what the FDA called below the water margin. 98 00:11:06,710 --> 00:11:09,740 Does this work? So it didn't want to do that. 99 00:11:10,010 --> 00:11:17,600 Wanted to. Here we go. 100 00:11:18,620 --> 00:11:22,820 Above the waterline. All you see journal publications and conference papers. 101 00:11:23,810 --> 00:11:32,330 Well below the waterline is a whole set of data which are called case report from clinical study reports that 102 00:11:32,330 --> 00:11:39,650 exist on electronic patient level data that exist around a trial and are available if you go asking for it. 103 00:11:41,270 --> 00:11:44,690 So we went asking for it to try and sum up what was going on. 104 00:11:45,950 --> 00:11:51,800 And in 2013 they agreed to release all of the trial on all trial data on Tamiflu. 105 00:11:53,480 --> 00:12:00,290 Now, remember this one I showed you the it's called m76001 300 word abstract. 106 00:12:01,520 --> 00:12:07,820 That's what you base your decision to stockpile on or do you base it on the clinical study report? 107 00:12:08,180 --> 00:12:15,050 We actually have 9809 pages of data on the trial, the benefits and harms. 108 00:12:16,340 --> 00:12:23,149 So I didn't realise all of this information and all this evidence existed and 109 00:12:23,150 --> 00:12:29,000 what we were doing with bathing our decision making on very short abstracts. 110 00:12:33,490 --> 00:12:40,570 It led me to come out with a position to say millions was wasted on Pluto in 2014. 111 00:12:43,300 --> 00:12:47,620 That's my statement. No better than paracetamol. In fact, it's worse than paracetamol. 112 00:12:49,780 --> 00:12:53,440 And I thought that was it. We would change practice and policy based on this. 113 00:12:54,070 --> 00:12:59,080 In fact, when I went to the Health Select Committee on this and I put the arguments for that, 114 00:12:59,200 --> 00:13:04,600 it a week later we spent another 50 million on the basis it was going out of date. 115 00:13:05,050 --> 00:13:12,010 And we still stockpile this drug today despite its lack of effectiveness and all of the issues. 116 00:13:12,310 --> 00:13:22,270 So taking that aside, what was going on for that led me to this position where I realised there were a whole reasons why we need better evidence. 117 00:13:22,300 --> 00:13:33,070 I could find 20 reasons underpinning the need for better evidence and these are the 20 reasons and they're all in the published literature. 118 00:13:34,420 --> 00:13:36,700 And when you when I looked at Tamiflu, 119 00:13:36,730 --> 00:13:45,880 I realised all of these were exhibiting all of these features in the Tamiflu literature were exhibiting quite a number of them. 120 00:13:47,140 --> 00:13:52,150 All of these features of the reporting, both ghost authorship, conflict of interest, 121 00:13:52,390 --> 00:13:59,470 under-reporting of harms you can go on, were all present in the evidence base and are still present today. 122 00:14:01,060 --> 00:14:05,650 So if you want to make good decisions and you've got all these problems and these flaws, 123 00:14:05,890 --> 00:14:12,040 it makes it very difficult to then pass this evidence on to clinicians and then make a very clear decision. 124 00:14:13,060 --> 00:14:25,450 At the same time, going back to 2015, I told you this based on the CMO review of statins and ultimately to where Ben Goldacre wrote this and I wrote 125 00:14:25,460 --> 00:14:32,380 it within this editorial and while realise this and all the time we were talking about problems that were bias, 126 00:14:34,390 --> 00:14:41,230 okay, a lot of these things are just introducing bias all the time into what we do, and there's a lot of it. 127 00:14:43,240 --> 00:14:52,000 This is a tsunami of bias in research, not just financial academic conflicts of interest as well. 128 00:14:52,720 --> 00:14:58,780 I meet a lot of researchers who think research is about using research to back up what they think is the correct thing to do. 129 00:15:00,130 --> 00:15:02,910 What we should do is start from the null hypothesis. 130 00:15:02,920 --> 00:15:09,370 This doesn't work, but much of what we do, much of what we research start from an alternative hypothesis. 131 00:15:09,640 --> 00:15:15,100 This works. The point of the research is to help me show that that's not how it should work. 132 00:15:16,130 --> 00:15:24,760 So another paper you should read why randomised trials fail but needed by a very important person called David Sackett. 133 00:15:25,030 --> 00:15:28,840 If you're interested in evidence based medicine and you're an IBM nerd like me, 134 00:15:29,050 --> 00:15:33,880 you should read everything that data is published in and it'll take you about 20 years. 135 00:15:34,660 --> 00:15:38,260 Now again, because this published a lot. But he published it. 136 00:15:38,470 --> 00:15:43,000 He said, Oh, so simple. When I read it, I thought, Oh, it's a bit complicated. 137 00:15:43,750 --> 00:15:49,780 He said that the confidence of results is proportional to the signal divided by the noise. 138 00:15:50,230 --> 00:15:54,910 Times Square, the sample size. Now bear with me. So I've tried to play with this confidence in the balance. 139 00:15:54,910 --> 00:16:02,590 The benefits of arms is directly, directly proportional to the effect five inversely proportional to the amount of bias. 140 00:16:03,370 --> 00:16:06,460 And then they end up with this very simple way of thinking about it. 141 00:16:06,790 --> 00:16:16,509 So when you think about patient benefit, the bigger the outcome, the bigger the size of the effect, the more likely you are to have patient benefit. 142 00:16:16,510 --> 00:16:27,820 Is that fair comment? However, the bigger the bias, the more by it's present, the less likely you are to have patient benefit. 143 00:16:29,170 --> 00:16:36,100 Get that and if you have the optimal information side, no more data. 144 00:16:37,000 --> 00:16:43,240 The right amount of data. That's why we do systematic reviews is to get the optimal information size. 145 00:16:44,110 --> 00:16:48,010 And if you have the optimum information so you're more confident about patient benefit. 146 00:16:49,060 --> 00:16:54,380 And that helps me think about what it is I'm trying to do and what we're trying to do with optimise the outcomes, 147 00:16:54,400 --> 00:16:59,740 minimise the bias and achieve the optimal information size. Quite simple really when you think about it. 148 00:17:01,390 --> 00:17:07,600 And so often we've looked at these and say, I just take the optimise the outcome, I'll take one example, optimise outcomes. 149 00:17:07,600 --> 00:17:18,640 Well, look, there's loads of people who wrote about outcomes all over the shelf with low the patient papers that tell you about problems 150 00:17:18,670 --> 00:17:26,620 with outcomes because there's one selective reporting bias of how outcomes with study findings from a cohort systematic reviews. 151 00:17:27,220 --> 00:17:30,549 It would just tell you loads of harms are not published. So there's load the paper. 152 00:17:30,550 --> 00:17:38,170 So one of the jobs we do. And your job is to, when you see a lot of information, is try and provide structure to that information. 153 00:17:39,550 --> 00:17:43,030 And here's how my latest technology for providing structure. 154 00:17:43,930 --> 00:17:52,239 Okay. All you tech savvy people, this is how I do it with Post-it notes on my wall and edit So you see. 155 00:17:52,240 --> 00:17:57,990 Edit So it said why most outcomes do not translate into clinical benefits, 156 00:17:58,000 --> 00:18:02,710 and then it edits into why clinical trial outcomes often fail to translate into clinical benefits. 157 00:18:03,010 --> 00:18:06,240 And then you have a paper. That's how you do it. 158 00:18:06,250 --> 00:18:13,970 And in the paper we provide a structure to the problem with outcomes because of our thinking about why are outcomes important? 159 00:18:13,990 --> 00:18:23,170 What's going wrong with the outcome? And they're in four areas the design, the methods, the publication and the interpretation. 160 00:18:24,400 --> 00:18:31,150 And in the design phase so badly chosen, I could spend the rest of this lecture talking about surrogates, composite outcomes, 161 00:18:31,690 --> 00:18:39,730 subjective outcomes, complex scales, and lack of relevant methods that could talk about selective reported or even interpreted. 162 00:18:39,760 --> 00:18:47,800 This is unbelievable, isn't it? I spent this morning with a group of medical students who, three years into their medical education, 163 00:18:48,190 --> 00:18:51,460 did not understand the difference between relative and absolute effects. 164 00:18:52,750 --> 00:18:57,940 And they're in year four and nobody's even bothered to teach them this and how this is used. 165 00:18:59,020 --> 00:19:07,780 Outrageous. But coming back, I could have talked to you about publication bias and reporting bias, because that's what happened in tummy flip. 166 00:19:08,560 --> 00:19:13,900 60% of the data was withheld. But we're going to give you an example here on switched outcomes, 167 00:19:15,730 --> 00:19:24,010 the sort of stuff we do is try to fix the world in in our centre in a completely mad way with people like Ben 168 00:19:24,010 --> 00:19:30,280 Goldacre and a bunch of medical students we set about trying to track switched outcomes in clinical trials. 169 00:19:31,000 --> 00:19:38,020 Now to date has been 27 cohort systematic reviews of cohort and what they've shown is about 170 00:19:38,020 --> 00:19:42,820 a third of trials are discrepancies between the pre-specified and reported primary outcome. 171 00:19:43,120 --> 00:19:48,279 You would think in a randomised controlled trial you set out to report an outcome in a 172 00:19:48,280 --> 00:19:52,900 protocol and then you report what you said you did in the outcome a third of the time. 173 00:19:53,050 --> 00:19:54,070 That doesn't happen. 174 00:19:55,600 --> 00:20:03,400 Yeah, and about one in ten times people introduce outcomes that they never even thought about in the first place, but they'll just put them in anyhow. 175 00:20:03,640 --> 00:20:07,930 Now all the people here, we do a bit of research and understand one about going, Well, that's nonsense. 176 00:20:08,290 --> 00:20:11,350 If you talk to a room of public, they go, That's nonsense. 177 00:20:12,190 --> 00:20:20,470 How can you get away with that? What's the purpose of trial registries and protocols if a third of the time you can just make it up? 178 00:20:22,900 --> 00:20:26,799 That's what's happening. So this is what we do and this is we didn't do. 179 00:20:26,800 --> 00:20:37,420 So we went to trials in the top five medical journals over six week and we looked at the pre specified versus reported outcomes, but we did this. 180 00:20:38,560 --> 00:20:48,700 We thought science is supposed to be self-correcting. Okay, so if you point out the error, you send them a letter, they'll correct the signs, 181 00:20:49,420 --> 00:20:53,440 and then we'll have a journal publication that will have zero problems. 182 00:20:53,980 --> 00:20:58,870 This is actually quite difficult to do. Most journals have a sort of you going to get it published. 183 00:20:58,870 --> 00:21:02,680 You have to do that within two or four weeks. They have word limits. 184 00:21:03,430 --> 00:21:08,649 They have certain things you have to do and achieve. So we did it and this is what we found. 185 00:21:08,650 --> 00:21:17,830 We did 67 trial checked, nine were perfect, 454 outcomes were not reported and 357 were added. 186 00:21:18,790 --> 00:21:25,959 That's a lot of outcomes, isn't it? Not reported or added. We sent 58 letters because now I'm a perfect 18 were published, 187 00:21:25,960 --> 00:21:34,330 eight were unpublished after four weeks and they're still unpublished and 32 letters were rejected by the editor of the Journal. 188 00:21:35,650 --> 00:21:39,190 These are ones you recognise these everybody wants to get published in these. 189 00:21:39,760 --> 00:21:44,530 This is what makes your career. And here's what JAMA rejected all letters. 190 00:21:44,530 --> 00:21:50,200 They said they were too vague and repetition. Between letters they only had 250 words. 191 00:21:51,070 --> 00:21:56,410 And we had to be very clear. You in you reported additional outcomes. 192 00:21:56,680 --> 00:22:00,370 This trial reported insert title because we had a standard template. 193 00:22:01,000 --> 00:22:06,850 So they just said we're not accepting that the annals of Internal Medicine, they were even better. 194 00:22:08,290 --> 00:22:15,310 They responded and said, we rely primarily on the protocol for details about pre-specified primary and secondary outcomes. 195 00:22:15,580 --> 00:22:21,520 They even wrote an editorial about how crap we were and published it and said our methods were crap. 196 00:22:23,470 --> 00:22:27,220 So we what do we do when we get something like that? What do you think we did? 197 00:22:31,090 --> 00:22:34,810 What would. You do at this point? Go away. Sit down. Forget about it. 198 00:22:34,830 --> 00:22:39,030 Project over. What would you do? Try to get more data. 199 00:22:39,300 --> 00:22:44,400 What data would you actually specifically look for? Just do the same exercise again and again. 200 00:22:44,400 --> 00:22:50,190 Improve them. Now, what we did is we said, can we get the protocol because we use a trial registry. 201 00:22:50,580 --> 00:22:58,110 So you're not far off. But we look for the protocols because they said you used the registry entry, which is mandatory by law. 202 00:22:58,470 --> 00:23:02,430 By the way, after one year you do both the report on the registry, all the data, 203 00:23:02,610 --> 00:23:07,200 and use both the primary and secondary outcome measured on the registry at the time of registration. 204 00:23:07,410 --> 00:23:13,130 So it's supposed to be by law, correct? They said we don't use the registry, we use the protocol, we vote for the protocol. 205 00:23:13,140 --> 00:23:16,440 We couldn't find any of them that none of them were available. 206 00:23:18,360 --> 00:23:24,360 They had the statement from Dr. Everson. Yeah. 207 00:23:26,430 --> 00:23:30,660 What do you think we did in the email, Dr. Everson? 208 00:23:31,290 --> 00:23:35,880 This is what he said. I regret to say these protocols are confidential documents, 209 00:23:36,090 --> 00:23:40,500 but we would be able to send you the original involvement if you sign a confidentiality agreement. 210 00:23:41,250 --> 00:23:42,420 That's outrageous, isn't it? 211 00:23:43,710 --> 00:23:49,920 And then went to my favourite journal in all of the world, New England Journal of Medicine, and they basically said, You can get lost. 212 00:23:51,750 --> 00:23:55,620 How can space constraints on an online journal? And then they said this. 213 00:23:56,160 --> 00:24:02,850 Now any interested reader can compare the published article, the trial registration, and the protocol with the reporting with the view discrepancies. 214 00:24:03,150 --> 00:24:10,170 If you have 4 hours available and some friends who know what they're doing, you can do that too. 215 00:24:11,280 --> 00:24:14,340 It's very difficult to do this and it's utter nonsense. 216 00:24:15,660 --> 00:24:23,770 So why does this all matter? So coming back coming back to our thought process, another great paper, this chap John, in all this, okay, 217 00:24:23,820 --> 00:24:30,370 this guy publishes so widely even I can't keep up to date with this guy, but we just have a few in there that are nuggets. 218 00:24:30,390 --> 00:24:35,459 And this one is very interesting because in there he talks about this. 219 00:24:35,460 --> 00:24:42,240 He said this. He says most that I started where it started, most effects are small. 220 00:24:43,020 --> 00:24:47,640 Yeah. And remember I said patient benefit is proportional to size of the outcome. 221 00:24:47,910 --> 00:24:57,750 He backs it up because he's looked at that and said, well, if you introduce any minimal bias, you'll overcome the outcome you'll nullify in effect. 222 00:24:58,350 --> 00:25:04,230 And so he's basically saying you have to be cautious if you introduce any element of bias. 223 00:25:05,130 --> 00:25:12,360 So our job is to eradicate bias. And then if we see a small effect, he could be a really important effect. 224 00:25:12,690 --> 00:25:17,250 We may want to put that into health care and inform people to do this. 225 00:25:17,910 --> 00:25:25,020 But if you've got minimal bias, you're in trouble. And that's what led to this, which is the manifesto. 226 00:25:27,030 --> 00:25:34,460 And in doing that, as part of evidence life, I spent a year going, talking to people on the road, talking about the issues. 227 00:25:34,470 --> 00:25:37,860 What do they think biases are? What are the issues in the evidence base? 228 00:25:38,160 --> 00:25:39,930 What should we think about correcting? 229 00:25:40,140 --> 00:25:49,830 And if we corrected it all and we all got to work, we'd be in a position where we'd have some high quality evidence, and we published that in 2017. 230 00:25:50,790 --> 00:25:54,600 And here, if you can read it, it's available on the website, as I said, available. 231 00:25:56,610 --> 00:26:07,790 And here's what it says. He's the nine recommendations. And so each one specifically is designed as a task where people could look at it and go, Hmm, 232 00:26:08,330 --> 00:26:14,000 I'm going to in my organisation while I'm doing all this stuff that everybody wants me to do for a ref, 233 00:26:14,000 --> 00:26:17,780 and an impact factor could actually get to work and do something useful. 234 00:26:19,640 --> 00:26:23,420 So explain the role of patients, health professionals and policymakers in research. 235 00:26:23,430 --> 00:26:28,040 That's quite an I mean, that's quite an obvious one, but actually it's quite difficult to do really well. 236 00:26:28,730 --> 00:26:34,040 I put these for free, free, because this is what we're focusing on at the moment. 237 00:26:35,270 --> 00:26:38,659 We're focusing on increasing the systematic use of existing evidence. 238 00:26:38,660 --> 00:26:44,270 It's really important at the moment. You might see things like we're going to be doing social prescribing in primary care. 239 00:26:44,780 --> 00:26:48,890 In fact, this morning I got up this morning and I realised we're going to be doing group consultations, 240 00:26:50,240 --> 00:26:56,660 yet people in policy pick these out there and say it's a good idea and they'll often go Right, what's the existing evidence? 241 00:26:56,660 --> 00:26:59,960 Why does it exist? How might I use that to inform what I do? 242 00:27:00,560 --> 00:27:05,330 So that's about why we really big importance about doing systematic reviews. 243 00:27:05,720 --> 00:27:07,790 But actually we may have to do for policy. 244 00:27:07,790 --> 00:27:18,739 A much more rapid approach is to make sure that you've at least a little bit of evidence when the design services make research evidence relevant, 245 00:27:18,740 --> 00:27:24,410 replicable and accessible to end users. I think that's really important and it covers a lot of things. 246 00:27:25,310 --> 00:27:30,830 If it's relevant, that means it has to be designed in a way that the outcomes are important. 247 00:27:30,830 --> 00:27:34,160 At the end of the day, they're not outcomes that are useless. 248 00:27:35,390 --> 00:27:39,410 You have to be able to replicate it in some way. We have to do it more than once. 249 00:27:39,470 --> 00:27:45,260 So at the moment we've got a problem. You might see one single large clinical trial for new drugs and that's it. 250 00:27:46,580 --> 00:27:53,780 There is not enough money now for people to do publicly funded trials for many interventions because we don't see that's worthwhile. 251 00:27:53,990 --> 00:28:01,580 We have to replicate much more of what trials are happening in different settings, and then we have to make it accessible and available, don't we? 252 00:28:03,560 --> 00:28:08,690 And then this would use questionable research practices, bias and conflicts of interest. 253 00:28:08,690 --> 00:28:17,270 That's what I'm really interested in. I think it's really fascinating. Questionable research practices are endemic in university organisations. 254 00:28:17,480 --> 00:28:20,720 In fact, universities train all the people who then go into industry, 255 00:28:20,990 --> 00:28:29,090 and we basically train them to be in some ways pretty poor at what they do because academic 256 00:28:29,090 --> 00:28:36,650 organisations fare much worse in terms of publication bias than pharmaceutical industry. 257 00:28:38,000 --> 00:28:42,800 If you go on online and look at one of the products in the IBM data lab is the trials tracker. 258 00:28:43,100 --> 00:28:47,959 You can look at a paper that came out a couple of weeks on on the European Union CTA trials, 259 00:28:47,960 --> 00:28:55,430 and it will show you that academic organisations fail to publish their results within one year of completion on a registry, 260 00:28:55,460 --> 00:29:03,650 which is the law, by the way, far worse than industry, because the industry now sees it as a quality marker. 261 00:29:03,740 --> 00:29:10,640 It's an easy one they can fix. But academic organisations, we are terrible in this institution. 262 00:29:11,060 --> 00:29:15,140 We do it about two thirds of the time. In many institutions it's less than a third. 263 00:29:16,070 --> 00:29:20,240 So somebody funds research. So my Ethics Committee approved that. 264 00:29:21,080 --> 00:29:25,220 You do the trial, you give people a treatment and a placebo, 265 00:29:25,400 --> 00:29:32,990 and then after one year you fail to make it freely available on a registry which is accessible by all which is the law, by the way. 266 00:29:33,500 --> 00:29:38,930 Now it's the law as of this year and then conflicts of interest. 267 00:29:40,670 --> 00:29:44,090 So this is evidence like I don't know anybody. 268 00:29:44,090 --> 00:29:46,909 No, this lady, Margaret McCartney. Amazing, right. 269 00:29:46,910 --> 00:29:57,140 A GP activist and can make your life a nightmare when she chooses to be and a great speaker and she said this is evidence live. 270 00:29:57,680 --> 00:30:00,410 She said we have to do something about conflicts of interest. 271 00:30:02,060 --> 00:30:06,379 She's really worried in primary care that not only is it in research, she's in clinical practice. 272 00:30:06,380 --> 00:30:13,790 Now, people are involved in developing apps and all sorts of interventions and then getting the federation to take them up with no evidence. 273 00:30:14,300 --> 00:30:18,230 And then people are not even describing they have commercial conflicts of interest. 274 00:30:19,010 --> 00:30:23,600 And I think conflict of interest is a huge problem in the next couple of days. 275 00:30:23,600 --> 00:30:25,970 It's going to be stuff about trans vaginal mesh coming out, 276 00:30:26,330 --> 00:30:33,830 which shows you the huge problem in surgery of conflicts of interest, real problem in how we operate. 277 00:30:34,490 --> 00:30:39,470 So we have decided with next year in mind, by the time we get to evidence log next year, 278 00:30:39,770 --> 00:30:43,969 we're going to try and sort out the mess of conflict of interest so we can get a 279 00:30:43,970 --> 00:30:48,740 statement that makes sense because I think some conflicts of interest are all right. 280 00:30:48,740 --> 00:30:53,270 You have to accept people involved in discovery, have to have conflicts of interest. 281 00:30:54,350 --> 00:31:00,950 So it's okay for you if you're involved in discovery, to take money from industry, to fund the research, to discover things. 282 00:31:02,290 --> 00:31:08,620 And I think that's okay. We can separate that out. The problem comes when people who are involved in discovery want to go. 283 00:31:08,800 --> 00:31:13,120 I want to be involved in effectiveness as well. Can I cross the line? 284 00:31:13,600 --> 00:31:18,640 I want to do the systematic review and be on the guideline and tell people why this is amazing 285 00:31:19,990 --> 00:31:23,290 and not tell you I've got a conflict of interest financially at the end of the day as well. 286 00:31:23,620 --> 00:31:28,660 And to me, that prevents a problem, because by the time you've been involved in discovery, 287 00:31:28,810 --> 00:31:33,680 you are at a position where you're academically and financially involved in the outcome. 288 00:31:34,420 --> 00:31:42,550 And so we've got this problem that is, I think, now so entrenched we can't see a way out of it. 289 00:31:43,750 --> 00:31:51,100 However, what did I say? We showed you that one of the things we do is we try and solve things, 290 00:31:51,100 --> 00:31:57,430 try and make it stir things up quite a bit trouble and see if we can make things shift in a little way. 291 00:31:57,460 --> 00:32:04,630 So we started I sent an email, she sent in email and said, I'd like to write an email to Sarah Wollaston, 292 00:32:05,080 --> 00:32:09,640 who's chair of the Health Select Committee, pointing out this issue with conflicts of interest. 293 00:32:10,560 --> 00:32:17,920 I wrote an email back saying, Let's get to work then. And in the evening we constructed this email that said, Dear Sarah, 294 00:32:18,130 --> 00:32:25,240 we think there's a huge problem with conflict of interests and the government should take this up. 295 00:32:25,240 --> 00:32:32,650 And we think that actually the GMC is a register where I, as a doctor every year have to give them money. 296 00:32:32,950 --> 00:32:40,209 Don't know why I give them money, but I'll give them it. And they put me on a specialist register to say, call your GP, you can look up my number. 297 00:32:40,210 --> 00:32:46,210 It's 4731643 and you can see that I'm allowed to practice as a GP. 298 00:32:46,930 --> 00:32:50,080 So if you come and see me as a GP and you think, Is he really a doctor? 299 00:32:50,980 --> 00:32:55,480 Sure. You might say, God, he can't be. You could go on and you can say, and I'm there. 300 00:32:55,600 --> 00:32:58,840 I paid my money. I'm on the specialist register. I've been appraised. 301 00:32:59,590 --> 00:33:02,680 However, what you can't see is who's paying me. 302 00:33:04,270 --> 00:33:08,980 And people have tried to do it in a sort of nefarious way. We do it in general publications, but it's all a bit of a myth. 303 00:33:09,310 --> 00:33:19,630 But why, at least as a doctor, couldn't we start on the GMC as a register and say, every year you have to declare Who's paid me in the last year? 304 00:33:20,560 --> 00:33:26,250 And over time you'd be able to see where I am. You could look back over ten years and couldn't say I was being paid by GSK either. 305 00:33:26,800 --> 00:33:34,390 It's being paid by Roche. Who was being paid by. Well, that makes sense because you would want to know if you were having a surgical procedure. 306 00:33:35,110 --> 00:33:40,570 Wouldn't you want to know if your surgery was being paid for by Johnson? Johnson shouldn't have the right to know that. 307 00:33:41,710 --> 00:33:47,010 So we wrote to her and only last week she sent us an email back. 308 00:33:47,020 --> 00:33:50,140 Thank you for your email about doctors declarations of interest. 309 00:33:52,090 --> 00:33:57,780 Because of the dissolution of Parliament. I am sending a copy of my letter and I will aim to ensure that the committee is accurate. 310 00:33:57,790 --> 00:33:59,679 Keep you up to date when I receive a response. 311 00:33:59,680 --> 00:34:10,030 So she sent a letter to the GMC saying I want to know why you felt in this out and if not, why we will then need primary legislation. 312 00:34:10,210 --> 00:34:12,340 Either you're going to do it or we'll have to do it. 313 00:34:12,850 --> 00:34:22,270 And so the purpose of that is to say if you start to take a small steps thinking about the quality of research, 314 00:34:22,810 --> 00:34:30,970 thinking about these issues, you start to think about not just the problems, but how might you fix it. 315 00:34:32,260 --> 00:34:36,970 And that was the editorial that said, when Ben said, let's start fixing it. 316 00:34:37,200 --> 00:34:39,609 For 20 years we've been talking about the problems. 317 00:34:39,610 --> 00:34:48,790 In 2015, we started to think about solutions, and if we can solve some of these issues, structural issues in the evidence base, 318 00:34:49,690 --> 00:34:56,290 we'll be in a much better position when we produce evidence to be able to give it to clinicians who can share it with patients. 319 00:34:57,100 --> 00:35:07,179 At the moment, there are too many problems. So I want to finish now and this is what the poor quality of medical research is wider in knowledge, 320 00:35:07,180 --> 00:35:12,150 yet disturbing with the leaders of the medical profession seem only minimally concerned about the problem, 321 00:35:12,170 --> 00:35:21,100 make no apparent efforts to find a solution. So what's interesting, when you look back in time, you see that it was already there 20 odd years ago. 322 00:35:21,700 --> 00:35:30,110 People like Doug Goldman, who unfortunately died earlier this year, were telling you what you should do just before he died to go. 323 00:35:30,130 --> 00:35:35,410 And I met him in the pub and he said, and this is on my left again. 324 00:35:35,410 --> 00:35:39,790 He said, we have to get universities to take this seriously. 325 00:35:40,420 --> 00:35:45,790 And at the moment it is a bit of a joke. It depends on who you are mentored by. 326 00:35:45,790 --> 00:35:50,709 Which group you are in will have a different view of what does quality mean. 327 00:35:50,710 --> 00:35:59,170 I know that a lot of groups in this university don't really matter, but we're probably way ahead of the curve compared to many other universities. 328 00:35:59,710 --> 00:36:08,520 So I think. That's a very important statement in terms of what I should be doing, what you should be thinking about doing next. 329 00:36:08,760 --> 00:36:09,600 Thank you very much.