1 00:00:00,420 --> 00:00:04,229 Hi. Hi. Welcome, everybody. Just to say who I am. 2 00:00:04,230 --> 00:00:07,650 Some of you I know. Some of you know me. My name's Carl Hennigan. 3 00:00:08,220 --> 00:00:14,460 I'm director of the Centre for Evidence based Medicine and also professor of evidence based medicine here at the University of Oxford. 4 00:00:16,080 --> 00:00:23,820 I'm also a GP, a practising GP and over the weekend, for instance, I was working in the urgent care setting. 5 00:00:24,210 --> 00:00:33,180 But I'm going to talk to you about today is the distillation of my 20 year thinking in evidence based practice and where we are now. 6 00:00:33,990 --> 00:00:43,260 To a group of us who are really interested in solving some of the issues around evidence based medicine, particularly the E in E beam. 7 00:00:44,580 --> 00:00:48,659 So the concept is that I want you to read some of you while I'm talking. 8 00:00:48,660 --> 00:00:53,400 I'm going to talk for about 35 minutes talking about the thinking of what I think the issues are. 9 00:00:53,790 --> 00:01:02,190 And then we're going to have a discussion and you're going to all help me and help the group solve this issue and some of these issues. 10 00:01:03,330 --> 00:01:10,800 And that's part of the mission today. Just to say the for those on Twitter, the hashtag is evidence life, evidence life. 11 00:01:12,030 --> 00:01:24,120 Okay, let's stop. About 20 years ago, a chap called Doug Goldman wrote a piece in the BMJ, an editorial called The Scandal of Poor Medical Research. 12 00:01:24,840 --> 00:01:32,910 And he basically outlined there were huge, significant problems at the time in how research was published and how it with disseminated. 13 00:01:33,330 --> 00:01:37,230 And this was a scandal because much of research was going to waste. 14 00:01:38,430 --> 00:01:46,290 And the problem is, over the 20 years, there's been so many initiatives that have been in different areas that have tried to solve this problem, 15 00:01:46,920 --> 00:01:52,920 and they tend to be isolated and not work together. So, for instance, some of that might be reporting standards, 16 00:01:52,920 --> 00:01:59,490 which are things like consort of try to bring together a system to try and improve the publishing of research. 17 00:02:00,300 --> 00:02:05,530 But one of the problems is that it's all been a bit piecemeal in what it's done today. 18 00:02:05,550 --> 00:02:08,940 So there have been little sporadic issues and little sporadic changes. 19 00:02:11,130 --> 00:02:14,430 There's also a long history of the third of them trying to fix these problems. 20 00:02:14,430 --> 00:02:19,470 But we want to pull them together through a clear a clear set of achievable goals 21 00:02:19,980 --> 00:02:23,790 and a stronger overview of the strategies that work best to help deliver change. 22 00:02:24,900 --> 00:02:29,580 Some of this was precipitated by this piece, which is an editorial which was about two years ago, 23 00:02:29,580 --> 00:02:34,770 which Trish Greenhalgh led, which was talking about evidence based medicine is a movement in crisis. 24 00:02:37,470 --> 00:02:44,640 And just recently with Ben Goldacre wrote an editorial about how medicine is broken and how we can fix it. 25 00:02:45,570 --> 00:02:49,950 So there's been a 20 year problem, if you like, about the quality of evidence. 26 00:02:50,610 --> 00:02:54,450 So some of you hear on the surgical sciences and some of you are on the MFC and IBN. 27 00:02:54,450 --> 00:03:00,089 But remember, the definition of evidence based medicine is the application of the best available 28 00:03:00,090 --> 00:03:07,680 research evidence to clinical practice integrating the values of patients in. 29 00:03:07,680 --> 00:03:14,820 That's one of the important concepts, isn't it, the best available research evidence with your clinical expertise and the patient value. 30 00:03:15,180 --> 00:03:19,440 So that still holds true when anybody comes to me and say that the problem with evidence based medicine, 31 00:03:19,440 --> 00:03:26,130 my first thing is to say there is not a problem with evidence based medicine as an ideology as what we're trying to achieve. 32 00:03:26,460 --> 00:03:32,820 There's a fundamental issue with the high quality evidence that goes into our decision making. 33 00:03:33,300 --> 00:03:38,900 And over the next ten, 20 minutes, I'm going to show you that I think it's actually got worth and still go. 34 00:03:38,910 --> 00:03:44,520 And that is editorial. Things are actually worse than they were 20 years ago in lots of areas. 35 00:03:46,720 --> 00:03:51,940 So those of you are online at the moment, if you want, while you are online, 36 00:03:51,940 --> 00:03:55,620 you can read this manifesto because it's gone up today on the evidence live site. 37 00:03:55,840 --> 00:03:59,230 It's evidence live dot org slash manifesto. 38 00:03:59,240 --> 00:04:06,670 If you just go to evidence live org, you'll be able to click through on this picture which says re DBM 2.0 manifesto. 39 00:04:06,670 --> 00:04:11,950 So if you're online right now, feel free to read because at some point we're all going to read it because I'm 40 00:04:11,950 --> 00:04:16,090 going to bring it up so you can think about some of the issues while I'm talking. 41 00:04:17,530 --> 00:04:22,870 All right. So this piece of work, why we need better evidence. 42 00:04:22,870 --> 00:04:29,019 Why do we need better evidence? I'm going to take you on a little journey of about 5 minutes of one of the most important 43 00:04:29,020 --> 00:04:35,530 pieces of work I've been involved in was with in Tamiflu and the use of antiviral in influenza. 44 00:04:37,240 --> 00:04:43,780 In 2009, I was asked actually by my sister fed my children. 45 00:04:44,290 --> 00:04:52,359 I've gone on the phone line and NHS evidence all NHS 111 said that you drive to Northampton to get some antivirals for my 46 00:04:52,360 --> 00:05:00,390 children and you should give them that because that's important in preventing complications in the swine flu outbreak. 47 00:05:00,400 --> 00:05:04,640 You'll remember that in 2009 when you were all going to die. Yeah, okay. 48 00:05:05,080 --> 00:05:09,969 You shouldn't really be here. You should have all died by now. But remember, this is why we need better evidence. 49 00:05:09,970 --> 00:05:13,240 So at the time I went, Oh, actually, I don't know the answer to that. 50 00:05:14,350 --> 00:05:16,360 And we decided to do the systematic review. 51 00:05:16,360 --> 00:05:22,750 We did the systematic review in children, which showed that with a small benefit in symptom reduction, mainly fever, 52 00:05:23,320 --> 00:05:29,020 which is about equivalent because it was fever about equivalent to what you'd get if you used paracetamol or one of the anti-inflammatory drugs. 53 00:05:29,410 --> 00:05:34,600 And actually there was no effect on any secondary complications whatsoever and there were quite serious harms. 54 00:05:34,600 --> 00:05:40,690 In effect, one in 20 children would have serious vomiting that might precipitate another visit to the doctor. 55 00:05:40,900 --> 00:05:44,380 So I couldn't quite work out why we were using it, why were stockpiling it. 56 00:05:44,740 --> 00:05:51,160 And I thought we would produce some of the evidence to show how this would change practice. 57 00:05:52,090 --> 00:05:57,970 At the same time, the adult team, led by Tom Jefferson, were looking at the adult review. 58 00:05:58,270 --> 00:06:05,710 And one of the key issues about the adult review was that they came to a problem when they tried to find some of the evidence. 59 00:06:06,490 --> 00:06:13,330 And in finding the evidence, it wasn't what what happened is much of the evidence for the adults had never been published, 60 00:06:13,810 --> 00:06:18,220 and they started to try and contact some of the authors and track down the data. 61 00:06:18,550 --> 00:06:23,950 And that's when we combined our forces together thinking, well, this is all this is a really important public health drug. 62 00:06:24,670 --> 00:06:29,260 We've spent millions on this drug and we promoted it in the W.H.O., the essential medicines list. 63 00:06:29,260 --> 00:06:38,100 Surely the evidence should be available. So part of that process, one of the first people they went to was a professor, Fred Hayden. 64 00:06:38,100 --> 00:06:42,180 And you can see here with on one of the biggest ever metre analysis down the bottom here, 65 00:06:43,200 --> 00:06:47,549 what happened is this method analysis done by a chap called Lauren Kizer, 66 00:06:47,550 --> 00:06:52,320 of which Frederick Hayden, with the corresponding author, had published ten trials. 67 00:06:54,080 --> 00:06:58,880 But only eight of them, child. Oh, sorry. Only two of them child were ever published in the journal. 68 00:06:58,910 --> 00:07:03,140 Eight were unpublished. So we went and asked them, Could we have that data? 69 00:07:05,540 --> 00:07:11,870 So you send an email. And his response was, I cannot find the original files related to this 2003 publication. 70 00:07:12,560 --> 00:07:16,580 The questions are not clear to me. They were very clear to me. Could we have the data, please? 71 00:07:17,840 --> 00:07:24,920 But if the original data and probably study reports are required, they will likely need to come from Roche, the sponsor of the studies. 72 00:07:26,120 --> 00:07:37,730 So we moved on. We then went to Professor John Traynor, who was actually the lead author on the largest ever single trial ever done in this trial, 73 00:07:38,210 --> 00:07:46,580 with only ever published as a 300 word abstract in a conference in America, and it's called in 76001. 74 00:07:47,990 --> 00:07:56,170 When asked about this trial that he presented, supposedly he couldn't actually, as far as he could remember, 75 00:07:56,180 --> 00:08:00,500 the trial, published in JAMA was the only large study of death cell of Tamiflu. 76 00:08:00,530 --> 00:08:05,089 He never participated in Channel four News because we went to the media, put it to Roche, 77 00:08:05,090 --> 00:08:14,000 that Professor Turner said that he didn't actually participate in this study and 70 6001 and doesn't remember presented it at the meeting. 78 00:08:14,240 --> 00:08:17,720 He actually wasn't there, didn't present it, didn't participate in this study. 79 00:08:18,110 --> 00:08:24,150 His name's on it. So, you know, start to get worried about the what's going on in the research happening. 80 00:08:25,900 --> 00:08:31,180 So we did go to one of the people who who'd published one of this study. This is a chap called Nicholson, Professor Nicholson. 81 00:08:31,690 --> 00:08:38,550 He was actually on one of the two studies that were published. And his response was, he did not recall seeing the primary data. 82 00:08:39,070 --> 00:08:43,570 He said that the statistical analysis had been conducted by Roche and he analysed the summary data. 83 00:08:43,990 --> 00:08:47,500 So the lead author on a trial, he'd never actually even looked at the data. 84 00:08:47,500 --> 00:08:51,880 He couldn't verify whether it existed. He couldn't even vouch for whether it was correct or not. 85 00:08:52,960 --> 00:08:57,730 Now all of that, anybody in the room, anybody out there should start to have alarm bells going off. 86 00:08:57,730 --> 00:09:00,370 Because when you tell patients this, they get really worried. 87 00:09:00,670 --> 00:09:07,420 But somehow in research, we start to feel that this is just the way that it's done, this is how we do it. 88 00:09:07,420 --> 00:09:14,650 And, you know, we have to keep it quiet from everybody. But when you start to tell people, boy, do they get worried at this point, what did that mean? 89 00:09:14,860 --> 00:09:20,800 It meant that 60% of the data was never published. And the FDA talked about not only that was published, 90 00:09:21,070 --> 00:09:26,920 there is this whole system of research that what they call is above the waterline is journal publications, 91 00:09:27,190 --> 00:09:30,700 and below the waterline is all the unpublished data. 92 00:09:31,730 --> 00:09:37,010 Clinical study reports, individual data, adverse events and all that exist. 93 00:09:37,460 --> 00:09:43,670 But nobody ever gets to look at it for important decision making. Okay. 94 00:09:43,730 --> 00:09:49,970 So what did that do? We had a breakthrough in 2013, Roche agreed to release all the trial data on Tamiflu drug. 95 00:09:50,210 --> 00:09:56,430 We started to make progress and move forward. But look, this is the difference. 96 00:09:56,520 --> 00:10:01,770 Look at the difference. This is m7001, which I showed you the 300 word abstract. 97 00:10:02,220 --> 00:10:11,400 Yep. This is the exact amount of data that exist for that abstract 9809 pages in the Clinical Study Report. 98 00:10:12,340 --> 00:10:21,700 See the difference now? Which one do you think will provide adequate information to even do a critical appraisal? 99 00:10:21,700 --> 00:10:28,810 Never, never mind based World Health Organisation decisions on because that's the one that's been used by everybody until today, 100 00:10:29,590 --> 00:10:38,800 until 2014, when we did the analysis that was used by nice used by FDA is by all these bodies around the world to make decisions. 101 00:10:39,010 --> 00:10:41,950 The largest ever trial, 9809 pages. 102 00:10:41,950 --> 00:10:49,600 So you get the idea there's a fundamental structural problem in how evidence is produced, disseminated, how it's reported. 103 00:10:51,340 --> 00:10:55,540 I have to say, having worked on this, it's almost like if you work in an area for long enough, 104 00:10:55,540 --> 00:11:01,030 you get to a point where you just cannot believe how bad the system is and how much we tolerate it. 105 00:11:01,300 --> 00:11:06,010 But I think we tolerate it because we never really quite get under the bonnet through it. 106 00:11:06,430 --> 00:11:11,170 We never quite know what's in our engine of our car. We just drive it until it goes wrong. 107 00:11:12,070 --> 00:11:15,370 And when you think about it, that's what we do until we start looking under the bonnet and think, 108 00:11:15,370 --> 00:11:18,820 Wow, this is a really old engine, doesn't quite work well. 109 00:11:20,470 --> 00:11:28,180 And you know, by April 2014, the newspapers were reporting that we were wasting millions. 110 00:11:28,180 --> 00:11:32,650 In the UK. We spent 473 million. On the stroke. 111 00:11:33,130 --> 00:11:37,540 In fact, now we spend over half a billion. And in fact, we still stockpile it. 112 00:11:38,440 --> 00:11:43,110 And I can come back to that at the end about the issues, however. That's one of the bits. 113 00:11:43,120 --> 00:11:47,379 And I'm going to show you more examples, brief examples of that. 114 00:11:47,380 --> 00:11:52,210 My own personal thought of journey, when I got to the point where I said, Enough is enough. 115 00:11:52,600 --> 00:12:00,940 This is actually unacceptable. And this has been a recurring theme in many areas of evidence, and it's used in evidence based medicine. 116 00:12:02,190 --> 00:12:04,409 So one of the things we've done, if you go on the manifesto, 117 00:12:04,410 --> 00:12:11,040 you'll see I outline there are 20 reasons why we need better evidence and each one of them you can search. 118 00:12:11,040 --> 00:12:18,270 And if you go through on the website, you can pull out relevant publications, relevant systematic reviews that show you these are real issue. 119 00:12:18,570 --> 00:12:24,780 And when I was doing that, I thought, well, how many problems do Tamiflu trials exhibit of these 20 problems. 120 00:12:25,350 --> 00:12:33,749 And so you can keep going through and they exhibit numerous problems publication bias the research is poor quality report reporting bias. 121 00:12:33,750 --> 00:12:37,590 I showed you the ghost of which they have lack of shared decision making, 122 00:12:37,590 --> 00:12:43,260 regulatory failing surrogate outcomes, too much medicine, prohibitive cost of drug trials to replicate them. 123 00:12:43,470 --> 00:12:48,180 So actually, they have a significant number of fundamental problems that everybody has shown. 124 00:12:48,180 --> 00:12:55,530 This is a real issue. However, irrespective of all them bits in the yellow government still stockpile the drug. 125 00:12:56,190 --> 00:12:57,000 So there's a problem. 126 00:12:57,840 --> 00:13:06,510 If you had a trial or a series of trials with that many problems, surely you would say the results are not valid irrespective of what they show. 127 00:13:06,510 --> 00:13:10,380 You would get to a point where you go the somewhat wrong with the evidence. 128 00:13:12,090 --> 00:13:18,190 So that's my own journey. In terms of Tamiflu, however, this. 129 00:13:18,210 --> 00:13:23,340 Equally worrying is the growth and volume of evidence that has been accompanied by this corrosion in the quality of evidence. 130 00:13:24,420 --> 00:13:29,160 And it's compromised, as I said, our ability to provide affordable, effective and high value care. 131 00:13:29,730 --> 00:13:33,060 That's what's led to the version 1.4. 132 00:13:33,270 --> 00:13:37,050 And I'll explain when we get to version 2.0 how we do that in a minute. 133 00:13:39,000 --> 00:13:42,240 Okay. Here's some other examples that have influenced me on the way. 134 00:13:44,130 --> 00:13:47,550 When I started Medicine in 2000, going back 99, 135 00:13:47,550 --> 00:13:55,350 2000 with a particular Cole Vioxx that was marketed as one of the most effective pain relievers on the market. 136 00:13:55,440 --> 00:14:03,240 And it had added exempt added benefits over anti-inflammatories because the reduction of GI side effect, however, 137 00:14:04,380 --> 00:14:12,690 and within very quickly that drug became one of the leading blockbusters in the world failed in excess of $44 billion a year. 138 00:14:13,650 --> 00:14:24,299 However, only when it got to litigation did we see that the company who knew this had withheld data and they'd only use the 139 00:14:24,300 --> 00:14:31,360 on treatment analysis which minimised the the appearance of any mortality risk using the evidence based treatment. 140 00:14:31,650 --> 00:14:36,570 So that means the only report, the effects in the people who are still on the treatment. 141 00:14:37,740 --> 00:14:42,930 Well, the people who were having the heart attack were stopping treatment and were being removed from the analysis. 142 00:14:44,850 --> 00:14:50,220 And you report not only to the public, to the journals, you report to the FDA, this issue. 143 00:14:51,610 --> 00:14:54,050 And that minimises the risk. 144 00:14:54,790 --> 00:15:04,870 Only when they found out what the it the intention to treat analysis was did they suddenly discover that this drug tripled your risk of heart attack. 145 00:15:05,840 --> 00:15:10,010 And then he subsequently got withdrawn, but they withheld that. 146 00:15:10,850 --> 00:15:17,510 This the editorial at the time from some of the leading editors said no, only with the drug companies. 147 00:15:17,780 --> 00:15:25,220 Part of the problem was that actually many of the doctors involved in the trial were complicit in this failing, if you like. 148 00:15:25,440 --> 00:15:29,970 And they were also to blame. Is a second example. 149 00:15:30,840 --> 00:15:34,970 And anybody here to study 3 to 9. In the room. 150 00:15:36,880 --> 00:15:40,180 David's right. Here's your homework for this week. 151 00:15:40,600 --> 00:15:46,989 It's even got its own Wikipedia. Page 33293329 is now quite famous. 152 00:15:46,990 --> 00:15:55,059 And for the ABM nerds out there, by the end of the week, you will qualify as a nerd if you know everything. 153 00:15:55,060 --> 00:16:02,350 About 3329329 relates to a drug called paroxetine, which is an antidepressant, 154 00:16:02,890 --> 00:16:08,950 one of the SSRI new a type of antidepressant, but it was marketed because it was relevant to adolescents. 155 00:16:11,140 --> 00:16:16,900 Again, the company, when they did the trial, minimised the harms in adolescents. 156 00:16:17,950 --> 00:16:24,159 In what's an effect called reporting bias by saying that some of the events where children had suicides, 157 00:16:24,160 --> 00:16:30,880 they put them down to other potential outcomes and remove them from the analysis again. 158 00:16:31,630 --> 00:16:36,400 Now, it was only in litigation against the company that it was discovered that actually the 159 00:16:36,400 --> 00:16:40,690 rates of suicides in adolescents were twice what we reported in the published paper. 160 00:16:41,620 --> 00:16:42,129 And in fact, 161 00:16:42,130 --> 00:16:50,720 recent reanalysis and restoring of the trial under what's called the Restoring Invisible Trials recap has shown that actually might even be higher. 162 00:16:50,720 --> 00:16:58,840 And that's the BMJ, which was only about a few months ago, another blockbuster used in adolescent Billy Boys. 163 00:16:58,840 --> 00:17:02,170 The failed then actually increased rates of suicide. 164 00:17:03,260 --> 00:17:10,950 In adolescence. So the exact way what it was trying to achieve, it was doing the exact opposite. 165 00:17:11,130 --> 00:17:16,620 But with clinicians didn't know what was going on because there were failings again in the evidence. 166 00:17:17,520 --> 00:17:27,550 So homework study 3 to 9. One of the things, again, here's what we started as a consequence of the Tamiflu with all trials. 167 00:17:28,240 --> 00:17:33,670 All trials was started because half of all research is never published. 168 00:17:35,280 --> 00:17:39,960 So at some point you go, what if half of the research is not published? 169 00:17:40,140 --> 00:17:47,719 We know there's a major structural problem already. So by the time you're thinking about how much research makes a difference, well, 170 00:17:47,720 --> 00:17:52,640 you know, already 50% of what happens is going to be likely never going to be looked at. 171 00:17:53,000 --> 00:17:57,410 So all trials with truth try joints, they all trial should be registered and reported in full. 172 00:17:57,650 --> 00:18:03,560 There's been major breakthrough and you can see even the companies now are starting to understand we need to be involved in this. 173 00:18:03,860 --> 00:18:09,739 And in fact, there's going to be a European Medicines Agency announcement in the next week or so to say. 174 00:18:09,740 --> 00:18:16,460 We're making all clinical study reports available as part of a new drug application. 175 00:18:16,820 --> 00:18:22,129 What old trials showed me was that actually people working together can fix problems. 176 00:18:22,130 --> 00:18:28,580 And this was a group of people of which the Centre for GBM was one, but it included people like things about phones, 177 00:18:28,880 --> 00:18:32,780 included the BMJ Cochrane Library working together to try to make a difference. 178 00:18:36,740 --> 00:18:40,280 It might sound like I'm painting the whole doom and gloom of the world, 179 00:18:40,280 --> 00:18:47,059 but actually I'm just painting a picture of my journey through medicine as a doctor and as an evidence based medicine expert, 180 00:18:47,060 --> 00:18:50,930 if you like, keeping an eye on what's going on. Here's another example, Rosie. 181 00:18:50,930 --> 00:18:59,090 Go to that, another blockbuster drug that used a surrogate outcome of diabetes in death. 182 00:18:59,990 --> 00:19:04,430 So it's it's a it's a drug that lowers your blood sugar. 183 00:19:05,210 --> 00:19:10,500 And apparently, because it's a composite with death, it lowers your risk of death. 184 00:19:10,520 --> 00:19:12,590 It never did lower your risk of death. 185 00:19:12,860 --> 00:19:20,300 Combining the two end points, lower your risk of blood sugar, but you combine the two and point people for lower your risk of death. 186 00:19:20,330 --> 00:19:24,110 In fact, what it did do is increase people's risk of heart failure. 187 00:19:24,500 --> 00:19:26,990 And for every 250 people on the drug, 188 00:19:27,170 --> 00:19:33,770 one extra person would would have heart failure because it causes fluid to be retained and it has all sorts of problems. 189 00:19:34,340 --> 00:19:41,390 Again, not known at the outset, not reported harms, not put out there what happened to this drug? 190 00:19:41,660 --> 00:19:48,830 It took ten years before it was withdrawn in Europe and in America. 191 00:19:48,830 --> 00:19:56,090 It's still on it still is still on the regulatory approval list in America, but it's got restrictions in its use. 192 00:19:57,700 --> 00:20:01,980 One of the things I say about people involved you're up to date with evidence based medicine is 193 00:20:02,530 --> 00:20:08,110 is will get there in the end the truth will get there in the end if you understand the evidence, 194 00:20:08,290 --> 00:20:13,600 you'll probably get there a bit quicker than everybody else because at some point we will understand the truth. 195 00:20:13,900 --> 00:20:18,940 But on the on the on the on the journey there, we can harm a lot of people. 196 00:20:24,240 --> 00:20:28,890 He's so one of the problems is we want to think about independent trials and how you do them. 197 00:20:29,250 --> 00:20:31,139 But the other aspect, while I've been doing this, 198 00:20:31,140 --> 00:20:38,130 the cost of trials have been escalating hugely at the front end, making them almost impossible to do. 199 00:20:38,160 --> 00:20:40,860 So you have to do it in a big trials unit, in a massive team. 200 00:20:41,250 --> 00:20:45,720 You have all this regulation at the front end and then at the back end, nobody cares what you do. 201 00:20:47,510 --> 00:20:51,420 So to get the trial off the ground will be immensely difficult for you. 202 00:20:51,440 --> 00:20:55,640 You probably need hundreds of thousands of pounds. You have hundreds of forms to fill in. 203 00:20:55,850 --> 00:21:01,850 And then when the trial finished, nobody cares where the publisher is disseminating to the people who participated in the trial. 204 00:21:02,090 --> 00:21:05,900 What you do with your vote whatsoever, how that can be ethical is beyond me. 205 00:21:06,440 --> 00:21:09,440 So there's a real problem if we want to replicate trial results. 206 00:21:09,950 --> 00:21:15,390 How do we lower the cost of trials? I put the surgeons in the room. 207 00:21:15,420 --> 00:21:18,240 There are a few surgeons. I thought I'd rather be a surgeon in here. 208 00:21:18,540 --> 00:21:24,600 Because the other thing is, when I come to devices, I get and I'm a world expert on the evidence for devices, by the way, 209 00:21:25,500 --> 00:21:30,659 and that's because there never is hardly any evidence to go with many medical devices that are used, 210 00:21:30,660 --> 00:21:36,030 particularly newly approved devices, because they're often approved with a thought process called equivalence. 211 00:21:36,930 --> 00:21:40,259 And when we do equivalence, that means my device is a bit like yours. 212 00:21:40,260 --> 00:21:46,200 So couldn't have access to the market marketplace in this is we made up a whole device and the 213 00:21:46,200 --> 00:21:51,269 company and got it approved in an Australian notified body to the point where they would have said, 214 00:21:51,270 --> 00:22:00,630 here we go. Not only did it once, we did it twice with a metal hip to the point where it's very easy to get devices that have no evidence approved. 215 00:22:01,260 --> 00:22:07,110 And so not only we talk about drugs, but in medical devices, the problems are just as bad. 216 00:22:07,620 --> 00:22:11,010 However, nobody knows because often there's no evidence to go with any devices. 217 00:22:13,500 --> 00:22:18,810 Here's another one. This is just even recently. And again, this is a drug called Rivaroxaban. 218 00:22:19,560 --> 00:22:24,060 EU approved the round table for secondary prevention after an acute coronary syndrome. 219 00:22:24,090 --> 00:22:27,870 So that's the European Medicines Agency. Okay. You go to Europe. 220 00:22:28,020 --> 00:22:32,129 In in in in Europe. The European Medicines Agency in America. 221 00:22:32,130 --> 00:22:39,570 It's the FDA two approval route. So this is what European Union reports in May 2015. 222 00:22:41,510 --> 00:22:44,900 This is what the FDA put today in 2012. 223 00:22:47,000 --> 00:22:50,420 Approval, not approval. Approval. 224 00:22:50,790 --> 00:22:56,870 Not approval. So we've got these situations where we've got drugs around the world that in America they're on the market. 225 00:22:57,260 --> 00:23:01,400 In Europe they're not or the other way around. In Europe, they're on the market and America they're not. 226 00:23:01,610 --> 00:23:05,069 And in America, they've seen that. Huge problems with missing data here. And I've just showed you. 227 00:23:05,070 --> 00:23:08,120 We've we understood that problem already. Yeah. 228 00:23:08,120 --> 00:23:14,750 Our regulators are going. Yes, there's a real mantra to approve drugs, to get them on the market, to keep business going. 229 00:23:15,710 --> 00:23:23,450 How can we have a system in the world where we have Europe saying yes and America saying no and then not talking to each other? 230 00:23:24,460 --> 00:23:31,330 Are they looking at the same evidence or not? I have no idea what's going on, but there's a huge issue to pursue with that. 231 00:23:33,550 --> 00:23:39,430 And this is Carmelo Anthony with some of you this morning with beseeching wrote an editorial about River Rock, 232 00:23:39,430 --> 00:23:46,930 the band to say we're now where are we but six years into the approval of this drug. 233 00:23:47,560 --> 00:23:57,820 Probably somewhere around there for a year, four years for the ACF that they still under uncertainty because there are key problems with the trial. 234 00:23:58,950 --> 00:24:04,049 In the trial. It's been found out that in the control group who were taking warfarin that the 235 00:24:04,050 --> 00:24:09,990 machine used to monitor the warfarin arm was defective up to one third of the time. 236 00:24:11,370 --> 00:24:22,320 So you compare in a new intervention to a crap comparator, and that's always been shown to be a problem in determining the effect of an intervention. 237 00:24:22,560 --> 00:24:27,370 It's one of them problems that should be alarm bell for regulators and go, Oh my gosh, we can't accept this. 238 00:24:27,390 --> 00:24:30,960 We need a new trial. That's not happened. It's been approved. 239 00:24:32,810 --> 00:24:40,920 And here's another example. This is a one that's led by Ben Goldacre with a group of students. 240 00:24:40,940 --> 00:24:46,100 It's called Compare. It's called the Centre for Evidence based Medicine Monitoring Trial Outcome Project. 241 00:24:46,820 --> 00:24:52,610 And what we did is take the top five journals and started in about this time last year, 242 00:24:52,610 --> 00:24:55,189 about October last year, and said every trial that's published, 243 00:24:55,190 --> 00:25:02,420 every randomised trial that's published in the top five journals, we are going to check to see the outcome that was reported. 244 00:25:03,170 --> 00:25:09,530 Was it same as the outcome that was pre-specified at the outset in the trial registrar or in the protocol that's available? 245 00:25:10,100 --> 00:25:15,530 Okay. Everybody happy with that? Because you should start you put an ethics application, you put it on a trial registrar, 246 00:25:15,830 --> 00:25:19,700 you shouldn't take these outcomes and they should get to the end and measure your outcome. 247 00:25:19,720 --> 00:25:25,200 That good clinical practice that's in the. The purpose of research is to specify an outcome. 248 00:25:25,220 --> 00:25:30,140 Okay. So we tracked 67 trials of them. 249 00:25:30,350 --> 00:25:37,550 Nine were perfect. Okay. That means 58 were of the 58 that weren't. 250 00:25:38,000 --> 00:25:43,399 354 So this is primary and secondary outcome. 354 outcomes were not reported. 251 00:25:43,400 --> 00:25:49,040 That said they would be and 357 new outcomes were silently added. 252 00:25:50,720 --> 00:25:54,440 So do you understand what's happening? You're doing your trial and you're saying, well, 253 00:25:54,440 --> 00:26:00,080 let's pull some of the outcomes and let's put some new ones in its place and maybe we'll put the new ones in place that are positive. 254 00:26:02,220 --> 00:26:07,050 You won't know that because you don't go. And we didn't go until we ask this question and go back and check that. 255 00:26:07,530 --> 00:26:12,530 But what we also did is we sent 58 letters to the journals. 256 00:26:12,540 --> 00:26:19,409 And this was a bit of work actually, because if you look at the journals, a pretty you know, if you think about science should be self-correcting. 257 00:26:19,410 --> 00:26:22,830 They give you a window of about two weeks. Some of them give you three weeks. 258 00:26:22,830 --> 00:26:27,870 But after that, you can't submit the letter anymore. Well, one thing if you see a problem that week four. 259 00:26:28,200 --> 00:26:33,530 Okay. So we met that window of the 58 letters where there was an error. 260 00:26:33,540 --> 00:26:38,070 We were pointing out an error. Only 18 were published. 40 were rejected out of hand. 261 00:26:39,290 --> 00:26:46,490 And we've started republishing all this now. I have to say, the journals in this era, the way they're acting, 262 00:26:46,670 --> 00:26:53,060 are not compatible with the scientific method and not compatible with high quality evidence and are huge problems in what they do. 263 00:26:54,250 --> 00:26:58,900 And this is sort of simple project like you can do with a team are starting to show that. 264 00:27:01,080 --> 00:27:08,670 And finally, here's the one that really gets me. Everybody goes on about shared decision making, if that's the way forward. 265 00:27:08,700 --> 00:27:13,709 We are here to inform patients and this is a systematic review of shared decision 266 00:27:13,710 --> 00:27:17,010 making and patient outcomes that's available in medical decision making. 267 00:27:17,370 --> 00:27:20,340 It's a well-done, systematic review, and that's their conclusion. 268 00:27:21,030 --> 00:27:26,580 Evidence is lacking for the association between empirical measures of shared decision making and patient behavioural and health outcomes. 269 00:27:27,450 --> 00:27:32,010 That's because nobody's investing in the research and nobody's doing it. 270 00:27:32,760 --> 00:27:36,000 There are so few trials in this area. It's unbelievable. 271 00:27:37,440 --> 00:27:41,470 So we all talk about like, we want to do this, but actually nobody's doing it. 272 00:27:41,490 --> 00:27:46,410 And so we have a conclusion that there are no tools. There's no way I can advise you what to do. 273 00:27:47,340 --> 00:27:51,000 If somebody said to me how to inform my patient better or go, I don't know. 274 00:27:51,180 --> 00:27:54,810 I don't know what to do. I'm doing it from opinion. Lowest level of evidence. 275 00:27:56,920 --> 00:27:58,540 Okay. And then. So finally. 276 00:27:58,570 --> 00:28:05,950 So, look, things have got so bad that the Academy of Medical Sciences has had to launch a new project on evaluating evidence. 277 00:28:05,950 --> 00:28:13,300 And they've done this at the request of the Chief Medical Officer in the UK, Professor Dame Sally DAVIES. 278 00:28:14,050 --> 00:28:18,820 There seems to be a view that doctors over medicate so it's difficult to trust them and that clinical 279 00:28:18,820 --> 00:28:23,379 scientists are all beset by conflicts of interest from industry funded under their untrustworthy. 280 00:28:23,380 --> 00:28:27,850 To see what she cited particularly Tamiflu comes into this. 281 00:28:28,300 --> 00:28:29,260 But also the other one, 282 00:28:29,260 --> 00:28:35,590 if you've have seen the huge issues right now around stuff in cholesterol lowering drugs that are causing all sorts of controversy. 283 00:28:35,890 --> 00:28:41,379 I have therefore reluctantly I don't know why she should be reluctant to come to the conclusion that we do need an 284 00:28:41,380 --> 00:28:47,350 authoritative independent report looking at how facilities should judge the safety and efficacy of drugs as an intervention. 285 00:28:47,800 --> 00:28:54,220 Well, we come to a different conclusion. Actually, I don't want to know the ivory tower report that says let's just whitewash it. 286 00:28:54,220 --> 00:29:01,360 If we've come to the conclusion that everybody interested in evidence based medicine should be contributing to the solution, 287 00:29:02,050 --> 00:29:08,620 should be identifying one problem that they're interested in, and let's go and fix it, as opposed to just talking about the problem. 288 00:29:10,000 --> 00:29:13,209 So that's where that evidence live. 289 00:29:13,210 --> 00:29:21,220 Even manifesto to point out the creation of a manifesto that is developed by people engaged at all points in the research ecosystem. 290 00:29:21,550 --> 00:29:27,280 That's everybody in this room, I assume, because you're all here who want to engage in fixing the problem, including above all, 291 00:29:27,280 --> 00:29:32,140 patients in the public who indirectly fund but are directly affected by the outputs of the current system. 292 00:29:32,620 --> 00:29:36,220 Okay, so let me try and bring this up. 293 00:29:36,850 --> 00:29:39,340 Here we go. By the magic. I'm going to. 294 00:29:39,400 --> 00:29:45,520 So if you go online now, you get to the manifesto and it gives you the preamble of what I've done and you can jump. 295 00:29:45,820 --> 00:29:48,100 And there are nine points at the moment. Okay. 296 00:29:50,080 --> 00:29:54,700 I'm just going to take you through I'm not going to say we and we have specific what do we want to achieve there? 297 00:29:55,630 --> 00:30:01,690 Okay. So informed decisions. We want clinical trials. 298 00:30:01,700 --> 00:30:05,690 We want to think about independent clinical trials, reduce the cost of a child. 299 00:30:06,140 --> 00:30:11,690 We want to solve publication and reporting both. We want to ensure that we've got outcomes that matter to patients. 300 00:30:13,310 --> 00:30:14,450 Evidence synthesis. 301 00:30:14,450 --> 00:30:21,860 We want to understand that these are really important tools in decision making, but they should be informed not just by journal publications. 302 00:30:22,220 --> 00:30:26,330 They have to have more informed evidence to underpin decision making. 303 00:30:28,190 --> 00:30:32,890 There's a short bit here, but look, research methods are in a myth right now in clinical trials. 304 00:30:35,200 --> 00:30:43,749 And I was having a talk here at the back with Martha about this earlier is that we are like the head mufflers of clinical trials in effect, 305 00:30:43,750 --> 00:30:50,170 because our job is to ensure that trials are done with rigour and that bias is to a minimum. 306 00:30:50,680 --> 00:30:53,980 And to do that, you have to be really officious, rather like a headmaster, 307 00:30:54,520 --> 00:30:58,839 and you have to come and beat people on the head and say You're doing it wrong. And they go, It's all right. 308 00:30:58,840 --> 00:31:01,060 We've always done it like this. What's the problem? 309 00:31:01,690 --> 00:31:07,570 Well, if you think about it, the problem is, is effect sizes are getting smaller and smaller and smaller and there's less room to play with, 310 00:31:08,710 --> 00:31:11,290 which is what should happen if you adding one drug on a new drug. 311 00:31:12,010 --> 00:31:17,620 It's really important that these trials are done so well that even if the effect is small, 312 00:31:17,620 --> 00:31:23,530 you can believe to believe that in fact, often that's not the case policy. 313 00:31:24,070 --> 00:31:29,410 We'd like to see policy that's based on evidence and that has a strategy to inform and educate policy. 314 00:31:29,620 --> 00:31:37,240 It's really important that our policy makers use rigorous and robust evidence to inform what they do, including routine data. 315 00:31:37,270 --> 00:31:44,470 Everybody believes routine data. The answer everybody wants to go away from trials to observational data because big is better. 316 00:31:47,330 --> 00:31:51,940 There are huge issues with the communication of evidence, whether that's in the media, 317 00:31:52,490 --> 00:31:57,670 whether that's in how we do it from press releases into the journals, what gets in front of the public. 318 00:31:57,680 --> 00:32:05,020 But actually, we've got to do a better job on that. And then our final two is individual accountability. 319 00:32:05,140 --> 00:32:14,950 We perceive that the real problem with conflict of interest in all aspects of medicine in terms of who's paying who to do what. 320 00:32:15,820 --> 00:32:23,350 Particularly when it comes to guideline development. 20 years ago, we used to pay individual doctors who do that round the world in places. 321 00:32:23,620 --> 00:32:27,070 Now they pay them to be on the guideline, to pay them to be involved in policy. 322 00:32:27,220 --> 00:32:34,810 Huge problem. And then there are huge problem within the concept of regulatory evidence and how we inform and do that better. 323 00:32:36,670 --> 00:32:41,170 Okay. That's the manifesto that you all have access to. 324 00:32:41,500 --> 00:32:49,210 What would like to do? If that's the manifesto I've discovered is for you at the end, is to have you effect. 325 00:32:50,230 --> 00:32:58,930 We are going to iterate and develop this manifesto with input from people who give us feedback. 326 00:32:59,390 --> 00:33:02,860 And when you give us feedback, people who do, we're going to write to at the end and say, 327 00:33:02,860 --> 00:33:06,970 Would you like to add your name to this manifesto when it gets published in the BMJ? 328 00:33:07,150 --> 00:33:15,200 So you'll be credited for having input, but you have to give us feedback and you have to say some things like What changes do we want to achieve? 329 00:33:15,220 --> 00:33:21,400 What do you think measureable changes could we achieve? Do you know about actions currently underway that could achieve this? 330 00:33:23,240 --> 00:33:26,480 And when we do that, it's going to be a large collaborative project. 331 00:33:26,960 --> 00:33:31,190 We're going to go out and do many talks with already set and talked about where we're going to do a talk and 332 00:33:31,190 --> 00:33:36,110 we're going to record them and get some of the people who really matter in the room to tell us what they think. 333 00:33:37,190 --> 00:33:43,309 But the most important people that matter if people like you, because if I sit up here and you go away and think, 334 00:33:43,310 --> 00:33:47,190 that was a nice talk, I'm going to go to the pope tonight and I'm going to forget all about it. 335 00:33:47,210 --> 00:33:54,590 We're in trouble because if you guys don't think this is important enough to even warrant a comment, I'm in trouble. 336 00:33:54,620 --> 00:33:59,120 What we're trying to do is it's trouble because this is really important stuff. 337 00:33:59,120 --> 00:34:04,670 If we can spend the next ten years fixing the evidence as opposed to talking about all the problems that exist. 338 00:34:05,240 --> 00:34:11,330 We need to do it in a collaborative way. And we need more people to start to think about how are they going to contribute. 339 00:34:12,170 --> 00:34:20,149 Now, that's the point. Where I go over to you and let you know that next year evidence life we will have a final 340 00:34:20,150 --> 00:34:24,860 version of this manifesto will be published in the BMJ and we will have an action plan. 341 00:34:25,220 --> 00:34:29,150 Will in the main focus of discussion to say right, how do groups of people contribute? 342 00:34:29,390 --> 00:34:32,960 Take this forward? How do we achieve some change in the evidence base? 343 00:34:33,560 --> 00:34:34,010 Thank you.