1 00:00:00,060 --> 00:00:05,200 Thanks for coming, because it is actually terrible out there, isn't it? It's quite nice in here. 2 00:00:05,370 --> 00:00:09,000 Terrible out there. So we're talking about safe and effective drugs. 3 00:00:09,420 --> 00:00:13,290 And thanks for coming forward so I can see your face and you don't fall asleep. 4 00:00:13,980 --> 00:00:18,780 And if you do, I might ask you a question, so be careful. But I want to take you on a bit of a journey. 5 00:00:18,780 --> 00:00:26,370 And this journey starts for myself about 089, but it starts about 40 years ago. 6 00:00:26,490 --> 00:00:29,700 So this is Archie Cochrane, 1979. 7 00:00:30,120 --> 00:00:35,279 He was telling viewers that what we need is a critical overview of all of the evidence 8 00:00:35,280 --> 00:00:40,890 and it should be periodically adapted and all the relevant randomised controlled trials. 9 00:00:41,670 --> 00:00:43,290 So if you really want to know about medicine, 10 00:00:43,290 --> 00:00:49,710 we need to have all the healthy teeth and we need to get them all together on certain topics and understand them. 11 00:00:50,010 --> 00:00:53,550 And generally that led to this important document. 12 00:00:53,970 --> 00:01:03,870 This is effective care in pregnancy and childbirth, which is in trauma out there with other people we brought together in pregnancy and childbirth. 13 00:01:03,900 --> 00:01:07,920 What was the forebearers of the Cochrane Library, Kiran? 14 00:01:08,330 --> 00:01:15,660 But this was an attempt to bring together all the evidence and systematically review it and put it into one volume. 15 00:01:15,670 --> 00:01:23,430 So this is 99 that the state of the are you see here real milestone in the history of our figure on the evaluation of care 16 00:01:23,890 --> 00:01:30,150 not led to the Cochrane Library thinking because that's what went on three years later to set out an editorial that said, 17 00:01:30,150 --> 00:01:38,290 here's the Cochrane Library. But at the same time, what this was starting to show with work that was happening in a number of publications think 18 00:01:38,370 --> 00:01:45,419 Dickinson in controlled clinical trials and Vivien himself with some other people in JAMA in 1990. 19 00:01:45,420 --> 00:01:52,809 And that's about the same time with a worrying concern that although the result was that randomised controlled trial evidence, 20 00:01:52,810 --> 00:01:56,640 there was a particular issue happening called publication bias, 21 00:01:57,420 --> 00:02:05,220 that it seemed that some of the trial data was never published and this was the start of the issues that arose at that time. 22 00:02:06,510 --> 00:02:17,760 So that was 99. Very quickly, we've gone from overview of the evidence into a problem where probably application by fast forwarding 20 years later. 23 00:02:17,760 --> 00:02:25,470 In 2013, I became involved in an organisation setting up the old trials group which basically said 24 00:02:25,830 --> 00:02:31,890 we would like all trials registered and reported in full for a very simple ask if you like. 25 00:02:32,310 --> 00:02:39,630 And it's been signed by 9750 people from people in this room have signed and 748 organisations. 26 00:02:41,070 --> 00:02:44,340 So one of the fundamental principles that led to my involvement, 27 00:02:44,340 --> 00:02:49,049 no trials and why we go trials with the particular research I got involved in on a 28 00:02:49,050 --> 00:02:57,030 drug called Tamiflu and Tamiflu with the drug that was used in influenza outbreaks. 29 00:02:57,030 --> 00:03:04,139 And it had lots of claims. Now, this could to take you back a little bit into the history of thinking about Tamiflu, 30 00:03:04,140 --> 00:03:11,670 the first thing you use is it was evaluated about 2000 and about 2005. 31 00:03:11,670 --> 00:03:15,580 Tamiflu used with minimal. Hardly used. 32 00:03:16,060 --> 00:03:20,200 In fact, the company broached with thinking of giving the license back to Genentech. 33 00:03:21,160 --> 00:03:32,020 But in 2006, Donald Rumsfeld, the manager of the known Unknowns, was part of a report in the U.S. that said, 34 00:03:32,020 --> 00:03:36,610 we're going to look into how many predicted death would be caused by bird flu. 35 00:03:37,210 --> 00:03:40,690 You remember all that hysteria about bird flu. 36 00:03:41,200 --> 00:03:48,350 So let me ask you a question. How many predicted death would bird flu cause worldwide? 37 00:03:48,980 --> 00:03:52,210 I mean the UK. Anything. Hmm. 38 00:03:53,410 --> 00:03:58,930 Very little. Even at that point in time, there've been about 40 to 50. 39 00:03:59,200 --> 00:04:02,230 In the world, you're going to get very little movies and very little. 40 00:04:06,310 --> 00:04:10,630 In fact, the number is 150 million. 41 00:04:12,530 --> 00:04:17,060 And in the UK it's 700,000 were predicted. 42 00:04:17,480 --> 00:04:26,020 And at that point we started stockpiling antiviral because people said we need to understand, we need to have something that helped us them. 43 00:04:27,680 --> 00:04:30,409 And in 2009, at the same time, 44 00:04:30,410 --> 00:04:38,750 we published this review with Matthew showing Shin for the report from the UN had come out saying predict 150 million deaths. 45 00:04:39,470 --> 00:04:46,660 Okay. And that's when we started to produce 14.6 million courses of the job. 46 00:04:47,140 --> 00:04:50,680 We'll have some of the 700,000 deaths that might be expected. 47 00:04:51,400 --> 00:04:56,370 That's a lot of deaths, isn't it? That's about one in every 800 people. 48 00:04:56,430 --> 00:05:01,709 So it's about one of you in this room should be dead, basically. That's a lot of people. 49 00:05:01,710 --> 00:05:05,280 And you think about where we are in reforming health care. Okay. 50 00:05:05,700 --> 00:05:15,840 Now, interestingly, the fourth offence and now in 2009, we've gotten to the point where we've moved on from swine flu to swine flu, 51 00:05:17,070 --> 00:05:21,030 but we've survived the bird flu and now we're into the swine flu problem. 52 00:05:22,260 --> 00:05:25,620 And again, we've got apocalyptic warning from officials. 53 00:05:25,620 --> 00:05:34,740 This is Liam Donaldson, who was the chief medical officer at the time, and he said the 50,000 would be killed in the UK. 54 00:05:35,250 --> 00:05:41,010 But then these provisions and in the World Health Organisation, 55 00:05:41,010 --> 00:05:49,650 this is Margaret Chan saying we have evidence to suggest we are in the first pandemic of the 21st century. 56 00:05:50,720 --> 00:05:56,530 And if you remember again in 2009, there were lots of echoes of the 2005. 57 00:05:56,930 --> 00:06:00,630 But this is a serious crisis. This is an epidemic. It's a pandemic. 58 00:06:01,070 --> 00:06:09,860 And therefore, in this country, what we had was we had situations where you phoned up a telephone line and if you had a fever, 59 00:06:09,860 --> 00:06:14,550 you would be given automatically Tamiflu. With no doctor's appointment. 60 00:06:15,700 --> 00:06:19,089 A very intricate and we did the systematic review in children. 61 00:06:19,090 --> 00:06:22,660 And at the same time there was also another review published. 62 00:06:23,050 --> 00:06:30,670 And this is a timeline of history of the reviews of Cochrane reviews and systematic reviews on some of it. 63 00:06:31,210 --> 00:06:38,050 And here's the children, one which I'm part of, and this is a chap called Tom Jefferson, who was also doing the adult review. 64 00:06:39,190 --> 00:06:44,409 And one of the things was this led to a new collaboration where we worked 65 00:06:44,410 --> 00:06:48,670 together to try and get all the evidence on the data from children and adults. 66 00:06:51,160 --> 00:06:55,239 This is an evolving story, my mind at the time and all of our thinking at the time. 67 00:06:55,240 --> 00:07:05,200 But what happened in 2003 is this paper had been published in Archives Internal Medicine and it wasn't quite a systematic review. 68 00:07:05,590 --> 00:07:10,420 But what he said is we analysed prospectively collected data on lower respiratory tract 69 00:07:11,020 --> 00:07:17,950 infections and antibiotic use from 3500 patients enrolled in ten placebo controlled, 70 00:07:17,950 --> 00:07:25,540 double blinded trials. So some of it. So that was a large dataset at the time. 71 00:07:26,950 --> 00:07:34,629 This review included ten randomised controlled trials and then ten randomised controlled trials had been 72 00:07:34,630 --> 00:07:42,430 included in the adult cocoon review and the data from then trials had been included in the systematic review. 73 00:07:43,270 --> 00:07:49,000 So when you look, if you're randomised controlled trials, sometimes you find these and then you find the data in that overview. 74 00:07:49,570 --> 00:07:55,630 What happened is a chap from Japan submitted a comment to the Cochrane Review. 75 00:07:55,930 --> 00:08:02,200 Anybody who's doing a systematic in Cochrane knows if you've got a comment six months to respond and post the comment. 76 00:08:02,200 --> 00:08:06,999 You have the auditor team come to you and say, Tell me what we found within six months. 77 00:08:07,000 --> 00:08:11,920 And his comment was to say, of the ten trials that were in the kind of review, 78 00:08:12,280 --> 00:08:16,509 only two were probably developed for the German months and the other eight were 79 00:08:16,510 --> 00:08:20,980 preceding the Congress meeting date of follow up and were never published. 80 00:08:21,880 --> 00:08:28,750 And then he went in and said, Well, look, you have publication bias because the two reports, the trials were the only two positive ones. 81 00:08:28,750 --> 00:08:35,680 I knew the right thing. And that meant 60% of the data was never published whatsoever. 82 00:08:36,910 --> 00:08:41,680 And that led to a whole affair of trying to track down the data. 83 00:08:42,590 --> 00:08:50,209 And we use media to try and get to the data. But one of the things that we wrote to the authors of the study is the original data, 84 00:08:50,210 --> 00:08:56,030 and this is what one of the study said, and this is Deborah Cohen, who was working in the BMJ as an investigator. 85 00:08:56,840 --> 00:08:59,870 First, he said he did not recall seeing the primary data. 86 00:09:00,290 --> 00:09:05,330 The statistical analysis had been conducted by Rouch and he analysed the summary data. 87 00:09:05,780 --> 00:09:09,880 But concerning well is the other chap, Fred Hazen, 88 00:09:09,890 --> 00:09:16,760 who then went to the chap who was the corresponding author of the original ten study review and said, 89 00:09:17,840 --> 00:09:21,470 I cannot find the original data to the 2003 publication. 90 00:09:21,740 --> 00:09:30,639 You may need to go to both. And in the proceedings, this is one of the largest ever trials ever done. 91 00:09:30,640 --> 00:09:37,600 If you go into the data, it tells you the number of study patients is about 1400 patients were in this study. 92 00:09:38,350 --> 00:09:48,550 And this is the abstract I've presented in 2000 of the international abstracts of the 2000 annual Infectious Disease Societies in America, 93 00:09:49,720 --> 00:09:56,400 about 300 words, okay. 1459 patients were enrolled. 94 00:09:56,640 --> 00:10:06,520 164 US study finds the chart at the top page on China said you didn't go to the meeting. 95 00:10:07,510 --> 00:10:11,860 It doesn't even remember the abstracts or participated in the writing of the abstract. 96 00:10:12,280 --> 00:10:24,600 I cannot recall the meeting whatsoever. So in 2013, Roche agreed to release all of the trial data on all the Tamiflu drugs. 97 00:10:25,920 --> 00:10:31,530 And in doing that, we had a long contractual issue back and forward about how we get the data. 98 00:10:31,530 --> 00:10:35,900 And you think about information governance and all the issues that come with information governance, 99 00:10:36,690 --> 00:10:41,130 and eventually you just simply load the seeds in the post on secured. 100 00:10:41,580 --> 00:10:53,010 And one day I got all these feeds and that was it, 170,000 pages of document we would see now coming back to 7001. 101 00:10:53,010 --> 00:11:04,169 Remember that study showed you the 300 word study? So in 76001 is the sort of ID you give a through before it gets published, 102 00:11:04,170 --> 00:11:07,860 before it gets into the public domain, and when it probably can become trendy, right? 103 00:11:07,860 --> 00:11:13,740 Write down blah blah blah. 476001 So it's 300 word abstract. 104 00:11:14,100 --> 00:11:20,910 The Clinical Study Report. If if you can see the chart is 95989 pages. 105 00:11:22,260 --> 00:11:26,000 So. There's no provocation in the middle. 106 00:11:27,050 --> 00:11:33,590 So to this date, we've been using that and yet with 9809 pages available. 107 00:11:34,940 --> 00:11:40,880 So the first thing, if you think, well, let's look at the amount of information and what we might get glean from this. 108 00:11:41,430 --> 00:11:45,620 And to this point and probably to about 20,010, 109 00:11:45,920 --> 00:11:51,860 it was deemed acceptable to use these sorts of abstracts in construction, constructing systematic reviews. 110 00:11:53,900 --> 00:11:56,420 Now I could go into the effects of Tamiflu. 111 00:11:56,420 --> 00:12:02,360 We found in the systematic review of the clinical study report, it took us quite a bit of time to evaluate, 112 00:12:02,360 --> 00:12:05,720 but what we found is it reduced the duration of symptoms by 50. 113 00:12:06,020 --> 00:12:10,009 We already knew that no effect on rates of hospitalisation, 114 00:12:10,010 --> 00:12:15,710 but that was different because they claimed effects and no effect on any bacterial infection in children. 115 00:12:16,250 --> 00:12:22,040 And importantly, we managed to say for every hundred people treated free with self-report, they don't have pneumonia. 116 00:12:22,880 --> 00:12:31,890 Self-report. As opposed to the people in the five trials that used an objective diagnosis of pneumonia, there was no effect. 117 00:12:32,370 --> 00:12:37,080 So the clinical study reports allowed you to, instead of, say, eight trials, put them together. 118 00:12:37,260 --> 00:12:40,350 There was an effect on lower respiratory tract complications. 119 00:12:40,590 --> 00:12:43,620 He basically took them apart and settled on the three that positive. 120 00:12:43,830 --> 00:12:49,530 It's just people who self-report pneumonia. Well, anybody who knows anything would know that's impossible to do. 121 00:12:50,630 --> 00:12:58,880 Do you think you've got pneumonia? Well, the answer is always yes in a clinical setting, because that's what drives you both antibiotic free. 122 00:12:59,880 --> 00:13:03,300 It's a ridiculous question, but actually I've used an x ray. There was no effect. 123 00:13:04,800 --> 00:13:08,400 But here's the bit where it gets interesting. Unexpected findings. 124 00:13:09,690 --> 00:13:11,669 The two publications I showed you, 125 00:13:11,670 --> 00:13:17,630 the ones where people had put their name to but didn't actually look at the data or not analyse the data or didn't have access to the data. 126 00:13:17,640 --> 00:13:21,660 It looks like stated there were no drug related serious adverse events. 127 00:13:22,450 --> 00:13:29,080 And this had been replicated by NHS organisations and international organisations around the world. 128 00:13:29,380 --> 00:13:36,110 There are no serious adverse events were noted in the major trials and no significant changes were noted in laboratory terms. 129 00:13:36,940 --> 00:13:42,880 That would be reassuring, wouldn't that? If you went to a meeting and said We're going to use this drug safety problem? 130 00:13:43,480 --> 00:13:47,790 Yeah, immediately we went into the clinical study report in the back. 131 00:13:47,800 --> 00:13:50,890 This is what's in the to 9000 pages of documents. 132 00:13:50,890 --> 00:13:59,350 There are individual table within. I think you can see that that was yet to come up from the back if you can see that serious adverse events. 133 00:14:00,330 --> 00:14:06,240 Listing of serious adverse events. And there were ten of them. 134 00:14:07,020 --> 00:14:16,320 So how do you go from their word, no joke to serious about you, to their attending just one day before? 135 00:14:17,830 --> 00:14:22,450 Again, that's deeply concerning when you think of this is an issue of reporting bias. 136 00:14:24,010 --> 00:14:27,340 Well, give you some other examples which allow you to do so at the time. 137 00:14:28,000 --> 00:14:37,210 People from Professor Legal, Public Health, England defending the strategy to use Tamiflu in to prevent an outbreak. 138 00:14:37,240 --> 00:14:43,020 So one of the arguments is in the very elderly, you would have an outbreak of somebody with flu. 139 00:14:43,030 --> 00:14:48,340 Somebody would do a test and say a person's got flu. Therefore you should you tell me. 140 00:14:48,700 --> 00:14:57,820 So tell me that prophylaxis, if you give it to everybody else, yeah, you will reduce the known exposure by 8%. 141 00:14:58,160 --> 00:15:02,260 Now, I'm not going to go into the that I could show you by that. 142 00:15:02,260 --> 00:15:06,020 I want you to think about the generalisability of this. 143 00:15:06,040 --> 00:15:09,699 If we were going to, you know, really have a situation. 144 00:15:09,700 --> 00:15:11,830 So Akhenaten cow. All right. 145 00:15:12,370 --> 00:15:19,840 So you would expect in research that you would be able to then understand that the research applied to the population, you would use it. 146 00:15:19,870 --> 00:15:28,840 So if you have an idea of somebody in a nursing home, that might generally be somebody who's elderly, might have co-morbidities, lots of a problem. 147 00:15:29,230 --> 00:15:32,890 And that generates a question of do these results apply to my patients? 148 00:15:34,620 --> 00:15:40,920 Now you can then construct because all of this data is missing from the publication and nobody knows about it. 149 00:15:41,310 --> 00:15:47,730 The exclusion criteria. If you think of the type of people you've seen so they can't have cancer, 150 00:15:47,730 --> 00:15:52,200 immunosuppression, chronic liver or renal disease, they can't be taking steroids. 151 00:15:52,200 --> 00:15:57,150 You can't have unstable or uncontrolled renal, cardiopulmonary, vascular neurological or metabolic disease, 152 00:15:57,390 --> 00:16:00,750 diabetes, thyroid adrenal disorders, capital activities about this. 153 00:16:01,110 --> 00:16:06,510 Not only that, if you've got of these around and if you've got cardiac failure, they're all for out. 154 00:16:07,110 --> 00:16:14,190 Now, if anybody knew the person in a nursing home who is now excluded, you then know not to go. 155 00:16:14,590 --> 00:16:21,540 There's a real problem. So when you look at the study that Public Health England is using to be akin to a nursing home, 156 00:16:21,870 --> 00:16:26,160 there are actually only 17 contract with 16 were included in the study. 157 00:16:26,700 --> 00:16:35,730 So the whole of our strategy in public health England for prophylactic exposure is underpinned by 17 people in a study. 158 00:16:36,300 --> 00:16:44,520 However, what it says in the in the short publication across all ages, it didn't even detail any of the breakdown of the people. 159 00:16:45,450 --> 00:16:49,610 So you start to go and when I'm in a briefing with people, 160 00:16:49,620 --> 00:17:00,210 is it acceptable to be able to apply treatment for people who are completely excluded from the study and say that's okay to get on with, 161 00:17:00,930 --> 00:17:08,670 yet we still do that today. Now, the interesting, if you go into the Tamiflu promotional material, it's really quite interesting. 162 00:17:09,000 --> 00:17:19,170 It's going back. What happened is this is the misleading efficacy claim that actually in America Roche submitted 163 00:17:19,170 --> 00:17:23,489 these in America the pill with the power to stop the flu taking the flu with Tamiflu. 164 00:17:23,490 --> 00:17:26,760 Tamiflu, it will reduce duration of the flu by 31%. 165 00:17:27,090 --> 00:17:30,240 Severity of influenza similar, all misleading, the FDA said. 166 00:17:30,600 --> 00:17:36,780 And in fact, what the FDA said, because they'd done a bit more work than in Europe with Tamiflu, 167 00:17:36,780 --> 00:17:40,409 has not been shown to prevent such complications all along. 168 00:17:40,410 --> 00:17:47,400 That was actually new. And in fact, in a recent update in 2010, they even restated this in America. 169 00:17:47,700 --> 00:17:50,930 This is not what it says in America, in Europe and in the UK. 170 00:17:51,480 --> 00:17:56,610 It is the complete opposite in terms of what they may become with the influenza like 171 00:17:56,610 --> 00:18:01,040 symptoms or may co-exist with or occur with complications during the course of influenza. 172 00:18:01,590 --> 00:18:04,620 Tamiflu has not been shown to prevent such complications. 173 00:18:05,340 --> 00:18:09,299 I put to go into the 500 million we spent stockpiling this drug. 174 00:18:09,300 --> 00:18:16,110 At this point, I'm not going to go there. What the message there is, is to start to think of a significant problem. 175 00:18:16,630 --> 00:18:24,870 Reporting by the amount of information around a drug is limited when we talk about journal 176 00:18:24,870 --> 00:18:30,960 publications and also in many systematic reviews and I'll come back to this point, 177 00:18:31,890 --> 00:18:40,500 but going back, what also happened at this point, which is really interesting about 20 1314, all trials that started with all trials, 178 00:18:40,500 --> 00:18:46,800 Ben Goldacre was part of that and he came to Oxford, invented free for the data lab. 179 00:18:47,520 --> 00:18:56,070 And one of the things he was interested in was the publication bias aspect, because he said it's still not gone away, it's still a major problem. 180 00:18:56,370 --> 00:19:03,540 Despite the fact we've been going on about now for nearly 30 years, not much has changed. 181 00:19:04,230 --> 00:19:08,639 And we published a few papers that started with looking at some of the issues. 182 00:19:08,640 --> 00:19:13,290 So we looked here in Oxford at the Biomedical Research Centre. 183 00:19:13,920 --> 00:19:18,840 Oxford is to receive probably the most research funding in the UK. 184 00:19:19,830 --> 00:19:24,510 And in the Biomedical Research Centre they done 286 trials. 185 00:19:24,510 --> 00:19:30,149 We identified and we set out to try and find we couldn't find registration for 186 00:19:30,150 --> 00:19:36,120 44 of the trials that were registered with a completion date pre January 2015. 187 00:19:36,540 --> 00:19:41,610 Just over half were published and half of these were published within 12 months. 188 00:19:42,750 --> 00:19:51,570 Because what happened is the European Union was also saying we should be now having to publish trials within one year of completion. 189 00:19:52,560 --> 00:20:00,950 Well, in 2014, the law changed. Say, if you finish your trial 12, then you should publish your photos with all involved. 190 00:20:01,800 --> 00:20:06,450 And as you can see in Oxford, it's supposed to be the bastion of quality research integrity. 191 00:20:06,750 --> 00:20:13,080 We were only doing that about half the time. And some of the information without the date any more accurate. 192 00:20:13,590 --> 00:20:20,040 And we tried to influence that. Some people worried about that, but at least we were starting to do audits on our own work. 193 00:20:21,610 --> 00:20:27,670 Now this also then led to what's called the EU trial tracker. 194 00:20:28,360 --> 00:20:38,050 Because the DataLab team is so smart they can pull the data from the trial registries and the date and then produce what's called the trial tracker. 195 00:20:38,620 --> 00:20:41,410 And that's because the law did change. 196 00:20:41,770 --> 00:20:52,060 And from December 2016, all trials of the EU clinical trials for EU CPR should post results within 12 months of completion. 197 00:20:52,270 --> 00:20:56,020 So in December 2017, it's the law. 198 00:20:56,530 --> 00:21:07,630 It's not something you want to do. Remembering it been the law in America since 2007 became the law in effect from 2017. 199 00:21:07,840 --> 00:21:11,140 For the first date, you could see the result with with a year later. 200 00:21:11,950 --> 00:21:15,370 Okay, now with the result for the public. 201 00:21:15,380 --> 00:21:20,000 So this is the traditional journal approach of what it looks like in the journal. 202 00:21:20,020 --> 00:21:22,270 So if you this the paper in the BMJ, 203 00:21:22,270 --> 00:21:33,190 which is a top prize and this is normally what researchers do and publish your paper with results and said 49.5%, roughly half reported results. 204 00:21:33,750 --> 00:21:35,139 Yeah. Interestingly though, 205 00:21:35,140 --> 00:21:44,379 even trials with a commercial sponsor with substantially more likely to post results than those with a non-commercial sponsor, 68 versus 11. 206 00:21:44,380 --> 00:21:47,500 Look at the difference. 68 votes is 11. 207 00:21:49,150 --> 00:21:54,380 Don't tell me I can't stand the ratio people. Are you going to learn about why the odds ratio is misleading? 208 00:21:54,450 --> 00:22:00,930 But it's not. It's about 50 times greater and those that conducted large number of trials. 209 00:22:00,940 --> 00:22:06,070 But what happened next is if you go online, you can go on to the EU trial tracker. 210 00:22:06,640 --> 00:22:12,280 You can look at it tonight and you'll see that the numbers got slightly better. 211 00:22:12,580 --> 00:22:15,640 So 62%, 61.8%. 212 00:22:16,000 --> 00:22:24,880 And you can go online and look at that. Now I'm doing a phone with the tool in Manchester in about a month's time at the Manchester University. 213 00:22:25,510 --> 00:22:32,870 And I went into Manchester and they got three free institutions in Manchester and wherever I am, if I do the talk, 214 00:22:32,870 --> 00:22:37,239 I show them and they're shocking and they're so bad they're going to hopefully be embarrassed to do 215 00:22:37,240 --> 00:22:41,650 something about it in Manchester because about 25% of their trials are reporting at the moment. 216 00:22:42,100 --> 00:22:47,020 In fact, they want to do something with one. 217 00:22:49,160 --> 00:22:52,469 You? No, what happens is it will accumulate. 218 00:22:52,470 --> 00:22:59,820 So once you report, you come within the time frame. So you report 14 month, you'll be considered to have completed what you showed. 219 00:23:00,420 --> 00:23:05,580 But what? The thing that you should report by 12 months and it continues to update. 220 00:23:05,850 --> 00:23:09,560 So if a trial fitted that report, it will come out. Yep. 221 00:23:10,180 --> 00:23:15,820 And you can type any of any of your institutions. You can type in where you are in that institution. 222 00:23:15,910 --> 00:23:22,360 And it will show any fact that academic institutions are pretty dire. 223 00:23:23,590 --> 00:23:27,340 Oxford's got better. Not least because they know we're all different all the time. 224 00:23:28,420 --> 00:23:35,079 And also recently, the HRA has said that actually they're going to start and the government has said we're 225 00:23:35,080 --> 00:23:40,600 going to start auditing academic institutions to start to look at their performance. 226 00:23:41,500 --> 00:23:43,780 And if you don't start to meet this requirement, 227 00:23:44,140 --> 00:23:48,220 then I think that's going to be a position where they're going to say, we're going to withdraw your funding, 228 00:23:48,970 --> 00:23:49,840 because the question is, 229 00:23:49,840 --> 00:23:57,760 is it ethically acceptable if you've still got 40% your trials on reported that you should receive more money to do more trials, 230 00:23:58,930 --> 00:24:03,280 to publish less results? So we feel like it. I mean, it doesn't ethically make sense to me. 231 00:24:03,520 --> 00:24:07,810 Surely it should be an ethics should go. How would you check online how you do it? 232 00:24:08,050 --> 00:24:12,160 Because you can look at this by institution. You can look at this by industry sponsorship. 233 00:24:12,550 --> 00:24:19,750 And if you are reporting in full, surely you should say, well, what about the sort of thing we need in the thinking? 234 00:24:20,440 --> 00:24:26,080 So that's where we are. So we can say just a bit more than half of all trials go unreported. 235 00:24:27,580 --> 00:24:35,200 And the thing is, can you make informed decisions about treatment if you're wondering 40% of the data may be missing. 236 00:24:36,610 --> 00:24:42,250 So we're now in two areas where we've got significant problems reporting bias and publication bias. 237 00:24:43,750 --> 00:24:48,550 Now, the thing is, some things have got better in the time in a very short period of time. 238 00:24:49,150 --> 00:24:50,590 And in 2014, 239 00:24:50,590 --> 00:24:58,250 the European Medicines Agency said what we're going to do is provide online access to clinical data for medicinal products that we approve. 240 00:24:59,650 --> 00:25:04,750 And to do this, what they're going to do is make clinical study report data available. 241 00:25:05,290 --> 00:25:13,120 So when they approve a drug, the clinical study report will be made available and you will be able to access it. 242 00:25:13,480 --> 00:25:17,170 So if you're doing a systematic review of a treatment or intervention, 243 00:25:17,500 --> 00:25:27,880 you will be able to get the new drugs now and you'll also be able to access and ask for some about before that date. 244 00:25:29,260 --> 00:25:35,440 Now, the reason I did that is because there are a number of studies that have looked at the type of documentation. 245 00:25:35,440 --> 00:25:41,380 So it's okay me giving you one example of Tamiflu, but actually there are more examples of this. 246 00:25:42,130 --> 00:25:46,600 And this is in Germany from eight quick it quick like the nice in Germany. 247 00:25:46,600 --> 00:25:55,659 The Health Technology Assessment group in Germany is called big week and this is a visa is head of the department of their drug assessment is trying 248 00:25:55,660 --> 00:26:04,030 to investigate to what extent different types of documents full reporting clinical trials provide sufficient information for trial evaluation. 249 00:26:04,360 --> 00:26:08,920 So when you get to the whole aspect of this, you really need the important information in front of you. 250 00:26:09,730 --> 00:26:12,910 When I looked at Tamiflu, they just looked at journal publication. 251 00:26:13,390 --> 00:26:19,090 If you go to this, this is in the BMJ. It is a very interesting table in 2015. 252 00:26:19,750 --> 00:26:23,890 And for instance, here the point here is this work. 253 00:26:26,980 --> 00:26:32,410 So this is this row here, it's study results and here you've got journal publications. 254 00:26:33,070 --> 00:26:38,290 This is posted on the registries like the UTR and this is the clinical study report. 255 00:26:38,800 --> 00:26:42,650 Okay. Our findings information. 256 00:26:42,660 --> 00:26:54,750 And so in the study report information in there complete on 52% of specific outcome in registries is 71%. 257 00:26:56,070 --> 00:27:01,770 And by the time we get to clinical study reports, it's 91% of specific outcomes in study report. 258 00:27:02,790 --> 00:27:09,810 So you can see the shift in information going from general publication through registries to clinical study report. 259 00:27:10,650 --> 00:27:16,320 These are incredibly useful when they populated what I just showed you that only did about 60% of the time. 260 00:27:17,070 --> 00:27:22,950 And they show you the problem to very well we're not publicly available the voluminous 261 00:27:22,950 --> 00:27:27,690 so we've pointed out the disadvantages that really big to get your head around. 262 00:27:29,250 --> 00:27:30,600 But there's been over examples. 263 00:27:31,560 --> 00:27:40,799 Here's an example recently about access to clinical study reports in line with the IMA and it reduced reporting biases for quality of life outcomes. 264 00:27:40,800 --> 00:27:44,180 So important information that's missing in the Journal for that. 265 00:27:44,220 --> 00:27:52,290 But one of the problems here is if pointed out that regulators unfortunately only see a fraction 266 00:27:52,290 --> 00:28:00,150 of the documentation and they frequently that no regulator actually asks for all the evidence. 267 00:28:01,590 --> 00:28:10,110 They only ask for the evidence sufficient to produce a marketing approval so that people can submit to them a little bit of evidence. 268 00:28:10,110 --> 00:28:14,910 They can submit different bits to different regulators around the world, and they all get to see different bits, 269 00:28:15,150 --> 00:28:22,560 but they do not create a table of all of the studies and say, here's all the studies and you're looking at only four. 270 00:28:22,920 --> 00:28:26,520 So that's a huge problem. Well, here's another example. 271 00:28:26,520 --> 00:28:29,040 And then here's again in the old effect trial. 272 00:28:29,130 --> 00:28:37,110 And if people know the study of only stuff, it was a drug that was used to reduce obesity but had significant problems with cardiovascular effects. 273 00:28:38,040 --> 00:28:45,120 And it's one of the things that it shows is when you look at clinical study reports from that journal publication, 274 00:28:46,140 --> 00:28:56,020 the benefits seemed to align where the major differences in understated adverse events, which would make sense, wouldn't it? 275 00:28:57,150 --> 00:29:00,540 If you have a journal publication that you want to accentuate positive, 276 00:29:00,900 --> 00:29:05,040 you are going to put the positive outcomes in and remove all of the adverse events. 277 00:29:05,400 --> 00:29:08,940 And that seemingly is what's happening with the clinical study report. 278 00:29:10,900 --> 00:29:16,870 So last year one of my other roles is editor in chief of BMJ evidence based medicine. 279 00:29:17,140 --> 00:29:28,630 And here's a good example of what's happening right now is a systematic review of HPV vaccine published in the Cochrane Library Juice Publications, 280 00:29:29,590 --> 00:29:39,160 but then trying to use a little bit of registries and a little bit of CFR without being comprehensive and doing it in a systematic way. 281 00:29:40,150 --> 00:29:45,520 And this report caused a lot of controversy because basically in there is the 282 00:29:45,520 --> 00:29:51,429 table that they said there's out of the 20 trials that we think they missed, 283 00:29:51,430 --> 00:29:56,650 and they definitely missed 16 additional trials which were eligible in the review. 284 00:29:57,130 --> 00:30:04,030 And and this is a review by a chap in Denmark, Lath Jorgensen, working with his doctor and Tom Jefferson. 285 00:30:04,300 --> 00:30:08,810 He was doing his Ph.D. on this whole subject. So it's pretty intense. 286 00:30:08,840 --> 00:30:18,139 It's not a few weeks work. I wanted to put my rapid review together and here they trying to match the study ID with the funding 287 00:30:18,140 --> 00:30:24,620 sponsorship database for the trial I.D. in the in the registry and then trying to match them and say, 288 00:30:24,620 --> 00:30:28,550 here's what we think this study is. And so. 289 00:30:31,260 --> 00:30:34,870 The we show this is a bit like you. 290 00:30:34,870 --> 00:30:40,090 You're about you're flying the plane and half the money off makes it. 291 00:30:41,670 --> 00:30:49,020 We don't know which half and you don't quite know when you need that bit the menu but it'll probably be when the really 292 00:30:49,290 --> 00:30:55,109 when when the safety issues happen and you really want to know when it's just about to hit going down with the night, 293 00:30:55,110 --> 00:30:58,130 what do we do now? And all of that is missing. 294 00:30:59,130 --> 00:31:05,910 And and I think it shows a worrying position that if you want to trust the evidence, 295 00:31:05,910 --> 00:31:09,720 because many of these decisions are based on small benefits to harm. 296 00:31:10,800 --> 00:31:14,430 But you really would want all of the evidence in front of you, 297 00:31:14,790 --> 00:31:21,810 and you probably want somebody to look at it in a way that is impartial and puts together the benefits and the harm. 298 00:31:22,200 --> 00:31:26,400 To do that takes a long time. This is not a straightforward job. 299 00:31:26,850 --> 00:31:35,550 You can't do it for all evidence, but you would expect to think we might do it for the public health drugs, the ones where like HPV or Tamiflu, 300 00:31:35,940 --> 00:31:43,350 where it's likely to be used by a wide spectrum of patients and it's important to know the benefits in the home. 301 00:31:45,310 --> 00:31:50,500 Now I'm just going to finish with a few things, which, again, this story carries on. 302 00:31:50,500 --> 00:31:56,350 And it's it's a very interesting story and it's not going away any time soon for this is a new trial. 303 00:31:56,350 --> 00:32:03,790 And the problem is right now, in 2016, they set out they had a delay for about a year, which is for immigrants, 304 00:32:03,790 --> 00:32:11,560 and it doesn't quite work as well, first of all, that it suspended all new activities related to clinical data publication. 305 00:32:12,370 --> 00:32:18,310 And this is the problem of good old Brexit because they've relocated to Amsterdam, 306 00:32:18,940 --> 00:32:24,790 they said it's too much work and we have to continue to do business plan. And so this is peripheral activity right now. 307 00:32:25,090 --> 00:32:32,380 It's not important. And they said by the end of July, but it's not happened yet. 308 00:32:32,740 --> 00:32:38,070 We'll be processed until further notice if we sell them further notice, which is deeply concerning. 309 00:32:39,220 --> 00:32:43,410 The flip side, though, is is is is anybody in Canada here? 310 00:32:43,570 --> 00:32:47,410 Is anybody from Canada here? Well, you guys are leading the field. 311 00:32:47,860 --> 00:32:55,450 You're really aware of this, what you're doing in Canada. So in Canada, Health Canada is within 120 days of a decision. 312 00:32:55,450 --> 00:32:59,680 Health Canada will post clinical study reports on a new government online portal. 313 00:33:00,250 --> 00:33:06,310 And as I say, they set out well, as I understand it, Health Canada is just going to say here, 314 00:33:07,420 --> 00:33:12,579 which is an amazing step forward, because the problem at the moment is even in email, 315 00:33:12,580 --> 00:33:19,480 you've still got to go to an access system that requests them and then they go through a sort of holding pattern where they go, 316 00:33:19,490 --> 00:33:23,620 we're going to we're going to check that they've no data protection issues. 317 00:33:23,620 --> 00:33:27,670 We're going to do some redactions in about year, like you get the data. 318 00:33:28,270 --> 00:33:33,640 So Health Canada have potentially moved the game forward quite considerably. 319 00:33:34,030 --> 00:33:35,710 And if they become available, 320 00:33:36,010 --> 00:33:44,950 we all could get to work in a way that means you're going to be very busy sorting out the problems of publication bias and reporting by. 321 00:33:45,820 --> 00:33:46,600 Thank you very much.