1 00:00:00,690 --> 00:00:09,420 It's great. It's great to be here. And to present one of my hobby horses is about systematic reviews. 2 00:00:09,870 --> 00:00:15,120 I appreciate you sort of nearing the end of your first weekend, you probably a little bit tired. 3 00:00:15,120 --> 00:00:19,830 So I went to I have to tell you now this at the end of my 30 minute talk, 4 00:00:20,070 --> 00:00:28,350 I'm going to ask you to sort of spend a bit of time discussing what I've presented and then to give me some feedback at the end of it is in part, 5 00:00:28,680 --> 00:00:32,730 the reason is that Carl, 6 00:00:32,880 --> 00:00:40,260 myself and Jeremy Herrick are writing a paper around why we think the current system of systematic 7 00:00:40,260 --> 00:00:45,750 reproduction is wrong and we've jotted out some thoughts actually spent a fair bit of time on it. 8 00:00:46,020 --> 00:00:53,850 And in a way, you're the guinea pigs to our ideas. And so we want you to point out all the obvious errors we've made as before we go to print. 9 00:00:54,120 --> 00:01:00,000 But also we think it will be challenging to a lot of the things that are currently thought about systematic reviews, 10 00:01:01,740 --> 00:01:05,580 a bit of background surmise myself. I'm a scientist by background. 11 00:01:05,610 --> 00:01:12,659 I've worked in what was clinical effectiveness in the mid 1990s. 12 00:01:12,660 --> 00:01:20,820 That's my start. And from there I moved to South Wales where I was told to make the GP's more clinically effective. 13 00:01:21,690 --> 00:01:25,499 So I went and spoke to the GP's and they said we don't learn searching skills, 14 00:01:25,500 --> 00:01:28,980 we don't want to learn critical appraisal skills, we haven't got time, just answer our questions. 15 00:01:29,460 --> 00:01:37,320 And so that's what I tried to do using this is pre Google days and when Medline was on CDs or separates or stats 16 00:01:37,350 --> 00:01:43,710 on and it was I basically taught myself to answer clinical questions and as I was answering these questions, 17 00:01:44,100 --> 00:01:50,969 it dawned on me it be so much easier if we could search all these resources of high quality evidence from one place. 18 00:01:50,970 --> 00:01:55,560 And that's how I started the database, and that's grown ever since. 19 00:01:55,740 --> 00:02:01,080 I'm not going to talk specifically about the database. If you want to ask me any questions about it, you can do after this. 20 00:02:02,400 --> 00:02:10,290 But a lot of my thinking has and a lot of the development of the databases come around from answering questions for doctors and nurses. 21 00:02:11,520 --> 00:02:20,520 Me and my various teams over in various services have answered over 10,000 clinical questions, mostly for primary care doctors and nurses. 22 00:02:20,670 --> 00:02:26,220 And it was not too far away from here that I was having a conversation with Aaron Chalmers, who obviously created the Cochrane Collaboration. 23 00:02:26,670 --> 00:02:37,500 And we've got a great relationship now, but I just had to, I, I struggle when I'm answering questions for GP's, 24 00:02:37,770 --> 00:02:43,680 I struggle with the notion on one hand I'm hearing Cochrane's great systematic views are great and on the other hand they don't up too many questions. 25 00:02:43,860 --> 00:02:45,000 That was my experience. 26 00:02:45,690 --> 00:02:53,340 We did an audit and less than 1% of the questions we received could be answered by a single systematic review asking What are they for? 27 00:02:53,640 --> 00:03:00,690 Surely they might be useful. And that was my that was a starting point for my scepticism about systematic reviews. 28 00:03:01,350 --> 00:03:04,680 And now we hear we are sort of sort of think how long it was eight years ago. 29 00:03:04,830 --> 00:03:10,799 And it's just I had that conversation with Ian and and I think he's developed and more 30 00:03:10,800 --> 00:03:14,730 and more information is coming out about what I think are the flaws of systematic views. 31 00:03:15,010 --> 00:03:21,000 I'm sure you've always been told systematic views are great and they can be, but I just think a lot of the flaws are hidden. 32 00:03:21,470 --> 00:03:28,470 You've invariably heard of Archie Cochrane and in 1979 he complained that medicine had not yet managed to produce a 33 00:03:28,470 --> 00:03:35,190 critical summary by speciality or subspecialty adapted periodically of all relevant randomised controlled trials. 34 00:03:35,520 --> 00:03:43,230 He felt it was important at the time and I think most people still think it's important and irrespective of my cynicism of the current methods, 35 00:03:43,440 --> 00:03:48,970 I still think systematic reviews are an important tool. But we're nowhere near that. 36 00:03:49,160 --> 00:03:56,340 We're nowhere near synthesising all randomised controlled trials and many of them are just floating about with, 37 00:03:56,350 --> 00:04:03,610 with, with, with no attempt at synthesising them because the cost per systematic review is huge. 38 00:04:03,910 --> 00:04:08,290 I don't know if anyone actually tries to do a systematic review here and all find it easy. 39 00:04:09,130 --> 00:04:14,950 Know the data I could find is each systematic review takes approximately 1100 person hours. 40 00:04:15,760 --> 00:04:17,229 And if you look at a Cochrane review, 41 00:04:17,230 --> 00:04:26,320 it typically takes 23 months from registration to publication and there's obviously a bit of work going to go and for before it's registered. 42 00:04:27,340 --> 00:04:33,370 So we're looking at well over two years for systematic review from that idea to publication. 43 00:04:33,700 --> 00:04:38,950 And as a result of this work, most of the people I've spoken to feel free to contradict me, those who have done a systematic view. 44 00:04:39,280 --> 00:04:44,370 Most people who do assessment of you don't come it don't come out of it thinking, Oh, I'd love to update it. 45 00:04:44,380 --> 00:04:52,000 I can't wait to start again and update it again, because most people I know seem to look at the thought of updating their system of review, of horror. 46 00:04:52,240 --> 00:04:57,490 It's not easy. I've not done one, but I see enough people near the end of their system out of use, 47 00:04:57,790 --> 00:05:03,250 who are horrified by the work that's needed to do it and to do it properly. 48 00:05:03,520 --> 00:05:09,280 And so most systematic views aren't kept up to date. And well, I'll show you later on the slide. 49 00:05:09,280 --> 00:05:14,440 But about two thirds of Cochrane systematic reviews are currently out of date, which is a huge number. 50 00:05:14,680 --> 00:05:17,680 I've spoken to Paul Glasgow who's called predecessor here, 51 00:05:17,680 --> 00:05:25,390 and they've quote from him They've the methodologies raised the methods and process barriers so high that most are not being updated. 52 00:05:25,810 --> 00:05:32,530 I often view this is the methodologies are running amok and they're trying to squeeze out every single little bit of bias, 53 00:05:32,530 --> 00:05:36,790 but with no reference to the cost. In producing the money. 54 00:05:37,110 --> 00:05:42,420 I feel that they are producing Ferraris where most people will be happy to have a golf or a mini. 55 00:05:42,690 --> 00:05:47,370 And this is a slow decision. I might be I might be seen to be slagging off Cochrane. 56 00:05:47,790 --> 00:05:55,320 That's. I'm not I'm just using Cochran's information because they are transparent, a lot more transparent than most other systematic review producers. 57 00:05:55,680 --> 00:05:59,880 And so they're being picked on in a way because they're transparent. 58 00:05:59,970 --> 00:06:05,280 So I'm saying this is a huge positive and Ian Chalmers is always very happy for 59 00:06:05,280 --> 00:06:09,870 Chris Cochran because he set up a prize for actual overt criticism of Cochran. 60 00:06:10,230 --> 00:06:12,300 So they're big enough, they can take it. 61 00:06:12,690 --> 00:06:20,580 But this is taken from the Cochran Library and Oversight Committee and the annual reports and financial statements. 62 00:06:21,450 --> 00:06:25,679 What we've got here is the funding, the income of Cochran. And as you can see, it's gone up. 63 00:06:25,680 --> 00:06:31,590 And it was a significant increase in 2008, 2009. That's probably when they started working with Wiley. 64 00:06:32,400 --> 00:06:39,300 Now, that was my assumption. And this line here is a number of systematic views in the Cochran lobby that are actually up to date. 65 00:06:39,690 --> 00:06:48,150 So in 2009, it was 39.8, and it's in last time they published information is down to 35.8. 66 00:06:48,750 --> 00:06:52,110 So problems getting worse, not better. That's my reading of it. 67 00:06:52,590 --> 00:06:56,370 I have read that correctly, by the way. I was always surprised. I have to keep rereading it. 68 00:06:56,370 --> 00:07:01,439 Is that out of date or up to date? That's actually 35 days up to date for clarity. 69 00:07:01,440 --> 00:07:04,110 So nearly two thirds are out of date. 70 00:07:04,440 --> 00:07:10,170 And I think one of the biggest problems with Cochran and the Cochran style systematic reviews, they rely on published trials. 71 00:07:11,040 --> 00:07:17,369 There's a paper fairly recently by Cheryl Barron Gertz, a bit of a rubbish reference, but she'll be able to find it from there. 72 00:07:17,370 --> 00:07:20,909 Have you done your searching skills and the problem and basis? 73 00:07:20,910 --> 00:07:26,670 That paper highlighted that the vast majority of Cochran systematic reviews rely on unpublished trials, 74 00:07:26,820 --> 00:07:33,660 and when they try and look at unpublished trials, it's a very unsystematic way of going about this. 75 00:07:33,840 --> 00:07:36,930 And I'll talk more about the difference in published and published trials. I don't know. 76 00:07:36,930 --> 00:07:43,770 We've done that yet, but there's lots of reasons for relying on on published journal articles only. 77 00:07:44,430 --> 00:07:48,130 And two clear ones are reporting bias and publication bias. 78 00:07:48,150 --> 00:07:53,639 Have you done biases yet? There's some weak nods and there's very few enthusiastic party notes. 79 00:07:53,640 --> 00:07:59,340 But anyway, reporting bias when you rely on published trials. 80 00:08:00,300 --> 00:08:06,060 I'll get the exploration out of the way now. What you typically have when you have a trial is you have a huge amount of data. 81 00:08:06,720 --> 00:08:11,010 This data is then typically summarised into something called the Clinical Study Report. 82 00:08:11,580 --> 00:08:18,810 People heard of clinical study reports, clinical study reports, or hundreds of pages of long, long and massive documents. 83 00:08:19,020 --> 00:08:21,570 They are typically unstructured and they're a nightmare to work with. 84 00:08:22,650 --> 00:08:26,280 And what happens is you have a clinical study report and then someone who wants to get a 85 00:08:26,280 --> 00:08:32,310 publication will write that will summarise those 200,000 pages into 810 sounds very full. 86 00:08:32,700 --> 00:08:38,010 And that is the Journal article and then obviously the doing that with some rise to the abstract. 87 00:08:38,790 --> 00:08:42,500 Now most people would say, oh, it's you shouldn't do a systematic review in an abstract. 88 00:08:42,510 --> 00:08:48,290 And I'm certainly not proposing that any of the reasons would probably be because you're missing big chunks of data. 89 00:08:48,300 --> 00:08:55,110 We just rely on the abstract. It's the same argument I see is relying on journal articles compared with clinical study reports. 90 00:08:55,590 --> 00:09:02,910 Clinical study reports is is is summarised and loses about 80% of the data. 91 00:09:03,630 --> 00:09:07,260 And so if it's not acceptable to use an abstract, do a systematic review, 92 00:09:07,650 --> 00:09:13,110 I sometimes wonder why is it acceptable to do a systematic view based on published journal article? 93 00:09:13,230 --> 00:09:21,630 Because you're still missing shared loads of data. It's a pragmatic decision and there was a recent paper the cow was involved with and Tom Jefferson 94 00:09:22,320 --> 00:09:28,860 and Pointer Risk of bias basically is looking at risk of bias of industry funded trials. 95 00:09:29,460 --> 00:09:35,280 And what they did, they compared the core reports of journal articles with the full clinical study reports. 96 00:09:36,030 --> 00:09:42,360 And what they found was, well, I might as well read that we found that his information increased in the documents. 97 00:09:42,510 --> 00:09:44,280 This increased our assessment of bias. 98 00:09:44,700 --> 00:09:49,830 This may mean that risk of bias has been insufficiently assessed in Cochrane reviews based on journal publications, 99 00:09:50,160 --> 00:09:54,600 and that to me is not trivial publication bias. 100 00:09:55,020 --> 00:10:04,200 Certainly if you're familiar with the All Trials initiative they report the 30 to 50% of all clinical trials aren't published and here's a good, 101 00:10:04,350 --> 00:10:11,460 good time to introduce it. Looking at the turn a paper is when a pharma company wants to get a drug approved. 102 00:10:11,820 --> 00:10:19,230 It needs to show the FDA in America all the trials that it that it's done, looking at the efficacy, 103 00:10:19,830 --> 00:10:25,770 the side effects associated with a particular drug, they have to register them before they start the trials for them to be accepted. 104 00:10:26,370 --> 00:10:30,270 But there's no obligation for them to actually publish the trials. 105 00:10:30,750 --> 00:10:34,050 They've got to own the FDA. Sure, but they don't have to publish them in. 106 00:10:34,070 --> 00:10:39,889 Journal articles and turn to looked at turnout are looked at the selective 107 00:10:39,890 --> 00:10:43,940 publication of antidepressant trials and its influence on apparent efficacy. 108 00:10:45,230 --> 00:10:52,580 And what they did they went to the FDA and we've it's not easy working with the FDA or the AMA and the European Medicines Agency, 109 00:10:53,540 --> 00:10:57,080 but eventually they they looked at all the trials that had been registered with the FDA, 110 00:10:57,950 --> 00:11:03,020 and they compare that to those that have been published in journal articles. 111 00:11:04,040 --> 00:11:11,420 And what they found of, of all the studies that were in favour or showed that the antidepressants were working, 112 00:11:12,020 --> 00:11:15,500 37 versus one were published versus unpublished. 113 00:11:17,210 --> 00:11:23,300 And then of a third of those that were found to be negative, I didn't support the use of antidepressants. 114 00:11:24,020 --> 00:11:27,500 Only three were published and 33 were unpublished. 115 00:11:28,370 --> 00:11:36,710 So if you do a systematic review on published trials, you'll find 37 in favour and three against. 116 00:11:37,040 --> 00:11:41,990 It's not fair that reasonable. And I think Turner pointed out the madness of this. 117 00:11:43,490 --> 00:11:52,610 And what what Turner did was they looked at what would the effect be of including the the FDA trial data, 118 00:11:52,610 --> 00:11:59,870 the regulatory data versus the the matter analysis done on published journal articles. 119 00:12:00,320 --> 00:12:05,330 And they found if you just relied on the journal articles, there was a 32% increase in effect size. 120 00:12:06,860 --> 00:12:10,130 This is not trivial, you might say. This is not this is an outlier. 121 00:12:10,730 --> 00:12:17,330 Certainly not good. This is a paper by heart in the BMJ 2011. 122 00:12:17,840 --> 00:12:22,340 And what she did was she looked at an awful lot more, 123 00:12:23,720 --> 00:12:30,650 basically the same principle comparing matter as is done on a on published journal articles 124 00:12:30,800 --> 00:12:35,210 comparing those of those that included regularly regulatory data such as by the FDA. 125 00:12:35,600 --> 00:12:36,469 And what I've done, 126 00:12:36,470 --> 00:12:47,750 I've zeroed in the zero line is the effect size as shown in the best analysis based on published journal articles and the horizontal bars, 127 00:12:47,750 --> 00:12:52,310 or how wide the discrepancy was when you added in regulatory data. 128 00:12:53,420 --> 00:12:56,870 And so the top one, it was out by over 150%. 129 00:12:57,380 --> 00:13:01,250 And down here, if I can get my point to working, is over 50, 130 00:13:01,400 --> 00:13:09,650 50% reduction and then over 50% of the approximately 50% of the cases, the effect size estimate was out by greater than 10%. 131 00:13:10,280 --> 00:13:13,459 And what the biggest problem that's that's that's serious enough. 132 00:13:13,460 --> 00:13:18,560 But one of the things that was actually I predicted before I actually read the article that well, 133 00:13:18,560 --> 00:13:24,140 it's any ones that involve far more will always increase the effect size in favour of the drug. 134 00:13:24,710 --> 00:13:29,690 But I didn't find that there was no pattern. So what makes it even worse is it's unpredictable. 135 00:13:29,690 --> 00:13:34,340 You cannot see if you see a journal or a meta analysis based on journal articles, 136 00:13:34,520 --> 00:13:41,629 whether it's an underestimate or an overestimate, the chances are it's going to be out by 15, 20% and confidence intervals. 137 00:13:41,630 --> 00:13:47,209 Don't touch this. You know, you see little diamond at the bottom and you think lots of that's a very narrow time. 138 00:13:47,210 --> 00:13:54,560 And therefore, we must be pretty certain does not doesn't take into account the unpublished trials misleading. 139 00:13:55,070 --> 00:14:05,570 And so I think part of my concern with with matter and systematic reviews used on published journal articles is that I didn't realise this and 140 00:14:05,570 --> 00:14:15,050 I don't think many people realise this when they look at systematic reviews is the effects and the harms of things such as publication bias? 141 00:14:15,500 --> 00:14:21,020 Well, what's what's even more scary is the amount of trials that are and the amount of data, 142 00:14:21,020 --> 00:14:25,909 not just trials are coming our way in 1976 is about a thousand. 143 00:14:25,910 --> 00:14:29,750 This is often called this. It calls data. You can't criticise it. 144 00:14:30,260 --> 00:14:41,120 In 1976 there was about 1000 trials per year published in 1994, 10,000 in 2000, seven, 25,000. 145 00:14:41,300 --> 00:14:44,300 And by 2018 19 it's estimated to be 50,000. 146 00:14:45,020 --> 00:14:49,640 Because of the success of the All Trials initiative. 147 00:14:50,120 --> 00:14:54,440 More trials are being they'll help contribute to the 50,000 because more trials will actually be published, 148 00:14:54,710 --> 00:15:02,180 but also to be much greater availability of data in the form of clinical study reports and individual patient data and things like that. 149 00:15:03,980 --> 00:15:09,950 Which is. It's great that we've got the data, but my view is the current system can't cope. 150 00:15:10,190 --> 00:15:15,290 So why do we think adding ten times more data is going to make the system work any more efficiently? 151 00:15:15,620 --> 00:15:20,900 It's not. It's going to creak. And I think at the moment the current system is coping really badly. 152 00:15:21,170 --> 00:15:25,070 So my general view is when we add more data, it will break. 153 00:15:25,520 --> 00:15:30,470 I actually think it's broken now, but that's just me. And so we need a new system. 154 00:15:30,710 --> 00:15:35,340 I think, oh, we could we could pray that we're going to get a tenfold increase of funding. 155 00:15:35,780 --> 00:15:44,660 I think in the end that's unlikely, I would say. And so I think it'll be a gross misuse of funds to throw extra money at the current system. 156 00:15:44,960 --> 00:15:48,260 And that's actually how I approached Carl and Jeremy initially, I said. 157 00:15:49,310 --> 00:15:58,460 I felt it an ethical issue around the allocation of funds to should we be spending so much money doing system reviews, which are clearly problematic. 158 00:15:58,820 --> 00:16:07,670 Anyway, that's another point. But I think that if people continue to believe that the current system review system is great, nothing will change. 159 00:16:08,270 --> 00:16:14,059 And I think there's obviously organisations that are very keen on maintaining the status quo and lots of people, 160 00:16:14,060 --> 00:16:21,410 not just organisations, make a lot of a nice career assessment reviews and I suppose change is always difficult. 161 00:16:21,410 --> 00:16:29,389 I suppose I'm saying because you always have the status quo, we'll try and maintain the situation and what the new system needs to be is both quick 162 00:16:29,390 --> 00:16:35,570 and robust and we can see no way of reconciling that unless you have two systems. 163 00:16:36,530 --> 00:16:42,950 One is quick and one is robust. The quick system will be brief and it will be very, very short. 164 00:16:43,250 --> 00:16:50,960 Will you shorten search based? You will take the systematic view method and we will significantly decrease the amounts of the cost at each stage. 165 00:16:51,350 --> 00:16:54,770 Expedited search, simple quality assessment. 166 00:16:55,250 --> 00:17:03,500 And the view is that we could produce and this is a very much at supporting clinical practice and that we will produce very short, brief reports. 167 00:17:03,560 --> 00:17:07,730 If you look at the very first Cochrane systematic reviews, they were quite small documents. 168 00:17:07,730 --> 00:17:11,360 I think they're about 14 pages long and that was with including all the references. 169 00:17:11,780 --> 00:17:16,729 And now I can never count all the pages. I'm always it's always about well, 170 00:17:16,730 --> 00:17:26,959 well over 200 pages by the time we scroll through it all and they would highlight key outcomes and they would highlight likely effect sizes and 171 00:17:26,960 --> 00:17:35,210 they would be easily updated on the various things that we are that are in the pipeline around updating systematic reviews and things like that. 172 00:17:35,450 --> 00:17:43,069 None of which have come online yet. But I think there's that there's a if we can, we've, we can rapidly produce sorry. 173 00:17:43,070 --> 00:17:47,430 If we can significantly reduce the cost per rapid systematic review, 174 00:17:47,450 --> 00:17:52,880 we can do an awful lot more and therefore we can incorporate all randomised controlled trials in that way. 175 00:17:53,570 --> 00:17:57,080 The systematic, the work I've been doing in Public Health Wales. 176 00:17:57,410 --> 00:18:02,209 We did two very quick methods of sort of rapid reviews. 177 00:18:02,210 --> 00:18:05,750 What we did, we blinded one member of staff literally, 178 00:18:05,990 --> 00:18:20,690 and we gave them ten questions based on Cochrane systematic reviews and we got them to answer it using this technique that took 4 hours and 85%. 179 00:18:20,690 --> 00:18:30,080 So is that 85% of the times it was there was we got it the same as Cochrane and got the half is that of the 5% is that because it was ambiguous? 180 00:18:30,170 --> 00:18:37,160 We weren't quite sure. But there's another technique that we do on the trip database which can do rapid reviews in about 5 minutes. 181 00:18:37,460 --> 00:18:41,720 And again, that got 85% agreement with Cochrane. 182 00:18:42,290 --> 00:18:45,199 But that's just my my non-scientific way of doing it. 183 00:18:45,200 --> 00:18:49,760 What I normally do things is just in the office and just mucking around with this, the rapid reviews. 184 00:18:50,030 --> 00:18:56,149 And there's this huge one of the pros of rapid reviews is no standard methodology and people get concerned by that. 185 00:18:56,150 --> 00:19:04,130 But there's been three papers recently which have looked at reducing the time taken to do systematic reviews. 186 00:19:04,580 --> 00:19:09,110 Hammonds and Hammonds, he works with Brian Haines. You may well have heard of McMaster. 187 00:19:09,560 --> 00:19:18,080 They do a wonderful service called Evidence Updates, which if you're not familiar with, you should be evidence updates is a great system. 188 00:19:18,650 --> 00:19:26,750 It's a they look at hundred and 20 of the top journals and they do an internal critical appraisal stage. 189 00:19:27,320 --> 00:19:34,190 And those that fail, a critical appraisal dumped from the system and those that they feel are suitable quality to 190 00:19:34,190 --> 00:19:39,920 inform clinical care are then passed on to a network who clinicians who is a cardiology paper. 191 00:19:40,190 --> 00:19:47,089 It is sent off to at least four. Cardiologist might be three I think it's four and the cardio is rate it on is it news worthy of 192 00:19:47,090 --> 00:19:51,830 its relevance clinical practice and they have to get a certain threshold before they're accepted. 193 00:19:52,430 --> 00:20:00,500 Those two stage processes kick out 96% of the papers and from the top, the top 120 journals, 194 00:20:00,830 --> 00:20:04,820 they get rid of loads of stuff, which is just not pertinent to clinical practice or newsworthy. 195 00:20:04,890 --> 00:20:08,130 Your well done properly using that. 196 00:20:09,270 --> 00:20:16,649 It says here when they updated using the their techniques using the evidence updates they use less than a quarter of the new 197 00:20:16,650 --> 00:20:24,510 studies found in the Cochrane update and they said that most reviews appeared unaffected by the admission of these studies late. 198 00:20:24,540 --> 00:20:35,190 Now, any Italians here before I absolutely murder the pronunciation of this occur and you know, no one can contradict that. 199 00:20:35,670 --> 00:20:40,590 Actually, there's a bit will flair in the sound of it or confident it would have wonder what they did. 200 00:20:40,620 --> 00:20:44,700 If the systematic review was on a cardiology topic. 201 00:20:44,820 --> 00:20:48,780 They would take the top journals in cardiology and just use those to update the systematic review. 202 00:20:49,230 --> 00:20:55,980 And so that's a different way of sampling the randomised controlled trials and they found, 203 00:20:56,010 --> 00:21:01,170 they found able to replicate almost all the clinical recommendations of a formal systematic review. 204 00:21:01,740 --> 00:21:11,160 Again, little difference. And then there was a great paper recently, one answered and they just did the meta analysis based on Medline. 205 00:21:11,820 --> 00:21:18,150 And again, restricting the studies index on Medline did not influence the summary estimates of matter analysis in our sample. 206 00:21:19,410 --> 00:21:24,330 So there's three separate ways of looking at trying to sample the number of trials, 207 00:21:24,540 --> 00:21:27,990 and none of them seem to affect the outcomes of the systematic review. 208 00:21:29,040 --> 00:21:31,919 But what I would say to that is it's very bold to make, 209 00:21:31,920 --> 00:21:39,000 but I think more research needs to be needs to be undertaken to fully appreciate when you can get away with that and when you can't. 210 00:21:39,330 --> 00:21:48,149 But I think if you talk to Tom Jefferson, who is, I would say, more cynical than me, he would say that he mirrors what Richard Smith is. 211 00:21:48,150 --> 00:21:53,790 I don't see what Richard Smith says about published journals. He says they're the publishing arm of pharma. 212 00:21:54,600 --> 00:21:57,540 He says they're a marketing tool for the pharmaceutical industry. 213 00:21:58,290 --> 00:22:06,419 And his view is when I've discussed it with him, well, if you sample the marketing message from pharma, if you take 10%, you take 80%. 214 00:22:06,420 --> 00:22:11,160 You still taking a sample of the marketing message, so you're still going to report the marketing messages they want to see. 215 00:22:11,490 --> 00:22:15,150 That was his cynical view. So to my mind, 216 00:22:15,360 --> 00:22:20,190 there is probably we can probably do an awful lot to get very close to where 217 00:22:20,400 --> 00:22:23,490 Cochrane today and I appreciate the sort of fallacy that I've been saying. 218 00:22:23,490 --> 00:22:30,810 Well Cochrane and Cochrane are systematic views are are flawed so we're getting close to a flawed system. 219 00:22:31,140 --> 00:22:36,150 But my view is, well, if we're going to get to an approximate I call it a ballpark figure, 220 00:22:36,390 --> 00:22:39,840 let's take two weeks doing it or one week doing it as opposed to 23 months. 221 00:22:40,440 --> 00:22:46,770 But in some situations, there's a an acknowledgement that we need a more robust system. 222 00:22:47,400 --> 00:22:54,420 And those, of course, calls spoken to you about his the work is with Tom Jefferson on the Tamiflu systematic review. 223 00:22:56,130 --> 00:23:02,580 Yeah, but but essentially that was a huge amount of work, much more than doing a standard systematic view. 224 00:23:02,580 --> 00:23:10,500 So those of you did a have done systematic reviews that was just a walk in the park relative to what I remember coming here last 225 00:23:10,500 --> 00:23:16,410 year and presenting and then call and and Tom were going off to look at clinical study reports for the rest of the evening. 226 00:23:16,860 --> 00:23:22,740 It was very romantic my friends on up but essentially to move. 227 00:23:22,920 --> 00:23:27,120 I've already said the huge cost in doing a systematic review, 228 00:23:27,900 --> 00:23:34,559 but in some situations a systematic review needs to be more robust in the current Cochrane technique. 229 00:23:34,560 --> 00:23:38,580 So that requires essentially the authorisation of a large amount of cost. 230 00:23:39,030 --> 00:23:46,980 And so it needs to be needs to be a very good reason for saying yes, let's spend even more resource on looking at this question. 231 00:23:48,420 --> 00:23:54,870 And you know, there's various things we can use. I don't know if you've had a look through systematic reviews, but often the outcomes will be bizarre. 232 00:23:55,140 --> 00:24:02,850 Often the topics will seem equally bizarre. My favourite one is music therapy for Ingrown Toenail, which I think I've actually made up. 233 00:24:03,120 --> 00:24:08,100 But some people don't know and I'm, I'm getting confused, but I'm sure there's probably a protocol for that somewhere. 234 00:24:08,520 --> 00:24:16,170 But often you you will get a feel from doing a rapid review about whether the outcomes are actually relevant to patients. 235 00:24:17,100 --> 00:24:21,000 I'm sure you're aware of lots of studies where the outcomes are bizarre. 236 00:24:21,300 --> 00:24:29,430 I saw one the other day for e-cigarettes and it's it was deemed a success if it stops someone craving a real cigarette for more than 15 minutes. 237 00:24:29,910 --> 00:24:37,200 And I just struck me as being a bit strange, but all the outcomes that are being reported in there in the research, are they useful to anyone? 238 00:24:37,320 --> 00:24:40,380 Does anyone care is a likely effect size. 239 00:24:42,420 --> 00:24:48,780 Clinically important because lots of studies will find very marginal benefits, which are probably not going to have a huge impact on clinical care. 240 00:24:49,560 --> 00:24:53,400 And is there a significant clinical or economic reason for doing it? 241 00:24:54,240 --> 00:25:00,900 And the Tamiflu is a great example. But what we believe if and I'll talk about the Tamiflu a little bit more, 242 00:25:02,250 --> 00:25:06,570 but what we think is that it should be the patients and it should be the clinicians who make the decision. 243 00:25:07,740 --> 00:25:11,340 They should be the ones saying are these outcomes relevant in the patient's case of these? 244 00:25:11,340 --> 00:25:13,440 Rather than to me, rather than to my care? 245 00:25:13,620 --> 00:25:19,530 And the clinician can make judgements as well simply as outcomes useful are the clinically important effect sizes. 246 00:25:20,100 --> 00:25:24,950 And if you ever get the chance to see Tom Jefferson talk about Tamiflu, go and see it. 247 00:25:25,230 --> 00:25:29,110 He's a brilliant speaker. He's in the Army. 248 00:25:29,110 --> 00:25:31,850 And I think that gives him a certain press. He was in the Army today. He still is. 249 00:25:31,860 --> 00:25:36,719 No, he's in Italy, but there is a doctor there, but he was in the army. 250 00:25:36,720 --> 00:25:40,470 And you can tell, I think, very, very commands a room. 251 00:25:41,340 --> 00:25:50,460 And when they did the first Tamiflu review, Tom got asked the question, say, in your in your systematic review, 252 00:25:51,240 --> 00:25:58,139 you relied on a a meta analysis on a previous meta analysis, which is using unpublished trials. 253 00:25:58,140 --> 00:26:02,490 How can you trust it? Until my instructions haven't, I can't. 254 00:26:03,090 --> 00:26:09,299 And over the course of the process of him wrestling with that concern, they came up with, 255 00:26:09,300 --> 00:26:17,580 they basically rip of the Cochrane Handbook and they do not publish of in the Cochrane a Cochrane systematic review. 256 00:26:17,580 --> 00:26:22,380 They haven't used a standard Cochrane methodology and they've avoided all journal articles. 257 00:26:22,530 --> 00:26:25,410 They don't touch journal articles when they did the systematic review, 258 00:26:25,770 --> 00:26:29,819 they just relied on the clinical study reports and they had to do a huge amount of work to 259 00:26:29,820 --> 00:26:35,940 identify those clinical study reports and they still keep springing up and I think it is. 260 00:26:36,870 --> 00:26:42,570 Yeah, it is. It's just been a phenomenal amount of work, as I say, much more than the standard systematic review methods. 261 00:26:42,690 --> 00:26:43,919 So you can't just do it willy nilly. 262 00:26:43,920 --> 00:26:49,260 You can't just say, I want to do the systematic review, which is going to take a huge amount of cost is just not feasible. 263 00:26:49,260 --> 00:26:54,210 So if they've got to be reserved for the special cases where it's really important, in the case of the Tamiflu, 264 00:26:55,080 --> 00:27:03,120 the first Cochrane Review was a lot less negative than the subsequent ones, 265 00:27:03,720 --> 00:27:07,680 and it came out about the same time as if the government spent £500 million on Tamiflu. 266 00:27:09,340 --> 00:27:14,920 Which is, as far as I can tell. Not much better than paracetamol, I think is the quote. 267 00:27:15,630 --> 00:27:20,530 So there's a strong economic case of the government thinking of spending £500 million on an intervention. 268 00:27:20,710 --> 00:27:23,140 Well, should we make sure that it's actually worthwhile? 269 00:27:23,290 --> 00:27:29,439 I think that's sort of fairly a good a good example of when a system like this would work and it would 270 00:27:29,440 --> 00:27:35,640 use all the data that's available and it would spend the time trawling through all the regulator data, 271 00:27:35,660 --> 00:27:37,060 knowing what a nightmare that is. 272 00:27:37,510 --> 00:27:43,930 But again, the All Trials Network had been great in getting the European Medicines Agency to be a little more flexible. 273 00:27:44,290 --> 00:27:50,049 There was an agreement to publish all clinical. Well, I don't think it's been ratified yet, but the general views are publish. 274 00:27:50,050 --> 00:27:57,400 All clinical study reports were a bit redacted for commercially sensitive stuff from next year, possibly, 275 00:27:57,400 --> 00:28:04,300 although they're still keeping hold of the clinical study reports that had been prior to prior to, well, 2014. 276 00:28:05,890 --> 00:28:08,950 And this these star reviews will be the proper gold standard. 277 00:28:09,460 --> 00:28:16,420 Not I use that because I think most people's view systematic reviews and especially the Cochrane ones as being the gold standard. 278 00:28:17,170 --> 00:28:22,900 And I think hopefully the evidence I presented will make you slightly sceptical, which is always good. 279 00:28:23,260 --> 00:28:31,390 We're all scientists here. So in conclusion, I have no doubt that the current system is completely broken. 280 00:28:31,840 --> 00:28:36,250 It's. It's on its last legs. And so we need to move to a situation. 281 00:28:36,460 --> 00:28:41,440 And we're not saying this is this is our suggested method. And you can discuss it amongst yourselves in a few minutes. 282 00:28:41,920 --> 00:28:51,550 But we think there's two there's only the way we can think of is do a rapid review for all to incorporate all trials in line with Arches vision. 283 00:28:52,870 --> 00:28:56,020 And then there is a process that you decide is it worthwhile? 284 00:28:56,230 --> 00:29:01,210 All the good reasons why we feel we should spend more money in doing a much more robust review. 285 00:29:01,340 --> 00:29:04,930 But when you do a robust review, you do a proper robust review, 286 00:29:05,200 --> 00:29:10,780 not one that's caught between two between two stools of as I would characterise Cochrane reviews. 287 00:29:11,260 --> 00:29:20,650 That's all I wanted to say for that. But what I'm now going to challenge you guys to do is to help us perfects our paper. 288 00:29:21,040 --> 00:29:28,380 It probably needs a lot of work, and what we'd like you to do is answer these questions because I think my talk, 289 00:29:28,390 --> 00:29:35,990 well, I'd characterise it as being in two parts. One is rubbishing the current methods of doing systematic reviews, and the second one is an ad, 290 00:29:35,990 --> 00:29:40,330 a suggested new approach of doing a rapid and a more robust method where appropriate. 291 00:29:40,810 --> 00:29:42,430 That's how I characterise it. 292 00:29:42,970 --> 00:29:51,700 What I'd be interested to hear is for the first one, were people surprised by you go into groups or talked to your neighbour about this? 293 00:29:52,120 --> 00:29:57,260 Did anything strike you as being particularly surprising and did you feel I overstated anything? 294 00:29:57,280 --> 00:30:01,600 Was the emphasis wrong? Did I not emphasises enough any of the criticisms? 295 00:30:02,260 --> 00:30:06,940 And so I want you to consider those. And also with a suggested new approach. 296 00:30:07,730 --> 00:30:09,610 Have we got it wrong? And I'm really thick skinned. 297 00:30:09,940 --> 00:30:15,070 I really don't mind if you say Johnny to the letter rubbish for the last 30 minutes and let me go I to go and have my dinner. 298 00:30:15,610 --> 00:30:23,170 And also, is there anything we if you think it's broadly correct, is there any way you might improve the way we position our reasoning? 299 00:30:24,640 --> 00:30:28,780 So what's the best way to do this? Is it people comfortable talking with their neighbours or people? 300 00:30:28,780 --> 00:30:33,160 And I'll give you sort of ten, 15 minutes and I'll start asking you to sort of report back. 301 00:30:34,570 --> 00:30:38,980 Joining us to you want to just find the best way of talking with your neighbours and.