1 00:00:00,060 --> 00:00:07,740 Great pleasure to introduce Dr. Kamal Bhutani, who some of you know already and some of you will get to know because he's the 2 00:00:07,740 --> 00:00:13,410 deputy director of the Centre for Evidence based Medicine and a practising GP, 3 00:00:14,990 --> 00:00:20,220 and he's very kindly going to talk to us today on why on earth do we waste so much research. 4 00:00:21,180 --> 00:00:26,010 Thanks, Chris. I feel sort of guilty keeping you in when it's such a nice evening, but. 5 00:00:26,190 --> 00:00:30,960 So we'll try. We'll try and just make it so it's it's brief but makes the points. 6 00:00:32,010 --> 00:00:38,760 So definitely I recognise some faces that if not, hopefully I'll get a chance to meet you again soon in one of the courses. 7 00:00:38,760 --> 00:00:43,710 So as I said, I'm a GP, I'm deputy director, so a researcher and a tutor as well. 8 00:00:43,920 --> 00:00:46,559 So I thought we'd just spend a little of the evening on this. Nice evening. 9 00:00:46,560 --> 00:00:51,870 Talk a little bit about research and I know you're doing your study designs or some of your study designs module as well this week. 10 00:00:52,740 --> 00:00:53,790 Okay. So a little quiz. 11 00:00:54,150 --> 00:01:03,000 Some of you may know this already because you're well versed with the literature, but how much per year spent globally on health and medical research? 12 00:01:03,000 --> 00:01:06,479 Roughly globally? This was probably a figure about a couple years. 13 00:01:06,480 --> 00:01:11,770 This is what we've gone up since this year. But have a guess. Global. 14 00:01:12,370 --> 00:01:15,760 3 billion. Okay. 77 million. 15 00:01:16,570 --> 00:01:20,140 Yep. Okay. Good. Thanks. That's a bit higher. 16 00:01:20,680 --> 00:01:25,390 Yep. And you're going to go higher. Lower than 48. 17 00:01:25,810 --> 00:01:29,140 40 billion. Oh, I got it. Okay. High stakes. Okay. 18 00:01:33,190 --> 00:01:38,770 I think I can see where this is going. All right. Well, actually, it's about 200 billion. 19 00:01:41,050 --> 00:01:44,080 And in fact, it's probably more than that. It's probably more. That was that was correct. 20 00:01:44,080 --> 00:01:48,970 As a few as of a few years ago. Okay. So you probably know the answer to the next question. 21 00:01:49,540 --> 00:01:54,760 How much of research produced is wasted? 90%. 22 00:01:54,850 --> 00:01:58,000 90% or 70? 23 00:01:58,810 --> 00:02:02,560 About 70%. About 70%. 90%. Okay. 24 00:02:02,710 --> 00:02:07,420 All right. Well, I'm going to. I'm going to say you're about halfway between that, about 85%. 25 00:02:07,690 --> 00:02:11,769 So we're talking about the money over about 170 billion is waste. 26 00:02:11,770 --> 00:02:14,889 And you can compare that to some of the GDPs of some of certain countries. 27 00:02:14,890 --> 00:02:18,130 And it's it's as you can see, as you find out, it's a lot more. 28 00:02:18,760 --> 00:02:23,140 In fact, I've written it's probably a bit more than this. So a lot of research, a lot of money is spent. 29 00:02:23,590 --> 00:02:26,829 A lot of research is wasted. Okay. 30 00:02:26,830 --> 00:02:33,040 So where does that figure come from? 85%. So, oops, that one. 31 00:02:33,100 --> 00:02:36,970 Right. So if you a starting point, I'll talk about this paper a lot. 32 00:02:37,210 --> 00:02:40,900 Have a look at the launch of paper by Ian Chalmers and Paul GLASSIE. 33 00:02:41,210 --> 00:02:46,900 And this is kind of where they get this from. So 100% of research, this doesn't come out of research. 34 00:02:47,380 --> 00:02:57,760 50% is just not published. You never see it. Of the remaining 50%, about half is just not clear accurate to use in clinical practice. 35 00:02:58,570 --> 00:03:06,970 Of that remaining quarter, half is flawed through an avoidable design that could have been a design that could've been avoided. 36 00:03:07,270 --> 00:03:10,740 Therefore, it limits your application. So that just leaves about 13. 37 00:03:10,750 --> 00:03:16,390 So not even 15%, about 13%. So there's a big problem in science and health research. 38 00:03:16,600 --> 00:03:20,620 So it's a really good paper. And as I said, I will draw on that. And here you are. Here is the paper. 39 00:03:21,160 --> 00:03:27,370 So they set out a sort of stepwise guide to where the sources of research is wasted, 40 00:03:27,700 --> 00:03:35,590 and actually they start discussing where it could be improved and resources targeted to prevent its waste. 41 00:03:36,010 --> 00:03:40,510 So a great paper you haven't read it. Really, really interesting. But I'll sort of drawn, as I said, through the talk today. 42 00:03:41,110 --> 00:03:49,900 Okay. Why does it matter? Well, we end up producing or not producing research that kind of help patients and clinicians. 43 00:03:51,940 --> 00:03:56,680 There are examples, and I'll give you one today of research about potential harms that are not fully revealed. 44 00:03:58,060 --> 00:04:04,120 And of course, we waste valuable resources in ever in a ever dwindling environment of resources. 45 00:04:06,100 --> 00:04:09,500 Okay. So why is research wasted? Okay. Point number one. 46 00:04:09,760 --> 00:04:15,220 Choosing the wrong question for the research. And you want to something you want to study design that's particularly relevant. 47 00:04:16,660 --> 00:04:27,160 Okay. Few years ago in the UK, David Cooksey, Sir David Cooksey was asked to do a review of UK research funding, and it's an exhaustive review. 48 00:04:27,220 --> 00:04:33,930 The Cooksey report very famous, and he identified sort of effectively two gaps in research. 49 00:04:33,940 --> 00:04:37,720 Anybody know this? So if you look at the research pathway, if you like, 50 00:04:37,990 --> 00:04:45,940 from basic research or bench science and working in laboratory to prototype, discovering design and so on, clinical trials, 51 00:04:46,360 --> 00:04:51,040 health technology assessment, health services, research, this it's starting to involve patients, 52 00:04:51,310 --> 00:04:54,670 knowledge management and finally actual health care delivery at the coalface. 53 00:04:55,120 --> 00:05:01,720 This sort of pathway of from basic research to to practice if you like identify two gaps. 54 00:05:02,440 --> 00:05:05,470 First gap in translation from this sort of area. 55 00:05:05,740 --> 00:05:10,330 So the discovery phase, early clinical trials to the next stage of late clinical trials. 56 00:05:10,540 --> 00:05:15,040 And the second gap here, sort of technology assessment actually getting it into practice. 57 00:05:15,700 --> 00:05:22,720 Okay. So there were two gaps that particularly, but this sort of area was fine, basic research. 58 00:05:24,550 --> 00:05:31,540 But if you look, where does the funding actually go to? And again, this draws on Thomson Glacier. 59 00:05:32,830 --> 00:05:37,780 Look, pure basic research receives by far the majority of funding. 60 00:05:38,440 --> 00:05:39,860 So this is pure, basic research. 61 00:05:39,860 --> 00:05:46,240 So if you talk about this sort of research here, when we start thinking about putting it into practice, we think about applied research down this end. 62 00:05:46,840 --> 00:05:53,440 So this and you'll see the gaps are more towards that in this gets the majority of funding. 63 00:05:53,890 --> 00:06:00,520 So they did an audit in 2004 in the mid 2000 and found about 68.3% of funding goes to basic research. 64 00:06:01,690 --> 00:06:04,210 Slightly less later on in 2009. 65 00:06:04,450 --> 00:06:10,180 But when you compare it to pure applied research, although it's going up, it's still a minor proportion of the funding. 66 00:06:11,330 --> 00:06:15,440 And actually, what does it matter? Why do we. I mean, I'm an applied researcher, so it matters to me. 67 00:06:16,220 --> 00:06:22,340 But why does it matter when they review? And this is just one example, and I'll talk through when they review the impact of research. 68 00:06:22,970 --> 00:06:26,070 From basic science. To. To patient care. 69 00:06:26,150 --> 00:06:29,480 You direct patient care. You know what matters? What makes a difference? 70 00:06:29,750 --> 00:06:37,400 This is just one example that it makes. Again, an exhaustive review looking at mental health, tracking, examples of mental health, basic research, 71 00:06:37,670 --> 00:06:43,100 and how it's been transferred, transformed or not into applied research and actual patient care. 72 00:06:44,390 --> 00:06:46,340 And so they come up with a number of conclusions, this being one of them. 73 00:06:46,340 --> 00:06:51,620 Despite significant advances in the biomedical understanding of mental health and brain function, 74 00:06:51,860 --> 00:06:55,459 these are yet to have much practical impact on the diagnosis and treatment of schizophrenia. 75 00:06:55,460 --> 00:07:00,860 So they use schizophrenia as one as an example, and this is actually from 2003. 76 00:07:01,640 --> 00:07:07,910 And so one of the overarching conclusions was through a number of case studies was that clinical research. 77 00:07:08,090 --> 00:07:11,330 So the more the applied nature has a large impact on patient care, 78 00:07:11,540 --> 00:07:16,520 the more basic research, particularly analyses, over 20 years worth of data and research. 79 00:07:17,180 --> 00:07:20,870 So we need to focus on clinical research more than we on biomedical. 80 00:07:20,870 --> 00:07:24,679 And that's not as I used to be a biomedical researcher before I became an applied researcher. 81 00:07:24,680 --> 00:07:28,370 So I understand the field to a certain degree. That's not to say that there isn't value there, 82 00:07:28,910 --> 00:07:35,060 but perhaps one of the reasons is we're not channelling funding into research that really matters to patients. 83 00:07:35,780 --> 00:07:40,970 Okay. Problems with external validity. Validity. So now we are thinking about applied research. 84 00:07:41,240 --> 00:07:46,700 So we're doing clinical trials, we're doing studies on patients, but we've got problems with external validity. 85 00:07:47,060 --> 00:07:50,150 So everybody comes with with the term external ability. Yep. 86 00:07:50,570 --> 00:07:56,330 So did the results of this study apply to my patient or my populations of patient i population of patients? 87 00:07:56,630 --> 00:08:02,480 That's an issue. And when you look through randomised control audits, a randomised controlled trial, this is just one of them. 88 00:08:04,280 --> 00:08:11,300 They look to see areas in which the randomised controlled trials for cardiology, mental health and oncology, they took a sample. 89 00:08:11,870 --> 00:08:16,910 How would the included patients in that in those trials map to real world patients, 90 00:08:18,030 --> 00:08:25,790 the real world populations and in fact they said that some of the conclusions were 71.32% of the studies included in this review. 91 00:08:26,480 --> 00:08:31,639 The individual study authors concluded that RTT samples were not representative of patients encountered in clinical 92 00:08:31,640 --> 00:08:37,460 practice and or that population differences may have relevant impact on the external validity of the RCC findings. 93 00:08:38,150 --> 00:08:40,969 So even with doing applied research, some of the findings, 94 00:08:40,970 --> 00:08:47,330 we're excluding really important patients that they may apply to and here are some of the recommendations. 95 00:08:47,690 --> 00:08:51,200 So they talk about from that paper about ways of managing external validity. 96 00:08:51,350 --> 00:08:57,300 So for example, they say broadening the RC to inclusion exclusion criteria, but we're all a bit edgy. 97 00:08:57,330 --> 00:09:06,530 Well, a lot of people are edgy about doing that because they think, well, it might dilute out my chance of getting a an impact as an effect size. 98 00:09:08,090 --> 00:09:12,110 It's something to think about really carefully when we design research. Okay. 99 00:09:12,860 --> 00:09:15,650 Why else as doing studies are unnecessary or poorly designed? 100 00:09:19,600 --> 00:09:26,320 So this sort of this sort of point lends itself to the fact that why on earth are we doing research when the answer is probably out there already? 101 00:09:27,430 --> 00:09:33,430 And this also highlights and this is what Ian Chalmers and Paul Glass you do is highlight the value of systematic reviews. 102 00:09:33,760 --> 00:09:38,360 And I'm a big advocate of systematic reviews, so everyone know what a systematic review is. 103 00:09:38,380 --> 00:09:45,910 Yeah. So by seeking to search systematic review PGC for all available evidence on a given topic and summarise the result into a usable format. 104 00:09:47,440 --> 00:09:54,520 And there are numerous examples where a failure to conduct timely, systematic reviews, waste resources and harms patients. 105 00:09:56,900 --> 00:10:00,290 Everybody knows the story that you know. Yeah. Oh, no. 106 00:10:00,770 --> 00:10:10,400 Okay. Do you mind if I tell you? So now my son is two and a half. 107 00:10:10,850 --> 00:10:17,780 And one thing I learned when he was born was there are lots of baby books out there, you know, how to be a parent and all this, etc. 108 00:10:18,110 --> 00:10:21,560 But if he was born in the 1940s fifties, we would have got this book. 109 00:10:22,010 --> 00:10:27,709 Okay. Dr. Spock's baby in child care book, he sold over 90 million copies. 110 00:10:27,710 --> 00:10:32,750 In fact, it doesn't quite fit, but it's sold more than that. And I think it's the second bestselling book going into the Bible. 111 00:10:34,280 --> 00:10:38,150 So Dr. Spock got a lot of things right in his advice, 112 00:10:38,330 --> 00:10:44,210 but one of the things he didn't get right was about his advice about how to put your baby to sleep. 113 00:10:44,840 --> 00:10:49,220 So he used mechanistic reasoning, so disadvantages to a baby sleeping on his back. 114 00:10:49,370 --> 00:10:55,460 If he vomits, it's more likely to choke on the vomit. I think it's preferable to a customer baby, to sleeping on his stomach from the start. 115 00:10:56,030 --> 00:11:01,250 So if you think about how many copies this was selling and still sells, in fact you can think about the impact of that sort of advice. 116 00:11:02,450 --> 00:11:08,450 And another if you if you're not familiar with testing treatments, it's available freely as a PDF book or to buy. 117 00:11:08,600 --> 00:11:14,540 But it's a great book with seven examples in it. But if you look at the timeline to what happened to Dr. Spock, he put it edition. 118 00:11:14,690 --> 00:11:18,440 In that edition he took talked about putting babies on their backs. 119 00:11:19,220 --> 00:11:22,880 In this edition, he switched, changed his mind, said, Oh, baby should be put on their front. 120 00:11:23,330 --> 00:11:27,049 And if you look at the timeline to what happened, first, studies just harm. 121 00:11:27,050 --> 00:11:30,110 Second studies just home by the mid 1980s. 122 00:11:30,110 --> 00:11:38,780 Third study. Three further studies suggest harm campaign started to be launched to advise parents to put their babies back on their back. 123 00:11:39,020 --> 00:11:45,710 Back to sleep. Just being one of them. And in fact, it wasn't until the mid 2000 that the first systematic review was published about this. 124 00:11:47,410 --> 00:11:48,879 And in fact, this is what the systematic review. 125 00:11:48,880 --> 00:11:54,130 So I'll talk very briefly, but if you look at these are all the studies that were done exploring this. 126 00:11:54,130 --> 00:12:00,750 These are all observational studies. And actually if you them out on a metal analysis, they would help with the responses. 127 00:12:01,450 --> 00:12:09,970 Yep. So if you look at the summary summary results here over a fourfold increase in cot deaths from putting the babies on their front. 128 00:12:11,260 --> 00:12:15,160 But if you reanalyse that looking at what they call cumulative paralysis. 129 00:12:15,580 --> 00:12:18,600 So what you've got here is the same study, carpenter. 130 00:12:18,610 --> 00:12:25,840 Carpenter. But what they've done is they've effectively said, well, next time this study came about or this study came out in 1970. 131 00:12:26,140 --> 00:12:29,560 Let's add the data to that, to that, and come up with a new summary statistic. 132 00:12:30,010 --> 00:12:37,060 Then people did that. So let's add it's each time your confidence intervals are getting narrower and you're building up a picture. 133 00:12:38,770 --> 00:12:43,090 So my question is, at what point, given that this is the line of no effect, 134 00:12:43,420 --> 00:12:47,560 statistically significant on either side, at what point would you have known the results of that? 135 00:12:50,390 --> 00:12:56,990 Yeah. So by 1970, he should have known that putting babies on their front is bad. 136 00:12:57,890 --> 00:13:02,210 And yet all these studies were done thereafter and the advice continued. 137 00:13:02,720 --> 00:13:05,990 So that's about by 1970. 138 00:13:07,940 --> 00:13:13,010 Another example which the story of violence. Some have. 139 00:13:13,160 --> 00:13:19,460 Okay. So fireworks originally marketed as a drug for is an anti-inflammatory. 140 00:13:19,730 --> 00:13:22,820 So for arthritic pain and basically pain relief. 141 00:13:23,360 --> 00:13:29,090 And hey well hey some of the advertising quite emotive. Sometimes you can improve two lines with a single prescription. 142 00:13:29,750 --> 00:13:32,990 Very nice. Sometimes you can ruin lives with a single prescription. 143 00:13:33,740 --> 00:13:36,770 So but Vioxx is an interesting story because years after, 144 00:13:36,770 --> 00:13:41,840 we understood what cumulative matter analysis is and the value of systematic reviews, we're still having problems. 145 00:13:42,500 --> 00:13:46,700 So this is now in the 2000. Withdrawal from the tooth, from the market in 2004. 146 00:13:46,940 --> 00:13:51,290 Of the concerns about increased cardiovascular risk, particularly heart attacks. 147 00:13:52,310 --> 00:14:01,490 And then after its withdrawal, a systematic review looked back at all the studies published by the manufacturer and said, 148 00:14:01,490 --> 00:14:06,590 actually, at what point should trials of Robert Cox Vioxx have been stopped? 149 00:14:06,620 --> 00:14:10,460 So again, it's a cumulative meta analysis. So these are the this is the year here. 150 00:14:11,030 --> 00:14:16,970 And as you go from 97, each one is a trial. But adding on the results of the previous trial to the next one and so on. 151 00:14:17,810 --> 00:14:23,090 So when would when would you have known Vioxx was harmful? 152 00:14:24,010 --> 00:14:27,760 2000 put me around, too. Yeah, you're right. What about him? 153 00:14:28,660 --> 00:14:33,040 Well, you can see a trend emerging. So all these trials, why on earth what they done? 154 00:14:36,010 --> 00:14:42,250 Okay, so that's a lack of understanding about the value of systematic reviews and not doing them timely. 155 00:14:43,120 --> 00:14:48,430 Okay. But equally, I was going to say that concerning that those were done in showing that. 156 00:14:51,540 --> 00:14:56,700 So all of them were not statistically significant as well. And then what's one word? 157 00:14:56,980 --> 00:15:03,450 Well, yeah, I mean, but they may they were well, there were flaws with some of these designs as well. 158 00:15:03,750 --> 00:15:08,460 So they each try to learn from the next. But the point is, by this point, they should have stopped doing any trials. 159 00:15:09,870 --> 00:15:13,290 All this all this data here was unnecessary randomising people to harm. 160 00:15:14,250 --> 00:15:17,700 That's the question, because it looks like, I guess, that second set of 12, 161 00:15:17,700 --> 00:15:23,880 is it because it's so large why you don't see such a huge relative to the other changes where it's kind of flip flopping back and forth? 162 00:15:23,910 --> 00:15:27,120 Which one? This or this one from there to there? Yep. Yeah. 163 00:15:27,130 --> 00:15:34,950 And it could be the size. So suddenly you get a better powered study and you see a true effect or a reflection of a true effect. 164 00:15:35,190 --> 00:15:38,280 Yeah. Why? They continued. What's. What's the reasoning? 165 00:15:38,580 --> 00:15:41,940 Why do they continue? Market forces. 166 00:15:43,080 --> 00:15:46,290 If you want to be cynical, prove a point. 167 00:15:48,400 --> 00:15:51,900 I guess you could make an argument not knowing what the design flaws of all 168 00:15:51,900 --> 00:15:56,820 these that maybe that that one trial and the second one of 2000 might be right. 169 00:15:56,820 --> 00:16:01,420 It was in that study that caused things to shift in the wrong direction. 170 00:16:01,470 --> 00:16:12,080 And so maybe this one poll or actually worthwhile to show that trend continuing that's accumulated as cumulative studies in 13,000 patients. 171 00:16:12,360 --> 00:16:17,120 No, but I think what you're saying you're saying is there's a fair jump from 51932. 172 00:16:17,350 --> 00:16:20,520 Yes. Yep. Okay. 173 00:16:21,540 --> 00:16:24,960 So remember I told you 50% of trials are not published. 174 00:16:25,590 --> 00:16:33,960 So failure to publish relevant research promptly or at all. So this one and this is talking about bias, one of them in particular reporting bias. 175 00:16:34,260 --> 00:16:39,990 Everybody familiar with the different types of reporting bias, although I'll just talk you through some of them anyway. 176 00:16:41,080 --> 00:16:46,050 So great resource. Cochrane handbook here. Beautiful table here about types of reporting bias. 177 00:16:46,470 --> 00:16:55,020 Publication bias. So publication or non publication of research findings depending on the nature of the direction of the results time like bias. 178 00:16:55,680 --> 00:17:01,889 So a rapid or delayed publication based on the type of result you get and so on. 179 00:17:01,890 --> 00:17:02,940 And so I can go through a couple more. 180 00:17:03,150 --> 00:17:08,820 Multiple publication bias, multiple single publication of research findings depending on the nature and the direction of the results. 181 00:17:09,000 --> 00:17:14,340 And you see this. So I mean, that particular I've seen that recently with some of the some of the trials around anticoagulants, 182 00:17:14,340 --> 00:17:18,750 you know, chopping and slicing data into different publications. 183 00:17:19,350 --> 00:17:25,480 And in fact, this the same trial, it just adds to the confusion. Location, bias, citation, outcome, reporting. 184 00:17:25,500 --> 00:17:33,000 And I want to talk about this later. Selective reporting of some outcomes, but not others, depending on the nature and the direction of the results. 185 00:17:33,100 --> 00:17:36,120 And altogether, it's all very, very common. 186 00:17:36,120 --> 00:17:41,759 I mean, even today, I don't have anybody telling me that report about some new dietary recommendations today in the news. 187 00:17:41,760 --> 00:17:45,060 No, you've got to be sitting in a classroom, which is which is great. 188 00:17:46,110 --> 00:17:49,940 But it's been on the news today about some new some collaboration, talking about new dietary. 189 00:17:49,960 --> 00:17:54,960 They produced a document which they called a guideline. And in fact, it wasn't and it's based example of citation bias. 190 00:17:55,320 --> 00:18:00,270 They just cherry picked various citations and then put it together in a documents and press release. 191 00:18:00,270 --> 00:18:04,940 Did this list the fact. Yeah, yeah. Found carbs. 192 00:18:05,190 --> 00:18:09,570 So it's an example of citation bias. There was no systematic way that they got the data. 193 00:18:10,380 --> 00:18:17,670 Okay. So if you want to know that way about they say you're quoting something in any of your assignments or theses, this is a great paper to know. 194 00:18:17,820 --> 00:18:18,750 And in fact, it's been cut off. 195 00:18:18,750 --> 00:18:25,889 But you're welcome to have the slides with all this heavily cited to sort of basically look systematically at all of the issues around 196 00:18:25,890 --> 00:18:34,860 publication bias and basically found all the all that all the all that examples I gave you found examples empirically in almost all those areas. 197 00:18:35,340 --> 00:18:39,780 So good research points to look at. 198 00:18:40,080 --> 00:18:44,550 Dissemination of research finding is likely to be a biased process. Overall conclusion, massive review. 199 00:18:46,500 --> 00:18:48,000 And when you look at some of the examples, 200 00:18:48,000 --> 00:18:55,780 this is one of the previous systematic reviews when they looked at publication bias in clinical trials due to the direction of fact, 201 00:18:55,800 --> 00:19:02,790 whether they were positive or negative or the results were positive or negative, and published in the Cochrane Library as a Cochrane Review. 202 00:19:03,000 --> 00:19:06,479 Association between publication and statistical significance or direction. 203 00:19:06,480 --> 00:19:07,620 The results were examined. 204 00:19:07,950 --> 00:19:14,430 Trials with positive findings were more likely about four times more likely to be published than trials with negative or no findings. 205 00:19:15,030 --> 00:19:21,240 Positive findings tended to be published after 4 to 5 years, negative findings after 6 to 8 years. 206 00:19:21,300 --> 00:19:26,580 So there was that example of that, that delay. Okay. 207 00:19:27,090 --> 00:19:31,260 Bias or unusable reports. Another reason why research is wasted. 208 00:19:33,120 --> 00:19:40,110 So this is this is about sort of publication of data reports that we simply cannot use in clinical practice. 209 00:19:41,040 --> 00:19:44,460 Okay. Who decided the story of study 3 to 9? 210 00:19:45,930 --> 00:19:52,770 Okay. Well, tell you a story. So this was study 3 to 9 refers to the the study reports. 211 00:19:53,400 --> 00:20:04,560 And this is the publication from that study report. And it was a clinical trial sponsored by GSK about the drug paroxetine. 212 00:20:05,580 --> 00:20:15,900 And it was looking specifically about the use of it as an antidepressant versus placebo in adolescent adolescents with major depression. 213 00:20:16,950 --> 00:20:19,950 So quite an important area. Very important area clinically. 214 00:20:23,410 --> 00:20:28,030 This publication is here and published in 2001. 215 00:20:28,030 --> 00:20:33,730 So, you know, this is not years and years ago. This is this is reasonable things that people can remember. 216 00:20:34,420 --> 00:20:39,819 So 2001, the report was published and it presented four outcomes, all favouring proxies. 217 00:20:39,820 --> 00:20:45,190 In all showing proxy works for this, this, this and this. As I said, 218 00:20:45,190 --> 00:20:50,020 the trial was sponsored by the manufacturer and the report contained the words 219 00:20:50,200 --> 00:20:54,580 that the drug was generally well-tolerated and effective and on face value. 220 00:20:54,700 --> 00:20:58,160 Okay, fine. What will cause you appraise that and yep. 221 00:20:58,180 --> 00:21:03,639 Seems to a reasonable trial done fairly well. And to million. 222 00:21:03,640 --> 00:21:09,030 I mean, this is sort of impact to many prescriptions issued for children and adolescents in 2002 alone. 223 00:21:09,040 --> 00:21:12,010 So just goes to show about the widespread use of this particular drug. 224 00:21:13,180 --> 00:21:19,570 But when we critically appraising and asked, we're only really critically appraising the data that we're seeing. 225 00:21:19,750 --> 00:21:25,780 And if we're appraising, you know, a PDF version of that, we're only praising the data we see. 226 00:21:25,810 --> 00:21:30,430 What about the data that we don't see? And this is a great example or a bad example, if you like. 227 00:21:31,060 --> 00:21:34,090 So this is an example of selective reporting. 228 00:21:34,270 --> 00:21:37,360 So I told you that they reported four outcomes all favouring. 229 00:21:38,410 --> 00:21:46,210 If you go back in time and look at the actual study protocol for that trial and the study was titled 329 and the study data, 230 00:21:46,780 --> 00:21:48,670 they actually said not for outcomes. 231 00:21:49,450 --> 00:21:55,900 They actually stipulated that they'd have two primary outcomes and six secondary outcomes in this study, in this trial. 232 00:21:56,740 --> 00:22:04,750 And what they also did, which they sort of didn't fully eluted, but they said that they were going to measure 19 additional outcomes in the study. 233 00:22:06,220 --> 00:22:12,670 Now, of the two primary outcomes and the six secondary outcomes, none of them showed favourable results. 234 00:22:14,810 --> 00:22:25,650 So they've been. Of the 19 remaining items that they just happened to measure, all but four showed a favourable result. 235 00:22:26,490 --> 00:22:33,090 So the ones that didn't have been a to the fool, they ended up getting into the paper. 236 00:22:35,780 --> 00:22:41,570 So the primary, the pre-specified primary and secondary outcomes disappeared. 237 00:22:44,730 --> 00:22:53,910 And as I said, it was only until 2015 here that the full data analysis was properly put together. 238 00:22:54,090 --> 00:22:59,579 And this article in the BMJ restoring study 3 to 9 the harms the proxy and they 239 00:22:59,580 --> 00:23:03,810 looked at another antidepressant imipramine in the treatment of major depression. 240 00:23:05,640 --> 00:23:09,120 So in concluding the proxy no high dose implement showed efficacy for major 241 00:23:09,120 --> 00:23:13,320 depression in adolescents and there was an increase in harms with both drugs, 242 00:23:14,070 --> 00:23:20,220 clinically significant increase in harms, including suicidal ideation and behaviour and other serious adverse events in the proxy group. 243 00:23:21,120 --> 00:23:30,960 So again, all because of selective outcomes which. Right. 244 00:23:31,170 --> 00:23:34,050 So another thing so we talked about selective outcome switching. 245 00:23:35,310 --> 00:23:39,870 Another issue about the way that studies are designed and the flaws in their in their design. 246 00:23:40,380 --> 00:23:45,150 So this, again, is a really nice paper looking empirically at that. 247 00:23:45,390 --> 00:23:47,010 And again, I'll put the link there if you're interested. 248 00:23:47,300 --> 00:23:54,030 So looking at flaws in design, conduct and analysis in our cities and the systematic reviews that include those asked, 249 00:23:54,990 --> 00:24:01,380 and basically the effect of that has but potentially erroneous conclusions with serious consequences for patients. 250 00:24:02,070 --> 00:24:10,680 And what they found when they did this empirical work. So they search for sources of bias from poor study design within a defined sample of studies. 251 00:24:11,670 --> 00:24:15,720 And they they identified nearly 1300 studies trials. 252 00:24:16,650 --> 00:24:22,950 And of these, about 43% had at least one domain, a high risk of bias, ie from poor methods. 253 00:24:23,460 --> 00:24:29,850 And they went on to explore these reasons and found that all much of this, if not all of it, could have been avoided. 254 00:24:30,930 --> 00:24:34,170 So people are still producing research that is flawed in its design, 255 00:24:34,410 --> 00:24:39,420 which could have been prevented and affects our ability to use that or potentially use that research. 256 00:24:39,720 --> 00:24:46,080 And we should have learned by now from examples like study 3 to 9 about the importance of actually 257 00:24:46,080 --> 00:24:49,290 saying we're going to do something and then actually doing it and presenting what we've done. 258 00:24:51,060 --> 00:24:54,390 Okay. There must be some good news, surely. 259 00:24:54,960 --> 00:24:58,710 So I thought I'd do it. Just talk to you about some of the good news that's out there as well. 260 00:24:59,520 --> 00:25:02,580 And there are changes, and I'm sure you'll know about some of these already as well. 261 00:25:03,360 --> 00:25:07,490 Okay. So I'll give you a few examples of what the NHS is doing. 262 00:25:07,500 --> 00:25:13,890 So a lot of our funding comes from the National Institute for Health Research, which is the the research arm of the NHS. 263 00:25:13,950 --> 00:25:17,280 Some of you may have funding from them as well. Be very familiar with them. 264 00:25:18,150 --> 00:25:22,320 And again, the their aim to improve the health and wealth of the nation through research. 265 00:25:22,320 --> 00:25:25,490 And in fact, they've just celebrated their ten year anniversary. 266 00:25:25,500 --> 00:25:28,170 They've been around for ten years now and it feels like they've been around for a lot 267 00:25:28,170 --> 00:25:32,670 longer because given the amount of development that's come from their their work, 268 00:25:32,820 --> 00:25:36,720 their support. Just to give you a bit of background. 269 00:25:37,110 --> 00:25:43,440 So in in the annual report, 20 1415 spent about a billion, isn't it about billion? 270 00:25:43,950 --> 00:25:51,509 And you can see how they break down. They spend a lot on building capacity through training, individual programmes and infrastructure. 271 00:25:51,510 --> 00:25:58,650 So that's big, big grants for, you know, setting up units, schools, etc., research schools and so on. 272 00:25:59,160 --> 00:26:02,010 And then you can see the sort of areas that they spend their money in. 273 00:26:02,460 --> 00:26:07,650 So cancer, mental health, which is which is important to see that mental well recognised and so on. 274 00:26:08,190 --> 00:26:16,710 So they spend a lot and they respond. They're reflective of needs as well, or they try to be certainly, but they're also reflective. 275 00:26:16,830 --> 00:26:23,940 And take the what we've been talking about, the value and the importance of research and adding value to research. 276 00:26:24,240 --> 00:26:31,320 So building on what Ian Chalmers and Paul Glasgow originally put forward in 2009 in The Lancet, 277 00:26:31,530 --> 00:26:40,409 they sort of had a whole have a whole section of ensuring that every research project that goes through them meets at least most, 278 00:26:40,410 --> 00:26:44,129 if not all, of these pillars for adding value. And I thought that's quite that's quite nice. 279 00:26:44,130 --> 00:26:45,240 So when you're designing, 280 00:26:45,240 --> 00:26:51,270 research yourself either for an assignment or actually when you're when you are principal investigator yourself, if you're not already, 281 00:26:51,540 --> 00:26:54,989 that this is actually a great resource to have a look at, to kind of think about actually, 282 00:26:54,990 --> 00:26:58,110 if I want to go for funding, you know, I've got to make the case for my research. 283 00:26:58,260 --> 00:27:02,130 It needs to be it needs to be of value. So they've got some great resources there as well. 284 00:27:03,720 --> 00:27:07,590 And actually, you can see there they're sort of the mindset we were talking right at the start about 285 00:27:07,590 --> 00:27:12,090 the value of basic research and that translational gap between applied research. 286 00:27:12,450 --> 00:27:16,379 So if you think just at the top line here about what we're talking about in invention, 287 00:27:16,380 --> 00:27:23,280 so we're talking about basic research here to evaluation and then to adoption and then diffusion in health systems. 288 00:27:23,280 --> 00:27:31,679 Right. From the basic research, you can see where the funding is now coming and how much the National Institute for Health Research focuses on this, 289 00:27:31,680 --> 00:27:34,350 bridging that that gap that we talked about right. The start. 290 00:27:34,530 --> 00:27:40,620 So Medical Research Council in the UK we're focussed mostly on basic research and developmental research, 291 00:27:40,710 --> 00:27:47,130 so innovation in basic sciences and then they sort of have this joint funding stream where they both fund this, 292 00:27:47,370 --> 00:27:50,399 where they start to evaluate mechanisms and then and so on. 293 00:27:50,400 --> 00:27:56,910 These are the big units and so on, different, different programs right through, as you see here, to patient care commissioning and patient care. 294 00:27:57,160 --> 00:28:01,170 So this kind of gives you the way that things are changing for the better. 295 00:28:03,570 --> 00:28:08,520 Another important point, and we talked about this, about doing research that really matters. 296 00:28:09,390 --> 00:28:14,040 And this is something that the night Charles picked up on as well. So questions being researched are the most important to patients. 297 00:28:14,040 --> 00:28:19,530 The public and clinicians as well. Haven't heard of the James Lind Alliance? 298 00:28:20,370 --> 00:28:27,450 No. Okay. So it was set up by Ian Chalmers of the Cochrane, who was involved with setting up the Cochrane Library. 299 00:28:27,900 --> 00:28:35,400 But in the. 2000 set up the James de Lions Co, set up with the Sir James Dean Alliance, which has subsequently been adopted by the Night Show, 300 00:28:35,430 --> 00:28:41,430 the national issue of health research as a form of actually getting a more formalised process for setting research 301 00:28:41,430 --> 00:28:49,080 questions to the research community like us that actually stem from issues that really matter to patients. 302 00:28:51,620 --> 00:28:56,179 So one of the main innovations was the idea of setting up what they call priority setting 303 00:28:56,180 --> 00:29:01,250 partnerships space that would specifically focus on bringing patient care as clinicians, 304 00:29:01,610 --> 00:29:05,870 as groups coming together. And they discussed and identified their different. 305 00:29:06,260 --> 00:29:11,479 It's more the discussion. It's the sort of formalised process to identify in certain clinical conditions. 306 00:29:11,480 --> 00:29:15,860 Where are the uncertainties? Where are the uncertainties in practice or in care? 307 00:29:17,030 --> 00:29:20,479 And the aim is to ensure that those who fund health research, like the NIH, 308 00:29:20,480 --> 00:29:24,710 are all those who apply for research actually know what really matters to patients. 309 00:29:25,790 --> 00:29:29,329 And just as an example, so and they're increasingly starting to publish their process. 310 00:29:29,330 --> 00:29:36,350 So if you are interested in in patient, public involvement in research and the value, it's worth looking at some of the work that they're doing. 311 00:29:37,040 --> 00:29:46,070 So here's an example. So this is published in the BMJ Open around setting research priorities for patients undergoing shoulder surgery. 312 00:29:46,790 --> 00:29:53,779 And this sort of talks about the process. So they send out a survey, gather the respondents, put them together, 313 00:29:53,780 --> 00:29:59,209 the rest of the respondents put them together to see which certain certainties could be taken forward to the next stage. 314 00:29:59,210 --> 00:30:02,660 So they identify those that are out of their scope, but those are relevant. 315 00:30:02,960 --> 00:30:06,050 It's quite a systematic process. Remove duplications, 316 00:30:06,530 --> 00:30:13,970 compare the uncertainties taking brought forward to existing evidence and actually identify from that where there are evidence gaps. 317 00:30:14,600 --> 00:30:18,709 Take that to the next level whereby they'll have a bit more of a roundtable meeting, 318 00:30:18,710 --> 00:30:25,910 steering group reviewing the results of the above, and then finalising a set of core research priorities in a particular area. 319 00:30:26,060 --> 00:30:29,600 And that's just for patients undergoing needing shoulder surgery. 320 00:30:29,990 --> 00:30:37,040 And this is an example. So what they then come up with what they call the top ten, are research priorities from this sort of process. 321 00:30:38,180 --> 00:30:41,899 And you can see that I won't read it all for you, but you can just sort of see. 322 00:30:41,900 --> 00:30:47,240 So some of the things where there actually there is an evidence base for we're doing it or it's a question that matters to patients, 323 00:30:47,240 --> 00:30:51,740 but there isn't an evidence base there. So, for example, let's have a look. 324 00:30:51,800 --> 00:30:52,160 Here we go. 325 00:30:52,190 --> 00:30:59,600 All patients, including older age groups with rotator cuff tendencies as a tear in your shoulder, best treated with surgery or physiotherapy. 326 00:31:00,590 --> 00:31:07,610 We don't have the answer. There is no answer out there. And yet both those are in clinical practice, but there's uncertainty. 327 00:31:07,790 --> 00:31:11,929 So it's just highlighting the method that they use to identify these research priorities. 328 00:31:11,930 --> 00:31:18,379 So here's a top tip. If you're a researcher, if you're interested in identifying areas to kind of go and have a look at their website 329 00:31:18,380 --> 00:31:23,250 because the richness of uncertain unanswered questions for you to kind of explore. 330 00:31:23,810 --> 00:31:31,460 And again, highlighted by the fact that they have been through a process that matters to patients and uses things a lot. 331 00:31:32,360 --> 00:31:34,420 Things are still there's still room for improvement. 332 00:31:34,430 --> 00:31:43,820 So in this scoping audit, again, sorry, the reference has been cut off, but they look to see clinical trials for a number of clinical areas published. 333 00:31:44,140 --> 00:31:52,450 They compared sort of what were the the JLA partnership setting priorities compared to the number of the registered trials in that area. 334 00:31:52,460 --> 00:31:56,120 So just to sort of cut that sort of make it simplify it, 335 00:31:56,330 --> 00:32:04,760 basically the long short phase registered trials in a particular area were much more focussed on on drugs, vaccines and biologicals. 336 00:32:05,090 --> 00:32:12,680 But actually what matters to the partnership, the Peas and Patients was more around disease management to do with education, 337 00:32:12,680 --> 00:32:14,870 training, service delivery, psychological therapy. 338 00:32:14,870 --> 00:32:20,120 So it just highlights the fact that a lot of the commercial trials are not necessarily meeting the needs of patient. 339 00:32:22,180 --> 00:32:25,180 Okay. Past few minutes. So everyone had all trials. 340 00:32:26,830 --> 00:32:32,560 Okay. If you haven't, we talked earlier about the fact that around half of all clinical trials have never been reported. 341 00:32:33,340 --> 00:32:40,360 This was a campaign set up by excuse me, a few years ago now to kind of really push this that actually this is just not on. 342 00:32:40,540 --> 00:32:45,310 And there's been Goldacre who works with us at CBM, one of the founders, 343 00:32:45,640 --> 00:32:52,060 just kind of pushing the agenda of actually all trials conducted, should have all trials published in full reports. 344 00:32:52,240 --> 00:32:55,000 There should be no hidden data. 345 00:32:55,420 --> 00:33:03,250 And, you know, there's a catalogue, as I said, and we just looked at a few examples of of where that has caused harm and confusion. 346 00:33:04,150 --> 00:33:11,380 And so they want all trial, all pharmaceutical companies, academic institutions and so on to register trial. 347 00:33:11,800 --> 00:33:16,120 You publish your findings and it must be transparent and accessible. And I think one of the they say on their website, 348 00:33:16,120 --> 00:33:22,659 one of their landmark moments was when they got a massive pharmaceutical company to sign up to all trials to say, 349 00:33:22,660 --> 00:33:26,620 yes, okay, we agree with your with your policy. 350 00:33:26,620 --> 00:33:33,420 And we will ensure that all our trials are published and available freely from all our registered trials. 351 00:33:33,430 --> 00:33:38,620 Now, what's interesting is we're looking at whether that's currently being done at the moment. 352 00:33:38,620 --> 00:33:41,680 So we're auditing that openness and we're helping and supporting it. 353 00:33:42,160 --> 00:33:46,030 So it is a step in the right direction, but there's still much, much more work to do. 354 00:33:48,240 --> 00:33:49,750 Increasing the value of systematic reviews. 355 00:33:49,750 --> 00:33:55,030 Now, I've recently argued that every health researcher should do a systematic review at the start of their training. 356 00:33:55,300 --> 00:33:57,370 Some may disagree, but there's so much value in it. 357 00:33:57,580 --> 00:34:02,770 It's a lot less expensive than doing a clinical trial, and it probably make you do a clinical trial a lot better if that's what you want to do. 358 00:34:03,970 --> 00:34:08,980 But I'm not the only one who's appreciating the value of systematic reviews. 359 00:34:09,280 --> 00:34:17,770 So this from the NIH are about ensuring that every resource, every trial they fund, particularly through the health technology assessment, is funded. 360 00:34:19,090 --> 00:34:23,830 One only when you know about the event, the evidence is available already. 361 00:34:24,190 --> 00:34:29,290 Only when you know what's available already will we look into supporting your your trial. 362 00:34:30,070 --> 00:34:39,070 And so this is one of the again from that website I showed you about adding value to research and they provide guidance notes as well. 363 00:34:39,250 --> 00:34:40,389 So what do we mean by that? 364 00:34:40,390 --> 00:34:48,340 So they explain to you kind of, you know, ideally a systematic review, but if not a systematic assessment, at least of the current evidence as well. 365 00:34:48,490 --> 00:34:52,540 And they sort of produce this guidance note for applicants to make sure that happens. 366 00:34:53,680 --> 00:35:00,129 Is it happening? Well, in fact, people are auditing it now. So looking at just one program, the Health Technology Assessment, 367 00:35:00,130 --> 00:35:06,400 Trial Planning and design from the NIH, are they looked to see of the trial of 47 funded trials, 368 00:35:06,400 --> 00:35:14,559 47 funded trials during this period here, only five didn't reference the systematic review and all those five gave a clear reason why they didn't, 369 00:35:14,560 --> 00:35:17,800 which was justified when they re audited in 2013. 370 00:35:18,520 --> 00:35:23,800 All trials funded in 2013 referenced the systematic review in their proposal. 371 00:35:24,310 --> 00:35:27,340 So things are improving, but not quite, not quite there. 372 00:35:27,340 --> 00:35:35,830 But they're heading in the right direction, making reports more useful, reducing outcomes, switching is everyone is anyone familiar with compare. 373 00:35:37,360 --> 00:35:38,950 So this is work that we're doing at the moment, 374 00:35:40,150 --> 00:35:47,140 led by Ben and five medical students who are putting their time in to support it and two clinical academics, one of which is me. 375 00:35:47,800 --> 00:35:50,680 And what exactly what the idea behind compare is, 376 00:35:51,550 --> 00:35:58,209 is to was to prospectively look at trials as they're being published in the top five major journals and actually 377 00:35:58,210 --> 00:36:06,250 compare hence the name the protocol for that trial to what they published and to see whether there are any outcomes. 378 00:36:06,250 --> 00:36:13,420 We talked about study 3 to 9, see whether there's any outcomes have been switched, lost, changed, altered and so on. 379 00:36:13,810 --> 00:36:16,120 And if I'm given a reason for and justifiable. Okay, fair enough. 380 00:36:16,120 --> 00:36:20,500 But if they've just suddenly mysteriously disappeared into that bin that we showed earlier, why is that? 381 00:36:21,010 --> 00:36:26,980 So the next stage was to write to the journal and say, know, you often have to do that within two weeks of the publication. 382 00:36:27,550 --> 00:36:34,030 This this public, this publication is in your journal, blah, blah, blah. We've noticed that the protocol says this is the reason why this we like to 383 00:36:34,030 --> 00:36:38,109 highlight it to the authors and sometimes even authors replied back and said blah, 384 00:36:38,110 --> 00:36:43,110 blah, blah, you know, this is something that we have considered, but these are the reasons why we switched, you know, 385 00:36:43,180 --> 00:36:47,290 or they've said sometimes you've misunderstood, you've got the trial, you've got the understanding wrong. 386 00:36:47,770 --> 00:36:54,760 But it's actually been really interesting to see what sort of responses and and on the 67 trials checked, only nine were perfect. 387 00:36:55,630 --> 00:36:58,630 And this was all this was this was only done last year. And it's still ongoing. 388 00:36:59,110 --> 00:37:05,570 The correspondence between all these trial authors and the journals who went to the trials, 389 00:37:05,890 --> 00:37:10,120 if the protocol is available for you to look at, it should be it's not always. 390 00:37:10,810 --> 00:37:14,560 And then you've got time to go through and check that it fit in the protocol. 391 00:37:16,960 --> 00:37:20,790 Well, five medical students and two clinical academics. 392 00:37:21,520 --> 00:37:24,550 But. Okay, well, okay. Well, let me flip the question. 393 00:37:24,580 --> 00:37:28,820 You tell me who should be doing it. Should it be the reviews? 394 00:37:29,440 --> 00:37:32,650 Yeah. Funders are going to published in the journal. 395 00:37:32,680 --> 00:37:37,440 You should check that before you publish it in your legitimate basis. 396 00:37:37,720 --> 00:37:40,870 So is that I ask, is that the point of peer review? 397 00:37:41,470 --> 00:37:42,940 And in fact, that's what we put some of the journals. 398 00:37:42,940 --> 00:37:49,090 And that's why some of the most vitriolic, if you like, exchanges have been with the Journal editors, not so much with the authors. 399 00:37:49,540 --> 00:37:51,780 And I'll just give you an example. Have a look at compare. 400 00:37:51,790 --> 00:37:55,750 You'll find there's lots of blogs around what we're doing it, why we're doing it, and so on. 401 00:37:57,010 --> 00:38:01,630 But it has been an enormous resource to do it, but I think very valuable one. 402 00:38:02,770 --> 00:38:07,749 And we talked about that same point. So one of the interesting discussions we've been having is with the Annals of Internal Medicine, 403 00:38:07,750 --> 00:38:14,860 one of the top five journals that we've been discussing with, and they were fairly resistant to to what Ben and us were doing. 404 00:38:15,340 --> 00:38:17,860 And actually there was quite a fairly heated exchange. 405 00:38:17,860 --> 00:38:21,850 We've logged all the heated exchanges in in the blogs, so you can read all about it very transparently, 406 00:38:22,210 --> 00:38:26,410 but actually only recently because of the pressure and the dialogue, 407 00:38:27,610 --> 00:38:32,050 they've actually changed their policy now only in the last few weeks, maybe a month or so. 408 00:38:32,260 --> 00:38:36,550 And this is an income has been reported, in fact. Well, that's few days ago. 409 00:38:36,910 --> 00:38:39,070 So journals publish clinical trial protocol. 410 00:38:39,080 --> 00:38:48,160 So they've now actually changed their their policy about making sure that protocol, as you say, are published next to that their trial. 411 00:38:48,310 --> 00:38:52,120 And I suspect internally they'll make sure that, as you saying as well, 412 00:38:52,300 --> 00:38:58,510 that the peer through the peer review process or editorial process, that the outcomes are much, much more closely scrutinised. 413 00:39:00,310 --> 00:39:12,250 Okay. So a more broader way of reducing waste is through a new initiative called the Reward Program Reward Campaign, 414 00:39:12,880 --> 00:39:19,810 and it's a matter of a number of journals, research that has CBM and it goes, 415 00:39:19,810 --> 00:39:24,340 the list goes on down here talking about talking to each other, partnering up, 416 00:39:24,520 --> 00:39:30,670 but actually kind of making a commitment to ensure they can do whatever they can to reduce research. 417 00:39:30,820 --> 00:39:35,559 So again, a nice resource and all the related content if you if you want and research goes 418 00:39:35,560 --> 00:39:40,100 on for pages and below this about sort of some of the the reasons behind it, 419 00:39:40,120 --> 00:39:44,890 the ways the citations and some of the clinical examples that we've talked about today as well. 420 00:39:46,990 --> 00:39:50,980 Okay. And I'm just like, let's just think it's 45 minutes. So it's the last minute. 421 00:39:51,220 --> 00:39:54,780 I'm going to leave you with one last paper, which I think will be well worth reading. 422 00:39:55,270 --> 00:40:04,030 And a final thought. Written by Devilment is a professor here in Oxford about the scandal of poor medical research. 423 00:40:04,030 --> 00:40:07,300 And this was written in 1994 to 22 years ago now. 424 00:40:08,110 --> 00:40:12,400 But just very one overriding line, which, you know, is something never to forget. 425 00:40:12,700 --> 00:40:17,170 We need less research, better research and research done for the right reasons. 426 00:40:18,640 --> 00:40:21,580 Really good paper. Okay. Thanks very much for listening.