1 00:00:00,240 --> 00:00:07,500 So thanks for coming. I'm going to talk to you about this. The better evidence for a better health care manifesto and why we need it, 2 00:00:07,500 --> 00:00:13,680 why it's so important and why we do it in the centre and how it might relate to you and how you might contribute. 3 00:00:15,270 --> 00:00:19,530 So I'm going to take you back to an article in 1994 that I think you should all read. 4 00:00:20,160 --> 00:00:26,610 It's an editorial by a chap called Doug Altman, who is a professor of medical statistics here in Rome, 5 00:00:26,610 --> 00:00:29,040 something called the Centre for Statistics in Medicine. 6 00:00:30,090 --> 00:00:36,330 And if you remember when in blunt pop you might have heard of that name talked about 10,000 times. 7 00:00:36,750 --> 00:00:40,720 Well, he wrote an article, an editorial called The Scandal of Poor Medical Research. 8 00:00:41,190 --> 00:00:46,050 And he said in that article, there are huge shortcomings in the way that evidence based medicine operates today. 9 00:00:46,110 --> 00:00:54,060 And that's in 1994. Too much research, too much poor quality teacher research, and too much research done for the wrong reasons. 10 00:00:55,170 --> 00:01:04,499 Added to that a few years ago. In 2014, a professor with it in London at the time who's now here, Trish Greenhalgh, had a meeting here in Oxford, 11 00:01:04,500 --> 00:01:10,500 a two day meeting to basically say that evidence based medicine is a movement in crisis. 12 00:01:11,040 --> 00:01:18,780 And there's been a long history of people trying to fix these problems, and it's presenting more problems and solutions for practice. 13 00:01:20,970 --> 00:01:26,850 And then just recently with a colleague of mine, Ben Goldacre, we wrote an editorial in the BMJ again. 14 00:01:26,850 --> 00:01:32,430 So you see there's a theme in the BMJ about how medicine is broken and how we can fix it. 15 00:01:32,640 --> 00:01:38,550 And the important issue is now is that patients and the public are increasingly aware of the shortcomings. 16 00:01:38,850 --> 00:01:42,960 So it's about time we did something about it. Okay. 17 00:01:42,990 --> 00:01:50,010 So one speaking, if you want to go online, you can go online to the evidence, live the org and type in manifesto or type that into Google. 18 00:01:50,370 --> 00:01:54,840 And you can read the manifesto, if any of your online at the moment and have a look while I'm talking. 19 00:01:56,550 --> 00:01:59,640 So wherever I've been, I thought I'd say, You've got to start somewhere. 20 00:01:59,670 --> 00:02:02,760 Why do you get to a point that we need better evidence? What's your story? 21 00:02:02,790 --> 00:02:07,050 Why am I here? What was the seminal moment when I went, Oh, there's a real problem. 22 00:02:08,070 --> 00:02:17,910 So I'm just going to talk you through one area where we've worked, which is in Tamiflu is an antiviral drug, which you may be aware of or not. 23 00:02:18,570 --> 00:02:22,980 And in the 2009 pandemic, remember when we were all going to die? 24 00:02:24,270 --> 00:02:31,130 You shouldn't be here. Some of you that were given no antiviral for the public health measure on a telephone call here in the UK. 25 00:02:31,140 --> 00:02:37,260 I'm a GP myself and we had a situation where you would ring up if you had a temperature, 26 00:02:37,290 --> 00:02:43,320 one of your children had a temperature and then they would send you off to a phone, a phone call to go and pick up the antivirals and take them. 27 00:02:43,980 --> 00:02:47,400 And that was the point where my sister offered me, do these make any difference? 28 00:02:48,270 --> 00:02:51,930 And at that point, I was like, I'm uncertain, not sure. 29 00:02:52,710 --> 00:03:00,330 So one of the basic principles of evidence based medicine is to reduce uncertainty, to use evidence to make better decisions, reduce uncertainty. 30 00:03:00,900 --> 00:03:06,420 And in 2009, we did a systematic review, but we also worked with Tom Jefferson, 31 00:03:06,420 --> 00:03:12,220 who's a colleague of mine, to try and find the evidence transfer and reduce uncertainty. 32 00:03:12,240 --> 00:03:14,910 And what we did is come under significant barriers. 33 00:03:16,560 --> 00:03:23,400 And one of the things at the time is in one of the Cochrane reviews is if you ever do a Cochrane review and publish a Cochrane review, 34 00:03:24,030 --> 00:03:29,880 if somebody put the comment on that review, you have six months to respond to that comment. 35 00:03:30,930 --> 00:03:37,830 And this is a Japanese clinician called Hayashi, who sent a question in saying, 36 00:03:38,100 --> 00:03:49,440 I want to question that you use ten trials to inform the systematic review, and of then ten trials, only two had ever been published and eight were. 37 00:03:50,580 --> 00:04:01,139 How did you not question that? And of the eight that were published, of the three of the two that were published, there was no significant difference. 38 00:04:01,140 --> 00:04:04,290 And of the ones that were unpublished, there was a significant difference. 39 00:04:06,270 --> 00:04:10,260 How did you come to that conclusion? Why didn't you investigate where that evidence was? 40 00:04:11,160 --> 00:04:17,520 And that was based on this review that was published in 2003 in Archives of Internal Medicine. 41 00:04:18,600 --> 00:04:22,169 And that said, it also time of treatment, 42 00:04:22,170 --> 00:04:29,430 reduced lower respiratory tract complications and antibiotic use and hospitalisation in both health and at risk at all. 43 00:04:31,230 --> 00:04:37,170 And then there were the ten trials published and what happened in the Cochrane Review. 44 00:04:37,830 --> 00:04:44,160 I'd taken some of what we used to do with take the abstracts, the non published and then just reproduced them in the Cochrane Review. 45 00:04:46,890 --> 00:04:51,209 And here's an example of one of these abstracts, but what we did is try to ask for the evidence. 46 00:04:51,210 --> 00:04:59,410 So here's an example of one of the abstract. This is the abstract of the largest ever published, largest ever trial of one found. 47 00:04:59,560 --> 00:05:04,960 459 patient published as an abstract 300 words. 48 00:05:06,460 --> 00:05:13,750 When we asked the person who had the name on that review, John Traynor, it was published in a meeting conference. 49 00:05:14,560 --> 00:05:22,750 He said that he didn't actually participate in study 76001 and doesn't remember presenting it at the meeting. 50 00:05:23,860 --> 00:05:29,290 So the largest ever trial, the person who's got the name on the conference but he wasn't at the meeting, 51 00:05:29,590 --> 00:05:33,490 doesn't remember participating in the trial and doesn't know anything about it. 52 00:05:33,950 --> 00:05:35,709 That set the alarm bells off, doesn't it? 53 00:05:35,710 --> 00:05:44,740 And we had the media involved because at first we were being stonewalled and then trying to get the information required to involve the media. 54 00:05:46,630 --> 00:05:53,410 We asked one of the chaps who was involved in one of the published trials, Where's the data? 55 00:05:54,520 --> 00:05:59,350 And he said, Well, unfortunately, I only I only saw the summary data. 56 00:05:59,740 --> 00:06:05,530 All of that you would have to ask Roche because they conducted the trial, only looked at the summary court. 57 00:06:07,390 --> 00:06:13,360 So even the published data we were worried about who's got access to data, who's really doing these trials. 58 00:06:13,870 --> 00:06:18,340 All of this looks really odd. Issues of conflict of interest go growth, authorship. 59 00:06:18,580 --> 00:06:19,900 How do you access the data? 60 00:06:21,430 --> 00:06:28,570 And that meant 60% of the data was never published for a drug that we were stockpiling and using in a public health emergency. 61 00:06:29,770 --> 00:06:37,630 And this is a really helpful slide. What it illustrates is they call it the FDA, call it above the waterline and below the waterline. 62 00:06:38,320 --> 00:06:47,380 And above the waterline is you see journal articles and conference proceedings and very small summaries of what really is out there. 63 00:06:48,760 --> 00:06:58,450 And for any trial below the line is all this over other information completed case report formed Patient Level Dataset Clinical Study Report. 64 00:06:58,490 --> 00:07:03,400 These are documents that are submitted to regulators as part of the regulatory proceeding. 65 00:07:04,270 --> 00:07:10,540 Yet what we rely on is only these small snippets of information to make huge decisions in health care. 66 00:07:11,500 --> 00:07:15,280 And suddenly when I realised this and we realised this, this can't be right. 67 00:07:18,790 --> 00:07:29,529 And what happened in 2013 with that, Roche agreed to release all of the trial data, and that was an important, uh, statement largely as well, 68 00:07:29,530 --> 00:07:33,340 because the publication of a book called Bad Pharma at the time, 69 00:07:33,730 --> 00:07:40,300 which basically said there's a lot of bad practices, and if you haven't read that book, buy a copy, read it. 70 00:07:40,450 --> 00:07:45,970 Some really interesting issues about what we defined by bad trials, all the fundamental problems. 71 00:07:48,310 --> 00:07:50,260 So let's come back when they really object. 72 00:07:50,290 --> 00:07:58,209 So you read a lot about confidentiality, patient confidentiality and the problems that present in pharmaceutical industry and entities. 73 00:07:58,210 --> 00:08:02,740 Law would say to you, Well, actually we can't do this of commercial confidentiality reasons. 74 00:08:03,430 --> 00:08:09,600 Yet Roche and JFK just posted the seeds to me in a package and I got them in the post. 75 00:08:09,620 --> 00:08:14,170 Wasn't even recorded. Deliver it. It was a really odd sensation to say We've got all these issues. 76 00:08:15,010 --> 00:08:19,450 But this, remember in 70 601, that's the largest ever treatment trial. 77 00:08:19,450 --> 00:08:23,350 That was all that was ever published in in an abstract. 78 00:08:24,400 --> 00:08:29,920 This is the actual clinical study report that underpinned and 7001. 79 00:08:31,690 --> 00:08:39,880 So would you rely on that 300 words or would you rely on that for the information about all the benefits and all the harm? 80 00:08:41,410 --> 00:08:50,440 And it's 9809 pages, so you can see that the huge difference between what we currently see and what might be available for drug treatment. 81 00:08:52,030 --> 00:08:53,950 Now, interestingly, what did that lead to? 82 00:08:54,880 --> 00:08:59,890 That led us to say it doesn't reduce hospital complications because we got all the full benefits and all the harms. 83 00:09:00,400 --> 00:09:04,810 It doesn't reduce antibiotic use. It doesn't reduce low respiratory tract complications. 84 00:09:05,200 --> 00:09:08,800 It does still reduce your symptoms because it has an antipyretic effect, 85 00:09:09,040 --> 00:09:16,480 but it increases some adverse events that are important, like renal failure, psychiatric adverse events and so on. 86 00:09:17,350 --> 00:09:21,459 So I haven't clinicians haven't prescribed this drug yet. 87 00:09:21,460 --> 00:09:30,460 I don't see the evidence is there for me to prescribe it. There are still some questions like in people on ICU or in hospital, but in primary care, 88 00:09:30,910 --> 00:09:36,340 the evidence is it's of limited benefit, if any, and the adverse event outweigh the benefit. 89 00:09:37,090 --> 00:09:44,290 Yet we still stockpile with drug. And when I went to the government to present this in the Parliamentary Select Committee, 90 00:09:44,320 --> 00:09:50,320 we still spent another £50 million stockpiling it the month after because it was going out today. 91 00:09:51,130 --> 00:09:59,260 So the huge issue about how we make decisions, so when anybody says to me we haven't got enough money in health care, I. 92 00:09:59,350 --> 00:10:05,330 You can't believe there isn't enough money. £473 million on stockpiling a drug. 93 00:10:06,350 --> 00:10:09,800 And what would happen if we had more money available in health care? 94 00:10:10,130 --> 00:10:14,570 We probably stockpile more youth treatment just in case scenario. 95 00:10:16,670 --> 00:10:24,380 The reason that's there is because influenza is on the government register, so it's on the same level as climate change. 96 00:10:24,860 --> 00:10:28,430 So we have to have a risk register and a risk mitigation plan. 97 00:10:28,790 --> 00:10:33,290 And that's part of it. Tamiflu is our rhythm mitigation plan, not handwashing. 98 00:10:33,620 --> 00:10:41,749 Whether the good evidence is, that's our plan. So thinking about that is how does that think? 99 00:10:41,750 --> 00:10:44,120 So if you go in the thought, you see why we need better evidence. 100 00:10:44,130 --> 00:10:50,820 So my first point say you can see 20 reasons that underpin at least 20 the need for better evidence, for better healthcare, 101 00:10:51,380 --> 00:10:59,030 and that these are structural, systemic problems that have existed in evidence based medicine for the last 20 or 30 years. 102 00:10:59,420 --> 00:11:06,980 And so if you take a drug like what I showed you, neuraminidase inhibitors, like I'll sell some of our Tamiflu zanamivir Relenza. 103 00:11:07,580 --> 00:11:13,430 And I say, how many problems do the trials exhibit? So we just take them 20 problems as we go through them. 104 00:11:13,760 --> 00:11:16,830 I can say that they suffered with pop publication bias. 105 00:11:16,850 --> 00:11:22,610 They had poor quality research. They had research that was more likely to be false and true reporting bias. 106 00:11:22,620 --> 00:11:27,310 They certainly had that go to authorship. They had financial and non-financial conflicts of interest. 107 00:11:27,740 --> 00:11:29,480 They had underreporting of harm. 108 00:11:30,200 --> 00:11:36,410 If you look at journal publications compared to clinical study report, there's about an 8% difference in reporting of harms. 109 00:11:36,920 --> 00:11:39,809 They certainly have a lack of shared decision making strategies. 110 00:11:39,810 --> 00:11:46,969 If you ring up for a year, if you've got a fever, take Tamiflu, where the shared decision making in that trial is lacking external validity. 111 00:11:46,970 --> 00:11:54,230 Who do they apply to regulatory failings? Certainly surrogate outcomes, definitely a manageable volume of evidence. 112 00:11:54,290 --> 00:11:57,230 How can you look at 160,000 pages of evidence? 113 00:11:58,010 --> 00:12:05,209 Get clinical guidelines like Public Health England, no understanding of the evidence and prohibitive cost of doing the actual trials. 114 00:12:05,210 --> 00:12:10,640 And they certainly give us too much medicine. So you can see £500 million worth. 115 00:12:10,640 --> 00:12:14,150 If people have understand the limitations of evidence and the problem that existed, 116 00:12:14,300 --> 00:12:20,840 we might have come to different decisions and said, we won't stockpile this and we'll save that 500 million for something else. 117 00:12:21,860 --> 00:12:25,939 So it's not that people don't want to clinicians don't want to practice evidence based medicine. 118 00:12:25,940 --> 00:12:29,840 It's actually very difficult for them if trials exhibit all of these problems. 119 00:12:30,680 --> 00:12:34,670 So somehow we've got to fix this if we want people to use evidence better. 120 00:12:35,570 --> 00:12:41,810 And if we don't fix, some of these problems will still be in a problem where there's a huge amount of evidence in front of people. 121 00:12:42,260 --> 00:12:45,470 And it's very difficult and it's getting harder in my mind. 122 00:12:46,190 --> 00:12:52,490 Not easier to tell the good evidence from the bad evidence, because if we go back to Goldman's time, 123 00:12:52,730 --> 00:12:55,430 there's been a huge increase in research and there's more. So whenever. 124 00:12:58,690 --> 00:13:04,089 So that's where we start of is the growth and volume of evidence has been accompanied by corrosion 125 00:13:04,090 --> 00:13:08,680 in the quality of evidence which is compromised medicine's ability to provide affordable, 126 00:13:08,680 --> 00:13:18,249 effective, high value care. Health care is either is one of the it is the fastest growing business in the world in some parts of the world, 127 00:13:18,250 --> 00:13:25,000 like America with 20% of the of the GDP. So if you wanted to create new business tomorrow, 128 00:13:25,750 --> 00:13:30,880 would you go into health care or would you go into making cars where it's actually a smaller GDP proportion? 129 00:13:31,120 --> 00:13:36,490 And what I think people, business people in health care is do what they think. They think it should be more than 20% of GDP. 130 00:13:36,610 --> 00:13:39,520 There's more to go for. We can make money out of health care. 131 00:13:40,300 --> 00:13:47,350 Now is a great driver of the economy, but actually that corrosion and uncertainty helps businesses thrive. 132 00:13:47,560 --> 00:13:52,930 When we want to roll back and say, hey, hold on a minute, we need to develop an evidence base, a high quality evidence base. 133 00:13:53,920 --> 00:14:03,219 And there are many examples of this. I could spend the next day giving a huge example, and in any field, wherever I go, people can give me examples. 134 00:14:03,220 --> 00:14:09,340 But here's another example. Vioxx trials. When I first qualified, going back in in 2001, 135 00:14:09,520 --> 00:14:14,559 the fact that evidence was withheld from regulators for a drug that was a global 136 00:14:14,560 --> 00:14:20,500 blockbuster and that simple people who would understand the difference between intention 137 00:14:20,500 --> 00:14:24,549 to treat and per protocol analysis could have come to a completely different 138 00:14:24,550 --> 00:14:30,760 understanding in that what they did is submit to the FDA only the on treatment analysis. 139 00:14:31,420 --> 00:14:36,820 So if you stopped the treatment because you had a heart attack, you were removed from the information sent to the regulator. 140 00:14:37,420 --> 00:14:44,080 So with two years later, before somebody clicked and went, Oh, there's a problem here with something being withheld in the analysis. 141 00:14:45,910 --> 00:14:50,379 Here's another classic example. 3293329. 142 00:14:50,380 --> 00:14:54,280 It's got its own Wikipedia page. They're going to have a read of the study. 143 00:14:54,280 --> 00:15:00,070 329. Everybody should know about this example. Again, paroxetine marketed to adolescents. 144 00:15:00,430 --> 00:15:08,889 Information withheld on important suicidal behaviour in adolescents meant that a number of people a 145 00:15:08,890 --> 00:15:13,960 drug that so billions could actually have been stopped the people to use better quality evidence. 146 00:15:14,770 --> 00:15:22,500 You structural problems. And then here's another example that the flip side is I've work with organisations when you try to fix things, 147 00:15:22,510 --> 00:15:27,040 all trials try to fix things and I'm going to come back to that. Just one specific thing. 148 00:15:27,220 --> 00:15:31,660 We want all trials registered and reported in full. Very simple statement. 149 00:15:32,500 --> 00:15:39,430 That's a commitment. All trials should be registered and should be reported in full. 150 00:15:39,760 --> 00:15:45,520 Doesn't seem nonsensical, but when you realise half of all trials never report, 151 00:15:45,760 --> 00:15:49,840 are never registered and report in full, that half the information is missing. 152 00:15:50,230 --> 00:15:59,049 Now imagine if you was the airline industry and you were in the business like British Airways are buying planes and they said, 153 00:15:59,050 --> 00:16:01,240 we're not going to give you half the information about this plane. 154 00:16:02,140 --> 00:16:08,050 So if it drops out of the sky, because we're not going to tell you the things that have gone wrong, we just would not accept that. 155 00:16:08,500 --> 00:16:15,880 So when we purchase a medicine and we both put it in the chemical compound, or are we purchasing all of the evidence that it's built up? 156 00:16:16,690 --> 00:16:23,649 And at the moment, that seems a ridiculous scenario that actually we can say, but we can hide half of that and we can do that. 157 00:16:23,650 --> 00:16:30,130 No problem. But what it taught me to fix things, it takes a lot of group for people to work together. 158 00:16:32,350 --> 00:16:33,879 And then here's an example. 159 00:16:33,880 --> 00:16:49,030 Recently there are of the rail toll is a drug called Rivaroxaban, which is designed to reduce your risk from both in people with risk of thrombosis, 160 00:16:49,030 --> 00:16:52,900 like with atrial fibrillation or risk of having a deep vein thrombosis. 161 00:16:53,320 --> 00:16:57,070 But here's it's in the secondary prevention after an acute coronary syndrome. 162 00:16:58,180 --> 00:17:04,870 And here the European Union approved the rail to rivaroxaban for secondary prevention of acute coronary syndrome. 163 00:17:05,680 --> 00:17:09,600 Two, that's the European Medicines Agency. The European Union approved these. 164 00:17:11,380 --> 00:17:20,560 However, in America, the regulatory position is to say we're actually going to vote against the same drug for acute coronary syndrome. 165 00:17:21,640 --> 00:17:24,340 So how can you have a system, global system, 166 00:17:24,670 --> 00:17:30,820 where the two major regulators come to completely opposite conclusion about the evidence in front of them? 167 00:17:31,720 --> 00:17:36,940 One of them saying, well, we've got like a few papers in front of them, looks quite nice, will trust it. 168 00:17:36,940 --> 00:17:41,679 And the other one is going, Oh, we see some of the issues that are emerging in the evidence base. 169 00:17:41,680 --> 00:17:51,490 We understand the problems of missing data because we've been here before and I do look at this and think that is ridiculous. 170 00:17:51,670 --> 00:17:59,830 This could be an amazing treatment. However, it's equally clear it's not animated and it's a harmful treatment. 171 00:17:59,860 --> 00:18:07,269 We need to reduce uncertainty. How can we have two European agents in FDA coming to opposite conclusion that we've got 172 00:18:07,270 --> 00:18:12,580 a legitimate regulatory system and they're looking at the same bits of evidence things? 173 00:18:12,970 --> 00:18:17,830 Absolute nonsense to me. But that's the problem, because we've got an issue with the evidence, 174 00:18:19,600 --> 00:18:24,910 and that's why we end up writing all sorts of editorials all the time about the same issue. 175 00:18:25,150 --> 00:18:28,840 That's too much uncertainty to make a decision. Could we reduce the uncertainty? 176 00:18:29,200 --> 00:18:32,740 One of the things we said is it was 2010 when these drugs became available. 177 00:18:33,850 --> 00:18:38,260 First off, we could have run our own independent trial in the meantime, 178 00:18:38,950 --> 00:18:45,520 which would have been far less money than what we've already spent on the drugs and come to our own independent conclusion. 179 00:18:46,090 --> 00:18:50,050 Getting rid of all the problems that exist in the evidence. 180 00:18:50,470 --> 00:18:57,730 But we don't want to do that. So we have a problem of replication and independent replication, and that's what we find here. 181 00:18:58,480 --> 00:19:04,240 Independent scrutiny of key trial data. Everybody involved in the trials have a commercial conflict of interest. 182 00:19:05,260 --> 00:19:08,140 How would we expect them to come to an independent position? 183 00:19:09,100 --> 00:19:14,170 Because we just know that even if you want to act in the best well of the world, you're not going to be able to do that. 184 00:19:14,320 --> 00:19:27,550 If you have a financial tower. And then we have a world where what we keep saying to everybody is we should be informing the public. 185 00:19:28,090 --> 00:19:33,300 You should be sharing decisions. We should work clinicians and patients together. 186 00:19:33,310 --> 00:19:39,580 And then I have clinicians who work with me saying, So, Karl, what can we do to operationalise this tomorrow? 187 00:19:39,610 --> 00:19:42,190 Where are the tools that are going to help us do this? 188 00:19:42,220 --> 00:19:49,750 And I say to them, Well, actually, if you go and look at the evidence, there's actually this is an area where there's so little research, 189 00:19:50,140 --> 00:19:54,480 it's impossible to inform anybody about what you might do, how you might go about it. 190 00:19:54,490 --> 00:19:57,350 There are no tools that can help us. 191 00:19:57,370 --> 00:20:03,820 When you say to somebody, how might you help somebody with cardiovascular risk make a better decision about whether to take treatment or not? 192 00:20:05,140 --> 00:20:08,920 And it's a really difficult area to operationalise in and work in. 193 00:20:09,160 --> 00:20:12,970 But I can tell you that the funders and people are not taking it seriously. 194 00:20:16,920 --> 00:20:22,470 And then finally we get invited into areas to work with people where we see this huge problem. 195 00:20:22,620 --> 00:20:30,690 And we work with Panorama, BBC before Christmas to look at the evidence for IVF treatment and add on itself. 196 00:20:32,640 --> 00:20:39,120 And it astounds me that you go to an area like this and we say the most patient relevant outcome here has to be live, 197 00:20:39,120 --> 00:20:43,440 birth and all of the people in this field go, No, that's not the important outcome. 198 00:20:44,250 --> 00:20:47,000 And we're like, Well, what is it? They go, No, no, no. You need to come back. 199 00:20:47,010 --> 00:20:52,830 You need to come back, and you need to do so at that for surrogate or maybe the six week pregnancy implantation rate. 200 00:20:53,250 --> 00:20:57,030 And which is what? No, that cannot be anything, but it's too difficult to measure. 201 00:20:57,030 --> 00:21:00,660 That would be good. How can it be difficult to measure live birth or not? 202 00:21:01,050 --> 00:21:06,070 Surely that's what you want to know. When you decided to have some additional treatment that could cost you up to 7 203 00:21:06,070 --> 00:21:11,399 to £10000 extra because this is outside of the NHS and this is the best world. 204 00:21:11,400 --> 00:21:18,600 If you want to look at what would the NHS look like in a private sector, you'd have this mish mash of thumb treatments we'd offer you and then others. 205 00:21:18,600 --> 00:21:23,910 We would operate in a world of uncertainty where we'd go. We're not quite sure, but we think it's a good idea. 206 00:21:24,090 --> 00:21:27,719 And give us 5000 and we'll let you know and you'll find out. 207 00:21:27,720 --> 00:21:31,140 And people spend up to 60 to 70000 extra in this. 208 00:21:31,140 --> 00:21:36,030 And all we said it very clearly is actually there were 42 different treatments 209 00:21:36,030 --> 00:21:42,239 available that you could be offered in IVF beyond standard treatment of them. 210 00:21:42,240 --> 00:21:48,899 42 only three had some form of evidence where we would say they show some benefit of improvement. 211 00:21:48,900 --> 00:21:57,570 But within Cochrane Review however, that evidence in them free with limited by real issues that need meant they needed a large a bigger trial. 212 00:21:58,830 --> 00:22:05,459 So to be honest, it was zero that we would recommend above and beyond the full placebo effect that you might get from parties. 213 00:22:05,460 --> 00:22:13,920 But the key that some of them were actually harmful and we couldn't understand why they were still available and being offered in the private world. 214 00:22:13,950 --> 00:22:18,899 They were. But what it taught us that there are huge issues across all different sectors. 215 00:22:18,900 --> 00:22:26,639 Wherever you are, there are issues in the quality of evidence, which is probably not that ridiculous, to be honest, 216 00:22:26,640 --> 00:22:32,270 because you can think we've only about 25 years into the idea of high quality developing evidence, 217 00:22:32,280 --> 00:22:35,220 and we're only at the beginning of the journey of evidence based medicine. 218 00:22:40,050 --> 00:22:47,280 And while this has been going on, not least, the Academy of Medical Sciences has launched a new project on evaluating evidence. 219 00:22:48,180 --> 00:22:52,980 So there are other people coming to the fore. And not least Sally DAVIES, who is the CMO. 220 00:22:53,430 --> 00:22:55,980 Some of you may be aware, Professor Dame Sally DAVIES. 221 00:22:56,370 --> 00:23:01,769 There seems to be a view that doctors over medicate so it is difficult to trust them and that clinical 222 00:23:01,770 --> 00:23:07,170 scientists are all beset by conflict of interest from industry funding and therefore untrustworthy too. 223 00:23:09,330 --> 00:23:16,469 And she picked out one of our favourite treatments in the world to say this is often a huge problem in 224 00:23:16,470 --> 00:23:23,130 understanding evidence because what they don't want to do is be putting out a policy where some people are going. 225 00:23:23,640 --> 00:23:28,200 Hold on a minute. There's a problem with the evidence. The problem with the evidence, they don't like it. 226 00:23:29,040 --> 00:23:35,490 We need an authoritative independent report looking at how society should judge the safety and efficacy of drugs of an intervention. 227 00:23:36,450 --> 00:23:40,260 How can anybody come up with an authoritative, independent report? 228 00:23:42,000 --> 00:23:46,690 That's a really bold statement, isn't it, by very nature in a shared decision making world. 229 00:23:46,710 --> 00:23:51,450 Nothing can be authoritative when we're operating in a world of uncertainty. 230 00:23:54,490 --> 00:23:57,570 So that's how we ended up with that sort of editorial. 231 00:23:57,600 --> 00:24:02,040 How can we how can we go from this? Because we wrote this in response to that. 232 00:24:02,280 --> 00:24:06,540 How can we fix it? Not just say, okay, we're going to do more of the same? 233 00:24:07,230 --> 00:24:12,030 And what realised when we were doing that, often what we were pointing out is we want to reduce the biases. 234 00:24:12,450 --> 00:24:15,990 That's what we really talking about. There's so much bias out there. 235 00:24:16,170 --> 00:24:23,379 How do we reduce bias? It was interesting last week of us talking about this from wherever I go. 236 00:24:23,380 --> 00:24:30,640 And I thought I'd leave the thing. And with this couple of flight with at the Centre for Evidence-Based Dermatology in Nottingham and 237 00:24:30,640 --> 00:24:35,980 we have the director of the Centre for evidence based Veterinary Medicine in Nottingham here. 238 00:24:35,980 --> 00:24:40,360 So to go with him one way and there's a chap there how. 239 00:24:40,360 --> 00:24:47,259 William I don't know if you've ever met how he's now also not only head of the Centre for Evidence based Dermatology, 240 00:24:47,260 --> 00:24:52,840 he's head of the Natural History Programme, not the Trials program. 241 00:24:53,470 --> 00:25:01,990 So really influential person knows a lot about evidence, about the problems in evidence and also is really interested. 242 00:25:01,990 --> 00:25:06,790 And we had a lively discussion about some of the affected, but when I went there I picked out this article. 243 00:25:07,120 --> 00:25:10,329 He wasn't aware of it, he was in the audience. So I said, Look, I'm here today. 244 00:25:10,330 --> 00:25:13,989 And I said, Look, strengthen and limitations of evidence based dermatology. 245 00:25:13,990 --> 00:25:19,479 And look what we think. Evidence based dermatology can have its limitations. 246 00:25:19,480 --> 00:25:26,680 For example, monolith pooling together of data are still susceptible to some degree of dishonesty and bias. 247 00:25:27,280 --> 00:25:30,940 But in its article, he makes this point, if possible, 248 00:25:30,970 --> 00:25:36,280 those with a vested interest to twist the methods and conclusion of a systematic review to their own advantage. 249 00:25:37,270 --> 00:25:41,230 One could conveniently fail to include one crucial study, for instance, 250 00:25:41,920 --> 00:25:48,390 or not publish a study that showed you unpublished data in a way that favours a particular product. 251 00:25:48,400 --> 00:25:54,790 And I like this readers need to develop a good nose for what constitutes a good systematic review in a clinical trial. 252 00:25:57,460 --> 00:26:02,740 So you all need to. So when you teaching tomorrow on the MFC you have to be talking about do you have a good nose or not? 253 00:26:03,700 --> 00:26:05,079 What does that mean? That's really ridiculous. 254 00:26:05,080 --> 00:26:11,649 And so what we have to do is to base our decisions, our most important policy decisions right now of whether you've got a good nose. 255 00:26:11,650 --> 00:26:17,440 And he's right because you have to be able to detect the subtleties, the biases, the problems that exist. 256 00:26:17,770 --> 00:26:21,040 And if you're not aware of them or you don't understand them or you conflicted in some way, 257 00:26:21,040 --> 00:26:25,929 you can easily dismiss them and come to an alternative viewpoint if you do. 258 00:26:25,930 --> 00:26:29,580 SMITH All the biases dismiss the poor quality outcomes. 259 00:26:29,590 --> 00:26:32,770 You can still come to an end. To that said, we should use this treatment. 260 00:26:32,770 --> 00:26:36,700 Why don't the problem is we can't afford to do that anymore, can we? 261 00:26:37,810 --> 00:26:39,790 We just have not got enough money in the pool. 262 00:26:41,710 --> 00:26:47,590 Can we invest 2.5 billion in health services extra tomorrow and of that, spend another 50 million on Tamiflu? 263 00:26:47,590 --> 00:26:50,890 Is that what we should do or should we stop and go? 264 00:26:51,160 --> 00:26:55,240 Actually, let's work out what's worth investing in and what do. 265 00:26:58,430 --> 00:27:01,160 So we've tried to fix some things. 266 00:27:01,850 --> 00:27:10,760 And when we've tried to fix some things, it's been fun but actually provoked a lot of force in our unit and trying to think, what do we do? 267 00:27:11,390 --> 00:27:14,990 So I'm going to talk to you about clinical outcomes. 268 00:27:14,990 --> 00:27:17,130 Failed to translate into benefit for patients. 269 00:27:17,390 --> 00:27:23,030 We published this article last week on Monday, so it's available in trial so you can go and read it freely in open access. 270 00:27:23,420 --> 00:27:30,350 But one of them is we think there's a huge problem in outcomes. And if you go to outcomes, here's what I'm brainstorming. 271 00:27:30,350 --> 00:27:33,930 I use the latest technology to brainstorm. And as you can see, 272 00:27:34,170 --> 00:27:41,959 it's like downloadable on the iPhone is Post-it notes and a large Post-it note that sticks on your wall and you put them on the wall and you think, 273 00:27:41,960 --> 00:27:48,470 what does that look like? And then you think, oh, inappropriately reported, selectively reported, poorly chosen, they're very thin. 274 00:27:48,470 --> 00:27:53,480 You see failure to work and then you end up with a nice little diagram like this that's in the report. 275 00:27:53,750 --> 00:27:57,830 That's what it looks like. And I'm going to talk to you one thing about that, you switched outcomes. 276 00:27:59,240 --> 00:28:06,649 So we think there are all these problems. There are there are 15 problems, I think 911, 1215 for fixing problems in the design, 277 00:28:06,650 --> 00:28:12,680 badly chosen outcomes, badly collected, selectively reported and inappropriately interpreted. 278 00:28:13,040 --> 00:28:16,790 And you can pick any one of them and say, we need to optimise and we need to improve what we did. 279 00:28:17,870 --> 00:28:24,110 And if we don't do that, that's why you have so many trial, 35,000 a year and so many of them don't improve outcomes. 280 00:28:25,430 --> 00:28:32,060 So Compare is a project led by Ben Goldacre and a group of medical students and a couple of us in the centre. 281 00:28:32,930 --> 00:28:38,419 So there's been a long history of reporting of the problem between what people pre 282 00:28:38,420 --> 00:28:44,330 specify at the outset of a trial and what they report at the end of the trial. 283 00:28:45,350 --> 00:28:51,200 Because if you go to the members of the general public and take anybody aside who knows nothing about research and being in a university, 284 00:28:51,200 --> 00:28:59,299 an academic setting, they think it's absolutely obvious that you should say, this is what I'm going to do, this is how I'm going to measure it. 285 00:28:59,300 --> 00:29:03,110 And then you should report that. And if you change from that, they go, Well, isn't that like cheating? 286 00:29:04,850 --> 00:29:08,270 And that's what they consider it cheating. They don't say, it's like, Oh, slightly. 287 00:29:08,510 --> 00:29:12,710 And what's been shown over time is that there's been a systematic review of 27 out. 288 00:29:13,340 --> 00:29:22,610 About a third of trials have discrepancies between pre-specified and reported primary outcomes, and 15% of trials introduce none previously. 289 00:29:22,620 --> 00:29:26,359 So they add brand new in. And somebody said to me, 290 00:29:26,360 --> 00:29:37,010 it's a bit like if you were putting a bet on a horse and you could put your bet on after the horse before the race had finished, wouldn't you? 291 00:29:37,880 --> 00:29:41,110 And if you could do that, we'd all be very rich and we wouldn't be still. There would be it. 292 00:29:41,120 --> 00:29:47,960 So imagine like the Grand National and you go at the end and go, Not me, but now please, I want to put it on that and you go over your £10,000. 293 00:29:48,440 --> 00:29:54,710 So it said it's obvious. So we work in things like other arenas that actually pre-specified means of the time stamp 294 00:29:54,980 --> 00:29:58,940 when you put your time only betting slip and it has to come in before the race starts, 295 00:29:59,330 --> 00:30:03,260 the end of the race is started, all bets are off and that's what should happen in research. 296 00:30:03,800 --> 00:30:09,350 But it doesn't. I've I've showed you third of the time you can sort of say I didn't pay the money or I'm going to have a new winner. 297 00:30:10,160 --> 00:30:17,030 So what did we do? We did all the top five medical journals over six weeks and we checked the pre-specified both reported outcomes. 298 00:30:17,390 --> 00:30:22,580 But what we also did differently, instead of doing repeating the 27 study that had already the problem, 299 00:30:22,790 --> 00:30:26,780 we said we're going to try and fix it, so we're going to send. 300 00:30:28,880 --> 00:30:36,770 An email to the journal, which is actually really difficult to do because the first thing you realise is that rule like you've only got so many words, 301 00:30:37,580 --> 00:30:41,540 250 words for one, 400 for the other, 400 for the other someone. 302 00:30:41,540 --> 00:30:46,279 Only two weeks, three weeks and four weeks. But if you don't take 15, you can't submit a response. 303 00:30:46,280 --> 00:30:50,300 Even if there's a glaring error, you're out of time. So you've got to get your act together. 304 00:30:50,310 --> 00:30:56,060 You both have got to be right. So we had a system where two students would do independently and then a clinician would check it. 305 00:30:56,630 --> 00:31:03,110 That's quite a lot of work actually in a short period of time. So we had to be organised and this is what we did. 306 00:31:03,140 --> 00:31:06,230 This is what we found. We checked 67 trials. 307 00:31:07,340 --> 00:31:16,520 Nine were perfect as many outcomes were not reported as was silently added, which was shocking to us. 308 00:31:17,240 --> 00:31:25,280 We sent 58 letters, 18 were published and after four weeks, eight letters were unpublished and 32 were rejected outright by the editors. 309 00:31:27,280 --> 00:31:32,710 So the idea that science is correcting itself correct itself is automatically ridiculed by this. 310 00:31:33,700 --> 00:31:35,379 So this is what the different journals did here. 311 00:31:35,380 --> 00:31:42,050 The BMJ, you can read online, you can go online and access the women who told me that all the revolt there freely available. 312 00:31:42,070 --> 00:31:51,280 You can look at all our data sheets. You can read them online. So interestingly, they publish all their letters within 24 hours, which is good. 313 00:31:53,110 --> 00:31:58,690 And here's one Henry Drysdale. And it had a correction. 314 00:32:00,940 --> 00:32:09,400 But interestingly, for another letter, which you can see, which includes my name with Henry Jagdale, there was no correction. 315 00:32:11,200 --> 00:32:13,440 So you've got journals operating in this really odd way. 316 00:32:13,450 --> 00:32:18,880 And in fact, I was presented this at the BMJ editors meeting with all the editors from the big journals in the room. 317 00:32:19,330 --> 00:32:26,050 And I said, What the [INAUDIBLE] is going on? This is supposed to be the scientific record and you seemingly make it up as you go. 318 00:32:26,860 --> 00:32:32,860 You are the bastions of top five. You people believe these are the upholders of the scientific record. 319 00:32:33,700 --> 00:32:39,670 The good thing is I recently. But it's so ridiculous. I've got to have a back into conversation to show this in public to the journal, 320 00:32:39,940 --> 00:32:45,009 the editor in chief who was so embarrassed that they actually had to no faith me come to him and say, can you send it? 321 00:32:45,010 --> 00:32:48,970 This will thought it. But he got to that level to so the Lancet. 322 00:32:50,560 --> 00:32:55,810 They published the letters up to 180 days it took. 323 00:32:56,980 --> 00:33:01,480 So it's ridiculously long time. By then, people have forgot what the trial was. 324 00:33:02,170 --> 00:33:07,540 They would publish an awful reply, but there'd be no correction, no response in the record. 325 00:33:11,320 --> 00:33:18,100 JAMA rejected all letters. They said they were too vague and there were repetition between letters. 326 00:33:19,840 --> 00:33:26,980 That's because we only had 250 words. And you can imagine you need to be very specific about where you insert the primary outcome. 327 00:33:27,340 --> 00:33:33,100 So we would have exactly the same and say the primary outcome measure with the reported, correct? 328 00:33:33,130 --> 00:33:39,670 Yes. No secondary outcomes. There were seven of which five were incorrect. 329 00:33:39,670 --> 00:33:45,870 Two weren't. Again, we're trying to embarrass JAMA into thought in this out, 330 00:33:45,880 --> 00:33:54,430 but it's ridiculous that they said we have outcomes which you know the problem and we're doing nothing about it and also of internal medicine. 331 00:33:56,020 --> 00:34:00,370 They published some letters they didn't publish, others they did all sorts of thing, 332 00:34:00,370 --> 00:34:05,530 published a 1000 editorial trashing our methods without letting us respond to that. 333 00:34:06,160 --> 00:34:09,520 And we keep going back and forward. 334 00:34:09,760 --> 00:34:15,850 But we know they've changed their policy, but they don't want to admit why. 335 00:34:16,160 --> 00:34:19,660 But they have changed their policy. But what if they say this? 336 00:34:19,660 --> 00:34:24,130 We relied primarily on the protocol details about pre-specified primary and secondary outcome. 337 00:34:26,080 --> 00:34:31,150 What they did is go to chairman and say, your methods are wrong because we rely on the protocol. 338 00:34:32,140 --> 00:34:39,850 And what we said is we will use the protocol if it's dated pre-specified, but if it's published two years after the event, 339 00:34:40,060 --> 00:34:44,490 which often happens a month before the trial finished, and there's no data on it all the time. 340 00:34:44,510 --> 00:34:51,579 Stamp How are you supposed to call that pre-specified? And they try to say that the protocol is the document. 341 00:34:51,580 --> 00:34:55,360 And we said, Yes, I understand that, but if you understand good clinical practice, 342 00:34:55,360 --> 00:34:59,410 every protocol should have a version number on it with version control and a time stamp. 343 00:35:00,160 --> 00:35:04,840 Yeah. If you go to an event and look in there, they've got no version control, no protocol, no dates on them. 344 00:35:05,530 --> 00:35:08,829 So somebody could have wrote it a week before and make it out though. 345 00:35:08,830 --> 00:35:15,040 This is the pre-specified outcome. We said that's utterly nonsense and anybody who's been trained in GCP would know that. 346 00:35:16,060 --> 00:35:22,300 And it doesn't make sense because every document I've been involved in a trial has the time stamp on it and the version on it. 347 00:35:22,690 --> 00:35:28,599 This is version 1.9. And all we're saying is if you've made a change, then you will highlight that in the protocol. 348 00:35:28,600 --> 00:35:29,799 We would have then accepted it. 349 00:35:29,800 --> 00:35:36,190 If you say this is two years after there have been no changes in C section, version one, we would accept it, but that's not the case. 350 00:35:38,470 --> 00:35:43,690 But look, so we went looking for that protocol as well, were available for none of the Annals trial. 351 00:35:45,070 --> 00:35:51,790 And even more worrying if statements like this reproducible research story, doctor, 352 00:35:51,790 --> 00:35:57,699 everything we wrote to doctor everything and it said this is what we've got a study 353 00:35:57,700 --> 00:36:03,000 protocol available from Dr. Aveton statistical code and data set not available. 354 00:36:03,020 --> 00:36:06,700 Okay. But he said that's what he said when we emailed him. 355 00:36:07,060 --> 00:36:18,900 This is his response. That's not what that says, is it? 356 00:36:19,980 --> 00:36:23,400 This is outrageous. So if you go back, I don't say we rely on the protocol. 357 00:36:23,670 --> 00:36:28,080 You ask somebody for the protocol and they say you have to sign a commercial confidentiality agreement. 358 00:36:28,500 --> 00:36:32,790 This sort of stuff is just outrageous. Then we said, Can we trust the evidence base? 359 00:36:33,990 --> 00:36:39,120 So New England Journal Medicine, one of my favourite journals in the in the world. 360 00:36:39,780 --> 00:36:43,530 How can space control not allow for all outcomes to be reported? 361 00:36:43,920 --> 00:36:50,670 And any interesting reader can compare the published article, the trial registration and the protocol with the reported results to view discrepancies. 362 00:36:51,090 --> 00:36:55,229 That is impossible. Their editors can't do it because they don't know how to do it. 363 00:36:55,230 --> 00:36:58,230 We've just shown you can't get the protocol for some trials. 364 00:36:58,230 --> 00:37:04,500 It took it up to 4 hours to do it well because some of the results are in the discussion. 365 00:37:05,580 --> 00:37:09,809 And when you go checking it, we're astonished to go with the secondary outcome of the primary outcome. 366 00:37:09,810 --> 00:37:12,750 We do it in the discussion. Why is this so? 367 00:37:12,780 --> 00:37:21,570 So they'll do things like they'll try and say, we reported the primary outcome correctly and we'll go, but you had seven timepoint you reported seven. 368 00:37:21,580 --> 00:37:27,390 So they might be say the protocol might say we're going to report HB one C at six weeks and then all of a sudden they introduce six, 369 00:37:27,420 --> 00:37:32,040 12, 18, 24, 30, and anybody will know what happens when you do multiple outcomes. 370 00:37:32,520 --> 00:37:35,549 The potential for inference of that effect, 371 00:37:35,550 --> 00:37:40,770 for the multiplicity argument that you've got too many outcomes by chance alone, one of them will be positive. 372 00:37:41,010 --> 00:37:49,980 That's the type of stuff you see. But this is not possible. And they ready to, if any interested editor cannot compare the published article, 373 00:37:50,220 --> 00:37:54,420 the registry in the protocol takes too long and it needs three of you. 374 00:37:54,420 --> 00:37:59,010 At the moment we're saying, Look, why can't you just have a summary table? Primary, secondary, exploratory outcomes? 375 00:37:59,190 --> 00:38:03,030 Anything changed into an infection? In the footnote, we'd all be able to do it really easy. 376 00:38:04,920 --> 00:38:07,950 Okay, so there's is a junior problems. 377 00:38:08,580 --> 00:38:16,260 He's a junior into the manifesto. So we've got I've got to a part where as a group of us, we've been doing things like even flying for a long time. 378 00:38:16,260 --> 00:38:21,149 We've been teaching a long time about what do we mean by high quality evidence, how do we get there? 379 00:38:21,150 --> 00:38:24,990 And we've thought that actually we now need to provide commitment, 380 00:38:25,770 --> 00:38:32,909 that these commitments are statements that we put together, that people then say, I'm prepared to put an action to. 381 00:38:32,910 --> 00:38:38,910 I'm going to try and do from about one or of an organisation free four five and then look 382 00:38:38,910 --> 00:38:43,500 like so if you go on the manifesto is we're trying to facilitate so it's for everybody. 383 00:38:43,500 --> 00:38:47,399 It's not just a few people to improve the quality of health care. 384 00:38:47,400 --> 00:38:51,360 And so here's the 16 actions right now, the manifesto commitments. 385 00:38:51,720 --> 00:38:59,100 You see, they're all verbs, so they're all about doing something and you can pick anyone. 386 00:38:59,100 --> 00:39:03,810 So these are thought of grown from a small number to a larger number of a spoke to people. 387 00:39:04,110 --> 00:39:12,210 They may collapse again. Not sure where they'll end up, but the idea of having commitment is if you commit to one thing, 388 00:39:13,120 --> 00:39:19,889 I'm going to eradicate publication bias and you then have to say, well, who are the people to work with? 389 00:39:19,890 --> 00:39:23,010 Who are the action? So we are committed. 390 00:39:23,010 --> 00:39:27,750 So when I've done this as an organisation in CPM, we're probably doing too many. 391 00:39:28,500 --> 00:39:30,750 I think we're involved in about eight things. 392 00:39:31,410 --> 00:39:37,500 We're really interested in growing generation of leaders in evidence based methods and cultivating skilled skilled. 393 00:39:38,190 --> 00:39:46,140 We are interested in reducing, don't worry, seed variation and overdiagnosis and unnecessary medicalisation we do do this. 394 00:39:46,410 --> 00:39:51,180 I'll talk to you a bit about these too. And we're interested in these too, but we can't cover everything. 395 00:39:52,800 --> 00:39:54,870 And it might be too many, maybe too many things. 396 00:39:54,870 --> 00:40:01,260 And now I'm interested in how do we put these into into a way that actually communicates to groups like yourself or other people? 397 00:40:01,710 --> 00:40:04,170 Oh, yeah, we should we should be doing something about it. 398 00:40:05,760 --> 00:40:10,139 So if you take something like eradicate publication bias, that's probably one of the things we're best at. 399 00:40:10,140 --> 00:40:13,530 And we've developed all trials, we've wrote about it. 400 00:40:14,370 --> 00:40:20,939 We're working with the World Health Organisation to reach youth to improve the reporting of trials globally. 401 00:40:20,940 --> 00:40:25,920 So increase the number of countries that register trials because that's one of the easiest fixes to do. 402 00:40:26,430 --> 00:40:30,780 And we're trying to get an action plan for all countries to sign up to try to register trials. 403 00:40:31,410 --> 00:40:36,299 We've developed a trial tracker that says Online you can look at how well your organisation is. 404 00:40:36,300 --> 00:40:40,440 We've done audits of the University of Oxford to say we're not as good as we think. 405 00:40:41,280 --> 00:40:46,050 Just to say two simple things Do you register your trial? 406 00:40:46,350 --> 00:40:49,950 Do you publish your summary result within 12 months of the completion of that trial? 407 00:40:51,210 --> 00:40:53,070 But actually it takes a huge number of people. 408 00:40:53,070 --> 00:40:59,640 If you create a commitment like we're going to eradicate publication and reporting both, you need to work with more than one group. 409 00:41:00,570 --> 00:41:09,810 You have to have a plan and you have to have actions. So our key is to get this right and in the right structure, and then from there say, right, 410 00:41:10,020 --> 00:41:14,490 who's going to action it for the next five years of work is going to be about fixing the problem. 411 00:41:14,700 --> 00:41:16,770 No more research about here's the problem. 412 00:41:17,460 --> 00:41:24,810 Apart from in areas like veterinary medicine, where you might say, we still have to say the obvious, but we certainly don't need another cohort study. 413 00:41:24,810 --> 00:41:28,650 That said, here's to to say there's a problem with outcomes which you can do it. 414 00:41:29,610 --> 00:41:34,700 We need to think about how do we fix it. How do we and it will take its time. 415 00:41:34,730 --> 00:41:37,190 So one of the things we are doing is we're not going away. 416 00:41:37,200 --> 00:41:43,220 We recorded in the journal until we fix it and we'll keep trying to make their life a nightmare. 417 00:41:44,640 --> 00:41:49,650 So of these you might think when you go online, you think, okay, I'm really interested in this. 418 00:41:52,270 --> 00:41:55,720 Regulatory decisions who might be in policy. 419 00:41:57,660 --> 00:42:02,550 You might be in clinical patient benefit, you might be in shared decision making, but what are the commitments? 420 00:42:02,730 --> 00:42:08,850 Is there something missing? And then once we've set them out, the key question is what actions do we need to take? 421 00:42:09,960 --> 00:42:15,300 And it keeps coming back. Because when you're in this game of trying to introduce action. 422 00:42:15,310 --> 00:42:20,820 This was last week. Carbon Select Committed Research Integrity Inquiry launched. 423 00:42:21,540 --> 00:42:27,900 So when you're in the job of fixing things, you do things like submit responses to this and say, this is how we should go about fixing it. 424 00:42:29,280 --> 00:42:34,830 My response that there are two ways of looking at this problem. You could say, Well, there are people who are massive fraudsters. 425 00:42:35,490 --> 00:42:39,510 The number of them is small and their actual impact is limited. 426 00:42:40,640 --> 00:42:45,740 Whereas if you consider the questionable research pact practices like not publishing reviews, 427 00:42:45,830 --> 00:42:52,549 the impact of that is huge and we likely affect the former because it looks bigger and it's more interesting news. 428 00:42:52,550 --> 00:42:55,880 But actually we need to definitely have a strategy around the latter. 429 00:42:58,100 --> 00:43:00,620 Anybody we do, we record put the themes up. 430 00:43:00,620 --> 00:43:06,950 People who have a response with about 100 people have responded, Be careful if you do, because then we start quoting you. 431 00:43:07,400 --> 00:43:10,900 And that's really helpful and that's how you do it. 432 00:43:10,910 --> 00:43:17,420 You give us your feedback and be really helpful if you want it to say something or respond, because it's really helpful in our feedback. 433 00:43:17,870 --> 00:43:21,259 And what we're trying to do then is we were going to do that. 434 00:43:21,260 --> 00:43:26,450 We are going to publish a final version of this where everybody who's contributed will be part of the team, 435 00:43:26,450 --> 00:43:33,109 an organisation of people who are saying, I'm an author and we will do that in evidence live. 436 00:43:33,110 --> 00:43:36,620 27 Is we going to be moving to a focus of the action plan? 437 00:43:37,790 --> 00:43:43,610 What does it mean if we've got the commitments? What are the fixes and who are the groups of people who want to talk to each other? 438 00:43:44,210 --> 00:43:48,020 You might say, I'm really interested in working on this or that problem. 439 00:43:48,440 --> 00:43:51,769 How do I go about it? Who do I talk to? Because it's a it's fixing. 440 00:43:51,770 --> 00:44:01,130 It is a lot harder than I first realised. Providing solutions is I tend to think of it in five year for take me five years to fix something of that, 441 00:44:01,430 --> 00:44:06,750 whereas I could do another paper about the problem in the next three months. All right. 442 00:44:06,790 --> 00:44:07,630 Thank you very much.