1 00:00:00,330 --> 00:00:04,410 Welcome, everybody. It's interesting. We're going to talk about the news. 2 00:00:05,100 --> 00:00:10,950 This gentleman here actually has a newspaper. Many people don't know what one of them is, actually, I'm pretty sure. 3 00:00:11,670 --> 00:00:18,750 So I'm going to talk about I do a lot of work in evidence based medicine and I do a lot of work with the media. 4 00:00:18,930 --> 00:00:23,100 And I'm going to talk about why you all should do a bit more yourself. 5 00:00:23,370 --> 00:00:30,720 So this is all aimed at you to get involved. Okay. So first thing is, there's a lot of out there. 6 00:00:32,190 --> 00:00:35,970 These are the views per month individual three shooting fans and four. 7 00:00:36,180 --> 00:00:44,130 But 80 million people, the number one paper in the UK, The Daily Mail, followed by the Sun, The Mirror and down here. 8 00:00:44,640 --> 00:00:50,010 And then you get to the express. And in the middle is all the for the Sunday Times and in pen for lots of news. 9 00:00:50,010 --> 00:00:52,560 Our consumption of news is actually going up. 10 00:00:53,490 --> 00:01:01,110 To talk about how it was 20 years ago with much more print and with the advent of computers and their mobile phones, 11 00:01:01,110 --> 00:01:11,399 how we consuming more news than ever. But my some of my favourite paper paper if I start with the Daily Mail and the Daily Mail's a 12 00:01:11,400 --> 00:01:19,410 great paper for health care headline loads of really interesting stories about processed meat, 13 00:01:19,440 --> 00:01:26,100 heavy web use harms a child's health poisoned by everyday life from bad, lethal cigarette. 14 00:01:26,940 --> 00:01:32,040 Now, it's interesting to note, why does this appear and why does it keep happening? 15 00:01:32,550 --> 00:01:42,720 It's incredibly prevalent to have health care news headlines is the editors know that you will click on these headlines. 16 00:01:43,770 --> 00:01:48,690 So you are part of the problem in that you know what it's like. 17 00:01:48,690 --> 00:01:53,040 You know, you know this is fake, but you can't wait to get in there and find out. 18 00:01:53,070 --> 00:01:59,400 So on the portable on the phone, we all go in and this increases in go zone. 19 00:02:00,030 --> 00:02:05,450 And even in Oxford we contribute to some of these these stories that obesity in women is dangerous. 20 00:02:05,460 --> 00:02:12,810 The terror threat and this is an Oxford study don't drink more than three glasses of wine a week. 21 00:02:13,080 --> 00:02:16,860 And so what's also interesting is how do these stories get interpreted? 22 00:02:17,850 --> 00:02:21,120 And I'm very fascinated by health care headlines. 23 00:02:21,810 --> 00:02:29,970 And when I first did this talk, I even sent out on Twitter a request for people for their favourite headline. 24 00:02:31,320 --> 00:02:42,780 And this is David who's running this week, his David favourite health care headline and so together. 25 00:02:43,140 --> 00:02:51,820 So David has an interest in exercise and lifestyle and this is a sort of evidence that he spends his time looking up about this. 26 00:02:54,660 --> 00:03:01,110 But, you know, it's interesting, isn't it, as we look at this stuff, you think, well, how does this impact on the decisions we make? 27 00:03:03,120 --> 00:03:06,480 And as I've shown you, he's my favourite. 28 00:03:09,090 --> 00:03:12,420 Your fridge is a major killer. You just don't know it. 29 00:03:13,380 --> 00:03:17,640 But here's my other favourite paper in the whole world. The Daily Express. 30 00:03:18,270 --> 00:03:21,510 The Daily Express definitely trumps the Daily Mail, 31 00:03:21,960 --> 00:03:28,680 and I'll show you what I'm going to come to a headline, which I know, but this is just about dementia. 32 00:03:28,680 --> 00:03:35,700 I started collecting news stories about dementia from the Daily Express, thinking, What could I learn? 33 00:03:36,430 --> 00:03:40,590 Simple way to fight off dementia. Actually, it's a stressful lifestyle. 34 00:03:41,190 --> 00:03:46,800 And then I started to think, Well, this is a bit odd. Chocolate can halt dementia, but lose weight to beat dementia. 35 00:03:47,250 --> 00:03:51,330 So how can anybody know you can eat lots of chocolate and lose weight? 36 00:03:51,810 --> 00:03:55,629 It's quite an interesting issue. You know, actually, you don't eat anything else, 37 00:03:55,630 --> 00:04:01,830 so you're going to go on the chocolate diet to stave off dementia and it keeps going daily drink aspirin, 38 00:04:02,700 --> 00:04:06,900 new way to fight dementia and even red wine pills stopped dementia. 39 00:04:07,800 --> 00:04:11,040 And so I was going I was getting really excited. You know, what you're like is a researcher. 40 00:04:11,430 --> 00:04:14,550 You like collecting data, don't you? We like to collect things. 41 00:04:14,880 --> 00:04:22,230 But what happened is this started to get a bit of a series addiction for me in that they started to increase it increase. 42 00:04:22,560 --> 00:04:26,700 Now the qualitative research is in love. 43 00:04:26,820 --> 00:04:31,920 This I thought, well, actually what we'll do is we'll take the headlines and try and create some themes. 44 00:04:32,490 --> 00:04:34,140 So I started putting them into theme. 45 00:04:34,150 --> 00:04:43,139 So here there's something about healthy living over here in lifestyle, something about drugs and chocolate, simple ways and to fight dementia. 46 00:04:43,140 --> 00:04:46,170 It's somewhat about tests and spicy diet. 47 00:04:46,170 --> 00:04:49,770 So started to organise them into is it a female tick analysis? 48 00:04:50,040 --> 00:04:56,400 Is that fair? What would you call that? Is that it's the beginning the beginning of being a qualitative. 49 00:04:56,580 --> 00:04:59,610 And when it did that I noticed is started to be able to fill in the. 50 00:04:59,940 --> 00:05:09,270 Even more and ended up like this with these eight different ways of looking at how we're going to beat dementia so you can lose weight, 51 00:05:09,960 --> 00:05:17,400 eat chocolate, keep busy, and have a healthy life. You can do puzzles, take some drugs and have some tests and watch your diet. 52 00:05:18,030 --> 00:05:23,550 And the Daily Express tells you how to do that. But, you know, this is all [INAUDIBLE]. 53 00:05:25,260 --> 00:05:29,460 When you look at all of them and you know this is what's going to happen in ten years time. 54 00:05:30,510 --> 00:05:34,360 And that's the issue here, isn't it? You start to go and this is how this is mine. 55 00:05:34,380 --> 00:05:46,740 Anybody can add in another one. There are loads of really interesting headlines that we're all, all the time looking at and pressing. 56 00:05:47,430 --> 00:05:55,170 Let's have a look. And I know that because the editor of The Express, his favourite topic is statins. 57 00:05:57,180 --> 00:06:04,500 His wife took a statin and his wife then got some muscle pains and decided to stop taking the statin. 58 00:06:05,010 --> 00:06:06,960 And so he became obsessed by statins. 59 00:06:08,100 --> 00:06:15,210 And what happened in doing that is a I think about tripled their readership when they put a story about such things on the front page. 60 00:06:15,660 --> 00:06:21,030 So they're always looking for statins, but what interested me is the circularity of all this. 61 00:06:22,170 --> 00:06:30,660 So this is the one headline Take Statins to Save Your Life, and then you go, Oh, let's put that in a health alert. 62 00:06:31,410 --> 00:06:37,560 Interesting. And part of that is the next one is it causes diabetes. 63 00:06:39,030 --> 00:06:42,090 However, in causing diabetes, we're going in that direction. 64 00:06:42,480 --> 00:06:43,860 It's going to cause chaos. 65 00:06:45,330 --> 00:06:52,799 But in causing chaos, we back to prove statins save millions and we come back in a circularity again and we'll get the whole thing. 66 00:06:52,800 --> 00:06:55,590 And this is what happens in about a year. 67 00:06:56,550 --> 00:07:04,770 And these news headlines keep getting recycled in a way if a piece of evidence emerges that actually fits this narrative. 68 00:07:06,810 --> 00:07:12,000 And you can keep going. You can upload the form with the express. You can do cancer, you can do whatever you want in headlines. 69 00:07:12,810 --> 00:07:19,850 And some of these are verging on the ridiculous drug to cure. 70 00:07:19,860 --> 00:07:28,200 Nine in ten cancers, rhubarb can save your life and they will feel a bit like fun until you meet the first patient who says, 71 00:07:28,590 --> 00:07:32,010 I stopped taking treatment and now I'm taking rhubarb instead. 72 00:07:33,150 --> 00:07:42,870 And I was describing a patient I've seen who basically came off his chemotherapy because his real life case started to take coffee enemas instead. 73 00:07:44,040 --> 00:07:47,879 And that will not anybody in medical care at some point is involved, 74 00:07:47,880 --> 00:07:54,030 will find somebody who goes on on one of these news headlines to take an alternative path. 75 00:07:55,170 --> 00:07:59,100 And I can come back to what I think issues usable. So it's interesting. 76 00:07:59,110 --> 00:08:05,420 So Google's official blog, about 1% of all searches on the Internet now. 77 00:08:07,140 --> 00:08:14,220 And that's you know, you don't have to think that's millions. If not, God knows how many that are about people just looking at symptoms. 78 00:08:16,380 --> 00:08:24,420 And that includes you. When you have a symptom, you go, ooh, I'm going to look it up and see if there's a miracle cure or what's going on. 79 00:08:24,780 --> 00:08:32,010 But interestingly, they sort of started to take a position to start to think about their role in health care. 80 00:08:33,450 --> 00:08:40,620 Now, you think it's because they're going to do health care for they want to do the right thing. 81 00:08:40,860 --> 00:08:47,670 They've realised like the Daily Express, this is a business and in doing it better you'll come to Google more often. 82 00:08:48,450 --> 00:08:56,370 But again, the evidence of the self-check is and there is this paper which is in the BMJ, that shows when you look at the symptom checkers, 83 00:08:56,370 --> 00:09:06,810 there were 23 symptom checkers provided the correct diagnosis was only available a third of the time in these online forums. 84 00:09:07,830 --> 00:09:11,010 And you start to think, what's going on here? 85 00:09:11,670 --> 00:09:17,159 If you think about the drivers of health care, when is this appropriate for people to check something and go, 86 00:09:17,160 --> 00:09:19,020 Well, that means I need to go and see my doctor. 87 00:09:19,080 --> 00:09:27,150 Or actually I can stay at home interested in its own right, the potential to carry off more work or inappropriate work. 88 00:09:29,310 --> 00:09:33,750 So I'm going to take you back in time. Here's my guilt again. Those in the MFC told them this here. 89 00:09:34,050 --> 00:09:38,610 This is going back to 2009 and 2009. 90 00:09:39,510 --> 00:09:43,470 There are move people when we're all going to die. Okay. 91 00:09:44,970 --> 00:09:49,740 Remember that Daily Express, we're on alert for the new super flu. 92 00:09:52,170 --> 00:09:59,820 Don't know. It's not about Jonathan Ruffin. It is swine flu already here now the battle to contain. 93 00:09:59,940 --> 00:10:04,680 It starts, then it gets really serious. 94 00:10:04,710 --> 00:10:09,870 Now, you really interested in the news? Because it's going to kill 350 people every day. 95 00:10:11,160 --> 00:10:19,500 Yeah. We are in panic mode and you can't go on your holidays. 96 00:10:20,170 --> 00:10:23,670 It's just like when you think about Brexit, you won't be able to go on holidays. 97 00:10:23,670 --> 00:10:29,549 You couldn't go the holidays in 2009. What was the problem? And then if you really think about it, it got even worse. 98 00:10:29,550 --> 00:10:33,450 When are you going to go? We're all going to die. It's interesting. 99 00:10:33,570 --> 00:10:35,219 So what's happening now? 100 00:10:35,220 --> 00:10:45,150 Interestingly, some of that was based on fake and some of it's based on real experts and what we do now going back to the swine flu, 101 00:10:45,150 --> 00:10:54,870 remember, some of you should now be dead. Donald Rumsfeld, remember the known unknowns and unknowns, whatever you know, you don't know. 102 00:10:55,290 --> 00:10:59,549 Well, he actually this is an article in The Independent. 103 00:10:59,550 --> 00:11:06,120 He actually had shares in a company called Gilead who had the intellectual property to Tamiflu. 104 00:11:07,110 --> 00:11:16,530 And he this is an article about him saying he ordered this UN report into how many predicted death would bird flu cause now. 105 00:11:17,870 --> 00:11:23,560 You'd like to have a go? A number? How many deaths do you think would be by bird flu? 106 00:11:25,090 --> 00:11:32,760 You can go either UK based or worldwide. 1020. 107 00:11:33,950 --> 00:11:38,550 A couple of hundred. A couple of thousand. You're going to go 5000. 108 00:11:39,030 --> 00:11:44,310 That feels pretty high, doesn't it? And would you like to go higher or lower? 109 00:11:45,840 --> 00:11:51,060 Because this is important. If you're going to think about stockpiling drugs and thinking about we're all going to die. 110 00:11:51,330 --> 00:11:55,010 How many people in a UN report were predicted to die? 111 00:11:55,010 --> 00:12:01,410 We have 5000. How many of you think lower? And how many of you think higher? 112 00:12:02,220 --> 00:12:07,460 Oh, you're all a bit higher. How many lady there with a green scarf? 113 00:12:07,470 --> 00:12:11,160 170 million. 170 million. 114 00:12:11,160 --> 00:12:14,990 Bloody [INAUDIBLE], man. I knew it. 115 00:12:15,900 --> 00:12:23,730 Not too bad, actually. In fact, we were some of the 700,000 deaths that might be expected in the UK alone. 116 00:12:26,430 --> 00:12:29,790 So it was 700,000 deaths worldwide were predicted. 117 00:12:30,600 --> 00:12:33,660 So you actually although it sounds laughable, you weren't that far out. 118 00:12:34,420 --> 00:12:42,840 Yeah. So we predicted so. No wonder everybody goes into panic mode, but this stuff really gets into our psyche and have impact on policy. 119 00:12:43,110 --> 00:12:46,820 So you can calculate what's called the news to death ratio. 120 00:12:46,830 --> 00:12:57,570 So in the first 24th of April, 2009, in the first 15 days of the swine flu outbreak, you can look at the number of deaths in terms of swine flu. 121 00:12:57,990 --> 00:13:07,470 So there were 31 death for swine flu in the same 13 days for TB, there were 63,066 death. 122 00:13:08,970 --> 00:13:15,510 The number of news articles were for swine flu were two five 250,036 and 8000 for TB. 123 00:13:16,530 --> 00:13:23,070 So that gives you a news to death ratio of 8176 news articles for one death. 124 00:13:23,760 --> 00:13:27,330 But actually, every time you see a news article on TB, there'll be ten death. 125 00:13:29,490 --> 00:13:33,360 Now, that was done by a guy called Hans Rosling. 126 00:13:33,360 --> 00:13:39,960 I pinch that often, but it's got a very important message in the distortion about how the news comes at you, 127 00:13:40,320 --> 00:13:47,340 because this is a major killer and it's really important in terms of its health impact. 128 00:13:47,850 --> 00:13:53,089 But nobody gives. A monkey study about it. 129 00:13:53,090 --> 00:13:58,250 Nobody cares about it in any way. And I think all of us go into this area and think where the excitement. 130 00:13:58,430 --> 00:14:02,510 But this is a huge killer. Now, we published the Cochrane Review in this area, 131 00:14:02,720 --> 00:14:10,390 got the unpublished data and did lots of fun and published all sorts of stuff that says, I wouldn't give this stone to 12. 132 00:14:10,400 --> 00:14:11,560 It's got huge problems. 133 00:14:11,570 --> 00:14:22,670 We published all of the research and I thought we would change practice and we ended up in 2014 £550 million we spent on stockpiling this drug. 134 00:14:23,120 --> 00:14:29,810 And it's a waste of time. The reason it's a waste of time, it does have a symptomatic effect, a bit like paracetamol. 135 00:14:30,020 --> 00:14:34,520 The evidence shows it reduces your temperature, reduces some of the symptoms of flu, 136 00:14:35,180 --> 00:14:41,180 but it doesn't reduce the important complications of flu like bacterial infections, 137 00:14:41,480 --> 00:14:45,710 hospital admissions, all of the things you would want it to do if you were going to stockpile it. 138 00:14:47,240 --> 00:14:50,750 It doesn't actually prevent you from getting fully when you look in the back of the evidence. 139 00:14:51,020 --> 00:14:59,599 I went to a parliamentary committee in 2014 and this is when it hit home and we spent about 2 hours laying all this out and leaving that committee, 140 00:14:59,600 --> 00:15:02,270 thinking all of the people in the government get this issue. 141 00:15:02,840 --> 00:15:12,170 And a week later, we spent £50 million more stockpiling it because the stockpile we had was going out of date and they said, we can have it cheap. 142 00:15:12,560 --> 00:15:14,780 You can have it at the half price if you renew your stockpile. 143 00:15:16,010 --> 00:15:23,209 So I also think there's a huge agenda with our policy people thinking about how we inform policy in an evidence based way. 144 00:15:23,210 --> 00:15:31,610 That's part of our job in research is to think about how do we arm the next generation of policy people to look at evidence. 145 00:15:33,380 --> 00:15:40,880 So some people with money will have been on the MSE course and seen this graph where there are thoughtful steps to evidence based practice. 146 00:15:40,910 --> 00:15:46,130 Ask a question, acquire some evidence, appraise the quality and apply it in practice. 147 00:15:46,880 --> 00:15:53,090 Now, when I'm thinking about applying this to the media and thinking about how we go about a 148 00:15:53,090 --> 00:15:58,760 process of informing the news and what we do in health care is finally flip that round. 149 00:15:58,760 --> 00:16:04,700 And I use this in a different way. I have an extra step, so I focus. 150 00:16:06,230 --> 00:16:10,250 You can see instead of asking questions here, this is an extra step that's going in. 151 00:16:11,000 --> 00:16:16,580 So I think about acquiring the best evidence and assessing the effect all the time. 152 00:16:16,790 --> 00:16:24,380 When I get asked about media questions or healthcare evidence, I think, is this the best evidence and what's the size of the effect? 153 00:16:25,640 --> 00:16:30,080 This step is deliberately smaller because we never get to the biases, 154 00:16:30,500 --> 00:16:35,630 often because by the time you're in the biases, most people don't understand what we're on about. 155 00:16:37,040 --> 00:16:44,149 If I'm having a discussion in a newspaper about confounding and indication, bias confounding by indication, 156 00:16:44,150 --> 00:16:47,720 and the people going, We're not interested, nobody's going to understand this. 157 00:16:48,020 --> 00:16:51,499 So I focus on these two areas and I'll show you how I do it. 158 00:16:51,500 --> 00:16:55,820 So the first step is I always think about is when I see some evidence, 159 00:16:56,510 --> 00:17:01,430 I ask a general question, Is this the highest level of evidence to answer this question? 160 00:17:03,020 --> 00:17:12,860 That's my first step. And often what you'll see is, for instance, within the sort of process of evidence, you'll see lots of news headlines. 161 00:17:13,370 --> 00:17:20,749 But down here it might actually be just an animal study, but if it's an animal study, 162 00:17:20,750 --> 00:17:25,610 it'll take 20 years to get into practice and only about one in ten of them will get into practice. 163 00:17:25,880 --> 00:17:31,670 So when they say we've got a new cure for dementia and it's in mice, you're thinking, well, one in ten chance in 20 years. 164 00:17:32,210 --> 00:17:36,710 There's a quote where if what you're thinking is, where are we in the evidence hierarchy? 165 00:17:37,070 --> 00:17:42,920 And to do that and inform that you generally want to be hierarchy, don't you want to be open to the systematic review? 166 00:17:42,920 --> 00:17:49,670 And and to do that, you need a very good understanding of this one table, which is the levels of evidence. 167 00:17:51,050 --> 00:17:55,490 What type of question is it and where are we in this matrix? 168 00:17:55,790 --> 00:18:04,790 And if we down here, we know we've got some issues where if I want to be at the higher end of the evidence, to think about the systematic reviews. 169 00:18:05,210 --> 00:18:13,640 So I'll give you an example of how we applied that. We did this for evidence for what's called IVF add ons. 170 00:18:14,990 --> 00:18:23,390 We was approached by the BBC who said We're going to do a story about all of these add on treatments that you can get in addition to IVF. 171 00:18:24,530 --> 00:18:26,269 So we did two things. 172 00:18:26,270 --> 00:18:36,980 The first thing is we said, Well, let's look at the websites and look at what the claims are and how many times they actually when they stake a claim, 173 00:18:37,610 --> 00:18:45,080 they cite a bit of evidence and then ask a question is, is it the highest level of evidence? 174 00:18:45,680 --> 00:18:50,240 Is it actually a systematic review? Because if you were having a new treatment, you would. 175 00:18:50,290 --> 00:18:57,070 I want to be saying, is backed up by a systematic review. And in doing that, we showed that actually they were all miserable. 176 00:18:58,420 --> 00:19:01,569 That actually they often on the websites didn't find any evidence. 177 00:19:01,570 --> 00:19:05,230 And when they did, they actually went to lower quality or opinions. 178 00:19:06,190 --> 00:19:15,249 And when we went searching for the evidence, we could only find evidence five times where there was some evidence backed up by systematic reviews, 179 00:19:15,250 --> 00:19:20,020 where you think there might be an effect and then discussed the effect sizes. 180 00:19:20,290 --> 00:19:24,009 Now, in just doing that very simple bit, 181 00:19:24,010 --> 00:19:30,639 we got to a point where most of the Royal College of OBS in Guinea and there are some people in here went mad at us and said, 182 00:19:30,640 --> 00:19:35,980 you have no idea what you're on about. Well, remember, my job is I'm a GP, I am a GP. 183 00:19:35,980 --> 00:19:40,840 So my job is to try and inform people. If they come in and say, I'm thinking of having some IVF. 184 00:19:41,140 --> 00:19:47,050 What do you think about embryo glue? Well, I can tell you moved GP's have no idea what to do in that situation. 185 00:19:47,320 --> 00:19:52,420 So there has to be some way of saying where's the evidence? And said, what do you think you're doing? 186 00:19:52,420 --> 00:19:56,470 Explaining what the evidence is when you're not an obstetrician or a gynaecologist. 187 00:19:57,160 --> 00:20:03,850 When I said I thought, That's what we're trying to do. And all we do is say, What's the evidence in terms of its level? 188 00:20:04,900 --> 00:20:12,060 I'm not even gonna into the effect sizes. Two years later when you think about different this has been really interesting. 189 00:20:12,070 --> 00:20:20,860 So for about two years went round in circles and then early at the end of just the end of last year, the FDA, 190 00:20:20,860 --> 00:20:26,829 who the Human Fertilisation and Embryo Authority did a whole review where they brought 191 00:20:26,830 --> 00:20:30,790 together a system which I think is a huge step forward based on the research. 192 00:20:31,300 --> 00:20:35,230 And what they did, they said here we're going to provide you with the evidence. 193 00:20:35,230 --> 00:20:45,280 But they did some really smart, which I quite like. They have a traffic light system which is their traffic light system is red, amber, green. 194 00:20:46,240 --> 00:20:49,450 And the red there is this is assisted hatching. It talks. 195 00:20:49,450 --> 00:20:53,139 You are the rest. What's the evidence? And it says there is no evidence. 196 00:20:53,140 --> 00:20:56,980 You can't see that you need know that this ad is effective and safe. 197 00:20:57,650 --> 00:20:58,420 That's a red light. 198 00:20:58,420 --> 00:21:06,579 When they have a green, they say there is evidence from at least two randomised controlled trials and in the amber they go it's uncertain. 199 00:21:06,580 --> 00:21:11,770 The evidence is conflicting and it needs reducing in certain and I quite like 200 00:21:11,770 --> 00:21:16,210 that system because the traffic light system helps you really simply think, 201 00:21:16,450 --> 00:21:19,480 okay, what's the evidence? What's it like? Is there any evidence? 202 00:21:19,660 --> 00:21:24,250 If the answer is no, you've got a red. If there's more than two trials, at least you're on the pathway to somewhere. 203 00:21:24,670 --> 00:21:29,590 And you can start to filter out a lot of evidence really simply by just asking 204 00:21:29,590 --> 00:21:33,130 what type of evidence is it and is it appropriate to answer this question? 205 00:21:34,720 --> 00:21:44,010 Now the second step then is assessing the effect, and that's difficult because you're really thinking about patient benefit. 206 00:21:44,830 --> 00:21:55,450 So I often think about this is an equation in my mind that I think about, and each patient benefit is directly proportional to the outcome. 207 00:21:56,350 --> 00:22:06,130 The bigger the outcome, the more benefit, the better the outcome, the more benefit, the more relevant the outcome, the more patient benefit. 208 00:22:07,240 --> 00:22:11,950 So if I say you get a two year reduction in mortality, you might think, well, that's an important outcome. 209 00:22:12,760 --> 00:22:18,400 However, it's inversely proportional to the bias. The more bias, the less patient benefit. 210 00:22:19,750 --> 00:22:23,500 And if you introduce biases, you're overcoming the patient benefit. 211 00:22:24,160 --> 00:22:32,050 Now, that's really important in your thinking because if there's only a very small outcome benefit, any small bias will overcome the patient benefit. 212 00:22:32,890 --> 00:22:36,280 It is a rather large outcome and a big outcome. 213 00:22:36,280 --> 00:22:39,880 You might accept more bias. So it's a simple way. 214 00:22:40,060 --> 00:22:43,110 And then the third thing is you need the optimal information size. 215 00:22:43,120 --> 00:22:50,140 That's what systematic reviews are about, having sufficient information to be able to say with some certainty. 216 00:22:50,170 --> 00:22:51,340 This is a true effect. 217 00:22:52,150 --> 00:23:00,310 It's not the big data, you know, not it's not all the information because people get overwhelmed with we've got 3 million patient years. 218 00:23:00,790 --> 00:23:04,060 What you're interested in, have I got the right amount of information? 219 00:23:05,620 --> 00:23:08,440 And so one of the things to think about is to start with, 220 00:23:08,710 --> 00:23:18,970 I optimise the outcome and this is my second step and I ask a very simple question all the time Does this make a difference? 221 00:23:20,470 --> 00:23:23,920 And if it doesn't make a difference, why do you even need to care about the biases? 222 00:23:24,190 --> 00:23:26,860 Why do you even need to care about whether you've got optimal information? 223 00:23:27,310 --> 00:23:33,160 Why not just stop and you can stop and filter out a lot of information really quickly? 224 00:23:33,580 --> 00:23:38,560 Now is not quite there are there are bits of to does it make a difference? 225 00:23:39,060 --> 00:23:44,860 Oh yeah. And I think about three questions to that when I'm doing that. 226 00:23:45,190 --> 00:23:49,630 Yeah. And is assessing the effect I have these, these three questions. 227 00:23:49,840 --> 00:23:54,620 Who does. It applies to. Does it apply to the actual people we see or the patients we see? 228 00:23:55,280 --> 00:23:58,489 Does it make an actual difference? Both positive and negative. 229 00:23:58,490 --> 00:24:01,580 The benefits and the harms. And is it actually feasible? 230 00:24:03,090 --> 00:24:09,720 Yeah. And so you have to think of the three questions I'm often putting in at this step. 231 00:24:11,550 --> 00:24:14,610 So if we come back to the IVF example. 232 00:24:15,300 --> 00:24:22,140 Yeah. The first thing you would expect to happen is that anything that makes a claim for an intervention will be backed up by some evidence. 233 00:24:22,380 --> 00:24:26,340 Some of these treatments are of no benefit whatsoever and some of them are harmful. 234 00:24:26,730 --> 00:24:31,590 And that's going into the state. But it's going to. For instance, which ones? 235 00:24:31,590 --> 00:24:36,569 And only one treatment called an endometrial scratch was supported by moderate quality evidence. 236 00:24:36,570 --> 00:24:41,840 It would help it involve the procedure to scratch the womb lining to help an embryo successfully implant, 237 00:24:41,850 --> 00:24:44,820 although the evidence for this treatment was not itself beyond doubt. 238 00:24:45,090 --> 00:24:49,320 In fact, what's happened is this was the only one that was a moderate evidence of benefit. 239 00:24:49,530 --> 00:24:54,240 A large randomised controlled trial has come out in the last month that says it's of no benefit. 240 00:24:55,170 --> 00:24:59,670 So we often get this early stage where things look promising, but as you go along, nobody. 241 00:25:00,030 --> 00:25:05,339 So you get by asking, here's another one. I'm thinking about the outcome. 242 00:25:05,340 --> 00:25:10,710 Does it make a difference? So this is an interesting paper came out in the BMJ. 243 00:25:11,490 --> 00:25:15,780 I got us to quote on this so I don't know anybody here. 244 00:25:15,780 --> 00:25:23,490 It quoted for the media part. When you're on the Science Media Centre list, anybody else get asked to make a quote? 245 00:25:23,730 --> 00:25:26,820 Yeah. Do you actually quote? Yes, sometimes. 246 00:25:27,240 --> 00:25:34,140 Is it difficult or do you do it really quickly and integrative? I spent that time and you say, yeah, yeah, yeah, yeah. 247 00:25:34,320 --> 00:25:37,920 So it feels like sometimes it's going to take you forever. And that's why we don't participate. 248 00:25:38,610 --> 00:25:44,190 This is a real I'm going to be quoted in the news. I'm going to spend half a day read the evidence. 249 00:25:44,520 --> 00:25:47,580 Oh, my God. In fact, I haven't got a time to do that any more. 250 00:25:47,940 --> 00:25:54,930 So I actually think, oh, this is really interesting. I'll have a quick read while I'm having a cup of tea and to think about my framework. 251 00:25:55,230 --> 00:26:01,980 Okay. And I think I'm going to quote on outcome and here's here's me being quoted. 252 00:26:02,490 --> 00:26:07,020 Here's what I actually said and that's what I said. I wonder if these make a difference. 253 00:26:07,180 --> 00:26:09,540 And I wrote a quote back to the science media saying, 254 00:26:09,720 --> 00:26:15,330 It's hard to understand why half the drugs were approved in the first place if they provide no clinical benefit, he said. 255 00:26:15,750 --> 00:26:16,560 And that's it, Don. 256 00:26:17,610 --> 00:26:25,739 That took me about 5 minutes reading it because they're going with a very structured approach to what I'm looking for in the evidence, 257 00:26:25,740 --> 00:26:29,280 as opposed to just reading it and hoping some amazing quote will emerge. 258 00:26:29,700 --> 00:26:35,790 I go in and say, Does this make a difference today? And as I said to you, if it doesn't make a difference, why would you carry on? 259 00:26:36,540 --> 00:26:43,679 That's my quote. So there's this sort of approach to benefit. 260 00:26:43,680 --> 00:26:47,069 And I think if you think about outcomes, we haven't touched any of the other two bits. 261 00:26:47,070 --> 00:26:53,300 I've told you two things. Is this the appropriate evidence to answer this question and then these three questions? 262 00:26:53,580 --> 00:26:58,350 Does it make a difference? Does this apply to my patients? And is the treatment feasible in my setting? 263 00:26:59,520 --> 00:27:02,070 Okay. And you want to think about optimising them. 264 00:27:03,030 --> 00:27:13,470 So sometimes I get things like research and I dismiss it immediately because I'm asking this is a question about does this apply to my patient? 265 00:27:13,890 --> 00:27:15,960 And in thinking about it, I think does this apply to me? 266 00:27:16,770 --> 00:27:23,580 So effective breakfast on weight and energy intake, a systematic review and meta analysis of randomised controlled trials. 267 00:27:24,540 --> 00:27:32,850 This was a trial of whether you eat breakfast or should stop eating breakfast and how many people don't eat breakfast. 268 00:27:34,590 --> 00:27:40,230 Okay. Do you actually care about whether you should or shouldn't and whether it has an impact on your weight loss? 269 00:27:41,220 --> 00:27:47,070 Look, I don't care. I say it depends how much time I've got in the morning. 270 00:27:47,100 --> 00:27:51,959 If I if I lie in too late and I'm running late, I might not have breakfast at the weekend. 271 00:27:51,960 --> 00:27:55,980 I have breakfast. So I'm like, this doesn't even apply to the people I see. 272 00:27:55,980 --> 00:27:59,100 Why would anybody even care about the effect of breakfast? 273 00:27:59,370 --> 00:28:02,510 Nobody's ever come into my clinic and said, I talk to Hannigan. 274 00:28:02,520 --> 00:28:05,880 I'm really thinking about whether I should give up breakfast or not. What's your advice? 275 00:28:06,120 --> 00:28:08,880 I've never seen anybody, so it doesn't mean you. 276 00:28:09,150 --> 00:28:17,490 But what I'm saying to you is this is a dramatic example of is this the sort of evidence that applies to the people you see in practice? 277 00:28:18,720 --> 00:28:23,100 If it is, no, I wouldn't waste your time. I've moved on now. 278 00:28:23,100 --> 00:28:26,220 You could read it if you're interested. Okay. Oral thought for a reason. 279 00:28:27,870 --> 00:28:34,490 Now, at the moment I am feeling a little bit stressed actually at the moment because I got involved in that. 280 00:28:34,530 --> 00:28:38,790 I'm not sure I should have got involved in what I have done because it's a really evidence based issue. 281 00:28:39,240 --> 00:28:46,440 This is a piece I did a piece with Panorama, which was about gender dysphoria and the drug treatments in children. 282 00:28:47,730 --> 00:28:55,450 And these things now in the world are incredibly emotive. And I don't my my question is not whether they're right or wrong. 283 00:28:55,620 --> 00:28:58,920 The one thing I do is say, what does the evidence look like? 284 00:28:59,580 --> 00:29:02,570 Have we got sufficient evidence for somebody to make an informed judge? 285 00:29:03,090 --> 00:29:07,530 And in terms of children, you'd think, well, actually, you would want robust evidence, wouldn't you? 286 00:29:07,920 --> 00:29:10,170 These are very powerful treatments we're going to use. 287 00:29:10,590 --> 00:29:19,140 So if we're going to go down this route, this is a good example of how we should frame the whole of medicine, how we should think about what we do. 288 00:29:19,410 --> 00:29:23,670 Now, the problem is in health care is, boy, do we jump on bandwagon. 289 00:29:24,180 --> 00:29:30,720 There's a huge history of just trying treatments and going down a route because we think they can work. 290 00:29:31,920 --> 00:29:40,470 And in this article I put I said, we've got ourselves into a complete mess here because about seven or eight years into this, 291 00:29:41,010 --> 00:29:47,820 we have got in a position where we've given out these powerful drugs and now we're going more people want them, 292 00:29:48,030 --> 00:29:55,200 and now we're going, Oh, what's the evidence? And I looked at the evidence and went, Oh, my God, this is abysmal. 293 00:29:55,650 --> 00:30:00,900 It's worse than the IVF. Now I can handle with the IVF because of the problem in IVF. 294 00:30:00,900 --> 00:30:04,770 If you get it wrong, it's okay. You don't you don't have a child. That's pretty bad. 295 00:30:04,770 --> 00:30:12,480 But if you get it wrong and somebody is having really powerful drugs, wouldn't we as a society want to be thinking about the difference we make? 296 00:30:12,930 --> 00:30:16,260 Now all I do is say what the quality of the evidence. 297 00:30:16,680 --> 00:30:20,310 Now, on one side of this argument, people think I'm amazing. 298 00:30:21,540 --> 00:30:28,469 On the other side of this argument, they think I'm an absolute nightmare, the devil himself. 299 00:30:28,470 --> 00:30:38,880 And he feels like that because of the ideology. And all I'm doing is I not said anything apart from this should be done in a research context. 300 00:30:39,780 --> 00:30:44,940 If we're going to go down this route, we have to develop the research evidence and we have to do it in a really robust way. 301 00:30:45,150 --> 00:30:48,150 And there are so many unanswered questions and so many uncertainties. 302 00:30:48,420 --> 00:30:51,870 We have to have a different approach and at the moment we seem to be going, it's alright, 303 00:30:51,960 --> 00:30:56,700 nobody wants to go there because we're all a bit scared of the stuff we'll get happening. 304 00:30:56,700 --> 00:31:01,380 But if we can't do it for this in children, what are we going to do in the whole of society? 305 00:31:01,650 --> 00:31:05,610 And I think these are really interesting issues we now face. 306 00:31:06,690 --> 00:31:13,050 We are bringing treatments and innovations and new devices and technologies in a way that we've never seen before. 307 00:31:13,380 --> 00:31:20,460 Innovation is the word on the street. And in doing that, we want to ignore the evidence, apart from when it suits us. 308 00:31:21,750 --> 00:31:29,640 So one thing is the debate now I've learned is when I'm going to go somewhere controversial, I always write a piece to go with it. 309 00:31:30,510 --> 00:31:35,770 So I've had some pieces today that have gone You've said this, and I go, I didn't say that. 310 00:31:36,600 --> 00:31:40,139 And so instead of trying to answer it back ago, here's what I said. 311 00:31:40,140 --> 00:31:48,060 If you go here, you can read my review of the evidence and what my conclusions are at the end of the article if you disagree with them. 312 00:31:48,060 --> 00:31:56,220 So some people right now disagree with systematic reviews are an important way of informing the public. 313 00:31:56,790 --> 00:32:01,740 They think they're terrible. And I go, Well, okay, that's alright. But we use systematic reviews and here they are. 314 00:32:01,980 --> 00:32:07,320 And in there I took the individual studies and said, What does it look like in terms of the effects and the quality? 315 00:32:08,400 --> 00:32:10,200 And I think this is something I've learnt. 316 00:32:10,200 --> 00:32:15,779 If you're going to join the media and have a go and say things, you should always have a backup where you go, 317 00:32:15,780 --> 00:32:20,909 actually here's my viewpoint here are published in some blog or some place where actually this 318 00:32:20,910 --> 00:32:26,910 is what I said and that helps you a lot because it gives you a if somebody misquoted you. 319 00:32:26,910 --> 00:32:32,879 And I think that's what everybody's worried about. You join in and suddenly somebody will translate something. 320 00:32:32,880 --> 00:32:35,880 You've said something into something so real or something. 321 00:32:35,880 --> 00:32:38,730 You go, I didn't say that. And that's what happens. 322 00:32:40,020 --> 00:32:48,470 So if I'm thinking I'm just going to end up now and then we can have a few questions is there's a book, 323 00:32:48,480 --> 00:32:57,299 there's a whole new world out there, and that world is Facebook and social media and bypasses. 324 00:32:57,300 --> 00:33:03,150 So the newspapers have sometimes put really good stuff up there, sometimes, as you've seen shocking stuff up there. 325 00:33:03,660 --> 00:33:06,840 But a lot of the time they're sort of in the middle. 326 00:33:07,050 --> 00:33:15,959 But this type of stuff has a huge impact because a large majority of health news shared on Facebook is fake, is fake or misleading. 327 00:33:15,960 --> 00:33:20,520 And that's what everybody is in their ideology sharing information. 328 00:33:20,940 --> 00:33:28,410 So the question is in health care is what's the role of researchers in informing the public? 329 00:33:29,010 --> 00:33:33,150 And if you look last year in the ten most shared health articles on social media, 330 00:33:34,110 --> 00:33:43,259 his his and this was in a way of just a simple measure of trying to say most of them are low are very low in terms of scientific credibility. 331 00:33:43,260 --> 00:33:46,980 The three that weren't here were six, seven and eight. 332 00:33:47,400 --> 00:33:52,740 And six is about cold or flu symptoms is how to tell the difference. 333 00:33:53,820 --> 00:34:00,690 One about stem cell treatment and the other was about how cycling in old age can keep your immune system young shed a lot. 334 00:34:00,690 --> 00:34:05,770 So we all like cycling in old age. The other ones are this one fine. 335 00:34:06,790 --> 00:34:11,980 Marijuana is 100 times less toxic than alcohol and it's safer than tobacco. 336 00:34:12,550 --> 00:34:15,130 That's a move shared story on social media last year. 337 00:34:15,610 --> 00:34:21,850 That's because all the people, I guess, who want to take their marijuana quite like that message and want to pass it on to everybody else. 338 00:34:22,570 --> 00:34:32,530 So it's a really interesting arena in terms of what our role as researchers in informing the public and how might you go about that. 339 00:34:33,130 --> 00:34:42,460 And I think we do have a huge role to play. And this is Fiona Fox, who runs the Science Media Centre, who do a great job. 340 00:34:42,970 --> 00:34:47,530 They have a system where they recruit people who will comment on news stories. 341 00:34:47,530 --> 00:34:55,299 And David's on that list and I get stories. But what she said in there was really insightful and she was commenting on this lady, Jenny. 342 00:34:55,300 --> 00:35:01,450 Ronnie said, I'm leaving the thought to world because you can't beat fake news with science communication. 343 00:35:02,620 --> 00:35:10,660 And her arguments were trying to say that actually I am seeing plenty to reassure me that the public are more discerning. 344 00:35:10,990 --> 00:35:16,900 The latest trust rating showed once again that scientists remain near the top of the list of most trusted professionals, 345 00:35:17,170 --> 00:35:25,690 with 83% of the public trusting scientists to tell the truth, compared with only 17% who say the same about politicians. 346 00:35:26,680 --> 00:35:32,200 So if you are involved in research, the public trust you to say the right things. 347 00:35:33,100 --> 00:35:37,149 And in doing that, I think it's helpful to say what she said. 348 00:35:37,150 --> 00:35:44,800 He's finishing it. Say no one said it was going to be easy. And I'm open to Roland's claims that it might be getting harder or at least more bruising. 349 00:35:45,370 --> 00:35:49,270 I think that is true. The world is more breathing than ever now. 350 00:35:49,510 --> 00:35:53,110 If you just try and put out a very clear message about here's the evidence. 351 00:35:53,500 --> 00:35:57,160 But that's all the more reason for Roan and a fellow scientist to hang in there 352 00:35:57,460 --> 00:36:01,300 now would be the very worst time for scientists to return to their ivory towers. 353 00:36:03,840 --> 00:36:09,030 Now I'm just going to finish with something that it doesn't matter what I do. 354 00:36:09,420 --> 00:36:15,360 I can't do anything about it. And it doesn't matter what you try, what your strategy is. 355 00:36:15,360 --> 00:36:29,560 There are some things you just can't do something about, and there's one now, having finished that, just to say. 356 00:36:30,030 --> 00:36:33,749 One of the things I'm having a bit of trouble at the moment is, is sleeping? 357 00:36:33,750 --> 00:36:36,810 Anybody else got a problem with sleeping in the room? 358 00:36:36,990 --> 00:36:44,220 Yeah, it's sort of like I'm alright with it. But what I do is I rely on the Daily Express to inform what I might do about it. 359 00:36:45,090 --> 00:36:50,160 And those of you sleep too much, you say you're in trouble. 360 00:36:50,190 --> 00:36:56,130 I'm alright, I'll be fine. But it's actually I'm using the Daily Express to help me out. 361 00:36:56,400 --> 00:36:57,300 Thank you very much.