1 00:00:00,960 --> 00:00:07,890 Thanks, everybody, for coming on this very chipper, cold day to the region. 2 00:00:07,890 --> 00:00:19,230 As a thought, Goliath was just on Twitter and I just took a photo of all your the back of your head and then the hashtag is catalogue of bias. 3 00:00:19,230 --> 00:00:25,190 Is that right with the American spelling as opposed to the English spelling, which has a U in the eighth? 4 00:00:25,260 --> 00:00:28,350 So it's catalogue of bias without all one word attached. 5 00:00:28,350 --> 00:00:32,280 So if you want to cheat tweet, it'd be useful if you could use a catalogue of bias. 6 00:00:32,760 --> 00:00:41,100 So thanks for coming. I'm going to talk a little bit about why we started, why did we start? 7 00:00:41,580 --> 00:00:44,820 And then David's going to talk about what did we do? 8 00:00:45,810 --> 00:00:50,400 And then Ian's going to come and talk a little bit about some of the integration of some of the other products, 9 00:00:50,400 --> 00:00:53,370 like the James Lynn Library and the importance of bias. 10 00:00:57,120 --> 00:01:06,059 About a few years ago, I wrote this piece about Reflections on Day Circuit's time at Centre for Urban Space Medicine when he died. 11 00:01:06,060 --> 00:01:17,610 Unfortunately, about four weeks before he died, he sent me an email and he sent it to about four people and he said, Dear Carl, have I got it wrong? 12 00:01:18,600 --> 00:01:24,870 Somebody introduced me to propensity scores and I'm wondering whether randomisation 13 00:01:24,870 --> 00:01:27,960 I should have been thinking more about propensity scores in the last 20, 14 00:01:27,970 --> 00:01:32,850 30 years. And all I could think is why this guy seriously and well, 15 00:01:33,630 --> 00:01:42,330 and he's still passionate about bias and treatments and effect sizes and he was really interested 16 00:01:42,330 --> 00:01:49,139 in that aspect and and in writing that there are about four or five papers or not papers, 17 00:01:49,140 --> 00:01:50,670 maybe series of articles. 18 00:01:51,540 --> 00:01:59,150 David Sackett with the first director of the centre I first met in 94, Pete, some people in this room, me and you many years before that, 19 00:01:59,160 --> 00:02:04,799 and work with him and there are about four or five books or series of papers that 20 00:02:04,800 --> 00:02:08,190 I think everybody should read if you're interested in evidence based medicine. 21 00:02:09,240 --> 00:02:16,410 One of them is probably the books on basic clinical epidemiology, which actually has a series one, two and three I think is right. 22 00:02:16,410 --> 00:02:21,300 And actually the first series is fantastic. The first book is is outstanding. 23 00:02:22,440 --> 00:02:30,810 The second is a sort of series of articles called Clinician Rounds, of which is about 30 papers in that series. 24 00:02:31,140 --> 00:02:36,390 And there's some great papers in that. Obviously, the editorial, what is IPM and what it isn't? 25 00:02:37,390 --> 00:02:41,700 And then there's a paper on online which you can look up about being a mentor. 26 00:02:42,750 --> 00:02:49,920 It was really interested in mentoring and what that meant, and he and I thought it talked about who to gravitate to and who to avoid, 27 00:02:50,640 --> 00:02:55,050 and they seem to be more about who to avoid than who to gravitate to in that series. 28 00:02:56,550 --> 00:03:05,190 But there's also a paper in there which is this paper bias in Analytic Research by David Sackett in 1979, 29 00:03:06,090 --> 00:03:15,120 printed in Great Britain in the Journal of Chronic Disease and is a really interesting paper and in reflections and thinking about. 30 00:03:15,120 --> 00:03:19,380 I've always been aware of this paper, but in reflections when I read it again, 31 00:03:19,590 --> 00:03:26,850 there's a really interesting bit at the end of the paper which is comes through and he lists all these priorities and biases in 1979. 32 00:03:28,530 --> 00:03:34,050 And interestingly, at the time then case control theories were the fact for journals. 33 00:03:35,070 --> 00:03:42,690 So case control theories were on the rise and cohort studies were dropping and there were very few randomised controlled trials. 34 00:03:43,980 --> 00:03:53,280 And if you read this paper there were sort of four priorities and you know when you look back and you think oh we should do this a few years earlier, 35 00:03:55,470 --> 00:04:01,740 I think the advent of the Internet allows is it put the continued development of an annotated catalogue of bias. 36 00:04:02,130 --> 00:04:06,570 Each citation should include a useful definition, a reference to example, 37 00:04:07,050 --> 00:04:12,810 illustrating the magnitude and direction of its effects and a description of the appropriate preventive measures. 38 00:04:13,740 --> 00:04:21,420 If any I volunteer for this task would welcome collaboration and would appreciate receiving nominations and examples of additional biases. 39 00:04:23,310 --> 00:04:27,450 So there's a really interesting idea that actually this is a fantastic idea. 40 00:04:27,450 --> 00:04:33,570 It's about 40 years to early, isn't it? Technology allows you to look back and think, wow, this is a fantastic idea. 41 00:04:33,780 --> 00:04:43,769 We could now do that with the Internet. So that's the mission of the catalogue of bias, interestingly biases. 42 00:04:43,770 --> 00:04:46,200 And when you think about it and when I was looking at this is, 43 00:04:46,260 --> 00:04:53,100 is there are loads of examples of really interesting bias but do like this on the James Lindley library is 44 00:04:53,100 --> 00:04:59,760 what our biases question mark and I think that's a really important issue for us always to think about. 45 00:05:00,580 --> 00:05:07,149 What do we mean by bias and out there in the world? When you go out and talk about biases, they're all around. 46 00:05:07,150 --> 00:05:12,670 They all affect us. But very few people really understand what are we on about and what does it mean? 47 00:05:13,510 --> 00:05:16,749 And there's a good and this is Peter Award. 48 00:05:16,750 --> 00:05:20,559 Here was the first triallist in aspirin, correct? 49 00:05:20,560 --> 00:05:24,400 Who did the first trial in aspirin, which put biases in test and treatment. 50 00:05:24,400 --> 00:05:30,370 And those influences and factors that can lead to conclusions about treatment effect that are systematically different from the truth. 51 00:05:31,050 --> 00:05:34,840 And that's a really nice definition of why we are interested in biases. 52 00:05:37,000 --> 00:05:43,030 Interestingly as well, when you go and start to then think about biases and this is a just two example, 53 00:05:44,110 --> 00:05:49,810 all of the people that are involved in evidence based medicine or in Cochrane in some way that everybody recognises, 54 00:05:50,200 --> 00:05:53,679 like Peter Godchaux, like Kenneth Schulze, like Bill Goldman, 55 00:05:53,680 --> 00:06:00,520 like Ian Chalmers at some point have gone through a mission in bias where they've gone, This is interesting, I'm going to write something about it. 56 00:06:01,060 --> 00:06:04,120 This is a great paper by Peter about reference bias. 57 00:06:06,190 --> 00:06:12,129 When we've been teaching for over 20 years is the empirical evidence of bias that we've been teaching all the time. 58 00:06:12,130 --> 00:06:17,380 So we've been teaching about this and an empirical the magnitude in the direction of the bias. 59 00:06:19,180 --> 00:06:24,400 More interestingly, it's not as straightforward as the magnitude and direction. 60 00:06:25,930 --> 00:06:29,860 Bias is not a straight forward something I could teach you today. 61 00:06:29,860 --> 00:06:31,720 And we walk away and we have all the knowledge. 62 00:06:32,180 --> 00:06:38,350 And there are many papers like this by Leslie Ward, who led this, which I like about all the people who've who've helped too, 63 00:06:38,350 --> 00:06:43,030 that it seems that in situation when bias is like the size of such bias is unpredictable, 64 00:06:44,230 --> 00:06:48,010 sometimes it's different for subjective outcomes than objective outcomes. 65 00:06:48,580 --> 00:06:51,640 So although we might say it's one way, sometimes we're not sure. 66 00:06:53,230 --> 00:06:59,139 And then it's just like I put this picture up because I think this has been the one bit where at this point, 67 00:06:59,140 --> 00:07:05,470 and this is in 2000 and about 910, this is the Tummy Flu Collaborative at New College. 68 00:07:06,430 --> 00:07:13,749 And there's Tom Jefferson, who was one of the lead. But Toby Toby's there and Chris Delma, myself and other people like Peter Doshi and Mark Jones. 69 00:07:13,750 --> 00:07:20,920 So it's a real international collaboration. But at the time there's been a there was a sort of approach that there was a great approach. 70 00:07:22,060 --> 00:07:25,390 That bias is quite straightforward. It's high, moderate or low. 71 00:07:26,800 --> 00:07:32,920 And in that there are three or four biases you look at. And if they're present, you're low risk. 72 00:07:32,920 --> 00:07:37,900 And if they're absent, you're sorry the other way. If the president you've got high risk, and if they're absent, you're low risk. 73 00:07:39,220 --> 00:07:44,300 And suddenly within the the work of the of the review we did in terms of I'm a lot hold 74 00:07:44,320 --> 00:07:48,400 on there are a myriad of other biases that are really important in what we're doing. 75 00:07:49,030 --> 00:07:55,780 Not least reporting bias is a huge impact and that also was very influential. 76 00:07:55,780 --> 00:07:59,290 And I'm thinking we have we we just made this reductionist approach. 77 00:08:00,610 --> 00:08:07,040 Is that appropriate? And then there are other biases that keep pervade in everything that I've been involved in or done. 78 00:08:07,390 --> 00:08:12,520 Publication bias is what this is all about. Huge project where 100,000 people supported it. 79 00:08:13,390 --> 00:08:16,560 Even sent me an email the other day from about that. 80 00:08:16,570 --> 00:08:22,270 Actually, bias is good because it keeps us in business. Without bias, we'd have no job. 81 00:08:23,260 --> 00:08:34,120 And I thought, Yep, bias is what this whole project is about. And so that led us to the idea of the catalogue of bias building. 82 00:08:34,150 --> 00:08:38,110 It's a sort of project that builds from what's come before, 83 00:08:39,040 --> 00:08:43,389 from what David Sackett was talking about, to what many other people have been talking about, 84 00:08:43,390 --> 00:08:53,770 to just say, let's build something that can allow us to put together a catalogue that will be useful for me, 85 00:08:54,610 --> 00:08:58,510 for you, for individuals out there, for researchers. 86 00:08:58,900 --> 00:09:06,820 We get asked a lot. I get asked a lot. And people in this room by Science Media Centre, could you comment on this research? 87 00:09:08,200 --> 00:09:13,329 Well, actually, it contains this bias. I'm sick of writing it. Why not just point to some place? 88 00:09:13,330 --> 00:09:20,830 So that's what it's about. And we think as researchers, our mission is to reduce bias. 89 00:09:22,150 --> 00:09:28,990 But there's also another mission out there. We means the general public just doesn't mean we in this room. 90 00:09:28,990 --> 00:09:33,280 I we as everybody must be aware of the different type of biases, 91 00:09:33,610 --> 00:09:38,380 their potential impact in how this affects interpretation and use of evidence in health care decision making. 92 00:09:39,400 --> 00:09:43,990 Very important concept. And I'm going to hand over to David in a minute. 93 00:09:43,990 --> 00:09:52,809 But and we think this is also so important. And one area where I think it's hugely important is in this group is in the the 18th in a new generation 94 00:09:52,810 --> 00:09:59,710 of people who understand bias and its impact on decision making and how we use bias to make decisions in. 95 00:09:59,810 --> 00:10:07,430 Once decision making. And we think it's to equip the next generation with the critical thinking skills to make informed health care choices. 96 00:10:07,910 --> 00:10:12,889 That's also another project called himself Informed Health Care Choices that's involved in and the. 97 00:10:12,890 --> 00:10:15,640 OXMAN And their opportunity to teach that. 98 00:10:16,700 --> 00:10:27,770 So that's why we started and this is a project where we decided to launch the website and Dave is going to show you what did we do? 99 00:10:30,080 --> 00:10:37,790 So, yes, I was very passionate when we decided to take this project forward right from the word go, because as a as a teacher. 100 00:10:37,790 --> 00:10:40,790 VBM And there's a lot of people on the course that I'm running this week. 101 00:10:41,000 --> 00:10:43,970 One of the modules for our events based health care master's program this week, 102 00:10:45,500 --> 00:10:49,549 I'm always look, I was always thinking, so why would we care about the biases? 103 00:10:49,550 --> 00:10:52,040 Why are we even looking at these biases? What does it matter? 104 00:10:52,400 --> 00:10:59,719 And when I first started off, I was struggling to find places other than usually going to to Ian's paper, the Schultz paper, 105 00:10:59,720 --> 00:11:04,280 and saying this is one of the clear cut examples of where bias is important and how it impacts on the results. 106 00:11:04,580 --> 00:11:07,970 But I often struggle to or it took a long time to dig out some other ones. 107 00:11:08,330 --> 00:11:12,530 And what would have been really useful for me is what we're about to sort of show you. 108 00:11:12,530 --> 00:11:16,430 Now, if I had this four or five years ago, I could really just point to my students go, Look, 109 00:11:16,700 --> 00:11:20,870 this is why these biases are important and this is why they matter and this is the impact that they're having. 110 00:11:21,560 --> 00:11:28,940 So like I said, I was very, very keen to get on board and really happy that I was always asked to come on board and be part of the project. 111 00:11:29,210 --> 00:11:37,550 And I managed to convince the crack team at the CBM Centre that this was important and they should get involved too. 112 00:11:37,880 --> 00:11:42,920 So all of these guys here and girls, we all went off for two days. 113 00:11:43,370 --> 00:11:48,949 I think it is one of the first away day project. So for two nights I've had the CBN and it was great fun and we went here. 114 00:11:48,950 --> 00:11:54,100 Anyone recognise this part of the world? Hmm. 115 00:11:55,270 --> 00:12:01,690 Well, no, it's. It's in. It's in. It's not far. And the budget wouldn't stretch to a foreign holiday so it's. 116 00:12:02,060 --> 00:12:05,240 Yes, there's Bristol with the coloured houses there. 117 00:12:05,240 --> 00:12:08,450 So the death of England. 118 00:12:08,450 --> 00:12:12,620 Yes, absolutely. And here we all are, day one. 119 00:12:13,130 --> 00:12:18,800 I don't know if we took another picture after day three, but it probably wasn't quite as as as bright and breezy as this. 120 00:12:19,190 --> 00:12:24,919 And we set off with prior to leaving, we were obviously familiar with David Circuit's paper, 121 00:12:24,920 --> 00:12:32,299 which we within that paper he actually identifies 56 biases that he lists relevant to cohort studies, 122 00:12:32,300 --> 00:12:35,900 relevant to case control studies, and a few that are relevant to experimental studies. 123 00:12:36,230 --> 00:12:40,910 But it's quite hard to tease that out from the paper itself. So we had we had 56 in mind. 124 00:12:41,150 --> 00:12:44,629 Then you go off and you do a bit of research and you realise that John is, 125 00:12:44,630 --> 00:12:51,320 you need is is also interested in this area and he comes up with 235 biases and you start to just your head starts to sink a little bit. 126 00:12:52,340 --> 00:12:59,149 But luckily I think for us he managed to whittle down those to 35 into four concrete sort of areas. 127 00:12:59,150 --> 00:13:04,660 So there was a lot of overlap between those 235 and actually 40 was was what he ended up presenting. 128 00:13:05,030 --> 00:13:12,079 So we took those on board as well. And then we obviously were familiar with our through our own work about the Cochrane as well, 129 00:13:12,080 --> 00:13:18,590 which some of these biases weren't captured either in the newspaper or in the paper, obviously because Cochrane wasn't around then. 130 00:13:18,590 --> 00:13:21,740 So some important biases, particularly in relation to clinical trials. 131 00:13:23,150 --> 00:13:30,440 So armed with that, we took our we took our project folder and our project tracker and we ended up with roughly around about 100 biases. 132 00:13:30,440 --> 00:13:39,410 We turned up at Bristol with about 100 biases to start off with, which we then set about in, in, in the, in the hotel bunker room. 133 00:13:40,100 --> 00:13:41,180 And here we are in action. 134 00:13:41,480 --> 00:13:51,020 We split into three teams, uh, folk here and then, and Carl and this is Douglas Badenoch and Ruth Davis is in the room as well as CBN manager were in, 135 00:13:51,200 --> 00:13:55,849 we're in the control room I think, doing something useful, but it felt like we were doing all the hard work. 136 00:13:55,850 --> 00:14:00,830 But I'm sure Ruth was definitely. But Ruth was definitely keeping us to task and keeping us at the time. 137 00:14:02,060 --> 00:14:07,129 And so we spent three and so for some reason I always end up this colour seems to follow me around. 138 00:14:07,130 --> 00:14:13,070 So I was in the pink group too and we had 30 biases per group and we came up with a web 139 00:14:13,070 --> 00:14:17,959 page template that basically had to cover these four areas that we were looking at. 140 00:14:17,960 --> 00:14:19,650 So what's the definition of the biases, 141 00:14:19,670 --> 00:14:23,570 the bias that we're looking at and where we're going to go look for and what we're going to what we going to write? 142 00:14:24,110 --> 00:14:31,309 We then had to find some examples of where these biases that we were we were we were we were tasked with were present in the literature. 143 00:14:31,310 --> 00:14:36,410 So some real concrete examples of yes, his, his, his a bad example, a good example or a bad example, 144 00:14:36,410 --> 00:14:42,050 whichever you're looking of it, of this particular bias in action key was this as well. 145 00:14:42,410 --> 00:14:46,220 So like David said, can you can you can you show evidence that it matters? 146 00:14:47,030 --> 00:14:51,890 And sometimes when it does matter and sometimes when it doesn't actually matter and put that in one place. 147 00:14:52,220 --> 00:14:59,510 And then also, as David also iterated, the catalogue must tell us how we should avoid this going forward. 148 00:14:59,930 --> 00:15:09,919 So that was what we were armed and tasked with, with completing. And we basically had a search strategy that consisted of four areas looking at the 149 00:15:09,920 --> 00:15:14,610 papers within our own catalogue folder that I've already mentioned from the 56, 150 00:15:14,610 --> 00:15:19,190 the 235 and the 40 papers. What's in there that's already answering some of these questions? 151 00:15:19,370 --> 00:15:23,330 But also we didn't want to miss any current up to date. Systematic reviews was key. 152 00:15:23,660 --> 00:15:30,060 So we also looked in PubMed crews for any systematic reviews or anything that would would cover this area that would be key to it to look at clearly. 153 00:15:30,080 --> 00:15:34,460 We also did a little bit of Googling work just to see what came out top for these different types of biases. 154 00:15:35,090 --> 00:15:41,560 And then we were keen also to link with the James Allen Library to see what their what they had on 155 00:15:41,600 --> 00:15:47,030 on each of these different biases and any links that we could then bring into to our Web pages. 156 00:15:48,560 --> 00:15:52,060 And this is, if you like, the rough template that we start to complete. 157 00:15:52,070 --> 00:15:56,030 So we just start to complete all these for different biases. A good example of allocation bias here. 158 00:15:56,660 --> 00:15:59,660 This is the, if you like, the web template that we were going to use. 159 00:15:59,900 --> 00:16:05,840 So definition a background, something along the biases that this this bias currently relates to which we are, 160 00:16:05,840 --> 00:16:13,490 which when you go into the catalogue online, you'll be able to link off to the similar biases that each bias is related to and snowball from there. 161 00:16:14,840 --> 00:16:16,910 A nice little definition of the background of the bias. 162 00:16:17,150 --> 00:16:24,970 Again, a concrete example where we could find one information on the impact with links to show you this is Watson. 163 00:16:24,980 --> 00:16:30,050 And often, hey, we've got our friend Schultz and Grimes, which is the which is the original paper we just showed you earlier. 164 00:16:30,350 --> 00:16:34,610 And then some nice preventative steps about how you can deal and handle with this bias 165 00:16:35,000 --> 00:16:39,620 in your research or in how people doing research should handle this type of bias. 166 00:16:40,700 --> 00:16:50,030 Okay. And then a bit of a some finger pointing to some other linked resources that we're going to help you with and provide for you, 167 00:16:50,300 --> 00:16:54,740 as well as the sources that we've used to to provide for this this page. 168 00:16:56,060 --> 00:17:03,080 So that said, I'd now like to just show you the actual catalogue, which is ready to ready to roll out. 169 00:17:03,860 --> 00:17:09,740 We hope so. Here we have it. Okay. So this is the home page for the catalogue. 170 00:17:10,070 --> 00:17:12,830 And I think it looks really, really cool. And I think we've done a really great job. 171 00:17:12,830 --> 00:17:17,870 And Douglas from Douglas Barton up from innovations, done a lot of the work, done all the work behind that, 172 00:17:17,870 --> 00:17:23,389 the back end of it, taking those sort of A4 sheets that you saw and turning into something like this, 173 00:17:23,390 --> 00:17:29,510 which is which is pretty impressive and really cool to the front page, we'll just have an example of, of a bias, 174 00:17:29,510 --> 00:17:35,790 probably the most recent one that's gone in, but also will have some, some links to some of the blogs that we've done, the latest blogs. 175 00:17:36,080 --> 00:17:43,460 And in fact, I think today I sent Carl this and he got it up today just to sort of say my take, 176 00:17:43,490 --> 00:17:45,620 pretty much what I've just said to you at the start of my talk, 177 00:17:45,800 --> 00:17:50,270 how for me, when I first started in IBM, this type of thing would have been really, really useful. 178 00:17:50,690 --> 00:17:54,139 And what we're hoping is that for other teachers, particularly teachers VBM, 179 00:17:54,140 --> 00:18:00,050 who want to use examples and go somewhere, this will be the place to go for for stuff on biases. 180 00:18:02,540 --> 00:18:07,490 And then you'll see if you just click on the top rows, we then get into our biases. 181 00:18:07,700 --> 00:18:12,800 Currently, there's 25 biases and these will be updated regularly. 182 00:18:14,510 --> 00:18:18,709 Like I said, we've got at least another 40 or 50 in the background and they're constantly being checked by the 183 00:18:18,710 --> 00:18:23,420 team to make sure that they're correct for content and correct and ready to go out on the pages. 184 00:18:24,920 --> 00:18:31,909 And we just offering those two different areas on the website where you can start to select the biases you're interested in dropdown menus. 185 00:18:31,910 --> 00:18:35,480 And this will become populated as we as we go through it, 186 00:18:36,560 --> 00:18:41,120 you'll see some sort of if you're familiar with some of the areas of bias and different types of terminology, 187 00:18:41,360 --> 00:18:45,739 but you'll also see some sort of strange ones like this hot stuff bias. 188 00:18:45,740 --> 00:18:52,580 And trust me, when I click on it, it doesn't go where you might want it to go. It goes to a nice definition of of what hot stuff bias is. 189 00:18:52,580 --> 00:18:55,760 And this actually came from David Circuit's paper. 190 00:18:56,390 --> 00:19:04,730 So there are a lot there are a lot of quite obscure and obsolete terminology that David used in that paper for certain types of biases. 191 00:19:05,450 --> 00:19:09,290 So you thought, what's hot stuff bias? What's that got to do anything when you read it? 192 00:19:09,590 --> 00:19:16,850 I think this makes clear and perfect sense, if not now more than ever, when you start to read what he's talking about around Hot Stuff bias. 193 00:19:17,120 --> 00:19:20,210 So there were some biases that we we realised that he went searching. 194 00:19:20,990 --> 00:19:22,790 Yeah. Be careful your Google search for this. 195 00:19:24,110 --> 00:19:29,870 But you go off and you start and just nothing's going to come through, certainly not on PubMed or any sort of any sort of regular database. 196 00:19:30,230 --> 00:19:37,309 But that that type of bias is very, very prominent now, and it's a very one that needs to be at the forefront, I think. 197 00:19:37,310 --> 00:19:41,389 So we included things like this because we thought actually that they were making sense of that. 198 00:19:41,390 --> 00:19:46,370 And they certainly make sense now, even if the terminology isn't quite what we would call that now, 199 00:19:46,370 --> 00:19:50,690 I'm not sure what that kind of what the name for that bias would be. Now, if it's not hot stuff bias. 200 00:19:51,590 --> 00:19:57,799 But we can we can we can debate that as well. I think. So there are some some some on there from the catalogue that certainly you wouldn't see now. 201 00:19:57,800 --> 00:20:02,760 But we've put them in anyway because we think. Some of them are actually quite relevant in terms of impact. 202 00:20:03,300 --> 00:20:06,990 Again, some statements here, we just sort of said, funny enough, 203 00:20:06,990 --> 00:20:11,550 hot stuff's not been formally investigated and those looked at the impact on effect size for hot stuff fires. 204 00:20:11,850 --> 00:20:18,570 But are you making a judgement of your own, your experiences and observations of how you think that that that bias is impacting at a time? 205 00:20:20,190 --> 00:20:24,510 And then we've got a blog, so we're going to keep a blog going. 206 00:20:24,690 --> 00:20:30,900 I've started the first one, and I'm hopefully relying on I'm going to rely on the 18 to to be filling in and putting blogs in. 207 00:20:31,110 --> 00:20:34,290 But we'll also be inviting blogs in from from people from the outside as well. 208 00:20:34,290 --> 00:20:39,390 So external to keep that to keep that running. All things bias and all things bias related. 209 00:20:39,900 --> 00:20:48,450 Okay. And I just wanted to link actually from if you go back to one of the pages I talked about the James Lind Library, 210 00:20:48,450 --> 00:20:56,459 and here's where I think I'll bring in Ian because you'll see where we can and where we could find a link we as best we could. 211 00:20:56,460 --> 00:21:00,690 We said, okay, there's also some we'll get information on this bias on the James Lynn Library. 212 00:21:00,690 --> 00:21:09,060 And we put the definition in that James Lynn gave. And then when you click on that, you go off and you've got a really nice definition with a lot, 213 00:21:09,270 --> 00:21:13,169 a lot more content than what we've been output in our catalogue. But that wasn't the point of our catalogue. 214 00:21:13,170 --> 00:21:17,820 All catalogues are one stop shop and then you can go off and start reading some more important biases there. 215 00:21:19,290 --> 00:21:24,719 So I think on that note, we invite all right, if you wouldn't mind, just give me a few words on the background, too, to the James Linda. 216 00:21:24,720 --> 00:21:28,620 That's great. Thank you. Okay. 217 00:21:28,620 --> 00:21:34,050 Well, first of all, I'm really chuffed to have been invited to come to this launch. 218 00:21:34,890 --> 00:21:47,730 It's just great. I mean, for for someone who, as I have been trying to point out, that you don't have to be a statistician to understand about biases. 219 00:21:48,210 --> 00:21:51,510 And biases are actually more fundamental than statistics. 220 00:21:52,710 --> 00:22:00,300 It's just very, very good news indeed. So I want to congratulate you on having taken this initiative really fantastic. 221 00:22:00,720 --> 00:22:11,820 And you can imagine that I'm particularly chuffed that you've seen that the James led library is relevant to this enterprise. 222 00:22:12,150 --> 00:22:22,290 That's great. I talked to Dave about this paper and I asked him what's happened to and Kavita. 223 00:22:25,500 --> 00:22:34,830 Now, I heard I was very pleased actually to hear that Dave David had actually looked to see if he could find out what Joe and care better is. 224 00:22:36,540 --> 00:22:41,550 So far, neither of us know. Dave didn't know, so I didn't know. 225 00:22:41,970 --> 00:22:49,230 So I think that one of the first jobs of the catalogue of bias is to do a worldwide search for the current whereabouts 226 00:22:49,230 --> 00:22:54,360 of Joe and Keir better because she deserves some credit actually for having produced the first catalogue. 227 00:22:55,050 --> 00:22:59,750 And by the way, the way it is spelt is actually the European ways, not the English way. 228 00:22:59,760 --> 00:23:06,390 It's the same in French as well. So it's important. 229 00:23:07,440 --> 00:23:15,389 I eventually persuaded Dave to write a commentary on this paper and that involved 230 00:23:15,390 --> 00:23:19,230 finding out a little bit about the meeting for which he had prepared it. 231 00:23:20,310 --> 00:23:28,290 And there was a great deal of discussion about case control studies, how valid they were. 232 00:23:28,350 --> 00:23:31,230 There were some people who wouldn't give them the time of day. 233 00:23:31,770 --> 00:23:36,570 Other people who realised that they were the only way of going about addressing some questions. 234 00:23:38,340 --> 00:23:44,670 And we got the person who convened the meeting, someone called Michael Ibrahim, an epidemiologist, 235 00:23:46,170 --> 00:23:51,930 to write an appendix to Dave's article in that giving a background to the meeting. 236 00:23:53,220 --> 00:24:00,930 I was interested because they've when he came to Oxford made a. 237 00:24:05,030 --> 00:24:14,180 Made a somewhat bad impression on Martin Vesey, who is the professor of public health. 238 00:24:15,380 --> 00:24:17,780 Martin felt that because of this paper, 239 00:24:18,050 --> 00:24:24,260 Dave didn't have any time for the sort of work the cohort studies and the case control studies, which Martin was doing. 240 00:24:25,280 --> 00:24:30,650 And Martin, they never really got on. It has to be said during the five years that he was here. 241 00:24:31,490 --> 00:24:37,400 And I thought that, you know, something might have come out of Dave writing about this exercise. 242 00:24:39,500 --> 00:24:42,770 And so I got him to write about it's a very nice piece that he's written. 243 00:24:43,160 --> 00:24:50,210 It doesn't really uncover much about that, but it's there for people to see. 244 00:24:51,410 --> 00:24:55,270 So what I thought I'd do is let me see it. 245 00:24:55,310 --> 00:25:01,639 Yes. I'll just give you a I'm going to give you a very few examples from the James Lynn 246 00:25:01,640 --> 00:25:09,410 Library to illustrate how it might be useful to people using the catalogue of bias. 247 00:25:10,160 --> 00:25:17,840 So it has these three silos, if you like, of records and articles, 248 00:25:18,410 --> 00:25:25,390 general articles about what constitutes fair tests that you in general need to have a comparison of some sort. 249 00:25:25,400 --> 00:25:29,600 If you want to make an inference about an effect of of an intervention, 250 00:25:30,620 --> 00:25:37,669 then the bias is and you see, we don't have 235 or whatever it was and certainly not even 100. 251 00:25:37,670 --> 00:25:44,959 I think there are seven there. So I think that to some from there got something on the play off chance and 252 00:25:44,960 --> 00:25:48,890 then we got something which is to remind people that at the end of the day, 253 00:25:49,130 --> 00:25:58,040 all of this is an attempt to try and make things better for patients and the public in general and work, which relates to that. 254 00:25:58,280 --> 00:26:05,150 And so, for example, the quality of reporting and key papers in the quality of reporting research go into that. 255 00:26:06,860 --> 00:26:13,819 But before you start to bias, I felt it was quite important to actually say when you don't need to worry about 256 00:26:13,820 --> 00:26:19,640 bias because there are certain treatments that have such slam bang effects, 257 00:26:20,810 --> 00:26:28,370 that bias is unlikely to be a cause of these effects. 258 00:26:29,540 --> 00:26:34,609 This is a paper that Paul Glacier and some of us wrote in that was published in the BMJ. 259 00:26:34,610 --> 00:26:41,630 And we've got some examples of treatments with dramatic effects and the sort of ways in which that drama shows itself. 260 00:26:42,110 --> 00:26:52,310 But I just took one from 1801, which is Astley Cooper treating conductive deafness by doing temp anatomy. 261 00:26:55,220 --> 00:27:00,440 It's sort of hurts to look at this spear being stuck down the area. 262 00:27:01,070 --> 00:27:05,480 But basically that's that was an example of a dramatic treatment. 263 00:27:05,510 --> 00:27:10,580 People who hadn't heard that, hadn't been able to hear for some time suddenly could hear. 264 00:27:11,900 --> 00:27:14,910 So you don't need to worry about bias there. 265 00:27:14,930 --> 00:27:21,680 You may need to worry about it and different sorts of deafness, but for certain types of deafness, that was a dramatic effect. 266 00:27:22,490 --> 00:27:29,810 But two biases which I've chosen to illustrate allocation, bias and observer bias. 267 00:27:29,930 --> 00:27:35,120 Observer bias is sometimes called measurement bias and other sorts of terms as well. 268 00:27:35,840 --> 00:27:41,870 And this one I thought I'd start off with was done in 1809 during the Peninsular War. 269 00:27:42,890 --> 00:27:55,880 In fact, in Portugal, where there was a a base hospital, seemed to be quite busy soldiers, often with fevers of one sort and the other. 270 00:27:57,080 --> 00:28:07,340 And there were three surgeons there. One of them was Alexander Hamilton, who described himself actually as the most beautiful man in existence, 271 00:28:08,150 --> 00:28:13,100 which is the name of the biography of this man, who was a bit of a bounder in all sorts of ways. 272 00:28:14,060 --> 00:28:26,000 But he reports in his M.D. thesis in Latin for University of Edinburgh that it had been so arranged that 273 00:28:26,720 --> 00:28:35,420 366 soldiers had were admitted alternately in such a manner that each of us had one third of the whole. 274 00:28:36,200 --> 00:28:44,690 The sick were indiscriminately received. No selection were attended as nearly as possible with the same care and accommodated with the same comforts. 275 00:28:46,100 --> 00:28:50,509 Neither Mr. Anderson nor I ever once employed the Lancet bloodletting. 276 00:28:50,510 --> 00:29:00,230 He lost two eye, four cases mortality of 2.5%, whilst out of the other third, the other surgeon used bloodletting. 277 00:29:00,410 --> 00:29:11,610 35 patients died. So ten. A higher mortality rate using the what was then a very widely used form of treatment. 278 00:29:14,100 --> 00:29:18,060 Here's another one, which is methodologically a bit more interesting in a way. 279 00:29:19,230 --> 00:29:26,930 Thomas Graham Balfour was a doctor. He was in charge of an orphanage, military orphanage in Chelsea. 280 00:29:27,720 --> 00:29:32,370 And he went on to become the president, the Royal Statistical Society. 281 00:29:32,390 --> 00:29:35,810 So he was, in that sense, a fairly exceptional medical doctor. 282 00:29:36,660 --> 00:29:51,210 And there was the idea that giving Belladonna to children during a course of a scarlet fever epidemic would protect them from getting scarlet fever. 283 00:29:52,560 --> 00:30:01,320 So what he did was he. There were 151 more boys of whom I had tolerably satisfactory evidence that they had not had Scarlet Tina. 284 00:30:01,350 --> 00:30:06,780 In other words, he wanted to make sure they were eligible for prevention of scarlet fever. 285 00:30:06,780 --> 00:30:10,080 I divided them in two sections, taking them alternately from the list. 286 00:30:10,530 --> 00:30:14,010 And this is the key statement to prevent the invitation of selection. 287 00:30:14,040 --> 00:30:24,629 In other words, to prevent bias. I don't know an earlier quotation which puts it quite as clearly as that. 288 00:30:24,630 --> 00:30:33,840 What he was trying to do by the beginning of the 20th century alternation was being used very widely, 289 00:30:33,840 --> 00:30:37,740 and it had become called different things in different languages. 290 00:30:42,380 --> 00:30:49,459 And basically alternation was the predecessor of random allocation. 291 00:30:49,460 --> 00:30:59,330 Random allocation. Really, it's only advantage in practical terms is that it's easier to conceal what the allocation method is. 292 00:31:00,110 --> 00:31:07,220 But that doesn't mean to say that if you use around the numbers, the allocation schedule is necessarily concealed. 293 00:31:07,880 --> 00:31:16,490 It might be stuck up on the wall so that when people the next patient eligible comes in, you look at the list on the wall and you decide, 294 00:31:16,490 --> 00:31:20,120 actually, no, I don't think I want to put the patient in is going to be allocated to that. 295 00:31:20,120 --> 00:31:27,860 So you can have because of lack of concealment of a random allocation schedule, you can introduce bias. 296 00:31:28,580 --> 00:31:34,190 Random allocation does not protect you against bias without concealment of the allocation schedule. 297 00:31:36,080 --> 00:31:42,260 Now, reducing observer bias, well, this was a comparison of so-called tractors, 298 00:31:42,680 --> 00:31:50,640 metal tractors with placebo tractors made of wood for treating rheumatism. 299 00:31:50,660 --> 00:31:59,630 This is from 1800. This is Gil Ray's cartoon making fun of this very fashionable treatment. 300 00:32:01,530 --> 00:32:07,310 These are this is what they looked like. And as you'll see, the inventor, Perkins, had patented them. 301 00:32:08,120 --> 00:32:17,060 And the idea was you put this little bit of metal spike against someone's nose and it is there or some part of them. 302 00:32:17,420 --> 00:32:24,830 And you drew out of them some magnetic thing that was causing the problem leading to their symptoms. 303 00:32:26,390 --> 00:32:31,040 Well, this man, Hagar, did a very nice job. 304 00:32:31,760 --> 00:32:41,540 He said, right, well, let's make some out of wood to look like metal, and therefore which will not act in the way claim for them. 305 00:32:42,230 --> 00:32:48,890 And he wrote up this lovely pamphlet of the imagination as a cause and a cure of disorders of the body, 306 00:32:49,970 --> 00:32:55,010 exemplified by fictitious tractors and epidemic ball convulsions. 307 00:32:55,790 --> 00:33:00,890 And he was unable to find any difference between patients treated with the wooden ones and the metal ones. 308 00:33:01,550 --> 00:33:11,810 That's an example of observer bias, the observers being the patients, of course, being controlled by a placebo controlled trial in 1800. 309 00:33:14,060 --> 00:33:22,790 This is an example of a comparison of of which involved both control of allocation bias and observer bias. 310 00:33:23,630 --> 00:33:31,280 It was a comparison of activated and ordinary horse serum for treating diphtheria 1918. 311 00:33:32,000 --> 00:33:44,830 It was done by this rather serious looking person, Adolph Bingle, and he worked in this place, which looks a bit like Wormwood Scrubs, I would say. 312 00:33:46,040 --> 00:33:55,490 But presumably it was the the state of the art hospital at the time, and this is how he puts it. 313 00:33:56,030 --> 00:34:06,470 He proceeded cautiously. I began in 1912 to treat alternate adult patients with antitoxin serum and with ordinary serum. 314 00:34:07,070 --> 00:34:10,910 That's an activated serum, exactly. 315 00:34:10,910 --> 00:34:17,450 In the temporal sequence in which they were admitted to the ward to make the trial as objective as possible. 316 00:34:17,720 --> 00:34:19,760 I haven't relied on my own judgement alone, 317 00:34:19,760 --> 00:34:26,840 but have sought the views of assistant physicians of the diphtheria ward without informing them about the nature of the serum I detest, 318 00:34:27,140 --> 00:34:31,610 namely the ordinary horse serum. That judgement was thus completely without prejudice. 319 00:34:31,910 --> 00:34:38,360 So again, I'm keen to see my observations checked independently and most warmly. 320 00:34:38,360 --> 00:34:48,200 Recommend this blind method for the purpose. So it's a nice example of a trial which is controlling allocation bias and observer bias. 321 00:34:49,820 --> 00:34:55,040 The last thing on to do with Chance, I know this isn't part of bias, 322 00:34:55,430 --> 00:35:04,640 but I thought that for those of you who weren't aware that meta analysis was used a long time ago and in this particular case, 323 00:35:04,650 --> 00:35:15,410 1928, here are three hospitals in which alternate allocation trials had been used to test serum versus no serum. 324 00:35:16,700 --> 00:35:26,720 And outlined in red are the combined meta analysed results from the three hospitals and which are three trials that were done in 1928. 325 00:35:28,010 --> 00:35:41,030 The word meta analysis wasn't introduced until 1976 by by Jean Glass, an educational psychologist. 326 00:35:43,650 --> 00:35:44,910 That's my last slide. 327 00:35:46,230 --> 00:35:59,760 It's basically to show the people who have been involved in the James Lane library, and I haven't named all of the more than 100 authors of articles. 328 00:36:04,110 --> 00:36:09,839 Jeff Aronson is one of them. I'm trying to see whether any others here. 329 00:36:09,840 --> 00:36:17,160 I don't think there are any others, but basically there are a lot of authors of some fantastic articles, including yours, of course. 330 00:36:17,340 --> 00:36:24,360 Yes. So there we go. I guess we started. 331 00:36:24,450 --> 00:36:28,480 One of the things I'm always a great believer is don't wait until it's a finished product. 332 00:36:28,500 --> 00:36:33,030 Make it happen. Get it out there. Get some people that are interested in that and build and build. 333 00:36:33,330 --> 00:36:36,120 And then in about five years time, we'll have an amazing product. 334 00:36:36,870 --> 00:36:46,050 And I believe we've got a lot to do because if we look at each one of these, we could do a better job and probably find better examples. 335 00:36:46,800 --> 00:36:52,410 We could find better. Some of them think about the impact and think about preventive measures. 336 00:36:54,390 --> 00:36:58,740 Jeff and I were talking today about stage two, stage three, and stage four. 337 00:37:00,390 --> 00:37:04,680 Thinking about the impact of these. Both have had a big impact on the papers. 338 00:37:04,680 --> 00:37:10,500 We see how they interact. What's the true impact? 339 00:37:10,920 --> 00:37:17,130 How do we talk about this in terms of reducing the impact in research of what should we be doing, the researchers, to prevent that? 340 00:37:17,610 --> 00:37:23,460 So we think there's a lot to do here. So it's one of these projects where we want to just keep going and going and going. 341 00:37:23,910 --> 00:37:31,020 So it's not just building the new viruses, it's also coming back and revisiting the ones we've written and thinking, how do we improve that? 342 00:37:31,740 --> 00:37:36,300 I'm particularly interested in people who are interested in want to be involved. 343 00:37:37,950 --> 00:37:42,180 And so we've got on the contact page a forum where you say, I'm really interested in a bias. 344 00:37:42,510 --> 00:37:46,649 Here's why. I'm going to tell you a bit of background about why I'm interested in that. 345 00:37:46,650 --> 00:37:49,830 So if you're really interested, we'd love to hear from you. 346 00:37:50,610 --> 00:37:56,220 And I think one of the things when I click on the BIOS is what I think we're really interested as well. 347 00:37:57,690 --> 00:38:02,370 And one of the key is Reef and I and the listener, Elizabeth Spencer, 348 00:38:02,370 --> 00:38:10,470 are trying to get a bit more money because we can afford to take about ten people away at the moment, and at some point we may grow that number. 349 00:38:11,430 --> 00:38:24,300 But I think there's a bit in the biases as we go down. The background is is here we want to really grow in in this section. 350 00:38:24,840 --> 00:38:30,320 There's the bit here. The the no, the one with the James Lynn Library that's below them, which we haven't filled in. 351 00:38:30,360 --> 00:38:31,680 It's got teaching resources. 352 00:38:33,350 --> 00:38:39,740 And we want to put some with each bias and ability to pull off the couple of flying that make it when you teach in regular. 353 00:38:40,760 --> 00:38:44,390 And I think that will be a huge benefit when we do that. 354 00:38:45,860 --> 00:38:49,850 And so some of the mission is when you wrote about this, it's not just to say, oh, we're onto the next one, 355 00:38:50,300 --> 00:38:55,400 it's been okay, what would what would we do to teach it how to help a teacher out there in the world. 356 00:38:55,400 --> 00:39:01,910 So that's the global aspect of having and that's where it fits with not just teachers and postgraduate health professionals. 357 00:39:01,910 --> 00:39:04,850 I think this is everybody is with schoolchildren in particular. 358 00:39:05,720 --> 00:39:10,700 So I think that's the mission is to is for people to get involved will keep going back to their bias of thinking, 359 00:39:10,700 --> 00:39:14,680 I'm going to fight to improve it, so I'm going to finish it in. 360 00:39:14,690 --> 00:39:18,319 Thanks to David, thanks to input, thanks to everybody. Come in. 361 00:39:18,320 --> 00:39:22,580 But particularly there are a number of people in this room who we drive to Bristol. 362 00:39:22,940 --> 00:39:29,840 I can tell you they like the beer at night and the meal and we had them hammered for about 14 hours of writing. 363 00:39:30,170 --> 00:39:33,530 And at the time, I'm not quite sure they thought this was a good idea. 364 00:39:33,890 --> 00:39:41,540 But I think now when you see the finished product, I think it's been a really fantastic journey and we're hoping to continue it. 365 00:39:41,540 --> 00:39:48,380 So on that very basis, thank you very much. Everybody has come and those who've helped that America participated in developing the catapult. 366 00:39:48,530 --> 00:39:57,200 Thank you very much. And.