1 00:00:00,120 --> 00:00:03,320 I've been working for a long time in systematic reviews. 2 00:00:03,850 --> 00:00:08,380 Began my career as a researcher just up the road from here, 3 00:00:08,400 --> 00:00:15,690 really in the Wycliffe Infirmary at the Cashew Clinical Social Service Unit was employed after 4 00:00:15,690 --> 00:00:20,310 having been a student at Oxford and doing my dphil at Oxford and then not really knowing what to do. 5 00:00:20,550 --> 00:00:24,720 And I saw an advert in daily information and they were trying to recruit someone. 6 00:00:24,720 --> 00:00:28,959 I had no idea about cashew people like Richard Dole and Richard Peter. 7 00:00:28,960 --> 00:00:31,540 I never heard of them. I didn't know that randomised trials, 8 00:00:31,540 --> 00:00:36,540 I didn't know what a systematic review was and at the time we weren't necessarily calling them systematic review sites. 9 00:00:36,540 --> 00:00:42,329 You call them overviews and a clue about any of that and just fell into working in a 10 00:00:42,330 --> 00:00:50,819 place that now makes me feel like part of this process and so many of you will know. 11 00:00:50,820 --> 00:00:58,469 Ian Chalmers Ian has been a sort of someone I've sort of step by step followed and through various things. 12 00:00:58,470 --> 00:01:02,620 Ian set up the UK Cochrane Centre. I became director of the UK Caucus. 13 00:01:02,620 --> 00:01:10,169 Senator Ian. Ian set up the James Dean Library and next year I'll be taking over as editor in chief for the James Lynn Library. 14 00:01:10,170 --> 00:01:19,110 It's about the history of of evaluations. And one of the things that Ian will often say about me is he's not a historian. 15 00:01:19,110 --> 00:01:22,469 I am. Ian's a better historian than I am. He is not here. 16 00:01:22,470 --> 00:01:27,810 So I can say it just happens that I have my my dphil sits in history. 17 00:01:28,140 --> 00:01:30,630 So Ian will often regard me as more of a star. 18 00:01:30,640 --> 00:01:36,660 I think Ian knows a lot more about how historians should act than I do, so I'm going to try and set the scene. 19 00:01:36,690 --> 00:01:40,440 I'm going to take you on a bit of a journey. History of what? 20 00:01:40,440 --> 00:01:44,219 Well, is the evidence synthesis, research, synthesis, matter analysis, systematic reviews, 21 00:01:44,220 --> 00:01:49,110 cockatoos, all sorts of words are used for the sorts of things I'm going to describe. 22 00:01:49,590 --> 00:01:59,040 Some of these words are subsets of other things called reviews are systematic reviews, but they don't all contain a matter analysis, 23 00:01:59,040 --> 00:02:04,829 which is the mathematical best evidence synthesis and research synthesis are words that we might use, 24 00:02:04,830 --> 00:02:07,830 and we don't necessarily mean systematic reviews. 25 00:02:08,670 --> 00:02:16,590 So the first thing is to think the language of it. My aim is to try and give you a framework for thinking about this history. 26 00:02:16,980 --> 00:02:21,810 I cannot stand here and definitively recount the history of systematic abuse. 27 00:02:22,020 --> 00:02:25,200 I think we're going to need to wait decades still for that. 28 00:02:25,200 --> 00:02:33,120 And someone more informed than me is going to really do the history of systematic abuse because there are challenges for me doing it. 29 00:02:33,120 --> 00:02:37,050 I'm building on work done by other people. So others have looked at this. 30 00:02:37,440 --> 00:02:46,350 I'm providing examples, but one of the things I do feel as wearing my historian hat is history is not about who was first. 31 00:02:46,680 --> 00:02:50,159 History is not about the state, their state, their state, their state. 32 00:02:50,160 --> 00:02:57,990 I will show you dates, but they're not to say and this is the first time this ever happened, anyone who says that is a hostage to fortune, 33 00:02:59,490 --> 00:03:03,630 we probably know roughly what was among the first, 34 00:03:04,260 --> 00:03:13,230 but there is no reason to believe that something is the first time that the idea was hard because the idea is not remarkable. 35 00:03:15,210 --> 00:03:23,850 Who was early? That's what I think we can talk about. I think we can talk about things that maybe began something because without that thing, 36 00:03:24,420 --> 00:03:28,980 you know, if the Cochrane Centre hadn't been established in Summertown, 37 00:03:29,130 --> 00:03:35,130 then we wouldn't have a Cochrane collaboration, we wouldn't have Cochrane reviews, it needed that thing. 38 00:03:35,130 --> 00:03:37,710 We might have had something else, but we wouldn't have had that thing. 39 00:03:39,000 --> 00:03:44,280 We wouldn't have matter analysis if someone hadn't coined the phrase matter analysis, 40 00:03:44,610 --> 00:03:50,309 but we probably would have and we know we would have combined analysis of studies because other people 41 00:03:50,310 --> 00:03:56,639 around the same time were writing about how to statistically bring together the results of studies. 42 00:03:56,640 --> 00:04:02,040 They weren't calling it matter. Analysis. History is alive and we're part of it. 43 00:04:03,540 --> 00:04:07,319 I'm looking after the systematic reviews module this week simultaneously. 44 00:04:07,320 --> 00:04:11,430 Some of you are on the matter analysis module. I'm the fortunate position. 45 00:04:11,430 --> 00:04:15,690 Eventually I'm going to figure out when did we actually start that systematic reviews module? 46 00:04:16,290 --> 00:04:18,630 Two weeks from now I'll be running the randomised trial module. 47 00:04:18,660 --> 00:04:26,910 I'm pretty certain we started that in 1998, 20 years ago, who was still at school 20 years ago in the audience. 48 00:04:26,910 --> 00:04:32,070 Anybody is teaching or maybe just you're looking well for your age. 49 00:04:32,580 --> 00:04:40,379 Okay, but just think, you know, some of what I'm going to show you and now those who do teach and we teach undergraduates and you think, 50 00:04:40,380 --> 00:04:45,960 I'm telling you what I was doing 25, 30 years ago, you weren't even born when we were doing this. 51 00:04:46,470 --> 00:04:50,370 How does history film history for some of us is going to feel as though we say, you know, the past. 52 00:04:50,760 --> 00:04:56,430 Now, again, one of the things is we think about the evidence based health care courses here in Oxford. 53 00:04:56,670 --> 00:04:59,940 They began last century. But when they began, let's say. 54 00:05:00,020 --> 00:05:06,560 She was nine years earlier. So we things are moving and I'm going to reflect on it. 55 00:05:06,830 --> 00:05:16,610 As a reviewer, as a researcher, as someone who occasionally gets passed as a historian, so we can think, well, how far back to this whole stuff go? 56 00:05:16,760 --> 00:05:20,450 One of the beauties of the modern age is the Internet. 57 00:05:21,350 --> 00:05:29,360 And one of the things that the BMJ did was they digitised all of the BMJ and Lancet have it, JAMA have it, most of the journals now around it. 58 00:05:29,510 --> 00:05:36,950 They've scanned their stuff and they put it online. So you can go to the BMJ website and you can search for the phrase systematic review. 59 00:05:37,790 --> 00:05:42,740 And this is the earliest 1867. It's in a book review. 60 00:05:43,670 --> 00:05:50,900 They're not using the phrase the way we might use it now for a scientific project called a systematic review. 61 00:05:51,110 --> 00:05:57,260 But they are using it in a way that is part of why we do systematic reviews. 62 00:05:57,710 --> 00:06:05,720 Daunted by the difficulty of any systematic review of these collection of monographs, the monographs are to do with hospital statistics. 63 00:06:05,720 --> 00:06:12,440 In London, there's lots of those monographs. So they're saying, you know, it's too difficult to systematic review them. 64 00:06:12,530 --> 00:06:16,459 So we'll take a flying one through the pages warning our readers that they will 65 00:06:16,460 --> 00:06:21,140 do well to indemnify themselves by procuring the volumes for systematic perusal. 66 00:06:21,500 --> 00:06:24,739 They're saying we ideally should do a systematic review. 67 00:06:24,740 --> 00:06:27,890 That'll be a lot of hard work. That'll take a long time. 68 00:06:28,610 --> 00:06:36,589 So we're going to skim it and you can reflect that now 150 years on and people are skimming the research that's been done and saying, 69 00:06:36,590 --> 00:06:40,159 I'll quickly summarise a couple of studies for you and you should be thinking, well, 70 00:06:40,160 --> 00:06:47,570 hang on a minute, that's a good guide to whether or not I should work harder on those studies to understand them for myself, 71 00:06:48,260 --> 00:06:55,340 that I should indemnify my myself by gathering those studies for myself and systematically perusing them. 72 00:06:56,240 --> 00:07:06,740 So it's a really nice example of the phrases that it's highlighting what we would now be doing, and it's warning against doing things too quickly. 73 00:07:08,960 --> 00:07:15,170 So key source is again the slides a bit. Yeah, we'll figure out a way to make the slides available, but there are other things. 74 00:07:15,350 --> 00:07:20,060 So Ian with Laurie Hedges and Alice Cooper wrote a Brief History in 2002. 75 00:07:20,840 --> 00:07:29,330 Steph Lewis led some work now show on forest plots. 2001 Max Dyer, the originator of Update Software, the first publisher of the Cochrane Library, 76 00:07:29,630 --> 00:07:35,780 and we wrote something with Mark in 2009, and Dan Fox wrote about the impact of reviews on health policy. 77 00:07:35,780 --> 00:07:39,860 So these are some of the sources that I will be using and the James Lane Library. 78 00:07:40,190 --> 00:07:44,750 You are well at the top, so I'm going to take you through it. 79 00:07:44,810 --> 00:07:51,110 So this slide could be a slide that we would use on the course, on the Metro analysis course to say, why are you here? 80 00:07:51,110 --> 00:07:55,730 Why are you learning about evidence synthesis? Why are you learning about systematic review and meta analysis? 81 00:07:56,060 --> 00:08:00,740 Because that I'm going to try and illustrate these points about how have they crack team historically. 82 00:08:01,280 --> 00:08:09,970 So is it about organising evidence, comparing and contrasting similar studies, doing the mathematics, minimising bias, improving access, 83 00:08:10,010 --> 00:08:18,140 appraising quality, building a better study in the future, identifying interventions that we should be using in the health system. 84 00:08:18,290 --> 00:08:21,530 Identifying things that I should buy if I go boots the chemist. 85 00:08:22,100 --> 00:08:27,770 Those of you who are practitioners that you should prescribe surgical procedures you should do if you are a surgeon. 86 00:08:27,950 --> 00:08:34,100 Procedures I would like dawned on me if I needed surgery. So these are this is why we do evidence synthesis today. 87 00:08:34,730 --> 00:08:38,630 So I'm going to try and illustrate where, you know, where did some of these ideas come from? 88 00:08:39,560 --> 00:08:45,200 So I begin with a very long quote, and I'll tell you where it's from at the end. 89 00:08:45,200 --> 00:08:48,560 And it's a sort of you know, it's it's a good rationale for why we do reviews. 90 00:08:48,770 --> 00:08:53,480 If, as is sometimes supposed, science consisted in nothing but a laborious accumulation of facts, 91 00:08:53,840 --> 00:08:58,100 it would soon come to a standstill, crushed, as it were, under its own weight. 92 00:08:58,550 --> 00:09:06,830 So it's a recognition. We've got to recognise that science is moving along and if all we did as scientists was accumulate facts, we'd just be crushed. 93 00:09:07,640 --> 00:09:12,140 The suggestion of a new idea or the detection of a law supersedes much that has 94 00:09:12,140 --> 00:09:16,160 previously been a burden on the memory and by introducing order and coherence, 95 00:09:16,160 --> 00:09:20,330 facilitates the retention and the remainder of in an available form. 96 00:09:20,540 --> 00:09:25,579 And again, that's what we do is we learn things, we assimilate the knowledge and we say, 97 00:09:25,580 --> 00:09:30,350 okay, on the basis of what we know and what we've just found out, here's how we go forward. 98 00:09:30,530 --> 00:09:37,610 Two processes I'll sit work side by side, the reception of new material and the digestion and assimilation of the old. 99 00:09:37,820 --> 00:09:40,280 One way mark, however, should be made. 100 00:09:40,280 --> 00:09:47,570 The work which deserves, but I'm afraid does not always receive the most credit is that in which discovery and explanation go hand in hand, 101 00:09:48,110 --> 00:09:53,060 in which not only a new facts presented, but their relation to old ones is pointed out. 102 00:09:53,180 --> 00:09:56,750 We could almost use that as a rallying call for systematic abuse. 103 00:09:57,080 --> 00:10:01,720 Lord Wiley, 1885. The. Swedish Association for the Advancement of Science. 104 00:10:02,220 --> 00:10:05,379 And I used to think this is. This feels like history. 105 00:10:05,380 --> 00:10:08,740 Now. It's a 19th century. It feels old. 106 00:10:11,140 --> 00:10:19,270 And it's why we do reviews. So I can bring you more up to date, because there's an alternative approach to being not being crushed by facts. 107 00:10:19,480 --> 00:10:22,660 And that is simply saying when a new one comes along. 108 00:10:22,940 --> 00:10:28,390 I throw away an old one. I don't assimilate. I just say I can only I can only cope with ten things at a time. 109 00:10:28,570 --> 00:10:34,360 So when number 11 comes along, number one has to disappear. We don't like that approach, but that is an approach of Homer Simpson. 110 00:10:35,020 --> 00:10:42,159 This is I keep meaning to find the YouTube clip. If you watched The Simpsons, Homer Simpson, this at one point says this. 111 00:10:42,160 --> 00:10:50,049 I know this as a fact because my sons bought me a calendar of The Simpsons sayings, and each day you get a different saying. 112 00:10:50,050 --> 00:10:53,740 And on a day in November 2023, this is what the calendar. 113 00:10:53,740 --> 00:10:56,920 Every time I learn something new, it pushes some old stuff out of my brain. 114 00:10:57,940 --> 00:11:01,270 So Lord Wally in 1885 is saying, we have to assimilate. 115 00:11:01,600 --> 00:11:07,149 We have to synthesise the evidence. We have to do systematic reviews, not using the terminology. 116 00:11:07,150 --> 00:11:14,080 But that's what he's talking about. Homer Simpson, beginning of this century, says, you know, just learn new stuff, throw some old stuff away. 117 00:11:14,410 --> 00:11:19,510 Clearly, that's not what we want to do. But we've got to think about those two approaches. 118 00:11:21,670 --> 00:11:25,450 Is it about organising the evidence? So Homer doesn't want to organise the evidence. 119 00:11:25,450 --> 00:11:30,219 He's got a shelf and he's is going to push one thing off as the next thing comes on board. 120 00:11:30,220 --> 00:11:34,480 Riley is saying, No, no, no, assimilate. So is it about organising? 121 00:11:35,050 --> 00:11:39,700 James Lind, people familiar potentially with this, the treatise on scurvy. 122 00:11:39,940 --> 00:11:46,360 This is that the oranges and lemons for people on ships to try and reduce or prevent scurvy. 123 00:11:46,720 --> 00:11:49,770 And so you several people are aware of that as a thing. 124 00:11:49,780 --> 00:11:54,580 Is it the first control trial? No, it's not the first control trial, but it is an early control trial. 125 00:11:55,330 --> 00:12:01,010 Not many people are aware that he also tested things like sulphuric acid to see whether or not that could keep, 126 00:12:01,290 --> 00:12:06,250 you know, here have some sulphuric acid that might stop you. Getting scurvy didn't work. 127 00:12:06,400 --> 00:12:12,520 Not not quite sure how sure it is. It didn't work, but I don't think it would be something we risk to recommend. 128 00:12:12,850 --> 00:12:21,790 But what's important about this book from the 1750s is not just the trial, the study that was done on the ship, 129 00:12:22,480 --> 00:12:29,620 but it is also together with a critical and chronological view of what being published on the subject 1750s, 130 00:12:30,280 --> 00:12:35,290 trying to bring together and catalogue what has been done on the subject and part of these examples, he said. 131 00:12:35,440 --> 00:12:43,510 It's not new. It has changed the last 20 years on reviews, dramatically different, but it's not new. 132 00:12:44,350 --> 00:12:48,550 Leonard James Leonard is doing that in 1753. 133 00:12:48,700 --> 00:12:56,530 Why is he doing it? And again, look at this and think, oh, you know, that's what we've been learning about on matter analysis and systematic reviews. 134 00:12:57,100 --> 00:13:00,129 We're learning about prejudices and biases. 135 00:13:00,130 --> 00:13:07,840 We're learning that stuff might be rubbish. James Lane 1753, is saying, Before I tell you about my study, 136 00:13:08,140 --> 00:13:13,690 I'll tell you about what's already happened because I need to do that, because I need to set my study in context. 137 00:13:13,690 --> 00:13:19,540 But also there's rubbish out there and I need to tell you what's rubbish and what you might believe and what might help you. 138 00:13:20,830 --> 00:13:25,120 So, you know, think about reviews you've read. Think about those who you work on reviews. 139 00:13:25,120 --> 00:13:35,320 Think about reviews you've done. Part of it is about cataloguing, placing things in order, finding out mistakes, managing prejudice. 140 00:13:35,470 --> 00:13:37,480 We now talk about the risk of bias. 141 00:13:37,810 --> 00:13:44,890 Well, James Leonard is talking about rooting out prejudice, same thing, same ideas, issues around quality assessment. 142 00:13:46,270 --> 00:13:54,280 Then there's other curiosity about this book, because if you look at one of the publishers, one of the publishers is Colin. 143 00:13:55,270 --> 00:13:59,680 And so you think, oh, wow, what? What then is this Nostradamus? 144 00:13:59,860 --> 00:14:04,599 Is this Harry sort of saying, look, I'm going to do this? And actually, one of my publishers is called Cochrane. 145 00:14:04,600 --> 00:14:11,380 And many, many, you know, two centuries from now, there'll be this thing called Cochrane doing, they say these reviews now a history of Cochrane. 146 00:14:11,770 --> 00:14:19,330 There is you know, it's it's a bizarre coincidence that one of the publishers of James Flynn's treatise on Scurvy is Cochrane. 147 00:14:20,890 --> 00:14:27,310 And so let's jump to 1979 and Archie Cochrane, the man who the Cochrane Collaboration is named after. 148 00:14:27,610 --> 00:14:32,290 Archie, I never met Archie Cochrane. Archie Cochrane, who died before the collaboration was formed. 149 00:14:33,250 --> 00:14:39,879 But in 79, it is surely a great criticism of our present profession that we have not organised a critical summary 150 00:14:39,880 --> 00:14:45,130 by speciality or subspecialty adapted periodically of all relevant randomised controlled trials. 151 00:14:45,520 --> 00:14:49,000 And that's a profession book I've come to be. 152 00:14:49,010 --> 00:14:58,780 But of saying, look, you know, 200 years earlier James Lind organised the literature but. 153 00:14:59,170 --> 00:15:09,000 200 years later, Archie Cochrane is talking to the medical profession, primarily talking to obstetrics and gynaecology, saying this is terrible. 154 00:15:09,010 --> 00:15:14,710 We've not organised the stuff. We're doing these trials, we're not organising them, we're not learning from them, we're just doing them. 155 00:15:14,830 --> 00:15:21,280 And as researchers, sometimes we just get enthused by doing our own study and we don't think hard enough about what we're supposed to be doing. 156 00:15:21,640 --> 00:15:25,900 So then we have to say, well, some of the signs were criticised. It's not scientific research. 157 00:15:26,200 --> 00:15:30,310 All you're doing is review and well, that's just like going to a restaurant and telling us how good the food was. 158 00:15:30,860 --> 00:15:37,360 Yeah, that's not science. Well, it is science and it's not new that people have recognised it as science. 159 00:15:37,510 --> 00:15:41,560 Feldman, 71, may be considered a type of research in its own right, 160 00:15:41,800 --> 00:15:46,600 one using a characteristic set of research techniques and methods and getting those of you on the courses. 161 00:15:47,500 --> 00:15:51,310 That's where here on the course as you're learning these methods. It's not. 162 00:15:51,610 --> 00:15:54,580 Go to a restaurant and tell us what you think about the food. 163 00:15:54,760 --> 00:16:00,850 It could theoretically be go to a restaurant, systematically appraise the food in a highly structured way, 164 00:16:00,970 --> 00:16:05,500 write your review of the food and make a recommendation about whether or not other people 165 00:16:05,500 --> 00:16:12,000 should go there in an unbiased way as possible so we can systematically review restaurants. 166 00:16:12,040 --> 00:16:15,040 We're not going to necessarily see it as science, but the process of the system. 167 00:16:15,370 --> 00:16:24,010 It's science. Lyons Smith, 71, noting that it's impossible to address some hypotheses other than by comparing and contrasting the studies. 168 00:16:25,120 --> 00:16:32,590 We still have to show this occasionally because here in Oxford people can do de filles systematic reviews, 169 00:16:33,700 --> 00:16:39,220 but go to some other universities in some other countries and say what a systematic review is it as a PhD? 170 00:16:39,760 --> 00:16:44,160 No, no, that's not that's not research. We still have these discussions. 171 00:16:44,170 --> 00:16:48,280 I still share these quotes with colleagues in other countries to say, well, look, 172 00:16:48,490 --> 00:16:52,420 here's some quotes for you and here's some Oxford DE that are systematic reviews. 173 00:16:52,420 --> 00:16:58,209 And I think you may have heard of the University of Oxford as a place that has been doing research for a while. 174 00:16:58,210 --> 00:17:00,400 And if they think it's worthy of a dphil, 175 00:17:00,610 --> 00:17:07,510 maybe you should be thinking about as a Ph.D. and often defend students in other places who want to do the such research. 176 00:17:07,780 --> 00:17:12,340 And the supervisors are saying, Oh, it's not research. Well, it is research. 177 00:17:12,610 --> 00:17:15,099 It's well-accepted research. Eugene Garfield, 178 00:17:15,100 --> 00:17:25,360 who is responsible for the concept around impact factors and science signs Citation Index suggested there should even be a prize for for this process. 179 00:17:25,630 --> 00:17:33,700 It's 1977. This is not the year to know in the 2000s when we're becoming much more familiar with reviews. 180 00:17:34,120 --> 00:17:37,900 But these are sporadic examples at time of people talking about them. 181 00:17:38,260 --> 00:17:49,120 So let's get some early examples. The place, the source for this is the work that Ian Harris Cooper and Larry Hedges published in 2002. 182 00:17:49,450 --> 00:17:53,170 So they identified so Ian and Harrison. 183 00:17:53,350 --> 00:18:00,220 Am I saying it's not until the early 20th century that the science of research as we know it began to emerge? 184 00:18:00,520 --> 00:18:06,880 So that's a scientific bit. So you might say that James Lind was about cataloguing, looking for bias and so on the scientific bit. 185 00:18:07,030 --> 00:18:12,040 Maybe that's to do with the mathematics and the statistics and the example that they they cite for that. 186 00:18:12,280 --> 00:18:20,439 Carl Pitts A 1904 article in the BMJ pulls the data from five studies of immunity, 187 00:18:20,440 --> 00:18:29,320 six studies of mortality to look at the effects of vaccines, pooling the studies saying none of the studies on their own would be enough. 188 00:18:29,770 --> 00:18:37,630 I am going to find the studies and I am going to pull them. And this journey that we're on is to say there's no it's not new. 189 00:18:38,710 --> 00:18:46,000 The mathematics of pooling studies to boost the power to think about a word that we're all now so familiar with is as we 190 00:18:46,000 --> 00:18:53,559 view as if point to a work heterogeneous heterogeneity corpus and not you know four is talking about that as an issue when 191 00:18:53,560 --> 00:19:04,389 the pooling the studies actually ESP extrasensory perception a book from the 1940s and it's got a systematic view of the of 192 00:19:04,390 --> 00:19:12,550 the research done to see whether or not extrasensory perception is a valid systematically reviewing something into a book. 193 00:19:14,320 --> 00:19:21,400 The comparison of the statistic of more than one experiment suggests the count about the combination of them for an estimate of total significance. 194 00:19:21,700 --> 00:19:25,120 That's the analysis. That's the statistical pooling of results. 195 00:19:27,310 --> 00:19:33,060 It's not new, but then we hit terminology, so then someone has to invent the term. 196 00:19:33,100 --> 00:19:37,120 Jean Glass, 1976 invents the term match analysis. 197 00:19:37,840 --> 00:19:44,020 That's where the term comes from, because it's not a systematic view, is a sort of well, we know linguistically what it means. 198 00:19:44,020 --> 00:19:52,209 It's a review that is systematic meta analysis, you know, so it made up that it made the term a by April 76, 199 00:19:52,210 --> 00:19:58,990 American Educational Research Association presidential address talks about describes the need for better sense. 200 00:19:59,420 --> 00:20:04,820 And introduces the term meta analysis, and the lecture is then published in educational research. 201 00:20:05,210 --> 00:20:10,340 And it's down there and it just I'm not going to read all the quotes out. 202 00:20:10,340 --> 00:20:16,430 You can read them faster than I can. But he saw saying, you know, there's metal mathematics is metal psychology, there's metal evaluation. 203 00:20:16,760 --> 00:20:19,970 I've never heard of those. I think those terms have been and gone. 204 00:20:20,210 --> 00:20:24,250 Metal analysis has stuck. Metal analysis is the analysis of analysis. 205 00:20:27,460 --> 00:20:36,040 Why? Unless we think of the second paragraph, he's working in education, he's working in behavioural things and they're saying, 206 00:20:36,220 --> 00:20:41,200 look, we in five years time research has produced hundreds of studies on IQ. 207 00:20:42,090 --> 00:20:47,290 Again, we said, look, typical with you now. Oh no, I've got 20 studies, 30 studies, four studies. 208 00:20:47,290 --> 00:20:50,440 How am I going to cope? Jane Glass is saying there's going to be hundreds of studies. 209 00:20:51,820 --> 00:20:55,570 And those studies in isolation, that's not going to be reliable. 210 00:20:55,720 --> 00:21:01,990 But actually, what they're really saying is pooling the results will give us something reliable meta analysis, 211 00:21:01,990 --> 00:21:06,020 the integration of research through the issue, as is citing Jane Glass. 212 00:21:06,040 --> 00:21:15,550 So so Mary Smith she's Jane Glass is Mary Smith's husband at 375 studies 833 effect size measures. 213 00:21:15,910 --> 00:21:25,360 Just feel that those of you who are reviewers, you feel the pain of extracting data from 20 or 30 studies, feel 375 studies, 214 00:21:25,360 --> 00:21:35,950 833 results being pooled and then feel the value of that because this is trying to drive to an average whether or not you believe the average. 215 00:21:36,100 --> 00:21:39,729 That's not our job. A systematic review is our job. 216 00:21:39,730 --> 00:21:44,140 A systematic review is to present the evidence and people might say that's an average result. 217 00:21:44,260 --> 00:21:47,260 That's not a good guide to the individual. That that's a valid argument. 218 00:21:47,290 --> 00:21:57,100 That's fine. Fair enough. But maybe you're going to have to use the average because maybe the results for an individual are not a reliable guide. 219 00:21:58,420 --> 00:22:02,170 So again, there there a quote, I'm not going to go through it again, but justifying it. 220 00:22:02,170 --> 00:22:08,740 But the sense being that this is not new, what we now feel and some of the challenges as we do reviews, 221 00:22:08,890 --> 00:22:12,480 the concept has been around for a while, but it didn't explode. 222 00:22:12,490 --> 00:22:16,570 It didn't take off when Jane Glass introduced the term. 223 00:22:17,470 --> 00:22:18,580 It took a little while. 224 00:22:18,970 --> 00:22:25,750 And that's one of that where I think when when the real deep history stuff is done on reviews, people are going to start exploring why? 225 00:22:25,900 --> 00:22:31,840 Why did it take off? We're going to see a graph that shows that we don't know terminology here. 226 00:22:31,840 --> 00:22:36,930 It's cropping up. This is in the BMJ 1982. 227 00:22:36,940 --> 00:22:38,019 The term is cropping up. 228 00:22:38,020 --> 00:22:44,860 It's crossing from psychology, waging class psychology in education, where they're working in the mid seventies, crossing over at 1982. 229 00:22:44,980 --> 00:22:49,630 So this is the first time I've been able to find the phrase systematic review, 230 00:22:50,170 --> 00:22:58,510 meaning what we would now use it to mean in the BMJ go back 800 1867 example this is looking at hypertension, 231 00:22:58,510 --> 00:23:05,230 comparing non-drug treatments and nice little quotes again in I don't know if we still see some of this writing now in the BMJ, 232 00:23:05,740 --> 00:23:12,940 the reviewing the published work ceases to require the judgement of Solomon and becomes a quasi experiment. 233 00:23:12,940 --> 00:23:17,920 We use the meta analysis. So it's half saying this is a more objective way of doing things. 234 00:23:18,280 --> 00:23:25,750 Instead of Solomon or some Supreme being deciding, we're going to let the numbers decide, we're going to get the process decide, 235 00:23:25,990 --> 00:23:31,720 we're going to make sense of what research has already been done, citing, quoting Gene Glass. 236 00:23:32,590 --> 00:23:38,290 So, again, think about the terminology. So I started work here in Oxford. 237 00:23:39,670 --> 00:23:44,260 Richard was, which is still with us and but in 69, 238 00:23:44,410 --> 00:23:53,830 Richard Dole publishes in the BMJ highlighting that it's not what we would now regard as our typical meta analysis review, 239 00:23:54,130 --> 00:23:59,950 but it is what we increasingly are seeing as a review to bring together the evidence in a systematic way. 240 00:24:00,130 --> 00:24:07,360 And Richard Dole is talking about bringing together evidence on adverse effects systematically. 241 00:24:07,480 --> 00:24:13,570 Again, you can see what I'm trying to illustrate is this concept comes we're coming at it from many different angles, 242 00:24:15,370 --> 00:24:25,840 jump back a little bit again on terminology, but also to illustrate how the world maybe needs to reflect on the history and move itself forward. 243 00:24:26,050 --> 00:24:28,000 People were saying things 40, 244 00:24:28,000 --> 00:24:35,140 50 years ago that only maybe people side paying attention to a real a serious attention to over the last five or ten years. 245 00:24:35,350 --> 00:24:38,500 Shaik 1970 628 reports, 29 studies. 246 00:24:38,650 --> 00:24:44,020 And look at the time of that where in my story and look I think we'll overlook they've gone back 247 00:24:44,020 --> 00:24:51,549 to 1922 not studies from 1922 to 1970 because the procedure has been around for a while now. 248 00:24:51,550 --> 00:24:57,640 How many reviews do we see? And they've only gone back, you know, 40 or 50 years because of convenience of searching the electronic literature. 249 00:24:58,510 --> 00:25:02,980 They went back to 1922 that published in Paediatrics. 250 00:25:03,640 --> 00:25:08,709 The purpose of the study is to review the English language issue, and we might now look at that as a bit of bias. 251 00:25:08,710 --> 00:25:13,510 They're just looking at English. Yeah. Shaykh You should have looked at some of the language as well. 252 00:25:14,110 --> 00:25:17,950 Particular emphasis now on the assessment of the scientific merit of the studies. 253 00:25:19,030 --> 00:25:22,930 So now not necessarily on the meta analysis, but are the studies good enough? 254 00:25:23,080 --> 00:25:26,470 And there is some sort of risk of bias quality. 255 00:25:26,620 --> 00:25:30,639 Critical appraisal tool. And you can see again, the point is not working. 256 00:25:30,640 --> 00:25:37,750 The top case in 1922 down to Roadhouse 1970 with a measure of how good was the study design, 257 00:25:37,840 --> 00:25:44,860 how good is the sample, how good is the description of the illness, critical appraisal, risk of bias going on? 258 00:25:44,860 --> 00:25:52,690 And one of the fascinating things that they found when they did that is that the quality and the conclusions weren't necessarily related. 259 00:25:53,200 --> 00:26:00,490 What was well related was, are you the sort of person that does the operation or the sort of person that does not do the operation? 260 00:26:00,820 --> 00:26:06,520 And if you're the sort of person who does the operation, you conclude in your study that the operation is effective. 261 00:26:07,030 --> 00:26:12,370 So the surgeons tended to conclude it worked. The public health people tended to conclude it doesn't work, didn't work. 262 00:26:12,880 --> 00:26:20,350 And again, right now, now suddenly we think, well, that's what we're confronted with now to some extent is is the bias of the researcher. 263 00:26:22,570 --> 00:26:28,720 So 89, this is the, quote, effective care in pregnancy and childbirth. 264 00:26:28,870 --> 00:26:34,270 Two volume work at big hefty books full of systematic reviews. 265 00:26:34,510 --> 00:26:38,919 And there's the foreword from Archie Cochrane where he's moving on and he sort of welcoming this. 266 00:26:38,920 --> 00:26:43,420 Now the systematic review of the randomised trials, which is in this book is a new achievement, 267 00:26:43,420 --> 00:26:50,410 it's a real milestone and that book contains chapter after chapter and they're tiny chapters, 268 00:26:51,200 --> 00:26:59,080 68 individual reviews, nowhere near the many, many pages that we now feel we have to write as a systematic review. 269 00:26:59,320 --> 00:27:04,810 One or two pages, quick, better methods. Quick bit of what studies did we find? 270 00:27:05,230 --> 00:27:13,720 Quick analysis answer for bringing together tons and tons of material to help people working in maternity care. 271 00:27:14,920 --> 00:27:17,590 So we then move to another element of the history. 272 00:27:17,590 --> 00:27:22,330 And again, these are important little things because certain things clicked and certain things have stayed with us. 273 00:27:23,170 --> 00:27:30,550 Forest plots that that picture that you'll be familiar with, if you look at reviews, the meta analysis plot, 78, 274 00:27:30,790 --> 00:27:36,540 we see this thing over on the right here and it's not big because if you make it bigger, it's a it's a screengrab from JAMA. 275 00:27:36,550 --> 00:27:41,020 If you make it bigger, it just pixelated. You can't see anything. You can see it better when it's smaller. 276 00:27:41,440 --> 00:27:48,670 It's not a traditional forest plot because it's not pooling the results, but it's just showing the result of each study and this confidence interval. 277 00:27:49,000 --> 00:27:52,990 And that because they were looking at negative trials, so-called negative trials, 278 00:27:53,260 --> 00:27:57,490 partly to illustrate that some trials are called negative just because they're not statistically significant. 279 00:27:58,360 --> 00:28:06,579 Louis and Alice produced something similar with a meta analysis in 82, and this was their thing. 280 00:28:06,580 --> 00:28:16,510 And this picture is from an article I was pleased to write with Stef Lewis, who is the daughter of the Lewis, who is the first author of of that. 281 00:28:16,960 --> 00:28:20,620 We just we just reproduced it there. And that's what it looked like. 282 00:28:20,860 --> 00:28:24,219 So it doesn't look like a traditional forest plot, but it did. 283 00:28:24,220 --> 00:28:30,100 It's one of the first examples of where down the bottom is the pooled result that all beta blockers pooled. 284 00:28:30,220 --> 00:28:32,380 That's the average of all of the studies. 285 00:28:32,530 --> 00:28:40,090 So it's a pictorial way of showing it up to then you might have a table up to then you might simply have the average result. 286 00:28:40,360 --> 00:28:45,700 This is showing the contribution of each of the studies in 83 and this we haven't nailed down. 287 00:28:45,940 --> 00:28:51,190 Yeah. At the time Steph and I wrote this and we wrote to various people who may have been involved early on. 288 00:28:51,190 --> 00:28:58,360 Stephen Evans, who statisticians some of you may know, Stephen said, Well, actually I think I was the one who suggested that we replace it. 289 00:28:59,080 --> 00:29:07,570 We replaced that Lewis and plot with something that would have a mark for the central point and then a line going through it. 290 00:29:07,990 --> 00:29:11,770 The Lewis and Ellis plot is a rectangle with a mark for their point estimate. 291 00:29:12,340 --> 00:29:20,510 The traditional forest plot now has that square and its size is proportional to the amount of evidence on a horizontal confidence interval. 292 00:29:20,710 --> 00:29:27,490 So Stephen Evans suggests that this would be what the Lewis analysis plot looks like as a regular forest plot. 293 00:29:27,700 --> 00:29:34,690 And so that sort of stuff did is our figure two in the BMJ article was to say, well this is what it would look like now and it looks different. 294 00:29:37,840 --> 00:29:44,200 1998 is probably the first time that what we would now fully recognise as a forest plot hits the literature. 295 00:29:44,410 --> 00:29:51,910 So Steph drew that retrospectively for my father's work, father hadn't drawn a forest plot that looked like that. 296 00:29:52,360 --> 00:29:55,480 But in 1998, the antiplatelet trial is collaboration. 297 00:29:55,630 --> 00:30:03,460 So looking at antiplatelet therapies, drugs, one and that same, and that potentially is the first forest plot in the literature. 298 00:30:04,060 --> 00:30:08,200 And that is not very different to what we would still see now. 299 00:30:08,620 --> 00:30:17,679 30 years on, things have changed slightly, but still the basic principles of the square being proportional, 300 00:30:17,680 --> 00:30:22,989 the horizontal line, the line of no effect going up the middle, sorry, line of no difference going up the diamonds. 301 00:30:22,990 --> 00:30:26,050 To illustrate the average. The dotted line to illustrate. 302 00:30:26,280 --> 00:30:30,060 The average point all the way up. Been there a while. 303 00:30:32,010 --> 00:30:35,819 So are there other examples? And again, I'm just going to illustrate a couple of examples. 304 00:30:35,820 --> 00:30:41,160 Again, just to say that when people say, oh, it's all it's all a feature of the last couple of decades, well, 305 00:30:41,610 --> 00:30:49,410 Stern's fault in 74 did an extremely important review of radiation therapy for breast cancer for women with breast cancer, 306 00:30:49,740 --> 00:30:58,650 concluding that actually it might not work, it might be dangerous, it may not improve survival. 307 00:30:59,040 --> 00:31:03,330 It may have an effect on work, on the cancer coming back around the breast. 308 00:31:03,600 --> 00:31:09,030 But that's not why you're using this therapy. You're using this therapy to try and keep the woman alive. 309 00:31:09,030 --> 00:31:13,020 And maybe this therapy is actually lethal in the long term. 310 00:31:13,440 --> 00:31:18,899 So Stern folded that in 74 subsequent reviews of radiation confirmed it. 311 00:31:18,900 --> 00:31:27,150 But now radiation therapy is better. And those harmful effects of irradiating the heart, those are being minimised. 312 00:31:27,600 --> 00:31:31,920 So now it has a survival advantage. But that's not 1970 475. 313 00:31:31,950 --> 00:31:43,290 Tom Chalmers not related to it at pools ascorbic acid for the common cold concluding there's not a lot going on shake which I've already 314 00:31:43,290 --> 00:31:52,349 illustrated not just looking at the quality of the evidence pools it and makes it makes a statement about cost effectiveness in view of the costs, 315 00:31:52,350 --> 00:31:55,680 financial and human as well as the lack of evidence to support it. 316 00:31:57,060 --> 00:32:01,530 Either get on and do a proper trial or maybe stop it. 317 00:32:02,430 --> 00:32:05,610 And again, how common are those sorts of things? 318 00:32:05,610 --> 00:32:12,450 Now, we need to feel this because we need to feel why are people why are people doing these things now? 319 00:32:12,660 --> 00:32:21,870 Why were they doing them in the past? In International Anticoagulant Review 97, we do the sorts of reviews I'm proud to work on. 320 00:32:21,870 --> 00:32:27,270 Still in Oxford, individual participant data. They gathered the data by building a collaboration. 321 00:32:28,380 --> 00:32:36,110 This type of review is still very rare where people actually share the patient level data, but it's not new. 322 00:32:36,300 --> 00:32:39,629 In 1970, it was pooled, the evidence was pooled. 323 00:32:39,630 --> 00:32:43,590 People joined together in a collaboration to do it early. 324 00:32:43,920 --> 00:32:52,140 This is the review I'm proud to still work on here with the CTS. Do this now up the hill at the Old Road campus in 84 before I joined, 325 00:32:52,860 --> 00:33:01,560 when I was still finishing my my degree that they got together at Heathrow, they formed the Early Breast Cancer Trials Collaborative Group. 326 00:33:01,980 --> 00:33:07,050 That group met in Oxford in the side business school in June of this year. 327 00:33:07,500 --> 00:33:10,739 Is is still rolling every few years. It's still rolling. 328 00:33:10,740 --> 00:33:18,120 Update on the data. New trials are coming along, the idea being that this is a disease that is very common. 329 00:33:18,600 --> 00:33:24,570 The treatments for it might not be great, but if we could pool the evidence, we could keep women alive. 330 00:33:25,380 --> 00:33:34,200 They could survive their breast cancer. And it led to this, which is to some extent, we've always viewed it as an icon, an iconic figure, 1998, 331 00:33:34,710 --> 00:33:42,060 this forest plot in The Lancet, all of the trials of a drug called Tamoxifen versus no Tamoxifen on the left is recurrence. 332 00:33:42,060 --> 00:33:48,300 Does it stop the disease coming back on the right is death does it stop the disease from killing the woman? 333 00:33:48,990 --> 00:33:54,710 Definitive meta analysis published in The Lancet in 1998. 334 00:33:54,840 --> 00:34:04,110 Still remember it very well. Partly because it well, actually because I tell you, my senior colleagues, some of whom you would know well, 335 00:34:04,110 --> 00:34:08,990 we colleagues, Richard Peto here in Oxford, they were the leads on the meteor. 336 00:34:09,000 --> 00:34:14,579 I was one of the juniors. I was responsible for going up to Radio Oxford in Summertown and talking to 337 00:34:14,580 --> 00:34:18,990 in the radio interviews and OP at 6 a.m. in the morning to do the first one. 338 00:34:19,350 --> 00:34:24,630 And what was the first headline on that day? Frank Sinatra has just died. 339 00:34:25,950 --> 00:34:28,680 Who knows who? Frank Sinatra is in the room, hands out. 340 00:34:28,830 --> 00:34:35,969 Because sometimes I say and people who who write very famous person and very sadly the 1:00 BBC News on the 341 00:34:35,970 --> 00:34:42,060 television Richard Peto was supposed to be the live first interview on rang out and said sorry Richard, 342 00:34:42,240 --> 00:34:47,700 we're not dealing with you we've got someone from the Frank Sinatra Appreciation Society. 343 00:34:47,880 --> 00:34:53,820 They're the lead off news on the BBC one news at 1:00 on that Friday. 344 00:34:53,820 --> 00:35:00,270 Well, remember it how it is is a big story and it's a big story until Frank Sinatra dies and then we go on. 345 00:35:00,270 --> 00:35:05,790 But it's the sort of thing you feel sad when you standing up in Radio Oxford at 6:00 in the morning and they don't want to talk to you anymore. 346 00:35:06,540 --> 00:35:12,990 But then you think, well, actually, our legacy, you know, whether we will outlive Frank Sinatra's legacy, we don't know. 347 00:35:14,250 --> 00:35:21,479 But certainly on the day on that Friday in May 1998, Frank Peters collaboration working together. 348 00:35:21,480 --> 00:35:25,650 And again that's so the early breast cancer try this group people share their data. 349 00:35:26,260 --> 00:35:29,890 For a five or 600 trial groups around the world share the data. 350 00:35:30,210 --> 00:35:32,690 Again, it's unusual, but collaboration exists. 351 00:35:33,010 --> 00:35:39,520 The CPC, as it's called, the two volume work that Cochrane introduced, that was a bunch of people collaborating. 352 00:35:39,640 --> 00:35:44,530 Dan Fox in 2011, who's very interested in that policymaking side of things. 353 00:35:44,760 --> 00:35:52,240 History credits it with having the great MOU, being building something, getting something moving. 354 00:35:52,540 --> 00:35:59,050 So we have to think about collaboration. Which brings us to Cochrane finally is the Cochrane collaboration to me. 355 00:35:59,190 --> 00:36:05,740 Really? That's the logo. The logo is of steroids showing the evidence on steroids to keep babies alive. 356 00:36:06,460 --> 00:36:14,350 And, you know, part of the rationale is that until the evidence was pooled, people didn't know how effective these things were once it was pooled. 357 00:36:14,470 --> 00:36:17,470 But it still took a while for people to take notice and start acting on it. 358 00:36:18,040 --> 00:36:24,850 But there will be human beings alive today because that showed that the moms should have steroids. 359 00:36:25,780 --> 00:36:31,659 So reviews are keeping people alive, but we've got to think about why we do them. 360 00:36:31,660 --> 00:36:35,860 So. COCHRANE There's a picture of Archie Cochrane 97. Is he criticising the profession? 361 00:36:36,100 --> 00:36:41,740 And this is really interesting for us to look back on because he says you should bring the results together. 362 00:36:42,400 --> 00:36:50,950 Ten years on and it's got nothing to do with Archie Cochrane. We get electronic publishing and electronic publishing was not the Internet. 363 00:36:51,130 --> 00:36:51,459 Again, 364 00:36:51,460 --> 00:37:01,990 those of you who are the younger ones amongst amongst us electronic publishing was you put it on a desk and the desk survives through your letterbox. 365 00:37:02,500 --> 00:37:06,910 That's how pregnancy and childbirth were. So that's how Cochrane Library was originally circa. 366 00:37:06,940 --> 00:37:10,690 It's on a CD and the CD arrives in your letterbox and you put the CD in your computer. 367 00:37:10,990 --> 00:37:15,129 Now, how many of you have got a CD player that you can connect to a computer? 368 00:37:15,130 --> 00:37:19,960 And how many have got a floppy disk drive? The PC was on the hard, the three and a quarter inch discs. 369 00:37:20,410 --> 00:37:26,110 That's how it came. The virtue being that next time we issue the disk, we kind of update some of its content. 370 00:37:26,740 --> 00:37:33,520 Because when your BMJ applies to your letterbox, there's not a big articulated lorry outside with all of the past issues of the 371 00:37:33,520 --> 00:37:38,230 BMJ scribbled in to update them because new evidence has become available. 372 00:37:39,550 --> 00:37:46,330 But electronic publishing realised that and then a decade on the internet and that transforms it an early internet. 373 00:37:46,600 --> 00:37:50,530 It just means that we can send you the stuff more easily we can. 374 00:37:50,560 --> 00:37:53,770 It's a good dissemination tool. But then the Internet and this is again, 375 00:37:53,820 --> 00:37:59,200 the other thing that needs to be looked at in history has transformed how we do reviews from people 376 00:37:59,200 --> 00:38:05,980 like me going from the Ratcliffe Infirmary to the Radcliffe Science Library to climb a ladder, 377 00:38:06,100 --> 00:38:10,000 to take a volume, to bring it to the photocopier fill in a photocopier form. 378 00:38:10,120 --> 00:38:18,380 See, people smile because they remember filling in those pink forms. The Ratcliffe Science Library, photocopy it, bringing it back to now you sitting. 379 00:38:18,490 --> 00:38:22,510 Click, click, click. Oh, got the PDF job done. Go and have a cup of coffee. 380 00:38:22,870 --> 00:38:28,719 You know, remember us walking across to the Radcliffe Science Library members filling in the forms for an interlibrary loan, 381 00:38:28,720 --> 00:38:32,560 and six weeks later, we might get an article with a big instruction on the back. 382 00:38:32,560 --> 00:38:37,750 Do not photocopy this article. You will breach copyright and just get a copy. 383 00:38:37,870 --> 00:38:41,110 Keep it spare. I got a spare because I'm going to scribble all over it. So. 384 00:38:41,200 --> 00:38:44,920 So that made a massive difference. Cochrane itself. 385 00:38:45,250 --> 00:38:49,389 Cochrane sent hope is likely to update software publish the practice in childbirth 386 00:38:49,390 --> 00:38:55,450 collection 95 the Cochrane Database first CSR comes out 98 Cochrane Library on the Internet. 387 00:38:55,720 --> 00:39:00,220 2013 was the 20th anniversary. We're now at the 25th anniversary. 388 00:39:00,550 --> 00:39:04,780 How has it grown? Well, our first Cochrane Library in 95 got 50 reviews on it. 389 00:39:05,740 --> 00:39:14,760 You can say there was very steady, steady. Now it's almost steady state, you know, 3007, 4009, 5000, 6000, 8000. 390 00:39:14,770 --> 00:39:17,210 It's about 8000 now. But is that it? 391 00:39:17,680 --> 00:39:27,070 Is that all the in systematic reviews now I hope this is an article by Hilda Bastian 2010 and it's an out it's got a title about, 392 00:39:27,220 --> 00:39:30,430 you know, some 70 trials a day, 12 reviews a day. 393 00:39:30,430 --> 00:39:34,990 That was the output. Then this is that graph showing the top one. 394 00:39:35,020 --> 00:39:42,850 There are various different ways you can search for systematic reviews, but the top one is estimating that by from 1990, not a lot, not a lot. 395 00:39:43,090 --> 00:39:46,360 And then we will put maybe 6000 down the bottom one. 396 00:39:46,360 --> 00:39:54,670 The very bottom line, that's Cochrane. So early on, yeah, it's a fair proportion until about 2000 and then the world takes off. 397 00:39:55,270 --> 00:40:02,530 That was Hilda's estimate in 2010. Earlier this year that graph was was in an article. 398 00:40:03,520 --> 00:40:12,070 This is across all disciplines. Their estimate is that we've passed the 200,000 mark now on systematic reviews across all disciplines. 399 00:40:12,250 --> 00:40:17,680 From the very early example onto this graph, they have a very nice timeline for this person, 400 00:40:18,040 --> 00:40:25,300 but also on what was a long few of those early examples aren't even making the graph and then. 401 00:40:26,480 --> 00:40:30,590 It just goes away. And that's what we're going to have to think about what is going on here. 402 00:40:31,430 --> 00:40:35,480 This is a this is a total shift. This is not a gradual growth. 403 00:40:35,720 --> 00:40:40,980 This is a wow. It's taking off. We have a register of systematic abuse called Prospero. 404 00:40:41,450 --> 00:40:49,430 It now has more than 40,000 things in it. It began only five or six years ago with a mindset that we might have one or 2000 a year. 405 00:40:49,730 --> 00:40:57,980 It's now running at 12 or 13 or 14,000 a year being registered as new reviews, and that's only a sample of them. 406 00:40:58,160 --> 00:41:01,070 So something is happening and I can't tell you what some. 407 00:41:01,160 --> 00:41:07,460 But the reality now, one of the really fascinating points from that graph like that is when Cochrane began in 1980, 408 00:41:07,880 --> 00:41:13,370 1994, one of the first things that needed to be done was to find the trials. 409 00:41:14,810 --> 00:41:20,030 At that time in Medline you could find 20,000 randomised trials easily. 410 00:41:22,070 --> 00:41:25,190 We need to sort this out. There's a lot of trials. We need reviews. 411 00:41:25,370 --> 00:41:30,980 Now in Medline you can find about 120,000 systematic reviews. 412 00:41:31,820 --> 00:41:38,330 So what's our next step going to be? Because we needed to sort it out because there were thousands, tens of thousands of trials. 413 00:41:38,480 --> 00:41:46,070 We've now got more reviews than there were trials we could find easily when a lot of this activity began. 414 00:41:46,310 --> 00:41:53,330 And we just don't know where that graph is growth is heading for. And we've got big issues about how high should that graph go. 415 00:41:53,810 --> 00:41:59,690 People just doing reviews because it's easier to do a review than it is to do a new prospective study. 416 00:42:00,080 --> 00:42:05,000 Typically, people just doing reviews to boost their CV is that's what's going on. 417 00:42:05,000 --> 00:42:14,030 Are these reviews legitimately needed? How many more reviews do we need of pet imaging for lung cancer, which is some of these challenges? 418 00:42:14,030 --> 00:42:15,170 And where are these challenges? 419 00:42:15,350 --> 00:42:22,610 Because you have agencies across Europe saying, why do we need to do another review to fit our guideline when there's already a review out there? 420 00:42:22,640 --> 00:42:27,770 Can we please get together and have things more organised? 421 00:42:28,310 --> 00:42:37,550 So I'm going to finish with a quote. Back in 9676 from the Journal that Jean Glass coined the phrase matter analysis, 422 00:42:39,440 --> 00:42:44,360 meta analysis were without doubt be abused as well as use with positive benefit. 423 00:42:44,870 --> 00:42:49,490 And I think is an important point there for us to think about. This is not all happy stuff. 424 00:42:50,420 --> 00:42:57,140 Reviews are used and now we can see bias in reviews. We can see the prejudice that James Lind was trying to root out. 425 00:42:57,500 --> 00:43:00,650 We can see that in some reviews. We can see people doing reviews, 426 00:43:00,830 --> 00:43:06,650 doing things that are not systematic reviews and labelling them as systematic reviews because the label may now mean something. 427 00:43:07,460 --> 00:43:13,730 But then are we going to have a problem in a few years from now where people say, reviews show that, yeah, I don't know what they're doing. 428 00:43:13,730 --> 00:43:21,830 There's just too many of them. Yeah. What I'd rather have is go back to the days of your of a very experienced practitioner. 429 00:43:22,520 --> 00:43:25,730 You've been treating this for a long time. What do you think I should do? 430 00:43:25,880 --> 00:43:36,560 Because I'm overwhelmed. I maybe I can talk to my GP and I can read 15 reviews of the same thing, even though I am an experienced reviewer. 431 00:43:37,010 --> 00:43:40,400 What are we expecting people to do as these reviews proliferate? 432 00:43:41,240 --> 00:43:47,959 So just to actually conclude and this is wearing so sometimes this talk is in the one of the modules on the evidence based health care, 433 00:43:47,960 --> 00:43:55,100 which is the history and philosophy module that Jeremy how it looked after and is to make make them think if you were going to study this history, 434 00:43:55,220 --> 00:43:59,570 how would you do it and how would you do it systematically? So yeah, yeah. 435 00:43:59,690 --> 00:44:06,380 This is a non systematic history of systematic reviews because I haven't searched everywhere where they might be. 436 00:44:06,740 --> 00:44:11,210 I've searched convenience BMJ fantastic that they digitised. 437 00:44:11,720 --> 00:44:17,360 So I can look at I can look back at them, but I bet there'll be examples in other places that use the terms. 438 00:44:18,620 --> 00:44:25,669 But what I'd love to see at some point is someone say, why did they explode the way they did? 439 00:44:25,670 --> 00:44:32,960 And that's probably going to be the anthropologists or sociologists who are going to look at it and say, Why have they taken off? 440 00:44:34,710 --> 00:44:35,850 So thank you.