1 00:00:01,020 --> 00:00:06,570 Okay. Research empowered. Have you heard the term research impact? 2 00:00:07,740 --> 00:00:11,010 Yeah. Who hasn't? Who? 3 00:00:11,130 --> 00:00:15,000 You never heard the term. Thank you for being so honest. That's. That's really good. 4 00:00:15,520 --> 00:00:20,030 And how many of you are from the UK? Most of you. 5 00:00:20,040 --> 00:00:23,610 And then? And then what have we got? The States? Yeah. 6 00:00:24,000 --> 00:00:27,570 Anywhere else. Canada. Okay. Africa. 7 00:00:28,120 --> 00:00:31,770 Africa could be a bit more specific, said Africa. 8 00:00:32,370 --> 00:00:35,370 Right. Which part of Africa? Kenya. Okay. I don't. 9 00:00:35,370 --> 00:00:38,370 I don't know what's happening about research, in fact, in Kenya, 10 00:00:38,610 --> 00:00:46,640 but certainly I'm in the middle of just finishing a big systematic literature review, which I was hoping to be able to show this day. 11 00:00:46,650 --> 00:00:51,210 But in fact, it hasn't been signed off yet. So I can't, you know, all that sort of stuff. 12 00:00:51,930 --> 00:00:57,809 And actually, the Canadian Institute for Health Research have got a really great chunk of work on research, 13 00:00:57,810 --> 00:01:01,650 in fact, which is slightly different from the mainstream stuff in the UK. 14 00:01:01,650 --> 00:01:04,830 But but very interesting. But certainly research impact is big in Canada. 15 00:01:04,830 --> 00:01:08,549 It's big in Australia, it's big here for reasons I'm going to tell you about. 16 00:01:08,550 --> 00:01:14,750 So. So what? My qualifications for telling you about research impact. Well, I'm. 17 00:01:14,920 --> 00:01:18,150 I'm like, Shawn, I've been interested in knowledge translation for years. 18 00:01:19,560 --> 00:01:24,960 So so this impact stuff I've I've got an ambivalent relationship with. 19 00:01:25,230 --> 00:01:34,650 But two things are sort of relevant. I mean, apart from the fact that research on research is an interest of mine and research into translation. 20 00:01:35,760 --> 00:01:44,550 Firstly, my previous job at Queen Mary was dean for research impact, among other things. 21 00:01:45,270 --> 00:01:47,879 In other words, there was a dean for teaching and Dean for research. 22 00:01:47,880 --> 00:01:51,630 And because Queen Mary was rather forward looking to appoint a Dean for research impact. 23 00:01:51,990 --> 00:02:00,000 And so my job was to sort of scurry around and help people who are doing research to try and generate evidence of impact, 24 00:02:01,320 --> 00:02:10,530 which was quite fun, and of course to liaise with external stakeholders, if you like, to to improve the impact. 25 00:02:10,530 --> 00:02:16,259 And I think a lot of higher education institutions in the UK are now making appointments, 26 00:02:16,260 --> 00:02:20,190 but whether they make senior academic appointments or whether they make junior administrative 27 00:02:20,190 --> 00:02:23,759 appointments around research impact is something that someone really ought to do. 28 00:02:23,760 --> 00:02:32,790 Some research about, you know, is impact. This kind of is in fact the province of the press office or is it equivalent to research? 29 00:02:32,790 --> 00:02:40,530 And teaching is another seriously important piece of activity that universities these days have got to get into. 30 00:02:41,640 --> 00:02:52,320 I should say also that my academic background in critical social science means that there's a bit of me that is a bit cynical about the impact agenda, 31 00:02:52,320 --> 00:02:55,410 and I'm going to share that with you towards the end of this talk. 32 00:02:56,430 --> 00:03:01,559 But the other reason why impact has been a really big part of my life for the last 33 00:03:01,560 --> 00:03:06,870 two or three years is because who who's head of the research excellence framework? 34 00:03:06,870 --> 00:03:11,279 The ref? Yeah, probably most of the UK people watching all of you. 35 00:03:11,280 --> 00:03:14,640 So you haven't a lot of, you know, putting your hands up who heard to the ref. 36 00:03:15,870 --> 00:03:24,420 Okay, about half the people. Okay. The ref, the research excellence framework, some of the ones described it as the university's Olympics in the UK. 37 00:03:24,450 --> 00:03:31,469 Although Australia's got a ref, most of most of the sort of high income countries are developing ref equivalents. 38 00:03:31,470 --> 00:03:43,380 And what it is is every five, six, seven years all the universities get together and compete and rank each other in terms of whose research comes top. 39 00:03:43,880 --> 00:03:51,960 And until last year you'd rank universities just in terms of publications, how many pages you had published in The Lancet or whatever. 40 00:03:52,380 --> 00:03:58,709 And then these are the scores that go into that. But this ref, they introduced an impact category. 41 00:03:58,710 --> 00:04:01,800 So 20% of this ranking, 42 00:04:01,920 --> 00:04:10,740 which means 20% of the funding the government is going to give universities annually for the next six years or so will come from research impact. 43 00:04:10,890 --> 00:04:16,170 Now, that was terrifying because nobody had actually had their research impact assessed. 44 00:04:16,620 --> 00:04:23,999 Now, the way the ref was wrong was they people could apply for posts on a big committee 45 00:04:24,000 --> 00:04:30,270 called a panel and then that panel for medicine got to spawn six sub panels, 46 00:04:31,230 --> 00:04:38,400 one in clinical clinical medicine, one in the sort of primary care public health, one in psychiatry, neuroscience, etc., etc. 47 00:04:38,400 --> 00:04:47,820 So that was so there was a main panel, six sub panels, all assessing lots and lots of submissions, including academic papers and impact case studies, 48 00:04:47,820 --> 00:04:56,280 little stories about the impact now that the whole panel was chaired by a chap called Stephen Holgate. 49 00:04:57,690 --> 00:05:03,030 But the deputy chair was me. And so Stephen tended to take the lead on the research bits. 50 00:05:03,240 --> 00:05:07,230 And because of his interest in knowledge translation, he said, maybe you can sort out the impact. 51 00:05:07,470 --> 00:05:09,180 In fact, I didn't sort out the impact, 52 00:05:09,180 --> 00:05:20,390 but I thought I had a lot to do with the sort of benchmarking and quality control of the assessment of how many impact case studies, 53 00:05:20,640 --> 00:05:25,290 I don't know, several hundred. So you know, where all the universities had submitted them. 54 00:05:25,290 --> 00:05:31,240 So that was an exhausting exercise of the assessment of research and its impact. 55 00:05:31,260 --> 00:05:34,920 So that's that's why I came to research impact. 56 00:05:35,250 --> 00:05:45,710 And this talk is an adapted version of, of one that began life as me explaining to people from the point of view of the ref. 57 00:05:45,720 --> 00:05:54,840 Now I don't want to bore you too much with ref, but it is quite a good thing to start with because it really, you know, existed. 58 00:05:55,410 --> 00:06:00,570 The results of the ref are helping to pay my salary and the salary of other academics in this room. 59 00:06:02,250 --> 00:06:06,120 So there's a sense that it's quite real, even if you don't agree with it. 60 00:06:08,280 --> 00:06:12,540 Normally I would define my terms at the beginning of the lecture, 61 00:06:12,840 --> 00:06:21,600 but the problem with impact is it's differently defined in different usages of the term, 62 00:06:21,610 --> 00:06:27,720 and that is highly problematic because it sounds very clear what impact is. 63 00:06:27,870 --> 00:06:34,020 But I'm going to show you that it isn't clear who's who's on Twitter. 64 00:06:34,530 --> 00:06:37,559 And so you've got to follow Lego academics on Twitter. 65 00:06:37,560 --> 00:06:44,640 They're absolutely brilliant. So these these these academics, the Lego people, and they tweet out, of course, they're not real. 66 00:06:46,410 --> 00:06:54,930 But actually, one of the problems with this policy interest in research impact is that, 67 00:06:55,290 --> 00:06:58,740 as Sharon said, you get Twitter, we've got to use social media. Is it? Absolutely. 68 00:06:59,310 --> 00:07:05,070 We've we've got to do things that get our research out there to the people who are not sitting in the ivory towers. 69 00:07:05,080 --> 00:07:13,739 There's this enormous push. And actually, there comes a time when a lot of these accounts of impact already. 70 00:07:13,740 --> 00:07:17,010 Hang on. What did we do? We sent a press release to The Daily Telegraph. What is this? 71 00:07:17,010 --> 00:07:21,120 Is that really impact? And okay, so why all the fuss about impact? 72 00:07:21,960 --> 00:07:28,290 I've told you about the UK research excellence framework that happened over the last couple of years. 73 00:07:28,290 --> 00:07:31,769 It released its results the week before Christmas, I think. 74 00:07:31,770 --> 00:07:33,930 So some vice chancellors had a lousy Christmas. 75 00:07:35,760 --> 00:07:44,280 The UK Research Council's Medical Research Council, Economic and Social Research Council, Arts Humanities Research Council. 76 00:07:45,000 --> 00:07:47,280 For all grant applications that you put in, 77 00:07:47,280 --> 00:07:53,430 for those you now have to write a Pathways to Impact section as well as saying, This is a research I'm going to do. 78 00:07:53,760 --> 00:08:00,090 You also have to submit a little submission saying this is how I'm going to achieve impact from my research. 79 00:08:00,510 --> 00:08:03,330 And it's an I know because I sit on some of the panels, give the money. 80 00:08:03,840 --> 00:08:09,720 You may have a brilliant idea for your research, but if you haven't got a good pathways to impact submission, you won't get funded. 81 00:08:11,220 --> 00:08:15,510 In Europe, the European Framework Horizon 2020 is this. 82 00:08:15,510 --> 00:08:20,970 You know, you need to know about this because there's a lot of your taxes are going into into European research. 83 00:08:23,020 --> 00:08:29,690 2020, which is a big programme of research that is now being funded from last year through to next year. 84 00:08:31,240 --> 00:08:38,830 It's prioritising what they call societal impact. We need to make society better in whatever way it is, monetised or not. 85 00:08:39,160 --> 00:08:47,980 And again, you won't get a European grant unless you address impact World University rankings if you care about that, 86 00:08:49,180 --> 00:08:59,890 depending on the different metrics. Research impact outside the university sector is now contributing increasing amounts to the world rankings. 87 00:09:00,640 --> 00:09:04,750 But push all that aside and just look at the bottom. 88 00:09:06,370 --> 00:09:11,140 There's also something a bit more moral, a bit more fundamental. 89 00:09:11,950 --> 00:09:19,120 Do we want to just be sitting here and talking, or do we want to be useful to society? 90 00:09:20,050 --> 00:09:23,560 Whose grave is that? Karl Marx. 91 00:09:24,130 --> 00:09:29,380 So, so what he's saying is the philosophers, which means the academics, when in total you can sit there and talk all day. 92 00:09:29,530 --> 00:09:38,530 But actually, if as academics, you know, is a professor, I don't get a lot of satisfaction unless I can follow through and say, 93 00:09:38,530 --> 00:09:41,920 right, the stuff I've done to stop babies dying or whatever it is. 94 00:09:42,190 --> 00:09:48,340 So, so I think although some of us are quite cynical about this, this, you know, demonstrate your impact. 95 00:09:48,790 --> 00:09:58,180 In point of fact, there is something real in there, too. In addition, you know, I certainly in my previous job, 96 00:09:58,450 --> 00:10:04,120 one of the things I had to do was deal with and counsel and support sad academics who 97 00:10:04,120 --> 00:10:10,659 didn't feel that they got very far in life and that surely you don't want to spend 20, 98 00:10:10,660 --> 00:10:15,010 30 years and then think, well, what have I done? 99 00:10:15,070 --> 00:10:20,200 And also, of course, your impact, because the university is judged on it. 100 00:10:20,560 --> 00:10:27,500 It's coming up at your annual appraisal. You might be performance managed on it and you could be getting your your REM, 101 00:10:27,900 --> 00:10:32,490 the little letter that says you're fired because all you've done is publish clever papers. 102 00:10:32,500 --> 00:10:36,100 Okay, so now I'm going to give you some definitions. 103 00:10:36,250 --> 00:10:42,940 So this is rough impact and it is a bit dry, but there's not much of this. 104 00:10:43,120 --> 00:10:44,320 It does get more interesting. 105 00:10:44,980 --> 00:10:53,559 All you have to read is the yellow bits impact was very, very broadly defined and some of this stuff I was involved in writing. 106 00:10:53,560 --> 00:11:02,890 So we all know it's tedious. Hopefully the idea is that impact was we weren't prescriptive about impact. 107 00:11:03,100 --> 00:11:10,180 We said, look, for example, it could be this or this or this. And like you write on the student thing, any other reasonable answer type thing, 108 00:11:10,450 --> 00:11:16,060 but within the RAF you were not allowed to have impact within universities, 109 00:11:16,240 --> 00:11:23,020 you had to have impact outside universities because the Treasury want to know if we give this much to the higher education sector, 110 00:11:23,140 --> 00:11:32,110 what does society at large get back? So what were these impact case studies that I spent quite a lot of the last two years looking at, 111 00:11:32,110 --> 00:11:35,440 with which I spent quite a lot of time writing, some of them, which I'll tell you about. 112 00:11:36,310 --> 00:11:42,650 So this is the storyline of an impact case study. There was some kind of a problem, preferably big one research. 113 00:11:42,680 --> 00:11:46,870 This university aimed to solve the problem. The problem was solved. 114 00:11:47,050 --> 00:11:52,180 That means it was significant and the benefits spread far and wide. 115 00:11:52,180 --> 00:12:00,730 And you know, with your knowledge, translation, work those views studying this module, that is actually fairly easy to make a change here. 116 00:12:00,790 --> 00:12:06,400 It's much more difficult to roll it out to the to even the ward, let alone the hospital down the road. 117 00:12:06,640 --> 00:12:15,730 And so the idea is to get good marks on your impact case study, you had to have both significance and reach. 118 00:12:17,530 --> 00:12:25,569 Here's an example of an impact case study that I put together with Nick Wold, who is an epidemiologist, still is an epidemiologist at Queen Mary. 119 00:12:25,570 --> 00:12:32,770 So the argument was before 1993, most babies born with Down's syndrome were a surprise. 120 00:12:32,860 --> 00:12:39,520 People didn't know they were about to have the babies and they weren't you know, there was no sort of intervention. 121 00:12:39,970 --> 00:12:49,240 What Ward's research did was he produced a series of tests that made prediction of Down's syndrome during early pregnancy much more accurate. 122 00:12:49,870 --> 00:12:53,919 Now, you have to be careful with this one because you don't want to say, hey, 123 00:12:53,920 --> 00:13:00,490 we've had seen so many thousand more abortions because, you know, it's maybe people don't support that. 124 00:13:00,850 --> 00:13:09,040 So the way we framed this rhetorically was that these days, most babies with Down's syndrome are born out of choice. 125 00:13:09,280 --> 00:13:15,370 Either the mother decides not to have the test or having had the test, it comes out positive. 126 00:13:15,520 --> 00:13:21,780 She decides she would like to have the baby anyway, but she's had the choice or she's chosen to have a termination. 127 00:13:21,790 --> 00:13:22,450 So that's the. 128 00:13:22,600 --> 00:13:33,430 We framed it and we also looked up where I looked up which other countries used the tests that were developed at what was then my university. 129 00:13:34,180 --> 00:13:40,180 And of course, if they used in China, that means there's enormous reach because there's so many million people in China. 130 00:13:40,720 --> 00:13:46,540 So the sorts of things that we were looking at with these impact case studies were significant, some reach attribution. 131 00:13:46,540 --> 00:13:49,990 In other words, did this research cause this impact? 132 00:13:50,210 --> 00:13:55,030 So a lot of argument in this systematic review of how do you attribute with this one? 133 00:13:55,030 --> 00:14:01,689 It was easy because he was the only guy doing it, really. But with something like smoking cessation interventions, you could say, well, we did. 134 00:14:01,690 --> 00:14:06,099 We produced this. And then ten years later, fewer people were smoking. 135 00:14:06,100 --> 00:14:09,970 Was that because of the intervention you developed? Was it because of someone else's work? 136 00:14:10,330 --> 00:14:16,600 How much what proportion of the change can we attribute to your research and also timescale? 137 00:14:17,020 --> 00:14:24,250 So we had a two year timescale. You could have impact immediately, or it might take 20 years for the whole thing to get into policy. 138 00:14:25,540 --> 00:14:27,220 Okay. So that's the impact case of this. 139 00:14:27,400 --> 00:14:35,050 Oh, the second aspect of the impact case study was what they called the impact template, which is a stupid name, but it was just a future plans. 140 00:14:35,080 --> 00:14:39,510 What's your university doing to improve the impact? 141 00:14:39,520 --> 00:14:48,280 In other words, what kind of knowledge translation infrastructure have you got that will improve your research impact over the next 5 to 10 years? 142 00:14:48,460 --> 00:14:52,900 For example, well, for example, appointing a dean for impact, one would hope, 143 00:14:53,980 --> 00:15:03,630 would improve the chance that everybody else in the university would have it on their agenda decent amounts of money into communications, 144 00:15:03,640 --> 00:15:10,270 external linkages with industry, with policy makers, patient and public involvement, that kind of thing. 145 00:15:10,450 --> 00:15:20,710 So so there was a the trouble with these I did a analysed lots of them actually 30 or 40 of these impact templates for something else I'm doing. 146 00:15:21,340 --> 00:15:28,749 Everybody managed to score pretty much top marks on these because it was obviously the sort of thing you had to put in place. 147 00:15:28,750 --> 00:15:33,549 And so before they wrote the Impact Case Impact Template or the university said, 148 00:15:33,550 --> 00:15:36,790 well, we know we've got this, we've got staff training, we've got blah, blah. 149 00:15:37,450 --> 00:15:48,190 So they weren't very discriminatory. And this is a straw poll of the 23 impact case studies that I was involved in. 150 00:15:48,490 --> 00:15:53,469 And again, this might be quite interesting for those of you who are doing thinking about knowledge, 151 00:15:53,470 --> 00:16:01,560 translation is what kind of what kind of impacts does research have? 152 00:16:01,570 --> 00:16:09,880 So the ones that we produce, you can see this is just 23 that went in from from my medical school where I was working until the end of December. 153 00:16:10,330 --> 00:16:14,380 And you see the three big things are changing morbidity. 154 00:16:14,740 --> 00:16:25,480 In other words, really, I was very reluctant to let anything through that hadn't actually at least changed people's level of sickness, if not more. 155 00:16:25,600 --> 00:16:32,560 Actually, very few change mortality, getting into a guideline and getting into policy. 156 00:16:33,070 --> 00:16:45,670 If I look, I'm now analysed about 160 of these for a paper that I'm writing and actually guideline comes out enormously ahead of everything else. 157 00:16:45,760 --> 00:16:51,400 So, so of the things that was submitted in certainly health services, research, primary care, public health, 158 00:16:51,910 --> 00:16:57,760 the biggest impacts that people were claiming or the commonest impact people claiming was we got into a guideline, 159 00:16:58,330 --> 00:17:03,550 and I think that's a bit of an artefact of wanting something measurable, something crunchy. 160 00:17:03,820 --> 00:17:06,700 You know, very few people said, well, we changed the paradigm. 161 00:17:06,700 --> 00:17:11,109 We've change public attitudes because it's much harder to measure, whereas we change a guideline. 162 00:17:11,110 --> 00:17:22,509 And here is our bit on page 63. So those are the kinds of impacts and the mechanism of impact in, 163 00:17:22,510 --> 00:17:33,760 in my little sample of 23 and actually very similar across the largest sample of impact case studies the commonest and this this 164 00:17:33,760 --> 00:17:40,959 is actually quite key and hopefully will resonate with the stuff that Sharon and her team been teaching you is clinician poll, 165 00:17:40,960 --> 00:17:49,150 in other words, research that was itself informed by experiences at the clinical front line. 166 00:17:49,160 --> 00:17:52,420 So I'll give you an example. So all on their own in the public domain. 167 00:17:52,420 --> 00:17:56,020 Now, there was a guy at Queen Mary who is consultant, 168 00:17:56,020 --> 00:18:06,760 cardiologist and 20 years ago set up rapid access chest pain clinics and he locally set them up in his own hospital. 169 00:18:06,910 --> 00:18:11,319 They seem to work. The GP is light to them. Someone comes in, got chest pain, right. 170 00:18:11,320 --> 00:18:12,580 Does a clinic tomorrow if you get. 171 00:18:13,810 --> 00:18:22,240 And so he then rolled it out and introduced this and I can't remember the study design, but he then did some research on that. 172 00:18:22,840 --> 00:18:26,500 Bringing his fellow cardiologists in other hospitals on board. 173 00:18:26,860 --> 00:18:36,400 And guess what? Huge impact on the diagnosis of things like unstable angina and impact on mortality. 174 00:18:37,060 --> 00:18:43,270 Now, you could say that the research was never going to make it into the New England Journal of Medicine, 175 00:18:43,270 --> 00:18:47,079 because actually it was pretty, pretty basic techniques. 176 00:18:47,080 --> 00:18:49,990 It was relatively small samples, but the impact was huge. 177 00:18:50,290 --> 00:18:57,190 And now almost every hospital in this country still has a rapid access chest, chest pain clinic, etc., etc. 178 00:18:57,280 --> 00:18:59,800 So that's an example of clinician pull. 179 00:19:00,040 --> 00:19:08,230 The person doing the research was themselves the clinician who drove who did the research in order to improve his own clinical practice. 180 00:19:08,800 --> 00:19:12,190 Industry pull very obvious. 181 00:19:12,610 --> 00:19:16,300 It's contract research for industry. So we had a dental school. 182 00:19:16,450 --> 00:19:21,490 And guess what? Industry think it was boots paid one of our chance to develop these. 183 00:19:21,790 --> 00:19:24,940 I've still got a tube of it on my desk, actually, because he brought it along. 184 00:19:24,950 --> 00:19:30,840 It's a special tooth whitening toothpaste, and it's selling like hotcakes and made a lot of money for whoever paid for it. 185 00:19:31,420 --> 00:19:35,049 Policy pull contract research from the Department of Health. 186 00:19:35,050 --> 00:19:39,960 Again, very useful. Very, very good link. 187 00:19:39,970 --> 00:19:49,130 If the policy maker pays for the research and you get the answer they want, it's highly likely that that will then get into policy. 188 00:19:49,150 --> 00:19:52,479 Now, look at this example here, just his bottom line. 189 00:19:52,480 --> 00:20:00,010 There were no examples of studies that were funded by the Medical Research Council or the NIH or or indeed any 190 00:20:00,010 --> 00:20:08,589 other major funder that had impact by the researchers doing the research and then going and knocking on the door. 191 00:20:08,590 --> 00:20:13,120 The policy makers saying, would you like these findings? Because that doesn't happen. 192 00:20:14,620 --> 00:20:22,569 Co-production is things like Clark's multi-stakeholder collaborations, where you've got clinicians, researchers, possibly industry, 193 00:20:22,570 --> 00:20:32,920 possibly third sector, whatever, local charities and faith groups and, you know, getting together and often pulling money in from different sources. 194 00:20:33,190 --> 00:20:35,080 I think we're going to see a lot more of that. 195 00:20:35,500 --> 00:20:46,030 And then one curiosity driven woman who did a fantastic start just followed her own interest, put it up on the Internet as an Internet resource. 196 00:20:46,210 --> 00:20:54,160 Loads of people using it. So there is still space for curiosity driven research, but it's not very common. 197 00:20:54,670 --> 00:20:58,180 Much more common is this. It is sort of driven from the clinical front line. 198 00:20:59,680 --> 00:21:05,260 Now, I've said the ref didn't count academic impact, in other words, impact within universities. 199 00:21:05,890 --> 00:21:12,129 But the Research Council's UK does count academic impact as impact. 200 00:21:12,130 --> 00:21:18,910 So this is a terminology thing. I mean, of course we were the ref assessed academic work, but it didn't call it impacts. 201 00:21:19,720 --> 00:21:29,020 But you can see here this is MRC, SLC, all this global economic performance effect of public sector. 202 00:21:29,920 --> 00:21:40,870 The emphasis is on economic banks per book because actually Treasury needs to make sure that higher education is helping balance the books. 203 00:21:43,180 --> 00:21:46,870 You don't need to read all of that. So anyone's seen this before. 204 00:21:47,590 --> 00:21:55,419 This is the Pathways to Impact slide and this is the bit every time I looked at this, you know, for a couple of years, oh, this is just so awful. 205 00:21:55,420 --> 00:22:04,750 This is this is the worst kind of spider things. I haven't shown you the grim bit yet, but this is a beautiful drawing from research councils UK. 206 00:22:05,020 --> 00:22:12,100 This is what you've got to use to inform your pathways to impact submission and the green bid, 207 00:22:12,610 --> 00:22:16,570 economic and societal impacts of the impacts outside the university sector. 208 00:22:17,110 --> 00:22:21,910 So, you know, you can do anything you like, including enhancing cultural enrichment. 209 00:22:22,510 --> 00:22:26,799 It's very broad. Okay. Just about anything council's impact. 210 00:22:26,800 --> 00:22:37,959 If you if you can write a plausible enough narrative and again in Horizon 2020, you can ex-post impact is after you've done the research. 211 00:22:37,960 --> 00:22:43,210 But what they're really interested in is all these ex ante impacts. 212 00:22:43,900 --> 00:22:53,410 When you apply for the money, what have you got in place that makes it look like you are likely to have impact from your research later? 213 00:22:53,410 --> 00:22:59,290 So the sort of things track record of the researchers, have you got well constructed dissemination plans? 214 00:22:59,830 --> 00:23:03,969 Is your project embedded in existing stakeholder networks? 215 00:23:03,970 --> 00:23:11,050 This is absolutely key and I hope you've already learned this under the involvement of policymakers, which is kind of the same thing. 216 00:23:11,350 --> 00:23:15,370 What if I left off their knowledge to action people? 217 00:23:16,240 --> 00:23:20,170 But what if they left off understanding of the local context? 218 00:23:20,590 --> 00:23:25,790 Yeah. Actually, yes. So engagement with the local contact? 219 00:23:26,240 --> 00:23:32,600 Yeah, yeah, yeah. Totally agree. This was for I do some refereeing for the Canadian Health Service Research Foundation, 220 00:23:33,050 --> 00:23:40,220 all relevant decision makers, part of the research team as investigators or with a significant advisory role. 221 00:23:40,610 --> 00:23:49,969 So this is to try and shift you if there's anyone in the room who still feels that you do the research and then when you just finishing research, 222 00:23:49,970 --> 00:23:56,780 you start your dissemination and it doesn't work like that. It's got to be the other way round before you apply for the money. 223 00:23:57,020 --> 00:24:03,260 Build your networks, build your relationships, get people materials and get people hungry for what you're doing. 224 00:24:03,440 --> 00:24:07,850 But also, you will find that if you involve people at the beginning, 225 00:24:08,150 --> 00:24:16,580 they will give you feedback that will change your research questions in a positive direction to to make your findings a lot more applicable. 226 00:24:18,350 --> 00:24:22,730 Right. Put your hand in the air if you'd like that picture. No day. 227 00:24:22,790 --> 00:24:29,419 I don't like it either. This is the I can say this with great confidence because it's coming out in a systematic review. 228 00:24:29,420 --> 00:24:34,790 This is the most widely used framework for assessing research impacts. 229 00:24:35,690 --> 00:24:41,419 It informed the design of the impact case study and I'll make it easier for you all. 230 00:24:41,420 --> 00:24:44,420 It's saying I don't know why they wrote it in such small. 231 00:24:45,560 --> 00:24:55,310 It's just bad design. All it's saying is, first of all, before you do research, you need to prioritise what are the most important topics to research? 232 00:24:55,670 --> 00:25:02,360 And those you hear was, I think, the last lecture I gave. We had a case where there was a whole load of drugs that the patients were on, 233 00:25:02,360 --> 00:25:05,960 but actually someone put their hand up and said, Well, what about non-drug treatments? 234 00:25:06,260 --> 00:25:09,530 And that's actually quite a good example of frame shifting the frame. 235 00:25:09,800 --> 00:25:13,850 You know, hang on a minute. Are we prioritising the right kind of research? 236 00:25:14,120 --> 00:25:17,150 Then you do your research, then you publish it. 237 00:25:17,420 --> 00:25:20,690 At some point you implement the findings and then you get benefits. 238 00:25:21,140 --> 00:25:30,590 But there's these two rings which took me a long time to draw on PowerPoint, but you can see they're really key. 239 00:25:31,040 --> 00:25:42,410 The first red ring is the link between the funders of research, the people who are commissioning in this country the jaw, the amnesty, all that. 240 00:25:42,440 --> 00:25:49,040 How much dialogue are you having with the commissioners of research or the or the Department of Health, whatever? 241 00:25:49,730 --> 00:25:55,970 And then the second interface is the link between you, the person doing the research, 242 00:25:56,090 --> 00:26:00,380 and those guys out there who are not researchers who might want to implement your findings. 243 00:26:00,390 --> 00:26:07,220 So these two interfaces with people outside the academic world are absolutely key. 244 00:26:07,490 --> 00:26:11,090 The rest of this ghastly diagram is just sort of feedback loops. 245 00:26:11,090 --> 00:26:20,470 Of course, once you're implementing, you also want to feedback and influence the setting of the next set of priorities. 246 00:26:20,480 --> 00:26:24,320 So the idea is this is a very dynamic model. 247 00:26:24,320 --> 00:26:29,059 In fact, I don't like it nearly as much as Buxton and I like it. 248 00:26:29,060 --> 00:26:35,870 But you know, whatever I'm going to say and of course you've got the same these phrases significance, 249 00:26:35,870 --> 00:26:39,410 reach, attribution, the time scale actually came from the payback framework. 250 00:26:40,580 --> 00:26:48,410 It's a good framework. It's not a bad framework. It's just that I think it's got limitations because I think it's too linear. 251 00:26:48,680 --> 00:26:54,200 It's a dry. I'm actually Steve Hannah is a good friend and we're doing a systematic review together along with some other people. 252 00:26:54,920 --> 00:27:00,150 So what's the payback framework says is there's all these sorts of, you know, 253 00:27:00,170 --> 00:27:05,630 loads of different categories of impact, the same kind of thing, contributing to the knowledge base, 254 00:27:06,170 --> 00:27:15,620 inspiring future research, including training new researchers, developing policy on products, improving health and broader economic benefits. 255 00:27:16,190 --> 00:27:20,180 You know, the welfare bill, if we get if we treat low back pain properly, 256 00:27:20,300 --> 00:27:23,630 people will have fewer sick days, therefore blah, blah, blah, that kind of thing. 257 00:27:24,290 --> 00:27:30,500 Okay, so that's payback as anyone ever tried to fill this in research fish. 258 00:27:31,400 --> 00:27:37,850 Okay, if you hold a grant or if you've ever held a grant as principal investigator, 259 00:27:38,300 --> 00:27:44,000 you get an email once a year for about ten years after you've held that grant. 260 00:27:44,240 --> 00:27:51,080 And you have to then fill in, you can see engagement activities, influence on policy. 261 00:27:51,080 --> 00:27:55,970 So it's based on the payback frame. And the idea is you just sort of send the principal investigator an email. 262 00:27:56,330 --> 00:28:00,770 Have you given any more talks about it or have you have you have you changed any 263 00:28:00,770 --> 00:28:04,610 more guidelines or have any of your staff gone on to get a job somewhere else? 264 00:28:05,060 --> 00:28:10,280 It's a complete nightmare because I get 24 of these every year and then I have to work out with, 265 00:28:10,760 --> 00:28:14,600 is this lecture linked to any previous grant that I've held? 266 00:28:15,080 --> 00:28:19,340 It drives me nuts. So you tend not to fit in properly. 267 00:28:19,340 --> 00:28:22,040 And then the question is if it's garbage in, garbage out, but. 268 00:28:22,290 --> 00:28:32,760 Sadly, this is what is going to be used to produce big data on the impact of research produced in UK higher education. 269 00:28:34,570 --> 00:28:40,350 Oh, let's go away from that. Let's come back to what it was all like before it became policy. 270 00:28:41,460 --> 00:28:46,880 So who's are? Buy a drink, if you can tell. 271 00:28:47,030 --> 00:28:50,510 Paul Erlich doing his microbiology research. 272 00:28:50,540 --> 00:28:54,920 The idea is the evidence pipeline. 273 00:28:55,820 --> 00:29:03,170 The assumed evidence pipeline is you have laboratory research, such clever chaps like this doing their research and making a discovery. 274 00:29:03,170 --> 00:29:07,400 And then you have the public health people and you know the clinicians, 275 00:29:07,470 --> 00:29:16,670 GP's like me turning those discoveries into applied health and this is sort of downstream applied implementation. 276 00:29:16,970 --> 00:29:24,330 And anybody who sits on any committees in the university knows that this stuff is much more prestigious. 277 00:29:24,400 --> 00:29:29,570 You know, this is actually now it's sort of genomics and lab stuff, wet science and then this stuff. 278 00:29:30,200 --> 00:29:34,880 Some people don't even count as research. But the idea is that there's a pipeline. 279 00:29:37,010 --> 00:29:38,330 You guys know this. 280 00:29:38,660 --> 00:29:47,180 I think the message to the practitioner in prison and community organisation and Sharon saying you've got to get to know your local context. 281 00:29:47,420 --> 00:29:49,790 Does that does that resonate with the sort of things you're being told? 282 00:29:49,820 --> 00:29:56,570 I mean, I know you're learning in a lot more detail, but that's broadly speaking, the science of knowledge translation. 283 00:29:57,800 --> 00:30:06,410 There's a problem with it, and this is why I'm worried about the payback framework research, fashion and RAF and all the rest of it. 284 00:30:08,060 --> 00:30:12,709 Abraham Flexner You read that science like the Mississippi. 285 00:30:12,710 --> 00:30:21,950 That's not the Mississippi. By the way, in case any Americans want to comment, science like Mississippi begins in a tiny review in a distant forest. 286 00:30:22,280 --> 00:30:28,240 Gradually, other streams swell its volume, and the roaring river that bursts the [INAUDIBLE] is formed from countless sources. 287 00:30:28,250 --> 00:30:36,650 In other words, it simply is not the case in science that my little piece of research here goes dumb, dumb and has impact down the line. 288 00:30:37,010 --> 00:30:45,590 Science is messier, it's more complex, and the stuff that I do is influenced by the stuff that other people have done, etc., etc. 289 00:30:45,950 --> 00:30:54,740 Now really, everyone would agree with that. But the question is to what extent does that invalidate these linear models of research into practice? 290 00:30:55,010 --> 00:31:00,770 And my view, I guess, because I work in social sciences, is that this is really important. 291 00:31:02,750 --> 00:31:09,020 Okay. Here's why I'm going to talk you through this. 292 00:31:09,320 --> 00:31:16,370 Start off in the top left hand corner. You apply for a grant to do a pilot study just on a few patients, blah, blah, blah. 293 00:31:17,330 --> 00:31:23,600 And then you say if the pilot is successful, then going to do a much bigger study, anticipate a study too. 294 00:31:23,600 --> 00:31:25,430 But somehow that doesn't happen. 295 00:31:25,640 --> 00:31:33,890 And so the impact that you put in your pathways to impacts application isn't ever going to happen because Study two didn't have it. 296 00:31:33,920 --> 00:31:41,360 What happened was you thought about it. You went to a conference, you met another team, you exchange ideas, you produced a new collaboration, 297 00:31:41,360 --> 00:31:47,419 you did all sorts of different things, but then finally you had some impact down here. 298 00:31:47,420 --> 00:31:51,770 And so when you write your recount of the impact, of course it's linear. 299 00:31:51,770 --> 00:31:56,930 Of course you can trace it back, but you then leave out all the meanderings. 300 00:31:57,590 --> 00:32:02,540 And so you get this account which appears a lot more linear than it actually is, 301 00:32:02,540 --> 00:32:08,209 which is why I think the impact case study can only be written retrospectively. 302 00:32:08,210 --> 00:32:15,590 And when you've written it, it makes the impact seem linear, which is why the the ref, 303 00:32:15,830 --> 00:32:21,080 when we wrote up the report on the ref, we said was a fantastic success because we've got these lovely stories. 304 00:32:22,190 --> 00:32:33,650 I firmly believe that patient and clinician input right at the beginning of study one would actually reduce wasted effort in the research process. 305 00:32:33,800 --> 00:32:38,959 And that is why knowing your local context, engaging local clinicians, 306 00:32:38,960 --> 00:32:46,130 engaging patient organisations is so important because they will stop this kind of thing or they won't stop it, but they'll reduce it. 307 00:32:46,130 --> 00:32:55,280 And there's quite a lot of evidence that that is the case. Remember that basic applied development technology. 308 00:32:55,910 --> 00:32:59,750 It just doesn't happen like that. Fees are ubiquitous. 309 00:32:59,750 --> 00:33:06,080 If you put research impact pipeline something into Google images, you get loads of these and it doesn't happen. 310 00:33:06,800 --> 00:33:15,090 I'm much more interested in this. Who knows about mode to knowledge production that you must know about is shown in full. 311 00:33:15,680 --> 00:33:21,920 Okay. Have you is it on the syllabus for this module by this name? 312 00:33:22,180 --> 00:33:25,879 Okay. Look how good. 313 00:33:25,880 --> 00:33:30,800 What name was president of the European Research Council in a very important woman. 314 00:33:31,610 --> 00:33:36,860 But this was her stuff sort of years ago. She's a social scientist and she studies scientists. 315 00:33:38,750 --> 00:33:42,770 Michael Gibbons these are all what you call critical social scientists. 316 00:33:42,770 --> 00:33:46,809 And what they said was. Forget the study of knowledge. 317 00:33:46,810 --> 00:33:51,640 Translation is a hiding to nothing. It just isn't going to work. 318 00:33:52,270 --> 00:33:56,739 Let's study knowledge production. So stop studying. 319 00:33:56,740 --> 00:34:03,160 Here is this thing called a set of research findings, and let's now study how knowledge gets produced. 320 00:34:03,970 --> 00:34:13,360 So Mode one is the traditional university based research that then goes down the pipeline mode to much more interesting. 321 00:34:13,360 --> 00:34:22,839 It's collaborative, it's transdisciplinary not just because it links medicine, social science, everything else, but because it links sectors. 322 00:34:22,840 --> 00:34:31,080 It links clinicians with researchers, with policy makers, with industry, and it's often quite pragmatic. 323 00:34:31,090 --> 00:34:34,479 So MOTU is quite a good model. 324 00:34:34,480 --> 00:34:43,059 What they say is MOTU is actually how a lot of knowledge gets produced in this society and a lot of professors will come along and say, 325 00:34:43,060 --> 00:34:49,360 well MOTU is obviously kind of some lesser kind of knowledge because Mode one is the really robust, rigorous objective stuff. 326 00:34:49,360 --> 00:34:52,120 Well, that may be the case, but maybe it is going to have more impact. 327 00:34:52,660 --> 00:35:00,970 Now, Carole Weiss was a social scientist who studied the impact of social science on policy, 328 00:35:02,200 --> 00:35:07,509 and she developed this taxonomy of seven different types of research impact. 329 00:35:07,510 --> 00:35:13,270 So the first is knowledge driven, where you go along to the policymakers. 330 00:35:13,420 --> 00:35:17,200 Here's my research and they say, what a brilliant paper. We must immediately implement that. 331 00:35:17,740 --> 00:35:23,800 And she demonstrated that this is very rare. If ever that happens, don't even think about doing that. 332 00:35:24,730 --> 00:35:31,180 The next one was policy. Pull the problem solving the policymakers come to you said we've got this problem. 333 00:35:31,420 --> 00:35:35,080 What are going to do about the obesity epidemic, what we can do about alcohol, whatever it might be. 334 00:35:36,190 --> 00:35:39,879 They commission you to do a piece of research that's also incredibly rare. 335 00:35:39,880 --> 00:35:43,660 And I'm going to tell you a little bit in a minute about the Rothschild experiment. 336 00:35:45,070 --> 00:35:55,220 Much commoner is this interactive mode where if you make friends with your policymakers locally or nationally, have long chats with them. 337 00:35:55,330 --> 00:35:58,120 You know, it's one of the kings from all those breakfast meetings. 338 00:35:59,500 --> 00:36:05,230 And actually, whenever someone asks me to go and sit on a go to appear on a committee, 339 00:36:05,440 --> 00:36:10,780 I say yes, because actually that is the only way anyone's ever going to take notice of my work. 340 00:36:11,080 --> 00:36:15,190 So you sort of muddle through to a decision. 341 00:36:15,340 --> 00:36:19,780 It's not that they actually invite you to bring your research. It's not so much subtler than that. 342 00:36:19,960 --> 00:36:25,210 But the idea is you've got to have interaction. You've got to get to know each other, get to know where the other side is coming from. 343 00:36:27,460 --> 00:36:37,420 Enlightenment, if you interact often enough, intensively enough with the policymakers and senior clinicians, 344 00:36:37,420 --> 00:36:43,000 whoever, they will start to take on board some of the arguments. 345 00:36:44,080 --> 00:36:48,010 It's not that you go in there and you say, By the way, I've got this randomised controlled trial, 346 00:36:48,010 --> 00:36:51,850 it didn't work like that, but maybe after you got to know them, 347 00:36:52,060 --> 00:36:58,540 after three or four years the policymakers will start to think it'll be a good idea to see if there's any randomised trials. 348 00:36:58,780 --> 00:37:04,060 Do you see. Because they didn't, they weren't up to that before. And then you've got these two political, 349 00:37:04,300 --> 00:37:10,510 political instrumental is when they know what they're going to do and they just look around for some research that 350 00:37:10,510 --> 00:37:17,889 supports that and political tactical is when they commissioned you to do a study not because they want your findings, 351 00:37:17,890 --> 00:37:19,750 but because they want to delay something else. 352 00:37:20,320 --> 00:37:28,850 And finally, research as a public good, they're going to give you money to do research just because it's a really good thing to do it. 353 00:37:29,500 --> 00:37:34,510 And the critics say that that's increasingly rare in the current climate. 354 00:37:35,770 --> 00:37:45,760 I could tell you briefly about Rothschild, some amazing book written by Cogan and Henkel, and it's more than 30 years, but 35 years ago now. 355 00:37:46,240 --> 00:37:50,470 So what Rothschild did, he's very rational guy, Victor Rothschild. 356 00:37:50,800 --> 00:37:54,490 And he he he described it as thinking the unthinkable. 357 00:37:55,450 --> 00:38:01,390 He said that a quarter of the Medical Research Council budget should just be allocated to the government, 358 00:38:01,900 --> 00:38:07,810 and the government would commission university academics to do the research that the government needed. 359 00:38:07,990 --> 00:38:13,810 You can imagine what the MRC thought. You can imagine what the scientists thought had going about academic freedom and all the rest of it. 360 00:38:14,560 --> 00:38:21,010 But surely from a knowledge translation point of view, wouldn't this have been perfect that, you know, the government would say, 361 00:38:21,010 --> 00:38:28,990 well, these are the big problems we've got to solve, and then the scientists would be a contractor to the government. 362 00:38:29,380 --> 00:38:35,200 So they then introduced this and it went so badly wrong. 363 00:38:35,260 --> 00:38:41,620 It's just the best book about how government science relations just went totally belly up. 364 00:38:43,030 --> 00:38:46,440 They interacted. And if you've ever had Pauling, 365 00:38:46,460 --> 00:38:54,710 I know this one person instrument spent a long time on government and she's going to put her hand up if they're two cultures are two different worlds. 366 00:38:55,340 --> 00:39:01,249 It was not easy to prioritise research topics. The research commissioning cycle happens in years. 367 00:39:01,250 --> 00:39:04,010 The policy cycle happens in days, weeks or months. 368 00:39:04,730 --> 00:39:10,070 So when when they finally produce the results, the policy makers are saying, Well, we're not interested in any more. 369 00:39:10,070 --> 00:39:15,290 We're on something else. And then when science did influence policy, because it did, 370 00:39:15,680 --> 00:39:19,669 it kind of happened a bit more obliquely through the sort of interpersonal relationships 371 00:39:19,670 --> 00:39:23,060 that had been built and not directly through the research that had been done. 372 00:39:23,540 --> 00:39:28,790 So this is really important, some important failed experiment. 373 00:39:30,320 --> 00:39:33,350 Fast forward to last year or the year before last. 374 00:39:33,500 --> 00:39:44,040 This is the new stuff. Okay. This is but you can sort of see where insights from the failed Rothschild experiment have influence now. 375 00:39:45,320 --> 00:39:51,110 I'm going to talk you through this because what you've got here is the development of a research idea, 376 00:39:51,560 --> 00:39:55,190 the production of knowledge, in other words, the doing of the research. 377 00:39:55,370 --> 00:39:59,870 And then you've got your implementation here. So it's still a linear model. 378 00:40:01,220 --> 00:40:06,709 But this is using something called Active Network Theory so that it's really 379 00:40:06,710 --> 00:40:14,420 important to think about the system as a network of interlinked stakeholders. 380 00:40:14,870 --> 00:40:20,780 And also to think about this, what they're calling an active scenario, 381 00:40:21,230 --> 00:40:29,000 actor scenario is a sort of like a a thing someone does, you know, like I'm a GP, so I'll go along and I'll see patients. 382 00:40:29,000 --> 00:40:35,780 Yeah, that's an active scenario. Maybe they want to change that actor scenario, for example. 383 00:40:35,780 --> 00:40:41,300 They want the GP to, I don't know, go out and do public health or something like that. 384 00:40:42,920 --> 00:40:49,010 So what you've got here is the investigators, which are the researchers and linked actors, 385 00:40:49,310 --> 00:40:56,480 and then you have what Jack Sparkman calls productive interactions. 386 00:40:57,530 --> 00:41:00,560 It's all it's not linear. It's all non-linear. 387 00:41:00,920 --> 00:41:10,280 But at some point, you get some knowledge products and you evolve the actor scenario as a result of the productive interactions. 388 00:41:10,280 --> 00:41:16,880 So it would be good for someone to make this into a dynamic little film and show some of them, like into a cartoon. 389 00:41:17,120 --> 00:41:24,620 And then, of course, this extends through into actors that were never linked at the beginning. 390 00:41:24,950 --> 00:41:31,490 So this is the way it's the way the thinking on research impact is moving. 391 00:41:31,730 --> 00:41:35,059 The trouble is that this is being published in especially social science. 392 00:41:35,060 --> 00:41:38,120 Generally, though, I think it's got a huge amount of theoretical resonance. 393 00:41:39,350 --> 00:41:45,410 It's opaque to most people who are still stuck in that linear mode and actually quite like the payback framework. 394 00:41:45,960 --> 00:41:48,320 Alright, so how do you maximise impact? 395 00:41:48,860 --> 00:41:57,860 Well, number one, you've got to make it a strategic priority, not just for you as a researcher, but for the organisation that's employing you, 396 00:41:58,460 --> 00:42:06,800 developing partnerships and networks as well as your your infrastructure, but also to take an ex ante approach. 397 00:42:07,430 --> 00:42:13,250 In other words, before you start doing the research. Think about the relationship building, the network building you've got to do. 398 00:42:14,540 --> 00:42:16,219 Secondly, sorry about my numbering. 399 00:42:16,220 --> 00:42:26,060 It doesn't seem to come through well to supplement what might be called high church research with clinically led studies from clinical academics. 400 00:42:26,390 --> 00:42:35,330 Contract research. If you get the contract right, that's, I think perfectly fine co-produced research and curiosity driven work. 401 00:42:35,630 --> 00:42:40,070 So yeah. So right. Have your your very clever MRC funded study. 402 00:42:40,070 --> 00:42:45,620 But actually that may be less likely to be its impact than some of these messier forms in 403 00:42:45,620 --> 00:42:51,500 which the researchers are already on the ground in the down and dirty with non researchers. 404 00:42:52,730 --> 00:42:59,060 Just a brief note about the critique of the whole impact agenda. 405 00:42:59,300 --> 00:43:11,150 There is an argument that impact itself is a neoliberal conspiracy, that the very terms useful and impactful are ideological constructs, 406 00:43:11,630 --> 00:43:20,720 and they're produced by a particular set of politically powerful people who, for example, are very keen on the term knowledge economy. 407 00:43:20,960 --> 00:43:24,890 Why do we have to link health and wealth? I mean, good example of this. 408 00:43:24,920 --> 00:43:31,790 I mean, it's two examples, but one of them is Mark Lamont's worse work in the field of critical management studies. 409 00:43:31,790 --> 00:43:40,370 And he says as the impact agenda has got going, more and more studies are around corporate finance in business schools. 410 00:43:40,520 --> 00:43:43,940 Fewer and fewer studies are about, for example, hidden power that. 411 00:43:44,340 --> 00:43:50,850 Which kind of thing he studies. So he's pretty cross about that. But also this paper that we published in the BMJ last year, 412 00:43:51,390 --> 00:43:56,550 we actually had a section in that about how evidence based medicine had become appropriated 413 00:43:56,820 --> 00:44:02,970 by drug company interests who were now describing their studies as evidence based. 414 00:44:03,120 --> 00:44:14,399 So this kind of stuff means that there is a sort of insidious element to this very trendy impact agenda that some 415 00:44:14,400 --> 00:44:26,050 lovely alternative stuff in the literature all about the ethics and the value of higher education and research, 416 00:44:26,070 --> 00:44:33,240 you know, research as a public good. Someone's proposed a People's Research Council on the Medical or Economic Research Council. 417 00:44:35,730 --> 00:44:41,790 And asking from a much more explicitly ethical perspective what kind of science do we want? 418 00:44:41,790 --> 00:44:48,220 What kind of science is morally worthwhile if we suspend our focus on these short term returns on investment? 419 00:44:48,240 --> 00:44:53,220 Can we measure whether you got into a guideline or whether the people's blood pressure changed? 420 00:44:53,460 --> 00:45:00,000 If we commit to science in a less competitive way, maybe will have more impact in the future. 421 00:45:00,000 --> 00:45:01,290 But it'll actually be so. 422 00:45:01,290 --> 00:45:07,290 It'll be hard to measure, but it'll be impossible to measure because what we're looking at here is the stuff that is very difficult to measure. 423 00:45:07,980 --> 00:45:17,580 So let me summarise briefly, the definition of impact depends very much on your perspective, on your terms of reference. 424 00:45:19,170 --> 00:45:23,700 Mostly the reason why we're all excited about research impact is this pressure, 425 00:45:23,700 --> 00:45:32,909 particularly from government and on government to show return on investment within the ref impact was very broadly defined. 426 00:45:32,910 --> 00:45:36,350 It was used a case study approach. I put results of which the results are out now. 427 00:45:36,360 --> 00:45:39,660 Beg your pardon? Should have and know people did well. 428 00:45:40,710 --> 00:45:45,720 Funders very interested in pathways to impact the ex-ante approach. 429 00:45:46,260 --> 00:45:52,470 I've given you a little bit of a flavour on theories. For example, Carol Weiss is seven mechanisms of impact. 430 00:45:54,000 --> 00:46:02,010 I've given you some evidence from Rothschild that the assumed linear link between evidence and policy doesn't happen. 431 00:46:02,580 --> 00:46:07,620 I've offered you some new framings of impact based on networked models, 432 00:46:08,310 --> 00:46:12,750 and I've given you very briefly a critique of impact of the neoliberal conspiracy. 433 00:46:12,750 --> 00:46:18,990 That's a lot to pack into one talk, and we still have 12 minutes for you to ask questions or disagree with me.