1 00:00:00,240 --> 00:00:04,450 So that's a welcome again to everybody else in the room. And welcome to everybody online. 2 00:00:04,470 --> 00:00:13,710 We've had an amazing response to this lecture. Um, around 200 250 people, which is really just a marker of how digital the world is becoming. 3 00:00:13,950 --> 00:00:17,970 But also lots of questions around why digital interventions fail. 4 00:00:18,900 --> 00:00:22,560 Um, so this focus of this session has really struck a chord. 5 00:00:22,890 --> 00:00:28,980 Why digital health and care interventions fail, and the part that you're probably all waiting for is what we can do about it. 6 00:00:29,310 --> 00:00:35,490 Um, and on, um, confidence that you're not about to come up with, um, very clear problems and very clear solutions. 7 00:00:35,490 --> 00:00:43,350 But we're in in Catherine's hands today. And I'm really delighted to welcome, uh, Professor Catherine Creswell. 8 00:00:43,620 --> 00:00:48,630 Um, Catherine is another social scientist and a colleague that we've never actually met before until today. 9 00:00:49,560 --> 00:00:50,790 There's a few of us in the room. 10 00:00:51,660 --> 00:00:59,850 Um, and Catherine brings extensive experience, uh, evaluating digitally enabled change and improvement programs across health and, uh, care. 11 00:01:00,360 --> 00:01:07,499 She's got numerous titles, gongs and publications, which I'm not going to read out, but if you do a quick search, you can find all of the detail. 12 00:01:07,500 --> 00:01:11,640 And, um, she's amazing basically. And we're very lucky to have her with us today. 13 00:01:12,810 --> 00:01:18,360 Now. Catherine's research the use of a variety of health information technologies in context. 14 00:01:18,360 --> 00:01:20,189 And I think that's really key over many, 15 00:01:20,190 --> 00:01:27,840 many years and often involving bringing together stakeholders with varying perspectives and voices, which I'm sure you'll touch on today. 16 00:01:28,410 --> 00:01:35,550 Um, and we've invited Catherine to speak as part of our MSC programme that we run here on Translational Health Science. 17 00:01:36,240 --> 00:01:42,190 My name is Sarah Shaw. I'm the module lead for the first module that runs on that program, uh, 18 00:01:42,240 --> 00:01:47,000 which is about the social science of innovation, but it's also about the methods that go along with that. 19 00:01:47,010 --> 00:01:49,680 We're training a bunch of new social scientists, if you like. 20 00:01:50,760 --> 00:02:04,049 Um, and we focus not only but partly on how digital technologies are adopted, implemented, used, evolved, adapted spread scales and lots of course. 21 00:02:04,050 --> 00:02:06,660 And that's, I think, what we're particularly interested in today. 22 00:02:07,110 --> 00:02:12,680 So I'm going to shut up, um, and let's hear more from Catherine about why digital health interventions fail. 23 00:02:12,690 --> 00:02:15,749 Um, for us, we might turn them over to you. 24 00:02:15,750 --> 00:02:23,960 Catherine. There we go. Um. Um, thanks for the lovely introduction and thanks for inviting me. 25 00:02:23,990 --> 00:02:31,560 Um, I'm delighted to be here. I hope I'm not disappointing you with a with a snazzy title and a boring presentation, so I'll do my best. 26 00:02:31,580 --> 00:02:34,970 Um, I heard I'm in between fresh green, hot and dinner, so. 27 00:02:34,980 --> 00:02:38,600 It's so nice to be, I have to say, but, um. 28 00:02:39,020 --> 00:02:45,300 Yeah. So I, um. I'm from the University of Edinburgh. I'm, uh, professor of digital innovations in health and care. 29 00:02:45,320 --> 00:02:50,600 I've mainly looked at health care because, um, the opportunities to study care are just coming up. 30 00:02:50,600 --> 00:02:58,430 Really? And, um, if there's any, um, then if there's any that you're involved in, um, then I'm very keen to get more into that area. 31 00:02:58,880 --> 00:03:01,490 Um, I'm a social scientist, as Sarah's already said. 32 00:03:01,940 --> 00:03:08,330 Um, and the medical school, I've been there for 20 years working with medics, um, and, and other, um, 33 00:03:08,510 --> 00:03:13,460 other, um, uh, professions that are interested in digital systems and look at them from different angles. 34 00:03:13,940 --> 00:03:19,720 Um, and the biggest, um, underlying assumption as a social scientist, um, um, 35 00:03:19,880 --> 00:03:23,660 my biggest underlying assumption is that, um, I start with a second bullet point. 36 00:03:23,930 --> 00:03:28,370 Technological systems have social consequences always. 37 00:03:28,400 --> 00:03:32,300 You can't look at technology and social dimensions in isolation. 38 00:03:32,310 --> 00:03:37,450 We always need to look at them together. So how do technological systems have social consequences? 39 00:03:37,460 --> 00:03:43,370 Well, they can change or they do always pretty much change, um, the way people work. 40 00:03:43,790 --> 00:03:48,310 Um, and they also change the way organisations function. And I get into that a little bit. 41 00:03:48,900 --> 00:03:53,480 But but similarly social systems also have technological consequences. 42 00:03:53,690 --> 00:04:01,459 So um, well, that abundance lot technologies, um, that they fail like we'd like to say, although I think it's not that easy, 43 00:04:01,460 --> 00:04:08,480 it's not that simple to to frame these things with its successes and failures, because it's success to somebody is a failure to another person. 44 00:04:08,490 --> 00:04:14,690 So it's slightly more, um, nuanced than that. Um, and also, um, they can, um, they can lead to workarounds. 45 00:04:14,690 --> 00:04:21,260 And I'm going to talk a bit about that. Um, and so if it's one thing that you're going to take away from this, um, 46 00:04:21,350 --> 00:04:26,360 technological systems have social consequences and social systems have technological consequences. 47 00:04:26,840 --> 00:04:30,319 Um, my expertise is and formative evaluation. 48 00:04:30,320 --> 00:04:35,629 And I will explain what that is later on and the kind of the biggest things I've looked into so far, um, 49 00:04:35,630 --> 00:04:43,580 have been the national programme for it, which went down in history as the biggest public sector, actually failure, um, uh, ever. 50 00:04:43,820 --> 00:04:50,899 Um, and it was very interesting to research. Um, and we looked at the national procurement of electronic health records as part of that. 51 00:04:50,900 --> 00:04:55,129 And I read my PhD on that very, um, very interesting, um, piece of work. 52 00:04:55,130 --> 00:04:56,810 And I'll talk about that a little bit later. 53 00:04:57,290 --> 00:05:05,150 Um, and then I co-led the Global Digital Exemplar programme evaluation, which was an attempt, um, by NHS England to, 54 00:05:05,180 --> 00:05:11,480 um, to create a learning health ecosystem, so to pass on implementation knowledge from hospital to hospital. 55 00:05:11,840 --> 00:05:16,350 And actually that was quite successful. Um, but the lessons weren't learned. 56 00:05:16,370 --> 00:05:21,980 And again, that wasn't due to technology, that was due to social and social aspects about the technology. 57 00:05:22,610 --> 00:05:27,419 Um, and, um, yeah, I can point you to papers that, uh, we look at that, um, but, but, 58 00:05:27,420 --> 00:05:32,320 but they're pretty much the social aspects around the technology are always pretty pretty much the same. 59 00:05:32,330 --> 00:05:39,710 So, um, but and I'm going to get into that at the moment, I'm doing, um, the, um, leading the independent evaluation of the initial AI lab, 60 00:05:40,250 --> 00:05:46,010 um, which, uh, I can't talk about, unfortunately, um, because it's a half underway. 61 00:05:46,010 --> 00:05:49,520 But, um, our results will come out in April next year. 62 00:05:49,520 --> 00:05:57,260 And, um, again, it's a national programme and, um, very important lessons that, uh, we need to be learning. 63 00:05:57,500 --> 00:06:02,670 Um. Okay. So, um, just for context, I mean, you know, all this, we're all in health care, 64 00:06:02,820 --> 00:06:05,910 but we are getting fatter and older and sicker for longer, unfortunately. 65 00:06:06,150 --> 00:06:13,860 So I've looked at some stats, but basically in 25 years from now, uh, 1 in 4 of us, uh, will be over 60 and 1 in 2 will be overweight. 66 00:06:14,010 --> 00:06:18,600 So it doesn't look too great. Um, and actually health care expenditure is increasing. 67 00:06:18,600 --> 00:06:23,040 Um, and we basically need to achieve, uh, more with less. 68 00:06:23,250 --> 00:06:29,700 Um, now, the solution, like policymakers like to tell you, um, which, which might which, which might be true. 69 00:06:29,940 --> 00:06:37,890 Uh, but I think it's often an overstated assumption is, uh, the potential of health and care it which is actually, um, in some instances, 70 00:06:37,920 --> 00:06:45,989 um, uh, actually, well, um, well evidenced, uh, in others, not so, um, so there's a lot of hype around some aspects of technology. 71 00:06:45,990 --> 00:06:47,520 Uh, I mentioned earlier. 72 00:06:48,000 --> 00:06:56,909 Um, so, um, the assumption is that we can, we can deliver more systematic and high quality care decision support of clinicians who, 73 00:06:56,910 --> 00:07:02,850 um, who will then make better decisions, uh, more proactive and targeted care through data analytics. 74 00:07:02,850 --> 00:07:07,320 Precision medicine is one thing that I'm really interested in at the moment, but again, it's not. 75 00:07:07,350 --> 00:07:12,360 It exists and everybody is really excited about it, but it doesn't work in practice as yet. 76 00:07:12,810 --> 00:07:17,610 Um, due to some of the issues I'm going to talk about, um, better coordination of care. 77 00:07:17,610 --> 00:07:21,720 So integration of primary, secondary and tertiary care. 78 00:07:22,290 --> 00:07:27,680 Um, do. I know something. 79 00:07:29,140 --> 00:07:32,210 Other, um, technology. 80 00:07:32,690 --> 00:07:37,720 Um, this is one of my vivo, though. Um, so there you go. 81 00:07:37,930 --> 00:07:41,840 Coordination of care. Um, improved access to specialist expertise. 82 00:07:41,860 --> 00:07:47,340 We can do it via telehealth. Now. People don't have to travel anymore. Um, greater patient engagement. 83 00:07:47,350 --> 00:07:52,299 I mean, that was an announcement by Wes Streeting. Uh, people will have access to the NHS app. 84 00:07:52,300 --> 00:08:00,550 Sounds all great. Um, people having access to their own records and having more involvement in their of care, improved resource management such as, 85 00:08:00,580 --> 00:08:06,220 um, using analytics for improving patient flow, let's say organisational aspects of change. 86 00:08:06,580 --> 00:08:13,570 And, um, the holy grail of, uh, creating a learning health system, um, and data is playing a huge part in that. 87 00:08:14,440 --> 00:08:20,739 Um, uh, but, uh, we need to think about, um, what what what these changes bring with it. 88 00:08:20,740 --> 00:08:24,790 Yeah. And in order to achieve them, we need to develop new competencies. 89 00:08:25,060 --> 00:08:31,299 I mean, I know, um, that in England, um, there's a huge skills gaps in procuring technologies. 90 00:08:31,300 --> 00:08:36,100 People procure technologies that don't know much about technology at all. Um, so that's one thing. 91 00:08:36,520 --> 00:08:42,370 Um, but also, uh, we need, uh, to, to understand how we can best exploit data, 92 00:08:42,370 --> 00:08:46,960 that we create the huge amounts of data that we create through data analytics and, 93 00:08:47,020 --> 00:08:53,679 um, and, and, and, um, the clinical interface, we need to understand how we use data and what data we use. 94 00:08:53,680 --> 00:08:58,330 I mean, the biggest problem now is not to get the data, but good quality data. 95 00:08:58,660 --> 00:09:05,770 And um, and one of the biggest questions is how do we make data actionable and what data is actionable and what data results and what actions. 96 00:09:06,280 --> 00:09:10,239 Um, and um, and they are changes the way decisions are made. 97 00:09:10,240 --> 00:09:13,090 So like I said, us looking into, um, into precision medicine. 98 00:09:13,390 --> 00:09:18,640 So we are increasingly creating predictive inputs that that's tough and issues about the future. 99 00:09:19,000 --> 00:09:24,340 Um, so it's it's, uh, it's it's a different level of evidence that they're going to work with. 100 00:09:24,340 --> 00:09:32,590 So skills have to change. Um, we also, as clinicians, uh, need to develop new ways of interacting with patients from face to face to, 101 00:09:33,040 --> 00:09:39,100 uh, remote and potentially, um, uh, um, yeah, it's a distance, um, through telehealth, let's say. 102 00:09:39,370 --> 00:09:43,060 And that might have adverse impacts on to on, on, on relationships. 103 00:09:43,270 --> 00:09:48,489 Um, uh, it might also lead to digital inclusion. I mean, I have simplified this whole area. 104 00:09:48,490 --> 00:09:51,790 It's so complex. Um, but these are just examples. 105 00:09:51,790 --> 00:09:57,999 Um, I'm trying to make this kind of digestible in a way that I think it's Lewis Carroll said this, you know, 106 00:09:58,000 --> 00:10:02,500 if you have a map that's the size of the scale of the Earth, then what's the use of having a map? 107 00:10:02,680 --> 00:10:06,339 So I'm completely simplifying this whole, uh, map. 108 00:10:06,340 --> 00:10:14,050 Um, but but it will give you some ideas of how to navigate, um, so, um, you know, competencies, new ways of interacting with patients, 109 00:10:14,230 --> 00:10:19,930 changing organisation and professional boundaries, uh, through increasingly complex health information, 110 00:10:19,940 --> 00:10:24,759 healthcare information infrastructures, um, we building systems upon systems. 111 00:10:24,760 --> 00:10:32,110 And the new systems are depending on the old systems. Sometimes the old systems just exist to, uh, maintain the new systems. 112 00:10:32,530 --> 00:10:36,520 Um, and the professional boundaries of, of care systems such as, um, 113 00:10:36,940 --> 00:10:42,610 used to working together suddenly have to work together because they share common pathways and common dataflows. 114 00:10:43,060 --> 00:10:46,330 So, um, yeah, this is still active. 115 00:10:46,330 --> 00:10:51,790 Um, I'm getting this. So, uh. Yeah, the problem is, I mean, this this is an estimation, 116 00:10:51,790 --> 00:10:56,469 but this is the numbers to find between 79% of Digital Health Start-Ups fail in the first five years. 117 00:10:56,470 --> 00:11:01,600 That was I think it was Ipsos Mori. And the first, uh, the biggest reason was no market needs. 118 00:11:01,960 --> 00:11:05,390 Well, that could have been resolved relatively easily, but most of them actually. 119 00:11:05,800 --> 00:11:15,880 Um, uh, the so the problem is that, um, developing and implementing um, technology in health and care is not an easy thing. 120 00:11:16,030 --> 00:11:22,030 Yeah. If it was easy, then everybody would do it right. Because we all, you know, we all want to improve safety, quality and efficiency. 121 00:11:22,390 --> 00:11:26,710 Um, and, and it should be relatively straightforward for some people to make money off this, which is fine. 122 00:11:26,890 --> 00:11:29,780 It's completely fine. Actually. That has to be ignored. I think it's okay. 123 00:11:30,370 --> 00:11:36,100 Um, because we need to align, uh, different, uh, stakeholder motivations and incentives. 124 00:11:36,370 --> 00:11:40,299 Um, but it's not it's not a technical problem with an easy fix. 125 00:11:40,300 --> 00:11:44,860 There's no recipe for success. And, um, I like this quote from Harvard Business Review. 126 00:11:45,100 --> 00:11:52,780 As a result, executives leading difficult change initiatives are often blissfully ignorant of an approaching threat until it's too late to respond. 127 00:11:53,620 --> 00:11:56,800 So, um, people understand the problem, but only when it's too late. 128 00:11:58,200 --> 00:12:05,700 So I'm taking a health system perspective here. So this is this is for the good of um, of the health system. 129 00:12:06,000 --> 00:12:10,500 So, um, this this should say and or but what what do we want? 130 00:12:10,890 --> 00:12:14,880 What what are we trying to do? Yes. At the most at the most simple level. 131 00:12:15,030 --> 00:12:19,590 We're trying to develop technology not just developed, but implement and get it used as well. 132 00:12:19,830 --> 00:12:23,690 Um, that improve safety, quality and efficiency of care, if at all. 133 00:12:23,700 --> 00:12:27,270 Efficiency of care. Uh, if it doesn't, we should stop developing it. 134 00:12:27,480 --> 00:12:30,140 You know, the problem is that it's quite simple. 135 00:12:30,150 --> 00:12:36,120 Yeah, it seems quite, uh, but a lot of technologies are developed that don't actually fulfil the basic criteria. 136 00:12:36,600 --> 00:12:40,140 Um, then we want people to use this technology again. 137 00:12:40,710 --> 00:12:45,940 Um, developers focus on developing and technology, and they think that once it's developed, it's succeeded. 138 00:12:45,960 --> 00:12:51,660 No it hasn't. There's so many technologies that exist and that that seem like a great idea, again, 139 00:12:51,660 --> 00:12:56,580 to strategic decision makers who don't actually know very much about technology or social aspects of technology. 140 00:12:57,000 --> 00:13:00,630 Um, we need people to use it. Otherwise it has. Yeah. 141 00:13:01,110 --> 00:13:08,490 Um, and then then we also want technology to embed within existing work practices and within organisational practices, 142 00:13:08,700 --> 00:13:12,650 because if it doesn't there's duplication of work. And that's gives an example later. 143 00:13:12,690 --> 00:13:22,650 Um, uh, and, and and ultimately the ultimate aim is to actually sustain a technology use, um, over time so that it gets used over time. 144 00:13:22,660 --> 00:13:32,760 It's not to be reimplemented and costs and creates disruption, but it but but it keeps being used and refined but optimised um, and um across space. 145 00:13:32,760 --> 00:13:37,420 So we want, we want technologies to talk to each other in different settings for example. 146 00:13:37,450 --> 00:13:43,140 Yeah. Because otherwise we have technological silos and multiple redundant systems and limited interoperability. 147 00:13:43,500 --> 00:13:50,430 I'll come back to that. So that's my general. That's my, um, my my, uh, argument base of my argument. 148 00:13:50,700 --> 00:13:56,550 And these should come in one by one. Um, but I didn't I don't know why, but, uh, let's start with the technology. 149 00:13:56,730 --> 00:14:02,880 So these, these factors, uh, you can and and you can apply them to any technology. 150 00:14:03,060 --> 00:14:05,910 They match up any such thing. Really. 151 00:14:06,300 --> 00:14:12,840 Uh, but if you think through these four factors, then, um, it would really it will really help you to make things work. 152 00:14:13,080 --> 00:14:18,330 Um, and we've, we've done a whole framework and I'll just give you some examples of these dimensions. 153 00:14:18,330 --> 00:14:24,270 But there's more, um, there's a more holistic framework around this that I'm sure very, very happily. 154 00:14:24,570 --> 00:14:25,980 So let's start with the technology. 155 00:14:26,340 --> 00:14:33,450 Um, a technology might perform, like I said earlier, might solve very well in the lab, might be super, but then actually, 156 00:14:33,450 --> 00:14:41,850 when it's implemented, um, if people don't use it, um, and zero is one, one thing that was uh, but specific monitoring now. 157 00:14:42,270 --> 00:14:45,800 Um, and that basically failed because people it worked well in the lab. 158 00:14:45,810 --> 00:14:49,350 It's rolled over the fence into the real world and people didn't want to, um, 159 00:14:49,680 --> 00:14:54,839 didn't want to record their sleep patterns before that because what they because they weren't worried. 160 00:14:54,840 --> 00:14:58,469 Well, um, and, um, so that's the technology. 161 00:14:58,470 --> 00:15:01,740 And again, they overlap as well, which makes it even more complicated. 162 00:15:02,010 --> 00:15:04,290 Um, so adoption is another dimension. 163 00:15:04,500 --> 00:15:10,979 So issues surrounding usability of technology and usefulness of technology and that that always increases the risk of abandonment. 164 00:15:10,980 --> 00:15:14,310 If something isn't useful to a person using it, then why would they use it? 165 00:15:14,700 --> 00:15:21,390 Um, and and one of the, one of the biggest, uh, that was, I think in 2011, um, 166 00:15:21,780 --> 00:15:28,110 IBM Watson, uh, beat, uh, some, some people, some humans in jeopardy game jeopardy! 167 00:15:28,530 --> 00:15:35,730 And um, they then developed it had a huge high but great idea to put it into oncology settings, um, and develop its oncology system. 168 00:15:36,120 --> 00:15:40,229 Well, people didn't trust the system, and, um, it was of no benefit to them. 169 00:15:40,230 --> 00:15:43,290 So actually they thought that it didn't. It was abandoned. 170 00:15:43,710 --> 00:15:47,490 Um, unfortunately, um, organisational issues. 171 00:15:47,490 --> 00:15:49,890 So organisations might have different priorities. 172 00:15:50,070 --> 00:15:58,530 Um, we see which um, uh, the our national program experience, uh, illustrates that support from our paper here. 173 00:15:58,860 --> 00:16:07,950 Um, because the government basically procured electronic health record systems for NHS hospitals and that they even though they got it for free, 174 00:16:08,250 --> 00:16:14,140 they didn't want it. They didn't want they didn't want an American system that was based on billing. 175 00:16:14,190 --> 00:16:16,830 Um, you know, because it didn't fit with them and they didn't they didn't. 176 00:16:17,310 --> 00:16:25,080 It's sometimes it's just the process of having the power of feeling and power to make the decision that, um, that didn't happen. 177 00:16:25,080 --> 00:16:33,330 And then the, you know, the whole program fell down. Um, and what we're seeing increasingly, uh, with AI technology and I have some examples, 178 00:16:33,330 --> 00:16:37,440 data is that it doesn't align with existing business models a lot of the time. 179 00:16:37,710 --> 00:16:43,980 So organisations are important, uh, stakeholders. And lastly, and we never think about this really a lot. 180 00:16:44,460 --> 00:16:50,550 Not enough. Um, it's a health system dimension. So a technology needs to be a national strategic priority. 181 00:16:50,550 --> 00:16:56,040 Flavour of the month. It's good. Um, in a way, because it can help adoption, because leaders get interested. 182 00:16:56,040 --> 00:16:59,850 And then when you have an interested leader, they. Will, you know, push the organisation towards it. 183 00:17:00,450 --> 00:17:06,780 Um, if it's not in line with existing health system structures and cultures, really important. 184 00:17:06,810 --> 00:17:12,120 Um, and in some instances, um systems, uh um, uh, yeah. 185 00:17:12,570 --> 00:17:17,129 Systems are too complex. They need to sit on an existing health information infrastructure. 186 00:17:17,130 --> 00:17:23,100 And if that doesn't exist, then advanced systems don't actually bring major benefits. 187 00:17:23,100 --> 00:17:31,259 So an example here is Babylon. It was again hailed as some, um, basic system that could triage, um, back to triaging. 188 00:17:31,260 --> 00:17:37,260 Primary care implemented in London, didn't fit in with existing payment structures basically, and went bankrupt. 189 00:17:37,770 --> 00:17:42,450 Um, they think they should have thought about that, but they should have thought about that earlier. 190 00:17:42,510 --> 00:17:46,140 Basically, I'm trying to tell you think about these things before you develop a technology. 191 00:17:46,560 --> 00:17:52,060 Um, unfortunately, uh, all these things are interrelated and it becomes really complex. 192 00:17:52,080 --> 00:17:56,100 So I acknowledge that. Um, so now what can we do about it? 193 00:17:56,910 --> 00:18:02,280 Um, so there are two types of evaluation. Um, one is summative. 194 00:18:02,460 --> 00:18:06,870 Yeah. You basically you cook a meal, um, and you test it after, um, you tell. 195 00:18:07,020 --> 00:18:13,380 You say, well, this test is great or it's not great. So that's that's basically what summative evaluation, 196 00:18:13,560 --> 00:18:20,190 assessing whether something's worked and that's um, that is assessing a quantitative kind of impact, 197 00:18:20,610 --> 00:18:26,339 um, been done through quantitative means such as, um, impact on quality and safety of care or economic impact. 198 00:18:26,340 --> 00:18:27,140 And that's important. 199 00:18:27,210 --> 00:18:33,510 We need to know whether something might be really important, um, especially for making business cases, for making investment decisions. 200 00:18:34,320 --> 00:18:39,660 Um, but what I focus on and ideally you do both um, is formative evaluation. 201 00:18:39,660 --> 00:18:44,950 So and that that basically means you test the dish early on and you put some salt. 202 00:18:44,970 --> 00:18:48,420 But if it means that you put some pepper and you make it taste good. 203 00:18:48,720 --> 00:18:55,840 Um, what's working on it on the way? Um, so it seeks to answer the question of how can we make this work? 204 00:18:55,890 --> 00:19:00,900 Yeah. Um, some I think some industry term might be something, um, like a premature. 205 00:19:00,900 --> 00:19:07,889 Um, if you ever thought that imagining that something has gone wrong and then mitigating, um, mitigating against that with with with a, yeah. 206 00:19:07,890 --> 00:19:14,730 With, um, with, with certain methods and, and those methods, uh, designed to understand processes. 207 00:19:14,910 --> 00:19:18,520 So, um, why did something what how can we learn from it? 208 00:19:18,540 --> 00:19:21,839 Why didn't it work? And how can we then apply that learning to other settings. 209 00:19:21,840 --> 00:19:30,000 And, and I usually draw on social theory. I mentioned, um, health information systems, uh, um, uh, health information infrastructures earlier. 210 00:19:30,030 --> 00:19:32,620 I'd like to look a little bit into the, um, 211 00:19:32,640 --> 00:19:40,200 science and technology studies literature because I find it quite intuitive and not very theoretical, but I do look into that, uh, literature. 212 00:19:40,200 --> 00:19:44,520 So I highly recommend you look. Um, and, uh, we use qualitative methods. 213 00:19:44,640 --> 00:19:48,780 We actually, we talk to people. That's, that's pretty much quite common sense, actually. 214 00:19:49,090 --> 00:19:54,390 But when you when you're excited about the technology, you forget that other people might have different views. 215 00:19:54,450 --> 00:20:01,350 Um, and especially aligning these different views becomes secondary to you developing that system that you always dreamt about. 216 00:20:01,650 --> 00:20:06,809 Um, so, um, so formative evaluation is important to account for this complexity. 217 00:20:06,810 --> 00:20:09,420 It's not simple, these things. We have different stakeholders. 218 00:20:09,420 --> 00:20:17,610 As I've mentioned, we have evolving interventions, um, technologies change needs change technologies, like I said, on technologies change. 219 00:20:17,790 --> 00:20:24,600 Um, and we have different contexts. So a system might work really well in one context and it might completely not work in another. 220 00:20:24,960 --> 00:20:29,790 Um, so, so so those are the kind of underlying reasons for this complexity. 221 00:20:30,090 --> 00:20:38,610 But, um, another really important thing is that I, uh, encounter again and again is we need to identify risks and potential unintended consequences. 222 00:20:38,610 --> 00:20:42,200 Early and unintended consequences are not only negative. 223 00:20:42,210 --> 00:20:47,220 Um, so introduction of new threats. We know that, um, that that is a threat. 224 00:20:47,220 --> 00:20:52,940 But actually, um, that can be positive. So that can be look at them as a source of innovation. 225 00:20:52,980 --> 00:20:58,770 Yeah. Uh, turn it on its head. So it doesn't necessarily just have to be negative unintended consequences. 226 00:20:59,220 --> 00:21:04,140 And and key is we want to learn. Yeah. We don't want to repeat same mistakes. 227 00:21:04,140 --> 00:21:12,719 And and we are a lot at the time. And that's why um yeah we talk about the same things again and again and um, across different implementations. 228 00:21:12,720 --> 00:21:16,410 And um, and it's quite frustrating because we, we don't learn. 229 00:21:16,410 --> 00:21:20,910 So we're trying our best, um, to change that. Um, so we want to have future implementations. 230 00:21:21,180 --> 00:21:25,380 And so that means that one context has learned something. 231 00:21:25,590 --> 00:21:29,370 The other the other organisation doesn't have to go through all these bad experiences. 232 00:21:29,370 --> 00:21:34,850 And patients are often suffering from these bad experiences. Um, and it also works across technologies. 233 00:21:34,860 --> 00:21:40,320 I mean I've written papers on I, I yeah, I is slightly different. 234 00:21:40,470 --> 00:21:44,820 Um, but actually it's also a technology. I mean, that's just the dimensions that apply. 235 00:21:44,940 --> 00:21:48,690 You know, we need to look backwards as well at the evidence, uh, when we, 236 00:21:48,690 --> 00:21:53,280 when we have a new system because there's that there's lessons that apply across technologies. 237 00:21:53,700 --> 00:21:57,520 So this is, um, this is my. Um. 238 00:21:58,700 --> 00:22:02,050 It's this clock, right? Yes. 239 00:22:02,410 --> 00:22:06,430 Oh, is it okay? Oh, I can slow down. That's good. Oh, excellent. 240 00:22:06,510 --> 00:22:11,190 Uh, so this is my framework. Um, but I will take you through. 241 00:22:11,230 --> 00:22:16,630 Yeah. And that's why I think we need to look and what we can do, um, about avoiding failure. 242 00:22:16,780 --> 00:22:20,709 Um. Uh, so technological development is the first one. 243 00:22:20,710 --> 00:22:27,070 So we need to actively involved different stakeholders in system design and also the development, 244 00:22:27,100 --> 00:22:32,020 um, of the office system to ensure that that technology meets the needs of adopters. 245 00:22:33,220 --> 00:22:39,370 Um, then we need to do implementation research to ensure that the technology embeds within the context of use. 246 00:22:40,360 --> 00:22:46,689 And then we need to move to optimisation. So once the technology is in we want to optimise it to the context of use. 247 00:22:46,690 --> 00:22:52,240 And we can do that again through um adjusting implementation strategies, let's say training. 248 00:22:52,780 --> 00:22:57,250 That's uh one that uh, managers always mention but it's not. 249 00:22:57,260 --> 00:23:02,440 Oh you can't, you know, if the software is unusable, you can't solve that with training either. 250 00:23:02,740 --> 00:23:11,110 So, um, yes. Um, uh, but it's the kind of continuous optimisation over time and that will never end. 251 00:23:11,620 --> 00:23:15,530 Um, and then we need to think about how we can, um, uh, 252 00:23:15,580 --> 00:23:22,420 scale and sustain technology so how we can transfer it across contexts, i.e. transfer to different groups of adopters. 253 00:23:22,960 --> 00:23:29,680 I think that's something that about pathways as well, and how information flows from one organisation to another and how people, 254 00:23:29,680 --> 00:23:34,420 um, contribute to that and draw on that information across different settings. 255 00:23:34,870 --> 00:23:38,440 Um, but also different organisations and different health systems. 256 00:23:38,500 --> 00:23:42,970 Um, like I said, some dimensions are quite similar and relevant. 257 00:23:43,150 --> 00:23:47,350 Um, and also sustainability over time. So um, we want this. 258 00:23:47,710 --> 00:23:52,570 We don't want to re-implement constantly because, um, yeah, that's a waste of money. 259 00:23:52,760 --> 00:24:01,089 Um, because then we would have optimised the system to work perfectly and then, um, uh, well, within the constraints of the system, of course. 260 00:24:01,090 --> 00:24:07,030 Um, but, um, yeah, it's about sustainability of time, maybe, and making systems work. 261 00:24:07,150 --> 00:24:10,210 So if we start with the first one. So the risk here. 262 00:24:10,240 --> 00:24:15,219 Yeah. Uh, technology development, um, the risk is that technology, it's not needed. 263 00:24:15,220 --> 00:24:20,210 Um, and not used or not used and, um, and that that's the most basic one. 264 00:24:20,230 --> 00:24:29,670 So before you, um, before you do anything, um, you need to figure out, uh, how a technology could meet the needs of a lot of its users. 265 00:24:29,680 --> 00:24:34,930 And one study we've done, um, is, uh, surrounding artificial intelligence based decision support. 266 00:24:35,410 --> 00:24:40,390 Um, there are many, many early stage AI innovations that you will be aware of. 267 00:24:40,600 --> 00:24:45,639 There's huge visions and huge expectations and huge enthusiasm around this and this one. 268 00:24:45,640 --> 00:24:50,850 This system didn't actually exist when we, um, when we researched it, which is excellent. 269 00:24:50,860 --> 00:24:53,380 I mean, that doesn't you know, that doesn't happen very often. 270 00:24:53,470 --> 00:25:00,160 So, um, usually people have an idea, so people have an idea and you kind of get caught in too late when, 271 00:25:00,160 --> 00:25:02,770 uh, when, when things exist and they can't be changed anymore. 272 00:25:03,010 --> 00:25:10,900 So with this one, um, it was the vision of a system to transform care by predicting, actually, a fibrillation from EKGs. 273 00:25:11,800 --> 00:25:16,150 And we did a qualitative study speaking to people. We gave them, we showed them the system. 274 00:25:16,150 --> 00:25:19,420 We said, you know, this is simple. Do this. What do you think about it? 275 00:25:19,480 --> 00:25:23,230 Um, and you got it. You got you got to junior people a lot. 276 00:25:23,560 --> 00:25:25,780 Um, we also need to speak to the senior people. 277 00:25:25,780 --> 00:25:31,840 But the junior people are really important because they often take a lot of the work so that they're the ones to actually really focus on. 278 00:25:32,200 --> 00:25:41,620 Um, the powerless. Yeah. Uh, important, um, uh, consultants don't do data input that to, uh, transfer to their, um, junior doctor. 279 00:25:41,620 --> 00:25:45,040 So junior doctors are the ones that I would want to speak to first. 280 00:25:45,040 --> 00:25:48,069 Always. Um, so, um. Yeah. 281 00:25:48,070 --> 00:25:51,969 So we thought we got so much. What? What would you. You know, what would you do with the system? 282 00:25:51,970 --> 00:25:55,210 How would it change the way you work? And I said, oh, yeah, I is great. 283 00:25:55,510 --> 00:25:59,500 And then said, how? But how would you what. Uh, it wouldn't. 284 00:26:00,160 --> 00:26:06,120 And then we thought, okay, well then we need to, you know, we need to rethink the system if, if it doesn't do anything. 285 00:26:06,200 --> 00:26:12,100 And now, um, now we have to go back to the drawing board, and, um, and then we left the field, which is another problem. 286 00:26:12,310 --> 00:26:16,750 Um, if you do formative evaluation. So I don't actually know, um, what's happening. 287 00:26:16,750 --> 00:26:22,630 I think it's been it's been rethought at the moment, but the developers hadn't invested a lot of effort. 288 00:26:22,930 --> 00:26:26,980 And, um, so that was an easy lesson to learn right at the beginning. 289 00:26:27,400 --> 00:26:31,320 Um, so something that I call that something that, uh, 290 00:26:31,330 --> 00:26:37,630 needs to be done early on then to understand these needs and align them ideally is a co-creation, we call it. 291 00:26:37,960 --> 00:26:42,420 Um, so you actively involve people, um, in system design and, 292 00:26:42,780 --> 00:26:47,200 and potentially also implementation strategies, and you ensure that the requirements of it. 293 00:26:47,500 --> 00:26:55,719 And that's not that's not always easy. As you see in the last point um, they uh requirements will conflict. 294 00:26:55,720 --> 00:27:00,400 You know, sometimes it's, it's, it's a least worst option and discussion to have. 295 00:27:00,790 --> 00:27:08,020 But um, involving people actively in that and being able to justify why certain design decisions have been made is really, really important. 296 00:27:08,620 --> 00:27:16,960 Um, so and then, um, you use this information and you need to make certain changes to, like I say, always think system design or strategy. 297 00:27:17,080 --> 00:27:25,809 Yeah. The two um, and um, but the problem during that process is already like I said, it's aligning different requirements. 298 00:27:25,810 --> 00:27:27,430 So for example, um, 299 00:27:27,430 --> 00:27:35,110 we see a lot that organisations implement or introduce systems that are good for the organisation but create more work for the clinician. 300 00:27:35,560 --> 00:27:39,400 Um, you know, that's if the if it's a problem if. 301 00:27:39,440 --> 00:27:47,210 There's no immediate benefit to the person putting in the work. Um, so that needs to be that needs to be somehow, uh, that circle needs to be squared. 302 00:27:47,600 --> 00:27:53,299 Um, and, um, I'd also use this, um, very well equipped to rethink processes. 303 00:27:53,300 --> 00:27:56,390 So it's that kind of, um, hadn't you thought Henry Ford? Yeah. 304 00:27:56,390 --> 00:28:01,310 The statement. Um, if I would have asked people what they wanted, they would have told me, first of all. 305 00:28:01,350 --> 00:28:02,299 So, you know, 306 00:28:02,300 --> 00:28:10,190 you don't want to present them with a final solution because you don't want to stifle their creativity and impose your vision of a system onto them. 307 00:28:10,640 --> 00:28:15,740 Uh, but you also, um, you also, um, you so it's a it's a balance. 308 00:28:15,950 --> 00:28:21,980 Yeah. Um, and, um, of course, we are dependent on developers, especially with a big we call them mega systems. 309 00:28:22,400 --> 00:28:27,530 Um, but but it's always a problem. Uh, change requests land and massive. 310 00:28:27,560 --> 00:28:32,060 Well, you know this better than me. Uh, the conditions they land landed because. 311 00:28:32,060 --> 00:28:35,770 And some of them are never actions. Uh, people make a big fuss a lot. 312 00:28:36,270 --> 00:28:40,670 If a lot of people make a big fuss, there might be action tomorrow. But. Yeah, so that's a problem. 313 00:28:41,090 --> 00:28:45,200 Um, later on, when the system's more established, especially, um. 314 00:28:46,870 --> 00:28:51,220 So co-creation is one. Uh, how are we? How we can avoid failure? 315 00:28:51,610 --> 00:28:54,880 Um, the other one is implementation research. 316 00:28:54,910 --> 00:28:58,690 Um, and again, it overlaps with the other stages. 317 00:28:58,690 --> 00:29:06,730 But, um, we want technology to embed within contexts of use, because if it doesn't, then we, um, we duplicate work. 318 00:29:07,000 --> 00:29:09,020 And um, we see that again and again. 319 00:29:09,040 --> 00:29:17,560 So, um, one of the studies we have done here was with, um, electronics health Records, that was the national programme, actually, that was my PhD. 320 00:29:18,220 --> 00:29:21,820 Um, and, uh, yeah, clinicians were doing more data entry. 321 00:29:22,240 --> 00:29:26,530 Um, they had more work, uh, which we see commonly with electronic health records. 322 00:29:26,620 --> 00:29:27,670 Um. That's okay. 323 00:29:27,790 --> 00:29:35,619 I think it's okay that there's more work as long as, um, there's as long as they're not sold on the basis of, um, this is going to change. 324 00:29:35,620 --> 00:29:39,850 Your life is going to make everything easier, which is what organisational strategic decision makers tend to do. 325 00:29:40,240 --> 00:29:44,920 So I think we just need to be we need to be upfront to people and saying, look, this is going to be worth it, you know. 326 00:29:45,370 --> 00:29:49,540 Um, it's going to improve patient safety and um, it's going to be exquisite. 327 00:29:49,540 --> 00:29:57,010 But it might, you know, you might you're likely to have some more work and some areas and actually more work. 328 00:29:57,250 --> 00:30:03,180 I find this applies across all complex systems. So there's always more work for somebody somewhere. 329 00:30:03,190 --> 00:30:06,310 You just have to find that person while you do the social science. 330 00:30:06,610 --> 00:30:12,220 Um, we also see a lot I saw a lot, um, that people develop workarounds. 331 00:30:12,250 --> 00:30:16,959 You know, if a system doesn't fit in with what they do within their workflows, um, 332 00:30:16,960 --> 00:30:22,930 then they will create the most amazing ways to get around, um, fitting into the system. 333 00:30:22,930 --> 00:30:28,810 And again, I think it's really, really important to understand some of the creative ways people do that, 334 00:30:29,020 --> 00:30:31,810 um, because they can really help you streamline the processes. 335 00:30:32,200 --> 00:30:36,549 Um, so we saw people taking notes on paper and then putting them in at the end of the day. 336 00:30:36,550 --> 00:30:38,830 And that then has implications for data quality, 337 00:30:38,830 --> 00:30:43,270 because people who were accessing the electronic system weren't working with the most up to date information. 338 00:30:44,080 --> 00:30:54,610 Um, that was electronic health records. The second one, um, we did is we looked very recently at an AI based decision support system in radiology, 339 00:30:55,240 --> 00:30:58,570 and, um, radiologists loved it and loved it. 340 00:30:58,610 --> 00:31:04,810 They said it makes my life easier. You know, there were clear drivers for the for this for the system because we don't have enough radiologists. 341 00:31:05,320 --> 00:31:07,620 Um, so, um, yeah, they loved it. 342 00:31:07,690 --> 00:31:15,160 But then we spoke to organisations who had gotten the system for free as part of the trial that we were evaluating, and they said, well, you know, 343 00:31:15,170 --> 00:31:20,380 I don't know how to justify buying the system going forward because the basically it looked, you know, 344 00:31:21,250 --> 00:31:26,590 I have different priorities and it looks like one little thing, one little aspect of detected cancer. 345 00:31:26,590 --> 00:31:30,399 It might have, um, you know, what about increased referrals? 346 00:31:30,400 --> 00:31:34,390 It might create some work somewhere further down the line. Do I even want to know about this? 347 00:31:34,870 --> 00:31:37,900 Um, so there were challenges for establishing a business model. 348 00:31:38,020 --> 00:31:43,150 Um, of, of this very, very specific, um, system that we were looking at here. 349 00:31:45,940 --> 00:31:48,970 So, uh, implications for research, Sutton. Which is what? 350 00:31:49,120 --> 00:31:54,080 You know, that's the stuff we can do about it. And I always say some evaluation is better than none. 351 00:31:54,100 --> 00:31:57,190 So even if we think about these things, you don't have to do the research. 352 00:31:57,190 --> 00:32:04,440 Just think about them. Um, that already helps. Um, so, um, so we we need to do implementation research. 353 00:32:04,450 --> 00:32:10,089 Ideally. I mean, this is the, you know, you need somebody like me coming in and and helping you, 354 00:32:10,090 --> 00:32:14,500 um, to, to to make sure that systems embed within that context of use. 355 00:32:14,860 --> 00:32:18,880 And that involves exploring implementation and adoption processes. 356 00:32:18,910 --> 00:32:24,310 We do that qualitatively by doing observations and interviews, but um, documentary analysis as well. 357 00:32:24,910 --> 00:32:29,260 And um, yeah, ideally we um, uh, come back to that. 358 00:32:29,260 --> 00:32:35,710 Actually we, we need to understand how these systems work, how these systems work, and how they transform and change the way people operate. 359 00:32:36,640 --> 00:32:39,790 Right. Um, I mentioned that other stuff before. 360 00:32:40,270 --> 00:32:46,209 Um, and what always helps me and again, something that to take away and, um, this is far from realist evaluation, 361 00:32:46,210 --> 00:32:52,870 but, um, it's about, you know, um, what works for whom and under what circumstances. 362 00:32:53,470 --> 00:32:57,549 So always ask yourself that there will always be different views. 363 00:32:57,550 --> 00:32:59,260 There's not one that's not one thing. 364 00:32:59,310 --> 00:33:04,930 And those different views of what works for whom and under what circumstances and helps you have the right conversations. 365 00:33:07,270 --> 00:33:11,260 Now, um, these two things. Um. Three things. 366 00:33:11,620 --> 00:33:16,000 Optimisation, scaling and sustainability are very, very difficult. 367 00:33:16,150 --> 00:33:20,560 Um, so the so looking at, you know, co-creating something. 368 00:33:20,650 --> 00:33:24,940 Um, asking users what they want, kind of making changes to system design, 369 00:33:24,950 --> 00:33:30,100 then putting it into putting a system into processes and planning how it changes organisations. 370 00:33:30,460 --> 00:33:37,000 It's relatively easy. Um, when you compare it to, um, to, to these next stages. 371 00:33:37,270 --> 00:33:45,310 Um, but they only become important. And that's probably why we don't know much about them when, uh, system is actually used in one context. 372 00:33:45,430 --> 00:33:48,610 Um, and when it has been shown to embed relatively successfully. 373 00:33:49,030 --> 00:33:55,330 Um, um, and also, um, the stages are only consider when things go wrong. 374 00:33:55,460 --> 00:34:01,900 Yeah. And then they call social scientists and um, try and how can we make people use the system like, um, 375 00:34:02,680 --> 00:34:08,890 but then it's this Harvard Business Review saying, you know, you're too late, um, because people are probably disillusioned already. 376 00:34:09,400 --> 00:34:18,790 Um, so, um, yeah, it's it's it's I'm basically telling you, you need to think about these things early on, as early on as possible. 377 00:34:19,150 --> 00:34:26,110 Um, and, um, because people don't focus on, on all these stages, it also means that there's relatively little, 378 00:34:26,170 --> 00:34:29,620 uh, research on them because we just don't get funded to study these things. 379 00:34:30,040 --> 00:34:37,990 Um, so, so most of the stuff, most of the cost of the empirical, uh, research focuses on the early development and implementation. 380 00:34:38,530 --> 00:34:47,299 Um, and yeah, it's I mean, this quote is just to just to show that, um, a simple solution is appealing, but it's not always the right thing to do. 381 00:34:47,300 --> 00:34:50,350 It might make things very simplistic. Um. 382 00:34:51,440 --> 00:34:56,690 So. But but we have looked at, um. We have looked at ways to, to address that problem. 383 00:34:57,080 --> 00:35:05,729 And I did mention optimisation earlier. So, um, uh, the risk here is that, uh, technology is not sufficiently exploited here and, 384 00:35:05,730 --> 00:35:09,980 and it can't keep up with, with changing technologies and changing needs. 385 00:35:10,490 --> 00:35:19,790 So, um, so we need to take that iterative refinement of systems and strategies, uh, forward, um, across time never ending. 386 00:35:20,840 --> 00:35:24,559 That shows a never ending. It will become, uh, continuous improvement. 387 00:35:24,560 --> 00:35:30,770 And I've stolen this, uh, from my my partner in crime, Robin Williams from science and Technology studies. 388 00:35:31,160 --> 00:35:40,069 Um, but yeah. So optimisation is, um, we've done a study on this in, in, uh, electronic prescribing work in hospitals. 389 00:35:40,070 --> 00:35:45,440 And actually we do some work now with and adjusting that to look at optimisation and electronic health records. 390 00:35:45,800 --> 00:35:53,660 Um, but optimisation of system refinement based on different user needs and, and, and the problem is that some of them only emerge in use. 391 00:35:53,660 --> 00:35:57,320 So you can't you can plan and it's important to plan for optimisation in advance. 392 00:35:57,350 --> 00:36:04,399 Yeah people don't believe it. But again they put the system in and then they oh yeah we could you know we could use XYZ that data. 393 00:36:04,400 --> 00:36:08,030 But if you have a strategy it really helps um some kind of strategy. 394 00:36:08,180 --> 00:36:12,560 So um there is optimisation activity that needs to be done at the pre implementation stages. 395 00:36:13,100 --> 00:36:19,160 Um, and um uh but but others can only be done once the system is being used. 396 00:36:19,250 --> 00:36:23,989 Um, and uh and also the change over time. So it's a never ending journey. 397 00:36:23,990 --> 00:36:28,820 It's just like digital maturity. Um which I have another lecture about. 398 00:36:29,240 --> 00:36:33,350 Um, it's not something that we can achieve. Um, and that's okay. 399 00:36:33,350 --> 00:36:42,889 We just need to acknowledge that. Um, so, um, we found in the e-prescribing work that organisations employed a different strategies to optimise. 400 00:36:42,890 --> 00:36:48,620 And that was misaligned, resolving misalignments between the technology and clinical practices, um, 401 00:36:48,920 --> 00:36:57,020 enhancing the systems or putting in um, different, um, uh, different features and also streamlining some of the activities, 402 00:36:57,020 --> 00:37:04,549 uh, some of the pathways, um, and also, um, skills, basically uh, training and also, um, 403 00:37:04,550 --> 00:37:09,200 how training, how people could optimal, how they could use data coming out of the system. 404 00:37:09,740 --> 00:37:15,440 And um, this optimisation. So we were in the field for 45 years, five years, I'd say actually, you know, 405 00:37:15,440 --> 00:37:19,339 we did the research before because the first year always takes ethical approval, takes a year. 406 00:37:19,340 --> 00:37:23,450 But unfortunately that's the that's the difficulty about research. 407 00:37:23,450 --> 00:37:30,290 So we need different methods here. Um, but um, so we um, yeah, it continued. 408 00:37:30,350 --> 00:37:33,350 And then we stopped because we weren't funded to study anymore. 409 00:37:34,100 --> 00:37:40,370 And that was annoying. Uh, because actually, that's really important, um, to figure out the processes around optimisation. 410 00:37:40,700 --> 00:37:47,659 So now I'm hopeful that this new project will shed some light. And it has already actually, um, and watch out for the paper that's coming. 411 00:37:47,660 --> 00:37:56,420 Um, on what people, you know, what the optimisation activities are and which optimisation activities leads to, um, improve perceived improvements. 412 00:37:56,420 --> 00:38:04,190 Um, and uh, um, optimisation of the system actually comes back and systems, um, um, again, 413 00:38:04,190 --> 00:38:10,670 this is like this, this is the idea of and actually, uh, the idea is beyond that as well. 414 00:38:10,940 --> 00:38:15,620 But um, we ideally we want to look at, uh, systems implementation before. 415 00:38:16,130 --> 00:38:23,750 Yeah. So, you know, dimension is crucial before we want to have great baselines because otherwise if we just focus on the implementation, 416 00:38:23,750 --> 00:38:30,950 we don't know what difference the system is made. And that is a huge problem in digital health that we just don't have proper baselines. 417 00:38:30,950 --> 00:38:34,339 And that's quantitative baselines and also qualitative baselines. 418 00:38:34,340 --> 00:38:39,770 So by qualitative baselines I mean what are the existing processes are that you know what process maps. 419 00:38:39,770 --> 00:38:46,099 For example they could be FODMAPs. But any kind of idea of what is going on before it's really, 420 00:38:46,100 --> 00:38:52,790 really important to plan the changes and then also figure out whether the system had the, the intended impact. 421 00:38:53,300 --> 00:38:59,240 Um, and it can help you how? Yeah. Plan. I said that um, and it can also help you tailor the system. 422 00:38:59,690 --> 00:39:02,960 Um, so before implementation is super important. Always neglected. 423 00:39:03,020 --> 00:39:05,180 Um, you can't go back once you implement it. 424 00:39:05,570 --> 00:39:12,950 Um, during implementation, um, when things are going I mean, so most of my work's been in electronic health records, 425 00:39:12,950 --> 00:39:18,680 so complex systems, um, everything is the organisation is a huge upheaval trying to make this thing work. 426 00:39:19,040 --> 00:39:26,359 Um, but that's an important point to look at how people make these things work and how the result, what tensions come up, how to resolve them. 427 00:39:26,360 --> 00:39:34,339 And that's a huge source of learnings. Um, and also, um, that's when I anticipated an unanticipated, uh, consequences. 428 00:39:34,340 --> 00:39:39,860 Come, come up and we can, we can, uh, look at these and then that can be addressed further on that one. 429 00:39:41,030 --> 00:39:45,679 And also, uh, workarounds, I think workarounds. People tend to keep them secret. 430 00:39:45,680 --> 00:39:52,909 But I think the top tip is please get people to share the workarounds that are really important for the organisation to know, 431 00:39:52,910 --> 00:39:57,710 because they can really mess up anything that comes up on the on the other side of the system. 432 00:39:58,250 --> 00:40:05,809 Um, so, um, and then so before implementation, during implementation and after implementation, 433 00:40:05,810 --> 00:40:10,220 when, when people in the organisation feel that they can't do without it anymore. 434 00:40:10,640 --> 00:40:15,050 Um, ideally uh, or when it's been abandoned, in which case you don't need to cut back. 435 00:40:15,410 --> 00:40:19,100 Um, but uh, when, when they've established new routines. 436 00:40:19,460 --> 00:40:23,150 Um. A system has become embedded, we call it. 437 00:40:23,690 --> 00:40:26,720 So so those are this is the optimal design. 438 00:40:27,130 --> 00:40:34,700 Uh, in a within the time frame, we usually after implementation would be we always say between 6 to 12 months after implementation. 439 00:40:35,040 --> 00:40:38,840 Um, but with if we want to study optimisation activity. So this is a project. 440 00:40:39,320 --> 00:40:43,160 Um, but if we want to study optimisation activities, that means to be much longer. 441 00:40:43,160 --> 00:40:47,270 And actually it probably shouldn't ever stop. Um, so yeah. 442 00:40:47,870 --> 00:40:51,710 Uh, so some, some idea of how things are developing over time. 443 00:40:51,740 --> 00:40:59,490 Longitudinal dimension is really important. Um, now, uh, sustainability and sustainability. 444 00:40:59,910 --> 00:41:06,330 This is, uh, a difficult one. Um, because. So I make that claim, and I don't know. 445 00:41:06,510 --> 00:41:10,620 I've been thinking about this a lot, actually, uh, walking the dog. So when, um. 446 00:41:10,740 --> 00:41:13,900 Uh, so the risk is that we create technology silos. 447 00:41:13,920 --> 00:41:25,230 Yeah, we we want especially now we integrating care across increasingly large, um, uh, locales and different organisations. 448 00:41:25,650 --> 00:41:31,950 We want a degree of interoperability. Now, we don't want seamless interoperability because that makes it quite difficult for 449 00:41:31,950 --> 00:41:36,540 organisations to systems that are suited to their individual needs and circumstances. 450 00:41:36,540 --> 00:41:45,750 So that but but I would argue that if you want a sustainable and scalable system, um, you need a degree of interoperability. 451 00:41:46,110 --> 00:41:52,680 Um, and, and the problem is that these strategic dimensions around scaling sustainability, 452 00:41:52,980 --> 00:41:56,360 uh, in the interest of, um, of the health healthcare system. 453 00:41:56,370 --> 00:42:04,080 Yeah. Strategic decision makers, they are they are stuff that isn't really studied unless you do a big national program evaluation. 454 00:42:05,340 --> 00:42:09,180 Which we have done so. So we got some insights into that. 455 00:42:09,180 --> 00:42:16,259 But actually I think policymakers are currently trying to figure out how we scale and sustain systems. 456 00:42:16,260 --> 00:42:20,520 And because there's no research it's quite difficult. You know, these things have to be tried out in the real world. 457 00:42:20,970 --> 00:42:28,080 Um, and again, um, that's that's also part of the AI lab work that we're doing because they have really they have tried, 458 00:42:28,110 --> 00:42:34,170 um, to scale and start to scale these systems, uh, across, across a sort, across the whole NHS. 459 00:42:34,710 --> 00:42:45,000 Um, and, um, so one example here from the past is the national programme for it, um, where we had, um, a top down, 460 00:42:45,000 --> 00:42:50,850 politically driven programme, you know, and that's important because you can align stakeholders and we can do that in the NHS. 461 00:42:50,850 --> 00:42:56,489 So that's one of the benefits here. Um, we um, you have one kind of strategic vision. 462 00:42:56,490 --> 00:43:01,890 And um, and that that was good because that helped, uh, to secure leadership and support. 463 00:43:02,250 --> 00:43:06,000 But the problem is that you limit local involvement in decision making, 464 00:43:06,000 --> 00:43:13,580 and you have a risk of disengaged disengagement of stakeholders and eventually abandonment, which is actually, um, what's happened? 465 00:43:13,590 --> 00:43:16,470 This is from Lorenzo, which is the system I studied. 466 00:43:16,530 --> 00:43:22,530 Um, which again sounded like a great idea, but they're still working on decommissioning it in the NHS now, 467 00:43:22,530 --> 00:43:26,490 which is for 15 years on something like that. 468 00:43:27,090 --> 00:43:36,010 Um, so, um, so there's this tension between, you know, focusing on these nationally and locally led initiatives. 469 00:43:36,040 --> 00:43:41,430 You can have a perfectly working local system that suits you, suits, um, the local organisation. 470 00:43:41,760 --> 00:43:47,940 But but you can't share information with others with other systems and that compromise large scale interoperability. 471 00:43:48,300 --> 00:43:56,100 So, um, there has been a push here in the NHS to implement, um, uh, mega systems, that big Cerner epic. 472 00:43:56,580 --> 00:44:03,600 Um, because they and if you have one instance of Cerner, which they have in Queensland, I think they have it across 16 different hospitals. 473 00:44:04,170 --> 00:44:10,979 Information can flow completely seamlessly. Um, so that's something that, um, strategic decision makers want. 474 00:44:10,980 --> 00:44:15,780 But organisations sometimes find it quite difficult because it doesn't align with their local needs. 475 00:44:16,050 --> 00:44:23,310 So somehow what we need to strategically do is we need to align organisational drivers with national strategy. 476 00:44:23,370 --> 00:44:27,559 And there will always be some kind of trade off. That's just how it is. 477 00:44:27,560 --> 00:44:35,290 Um. Um, and there's a risk of this, uh, what I mentioned earlier between the clinician doing the data entry and the organisation reaping the benefit, 478 00:44:35,300 --> 00:44:44,420 the same thing applies between organisations and, um, and national, uh, national um, uh, health system benefits. 479 00:44:44,420 --> 00:44:52,249 Yeah. Organisation health system benefits. So we have this federated data platform, uh, being implemented at the moment, 480 00:44:52,250 --> 00:44:56,809 which you might have heard about, but I think that's really showing that tension, um, 481 00:44:56,810 --> 00:45:02,000 that organisations find that quite difficult to sign up for because they can't see the immediate benefits for them, 482 00:45:02,000 --> 00:45:06,050 but they somehow need to be aligned for these systems to work. Um. 483 00:45:06,860 --> 00:45:10,910 So, um, studying, scaling. Um, so what can we do? 484 00:45:11,080 --> 00:45:14,870 What what have we done and what has worked? So that's a phrase that like that. 485 00:45:14,870 --> 00:45:21,920 So we do case studies. Um, and that means looking at, um, at one organisation implementing a system and, 486 00:45:21,950 --> 00:45:27,230 um, comparing it to other organisations implementing the system, a system, a system. 487 00:45:27,410 --> 00:45:33,230 So that. Yeah. All right. This is from then over our quarter two. 488 00:45:33,240 --> 00:45:37,260 We start up here. Perfect. Okay, good. You know, technology was a good one. 489 00:45:39,690 --> 00:45:43,259 Um, so, um. Yeah. No, I'll be finished in two slides. 490 00:45:43,260 --> 00:45:48,450 So, um. Yeah, we need to study. Um, we need to study different contexts, 491 00:45:48,450 --> 00:45:52,890 and we can do that by studying different organisations like we used usually conceptualise these as case studies. 492 00:45:53,340 --> 00:45:59,190 Um, and um, we, uh, I think it's crucially important that we study both macro environments. 493 00:45:59,190 --> 00:46:03,959 And by that I mean the health system dimensions, um, and also the, the micro factors. 494 00:46:03,960 --> 00:46:10,860 So organisational dimensions, but also um, adoption dimensions of the individual interacting with the technology. 495 00:46:11,280 --> 00:46:14,339 And um, we've seen um, yeah. 496 00:46:14,340 --> 00:46:17,579 And the interrelated. So the problem is that all these things are interrelated. 497 00:46:17,580 --> 00:46:20,700 And I've mentioned that a few times, and that makes it very, very complex. 498 00:46:20,700 --> 00:46:23,280 But this is where you need to look if you find any problems. 499 00:46:23,790 --> 00:46:30,029 Um, between what tensions between these dimensions that are usually things that need to be resolved for things to work, I thought. 500 00:46:30,030 --> 00:46:33,509 Makes sense. Um, and ideally we want longitudinal components. 501 00:46:33,510 --> 00:46:39,810 So we want to we want to look at short medium that should say medium and long term transformations that will never end. 502 00:46:40,290 --> 00:46:43,410 Um, and I think, um, yeah. 503 00:46:44,040 --> 00:46:48,809 And we need to plan how systems will integrate within wider health and care delivery infrastructures. 504 00:46:48,810 --> 00:46:54,180 Pathways are really, really important because none of these things are going to exist in isolation anymore. 505 00:46:54,720 --> 00:47:03,360 Um, and uh, another another issue to look at is how to, to some degree, intervention needs to be generalisable to work in different settings. 506 00:47:03,720 --> 00:47:08,940 Um, but also it needs to address, uh, local needs. Um, yeah. 507 00:47:08,940 --> 00:47:14,030 So, I mean, I'm of course this is what I do. I, you know, I'm flying flag performance evaluation. 508 00:47:14,350 --> 00:47:20,219 I think it's really, really important. Um, because it can help us, uh, to to to avert risks. 509 00:47:20,220 --> 00:47:28,080 But it's never done. We get called into called in too late. Um, uh, people don't like to, um, you know, people like to celebrate successes. 510 00:47:28,080 --> 00:47:32,520 And I think that's really, really important. But you can learn most from failures. 511 00:47:32,520 --> 00:47:36,569 Really. Um, and we just need to be slightly more. It's easy for me to say, well, isn't it? 512 00:47:36,570 --> 00:47:40,469 Because it's, you know, I just jet in and criticise people. 513 00:47:40,470 --> 00:47:47,850 But actually, um, I acknowledge that it's really, really difficult to do these things and they're not easy decisions that have to be made. 514 00:47:48,420 --> 00:47:52,649 Um, and it's, uh, we need to think about technological adoption, 515 00:47:52,650 --> 00:47:57,810 organisation and health system factors as early on as possible throughout the technology lifecycle. 516 00:47:58,260 --> 00:48:02,549 Um, problem is that efforts are often invested at the front end and during development. 517 00:48:02,550 --> 00:48:08,040 And we are not thinking early enough about integration and optimisation and scaling. 518 00:48:08,730 --> 00:48:15,660 And, um, I think that's it. I made this with I thought, um, that's me. 519 00:48:16,740 --> 00:48:21,270 Yeah. So if you have any if you have any questions or anything, um, please email me. 520 00:48:21,270 --> 00:48:26,250 Um, you know, I like my job, so I'm very happy to, to get back to you.