1 00:00:00,300 --> 00:00:10,020 [Auto-generated transcript. Edits may have been applied for clarity.] Welcome. Um, this is a guest lecture for a module on the Master's in Translational Health Sciences, 2 00:00:10,020 --> 00:00:15,750 which is run jointly through continuing education and the Department of Primary Care Health Sciences. 3 00:00:16,110 --> 00:00:17,970 My name is Trish Greenhouse. 4 00:00:18,120 --> 00:00:26,550 I'm program director, and I want to extend a really warm welcome to not just to our own master students, most of whom are in the room here, 5 00:00:26,730 --> 00:00:33,480 but also lots of people who've joined online and, uh, also some who are in the room who know that master student. 6 00:00:33,630 --> 00:00:38,010 So I think we've got a hugely diverse group here. 7 00:00:38,310 --> 00:00:42,660 And now I'm going to introduce Carolyn, and I'm so pleased you're here. 8 00:00:43,170 --> 00:00:51,329 Um, you are on visiting fellowship at the this institute in Cambridge, which is a it's run by Mary Dixon Woods. 9 00:00:51,330 --> 00:00:57,510 And they do wacky things with implementation science, pushing back on some of the assumptions. 10 00:00:57,900 --> 00:01:02,700 And I hear from Karen, you have the most amazing interdisciplinary background, 11 00:01:02,700 --> 00:01:10,829 which started with an undergraduate joint harnessing chi busy ology, looking up if you don't know that it's fun and sociology. 12 00:01:10,830 --> 00:01:15,630 And then you went on to do various management t organisationally things. 13 00:01:16,080 --> 00:01:25,680 Um, which equipped you for hanging out with the Institute of Health Policy Management and Evaluation at the fabulous University of Toronto. 14 00:01:26,160 --> 00:01:32,310 Um, but the the reason why I'm so excited is, you know, this Barry is an enablers thing. 15 00:01:32,520 --> 00:01:38,250 What am I say to you guys at the beginning of the week? You give me barriers in the neighbourhoods, you might pass you. 16 00:01:38,460 --> 00:01:42,860 You challenge those barriers and enablers framing. You probably get a distinction. 17 00:01:42,870 --> 00:01:45,960 Well, you're going to tell us how it's done. Thank you. 18 00:01:46,110 --> 00:01:52,650 Thank you. And I understand it's going to be one of those back and forth sessions where there's going to be lots of interactivity. 19 00:01:53,100 --> 00:01:56,489 Uh, questions are going to pop up. I think you are okay with them. 20 00:01:56,490 --> 00:02:01,020 Put your hands in the all right stuff. I'm just going to go and sit in the corner much. 21 00:02:03,080 --> 00:02:06,739 I think. I think I'm on camera now for the online people. Okay, I'll try and stand here. 22 00:02:06,740 --> 00:02:10,370 Novel wandered. Thanks so much and so happy to be here. 23 00:02:10,370 --> 00:02:14,929 Um, really. Thank you. Uh, to Caroline and Trish for the invitation to chat with the module. 24 00:02:14,930 --> 00:02:18,409 I had a chance to read a little bit more about what you guys have been learning all week. 25 00:02:18,410 --> 00:02:25,610 And, uh, what an exciting opportunity, um, for for you guys and for, um, for me to be able to engage with you in what you've been learning this week. 26 00:02:26,240 --> 00:02:32,750 Um, so Esther says I'm from the University of Toronto, in addition to my associate professor role and senior investigating role, 27 00:02:32,750 --> 00:02:37,070 I hold what's called a Canada research chair in implementing digital health innovation. 28 00:02:37,430 --> 00:02:44,450 So in Canada, we have these kind of nationally recognised chairs where we get to present an issue we think really matters, 29 00:02:44,780 --> 00:02:48,439 and the government gets to decide if we get to go and do it or not. 30 00:02:48,440 --> 00:02:54,350 And so I'm one of those and I pitch that implementation matters a lot and digital health matters. 31 00:02:54,950 --> 00:02:56,899 And that's what I've been working on the last five years. 32 00:02:56,900 --> 00:03:01,820 And uh, we'll be renewing that for, for the next five, uh, because it's a journey that continues. 33 00:03:02,450 --> 00:03:05,899 And one of the reasons I really like implementation science, um, 34 00:03:05,900 --> 00:03:10,280 is because it kind of helps us with this problem that you guys have probably talked a lot about, 35 00:03:10,400 --> 00:03:15,620 like that notorious 17 years gap between discovery and actual adoption. 36 00:03:15,920 --> 00:03:20,330 And it's a really sticky one is this paper was out in 2023 from Jama. 37 00:03:20,570 --> 00:03:28,010 But yeah, and I seen this title in a lot of papers over a lot of years, and it just doesn't seem to be budging very much. 38 00:03:28,640 --> 00:03:34,940 Um, and I'm going to talk a little bit about how we might think about using implementation science theory in particular, 39 00:03:34,940 --> 00:03:39,379 and theories that underpin implementation science to maybe help shrink this a 40 00:03:39,380 --> 00:03:43,250 little bit more than they've been able to do over the last couple of decades. 41 00:03:43,880 --> 00:03:47,630 So I'm already going to put you guys to work. I'm gonna put you to work a couple of times. 42 00:03:47,810 --> 00:03:50,960 Maybe you have a piece of paper in front of you. Maybe you have your phone, your laptop. 43 00:03:51,500 --> 00:03:56,360 Um, I'd like us to think a little bit about what we've actually been measuring when we're doing change. 44 00:03:57,080 --> 00:04:01,340 So on your piece of paper or your note on the phone, even better, find a new piece of paper. 45 00:04:01,340 --> 00:04:04,430 I'll show you. Why. Because drawing pictures is a fun part of this work. 46 00:04:04,880 --> 00:04:10,400 I want to think of you guys to think about maybe a reform or an innovation that you've been a part of. 47 00:04:10,850 --> 00:04:16,970 Maybe you've studied it as a researcher, or maybe you've implemented it as an organisational leader. 48 00:04:17,240 --> 00:04:24,680 Maybe you designed it as an innovator. Um, and think about what, um, your experience of doing that was. 49 00:04:24,680 --> 00:04:29,540 So what you're going to do was on the left side of your piece of paper or on one part of your phone, 50 00:04:29,960 --> 00:04:34,580 you're going to write down what the innovation is or what current in 2020 likes to call the thing. 51 00:04:34,700 --> 00:04:38,870 What is your thing, your innovation, the implementation that they were going to put in place? 52 00:04:38,870 --> 00:04:43,880 What's your thing? And on the right side, what did you hope the thing was going to do for you? 53 00:04:43,940 --> 00:04:48,319 What was your outcome that you were trying to get to? So have a thought on that. 54 00:04:48,320 --> 00:04:52,830 And then in the middle, write down all the things you kind of cared about or measure. 55 00:04:52,850 --> 00:04:58,970 So if you're researching it or if you were trying to drive it, what were your KPIs, your key performance indicators? 56 00:04:59,210 --> 00:05:03,890 What were the things you kind of measured between the thing and the outcome? 57 00:05:04,190 --> 00:05:08,210 I'll give you just like a couple of minutes to think about that. But I take a sense of this lovely coffee that was. 58 00:05:09,430 --> 00:05:18,300 Like you to take a look at your list. Um, did anybody write down explicitly something about how you went from the thing to the outcome? 59 00:05:18,780 --> 00:05:23,339 Yeah. And we often do that. Right? What's the thing that we're doing? What were the outcomes that we measured. 60 00:05:23,340 --> 00:05:31,829 And implementation folks will do that as well. But there's all this stuff that's happening in the middle, right, that we know what the thing is. 61 00:05:31,830 --> 00:05:35,700 And then we measure this outcome and then we think, okay, now we we're going to make it work. 62 00:05:35,700 --> 00:05:39,540 And then we try to hand it to people. And then it doesn't really take the way we think it should. 63 00:05:40,170 --> 00:05:43,680 And that's where what you guys have been learning about should be able to help with that. 64 00:05:43,680 --> 00:05:54,509 And in fact, when we go back to that, um, paper from 2023, it suggests to us that the burgeoning field of implementation science written in 2023, 65 00:05:54,510 --> 00:05:58,980 I feel like it was a couple, at least a couple of decades old at that point, but that should be helping us out. 66 00:05:59,900 --> 00:06:05,460 But not all. Implementation science is kind of the same and may or may not be helping us on that journey. 67 00:06:05,480 --> 00:06:08,870 So we'll talk about that a bit today. So hold on to your piece of paper. 68 00:06:08,870 --> 00:06:12,680 Think about your thing and your outcome, and let's start working on filling in the middle of it. 69 00:06:13,610 --> 00:06:21,020 Um, but what often happens for us as we really notice that there's determinants in the literature, we really like determinants. 70 00:06:21,230 --> 00:06:29,660 We did a really interesting, um, one study led by, uh, Genevieve Rouleau and uh, in published just in 2024, 71 00:06:29,660 --> 00:06:33,980 where we looked at just studies, uh, digitally implementation of digital health. 72 00:06:34,280 --> 00:06:39,590 And we wanted to look at the theories, models and frameworks that were being used in those studies over a 20 year period. 73 00:06:40,100 --> 00:06:43,820 Uh, we looked found 156 studies that were published in that time. 74 00:06:44,150 --> 00:06:48,260 68 different theories, models and frameworks were used. 75 00:06:48,830 --> 00:06:54,469 The vast majority of determinants, frameworks, and really surprisingly, 76 00:06:54,470 --> 00:07:02,330 about 20% of those determinants frameworks were just cipher a lots of consolidated framework for implementation research that's getting used. 77 00:07:02,690 --> 00:07:11,510 We can talk a lot about why that's the case. But overall the field in digital health and others really love determinant studies. 78 00:07:11,960 --> 00:07:16,580 We love it. Last year uh for a talk I want you know, 79 00:07:16,580 --> 00:07:24,680 for for fun I wanted to see in just 2025 how many studies of determinants in digital health were out there and there were over 100. 80 00:07:25,130 --> 00:07:28,430 This is not a systematic review. This is just me playing around with Google Scholar. 81 00:07:28,700 --> 00:07:34,210 But that's an awful lot of papers on just determinants. So as a field we just really love digital. 82 00:07:35,330 --> 00:07:41,840 Um, and I really encourage to hear that you guys were talk to you about this at the beginning, that maybe we need to move beyond, 83 00:07:42,260 --> 00:07:49,070 um, because there's a concern growing amongst implementation scientists and other people that as the science has matured, 84 00:07:49,160 --> 00:07:58,309 uh, it's arguably becoming structurally comfortable that we're really good at cataloguing determinants and strategies, 85 00:07:58,310 --> 00:08:06,680 but we're really under theorising mechanisms, which is really the key thing that happens between the thing and the outcome. 86 00:08:07,280 --> 00:08:17,239 Um, and as I wrote in 2023 with, uh, their colleagues that theoretical products at describing what have generally outpaced explanations of how in 87 00:08:17,240 --> 00:08:24,470 implementation science and as evidenced by those numerous determinant frameworks but fewer explanatory theories. 88 00:08:25,010 --> 00:08:28,399 So we saw that in our scoping review and others who are also working in the field 89 00:08:28,400 --> 00:08:31,700 doing similar scoping reviews in other like outside of just digital health, 90 00:08:32,120 --> 00:08:38,029 are overwhelmingly seeing this to be the case. So it's really important to remember that determinants matter. 91 00:08:38,030 --> 00:08:43,400 They're important for for some key things, but they're not really explanations all the time. 92 00:08:43,940 --> 00:08:47,780 Strategies that we also publish aren't necessarily causal processes. 93 00:08:48,110 --> 00:08:51,440 And outcomes don't necessarily mean we've understood what happened. 94 00:08:51,710 --> 00:08:55,190 Right. We see this in trials all the time. We we saw an outcome and that's awesome. 95 00:08:55,190 --> 00:09:01,850 And then we try to pick the thing up and put it to another place. And we can't for the life of us figure out why it didn't happen in the other place. 96 00:09:02,450 --> 00:09:06,230 And that's all the middle part between our thing and our outcome. 97 00:09:06,890 --> 00:09:12,080 So mechanistic thinking or mechanisms are a way forward in this. 98 00:09:12,080 --> 00:09:19,100 And that's and this is not a brand new space. A lot of implementation scientists and other theorists have been grappling with mechanisms. 99 00:09:19,100 --> 00:09:22,580 Realist thinking has been grappling with mechanisms for a long time. 100 00:09:23,010 --> 00:09:27,589 Um, so I'm going to spend today talking about how we can think differently about mechanisms, um, 101 00:09:27,590 --> 00:09:32,450 the different theoretical foundations that help us uncover where they are and some of the challenges 102 00:09:32,450 --> 00:09:37,550 we might face if we're really going in and trying to understand this component of implementation. 103 00:09:39,140 --> 00:09:47,360 So there are lots of different definitions and understandings, and generally not an one agreed upon, um, definition that is being used. 104 00:09:47,750 --> 00:09:51,350 Our realist friends will suggest mechanisms are kind of this, uh, 105 00:09:51,350 --> 00:09:57,259 resources and reasoning components that trigger that are triggered within a particular context to lead to an outcome. 106 00:09:57,260 --> 00:10:01,250 Those are the context mechanism, outcomes groupings that you tend to see. 107 00:10:01,760 --> 00:10:07,999 Um, del can explain expanded that to suggest mechanisms exist in resources and reasoning, um, as well. 108 00:10:08,000 --> 00:10:15,110 So we can kind of think about them in that way. Uh, Lewis Carroll, Lewis and a group of colleagues out of the states have been working in, 109 00:10:15,110 --> 00:10:20,840 in mechanisms for quite some time, and they've been trying to catalogue them to the best of their ability for a long time. 110 00:10:21,080 --> 00:10:26,750 And they define them as processes through which implementation strategies exert effects. 111 00:10:27,290 --> 00:10:34,460 Um, so very much focussed on the implementation component, how you connect your strategy for doing the thing to lead to the outcome you're expecting. 112 00:10:34,940 --> 00:10:40,099 And then, uh, more on the theoretic kind of theoretical, sociological, middle range theory spaces. 113 00:10:40,100 --> 00:10:45,950 We have folks like Romain and Tracy Finch doing normalisation process theory thinking, where they suggest mechanisms, 114 00:10:45,950 --> 00:10:53,569 or these generative social processes, uh, that explain how a practice becomes normalised, at least in the the example of that particular theory. 115 00:10:53,570 --> 00:10:57,260 But there are other middle range theories that effectively say that same thing. 116 00:10:57,260 --> 00:11:03,680 But there's a causal thing that's happening under the surface that's leading from the thing to the outcome or the action. 117 00:11:04,280 --> 00:11:12,200 But generally, they're all agreeing that mechanisms are about explaining how and why a change happens. 118 00:11:12,890 --> 00:11:18,770 Um, the how and why being critical determines sort of telling you what, not necessarily how or why they're functioning. 119 00:11:20,110 --> 00:11:25,350 So another way I like to share this idea with people is using a light switch analogy. 120 00:11:25,360 --> 00:11:31,420 I used to use a trail running analogy, which only worked for a very small pocket of people who knew what trail running. 121 00:11:32,380 --> 00:11:35,590 So I was strongly encouraged to not use that analogy. 122 00:11:35,840 --> 00:11:39,150 So now I'm on a light switch and a light bulb, which makes a lot more sense. 123 00:11:39,160 --> 00:11:43,840 So if you standing in this lovely room and we would like to light up the room a little bit. 124 00:11:43,840 --> 00:11:48,760 So I see some lights here and I go over to the wall and I turn the light switch on and off the light went on. 125 00:11:49,030 --> 00:11:57,040 Magic. However, for that light to turn on because that light switch was switched, there's all this stuff happening in the walls of the building. 126 00:11:57,400 --> 00:12:02,590 There's all the stuff happening in the environment around us to create power that goes to that light switch, 127 00:12:02,890 --> 00:12:08,770 to make sure that that light went on, because that light switch was turned on and not a fan was turned on or not. 128 00:12:08,770 --> 00:12:16,870 A light in another room was turned on. And the key with mechanisms is they tend to exist in places we don't always obviously see them. 129 00:12:17,440 --> 00:12:20,620 We're kind of in the walls. They're kind of in our minds. 130 00:12:20,620 --> 00:12:24,549 They're kind of in our social norms that exist in our environments around us. 131 00:12:24,550 --> 00:12:31,240 So they're not readily available for measurement, and they're not readily available for us to see or understand them, 132 00:12:31,240 --> 00:12:37,930 which is part of why we don't go in that direction when we go to evaluate or measure or understand what we see, 133 00:12:38,050 --> 00:12:45,790 because it's not necessarily something we can't see. So they matter because we might have asked them this time. 134 00:12:45,790 --> 00:12:51,910 It's a large ocean liner, and here's our thing chugging along and we're like, this is definitely going to work. 135 00:12:52,060 --> 00:13:00,310 But then what happens is we face a resistance iceberg. And often at the top of that iceberg are all those determinants that we can see and measure. 136 00:13:00,880 --> 00:13:06,040 But what the mechanisms that are important are the ones that actually underpin the things that we see and measure. 137 00:13:06,040 --> 00:13:11,409 They don't exist differently from the mechanisms. They're actually kind of part of why or differently from the determinants. 138 00:13:11,410 --> 00:13:14,860 They actually are part of why the determinants are the determinants in the first place. 139 00:13:15,310 --> 00:13:23,590 And they rest on they're influenced by these mechanisms that are things like affect and beliefs and norms and values that lead to things like, 140 00:13:23,920 --> 00:13:30,010 um, general resistance or workflow problems or resource allocation issues. 141 00:13:30,310 --> 00:13:35,770 There are underlying pieces about why the things we see at the surface are actually happening at all. 142 00:13:36,100 --> 00:13:43,060 So we need a way to kind of dive deeper into what the determinants are in order to understand why they matter to us in the first place. 143 00:13:43,360 --> 00:13:46,240 And that's where our action probably needs to happen. 144 00:13:46,870 --> 00:13:52,509 Um, that's why when you go and put something in practice and you fix all your work flow in one place, you're in a great fixed work flow. 145 00:13:52,510 --> 00:13:56,110 And then I go do it over here. And that work flow fix just isn't working. 146 00:13:56,410 --> 00:13:59,950 It's because something else is happening underneath the surface that you need to uncover. 147 00:14:01,430 --> 00:14:08,090 Um, now, there's also some new exciting studies that have shot that has shown, particularly this one from Williams and company, 148 00:14:08,360 --> 00:14:13,160 that mechanisms can have actual like causal explanations for implementation success. 149 00:14:13,430 --> 00:14:18,290 This one came out just a month ago, which I'm super psyched about is in all my talks over the next few months. 150 00:14:18,710 --> 00:14:24,560 Um, and certainly Carl May and Tracy Vance. You design normalisation process theory are also super pumped about this. 151 00:14:24,890 --> 00:14:32,530 Uh, because they ran, uh, they did a secondary analysis of data out of a hybrid type three trial of it implementing a digital, um, tool. 152 00:14:32,540 --> 00:14:39,800 This was a on clinician decision support tool or. Sorry, this was a, uh, tool used in mental health and outpatient clinics for youth. 153 00:14:40,250 --> 00:14:48,469 Uh, they were able to use the Nomad's tool, which measures mechanisms of coherence and collective action and cognitive participation, 154 00:14:48,470 --> 00:14:51,170 all the mechanisms underlying normalisation process theory. 155 00:14:51,560 --> 00:14:56,719 And they were able to see, like causal reasons for implementation working when those mechanisms were turned on, 156 00:14:56,720 --> 00:15:01,320 when that like dependent outcome occurred. Um, which was very exciting. 157 00:15:01,340 --> 00:15:07,400 Um, and then they saw that there was a statement of the implementation over a six month period when those mechanisms were turned on. 158 00:15:07,700 --> 00:15:14,600 So not only important when you're putting anything in place, but maybe also important to sustain the new thing over time. 159 00:15:14,610 --> 00:15:20,329 So really encouraging data coming out. So that's one way of doing finding out what your mechanisms are. 160 00:15:20,330 --> 00:15:27,950 You can go out and actually study them and figure out causal pathways. But uh, has done kind of suggests there are lots of other ways we can find, um, 161 00:15:27,950 --> 00:15:34,280 mechanisms in the world that we can look for things like strong association, which happened with the Williams paper. 162 00:15:34,490 --> 00:15:38,059 You can dive in and do some specificity work or consistency work, 163 00:15:38,060 --> 00:15:44,330 which is actually getting into being able to quantify what a mechanism is and tested empirically over time. 164 00:15:44,870 --> 00:15:53,060 Um, you can do experimental implementations of experimental manipulation or look at things occurring over time and look for patterns over time. 165 00:15:53,630 --> 00:16:00,020 Um, all of these things are quite complex and require a good amount of measurement so that you can one understand the mechanism is there, 166 00:16:00,020 --> 00:16:02,630 and to be able to measure it consistently over time. 167 00:16:03,200 --> 00:16:12,799 And because those things are hard to do when you're trying to measure something like social norms often land on the plausibility or of or coherence, 168 00:16:12,800 --> 00:16:15,530 um, requirement of identifying mechanisms. 169 00:16:15,920 --> 00:16:24,440 So Carol Lewis in that group, um, back in 2020, had done a review of the literature to see how mechanisms were being unveiled in the literature. 170 00:16:24,620 --> 00:16:32,690 And overwhelmingly plausibility. Coherence was the main way that we were uncovering it, because we were still really in a qualitative space, 171 00:16:32,990 --> 00:16:37,400 working in smaller environments and not being able to do these bigger replication pieces of work. 172 00:16:37,760 --> 00:16:42,200 So the Williams paper is very encouraging and shows us ways that we can embed mechanism, 173 00:16:42,200 --> 00:16:45,620 mechanistic measurement into bigger trials with bigger numbers, 174 00:16:45,620 --> 00:16:51,139 and start getting into those higher order measurement opportunities that cast and suggests for 175 00:16:51,140 --> 00:16:57,470 us what I will say is what I'm going to present next as we go through some other mechanisms. 176 00:16:57,830 --> 00:17:05,300 Um, tends to rest on plausibility or coherence, um, as a way to understand that these are mechanisms available to us, 177 00:17:05,570 --> 00:17:11,960 because that's kind of a lot of the data that we have available to us. So I think it's still conversational and still things that need to be tested. 178 00:17:12,380 --> 00:17:17,630 Um, I'm going to talk about why theory helps us when in the plausibility or coherence space. 179 00:17:18,920 --> 00:17:21,829 Another way. We've uncovered this in some of the work that we've done, 180 00:17:21,830 --> 00:17:28,309 and that's one of these really fun piece of work I'm doing with colleague, uh, Jay Shaw in, uh, University of Toronto. 181 00:17:28,310 --> 00:17:31,490 And I think a number of people here all know Jay was like, hey, do you know Jay? 182 00:17:31,550 --> 00:17:40,370 Yeah. Uh, so we're trying to uncover, um, how to change practice for community health workers, um, and understand what their needs are. 183 00:17:40,370 --> 00:17:48,229 And we're doing a bunch of co-design work in terms of understanding their roles, uh, in community health work in different environments across Canada. 184 00:17:48,230 --> 00:17:55,070 So we have sites in Ontario where I'm from. We have sites in B.C., um, as well, where Nancy Clark is leading that part of the work. 185 00:17:55,190 --> 00:18:01,280 And so we were trying to understand what they were facing in terms of the challenges of doing their work as community health workers. 186 00:18:01,760 --> 00:18:07,370 And in order for us to dive deeper into understanding root causes or mechanisms leading 187 00:18:07,370 --> 00:18:11,570 to those barriers and enablers that they were experiencing as community health workers, 188 00:18:11,900 --> 00:18:18,710 we just kept asking why? So this is an image from one of our mural boards that we used in our co-production work. 189 00:18:19,130 --> 00:18:22,470 So we'd say, okay, what's the challenge? You're experiencing a mental health. 190 00:18:22,490 --> 00:18:26,120 You say, oh, why do you think that is? And explains this? Okay, well why do you think that is? 191 00:18:26,120 --> 00:18:27,200 And then they explain to us again. 192 00:18:27,500 --> 00:18:34,250 And so they only see two layers of this, but it just keeps going and going and going until you get down to I can't say why anymore. 193 00:18:34,670 --> 00:18:40,850 And that's where you kind of get to like values and norms and beliefs when there isn't really another why. 194 00:18:40,850 --> 00:18:45,410 It's just what's deeply rooted in people, in systems, in our environments. 195 00:18:45,890 --> 00:18:48,590 Um, I hear you heard lurkers love a complexity theory. 196 00:18:48,590 --> 00:18:55,340 So think about all those factors and complexity theory, then into those root reasons why things are happening or not happening. 197 00:18:55,610 --> 00:18:59,240 And that's when some of that, like magical mechanistic stuff starts happening. 198 00:18:59,660 --> 00:19:06,020 Um, and that's where you can start uncovering it and understanding it, at least from a coherence and plausibility, um, perspective. 199 00:19:07,760 --> 00:19:11,980 I'm going to make you go to work again. Go back to your pieces of paper or your phones. 200 00:19:12,310 --> 00:19:15,580 You have your thing on the left. You have your outcome on the right. 201 00:19:16,090 --> 00:19:18,250 We've talked a little bit about what the middle could be. 202 00:19:18,850 --> 00:19:24,770 If you wrote some things down in the middle already, stuff you measured, can you think about that a little deeper? 203 00:19:24,790 --> 00:19:28,989 If you saw something on the surface that occurred that maybe got in the way or helped what 204 00:19:28,990 --> 00:19:35,080 might be underlying that thing that got in the way or helped your thing to its outcome? 205 00:19:35,320 --> 00:19:37,060 And I got to a something with you. 206 00:19:37,450 --> 00:19:44,680 Um, I'm curious if anyone wants to share with us what you were thinking about, or possibly how hard that might have been. 207 00:19:45,430 --> 00:19:51,970 Um, the risk one is a really interesting one, right? That, um, if you presented a new idea and it feels less risky, 208 00:19:52,300 --> 00:20:00,010 then you're kind of driving down into individual kind of behaviour mechanism that our tolerance for risk really shapes how we behave or not behave. 209 00:20:00,400 --> 00:20:06,850 Um, excellent example of how you can think about that. Moving into mechanistic where we're in can are great questions to ask. 210 00:20:06,850 --> 00:20:11,500 Like where is the mechanism. I'm going to present some stuff that's going to make that answer really hard to figure out. 211 00:20:11,950 --> 00:20:19,950 Um, but uh, I think the piece you, you clicked in on the midwife, introducing something gets you there, right? 212 00:20:19,960 --> 00:20:24,580 The interest. So where you would start is the midwife introduce something that's like the action we did. 213 00:20:25,030 --> 00:20:30,970 And then you got to ask, well, why does that work? Why is that helping a person actually use a technology? 214 00:20:31,240 --> 00:20:36,340 And there's lots of really interesting literature coming out around how we can transfer 215 00:20:36,340 --> 00:20:42,610 trust from an individual into trust to a technology that if someone presents one to us, 216 00:20:42,610 --> 00:20:47,769 that if we have a trusting, good relationship with them, that's transferred into the technology space. 217 00:20:47,770 --> 00:20:51,219 And it also comes from, um, trust transfer in teams. 218 00:20:51,220 --> 00:20:56,680 So that's in the team's literature for, um, like a long time where people can work within a team, 219 00:20:56,680 --> 00:21:01,870 even if it's a new person, because they trust the team and you could potentially transfer them to technology. 220 00:21:01,870 --> 00:21:05,290 Some newer papers coming out on that. Very cool. 221 00:21:05,290 --> 00:21:10,200 So as you start getting into the next level of why that will help you. 222 00:21:10,270 --> 00:21:14,560 I'm going to show you some things about, um, some theory that can help you. 223 00:21:15,400 --> 00:21:25,090 So I did a little bit of a deep dive with our example here about how we can try to use theory to inform understanding why a mechanism works or not. 224 00:21:25,870 --> 00:21:30,279 And, you know, Tristan, a nice job of talking about my very multi-disciplinary background. 225 00:21:30,280 --> 00:21:31,870 There's political science in there, too. 226 00:21:32,230 --> 00:21:38,830 So I had, you know, 4 or 5 different disciplinary backgrounds, and I think that's probably why I like implementation science, 227 00:21:39,280 --> 00:21:43,390 because implementation science is like the science for people who don't want to pick things, 228 00:21:43,990 --> 00:21:47,530 because it requires understanding cognitive and social psychology. 229 00:21:47,710 --> 00:21:52,190 We need to understand why people do things and don't do things. You need to understand sociology. 230 00:21:52,190 --> 00:21:55,569 You need to understand why societies shift or move, or in anthropology, 231 00:21:55,570 --> 00:21:59,650 you have to understand why cultures shift or move, and how culture sharing groups behave. 232 00:22:00,040 --> 00:22:07,750 And all of these disciplinary traditions have been around for a long time, have been doing the work of understanding change. 233 00:22:07,960 --> 00:22:12,580 Right. We don't need to reinvent a lot of that. And some of this is like deeply, empirically based. 234 00:22:12,940 --> 00:22:16,270 Um, they show it in labs, they show it consistently in real world. 235 00:22:16,600 --> 00:22:23,440 Why people, teams, organisations, policies, systems and cultures change over time. 236 00:22:24,130 --> 00:22:27,910 We don't have to reinvent this. There are excellent theories we can go to. 237 00:22:28,270 --> 00:22:33,820 And it's those why questions that these theories have been grappling with that help us understand mechanisms. 238 00:22:34,210 --> 00:22:38,950 Those are the causal links that we can go to and actually test and look for in our implementation work. 239 00:22:39,640 --> 00:22:48,100 So I'm a big fan of going back to disciplinary traditions. I teach a graduate course kind of on this around how you use this to pick the right theory, 240 00:22:48,100 --> 00:22:54,760 model or framework for a particular challenge by recognising what is your big lever of change in the implementation you're doing? 241 00:22:55,000 --> 00:23:02,740 Sometimes we know, sometimes we don't. Um, in the example of kind of pharmacists taking on new, um, uh, new prescribing behaviours. 242 00:23:03,100 --> 00:23:06,180 There's like a couple of big levers we can think about, right? 243 00:23:06,190 --> 00:23:09,009 There's political levers, which you noted, uh, 244 00:23:09,010 --> 00:23:15,160 being able to share that role is another big lever that being clinicians reading power to another clinician. 245 00:23:15,400 --> 00:23:21,010 That's a big lever in each of those levers kind of hook into these different kinds of traditions of understanding change. 246 00:23:21,490 --> 00:23:26,319 And that's how you would go ahead and pick a theory to help you understand the challenge change that's 247 00:23:26,320 --> 00:23:30,670 happening and something you can actually test and use and maybe appropriately rebuilding over time. 248 00:23:31,480 --> 00:23:33,550 So I'm going to go through a few of them and we'll go through all of them. 249 00:23:33,760 --> 00:23:39,520 I'll go through a few that we've been able to look at and get some coherence level understanding of mechanisms and, 250 00:23:39,520 --> 00:23:45,639 and some of the work that we've done. And hopefully this sparked some, some ideas in the example you've been working on, um, 251 00:23:45,640 --> 00:23:50,950 for mechanisms that might be underlying the stuff that you're seeing or in any projects that you have coming up. 252 00:23:52,230 --> 00:23:56,970 So let's start with cognitive mechanisms. It's often when we think about brain because decisions are made by humans. 253 00:23:57,810 --> 00:24:04,620 Um, and these can be things like habit formation, mental models, identity from an individual standpoint. 254 00:24:04,620 --> 00:24:13,590 We'll talk about identity from a global standpoint. Um, and there are different theoretical approaches that try to understand cognitive behaviour, 255 00:24:13,800 --> 00:24:19,800 things that we use in implementation science or things like they can be or they capability, 256 00:24:19,800 --> 00:24:26,459 opportunity, motivation, behaviour and related theory or plan behaviour and um behaviour change. 257 00:24:26,460 --> 00:24:28,410 We'll a lot of those behaviour change theories. 258 00:24:28,680 --> 00:24:35,700 And if you go into their like foundational papers and how they were written first they'll tell you the assumptions that they're built on. 259 00:24:35,700 --> 00:24:38,730 They'll tell you the causal mechanisms that they're resting on. 260 00:24:39,000 --> 00:24:45,510 And that really helps you understand why they believe these things are important to a shift in behaviour over time. 261 00:24:45,870 --> 00:24:49,710 I often encourage people try to find, like the first paper that talked about the theory, 262 00:24:49,980 --> 00:24:54,330 because they don't always carry the assumptions of the theory forward as they continue to write about it. 263 00:24:54,330 --> 00:24:59,280 And this becomes really important for us to understand the assumptions that are underpinning our thinking. 264 00:25:00,120 --> 00:25:03,120 Um, so one example I am very happy to share. 265 00:25:04,080 --> 00:25:09,270 And it's a little this one's a little sad. Um, like I actually passed just two months ago very tragically. 266 00:25:09,600 --> 00:25:18,450 Uh, and this is one of the projects we got to do together. Micah was a brilliant clinician, um, family doc, and worked in the ICU at our hospital. 267 00:25:18,810 --> 00:25:26,850 And he'd been working in rural, remote parts of Ontario for a long time, and he really believed in GLP ones for diabetic patients. 268 00:25:26,850 --> 00:25:32,309 He'd seen so much evidence for their effectiveness. He saw that they were helping his his patients. 269 00:25:32,310 --> 00:25:37,590 He was deep into the literature, but his colleagues weren't using it and they weren't prescribing it. 270 00:25:37,800 --> 00:25:42,120 Certainly in the North, where diabetes was a significant problem for the patient population of their, 271 00:25:42,120 --> 00:25:45,300 particularly our First Nations and indigenous populations. 272 00:25:45,870 --> 00:25:50,810 And so he came to me, he's like, you know, I know you do a lot of co-design work. Let's co-design a new training platform. 273 00:25:51,210 --> 00:25:54,450 They do this nice digital training platform and that's going to solve this problem. 274 00:25:54,990 --> 00:26:02,070 And I said, are you sure it's a training problem? Like is it that they don't know what it is and they don't know how to prescribe it? 275 00:26:02,670 --> 00:26:05,880 And then he kind of thought about that and said, no, well, maybe, maybe it's not. 276 00:26:05,970 --> 00:26:09,060 So we took a step back and used some funding. 277 00:26:09,060 --> 00:26:14,610 He got through one of his many career awards that he was getting at the time to do a deep dive. 278 00:26:14,610 --> 00:26:19,890 It was qualitative work and did some interviews with the clinicians that in his environment, 279 00:26:20,190 --> 00:26:25,770 and we applied calmly to it to understand what was shifting or not shifting their prescribing behaviour. 280 00:26:26,280 --> 00:26:30,270 And some really interesting things came out of that. Things like their clinical identity, 281 00:26:30,360 --> 00:26:36,930 who they understood themselves to be and what their job was as a physician was changing whether they were prescribing things or not. 282 00:26:37,560 --> 00:26:46,379 Um, their beliefs in GLP one and the social construction around Ozempic as a concept was shifting their behaviour, 283 00:26:46,380 --> 00:26:52,170 whether there was clinical evidence or not, that social influence was really important and meaningful to them, 284 00:26:52,620 --> 00:26:56,879 their past experiences and their risk assessments, all of those things mattered a lot. 285 00:26:56,880 --> 00:27:01,050 And if we'd gone ahead and just built a new technology enabled training platform, sure, 286 00:27:01,050 --> 00:27:05,640 it had accessibility to them, but it wouldn't have hit on these really key things. 287 00:27:06,180 --> 00:27:10,440 And what that meant was different kinds of relationship building, different kinds of sense making. 288 00:27:10,830 --> 00:27:15,900 And although we won't get to do it with Mike, we will try and carry this forward to, to, um, 289 00:27:16,260 --> 00:27:20,310 to do some interventions with those clinicians so we can kind of see his vision forward. 290 00:27:22,620 --> 00:27:27,060 So those are some cognitive mechanisms. There are lots though. You can think about lots of behaviour change stuff. 291 00:27:27,390 --> 00:27:30,200 Let's talk a little bit about sociological mechanisms. 292 00:27:30,210 --> 00:27:38,670 These are a little trickier because they exist both within individuals and within our social systems I think for complexity conversation. 293 00:27:38,670 --> 00:27:44,579 Right. All of these things are like a big blanket of interwoven fibres where our individual 294 00:27:44,580 --> 00:27:50,250 behaviour is shifted and shaped and moved by our social systems and fabrics. 295 00:27:50,640 --> 00:27:55,690 Right. That that individual clinician isn't necessarily just working as an individual clinician in a vacuum. 296 00:27:55,710 --> 00:28:05,940 There's all these forces acting on us. Things like norms, things like power networks and then identity from a shared sense, 297 00:28:06,360 --> 00:28:10,940 things about not just who am I as a person, but who am I as a physician? 298 00:28:10,950 --> 00:28:15,030 Who am I as a parent? Who am I as a child? 299 00:28:15,270 --> 00:28:20,500 All of these kind of simple constructions within shared groups that shape our behaviour as well. 300 00:28:20,520 --> 00:28:24,030 So identity is a sticky one that lands in a couple of places. 301 00:28:24,660 --> 00:28:28,380 And then the theoretical approaches we want at this level are a little different. 302 00:28:28,500 --> 00:28:34,800 They're not necessarily about individual cognition. They're about like how the individual interacts with their social environment. 303 00:28:35,130 --> 00:28:38,640 So then you get into like some classic social interactions in theory. 304 00:28:38,820 --> 00:28:42,830 I'm always a big fan of Goffman and he's like front stage backstage kind of stuff. 305 00:28:42,840 --> 00:28:48,390 I think that really resonates well. You can get into critical theory and think about power and, um, 306 00:28:48,570 --> 00:28:53,820 kind of structures of power environments and how that shifts our ability to move or not. 307 00:28:54,300 --> 00:28:59,820 Um, and then socialisation processes, if you're a fan of Foucault, this one is always really fun. 308 00:28:59,820 --> 00:29:07,230 And if you can think a lot about Inter in terms of professional identity creation, that as people go through the training of their profession, 309 00:29:07,500 --> 00:29:14,459 they're socialised into that identity and that shifts their behaviour and what they think is the right or wrong way of doing things, 310 00:29:14,460 --> 00:29:24,480 and it also shifts the environments around them. So mechanisms that might emerge from this level might be things like identity and normative threat. 311 00:29:24,780 --> 00:29:28,020 So what is threatening my power as a clinician? 312 00:29:28,380 --> 00:29:32,670 What might be threatening my role as a, um, a caregiver? 313 00:29:33,480 --> 00:29:42,360 Um, there might also be legitimacy construction. So what is a legitimate or illegitimate form of action within what is understood within my society? 314 00:29:42,960 --> 00:29:46,350 Um, and there's also things like network contagion that always sounds kind of fun, 315 00:29:46,350 --> 00:29:53,310 but it's basically that when something happens in one part of a network, it can trickle and move towards other parts of a close knit network. 316 00:29:54,030 --> 00:29:59,460 Um, we've looked at this a little bit in the shared identity space of professional identity. 317 00:29:59,850 --> 00:30:03,030 Um, and this is work by my lovely colleague, uh, Jennifer Schuldiner. 318 00:30:03,210 --> 00:30:06,870 Um, in 2021. And then we did some work in 2024. Um, 319 00:30:06,870 --> 00:30:16,049 all the years leading into it that found in virtual care environments that things like professional identity were really important to use or not use, 320 00:30:16,050 --> 00:30:22,540 and for virtual care. So doctors, nurses work, looked at emergency virtual emergency departments. 321 00:30:22,560 --> 00:30:28,380 Uh, during Covid they were setting these up all over Canada, definitely in Toronto and in her hospital in Toronto. 322 00:30:28,890 --> 00:30:32,250 And they were finding some clinicians loved it and some clinicians hated it. 323 00:30:32,250 --> 00:30:33,980 And they were trying to find out why. 324 00:30:33,990 --> 00:30:40,770 And they found that the clinicians who really hated it said things like, this isn't what it means to be an emergency room doctor. 325 00:30:41,640 --> 00:30:46,170 And then we did some work and social representation and virtual care and primary care. 326 00:30:46,170 --> 00:30:51,090 Did some deep ethnographic work, talk to patients, talk to their caregivers, talked to providers. 327 00:30:51,090 --> 00:30:57,120 After watching an interaction in a virtual environment. And lots of clinicians used it and lots of clinicians didn't. 328 00:30:57,120 --> 00:31:00,930 And the ones who didn't use it said, this is not what it means to be a family doctor. 329 00:31:01,410 --> 00:31:05,520 This is not how I was trained. This isn't how I understand what my role is. 330 00:31:05,940 --> 00:31:12,780 And so virtual care was a very immediate and real threat to what it meant to them to be a good doctor, 331 00:31:13,140 --> 00:31:18,120 which for them was being in a room with another person. But that wasn't everybody. 332 00:31:18,150 --> 00:31:23,670 There were lots of clinicians in the study for whom it resonated strongly with their identity as a clinician, 333 00:31:24,150 --> 00:31:30,060 and it meant that they could do a better job of connecting, particularly some of the social workers and um, 334 00:31:30,840 --> 00:31:36,450 um, nursing staff who were doing much more kind of mental health and social support pieces, 335 00:31:36,840 --> 00:31:42,420 and probably because the patients were more vulnerable in a virtual environment than they could be in person. 336 00:31:42,660 --> 00:31:48,600 So that allowed them to tap into their personal identity of being a good social worker that could 337 00:31:48,600 --> 00:31:53,940 connect with somebody emotionally because there was now availability to connect to them emotionally. 338 00:31:54,240 --> 00:31:59,880 And a lot of that was wrapped in the construction of how they were trained and what their expectations were over time. 339 00:32:01,450 --> 00:32:07,430 So those are very different kinds of mechanisms we think about. I'll talk about how that changes strategy in just a moment. 340 00:32:08,870 --> 00:32:13,240 There's also organisational and team mechanisms so important when we think about these ones. 341 00:32:13,250 --> 00:32:20,900 Yes, the individuals still acting, but individuals can also act on behalf of their team or on behalf of their organisation. 342 00:32:21,290 --> 00:32:29,770 So for example, when the leaders or directors are acting, they're not necessarily making decisions because of what they, as an individual want to. 343 00:32:29,780 --> 00:32:33,469 They're making it a kind of, um, for their organisation, 344 00:32:33,470 --> 00:32:37,370 and they're kind of embedding what their organisation wants to do and what their priorities are, 345 00:32:37,370 --> 00:32:41,719 and all of those things that they can, you know, constructed in. Um, so that's how they're acting. 346 00:32:41,720 --> 00:32:48,590 So we need a different kind of theoretical lens to understand that kind of change that's happening, things we can look at. 347 00:32:48,590 --> 00:32:55,640 Now back to normalisation process theory, which talks of teams changing behaviour over time and and broadly team science, 348 00:32:55,820 --> 00:33:03,620 lots of organisational behaviour theory like ecology and institutionalisation, lots of things that talk about why an organisation changes or not. 349 00:33:03,650 --> 00:33:08,090 Tons of little literature there and then you can get into some shared mental model stuff. 350 00:33:08,090 --> 00:33:11,000 So mental models can like how an individual understands the world. 351 00:33:11,450 --> 00:33:18,170 Um, when we share that amongst a group, that becomes something within teams or organisations, um, that you can work with, 352 00:33:18,830 --> 00:33:25,159 mechanisms we might see here are things like collective sense making, role renegotiation, trust, 353 00:33:25,160 --> 00:33:30,050 collaboration, things that are kind of happening at a team interactional, larger group level. 354 00:33:31,130 --> 00:33:35,600 Um, I actually ended up adding this slide this morning because I forgot the study was that, um, 355 00:33:35,990 --> 00:33:41,809 and this was actually done by a postdoc of mine, and she moves so quickly into her scientist's role, I forgot how fun this was. 356 00:33:41,810 --> 00:33:50,660 And this actually came out last year where we did a realist synthesis to understand context mechanism outcomes for learning, um, organisations. 357 00:33:50,660 --> 00:33:54,440 She wants to go and figure out what makes a learning health system a learning health system, 358 00:33:54,770 --> 00:33:58,669 and what are the mechanisms that drive that learning health system to exist. 359 00:33:58,670 --> 00:34:02,569 Well, so she's building on this work, but this one came out, um, just last year. 360 00:34:02,570 --> 00:34:04,700 And this is our foundational work. There's Marissa there. 361 00:34:05,270 --> 00:34:10,639 Um, and where we wound up finding, you know, 60 different CMO groupings across the literature, 362 00:34:10,640 --> 00:34:13,940 but we were able to distil them down into six big areas. 363 00:34:14,300 --> 00:34:18,290 Um, and they kind of look like this as an example. 364 00:34:18,290 --> 00:34:21,800 So this is one of our six. So embracing risk and failure. 365 00:34:22,070 --> 00:34:28,820 Uh, a CMO in this space looks like when evaluation is promoted as a learning activity within an organisation. 366 00:34:28,820 --> 00:34:37,850 So learning health systems or learning learning organisations, um, that tolerates risk and does not shame and blame staff for evaluation failures. 367 00:34:37,850 --> 00:34:44,660 That's for context. Uh, staff involved in the evaluation are more apt to engage with and learn from that evaluation. 368 00:34:44,750 --> 00:34:53,840 There's your outcome. And because they do not fear your mechanism, disciplinary punitive measures associated with evaluation failure. 369 00:34:54,290 --> 00:34:58,069 So we have our context, our environment of the outcome we're trying to get to. 370 00:34:58,070 --> 00:35:02,420 And then our mechanism in the middle. So realist thinking really does help us unpack what the mechanism is. 371 00:35:03,050 --> 00:35:06,350 Uh, we were able to distil the 16 into these six things. 372 00:35:06,350 --> 00:35:12,079 And she's in the process of testing, um, on a now, what is really fun about this paper, um, 373 00:35:12,080 --> 00:35:21,740 that I reread and Marissa and I talked about so much is that each of those six kind of big buckets are really at this underlying level. 374 00:35:22,220 --> 00:35:26,209 That first one mechanism that's about fear, that's second one. 375 00:35:26,210 --> 00:35:32,150 If you go and read the paper, it's going to be all about provenance. That third one's about a sense of ownership and values. 376 00:35:32,480 --> 00:35:35,150 The fourth one is about meaningfulness of data. 377 00:35:35,270 --> 00:35:40,520 I love this one because we've been doing this in digital health for a long time, and meaning and data is so crucial. 378 00:35:41,090 --> 00:35:46,850 That fourth one, the fifth one is about motivation and the last one is about kind of contributed role. 379 00:35:46,860 --> 00:35:49,910 So we're back to like a shared role and share identity in that last one. 380 00:35:50,150 --> 00:35:54,470 So all six of them are operating at this lower level. 381 00:35:54,530 --> 00:35:56,299 Yeah. But yeah just a quick question. 382 00:35:56,300 --> 00:36:05,240 Um, so I guess several, but um, I'm looking at the learning system and I'm wondering where does the data come from? 383 00:36:05,420 --> 00:36:12,110 And with regards to the evaluation piece, has there been any conversations about, um, 384 00:36:12,110 --> 00:36:18,559 what forms that takes into or is, is is sort of, um, there specific teams that sort of do that? 385 00:36:18,560 --> 00:36:24,340 Is it. Everybody does it together. Is individuals having that reflective spirit within, uh, team. 386 00:36:24,350 --> 00:36:26,720 I'm really curious for that part of that question. 387 00:36:26,810 --> 00:36:31,639 Just to clarify, are you asking about like, data that informed building this framework or like data used in a learning health system? 388 00:36:31,640 --> 00:36:34,100 No, no, the data in the learning health system, in the learning health. 389 00:36:34,110 --> 00:36:40,489 And so, um, Marissa and that team over actually in health partners, they've been doing their entire system is learning health systems. 390 00:36:40,490 --> 00:36:43,610 So they have a really nice framework that talks about different sources of data. 391 00:36:44,120 --> 00:36:50,360 Um, it can it helps when if it's coming from something the organisation's already collecting anyway. 392 00:36:50,810 --> 00:36:54,140 Um, because trying to give them new stuff to do is very hard. 393 00:36:54,440 --> 00:37:02,000 But what you end up doing is shifting, um, their understanding of the data and presenting it in a different way in order to adapt these things. 394 00:37:02,450 --> 00:37:08,270 So, for example, you can get to a sense of ownership and value if you sit down with the clinicians. 395 00:37:08,380 --> 00:37:12,940 Who you'd be asking to do something different and say, how do you think about this data? 396 00:37:13,000 --> 00:37:16,270 What do you think we should be collecting? How should it be presented to you? 397 00:37:16,420 --> 00:37:20,010 And suddenly they're part of the conversation. It's making sense to them. 398 00:37:20,020 --> 00:37:27,099 So you're going to hook into the data is meaningful piece. And then they feel like it's not something that's been thrust upon them. 399 00:37:27,100 --> 00:37:33,460 It's something they actually went and asked for. Um, so often I doing it that way, it's helpful. 400 00:37:33,770 --> 00:37:41,020 Um, my other wonderful colleague, Harry Carlucci, does a ton in patient experience, and she took the patient experience data they were doing already. 401 00:37:41,020 --> 00:37:45,580 She tweeted to improve it. And Hannah has now expanded that into caregiver experience. 402 00:37:45,970 --> 00:37:52,450 And she worked with clinicians and worked with her decision and the kind of key decision makers to present data in real time, 403 00:37:52,450 --> 00:37:56,710 back to clinicians on tablets. And it just exists in their environment. 404 00:37:57,160 --> 00:38:02,590 And because they're kind of competitive people, they like to see how they're doing and they like to see how their neighbours are doing. 405 00:38:02,590 --> 00:38:11,140 So it's presented unity unit, but they also extracted really meaningful, um, quotes from the experience surveys, 406 00:38:11,530 --> 00:38:18,220 which hooked into the motivation thing and the confidence thing really nicely because suddenly they're like, 407 00:38:18,520 --> 00:38:24,250 wow, I really made a difference to this person. I'm motivated to do more things and I'm confident I'm doing a good job. 408 00:38:24,640 --> 00:38:30,880 So they were able to pull on a couple of different mechanisms, um, in order to present data that was meaningful and useful to them. 409 00:38:31,870 --> 00:38:35,599 At the end of the day, when it gets into kind of like meaningfulness of data and usefulness, 410 00:38:35,600 --> 00:38:42,040 and a lot of these components co-production of the process and understanding in that collective sense making thing that kind of talked about. 411 00:38:42,040 --> 00:38:45,309 I have a whole other lecture on citizens making I didn't get last week. 412 00:38:45,310 --> 00:38:54,130 I didn't bring it in here, but we can talk about it. Um, is really helpful to overcome things like threat and fear of data and new stuff. 413 00:38:54,820 --> 00:38:58,840 Because you're working together, you're often having to change with a lot of different people. 414 00:38:59,140 --> 00:39:03,280 And if you all understand that differently, it can feel very threatening and very scary. 415 00:39:03,700 --> 00:39:05,250 Same thing with presenting data. 416 00:39:05,320 --> 00:39:10,780 It can feel like it's being punitive if you're not having a conversation about what it means and why it's valuable early on. 417 00:39:12,130 --> 00:39:15,390 To help. Yeah. Yeah. 418 00:39:16,370 --> 00:39:19,970 Really quickly. And thanks. This is really fascinating. 419 00:39:19,990 --> 00:39:28,309 I'm wondering about the organisational preconditions, however. To what extent and to what level of success can you have? 420 00:39:28,310 --> 00:39:36,230 This framework placed atop an organisation that has not yet figured out the organisational preconditions? 421 00:39:36,950 --> 00:39:41,059 Yeah, I mean, it's it's such a wonderful question. 422 00:39:41,060 --> 00:39:47,690 And I think one we just are still grappling with because we don't have enough learning health systems right now to do comparative work. 423 00:39:48,110 --> 00:39:54,560 And the ones who are doing it already have the preconditions in place, for example, valuing data and valuing research. 424 00:39:54,890 --> 00:40:00,620 And, um, having a different view, like having that safety kind of culture and, 425 00:40:00,620 --> 00:40:03,919 um, kind of what Amy Edmondson talks about being like a fearless organisation. 426 00:40:03,920 --> 00:40:12,950 They already have that. Um, I have yet to see an example of an organisation go from being top down punitive into this model yet. 427 00:40:13,370 --> 00:40:21,349 Um, I think she's she's keen to find it. So one of the things she's been doing with some colleagues at McMaster is building a network of organisations 428 00:40:21,350 --> 00:40:26,660 that are either already identified as a learning health system or aspiring to be a legal system, 429 00:40:27,020 --> 00:40:31,429 and she's trying to they're trying to unpack that next piece. But you're right. 430 00:40:31,430 --> 00:40:35,390 Some of these like the context pieces and some of the preconditions, um, 431 00:40:35,720 --> 00:40:44,120 might rest on our organisational cultures that are very hard to move and sometimes very driven by particular types of leaders. 432 00:40:44,780 --> 00:40:48,680 Um, so they're also trying to unpack that piece as well. And it's tricky. 433 00:40:50,450 --> 00:40:54,570 Not mention. There's also political, uh. Um, so very cool work. 434 00:40:54,590 --> 00:41:04,040 I highly encourage you to follow rhythm. Um, another piece of work we did this is older, but this is also this is more of a team level, um, mechanism. 435 00:41:04,040 --> 00:41:09,739 And we actually did a trial of a technology that we had built and co-produced with patients, 436 00:41:09,740 --> 00:41:14,389 their families and primary care teams to do goal oriented care. 437 00:41:14,390 --> 00:41:18,920 So shifting a model of care to asking patients what matters to them, 438 00:41:19,190 --> 00:41:23,149 and then wrapping care plans around them and monitoring them against goals that they cared about. 439 00:41:23,150 --> 00:41:28,700 So that's goals like, I would like to walk my dog every day, not I need a lower HBA one C number. 440 00:41:29,090 --> 00:41:32,989 Um, and so we went through many, many iterations of this. 441 00:41:32,990 --> 00:41:36,980 And at the end of the trial, we had lots of people using it and lots of people not using it, 442 00:41:37,430 --> 00:41:46,850 and the people who really used it who really changed behaviour, both the clinicians and the patients had this very strong alignment of coherence. 443 00:41:47,570 --> 00:41:54,290 So that became a mechanism we care a lot about. And we have been building into kind of trials of our next rounds of technology. 444 00:41:54,890 --> 00:41:58,310 Um, because we often build a technology for like one user group. 445 00:41:58,850 --> 00:42:02,210 But a technology might actually be interactional and asking multiple people. 446 00:42:03,110 --> 00:42:11,419 And when we saw the highest amount of use from both the physicians and in clinical teams and patients like people we called super users, 447 00:42:11,420 --> 00:42:14,390 they were using this thing every day, whether they really needed to or not. 448 00:42:14,780 --> 00:42:20,910 They had this really strong alignment between their sense of coherence, both at the individual and team level. 449 00:42:20,930 --> 00:42:22,579 It made a lot of sense to them. 450 00:42:22,580 --> 00:42:29,090 The tool did, and it made a lot of sense to their relationships together, and that changed their behaviour really meaningfully. 451 00:42:29,510 --> 00:42:36,620 So this we've kind of reimagined normalisation process theory into this framework from a huge narrative analysis of a ton of data. 452 00:42:36,620 --> 00:42:38,690 And to actually get to something that looks pretty simple. 453 00:42:39,020 --> 00:42:43,850 But at the end of the day, that's moved us to think differently about technology, around what's meaningful to people. 454 00:42:44,120 --> 00:42:50,000 And that's a mechanism we kind of chase now, uh, we try to build it into the way we build the innovation. 455 00:42:50,330 --> 00:42:55,250 We build it into the way we do our implementation work as well. Meaningfulness really matters. 456 00:42:55,250 --> 00:42:58,250 Even in technology. They're not inert. They matter. They're things. 457 00:42:58,250 --> 00:43:03,170 They're real social artefacts. People have feelings. Lots of thoughts and feelings about our technology. 458 00:43:04,430 --> 00:43:08,930 Okay. The last one we'll talk about is kind of diffusion and adoption mechanisms. 459 00:43:08,930 --> 00:43:15,020 So this is where things get like really fun and sticky where things are just like dispersing out whether you can measure it or not. 460 00:43:15,020 --> 00:43:21,440 And um, there's so many fun studies of complex adaptive systems theory, um, where you can measure the first couple of stages. 461 00:43:21,440 --> 00:43:24,950 And as it gets further and further into embedding, it gets really, really hard to measure, 462 00:43:25,430 --> 00:43:30,650 um, which is the fun and challenging thing of a complex adaptive systems theory. 463 00:43:31,160 --> 00:43:35,629 But this is getting into bigger stuff like tipping points and betting and emergence social proof. 464 00:43:35,630 --> 00:43:38,870 And so diffusion of innovation is helpful for us here. 465 00:43:39,170 --> 00:43:45,079 The dynamic sustainability framework for us is helpful here. And of course, complex adaptive systems theory that's helpful for us here. 466 00:43:45,080 --> 00:43:48,950 All of them suggest why things go from one place to the other. 467 00:43:49,460 --> 00:43:57,680 So mechanisms here are social contagion, norm cascades, institutionalisation of change that occurs over time. 468 00:43:57,680 --> 00:44:01,549 So this is when our change is getting bigger and bigger. This is actually work from Scarborough. 469 00:44:01,550 --> 00:44:05,690 Um uh, care that says uh, from 2022. 470 00:44:05,690 --> 00:44:10,219 Joe, do you know what this is? I know Harry. Yeah. Oh, yeah. It's a really nice little piece of work. 471 00:44:10,220 --> 00:44:14,930 Yeah. What's a big piece of work? Uh, that they connected micro and macro level phenomenon. 472 00:44:15,260 --> 00:44:22,790 Um, so they were looking at mechanisms that were occurring locally that cascaded into mechanisms that were happening more broadly within the system. 473 00:44:23,180 --> 00:44:26,540 Um, which was super cool. It's a great paper. I highly recommend reading it. 474 00:44:26,840 --> 00:44:29,899 And they found things like learning, adapting, 475 00:44:29,900 --> 00:44:34,879 and institutionalising that occurred in a local space would also cascade into much 476 00:44:34,880 --> 00:44:38,300 bigger change of the social level over time when they were looking at the literature. 477 00:44:38,840 --> 00:44:45,290 Um, and they embed, you know, Rogers diffusion of innovation theory and a couple of other theory which make the 478 00:44:45,410 --> 00:44:50,150 ultimate impact on the structure they build in cognitive and normative other, 479 00:44:50,180 --> 00:44:56,240 uh, theoretical frameworks to, um, to show how those kind of key mechanisms cascade forward into something bigger. 480 00:44:57,050 --> 00:45:03,170 Um, I have another wonderful student, um, Marsha Gupta, who, um, finished a PhD some years ago. 481 00:45:03,800 --> 00:45:06,710 She ended up writing like five papers on her PhD. And here's two of them. 482 00:45:07,130 --> 00:45:11,870 She looked at kind of the emergence of family medicine in India for her work. 483 00:45:12,230 --> 00:45:17,150 So fascinating. Um, I highly recommended your diffusion of innovation theory, 484 00:45:17,150 --> 00:45:22,490 for I was her theory nerd on her thesis because she was mostly supported by other family docs. 485 00:45:22,940 --> 00:45:26,059 Um, and she did some quality. These are her qualitative, 486 00:45:26,060 --> 00:45:32,360 descriptive and survey studies where she looked at emergence of family medicine over a 30 year period and was able to find 487 00:45:32,360 --> 00:45:39,469 this link between the work of innovators and early adopters to do institutionalisation work that allowed family medicine to, 488 00:45:39,470 --> 00:45:46,640 like, take root in the entire country. Um, so it the potential that so she didn't go at it from a mechanistic lens. 489 00:45:46,640 --> 00:45:52,549 But you can kind of apply some of that coherence, um, criteria, um, to understand these as mechanisms, 490 00:45:52,550 --> 00:45:58,100 things like having sufficient training programs that were widely dispersed was really critical, 491 00:45:58,100 --> 00:46:05,240 engaging a multidisciplinary model because of the very different kinds of discipline approach and disciplinary approaches to medicine in India. 492 00:46:06,080 --> 00:46:10,910 Also absolutely important to kind of embed into the norms of the society there. 493 00:46:11,180 --> 00:46:17,239 And then having increasing postgraduate training was a way of institutionalising it into the entire system. 494 00:46:17,240 --> 00:46:18,649 So over this 30 year period, 495 00:46:18,650 --> 00:46:26,390 they went from not knowing what family medicine was to having a burgeoning group of clinicians who were delivering care all over India. 496 00:46:26,750 --> 00:46:32,150 Really nice piece of work that shows this kind of how mechanisms can occur in these much higher order ways. 497 00:46:33,720 --> 00:46:39,720 So these things all matter and they're really cool. But they also shift how we think about our strategy. 498 00:46:40,260 --> 00:46:42,670 So not only understanding why something happened. 499 00:46:43,140 --> 00:46:48,360 Um, I like to talk to our leaders about they, they also tell us maybe which strategies you should target, 500 00:46:48,660 --> 00:46:50,970 because it's going to pull that lever of change for you. 501 00:46:51,390 --> 00:46:55,890 So a lot of leaders kind of intuit the different strategies you might use to activate a mechanism. 502 00:46:56,220 --> 00:47:01,320 But the difference between using theory and not is if you don't, you might be trying eight different things, 503 00:47:01,320 --> 00:47:05,480 and if you do, you might only have to try to in order to actually get something going. 504 00:47:05,490 --> 00:47:09,040 So I have an example from that technology example. 505 00:47:09,040 --> 00:47:18,000 I you the core of two ways we were thinking about this. So with when we were doing this at the primary care level, there's our goal or indicator tool. 506 00:47:18,420 --> 00:47:22,290 A really important enabler for us was getting clinician champions. 507 00:47:22,290 --> 00:47:25,500 So I'm sure you've heard this before. Clinician champions matter a lot. 508 00:47:25,890 --> 00:47:33,270 And at our primary care teams in Ontario, they kind of function somewhat independently, but they're very much a cohesive team. 509 00:47:33,660 --> 00:47:39,750 There's usually like a person not even with a title that's like a leader, but just a person in the team that everyone looks to for guidance. 510 00:47:40,050 --> 00:47:45,500 In some of our family health teams, it was the social worker and some of our family health teams, it was the admin. 511 00:47:45,510 --> 00:47:48,630 Sometimes it was a particular family doctor. 512 00:47:49,290 --> 00:47:52,970 Um, so we were getting them on board to this technology. 513 00:47:52,980 --> 00:47:58,049 They kind of came in and championed it, and then they got the whole unit on board so we could run the trial. 514 00:47:58,050 --> 00:48:04,350 So that was a really important mechanism for us to pull on. So, you know, fast forward five, six years later, 515 00:48:04,470 --> 00:48:09,620 and we're integrating this technology into a care planning tool at a hospital in two 516 00:48:09,630 --> 00:48:14,010 hospitals to support discharging from hospital to home for same kinds of patients. 517 00:48:14,010 --> 00:48:19,140 Very complex. This is work we did with Trillium, and we're doing it on my hospital at Finding Health. 518 00:48:19,680 --> 00:48:22,290 So we've gone through all of this co-production work, 519 00:48:22,290 --> 00:48:28,409 worked very closely with our hospital teams as well as our primary care teams, and we are ready to get this thing going. 520 00:48:28,410 --> 00:48:32,160 So we went to the back to the same strategy. We said, okay, here are our clinician champions. 521 00:48:32,160 --> 00:48:36,030 We identified them on the units across all of these different hospital spaces. 522 00:48:36,420 --> 00:48:40,350 And the champions were great champions and they continue to be great champions. 523 00:48:40,350 --> 00:48:46,410 They were rah rah, rah rah for this thing the whole time. But people really weren't signing up for a trial. 524 00:48:46,920 --> 00:48:51,479 And we were at this for months, and the champions did everything they could to talk to everybody. 525 00:48:51,480 --> 00:48:57,510 They invited us for lunch and learns. They invited my team to be on the units all the time, but nothing was really sticking. 526 00:48:58,320 --> 00:49:05,580 And then kind of one day the hospital decided to do a communication about the digital bridge project, and they have highlighted, um, us. 527 00:49:05,610 --> 00:49:10,200 They highlighted the wonderful students that were kind of all over the place, trying to work hard to recruit. 528 00:49:10,530 --> 00:49:18,540 They talked about its connection to Sinai Health and what was important to Sinai, and advancing compassionate care and connection to the community. 529 00:49:18,840 --> 00:49:26,790 And suddenly 20 people signed up for my. Yes. So that tells us a very different lever of change in primary care teams. 530 00:49:26,810 --> 00:49:33,830 It was about the person that they knew and their connection to that person that they knew, that made that behaviour change happen for others. 531 00:49:34,340 --> 00:49:35,299 For this environment, 532 00:49:35,300 --> 00:49:43,100 it was their connection to their hospital and their sense of being a team player with their hospital that hooked them in to change their behaviour. 533 00:49:43,580 --> 00:49:46,990 So, you know, we didn't know that before, but now we do that. 534 00:49:47,000 --> 00:49:54,979 Maybe before you go out and set up a strategy or an adoption strategy, you ask people like, what's the most meaningful thing to you? 535 00:49:54,980 --> 00:50:00,440 Like, who do you go to for help? What makes you feel like you're connected to this role and to this organisation? 536 00:50:00,920 --> 00:50:08,010 And if you ask those deeper questions about what drives people and what makes them want to do new things, then it changes your strategy. 537 00:50:08,030 --> 00:50:10,940 And we would have saved like eight months if you talk to someone. 538 00:50:11,570 --> 00:50:16,370 If we had just asked the communications team to write a little blurb about us much earlier. 539 00:50:16,670 --> 00:50:19,790 So the question that's just popped in that, I think, 540 00:50:19,940 --> 00:50:27,440 relevant to the whole question of saying when you're doing implementation work, it can be that multiple mechanisms are at play. 541 00:50:27,800 --> 00:50:34,850 So when that's happening, how do you how do you navigate appropriate models or do you incorporate multiple models and frameworks in this case? 542 00:50:35,120 --> 00:50:38,790 In other words, how do you work through the complexity of the real world messy implementation? 543 00:50:38,810 --> 00:50:42,110 Oh, talking about okay, what a wonderful question. 544 00:50:42,290 --> 00:50:50,419 I think I have a slide coming up on this. Um, I do have a slide coming up on this, but, uh, there are people who, um, remember the name. 545 00:50:50,420 --> 00:50:55,909 Anyway, the names on the slide, um, that you can embed lots of different, um, theoretical frameworks. 546 00:50:55,910 --> 00:51:00,560 They do need to be epistemologically aligned. Um, that becomes really important. 547 00:51:00,570 --> 00:51:07,790 Um, whether you're doing research or you're trying to guide mechanistic thinking, because some theories rest on some assumptions, 548 00:51:07,790 --> 00:51:12,890 and theories can rest on conflicting assumptions so important to see that they conceptually map together. 549 00:51:13,400 --> 00:51:17,450 Um, but you can kind of work them together if they're meeting, 550 00:51:17,450 --> 00:51:23,850 if they're not in conflict and they're meeting a gap that each other don't let each other aren't hitting. 551 00:51:23,870 --> 00:51:29,900 So if you want to like, layer cognitive and social theories together and are not really in conflict, that can help. 552 00:51:30,200 --> 00:51:37,009 And you see that happening in research, right. Um you'll see com being linked with for because com is not doing a good job of the environment. 553 00:51:37,010 --> 00:51:39,440 So they at C version get to the environment stuff. 554 00:51:40,040 --> 00:51:49,520 Um so you can the challenge and I don't know that I certainly haven't unpacked it and I haven't seen it unpacked yet is what you prioritise. 555 00:51:50,210 --> 00:51:56,360 Um, so if you have many mechanisms at play, which one you pick first, whether they want, 556 00:51:56,430 --> 00:52:02,720 there's evidence that they actually can chain and cascade and some mechanisms need to happen first and others have happened later. 557 00:52:03,080 --> 00:52:09,200 But it's still kind of a messy literature. Unless if somebody like read something that I haven't read yet and I'm not sure. 558 00:52:09,470 --> 00:52:18,230 Yeah, there is automatically. So we, we tend to go with, um, I like to go with talking to people and asking them like what matters to them the most. 559 00:52:18,620 --> 00:52:24,350 And then I prioritise that, like what they're thinking about in that moment, what the organisation cares about in that moment. 560 00:52:24,680 --> 00:52:29,989 And then I'll try to cascade, um, kind of strategies that linked to those mechanisms first, um, 561 00:52:29,990 --> 00:52:33,920 that can be aware because ultimately we're just trying to get people to do something different. 562 00:52:34,340 --> 00:52:38,480 Um, so you need to hook into the people. I am very possible. 563 00:52:38,840 --> 00:52:42,260 I do encourage going back to assumptions so we're not in conflict. 564 00:52:44,410 --> 00:52:50,170 All right, I'm gonna make you do more work. Okay, so here's a drawing of how you could potentially draw mechanisms. 565 00:52:50,470 --> 00:52:58,450 This is, again, Carroll Lewis's work. They're pretty intense, but if you're a visual person, they can actually be very helpful. 566 00:52:58,480 --> 00:53:03,580 You don't need to know all the components. So you had your, um, your list of thing. 567 00:53:03,910 --> 00:53:09,070 Your thing. The stuff in the middle and your outcome. You can actually logic model these guys out. 568 00:53:09,320 --> 00:53:13,910 You can see about how they're connecting. Um, here are I going to describe all the different. 569 00:53:13,960 --> 00:53:19,840 So you might just want to pick one. So here you might have either you're saying or your strategy whatever you're putting in place. 570 00:53:20,970 --> 00:53:27,270 This is maybe something like a preconditions. So like organisational preconditions we are sort of talking about from Maurice's work. 571 00:53:27,270 --> 00:53:31,650 So what actually has to be why maybe you might just need the funding to do the thing at all. 572 00:53:32,040 --> 00:53:34,140 Um, without it, nothing happens. Anyway. 573 00:53:34,650 --> 00:53:41,700 Uh, then you have your mechanism, but we are just chatting about important things that we need to think about with mechanism. 574 00:53:41,700 --> 00:53:49,320 So there are these moderators and mediators that Carol Lewis and company and other theorists in the space like to talk about. 575 00:53:49,860 --> 00:53:57,420 These are factors that can turn up a mechanism or turn down a mechanism or shut out the mechanism entirely. 576 00:53:57,900 --> 00:54:05,969 So sometimes, for example, something might be very meaningful to a person, but it's not an organisational priority. 577 00:54:05,970 --> 00:54:09,750 So they're not given any time. So that's shut down that mechanism entirely. 578 00:54:10,560 --> 00:54:13,830 Um, or sometimes there's no political will to do the thing. 579 00:54:13,830 --> 00:54:21,910 So shut down entirely. So these, these are things that can come in and just sweep away because they have more power or more, um, 580 00:54:21,990 --> 00:54:27,990 ability to either take away a precondition or kind of make the mechanism mute for that particular moment in time. 581 00:54:30,140 --> 00:54:34,700 Let's do some drawing or. Super fun. 582 00:54:35,420 --> 00:54:38,570 I won't do it. Um. Pick one. Just pick one mechanism. 583 00:54:38,780 --> 00:54:43,790 Chat with your friend again. Those online. Please go ahead and find a piece of paper and you will give it a go. 584 00:54:44,270 --> 00:54:47,719 Try to pick one mechanism. It can be at any of the levels we talked about. 585 00:54:47,720 --> 00:54:51,890 Could be a cognitive one. Could be a team one organisation social system. 586 00:54:52,340 --> 00:54:57,410 Um, the cognitive ones are sometimes a little easier to start with because you can feel them in your body. 587 00:54:58,070 --> 00:55:00,800 So those I don't see. So, um. Don't worry. 588 00:55:01,130 --> 00:55:09,650 You know, you can't put a determinant in there, but I think we want to focus on these areas outcome moderators, mechanisms and either strategy. 589 00:55:10,670 --> 00:55:15,380 So does any want to share anyone want to share with where they got you. There's no right or wrong here. 590 00:55:16,380 --> 00:55:19,440 So I'm going to do a little bit on what they offer us. And then I'm going to go to our wrenches. 591 00:55:20,310 --> 00:55:26,310 Um, so remember so I've already talked about this. Remember this study before because we know this now we can do different strategies. 592 00:55:26,320 --> 00:55:30,479 So I gave an example of that with our technology. But so say we're doing virtual care. 593 00:55:30,480 --> 00:55:33,600 If I know that virtual care hinges on professional identity. 594 00:55:33,600 --> 00:55:39,210 And they told me that they were training to be physicians in a particular way, well, then I need to train physicians in a different way. 595 00:55:39,300 --> 00:55:42,450 I need their identities to be not. I'm not a doctor in a room. 596 00:55:42,450 --> 00:55:45,780 I am a doctor that gives care in multiple modalities. 597 00:55:46,200 --> 00:55:51,420 That's a different way of socialising people, and that's a different kind of strategy that you would have to think about kind of earlier days. 598 00:55:51,870 --> 00:55:55,050 Um, so mechanisms can help us tailor our strategies. 599 00:55:56,040 --> 00:56:01,199 Um, they can also help us understand failure. So this is a fun exercise. 600 00:56:01,200 --> 00:56:07,470 If it was to do in all your humility when something goes wrong in so many of our trials, do not land the way we do, 601 00:56:07,620 --> 00:56:12,660 we will have to do this for my trial of our digital bridge project, because there was a lot of implementation failure. 602 00:56:13,020 --> 00:56:19,970 Um, and this was one for a clinical decision support tool where they did all the right kinds of strategies. 603 00:56:19,980 --> 00:56:23,760 They did their clinical champion, like the example I gave, they did their training. 604 00:56:24,180 --> 00:56:29,030 Um, and yet it failed. And they were able to uncover mechanisms of failure. 605 00:56:29,040 --> 00:56:32,369 And in these cases, it was a lack of confidence in the tool. 606 00:56:32,370 --> 00:56:37,739 So similar to Michelle's example, um, that can lead you down a you thrive or survive, pass fail. 607 00:56:37,740 --> 00:56:41,480 I'll talk about in a second. Um, and competing clinical priorities. 608 00:56:41,490 --> 00:56:44,490 Both of those things got in the way. Um, this is a great paper. 609 00:56:44,490 --> 00:56:50,190 Um, that came out if you're if you're keen on, um, you know, failure from us that that came out just earlier this year. 610 00:56:50,520 --> 00:56:53,540 What you'll see in these mechanisms papers as you get into them. 611 00:56:53,550 --> 00:56:57,990 Um, and I think some of the conversations were coming out like this that mechanisms 612 00:56:57,990 --> 00:57:02,190 can get changed and there can be many of them within a particular intervention. 613 00:57:02,370 --> 00:57:08,999 So you can actually have five, 6 or 7 of these, um, all in one row, which is anyone have kids? 614 00:57:09,000 --> 00:57:13,920 I said 6 or 7. And I can hear my daughter, like all it's her whole life. 615 00:57:13,920 --> 00:57:18,209 Right now she's ten. Um, mechanisms can also cascade. 616 00:57:18,210 --> 00:57:22,650 So we have that. Ah, um, that example before um, this I academic, 617 00:57:22,650 --> 00:57:28,170 you and company did a lovely scoping review of employee driven innovation a little while ago, and they, 618 00:57:28,170 --> 00:57:36,210 um, I was very kindly invited to write a commentary on it, uh, where we, uh, I looked at the literature a little bit and showed how, you know, um, 619 00:57:36,480 --> 00:57:43,080 mechanisms happening within an employee driven innovation space can cascade into the same things that drive large system change, 620 00:57:43,350 --> 00:57:48,870 things like distributed leadership achieving better fit between innovation and local context in alignment, 621 00:57:48,870 --> 00:57:56,680 um, uh, around kind of what is perceived to be meaningful that if you do that is small, that can cascade into bigger and bigger places. 622 00:57:56,710 --> 00:58:00,630 Those are mechanisms that can potentially transfer into other spaces. 623 00:58:02,180 --> 00:58:07,820 This is a mechanism I'm a little bit geeked about. This is Alex Honnold climbing type 101 earlier this year. 624 00:58:07,850 --> 00:58:13,690 Anybody watch that while sweating? I was on I was on my bike trainer at the time which was not a good idea. 625 00:58:13,700 --> 00:58:20,200 My heart rate was too high. And so when I've been working on a little bit and here's a drawing I've done for you. 626 00:58:20,210 --> 00:58:29,260 Sorry. Online. No, I'm off camera. Um, is how threat that exists at different levels can lead to different kinds of responses. 627 00:58:29,260 --> 00:58:31,219 So some people have kind of theorised this, 628 00:58:31,220 --> 00:58:39,380 again from a cognitive psychology space where they've shown that and particularly in accelerated accelerated change environments, 629 00:58:39,560 --> 00:58:44,450 that we can elicit this threat response, which can lead people to survive. 630 00:58:44,810 --> 00:58:53,300 So that's like when you present a new idea in a room and either you or somebody gets like angry or shuts down or looks at their phone or leaves, 631 00:58:53,540 --> 00:58:59,810 that's like your fight, flight or freeze. Um, example of when something new was presented and it feels threatening. 632 00:59:00,200 --> 00:59:06,049 Or you can go into a thrive space where you embrace the fear and climb this crazy building that looks horrible. 633 00:59:06,050 --> 00:59:12,830 Then, um, in implementation science, it's looking like things like resistance or anxiety or loss or perceived risks, 634 00:59:13,280 --> 00:59:16,630 but they're kind of sitting in that space of threat a lot. 635 00:59:16,640 --> 00:59:23,750 And the reason I'm interested in this is it seems to be happening, um, in technology environments that when you present, 636 00:59:23,750 --> 00:59:32,659 especially when somebody has gone through a really bad technology implementation before you present something new and it's a total shutdown, 637 00:59:32,660 --> 00:59:36,560 like an actual threat response to something that they don't want to go through again. 638 00:59:37,040 --> 00:59:44,269 And so it's a mechanism I think it's under theorised. And I'm working on a little bit, uh, around the nature of threat in terms of where it sits, 639 00:59:44,270 --> 00:59:49,280 not just at the individual level but at multiple levels, and that it can look like different things. 640 00:59:49,280 --> 00:59:54,830 It can look like identity, it can look like norms, it could look like survival at an organisational level. 641 00:59:55,130 --> 00:59:59,060 And so if I were drawing a mechanism drawing, I mentioned one I've been working on for months. 642 00:59:59,060 --> 01:00:03,890 This is one of them you can draw. So say someone's going through tool training. 643 01:00:04,460 --> 01:00:12,110 Maybe they're a radiologist. And that is leading to some rule uncertainty, whether they're actual and uncertainty being a precursor to threat. 644 01:00:12,500 --> 01:00:17,720 And then they feel some professional identity threat. And now we're either going to go into survival mode or thrive mode. 645 01:00:18,290 --> 01:00:22,610 So our Meccan our moderators in this example would be their past experience. 646 01:00:22,910 --> 01:00:26,150 How have they seen something like this before and what are they likely to expect? 647 01:00:26,690 --> 01:00:32,420 And then how meaningful this thing is to them in relation to their perceived identity and threat. 648 01:00:32,840 --> 01:00:37,310 And so when I've seen meaningfulness turned up, that's a dial that can go up. 649 01:00:38,420 --> 01:00:44,030 Thrive happens because they're willing to tolerate the risk and the uncertainty, because the outcome of the outcome, 650 01:00:44,540 --> 01:00:50,180 when their past experience suggests this is going to go terribly, they're more they seem to be more likely to happen here. 651 01:00:50,540 --> 01:00:54,650 So this is a lot of what I'm lecturing on in the next couple of months at different places. 652 01:00:54,890 --> 01:00:59,930 And I wanted to share with you guys because it's a learning process, and I'm also keen on what you guys think about that. 653 01:01:00,650 --> 01:01:03,690 Come on, come on. Them. Yes, please. Since. 654 01:01:05,520 --> 01:01:08,670 I like it. Um, I think. 655 01:01:08,850 --> 01:01:15,629 I think we we all can think of an example where somebody has gone into that freeze mode because of a 656 01:01:15,630 --> 01:01:21,110 bad experience with the technology that was introduced that then kind of completely wrecked that job. 657 01:01:21,450 --> 01:01:24,690 And then they were then doing things they didn't like and didn't. 658 01:01:24,960 --> 01:01:30,150 But also the effort you put into the technology and then the whole thing fails. 659 01:01:30,570 --> 01:01:35,500 Um. I'm not sure that meaningfulness is the only thing in that full. 660 01:01:36,580 --> 01:01:43,000 I remember we were doing a study years ago, uh, which was a seven site, 661 01:01:43,000 --> 01:01:51,040 randomised controlled trial of telehealth in heart failure and was going well in six sites and then site seven, which was a plane ride away. 662 01:01:51,520 --> 01:01:58,840 Um, nobody was being recruited. So Joan and I got on a plane and went off to this place where they weren't encouraging anyone. 663 01:01:59,110 --> 01:02:02,560 We had all these kind of unusual things about, like, what? You're recruiting. 664 01:02:03,040 --> 01:02:07,629 And then we picked up the stories, and I think we got the stories in a taxi from someone, you know, 665 01:02:07,630 --> 01:02:15,550 some of those informal things, the story of something that had been introduced four years ago, that was just a disaster. 666 01:02:15,790 --> 01:02:21,880 And then everyone got blamed and punished and and people were still in that freeze mode. 667 01:02:22,240 --> 01:02:31,060 Now the challenge for us was not just to make this meaningful, but also at a technical level, 668 01:02:31,360 --> 01:02:40,240 to reassure them that the the technology that we were introducing was not going to go as badly wrong as the technology that was in Chicago. 669 01:02:40,390 --> 01:02:47,080 And actually, one of the things was that the technology had moved on, but that it was a much, much better technology than the one. 670 01:02:47,080 --> 01:02:50,229 Yeah, that was being introduced. So it wasn't just meaningful. 671 01:02:50,230 --> 01:02:54,070 It was like, is this thing going to work because it didn't work last time? 672 01:02:55,660 --> 01:02:58,870 And value. So. Yeah. Yeah. Yeah. You know. 673 01:02:58,870 --> 01:03:03,579 Whatever. Yeah. That's great, I love that. Um, I have a whole lot of things I like. 674 01:03:03,580 --> 01:03:06,820 Narrative approaches is really important because stories matter. 675 01:03:07,210 --> 01:03:10,390 Um, I love that. I'm going to ask you more about that. Um. 676 01:03:11,590 --> 01:03:15,160 Thank you. I'm collecting stories right now. I'll come back to that. 677 01:03:15,760 --> 01:03:20,420 Um, yes, because that's looking back into past experience and that's changing how I kind of value meaningful, 678 01:03:20,440 --> 01:03:26,349 because there are some challenges for our field that we need to keep going. And I want to elevate these because you're probably going to experience. 679 01:03:26,350 --> 01:03:31,390 And if you're going out and trying to study mechanisms, one is the definitional clarity issue. 680 01:03:31,630 --> 01:03:37,720 Uh, we still are missing it. So if you're going out just just pick one that resonates with the the environment that you're working in. 681 01:03:38,080 --> 01:03:39,909 Um, if you're implementation, hang out with Lewis. 682 01:03:39,910 --> 01:03:45,820 If you're realist, hang out with POS mentally and um, kind of recognise that this is still an evolving space. 683 01:03:46,630 --> 01:03:53,050 Duncan also suggests to us that magazines don't just get turned on and off like layouts, but they're actually probably more like dimmer switches. 684 01:03:53,470 --> 01:03:59,140 That there's actually a continuum of mechanism that can actually exist in small ways and big ways. 685 01:03:59,140 --> 01:04:05,740 And then and the context matters in that in those instances and conditions by and all of it. 686 01:04:06,310 --> 01:04:12,850 It's a dial. Other work I did with Jay, um, some years ago, this is back when we were doing postdoctoral work. 687 01:04:13,000 --> 01:04:18,210 Yeah, I grew up in here. I know he and I chatted, but we were very, very close colleagues. 688 01:04:18,220 --> 01:04:25,600 We work together a long time. We did some work understanding kind of mechanisms or can milestones, as we call them, advancing integrated care. 689 01:04:25,990 --> 01:04:27,760 And because we looked at it over time, 690 01:04:28,060 --> 01:04:34,030 what we are finding was that something that was a mechanism at one point in time actually became a context at another point in time. 691 01:04:34,540 --> 01:04:43,029 So mechanisms actually have like a time limit, or they shift over time in the way they, you know, actually have their action shifts over time. 692 01:04:43,030 --> 01:04:50,139 So you have to consider the time component. Here's the study I was mentioning before about how you could layer on different 693 01:04:50,140 --> 01:04:54,190 kinds of mechanisms that are cognitive and institutional and process oriented. 694 01:04:54,670 --> 01:05:01,220 This is of course in all paper. This is them just looking at, um, uh, spread of an intervention. 695 01:05:01,240 --> 01:05:05,350 They looked at nutrition programs, just seven that that spread. 696 01:05:05,590 --> 01:05:09,670 What you see in the middle is what the tests are important for scale and spread. 697 01:05:10,210 --> 01:05:14,080 And then these are all the mechanisms around those seven things that mattered. 698 01:05:14,440 --> 01:05:18,580 Um, in order to make Scanlon spread happen, you can make it a little deeper. 699 01:05:18,610 --> 01:05:21,400 It's a very cool paper. Very. It's like a lot. 700 01:05:21,730 --> 01:05:28,300 But it does tell you kind of what how these things relate to each other and how they kind of come to some of those bigger order. 701 01:05:28,660 --> 01:05:33,940 Often our guidance sits at this really high abstract level, and you don't know why it matters. 702 01:05:34,210 --> 01:05:39,670 This is a nice paper that tells you kind of why it matters. And they use multiple theoretical lenses to help us unpack that. 703 01:05:41,500 --> 01:05:48,910 Um, a last piece is that there are often invisible and consequently very hard to measure. 704 01:05:49,210 --> 01:05:52,480 Um, this is a nice paper by Hull and Company. 705 01:05:52,810 --> 01:05:57,010 Um, where they were looking at mental models and implementation science. 706 01:05:57,310 --> 01:06:03,990 Mental models are very hard to uncover. We did some shared mental models work, and it's a lot to uncover in in this very deep enough idealogical. 707 01:06:04,570 --> 01:06:08,580 Um, but they were finding that it's really critical and implementation science. 708 01:06:08,580 --> 01:06:12,430 So we need to start thinking differently about how we unpack mental models. 709 01:06:12,440 --> 01:06:19,600 Well. So to summarise, cosmic uh, determinism does help describe terrain. 710 01:06:19,850 --> 01:06:24,440 Our strategies help describe action, but our mechanisms explain movement. 711 01:06:24,500 --> 01:06:30,959 Like my kids running around the beach in Belgium just as we connect. The three things you can think about going forward are. 712 01:06:30,960 --> 01:06:35,280 Implementation science might be at an inflection point, or we need to pay more attention to this. 713 01:06:35,910 --> 01:06:40,110 Mechanisms inform strategy. Design can be essential for system level reform. 714 01:06:40,110 --> 01:06:46,710 And finally, studying mechanisms requires theoretical pluralism and possibly methodological innovation. 715 01:06:47,750 --> 01:06:49,100 Reminder. Thank you.