1 00:00:01,350 --> 00:00:09,960 Sort. 2 00:00:09,960 --> 00:00:17,760 I'm very honoured to be here to share a lot of what we've done over the last decade with you and kind of my ideas and thoughts about the path forward. 3 00:00:17,760 --> 00:00:21,180 And so I think we have about an hour. I'll try to. 4 00:00:21,180 --> 00:00:26,520 I put quite an ambitious amount of information in this talk and hopefully we'll be able to end relatively early. 5 00:00:26,520 --> 00:00:27,750 If I do breeze through a little bit, 6 00:00:27,750 --> 00:00:35,670 it's because I want to leave time for questions and discussions because I'd really love to learn from you through your enquiries. 7 00:00:35,670 --> 00:00:41,280 So here's some standard disclosures around research, funding and salary support. 8 00:00:41,280 --> 00:00:46,350 So communication. We hear about it all the time. 9 00:00:46,350 --> 00:00:53,430 You know, it's hammered home constantly during medical training, surgical training, et cetera. 10 00:00:53,430 --> 00:00:58,350 So what we're the what's the current state of communication and surgery, and we'll touch a little bit on then kind of my views. 11 00:00:58,350 --> 00:01:01,680 And some of this will just be very clinical vignette based where you'll you'll recognise 12 00:01:01,680 --> 00:01:07,080 these situations and know or understand that you've been in these situations before. 13 00:01:07,080 --> 00:01:10,590 Failure to rescue, as Peter described, is really something I have worked on for the last decade, 14 00:01:10,590 --> 00:01:16,770 but I describe it as my straw man for understanding, communication and quality in health care and surgery in particular. 15 00:01:16,770 --> 00:01:24,660 And then we'll talk a little bit about kind of where we can go toward the future. So it's an excellent quote from George Bernard Shaw. 16 00:01:24,660 --> 00:01:31,300 The single biggest problem in communication is the illusion that it has taken place, and we recognise this on a day in and day out basis. 17 00:01:31,300 --> 00:01:40,620 Think about the the sheer volume of information that you're communicating across providers, clinicians, et cetera, within the hospital. 18 00:01:40,620 --> 00:01:43,920 We say something and we assume everybody has understands it. 19 00:01:43,920 --> 00:01:49,440 We write something in the electronic medical record or the paper chart, and we assume everybody who needs to know knows. 20 00:01:49,440 --> 00:02:01,110 And so truly, the goal here is to close that loop and understand that communication is a lot more than just written or verbal, 21 00:02:01,110 --> 00:02:08,100 that there's a patient in ICU in Michigan who I'll tell you a little vignette about this patient, 22 00:02:08,100 --> 00:02:11,610 and we'll come back to this patient toward the end of the talk. 23 00:02:11,610 --> 00:02:17,220 The 71 year old patient, who underwent a subtotal gastrectomy for cancer with a runway reconstruction patient, 24 00:02:17,220 --> 00:02:28,350 ended up developing a leak, was in the ICU, doing fairly OK, but required a procedure in our interventional radiology suite. 25 00:02:28,350 --> 00:02:33,390 And because of his tenuous state, was going to require an anaesthesia transport. 26 00:02:33,390 --> 00:02:36,480 It's how we describe it. So it was going to require some higher level of transport. 27 00:02:36,480 --> 00:02:38,670 And the anaesthesiologist came up to the surgical intensive care unit. 28 00:02:38,670 --> 00:02:42,820 Oftentimes intubate the patient, take them back down for the procedure and then bring it back up. 29 00:02:42,820 --> 00:02:52,230 And debate sounds very routine, right? We'll get to a lot of the flaws in this case that actually led to this patient dying as we move forward. 30 00:02:52,230 --> 00:02:58,380 But just remember something as simple as we're going to go up to the ICU into the very routine, take him down to the to demonstrate upswing. 31 00:02:58,380 --> 00:03:04,500 Come back. Remember this patient? So I found this a very interesting. 32 00:03:04,500 --> 00:03:08,340 Now I study communication mostly in the post-operative setting, 33 00:03:08,340 --> 00:03:15,300 but I've taken on a new role in the last year kind of understanding and improving quality and culture and efficiency, 34 00:03:15,300 --> 00:03:23,490 et cetera, within within the operating rooms. And this is an eye opening study about communication within the operating room. 35 00:03:23,490 --> 00:03:27,210 You all do timeouts in the operating room right now. 36 00:03:27,210 --> 00:03:34,350 Those checklists have been rammed down our throats for a while now, and they seem like they would make sense that they would do good. 37 00:03:34,350 --> 00:03:41,850 But part of the problem is they become rote and people don't pay as much attention to the quality or content of the timeout. 38 00:03:41,850 --> 00:03:48,420 And one of the pieces, if yours is similar to ours, are the very first thing you do is introduce everybody in the operating room, right? 39 00:03:48,420 --> 00:03:56,580 This was a very interesting study where they polled everybody in the E.R. at the end of the case to see who knew other people's names. 40 00:03:56,580 --> 00:04:02,100 And I would draw your attention if for nothing else to this box right here, 41 00:04:02,100 --> 00:04:09,780 you would think the surgical attending assay would know his or her surgical resident their name. 42 00:04:09,780 --> 00:04:17,820 They got it right only sixty eight percent of the time. That's I mean pathetic is maybe a little overstatement, but that's pretty sad. 43 00:04:17,820 --> 00:04:19,680 But this is the current state of communication health care. 44 00:04:19,680 --> 00:04:26,280 We don't we don't pay as much attention as we need to and in some places have really led this effort with scrub caps, 45 00:04:26,280 --> 00:04:33,750 with people's names on the front. I don't know if you do that here. We are actually doing that at Michigan now because of this. 46 00:04:33,750 --> 00:04:39,840 I really do think that a lot of communication, subtle or otherwise is is withheld, 47 00:04:39,840 --> 00:04:44,100 is held back because I'm embarrassed because I don't remember your name. 48 00:04:44,100 --> 00:04:51,840 I don't want to call you, Hey, you. I don't want to say, Hey, anaesthesia. So if it's not super important, I'll let it slide. 49 00:04:51,840 --> 00:04:57,810 But you can imagine death by a thousand cuts in the operating room if that is continuing to happen. 50 00:04:57,810 --> 00:05:00,870 We're going to have balls dropped. So here's the ideal situation, right? 51 00:05:00,870 --> 00:05:05,610 We have nurses, doctors, physios, everybody coming together, talking to each other. 52 00:05:05,610 --> 00:05:12,270 And this is really the ideal state and health care. I would describe this as a unicorn. You don't see this anywhere. 53 00:05:12,270 --> 00:05:18,840 Multi-disciplinary kind of conferences don't count. I'm talking about day to day management of patients. 54 00:05:18,840 --> 00:05:23,520 I challenge you to find a ward anywhere where everybody comes together, sits down, 55 00:05:23,520 --> 00:05:30,240 really kind of puts their heads together as they're trying to come up with a diagnosis or treat their patients. 56 00:05:30,240 --> 00:05:31,980 So I really feel like this is a unicorn. 57 00:05:31,980 --> 00:05:38,220 What's really happening is usually this nowadays you've got a care team in the background trying to figure out as a doctor back, 58 00:05:38,220 --> 00:05:39,960 they're trying to figure what's going on. And then you've got. 59 00:05:39,960 --> 00:05:44,940 Other people distracted by their smartphones or hair to enter the vital signed data, they can figure it out. 60 00:05:44,940 --> 00:05:48,750 I mean, this is really kind of this idea of the separation of powers on the units. 61 00:05:48,750 --> 00:05:56,130 The hierarchy were trying to flatten is fine for the communication isn't necessarily occurring the way it should because again, 62 00:05:56,130 --> 00:06:02,010 a lot of what we do now, they're holding a paper charts pretty archaic. I shouldn't say that you guys have paper charts. 63 00:06:02,010 --> 00:06:09,120 OK, sorry, I had a picture. I also teach a class how to put your foot in your mouth one on one. 64 00:06:09,120 --> 00:06:17,010 So, so, so. But you know, we think about even the electronic medical record as being the panacea of communication in health care. 65 00:06:17,010 --> 00:06:23,160 And as we know in our experience, it's a lot of cutting and pasting and very minimal important content. 66 00:06:23,160 --> 00:06:30,690 Not to mention, it doesn't really, you know, you would you wouldn't text the things you write in the medical record. 67 00:06:30,690 --> 00:06:36,060 There's a bit of formality in the medical record that loses some of the nuance of communication. 68 00:06:36,060 --> 00:06:44,580 So then we have some communication occurs and we have physicians fighting and yelling and we see this mostly between departments. 69 00:06:44,580 --> 00:06:47,880 We've got consultants and consulting for me is something different. 70 00:06:47,880 --> 00:06:49,140 So if you got a consulting physician, 71 00:06:49,140 --> 00:06:56,640 so if I'm a surgical attending and I've got infectious disease gastroenterology and we all have different views on how the patient should be managed. 72 00:06:56,640 --> 00:06:59,040 Unfortunately, some of this has devolved into conflict. 73 00:06:59,040 --> 00:07:05,190 It's become even more difficult in the communication occurs through passive aggressive notes in the electronic 74 00:07:05,190 --> 00:07:10,920 medical record as opposed to I think this is actually probably somewhat therapeutic to just get it all out there. 75 00:07:10,920 --> 00:07:14,400 But again, this can contribute to even worse communication. 76 00:07:14,400 --> 00:07:21,150 So I'm painting kind of a relatively negative picture because I really do think that communication health care is is not where it should be. 77 00:07:21,150 --> 00:07:25,920 And these are just some, some concepts that many of you apply that resonate with you. 78 00:07:25,920 --> 00:07:32,100 So let's get some data out there. So the Joint Commission is basically the accrediting body for hospitals in the United States, 79 00:07:32,100 --> 00:07:39,450 and here are all seminal events need to report to the Joint Commission, and they undergo a root cause analysis. 80 00:07:39,450 --> 00:07:51,150 And you see here that communication is in the top three every year as a root cause of sentinel events in health care. 81 00:07:51,150 --> 00:07:59,220 So, you know other things up there that you all have heard from Peters Group and others around human factors, leadership, et cetera are up there. 82 00:07:59,220 --> 00:08:07,200 But you know, we know that communication is really the linchpin to improving a lot or preventing a lot of these events. 83 00:08:07,200 --> 00:08:10,140 So what are the solutions we have come up with? Well, that's easy, right? 84 00:08:10,140 --> 00:08:16,230 We've got checklists so that, you know, of course, that's like the gold standard for communication in nursing. 85 00:08:16,230 --> 00:08:21,270 We use in the United States with something called ESPA. It's basically a structured communication tool. 86 00:08:21,270 --> 00:08:29,850 I now have these daily goal sheets that have they're supposed to have clear communication, so standardisation makes some sense. 87 00:08:29,850 --> 00:08:33,090 For team behaviours, many of you heard of team steps, 88 00:08:33,090 --> 00:08:41,460 the Agency for Healthcare Research and Quality put together this comprehensive plan for how you can improve culture and team building. 89 00:08:41,460 --> 00:08:47,940 And so, you know, that's a decent solution. Technology, you know, we talked about the EMR. 90 00:08:47,940 --> 00:08:56,820 What about smartphones, right? So do you all carry one way pagers or do you how do you guys communicate to paging paging system or anything that sort? 91 00:08:56,820 --> 00:09:05,340 Or does it go to your? Does it go to your smartphones? No. So we still have one way pagers that it's pretty funny because some of our patients 92 00:09:05,340 --> 00:09:11,280 who are of the new generation see us with those they they thought those were extinct. 93 00:09:11,280 --> 00:09:16,560 Because what? Why would you want just unit directional communication? 94 00:09:16,560 --> 00:09:22,830 There is no way of closing a loop in a lot of adverse events or widening at get the page. 95 00:09:22,830 --> 00:09:26,550 And then that's it, I'm off the hook. We have technology out there. 96 00:09:26,550 --> 00:09:29,190 I mean, there's a lot of hospitals in the US now. 97 00:09:29,190 --> 00:09:36,570 They're using proprietary software that is privacy compliant, that allows for two way communication using smartphones. 98 00:09:36,570 --> 00:09:39,000 So so what's the problem with these approaches, though? 99 00:09:39,000 --> 00:09:45,480 I mean, if these are the solutions they all sound reasonable, we would have perfect communication is that they take a one size fits all approach. 100 00:09:45,480 --> 00:09:52,200 They don't take into consideration the context within which we are delivering care, and it's supposed to be the same for the medical physicians, 101 00:09:52,200 --> 00:09:58,770 as well as the surgeons supposed to be the same on the ward as it is on the ICU, supposedly on the same inpatient outpatient. 102 00:09:58,770 --> 00:10:02,070 So clearly, this one size fits all approach is not going to work. 103 00:10:02,070 --> 00:10:09,300 And so to better understand these nuances of communication, we need a case study and that's where failure to rescue comes in. 104 00:10:09,300 --> 00:10:16,620 I think it's something that you can describe to your grandmother or your grandfather. It totally makes sense patient as a complication. 105 00:10:16,620 --> 00:10:21,180 I'll get into description in a second, but you know, we fail to rescue them. Why? 106 00:10:21,180 --> 00:10:28,020 You know, so here's here's an overly simplistic view of a surgical episode, right? 107 00:10:28,020 --> 00:10:36,450 Patient as an operation they may develop a seminal complication could be something as simple as a urinary tract infection DVT. 108 00:10:36,450 --> 00:10:39,420 If it's not detected, it can lead to domino complications. 109 00:10:39,420 --> 00:10:46,740 We've all seen those patients right starts with one little thing and then just the dominoes continue to fall and then they may succumb to that. 110 00:10:46,740 --> 00:10:52,800 And so for probably the latter part of the 90s and early 2000s, 111 00:10:52,800 --> 00:10:58,890 the goal was if we were going to reduce mortality after surgery, we had to prevent complications. 112 00:10:58,890 --> 00:11:06,780 A lot of initiatives around making sure antibiotics were given on time. DVT prophylaxis was administered appropriately, et cetera, et cetera. 113 00:11:06,780 --> 00:11:13,680 And we did a good job of that. I think we got compliance rates in the United States upwards of 90 percent for a lot of these processed measures, 114 00:11:13,680 --> 00:11:19,560 but we weren't realising the benefits fully for mortality reduction. 115 00:11:19,560 --> 00:11:24,540 So rescue kind of flips it on its head a little bit and says, OK, you know, things are going to happen. 116 00:11:24,540 --> 00:11:32,460 The adverse outcomes are going to happen. You have the 70 year old obese diabetic male who comes in for an operation, 117 00:11:32,460 --> 00:11:36,690 and they're going to have a really high likelihood of developing a superficial surgical site infection. 118 00:11:36,690 --> 00:11:42,250 Right? The question is, do you detect it and treat it? Or do you allow it to become an organ space infection? 119 00:11:42,250 --> 00:11:51,980 And because you haven't treated it? And so this idea of rescuing is really preventing these seminal complications from leading to a domino effect. 120 00:11:51,980 --> 00:11:55,500 So. How do we get this on the map? 121 00:11:55,500 --> 00:12:01,050 This was actually initially described in 1992 by a paediatric anaesthesiologist, 122 00:12:01,050 --> 00:12:10,500 but was kind of a term and a quality measure those obscure in the literature and a lot of people did not know about it, especially in surgery. 123 00:12:10,500 --> 00:12:16,890 Part of the critique around it was that it was using administrative data so that the complications were not fully. 124 00:12:16,890 --> 00:12:23,040 We're not reliable. So in 2009, with two of my colleagues, Drs. Brookmeyer and Dimmock, 125 00:12:23,040 --> 00:12:33,120 we gained access to the Inskip data in the US and this was early on in an experience where it was only about 100 hospitals. 126 00:12:33,120 --> 00:12:42,900 Now there's over five six hundred hospitals that participate. And the beauty of this data was that it was near clinical nurse abstracted data. 127 00:12:42,900 --> 00:12:49,530 So the complications, everything was very reliable. So there was no argument anymore that, you know, your data is unreliable, 128 00:12:49,530 --> 00:12:58,620 so you can't make judgement judgements or you can't make any conclusions about how failure to rescue mortality and complications are related. 129 00:12:58,620 --> 00:13:07,740 So when we use this high quality data, we took hospitals, we divided them into quintiles five equal groups rated by their risk adjusted mortality. 130 00:13:07,740 --> 00:13:13,380 And you could see, even in this kind of select cohort of large academic medical centres in the United States, 131 00:13:13,380 --> 00:13:21,750 there was a two fold variation in mortality following major surgery. So this is interesting in of itself, but we know variation exists everywhere. 132 00:13:21,750 --> 00:13:26,490 But the key was understanding the relationship between failure to rescue and complications. 133 00:13:26,490 --> 00:13:31,470 This had been described again by Jeff Silber and others in administrative data. 134 00:13:31,470 --> 00:13:33,810 But this is the first time we were using a clinical data to describe this, 135 00:13:33,810 --> 00:13:44,460 and what we found was the same quintiles of mortality had very similar rates of both all complications or a major or significant complications. 136 00:13:44,460 --> 00:13:50,160 And what really distinguished their differences between them as far as their mortality was, 137 00:13:50,160 --> 00:13:58,410 this concept of rescue and rescue is really mathematically as the case fatality rate of major complications. 138 00:13:58,410 --> 00:14:02,400 And so this seems very kind of obvious now, 139 00:14:02,400 --> 00:14:07,530 but at the time in 2009 really caught the attention of people in the United States and 140 00:14:07,530 --> 00:14:11,640 especially surgical departments that then suddenly began to pay attention to this measure. 141 00:14:11,640 --> 00:14:16,500 Now again, it had been around the Agency for Healthcare Research and Quality in two thousand three, 142 00:14:16,500 --> 00:14:20,700 adopted it as a quality measure and reported it after our study. 143 00:14:20,700 --> 00:14:31,860 The AHA or National Medicare programme began to report this on their website, and you can actually log on now and look up any hospital and see. 144 00:14:31,860 --> 00:14:34,950 Here's University of Michigan health system. Kind of see where do they? 145 00:14:34,950 --> 00:14:39,900 Are they above below or at the national rate for rescue in the national quality? 146 00:14:39,900 --> 00:14:47,550 Former truly drives a lot of the agendas in the United States had endorsed this from twenty thousand seventeen and recently withdrew endorsement, 147 00:14:47,550 --> 00:14:55,080 mainly because people like Medicare were using administrative data to report this, and they felt that there was a reliability issue. 148 00:14:55,080 --> 00:14:59,040 They endorsed the the concept, but not the way it was being measured. 149 00:14:59,040 --> 00:15:06,600 But nonetheless, there are a lot of people who have begun to pay attention to this, and it has sparked some interest in how do we reduce it? 150 00:15:06,600 --> 00:15:10,350 So whenever you measure it on something and it's reported, how do you improve it? 151 00:15:10,350 --> 00:15:17,370 Well, this is where some interesting organisational systems and design really took off. 152 00:15:17,370 --> 00:15:28,380 So as a so I was a research fellow, I forgot to mention that in 2009, when I wrote that paper, and in the ensuing four to five years, basically, 153 00:15:28,380 --> 00:15:33,780 we tried to understand every macro system level characteristic that was associated with food rescue, 154 00:15:33,780 --> 00:15:38,100 trying to unpack it and understand it using secondary data or large datasets. 155 00:15:38,100 --> 00:15:45,240 And we looked at things like hospital volume, you know, teaching status, hospital technology levels, nurse to patient ratios, et cetera. 156 00:15:45,240 --> 00:15:50,340 And we found some interesting solutions or excuse me, associations, but there were no solutions there. 157 00:15:50,340 --> 00:15:54,270 Many of the things I just described to you are fixed. 158 00:15:54,270 --> 00:15:59,370 I'm a nurse. The patient ratios maybe a little bit movable, but you ask any hospital leader. 159 00:15:59,370 --> 00:16:04,270 It's not like there's an infinite amount of money to keep hiring more and more nurses. 160 00:16:04,270 --> 00:16:08,460 So yes, it'd be great if every patient had one to one nursing, and that's just not going to happen. 161 00:16:08,460 --> 00:16:15,750 So one of my friends at the time and she still my friend, but she didn't tell me that she's a trauma surgeon in Houston. 162 00:16:15,750 --> 00:16:20,630 She can tell me she was going to write this. So, you know, we were, you know, this is one of my research fellows. 163 00:16:20,630 --> 00:16:25,890 Yeah, there's the and Brookmeyer. And we wrote this paper, you know, thinking about he was he's now a vascular surgeon, 164 00:16:25,890 --> 00:16:29,920 was interested in cardiovascular surgery and nobody had really described failure rescue. 165 00:16:29,920 --> 00:16:35,850 And that's okay. Great. All right. This on paper, for him, it was a chance for him to learn how to write and analyse data. 166 00:16:35,850 --> 00:16:42,720 But Lillian called me out nationally publicly on this and said, Look, enough is enough. 167 00:16:42,720 --> 00:16:48,410 You've written umpteen papers about this. What are you going to do about it? 168 00:16:48,410 --> 00:16:53,300 Thank you. So, you know, you try to take inspiration from that. 169 00:16:53,300 --> 00:17:01,430 And so we did and we decided to take this kind of very simplistic view and begin to kind of add some pieces around and start 170 00:17:01,430 --> 00:17:07,460 building a conceptual model around what are some things that we can study some aspects and so we can study in a rigorous way. 171 00:17:07,460 --> 00:17:12,320 So we apply for funding through the National Institute for Ageing and receive funding to really understand too big 172 00:17:12,320 --> 00:17:18,950 two pieces micro system level resources that may affect the ability to recognise and respond to complications, 173 00:17:18,950 --> 00:17:25,520 as well as this concept of kind of teamwork or safety attitudes within health systems. 174 00:17:25,520 --> 00:17:28,070 So here's what we did. 175 00:17:28,070 --> 00:17:36,590 We went to thirty two hospitals in the state of Michigan that participate in a large quality collaborative there that has clinically abstracted data. 176 00:17:36,590 --> 00:17:41,330 And so that's where we garnered this data and mortality data. 177 00:17:41,330 --> 00:17:46,670 You could see within these 32 hospitals. Again, it's very similar to the national population. 178 00:17:46,670 --> 00:17:54,620 But a two fold difference in failure to rescue this was we picked a kind of higher risk operations, you know, twofold risk. 179 00:17:54,620 --> 00:17:58,970 So so this is very consistent from the worst of the best performers. 180 00:17:58,970 --> 00:18:03,080 But we then did was we began to look at these micro system level factors, right? 181 00:18:03,080 --> 00:18:06,650 So ICU staffing is is the one that I'll talk about. 182 00:18:06,650 --> 00:18:14,480 We at the time knew and Peter Pronovost had really kind of pioneered a lot of this in writing about closed ICUs, which in the US are closed. 183 00:18:14,480 --> 00:18:18,590 ICU essentially means you have a board certified intensivists who manages the day 184 00:18:18,590 --> 00:18:25,880 to day events for patients and writes the orders and takes care of all that. 185 00:18:25,880 --> 00:18:28,730 And the surgical team is really kind of the ones that are consulting on the patient 186 00:18:28,730 --> 00:18:32,600 is providing input from a surgical perspective and pathophysiology as far as feeding, 187 00:18:32,600 --> 00:18:41,270 et cetera. And so at the time in the state of Michigan, you could see, I mean, overall, those pie about a forty five percent rate of close use. 188 00:18:41,270 --> 00:18:46,760 But it held true that hospitals that were better at rescue had more closed. 189 00:18:46,760 --> 00:18:52,950 Ice use made sense, right? So this is actually a relatively easy thing to flip a switch on. 190 00:18:52,950 --> 00:18:58,730 You hire an intensivist. It's not that difficult as opposed to other things like, you know, 191 00:18:58,730 --> 00:19:04,580 trying to hire more nurses or trying to improve or increase your hospital technology, et cetera, et cetera. 192 00:19:04,580 --> 00:19:10,590 So, so this sparked another interesting question in my era dawned on me the other day 193 00:19:10,590 --> 00:19:15,440 was as a medical was a medical student here in the UK who was visiting us here, 194 00:19:15,440 --> 00:19:23,750 right? Yeah. So who came to spent a year with us to do some research and was interested in unpacking 195 00:19:23,750 --> 00:19:28,940 this idea of intensive staffing and how that affects mortality within hospitals? 196 00:19:28,940 --> 00:19:34,670 And so there's a little bit of kind of nuance here as far as the the statistical methodology that we employed here. 197 00:19:34,670 --> 00:19:40,940 There had been innumerable papers written about the benefits of ICU closed ICDS or intensivists, 198 00:19:40,940 --> 00:19:44,630 but the floor there was there was a lot of pre post evaluations, 199 00:19:44,630 --> 00:19:50,750 meaning it was they looked at before there was intensivists and after there was intensivists in aggregate. 200 00:19:50,750 --> 00:19:54,170 And they saw, yeah, of course. But you know, places where there was intensivists did better. 201 00:19:54,170 --> 00:20:03,830 I mean, that was undeniable. The question was, if you are as an individual hospital looking at your outcomes over time and you hire an intensivist, 202 00:20:03,830 --> 00:20:07,640 do you see an improvement in your specific outcomes? Sure. 203 00:20:07,640 --> 00:20:16,460 Some people had seen that, but in aggregate, was this true? And so what we found, we use this econometric technique called difference in differences. 204 00:20:16,460 --> 00:20:20,960 And in this technique, we're able to remove what we describe as secular trends. 205 00:20:20,960 --> 00:20:25,940 And what does that mean? We know health care in general is getting safer. 206 00:20:25,940 --> 00:20:33,500 So we know in general mortality rates from 30 years ago after major surgery were higher than mortality rates currently. 207 00:20:33,500 --> 00:20:38,690 If we just look at point zero thirty years ago and say there is no intensivists and look at now, 208 00:20:38,690 --> 00:20:42,020 oh yeah, of course, the intensivist led to improved outcomes. 209 00:20:42,020 --> 00:20:47,840 But when you remove that trend over time of improvement, you begin to unpack different changes. 210 00:20:47,840 --> 00:20:52,250 And so in this difference, the difference is technique. You need a time zero. 211 00:20:52,250 --> 00:20:58,580 That's typically a policy change. Right. So we we use this technique a lot in national policy evaluations. 212 00:20:58,580 --> 00:21:00,080 But in this situation, 213 00:21:00,080 --> 00:21:09,300 we were able to use it as the point at which intensities were hired at specific hospitals around the country and we were using Medicare data for this. 214 00:21:09,300 --> 00:21:13,590 And you would expect and we looked at both medical and surgical patients. 215 00:21:13,590 --> 00:21:23,460 If this hiring of an intensivist were to lead to improved outcomes, that the moment this is this is the trend of mortality, 216 00:21:23,460 --> 00:21:30,690 you would expect there to be a sharp, inflexion or at least some inflexion that that hiring led to improvement. 217 00:21:30,690 --> 00:21:36,390 We saw a trend where the opposite almost in surgical or really just no significant change. 218 00:21:36,390 --> 00:21:43,110 This was kind of eye opening. And this really helped with my conceptual model around rescue. 219 00:21:43,110 --> 00:21:49,960 There is no easy fix. There is no light switch. You can flip to improve quality. 220 00:21:49,960 --> 00:21:57,550 And I applied my own experience to this data when I had arrived at Michigan as a trainee. 221 00:21:57,550 --> 00:22:02,500 We had just closed the issues and as an intern, you are, you know, 222 00:22:02,500 --> 00:22:06,730 very intimately involved with patient care patients going back and forth from the ICU. 223 00:22:06,730 --> 00:22:13,420 And our chair had hired arguably the best intensivist in the country, Leanne Napolitano, 224 00:22:13,420 --> 00:22:21,320 to come in and close the eyes and develop a much more structured critical care environment. 225 00:22:21,320 --> 00:22:21,830 Well, 226 00:22:21,830 --> 00:22:31,590 you can imagine she was met with a lot of resistance by many of the senior surgeons who had for many years managed their own patients in the ICU. 227 00:22:31,590 --> 00:22:40,200 And so you can see in a maybe a slightly different environment or a different individual that just hiring that intensivists, 228 00:22:40,200 --> 00:22:46,620 they could be overpowered or kind of just made irrelevant by the surgical services. 229 00:22:46,620 --> 00:22:59,520 But in our hospital, our chair resolved any and all conflicts between Dr. Napolitano and the surgeons by saying, Surgeons, you're wrong. 230 00:22:59,520 --> 00:23:04,440 The intensivist is right. And that was extremely powerful. 231 00:23:04,440 --> 00:23:09,090 And that is an example of leadership of culture. 232 00:23:09,090 --> 00:23:14,430 And so you can imagine a lot of places that are hiring intensivists to meet some sort of external 233 00:23:14,430 --> 00:23:22,230 benchmark or measure may not be fully invested in providing that individual or individuals. 234 00:23:22,230 --> 00:23:30,280 The recipe for success. So that's that was huge. So we then looked at safety attitudes more broadly in the state of Michigan, 235 00:23:30,280 --> 00:23:35,730 we look the same 30 to hospitals and we apply the safety attitudes questionnaire to those hospitals. 236 00:23:35,730 --> 00:23:39,870 And we did this in aggregate on surgical units and the Safety Attitudes Questionnaire, 237 00:23:39,870 --> 00:23:47,940 if you're not familiar with it, has multiple domains that can be reported out off of this. 238 00:23:47,940 --> 00:23:55,470 It's probably, I think it's like 50 or 60 questions, and we focus on the top two teamwork and safety climate and what we found. 239 00:23:55,470 --> 00:24:01,380 Interestingly, and I was surprised by this, I thought there'd be a nice association between safety, culture and outcomes. 240 00:24:01,380 --> 00:24:06,060 Those same three groups of hospitals low, middle and high food rescue. 241 00:24:06,060 --> 00:24:13,930 We found no difference between them with their team or climate or safety climate. 242 00:24:13,930 --> 00:24:21,320 But one thing that's interesting in this data is that the nurse is consistently rated. 243 00:24:21,320 --> 00:24:26,330 The safety and tumour climate is worse than their surgeon counterparts, 244 00:24:26,330 --> 00:24:31,370 so this is interesting, and this holds true if you look at any safety culture data. 245 00:24:31,370 --> 00:24:35,600 This holds true consistently and this gets to the concept of kind of what is a 246 00:24:35,600 --> 00:24:41,150 frontline provider actually experiencing versus what is the view from the top. 247 00:24:41,150 --> 00:24:46,100 And you know, we slice, you know, we did the whole salami slicing. We looked at every different question. 248 00:24:46,100 --> 00:24:50,780 And again, there was no difference across any of any and all of these questions. 249 00:24:50,780 --> 00:25:02,900 It was very consistent across these groups. So I was frustrated at this point and I told Lillian I was like, Listen, I don't know what to do. 250 00:25:02,900 --> 00:25:06,710 I tried. I failed. There's clearly no easy but an improved rescue. 251 00:25:06,710 --> 00:25:10,940 There's nothing that I can, no toolkit that I can provide. 252 00:25:10,940 --> 00:25:20,640 But at the same time, I began to look toward the future and had a fortuitous interaction with somebody at our at our institution. 253 00:25:20,640 --> 00:25:26,270 I mean, this is bad timing. So if you want to put a joke and I just kind of chill the mood a little bit at the halfway point. 254 00:25:26,270 --> 00:25:32,930 I love this cartoon. You may have seen it before in case you can't see it there in the back, you know, and that is why we lift on three. 255 00:25:32,930 --> 00:25:36,800 They dropped the patient. They're holding it. So. All right. 256 00:25:36,800 --> 00:25:46,550 Tough crowd. So. So this is this is where I describe in my mind as inflexion point of understanding rescue. 257 00:25:46,550 --> 00:25:52,100 So Carl, why can Kathy Sutcliffe really kind of pioneered the concept of higher liability organisations? 258 00:25:52,100 --> 00:25:54,320 And I'm sure many of you here are familiar with it. 259 00:25:54,320 --> 00:26:03,190 It's become one of the buzzwords in health care, and I didn't realise that they were Michigan faculty. 260 00:26:03,190 --> 00:26:05,710 And that Carl Wijk had retired, 261 00:26:05,710 --> 00:26:13,300 but Kathy Sutcliffe was alive and well and still continuing to do this research over in the University Michigan Business School. 262 00:26:13,300 --> 00:26:18,670 And so this really sparked a relationship between Kathy and I to understand and apply a lot 263 00:26:18,670 --> 00:26:23,200 of the principles in higher liability to what we were trying to do in in Chile to rescue. 264 00:26:23,200 --> 00:26:26,950 And you know, so so for those of you who aren't as familiar with higher liability organisations, 265 00:26:26,950 --> 00:26:31,960 I mean, the concept is that the capacity for an organisation, 266 00:26:31,960 --> 00:26:38,320 the capacity for organisational resilience is really kind of based on the understanding that the unexpected is going to happen. 267 00:26:38,320 --> 00:26:44,740 This gets to that complication idea, right? We can't prevent all complications that are going to happen, that we know it's inevitable, 268 00:26:44,740 --> 00:26:53,230 but we have to build in systems and plan and anticipate what we're going to do once those occur. 269 00:26:53,230 --> 00:26:58,840 And you guys do a lot of this work here. So I'm kind of preaching to the choir. 270 00:26:58,840 --> 00:27:04,540 So what are some examples of? I mean, everyone always uses nuclear power as one. 271 00:27:04,540 --> 00:27:08,710 I think this is my favourite aircraft carriers. 272 00:27:08,710 --> 00:27:17,500 The I've read a lot about aircraft carriers in the course of my work on understanding rows and what you're just what you're seeing here. 273 00:27:17,500 --> 00:27:23,440 What's interesting you describe this is called unwrap, but it's basically replenishment of supplies. 274 00:27:23,440 --> 00:27:28,450 It's occurring between the supply boat. And if you ever played Battleship that game, here's the supply bill. 275 00:27:28,450 --> 00:27:33,280 And here's the battleship. So are the carrier there. 276 00:27:33,280 --> 00:27:38,140 You can see these these cables attached between these two boats. 277 00:27:38,140 --> 00:27:43,270 They're not standing still. As you can see, they're moving at about 12 knots. 278 00:27:43,270 --> 00:27:46,420 Don't ask them what that means. I just know they're moving at 12 knots. 279 00:27:46,420 --> 00:27:54,160 And what they have to do is this coordinated effort to move supplies across now seems easy enough. 280 00:27:54,160 --> 00:28:02,890 OK? What if it's choppy water? What if this the captain of this ship falls asleep and starts veering off the left again? 281 00:28:02,890 --> 00:28:10,720 This slight differences can result in catastrophic consequences, because if these cords, there's people you can see here on the other end of it, 282 00:28:10,720 --> 00:28:16,150 if they snap because they move apart, you lose supplies in the water, but you lose people. 283 00:28:16,150 --> 00:28:20,680 Those will snap back and kill sailors on both sides. 284 00:28:20,680 --> 00:28:24,940 So this is a very coordinated effort that is practised quite a bit. 285 00:28:24,940 --> 00:28:29,220 You can imagine. Imagine how this is an operation. Right. 286 00:28:29,220 --> 00:28:40,040 And the complication is that this this aircraft carrier starts to slow down a little bit because of some incident, that some incident. 287 00:28:40,040 --> 00:28:48,210 If this is not well communicated to the captain of this ship or catastrophic consequences to innumerable people. 288 00:28:48,210 --> 00:28:50,820 Just think of that as surgery. Right? 289 00:28:50,820 --> 00:29:00,480 Except these are your patient and the operation something comes veers off course if we don't communicate them to recognise it. 290 00:29:00,480 --> 00:29:08,470 Catastrophic consequences can occur a patient's death. So here's a now here's a good description from a senior officer from aircraft carrier. 291 00:29:08,470 --> 00:29:15,270 I'll just read this quickly. So you want to understand an aircraft carrier? Well, just imagine that it's a busy day in shrink San Francisco airport. 292 00:29:15,270 --> 00:29:21,540 It's only one short runway and one ramp and one gate make planes take off and land at the same time. 293 00:29:21,540 --> 00:29:29,280 At half the present time interval, rock the runway from side to side and require that everyone who leaves in the morning returns that same day. 294 00:29:29,280 --> 00:29:33,210 Make sure the equipment is so close to the edge of the envelope that it's fragile, 295 00:29:33,210 --> 00:29:38,250 then turn off the radar to avoid detection, impose strict controls on radios. 296 00:29:38,250 --> 00:29:45,420 Fuel the aircraft in place with their engines running. Put an enemy in the air and scatter live bombs and rockets around. 297 00:29:45,420 --> 00:29:53,640 Now what the whole thing down with seawater and oil and man it with 20 year olds, half of whom have never seen an aeroplane up close. 298 00:29:53,640 --> 00:29:59,250 Oh, and by the way, try not to kill anyone. That's like surgical wards, right? 299 00:29:59,250 --> 00:30:06,330 I mean, to some degree. So we think we have it tough because I mean, you know, we all suffer from the NIH syndrome not invented here, right? 300 00:30:06,330 --> 00:30:09,720 So like the concept is or this wasn't invented in health care. 301 00:30:09,720 --> 00:30:14,260 So we're different. We're special. Our environment is just completely. 302 00:30:14,260 --> 00:30:17,940 We're not an aircraft carrier. We're dealing with human beings. These are human beings, too. 303 00:30:17,940 --> 00:30:23,340 And I would argue this a much more complex environment with complex decision making that's occurring. 304 00:30:23,340 --> 00:30:28,690 So we don't get to just throw our hands up in the air and say, we don't know what to do. 305 00:30:28,690 --> 00:30:33,730 So. You've heard about higher reliability organisations. 306 00:30:33,730 --> 00:30:40,180 But what Cathy taught me is that it's it should be more of a verb. It's high reliability organising. 307 00:30:40,180 --> 00:30:45,070 It's not. You don't achieve because I hear this. People want to achieve high reliability. 308 00:30:45,070 --> 00:30:53,260 You don't achieve higher in a higher level organisation. You continue high reliability organising. 309 00:30:53,260 --> 00:30:58,600 And what does that mean? There's these five pillars of high reliability organising preoccupation with failure, 310 00:30:58,600 --> 00:31:04,150 reluctance to simplify sensitivity, operations, deference to expertise and commitment to resilience. 311 00:31:04,150 --> 00:31:13,300 I put these up here like this intentionally. The top row you can describe as being important problem detection, complication detection. 312 00:31:13,300 --> 00:31:18,430 The bottom row is really problem management or complication management, right? 313 00:31:18,430 --> 00:31:27,130 We have to build these principles into our every day, not just say that, you know, we achieved it once and now we're good. 314 00:31:27,130 --> 00:31:33,010 And so something that we did, you know, tried to apply this idea of high reliability organising to surgery. 315 00:31:33,010 --> 00:31:36,130 And along with Kathy and my post-doc at the time and Peter front of us, 316 00:31:36,130 --> 00:31:40,360 we wrote this paper really kind of more of an opinion piece that's that has some 317 00:31:40,360 --> 00:31:44,290 data behind it in understanding the three waves of innovation in patient safety. 318 00:31:44,290 --> 00:31:52,570 It was mostly in surgery was the was the was was really the venue with which I was writing this or the backdrop with which I was writing this. 319 00:31:52,570 --> 00:31:54,790 And we all know that we had technical advancements, right? 320 00:31:54,790 --> 00:31:59,650 So we've improved the technology around surgery and we've gained some significant improvements. 321 00:31:59,650 --> 00:32:04,110 But I believe we reached kind of flatten the curve now. We're just kind of fiddling with robots, right? 322 00:32:04,110 --> 00:32:09,940 So I point to the urologist. You know, so there's certain indications for it. 323 00:32:09,940 --> 00:32:13,510 But now we're try push. And people are doing inguinal hernia repairs with robots. 324 00:32:13,510 --> 00:32:18,670 I find that there's anybody who do that, OK? I find that crazy. 325 00:32:18,670 --> 00:32:22,690 And then, you know, so we've reached kind of five standardising procedures. We talked about this. 326 00:32:22,690 --> 00:32:28,840 We achieved great strides. I believe in standardisation with checklists and things of that sort process compliance. 327 00:32:28,840 --> 00:32:32,710 But again, we used to flatten the curve in really the third wave. My believe, 328 00:32:32,710 --> 00:32:39,850 my belief in the third wave is this concept of high reliability organising something that other industries have done for years and decades, 329 00:32:39,850 --> 00:32:44,420 and that we are now just starting to understand and dive into. 330 00:32:44,420 --> 00:32:49,480 So I put this up here. I would encourage you to read it. If you want a copy of it, I'm happy to send it to you. 331 00:32:49,480 --> 00:32:55,090 Not because I wrote it, because I really believe in the principles we laid forth. 332 00:32:55,090 --> 00:33:01,510 So I've got about 18 minutes or 20 minutes, the top of the hour, and I've got to try to cut it short. 333 00:33:01,510 --> 00:33:10,330 I'm going to tell you a little bit about what we're doing now to improve rescue and really improve communication. 334 00:33:10,330 --> 00:33:14,200 We develop this failure to rescue patient safety learning lab that has two big 335 00:33:14,200 --> 00:33:20,500 components to IT technology and human behaviour used to say human factors on the right. 336 00:33:20,500 --> 00:33:24,670 But I I appreciate I am not an expert in human factors, 337 00:33:24,670 --> 00:33:29,260 so I would be a charlatan if I wrote human factors up there said behaviour is my 338 00:33:29,260 --> 00:33:34,660 kind of you can tell me later if that's OK is my kind of compromise with this. 339 00:33:34,660 --> 00:33:40,930 But what we've done here, I'll tell you a little bit about the technology piece and then a little bit about the human behaviour piece. 340 00:33:40,930 --> 00:33:49,150 So we know that technology's all around us in the preoperative setting through visits, risk prediction, patient education. 341 00:33:49,150 --> 00:33:55,510 We know in the operating room there's decision support going on and all the time within the hour for anaesthesiology colleagues, 342 00:33:55,510 --> 00:34:01,600 for perfusion, et cetera, help us decision decide where to put patients after surgery and then in the patient room. 343 00:34:01,600 --> 00:34:09,850 It's just becoming this smorgasbord of of wireless and Bluetooth devices that help us with surveillance that can help us with deterioration, 344 00:34:09,850 --> 00:34:17,770 risk, escalation of care, et cetera. I'm going to hone in on just one piece that we've begun to think about and work on in the post-operative setting. 345 00:34:17,770 --> 00:34:20,020 What's interesting? The way we collect vital signs? 346 00:34:20,020 --> 00:34:27,280 I heard a lot about this yesterday from some of the folks in Peter's group about how we collect vitals. 347 00:34:27,280 --> 00:34:34,960 We have highly trained nurses going around and this is just a schematic from this company, by the way. 348 00:34:34,960 --> 00:34:39,970 I have no interest in this company, so I just they just have a really good figure. 349 00:34:39,970 --> 00:34:46,750 You know where the nurse or tech or somebody has to go find a poll that has all the equipment on it for blood pressure, 350 00:34:46,750 --> 00:34:51,820 so they go to the patient's room, they generally wake the patient up in the middle of the night. 351 00:34:51,820 --> 00:34:54,760 They slap this stuff on them. They they write down. 352 00:34:54,760 --> 00:34:59,800 You guys probably have the same government issued napkins that everybody writes the vital signs on, right? 353 00:34:59,800 --> 00:35:04,750 That's they write it on a little napkin and then they have to leave the room. 354 00:35:04,750 --> 00:35:10,660 In theory, clean that equipment. But if your hospitals like mine, they don't always do that. 355 00:35:10,660 --> 00:35:13,840 And then they have to enter the data somewhere, either write it on a chart or enter. 356 00:35:13,840 --> 00:35:18,670 And this is the way we've been doing it from the days of Florence Nightingale, right? 357 00:35:18,670 --> 00:35:23,500 This has not changed for hundreds of years. This is the way we collect them. 358 00:35:23,500 --> 00:35:32,170 Why do we not employ the most simple technology that's available like a little, you know, 359 00:35:32,170 --> 00:35:36,250 like my my Apple Watch here is monitoring my heart rate this entire talk, right? 360 00:35:36,250 --> 00:35:45,070 It's soon we'll be able to monitor my EKG. We have the technology out there to have this being captured in real time and then fed 361 00:35:45,070 --> 00:35:51,040 into systems that can either present it to us in a more efficient manner or begin to 362 00:35:51,040 --> 00:35:55,030 run certain algorithms on it so that we go from this linear traditional health care 363 00:35:55,030 --> 00:36:00,940 model to this model where we have analytics and all different sorts of patient input. 364 00:36:00,940 --> 00:36:07,300 Why not have the algorithm use the patient's Fitbit or Apple Watch that had been collecting vital signs 365 00:36:07,300 --> 00:36:12,520 data before we operated on them and incorporate that into what should be happening to them afterwards? 366 00:36:12,520 --> 00:36:17,830 These are not revolutionary or, you know, pie in the sky ideas. 367 00:36:17,830 --> 00:36:21,970 Apple is doing this. There's not telling us about it yet. 368 00:36:21,970 --> 00:36:25,930 So we need to be on the forefront of understanding this and incorporating it into our workflow. 369 00:36:25,930 --> 00:36:30,520 And I put machine learning on. I can't talk about technology, health care anymore without talking about machine learning and AI. 370 00:36:30,520 --> 00:36:35,110 And those are those are two again, buzzwords that I will not say that I know much about AI. 371 00:36:35,110 --> 00:36:38,290 I collaborate with individuals who use this. 372 00:36:38,290 --> 00:36:44,950 And one of the major limitations for us in health care is it's it's easy to build algorithms using one hospital's data, 373 00:36:44,950 --> 00:36:47,440 but it's only helpful for that one hospital. 374 00:36:47,440 --> 00:36:54,670 And the only way we're going to achieve nirvana, so to speak with machine learning, is when we have giant data repositories of vital signs, 375 00:36:54,670 --> 00:36:58,780 data and whatnot coming in from people all over the country and all over the world. 376 00:36:58,780 --> 00:37:04,660 And this is kind of the Big Brother piece that Apple is doing in the background because I know they're collecting, 377 00:37:04,660 --> 00:37:08,710 they're probably wondering why is this heartbeat like five hundred pieces that minute right now, 378 00:37:08,710 --> 00:37:17,470 you know, but they're collecting this data in the background, and hopefully we'll have some ability to apply that to clinical care in the future. 379 00:37:17,470 --> 00:37:20,650 So let's go to the human behaviour piece, and this is actually my favourite part, 380 00:37:20,650 --> 00:37:28,770 and of course, I didn't leave myself a ton of time to talk about it, but. I told you about a lot of secondary data analysis that we had done. 381 00:37:28,770 --> 00:37:35,400 And one of the things that we tried to understand was these the relative contribution of all of these different factors to understanding rescue. 382 00:37:35,400 --> 00:37:41,490 And what we found was if you take into consideration all eight of these fact or seven of these factors, 383 00:37:41,490 --> 00:37:45,270 it only accounts for about 30 percent of the variation favourite rescue. 384 00:37:45,270 --> 00:37:51,030 So that means are 70 percent unobserved variation, and we're not sure why that 70 percent is occurring. 385 00:37:51,030 --> 00:37:58,380 And I believe it has to do, at least in part to human behaviour is the stuff that you can't really measure in a secondary data analysis. 386 00:37:58,380 --> 00:38:04,230 So what we employed and I would love your insights on this is swim lane analysis, right? 387 00:38:04,230 --> 00:38:09,840 So have you seen these before? Yeah, there this is literally just on one unit, 388 00:38:09,840 --> 00:38:18,900 the interactions between all of these different individuals who provide care for the patient and how they all interact. 389 00:38:18,900 --> 00:38:23,580 You know, all the different individuals they interact with, how do they monitor how they detect complications? 390 00:38:23,580 --> 00:38:30,030 How do they assess patients? This is when I when we did this, I was like, Oh my gosh, I'm never going to figure this out, right? 391 00:38:30,030 --> 00:38:37,680 This is how complicated health care is, but we know that there are ways for us to begin to hone in on at least a little nuggets 392 00:38:37,680 --> 00:38:44,730 of of these interactions to improve how we rescue patients and ergo the perfect study. 393 00:38:44,730 --> 00:38:53,100 Right. It's not perfect. My project manager tells me that I chose this acronym because I'm a surgeon and surgeons think they're perfect. 394 00:38:53,100 --> 00:38:57,570 But I said, no, there's a little improvement, except it's the perfect ideal is the perfect care. 395 00:38:57,570 --> 00:39:05,790 And this was our goal in or our goal was to really enhance these three areas where we felt were going to be important engagement, 396 00:39:05,790 --> 00:39:13,650 communication and teamwork in improving the rescue process and to better understand whether these are the right constructs. 397 00:39:13,650 --> 00:39:19,140 We visited five different hospitals in the state of Michigan, two that were low. 398 00:39:19,140 --> 00:39:24,060 Failure to rescue outliers are high quality hospitals and three other high field rescue hospitals. 399 00:39:24,060 --> 00:39:29,250 There were high outliers, and we conducted semi-structured interviews with about 50 stakeholders, 400 00:39:29,250 --> 00:39:34,380 anywhere from surgeons to nurses to netstat those to rest for a therapist. 401 00:39:34,380 --> 00:39:42,900 Hospital administrators, et cetera. And conducted a kind of a thematic analysis using consensus coding and came up with seven 402 00:39:42,900 --> 00:39:51,600 themes or areas where we feel that we that our interviewees felt mattered the most in rescue. 403 00:39:51,600 --> 00:39:58,650 And here they are. By the way, this type of research, like all those other things like my my fellows could do on their laptops. 404 00:39:58,650 --> 00:40:02,430 This took two and a half years to do this, which was exhausting. 405 00:40:02,430 --> 00:40:06,690 And then all I get is this one stinkin slide out of it. Mm hmm. 406 00:40:06,690 --> 00:40:11,460 So so what matters most? The the factors on the left, 407 00:40:11,460 --> 00:40:20,400 we actually found were positive across hospitals that there has been some sort of cultural and societal 408 00:40:20,400 --> 00:40:26,280 shift where there was there is really no fear in speaking up because of a psychological safety. 409 00:40:26,280 --> 00:40:35,760 Teamwork, actually, to a certain degree, happens fairly well once the crisis has been identified and action taking people rally. 410 00:40:35,760 --> 00:40:43,620 We found that people providers felt that if there's a patient crashing, everybody rallies to that patient's care. 411 00:40:43,620 --> 00:40:47,550 Nobody's really going to turn a blind eye to that and swift action really was taken. 412 00:40:47,550 --> 00:40:52,890 The stuff on the left is these are sure for communication processes, access tools and recognition. 413 00:40:52,890 --> 00:40:59,010 Rapport kind of goes without saying, but we'll talk a little bit about each of these in a second. 414 00:40:59,010 --> 00:41:05,400 So here's some representative quotations from people. You know, I think this is this was interesting. 415 00:41:05,400 --> 00:41:12,780 I'm very comfortable asking. I've never thought about twice about calling. And what we found was this was this was a nurse. 416 00:41:12,780 --> 00:41:16,640 A lot of young nurses. I mean, this is like the new generation. 417 00:41:16,640 --> 00:41:20,390 Like, I'll call you. I'll wake you up. I don't care. 418 00:41:20,390 --> 00:41:26,220 I had those patients not doing well. That fear factor has dissipated a little bit. 419 00:41:26,220 --> 00:41:31,650 And I think again, that's a cultural and societal shift that we're benefiting from the negatives. 420 00:41:31,650 --> 00:41:37,740 So areas that really did need some improvement, communication recognition. So this is this really resonating? 421 00:41:37,740 --> 00:41:44,460 He kind of blew you off when I think, no, there's really something going on now that makes it sound like people aren't listening to them. 422 00:41:44,460 --> 00:41:54,020 But the question here is, are they blowing you off? Because maybe you didn't communicate the right information and we'll get to that in a second here. 423 00:41:54,020 --> 00:42:00,520 You know, identification could be better. Consistently, people said, once we knew the problem was kind of obvious, right? 424 00:42:00,520 --> 00:42:04,170 Like, OK, so the patient is septic shock. We know what to do. Question was, can you do? 425 00:42:04,170 --> 00:42:09,630 Could you identify that sepsis before it led to shock? And that's where identification could be better. 426 00:42:09,630 --> 00:42:15,960 And so we revised this a little bit and we recognise that it's it's about recognition and 427 00:42:15,960 --> 00:42:20,790 effectively communicating early signs of deterioration that are going to lead to stress. 428 00:42:20,790 --> 00:42:25,970 Again, some of this seems like almost like common sense, but it's. 429 00:42:25,970 --> 00:42:28,910 Critical to hear this from the horse's mouth, so to speak. 430 00:42:28,910 --> 00:42:35,390 And so what this led us to to begin to talk about is how do you know when escalating how to escalate? 431 00:42:35,390 --> 00:42:42,680 We don't want people to be the boy who cried wolf right to constantly escalate, care unnecessarily. 432 00:42:42,680 --> 00:42:46,070 And at the end of the day, no one's going to listen to you. 433 00:42:46,070 --> 00:42:52,460 And in talking to our frontline providers, what we learnt is that the majority are novices, right? 434 00:42:52,460 --> 00:42:56,210 So I see a lot of young faces on this side. 435 00:42:56,210 --> 00:43:01,730 Are you all house officers training students? Yeah, great. 436 00:43:01,730 --> 00:43:06,680 So, so, so you. 437 00:43:06,680 --> 00:43:12,710 I think we heard this yesterday. Well, one day, wake up the night before you were a student. 438 00:43:12,710 --> 00:43:17,090 The next day you're a full fledged doctor who can prescribe whatever you want. 439 00:43:17,090 --> 00:43:24,320 Nothing changes when you're asleep, you're the same person with the same inexperience. 440 00:43:24,320 --> 00:43:28,160 But yet one day you wake up and you're managing some of the most critically ill patients in our hospital. 441 00:43:28,160 --> 00:43:35,420 Kind of scary, right? Engineers cringe at that. So what can we do to improve or shore up your experience? 442 00:43:35,420 --> 00:43:43,700 And I think it gets to this. I call the three CS of communication here. So competence is key, and that doesn't mean you're incompetent. 443 00:43:43,700 --> 00:43:48,500 It just means that your confidence level isn't up to the level of someone who's been in practise 444 00:43:48,500 --> 00:43:53,420 for 20 years and cared for thousands of patients and had this kind of immersive experience. 445 00:43:53,420 --> 00:43:58,220 And so how do we improve and speed that process up? 446 00:43:58,220 --> 00:44:05,300 Competence leads to confidence, right? You know, here's a here's a great quote from Jim Rohn effective communication is 20 percent what you know, 447 00:44:05,300 --> 00:44:14,090 and 80 80 percent how you feel about what you know, right? So that quotation earlier where they said, I call and they kind of blew me off. 448 00:44:14,090 --> 00:44:19,460 Well, it depends how I don't want to say forceful, but how convincing you were that there's a problem. 449 00:44:19,460 --> 00:44:26,540 If it was just a hunch, then that that may that may not have resulted in the response that you want. 450 00:44:26,540 --> 00:44:32,150 All right. I saw you look at your watch, I'm going to speed up in a second or so. 451 00:44:32,150 --> 00:44:35,960 So what we did is we went to the against our frontline providers. We held this rescue innovation event. 452 00:44:35,960 --> 00:44:41,870 I would encourage you to try one of these. If you haven't, I think you guys may have done stuff like this where we got all the stakeholders together. 453 00:44:41,870 --> 00:44:47,000 We presented them with a lot of the data I just showed you and we let them come up with the solutions. 454 00:44:47,000 --> 00:44:50,540 And here's what's insane. They came up with this early communication system. 455 00:44:50,540 --> 00:44:57,710 This four pronged approach that actually fits into human factors kind of thinking without them even being human factors, 456 00:44:57,710 --> 00:45:05,870 experts and the four pronged approach as these clinical pathways try and recognition video based competition, training and simulation based practise. 457 00:45:05,870 --> 00:45:12,590 And the idea here is these are build on one another clinical pathways establishing an understanding of normal. 458 00:45:12,590 --> 00:45:18,380 You, when you hit the wards, may have seen a couple of patients who had a Whipple operation, right? 459 00:45:18,380 --> 00:45:23,780 Whipple patients are not normal post-op. You just like this is major physiologic insult. 460 00:45:23,780 --> 00:45:27,350 A normal Whipple patient is not a patient. The same as a patient with pneumonia. 461 00:45:27,350 --> 00:45:33,050 For you to understand what normal looks like, you need to have some guidance, some sort of expected post-op course. 462 00:45:33,050 --> 00:45:40,970 And we've developed these for multiple operations now with the input of our surgeons and nurses to kind of say, what are some general milestones? 463 00:45:40,970 --> 00:45:45,710 This is not an order set. This is not something that you just kind of checkboxes on. 464 00:45:45,710 --> 00:45:52,100 What this does and what our trainees are using this now and our nurses find it helpful is it 465 00:45:52,100 --> 00:45:57,200 allows them to have a more meaningful conversation with the attending surgeon or the care team. 466 00:45:57,200 --> 00:46:03,120 Hey, you know what? It's Day three and they're still got an energy tube in what's going on. 467 00:46:03,120 --> 00:46:07,080 It's just a simple question, Oh, well, you know, they've got gastroparesis and we're doing that. 468 00:46:07,080 --> 00:46:12,750 Oh, is there anything I should be worried about? Yeah, you should definitely make sure head of bed is up because their aspiration risk and you know, 469 00:46:12,750 --> 00:46:20,730 it sparks a conversation that would have not happened otherwise. So again, now these are the expected post-op courses trend recognition. 470 00:46:20,730 --> 00:46:26,940 This is my favourite right now. This is the one we're working on is to begin to understand what is abnormal look like and what can you do about it? 471 00:46:26,940 --> 00:46:32,280 And what we've done to teach this is this concept rescue improvement conferences. 472 00:46:32,280 --> 00:46:39,900 We do this once a month. There are a modification of our typical Eminem conference, which I think is really an untapped resource. 473 00:46:39,900 --> 00:46:47,010 Do you guys have weekly or monthly Eminem's here? Yeah, I think it's we do it the same way we did it a hundred years ago today. 474 00:46:47,010 --> 00:46:53,350 And what we've done is try to flip it on its head. And this is the patient I was describing to you earlier the very beginning of the talk. 475 00:46:53,350 --> 00:46:58,780 This is what happened. So the patient, they come in to intubate the patient to give their normal induction doses. 476 00:46:58,780 --> 00:47:03,490 Patient goes into cardiac arrest and they do two hours of chest compressions to get the patient back. 477 00:47:03,490 --> 00:47:09,790 But he ultimately succumbed the day after again, they were going in to do a routine procedure. 478 00:47:09,790 --> 00:47:17,710 But what went wrong? So in a typical Eminem, we would talk about, oh, you know, frail old man and had a complication, 479 00:47:17,710 --> 00:47:22,570 and they ask the attending, Would you do anything differently? They always say, No, no, we're done the same thing, right? 480 00:47:22,570 --> 00:47:25,480 Sound familiar? Yeah. So you're not allowed to say that in this conference, 481 00:47:25,480 --> 00:47:29,890 you can never say I would do the same thing because then I say that every patient going to die if you keep doing the same thing. 482 00:47:29,890 --> 00:47:34,630 So so we have this case analysis checklist that we've modified from the Ottawa model. 483 00:47:34,630 --> 00:47:36,400 And what we do is, ah, 484 00:47:36,400 --> 00:47:42,850 we have our house officers are the ones who are present and they basically go through and identify all the different issues with that case, 485 00:47:42,850 --> 00:47:47,620 thinking about it from a more systems and cognitive perspective. And what do they do with this? 486 00:47:47,620 --> 00:47:54,160 They then have to come up with key concepts of rescue improvement. How what were the what were the critical fails in each of these arenas? 487 00:47:54,160 --> 00:48:00,820 And in this case, there was an overconfidence on the part of the anaesthetic team that came up intubate. 488 00:48:00,820 --> 00:48:08,080 There's a skill set error, and this led to an institutional change from this conference about resuscitating before you into they, 489 00:48:08,080 --> 00:48:11,140 even in an elective setting in the afternoon in the hospital. 490 00:48:11,140 --> 00:48:17,200 And there was critical teamwork failure when the anaesthesia team came up to intubate and tell you this, 491 00:48:17,200 --> 00:48:21,130 the critical care team was rounding on the opposite end of the intensive care unit. 492 00:48:21,130 --> 00:48:26,320 There was zero discussion between those two sets of providers on this very critically ill patient. 493 00:48:26,320 --> 00:48:32,200 Had they asked that team the corroborating, there's an Oh, he's been kind of dwindling, 494 00:48:32,200 --> 00:48:38,020 you know, he's a lot sicker than he looks like on paper, which is what what we talked about. 495 00:48:38,020 --> 00:48:42,310 And so I'm going to skip over that. This became policy. 496 00:48:42,310 --> 00:48:49,240 We this became a new policy in our hospital and how we effectively intubate patients in hospital. 497 00:48:49,240 --> 00:48:55,840 This came out of a simple Eminem performance was the last time an Eminem changed policy in the hospital, at least in an hour. 498 00:48:55,840 --> 00:49:00,970 Eminem never easily could say never. And here are the systems issue I'm going to skip over. 499 00:49:00,970 --> 00:49:06,610 The last couple of pieces will go quickly because we only have four minutes left. Video based complication training. 500 00:49:06,610 --> 00:49:10,720 This is an early stage of development. I liken this to kind of your choose your own adventure. 501 00:49:10,720 --> 00:49:13,120 It's going to be fun. You guys are going to love it. 502 00:49:13,120 --> 00:49:20,500 It's basically kind of interactive videos like whiteboard type videos where it presented with a case like that patient. 503 00:49:20,500 --> 00:49:25,420 And there's no wrong answer, but you pick, what do you want to do next? And it takes you down a path. 504 00:49:25,420 --> 00:49:29,920 And what this really helps you do is you can take the perspective of other providers. 505 00:49:29,920 --> 00:49:34,000 So, hey, you're going to be the nurse in this situation. You're going to have to make these critical decisions. 506 00:49:34,000 --> 00:49:40,750 Here's what happens when you make that decision. Patients that just die, it might lead to a prolonged hospital course, et cetera. 507 00:49:40,750 --> 00:49:44,980 I took this straight out of aviation. They do this for all their credentialing credentialing. 508 00:49:44,980 --> 00:49:48,550 They do these choose your own adventure simulations, which are amazing. 509 00:49:48,550 --> 00:49:53,770 And simulation based practise, I think this is again, probably a little bit of a unicorn. 510 00:49:53,770 --> 00:49:56,280 But I think that we are going to work toward it. 511 00:49:56,280 --> 00:50:01,690 The hardest thing here is really identifying time and space to do this, but this is about building muscle memory, 512 00:50:01,690 --> 00:50:08,350 having the same people around you, the same environment, the same sites, the same smells when you're trying to manage patients. 513 00:50:08,350 --> 00:50:13,480 And I think it'll be very important as one of the ultimate steps. 514 00:50:13,480 --> 00:50:17,200 So I hope I encourage you to think more about communication. 515 00:50:17,200 --> 00:50:25,660 It is not as simple as a unit directional idea, and I will close you the last couple sign that is just kind of for a few slides, right? 516 00:50:25,660 --> 00:50:26,500 So I'm from Michigan. 517 00:50:26,500 --> 00:50:33,700 If you haven't heard about the team, the team, the team, that's a mantra from Bosch and back where I'd encourage you to Google that and listen to him, 518 00:50:33,700 --> 00:50:40,060 give this speech to the football team many years ago and emphasising it's not about one person, 519 00:50:40,060 --> 00:50:45,160 it's about the team, the team, the team, and we live by this at Michigan. 520 00:50:45,160 --> 00:50:49,480 This is at the tunnel going down to the football stadium. 521 00:50:49,480 --> 00:50:58,760 This is my team, team, team, folks across three institutions that have really been integral again, very in our professional group. 522 00:50:58,760 --> 00:51:00,550 We've got enough folks in the business school. 523 00:51:00,550 --> 00:51:07,810 We got anaesthesiologists nursing directors, nursing nursing surgeon, project managers, residents, anaesthesiologists. 524 00:51:07,810 --> 00:51:13,590 We got people from all spectrum and these are systems engineers. So this is my other team. 525 00:51:13,590 --> 00:51:20,040 They're the ones who let me come here. Sorry. Dr. Henry, I'm almost done, I swear. 526 00:51:20,040 --> 00:51:26,040 But I used to incorporate them much more my talk, but I was told to stop talking about my family. 527 00:51:26,040 --> 00:51:31,770 So. But these are my four children. Cameron, Ryan, Sean and Lyla. 528 00:51:31,770 --> 00:51:39,480 That's my original son, Dodger, who is still alive and well at 13, and I would argue one of the most beautiful dogs on the planet. 529 00:51:39,480 --> 00:51:47,430 This is the matriarch of the family, who is also a full time practising physician and really kind of holds the group together. 530 00:51:47,430 --> 00:51:52,510 And again, none of this work would be possible without this group. 531 00:51:52,510 --> 00:52:00,768 That's it.