1 00:00:00,540 --> 00:00:03,629 I am thrilled to be here with you and to share with, you know, 2 00:00:03,630 --> 00:00:12,330 for go and what we've been doing with no forgo in the last four maybe five years now in the hospital setting. 3 00:00:12,930 --> 00:00:17,790 And I'm going to focus first on the hospital settings because that's where the 4 00:00:17,790 --> 00:00:22,970 rubber meets the road and that's where the decision making really comes to a crux. 5 00:00:23,190 --> 00:00:32,700 And then we'll talk about maybe how this can be applied regionally, even from the government perspective, which we're working on in Ontario. 6 00:00:32,970 --> 00:00:39,840 So I'm from Canada, from Ontario, from the City of London, where we have a Piccadilly street in the Thames River, 7 00:00:40,230 --> 00:00:45,900 and our university looks a little bit like Oxford in areas. 8 00:00:46,560 --> 00:00:50,460 So it's quite nice to be there. And we're 2 hours from Toronto and 2 hours from Detroit. 9 00:00:51,810 --> 00:01:01,380 So no go is a tool that we've developed over the last five years to improve decision making, and it's a passion of mine. 10 00:01:01,580 --> 00:01:07,290 I took some time off about four years ago to do a master's in Health Technology Assessment and thought I would find 11 00:01:07,290 --> 00:01:13,780 some answers there on how to make decision making better and was somewhat disappointed that it didn't get me there. 12 00:01:13,800 --> 00:01:20,129 So I developed this thing called No. Four Go, and then our senior administration in the hospital were very kind, 13 00:01:20,130 --> 00:01:24,540 very excited about it when they heard about it and have implemented it across the hospital. 14 00:01:24,780 --> 00:01:28,979 And now there's interest from the region as well as from our Ministry of Health. 15 00:01:28,980 --> 00:01:30,240 And I'll tell you why. 16 00:01:31,200 --> 00:01:38,940 Now, I don't want to speak too highly about nor for go and give you the false impression that it's perfect because it's far from that. 17 00:01:39,300 --> 00:01:45,060 And I really, really want to stress that if you have any feedback or thoughts about how it could be better, 18 00:01:45,360 --> 00:01:50,250 I'm more than willing and open to hear that. And what we'll do first is a presentation. 19 00:01:50,790 --> 00:01:56,070 I think in this format it will work best and then we'll have some time at the end for reflection, 20 00:01:56,490 --> 00:01:59,370 comments, feedback, rebuttals, whatever you have, that's great. 21 00:01:59,520 --> 00:02:06,210 I always learn a lot from my audiences and have presented this in a variety of places in Canada and around the world. 22 00:02:06,660 --> 00:02:13,800 And because of that, it's been refined based on the differing perspectives and experiences that people have kindly shared. 23 00:02:15,330 --> 00:02:22,380 So I do need to thank the people at my hospital and the people at CIA of the Canadian Agency for Research, 24 00:02:22,710 --> 00:02:27,800 as well as our National Agency for Health Technology Assessment, called Karath for their support. 25 00:02:28,020 --> 00:02:35,969 They have funded part of this, and then our High Impact Technology Evaluation Committee in the hospital and our EPCOR team, 26 00:02:35,970 --> 00:02:40,740 which is our evidence based Perioperative Clinical Outcomes Research Group. 27 00:02:42,660 --> 00:02:48,420 All right. So what are we going to do in the next few minutes? Well, we're going to talk again from the perspective of the hospital. 28 00:02:49,230 --> 00:02:54,360 I'm going to talk about the importance of evidence based health technology assessment in the hospital setting. 29 00:02:56,010 --> 00:03:04,890 And I'm going to start out by setting the context in what are the traditional challenges around decision making and why it's not working in general, 30 00:03:05,610 --> 00:03:11,069 and then how we try to answer to those particular challenges using Novago as a solution and 31 00:03:11,070 --> 00:03:15,180 what has been our experience and then future directions where we'd like to go from here. 32 00:03:16,170 --> 00:03:19,680 So hospitals in Canada are costly. I know they are here too. 33 00:03:20,430 --> 00:03:24,180 Hospitals account for about 30% of our overall health expenditures, 34 00:03:24,720 --> 00:03:32,190 and then drugs beyond that also account for another about 17% of our total health care costs. 35 00:03:33,450 --> 00:03:39,510 So a lot of people ask me, why would you do this? Why would you start this type of a project in the hospital setting? 36 00:03:40,350 --> 00:03:49,170 Well, the answer is hospitals in Canada, in Ontario is really where a lot of the action is, and it is really where the rubber meets the road. 37 00:03:50,190 --> 00:03:53,969 Technological innovations are often introduced first in the hospital setting. 38 00:03:53,970 --> 00:04:01,830 That's where we have often the back door to new technologies entering the door at the hospital level. 39 00:04:02,580 --> 00:04:10,530 And these technology technological innovations are restructuring health care in profoundly and somewhat unsettling ways. 40 00:04:11,490 --> 00:04:21,840 The demand for innovative and new technologies, devices, drugs, procedures has clearly outpaced our capacity to address them in a systematic way. 41 00:04:23,010 --> 00:04:26,610 And this is all concentrated very much at the hospital setting. 42 00:04:26,610 --> 00:04:33,030 And it to me, it created the perfect petri dish just to implement a new tool or a new process 43 00:04:33,300 --> 00:04:37,020 for decision making that tries to bring in evidence based decision making, 44 00:04:37,260 --> 00:04:47,650 that tries to bring in health technology assessment formally, but that is also accountable to the resources available in Canada. 45 00:04:48,240 --> 00:04:53,580 We have a multilayered set of programs in health technology assessment, 46 00:04:54,240 --> 00:04:59,940 and I think it's quite similar to the U.K. in that we consider drugs and devices. 47 00:05:00,010 --> 00:05:07,120 And machines more commonly than we consider procedures which may or may not involve devices, drugs or machines. 48 00:05:07,660 --> 00:05:14,920 We have a national agency for Health Agency for Health Technology Assessment cameras, which does assessments and they make recommendations. 49 00:05:14,920 --> 00:05:16,840 But these are in no way binding. 50 00:05:17,440 --> 00:05:24,100 Same with at our provincial level, we have Otec, which is a health technology assessment committee, which makes recommendations. 51 00:05:24,490 --> 00:05:27,750 But again, in no way are those recommendations binding. 52 00:05:27,760 --> 00:05:34,659 So what happens in the hospital is that decisions are made and perhaps informed 53 00:05:34,660 --> 00:05:37,870 by the health technology assessment that has gone on in the recommendations, 54 00:05:37,870 --> 00:05:44,410 but perhaps not. So we needed something that addressed these gaps and the ability for things to walk in the back door. 55 00:05:45,430 --> 00:05:53,230 I consult part time for the Canadian agency, as well as our provincial agencies for drug assessment and non-drug technologies assessment. 56 00:05:54,130 --> 00:06:01,900 But when it really comes down to it, the decision making at the hospital level really still needs a lot of help, despite these external supports. 57 00:06:03,190 --> 00:06:06,220 When a health technology assessment is completed, 58 00:06:06,310 --> 00:06:15,010 we get a report that outlines the evidence as well as the expanded considerations such as ethics and feasibility, etc. 59 00:06:15,460 --> 00:06:19,660 But what we found in the hospital setting is that really wasn't enough to make the decision for us. 60 00:06:19,990 --> 00:06:29,560 External editors do beautiful health technology assessment reports, but in the end those reports do not tell us how to prioritise. 61 00:06:30,250 --> 00:06:35,760 These reports need to be localised with considerations of infrastructure, 62 00:06:35,770 --> 00:06:39,819 existing technologies that we've already got in practice, our patient population, 63 00:06:39,820 --> 00:06:45,250 our internal needs, our external needs in the region, our health professionals skills that are on hand, 64 00:06:45,610 --> 00:06:48,910 our learning curves, competing priorities, etc., etc. 65 00:06:50,740 --> 00:06:58,150 I really like health technology assessment because in the hospital setting, again, because it's fast paced, it has to be real time. 66 00:06:58,660 --> 00:07:01,840 There's not a lot of time to wait for assessments to become available. 67 00:07:02,050 --> 00:07:06,150 Sometimes the train will already leave the station or the horse will be out of the barn. 68 00:07:06,160 --> 00:07:12,760 If we wait for an assessment to be completed and in the hospital setting, we have our end users. 69 00:07:12,760 --> 00:07:16,989 The very target of the recommendations are with us when we're putting together 70 00:07:16,990 --> 00:07:21,580 recommendations and we can contextualise it to the local health setting. 71 00:07:22,270 --> 00:07:28,749 And it also makes us entirely and extremely accountable to whatever predictions we put into 72 00:07:28,750 --> 00:07:34,030 our recommendations or our models around what is going to be the resource requirement, 73 00:07:34,540 --> 00:07:40,000 what likely benefits will accrue from implementing this new device or drug or other technology. 74 00:07:41,230 --> 00:07:45,610 And every decision that we make in the hospital setting is felt poignantly. 75 00:07:46,420 --> 00:07:49,270 It really is because we're working within a box. 76 00:07:49,270 --> 00:07:59,560 We have four walls around us, we have a budget, and as soon as we make a decision for something, it means that we feel whatever we've given up. 77 00:08:00,490 --> 00:08:09,520 And I'll tell you a little bit more about that. So I call it HD in a box, and it becomes something that we have to do truly, madly and very deeply. 78 00:08:09,520 --> 00:08:14,470 It does go very deep because you're faced with the decision makers at the time that you're making decisions. 79 00:08:14,650 --> 00:08:21,310 It's not always comfortable, that's for sure. And yes, it is where the rubber meets the road. 80 00:08:21,760 --> 00:08:27,100 It's very easy to do health technology assessment when you're not the one who has to make the trade-off decisions. 81 00:08:28,180 --> 00:08:37,780 These decisions become moral dilemmas. No matter how we make the decisions, somebody's going to get hurt, i.e. for every decision to do something. 82 00:08:37,780 --> 00:08:42,610 It's a decision not to do something else, and you are going to hurt some part of the population. 83 00:08:42,850 --> 00:08:48,400 The hope is that you can maximise the good that you've done for your population with the range of decisions you've made. 84 00:08:49,330 --> 00:08:58,810 So in order to tell you what we're doing today, I need to tell you what we've done in the past ten years ago now. 85 00:08:59,410 --> 00:09:03,910 We started a program called the Evidence based Prescribing Initiative in the hospital setting. 86 00:09:04,060 --> 00:09:09,549 Senior leadership in the hospital recruited me to come in and start a program 87 00:09:09,550 --> 00:09:12,940 that tried to get evidence into practice around the area of drug prescribing. 88 00:09:13,810 --> 00:09:14,860 And that was a lot of fun. 89 00:09:15,460 --> 00:09:25,780 At that time, we tried to get the evidence straight, but we had a lot of rebuttals and a lot of pushback and rebuttals, such as evidence is biased. 90 00:09:26,440 --> 00:09:30,400 Full stop. I don't believe you. Numbers are tortured. 91 00:09:30,730 --> 00:09:34,740 Statistics lie like evidence from evidence. 92 00:09:34,750 --> 00:09:38,720 Let's just get on with it. I don't have time to wait for your assessment. It works and I want it. 93 00:09:38,740 --> 00:09:44,710 There's no time for assessments here. And of course, the ubiquitous blind to the evidence. 94 00:09:44,800 --> 00:09:51,610 That's not the way I see it. And the inherent new is always better if it's better. 95 00:09:51,700 --> 00:09:55,030 It's the presumption that it must always be worth it. 96 00:09:55,720 --> 00:09:59,170 Okay? Or that new must always be better. Therefore it's worth it. 97 00:10:02,240 --> 00:10:09,710 And what we started to do over time was calling these particular rebuttals because they really were showstoppers to getting evidence into practice. 98 00:10:11,060 --> 00:10:13,670 So we called them symptoms of compromise, decision making, 99 00:10:14,180 --> 00:10:22,340 and the most common categories of rebuttals were the evidence doesn't apply to my patient, and my patient is just exceptional or unique. 100 00:10:22,350 --> 00:10:26,780 You can't say when that particular evidence applies to my patient. 101 00:10:27,350 --> 00:10:37,460 And what we found was that patients, according to the prescribing doctor or the treat the care provider, the exceptions were becoming the norm. 102 00:10:37,820 --> 00:10:40,850 Nobody wanted to admit that their patients are actually normal. 103 00:10:42,260 --> 00:10:46,610 Obviously the evidence is wrong because everyone else is still doing it was the other common rebuttal. 104 00:10:46,820 --> 00:10:51,230 And we have examples here around which some of these rebuttals were launched. 105 00:10:52,010 --> 00:10:55,520 And again, commonly this place is archaic. 106 00:10:55,520 --> 00:10:59,450 It's only about the bottom line about cost, the dirty word cost. 107 00:10:59,630 --> 00:11:03,230 Don't bring that into my decision making when I'm responsible for a patient here. 108 00:11:03,230 --> 00:11:13,000 These are lives. And of course, we have this key challenge of too fast or too slow to adopt things we have in our hospital. 109 00:11:13,000 --> 00:11:17,920 And I don't think that we're unique compared to the U.K. We have this thing called gizmo idolatry. 110 00:11:17,930 --> 00:11:24,400 Technology is fun, it's shiny, it's new, comes with purported advantages compared to the old. 111 00:11:24,970 --> 00:11:28,960 And they just want it. And they feel that sooner is always better than later. 112 00:11:29,860 --> 00:11:35,409 And in the teaching centre attached to the universities, we have the responsibility to be leading edge. 113 00:11:35,410 --> 00:11:39,100 So please just let us try everything out and be the leaders and such. 114 00:11:39,100 --> 00:11:46,060 And I need to be able to teach my residents, my fellows, my guys about everything, about all the options out there. 115 00:11:47,140 --> 00:11:52,990 And this leads to what we call the technology hype cycle, where we overshoot implementation of new things. 116 00:11:53,170 --> 00:12:00,520 So we have this peak of inflated expectations. But then when we try it out and we see that it doesn't always work or in fact it caused some harms, 117 00:12:00,760 --> 00:12:05,050 or we're not adept enough to actually take it up into practice, 118 00:12:05,350 --> 00:12:13,960 then we fall into this trough of disillusionment before we figure out what the evidence really says about where it's true place in practice should be. 119 00:12:14,710 --> 00:12:18,760 But the problem is here, if we have this peak of inflated expectations, 120 00:12:18,760 --> 00:12:23,980 followed by a trough of disillusionment, we've got two major areas of inefficiency here. 121 00:12:24,130 --> 00:12:29,050 If we've over taken up or under implemented any particular technology. 122 00:12:29,650 --> 00:12:37,600 So what we wanted to do was come up with something that smooths this particular curve in order to get good things into practice sooner or earlier, 123 00:12:37,600 --> 00:12:46,270 smoother without overshooting, but not have to be in this constant cycle of trying to always implement the new when it's not proven. 124 00:12:47,830 --> 00:12:53,850 And we have multiple, multiple opposing agendas in our hospital setting, just as you do when you're setting high. 125 00:12:54,490 --> 00:12:57,969 We have skills deficits, we have people that are territorial. 126 00:12:57,970 --> 00:13:01,210 They want just to be able to practice their particular pet surgeries. 127 00:13:02,260 --> 00:13:03,880 I don't care what the evidence says. 128 00:13:04,090 --> 00:13:12,070 And we have a lot of different agendas that are covered politically in our governments as well as in our frontline practitioners. 129 00:13:13,690 --> 00:13:17,740 And of course, the average man's judgement is so poor he runs a risk every time he uses it. 130 00:13:18,520 --> 00:13:22,660 This this is true, but most of us are not aware of that. 131 00:13:23,050 --> 00:13:28,240 So what can we do? We got a lot of evidence out there. We've got some good health economics out there. 132 00:13:28,360 --> 00:13:35,259 How do we pull this all together? So when we tried to do this evidence based prescribing initiative, we really kind of felt deflated. 133 00:13:35,260 --> 00:13:42,520 We went in with absolute excitement, passionate about getting evidence into practice, and we thought everyone would come to the party. 134 00:13:44,230 --> 00:13:53,200 Well, they did, but only to try to poke holes in the whole process and to dig in their heels and to say, I'm sorry, but my patients are exceptional. 135 00:13:53,410 --> 00:13:59,590 The exception is the norm. So we kind of felt like we're here on that curve in getting evidence into practice. 136 00:14:00,430 --> 00:14:08,680 But rather than just throw out, throw in the towel and collectively throw up our hands and give up, we decided to have another go. 137 00:14:10,150 --> 00:14:14,049 What we were doing at that time was pretty good. 138 00:14:14,050 --> 00:14:18,400 I think considering it was ten years ago. I'm but not entirely unique. 139 00:14:18,550 --> 00:14:22,330 What we did was synthesised systematically through systematic review, 140 00:14:22,960 --> 00:14:32,020 the benefits versus risks for a new drug or health technology assessment or we used what others had done, if it was recent enough. 141 00:14:32,440 --> 00:14:38,769 And then we combine that with our local resource considerations, synthesised it through formal economic analysis. 142 00:14:38,770 --> 00:14:41,290 And you've been exposed to that today, I hear. 143 00:14:42,550 --> 00:14:50,440 And in that synthesis, we made recommendations for our local setting, often through made analysis and then from a cost effectiveness analysis. 144 00:14:50,950 --> 00:14:59,080 But after doing all of that and for many years, we found that systematic review does not equal the decision. 145 00:15:00,010 --> 00:15:03,430 Economic analysis does not equal a decision. 146 00:15:04,450 --> 00:15:09,310 Evidence as own is one consideration, but not the only consideration for decision making. 147 00:15:10,600 --> 00:15:19,360 This was startling to us because the promise of evidence based medicine, as the title almost infers, is that evidence should get you there. 148 00:15:19,360 --> 00:15:23,979 It should be fairly robust until you want to do an economic analysis. 149 00:15:23,980 --> 00:15:30,400 I mean, that's, you know, even further along the path, it should tell you what to do, but it didn't quite get us there. 150 00:15:31,420 --> 00:15:37,660 And we found that in all of our decisions that we were informing systematically by the evidence and through economic analysis, 151 00:15:38,230 --> 00:15:45,160 we found that yes was always perceived to be better than a no, and there was this propensity to say yes. 152 00:15:45,160 --> 00:15:53,440 And there's pressure for us as the community to say yes to everything, and that there should almost never be the opportunity to say no. 153 00:15:53,860 --> 00:15:58,750 And it's unfair when you do. So they said. And so we moved on. 154 00:15:58,990 --> 00:16:02,350 We said, okay, evidence is essential, but we need a better way. 155 00:16:02,360 --> 00:16:05,640 Something's not working out here. So now we say evidence is this. 156 00:16:05,720 --> 00:16:09,650 Unsure, but insufficient. And we developed this thing called AlphaGo. 157 00:16:10,520 --> 00:16:17,330 And this was based on the premise that the application of evidence to decisions was too technical in and of itself. 158 00:16:17,780 --> 00:16:23,810 It was too linear. It was too blunt in the decisions that it was trying to inform or make. 159 00:16:24,980 --> 00:16:28,640 Evidence didn't come in the shape necessarily of our particular hospital. 160 00:16:29,720 --> 00:16:37,670 So we said evidence from expanded domains of influence also needs to be brought into the each of the decisions that we make. 161 00:16:38,330 --> 00:16:46,760 Evidence from these expanded domains of influence include social, legal, ethical, environmental, etc., types of considerations. 162 00:16:47,270 --> 00:16:53,330 And we said we really need to somehow enumerate the upper opportunity costs. 163 00:16:53,810 --> 00:16:59,120 We need to be making explicit what is the opportunity cost before decisions can be made comfortably. 164 00:17:00,890 --> 00:17:05,360 And in doing so, we decided we have to be able to go where the evidence does not. 165 00:17:07,490 --> 00:17:10,520 And we need to actually bring in the concept of sleepers. 166 00:17:11,030 --> 00:17:16,129 And we we created this particular acronym because it's a good one. 167 00:17:16,130 --> 00:17:22,340 It's memorable when we're sitting around the table trying to assess what are the expanded domains of influence on this particular decision. 168 00:17:23,120 --> 00:17:34,040 The sleeper is a social, legal, ethical or considerations of equity, environmental, political and entrepreneurial or research and stickiness factors. 169 00:17:34,520 --> 00:17:43,910 So all of these expanded domains of influence, we said, have to be explicitly considered before we can make the decision otherwise. 170 00:17:44,180 --> 00:17:48,860 These very factors in and of themselves may trump the evidence prematurely. 171 00:17:49,550 --> 00:17:53,570 A political factor may come in and just make the decision for you if you're not 172 00:17:53,570 --> 00:17:58,100 systematic in actually getting these particular factors onto the decision making table. 173 00:18:00,260 --> 00:18:04,040 The last one here, I think most of the categories speak for themselves. 174 00:18:04,370 --> 00:18:10,400 When we talk about when we look at social considerations that are unique to our particular environment, 175 00:18:10,520 --> 00:18:19,060 we look at legal considerations, we look at ethical considerations, and we look at equity across our hospital in our region. 176 00:18:19,070 --> 00:18:28,010 Now we look at environmental and organisational factors, very important when you're talking about taking up a new surgery technique or a new device. 177 00:18:29,030 --> 00:18:33,530 We look at the political factors as well as it relates to our hospital or our region. 178 00:18:34,130 --> 00:18:39,710 And we also look at what entrepreneurial and research opportunities also need to be considered. 179 00:18:39,880 --> 00:18:43,870 We'll talk more about that later because we're a teaching centre. 180 00:18:43,880 --> 00:18:50,900 There are some opportunities in research and developing information that are worth the value of putting more resources in. 181 00:18:51,050 --> 00:18:58,040 Perhaps this last factor, the stickiness factor, confuses a lot of people and it's not intuitive when you see that first. 182 00:18:59,060 --> 00:19:05,870 But the concept here is that whenever you have something already taken up into practice, 183 00:19:06,710 --> 00:19:13,310 it's a lot harder to take it back out of practice than it would have been to stop it from getting into practice in the first place. 184 00:19:14,180 --> 00:19:20,839 So a lot of our clinicians in our hospital and even our senior leadership team will say, well, 185 00:19:20,840 --> 00:19:29,360 why don't we just give this new technology or this new drug a try in our hospital and just see what it's like and then we'll decide in a year. 186 00:19:30,500 --> 00:19:34,790 Trouble is, if we don't explicitly enumerate the stickiness factor, 187 00:19:35,240 --> 00:19:42,830 we run into the problem of presuming that we could just as easily undo that practice once it's already got into practice. 188 00:19:43,490 --> 00:19:48,889 This very concept is very well enumerated by the late and great Bernie O'Brien in his paper. 189 00:19:48,890 --> 00:19:59,209 Is there a kink in the curve? And Bernie wrote about this, I think, very eloquently, where he says the willingness to pay curve is really kinked in, 190 00:19:59,210 --> 00:20:06,200 that people are willing to pay more to keep something that they've already had and enjoyed than 191 00:20:06,200 --> 00:20:10,670 they would have been to pay for it in the first place before they even had ownership of it. 192 00:20:11,030 --> 00:20:12,410 So it's the sense of ownership, 193 00:20:12,680 --> 00:20:19,460 the sense of tearing something away from you once you've already had it makes it more valuable to to you once you've already enjoyed it. 194 00:20:21,920 --> 00:20:24,799 So I need to tell you and show you what we do. 195 00:20:24,800 --> 00:20:33,080 How do we go here and how do we go where the evidence dares not to better inform our decisions to smooth that particular technology hype curve? 196 00:20:35,540 --> 00:20:39,560 Well, we need to first understand what we're talking about when we're talking about true cost. 197 00:20:40,970 --> 00:20:50,390 Has anyone seen this definition before? True costs of cost is the sacrifice of consequences in the next best alternative use of resources. 198 00:20:51,320 --> 00:20:58,730 That's the real cost. It's not the dollar value on the drug or the device or the surgery that we're considering. 199 00:20:59,330 --> 00:21:04,460 It's actually the value of the next best alternative use of those resources. 200 00:21:05,660 --> 00:21:09,860 And this concept is very well espoused and is related to Newton's third law, 201 00:21:10,880 --> 00:21:20,270 where the every decision to do one thing is a decision not to do another or every action causes an equal and opposite reaction. 202 00:21:21,080 --> 00:21:27,170 And we use this as an analogy to the fact that we need to consider the benefits, the risks, the costs. 203 00:21:27,290 --> 00:21:33,469 But we also have to somehow enumerate what is foregone if we're going to be able to determine what is the 204 00:21:33,470 --> 00:21:38,810 true cost of taking up this new technology instead of all the other options that we had on the table, 205 00:21:39,560 --> 00:21:46,820 and we call this no forego because you have to know when to go for the decision or whether you have to know whether it's a no go. 206 00:21:47,510 --> 00:21:58,700 And these decisions have to be based on the four factors systematic review, objective evidence of the benefit versus risks as well as the costs, 207 00:21:59,270 --> 00:22:05,300 as well as those expanded domains of influence, the sleepers, as well as what is foregone. 208 00:22:07,100 --> 00:22:12,030 And in doing this, what we do in our hospital is we work around a trade-off table. 209 00:22:12,480 --> 00:22:20,000 We bring the stakeholders for the decision around the table after we've looked at the benefit risk cost sleepers. 210 00:22:20,570 --> 00:22:27,920 And we work with them to further enumerate or calibrate the sleepers and figure out what is the foregone benefit. 211 00:22:29,480 --> 00:22:36,800 So it's not an easy task, but what we have done is we've created the Trade-off table to be like a pool table or a billiards table. 212 00:22:37,730 --> 00:22:43,639 And essentially what is built on is the cost effectiveness quadrant where we 213 00:22:43,640 --> 00:22:49,000 have the benefit index along the x axis and the budget impact on the Y axis. 214 00:22:49,010 --> 00:22:56,089 To get us started, what are we plotting? So for the budget impact, it's the incremental cost of taking up this new technology, 215 00:22:56,090 --> 00:23:01,309 drug or device or procedure per patient in our real hospital setting, 216 00:23:01,310 --> 00:23:05,600 real world setting times the number of eligible patients that we would anticipate. 217 00:23:07,070 --> 00:23:11,930 And then we look at the benefit index. We're talking about the trade off table. 218 00:23:11,930 --> 00:23:18,319 And what we plot is something that's very practical but it's very much localised. 219 00:23:18,320 --> 00:23:20,120 So our budget impact got that there. 220 00:23:20,630 --> 00:23:27,220 And the benefit index, which is the number of eligible patients divided by the number needed to treat to benefit really. 221 00:23:27,590 --> 00:23:38,510 Okay. It's not rocket science but it is local localised and it fully engages the best available evidence about benefits, risks and costs. 222 00:23:39,680 --> 00:23:45,200 And it also brings in the foregone opportunities, which I'll show you in a moment. 223 00:23:46,040 --> 00:23:52,400 So when we're sitting around this particular trade-off table, what we do is we we draw in our go no go line. 224 00:23:53,160 --> 00:23:55,700 Okay. We'll come back to how we define that in the future. 225 00:23:56,750 --> 00:24:05,180 But to orient yourself, any decisions that come below this particular line, because now that you're familiar with the cost effectiveness curve, 226 00:24:05,420 --> 00:24:09,110 you know that for increasing levels of benefit, 227 00:24:09,110 --> 00:24:15,200 you're willing to pay increasing numbers of dollars or devote increasing amounts of resources makes common sense. 228 00:24:15,680 --> 00:24:26,120 But if we talk about it, the corollary to that for infinitesimally smaller benefits, we're willing to pay much less, if anything at all. 229 00:24:26,510 --> 00:24:28,740 Okay. So that's based on this premise. 230 00:24:28,760 --> 00:24:36,350 So anything coming in below the line means that we're likely willing to put out that amount of resource for that amount of predicted benefit. 231 00:24:37,970 --> 00:24:44,660 And so if we have, let's say, five different decisions that we need to plot on this particular trade-off table, 232 00:24:45,320 --> 00:24:51,350 a decision right at the at the intersection of these two axes would be a decision as in the green ball, 233 00:24:51,560 --> 00:24:55,340 where we have really no budget impact, but really no known benefit either. 234 00:24:56,870 --> 00:25:01,699 If we have the blue ball and we have loads of resource impact, 235 00:25:01,700 --> 00:25:08,690 but really no proven benefit or risk if we have the yellow ball on the right on the horizontal axis, 236 00:25:08,960 --> 00:25:12,170 we have lots and lots of benefit but really no known resource impact. 237 00:25:12,410 --> 00:25:15,170 So maybe there are some savings that come along with implementation, 238 00:25:16,880 --> 00:25:20,780 but most commonly we have the pink in the orange ball whereby there's some sort of incremental 239 00:25:20,780 --> 00:25:25,370 benefit for some sort of increment in cost and we've got to make some pretty difficult choices. 240 00:25:26,480 --> 00:25:33,200 So over time, we've been plotting decisions on this particular trade-off table and making real decisions, 241 00:25:34,040 --> 00:25:42,079 and we now have over 40 decisions that are plotted. And it's it's on the increment in terms of the number of decisions that are here. 242 00:25:42,080 --> 00:25:45,920 I say 40 because those are 40 formal. We've done a number of informs too. 243 00:25:46,340 --> 00:25:54,830 But if I put them all on here, it gets quite busy. And so what we do is we simply plot the benefit risk. 244 00:25:55,850 --> 00:26:05,540 So that benefit index. So how many of our patient population would we expect to benefit for the amount of resource outlay that we have to put out in? 245 00:26:05,600 --> 00:26:13,579 Our hospital and we colour these balls based on the focus and the width of the ball is, 246 00:26:13,580 --> 00:26:18,170 is pro-rated based on the confidence intervals or, or the amount of uncertainty. 247 00:26:19,310 --> 00:26:26,960 And I mentioned that the colour of the balls, if they are actually coloured, it means that the decision's already been made. 248 00:26:28,220 --> 00:26:34,220 If the ball is grey, as in B and C on the horizontal axis are on the vertical axis at the top, 249 00:26:34,490 --> 00:26:37,220 it means that those are decisions that we're considering today. 250 00:26:37,970 --> 00:26:46,220 If the ball is white as ADR, it means that those are decisions that have not been taken up into practice, 251 00:26:46,790 --> 00:26:50,690 but which exist below the line for which we should have said go. 252 00:26:51,290 --> 00:26:55,400 But for some reason we haven't gone and said yes to those decisions yet. 253 00:26:56,060 --> 00:27:05,330 So you can imagine in this particular case, if we are sitting down and considering whether or not we should take up B and C and 254 00:27:05,330 --> 00:27:11,330 we'll call B inhaled nitric oxide for patients post cardiac surgery and we'll call 255 00:27:11,330 --> 00:27:16,729 C a special enzyme replacement therapy for a patient with a very rare disorder for 256 00:27:16,730 --> 00:27:21,920 whom we want to perhaps bridge over until they can get a bone marrow transplant, 257 00:27:22,370 --> 00:27:29,210 perhaps as definitive therapy for their disease, their very rare disease. 258 00:27:29,750 --> 00:27:34,400 So you can see that C and B have no proven overall benefit. 259 00:27:34,880 --> 00:27:39,680 We have we have no proven benefit for either of these. We have huge budget impact. 260 00:27:40,220 --> 00:27:48,860 And this graph has been crunched down so that it fits on a page. But the budget impact is in the range of half a million dollars for both of these. 261 00:27:49,190 --> 00:27:56,059 Okay. Now, when we're sitting down and making these particular decisions, it's it's not always easy. 262 00:27:56,060 --> 00:28:01,340 There's a lot of uncertainties. And you can see that C has a lot of uncertainty with it, which is the enzyme replacement therapy, 263 00:28:01,640 --> 00:28:05,629 because the evidence only showed one randomised controlled trial with very large confidence 264 00:28:05,630 --> 00:28:10,850 intervals around the one benefit that was measured which wasn't even really clinically relevant. 265 00:28:12,290 --> 00:28:20,540 So in this particular case, what we would do then is make a table of the different options that we have on the table today. 266 00:28:21,890 --> 00:28:25,040 And if we are considering inhaled nitric oxide, for example, 267 00:28:25,040 --> 00:28:31,100 for a variety of different indications in this case and the enzyme replacement therapy 268 00:28:32,180 --> 00:28:35,660 and we have a few other things that are labelled here just to orient yourselves, 269 00:28:36,470 --> 00:28:40,910 in particular the white balls, which are the things that we haven't taken up yet, 270 00:28:40,910 --> 00:28:45,320 which have a benefit, proven benefit, and are under that go no go line. 271 00:28:46,700 --> 00:28:51,170 Then what we would do is consider all of these decisions at once. 272 00:28:51,230 --> 00:28:56,210 What should we take up? There was a lot of pressure to take up the enzyme replacement therapy, 273 00:28:56,420 --> 00:29:00,470 and there was a lot of pressure for human pressure to take up inhaled nitric oxide, 274 00:29:01,190 --> 00:29:04,820 especially for post cardiac surgery and post heart transplant surgery. 275 00:29:06,380 --> 00:29:10,280 Once we have these balls graphed or on the trade off table, 276 00:29:10,970 --> 00:29:21,890 then we start to talk about the evidence generally as well as the budget impact generally, and then we move into the expanded domains of influence. 277 00:29:21,890 --> 00:29:29,330 So I haven't showed you yet how we bring in the sleepers. So we have to decide which of these are we going to go or say no, go to. 278 00:29:30,830 --> 00:29:38,690 So to bring in the sleepers now, again, reminding you what those sleepers are, we actually have to do surveys of the relevant stakeholders. 279 00:29:38,960 --> 00:29:41,480 What are the social, legal, ethical, environmental, 280 00:29:41,480 --> 00:29:47,990 political and research opportunities and innovation opportunities with each of these options before us, 281 00:29:49,460 --> 00:29:53,660 for those white balls, we had already done them in the past. Okay. 282 00:29:53,960 --> 00:29:57,590 We now would just if we're sitting at the table with these other life decisions, 283 00:29:57,590 --> 00:30:03,560 we now have to do it for whatever, for the B and C decisions at the top on the grey. 284 00:30:04,040 --> 00:30:14,720 So what we do is we we we survey our stakeholders using radial plots with the sleepers on each one of the axes of us of a star plot, for example. 285 00:30:15,380 --> 00:30:21,590 And we survey them, we survey the physicians, we survey the patient groups, we survey our senior leadership team. 286 00:30:22,790 --> 00:30:25,819 We survey others who are in the hospital, 287 00:30:25,820 --> 00:30:33,110 other practitioners who are not necessarily experts or even proponents of the particular technology in question. 288 00:30:38,840 --> 00:30:42,950 And then we come back and we have data on those sleepers. 289 00:30:43,400 --> 00:30:47,840 And what is the purported weight of those sleepers on the decision? 290 00:30:48,650 --> 00:30:52,970 And we put a price tag on the ball if there's a significant sleeper issue. 291 00:30:54,320 --> 00:30:57,110 So in this particular case with B, 292 00:30:57,110 --> 00:31:05,180 there were some significant issues that the physicians working in the area felt should trump the evidence or the lack of evidence in this case. 293 00:31:05,450 --> 00:31:10,760 And those issues were such as well as last ditch rescue efforts. 294 00:31:11,240 --> 00:31:19,069 And we hope the inhaled nitric oxide will save somebody who can't get off the pump when they're coming out of surgery and or will get their 295 00:31:19,070 --> 00:31:27,560 pulmonary pressures down enough for us to bridge over at least a day until we can get them on to something orally once they're activated. 296 00:31:28,220 --> 00:31:32,510 And how could you say no to that, despite the fact that there's no proven benefit? 297 00:31:33,500 --> 00:31:38,180 So that was the significant sleeper there. And these graphs here are hypertext. 298 00:31:38,180 --> 00:31:41,600 So we can actually get back in and find out what are the significant sleepers. 299 00:31:46,430 --> 00:31:51,979 First. See, you can imagine what the sleeper issues could be for an enzyme replacement therapy for 300 00:31:51,980 --> 00:31:56,510 an exceedingly rare disease for which there's only one randomised controlled trial. 301 00:31:57,410 --> 00:31:59,960 Sleepers ended up being such as things such as? 302 00:32:00,260 --> 00:32:05,490 We think it would be unethical to withhold this despite the fact that the single rc t showed no benefit. 303 00:32:05,510 --> 00:32:10,760 We think it's underpowered to have been able to show that particular difference. 304 00:32:11,180 --> 00:32:16,130 And what's $500,000 for the shot at a child's life? 305 00:32:17,240 --> 00:32:22,520 The other underlying issues were political, and it's interesting. 306 00:32:22,910 --> 00:32:32,630 The decisions that were made in this particular case were that we would actually say yes to the enzyme replacement therapy. 307 00:32:34,490 --> 00:32:41,690 And that was despite the fact that we presented the evidence and the lack of proven benefit and the huge impact on resources. 308 00:32:42,680 --> 00:32:45,710 The yes was based entirely on a political issue. 309 00:32:48,180 --> 00:32:51,270 And because we had done this sleepers analysis, 310 00:32:51,270 --> 00:32:58,679 that particular issue became explicit and there was some feedback that went across the hospital and 311 00:32:58,680 --> 00:33:04,380 the region because this was made explicit and because this was early in our no formal process, 312 00:33:04,680 --> 00:33:19,280 it really helped to shape our future decisions. The political issue was ramped up because a number of us got telephone calls personally to say that 313 00:33:19,280 --> 00:33:25,100 your face will be on the newspaper tomorrow if you don't say yes to this particular technology. 314 00:33:25,940 --> 00:33:33,740 And the patient in this case was was the child of very influential parents in our community. 315 00:33:35,030 --> 00:33:43,489 And our hospital was undergoing a very sensitive political opportunity at the time, 316 00:33:43,490 --> 00:33:49,970 whereby we were undergoing examination to become the Centre for Innovation around a particular topic. 317 00:33:50,270 --> 00:33:54,110 And I can't really give all the details because I don't want to put anybody in jeopardy, 318 00:33:54,560 --> 00:33:58,190 but it gives you a flavour for the underlying things that occur. 319 00:33:58,310 --> 00:34:06,920 But in this case at least, it was made explicit and we found ways to better address these political trumped cards over the evidence, 320 00:34:06,920 --> 00:34:16,850 which, with our future decisions with B, the decision was to say no for inhaled nitric oxide, not outside of the research setting. 321 00:34:19,850 --> 00:34:24,530 Now, how does this help us to actually enumerate the foregone benefit? 322 00:34:24,950 --> 00:34:31,790 Well, in this particular case, we have an equation which is essentially written out in words with numbers if they want it. 323 00:34:32,240 --> 00:34:38,270 But the equation tells them if we take a B or if we take up C or if we take up both, 324 00:34:39,350 --> 00:34:50,929 then those price tags are those sleepers which are marked by the price tags on B plus C better be more than or B value to you more than the benefits 325 00:34:50,930 --> 00:35:00,470 that a plus de plus E could have brought us and decisions a plus de plus e were for technologies for which we know we could have saved some lives, 326 00:35:01,010 --> 00:35:08,870 prevented the need for some surgery, reduced length of stay, and in some cases, improved quality of life. 327 00:35:10,340 --> 00:35:16,880 In this case, our senior leadership decided for the enzyme replacement therapy that it was worth more than a plus de plus E, 328 00:35:18,080 --> 00:35:25,610 but it doesn't sit comfortably with all of us. Beauty of this is that a plus de plus e are always staring us in the face. 329 00:35:25,610 --> 00:35:32,600 Whenever we bring another decision to the table, those opportunity costs are always there and we have to wear new decisions, 330 00:35:32,930 --> 00:35:39,740 plus the sleeper's price tag against the opportunities that we're foregoing which are now enumerated. 331 00:35:40,460 --> 00:35:47,810 And we have to decide explicitly whether or not those sleeper issues really are and should trump over a plus de plus e, 332 00:35:47,810 --> 00:35:51,560 which our lives saved, surgeries prevented. Quality of life improved. 333 00:35:51,770 --> 00:35:57,500 And days of stay prevented. So that's how we do it has called no forego. 334 00:35:57,800 --> 00:36:02,420 And again, we have to bring these four dimensions of influence into the decision making, 335 00:36:02,420 --> 00:36:07,550 explicitly the benefit for evidence or the evidence for benefit versus risk. 336 00:36:08,120 --> 00:36:20,000 Our local resource considerations, the sleepers and then the foregone benefit, the feedback from our particular institution has been very positive. 337 00:36:20,630 --> 00:36:24,350 It has really taken us forward in terms of our level of decision making. 338 00:36:24,500 --> 00:36:27,260 It's made us a lot more comfortable in our decision making, 339 00:36:27,680 --> 00:36:35,180 a lot more comfortable to say no or to change something that would have been a sleeper trump card into 340 00:36:35,180 --> 00:36:41,060 something that becomes not at all rated highly anymore compared with our opportunities foregone. 341 00:36:42,500 --> 00:36:43,100 It works. 342 00:36:43,310 --> 00:36:53,050 It tends to resonate very well across the levels of decision making in our hospital, from senior leadership down to the frontline practitioners. 343 00:36:54,380 --> 00:36:58,910 It's multidimensional and nuanced in that it does bring in the sleepers systematically, 344 00:36:59,390 --> 00:37:04,850 and it also brings in our very real local estimates of benefit and cost. 345 00:37:05,030 --> 00:37:12,500 So what is it that we're really buying if we take up this new technology and we think that it's honest and accountable because we we use, 346 00:37:12,980 --> 00:37:16,640 of course, objective evidence based review, systematic reviews, 347 00:37:17,630 --> 00:37:27,950 but we also have to be accountable to our own particular limits in resources as well as our own particular social contexts, organisational context, 348 00:37:28,190 --> 00:37:36,920 etc. It also keeps us accountable because every time we sit down to the decision making table, those decisions are staring us in the face. 349 00:37:37,700 --> 00:37:49,940 So if you remember from our graph up here, when we sit down to the table next time and let's say now we're considering ball number eight or 28 or six, 350 00:37:50,330 --> 00:37:54,500 and in that case the balls would be grey. But again, we'd have to say, 351 00:37:54,500 --> 00:38:03,680 are these three worth more than eight plus de plus e a plus de plus e which would cost less an E which even would save us money. 352 00:38:04,010 --> 00:38:07,880 The overall resource impact would be almost zero if we took up eight plus de plus eight. 353 00:38:08,630 --> 00:38:10,100 Are they really worth more than that? 354 00:38:10,820 --> 00:38:19,880 Quite often once you make it this explicit form for decision makers, even though they were Rahim and Ali advocating for a new technology or procedure, 355 00:38:21,170 --> 00:38:25,190 they tend to lay down their weapons after they've been through the process and they say, 356 00:38:25,240 --> 00:38:32,510 Oh my gosh, I had no idea that we were not yet implementing A-plus de plus E or I had no idea 357 00:38:32,510 --> 00:38:36,470 that A-plus plus E would give us that much tangible benefit to this hospital. 358 00:38:36,980 --> 00:38:41,209 I'm willing to say no. I'm willing to set down my armour. 359 00:38:41,210 --> 00:38:47,630 I'm willing to set down my badges and forego this particular opportunity because I see the value very tangible. 360 00:38:50,330 --> 00:38:54,500 It also embeds the decision in the context of the past, the present and the future. 361 00:38:54,710 --> 00:38:58,820 What do I mean by that? Well, again, back to that graph. 362 00:38:59,810 --> 00:39:05,240 When we're sitting down to this trade off table, the billiards table, our past decisions are plotted here. 363 00:39:05,570 --> 00:39:09,290 They're coloured balls, but they're not permanently set in stone. 364 00:39:10,010 --> 00:39:17,360 If we're now deciding on, let's say, ball number ten, whereby we'd get some benefit but save some money, 365 00:39:17,930 --> 00:39:23,540 perhaps we should forego something that was already implemented, such as 12 plus 15. 366 00:39:25,310 --> 00:39:30,440 Okay. So those it embeds our current decisions in the decisions that we've made in the past. 367 00:39:30,800 --> 00:39:36,890 This really helps orient us contextually in terms of past decisions that we've made and hopefully 368 00:39:36,890 --> 00:39:41,750 helps us get our house in order more and more into the future because we can't do it all at once. 369 00:39:43,220 --> 00:39:50,390 It also means that our future decisions, things that we know coming down, we'll plot them sort of in the background to show people what's coming up. 370 00:39:52,040 --> 00:39:55,129 And it means that decisions once made are not always live. 371 00:39:55,130 --> 00:40:01,370 They're not just a single event once a decision is made. If it is a no, it doesn't mean a forever no. 372 00:40:01,370 --> 00:40:06,410 It just means for now that ball needs to be placed on the graph as a no. 373 00:40:06,920 --> 00:40:11,150 And in the future, if there's something better to trade off for, we'll make it a yes. 374 00:40:11,810 --> 00:40:15,770 And that is where the extra comfort has come in from the decision makers. 375 00:40:16,790 --> 00:40:21,710 So the feedback you remember some of the rebuttals that I brought to you in the beginning of the presentation. 376 00:40:22,520 --> 00:40:29,000 Now, the feedback has been, you know, huge sighs of relief from decision makers, especially front line decision makers, 377 00:40:29,240 --> 00:40:39,260 but also our senior leadership team, our VP's, where they say, oh, now I understand I didn't kind of get the cost effectiveness thing before. 378 00:40:39,290 --> 00:40:42,800 It kind of seemed ephemeral to me, not something very tangible. 379 00:40:43,280 --> 00:40:49,730 I didn't quite get the uncertainty around the evidence before, but now I understand you've helped me to bring it all together. 380 00:40:50,000 --> 00:40:56,329 It makes sense. I can live with that. One particular physician about a year ago said he's a surgeon. 381 00:40:56,330 --> 00:41:06,229 He said, I'm not happy with the decision, but I will accept it, knowing what I've seen now and things like. 382 00:41:06,230 --> 00:41:11,270 Finally, you've shown me explicitly what I've tried to understand for years and then we need more of this. 383 00:41:11,270 --> 00:41:16,549 And now the hospital wants to put the whole capital planning process through this newfangled tool as well, 384 00:41:16,550 --> 00:41:22,760 which gives me a bit of the shudders, but we'll see how it goes. So we think that the tool might be dextrous. 385 00:41:23,480 --> 00:41:28,460 We've we started out in the world of drug therapies because that's where my expertise originally lies. 386 00:41:28,880 --> 00:41:33,860 And then we've now also applied it to devices and surgical procedures. 387 00:41:33,860 --> 00:41:34,910 Increasingly, 388 00:41:35,720 --> 00:41:44,750 there's been some asks around how could we use this now to evaluate programs which are embedded with a number of technologies and opportunities, etc.? 389 00:41:45,560 --> 00:41:49,370 I think there's opportunities there as well, though we haven't yet done it for programs. 390 00:41:50,270 --> 00:41:53,900 What are some of the surprising findings as we close up? 391 00:41:54,080 --> 00:41:55,850 Well, I think you'll find this interesting. 392 00:41:56,900 --> 00:42:05,809 There's been a lot of surprising findings, including by, from my own perspective, where before we always thought more technology is better. 393 00:42:05,810 --> 00:42:09,410 You know, more drug options on the formulary are better, more devices are, you know, 394 00:42:09,410 --> 00:42:15,170 at least having the option makes us more able to meet the individual needs of patients. 395 00:42:16,250 --> 00:42:25,129 But it's interesting, more and more we've come to experience that less is more and that the more options that we have out there, 396 00:42:25,130 --> 00:42:28,670 the more complex and confusing it becomes, the more errors that occur, 397 00:42:29,000 --> 00:42:36,440 and the more likely that practitioners and patients are unsettled because as soon as they choose one option, 398 00:42:36,830 --> 00:42:40,190 they feel that they might be missing out on the other options that they had before them. 399 00:42:41,210 --> 00:42:48,320 So that's been a surprising finding. We've also found that even though in the past we sort of considered decisions to 400 00:42:48,320 --> 00:42:53,660 be more likely dichotomous a yes or a no or yes with conditions in some patients, 401 00:42:53,660 --> 00:43:00,500 but not all. For example, now we consider that there is also the opportunity to say yes, 402 00:43:00,500 --> 00:43:07,400 but only in the context of a bonafide research trial, often a randomised controlled trial in our own setting. 403 00:43:09,790 --> 00:43:17,230 And we've also found that our clinicians are more and more accepting that yes is not better than no, yes is also no. 404 00:43:19,540 --> 00:43:23,470 So here's the issue about moral hazard that we talked about just a few minutes ago. 405 00:43:23,620 --> 00:43:29,890 Whenever we say yes to something, we're trading it off against other opportunities that we could have spent our money on. 406 00:43:30,160 --> 00:43:36,100 And because of that, a yes to one thing is a notice something else. So we better be very comfortable that we've made the right. 407 00:43:36,100 --> 00:43:44,370 Yes. And again, decisions always being life has really made clinicians more comfortable to say no today. 408 00:43:44,380 --> 00:43:47,950 But we'll see in the future. Maybe it becomes maybe a go no go line will change. 409 00:43:49,270 --> 00:43:56,940 And again, it causes us to be accountable to the evidence, to our own resource requirements, to the sleepers and systematically so. 410 00:43:57,280 --> 00:44:00,550 And it causes us to enumerate what is the foregone benefit. 411 00:44:02,590 --> 00:44:09,640 A lot of people think that we're about the bottom line when we talk about money in the context of decision making, health care, decision making. 412 00:44:09,910 --> 00:44:16,629 But to me, opportunity to me, money is not the bottom line. 413 00:44:16,630 --> 00:44:19,840 It's actually opportunities. Moneys are just the proxy for opportunities. 414 00:44:20,140 --> 00:44:25,330 Whenever we make a decision that's going to save us money, we immediately put that money into another opportunity. 415 00:44:25,330 --> 00:44:35,530 It's not that we just have more money in the bank and opportunity itself may present more than once, but you can only spend the same dollar once. 416 00:44:35,830 --> 00:44:39,489 So opportunity might present twice, but you can only spend the same dollar once. 417 00:44:39,490 --> 00:44:50,440 So you better make sure you take up the right opportunity. And the other surprising finding was that we really need to subtract before we add. 418 00:44:50,620 --> 00:44:56,560 Meaning that if a clinician or a group is asking for something new and it's going to be an increment in cost, 419 00:44:56,920 --> 00:44:58,930 then we have to ask them, well, what are you going to give up? 420 00:45:00,400 --> 00:45:05,680 Because we can't implement something without extra dollars in our bank and we truly don't have extra dollars in our bank. 421 00:45:05,680 --> 00:45:09,880 So we actually have to get them to subtract something before we actually add another new thing. 422 00:45:10,750 --> 00:45:16,660 Future efforts we'd like to further develop, know for go in other settings in other contexts, 423 00:45:17,230 --> 00:45:24,730 because we've learned so much more about its capabilities when doing so, and it gets further refined when other people are using it. 424 00:45:25,150 --> 00:45:31,030 We'd like to further develop it also for considering research opportunities and our research wing is very open, 425 00:45:31,150 --> 00:45:37,840 is very interested in doing so because a number of decisions really should be yes, 426 00:45:37,840 --> 00:45:43,600 but only in the context of further research if the value of ascertaining more information is worth it. 427 00:45:43,610 --> 00:45:51,700 Once we graph it on the map, we'd also like to get better at our decommissioning opportunities. 428 00:45:51,700 --> 00:45:59,019 A number of the coloured balls that you saw on our graph, you can see they're well be lying below the line of go no go because they were decisions 429 00:45:59,020 --> 00:46:04,600 that might have preceded our ability to address those decisions systematically. 430 00:46:05,290 --> 00:46:08,500 So now we need to look back at those and get better at disinvesting now. 431 00:46:10,390 --> 00:46:16,000 And the sleepers become a huge issue. And that s factor on the sleepers, the stickiness factor. 432 00:46:16,000 --> 00:46:22,630 You have to be very careful to enumerate what it's going to take to undo a practice, and you have to be very honest about that. 433 00:46:24,430 --> 00:46:31,419 Our region is interested in using it, as well as our ministry and other provincial ministers and other provinces. 434 00:46:31,420 --> 00:46:34,720 Ministry has contact us, contacted us to use it as well. 435 00:46:35,170 --> 00:46:41,290 We'll see how that goes. And now we're moving more and more into the area of less tangible technologies, 436 00:46:41,290 --> 00:46:46,240 such as procedures in and of themselves, which brings a whole new raft of sleepers with them. 437 00:46:47,950 --> 00:46:56,169 So in summary, I hope I've shown you that Hospital HCA is a good petri dish for trying something like this out before we try it. 438 00:46:56,170 --> 00:47:02,889 In the more broader range of contexts, traditional decision making failed us, has failed us in the past. 439 00:47:02,890 --> 00:47:07,180 So we moved on. Traditional decision making, evidence based decision making, again, 440 00:47:07,540 --> 00:47:13,960 can be very linear and doesn't necessarily take on the nuances of the decision at hand systematically. 441 00:47:14,230 --> 00:47:16,960 So we created this to do just that, to answer to that gap. 442 00:47:18,550 --> 00:47:25,900 And we found that if we're to engage rather than alienate our decision makers, whether it be the VP's in our hospital or frontline practitioners, 443 00:47:25,900 --> 00:47:30,820 we need to make to an evidence based medicine relevant, explicit and tangible. 444 00:47:31,450 --> 00:47:37,989 That's why we do the graph. The trade off table provides a framework to explicate the evidence and its uncertainty 445 00:47:37,990 --> 00:47:43,060 by toggling the width of the balls while acknowledging the sleepers and enumerating, 446 00:47:43,480 --> 00:47:46,090 explicitly enumerating what's the foregone benefit. 447 00:47:47,590 --> 00:47:53,470 We found that if we don't make this explicit, then the sleepers often trump very important decisions. 448 00:47:53,710 --> 00:47:58,840 They often trump the evidence, even when it's a very clear evidence that something does not work. 449 00:47:59,380 --> 00:48:01,360 Sometimes the politics just takes over. 450 00:48:01,930 --> 00:48:09,610 Making the sleepers explicit and the forgo explicit has been our saving grace in terms of comfort in decision making. 451 00:48:10,990 --> 00:48:19,210 And you'll notice that there's a lot of things that know Fargo espouses and incorporates and tries to bring together to the decision making table. 452 00:48:19,510 --> 00:48:23,440 Yes, evidence based decision making is there to. Is there a medical economics? 453 00:48:23,440 --> 00:48:28,660 Is there. And then there's some other flavour of game theory and program budget. 454 00:48:28,810 --> 00:48:32,410 PBM and accountability for reasonableness, 455 00:48:32,620 --> 00:48:37,810 which you may want to read more about to make us accountable and robust in our decision making 456 00:48:38,200 --> 00:48:45,220 that makes us live within our means and that makes us smoother on this technology hype cycle. 457 00:48:45,790 --> 00:48:50,410 So we have a way to address the overhype and the under hype explicitly. 458 00:48:52,030 --> 00:48:59,080 So with that, we're close. We're drowning in information. While starving for wisdom, we hope that this process has gotten us closer to wisdom. 459 00:48:59,410 --> 00:49:07,360 But again, it is not perfect. And many thanks for your attention and for your patience with my my voice. 460 00:49:08,140 --> 00:49:13,780 Now, I'd like to take some time to have some feedback and some discussion and hear your thoughts, 461 00:49:13,780 --> 00:49:18,700 comments and your feedback here is most welcome and by email as well. 462 00:49:18,700 --> 00:49:23,560 If. If you'd like to do that over time. Well, thank you very much indeed.