1 00:00:00,000 --> 00:00:03,300 According to my watch, it is 531 in the hall. 2 00:00:03,450 --> 00:00:07,050 So this tells me a lot. 3 00:00:09,060 --> 00:00:16,470 I can't think of anyone better to give a guest lecture on knowledge and to actually think there's an awful lot about knowledge interaction. 4 00:00:18,450 --> 00:00:22,860 I'm going to resist telling all sorts of anecdotes about the medical doctor. 5 00:00:23,020 --> 00:00:30,149 Yeah, it works right across the board. And he's a real think outside the box. 6 00:00:30,150 --> 00:00:33,330 And I collect think people who think outside the box. 7 00:00:33,330 --> 00:00:34,920 And that's the main reason I brought him in. 8 00:00:35,190 --> 00:00:44,290 He's a very clever professor, which he is, but because he is going to encourage us to think outside the box tonight, you can fill in the gaps. 9 00:00:44,340 --> 00:00:51,690 I haven't said about the relevant things you think I need to know before you get into this amazing lecture, which I've seen twice before. 10 00:00:51,690 --> 00:00:55,290 And I'm looking forward to seeing you again. Yeah, thanks. 11 00:00:55,560 --> 00:00:58,800 I think I've got a microphone on here and we. Okay. Can you hear me? 12 00:00:59,380 --> 00:01:03,270 Yeah, great. Okay. Well, thank you for the thank you for the invitation. 13 00:01:03,270 --> 00:01:08,159 I'm going to we're through about four days of stuff. 14 00:01:08,160 --> 00:01:13,170 Perhaps if I just stand at this side for at least for the moment in about in about an hour. 15 00:01:14,130 --> 00:01:21,660 So I'm not going to fill in any more gaps about me because it's more about how we go about making decisions. 16 00:01:22,020 --> 00:01:27,480 So what would be really great would be if you could dig out a piece of paper and a pen. 17 00:01:27,500 --> 00:01:33,360 And if anybody needs any paper, here's some paper, pass it backwards. 18 00:01:34,830 --> 00:01:47,340 This isn't going to be hard stuff. I know some of you have been working incredibly hard on this course this week and it's day four and 530 at night. 19 00:01:47,340 --> 00:01:55,860 So this is going to be more on the entertainment and the things than on the education and the things. 20 00:01:56,160 --> 00:02:00,750 But I hope you'll find it interesting and I hope there's a few things to think about. 21 00:02:01,360 --> 00:02:13,200 It's just more fun if you join in, so be prepared for a bit of interactivity rather than just PowerPoint and me talking. 22 00:02:13,200 --> 00:02:20,969 So let's start with familiar territory. Here's the socket definition from 1996. 23 00:02:20,970 --> 00:02:24,299 I think this is heading for 30,000 citations. 24 00:02:24,300 --> 00:02:32,760 So so there must be something in it. Now, I came here in 1994 when Dave Sackett was still here. 25 00:02:32,760 --> 00:02:36,600 We had a party at Martin Dawes house with a swimming pool. 26 00:02:36,600 --> 00:02:40,410 I remember that was the that was that was the greatest thing. 27 00:02:41,010 --> 00:02:45,220 No, it wasn't Paul Glacier. It was here. And it was his first time as a group chief. 28 00:02:45,240 --> 00:02:46,860 So it was that long ago. 29 00:02:47,070 --> 00:02:57,390 And the point about this definition is that evidence based medicine started out all about making decisions for individual patients. 30 00:02:57,930 --> 00:03:01,260 Interestingly enough, it's my emphasis in red. 31 00:03:01,500 --> 00:03:08,910 But this word, the task set was better decision making for individuals, interestingly enough. 32 00:03:09,240 --> 00:03:13,889 Did we answer that question? Well, three bits in the prologue. 33 00:03:13,890 --> 00:03:23,910 So we were trying to drop in to the traditional clinical expertise alongside what would the patient consent to, 34 00:03:24,300 --> 00:03:37,110 even if we didn't do a tremendous amount at that time about getting values and preferences explicitly in the way that we might try and do nowadays, 35 00:03:37,410 --> 00:03:40,440 and we try and draw up in this evidence. 36 00:03:41,490 --> 00:03:45,240 The task was to do that better, to have better consultations. 37 00:03:45,240 --> 00:03:51,750 That was the challenge from that IBM editorial from 1996. 38 00:03:52,590 --> 00:03:58,080 Did we answer that? Well, I'll steal this analogy from from Don Berwick. 39 00:03:58,410 --> 00:04:04,080 This is a bridge in Honduras. Honduras is that little country in Central America. 40 00:04:04,080 --> 00:04:07,650 They get a lot of rain. In Honduras, there are lots of rivers. 41 00:04:08,070 --> 00:04:14,130 And in the 1930s, they need to build bridges because canoes weren't really cutting it for the economy. 42 00:04:14,850 --> 00:04:18,690 And they built this bridge and the bridge was fantastic. 43 00:04:19,080 --> 00:04:21,690 It stood up to all sorts of weather. 44 00:04:21,840 --> 00:04:33,000 The traffic went backwards and forwards every day until in 1998 they got a Category five hurricane smack on the nose of Honduras, 45 00:04:33,210 --> 00:04:36,990 one of the worst hurricanes that's ever been on the planet. 46 00:04:37,500 --> 00:04:41,490 And this is the serious, but it was absolutely awful. 47 00:04:41,790 --> 00:04:47,130 Tens of thousands of people died, hundreds of thousands made homeless. 48 00:04:47,400 --> 00:04:54,479 The economy was decimated for a decade, but the bridge stood. 49 00:04:54,480 --> 00:04:58,080 The civil engineers are doing a fantastic job. The. 50 00:04:58,140 --> 00:05:04,380 There wasn't any malpractice and. In the construction, the bridge stood up to Hurricane Mitch. 51 00:05:04,410 --> 00:05:08,700 The only problem was the river moved. 52 00:05:13,290 --> 00:05:23,970 So the bridge answered the first problem. Now we've got a different problem, and we need to look again at what the challenge is. 53 00:05:24,990 --> 00:05:32,370 And one of the first things the evidence based medicine movement did was start collecting all this fantastic research, 54 00:05:32,370 --> 00:05:35,100 or some of it was fantastic and some of it wasn't very good. 55 00:05:35,700 --> 00:05:43,739 And collecting it together into evidence summaries, some of which were called guidelines, and some were done very well. 56 00:05:43,740 --> 00:05:47,190 Some were doing okay and some weren't doing very well at all. 57 00:05:47,640 --> 00:05:55,140 But instead of answering the complex question of how do we do consultations which incorporate those three elements, 58 00:05:55,680 --> 00:06:05,130 what we did was answer a simpler question, which was how do we collect the evidence together into evidence, summaries and guidelines? 59 00:06:05,640 --> 00:06:09,190 And we did that quite a lot. 60 00:06:10,080 --> 00:06:15,420 So we're now over 230 guidelines that nice where I used to work. 61 00:06:17,190 --> 00:06:20,790 And we've developed a different problem now. 62 00:06:20,850 --> 00:06:28,830 We have guideline overload and information overload, and all of this is based upon what's best for the population, 63 00:06:29,100 --> 00:06:34,920 not what works best for the individual at this moment in time sitting in front of us. 64 00:06:35,430 --> 00:06:41,520 With that clinical problem today. So that's prologue bit number one. 65 00:06:41,520 --> 00:06:48,929 Prologue bit number two features this famous American ice skater called Dorothy Hamill. 66 00:06:48,930 --> 00:06:53,910 Dorothy Hamill is advertising a drug called roughly Cox IB roughly Cox. 67 00:06:53,950 --> 00:06:57,990 It was a sort of designer, non-steroidal anti-inflammatory drug, 68 00:06:59,130 --> 00:07:09,720 which was produced in the hope of reducing one of the major side effects of all steroid pills, and that is gastric ulceration and bleeding. 69 00:07:10,290 --> 00:07:16,799 And indeed, the original research that was published showed that there was a reduction in 70 00:07:16,800 --> 00:07:22,110 the rate of GI complications with roughly Cox slip compared to competitors. 71 00:07:22,110 --> 00:07:33,210 There was also a signal and the major randomised controlled trial that it might increase cardiovascular events by a small but absolute amount. 72 00:07:33,690 --> 00:07:38,550 So the advertising budget in the states where they have direct to consumer advertising 73 00:07:38,790 --> 00:07:46,980 was bigger than Budweiser or Pepsi advertising 160 million on television largely. 74 00:07:47,190 --> 00:07:52,410 And people were going to their doctors and saying, Can I have this new drug called Raphe Cox? 75 00:07:52,950 --> 00:08:03,180 But they stopped doing that when the makers withdrew roughly Cox from the market because it increased the rate of cardiovascular events. 76 00:08:03,900 --> 00:08:10,799 Anybody have a guess at how many premature or avoidable cardiovascular events 77 00:08:10,800 --> 00:08:14,490 were estimated to have occurred in the United States associated with what the. 78 00:08:14,490 --> 00:08:31,910 Cox it. What do you think? Hundred thousand, high or low, higher, about 140 to 160000 with a 40% mortality rate. 79 00:08:32,960 --> 00:08:37,760 This was jumbo jets crushing a phenomenal thing. 80 00:08:37,880 --> 00:08:44,630 They they the total in the UK is estimated to be much less than that, but still a couple of thousand. 81 00:08:44,650 --> 00:08:49,969 And in my time at the US Prescribing Centre we wrote a summary which said You use these things cautiously. 82 00:08:49,970 --> 00:08:54,860 There might be a little bit of a signal here. I'm not claiming that we did anything, but the context is that. 83 00:08:55,430 --> 00:09:03,499 And of course with that people started to look at the more traditional and say it's particularly a known site called diclofenac. 84 00:09:03,500 --> 00:09:07,549 Diclofenac has a molecular structure very similar to Raphe Cox, 85 00:09:07,550 --> 00:09:16,430 even it has a Cox two selectivity identical to Raphe Cox, and this is the prescribing of diclofenac. 86 00:09:16,430 --> 00:09:26,990 So as this work is being analysed, almost half of all the prescriptions written by every GP in England for an end side was diclofenac. 87 00:09:27,530 --> 00:09:31,549 Yeah, point is 0506 here. 88 00:09:31,550 --> 00:09:39,080 This is 2013. Over here this line worn is the time when the systematic review was published in the British 89 00:09:39,080 --> 00:09:45,740 Medical Journal saying diclofenac increased cardiovascular events by three per thousand per year. 90 00:09:46,790 --> 00:09:53,149 You might want to think about your prescribing as does apparently the evidence isn't isn't fantastic as 91 00:09:53,150 --> 00:09:59,360 it as as it does with other insights apart from ibuprofen and approx and ibuprofen at standard doses, 92 00:09:59,630 --> 00:10:04,040 approximate 1000 milligrams a day doesn't appear to increase the cardiovascular rate. 93 00:10:05,360 --> 00:10:15,379 Note the dramatic effect friends on the prescribing of diclofenac once that signal is published and one of 94 00:10:15,380 --> 00:10:25,850 the world's leading scientific journals and line two as where every prescriber gets a letter from the MHRA, 95 00:10:25,850 --> 00:10:31,280 the UK drugs regulator saying You might want to think about your prescribing of nonsteroidal 96 00:10:31,280 --> 00:10:37,130 anti-inflammatory drugs and think about using ibuprofen and naproxen rather than things like diclofenac. 97 00:10:38,630 --> 00:10:51,680 This is the point at which me and mine wound up prescribing advises to go about talking to people in GP practices and in local groups. 98 00:10:52,370 --> 00:11:00,409 Note how long it takes for things to start dwindling downwards in terms of the use of diclofenac 99 00:11:00,410 --> 00:11:08,750 and note the fact that Northeast bit of England gets the message quite early and leads the pack. 100 00:11:08,750 --> 00:11:13,970 The West Midlands thinks about it for a bit and comes down quite nicely. 101 00:11:13,980 --> 00:11:21,620 South East Coast does almost nothing for about three years and then starts to get hold of it. 102 00:11:21,630 --> 00:11:27,860 What do you think happened? Point for time, point for a lawsuit? 103 00:11:28,640 --> 00:11:41,840 No, actually we started having cough points for prescribing so GP's got money for doing the right thing and at point five. 104 00:11:45,430 --> 00:11:56,680 The money got taken away. So it's a bit you look at that sort of data and you think these people aren't very bright, are they? 105 00:11:58,450 --> 00:12:06,249 Don't you think having that and that the natural reaction. These people are, you know, pretty uncaring and they're not committed. 106 00:12:06,250 --> 00:12:11,330 They're not working hard enough, doing the right thing for their patients. 107 00:12:11,350 --> 00:12:15,430 You look at that prescribing data and you think these people need a good slap. 108 00:12:19,540 --> 00:12:22,960 But of course, I do it one more time. It's great, isn't it? 109 00:12:23,470 --> 00:12:30,340 But of course, that's not the case. It's just a bit more complicated than telling people what to do. 110 00:12:30,730 --> 00:12:33,850 Changing what you do is particularly difficult, and we've, 111 00:12:34,270 --> 00:12:41,020 in the last 12 months have the new nice guidelines for atrial fibrillation, which says don't use aspirin on it. 112 00:12:41,020 --> 00:12:44,110 So just for people with stroke and of course, 113 00:12:44,470 --> 00:12:52,670 everybody's brain says that can't possibly be true because I've used aspirin for years for that indication, in essence at heart. 114 00:12:54,880 --> 00:13:02,709 So we might want to think about prologue number three, which is Joan, how many people do see patients? 115 00:13:02,710 --> 00:13:09,250 Hands up. Okay. So the rest of you will have to imagine that you see patients. 116 00:13:10,000 --> 00:13:13,060 So Joan's 84. She's had a couple of heart attacks. 117 00:13:13,420 --> 00:13:18,969 She's she's had well-controlled heart failure. She's at the far end of the survival curve. 118 00:13:18,970 --> 00:13:25,750 She's a survivor with heart failure. She's had heart failure for about eight years now and is still pretty much symptom free. 119 00:13:25,750 --> 00:13:29,709 We think she's got osteoporosis because she's had a couple of low impact fractures. 120 00:13:29,710 --> 00:13:35,680 She's fractured f and she's broken her wrist. She she gets bouts of gout occasionally. 121 00:13:35,920 --> 00:13:41,680 Hannah Renal function isn't great, but then nobody's renal function is greater. 122 00:13:41,680 --> 00:13:45,820 84 Her EGFR runs in the sort of mid-twenties. 123 00:13:47,080 --> 00:13:53,410 There's a list of of medicines here and she now has a new problem. 124 00:13:53,650 --> 00:14:03,700 She's developed atrial fibrillation. Her risk for stroke is is certainly getting towards the top end for the technical people here 125 00:14:03,700 --> 00:14:11,169 at the shard vascular I've calculated for you is five and risk of bleeding is is is three. 126 00:14:11,170 --> 00:14:16,840 So what do you think we should do for Joan those of you who who do this what 127 00:14:16,840 --> 00:14:21,850 what's the what's the thing that you'd want to cover in this consultation? 128 00:14:25,700 --> 00:14:28,129 It's a stroke thing, isn't it? Really? 129 00:14:28,130 --> 00:14:37,100 Atrial fibrillation increases your risk of having a stroke phenomenally if you have a stroke due to atrial fibrillation. 130 00:14:37,100 --> 00:14:41,810 But a bit of clot forms in the heart and flies through the bloodstream to the brain. 131 00:14:42,230 --> 00:14:48,080 It's a really serious, horrible thing to happen and potentially preventable by anticoagulation. 132 00:14:48,440 --> 00:14:59,570 Yeah. So I do this in a workshop of clinicians and I say, okay, you might want to discuss these sorts of numbers with Joe. 133 00:14:59,910 --> 00:15:05,810 So if there's a thousand people like Joan, how many will have a stroke in the next year? 134 00:15:07,160 --> 00:15:11,320 If there's a thousand people like Joan, how many will have a major bleed in the next year? 135 00:15:11,330 --> 00:15:15,020 Because if we anticoagulant her, her risk of having a complication, 136 00:15:15,410 --> 00:15:22,550 a bleeding from the gut of perhaps into the back of the eye or someone else's is goes up because we've, in effect, thin the blood. 137 00:15:23,930 --> 00:15:28,430 And how do those numbers change if Joan is anti coagulated? 138 00:15:29,930 --> 00:15:35,749 Anybody want to have a guess? Um, it's the thousand people like Joan. 139 00:15:35,750 --> 00:15:41,360 How many will have a stroke in the next year? So people with a childbirth score of 550? 140 00:15:41,870 --> 00:15:48,320 Yeah. Anybody else? I've done this in workshops with loads more clinicians. 141 00:15:48,710 --> 00:15:52,430 The numbers that people estimate ranged from 1 to 100. 142 00:15:53,360 --> 00:16:00,440 I've done it with the staff from a stroke unit, a regional stroke unit, and their numbers weren't as high as 2 to 300. 143 00:16:01,520 --> 00:16:09,890 Interestingly, if you see lots of strokes, you think the risk is much higher than if you don't see lots of people with strokes. 144 00:16:10,340 --> 00:16:19,460 But it would be good to actually know those numbers and to be able to have a conversation with her about the risks and benefits and and the options. 145 00:16:19,880 --> 00:16:23,900 But it would be good if we knew pretty closely what those numbers were. 146 00:16:24,290 --> 00:16:27,710 But we but we don't really, really need to. 147 00:16:27,740 --> 00:16:37,340 So that's three things. We have to work out what the evidence is, find it and apply it to people like Joan, not just create it into guidelines. 148 00:16:37,700 --> 00:16:43,939 And then we have to work out why is it that we find it so hard to change clinical practice, 149 00:16:43,940 --> 00:16:47,570 to make different decisions from the ones that we're used to making? 150 00:16:47,690 --> 00:16:50,870 Yeah, that's the that's setting it up, everybody. 151 00:16:50,870 --> 00:16:56,180 Okay, so this is where you need your piece of paper and a pen. 152 00:16:57,140 --> 00:17:01,400 Everybody ready? A list of words follows. 153 00:17:01,400 --> 00:17:05,840 Look at them once, but don't reread them. When you've read the list, close your eyes. 154 00:17:06,980 --> 00:17:14,660 Nothing horrible happens. There's no time came. The sailing, the closing, the eyes is just a signal for me. 155 00:17:14,930 --> 00:17:24,010 So I know we can move on to the next bit. Everybody okay? When you've read the list, close your eyes. 156 00:17:28,440 --> 00:17:34,890 Pretty much everybody there. Write down as many words as you can remember. 157 00:17:41,580 --> 00:17:45,060 Come on. We've got some of the most intelligent lifeforms on the planet. Come on. 158 00:17:48,600 --> 00:17:52,380 Anybody. Anybody need more time to copy from their neighbour? 159 00:17:55,320 --> 00:18:00,060 So who's got 20 words? Oh, okay. 160 00:18:00,750 --> 00:18:04,950 Four ish. Three, four, five. Yeah, that's that. 161 00:18:05,250 --> 00:18:11,740 That's the. That's the norm. And anybody think they're doing quite well compared to the peer group? 162 00:18:11,760 --> 00:18:19,050 Seven, eight? Yeah, one or two. Okay. Just take a look everybody at the list again. 163 00:18:19,530 --> 00:18:22,740 Just take a look and see if any of you have misremembered a word. 164 00:18:23,280 --> 00:18:27,330 Hands up. If you've misremembered the word straight off. Yeah, that's quite, quite. 165 00:18:27,540 --> 00:18:33,540 Now, that's interesting, isn't it? How easy it is to mix things up. 166 00:18:35,220 --> 00:18:40,170 Second thing, how many people are A's and B's? 167 00:18:40,170 --> 00:18:44,220 People predominantly remembering words from the start of the list. 168 00:18:44,760 --> 00:18:50,160 How many people are D's? And these people predominantly remembering words from the back end of the list 169 00:18:50,550 --> 00:18:55,290 and how many people are all over the place and no true pattern splits out? 170 00:18:55,410 --> 00:18:59,280 If you have a big enough sample, it's about a third, a third and a third. 171 00:18:59,580 --> 00:19:01,430 So that's interesting as well. 172 00:19:01,440 --> 00:19:09,930 You have an an in built propensity for remembering things from the start of something or from the end of something or scattered all over. 173 00:19:10,320 --> 00:19:14,850 But the whole point of this is that nobody can remember everything. 174 00:19:15,210 --> 00:19:18,630 You're cognitive ability is bounded. 175 00:19:19,260 --> 00:19:29,390 It's not infinite. And if you think that this is a bit harsh, those of you who do clinical work. 176 00:19:29,900 --> 00:19:35,719 Yeah. The list of potential drugs that interact with warfarin runs to four slides. 177 00:19:35,720 --> 00:19:38,480 I've not put it up. I'm not going to ask you to write it down. 178 00:19:39,080 --> 00:19:49,190 But somehow clinicians find their way through this and we don't get as many complications from this sort of stuff as you would think. 179 00:19:49,190 --> 00:19:56,719 Although Warfarin sits up there as one of the top causes of iatrogenic admission to hospital 180 00:19:56,720 --> 00:20:05,390 admission caused by taking a medicine or something caused by something that's been done to them. 181 00:20:05,390 --> 00:20:11,450 So. So that's the first thing. Who's bought a car in the last three years? 182 00:20:12,590 --> 00:20:23,390 Okay, since you're close to me, tell us what you did to help you make a decision for buying a car. 183 00:20:23,900 --> 00:20:29,150 Well, actually, it was my husband of one car. Okay? He didn't do his homework anyway, near enough. 184 00:20:29,780 --> 00:20:34,590 So what I did after he did what he paid for the car, evidently he bought it while I was overseas. 185 00:20:34,730 --> 00:20:40,460 Okay. I went on to all the other car websites. Yes. How many kilometres they had, how old they were. 186 00:20:40,550 --> 00:20:46,910 Okay. So this would be a summary sort of website of what car website or consumer. 187 00:20:47,870 --> 00:20:53,300 Okay, yes, same sort of thing. I went to the choice that was the RACV, you know, country. 188 00:20:53,390 --> 00:20:56,570 Okay. If you want this type of car. Yeah. 189 00:20:57,640 --> 00:21:00,890 Okay then I'll look at the insurance. Yeah. 190 00:21:01,010 --> 00:21:05,160 How much would you pay for that. Okay. And it was on to that you didn't even want to come. 191 00:21:05,450 --> 00:21:09,350 Okay. I've not had that one before. 192 00:21:12,410 --> 00:21:15,530 Somebody else. Who else wants to tell us about what they did about buying a car? 193 00:21:15,530 --> 00:21:21,349 Go for it. Let's go over here. Well, only we want to spend all night because we want to get to other stuff as well. 194 00:21:21,350 --> 00:21:34,219 But please tell us it's on. Oh, I used Auto Trader K the filters to tell you what time the vending machine was. 195 00:21:34,220 --> 00:21:37,850 So I don't know what time you what time of transmission. 196 00:21:38,060 --> 00:21:44,860 Yeah. How did you just just let me interrupt you for a minute, okay? 197 00:21:45,470 --> 00:21:50,750 Far away. What about how did you decide what what variables you were going to put in? 198 00:21:50,750 --> 00:21:56,150 So you were looking at price and safety. How did you decide on those things? 199 00:21:56,480 --> 00:22:07,070 Because you obviously made the same choice around the things. My experience with my previous two cars, I was looking for something, um, no similar. 200 00:22:07,130 --> 00:22:13,240 Yeah. Yeah. So, yeah, I ended up actually being the same car. 201 00:22:13,370 --> 00:22:17,250 Yeah. You used. Yeah. Okay. I had a different idea of how. 202 00:22:17,270 --> 00:22:22,129 Oh, yeah, yeah. Okay. Anything else? 203 00:22:22,130 --> 00:22:25,850 Did you do anything else in terms of helping you make a choice? 204 00:22:27,110 --> 00:22:31,060 Oh, no, I pretty much set it all and all of this. 205 00:22:32,090 --> 00:22:36,709 All choices. Yeah. To set on the filters, outcomes, the results. 206 00:22:36,710 --> 00:22:39,800 And that's not too far away. I'll go and look at that one. Yeah, exactly. 207 00:22:40,010 --> 00:22:45,470 Brilliant. Thank you very much. That's pretty typical. 208 00:22:45,710 --> 00:22:58,070 Um, we select a few things that are important to us size, cost, colour, various bits, safety, fuel consumption is a common one. 209 00:22:58,370 --> 00:23:05,570 I've never had anybody say they weren't to the Society of Automotive Engineers website and 210 00:23:05,600 --> 00:23:12,829 read the systematic review on engine testing theory and this third edition 700 pages. 211 00:23:12,830 --> 00:23:19,880 You made a perfectly good decision without ploughing through the technical engineering jargon. 212 00:23:21,560 --> 00:23:26,810 People don't go for laminated glass design considerations for vehicle door systems. 213 00:23:26,810 --> 00:23:30,920 Fast track the 40 minute PowerPoint presentation. 214 00:23:31,460 --> 00:23:37,940 Yeah, we truncate staff, we make quicker decisions. 215 00:23:38,240 --> 00:23:40,940 We don't go to these detail things. 216 00:23:40,940 --> 00:23:48,799 And arguably, it wouldn't have helped you to make a better, quicker, easier, safer decision about which car to buy. 217 00:23:48,800 --> 00:23:58,970 If you had read 700 pages on mechanical engineering about car engines, you see my point and we can get by without the 700 pages. 218 00:23:59,600 --> 00:24:06,799 So Herbert Syme, one Nobel Prize for describing the fact that your rationality is bounded, 219 00:24:06,800 --> 00:24:17,930 that it is simply not possible for you to read, learn, hear, learn, see, learn, store it away. 220 00:24:18,020 --> 00:24:21,140 Recall it in context. Just that the point. 221 00:24:21,680 --> 00:24:24,980 Which you might need it most of the time. 222 00:24:25,220 --> 00:24:33,950 You make a good enough decision. You collect sufficient information to reduce your uncertainty to the point at which you're able to make a decision. 223 00:24:34,820 --> 00:24:41,209 That's an important phrase. So you're looking to reduce uncertainty to the point at which you are able decision. 224 00:24:41,210 --> 00:24:44,630 And once you reach that point, you can stop and make a decision. 225 00:24:44,660 --> 00:24:46,340 It's called Satisficing. 226 00:24:46,550 --> 00:24:57,200 And if you do this work with general practitioners, they can get through the week with saying for every problem that they face, barring one or two. 227 00:24:57,830 --> 00:25:03,229 And then they do the alternative. They temporise or do some investigations and they'll say We'll give it a few 228 00:25:03,230 --> 00:25:06,520 days and then they'll go away and they'll do some more information collecting. 229 00:25:06,530 --> 00:25:09,559 That's called maximising. That may be from other people. 230 00:25:09,560 --> 00:25:13,370 It might be from the Internet these days. In times past, it might have been from books. 231 00:25:13,370 --> 00:25:19,279 So Satisficing is the normal way we go about making decisions. 232 00:25:19,280 --> 00:25:22,309 Maximising is a is a rare approach. 233 00:25:22,310 --> 00:25:32,540 And this this Herbert Simon work is counter to the notion that all we need to do to make better decisions is to have all of the information. 234 00:25:33,470 --> 00:25:42,470 And that's palpably not true. And it has implications for fields of artificial intelligence, which was part of Herbert Simon's work. 235 00:25:42,470 --> 00:25:47,959 But we don't need to go there now. If you look at how this blends into clinical practice, 236 00:25:47,960 --> 00:25:53,900 this is a quick bit of work published in The Lancet by a couple of junior doctors who looked 237 00:25:53,900 --> 00:26:00,020 at the patients that they'd admitted under their care in the previous 24 hours on call. 238 00:26:00,020 --> 00:26:03,409 So it was 18 patients with 44 diagnoses. 239 00:26:03,410 --> 00:26:07,310 So lots of them were like Joan with more than one thing wrong with them. 240 00:26:08,030 --> 00:26:13,790 They they calculated that if they were to set out to read the relevant guidelines 241 00:26:14,090 --> 00:26:20,900 pertinent to those 44 diagnoses published only in the last three years in the UK, 242 00:26:21,170 --> 00:26:30,709 just by relevant Royal Colleges and nice they would have to read 3679 pages and at 2 243 00:26:30,710 --> 00:26:36,500 minutes reading each page the physician on call would need to read for 122 hours. 244 00:26:38,980 --> 00:26:47,350 So there's a different way of getting through this. This is not simply more knowledge equals easier decision making. 245 00:26:47,350 --> 00:26:56,259 This clearly something going on as you saw with the diclofenac stuff when we do john gabbay and Andre Lima, 246 00:26:56,260 --> 00:27:01,450 his mind line stuff and those of you who've been on the course this week will have heard about my lines already, 247 00:27:01,450 --> 00:27:07,479 I think from about a systematic review. This is the this is the quote that I like to put up. 248 00:27:07,480 --> 00:27:13,060 Clinicians rarely access appraise news, explicit evidence directly from research or other formal sources. 249 00:27:13,270 --> 00:27:16,629 Rare exceptions are where they need to go and maximise gap in them. 250 00:27:16,630 --> 00:27:22,600 They didn't write that, but that's what they meant. Instead, they rely on what we call mind lines. 251 00:27:22,600 --> 00:27:30,489 So in contrast to guidelines, mind lines, patents, pictures in our heads, these are collectively reinforced, 252 00:27:30,490 --> 00:27:38,140 internalised, tacit, so not necessarily written down anywhere guidelines informed by brief reading. 253 00:27:38,650 --> 00:27:45,700 Think about where you got your information from your cars, brief reading, but mainly their interactions with each other and with opinion leaders, 254 00:27:45,700 --> 00:27:51,549 patients, pharmaceutical reps and other sources of largely tacit knowledge that builds on early training. 255 00:27:51,550 --> 00:27:55,870 So patterns are ingrained and then we use the patterns. 256 00:27:57,580 --> 00:28:02,530 And that leads me on to the fact that simply constructing guidelines isn't enough. 257 00:28:02,800 --> 00:28:06,610 There's clearly a community of practice involved in mind lines. 258 00:28:07,180 --> 00:28:15,490 Before we get to the consultation with Joan and the individual bits and the complexity of all of this, 259 00:28:15,820 --> 00:28:20,380 three different translations, they all require different methodologies. 260 00:28:20,770 --> 00:28:24,910 We've worked on that long. Not too much on the other two. 261 00:28:27,680 --> 00:28:33,580 So second theory we've done bounded rationality. It's great being a professor of evidence informed decision making. 262 00:28:33,590 --> 00:28:37,430 There's only two things you need to know. One is bound to rationality. 263 00:28:37,610 --> 00:28:44,510 The second is due process theory. Who knows the story of Noah in the Bible? 264 00:28:46,940 --> 00:28:50,270 Come on, don't be shy. I saw you twitch. 265 00:28:52,220 --> 00:28:57,860 Just prepare the brain now. Tell us briefly the story of Noah in the Bible. 266 00:28:59,720 --> 00:29:04,430 Well, as I remember it, something to do with, uh, people being misbehaving generally. 267 00:29:04,460 --> 00:29:10,850 Yeah. No, it's been well behaved. Uh, and therefore, God said to him, you know. 268 00:29:10,980 --> 00:29:14,150 Yeah. Yeah. You'll be safe. 269 00:29:14,500 --> 00:29:22,340 Yeah. Yeah. And the raven went out with a thing in his mouth some time later. 270 00:29:22,400 --> 00:29:28,700 Yeah. 40 days and 40 nights. Yeah. Uh, we would have been flooded, and then. 271 00:29:28,700 --> 00:29:33,320 Yes, he sent out some birds, and then a dove. 272 00:29:33,500 --> 00:29:37,100 And the dove comes back with. With some, uh. What was it? 273 00:29:37,880 --> 00:29:43,030 An olive branch? Yeah. Yeah. And that was the signal that the. 274 00:29:43,320 --> 00:29:46,790 The flood is receding and everybody happy. 275 00:29:46,850 --> 00:29:50,030 Everybody remember the story of Noah now. And it was done by Russell Crowe. 276 00:29:50,780 --> 00:29:54,260 Yeah. Russell Crowe and Jim Carrey. 277 00:29:54,470 --> 00:29:58,370 Do you remember the Jim Carrey film? Yeah. Two of every kind. 278 00:29:58,580 --> 00:30:01,700 Yeah. Okay. Imagine you're working as a doctor in a remote village. 279 00:30:01,700 --> 00:30:04,820 It's the weekend. There's no other health care professionals around. 280 00:30:05,060 --> 00:30:08,270 But you do have a new piece of technology called the Marvel Tron. 281 00:30:09,230 --> 00:30:12,560 The Marvel Tron will save the life of any patient you're treating, 282 00:30:12,740 --> 00:30:18,620 but you have to answer correctly the question the Marvel Tron asks of the attending doctor before it works its magic. 283 00:30:19,760 --> 00:30:24,950 A young child is brought to, you see, seriously ill, and it appears she'll die imminently. 284 00:30:25,310 --> 00:30:30,590 You switch on the Marvel tron and await the question. Now you must write down your answer immediately. 285 00:30:30,590 --> 00:30:34,940 The question is asked or the child will die. Pen and paper. 286 00:30:34,940 --> 00:30:42,230 Ready? Importantly, you'll get no blame if you get the answer to this question wrong. 287 00:30:43,070 --> 00:30:46,260 You'll only be blamed if you don't try and save the life. 288 00:30:46,280 --> 00:30:51,150 This is the Good Samaritan as well as the know her story. 289 00:30:51,170 --> 00:30:57,690 So are you ready? It didn't work according to the Bible. 290 00:30:57,700 --> 00:31:00,729 How many giraffes did no take into the ark? Answer quickly. 291 00:31:00,730 --> 00:31:08,350 Write it down. Charles is dying and answer. Period. Let's cut to the chase because time's getting on. 292 00:31:09,190 --> 00:31:14,770 How many people wrote down to? Yeah, pretty much everybody wrote down to. 293 00:31:15,190 --> 00:31:19,510 So where did you get that evidence from? A Bible. 294 00:31:20,140 --> 00:31:24,960 A Bible story. Somebody told it to you once, a long time ago. 295 00:31:24,970 --> 00:31:29,900 Anybody else? Where do you get it from? 296 00:31:30,500 --> 00:31:38,320 Probably assumed it's in the mail. Mentioned you asked, but you said yeah, yeah, yeah. 297 00:31:38,660 --> 00:31:43,100 Okay. So we just know it. Don't weigh in there somehow. 298 00:31:43,190 --> 00:31:48,020 We know the animals go in two by two. There's even a Disney song. 299 00:31:48,290 --> 00:32:02,210 Yeah. Let's check with the guideline of every clean base that shall take to the by sevens, the male and female and off base that are not clean by two. 300 00:32:03,080 --> 00:32:09,590 So are giraffes, are clean beast, clean base, have cloven hooves and chew the code. 301 00:32:10,160 --> 00:32:15,710 Here is the cloven hoof of a giraffe. Here is the cloven hoof on the desert floor. 302 00:32:16,100 --> 00:32:19,400 Here is the YouTube video of giraffes chewing their codes. 303 00:32:20,810 --> 00:32:28,280 The correct answer is 14, not two, 14, not two. 304 00:32:28,910 --> 00:32:29,930 And at this point, 305 00:32:30,170 --> 00:32:39,799 somebody is going to get their iPhone out and start Googling Bible dot com because your brain is saying that can't possibly be true. 306 00:32:39,800 --> 00:32:44,209 I know that the animals go in two by two. The pattern is imprinted with you. 307 00:32:44,210 --> 00:32:48,050 I promise you the answer is 14. 308 00:32:48,950 --> 00:33:01,220 I've even had an Orthodox Jewish pharmacist get out his iPhone and Google the Torah in Hebrew and he teaches this in the synagogue. 309 00:33:01,820 --> 00:33:10,400 And after about 5 minutes, he said, Gosh, Neal, you're right, even in the tourist 14 and I've been telling everybody it's two by two. 310 00:33:11,180 --> 00:33:18,079 Okay? The point is, once we know something, which once we've had it imprinted several times, 311 00:33:18,080 --> 00:33:22,820 it sticks with us and it's really hard to do something different. 312 00:33:23,090 --> 00:33:27,770 Pattern recognition is actually part of our DNA. 313 00:33:27,770 --> 00:33:39,080 We're all descended from about 5000 ancestors living in Eastern Africa millions of years ago, and their pattern recognition had a survival advantage. 314 00:33:39,800 --> 00:33:46,010 If you could recognise a berry that you could eat from one that would poison you a friend from a foe 315 00:33:46,010 --> 00:33:50,810 or an animal that you could chase and catch and eat from one that would chase and catch and eat you. 316 00:33:50,960 --> 00:34:01,280 There was a survival advantage. And as I'll demonstrate shortly, this pattern recognition is so strong in is that we can't actually turn it off. 317 00:34:01,610 --> 00:34:07,730 So once we've started prescribing diclofenac, it's really difficult to stop it. 318 00:34:08,660 --> 00:34:16,549 Once we have a pattern of behaviour in our minds, it's terribly easy just to go along and it's actually an advantage. 319 00:34:16,550 --> 00:34:20,390 So I was in Amsterdam last week. I'm staying at a hotel. 320 00:34:20,390 --> 00:34:24,740 I'll never stay there before. I don't know how to find my way through that, through the transport system. 321 00:34:26,030 --> 00:34:30,350 Very difficult once I've done it once second time, easy. 322 00:34:31,520 --> 00:34:36,290 Classic example of how easy we can get patterns. 323 00:34:36,290 --> 00:34:40,009 And this is sort of being constructed together into this due process theory. 324 00:34:40,010 --> 00:34:45,499 So in medicine, if a patient presents and the pattern process a function that we have, 325 00:34:45,500 --> 00:34:49,640 it's not an ensemble sight within the brain, it's just the way the brain works. 326 00:34:50,060 --> 00:34:56,330 If we've seen it before and we recognise what it is, this is fast intuitive. 327 00:34:56,480 --> 00:35:04,550 We can't switch it off. We don't have to think about it, it just happens and we can come up with the diagnosis based upon pattern recognition. 328 00:35:05,210 --> 00:35:11,120 If you're an undergraduate training or your GP training or this is an unusual condition that you've never seen before, 329 00:35:11,480 --> 00:35:17,990 then you don't recognise what it is. This this type to decision making process requires effort. 330 00:35:18,500 --> 00:35:23,210 It's uncomfortable, it takes time and it's a struggle. 331 00:35:23,810 --> 00:35:28,370 And some days every surgery runs like this and it feels like a dream. 332 00:35:28,640 --> 00:35:35,600 And some days every surgery runs like this and it's like walking through treacle and sometimes 333 00:35:35,600 --> 00:35:40,700 new information comes in and it will rationally override the first impression we have. 334 00:35:41,330 --> 00:35:45,559 Sometimes, despite new information being available, 335 00:35:45,560 --> 00:35:52,100 the previous patterns that were imprinted this rationally override the new information that comes in. 336 00:35:52,520 --> 00:35:55,610 Think of the MHRA warning about and cites. 337 00:35:55,760 --> 00:36:00,830 It's not the people are lazy. It really tells you they happen to be human beings and they find it difficult to shift patterns. 338 00:36:01,790 --> 00:36:07,609 Okay, so we'll do a bit of this in a second. It's actually geological process theory. 339 00:36:07,610 --> 00:36:09,620 So in an emergency situation, 340 00:36:09,620 --> 00:36:18,799 a trained clinician recognises what the emergency is and provides the lifesaving treatment straightaway fast, quick lifesaving treatment. 341 00:36:18,800 --> 00:36:23,180 And don't think that System two is always going to be better than system one. 342 00:36:23,180 --> 00:36:27,530 Optimal decision making is actually pattern recognition. 343 00:36:27,820 --> 00:36:36,130 With a few stop and start rules some some some alternatives to just going with the pattern that's there. 344 00:36:36,340 --> 00:36:45,160 Good giga insist on lots of work on this from from Berlin but I'm skating over the the academia so let's do a bit of this together. 345 00:36:46,170 --> 00:36:49,270 All together. Big breath in. I want to hear you. 346 00:36:50,080 --> 00:36:55,330 Big breath in. What's the answer to this? Some. That was useless. 347 00:36:56,950 --> 00:37:00,880 We'll do it again. Big breath in. What's the answer to this song? 348 00:37:01,750 --> 00:37:06,969 Thank you. That's better. Okay, so when I put out the first time, you didn't have to count upon your fingers. 349 00:37:06,970 --> 00:37:11,680 I didn't see anybody getting their socks off. It was known to you. 350 00:37:11,800 --> 00:37:15,580 The answer was four before you came in here. 351 00:37:15,580 --> 00:37:18,850 And it's still four now. I promise you, it hasn't changed. 352 00:37:19,270 --> 00:37:22,410 Two and two is still four altogether. 353 00:37:22,420 --> 00:37:28,330 Big breath in. What's the answer to this? Some. Okay. 354 00:37:29,260 --> 00:37:34,540 So that requires system two processing of a fairly simple nature. 355 00:37:35,050 --> 00:37:39,670 But there are some rules of thumb that would help you make a more rapid decision. 356 00:37:39,670 --> 00:37:46,209 So if we said that was about 80 and that was about 60, 80 times 60 is about 4800. 357 00:37:46,210 --> 00:37:52,300 So if I was trying to tell you that the answer was 10,000, you be able to fairly quickly tell me that I was wrong, 358 00:37:53,230 --> 00:37:58,300 providing you knew the rules of thumb, which in this language are called heuristics. 359 00:37:58,600 --> 00:38:04,300 So if you know the tricks, you can you can get that. That's what I mean about stop start rules of thumb type stuff. 360 00:38:04,780 --> 00:38:09,129 Okay, so what's this? Make the diagnosis here. 361 00:38:09,130 --> 00:38:15,890 Big breath in everybody recognise Margaret Thatcher. 362 00:38:15,910 --> 00:38:22,069 So if you're a clinician and a margaret Thatcher walked into your consulting room, you be able to recognise Margaret Thatcher. 363 00:38:22,070 --> 00:38:28,180 Yeah. So how many people have recognised have met Margaret Thatcher? 364 00:38:29,980 --> 00:38:33,040 So this is a pattern recognition, just trash. 365 00:38:33,190 --> 00:38:40,059 Of course you have the pattern recognition. 366 00:38:40,060 --> 00:38:46,180 It's imprinted on you. You've only seen pictures in papers, in magazines, on the television. 367 00:38:46,210 --> 00:38:51,980 Most of those pictures were a long time ago. The funeral was a few years ago. 368 00:38:52,630 --> 00:38:58,650 But she was prime minister, actually. You know, 1990 was when she left office. 369 00:38:58,660 --> 00:39:08,649 And yet a pattern recognition and who as human beings is so brilliant that there is no problem in recognising Margaret Thatcher. 370 00:39:08,650 --> 00:39:13,210 We take it for granted. Actually don't way that we can do that. It's an amazing scale. 371 00:39:13,720 --> 00:39:20,500 And if you don't see a distant relative for ten years but you happen to come across them at a wedding or a funeral or a christening, 372 00:39:21,160 --> 00:39:29,979 you straightaway recognise Aunty Minnie probably from just the way she walks, rather than the fact that you've seen her face. 373 00:39:29,980 --> 00:39:36,100 It's a fantastic bit of human scale to be able to do that pattern recognition. 374 00:39:36,100 --> 00:39:44,470 So if, if I put this up in front of trained clinicians, they'll say it's huge, it's right away. 375 00:39:44,830 --> 00:39:49,000 You don't have to go away and work it out. You know that this is shingles. 376 00:39:49,240 --> 00:39:56,469 Put it up in front of first year medical student. So obviously students, nursing students, anybody really hard to get to grips with that. 377 00:39:56,470 --> 00:40:00,250 No idea what it is unless they've seen it before in a relative. 378 00:40:02,020 --> 00:40:07,450 Um, well, emergency treatment there's a seriously ill adolescent need clinicians. 379 00:40:10,360 --> 00:40:13,390 Okay, so this is, that's that due process theory. 380 00:40:13,510 --> 00:40:15,610 You've just saved a life potentially. 381 00:40:15,760 --> 00:40:25,660 This is meningococcal septicaemia, antibiotics into this seriously ill child as soon as you can please and lots of other stuff as well. 382 00:40:26,650 --> 00:40:33,070 You don't need to go to the books and start working out what this is or googling what this might be. 383 00:40:33,070 --> 00:40:36,969 We know it straight away and it's extremely useful. 384 00:40:36,970 --> 00:40:41,910 Pattern recognition. What's this? Okay. 385 00:40:41,940 --> 00:40:45,120 I'll take you back to him. I'll take you back to medical school. 386 00:40:45,870 --> 00:40:47,970 I'll become the professor of cow ology. 387 00:40:48,750 --> 00:41:01,230 I spent 20 years recognising, treating, investigating, researching that rare but serious condition where a human being has turned into a cow. 388 00:41:02,430 --> 00:41:07,680 No carefully. Students, the brow, the eyes, the jawline, the nostrils, the ears. 389 00:41:07,680 --> 00:41:11,370 And we do lots of stuff in lecture theatres and in small group learning. 390 00:41:11,370 --> 00:41:16,319 And I'd set some tasks to go and do self-directed learning. 391 00:41:16,320 --> 00:41:22,649 And then I take you on to the ward at the John Radcliffe, where a patient had been admitted the previous night under my care. 392 00:41:22,650 --> 00:41:25,260 And you'd say, what a great teacher that prof is. 393 00:41:25,560 --> 00:41:32,430 Because the patient in the bed looks like this, just like the one we learnt about over at the medical school. 394 00:41:32,640 --> 00:41:38,940 No carefully students, I would say the brow, the eyes, the jawline, the nostrils, the ears. 395 00:41:39,330 --> 00:41:47,940 And we do a bit of additional work and we might see two or three patients and I might set the odd test and then we'd come to your final 396 00:41:47,940 --> 00:41:59,790 examination and I'd set you the task of differential diagnosis and provided you could correctly identify the patient that's turned into a cow, 397 00:41:59,790 --> 00:42:02,820 despite it being a slightly unusual setting. 398 00:42:03,510 --> 00:42:11,969 I might let you go off and be a doctor, and there'd be some point to this because a couple of years later you'd be 399 00:42:11,970 --> 00:42:17,400 working in the A&E department and a patient comes in looking a bit like this. 400 00:42:20,080 --> 00:42:24,430 I need to have a look at the patient in the cubicle in the accident and emergency department, 401 00:42:24,430 --> 00:42:29,560 and you'd sort of squint your eyes at that and something might pop up into your brain if you're lucky. 402 00:42:29,570 --> 00:42:37,340 And you might go and talk to the registrar and the consultant and say, that chap over there in that cubicle, I think he might have turned into a cow. 403 00:42:39,280 --> 00:42:40,480 And you'd be right. 404 00:42:41,260 --> 00:42:48,999 And your self-confidence would go up and in your esteem amongst your colleagues would increase and you'd progress through your career. 405 00:42:49,000 --> 00:42:54,160 And then a couple of years later, somebody comes in looking like this. 406 00:43:00,210 --> 00:43:07,590 And gradually, if you keep looking at this person in the cubicle, you might spot that. 407 00:43:07,590 --> 00:43:15,330 That's an ear and that's an eye and that's a jawline and nostrils, another ear hair. 408 00:43:15,390 --> 00:43:21,450 Now I've got it. I've never really seeing the cow on the screen. 409 00:43:22,440 --> 00:43:27,750 Okay. That's how fast the brain can start to recognise patterns. 410 00:43:28,050 --> 00:43:32,370 Importantly, look at the screen and close your eyes. Now open them again. 411 00:43:32,910 --> 00:43:37,530 Now try not to see the cow, because now it's imprinted. 412 00:43:38,940 --> 00:43:43,440 So if I wanted to get you to write out a prescription for a new drug, 413 00:43:43,440 --> 00:43:46,800 I'd get you to try out two or three times when you've written it out three times. 414 00:43:47,040 --> 00:43:50,490 You write outgoing for the first patient. The fifth patient, no problem whatsoever. 415 00:43:50,940 --> 00:43:54,180 The printing happens that that quickly. 416 00:43:54,840 --> 00:43:58,710 So this is how it works in terms of education, since we're an educational centre. 417 00:43:59,100 --> 00:44:03,870 If I asked you to drive me home to the northwest of England tonight and you'd never driven a car before, 418 00:44:04,830 --> 00:44:09,090 I hope you'd say no because you're consciously incompetent. 419 00:44:09,090 --> 00:44:12,430 You know you can't drive a car because you've never had a driving lesson. I would have. 420 00:44:12,810 --> 00:44:19,470 Please say no. But if you had some driving lessons, it would be hard. 421 00:44:19,830 --> 00:44:23,040 You'd have to put some effort in. It would be system two. 422 00:44:23,070 --> 00:44:27,270 It would be slow. But by the time you got to your driving test, 423 00:44:28,260 --> 00:44:35,700 provided you thought about what you were doing whilst you were doing the driving test, you might just be consciously competent. 424 00:44:36,750 --> 00:44:39,899 Now, after lots of practice at driving, system two is gone. 425 00:44:39,900 --> 00:44:45,150 It's a system one with repetition and you can drive a car now whilst working out the shopping list and thinking 426 00:44:45,150 --> 00:44:48,990 about what's happening with the kids and what's going on at work today and all of those sorts of things. 427 00:44:49,500 --> 00:44:52,750 I wasn't thinking coming here today. Now it's time to change gears. 428 00:44:52,770 --> 00:44:57,059 Now I need to press the clutch, etc. Yeah. The problem is that sometimes. 429 00:44:57,060 --> 00:45:00,780 Oh, sorry. Sometimes things change. 430 00:45:00,960 --> 00:45:05,430 Somebody steps out in front of you or the lights alter or the road defines you. 431 00:45:05,670 --> 00:45:11,190 You're using your system well. And at that point, you're unconsciously incompetent. 432 00:45:11,200 --> 00:45:19,320 You think you're okay, but you're not. And if this is a situation that needs repetition, you have to assess that, become aware of it. 433 00:45:19,860 --> 00:45:23,850 That requires effort. Just recognising you need to change requires effort. 434 00:45:25,170 --> 00:45:30,740 And then you have to go back and learn again to write ibuprofen and naproxen diclofenac. 435 00:45:30,750 --> 00:45:38,380 And that's hard because it requires effort. What goes wrong with this pattern recognition? 436 00:45:38,800 --> 00:45:45,910 Steve is shy and withdraw and invariably helpful, but with little interest in people has a need for order and structure and a passion for detail. 437 00:45:46,120 --> 00:45:49,150 Which of these is it most likely that Steve is? 438 00:45:50,950 --> 00:45:55,849 What do you think? Librarian. 439 00:45:55,850 --> 00:45:59,810 Pharmacist. Yeah. Librarian. Pharmacist is what most people are saying. 440 00:46:00,200 --> 00:46:03,589 I know quite a lot of librarians. I know even more pharmacists. 441 00:46:03,590 --> 00:46:07,870 I agree with you. How did you reach that conclusion, though? 442 00:46:10,700 --> 00:46:14,440 Yeah. You created a picture from the text up here. 443 00:46:14,450 --> 00:46:18,500 You don't need a picture. You can create pictures that easily by just reading some text. 444 00:46:19,430 --> 00:46:25,610 And then you apply this picture to the alternatives that are down here and see the one that matches most. 445 00:46:25,850 --> 00:46:27,409 This is a cognitive bias. 446 00:46:27,410 --> 00:46:37,730 It's called base rate neglect because in most countries, in most populations, the numbers of farmers are much greater than any of these options. 447 00:46:38,000 --> 00:46:46,970 So even if a small proportion of farmers had these introverted traits, it's still more likely on the numbers that Steve's a farmer than any of these. 448 00:46:48,650 --> 00:46:55,340 The pattern recognition led you astray, and there's more than 100 of these. 449 00:46:56,600 --> 00:47:07,700 So anchoring sorry, anchoring bias, an early salient feature in a consultation, is something that a clinician might fix on. 450 00:47:07,940 --> 00:47:16,700 And despite continuing queues emerging throughout that consultation, they stick with the first impression that comes to mind. 451 00:47:18,110 --> 00:47:21,469 Ascertainment bias thinking shaped by prior expectations. 452 00:47:21,470 --> 00:47:25,250 So I'm a GP in Scarborough. Years ago I was on call. 453 00:47:25,250 --> 00:47:29,569 It's 3:00 in the morning on a Sunday morning. The night clubs are turning out. 454 00:47:29,570 --> 00:47:31,940 That was quite, quite early for these days, isn't it. 455 00:47:31,960 --> 00:47:37,040 But but the night clubs were turning out at 3 a.m. and there's a guy staggers across the pavement, 456 00:47:38,450 --> 00:47:43,190 trips at the side of the gutter and falls in the road. And I very nearly run him over. 457 00:47:43,760 --> 00:47:48,380 What's the diagnosis? He's drunk. Yeah, he's actually had a stroke. 458 00:47:49,880 --> 00:47:58,400 But I thought because of where I was and where he was, time of day, I thought he was drunk, too. 459 00:47:59,990 --> 00:48:02,150 Recent experience dominates the evidence. 460 00:48:02,180 --> 00:48:09,950 So if you're a doctor and you've prescribed an evidence based treatment for a patient and it's not something you're familiar with, 461 00:48:09,950 --> 00:48:19,850 the patient develops a horrible side effect. How hard is it to then write out the prescription for the next patient for whom it's indicated? 462 00:48:20,600 --> 00:48:25,050 Yeah. Recent experience dominates what the evidence says. 463 00:48:26,150 --> 00:48:31,220 It's just pattern recognition all the time. Well, run, run through all of these. 464 00:48:31,970 --> 00:48:35,140 Let me go with this. This one, the gambler's fallacy. 465 00:48:35,150 --> 00:48:40,860 So I've seen three people recently with chest pain just in the last couple of weeks there. 466 00:48:41,630 --> 00:48:48,530 They've all had heart attacks. I would normally say, you know, two or three of these a year at most. 467 00:48:49,820 --> 00:48:53,870 Here's a fourth patient with crushing central chest pain going down their arm. 468 00:48:54,080 --> 00:49:01,700 They're sweating and they're feeling sick. My brain is saying this can't possibly be a fourth person with a heart attack in two weeks. 469 00:49:02,480 --> 00:49:07,700 That just can't be the case. Whereas, of course, if I stand back and think about it, 470 00:49:08,390 --> 00:49:16,250 the presentation of this patient now cannot possibly be connected with the three other patients who've had heart attacks in the last two weeks. 471 00:49:16,370 --> 00:49:19,920 There can be no connection whatsoever. It can't just be. 472 00:49:19,940 --> 00:49:25,700 Look, I search Satisficing. 473 00:49:25,700 --> 00:49:33,510 So the most commonly missed fracture on an x ray in the A&E department is Sky for now. 474 00:49:36,530 --> 00:49:42,410 Colleagues now note the most commonly missed fracture is the second fracture. 475 00:49:45,230 --> 00:49:53,060 Because you look on the x ray of whatever's hurting, you see a fracture and you say, Great, I found out why this patient's hurting. 476 00:49:53,750 --> 00:49:59,719 And that can have serious consequences if they've got four fractures in one hand, because that needs the hand surgeon now, 477 00:49:59,720 --> 00:50:03,800 because that's an occupationally threatening set of conditions, 478 00:50:04,010 --> 00:50:09,560 whereas a crack in well metacarpal is probably a back slap and see you in fracture clinical Monday. 479 00:50:09,620 --> 00:50:19,070 Completely different situation so lots of these cognitive biases that they all have a basis and pattern recognition. 480 00:50:19,580 --> 00:50:22,640 I know what it is and I recognise that. 481 00:50:23,750 --> 00:50:27,890 Who's this Jesus straight away? Just like warrior Jesus. 482 00:50:28,610 --> 00:50:32,089 Yeah. Representation of D. Nobody's really sure what Jesus looks like, of course. 483 00:50:32,090 --> 00:50:35,630 But traditionally this could be Jesus. Yeah. Okay. 484 00:50:36,290 --> 00:50:43,970 Not Jesus. This is a piece of toast. But the pattern makes you think of Jesus. 485 00:50:46,690 --> 00:50:49,930 Jesus in the Kit Kat bar. Oh, wow. 486 00:50:50,270 --> 00:50:55,930 Got one before. Be careful eating Kit-Kat bars when you're driving a car. 487 00:50:58,570 --> 00:51:02,950 Oh, I'm not sure this is a true story. 488 00:51:03,280 --> 00:51:09,399 I got it from the Internet. I definitely put it just fitted very nicely. 489 00:51:09,400 --> 00:51:14,260 So. So I put it in. Who is this? Not Hitler. 490 00:51:18,250 --> 00:51:23,140 But doesn't it make you think of Hitler? It's this pattern recognition thing. 491 00:51:23,800 --> 00:51:27,640 Once it starts, you can't turn it off. Once you've got the pattern there. 492 00:51:27,790 --> 00:51:33,370 It's really hard. You have to make a conscious effort not to go with saying, Oh, Scott, looks like Hitler. 493 00:51:34,030 --> 00:51:37,300 What you do is, of course, you can see the problem. 494 00:51:37,570 --> 00:51:46,570 Oh, we we've got some repetition. So lots of cognitive biases this lies put in to remind me to mention affective biases. 495 00:51:46,570 --> 00:51:52,390 So they are. The airline pilots talk about being hungry, angry, late or tired? 496 00:51:52,570 --> 00:52:01,750 Tired. That is the connotation here. I don't necessarily read right across from things in the aviation industry to the decision making in medicine, 497 00:52:02,050 --> 00:52:04,810 but I think there is something we could learn from the awareness. 498 00:52:04,810 --> 00:52:10,450 Pilots have self awareness that they need to be particularly careful if they've got one of those. 499 00:52:10,750 --> 00:52:12,730 Certainly as a GP for 20 years. 500 00:52:12,910 --> 00:52:21,490 I'd have all four of those most days and that certainly seems to be true the way things are in clinical medicine at the moment. 501 00:52:21,820 --> 00:52:27,850 Effective bias is each one. Any one of them individually increases your chances of making an error by 30%. 502 00:52:28,990 --> 00:52:37,630 We don't make too many errors, fortunately, but a relative increase of 30% is it's worth thinking about. 503 00:52:38,470 --> 00:52:41,800 So let's go back to Joan as we finish off. 504 00:52:42,100 --> 00:52:45,150 What would be the optimum consultation with Joan? 505 00:52:45,160 --> 00:52:51,760 We'd want to know something about the benefits and risks of anticoagulation. 506 00:52:51,760 --> 00:52:57,339 And when I was at Nice, I actually did some work with a colleague called Andy Hutchinson. 507 00:52:57,340 --> 00:53:03,159 And actually we run the world championships of atrial fibrillation experts 508 00:53:03,160 --> 00:53:08,950 and the shared decision making experts and came up with this sort of graphic. 509 00:53:10,270 --> 00:53:20,620 So at the top, there's a thousand people like Joan, and over the next 12 months, 916 people won't have a stroke, but 84 people will. 510 00:53:21,610 --> 00:53:24,879 So you weren't too far away with 50. It's a ballpark figure. 511 00:53:24,880 --> 00:53:29,380 It's not too far away. That's just the point estimate. So it'll have confidence intervals around it. 512 00:53:29,380 --> 00:53:33,820 It's not necessarily absolutely accurate to four decimal places, 513 00:53:34,120 --> 00:53:40,300 but that's a sort of ballpark figure we could come up with from the best available evidence at the time. 514 00:53:40,660 --> 00:53:44,090 So how do you represent that to Joan? 515 00:53:44,110 --> 00:53:49,960 If you get into the conversation with her about her risks of stroke, do you say great news, Joan? 516 00:53:50,560 --> 00:53:57,100 If there's a thousand people like you over the next year, a thousand people won't have a stroke. 517 00:53:57,100 --> 00:54:03,130 Related, 960 people out of a thousand won't have a stroke related to the atrial fibrillation. 518 00:54:04,270 --> 00:54:09,969 But 84 people will say bad news, Joan. 519 00:54:09,970 --> 00:54:16,900 That's atrial fibrillation. 84 people out of a thousand will have a stroke in the next year, 520 00:54:17,770 --> 00:54:28,090 but 960 won't because how you frame that news and those numbers will make a big difference to how she views it and actually to the way you view it. 521 00:54:29,440 --> 00:54:36,970 And you have a propensity for one way round or the other, and you shouldn't because you're trying to reach an unbiased decision. 522 00:54:37,240 --> 00:54:41,710 You angio because you have a mind line about this. 523 00:54:42,340 --> 00:54:46,210 But Joan has a mind line about this, too, and we don't know what Joan's as yet. 524 00:54:48,580 --> 00:54:53,860 If we add in the anticoagulation. Oh, sorry, 57. 525 00:54:54,220 --> 00:55:02,230 Sorry, go back. 157 people are saved from having an atrial fibrillation related stroke. 526 00:55:02,230 --> 00:55:07,240 If we don't calculate all the thousand, all the thousand have to take the anticoagulation. 527 00:55:07,540 --> 00:55:15,099 Only 57 are helped. And again, there's the framing problem with the how much benefit people get as well. 528 00:55:15,100 --> 00:55:18,760 And then there's the problem that you don't have a crystal ball. 529 00:55:19,360 --> 00:55:23,650 You can only say to Joan what will happen in the population. 530 00:55:24,160 --> 00:55:33,460 You can't say what will happen for her. Phil Hammond said years ago, having a stethoscope is no substitute for a crystal ball. 531 00:55:33,490 --> 00:55:37,390 It would be great if you could say what would happen to her, but you can't. 532 00:55:39,270 --> 00:55:45,610 In terms of bleeding. There's nine people that have a bleed without anticoagulation. 533 00:55:45,630 --> 00:55:51,480 It goes up by another, another 15. Most people don't have bleeds, but nevertheless, that's a significant increase. 534 00:55:52,590 --> 00:55:59,430 So does that solve the problem for you? Now, you know the evidence, does that help you to make the decision for your own? 535 00:56:03,240 --> 00:56:08,610 I would argue not because you can't tell what will happen to her and you don't know 536 00:56:08,610 --> 00:56:13,800 how she feels about this and you have no idea what else is going on in her head. 537 00:56:15,720 --> 00:56:19,650 So that'll go one day of researchers fame and immortal fame. 538 00:56:21,030 --> 00:56:27,470 I would recommend reading anything, but I'll going to go one day, but particularly being mortal. 539 00:56:27,480 --> 00:56:31,590 And he says there's probably four questions we need to ask a bit more often. 540 00:56:32,340 --> 00:56:37,890 What's your understanding of your illness? What are your fears and worries for the future? 541 00:56:37,920 --> 00:56:42,740 What outcomes will be unacceptable to you? And and what are we trying to do here? 542 00:56:43,610 --> 00:56:48,050 What are we trying to do for you? In American speak, it's one of the goals of care. 543 00:56:48,060 --> 00:56:54,630 If time is short and I can't do accents, as you've just discovered, I don't think I'd do the if time is short, 544 00:56:54,870 --> 00:56:59,100 because most 84 year olds know that time is short and don't need their noses rubbing in it. 545 00:57:00,570 --> 00:57:06,990 Yeah, but something along those lines would perhaps be worth exploring. 546 00:57:06,990 --> 00:57:08,219 There's an evidence base for this. 547 00:57:08,220 --> 00:57:16,500 This is a randomised controlled trial of using those sorts of questions to people who've got widespread lung cancer at diagnosis. 548 00:57:16,890 --> 00:57:21,600 So the randomisation is to having that sort of conversation and not having that sort of conversation. 549 00:57:21,600 --> 00:57:30,390 The people who had that conversation had 20% fewer emergency admissions for hospice or hospital care. 550 00:57:31,560 --> 00:57:35,670 Their costs were 20 to 30% lower. The cost of care were 20 30% lower. 551 00:57:36,350 --> 00:57:39,660 Um, they lived on average 25% longer. 552 00:57:40,830 --> 00:57:48,920 And interestingly, of those who die, they looked at the relatives of the people who died and the people who'd had that advanced planning conversation. 553 00:57:49,200 --> 00:57:56,730 The relatives of those people were in much better shape six months after death of their beloved relative. 554 00:57:57,480 --> 00:58:02,639 Um, this is a randomised controlled trial at the end of a GP consultation saying is there anything else 555 00:58:02,640 --> 00:58:08,700 we can usefully talk about today or is there something else we could usefully talk about today? 556 00:58:09,930 --> 00:58:15,780 Anything or something. Something gets you 80% of the hidden agendas. 557 00:58:16,020 --> 00:58:21,990 It gets you where Jonas with this stuff, if you say anything, seems to close it off. 558 00:58:22,500 --> 00:58:31,110 One word is incredibly powerful and we've not explored how to do this fantastically well as an individual. 559 00:58:31,530 --> 00:58:35,280 So how did it go with Joan, do you think? 560 00:58:37,470 --> 00:58:43,440 Well, we started off with a anticoagulation stroke reduction type of conversation, but we didn't get too far. 561 00:58:44,910 --> 00:58:47,910 I don't want any more tablets. I feel ill all the time. 562 00:58:48,090 --> 00:58:52,920 I'm unsteady on my feet. I need help with shopping. I need help getting in and out of the shower in the morning. 563 00:58:53,100 --> 00:58:59,549 She's frightened of falling over and fracturing a hip again. She has a fairly unpleasant experience at the moment. 564 00:58:59,550 --> 00:59:03,420 I can't stand them cooking. She was a great cook, really keen on cooking, as you can say. 565 00:59:04,170 --> 00:59:07,710 How am I going to get a meal? But we persisted. 566 00:59:10,770 --> 00:59:13,830 I don't want water in my house was on water and it was awful. 567 00:59:14,100 --> 00:59:18,960 I'm not interested in your pictures of benefits and risks. I 84 now. 568 00:59:19,260 --> 00:59:22,420 Tell me which of my tablets are controlling my symptoms and let's stop the rest. 569 00:59:22,440 --> 00:59:23,820 I'll take my chances. 570 00:59:26,090 --> 00:59:32,120 Never mind that the guideline says anticoagulant people are risk of stroke because they got atrial fibrillation or prevent lots of strokes. 571 00:59:32,660 --> 00:59:36,380 The population level, it works fantastically at the individual level. It may well not. 572 00:59:37,550 --> 00:59:42,140 And we've concentrated on the population, not the individual. So what happened? 573 00:59:44,760 --> 00:59:47,850 Joan did well. She didn't take her anticoagulant. 574 00:59:48,060 --> 00:59:52,890 Some of her medicines were stopped. She didn't have a stroke because the odds were always in favour of her. 575 00:59:53,160 --> 00:59:57,330 Go with many more green faces than red faces. 576 00:59:57,570 --> 01:00:01,380 She got stronger. She was able to look after self in her own flat with little support. 577 01:00:01,390 --> 01:00:06,450 Five more years of good quality of life and died peacefully, two months short for 89th birthday. 578 01:00:06,480 --> 01:00:09,150 I'm well qualified to tell you Joan story. 579 01:00:09,150 --> 01:00:18,390 Joan was actually my mom and you can see the effect of cardiac actually a longstanding and stage heart disease. 580 01:00:18,420 --> 01:00:22,530 Her ankles are swollen but this is quarter to one in the morning at a party. 581 01:00:25,140 --> 01:00:28,200 Know I know I'm saying mom it's quarter one. 582 01:00:28,350 --> 01:00:32,280 Shall I take you home? Are you going to be ever attacked? And she said, you must be joking. 583 01:00:32,550 --> 01:00:36,480 How often do I get to come to a party like this? Take me over that. I haven't seen Mary for ages. 584 01:00:36,810 --> 01:00:46,950 So. So the epilogue, very quickly for slides, patient centeredness, I think now is an absolute gift. 585 01:00:48,000 --> 01:00:53,670 Every consultation, values and preferences of patients has got to be there. 586 01:00:54,360 --> 01:00:59,280 And whilst we've done some work on consultation skills, most specialists get nothing, 587 01:00:59,730 --> 01:01:05,950 GP's get some process based stuff and it's still nowhere near enough to get that, 588 01:01:05,970 --> 01:01:14,730 to get that to that wonderful expertise that's required to do these sorts of preference sensitive consultations really well. 589 01:01:16,410 --> 01:01:21,209 The bifocal version I think, is a really useful concept. 590 01:01:21,210 --> 01:01:25,620 So we're not ditching the evidence. We want to know what the evidence is for the population. 591 01:01:25,620 --> 01:01:33,360 It's essential both for Joan and for her clinician to know what that evidence is, perhaps in some circumstances. 592 01:01:34,560 --> 01:01:38,310 But sometimes people make decisions without knowing what the evidence says. 593 01:01:38,520 --> 01:01:47,610 They'll just go with a few variables and just make a choice based on while that was quite good, we'll do that again, as we did with the cars. 594 01:01:47,940 --> 01:01:52,020 So we have to have a focus on the individual and the focus on the population. 595 01:01:52,230 --> 01:02:01,230 Both of these, not predominantly this which abmas done more off for the last 20 years. 596 01:02:01,230 --> 01:02:07,710 I think third thing, I think we're all agreed the evidence doesn't make the decision for you. 597 01:02:08,520 --> 01:02:12,950 Decision making is an entity on its own and we have to know the evidence. 598 01:02:12,950 --> 01:02:15,899 But we have to be really good at making decisions too, 599 01:02:15,900 --> 01:02:21,299 and knowing a bit about what's going on in our heads and knowing a bit about what might be going on in the patient's 600 01:02:21,300 --> 01:02:30,720 heads and exploring the two together might be somewhat better than everybody gets aspirin after a heart attack, 601 01:02:30,720 --> 01:02:36,299 let's say. So the final slide is the sort of where we're going to need to go with this. 602 01:02:36,300 --> 01:02:42,240 We need to do a bit more on the curriculum. We need to help people find the stuff that really matters. 603 01:02:42,750 --> 01:02:49,829 We need to understand ourselves. So we need to understand pattern recognition and bounded rationality and how 604 01:02:49,830 --> 01:02:54,600 those two things work together for us and how we can employ those heuristics, 605 01:02:54,600 --> 01:03:00,810 those stop start rules, because we will inevitably make most decisions using system one and pattern recognition. 606 01:03:02,070 --> 01:03:06,389 And then the final bit is they let's work a bit more at the consultation, 607 01:03:06,390 --> 01:03:10,560 the scales in order to have really great, meaningful consultations with people. 608 01:03:10,800 --> 01:03:15,990 That's the sort of curriculum that we need to be heading for, and it's a spiral curriculum. 609 01:03:15,990 --> 01:03:20,550 It's no good talking to medical students about decision making because they're still constructing those a knowledge. 610 01:03:22,530 --> 01:03:25,980 Once you start having to make decisions, you need to know a bit about making decisions. 611 01:03:26,670 --> 01:03:28,620 But before that time, it's too soon. 612 01:03:29,790 --> 01:03:35,250 But an awareness that this is where you're heading, we're expect you to become an expert might be might be really helpful. 613 01:03:35,850 --> 01:03:40,210 I hope that's been helpful. Thank you very much for the invitation to.