1 00:00:00,750 --> 00:00:08,160 This is joint work I've done with Justin Oakley at Monash now, 2 00:00:08,160 --> 00:00:14,940 many of you have met Justin, you may have read his many contributions to bioethics. 3 00:00:14,940 --> 00:00:19,920 So to acknowledge him as co-author of the paper. 4 00:00:19,920 --> 00:00:28,240 So this is a paper about the importance of hope in health care now. 5 00:00:28,240 --> 00:00:29,980 A lot of people will tell you this, 6 00:00:29,980 --> 00:00:41,840 they'll say it's really important that patients who are undergoing some kind of medical procedure hope for a successful outcome. 7 00:00:41,840 --> 00:00:46,690 OK, I hear this, you know, a lot of times and it's bioethicists all 8 00:00:46,690 --> 00:00:55,420 I never heard someone who's flat out denied it. But people just simply said a lot of health care professionals will say this as well. 9 00:00:55,420 --> 00:01:01,030 So really important that they hope for a good outcome. 10 00:01:01,030 --> 00:01:09,880 But it's not clear why this is the case, why it's so important, if indeed it is, and we should be open minded about this. 11 00:01:09,880 --> 00:01:15,640 Maybe it's not important why you think it is and what we're going to do. 12 00:01:15,640 --> 00:01:22,810 What just Nyhan do is propose a hitherto unrecognised explanation or as far as we're aware, 13 00:01:22,810 --> 00:01:29,390 no one else has come up with it, but it's always possible that they come up with it and get it from it, from us. 14 00:01:29,390 --> 00:01:35,320 It's possible and hit you unrecognised explanation for the importance of patients, 15 00:01:35,320 --> 00:01:41,200 physician policies, successful treatment and this appeals to prospects. 16 00:01:41,200 --> 00:01:51,970 So prospect theory for those of you who aren't familiar is a descriptive theory about decision making in situations of risk in uncertainty. 17 00:01:51,970 --> 00:02:01,780 It's probably the most influential descriptive theory so often contrasted the expected utility theory, 18 00:02:01,780 --> 00:02:05,470 but expected utility theories often means simply enormous experience. 19 00:02:05,470 --> 00:02:14,710 What you could do It's prospect theory usually just said to be descriptive of how people make decisions under situations of risk and uncertainty. 20 00:02:14,710 --> 00:02:18,190 So we're going to keep this theory and I'll go through the basics of it. 21 00:02:18,190 --> 00:02:26,910 And those of you who are familiar with that already and you just kind of tune out for a few weeks if that's OK. 22 00:02:26,910 --> 00:02:35,640 Now, this talk is about hope in health care, specifically about patients hope for a successful outcome. 23 00:02:35,640 --> 00:02:46,560 So it's quite narrowly targeted. However, the analysis we're going to provide might well work in other contexts, so we're just open minded about this. 24 00:02:46,560 --> 00:02:50,760 There are certainly other context in which people run around saying publicly support. 25 00:02:50,760 --> 00:02:53,970 So here's a couple that if we make change, 26 00:02:53,970 --> 00:03:03,960 hopefully military leadership so soft and claimed it's very important that military leaders hope that they're going to be successful. 27 00:03:03,960 --> 00:03:09,820 Clearly, if they're not hopefully successful, they despairing. 28 00:03:09,820 --> 00:03:18,120 And that sounds like a very bad thing intuitively. So people keep saying this and also elite athletes so often say, you know, 29 00:03:18,120 --> 00:03:23,370 you really have to hope that you're going to win when she sees hoping for victory or, 30 00:03:23,370 --> 00:03:32,520 you know, the sort of reality sets in facing the world champion and that your chances are very slim. 31 00:03:32,520 --> 00:03:38,370 Then things go wrong. You've got to OK. And there might be other contexts as well where hope is important. 32 00:03:38,370 --> 00:03:48,060 Completely open minded about that. But we're just focussing on this one context where we think hope is important. 33 00:03:48,060 --> 00:03:54,900 Other thing to say is we seem to have a sort of a weird sidebar crossing some. 34 00:03:54,900 --> 00:04:01,350 So my words here, but we're going to assume that and this is really uncontroversial assumption. 35 00:04:01,350 --> 00:04:05,340 But I need to say this in this paper somewhere else and people kept bringing 36 00:04:05,340 --> 00:04:11,550 it up is that the point of health care is to restore people to good health. 37 00:04:11,550 --> 00:04:18,660 It doesn't have some other things. So the health care system is set up to get people back to health. 38 00:04:18,660 --> 00:04:29,970 It's not for humanity in its summer. And so the very people who keep saying that hope is health is important in health care are the same people 39 00:04:29,970 --> 00:04:39,150 who say that health care is designed around this uncontroversial assumption of restoring people's security. 40 00:04:39,150 --> 00:04:45,440 We cannot. Both seem to have a. 41 00:04:45,440 --> 00:04:54,360 Now we start. Right. 42 00:04:54,360 --> 00:04:57,120 Now, so that's good. OK. 43 00:04:57,120 --> 00:05:06,210 So if you start reading the literature on hope in philosophy, there is said to be a standard, so that's going to be the jumping off point here. 44 00:05:06,210 --> 00:05:13,200 So here's a take on the standard account by Eve Garrard and co-author Hurston hopes for 45 00:05:13,200 --> 00:05:19,170 statehood of phase three if and only if she desires the believes that it is possible, 46 00:05:19,170 --> 00:05:25,230 though not certain, that he will come about. So if you're not desiring it, you're not hoping. 47 00:05:25,230 --> 00:05:29,940 And if you think it's certain to come about how it's, you know, it's in the bag, 48 00:05:29,940 --> 00:05:35,010 it's not not something you're hopeful that it has to be something less than certainty. 49 00:05:35,010 --> 00:05:42,840 Thanks. Now. Straightaway, this doesn't seem right to me, it's not there's not enough going on and what's missing, 50 00:05:42,840 --> 00:05:51,090 and I'm not the first person to point this out is any mention of the S8 affective dimension. 51 00:05:51,090 --> 00:06:00,100 So think about this. So you've got two patients and thought experiment in the same patient. 52 00:06:00,100 --> 00:06:08,320 Patient number one, he believes that there's a one per cent chance that a malignant tumours will be successfully treated. 53 00:06:08,320 --> 00:06:18,640 And this is something she desires. And so she fits the standard definition pretty straightforwardly, and she's hopeful. 54 00:06:18,640 --> 00:06:22,450 She thinks, Oh, there's a one per cent chance this is great, huh? 55 00:06:22,450 --> 00:06:29,770 So she hopes she approaches the impending operation with a sense of hope of success in the face of great odds of failure. 56 00:06:29,770 --> 00:06:38,410 A patient to exact same circumstances believes as one per cent chance that the malignant tumours will be successfully treated. 57 00:06:38,410 --> 00:06:46,030 But she finds herself in a state of despair because only one per cent chance she abandons hope now intuitively. 58 00:06:46,030 --> 00:06:50,410 These people are very different. The first one is genuine case of hope. 59 00:06:50,410 --> 00:06:54,460 The second one is not in this same state. What's the difference? 60 00:06:54,460 --> 00:07:02,800 It's a different ethic to start different emotional engagement with information that's being presented 61 00:07:02,800 --> 00:07:10,780 as it presented the exact same information and don't have the exact same outcome they want. 62 00:07:10,780 --> 00:07:14,830 But they have different emotional reactions, and intuitively, the first one is hope. 63 00:07:14,830 --> 00:07:19,630 And the second one? Yeah, a different thing. 64 00:07:19,630 --> 00:07:27,910 Yeah. Here's a definition, which we think is much more plausible, and this comes from Adrian, Martin said. 65 00:07:27,910 --> 00:07:34,480 She says to look for an outcome is to decide to be attracted to it, assign probability somebody between zero and one. 66 00:07:34,480 --> 00:07:45,640 I take it she means not, including one percent certainty, and to judge that there are sufficient reasons to engage in certain feelings and attitudes. 67 00:07:45,640 --> 00:07:55,390 So the judgement is driving an emotional engagement. The margin contrasts her view with the standard view. 68 00:07:55,390 --> 00:08:00,910 She also contrasted with the view that's dominant in academic psychology. 69 00:08:00,910 --> 00:08:12,700 So if you're reading to this, there's someone who which Snyder who big hope fubar in psychology like reams of papers about books and things, 70 00:08:12,700 --> 00:08:17,860 and it's got big school out in the hope theorists. 71 00:08:17,860 --> 00:08:29,330 And this is a definition for him and some of his co-authors. Hope is a goal oriented cognitive construct these effective and behavioural implications. 72 00:08:29,330 --> 00:08:36,880 Now Adrian Martin contrasts it means she doesn't like this field, but we think very similar. 73 00:08:36,880 --> 00:08:42,680 OK, so she's hoping for an outcome. 74 00:08:42,680 --> 00:08:48,950 Snyder unfolds this as a goal oriented cognitive construct, an outcome of the goal. 75 00:08:48,950 --> 00:08:55,010 Pretty much the same thing. So we think she shouldn't contrast, 76 00:08:55,010 --> 00:09:02,510 and she's right to contrast him with the standard account of Rome to make such a big song and dance about the difference between housing, 77 00:09:02,510 --> 00:09:07,910 psychology and so on. 78 00:09:07,910 --> 00:09:12,440 So so we don't see what the fuss is about. 79 00:09:12,440 --> 00:09:17,380 We think you should get these psychologists. OK. 80 00:09:17,380 --> 00:09:29,120 Now, you might have an objection to this, because not all uses of the term of the word seem to motivate goal oriented behaviour. 81 00:09:29,120 --> 00:09:41,660 OK, so if I say, look, I hope it's tomorrow thing, just give you like, I hope this even better luck to have a nice day and see where folks in this. 82 00:09:41,660 --> 00:09:50,210 I'm saying that despite have no understanding, fully understanding, I have no ability to influence the winner whatsoever, 83 00:09:50,210 --> 00:09:58,440 so it's not possibly a goal for me is not my goal to cause good weather tomorrow. 84 00:09:58,440 --> 00:10:03,080 OK, so the hope is not going to be in any way whatsoever. 85 00:10:03,080 --> 00:10:16,700 OK. So as a definition, this doesn't seem to us to do the work that's needed, but we're not interested in giving me injecting definition of heart. 86 00:10:16,700 --> 00:10:23,540 OK? What we're interested in is a particular form of goal oriented behaviour. 87 00:10:23,540 --> 00:10:26,960 OK. I actually hope for successful treatments. 88 00:10:26,960 --> 00:10:33,960 What we think is going on is that when patients hope for successful treatment, they're getting with the programme, 89 00:10:33,960 --> 00:10:41,210 they're gauging a series of activities that will be part of their perceived treat. 90 00:10:41,210 --> 00:10:49,070 So they're consenting to the operation they going along with whatever the rehabilitation process is afterwards. 91 00:10:49,070 --> 00:10:57,410 In the end, a series of exercises trying on some kind of restrictive diet or otherwise restricting their behaviour 92 00:10:57,410 --> 00:11:05,450 to ensure that you're taking those forms of medication for enough time after the operation and so on. 93 00:11:05,450 --> 00:11:16,040 And that's the goal oriented behaviour. So we're interested in this subclass of hope, which we think is captured by the psychologists pretty well. 94 00:11:16,040 --> 00:11:22,130 So we're on board with the psychologists. But we acknowledge that. 95 00:11:22,130 --> 00:11:29,210 Someone who was more concerned to pin down an exact, you know, 96 00:11:29,210 --> 00:11:38,600 added definition of necessary and sufficient conditions would need something slightly different from this. 97 00:11:38,600 --> 00:11:50,690 But that's what happens. OK, now let's get on to just looking at some reasons that are being given, and this is just in literature, one hopes and we. 98 00:11:50,690 --> 00:12:00,010 So this is some stuff people say. So one of the things they say is that, look, there are psychological benefits of having hope. 99 00:12:00,010 --> 00:12:08,150 OK. So why is it important for patients to have hope? Well, let's say they have hope they'll get these psychological. 100 00:12:08,150 --> 00:12:17,240 Now, we don't think this is a good response, and the problem is not that there may not be psychological benefits of hope. 101 00:12:17,240 --> 00:12:21,170 It's just that it's very difficult to tease them out from the benefits that come along 102 00:12:21,170 --> 00:12:28,400 with optimism and so hopeful person that is almost invariably an optimistic person. 103 00:12:28,400 --> 00:12:33,980 Lots of studies that show that optimism has health benefits in various ways. 104 00:12:33,980 --> 00:12:44,450 And so as far as we can tell, it's optimism that's probably doing the work of providing health benefits. 105 00:12:44,450 --> 00:12:53,660 It's possible that hope is adding something to it, but it's just very, very difficult to get clear what separates this act. 106 00:12:53,660 --> 00:13:03,200 So we think this might be something to this explanation, but it's just very difficult to prove. 107 00:13:03,200 --> 00:13:12,920 OK. A second thing people say is that the company wants to avoid civil rights. 108 00:13:12,920 --> 00:13:17,810 So there's a recent paper which runs this line, 109 00:13:17,810 --> 00:13:29,320 and we also name on blanking on draws on this stuff by this guy to Frankel, who survived the concentration camp. 110 00:13:29,320 --> 00:13:36,250 And according to Frankel, the important thing for him surviving was having hope. 111 00:13:36,250 --> 00:13:43,240 And he points out in his story that were many members, 112 00:13:43,240 --> 00:13:50,560 many people who stuck in the concentration camp who fell into a state of despair about their miserable lives and prospects. 113 00:13:50,560 --> 00:13:59,380 So what happened is when they fell into the state of despair, according to Frankel, they just ceased sort of trying. 114 00:13:59,380 --> 00:14:06,850 So with the concentration camp guards would be starting to do this, that or the other, and they just would do it. 115 00:14:06,850 --> 00:14:14,800 And the gods would hit them with their rifles and they'd sort of fall on the ground and do nothing and 116 00:14:14,800 --> 00:14:24,610 basically just dive into the state of despair that and realise that there's just no good future for them. 117 00:14:24,610 --> 00:14:28,120 So they figure out what it looks like. 118 00:14:28,120 --> 00:14:36,220 And, you know, they don't survive very long. So frankly, story is that hope is very important because it helps you avoid despair. 119 00:14:36,220 --> 00:14:44,530 Now we think that's true. It's just that we also think is a neutral state, which is between hope and despair. 120 00:14:44,530 --> 00:14:54,400 OK, which is neither hoping for a saviour is successful, more despairing that it's going to be unsuccessful. 121 00:14:54,400 --> 00:15:04,060 And what we would like is an explanation that picks our hopeful state from the neutral state and smells fantastic. 122 00:15:04,060 --> 00:15:13,180 So we don't think that pointing out the badness of despair is sufficient to tell you about the goodness of hearts, 123 00:15:13,180 --> 00:15:20,450 because we also want to know why hope is better than the nutrients that. 124 00:15:20,450 --> 00:15:26,490 Thank you. That is a sad thing that you said, which we endorse, we don't think the full story, 125 00:15:26,490 --> 00:15:34,770 but we endorse this hope motivates patients to cooperate with health care professionals to achieve the goal of regaining its health. 126 00:15:34,770 --> 00:15:38,670 So it's motivation. And we think this is clearly right. 127 00:15:38,670 --> 00:15:45,000 The patient who is hopeful they're going to be motivated, they're going to try harder to participate in their rehabilitation, 128 00:15:45,000 --> 00:15:52,080 they'll take the regular exercise, they'll stick to the restrictive diets. We'll take regular medication for longer periods of time. 129 00:15:52,080 --> 00:15:56,700 The patient who's in despair certainly won't do this. So what's the point? 130 00:15:56,700 --> 00:15:58,710 The patient is in the neutral state. 131 00:15:58,710 --> 00:16:10,950 Well, you know, they might do if you think about the benefits, but they're not particularly motivated to do less of a regular exercise and diet. 132 00:16:10,950 --> 00:16:16,740 You know, alcohol much. Whatever it is, we think you need help. 133 00:16:16,740 --> 00:16:24,750 You need to be focussed on the goal of restoring yourself to health to get it. 134 00:16:24,750 --> 00:16:28,750 So we think this motivational disturbance could host this. 135 00:16:28,750 --> 00:16:34,800 We just think it's not the whole study. Okay, what is the whole story most when I speak to you? 136 00:16:34,800 --> 00:16:39,130 So this is the we can treat our first, too. 137 00:16:39,130 --> 00:16:44,440 OK, let's start with the well-known idea disease problem. 138 00:16:44,440 --> 00:16:51,130 So this weekend in the diverse diversity, imagine that the this is better all study, 139 00:16:51,130 --> 00:16:58,180 imagine that the US hearing an outbreak of an unusual Asian disease which is expected to kill 600 people, 140 00:16:58,180 --> 00:17:02,620 two alternative programmes to combat the disease have been proposed. 141 00:17:02,620 --> 00:17:11,770 Assuming that the exact scientific estimates of the consequences the programmes it was far from Ghana has adopted 200 people will be saved. 142 00:17:11,770 --> 00:17:20,800 Programmes B is a one third probability that 600 will be saved and two thirds probability that no people saved. 143 00:17:20,800 --> 00:17:32,410 So that's your choice. Your research subject, given these choices, now turns out that most people prefer programme in these choices. 144 00:17:32,410 --> 00:17:38,820 So what's going on is that programming is the safest way. 145 00:17:38,820 --> 00:17:52,080 You you're making a recommendation. This is the safest you will save to hunt programme B, we'll save 200 on average in the risk category. 146 00:17:52,080 --> 00:17:57,300 So there's a two thirds chance of say No. 600 die. 147 00:17:57,300 --> 00:18:03,130 But there's a one third chance you'll say, Oh, that's the riskiest. Most people don't take the risk and left this life. 148 00:18:03,130 --> 00:18:07,740 We can now tell a story slightly differently. 149 00:18:07,740 --> 00:18:15,150 Same scenario and you saying on this programme say is adopted 600 400 people will die. 150 00:18:15,150 --> 00:18:23,300 Programme B is adopted one third probability that no one will die and a two thirds probability that 600 people die. 151 00:18:23,300 --> 00:18:31,400 Now you tell the story this way in the same scenario, but you use the wheel dialling, which will be saved. 152 00:18:31,400 --> 00:18:46,380 Flip the script around. People have very different choices, so this time 22 per cent of the subjects preferred see which is the safe bet. 153 00:18:46,380 --> 00:18:57,360 This 400 by 200 was saved and 78 percent take the state programme D now. 154 00:18:57,360 --> 00:19:06,680 So what's going on is that this sudden switch of wording has switched people most people from being risk averse to risk seeking. 155 00:19:06,680 --> 00:19:11,670 Thanks. Now the big question is how does this happen? 156 00:19:11,670 --> 00:19:18,270 How is this little switching language at this extraordinary effect? 157 00:19:18,270 --> 00:19:27,730 Well. This is this is the explanation, which is so dominant view, and it's again comes from kind of adversity. 158 00:19:27,730 --> 00:19:32,290 They say this that make frequent claims. This is they've got some prospects. 159 00:19:32,290 --> 00:19:41,950 We evaluate risk in comparison to a reference point. Outcomes that are not understood as superior to the reference points are regarded as games. 160 00:19:41,950 --> 00:19:55,000 Outcomes that are understood as being in Syria are regarded as lost weight loss lot of us so losing a psychologically, morally winning benefit. 161 00:19:55,000 --> 00:20:06,340 So Cameron, in his book, he cites the case of Jemmy Connors, who famously said, I hate to lose more than I love to weep right now. 162 00:20:06,340 --> 00:20:11,650 Apparently, we're all guilty because you miss me. 163 00:20:11,650 --> 00:20:18,670 We all feel more than the badness of losing in the goodness of. 164 00:20:18,670 --> 00:20:31,760 So I think academics can relate to this. You could choose from a journal and you feel much more about acceptance makes you feel good right now. 165 00:20:31,760 --> 00:20:38,440 The thing is, we experience diminishing sensitivity gains and losses in proportionally relative distance from the records. 166 00:20:38,440 --> 00:20:43,060 So if it's zero dollars, the reference point first ten dollars, 167 00:20:43,060 --> 00:20:49,720 we might gain or lose as more psychologically psychological significance in next ten dollars. 168 00:20:49,720 --> 00:20:54,550 So putting that all together, you get a chart that looks like this. 169 00:20:54,550 --> 00:21:00,940 And so the key things here to understand is you've got your reference point. 170 00:21:00,940 --> 00:21:08,900 That's the point in which in your head, you switch from games to lawsuits. 171 00:21:08,900 --> 00:21:19,640 So again, is what's going on above the line are losses below the line and the curve below the line is steeper. 172 00:21:19,640 --> 00:21:25,790 So the losses are harming you more your field and the more you benefiting from the gains. 173 00:21:25,790 --> 00:21:29,990 So you like gains or losses more. That's the idea. 174 00:21:29,990 --> 00:21:36,530 Now, according to kind of adversity, this is a kind of a hard luck figure feature only human psychology. 175 00:21:36,530 --> 00:21:40,850 It's not just a few people like this. What's this? 176 00:21:40,850 --> 00:21:49,890 Just so this is this is the guts of prospects. It's descriptive theory of how we try and evaluate how we make decisions on the society. 177 00:21:49,890 --> 00:21:55,010 Risking uncertainty is that we way it's intuitive. 178 00:21:55,010 --> 00:22:03,870 We like this. We try and avoid losses and costs, but we'd like to make some gains. 179 00:22:03,870 --> 00:22:16,270 Now you use your reference point is the status quo, whatever the current situation is, so suppose you're lucky enough to own one problem. 180 00:22:16,270 --> 00:22:26,460 And that's your reference point. Now you are going to do a lot more to prevent loss get. 181 00:22:26,460 --> 00:22:27,300 OK. 182 00:22:27,300 --> 00:22:38,370 So you might you've got your one property and say you don't have any data which you might be able to take out a big mortgage to buy a second property. 183 00:22:38,370 --> 00:22:45,510 You. And you might do that if you think, well, I can lock in a low interest rate tool, very safe. 184 00:22:45,510 --> 00:22:51,630 But you're not going to do it if it's so risky and be risk averse. We got to that decision. 185 00:22:51,630 --> 00:22:57,240 Think about a threat to the status quo, to your ownership of that one property. 186 00:22:57,240 --> 00:23:02,460 Suppose I don't have a bank trust foreclose on something like that. 187 00:23:02,460 --> 00:23:06,630 We've never used the kind. You get the letter saying you're not making the payments. 188 00:23:06,630 --> 00:23:18,450 You have to do this. So totally. So if you're going to move heaven and Earth to stop this, OK, because that's your status quo, that one province. 189 00:23:18,450 --> 00:23:32,430 So the idea is that that's your response. So status quo, particularly about what you consider what you own, is often a reference point, should people. 190 00:23:32,430 --> 00:23:40,170 And from that point, you become loss of this and you risk averse. 191 00:23:40,170 --> 00:23:44,490 So you receiving a lot of Americans want to try and prevent losses. 192 00:23:44,490 --> 00:23:56,160 One of the reasons why risk averse and respecting, you know, as we've seen with Asian flu, the reference point can be shifted. 193 00:23:56,160 --> 00:24:00,600 So what's going on there is it's a kind of a little tricky language. 194 00:24:00,600 --> 00:24:13,730 So when you used the phrase, let's go back in and I'll show you this will die. 195 00:24:13,730 --> 00:24:24,550 This this makes people risk seeking because the phrase will die makes you think the reference point is that people are alive. 196 00:24:24,550 --> 00:24:32,330 OK, so you become risk sinking below the reference. You want to prevent those deaths. 197 00:24:32,330 --> 00:24:41,620 However, the language will be saved. What's the reference point for when intuitively that they're going to die if they're not saved? 198 00:24:41,620 --> 00:24:46,970 OK, so the reference point is now that they're as good as dead, so. 199 00:24:46,970 --> 00:24:53,140 So you are just starting to you're not going to be risk thinking in regard to making those statements. 200 00:24:53,140 --> 00:24:59,770 I severe risk sinking, preventing losses and this language shifts the way you think. 201 00:24:59,770 --> 00:25:06,720 But it does it. I mean, 99 or 100 people not going to realise this is going on and it's going on. 202 00:25:06,720 --> 00:25:12,600 OK, so I usually I usually flew reframing that's way of shifting a reference point. 203 00:25:12,600 --> 00:25:21,720 Another example of the way in which I reference point and shift is through acquisition of an extension. 204 00:25:21,720 --> 00:25:26,740 So let's go back to the writing that one thought. 205 00:25:26,740 --> 00:25:40,950 Now, suppose. Do your parents tell you when we die, we're going to you're going to be carrying a second now? 206 00:25:40,950 --> 00:25:45,130 You think so. That's all very well. You know, my parents are healthy, whatever. 207 00:25:45,130 --> 00:25:56,550 It's still going to happen any time soon. So you don't really think this sounds nice, but you won't lose parents, but you can't have that. 208 00:25:56,550 --> 00:26:04,860 So it's not really doing any work. It suppose now that actually, you know, you hear from a doctor, I should care inside, 209 00:26:04,860 --> 00:26:10,800 and they've lost to a new strain of COVID, and they're not going to be around for long this week. 210 00:26:10,800 --> 00:26:20,430 Well, you you're obviously pretty upset. Another thing that's going on is the silver lining is that you stop, you know, banking that that extra house. 211 00:26:20,430 --> 00:26:30,660 I think my reference point is moot now. So my reference point now based on rational expectation is that I have two properties, 212 00:26:30,660 --> 00:26:34,770 even though it's not quite yours, it's not even the bag you can't get in the bag. 213 00:26:34,770 --> 00:26:41,270 But suppose you know, a estranged cousin comes along and challenges the will. 214 00:26:41,270 --> 00:26:53,330 They say, look at that, Steve Clarke, the ne'er do well, you know, this thing about the property, 215 00:26:53,330 --> 00:27:03,950 they kind of made this and they produced this opportunity at that point because the reference point is shifted again. 216 00:27:03,950 --> 00:27:11,120 You consider this a loss if you've banked this in your head as as good as yours. 217 00:27:11,120 --> 00:27:17,140 So you're now going to move heaven on Earth. You lawyer up, get the best lawyers to get rid of this, don't you? 218 00:27:17,140 --> 00:27:21,860 It, because this is now a reference. 219 00:27:21,860 --> 00:27:28,850 However, when it was just a sort of distant thought in the back of your head and you weren't sort of banking on the property, 220 00:27:28,850 --> 00:27:41,040 will it didn't play this role? And if you'd heard that the cousin was sort of schmoozing with your parents and you know, you suspected there you are. 221 00:27:41,040 --> 00:27:46,890 That's a bit suspicious. What are they doing? You're not really going to be too fussed about it. 222 00:27:46,890 --> 00:27:54,260 You might sort of flag pops up, but nothing. So expectations can change. 223 00:27:54,260 --> 00:28:00,790 It's another thing that can change reference points is gone now. 224 00:28:00,790 --> 00:28:07,090 So agouti, this is a good example of this is rummy. So suppose you are a keen distance runner. 225 00:28:07,090 --> 00:28:11,920 I'm not talking about operations on two of my knees. I do not do this, but that's not constant. 226 00:28:11,920 --> 00:28:19,040 Use the example. Suppose you're running your regular runner, and if you want to run, you know you've got to go. 227 00:28:19,040 --> 00:28:24,970 You're kind of increasing the distance. You run the military every week or whatever, and you're up to five ks. 228 00:28:24,970 --> 00:28:28,190 You want to run five times. That's a bit of a challenge for. 229 00:28:28,190 --> 00:28:40,700 Well, you do understand that you might try to reach this goal, might incur some kind of injuries there are that there is a danger. 230 00:28:40,700 --> 00:28:46,270 Hamstring. Up your knee or whatever it is, athletes, whatever it is, 231 00:28:46,270 --> 00:28:54,460 you take these risks because you set yourself a goal that's now going to serve as your reference point. 232 00:28:54,460 --> 00:29:03,350 So goal, can you hear that will run five KS? You will take risks if you help get the goal, reach the goal. 233 00:29:03,350 --> 00:29:07,630 However, suppose you meet your goal and you still feeling pretty good about running. 234 00:29:07,630 --> 00:29:12,010 If you actually I could run a little bit further, you might do it, 235 00:29:12,010 --> 00:29:14,800 but you're not going to risk your health doing that because you've already achieved. 236 00:29:14,800 --> 00:29:19,360 You got the goal when your back of your head was a good run five times. 237 00:29:19,360 --> 00:29:23,860 So another K, yeah, you might do it if you're feeling good and you think this is low risk, 238 00:29:23,860 --> 00:29:31,960 but you're not going to be so bad the more you know, I'm not really terrified me. 239 00:29:31,960 --> 00:29:34,630 If I do this, just stop. 240 00:29:34,630 --> 00:29:43,660 So the goal is setting so different ways in which reference points and shift around and very important for how you respond accordingly. 241 00:29:43,660 --> 00:29:53,200 Prospect sequence, which is stop a truck. OK, now is there evidence that goals concerns reference points? 242 00:29:53,200 --> 00:30:01,360 Yes. So this is a well known study by people who Pope and Schweitzer, and they believe it or not. 243 00:30:01,360 --> 00:30:07,000 They examine two point five million golf packs using laser measurements. 244 00:30:07,000 --> 00:30:19,240 Now the reasoning was this a professional golfer? Is expected to get a par round a round of golf, so 18 holes is supposed to get passed. 245 00:30:19,240 --> 00:30:28,740 So for those of you don't know the game, every hole it's in either sluggish to get to the hole three or four shots. 246 00:30:28,740 --> 00:30:33,490 It's par for the hole and perfection must people like me suck. 247 00:30:33,490 --> 00:30:39,330 But a double body needs most times to two extra shots. 248 00:30:39,330 --> 00:30:43,750 Oh, what a professional golfer. This is what makes it. 249 00:30:43,750 --> 00:30:47,580 So that's the goal. I can schmucks a reason. 250 00:30:47,580 --> 00:30:55,830 Now, if that's right, if the professional golfers goal is to get a pass go on a given home, 251 00:30:55,830 --> 00:31:01,500 they are going to take more risks to ensure that only to a loss that they don't get 252 00:31:01,500 --> 00:31:10,740 worse for bogeys in golf and the risks they will take to get a better than pass forward. 253 00:31:10,740 --> 00:31:23,100 If you know your golf, if it's two under par or a an eagle, so, so, so they looked at these pots, these 2.5 million putts. 254 00:31:23,100 --> 00:31:32,910 And indeed, this is what they found that found that there was more of a chance of overshooting the hole when you were going. 255 00:31:32,910 --> 00:31:34,530 When you're really going forward, 256 00:31:34,530 --> 00:31:54,780 when you're faced with the threat of getting an overpass for the losses of then the underpass for the the same bonus got better. 257 00:31:54,780 --> 00:32:06,000 And the reason is that, you know, if you want to make sure to make sure you get the par school that your ball travels, at least as far as the hole. 258 00:32:06,000 --> 00:32:15,900 So you didn't know the danger of a shooting, whereas if you were putting four under par might just be happy just to lay out the ball for the hole. 259 00:32:15,900 --> 00:32:22,410 And so that's if it's a couple of inches short and you can just tap the right now. 260 00:32:22,410 --> 00:32:33,120 Interestingly. Some professional golfers are waiters, and they defend it, so this is probably the only philosophy paid that beat you. 261 00:32:33,120 --> 00:32:40,320 Tiger Woods has been cited, but here is Tiger, who is surely thought to be golf. 262 00:32:40,320 --> 00:32:48,300 He says Any time you make big parts, I think it's more important to make those than birdie putt. 263 00:32:48,300 --> 00:32:56,730 I don't ever want to drop a shot. Psychological difference between dropping shock and making a birdie, I think it's just bigger to make a par pop. 264 00:32:56,730 --> 00:33:06,120 So basically, he's endorsing prospect theory, right? The goal is par, and I really don't want to get below par. 265 00:33:06,120 --> 00:33:12,840 Got to do that. But no birdie. It's kind of bogus. 266 00:33:12,840 --> 00:33:17,310 So it's aware of what's going on. And he's endorsing. Right. 267 00:33:17,310 --> 00:33:24,560 Let's bring this back to medicine itself. In health care and some evidence for this. 268 00:33:24,560 --> 00:33:29,000 By the way, I should have said this quite a lot of evidence about prospect theory and health care. 269 00:33:29,000 --> 00:33:36,080 Lots of people prospect theory is important to health care and the very standards should try and shove it. 270 00:33:36,080 --> 00:33:39,410 It's before is back. Why this? 271 00:33:39,410 --> 00:33:43,580 This is the first one as far as we can hope. 272 00:33:43,580 --> 00:33:50,060 But these people, Treadwell and Lannett, they were interested for other reasons and which the reference point and I think for most people, 273 00:33:50,060 --> 00:33:57,630 reference point is your current state of health. So for most people, you think I'm in this current state of health? 274 00:33:57,630 --> 00:34:03,320 Yeah, I'd like to be in a better state of health. That would be great, but I really don't want to get worse. 275 00:34:03,320 --> 00:34:10,510 So think about white, you know, pretty nice if I've lost a few pounds, but I really don't want a guy in. 276 00:34:10,510 --> 00:34:22,590 OK, that's the worst thing to me. So I'm more motivated to try and prevent myself putting on right and I'm motivated to lose weight. 277 00:34:22,590 --> 00:34:30,650 So that's kind of it a common thing. So for most people, current state of health is your reference point. 278 00:34:30,650 --> 00:34:38,830 Yeah. If you take a quick fix on the goal of returning to their former state of health and their reference point shifts, 279 00:34:38,830 --> 00:34:44,620 they're going to become risk thinking in regard to an outcome that falls short of that goal. 280 00:34:44,620 --> 00:34:53,450 OK, so get the idea. So if a health care professional could persuade them to take on the goal of returning from the state of health, 281 00:34:53,450 --> 00:35:03,920 then that's going to shift the reference point and they're going to be some of more risks and otherwise in regard to getting there. 282 00:35:03,920 --> 00:35:10,790 So we'll be more willing to accept the risks of side effects that might come along with an operation, 283 00:35:10,790 --> 00:35:16,160 but be more willing to risk side effects that might go along with taking medication. 284 00:35:16,160 --> 00:35:26,300 And they'll be more willing to undertake exercises that might backfire because of injuries as part of the rehabilitation, 285 00:35:26,300 --> 00:35:39,650 then they wouldn't be otherwise. So we think this is another thing that's important about health that makes you hope for goal of it. 286 00:35:39,650 --> 00:35:45,800 Better in restoring the self to good health makes you more risk seeking in 287 00:35:45,800 --> 00:35:52,220 ways that ultimately decision that you make makes you more likely to go along. 288 00:35:52,220 --> 00:35:56,840 Health care programme that involves more risks to others. 289 00:35:56,840 --> 00:36:02,780 The hope sets goals, shift reference points, changing attitudes towards Hey. 290 00:36:02,780 --> 00:36:06,770 So that's the other reason we feel totally functional. 291 00:36:06,770 --> 00:36:11,990 But it's also important because it changes your attitude to risk. 292 00:36:11,990 --> 00:36:22,630 OK. Now, having said this, we now think we can pin down what the difference is between the absence of hope and despair. 293 00:36:22,630 --> 00:36:30,490 And so this has been a side issue. It's kind of interesting to do to fix on this. 294 00:36:30,490 --> 00:36:40,030 So a patient who has no particular hope of recovery, but if this recovery is possible, 295 00:36:40,030 --> 00:36:46,960 enough in the studies, but it's not in spite of despair, but just that the status quo, 296 00:36:46,960 --> 00:36:53,200 a reference point is where the health is right now and that patient is in a 297 00:36:53,200 --> 00:36:59,050 state of despair has accepted the fact that the health will further deteriorate. 298 00:36:59,050 --> 00:37:03,310 So the reference point is where in their heads actually the expectation? 299 00:37:03,310 --> 00:37:11,140 I think it's going way down. And so the concentration camp victims despair. 300 00:37:11,140 --> 00:37:19,900 They just assume the reference point is now that they and other people who are in a state of despair about their health, 301 00:37:19,900 --> 00:37:24,820 their reference point is going to be wherever they feel their health is going to end up often. 302 00:37:24,820 --> 00:37:28,210 What is the disease that you have? 303 00:37:28,210 --> 00:37:37,390 So that's the difference, we think, between being in the neutral stand between hope and despair and being in despair. 304 00:37:37,390 --> 00:37:45,380 But that's just the insight. OK, now given that. 305 00:37:45,380 --> 00:37:56,300 Imbuing a patient with hope and change their attitudes to risk this surely concerns about manipulation. 306 00:37:56,300 --> 00:38:00,710 There is already literature about this, but it plays out slightly different. 307 00:38:00,710 --> 00:38:06,030 It's quite a literature about what's known as false hope. 308 00:38:06,030 --> 00:38:17,210 It also is when, say, doctor make stuff on patient gives them sort of very unrealistic assessments of things. 309 00:38:17,210 --> 00:38:22,340 17. So there's no review literature, so we're concerned something slightly different. 310 00:38:22,340 --> 00:38:29,900 We're concerned with this issue about how to manipulate a patient by moving the reference point. 311 00:38:29,900 --> 00:38:35,230 And could you do something appropriately right now? 312 00:38:35,230 --> 00:38:42,910 So. Patients will often say, as we've noted, considering the options, 313 00:38:42,910 --> 00:38:48,430 patients will often use Google's reference, but be more willing to take greater risks to achieve that goal. 314 00:38:48,430 --> 00:38:58,750 Raises concerns about manipulation, particularly if they're led to accept this means that they do not endorse now, 315 00:38:58,750 --> 00:39:05,440 but they're going to be situations where this is absolutely fine and it's not wrongful. 316 00:39:05,440 --> 00:39:18,280 In the average case, to inspire someone to cut, it's so I suppose think about this patient, a doctor. 317 00:39:18,280 --> 00:39:23,590 So his patient, a patient, hey, has an advanced cancer, which is proven resistant. 318 00:39:23,590 --> 00:39:29,230 Most of traits in all the range of drugs, which are very unlikely to cure the cancer. 319 00:39:29,230 --> 00:39:34,160 One highly likely to cause of premature death. So she's in a bad way. 320 00:39:34,160 --> 00:39:39,870 She's no longer has her, but she thinks so a reference point. 321 00:39:39,870 --> 00:39:45,830 She's not in a state of despair. She's just gotten over the reference point status quo. 322 00:39:45,830 --> 00:39:48,320 So come along and say to her, well, you know, 323 00:39:48,320 --> 00:39:55,280 there's this other possible drug therapy you could take out these very high risk side effects, blah blah blah. 324 00:39:55,280 --> 00:39:58,970 Then she's probably going to be unwilling to file, which has been awesome point. 325 00:39:58,970 --> 00:40:02,210 More drugs, more side effects. I don't want to go. 326 00:40:02,210 --> 00:40:13,140 I don't want to do this. Now, suppose the doctor, though from patient day, provides her with access to a promising new drug that's being trialled, 327 00:40:13,140 --> 00:40:21,500 which appears to be far less likely to cause premature death, has some side effects, but they don't seem to be as bad as the others. 328 00:40:21,500 --> 00:40:26,900 That's a different story. So the doctor may be able to infuse patient improvements. 329 00:40:26,900 --> 00:40:31,610 And what we know was drug therapies said to be proven out there market. 330 00:40:31,610 --> 00:40:37,280 They pretty bad. They have these terrible side effects. But this one that's in trial really seems promising. 331 00:40:37,280 --> 00:40:40,930 Yeah, there are some side effects, but we've really seen these trials. 332 00:40:40,930 --> 00:40:50,690 OK, now if the doctor can persuade the patient to take this and that's going to change their attitude to risk, I am more likely to take it. 333 00:40:50,690 --> 00:40:56,600 So this is where the doctor due the question of manipulation lies. 334 00:40:56,600 --> 00:41:01,520 Now we think in most cases this is perfectly fine, 335 00:41:01,520 --> 00:41:10,010 morally acceptable because the doctor just straightforwardly says to the patient, Here are the know. 336 00:41:10,010 --> 00:41:18,470 Here is the risks, the known risks taken to this new drug, the known side effects. 337 00:41:18,470 --> 00:41:27,890 Here's what we know about chances you know might not work a lot and, well, much better than the other therapies. 338 00:41:27,890 --> 00:41:34,370 The patient is, in effect, making a rational decision about what's going to happen if she is inspired, 339 00:41:34,370 --> 00:41:39,770 but hope that she's going to move from risk are risk averse to risk seeking. 340 00:41:39,770 --> 00:41:45,260 But we don't think that's a problem because on the prospect theory, 341 00:41:45,260 --> 00:41:52,040 all of us are already both risk seeking and risk averse, just different circumstances. 342 00:41:52,040 --> 00:41:57,200 So it's not irrational to be risk seeking. It's not irrational to be risk averse. 343 00:41:57,200 --> 00:42:08,660 And this is, you know, this perfectly sort of normal procedure. Every one of us has the experience of acquiring hope and changing our behaviour. 344 00:42:08,660 --> 00:42:18,890 But I do think this is perfectly normal procedure, which can be quite consistent rationality and can be something that she can actually treat. 345 00:42:18,890 --> 00:42:23,660 And so we don't think, you know my advice, this is for the patient. 346 00:42:23,660 --> 00:42:35,430 The key thing is the patient, OK? However, there might be some variations on this which are not good. 347 00:42:35,430 --> 00:42:39,620 Now this is a paper. It's frightening. 348 00:42:39,620 --> 00:42:49,900 Justin Staff is very impressed by this. Someone who would rather know from 1978, I did not find this, that he's not an account of manipulation. 349 00:42:49,900 --> 00:42:55,660 I I attempts to modify their safety on these attempts, the complex motivation of this behaviour. 350 00:42:55,660 --> 00:43:02,950 I mean, it's a fictional claim of supposed weakness that is attempting to motivate someone's 351 00:43:02,950 --> 00:43:10,360 behaviour in a way which presume which fans of all persons Pro-Ject know. 352 00:43:10,360 --> 00:43:17,410 So think about this. So the worry here is this playing on supposed weakness I was. 353 00:43:17,410 --> 00:43:24,490 The doctor finds out that the patient has a tendency to clutching at straws, a plastic straws, anything. 354 00:43:24,490 --> 00:43:31,660 It tells them about a little prospect and they immediately jump on it without sort of feeds them. 355 00:43:31,660 --> 00:43:39,160 So you decided to be patient? Well, you know, you've tried all those main therapies and will produce this bad side effects. 356 00:43:39,160 --> 00:43:43,060 But there's this new driving trial of patients. 357 00:43:43,060 --> 00:43:48,370 Yes, I wanted without thinking it through. 358 00:43:48,370 --> 00:43:57,780 And that's that's his worry, because if the doctor is aware of this and there's a potential to exploit to neglect the statute, 359 00:43:57,780 --> 00:44:07,660 there are tendency to flash at straws. And you could imagine the doctor is prepared to say that they don't really want to cure the patient. 360 00:44:07,660 --> 00:44:20,830 What they want is to have another person take the experimental drug because hundreds of patients publicly follow something. 361 00:44:20,830 --> 00:44:27,910 So there might be some bad motives here. Well, in that case, we think this is a pretty simple distillation. 362 00:44:27,910 --> 00:44:37,210 So the first story where the patient doses, I hope the switch in her attitude to risk we think is acceptable. 363 00:44:37,210 --> 00:44:47,860 We think this influence by way of decision-making weakness is not morally acceptable, so it should be clear. 364 00:44:47,860 --> 00:44:56,230 OK, so. And here's another way this another sort of possibility this patient might just be obsessed 365 00:44:56,230 --> 00:45:04,060 with preventing deaths at all costs and exaggerates the efficacious mass found in any trial. 366 00:45:04,060 --> 00:45:11,590 So, so it might be that, no, they have a patient, and I know the clutching at straws. 367 00:45:11,590 --> 00:45:15,910 The doctor is worried about the ethics of performance record. 368 00:45:15,910 --> 00:45:19,150 They want to try everything. They don't want to be seen as someone who gives up. 369 00:45:19,150 --> 00:45:24,610 Even the patient wants to give up and doesn't want to have all these different side effects. 370 00:45:24,610 --> 00:45:31,600 And so they exploit the see budget straws getting to take these drugs. 371 00:45:31,600 --> 00:45:35,260 So once you imbued with hope, 372 00:45:35,260 --> 00:45:46,150 the reference point shifts load from the state of being unwell with cancer that now the best lusts and wounds take more risks. 373 00:45:46,150 --> 00:45:56,410 And so she's been manipulated wrong. Now this is slightly different from this is another case comes up. 374 00:45:56,410 --> 00:46:04,480 So this is just more of a straightforward section. Tyson's different from the the clutching at straws case. 375 00:46:04,480 --> 00:46:12,710 No information has been withheld from the patient. It's just that a known decision making weakness has been manipulated. 376 00:46:12,710 --> 00:46:17,740 But there are other cases that come up is something to worry about. 377 00:46:17,740 --> 00:46:22,330 Now it's basically a patient. There's no hope, any each way. 378 00:46:22,330 --> 00:46:30,990 And but the doctor pushes their buttons and gets them to hope that there will be a miracle 379 00:46:30,990 --> 00:46:37,150 and just neglects to mention the sort of significant risks of the experimental procedure. 380 00:46:37,150 --> 00:46:41,100 In that case, they could ignore discount rationalise away. 381 00:46:41,100 --> 00:46:48,910 So they're not engaging with risks at all. So again, this is another way of manipulating a patient. 382 00:46:48,910 --> 00:46:53,740 It's not quite as devious as exploiting. 383 00:46:53,740 --> 00:47:02,260 It's more exploiting these circumstances rather than exploiting the he wants, but it's still OK. 384 00:47:02,260 --> 00:47:15,920 But even though there are these wonderful ways of taking advantage of patients propensity to change their attitude to risk as a result of the quality, 385 00:47:15,920 --> 00:47:21,390 though, they're also going to be saving lives and. 386 00:47:21,390 --> 00:47:29,390 OK. So, sir, I've done that already. So, yeah, so that's just a different story. 387 00:47:29,390 --> 00:47:34,860 Actually, the the other case, I'll just finish up it, say in the next I was going to say, 388 00:47:34,860 --> 00:47:42,440 we're going to say something about religion and hope that kind of Dr Typekit Typekit longer and longer. 389 00:47:42,440 --> 00:47:45,830 But the gist of what we want to say is this well, 390 00:47:45,830 --> 00:47:56,930 insofar as and this often comes up if religion religion is a common source of hope and you see to religious and the life to be 391 00:47:56,930 --> 00:48:08,330 hopeful insofar as religion is doing the work of getting patients to hope for an outcome that they rationally endorse anyway, 392 00:48:08,330 --> 00:48:16,940 and which is, I know that they become better, that they return to their previous state of health. 393 00:48:16,940 --> 00:48:23,960 And that's good. If religion is actually pushing you to some other goal, then we think that's bad. 394 00:48:23,960 --> 00:48:29,600 Guns and religion are no longer in line with the goals of culture. And so that's what we can say. 395 00:48:29,600 --> 00:48:35,240 But as I say, that simply dropped out of the pipe, those regions of space. 396 00:48:35,240 --> 00:48:38,746 So I'll leave it there. Thank you. And I look forward to.