1 00:00:00,330 --> 00:00:07,440 Hello, this is the third of the many podcasts, this one's on utilitarianism. 2 00:00:07,440 --> 00:00:12,000 Welcome to everyone listening to the podcast and welcome to you, too. 3 00:00:12,000 --> 00:00:22,890 Right. Let's get started. Many people have strong intuitions to the effect that it's only the consequences of inaction that matter morally. 4 00:00:22,890 --> 00:00:30,480 And these people are called consequentialist. And Utilitarians are a type of consequentialist. 5 00:00:30,480 --> 00:00:36,660 They believe that the only thing that matters morally is that we produce the greatest happiness of the greatest number. 6 00:00:36,660 --> 00:00:41,070 And other consequentialist might be people who believe that what matters is liberty, 7 00:00:41,070 --> 00:00:45,300 for example, or some people believe that what matters is equality. 8 00:00:45,300 --> 00:00:54,990 So if you're a you're a libertarian, if you believe that every action should be looked at to see what its consequences are for people's liberty. 9 00:00:54,990 --> 00:01:02,360 But we're going to be talking about utilitarianism today because. Well, because that's what I'm going to talk about. 10 00:01:02,360 --> 00:01:11,450 That was a lousy argument, wasn't it? OK, John Stuart Mill was a very important utilitarian notice that I've got too many brackets on that. 11 00:01:11,450 --> 00:01:16,470 We're going to be doing logic soon. And you want to watch brackets? I ought to watch brackets. 12 00:01:16,470 --> 00:01:26,790 Okay. That's John Stuart Mill. And here are some actions that might make you feel that utilitarianism is the right way to go morally. 13 00:01:26,790 --> 00:01:31,610 Imagine a patient with a terminally ill and terminal and very painful illness, 14 00:01:31,610 --> 00:01:37,700 and he desperately wants to die and his family are exhausted and they beg their doctor 15 00:01:37,700 --> 00:01:43,420 to help him and the doctor gives him a dose of morphine intending it to kill him. 16 00:01:43,420 --> 00:01:47,690 OK, now you might think to yourself, well, what's wrong with that? 17 00:01:47,690 --> 00:01:50,660 He wants to die. He's going to die any minute. 18 00:01:50,660 --> 00:01:59,480 Why shouldn't the doctor break the the rule down to logical rule, if that's what it is that says don't kill? 19 00:01:59,480 --> 00:02:04,580 Why shouldn't he? The consequences of this doctor's act are the best that can be. 20 00:02:04,580 --> 00:02:08,180 Therefore, it's fine. Second example, 21 00:02:08,180 --> 00:02:13,700 a high ranking officer knowing that the enemy will attack a particular hotel and tells 22 00:02:13,700 --> 00:02:18,170 the hotel manager to close the hotel on the grounds of an outbreak of food poisoning. 23 00:02:18,170 --> 00:02:27,200 And the manager does this well, OK, the manager is being dishonest and therefore the deal anthologist wouldn't be very happy about this. 24 00:02:27,200 --> 00:02:31,520 He's not perhaps expressing the virtue of honesty. 25 00:02:31,520 --> 00:02:33,860 And yet if you're a utilitarian, 26 00:02:33,860 --> 00:02:41,030 surely what he's doing is right because he's managing to get everybody out of the hotel when the hotel is going to be attacked. 27 00:02:41,030 --> 00:02:46,460 What matters, according to the utilitarian, is the consequences of the action. 28 00:02:46,460 --> 00:02:55,520 And finally, a father, knowing his unemployed son is depressed, forces him to work in the family business in order to regain his self-esteem. 29 00:02:55,520 --> 00:03:00,410 Well, forcing somebody is not something that a day anthologist is going to go for. 30 00:03:00,410 --> 00:03:05,330 It's not something that a virtue ethicist is going to go for, because on the whole, 31 00:03:05,330 --> 00:03:10,130 it's not treating somebody as an end in himself to force them to do anything, 32 00:03:10,130 --> 00:03:16,400 nor is it showing the virtue of patience, for example, or understanding or whatever. 33 00:03:16,400 --> 00:03:24,140 But if you're a utilitarian, you might think, well, if he does regain his self esteem, it will have been worth it. 34 00:03:24,140 --> 00:03:29,330 So for the utilitarian, the end really does justify the means. 35 00:03:29,330 --> 00:03:38,150 And there's no action that can't be performed so long as performing it would produce the greatest happiness of the greatest number. 36 00:03:38,150 --> 00:03:44,000 And this can lead to problems because we might think, well, surely genocide is wrong, 37 00:03:44,000 --> 00:03:54,740 even if it does produce the greatest happiness, the greatest number. So imagine that those of us in this room would be very, very happy. 38 00:03:54,740 --> 00:04:04,900 All say 50 of us in this room would be very, very happy if we took Steve over there and killed him. 39 00:04:04,900 --> 00:04:06,970 Well, that's 50 of us. 40 00:04:06,970 --> 00:04:15,160 I mean, Steve wouldn't be very happy that his parents wouldn't be either and things like that, but but nevertheless, we would be happy. 41 00:04:15,160 --> 00:04:16,720 I mean, we balance these things up. 42 00:04:16,720 --> 00:04:27,460 And if it turns out that the greatest happiness would be achieved by killing Steve, we kill Steve or we kill all those people over there or whatever. 43 00:04:27,460 --> 00:04:32,080 Well, is that acceptable just because it produces gross happiness, gross number or not? 44 00:04:32,080 --> 00:04:37,120 If it's not, then you're going to think there's something wrong with utilitarianism. 45 00:04:37,120 --> 00:04:43,150 But if you really do think that what we should do is produce the gross happiness, greater number, then why not? 46 00:04:43,150 --> 00:04:47,980 And again, we might say surely slavery is wrong, even if it does lead to the greatest happiness. 47 00:04:47,980 --> 00:04:54,610 The greatest number of slavery treats people as tools, as means to our ends. 48 00:04:54,610 --> 00:05:00,940 And that's unacceptable, surely, even if it does lead to the greatest happiness, the greatest number. 49 00:05:00,940 --> 00:05:05,500 So we can imagine that we treat our slaves really well and that because we've got slaves, 50 00:05:05,500 --> 00:05:13,300 we can be very productive and therefore the productivity benefits the slaves as well, etc. 51 00:05:13,300 --> 00:05:18,310 You could tell a story where it looks as if slavery is going to produce the happiness greatest number. 52 00:05:18,310 --> 00:05:28,990 Does that justify it? And some have said that utilitarianism, because they don't recognise any act as absolutely wrong, 53 00:05:28,990 --> 00:05:34,270 can't recognise left out to see that, can't recognise rights at all. 54 00:05:34,270 --> 00:05:42,910 And if so, this is a big problem for utilitarianism, because we do think that people have rights and we do think that these rights 55 00:05:42,910 --> 00:05:48,790 sometimes overrule the any action that might produce the greatest happiness, 56 00:05:48,790 --> 00:05:55,540 the greatest number. And to cope with this sort of difficulty, utilitarianism broke into two camps, 57 00:05:55,540 --> 00:06:02,170 or at least people thought that maybe there are two different types of utilitarianism and that one is better than the other. 58 00:06:02,170 --> 00:06:09,670 So act utilitarians are utilitarians who refer every single action to the greatest happiness principle, 59 00:06:09,670 --> 00:06:15,850 to the idea that what we should do is produce the greatest happiness of the greatest number rule. 60 00:06:15,850 --> 00:06:25,900 Utilitarians don't do that. What they do is, is first they look at, OK, let's take an action of this type slavery, for example. 61 00:06:25,900 --> 00:06:34,180 Would that conduce to the greater we're doing that, as a general rule, conduce to the greatest happiness of the greatest number or not? 62 00:06:34,180 --> 00:06:39,400 If it wouldn't, then we should have a rule that we shouldn't do it. 63 00:06:39,400 --> 00:06:46,870 And then every token action is this an act of enslaving someone or not has to be referred to the rule. 64 00:06:46,870 --> 00:06:51,640 So where is the act? Utilitarians only ever use rules of thumb. 65 00:06:51,640 --> 00:06:57,760 So they might say, well, lying on the whole doesn't produce the greatest happiness, the greater number, therefore I won't lie. 66 00:06:57,760 --> 00:07:03,010 But if they reach a lie, that would would create the greatest happiness, greatest number. 67 00:07:03,010 --> 00:07:10,210 They would lie. They just break the rule. So although they use rules of thumb, there are no unbreakable rules. 68 00:07:10,210 --> 00:07:16,600 In fact, utilitarianism, whereas in rule utilitarianism you say that there are some rules that have value in 69 00:07:16,600 --> 00:07:21,070 themselves and the reason they have value in themselves is because they produce. 70 00:07:21,070 --> 00:07:26,500 So bang them brings the greatest happiness, the greatest number. 71 00:07:26,500 --> 00:07:29,410 But there is a problem for this distinction. 72 00:07:29,410 --> 00:07:37,180 Some people believe that the rule utilitarianism is incoherent because it collapses into act utilitarianism. 73 00:07:37,180 --> 00:07:44,150 So how does this work? Well, imagine a rule utilitarian in a situation in which you've got three options. 74 00:07:44,150 --> 00:07:48,150 He can either keep his rule, he can break his rule, or he can modify his rules. 75 00:07:48,150 --> 00:07:56,560 So here's a situation in which telling a lie, you say, would obviously produce the greatest happiness, the greatest number. 76 00:07:56,560 --> 00:08:01,540 Should he keep his rule, should he break it or should he modify the rules? 77 00:08:01,540 --> 00:08:09,790 Say we can lie except in C circumstances C And the trouble with this is that whatever he does, 78 00:08:09,790 --> 00:08:18,940 he either becomes a de anthologist, which is rudely called a rule worshipper by utilitarian or he becomes an act utilitarian. 79 00:08:18,940 --> 00:08:21,940 So let's say he keeps his rule. 80 00:08:21,940 --> 00:08:32,440 OK, so even when it's obvious that telling this lie is going to produce the greatest happiness, the greatest number, he nevertheless tells the truth. 81 00:08:32,440 --> 00:08:38,320 I think I got my words mixed up there. I hope it's obvious what I meant, even not what I said. 82 00:08:38,320 --> 00:08:43,780 You should always listen to what I mean, not what I say so he can keep his rule. 83 00:08:43,780 --> 00:08:51,490 Well, if he does, then in what sense is he a utilitarian? He sees that breaking the rule would produce the greatest happiness, the greatest number. 84 00:08:51,490 --> 00:08:55,480 Yet he doesn't break it. He's not a utilitarian. He's a rule worshipper. 85 00:08:55,480 --> 00:09:00,370 Adell anthologist. On the other hand, let's say he breaks his rule. 86 00:09:00,370 --> 00:09:04,660 Well, if so, then surely he's just using rules of thumb and he's nothing more. 87 00:09:04,660 --> 00:09:14,110 Act utilitarian, there's no difference between him and the ordinary, act utilitarian, who uses rules just because they're simple? 88 00:09:14,110 --> 00:09:23,080 The only chance he has then of making a rule utilitarian, different from act utilitarian ism is if the rule utilitarian, 89 00:09:23,080 --> 00:09:32,590 commodify the rule and therefore somehow keep what he does different from either anthologist or an act utilitarian. 90 00:09:32,590 --> 00:09:36,880 But the trouble with that is, OK, the first time we think, OK, 91 00:09:36,880 --> 00:09:44,650 we mustn't lie except in circumstances C and then the second time is we mustn't lie except in circumstances C, one or two. 92 00:09:44,650 --> 00:09:51,400 And then we mustn't like certain circumstances. See one, see to see three and you start to think, well hang on. 93 00:09:51,400 --> 00:09:57,160 Actually this chap isn't going to do anything different from the act utilitarian. 94 00:09:57,160 --> 00:10:03,760 He claims to be using rules and perhaps he is using rules because he's modifying his rule every time. 95 00:10:03,760 --> 00:10:10,360 But actually his use of that rule isn't making him any different from anything that an act utilitarian does. 96 00:10:10,360 --> 00:10:16,930 And so the actual actions he performs are going to be the same as those of an act utilitarian. 97 00:10:16,930 --> 00:10:21,820 So the fact that he calls himself a rule utilitarianism is actually irrelevant. 98 00:10:21,820 --> 00:10:28,570 So rule utilitarianism is extensional equivalent to act utilitarian, 99 00:10:28,570 --> 00:10:36,640 even if it's intentionally not equivalent in the same way that to have hearts and to have kidneys. 100 00:10:36,640 --> 00:10:44,470 If you point to things with hearts, you're pointing to exactly the same things as the things that have kidneys because they always do the same thing. 101 00:10:44,470 --> 00:10:56,770 But of course you understand the two differently. So that's the idea, you can't save utilitarianism from charges that it doesn't respect rights, 102 00:10:56,770 --> 00:11:03,160 etc. by splitting it into act and rule utilitarianism, but. 103 00:11:03,160 --> 00:11:10,060 This collapse argument rests on the assumption that utilitarianism is really a very unsophisticated 104 00:11:10,060 --> 00:11:17,530 creed and that it recognises only one sort of rule and one sort of relationship towards the rule. 105 00:11:17,530 --> 00:11:19,480 And this just isn't true. 106 00:11:19,480 --> 00:11:27,350 I mean, you can think of utilitarianism as very unsophisticated, but if you do, you're not exercising the principle of charity. 107 00:11:27,350 --> 00:11:35,170 You're you're thinking of utilitarians as idiots and they're not human society is much more complicated. 108 00:11:35,170 --> 00:11:38,650 Consider the difference between a legislator and a judge. 109 00:11:38,650 --> 00:11:48,490 So let's think if we're let's use the knee jerk example that's often used in utilitarian of the sheriff who's got in his prison, 110 00:11:48,490 --> 00:11:52,300 a tramp who's often wanted to die, has no family, 111 00:11:52,300 --> 00:11:59,050 etc. And the sheriff has a community who are really very worried about all these rapes that are going on. 112 00:11:59,050 --> 00:12:03,550 The sheriff knows that the rapist is dead because he shot himself, 113 00:12:03,550 --> 00:12:11,080 but he also knows that he can't justify this claim because when he shot him, he fell into a river and the body was taken away, etc. 114 00:12:11,080 --> 00:12:19,690 You can add to the story as you like what the key things are that should the sheriff kill the tramp who is innocent 115 00:12:19,690 --> 00:12:29,650 of these crimes in order to avoid the social riots that will happen if someone isn't accused of these rapes. 116 00:12:29,650 --> 00:12:34,960 Well, let's take a legislator who's asking, well, 117 00:12:34,960 --> 00:12:43,250 should we pass a rule that says innocent people should be killed in any circumstance in which doing so would lead to the greatest happiness, 118 00:12:43,250 --> 00:12:49,120 the greatest number should? Is that a reasonable law? Would that lead to the greatest happiness? 119 00:12:49,120 --> 00:12:55,690 Would it. I think clearly it wouldn't because we'd all be thinking, well, hang on, that's all very well. 120 00:12:55,690 --> 00:13:00,460 First they came for the slaves, then they came for the I can't remember how the thing goes, 121 00:13:00,460 --> 00:13:08,080 but maybe it would be me who is the innocence who was caught in the next situation. 122 00:13:08,080 --> 00:13:16,450 No, of course, that couldn't be a rule. So you wouldn't think that the greatest happiness, greatest number would be served by that rule. 123 00:13:16,450 --> 00:13:23,800 So the legislator would never pass a law of that kind. The sort of law that a legislator would pass was it would be don't kill. 124 00:13:23,800 --> 00:13:29,920 There is an act called murder. That murder is the illegal killing of another person. 125 00:13:29,920 --> 00:13:38,110 This mustn't be done because if we allow it willy nilly, we'll end up without the greatest happiness of the greatest number. 126 00:13:38,110 --> 00:13:43,480 But then when you have a judge, does he have the right to say, well, actually, 127 00:13:43,480 --> 00:13:48,520 in this situation, it would produce the greatest happiness, the greatest number? 128 00:13:48,520 --> 00:13:53,200 The judge must look to the law and actually act on the law. 129 00:13:53,200 --> 00:13:57,100 He cannot then start thinking, well, should I do this or shouldn't I do this? 130 00:13:57,100 --> 00:14:01,840 It's the legislator that makes that sort of decision, not the judge. 131 00:14:01,840 --> 00:14:08,260 So different roles in society have different tasks to do in relation to rules. 132 00:14:08,260 --> 00:14:17,740 The legislator might act rather more like an act utilitarian, whereas the judge acts more like a rule utilitarian. 133 00:14:17,740 --> 00:14:25,130 So the legislator looks directly to the greatest happiness principle where the judge looks to the rule. 134 00:14:25,130 --> 00:14:31,010 And also consider the difference between an act of fraud and an ordinary lie, 135 00:14:31,010 --> 00:14:39,890 and we could if you were a legislator, you might say, well, why shouldn't we make lying illegal? 136 00:14:39,890 --> 00:14:44,210 Lying actually can lead to a lot of trouble in society, can really cause difficulties. 137 00:14:44,210 --> 00:14:48,470 Why don't we just make it make it illegal? Well, I think you see that clearly. 138 00:14:48,470 --> 00:14:58,790 That's not going to lead to the greatest happiness of the greatest number. So the legislator is unlikely to to allow that ordinary lies are illegal, 139 00:14:58,790 --> 00:15:07,520 but that there should be an act of fraud, which is, again, a legal concept rather than a moral concept. 140 00:15:07,520 --> 00:15:17,600 Fraud is when you lie to somebody in a situation where you've got a contract or whatever, there's a difference between an act of fraud, 141 00:15:17,600 --> 00:15:26,240 which is something for which the judge would punish somebody, and an ordinary lie, which actually just doesn't come under the judicial system. 142 00:15:26,240 --> 00:15:36,230 So you might as a utilitarian, when you're faced with the question of should I tell this lie or not, you can act as an act utilitarian. 143 00:15:36,230 --> 00:15:40,820 So if the Nazis are at the door saying, are there any Jews here, you think, well, 144 00:15:40,820 --> 00:15:44,600 would it produce the gross happiness of the greatest number or not to tell this lie? 145 00:15:44,600 --> 00:15:48,950 And you might tell the lie. You should tell the lie in that case. 146 00:15:48,950 --> 00:15:54,500 But if it's an act of fraud, there you look to the rule. 147 00:15:54,500 --> 00:16:01,620 You look to the fact that this is this is actually forbidden in society and you will pay great penalties. 148 00:16:01,620 --> 00:16:06,560 The society ensures that you will pay penalties because it's wrong. 149 00:16:06,560 --> 00:16:13,280 You don't. It's not wrong because you pay penalties, you pay penalties because it's wrong, 150 00:16:13,280 --> 00:16:19,880 and to recognise complications like these is to see that a utilitarian can recognise both rules of thumb, 151 00:16:19,880 --> 00:16:30,470 that the act utilitarian, used and unbreakable rules such as the ones that day ontologies will recognise. 152 00:16:30,470 --> 00:16:39,200 And I'm going to leave it with you. Do you think this argument means that utilitarianism can overcome the objection, that it can't recognise rights? 153 00:16:39,200 --> 00:16:45,700 That's a huge question, and I'm going to let you think about that in your own time. 154 00:16:45,700 --> 00:16:55,210 And other problems for utilitarians, firstly, are there really no actions that are intrinsically wrong? 155 00:16:55,210 --> 00:17:01,090 Is it okay? I mean, actually, a very good example of this is the discussion about torture. 156 00:17:01,090 --> 00:17:11,380 Is it acceptable to torture somebody? Because in doing so, you might be able to save a lot of lives or is torture intrinsically wrong? 157 00:17:11,380 --> 00:17:17,680 And to make it really difficult, you should think about perhaps torturing a baby or something like that. 158 00:17:17,680 --> 00:17:25,450 Could you, by torturing a baby, produce the greatest happiness, the greatest number, and if so, should you? 159 00:17:25,450 --> 00:17:30,940 That's a question that will really pull your intuitions about utilitarianism this way and that way, 160 00:17:30,940 --> 00:17:39,880 which is what should be happening if you're a philosopher. And secondly, how do we know in advance what the consequences of our actions will be? 161 00:17:39,880 --> 00:17:48,280 So if I take my elderly aunt out for tea and we cross a busy road, unfortunately she's mown down by a truck. 162 00:17:48,280 --> 00:17:57,850 So the consequences of my actions are not good. My aunt isn't made very happy by this action, nor am I nor anyone else in the family. 163 00:17:57,850 --> 00:18:04,760 So the consequences of my action are not good. Therefore, I have not acted morally in taking my aunt. 164 00:18:04,760 --> 00:18:10,360 Well, we might think that's not fair because I didn't know that those would be the consequences. 165 00:18:10,360 --> 00:18:19,210 Surely I did the right thing. One way we might solve this problem is to make a distinction between the agent and the action. 166 00:18:19,210 --> 00:18:22,900 So an action is wrong if it doesn't produce the greatest happiness and grace. 167 00:18:22,900 --> 00:18:29,080 No, but the agent is evaluated in terms of different things, for example, 168 00:18:29,080 --> 00:18:37,390 in terms of their intentions, what they believe the consequences of that action will be. 169 00:18:37,390 --> 00:18:38,170 Oh, and incidentally, 170 00:18:38,170 --> 00:18:45,910 one thing I haven't mentioned is you can be utilitarian without thinking that everything you must do must produce the greatest happiness, 171 00:18:45,910 --> 00:18:54,250 the greatest number. So it could be that what makes your action morally right or wrong is whether it does produce the greatest happiness. 172 00:18:54,250 --> 00:19:01,030 Grace, no. Rather than whether you intended it to, if you see what I mean. 173 00:19:01,030 --> 00:19:06,520 Another difficulty is, must we always act to produce the greatest happiness of the greatest number? 174 00:19:06,520 --> 00:19:10,990 This is linked to what I've just said. I'm tired. I go home at night, I have a glass of wine. 175 00:19:10,990 --> 00:19:15,790 I sit in front of the television and watch NEWSNIGHT or whatever. 176 00:19:15,790 --> 00:19:20,080 Um, well, I could be working in the Oxfam shop, couldn't I? 177 00:19:20,080 --> 00:19:27,790 And if I was working or I could be giving another lecture, thereby making loads of people really, really happy. 178 00:19:27,790 --> 00:19:33,040 Shouldn't I be doing that instead of having a glass of wine and watching NEWSNIGHT again, 179 00:19:33,040 --> 00:19:38,740 the answer here might be, well, there's a limit to how much you can actually do to produce happiness. 180 00:19:38,740 --> 00:19:42,910 At some point, you've got to count yourself in the calculus. 181 00:19:42,910 --> 00:19:51,100 Well, actually, it's every point if you're a utilitarian, you've got to count yourself in the utilitarian calculation. 182 00:19:51,100 --> 00:19:58,240 So it could be that you can you will make people happier the following day by having a night off. 183 00:19:58,240 --> 00:20:08,920 You needn't spend every single night in the Oxfam shop. A big problem for utilitarianism is what is happiness and how do we measure it? 184 00:20:08,920 --> 00:20:17,350 The question of what happiness has produced as many books as would fill this room several times over, I should imagine. 185 00:20:17,350 --> 00:20:21,170 So I won't need I won't go into it here. And how do we measure it? 186 00:20:21,170 --> 00:20:27,520 One thing will have to be careful here is we mustn't get hung up on the idea of 187 00:20:27,520 --> 00:20:32,950 measurement like you lose using a ruler or a calculator of some kind of things like that. 188 00:20:32,950 --> 00:20:39,580 Sometimes we measure each other subjectively. I can see understanding on your face. 189 00:20:39,580 --> 00:20:44,350 I can also see lack of understanding on your face. How do I do that? I don't know. 190 00:20:44,350 --> 00:20:48,790 I certainly couldn't write. I couldn't tell you how I do it. 191 00:20:48,790 --> 00:20:55,270 Human beings just do it. We are able to see perhaps not all of us and perhaps some of us see it better than others. 192 00:20:55,270 --> 00:21:03,370 But we we can measure happiness without taking a ruler or something like that. 193 00:21:03,370 --> 00:21:10,630 We're actually very good at doing that. A very important question, whose happiness must be counted? 194 00:21:10,630 --> 00:21:15,670 I may be Hitler was a good utilitarian. It's just that he didn't count Jews. 195 00:21:15,670 --> 00:21:20,560 If we don't count animals, that's going to change the utilitarian calculus quite a lot. 196 00:21:20,560 --> 00:21:27,010 Maybe our grandchildren are going to say of us they ate lamb for Sunday lunch. 197 00:21:27,010 --> 00:21:33,940 Whether you should be a vegetarian or not, if you're a utilitarian, is going to make a huge difference. 198 00:21:33,940 --> 00:21:41,380 To be made to that decision is whether you count animals in the utilitarian calculus or you don't. 199 00:21:41,380 --> 00:21:44,680 And also think about embryos. Do we count? 200 00:21:44,680 --> 00:21:52,810 Embryos or not, and think of the decisions that we would make about abortions, depending on whether we count embryos or not. 201 00:21:52,810 --> 00:21:57,670 And finally, we might say kind of utilitarian account for personal integrity. 202 00:21:57,670 --> 00:22:05,320 Imagine that I'm working for the nuclear industry or rather that I'm offered a job in the nuclear industry. 203 00:22:05,320 --> 00:22:14,230 But I think the nuclear industry is a really appalling place. But somebody points out that if I take this job, I can work within to subvert it. 204 00:22:14,230 --> 00:22:18,540 And I might think, well, hang on a second. That may produce the greatest happiness, the greatest number. 205 00:22:18,540 --> 00:22:33,030 My integrity would be shot. Working for the nuclear industry, and so I'll leave you again with those problems and if you'd like to go further there, 206 00:22:33,030 --> 00:22:39,680 as usual, is the list of things that you might do.