1 00:00:04,220 --> 00:00:09,930 Okay. Nice to see you all again. Have you had a good week today? 2 00:00:09,930 --> 00:00:13,380 We're going to leave logic an argument that or ugly? 3 00:00:13,380 --> 00:00:19,110 Not actually, because you can't you can't get away with logic and arguments in philosophy. 4 00:00:19,110 --> 00:00:28,970 I'm going to talk about ethics and politics. So you might not it won't sound quite as foreign to you today, perhaps. 5 00:00:28,970 --> 00:00:38,400 OK. I mean, it took us three ethical theories. I mean, obviously, I've chosen to leave because I could have talked about a number of things, 6 00:00:38,400 --> 00:00:43,570 but these three are probably the three most popular theories. 7 00:00:43,570 --> 00:00:48,280 Firstly, Aristotle and his work has become virtue ethics. 8 00:00:48,280 --> 00:00:54,700 Now, then I get to talk about Kant and then I'm going to talk about utilitarianism and all. 9 00:00:54,700 --> 00:00:57,150 Compare and contrast them to each other. 10 00:00:57,150 --> 00:01:05,580 But I think by the end of today, you'll at least have a feel for some of the key issues of each of these theories. 11 00:01:05,580 --> 00:01:14,220 Okay. So we're going to start with Aristotle. Now, tell me actually, because it all ties to what what is it that we're trying to answer? 12 00:01:14,220 --> 00:01:23,850 Well, in ethics, when we're trying to do ethics, what do you think the question is that we're concerned with? 13 00:01:23,850 --> 00:01:29,030 That's not a question. I mean, yes, we're concerned, right or wrong, but what is the question to which. 14 00:01:29,030 --> 00:01:32,990 Well, it's it's morally right. Right. 15 00:01:32,990 --> 00:01:45,270 Okay. What question might be is it right to do whatever, to lie, to kill or whatever that might be with us? 16 00:01:45,270 --> 00:01:51,940 Good. That's right. There are two sorts of ethical question. 17 00:01:51,940 --> 00:01:58,050 How can you call me in my office? Sorry, I got the question of their answer. 18 00:01:58,050 --> 00:02:06,750 Got it. Okay. The first answer was the question we're looking to answer is, is doing this right or wrong? 19 00:02:06,750 --> 00:02:15,480 And the second also was the principles behind what are the principles behind decisions about what to what the council said. 20 00:02:15,480 --> 00:02:22,590 OK. And between the two of you, you've come up with absolutely the distinction that I want to come up with. 21 00:02:22,590 --> 00:02:29,370 Thank you very much. The distinction between first order ethics and second-order ethics. 22 00:02:29,370 --> 00:02:39,290 Now first order ethics looks at the world. It looks at certain action types, lying, killing, clothing, whatever. 23 00:02:39,290 --> 00:02:43,920 And it says it's an action of this type, morally acceptable. OK. 24 00:02:43,920 --> 00:02:48,350 That's a first sort of question because it pertains to the world, if you like, 25 00:02:48,350 --> 00:02:56,420 that there are second thoughts and moral questions which ask what is it that makes an action, right or wrong? 26 00:02:56,420 --> 00:02:59,340 Do you think that we've got one back, if you like? 27 00:02:59,340 --> 00:03:06,180 So just in the same way, if I talk about first order beliefs, I'm talking about beliefs about the world. 28 00:03:06,180 --> 00:03:11,190 So my belief that she has blue is a first order belief. 29 00:03:11,190 --> 00:03:16,950 But if I don't think it's my belief that that chair's blew through. 30 00:03:16,950 --> 00:03:21,760 That's a second order brief because it's a belief about belief. 31 00:03:21,760 --> 00:03:25,170 Got it. So you can have second order desires as well. 32 00:03:25,170 --> 00:03:29,400 Does someone want to give me a first order? What's a first order? 33 00:03:29,400 --> 00:03:34,410 Desire. All I have I want her. 34 00:03:34,410 --> 00:03:39,870 I want a cream cake or I'll eat something to eat or whatever and some other second order. 35 00:03:39,870 --> 00:03:43,710 Desire is a bit more difficult times with desires. 36 00:03:43,710 --> 00:03:49,600 I realise now start to finish. Can anyone think of the second order dessert. 37 00:03:49,600 --> 00:03:53,100 I'm very near. 38 00:03:53,100 --> 00:03:58,860 Yeah. Absolutely. You might think, OK, I want to cream cake but I want to be slim. 39 00:03:58,860 --> 00:04:08,160 Both of those first order desires. But then you might think I want to be the sort of person who wants a salad rather than a cream cake or I 40 00:04:08,160 --> 00:04:14,550 want to be the sort of person who wants my health more than I want a cigarette or something like that. 41 00:04:14,550 --> 00:04:22,230 So you should be noticing instantly that the philosophy works in the second order. 42 00:04:22,230 --> 00:04:26,700 So is that belief true? Is it is a very philosophical question. 43 00:04:26,700 --> 00:04:32,580 What does. What is truth? What constitutes truth? What constitutes rightness? 44 00:04:32,580 --> 00:04:37,480 So, OK, we can say whether clothing's right or killings right or keeping promises is right. 45 00:04:37,480 --> 00:04:42,840 But as a philosopher, what we're interested in is what is it? There's something to be right. 46 00:04:42,840 --> 00:04:47,340 What is it for an action to be right? What is it for an action to be wrong? 47 00:04:47,340 --> 00:04:59,310 Okay, we step back with it. So what we're interested in today is not so much first order ethics, but second-order ethics or metter ethics. 48 00:04:59,310 --> 00:05:05,850 It's sometimes called we're interested, not so much in judgements about particular types of action. 49 00:05:05,850 --> 00:05:13,470 We're interested in judgements about that sort of judgement, what it is that makes something right now. 50 00:05:13,470 --> 00:05:21,210 Very importantly, you can only test your second order ethical theories against your first order. 51 00:05:21,210 --> 00:05:28,890 Ethical. So if an ethical theory says, well, what it is for an action to be right is this, 52 00:05:28,890 --> 00:05:34,680 it would be perfectly reasonable to say, well, there's one problem with that and it comes out. 53 00:05:34,680 --> 00:05:39,870 And the problem is that it comes out of saying that it's fine to kill people. 54 00:05:39,870 --> 00:05:45,090 Now, that's because it clashes with the first-order ethics. 55 00:05:45,090 --> 00:05:49,530 You think there's probably something wrong with your second daughter? Ethics. Do you see what I mean? 56 00:05:49,530 --> 00:05:57,000 You test the two against each other. I've seen everything theory and practise interact with each other. 57 00:05:57,000 --> 00:06:05,910 So to link this to last week when we were when I was doing logic, I said to quite a few times, you are all rational animals. 58 00:06:05,910 --> 00:06:10,290 You all make logical decisions, logical judgements the whole time, 59 00:06:10,290 --> 00:06:17,580 because whenever you decide where development is good or bad, you're making a logical decision. 60 00:06:17,580 --> 00:06:20,950 The difference between us is that I know what I'm doing when I do. 61 00:06:20,950 --> 00:06:25,030 That's where you. Do it just from instinct, from intuition. 62 00:06:25,030 --> 00:06:33,310 But what's a logician does is identify what you do when you reason and then formulate a theory on that. 63 00:06:33,310 --> 00:06:41,470 But of course, if that theory came out telling you to do something that you think is blindingly obviously wrong, all of you do. 64 00:06:41,470 --> 00:06:47,500 Then there's something wrong with the ethical theory. So theory and practise interact to trust you. 65 00:06:47,500 --> 00:06:55,660 Test your true intuitions about ethical theory against your ordinary, everyday ethical judgements. 66 00:06:55,660 --> 00:07:01,300 We're going to be doing a bit of that today. So I'll draw your attention to the fact that we're doing it when we do it. 67 00:07:01,300 --> 00:07:06,170 So you'll see what I mean when I when what I'm saying now. 68 00:07:06,170 --> 00:07:10,490 OK, so let's talk first about Aristotle. 69 00:07:10,490 --> 00:07:20,380 And I ask him, like, if you his states that says that the right action is the action that would be chosen by a virtuous person. 70 00:07:20,380 --> 00:07:27,490 You feel like saying not very useful are not a good decision making theory, this one, 71 00:07:27,490 --> 00:07:33,780 because I now got to know who is a virtuous person and ask them and make sure I understand. 72 00:07:33,780 --> 00:07:41,650 And so. But anyway, that's what Aristotle says. The right action is an action that would be chosen by the virtuous person. 73 00:07:41,650 --> 00:07:50,950 And what is a virtuous person? Well, here we are. A virtuous person knows three things, or at least has three characteristics. 74 00:07:50,950 --> 00:07:56,980 The first is he or she knows which is the right action in a situation. 75 00:07:56,980 --> 00:08:02,110 Now, this is quite important feature of Aristotelian ism. 76 00:08:02,110 --> 00:08:11,770 Aristotle believes that morality has nothing to do with rules, rules like keep promises, don't tell lies, etc. 77 00:08:11,770 --> 00:08:17,950 And let me give you an example of that. OK, I want you to imagine the situation. 78 00:08:17,950 --> 00:08:26,210 OK. Your mom comes back from the hairdresser or your wife or your husband or your daughter or whoever you like, 79 00:08:26,210 --> 00:08:32,130 comes back from the hairdresser and says, What do you think? And you think, yuck. 80 00:08:32,130 --> 00:08:39,610 OK. You've got a problem you want. What's your problem? Truth is, it's a problem, it's a word. 81 00:08:39,610 --> 00:08:44,200 Tell me what the problem is. I think you might hurt the person. 82 00:08:44,200 --> 00:08:51,880 Okay. That's one element. But that's only one side because that's not a problem. 83 00:08:51,880 --> 00:08:56,340 You might have to lie. You're not going to have to lie. You've got a choice. 84 00:08:56,340 --> 00:09:00,830 You might be wrong. You know what? 85 00:09:00,830 --> 00:09:07,370 If you think you're up, you can't be wrong about what you think. At least your mom has asked you, what do you think? 86 00:09:07,370 --> 00:09:10,550 So. OK. You got a problem here. There are two rules out there. 87 00:09:10,550 --> 00:09:16,270 Be kind. Be honest. And it looks as if in this situation you've got a conflict. 88 00:09:16,270 --> 00:09:21,100 You can't be both kind and honest. Right. 89 00:09:21,100 --> 00:09:26,740 So what do you do? OK, well, Aristotle says this is a classic example of a moral dilemma. 90 00:09:26,740 --> 00:09:30,760 You cannot use rules here because the rules run out. 91 00:09:30,760 --> 00:09:40,250 The thing about rules is that their general claims don't lie, don't kill that have to be applied in particular situations. 92 00:09:40,250 --> 00:09:46,170 And give me that in various particular situations. Second, to come into conflict, don't they? 93 00:09:46,170 --> 00:09:52,210 And so in this tug, you're thinking, my goodness, if I tell her the truth, she's going to think I'm really cruel. 94 00:09:52,210 --> 00:09:56,260 If I tell her a lie, I'm going to think I'm really awful. What do you do? 95 00:09:56,260 --> 00:10:04,900 So you've got to make a choice between these two. Does it change your mind if I say your mother's been depressed for six months? 96 00:10:04,900 --> 00:10:09,890 This is the first time you've seen a smile. Does that push you in one direction? 97 00:10:09,890 --> 00:10:15,100 Okay. I hope you thought that before. 98 00:10:15,100 --> 00:10:19,990 Oh, I see what you're doing. I'm going to be. Okay. 99 00:10:19,990 --> 00:10:21,110 Well, okay. 100 00:10:21,110 --> 00:10:33,780 I'm going to embarrass some of you now because what do you do if you're not a proper moral agent in this situation is make yourself a set of rules. 101 00:10:33,780 --> 00:10:38,040 You're going to say, oh, goodness, I can't bear this sort of moral dilemma. 102 00:10:38,040 --> 00:10:42,310 You know, I really I don't want to be cruel and I don't want to be dishonest either. 103 00:10:42,310 --> 00:10:50,710 What am I going to do? I'll just have to say, look, in this, I value truth more than I value being kind. 104 00:10:50,710 --> 00:10:54,670 So whenever this situation arises, whenever I put the dilemma of this kind. 105 00:10:54,670 --> 00:10:59,740 Fine. I'm going to be honest. Now, we all know people like that, don't we? 106 00:10:59,740 --> 00:11:02,950 Some of you may be people like that. 107 00:11:02,950 --> 00:11:09,580 And there are other people who hit that situation and they say, oh, the sorts of dilemma I'm going to make myself a little rule. 108 00:11:09,580 --> 00:11:14,590 And the rule is whenever I hit a situation of this kind. I'm going to be kind. 109 00:11:14,590 --> 00:11:21,100 I think kindness trumps honesty. And I'm going to become perhaps you you hear me okay. 110 00:11:21,100 --> 00:11:28,210 But you see that if you make that sort of we all know that sort of person, too, don't we? 111 00:11:28,210 --> 00:11:34,420 Aristotle says that you shouldn't be a person of either of those kinds because what you should do 112 00:11:34,420 --> 00:11:41,980 is maintain the value of both truth and honesty and make a decision in this particular situation. 113 00:11:41,980 --> 00:11:48,610 That doesn't necessarily have any ramifications whatsoever or any other situation. 114 00:11:48,610 --> 00:11:53,680 So you don't make a rule that says I value truth more than honesty. I mean, I must be more than truth. 115 00:11:53,680 --> 00:12:01,420 You say in this situation, given the particularities of the whole situation, I'm going to go for kindness, 116 00:12:01,420 --> 00:12:11,380 given that my mom has been depressed for so long, etc. But in another situation, pretty much the same, you'd say, I'm going to go to her honesty. 117 00:12:11,380 --> 00:12:15,760 And the point, according to our Sibley's, you don't make yourself rules. 118 00:12:15,760 --> 00:12:22,380 You just do whatever seems to you. In that situation to be the right thing to do. 119 00:12:22,380 --> 00:12:31,950 Surely there should be another. 120 00:12:31,950 --> 00:12:41,530 Yes. Well, I'm glad you wouldn't so many other people, you say that. 121 00:12:41,530 --> 00:12:48,410 What does my bum look big in there, you know? What do you think of this dress or did you like the stew I made tonight? 122 00:12:48,410 --> 00:12:58,320 Or I mean, we're constantly seeking the opinions of others on things and therefore putting them potentially into that situation of moral dilemmas. 123 00:12:58,320 --> 00:13:01,960 And anyway, the fact is, you can generalise the example I'm using. 124 00:13:01,960 --> 00:13:08,350 The example I'm using is fairly trivial. But any moral dilemma is exactly the situation. 125 00:13:08,350 --> 00:13:16,480 You talk two values to general rules, which in a particular situation come into conflict and you can't obey both. 126 00:13:16,480 --> 00:13:20,590 Of course, you get a recall launches. I'm going to be cruel to be kind. 127 00:13:20,590 --> 00:13:26,020 You're going to say or something like that. Of course, it's not really a lie. It's only a white lie. 128 00:13:26,020 --> 00:13:30,800 OK. Where did you both try to do try that before you gave your answer? 129 00:13:30,800 --> 00:13:39,520 I'm sure you did, because one of the things about being a virtuous person is that you've got to be a wise person, says Aristotle. 130 00:13:39,520 --> 00:13:48,880 He thinks that all the virtues come together. Others say that that's a wise and virtuous person, is that they do know what to do in a situation. 131 00:13:48,880 --> 00:13:54,070 Unfortunately, not in a way that they can give you a rule. 132 00:13:54,070 --> 00:13:58,560 OK. All they can do is, is what they can't do anything that they'd like to be modest with. 133 00:13:58,560 --> 00:14:04,750 Well, of course. But if you think they're virtuous, you should watch them see what they do. 134 00:14:04,750 --> 00:14:09,130 Try, try and intuitively act as they would act. 135 00:14:09,130 --> 00:14:12,850 Ask yourself how would they act in this situation, that sort of thing. 136 00:14:12,850 --> 00:14:16,450 So he knows what the right action is. 137 00:14:16,450 --> 00:14:26,890 Even though knowing what the right action is is so very difficult and there aren't any rules that you can give to anyone but to go to help them. 138 00:14:26,890 --> 00:14:31,660 The second thing about the virtuous person is that he performs the right faction. 139 00:14:31,660 --> 00:14:36,610 Well, we all know about this, don't we? You know you know what the right thing is to do. 140 00:14:36,610 --> 00:14:42,630 But do you do it? Oh, you know, they. Again, you gave in to temptation. 141 00:14:42,630 --> 00:14:46,450 Okay. You need to be malicious or something like that. 142 00:14:46,450 --> 00:14:55,050 Could be just a moment's weakness or whatever. And clearly, knowing what the right action is is not a sufficient reason for being not sufficient. 143 00:14:55,050 --> 00:14:59,530 In addition to being virtuous, you've actually got to do the right action as well. 144 00:14:59,530 --> 00:15:05,480 And Aristotle says that's the thinking about the virtuous person. Is that. 145 00:15:05,480 --> 00:15:12,300 And this is actually very important here. You can be born benevolent. 146 00:15:12,300 --> 00:15:23,920 Okay. You're you're sort of naturally benevolent person. But that doesn't mean that you're going to acquire the virtue of benevolence comparison here 147 00:15:23,920 --> 00:15:29,020 as you could be born strong with the potential to be a real athlete or something like that. 148 00:15:29,020 --> 00:15:37,630 But if you sit around eating crisps and watching television all day, this disposition is going to disappear, isn't it? 149 00:15:37,630 --> 00:15:42,910 So you were born with the potential to be strong, but you're not strong or with the potential to be athletic. 150 00:15:42,910 --> 00:15:51,520 You're not athletic. Similarly, you might be born with the potential to be virtuous, to be benevolent, say, but not be benevolent. 151 00:15:51,520 --> 00:15:56,260 And the difference is that you actually have to exercise the netherlands'. 152 00:15:56,260 --> 00:16:00,910 You have to do it. So if you're born strong, you actually have to exercise. 153 00:16:00,910 --> 00:16:07,690 You have to practise. You have to train. If you do all those things, you will become strong, properly strong. 154 00:16:07,690 --> 00:16:11,730 Similarly, it doesn't matter whether you've been born benevolent. 155 00:16:11,730 --> 00:16:17,110 So you occasionally do kind things just because it's your nature to Aristotle. 156 00:16:17,110 --> 00:16:20,860 You've actually got to do it because it is the right thing to do. 157 00:16:20,860 --> 00:16:25,570 And you've got to practise. You've got to get the habit of telling the truth. 158 00:16:25,570 --> 00:16:30,970 We all know it. The first lie is quite difficult. But the second one's a bit easier. 159 00:16:30,970 --> 00:16:37,210 And the third one. The fourth one. And the fifth one. You know, you can get the habit of being dishonest. 160 00:16:37,210 --> 00:16:42,350 You can't get the habit of not fulfilling your attentions of going swimming this afternoon. 161 00:16:42,350 --> 00:16:47,950 That reminds me, you've got to try and get the good habits to habits. 162 00:16:47,950 --> 00:16:56,020 The other way round. So you've not only got to know what the right action is, you've actually got to act on that knowledge. 163 00:16:56,020 --> 00:17:00,280 And what's more, make it a habit of acting on that knowledge. 164 00:17:00,280 --> 00:17:08,940 OK. And finally, you've got to pull the right action for the right reason. 165 00:17:08,940 --> 00:17:16,450 OK. Now, each of us is the guardian of our own morality, if you like, of our own values. 166 00:17:16,450 --> 00:17:20,650 Imagine the situation again with your mum. OK. 167 00:17:20,650 --> 00:17:24,420 Here's the situation. She said, what do you think? A new baby. You think? 168 00:17:24,420 --> 00:17:29,940 Yup. And you make a lightning decision. I'm going to be kindlier now. 169 00:17:29,940 --> 00:17:34,900 You can justify that, can't you? It's you know, it's very easy to say I've just being kind. 170 00:17:34,900 --> 00:17:45,150 You can also just files for Spain on this. But we all know that sometimes when we were honest, actually what we did was give into a moment's fight. 171 00:17:45,150 --> 00:17:48,750 Have we ever done that? Don't need to tell. 172 00:17:48,750 --> 00:17:52,710 I won't ask you to put your hands up so you couldn't. 173 00:17:52,710 --> 00:17:58,170 You can claim to be being honest. But actually, you're giving into a moment struck at this fight. 174 00:17:58,170 --> 00:18:05,940 And similarly, you can claim to be being kind. But actually, you've just failed in moral courage. 175 00:18:05,940 --> 00:18:16,650 Is that also a common situation? So it's no good just knowing what the right action is and performing the right action. 176 00:18:16,650 --> 00:18:21,920 You've got to perform the right action for the right reason. 177 00:18:21,920 --> 00:18:27,060 Okay, so one's intention is is very important to Aristotle. 178 00:18:27,060 --> 00:18:31,440 And if throughout her lifetime you do these things, I mean, 179 00:18:31,440 --> 00:18:40,860 obviously you get better as you get older and then saw an eight thousand lifetime, we do these things, you will become a virtuous person. 180 00:18:40,860 --> 00:18:52,290 And at that point, you could be looked to as someone whose actions to emulate someone's decisions, to emulate, somebody whose advice to ask and so on. 181 00:18:52,290 --> 00:18:59,520 So, for example, when the government gets together, a group of the greats in the boots to form an advisory committee or something like that, 182 00:18:59,520 --> 00:19:03,570 what the government is doing is actually acting in a very Aristotelian sense. 183 00:19:03,570 --> 00:19:10,410 They're saying, look, all you've got your your wise and virtuous. So let's put you together and ask your advice on these issues. 184 00:19:10,410 --> 00:19:15,150 Let's see what you would do or what you would tell us to do. 185 00:19:15,150 --> 00:19:26,290 And that's what we're doing, is consulting people who have a reputation, who have proven themselves in some sphere of life to two productive virtues. 186 00:19:26,290 --> 00:19:34,200 OK, that's Aristotle then. Yes, I was just going to say, Allburn, excuse me. 187 00:19:34,200 --> 00:19:49,790 You seem to agree that an unnecessary complexity with this business about having fun it to a person why is an unnecessary layer of complexity has. 188 00:19:49,790 --> 00:19:58,030 Those conditions and some people that have to be from situations made in some situations are not others. 189 00:19:58,030 --> 00:20:05,900 It's not just black or white thing. Our stuff that seeps in doesn't this is not the case with those. 190 00:20:05,900 --> 00:20:14,360 And so this disease actually presupposes that every invention there is. 191 00:20:14,360 --> 00:20:19,640 And, you know, it does matter. No, it doesn't presuppose either. 192 00:20:19,640 --> 00:20:24,620 I'll deal with that one first because it's very easy. Doesn't sit presuppose that there is a right action. 193 00:20:24,620 --> 00:20:32,000 As a matter of fact, in the situation I gave you with your mom, either of those options could be right and it would be perfectly reasonable for you to 194 00:20:32,000 --> 00:20:39,050 make either of those decisions if you were sincerely making them on your best judgement. 195 00:20:39,050 --> 00:20:46,220 So the fact is there can be many different types of cases, but that doesn't mean there isn't a wrong one. 196 00:20:46,220 --> 00:20:50,090 Do you see what I mean? I mean, there are lots of different right answers in this situation. 197 00:20:50,090 --> 00:20:53,870 Doesn't mean there isn't a wrong one. Only the first one. 198 00:20:53,870 --> 00:20:55,960 I didn't think it's a layer of complexity. 199 00:20:55,960 --> 00:21:05,600 Because if I have a moral dilemma, if I have a problem that I think is a moral one, I'll go and ask lots of different people. 200 00:21:05,600 --> 00:21:18,380 But I'm not going to ask anyone I think is either stupid or would it be more likely to give me biased advice or something like that. 201 00:21:18,380 --> 00:21:22,370 The people choose will be people I think can give me something safe. 202 00:21:22,370 --> 00:21:26,770 And of course, I'm not going to choose just one. I'm going to choose several. 203 00:21:26,770 --> 00:21:33,770 And I don't see why that shouldn't be described as looking for a virtuous person to help me out of this situation. 204 00:21:33,770 --> 00:21:39,950 Of course, the decision is eventually mine. But I think I'd be stupid not to go seeking advice. 205 00:21:39,950 --> 00:21:45,860 I would have just thought it's hard enough making that decision in some situations where it's complex. 206 00:21:45,860 --> 00:21:53,070 It's hard not knowing what's going on in action and doing it for the right reasons without having to then make decisions. 207 00:21:53,070 --> 00:21:59,200 I need to move together to make decisions about which this is virtuous in this respect and which person isn't. 208 00:21:59,200 --> 00:22:04,800 But I think you'll see it the wrong way round, because it seems to me that when I have a moral dilemma, 209 00:22:04,800 --> 00:22:13,400 what I usually mean by that is I do know what the right faction is in this situation, or I can give arguments for several different factions. 210 00:22:13,400 --> 00:22:20,730 And it might be that only one of them would be right. And that's why I would seek the opinion of somebody else whom I respected. 211 00:22:20,730 --> 00:22:32,960 Guess my stand is to think about that and why at a press conference that there's look at the reasons that you would you wouldn't consult anyone else. 212 00:22:32,960 --> 00:22:37,370 It depends. Right. 213 00:22:37,370 --> 00:22:46,850 Well, I know that that's a personal anecdote. What I would say is that what you should do is you might not consult them. 214 00:22:46,850 --> 00:22:53,030 You might say, what would so-and-so do situation? What would so-and-so do in that situation anyway? 215 00:22:53,030 --> 00:22:59,390 You didn't have to agree with our stuff. But this is his theory. I mean, I do agree there's something here. 216 00:22:59,390 --> 00:23:07,370 But I would certainly I wouldn't make a serious moral decision where I wasn't sure of the pros and cons myself without asking some other people. 217 00:23:07,370 --> 00:23:14,720 Then, of course, I'd weigh the pros and cons for myself because eventually that decision must be yours, mustn't it? 218 00:23:14,720 --> 00:23:17,660 You know, you can't get away from weighing the pros and cons, 219 00:23:17,660 --> 00:23:23,640 but you can either do it without supplies or you can take advice from people whom you respect and admire. 220 00:23:23,640 --> 00:23:29,930 OK. Any other questions that are simple change? 221 00:23:29,930 --> 00:23:37,230 What is essentially a slight. But looking. 222 00:23:37,230 --> 00:23:43,560 And what's the trivia question? What do you think, my head. 223 00:23:43,560 --> 00:23:52,730 I think it is trivial. What do you get? 224 00:23:52,730 --> 00:23:57,900 Could you ask somebody. Said construction. 225 00:23:57,900 --> 00:24:04,030 And you say to them, are you going to do that? Eventually you put it that long. 226 00:24:04,030 --> 00:24:23,500 And she said on 72 days, Daniel. Then you you said you were your. 227 00:24:23,500 --> 00:24:31,330 Well, I mean, I would suggest that you walk in with her first and maybe support her or not support her, depending on how the argument went. 228 00:24:31,330 --> 00:24:38,660 But I would also say that you should ask someone else. The key question is, why are you going to vote for X if you are someone else? 229 00:24:38,660 --> 00:24:48,580 Why are you going to vote? Why? I mean, I use the trivial question as an example, but as I said, it applies in every case of moral dilemmas. 230 00:24:48,580 --> 00:24:52,930 I mean, should we allow cyberattacks, for example? 231 00:24:52,930 --> 00:25:08,950 These are the use of a cow egg to incubated human nucleus, producing something that's 99 percent human, one percent bovine. 232 00:25:08,950 --> 00:25:12,460 I mean, should we do that? Is that morally acceptable? 233 00:25:12,460 --> 00:25:18,100 Well, you know, I could just weigh the pros and cons by myself or I could ask just one friends or. 234 00:25:18,100 --> 00:25:23,300 But I think what I would do is ask lots of people. I would set up a commission to look into it. 235 00:25:23,300 --> 00:25:27,310 Get lots of people on this commission who who are reputation being wise. 236 00:25:27,310 --> 00:25:32,170 And I'd ask them to discuss it and then weigh the pros and cons on the basis of this. 237 00:25:32,170 --> 00:25:37,680 So the question. It doesn't matter how trivial or important the question is. 238 00:25:37,680 --> 00:25:47,260 It brings up the same process, which is that you consult as many virtuous people as possible and then weigh up the situation yourself. 239 00:25:47,260 --> 00:26:07,200 Your thoughts are out there all the time. 240 00:26:07,200 --> 00:26:11,320 No, no, I wouldn't think that was the case. 241 00:26:11,320 --> 00:26:20,230 What you do become virtuous until you've been performing the right action for the right reason. 242 00:26:20,230 --> 00:26:29,140 On many occasions in the past. I mean, we can all get it wrong occasionally, but we all know that there are some people who get it wrong a lot. 243 00:26:29,140 --> 00:26:34,400 You know, they either never know what they did. Do you think he's a car crash waiting to happen? 244 00:26:34,400 --> 00:26:39,550 Listen to this one. Or they know what the right action is, but never perform it. 245 00:26:39,550 --> 00:26:45,070 You know, they're weak or malevolence or something or they do perform the right action. 246 00:26:45,070 --> 00:26:51,400 So, I mean, for example, if I'm an honest if I'm a dishonest person, my best bet is to tell the truth, isn't it? 247 00:26:51,400 --> 00:26:59,620 Most of the time, wasn't this on this person does is merely hold themselves ready to be dishonest when it's going to benefit them. 248 00:26:59,620 --> 00:27:05,140 I mean, actually, it's it's important if I want to be dishonest to get you to trust me. 249 00:27:05,140 --> 00:27:09,800 But what I'm doing is telling the truth. I know the whole truth is the right thing to do. 250 00:27:09,800 --> 00:27:14,650 I'm often telling the truth that the reason I'm doing it is to get you to trust me so I can then 251 00:27:14,650 --> 00:27:20,930 sell you that nice timeshare that I suddenly just sprung a leak the other day or whatever that is. 252 00:27:20,930 --> 00:27:29,740 And I'm doing it for the wrong reason. So it's not in a situation that you that someone is virtuous if they're virtuous as I am. 253 00:27:29,740 --> 00:27:36,040 So they have been purchased over a long period. The nice thing is you can't really be virtuous. 254 00:27:36,040 --> 00:27:40,850 Nearly all white owned. I mean, we're whales qualify qualified. 255 00:27:40,850 --> 00:27:52,820 We buy made it by now we're not. So, you know, we can all make mistakes. 256 00:27:52,820 --> 00:27:56,770 No, no, nobody's denying that. But you can make mistakes. 257 00:27:56,770 --> 00:28:03,140 But the thing is and you know, this is not a foolproof procedure because, you know, you can get the great. 258 00:28:03,140 --> 00:28:08,000 I mean, Mary Bono came out recently and said she made the wrong decision on special schools. 259 00:28:08,000 --> 00:28:14,340 I mean, she's a prime example of the sort of person the government consults when it comes to wanting a virtuous person. 260 00:28:14,340 --> 00:28:21,860 Well, she's come out and said she believes she's wrong on something. You know, that's that's entirely consistent with being virtuous person. 261 00:28:21,860 --> 00:28:27,440 But if someone did it too often, then you'd stop consulting them, wouldn't you? 262 00:28:27,440 --> 00:28:37,200 We're spending too long on ourselves. So just one quick question. Say, well, it makes some more votes. 263 00:28:37,200 --> 00:28:41,530 If he has said it suggests that something could be wrong. 264 00:28:41,530 --> 00:28:51,530 Yes. OK. So that's Aristotle. So you've got one ethical theory now, one account of what it is, what the right action is. 265 00:28:51,530 --> 00:28:57,050 The right action is once it's performed by a virtuous person or chosen by a virtuous person. 266 00:28:57,050 --> 00:29:01,490 Not very action guiding, but a good theory, I think. OK, here's another one. 267 00:29:01,490 --> 00:29:05,020 Kant believes this in action is right. 268 00:29:05,020 --> 00:29:15,800 Only if the person performing it does so out of reverence, separate the law or as he would also put it, out of a sense of duty. 269 00:29:15,800 --> 00:29:18,560 But didn't you teach here? You've got to be quite careful. 270 00:29:18,560 --> 00:29:27,800 We have a tendency to think of duty as something dry and horrible, whereas Kant doesn't think of it like that. 271 00:29:27,800 --> 00:29:36,080 Let me leave that. He said no, just Kant talks about the groundwork and the metaphysical morals, which is on your reading list. 272 00:29:36,080 --> 00:29:41,400 He talks about the only way you can tell whether someone's virtuous or not. 273 00:29:41,400 --> 00:29:50,840 He's eigth in a situation of moral dilemmas. They're prepared to act out a duty rather than inclination. 274 00:29:50,840 --> 00:29:55,880 So we always have many motives. Could doing all that almost everything we do. 275 00:29:55,880 --> 00:30:03,170 There are several different motives or reasons for doing it, reasons for not doing it, and some reasons are better than others. 276 00:30:03,170 --> 00:30:09,290 And if we act on the right reasons. So Camus is very similar to our subtle in this. 277 00:30:09,290 --> 00:30:16,790 He thinks it's intention that's important. It's acting out of reverence for the law after duty. 278 00:30:16,790 --> 00:30:24,180 So let's say we I'm having four wanted to raise those. 279 00:30:24,180 --> 00:30:28,640 Then you're coming from the other. What's your name? Alison's coming from the other. 280 00:30:28,640 --> 00:30:35,760 The bigger me sits with her young child in winter, wrapped up at least is asking for money. 281 00:30:35,760 --> 00:30:41,000 And Alison gives her pounds and I give her a pound. So we. What exactly the right actually. 282 00:30:41,000 --> 00:30:46,040 Alison gave her a pound because she thought it was the right thing to do. 283 00:30:46,040 --> 00:30:50,980 I gave her a pound because I wanted Alison to think I'm the kind of person. 284 00:30:50,980 --> 00:30:58,780 OK, have we both acted morally or have I acted morally and Alison's acted in self interest? 285 00:30:58,780 --> 00:31:04,940 What did I get? Not the wrong way around. You know what I mean? Just a little test for you. 286 00:31:04,940 --> 00:31:12,720 Make sure you're listening. Who thinks that I have acted just as morally as Alison? 287 00:31:12,720 --> 00:31:23,510 OK. Why is against your best interest? Oh, well, why are you here? 288 00:31:23,510 --> 00:31:31,250 Sorry. What's your name? Georgia. Georgia. Same as Alison did it as well because she wanted to make herself feel good. 289 00:31:31,250 --> 00:31:36,290 Is that right? Yeah. Yes. Think I did. Then you. 290 00:31:36,290 --> 00:31:44,420 You think everything is done mostly ok. It is not the reason it's subconscious. 291 00:31:44,420 --> 00:31:48,710 Okay. Doing what they can. You're in good company here, Hugh. 292 00:31:48,710 --> 00:31:51,560 David Hume, the Scottish philosopher, would say the same thing. 293 00:31:51,560 --> 00:32:00,470 That everything is performed out of self-interest, that you do absolutely nothing out of altruism. 294 00:32:00,470 --> 00:32:09,580 Pants thinks that is absolutely wrong. He thinks that if you do something out of inclination, then it's not. 295 00:32:09,580 --> 00:32:17,030 But by inclination, he means for self-interest at any level, that it's not a moral act. 296 00:32:17,030 --> 00:32:23,240 And he actually thinks that most of the actions we do and people think they're moral are 297 00:32:23,240 --> 00:32:29,050 in fact not moral because they all are done from this idea of wanting to appear good or, 298 00:32:29,050 --> 00:32:34,160 you know, you give something to charity and it gives you a nice warm glow. And you knows it's that's a nice thing. 299 00:32:34,160 --> 00:32:38,750 Well, if you do it for that reason, that it's not a moral action, says Kant, 300 00:32:38,750 --> 00:32:43,740 it's only a moral action if you perform it because you think it's the right. 301 00:32:43,740 --> 00:32:53,490 Thing to do, and the real test is, is there's an action that you really, really, really want not to perform. 302 00:32:53,490 --> 00:32:59,540 And yet your duty tells you you should perform it. So you believe it's the right thing to do. 303 00:32:59,540 --> 00:33:04,110 It's the right thing rather than what you want to do. 304 00:33:04,110 --> 00:33:11,160 You are acting morally in that particular instance, whereas if you give in to your inclination, you're not acting morally. 305 00:33:11,160 --> 00:33:17,370 So hence would disagree with you on the belief that every action is self-interested. 306 00:33:17,370 --> 00:33:21,690 He thinks most sanctions are self-interested, but not all of them. 307 00:33:21,690 --> 00:33:31,450 There are the occasional acts that is performed out of reverence for the law. 308 00:33:31,450 --> 00:33:39,530 Well, Catherine Callaway wouldn't care about that. 309 00:33:39,530 --> 00:33:53,050 Well, you've lost the suit, cared about that, not because I'm not religious, but sorry, killing them because you want to have a. 310 00:33:53,050 --> 00:34:04,090 Douglas Adams is just less than. 311 00:34:04,090 --> 00:34:12,350 I see what you because you're saying to the extent that they couldn't give you an inclination towards performing the right thing. 312 00:34:12,350 --> 00:34:16,270 OK. This is what cats would say. This will remind you of something I said last week. 313 00:34:16,270 --> 00:34:25,540 In fact, it's exactly the same. Something else and mostly. OK. 314 00:34:25,540 --> 00:34:29,590 Doing a is right. I should do a. OK. 315 00:34:29,590 --> 00:34:35,650 A German bride said the count would say that that is entailed by that. 316 00:34:35,650 --> 00:34:44,380 You don't need didn't hear anything like I want to do the right thing. 317 00:34:44,380 --> 00:34:52,090 Now, in effect, Georgia, this is this is what you will say, that there must always be a desire of that kind. 318 00:34:52,090 --> 00:34:57,460 In between this premising and this conclusion, I'm Scouts' would say, well, no, 319 00:34:57,460 --> 00:35:04,290 because actually, if you think you need to add in that you want to do the right thing, 320 00:35:04,290 --> 00:35:10,420 you were implying that there might be an occasional which you didn't want to do the right thing. 321 00:35:10,420 --> 00:35:16,120 And if that's true, you don't have the concept of rights at all. 322 00:35:16,120 --> 00:35:25,420 You don't understand what rights is. If on an occasion, you might not want to do it. 323 00:35:25,420 --> 00:35:34,480 You see how it would work. So you do the very important thing here is that this without the want. 324 00:35:34,480 --> 00:35:39,650 Here is what cancer is call a categorical imperative. 325 00:35:39,650 --> 00:35:54,040 The imperative I should do A is not contingent upon your having this desire without doing the right thing, because this desire doesn't make any sense. 326 00:35:54,040 --> 00:36:01,510 Because if you know what it is to do right. You couldn't not want to do right. 327 00:36:01,510 --> 00:36:07,230 Doesn't mean that you will always do right. Because we do quite often do things that we tend to do. 328 00:36:07,230 --> 00:36:15,580 But the fact that we believe we were wrong will be manifested in shame and guilt. 329 00:36:15,580 --> 00:36:27,460 So if you really knew what's doing right, doing wrong is you cannot want to do wrong. 330 00:36:27,460 --> 00:36:40,810 That's what counts as so. OK, so there's a very long tradition stemming from him that says you can't do any action without a self-interest behind it. 331 00:36:40,810 --> 00:36:49,150 Now, if that's so and if so, please write to the ancients believe that actually it has to be reason, not desire. 332 00:36:49,150 --> 00:36:53,890 That's storing propelling most moral actions. 333 00:36:53,890 --> 00:36:59,020 It would mean that actually none of our actions is moral. Wouldn't it? 334 00:36:59,020 --> 00:37:04,440 Because not one of them is altruistic. So can you sorry. 335 00:37:04,440 --> 00:37:12,190 Humans come up with another account of morality that's consistent with no actions being altruistic anyway. 336 00:37:12,190 --> 00:37:16,180 So that's kind. And I thought about doing it tough with reverence for the law. 337 00:37:16,180 --> 00:37:21,010 So what is this moral law? Okay, well, this actually counts. 338 00:37:21,010 --> 00:37:29,860 Skip six different accounts of the categorical imperative of which this is one. 339 00:37:29,860 --> 00:37:33,400 And I'll give you this one because I like this one best. But there are six others. 340 00:37:33,400 --> 00:37:41,290 But there's supposed to be equivalent. So it shouldn't really matter if I'm giving you one rather than six act in such a way that you treat humanity, 341 00:37:41,290 --> 00:37:52,090 whether in your own person or the person of another, always at the same time as an ends and never solely as a means. 342 00:37:52,090 --> 00:37:57,860 So what's your name? Dorothy. Lend me your pen for a second. 343 00:37:57,860 --> 00:38:01,810 Thank you. Now I use Dorothy as a means to my end. 344 00:38:01,810 --> 00:38:06,250 Then in time I wanted to tell you something. 345 00:38:06,250 --> 00:38:10,180 To show you something. I also to give my pension. Give me her pen. 346 00:38:10,180 --> 00:38:16,750 She gave it to me. So she was a means to my end of giving you an example, is that right? 347 00:38:16,750 --> 00:38:24,520 But I also at the same time you can have it. I used use then who didn't herself because she could have said no. 348 00:38:24,520 --> 00:38:33,690 She could have said I'm sorry, I'm using it or it's the only pen I've got or my I left my pen or you know, she had the choice. 349 00:38:33,690 --> 00:38:38,170 That's not true. Is it really interesting. But you can see what I mean. 350 00:38:38,170 --> 00:38:44,930 I did two things. We're always using each other means when we say pass the salt, would you carry my suitcase? 351 00:38:44,930 --> 00:38:52,300 Will you do this for me, etc. But it's what's important is that we always treat others as ends in themselves as well. 352 00:38:52,300 --> 00:38:56,110 In other words, we allow them to make their own choices. 353 00:38:56,110 --> 00:39:06,700 If I trick you into carrying my suitcase, then I'm not treating you as an engine yourself because I'm not giving you the choice I. 354 00:39:06,700 --> 00:39:12,400 So what can says is that you've always got to treat humanity as an end in themselves 355 00:39:12,400 --> 00:39:17,150 because the thing about humans and he allows that there may be other rational animals. 356 00:39:17,150 --> 00:39:21,460 But but that's just for humans. Thing about humans is they're rational. 357 00:39:21,460 --> 00:39:27,460 They make choices and they make them freely. Not all my choices are free, but some of them are. 358 00:39:27,460 --> 00:39:33,220 And the fact we can make free choices is what makes us moral animals. 359 00:39:33,220 --> 00:39:38,080 Okay. That's. That makes us moral a notice. It's not just the other people. 360 00:39:38,080 --> 00:39:43,660 We've got to treat as ends in themselves. It's also ourselves. 361 00:39:43,660 --> 00:39:52,090 Your every bit as wicked. If you treat yourself as nothing more than the means to somebody else's end, 362 00:39:52,090 --> 00:39:59,710 because that's conflicting with your integrity as an autonomous being, as somebody with free will. 363 00:39:59,710 --> 00:40:07,230 So that's the moral law. Notice since that day, that count is also pretty lousy as a decision maker. 364 00:40:07,230 --> 00:40:11,530 Okay. What's the right action? Oh, well, it's the one that the law says we should perform. 365 00:40:11,530 --> 00:40:19,090 What's the law? The one that says we should always treat others as enemies. Lousy school rules, isn't it? 366 00:40:19,090 --> 00:40:29,130 Please try. Tell us a sentence. That's every time I tried it through time. 367 00:40:29,130 --> 00:40:38,590 Does it? Fine. Fine. I'll just say, as you treat them as a means, you're also treating them as an end of it. 368 00:40:38,590 --> 00:40:43,210 So it doesn't mean it's OK. There's no duration to it. 369 00:40:43,210 --> 00:40:51,010 It's just that simultaneously you can't make up for having treated someone as a means by then treating it as an end. 370 00:40:51,010 --> 00:40:56,470 You were wrong to treat them as a means in the first place. OK. 371 00:40:56,470 --> 00:41:01,000 So that's Kant. Robert came through the philosophers here. 372 00:41:01,000 --> 00:41:04,660 OK. The next one we going to look at is utilitarianism. 373 00:41:04,660 --> 00:41:14,280 Now, utilitarianism is quite different from either Aristotle or cans because utilitarianism tells us that the right. 374 00:41:14,280 --> 00:41:25,470 They actually produces the greatest happiness and greatest number. Now, this is a consequentialist, not a doctrine based on intention or will. 375 00:41:25,470 --> 00:41:34,460 So whereas Camus believes that the only thing that's good in itself is the will, the choices, the intentions on which you act. 376 00:41:34,460 --> 00:41:39,930 The utilitarian thinks it's the consequences of your actions that matter. 377 00:41:39,930 --> 00:41:45,600 So I think let's think about the nature of an action for a minute. 378 00:41:45,600 --> 00:41:55,570 Here's an act. OK. Whatever that act is, it might be me scratching my nose because I told you I. 379 00:41:55,570 --> 00:42:04,430 OK. I told David that if Mike isn't back today filming me, I'll let him know by structuring my news. 380 00:42:04,430 --> 00:42:10,300 OK. And so I'm scratching my nose. In effect, the structure of the movies is a lie. 381 00:42:10,300 --> 00:42:26,100 You said this, Mike, is that. So this act could be a stretching of the noose, but of course, it's also a lie. 382 00:42:26,100 --> 00:42:38,390 Well, that wouldn't be a lie, but it would be a stretching of the day. I know that that's quite important because some people say this is all media. 383 00:42:38,390 --> 00:42:48,350 Oh, I have to go a little bit. No, I'll never find this [INAUDIBLE] again. 384 00:42:48,350 --> 00:42:53,660 If you think think of OK, forgive me, I'm an object. 385 00:42:53,660 --> 00:43:00,920 I'm an object with many properties. There are many, many descriptions that pick out me uniquely, aren't there? 386 00:43:00,920 --> 00:43:09,320 So she's the only person in this room who's on the stage. She's the person who is director of studies and philosophy. 387 00:43:09,320 --> 00:43:15,230 So you do see she's the person who's wearing a turquoise jumper. 388 00:43:15,230 --> 00:43:19,580 Several other people here. But I'm the one on the stage wearing the turquoise comforts. 389 00:43:19,580 --> 00:43:24,650 You see, these are all uniquely identifying descriptions of me. 390 00:43:24,650 --> 00:43:28,220 So there are lots of different ways of getting to me in. 391 00:43:28,220 --> 00:43:40,800 The same thing is true of an action. Okay, here's the class of notice scratching. 392 00:43:40,800 --> 00:43:44,610 This one is a lie. OK? 393 00:43:44,610 --> 00:43:51,840 This is the one where I'm looking at they and say to tell him that Mike's not in the room. 394 00:43:51,840 --> 00:43:55,560 This one isn't. This is just a no stretching. OK. 395 00:43:55,560 --> 00:44:03,210 And this one is a I'm very bored with this sort of move and so on. 396 00:44:03,210 --> 00:44:07,350 You see, so it can be a token thing that is also a lie. 397 00:44:07,350 --> 00:44:11,720 But when any connexion. So hesitation action is an action. 398 00:44:11,720 --> 00:44:19,260 Escutcheon knows the fact that its action means that it must have an intention, mustn't it? 399 00:44:19,260 --> 00:44:25,890 OK. So if I come in Tripp overmatch and you will laugh, I might think to myself, oh, that's interesting. 400 00:44:25,890 --> 00:44:30,890 I might make him laugh. I'll do it again next week. So when I come in next week, I trip over much again. 401 00:44:30,890 --> 00:44:37,470 Now the first one was unintentional, wasn't it? The second one is intentional. 402 00:44:37,470 --> 00:44:43,440 So in order for it to be an action something, it's got to be something you've chosen to do. 403 00:44:43,440 --> 00:44:50,270 It's any action. You haven't chosen to do it. Not too much for which you're morally responsible. 404 00:44:50,270 --> 00:44:58,990 There can be manslaughter rather than murder. You might be culpably guilty of manslaughter, but you really shouldn't have been cleaning your gun. 405 00:44:58,990 --> 00:45:03,510 As it pointed at, David was loaded and so on. 406 00:45:03,510 --> 00:45:11,250 So that's more or less guilt attached to manslaughter. But in order for it to murder, there has to be an intention there. 407 00:45:11,250 --> 00:45:17,870 So you've got the intention. You've got an action. And you've got the consequences of an action. 408 00:45:17,870 --> 00:45:22,890 You're always going to be consequences of the action. So now David knows that Mike isn't there. 409 00:45:22,890 --> 00:45:26,770 He's got go an and ask where my kids or something like that. 410 00:45:26,770 --> 00:45:30,750 So there's a consequence. So every act has an intention. 411 00:45:30,750 --> 00:45:33,210 And as a consequence, I'm concerned. 412 00:45:33,210 --> 00:45:46,480 Aristotle thinks that the moral evaluation of an action carried goes on here, whereas utilitarianism thinks this an can only be evaluated fear. 413 00:45:46,480 --> 00:45:51,540 Okay, so it's not the intentions with which we act that make the act wrong. 414 00:45:51,540 --> 00:45:54,120 It's the consequences of the attack. 415 00:45:54,120 --> 00:46:07,920 So if I want to take my dear old arms out for tea and as I take her out, we cross the road and she's squashed by a bus. 416 00:46:07,920 --> 00:46:13,620 The action I have performed has been wrong, even though my intention was a good one. 417 00:46:13,620 --> 00:46:18,460 I'm going to feel guilty. You know, other people are going to worry about me. 418 00:46:18,460 --> 00:46:29,880 What counts is making an action wrong is its consequences. And you can see one huge advantage of this is that if we're looking at courts of law, 419 00:46:29,880 --> 00:46:34,380 it's pretty well only consequences that we can look at, isn't it? 420 00:46:34,380 --> 00:46:41,850 I have no way of getting to your intentions. Also isn't through your actions, including your linguistic actions. 421 00:46:41,850 --> 00:46:47,220 I mean, you might tell me that your intention was this that's or the other. 422 00:46:47,220 --> 00:46:54,960 So in a court of law, it's nearly always the consequences that matter more than the intention. 423 00:46:54,960 --> 00:47:01,080 Of course, I may want to get the attention, but do you think that somebody could always intent? 424 00:47:01,080 --> 00:47:06,570 Well, but as a matter of fact, everything they do is wrong. 425 00:47:06,570 --> 00:47:11,490 So they intend to do the right thing, but they never actually succeed. 426 00:47:11,490 --> 00:47:23,350 I mean, wouldn't you be getting a bit suspicious? So know this is the third top taken out and squashed under a bus. 427 00:47:23,350 --> 00:47:28,920 Funny enough, they each left me money, you know? 428 00:47:28,920 --> 00:47:35,880 I mean, what's going on here? So we look at we look at the consequences in order to determine the intentions. 429 00:47:35,880 --> 00:47:39,810 You know, let's look at something again. Oh, my goodness me. 430 00:47:39,810 --> 00:47:43,950 I just I meant to make her happy, etc. You're not going to believe me. 431 00:47:43,950 --> 00:47:50,180 I said in a court of law is it tends to be the consequences, not the. 432 00:47:50,180 --> 00:47:56,730 There's another nice thing about utilitarianism, and that is it gets us a bit of a decision procedure, doesn't it? 433 00:47:56,730 --> 00:48:04,230 So I mean, Aristotle has a school searching around for virtuous people and Kant has us searching 434 00:48:04,230 --> 00:48:11,190 our own intuitions for what the moral law tells us to do on this occasion. 435 00:48:11,190 --> 00:48:16,680 But utilitarian is an inductive, inductive moral theory. 436 00:48:16,680 --> 00:48:23,790 It tells us that whatever action will produce the greatest happiness is the greatest number is the right action. 437 00:48:23,790 --> 00:48:30,930 So it looks as if it's much easier to work out what you should or shouldn't do. 438 00:48:30,930 --> 00:48:35,910 But do you think that's maybe misleading? Oh, you all do then. 439 00:48:35,910 --> 00:48:39,570 So I think. Okay, let me just give you one example. Dropping the ball on. 440 00:48:39,570 --> 00:48:46,830 Hiroshima was not the right thing to do or not. Well, if you're a utilitarian, you think it was the right thing. 441 00:48:46,830 --> 00:48:52,080 If dropping the bomb produced the great stuff is the greatest number and it wasn't the right thing. 442 00:48:52,080 --> 00:48:56,340 If dropping the bomb didn't produce the greatest happiness, the greatest number. 443 00:48:56,340 --> 00:49:03,570 Now, what's the truth of the matter? Did dropping the bomb produce success, happiness, the greatest number or not? 444 00:49:03,570 --> 00:49:10,140 I mean, actually, we could give arguments, can't we, on both sides? There is a fact of the matter. 445 00:49:10,140 --> 00:49:16,890 But the chances of our ever knowing that fact are virtually nil, aren't they? 446 00:49:16,890 --> 00:49:23,730 So utilitarianism might seem to give us a very easy decision procedure, but actually it doesn't. 447 00:49:23,730 --> 00:49:30,850 And there are all sorts of other problems as well. I mean, to me, let's say that, David, this is such a happy, cheery person. 448 00:49:30,850 --> 00:49:38,820 You know, if he smiles all the time. It's very easy to make David happy, whereas Alison is a miserable. 449 00:49:38,820 --> 00:49:46,290 And I can say how I can produce much more happiness in this class by concentrating on David because he's very easy to make happy. 450 00:49:46,290 --> 00:49:51,900 And I can just ignore Alison because she's you know, nothing I can do is make her happy. 451 00:49:51,900 --> 00:49:57,930 I'm not talking about average happiness or total happiness. What is happiness anyway? 452 00:49:57,930 --> 00:50:04,860 So actually, utilitarianism is not as easy as it looks to to execute. 453 00:50:04,860 --> 00:50:16,290 Or you can worry out that we cannot know what will happen if we do all the facts. 454 00:50:16,290 --> 00:50:27,090 We can't know. Doesn't mean there isn't a fact of the matter. So, for example, all that three consecutive sets sevens and the decimal expansion of PI. 455 00:50:27,090 --> 00:50:37,600 Excuse me a second. I'm just looking for something here. Something I can't find. 456 00:50:37,600 --> 00:50:53,630 Done. So happy to see. Yes, the intentional doesn't happen. 457 00:50:53,630 --> 00:51:03,170 Well, no, no, I mean. I mean, ending a war doesn't make people happy, actually. 458 00:51:03,170 --> 00:51:08,250 And of course, the alleviation of unhappiness produces happiness as well. 459 00:51:08,250 --> 00:51:15,440 But sorry, going back to your question. You can't say that because we'll never know what the are. 460 00:51:15,440 --> 00:51:22,250 Therefore, that wasn't a fact. I mean, the decimal expansion of PI is an infinite extension. 461 00:51:22,250 --> 00:51:30,820 If there are no three consecutive settlements in the decimal expansion of PI, then we will never know this ever. 462 00:51:30,820 --> 00:51:37,900 And we can those that in principle, it doesn't mean the fact of the matter. 463 00:51:37,900 --> 00:51:43,600 Was there a tree here? Twenty thousand years ago? 464 00:51:43,600 --> 00:51:47,620 Right here. I mean, there is a factor that matter, isn't that? 465 00:51:47,620 --> 00:51:56,780 Will we ever know it? Well, it wasn't that. 466 00:51:56,780 --> 00:52:03,010 What is it? Nothing. I mean, this being a is being thrown all the world. 467 00:52:03,010 --> 00:52:07,540 The fact of what would happen. You could never know. 468 00:52:07,540 --> 00:52:12,190 Yes, but but that's a different problem, because the fact is, it was a long time ago. 469 00:52:12,190 --> 00:52:18,490 And with the problem, there is the fact of the matter now as to whether that would produce more happiness or not. 470 00:52:18,490 --> 00:52:23,980 But the fact that the future we can't know about happiness is actually very interesting, isn't it? 471 00:52:23,980 --> 00:52:31,570 Because it does mean that when you actually act, you don't know what the consequences of your action will be. 472 00:52:31,570 --> 00:52:38,470 You can make a good, good guess, perhaps, but you can never be certain what the consequences of your actions will be. 473 00:52:38,470 --> 00:52:47,820 So utilitarianism makes a very important distinction between the morality of an agent and the morality of an action. 474 00:52:47,820 --> 00:52:52,780 And where is the morality of the action? Always depends on the consequences. 475 00:52:52,780 --> 00:53:02,320 The moral morality of an agency depends upon the fact that they act with the intention of producing the best consequences. 476 00:53:02,320 --> 00:53:07,990 OK, so immoral cannot morally. Yes. 477 00:53:07,990 --> 00:53:17,410 So if I go to the principal minnow in a supermarket or something like that, and it's actually quite a small bomb. 478 00:53:17,410 --> 00:53:26,860 And as it goes off in it. So I'm trying you could probably see where I'm going somewhere, but I can't. 479 00:53:26,860 --> 00:53:31,420 It stops a major disaster from happening. It's something else. 480 00:53:31,420 --> 00:53:39,790 So my small bomb actually results in a lot more people being left alive than would otherwise have been left alive. 481 00:53:39,790 --> 00:53:45,800 The consequences of what I've done have been good, but what I did was bad. 482 00:53:45,800 --> 00:53:49,780 And so part of me might and I was bad for doing what I did. 483 00:53:49,780 --> 00:53:52,630 Even though the consequences of being good. 484 00:53:52,630 --> 00:54:06,300 So utilitarianism, you've got to make a distinction to be moral worth of the agent and the moral worth of the action that they perform. 485 00:54:06,300 --> 00:54:16,080 Problems for utilitarianism. One of the difficulties of utilitarianism is that it says that the action that 486 00:54:16,080 --> 00:54:24,450 produces the great something is the greatest number is the right action. But there seems to be some very clear counterexamples to this. 487 00:54:24,450 --> 00:54:32,280 For example, genocide. It looks as if utilitarianism could justify genocide. 488 00:54:32,280 --> 00:54:44,340 If you have a situation where the two of you downhill, the three of you down here are a different race for us or something, rather, from us. 489 00:54:44,340 --> 00:54:47,630 And we don't like them, do we? But they're not here. 490 00:54:47,630 --> 00:54:56,150 And many of them. All of it. So let's just get rid of them. OK, now that's produced the greatest happiness of greatest number. 491 00:54:56,150 --> 00:55:01,270 Because, you know, there are not many of you to be happy. Your happiness doesn't really count. 492 00:55:01,270 --> 00:55:08,350 And we all wanted them dead. So that's produced the greatest happiness of gross numbers, hasn't it? 493 00:55:08,350 --> 00:55:15,410 That looks a bit worrying. To say the least. Now, it might be that some of us feel pretty outraged that these people have been shot. 494 00:55:15,410 --> 00:55:22,470 So. So four of you over here think that was really awful. So we can actually put your unhappiness with their unhappiness. 495 00:55:22,470 --> 00:55:32,310 But can we ever be sure that the unhappiness of those who are against gendercide will outweigh the happiness of those who are for it? 496 00:55:32,310 --> 00:55:38,340 Can we be sure? No. It looks if we can't, in which case, if you go for utilitarianism, 497 00:55:38,340 --> 00:55:44,100 it rather looks as if you're opening yourself to the possibility of justifying genocide. 498 00:55:44,100 --> 00:55:50,630 And that's because the utilitarian human rights, the right to life, for example, 499 00:55:50,630 --> 00:55:58,550 is only a right on the understanding that your right to life isn't conflicting with the greatest happiness and grace. 500 00:55:58,550 --> 00:56:02,850 No. You have a right to life because on the whole, 501 00:56:02,850 --> 00:56:08,040 it produces the greatest happiness of the greatest number to treat people as if they have a right to life. 502 00:56:08,040 --> 00:56:16,680 But if somebody, your right to life comes into conflict, to the greatest happiness in the greatest number, then your right to life lapses. 503 00:56:16,680 --> 00:56:23,820 Okay. I have a duty to kill you. If killing you will produce the great sadness of the greatest number. 504 00:56:23,820 --> 00:56:29,370 So there seemed to be counterexamples to utilitarianism. 505 00:56:29,370 --> 00:56:36,100 On the other hand, it's so incredibly useful. I mean, we're we're actually using it to the National Health Service movement. 506 00:56:36,100 --> 00:56:44,530 And it's only One-Third of Collie's. So quality adjusted life years for something policy. 507 00:56:44,530 --> 00:56:51,870 Somebody out there, some quality adjusted life years when when a surgeon or a doctor is considering some 508 00:56:51,870 --> 00:56:56,240 sort of intervention and she can only afford to do one of these perhaps wrong. 509 00:56:56,240 --> 00:57:00,900 The other, you decide by looking at the quality of the. 510 00:57:00,900 --> 00:57:08,220 So the number of life years that would be produced by this intervention. So for you, it would be ten, and for you it would be five. 511 00:57:08,220 --> 00:57:11,970 And then you look at the quality of those years that you're producing. 512 00:57:11,970 --> 00:57:19,260 So you would have five actually pretty mediocre years where I would say I can give you five really good ones. 513 00:57:19,260 --> 00:57:22,360 Of course, I could not make it difficult for myself. 514 00:57:22,360 --> 00:57:29,640 But you see that you can look at Pawleys to give you some way, a measure of which intervention you should do. 515 00:57:29,640 --> 00:57:46,790 And that's based very, very firmly on utilitarianism, which will lead to more. 516 00:57:46,790 --> 00:57:56,220 Well, do you remember the sailor to the captain not to shut off their oxygen in the engine room on purpose? 517 00:57:56,220 --> 00:58:02,180 Well, utilitarianism did seem to justify shutting off fields to the. 518 00:58:02,180 --> 00:58:08,540 And it might justify war. But, of course, cannons could also perhaps justify war unsurpassable. 519 00:58:08,540 --> 00:58:18,110 If if the war seems to be their only thing that's treating others as ends in themselves or all the thing that the virtuous people would suggest. 520 00:58:18,110 --> 00:58:24,320 So it's not just utilitarianism that can justify war, that everyone can do that. 521 00:58:24,320 --> 00:58:31,700 But what utilitarianism could do couldn't justify genocide. And it's not obvious the other theories can do that. 522 00:58:31,700 --> 00:58:36,680 So if you want to justify genocide, utilitarianism is probably your theory. 523 00:58:36,680 --> 00:58:40,440 It makes it sounds as if I've got a to the sirens. 524 00:58:40,440 --> 00:58:48,950 This is absolutely not the case. I think that there are many ways of interpreting this in particular. 525 00:58:48,950 --> 00:58:58,370 I think that you might want to say that utilitarianism is a discredited theory, not a prescriptive one. 526 00:58:58,370 --> 00:59:10,040 In other words, it's not telling us that this is true, but that this is. 527 00:59:10,040 --> 00:59:23,130 It's something that we noticed in some of his comments that it was estimated because. 528 00:59:23,130 --> 00:59:28,110 Stop. OK. What was I say? 529 00:59:28,110 --> 00:59:33,890 Right. The descriptive theory is just say that what you do do the right action. 530 00:59:33,890 --> 00:59:37,680 It will work out that that was the action that produced the great something to squish. 531 00:59:37,680 --> 00:59:43,500 No, not that's what you should do. Because actually, there's something wrong with saying it's prescriptive. 532 00:59:43,500 --> 00:59:51,090 I mean, I intend to go home tonight after I've been swimming, of course, and have a glass of wine. 533 00:59:51,090 --> 00:59:57,270 Now, I could actually go and work in the Oxfam to go house for a bit or something like that, 534 00:59:57,270 --> 01:00:03,960 which would presumably make more people happy than me having my glass of wine. 535 01:00:03,960 --> 01:00:05,520 Should I be doing that? 536 01:00:05,520 --> 01:00:14,460 I mean, surely I actually if what I'm supposed to do is producing a very suddenly suppressed number all the time, I'm going to get very tired. 537 01:00:14,460 --> 01:00:18,730 So you might think it's not a prescriptive theory to put a descriptive one. 538 01:00:18,730 --> 01:00:22,950 So I didn't do that. The bad news headlines about utilitarianism. 539 01:00:22,950 --> 01:00:35,830 But I promise you that there are ways of understanding this that make it much less simple than deny giving the impression of saying, you know. 540 01:00:35,830 --> 01:00:41,200 Three, but I wonder to pay is that make me feel good? 541 01:00:41,200 --> 01:00:44,530 That's it. Child development you have to develop. 542 01:00:44,530 --> 01:00:54,010 We're using the food that makes the crucial switch from whether it's externally imposed by consequences imposed by. 543 01:00:54,010 --> 01:00:58,400 Mommy, punish me. And when it rains, it comes into. 544 01:00:58,400 --> 01:01:03,900 This is one of the major factors is that childhood development development? 545 01:01:03,900 --> 01:01:16,440 Yeah. And for any moral philosophy, it doesn't actually take into account that one of the major concerns of Oromo reasoning is. 546 01:01:16,440 --> 01:01:20,550 But surveys can all take and was pass because. 547 01:01:20,550 --> 01:01:25,540 Yes. Capitalism is it seems to sort of be right? 548 01:01:25,540 --> 01:01:34,400 Well, I'm concerned for your happiness, I think. And I've got to exercise an empathy is only a method of determining what others are feeling. 549 01:01:34,400 --> 01:01:44,350 I did if I put myself in your position that I'm using empathy, but I'm using empathy to determine what's going to make you happy or I'm imagining, 550 01:01:44,350 --> 01:01:48,820 you know, who's going to be affected by my action, how will they be affected by it? 551 01:01:48,820 --> 01:01:53,980 Or I'm thinking, what is it in this situation to treat to in yourself? 552 01:01:53,980 --> 01:01:58,330 What would you choose if you if I was able to ask you again? 553 01:01:58,330 --> 01:02:10,480 I've got to use empathy. It's supposed to be the human body is simply actually experiencing the experience, not the other person. 554 01:02:10,480 --> 01:02:14,230 It's not it's not the reasoning or the rational person. Maybe not. 555 01:02:14,230 --> 01:02:22,450 But the fact is, it's all there acting on. I mean, think back to what Aristotle said about you can be naturally benevolent and I mean children. 556 01:02:22,450 --> 01:02:27,850 If one child in the nursery starts screaming, the others are going to come out in sympathy very quick. 557 01:02:27,850 --> 01:02:31,130 And if you cry in front of the child, the child gets very distressed. 558 01:02:31,130 --> 01:02:37,210 I mean, children are naturally empathetic, but that boy, they're not naturally moral. 559 01:02:37,210 --> 01:02:42,400 I mean, aid in order to become moral. They've got to learn what the right action is. 560 01:02:42,400 --> 01:02:48,660 They've got to learn that they've got to do the right action and they've got to do it for the right reasons. 561 01:02:48,660 --> 01:02:56,520 So that is the beginning of knowing that other people have feelings and experiences like you have a family. 562 01:02:56,520 --> 01:03:02,690 Yes. And then from that, you begin to see a finding such and such. 563 01:03:02,690 --> 01:03:09,040 And so it will hurt. I don't like it. Yes, but you give it a psychological theory of morality. 564 01:03:09,040 --> 01:03:16,270 I'm interested in the philosophical theory of morality. So in the same way I can say it, we can be interested in language. 565 01:03:16,270 --> 01:03:20,890 What is language? What is meaning? How do we manage to communicate with each other? 566 01:03:20,890 --> 01:03:25,960 That's a philosophical interest. All we can say, how does language develop in a child? 567 01:03:25,960 --> 01:03:30,760 That's a psychological theory. And I'm teaching you philosophy. 568 01:03:30,760 --> 01:03:40,210 I don't know. Know, it's just sort of more sophisticated development of that law and things like that. 569 01:03:40,210 --> 01:03:43,940 Otherwise, where this will come from. Yeah. 570 01:03:43,940 --> 01:03:49,670 OK. Margaret, did you have a specific question, though? 571 01:03:49,670 --> 01:03:57,570 I was asking you to describe this theory was prescriptive theory tells you what you ought to do. 572 01:03:57,570 --> 01:04:01,870 Such a prescription of your actually not just the description. 573 01:04:01,870 --> 01:04:10,370 You obviously, obviously had to write actions because. 574 01:04:10,370 --> 01:04:13,350 Or, you know, you don't have to do it. 575 01:04:13,350 --> 01:04:20,650 It's just that utilitarianism is a prescriptive theory, says you should produce great suppleness, the greatest number. 576 01:04:20,650 --> 01:04:22,950 It's a descriptive theory. 577 01:04:22,950 --> 01:04:31,520 When you have done the right action, it will be the case that it does produce great stuff, this number, whatever your intention was in doing it. 578 01:04:31,520 --> 01:04:37,330 Yes. OK. Including if you intended to do wrong. 579 01:04:37,330 --> 01:04:43,720 OK. I'm going to move on to politics now because otherwise we're not going to get politics done. 580 01:04:43,720 --> 01:04:48,750 OK, so that's ethical theory and it's to go back to the developmental thing. 581 01:04:48,750 --> 01:04:57,270 Of course, empathy is important. I think empathy is actually I called it charity, the principle of charity. 582 01:04:57,270 --> 01:05:03,840 I think there's very good reason to think that when we're trying to understand this world, 583 01:05:03,840 --> 01:05:09,480 we're constrained in all our thinking by something called the gospel of the uniformity of nature. 584 01:05:09,480 --> 01:05:17,460 In other words, if we want to understand whether A causes B, we've got to see that, see if we can get name without a B, because if we can. 585 01:05:17,460 --> 01:05:22,320 That shows A doesn't cause B, so we're assuming Natuzzi reform. 586 01:05:22,320 --> 01:05:27,510 But in the case of understanding each other, we've got to use the principle of charity, 587 01:05:27,510 --> 01:05:33,120 which is a form of empathy, where if you say something that strikes me as mad. 588 01:05:33,120 --> 01:05:38,100 So you say P and I say, look, you know, it's obviously not. 589 01:05:38,100 --> 01:05:46,500 Now, I could just dismiss you as stupid. But actually, if you're right, I'll be losing my opportunity to learn something. 590 01:05:46,500 --> 01:05:59,420 So the principle of charity tells you to to always assume that the other person's error is less likely and your bad interpretation. 591 01:05:59,420 --> 01:06:08,100 OK, so if you seem to be saying something mad, it's probably because I haven't understood you and I need to ask you, why are you saying that? 592 01:06:08,100 --> 01:06:17,070 What if I dismiss you as mad then and there's something wrong with me because you're a worthy collaborator in pursuit of truth. 593 01:06:17,070 --> 01:06:23,250 You're a rational animal. I'm doing wrong by dismissing you is stupid. 594 01:06:23,250 --> 01:06:34,020 OK, moving on to politics, I'm going to talk about just one political issue, because it's quite nice and central issue, distributive justice. 595 01:06:34,020 --> 01:06:42,720 How do we distribute the goods of society in such a way as it's done to us to do justice Kelly? 596 01:06:42,720 --> 01:06:48,910 So in this country, we not only have a tax regime, we also have a benefits regime. 597 01:06:48,910 --> 01:06:52,110 And we redistribute wealth in various ways. 598 01:06:52,110 --> 01:07:02,040 And we presumably do this because we think that this is the just most just way of distributing things like education, the vote, 599 01:07:02,040 --> 01:07:11,780 freedom of speech, etc. We try and equalise it, but we don't go for equality and we go for redistribution of the sort that we do. 600 01:07:11,780 --> 01:07:17,430 And I'm going to be talking about two philosophers, John Rawls and Robert Nosek. 601 01:07:17,430 --> 01:07:26,040 First, I want to talk about rules, too. I can talk to a party that gets in the. 602 01:07:26,040 --> 01:07:33,490 Right. I I've no idea how that got there. 603 01:07:33,490 --> 01:07:39,330 No idea. I don't. I think he's still a lot. I. 604 01:07:39,330 --> 01:07:44,390 I'll have to look it up anyway. It's definitely not box or release. 605 01:07:44,390 --> 01:07:54,590 If it is, I don't know. Okay. Rose wrote a very influential, hugely influential book called Fair Justice. 606 01:07:54,590 --> 01:08:03,560 It's very big. It's very boring. I don't recommend it, but it is it has some very interesting stuff in it. 607 01:08:03,560 --> 01:08:12,140 And what rules sets out to do is to choose the principles of justice for society, he says. 608 01:08:12,140 --> 01:08:20,460 What is it? That was all the principles on which goods should be distributed in such ways to make it as just as possible. 609 01:08:20,460 --> 01:08:32,090 And the real hallmark of rules is originality was something called the original position, which is his way of choosing the principles of justice. 610 01:08:32,090 --> 01:08:43,850 This is how he did it. The original position is a position where people of a certain kind are put in a certain position. 611 01:08:43,850 --> 01:08:52,100 I'm asked to decide on the rules by which justice should be by which good should be distributed. 612 01:08:52,100 --> 01:08:58,830 The people are like this. They're rational. They're self-interested. 613 01:08:58,830 --> 01:09:04,760 OK, so. So they're not stupid. They they care about themselves. 614 01:09:04,760 --> 01:09:10,490 But they also, you know, they're quite happy to be kind to other people. 615 01:09:10,490 --> 01:09:15,860 They're also risk averse. They don't really want to put themselves into into a difficult position. 616 01:09:15,860 --> 01:09:25,850 They don't want to take risks. And the original position was that the sort of the position that Putin is behind the veil of ignorance. 617 01:09:25,850 --> 01:09:32,660 Now, the veil of ignorance means that you don't know anything about yourself. 618 01:09:32,660 --> 01:09:37,010 So you don't know whether you're male or female. You don't know whether you're old or young. 619 01:09:37,010 --> 01:09:45,930 You don't know whether you're rich or poor, black or white, intelligent or stupid, ill or fit. 620 01:09:45,930 --> 01:09:51,740 OK. You know nothing about yourself. You could be any of these things. 621 01:09:51,740 --> 01:10:01,770 OK. The only knowledge you have is the thin theory of good rules, cause that's what this in theory good is. 622 01:10:01,770 --> 01:10:07,770 It tells you things like human beings need warmth. 623 01:10:07,770 --> 01:10:15,780 They need comfort. They need a certain amount of property over which they have autonomy. 624 01:10:15,780 --> 01:10:26,100 Humans gestate for nine months. So you you have the basic physiological, psychological, political or economic facts about human beings. 625 01:10:26,100 --> 01:10:29,130 But you don't know anything about you. 626 01:10:29,130 --> 01:10:39,700 Now you've got to decide what the principles of justice should be and what those things is that because you're behind the veil of ignorance. 627 01:10:39,700 --> 01:10:44,220 You're forced to be fair. OK. 628 01:10:44,220 --> 01:10:45,920 You don't know whether your well sick. 629 01:10:45,920 --> 01:10:53,460 So you're not going to make the set up such that people are set to be discriminated against because that might be you. 630 01:10:53,460 --> 01:10:58,200 You don't know whether you're rich or poor. So you're not going to give the rich everything or leave the poor with nothing. 631 01:10:58,200 --> 01:11:01,500 Like a poor. You're not going to be. You don't. 632 01:11:01,500 --> 01:11:06,660 Whether you're black or white. So you're not going to set up a system so that black people are discriminated against 633 01:11:06,660 --> 01:11:10,350 or white people are discriminated against because you don't know who you are. 634 01:11:10,350 --> 01:11:16,380 Ditto with female. If they were. I mean. Because you don't know who you are. 635 01:11:16,380 --> 01:11:24,030 Your self interest is not going to work for any particular type of person. 636 01:11:24,030 --> 01:11:31,800 Your self-interest is going to work on behalf of anyone or rather everyone who might cover. 637 01:11:31,800 --> 01:11:45,360 Isn't it good? I do. I'm just rules thinks that the two principles of justice that he believes will come out of this process are these two. 638 01:11:45,360 --> 01:11:52,060 Everyone's entitled to maximum liberty, compatible with equal liberty for all. 639 01:11:52,060 --> 01:11:57,830 OK. So actually, he's a libertarian. He is a consequentialist. 640 01:11:57,830 --> 01:12:03,840 But whereas is to the terrible happiness as the sun burns the thing we all want. 641 01:12:03,840 --> 01:12:08,550 He puts liberty there. It's liberty. That's the most important thing. 642 01:12:08,550 --> 01:12:14,190 And you could say we want equality. Things should be distributed equally. 643 01:12:14,190 --> 01:12:20,700 But if you do that, you're you're not taking account of the fact that we're not given an equal distribution to start with. 644 01:12:20,700 --> 01:12:27,240 All we know. If I male, I possibly need more goods than you need because you'll fit. 645 01:12:27,240 --> 01:12:34,390 So he says the inequalities are permitted, but only when they make the worse off, better off. 646 01:12:34,390 --> 01:12:37,230 OK, so those are two principles of justice. 647 01:12:37,230 --> 01:12:45,060 So to evaluate rules, you've got to think, OK, what do we think of the original position in the first place? 648 01:12:45,060 --> 01:12:49,470 And what do we think of the principles of justice that come out of it? 649 01:12:49,470 --> 01:12:56,880 I mean, we might think we might dismiss the original position is not a good idea or we might dismiss the principles of justice. 650 01:12:56,880 --> 01:13:03,120 Come out and I'll say just a little something on the original position. 651 01:13:03,120 --> 01:13:14,850 One important thing you've got to work out is just how which information goes where behind the veil of ignorance or in this hidden theory of good. 652 01:13:14,850 --> 01:13:21,740 Let's say that you believe that women are very emotional, okay. 653 01:13:21,740 --> 01:13:30,420 And that therefore they shouldn't be allowed to fly planes. Now, do you think women are very emotional into the things here of good? 654 01:13:30,420 --> 01:13:38,820 Because this is a fact, about half the human race. I'm just as you put it, it's women who have babies because it would be very important. 655 01:13:38,820 --> 01:13:47,940 That was it in the thin period. Wouldn't it? Do you think women are emotional in that or do you put that in the behind the veil of ignorance? 656 01:13:47,940 --> 01:14:01,080 So your decision about where you put various bits of information is actually going to serve garbage in, garbage out in effect, isn't it? 657 01:14:01,080 --> 01:14:05,820 So somebody in South Africa might have put all sorts of things about black people into the 658 01:14:05,820 --> 01:14:11,580 thin theory of goods that we would think probably belong in the behind the veil of ignorance. 659 01:14:11,580 --> 01:14:19,560 People 200 years ago would have put a lot of facts about women into the same theory of good that we think ought to be behind the veil of ignorance. 660 01:14:19,560 --> 01:14:27,210 So. So surely it's the whole thing. Just question, Becky, that we're going to put in the veil of ignorance. 661 01:14:27,210 --> 01:14:34,110 All the things, all our prejudices are going to be exercised simply in the division of where we put things. 662 01:14:34,110 --> 01:14:38,710 That's one thing. There's no same problem. We might think that's OK. 663 01:14:38,710 --> 01:14:43,270 Inequalities are permitted when they make the worse off, better off. 664 01:14:43,270 --> 01:14:50,350 Well, OK. Let's say that I've got something I can do in this room. 665 01:14:50,350 --> 01:14:55,530 You're the worse off. Sorry, you didn't really badly to them. No, the worse off. 666 01:14:55,530 --> 01:15:02,970 We're the best off. And I can do something is going to make us a lot better off. 667 01:15:02,970 --> 01:15:09,820 Who can really do something? Unfortunately, it's not going to shift them by so much as a halfpenny. 668 01:15:09,820 --> 01:15:19,560 OK. Now, this is ruled out. We can't do that because this is an inequality that isn't making the worse off, better off. 669 01:15:19,560 --> 01:15:27,030 On the other hand, I could do it if I make and just half a penny better off. 670 01:15:27,030 --> 01:15:32,100 We can Upshaw's by just a tiny little something like that. 671 01:15:32,100 --> 01:15:42,390 So there seems to be something of a politics of envy that could work in here, quite worryingly, 672 01:15:42,390 --> 01:15:47,490 because it might prevent changes that would make an awful lot of people better off, 673 01:15:47,490 --> 01:15:52,470 but would be prevented just because it doesn't make the worse off, better off. 674 01:15:52,470 --> 01:16:01,470 So that's all I can say by the rules. But that that's a very quick romp through distributive justice according to the theory of justice. 675 01:16:01,470 --> 01:16:05,550 And the original position is the key thing in that point. 676 01:16:05,550 --> 01:16:12,320 And of that, it's the original position and the thing, theory of good fortune. 677 01:16:12,320 --> 01:16:17,970 Enough fine, man. It's going to no thick no. Nosik holds a Lockean. 678 01:16:17,970 --> 01:16:22,720 John Locke, English philosopher of property theory. 679 01:16:22,720 --> 01:16:35,340 And what that says in effect is, is that you own the labour of your own body and everything with which you mix that labour. 680 01:16:35,340 --> 01:16:42,930 Now, that's actually this theory of property is is underpins the American constitution. 681 01:16:42,930 --> 01:16:45,330 It also underpins much of our law. 682 01:16:45,330 --> 01:16:56,700 I mean, for example, there was a time when you were allowed to include all the lands that you and your family could power between sunrise and sunset. 683 01:16:56,700 --> 01:17:02,860 And one thing that was wrong, fair about this is that you if you were very strong and you had a family of lusty sons, 684 01:17:02,860 --> 01:17:08,580 you get up there and power lots of land, then you could impose a lot of that land between sunrise and sunset. 685 01:17:08,580 --> 01:17:12,630 But you could also work, you know, with all these sons. 686 01:17:12,630 --> 01:17:23,670 And what's what? You eat it, too. What is it if you and your little mother were telling that, you know, you couldn't get much done? 687 01:17:23,670 --> 01:17:29,800 But on the other hand, you wouldn't need that much either, would you? So that was the idea between you. 688 01:17:29,800 --> 01:17:34,570 You can own what you mix your labour with big problems, though. 689 01:17:34,570 --> 01:17:40,590 And somebody pointed out that if I add to your wine glass into the sea. Have I lost my wine or games of the sea? 690 01:17:40,590 --> 01:17:50,790 Even if I'm telling the lands to, I also get to own the mineral rights under the land or just the topsoil. 691 01:17:50,790 --> 01:17:55,720 So there are big there is another big theory, not specified and surface news, 692 01:17:55,720 --> 01:18:01,900 but you've got to leave as good as much behind for people who come off to you. 693 01:18:01,900 --> 01:18:08,910 And the trouble with this is it can zip back. If there are 10 apples, then you take one, you take you on, you take one, you take one. 694 01:18:08,910 --> 01:18:14,240 But when it gets to the 10th, I can't take it. I'm not leaving as good and as much behind. 695 01:18:14,240 --> 01:18:20,370 But if I can't take it, neither can the ninth. Personal, big, a perfect world, personal and personal. 696 01:18:20,370 --> 01:18:25,110 And it looks as if mother property can't be owned at all. 697 01:18:25,110 --> 01:18:34,020 If you think of the settlement principle. So there are problems with the Lockean property theory which underpins No.6 theory. 698 01:18:34,020 --> 01:18:39,180 But one of the big things that he claims is that taxation is forced labour. 699 01:18:39,180 --> 01:18:45,330 And his argument for this. He talks about Wilt Chamberlain, who was a basketball player. 700 01:18:45,330 --> 01:18:50,370 Now, Wilt is a wonderful basketball player. He's actually fantastic. 701 01:18:50,370 --> 01:18:53,990 And you all have a certain number of holdings. 702 01:18:53,990 --> 01:18:58,060 Okay, you've got this money. Let's say we've all got equal amounts of money. 703 01:18:58,060 --> 01:19:02,160 And since Wilt, we thought that everyone, including Wilt, has the same amount of money. 704 01:19:02,160 --> 01:19:10,890 But Wilt has this talent. Okay. But he says on any good exercise, this talent, if you pay me and we all say, that's all I will pay. 705 01:19:10,890 --> 01:19:17,370 We'll give you 25 cents for the same basketball. What Wilt does play basketball. 706 01:19:17,370 --> 01:19:25,440 He ends up richer than the rest of us. Now, No.6 says that's fair. 707 01:19:25,440 --> 01:19:31,830 You all chose to give him the extra type of 25 cents. He chose to exercise his talents. 708 01:19:31,830 --> 01:19:35,970 He didn't have to work like that. And he's unequal. 709 01:19:35,970 --> 01:19:43,070 Wealth is owned by him. If you now take 25 percent of that away in taxation, you're in effect. 710 01:19:43,070 --> 01:19:51,040 You're forcing him to work for 25 percent of the time. I mean, for the hour and a half, I've been measuring two percent of that. 711 01:19:51,040 --> 01:19:56,010 I think. It's good to be taken away from you. Yeah, that's wicked. 712 01:19:56,010 --> 01:19:59,940 Well, shouldn't I be left with that 25 percent? 713 01:19:59,940 --> 01:20:06,000 We could then. I mean, we could have private medicine, private education, toll roads and so on. 714 01:20:06,000 --> 01:20:11,780 Why should the states take that money away and spend it on things that I don't have any children? 715 01:20:11,780 --> 01:20:19,500 Why should I spend money on education? I don't drive. Why should I spend money on rock on the roads? 716 01:20:19,500 --> 01:20:24,240 So the conflict between liberty and equality, the only way says no. 717 01:20:24,240 --> 01:20:33,580 Sick of avoiding that link is you either have to interfere with Wilt's ability to choose whether or not to exercise his talents. 718 01:20:33,580 --> 01:20:42,790 You've got to make an exercise first for no extra money. Or you've got to stop you from choosing to spend your money freely. 719 01:20:42,790 --> 01:20:49,650 OK. You can't spend it all on wealth. Either way, there's a conflict between liberty and equality. 720 01:20:49,650 --> 01:20:54,920 And that, says Nosek, is the key problem for all liberals. 721 01:20:54,920 --> 01:21:01,770 Theories of distributive justice. What do you think of that as a theory? 722 01:21:01,770 --> 01:21:09,750 Rubbish. That's why I think. 723 01:21:09,750 --> 01:21:15,420 I don't want some fur here. But it does look stupid. 724 01:21:15,420 --> 01:21:27,880 Justice, choosing to super talented, go to the right or this, that and the other escapes perpetuate inequality. 725 01:21:27,880 --> 01:21:33,750 Where is taxation as we are conservatives? Should I go to the polls? 726 01:21:33,750 --> 01:21:43,980 But surely Hillary. Sounds good, but no, it could say that there's nothing uneven about this taxation. 727 01:21:43,980 --> 01:21:48,120 If we're starting from sorry about private use. 728 01:21:48,120 --> 01:21:54,390 Look, I've tried to legal that. If we've all started with equality. 729 01:21:54,390 --> 01:22:04,200 OK? I don't have children and I don't have a car, but I do like swimming occasionally. 730 01:22:04,200 --> 01:22:13,800 I would like to spend my money on decent leisure facilities in National Trust, things like that. 731 01:22:13,800 --> 01:22:18,550 You had a car. You would what you would be prepared to pay to have toll roads. 732 01:22:18,550 --> 01:22:23,250 Top so private. Any of you who might get sick. 733 01:22:23,250 --> 01:22:30,640 Probably all of us would want to pay for private insurance to make sure that we had hospitals available when we did it. 734 01:22:30,640 --> 01:22:37,200 So it's not that there wouldn't be hospitals that we cancelled. 735 01:22:37,200 --> 01:22:40,850 Well, that's but that's exactly his point. 736 01:22:40,850 --> 01:22:50,930 It's it's his point is that as we do not sometimes equal and you want to encourage everyone to use their talents, 737 01:22:50,930 --> 01:22:59,610 if you're sick enough to make it to stop them from using their talents by taxing them, then you're going to lose. 738 01:22:59,610 --> 01:23:04,350 They're not going to use those terms. I mean, in some ways you can see that this happens. 739 01:23:04,350 --> 01:23:11,130 Taxation is set too high. These are people with challenge are going to leave the country and not pay them. 740 01:23:11,130 --> 01:23:20,040 So it's quite it's crucial, isn't it? And set the taxation level so that you don't lose the people who because there is this conflict. 741 01:23:20,040 --> 01:23:23,380 So it's not to say there shouldn't be any taxation, 742 01:23:23,380 --> 01:23:35,700 but he's saying that if you get taxation is forced labour and therefore you don't want too much of it. 743 01:23:35,700 --> 01:23:39,510 Well, I mean, one thing you might say is, is that if you tax people sorry, 744 01:23:39,510 --> 01:23:45,650 if you don't tax people thought there are some people who get to pull out of the net. 745 01:23:45,650 --> 01:23:51,990 You might say there has to be a safety net and someone knows it wouldn't even accept that. 746 01:23:51,990 --> 01:23:58,050 He thinks that very importantly, charity must be supported in a big way. 747 01:23:58,050 --> 01:24:02,820 So it must be voluntary. 748 01:24:02,820 --> 01:24:10,230 Giving becomes very important in a society where there is very little taxation and taxation. 749 01:24:10,230 --> 01:24:20,830 And I'm sure that must encourage that. Well, in schools and in I mean, there are there are ways of encouraging giving. 750 01:24:20,830 --> 01:24:27,460 They do it in America very much. They do it here. There were. 751 01:24:27,460 --> 01:24:43,130 We will find out when two people part of. 752 01:24:43,130 --> 01:24:49,770 Yeah. Kimberly, yes, he has, because Wilt Chamberlain wanted to get to be a milkman. 753 01:24:49,770 --> 01:24:55,620 Let's say he did not want to play basketball. He's playing basketball. 754 01:24:55,620 --> 01:25:07,670 And you're prepared to pay. He doesn't like playing basketball. 755 01:25:07,670 --> 01:25:14,910 Okay, well, that's that's got five minutes for questions. Let's so let's have some questions out on ethics and politics. 756 01:25:14,910 --> 01:25:18,630 Surely everybody equal. On Monday. 757 01:25:18,630 --> 01:25:22,560 The next day. That's his that's his response. 758 01:25:22,560 --> 01:25:28,800 Yeah. His point is that in order to get some quality, you've got to interfere with liberty. 759 01:25:28,800 --> 01:25:42,270 And in order to get liberty, you cannot assume equality will follow because it plays a key conflict between liberty and equality. 760 01:25:42,270 --> 01:25:44,750 But the light truth and. 761 01:25:44,750 --> 01:25:57,400 Georgia, I'm sorry, the way that those points made, and I think it is very small, but a very funny I that isn't just about money. 762 01:25:57,400 --> 01:26:01,050 I think you will be paid for taxation. 763 01:26:01,050 --> 01:26:06,800 But no, distributive justice is about the goods in society. 764 01:26:06,800 --> 01:26:14,490 So it includes things like the boat, education, the roads, hospitals, health, everything. 765 01:26:14,490 --> 01:26:23,430 That's fine. But of course, given the way our societies are, as a matter of fact, set up because it's money that pays for these things. 766 01:26:23,430 --> 01:26:41,590 So it does tend to come down to money. Please sit down. 767 01:26:41,590 --> 01:26:58,160 Yes. Try this. This is. How about this? 768 01:26:58,160 --> 01:27:06,830 Well, the question remains, what shall I also want one just quickly, because I'll forget it otherwise. 769 01:27:06,830 --> 01:27:18,140 One contradiction in the body of its work is he does think that you have to have property to be autonomous. 770 01:27:18,140 --> 01:27:24,260 So it is for him. Autonomy is the key value that businesses make choices. 771 01:27:24,260 --> 01:27:30,620 The key value. But if you need property in order to make choices and if on his theory, 772 01:27:30,620 --> 01:27:41,620 you could be left without any property at all because there's nobody redistributive mechanisms in society, then surely that can't be right. 773 01:27:41,620 --> 01:27:43,100 There's got to be something wrong with that. 774 01:27:43,100 --> 01:27:50,120 Social ills, with some basic redistributive mechanism to at least ensure that everybody has some property. 775 01:27:50,120 --> 01:28:06,190 However much is needed to be autonomous if being autonomous is the most. 776 01:28:06,190 --> 01:28:19,870 And I was just you know what I mean, is he possibly. But he's a libertarian and he's not just a possible. 777 01:28:19,870 --> 01:28:27,920 Oh, yes. I mean I mean in philosophy as a people, I mean, 778 01:28:27,920 --> 01:28:35,210 one would hope that their physical positions are broken rather more thought out than those of non philosophers. 779 01:28:35,210 --> 01:28:39,790 But I'm afraid this isn't necessarily so. But yeah. 780 01:28:39,790 --> 01:28:43,130 Philosopher with a political position, Roger Scruton, for example, 781 01:28:43,130 --> 01:28:48,640 is another famous loss where the wrong things, political positions and are lost was lost. 782 01:28:48,640 --> 01:28:53,060 As to who a political sceptics as well. 783 01:28:53,060 --> 01:29:04,500 So you can go either way. Our forces, virtuous people. 784 01:29:04,500 --> 01:29:11,170 I'm a little bit suspicious also. It's very hard to find somebody who can be equally reliable. 785 01:29:11,170 --> 01:29:16,860 All right. They are. 786 01:29:16,860 --> 01:29:22,250 Butterflies. Oh, I say, baby. 787 01:29:22,250 --> 01:29:27,450 When don't know that they have decided to do something special. 788 01:29:27,450 --> 01:29:34,220 They need to keep it in. Why? 789 01:29:34,220 --> 01:29:40,820 Well, I mean, I think it's all right in practise as well, because it's what we did in the practise. 790 01:29:40,820 --> 01:29:52,190 I mean, Aristotle would think that somebody could somebody who's virtuous would be virtuous about everything, 791 01:29:52,190 --> 01:30:07,800 although he would admit that they can get it wrong. Oh, no influence by the framers. 792 01:30:07,800 --> 01:30:16,670 Yes. I mean, one thing you would want to do if you were a wise and virtuous person is to look at the problem as effectively as possible. 793 01:30:16,670 --> 01:30:18,770 I mean, you would try not to let these. 794 01:30:18,770 --> 01:30:26,520 And also, you might say, I'm sorry, this is this this is something which I don't want to advise you because I know that I'm going to be biased. 795 01:30:26,520 --> 01:30:34,220 And I know that it's going to be biased. Therefore, this is also an issue which I want to be consulted. 796 01:30:34,220 --> 01:30:39,270 I think that's perfectly reasonable and consistent with being virtues. 797 01:30:39,270 --> 01:30:43,220 But perhaps people would say. Two very good ones. 798 01:30:43,220 --> 01:30:48,240 One here. One man is coming back next year. Do not impose. 799 01:30:48,240 --> 01:30:51,740 Absolutely. That's right. That's right. 800 01:30:51,740 --> 01:30:56,840 So actually, all the theories we've looked at here, absolutist theories. 801 01:30:56,840 --> 01:31:04,166 Yes.