1 00:00:00,360 --> 00:00:08,790 This is the second of the mini podcasts that we're doing, so welcome to everyone watching the podcasts and welcome to you, too. 2 00:00:08,790 --> 00:00:13,920 And on this one, we're doing day ontology. So let's get started. 3 00:00:13,920 --> 00:00:20,010 One of the most famous technologists, in fact, the famous anthologist is Immanuel Kant. 4 00:00:20,010 --> 00:00:26,910 This is the man himself. And in this session, I'm going to be talking mainly about Cancion de Ontology, 5 00:00:26,910 --> 00:00:31,170 although we'll see at the end that there are different types of gerontologists. 6 00:00:31,170 --> 00:00:37,050 And I can't believe that morality is a system of categorical imperatives. 7 00:00:37,050 --> 00:00:45,360 A categorical imperative is an absolute rule, a rule that binds us irrespective of our desires or any other consideration. 8 00:00:45,360 --> 00:00:52,320 And lots of people are anti de ontology precisely because they don't believe in an absolute morality. 9 00:00:52,320 --> 00:00:57,210 They think because they couldn't be an absolute morality. This is this is just wrong. 10 00:00:57,210 --> 00:01:03,780 So let's have a look at why can't believe this. Well, a little bit more about why what he believes. 11 00:01:03,780 --> 00:01:11,430 First, he believes that we're truly moral agents only when we act out of reverence for the moral law. 12 00:01:11,430 --> 00:01:17,340 So if you watch the video, the podcast on Virtue Ethics, you'll remember that virtue. 13 00:01:17,340 --> 00:01:23,850 Ethicists say that our reason for writing is a very important part of what makes us moral. 14 00:01:23,850 --> 00:01:32,190 Well, Kant would agree with that. In order to act morally, says Kant, we've got to act out of reverence for the moral law. 15 00:01:32,190 --> 00:01:36,030 And when we obey the categorical imperative when we act, 16 00:01:36,030 --> 00:01:43,020 because we see that it's a categorical imperative and that's why we do it, then we are acting morally. 17 00:01:43,020 --> 00:01:46,350 But if we act for any other reason, we're not acting morally, 18 00:01:46,350 --> 00:01:53,580 even if our act is identical to the one we would have performed if we were acting out of reverence for the law. 19 00:01:53,580 --> 00:02:03,480 And give you a quick little illustration of that, imagine that here's a road I think of it actually is Brazde Newsline just behind my old 20 00:02:03,480 --> 00:02:08,940 college and his mic coming from one direction and me coming from the other direction. 21 00:02:08,940 --> 00:02:13,530 And in the middle is sitting a beggar, holding her baby and asking for money. 22 00:02:13,530 --> 00:02:22,260 And as we're coming along, we each give her a pound, but Mike gives her a pound because he thinks it's the right thing to do. 23 00:02:22,260 --> 00:02:29,430 And I give her a pound because I think I want like to think that I'm a nice person and you might think they're well, 24 00:02:29,430 --> 00:02:35,250 which we've done exactly the same act. We've each given the the beggar a pound. 25 00:02:35,250 --> 00:02:39,660 But one of us has acted morally, according to Kant, the other hasn't. 26 00:02:39,660 --> 00:02:49,540 And we can see why he says that, can't we? You know, I'm acting out of self-interest and I'm acting to so that Mike thinks I'm a nice person. 27 00:02:49,540 --> 00:02:57,940 OK, but what are categorical imperatives? OK, an imperative is a should or a must statement. 28 00:02:57,940 --> 00:03:02,440 So both of these are imperatives. You should do this or I must do that. 29 00:03:02,440 --> 00:03:07,360 Both of those are imperative utterances. 30 00:03:07,360 --> 00:03:13,420 And here's an imperative of the sort that you act on daily little piece of practical reasoning. 31 00:03:13,420 --> 00:03:17,710 I want to get to London by noon. I believe I can only get to London by noon. 32 00:03:17,710 --> 00:03:22,480 If I catch the 10 30 train, I must therefore catch the 10 30 train. 33 00:03:22,480 --> 00:03:31,690 Do you see that his desire of yours is a belief of yours and together they generate an imperative? 34 00:03:31,690 --> 00:03:35,980 OK, so you're rationally bound by that imperative. 35 00:03:35,980 --> 00:03:44,770 You must act on it by dint of your capacity for reason, so long as you have that desire and you have that belief. 36 00:03:44,770 --> 00:03:52,480 OK, that's what an imperative is. But that one, the one I had up there, is a hypothetical imperative. 37 00:03:52,480 --> 00:04:00,370 You are rationally bound by the conclusion of that piece of practical reasoning only if you have the desire and you have the belief. 38 00:04:00,370 --> 00:04:10,390 So looking at these again, let's say eight o'clock in the morning, you do want to get to London by this time and you do believe this. 39 00:04:10,390 --> 00:04:14,860 Therefore, you're bound by that. You believe you must catch the ten 30 train. 40 00:04:14,860 --> 00:04:20,110 But at nine o'clock in the morning, your somebody rings up and says, I want you to do this. 41 00:04:20,110 --> 00:04:28,390 And this becomes very important. So you lose that desire. Now, at that point, this imperative is no longer binding on you, is it? 42 00:04:28,390 --> 00:04:32,020 As long as you had that desire, you were rationally bound by that. 43 00:04:32,020 --> 00:04:36,280 As soon as you lost that desire, you weren't rationally bound by it. 44 00:04:36,280 --> 00:04:44,470 So that was the hypothetical imperative. And what Kant is talking about is categorical imperatives, not hypothetical ones. 45 00:04:44,470 --> 00:04:52,090 So what's a categorical imperative? It binds you in virtue of your rational nature. 46 00:04:52,090 --> 00:04:57,020 Your desires are completely irrelevant. And let's give you an example of one. 47 00:04:57,020 --> 00:05:04,630 OK, here's a categorical imperative. I believe it's right to do a therefore I should do a. 48 00:05:04,630 --> 00:05:14,170 So let me ask you whether you think could you believe that it is right to do something without also believing that you should do it? 49 00:05:14,170 --> 00:05:22,210 And by the same token, could you believe that it is wrong to do something without believing that you shouldn't do it? 50 00:05:22,210 --> 00:05:33,340 Kant thinks that there's a logical link between the moral concepts of right and wrong and imperatives like should and must and so on, 51 00:05:33,340 --> 00:05:39,700 because Kant believes that we can't believe that it's right to do something without believing that we should do it. 52 00:05:39,700 --> 00:05:44,560 And we can't believe that it's wrong to do something without believing that we shouldn't do it. 53 00:05:44,560 --> 00:05:53,980 So the moral concepts are intrinsically action guiding and they're the only concepts that are the imperative come straight from the belief. 54 00:05:53,980 --> 00:05:58,930 No desire is needed. There's no desire here. Desire isn't needed. 55 00:05:58,930 --> 00:06:07,760 You logically get the should straight from the belief, straight from the way of thinking of the world. 56 00:06:07,760 --> 00:06:11,650 OK, now I can see on your faces that you may have an objection to this. 57 00:06:11,650 --> 00:06:15,820 And here's the objection I think you may have you probably immediately think. 58 00:06:15,820 --> 00:06:19,750 But hang on. There's a premise missing from this categorical imperative. 59 00:06:19,750 --> 00:06:24,790 What about I want to do the right thing. Should not be in there. 60 00:06:24,790 --> 00:06:28,780 I believe it's right to do a I want to do the right thing, therefore, I should do. 61 00:06:28,780 --> 00:06:38,230 A lot of people think this. But actually Kant would say that if you think this, you haven't properly understood the moral concept. 62 00:06:38,230 --> 00:06:44,350 Right. And I should have put and wrong there. And to say that in order to do the right thing, 63 00:06:44,350 --> 00:06:52,660 you must want to do the right thing implies that you will only do something you recognise to be right if you want to. 64 00:06:52,660 --> 00:07:00,370 And actually, if you say that what you're manifesting is that you haven't understood what it is for something to be right. 65 00:07:00,370 --> 00:07:08,500 Because the thing about morality is it's. It requires us to act whatever we want. 66 00:07:08,500 --> 00:07:16,990 I mean, it may if I need to save your life, my desire not to get my new boots wet, it becomes completely irrelevant. 67 00:07:16,990 --> 00:07:19,750 The thing I should do is save your life. 68 00:07:19,750 --> 00:07:32,020 So the categorical imperative differs from the hypothetical in that the imperative is not conditional upon any desire of yours. 69 00:07:32,020 --> 00:07:35,440 Your recognition of the of the moral law, 70 00:07:35,440 --> 00:07:43,060 of the fact that in this situation is right to do a or wrong to do a tells you straightaway what you should or shouldn't do. 71 00:07:43,060 --> 00:07:51,710 So it bind you in virtue of your rational nature, not your affective nature, not your desires. 72 00:07:51,710 --> 00:07:58,130 OK, so that sort of categorical imperative is it rationally binds us irrespective of our desires, 73 00:07:58,130 --> 00:08:02,360 but that's the form of a categorical imperative at the moment. 74 00:08:02,360 --> 00:08:08,150 It hasn't got any content at all, has it? All I've told you is what a categorical imperative is. 75 00:08:08,150 --> 00:08:14,090 So just as there can be lots of different hypothetical imperatives, there can be lots of different categorical imperatives. 76 00:08:14,090 --> 00:08:20,690 Let's have a look at them. Can offer several formulations of the categorical imperative. 77 00:08:20,690 --> 00:08:28,910 And he says they're all equivalent. Whether they are or not is it is a big philosophical question not one will address here. 78 00:08:28,910 --> 00:08:36,920 But here are two of the formulations he offers. One of them he calls the principle of humanity, which is always treat humanity, 79 00:08:36,920 --> 00:08:44,180 whether in yourself or in another person, as an end in themselves, never solely as a means. 80 00:08:44,180 --> 00:08:53,960 Well, what does that mean? And what it means is that I must always treat you as somebody who make your own decisions about your own actions. 81 00:08:53,960 --> 00:08:57,740 I should never use you only as a means to my own. 82 00:08:57,740 --> 00:09:03,270 And so that doesn't mean I can't use you. Can I borrow a pen, please? 83 00:09:03,270 --> 00:09:09,590 Peter, thank you. Thank you. OK, I just used you as a means to my own ends there. 84 00:09:09,590 --> 00:09:15,380 I wanted to make an example and I chose you as the means of making my sample. 85 00:09:15,380 --> 00:09:19,700 But of course you had the choice to say no. I mean, you could have said no, I'm sorry. 86 00:09:19,700 --> 00:09:23,810 I need the pen. I'm making notes or something, something like that. You did have the choice to say no. 87 00:09:23,810 --> 00:09:27,560 So I wasn't using you solely as a means to my end. 88 00:09:27,560 --> 00:09:36,110 That's what I mustn't do. So if I lie to you in order to get you to do what I want, I'm using you as nothing more than a tool. 89 00:09:36,110 --> 00:09:42,590 And that's what's ruled out by that formulation of the moral law. 90 00:09:42,590 --> 00:09:49,790 The second principle is that we must act only on that maximum we could will to be universal law. 91 00:09:49,790 --> 00:09:51,920 Well, that's less easy to understand. 92 00:09:51,920 --> 00:10:01,400 But what Kant means is that before we act for a any given reason, we should always ask ourselves what if everyone were to do this? 93 00:10:01,400 --> 00:10:11,750 So let's say that I think about saying to my mum, no, you look terrible out of a feeling of spite, momentary spite. 94 00:10:11,750 --> 00:10:18,590 Before I do that, I should ask, well, what if everyone were to be truthful only because they felt spiteful. 95 00:10:18,590 --> 00:10:22,850 This would be terrible, wouldn't it? Therefore, I shouldn't do it. 96 00:10:22,850 --> 00:10:31,430 And so what's important with this one is that Qantas saying that you shouldn't make exceptions based on who you are. 97 00:10:31,430 --> 00:10:36,200 You shouldn't treat yourself as any more privileged than anyone else. 98 00:10:36,200 --> 00:10:43,070 You should assume that if you have this reason for acting, everyone else will have this reason for acting, too. 99 00:10:43,070 --> 00:10:48,230 And if they shouldn't act on it, neither should you. 100 00:10:48,230 --> 00:10:56,900 So the principle of universalised ability means that you shouldn't privilege yourself above other people. 101 00:10:56,900 --> 00:11:05,750 OK, you'll note that cancer doesn't seem to think that the absolute moral rules that bind us are the ones that we were taught as small children. 102 00:11:05,750 --> 00:11:11,870 So don't lie, keep promises. Those were the rules that you were given that we were all given as small children. 103 00:11:11,870 --> 00:11:15,290 But that's not what Kant says the moral law is. 104 00:11:15,290 --> 00:11:21,270 Although I have to say that in his books he does give examples that that use those. 105 00:11:21,270 --> 00:11:28,670 And and sometimes people do believe that he thinks of those as categorical imperatives. 106 00:11:28,670 --> 00:11:37,940 If he there are certainly down ologists, who do think that lower order moral rules like keep promises and don't lie are the categorical imperatives. 107 00:11:37,940 --> 00:11:41,960 But the way I'm interpreting Kant is a different sort of gerontologists, 108 00:11:41,960 --> 00:11:52,820 one who believes that the categorical imperative is a higher order moral law like treat others only as ends in themselves. 109 00:11:52,820 --> 00:11:57,170 So, OK, so for Kant, 110 00:11:57,170 --> 00:12:09,140 the moral law binds us absolutely in virtue of our capacity to act rationally and to act immorally is for Kant to act irrationally, 111 00:12:09,140 --> 00:12:15,740 which is a very important thing. And here are some considerations that might prompt you to accept de ontology. 112 00:12:15,740 --> 00:12:23,450 So a doctor saves the lives of seven of his dying patients by gently killing a healthy tramp and using his organs for the transplant. 113 00:12:23,450 --> 00:12:27,050 Well, you might think, well, hang on, no, he can't do this. 114 00:12:27,050 --> 00:12:34,940 Even if he's saving seven other people, he's using the tramp as a means to the ends of the other people. 115 00:12:34,940 --> 00:12:43,880 And he can't do that. Or a government official desperate to avert a food crisis and believing the fear of genetically modified food, 116 00:12:43,880 --> 00:12:53,150 for example, to be overblown, orders his underlings to remove the labels and to distribute GM food as non GM. 117 00:12:53,150 --> 00:12:57,830 Well, again, the ontologies will say, no, he can't do that because he's lying to people. 118 00:12:57,830 --> 00:13:02,930 What he's got to do is to say to these people, you know, OK, here you are, you're starving. 119 00:13:02,930 --> 00:13:04,490 There is this food here. 120 00:13:04,490 --> 00:13:12,530 But I should tell you, it's GM food and let them make their own decision if they don't want to eat it because it's GM, that is their decision, 121 00:13:12,530 --> 00:13:22,250 even if it leads to their death, because they are anderssen themselves, they have the right to make their own decision about their own lives. 122 00:13:22,250 --> 00:13:28,520 And the third example, a father of two daughters believing the family will starve if they have to pay another dowry. 123 00:13:28,520 --> 00:13:33,530 Forces is unwilling wife to abort a pregnancy when he discovers it's another daughter. 124 00:13:33,530 --> 00:13:42,020 Well, again, the gerontologists will say, no, you can't do that. And the reason you can't do it is you can't force someone to do anything, 125 00:13:42,020 --> 00:13:47,840 even if it's for their own good, because you're being paternalistic, you're treating them. 126 00:13:47,840 --> 00:13:53,600 You think they ought to do this, therefore, you're saying they ought to do this. 127 00:13:53,600 --> 00:14:04,850 And actually, that doesn't follow at all because they are different from you and they might decide differently and quickly problems for de ontology. 128 00:14:04,850 --> 00:14:13,640 How do we know which actions are intrinsically right or wrong? Well, don't sort of just like can't believe we have a moral sense. 129 00:14:13,640 --> 00:14:20,990 We can actually, just as I can see that the chairs in this room are blue, I can see that an action is right or wrong. 130 00:14:20,990 --> 00:14:24,440 And there are many actions of which that is true. 131 00:14:24,440 --> 00:14:30,620 Of course, there are many actions for which it isn't true as well, though, if you make a distinction between token actions, 132 00:14:30,620 --> 00:14:34,280 just one individual action and type actions, 133 00:14:34,280 --> 00:14:42,800 it becomes much more convincing that we can look at a particular action and think of that action that is right or wrong. 134 00:14:42,800 --> 00:14:49,580 Another problem is we might ask, well, are there any actions that are intrinsically right or on the consequentialist? 135 00:14:49,580 --> 00:14:56,660 Who will look at in the next podcast? Doesn't think there are, and maybe he's right. 136 00:14:56,660 --> 00:15:02,660 Thirdly, how could blindly following a set of rules make us moral? 137 00:15:02,660 --> 00:15:09,830 And this is probably an objection to a lower order anthologist rather than a higher order ontology, just like Kant, 138 00:15:09,830 --> 00:15:18,020 because I think it's clear that you couldn't blindly follow the rule that says treat others as ends in themselves. 139 00:15:18,020 --> 00:15:22,580 And actually, that's a very difficult rule to follow because what is it to be an end in themselves? 140 00:15:22,580 --> 00:15:32,120 What what do I do in a situation not to do this, etc. and do rules like do as you would be done by, 141 00:15:32,120 --> 00:15:37,880 which is another very difficult rule to follow, provide any guidance on action. 142 00:15:37,880 --> 00:15:43,130 And we might ask whether there are sometimes moral reasons for breaking moral rules. 143 00:15:43,130 --> 00:15:53,390 For example, are we allowed to kill a person in order to stop many other people dying or many other people being killed? 144 00:15:53,390 --> 00:16:03,360 So am I allowed to kill one? A person, if I'm told that if I don't, 20 other people will be killed, for example. 145 00:16:03,360 --> 00:16:07,740 So those are the problems for ontology. Those are a romp through the problems then. 146 00:16:07,740 --> 00:16:10,290 They're not all the problems, of course. 147 00:16:10,290 --> 00:16:23,200 And if you'd like to look further again, there's the slides that'll tell you all about how to learn a bit more philosophy, including more ontology.