1 00:00:23,760 --> 00:00:31,380 [Auto-generated transcript. Edits may have been applied for clarity.] And a very, very warm welcome to the 2015 Tanner Lecture on Human Values at Oxford University. 2 00:00:31,830 --> 00:00:40,680 Hosted by Linacre College, the Tanner lectures, now in their 37th year, were established by the American scholar, 3 00:00:41,070 --> 00:00:47,370 industrialist and philanthropist Obasi Tanner in creating the lectureship. 4 00:00:47,730 --> 00:00:54,450 Professor Tanner said I hope these lectures will contribute to the intellectual and moral life of mankind. 5 00:00:55,320 --> 00:01:00,990 I see them simply as a search for better understanding of human behaviour and human values. 6 00:01:02,340 --> 00:01:06,239 Appointment as a Tanner lecturer is a recognition of exceptional, 7 00:01:06,240 --> 00:01:14,040 distinguished and important scholarship in the field of human values, and that description is is fully justified. 8 00:01:14,040 --> 00:01:17,340 In the case of tonight's lecturer, Professor Peter Singer. 9 00:01:19,080 --> 00:01:23,370 Professor singer has been variously described as the world's most influential living 10 00:01:23,370 --> 00:01:28,860 philosopher and the best known and most widely read of all contemporary ethicists. 11 00:01:29,910 --> 00:01:37,980 He is currently the IRA DeCamp Professor of Bioethics at the University Centre for Human Values at Princeton University, 12 00:01:38,280 --> 00:01:42,900 and simultaneously a Laureate Professor at the University of Melbourne. 13 00:01:44,130 --> 00:01:51,480 In his long career, he's held positions at Oxford, New York, La Trobe and Monash Monash University's. 14 00:01:51,480 --> 00:01:56,250 But he started his vocation as a utilitarian philosopher here in Oxford with a B fill. 15 00:01:57,780 --> 00:02:04,830 He is without doubt a man who does not shy away from voicing challenging ideas and stirring up controversy. 16 00:02:05,490 --> 00:02:14,820 His books on animal liberation, Practical Ethics and the Life that you can Save have provoked both plaudits and outrage. 17 00:02:15,750 --> 00:02:18,840 I hope that his lecture tonight will do just that. 18 00:02:20,160 --> 00:02:29,090 It is entitled From Moral Neutrality to Effective Altruism The Challenging Scope and Significance of Moral Philosophy, Professor singer. 19 00:02:40,680 --> 00:02:48,730 Thank you very much. Uh, it is indeed an honour to be invited to give a lecture and especially attend a lecture at Oxford. 20 00:02:48,750 --> 00:02:56,879 So I'm very happy to be here. And I've chosen a topic that I think is particularly appropriate for Oxford. 21 00:02:56,880 --> 00:03:05,190 And the topic that relates to the time that I spent here when I was a graduate student at Oxford in the early 1970s. 22 00:03:07,440 --> 00:03:13,980 I want to look at the dramatic change that has occurred in moral philosophy, uh, over that time. 23 00:03:14,430 --> 00:03:20,430 So this is a rather broad topic. I'm not going to examine one particular question in detail. 24 00:03:20,610 --> 00:03:23,970 Those who wanted that sort of argument, I apologise, 25 00:03:24,210 --> 00:03:31,890 but I think it's interesting to occasionally take the opportunity to reflect and look back at where we've come from and why and where we're going. 26 00:03:32,250 --> 00:03:35,880 Uh, so that's the type of, uh, talk that you're going to get tonight. 27 00:03:38,520 --> 00:03:44,220 In 1972, I was a Radcliffe lecturer at University College, Oxford. 28 00:03:44,250 --> 00:03:58,670 My first academic appointment. And I published a brief article in analysis under the title Moral Experts, in which, um, I challenge the view that, 29 00:03:58,730 --> 00:04:05,120 uh, to quote one of the philosophers, I'm going to discuss a challenge view that KD broad put it. 30 00:04:05,600 --> 00:04:13,250 It's no part of the professional business of moral philosophers to tell people what they ought or ought not to do. 31 00:04:14,750 --> 00:04:27,110 So today, I want to revisit that little article, um, and ask why it was that people thought that that was the case when we, 32 00:04:27,380 --> 00:04:33,530 those of you who are involved in moral philosophy today, um, will obviously see things quite differently. 33 00:04:33,530 --> 00:04:43,339 A lot of moral philosophy has done by many people in this university, as elsewhere, is indeed, um, could be described as telling people what to do. 34 00:04:43,340 --> 00:04:50,930 That's maybe a little too crude, but at least as putting arguments about what one ought to do or ought not to do in particular situations. 35 00:04:51,950 --> 00:04:54,980 So why did people think something differently? 36 00:04:55,460 --> 00:05:04,370 And, uh, where have we come since that, with that change in practice that has led us to where we are now? 37 00:05:05,970 --> 00:05:12,310 In the original article I quoted from two philosophers, one of them just mentioned KD broad, 38 00:05:12,330 --> 00:05:15,300 perhaps best known for his book Five Types of Ethical Theory. 39 00:05:15,840 --> 00:05:27,090 Um, and the other, uh, ager who certainly at that time was a one of the leading philosophers, uh, who, um, was then, uh, 40 00:05:27,150 --> 00:05:34,320 we can professor of logic here at Oxford, um, but had been a leading philosopher since the publication of language, 41 00:05:34,320 --> 00:05:39,060 Truth and Logic in 1936, when he was only 26. 42 00:05:39,930 --> 00:05:48,479 That book was a kind of manifesto for logical positivism, um, which may have had something to do with the views that he held about ethics. 43 00:05:48,480 --> 00:05:55,620 And they clearly did have something to do with the views he held about ethics and perhaps about, uh, what the proper role of philosophers is. 44 00:05:56,040 --> 00:06:02,910 Um, and today I'm going to add a third, um, uh, philosopher who takes, in some respects, a similar view. 45 00:06:03,180 --> 00:06:06,750 Uh, that might surprise you. The third philosopher is Bertrand Russell. 46 00:06:08,620 --> 00:06:16,330 Okay, so in 1949, I published an article called On the Analysis of Moral Judgements. 47 00:06:16,410 --> 00:06:20,540 Uh, it was later reprinted in his, uh, Philosophical Essays. 48 00:06:20,560 --> 00:06:28,210 It was probably most widely read then, because that was a book that was very widely used in teaching undergraduates philosophy at the time. 49 00:06:29,290 --> 00:06:34,449 Uh, in On the Analysis of Moral Judgements, uh, 50 00:06:34,450 --> 00:06:42,850 restates in a somewhat more sophisticated form the view of ethics that he had already put forward in one chapter of language, truth and logic. 51 00:06:43,750 --> 00:06:51,400 Um, that view was that ethical judgements don't state propositions that can be true or false. 52 00:06:51,850 --> 00:06:58,260 Uh, crudely put, the theory could be described as the boo hooray theory of moral judgements. 53 00:06:58,270 --> 00:07:02,470 So if you say something is good, you're saying hooray for X, whatever it is. 54 00:07:02,710 --> 00:07:05,770 If you say something is bad, you're saying boo to X. 55 00:07:06,070 --> 00:07:09,490 Uh, and that's, uh, roughly what it is to make a moral judgement. 56 00:07:09,820 --> 00:07:12,790 The, um, view that he put forward in, uh, 57 00:07:12,820 --> 00:07:19,180 on the analysis of moral judgements was somewhat more sophisticated than that, but it was clearly in the same family, 58 00:07:19,180 --> 00:07:24,880 recognisably in that family, which came to be known as, uh, emotive ism was more developed, 59 00:07:25,300 --> 00:07:29,350 uh, more by C.L. Stevenson, uh, with a number of articles about that time. 60 00:07:30,700 --> 00:07:36,580 But in explaining the implications of his theory, that is the theory of moral judgements in, uh, 61 00:07:36,610 --> 00:07:42,909 on the analysis of moral judgements, uh, makes some firm statements about the role of moral philosophy. 62 00:07:42,910 --> 00:07:46,510 So here's a longish sort of quote from that article. 63 00:07:48,240 --> 00:07:54,390 I am not saying that morals are trivial or unimportant, or that people ought not to bother with them, 64 00:07:55,170 --> 00:08:00,630 for this would itself be a judgement of value which I have not made and do not wish to make. 65 00:08:01,260 --> 00:08:10,140 And even if I did wish to make it, it would have no logical connection with my theory, for the theory is entirely on the level of analysis. 66 00:08:10,680 --> 00:08:14,310 It is an attempt to show what people are doing when they make moral judgements. 67 00:08:14,670 --> 00:08:18,900 It is not a set of suggestions as to what moral judgements they are to make. 68 00:08:20,250 --> 00:08:23,790 And this is true of all moral philosophy, as I understand it. 69 00:08:24,420 --> 00:08:30,660 All moral theories intuitionist, naturalistic, Objectivist, emotive and the rest. 70 00:08:31,140 --> 00:08:37,320 Insofar as they are philosophical, theories are neutral as regards actual conduct. 71 00:08:37,950 --> 00:08:41,760 To speak technically, they belong to the field of ethics, not ethics proper. 72 00:08:42,750 --> 00:08:49,890 That is why it is silly as well as presumptuous for any one type of philosopher to pose as the champion of virtue. 73 00:08:50,550 --> 00:08:55,410 And it is also one reason why many people find moral philosophy and unsatisfying subject, 74 00:08:55,740 --> 00:08:59,850 for they mistakenly look to the moral philosopher for guidance. 75 00:09:01,650 --> 00:09:06,750 And then he adds, it is indeed to be expected that a moral philosopher, even in my sense of the term, 76 00:09:06,750 --> 00:09:10,740 will have his moral standards, and then he will sometimes make moral judgements. 77 00:09:11,100 --> 00:09:15,300 But these moral judgements cannot be a logical consequence of his philosophy. 78 00:09:15,720 --> 00:09:19,590 To analyse moral judgements is not itself to moralise. 79 00:09:22,140 --> 00:09:29,219 Clearly and saying this air is dismissing a long tradition of moral philosophy, which I'm sorry, 80 00:09:29,220 --> 00:09:35,490 I shouldn't have turned the page there a long tradition of moral philosophy that goes back to Socrates, at least to Socrates, 81 00:09:35,490 --> 00:09:43,700 as portrayed in Plato's dialogues, where he goes around Athens questioning Athenians about their views about, uh, 82 00:09:43,800 --> 00:09:50,520 justice or piety or something of that sort, and showing that they don't really have coherent views about these things. 83 00:09:51,540 --> 00:09:55,859 Uh, and that presumably would influence their thinking. 84 00:09:55,860 --> 00:09:59,910 And they conduct to show, at least to unsettle them in some way. 85 00:10:00,340 --> 00:10:05,129 It's not that Socrates was perhaps himself specifically propounding of you, 86 00:10:05,130 --> 00:10:10,260 although certainly in the later platonic dialogues he's portrayed as propounding views. 87 00:10:11,340 --> 00:10:20,220 Um, but still that he is clearly doing things that relate to showing up particular normative views, as we would now call them. 88 00:10:20,610 --> 00:10:26,540 And that tradition then clearly goes on through Plato, through the, um, medieval, uh, well, 89 00:10:26,550 --> 00:10:31,080 through the Epicureans and the Stoics, I guess, um, through the medieval scholastics. 90 00:10:31,350 --> 00:10:40,379 Uh, well, certainly telling people what they think is is right or wrong, uh, goes on through, uh, later philosophers. 91 00:10:40,380 --> 00:10:45,630 Hume, uh, clearly, although he did some things today would say, uh, meta ethics. 92 00:10:45,630 --> 00:10:52,590 He also wrote, for example, on suicide. And then he would have suggested that he was not doing that as a philosopher. 93 00:10:53,550 --> 00:10:59,940 Uh, Kant, um, lectured on, uh, normative ethics, clearly, and uh, wrote on it, 94 00:10:59,940 --> 00:11:06,480 gave the examples of right and wrong behaviour, uh, as he saw it, um, the utilitarians, 95 00:11:06,780 --> 00:11:13,290 Bentham, Mill and Sidgwick, uh, certainly all did normative ethics, in which, uh, 96 00:11:13,290 --> 00:11:18,600 you can find their opinions on various things that we should or should not do. 97 00:11:19,050 --> 00:11:29,370 And right up to time, just before er wrote language use of logic, um, or overlapping with it, uh, you would get Ross the uh, intuitionist, 98 00:11:29,610 --> 00:11:37,740 who also clearly thought that in doing normative ethics, um, suggesting views about what's right and what's wrong, 99 00:11:38,070 --> 00:11:42,570 he was doing what was part of the part of moral philosophy. 100 00:11:42,570 --> 00:11:44,910 What was the role of the moral philosopher? 101 00:11:45,930 --> 00:11:53,340 But, um, for er the fact that he was turning his back on that tradition probably wasn't in itself something that would trouble him because, 102 00:11:53,700 --> 00:12:00,240 uh, that was the logical positivist view that he had been, uh, one was to some extent still a part of, 103 00:12:00,480 --> 00:12:06,030 um, that idea that a lot of a lot of things that have been done before in philosophy, a lot of metaphysics, 104 00:12:06,030 --> 00:12:13,139 a lot of talk about God, for example, um, is really meaningless because, uh, it's unverifiable. 105 00:12:13,140 --> 00:12:21,750 The statements made are unverifiable. Um, and therefore we ought to reject this, and we ought to do philosophy in a very different sort of way. 106 00:12:23,160 --> 00:12:31,620 So I think it's it's not too difficult to see where er is coming from and why he holds that restricted view of moral philosophy. 107 00:12:33,110 --> 00:12:41,210 Um, the other philosopher that I quoted in Moral Experts, KD broad, uh, turns out to be a little more puzzling. 108 00:12:42,550 --> 00:12:44,980 Here's the quote that I used in the original article. 109 00:12:46,540 --> 00:12:52,270 It's no part of the professional business of moral philosophers to tell people what they ought or ought not to do. 110 00:12:53,020 --> 00:13:00,730 Moral philosophers, as such, have no special information not available to the general public about what is right and what is wrong. 111 00:13:02,080 --> 00:13:11,020 Nor have they any call to undertake those functions which are so adequately performed by clergymen, politicians and leader writers. 112 00:13:12,780 --> 00:13:18,750 Now here's a confession. When I use this quote in 1972, I didn't know very much about broad. 113 00:13:19,410 --> 00:13:25,050 But here was a handy quote that I could use to make the point in the article that I wanted to criticise. 114 00:13:25,260 --> 00:13:29,550 So I didn't go very much further into it, and I took that quote at face value. 115 00:13:30,630 --> 00:13:36,090 Now that I know a bit more about broad, um, I think that at least that last part of it, about the, 116 00:13:36,090 --> 00:13:41,880 uh, function so adequately performed by clergymen, etc., must have been meant ironically. 117 00:13:42,480 --> 00:13:44,820 Um, I've learned, for example, 118 00:13:44,820 --> 00:13:53,940 that broad was a homosexual who in 1958 signed a letter to the times asking for the repeal of the law criminalising homosexual conduct. 119 00:13:54,990 --> 00:14:02,670 Um, clergymen at the time were certainly not supportive of the view that broad must have held to write that letter. 120 00:14:03,120 --> 00:14:07,530 So, um, you can't really, I think, have have meant that very seriously. 121 00:14:09,240 --> 00:14:15,960 A second puzzling aspect about the quote is that it's from an essay entitled Conscience and Conscientious Action, 122 00:14:16,680 --> 00:14:19,920 which was first published in wartime in 1940. 123 00:14:20,850 --> 00:14:24,510 And the opening sentence indicates the topic of the essay. 124 00:14:26,010 --> 00:14:32,249 At the present time, tribunals appointed under an Act of Parliament are engaged all over England in dealing with claims 125 00:14:32,250 --> 00:14:38,100 to exemption from military service based on the grounds of conscientious objection to taking part, 126 00:14:38,100 --> 00:14:41,880 directly or indirectly, in warlike activities. 127 00:14:43,160 --> 00:14:52,610 And the article then goes on to examine this and indeed argue against the possibility of these tribunals adequately performing that role. 128 00:14:52,970 --> 00:14:59,120 So this, in fact, we would not consider it to be an article in practical or applied ethics. 129 00:14:59,450 --> 00:15:08,600 Um, so how is that compatible with the statement that is not the business of the philosopher to, um, tell people what they ought or not to do. 130 00:15:09,940 --> 00:15:15,220 Um. What broad is in fact, goes on to argue is, um, 131 00:15:15,430 --> 00:15:23,259 that an analysis of the notions of conscience and conscientious action suggests that the tribunals have been given a task which, 132 00:15:23,260 --> 00:15:28,210 from its nature is impossible, of being satisfactorily performed. 133 00:15:28,870 --> 00:15:33,160 And from this he draws the conclusion that this is a strong ground against exemption 134 00:15:33,550 --> 00:15:38,590 for military service on grounds of conscience and against setting up tribunals at all. 135 00:15:39,310 --> 00:15:43,930 Um he doesn't. He acknowledges that there could be other reasons going in the opposite direction. 136 00:15:43,930 --> 00:15:52,710 So he doesn't think that he's finally resolve the issue, but he's clearly putting up a an argument on one side of that issue, which, uh, 137 00:15:52,720 --> 00:15:57,520 would have to be rebutted, or else one would have to conclude that we ought not to have set up, 138 00:15:57,520 --> 00:16:01,000 or the government ought not to have set up those tribunals. 139 00:16:02,260 --> 00:16:07,839 How can you resolve the apparent contradiction between the article itself and the claim 140 00:16:07,840 --> 00:16:12,640 that philosophers have no business in telling people what they ought or ought not to do? 141 00:16:15,300 --> 00:16:21,300 But broad. If you if we look a little bit further on from the passage I quoted, broad goes on to say, 142 00:16:22,170 --> 00:16:28,020 but it is the function of a moral philosopher to reflect on the moral concepts and beliefs which he or others have, 143 00:16:28,500 --> 00:16:33,900 to try to analyse them and draw distinctions and clear up confusions in connection with them, 144 00:16:34,440 --> 00:16:39,510 and to see how they're interrelated and whether they can be arranged in a coherent system. 145 00:16:40,800 --> 00:16:46,980 Now, there can be no doubt that the popular notions of conscience and conscientious action are extremely vague and confused. 146 00:16:47,310 --> 00:16:54,810 So I think that by devoting this paper to an attempt to elucidate them, I may succeed in being topical without being impertinent. 147 00:16:56,910 --> 00:17:04,200 What brought here describes is, in fact, the basis for a lot of what we now think of as practical or applied ethics. 148 00:17:04,710 --> 00:17:15,690 Um, in fact, what I was arguing in that, uh, early article is, uh, part of that, that clearing up confusions in people's ideas, 149 00:17:15,690 --> 00:17:24,210 their moral concepts can be an important role for a moral philosopher and can affect discussions about what we ought to do. 150 00:17:25,260 --> 00:17:31,650 Um, clarity, I said, is not an end in itself, but it is an important aid to sound argument. 151 00:17:31,920 --> 00:17:37,200 And if philosophers are clearer about concepts than popular usages, 152 00:17:37,410 --> 00:17:43,920 then philosophers may have a valuable role in aiding sound argument by clearing up those confusions about concepts. 153 00:17:44,280 --> 00:17:48,680 Um, just what broad was doing. But bro, 154 00:17:48,690 --> 00:17:52,860 it also says something that's actually more far reaching when he recognises that it's the 155 00:17:52,860 --> 00:17:58,620 function of a moral philosopher to see how moral concepts and beliefs are interrelated, 156 00:17:58,800 --> 00:18:02,070 and whether they can be arranged in a coherent system. 157 00:18:03,570 --> 00:18:07,890 And I think broad could hardly deny that this was properly the function of the moral philosopher, 158 00:18:08,370 --> 00:18:15,089 because, um, broad himself had earlier written that he thought the best book on ethics, 159 00:18:15,090 --> 00:18:20,970 the best moral treatise that had been written by a philosopher was Henry Sedgwick's The Methods of Ethics, 160 00:18:21,210 --> 00:18:24,720 a judgement that I would be in agreement with. 161 00:18:25,170 --> 00:18:31,680 But, um, a lot of what Sidgwick does, particularly in book three of the Methods of Ethics, where he's considering, uh, 162 00:18:31,680 --> 00:18:39,180 what he calls common sense, the morality of common sense or, uh, uh, form of intuition ism, where we intuit certain moral principles. 163 00:18:39,570 --> 00:18:46,170 A lot of what Sidgwick does there is to examine particular concepts of things that are regarded as desirable by common sense, 164 00:18:46,170 --> 00:18:51,900 morality, like benevolence or, uh, veracity, uh, gratitude and so on, 165 00:18:52,170 --> 00:18:58,850 and show that they're not really fully coherent concepts or they're not, uh, employed in, uh, 166 00:18:58,890 --> 00:19:04,110 straightforward way on the basis of those concepts, and that they need to be supplemented by some other principle. 167 00:19:04,470 --> 00:19:10,620 And then later, Sidgwick suggests that utilitarian, the utilitarian principle is the appropriate supplement for them. 168 00:19:10,920 --> 00:19:17,850 So broad would have known, of course, that this, uh, seeing whether a set of moral concepts is coherent and, 169 00:19:17,850 --> 00:19:25,390 uh, self-sustaining is a part of moral philosophy, and it's clearly not normatively neutral. 170 00:19:25,410 --> 00:19:29,700 It wasn't normatively neutral in Sedgwick's hands, uh, and in fact, 171 00:19:29,700 --> 00:19:36,360 a good deal of my own work in practical ethics can be seen as arguing that views that we hold, 172 00:19:36,660 --> 00:19:43,470 uh, not coherent, and therefore we need to rethink the beliefs, the moral beliefs that we hold. 173 00:19:43,860 --> 00:19:50,679 So. Let me give you three examples. 174 00:19:50,680 --> 00:19:56,500 Um, where I've, I've used that, uh, to argue for the need for change. 175 00:19:57,600 --> 00:20:00,570 The traditional doctrine of the sanctity of human life, 176 00:20:01,560 --> 00:20:07,590 as upheld to some extent by popular morality and certainly most strongly by the Roman Catholic Church, 177 00:20:08,160 --> 00:20:15,510 insists that all human life is of equal worth, and we should not make decisions to choose between, 178 00:20:15,570 --> 00:20:20,670 uh, to to choose to end a life because of its, uh, poor quality. 179 00:20:22,050 --> 00:20:28,020 Yet. This doctrine also permits the withdrawal of life support in certain circumstances. 180 00:20:28,020 --> 00:20:36,899 For example, the withdrawal of a respirator from somebody who is in a, uh, irreversible coma, uh, 181 00:20:36,900 --> 00:20:44,100 or has had other damage that suggests that they will have, um, a extremely poor quality of life. 182 00:20:44,670 --> 00:20:49,799 Now, there are various justifications for this permission that are given by the doctrine, 183 00:20:49,800 --> 00:20:56,850 including the idea that we don't have an obligation to provide extraordinary means of life support or that, 184 00:20:56,850 --> 00:21:04,620 uh, some treatment is disproportionate to the, uh, uh, benefit that it creates for the patient. 185 00:21:05,280 --> 00:21:15,209 But, um, I've argued in various works that those justifications themselves rely on judgements that, uh, quality of life judgements. 186 00:21:15,210 --> 00:21:24,210 So they're really disguised quality of life judgements. And if I'm right about that then the doctrine is not really coherent. 187 00:21:25,730 --> 00:21:28,760 Second, uh, example. It'll be familiar to many of you. 188 00:21:29,570 --> 00:21:37,129 Um, many of us believe that if we were passing by a shallow pond and saw a small child drowning in it, uh, 189 00:21:37,130 --> 00:21:44,180 and, um, realise that unless we rush into the pond and rescue the child, the child would probably drown. 190 00:21:44,720 --> 00:21:49,100 Also realise that there was some cost to us in doing so that we would ruin, 191 00:21:49,100 --> 00:21:53,090 uh, an expensive pair of shoes that we were wearing by jumping into the pond. 192 00:21:53,480 --> 00:21:59,300 Um, but nevertheless, it would be seriously wrong for us not to rescue that child. 193 00:21:59,480 --> 00:22:05,060 This would not just be something that, uh, you know, it would be good to rescue the child, but not wrong not to rescue the child. 194 00:22:05,600 --> 00:22:12,650 Virtually everybody thinks that it would be wrong to rescue the child, that you would have that obligation to do so in this case. 195 00:22:14,390 --> 00:22:22,340 But then of course, we can broaden the picture and we can recognise that there are children dying from preventable causes all over the world. 196 00:22:22,850 --> 00:22:27,590 Uh, that very often we can quite cheaply save those lives. 197 00:22:27,980 --> 00:22:31,000 Maybe not exactly for the cost of an expensive pair of shoes. 198 00:22:31,040 --> 00:22:38,329 I admit that the, uh, estimates that we now have probably rather higher than they were when I originally wrote about this case. 199 00:22:38,330 --> 00:22:43,370 But still, for things that are not enormous sacrifices for us, um, we can save a child's life. 200 00:22:43,610 --> 00:22:47,899 So, uh, if we think that that's not wrong, not to do that. 201 00:22:47,900 --> 00:22:54,620 It's not wrong. In other words, that you can be an ethical person without giving anything to help people who are much 202 00:22:54,620 --> 00:22:58,490 less well-off than you are and whom you could help without making a huge sacrifice. 203 00:22:58,820 --> 00:23:01,850 Um, there then that needs to be explained. 204 00:23:01,940 --> 00:23:08,420 And simply, the fact that the child is further away than the child in the pond doesn't seem to be a sufficient explanation. 205 00:23:08,660 --> 00:23:12,860 So something else needs to be said about that. 206 00:23:13,310 --> 00:23:23,360 Um, so that's that's the second challenge that I think depends on showing incoherence in moral concepts or the set of moral views that people hold. 207 00:23:23,870 --> 00:23:37,580 Um, and thirdly, my work about, uh, animal liberation has relied on a parallel between our objections to racism and our, uh, objections to sexism. 208 00:23:37,820 --> 00:23:40,610 Uh, and between that and speciesism. 209 00:23:40,610 --> 00:23:47,330 And I want to welcome, uh, Richard Ryder to the room, by the way, from whom I first saw the concept of the term speciesism. 210 00:23:47,540 --> 00:23:54,590 Richard was, uh, at Oxford many years ago when I was in, uh, that time in the beginning of the 70s and, uh, 211 00:23:54,920 --> 00:24:00,740 produced a leaflet, uh, with a picture of a chimpanzee who'd been infected with syphilis looking very miserable. 212 00:24:00,920 --> 00:24:07,280 And the heading speciesism. And I thought, hmm, that's a very interesting way of putting the idea that, um. 213 00:24:08,520 --> 00:24:12,390 There's something similar going on in terms of a dominant, 214 00:24:12,480 --> 00:24:23,790 powerful group which defines moral status in its own interests in a way that enables it to make use of another class of beings, 215 00:24:24,090 --> 00:24:32,760 um, who are denied moral status and where it's very convenient or helpful for it to have higher status, uh, over those beings. 216 00:24:33,030 --> 00:24:40,380 And that's exactly the phenomenon we see in even the most blatant form of racism, in the racism of the slave trade. 217 00:24:40,830 --> 00:24:49,530 Um, and we can see, uh, in the idea that males, uh, males have a moral status that, uh, that females don't have. 218 00:24:49,830 --> 00:25:00,240 So, um, again, there there's a question of can you consistently be, uh, against racism, against sexism, but, uh, accept speciesism? 219 00:25:00,570 --> 00:25:06,120 And, uh, what would have to be done to justify drawing a distinction? 220 00:25:06,420 --> 00:25:15,270 Not this is not on the basis of human's higher cognitive capacities, which would not be speciesism, but on the basis of species itself. 221 00:25:15,720 --> 00:25:23,130 And, of course, the views we standardly hold don't make draw the line on the basis of cognitive capacities, 222 00:25:23,430 --> 00:25:28,680 because if they did, then some humans would not have a higher moral status than some animals. 223 00:25:28,920 --> 00:25:34,800 But since we we don't draw the line in that way, we do draw it on the basis of are you a member of the species Homo sapiens? 224 00:25:35,580 --> 00:25:44,100 Therefore, there's this difficulty in showing how that can be defensible when, uh, racism and sexism are not. 225 00:25:45,420 --> 00:25:50,400 Now, these arguments are only arguments about coherence of a set of beliefs. 226 00:25:50,940 --> 00:26:01,590 So it is possible to um. Restore coherence, not by going in the direction that I've suggested in my writings about these questions. 227 00:26:02,040 --> 00:26:07,530 Uh, not by saying that it's okay to take into account quality of life in making life or death decisions, 228 00:26:07,770 --> 00:26:14,040 not by saying we do have an obligation to help people, uh, in great need in developing countries, 229 00:26:14,310 --> 00:26:21,510 not by saying, uh, we ought to reject speciesism, um, but by saying, well, in the first case, 230 00:26:21,900 --> 00:26:26,190 yes, it it is wrong to withdraw a respirator no matter what the quality of life of someone. 231 00:26:26,400 --> 00:26:28,860 We've we've got to keep valuing human life. 232 00:26:29,070 --> 00:26:37,290 And so we've got to keep ventilating patients, even if they will, we can know for sure that they will never recover consciousness. 233 00:26:37,800 --> 00:26:42,870 Or in the shallow pond case, you could say, oh, well, I guess when I think about it more, 234 00:26:43,170 --> 00:26:47,640 I realise it wouldn't be wrong to just walk past the pond and let the child drown, 235 00:26:48,180 --> 00:26:52,470 because you didn't want to be at the expense of replacing your expensive shoes. 236 00:26:53,460 --> 00:27:00,210 It's a possible position, and equally, you could possibly, though I think people would probably be even more reluctant to buy this one. 237 00:27:00,210 --> 00:27:05,700 You could possibly say, well, now I realise that racism and sexism are not wrong because clearly speciesism is not wrong. 238 00:27:06,300 --> 00:27:16,800 Um, so so the argument from coherence does require people not to want to retreat from positions that they generally hold. 239 00:27:17,970 --> 00:27:24,820 Um. So, uh, I think that, uh, a more careful reading of, uh, 240 00:27:25,000 --> 00:27:35,410 broad that I gave in my 1972 article would suggest that what he's doing is enough to allow a lot of argument in practical or applied ethics. 241 00:27:35,770 --> 00:27:40,420 Um, and it's interesting, therefore, that he didn't himself want to draw that conclusion, 242 00:27:40,630 --> 00:27:46,150 but, um, he was so concerned to say that he wasn't really doing this. 243 00:27:47,190 --> 00:27:54,240 He seems to have felt, um, he uses the word uh. What he's doing in analysing the concepts, he says he hopes it won't be considered impertinent. 244 00:27:54,660 --> 00:28:00,930 So it's as if he thinks that for a moral philosopher to tell people what they ought or what not to do would be seen as impertinence. 245 00:28:01,260 --> 00:28:09,270 Um, and they certainly seems to be some kind of modesty, uh, reluctance, perhaps, uh, of academics of that time, philosophers of that time, 246 00:28:09,270 --> 00:28:18,480 at least, to get out of the university, the academy environment, and to actually engage fully with the public about moral questions. 247 00:28:20,010 --> 00:28:26,940 But there was one philosopher of the time who was certainly not modest or reluctant to engage with the public about moral questions. 248 00:28:27,180 --> 00:28:33,870 And, uh, that's my third example, Bertrand Russell, because, uh, Russell was, um, 249 00:28:34,110 --> 00:28:39,240 both the most, probably the best known British philosopher of the 20th century. 250 00:28:40,350 --> 00:28:52,590 Uh, and he also wrote a series of, uh, books, popular books on issues ranging from social justice to sexual morality and from, uh, 251 00:28:53,160 --> 00:28:57,720 the nature of happiness or the conquest of happiness, to use the title of one of his books to, 252 00:28:58,020 --> 00:29:02,940 uh, campaigning against, uh, nuclear weapons for nuclear disarmament. 253 00:29:03,850 --> 00:29:09,580 So if Russell was both both the best known philosopher of the in Britain of the 20th century and wrote all those works, 254 00:29:10,120 --> 00:29:18,250 how can he be somebody who thinks that it's not the professional business of the philosopher to, uh, engage with the public? 255 00:29:19,000 --> 00:29:27,090 And the answer is that although Russell was certainly doing. What we would now think of as, uh, as practical ethics. 256 00:29:27,870 --> 00:29:31,770 He, too, didn't regard what he was doing as philosophy. 257 00:29:32,830 --> 00:29:41,560 Um. Russell's work in ethics is well discussed by Charles Picton in his um contribution to The Cambridge Companion to Bertrand Russell, 258 00:29:41,890 --> 00:29:47,620 edited by Nicholas Griffin. And in what follows, I'm drawing on Pynchon's work. 259 00:29:48,910 --> 00:29:57,760 So he points out that Russell himself contributed to a narrow view of what philosophy is, Russell wrote, and particularly what moral philosophy is. 260 00:29:57,790 --> 00:30:03,880 Russell wrote, the only matter concerned with ethics that I can regard as properly belonging to philosophy 261 00:30:04,540 --> 00:30:09,250 is the argument that ethical propositions should be expressed in the optative mood, 262 00:30:09,490 --> 00:30:15,070 not the indicative, that is, the mood suited for wishes expressing wishes and desires, not the indicative. 263 00:30:15,460 --> 00:30:19,510 So this is clearly an issue of matter ethics how we understand moral judgements. 264 00:30:19,510 --> 00:30:27,070 Again, um, as er was doing and roughly in the same general family of theories that uh, it was talking about. 265 00:30:28,450 --> 00:30:31,810 So Russell held that normative ethical judgements can't be true or false. 266 00:30:32,110 --> 00:30:40,270 And Picton suggests that because Russell thought that philosophy is the pursuit of truth when talking about normative matters, 267 00:30:40,780 --> 00:30:44,050 ethical judgements could not be part of philosophy. 268 00:30:45,400 --> 00:30:52,000 And this appears to have affected, I think, not only the way Russell regarded his own writings in philosophy on ethical issues, 269 00:30:52,270 --> 00:30:57,430 but also the way he went about those that writing. Uh, consider this passage. 270 00:30:58,750 --> 00:31:04,480 Persuasion in ethical matters is necessarily different from persuasion in scientific matters. 271 00:31:05,560 --> 00:31:11,590 According to me, the person who judges that I is good is wishing other persons to feel certain desires. 272 00:31:12,340 --> 00:31:15,550 He will therefore try to rouse those desires in other people. 273 00:31:16,570 --> 00:31:23,050 This is the purpose of preaching, and it was my purpose in the books in which I have expressed ethical opinions. 274 00:31:25,090 --> 00:31:28,150 Elsewhere in defending himself against a lack of precision in his writing. 275 00:31:28,270 --> 00:31:35,770 Russell refers to his Principles of Social Reconstruction, one of his early popular writings, and so to some extent this, this, 276 00:31:35,770 --> 00:31:43,300 and to some extent my other popular books is not intended as a contribution to learning, but rather as having an entirely practical purpose. 277 00:31:43,900 --> 00:31:52,450 So Russell is distinguishing preaching from philosophy and putting practical ethics, or indeed all normative ethics, into the former category. 278 00:31:52,990 --> 00:31:57,520 The exception he makes is where you're just drawing implications from a principle. 279 00:31:57,520 --> 00:32:03,250 So he says, for example, suppose you accept the principle that we ought to maximise pleasure and minimise pain. 280 00:32:03,610 --> 00:32:08,080 You might then discuss whether capital punishment does or does not do this. 281 00:32:08,800 --> 00:32:11,230 Um, and that can be a scientific question. 282 00:32:11,320 --> 00:32:16,899 Looking at the empirical facts of whether capital punishment does or doesn't do that, but it's not really then a normative activity either, 283 00:32:16,900 --> 00:32:22,960 because you're just taking the values for granted and, uh, exploring their implications. 284 00:32:24,530 --> 00:32:34,309 I think we see some of the effects of this view of practical ethics in Russell's own writing, where, um, he's very he can be very dogmatic. 285 00:32:34,310 --> 00:32:41,660 He's not very careful about, uh, the arguments he puts forward, um, particularly when he got on to the nuclear disarmament issue. 286 00:32:41,900 --> 00:32:48,860 Um, he just says things like, um, uh, it's the plain duty of everyone, uh, 287 00:32:48,920 --> 00:32:56,570 he says to make known two key facts that nuclear war is not improbable and that it would cause the death of all or almost all human beings. 288 00:32:57,230 --> 00:33:01,000 Given this, Russell says, a philosopher or any other uh, 289 00:33:01,070 --> 00:33:08,090 put or a person of any academic capacity must devote himself by whatever means are open to him, 290 00:33:08,660 --> 00:33:16,430 to persuading other people to agree with him as to these effects, and to joining him in whatever protest shows the most chance of success. 291 00:33:17,450 --> 00:33:21,920 So, um, I think you could say, you know, when Russell uses the term preaching, 292 00:33:22,310 --> 00:33:26,330 that does apply to some of his writings in this case, uh, and you could say, well, 293 00:33:26,660 --> 00:33:34,520 maybe that was justified if you believe the fact he claimed that nuclear war was not improbable and given the devastation that would cause, 294 00:33:34,820 --> 00:33:38,570 then maybe that is what you ought to do, at least if you're a utilitarian. 295 00:33:38,930 --> 00:33:45,650 Um, but that's then not really practical, I think, because you're simply trying to persuade people by whatever means you can. 296 00:33:46,100 --> 00:33:53,209 And, uh, doing practical ethics is not persuasion by whatever means you can, but by, uh, 297 00:33:53,210 --> 00:34:00,110 standards of reasoning and argument that are consistent, I would say, with philosophical thinking. 298 00:34:01,080 --> 00:34:07,319 Um, so, uh, both Broad and Russell then could have said that they were doing practical ethics. 299 00:34:07,320 --> 00:34:13,860 And Picton argues that some of Russell's work should count as philosophical work because it uses philosophical arguments, 300 00:34:14,220 --> 00:34:20,400 but some of it is clearly not. And, uh, perhaps the idea is to avoid preaching when you're doing philosophy. 301 00:34:20,460 --> 00:34:29,090 For Russell was part of this. But as I said, I think, uh, as academics, we are familiar with assessing arguments. 302 00:34:29,100 --> 00:34:33,660 We're familiar with assessing arguments as well put, well constructed or not, 303 00:34:34,260 --> 00:34:38,670 um, and independently from whether we agree or disagree with their conclusions. 304 00:34:39,300 --> 00:34:47,190 Certainly, now that we've had many years experience of, uh, practical ethics being a part of philosophy, uh, curricula, um, 305 00:34:47,910 --> 00:34:55,200 a lot of us have spent a lot of time grading papers, and I think we know reasonably well how to assess a well argued paper. 306 00:34:55,560 --> 00:35:02,090 And often a well argued paper will not be a paper, uh, with which we agree with the conclusions, might be, uh, 307 00:35:02,130 --> 00:35:07,680 might get a higher grade than a paper where we do agree with the conclusions, but where we think the argument is rather sloppy. 308 00:35:08,080 --> 00:35:12,600 Uh, and the same standards at a higher level, of course, are applied by reviewers. 309 00:35:12,600 --> 00:35:18,640 When we submit papers to peer reviewed journals, they're not supposed to recommend papers, uh, 310 00:35:18,660 --> 00:35:25,110 because they agree with their conclusions, but rather because they think they meet the journal's standards of argument. 311 00:35:26,640 --> 00:35:33,360 So, um, I think it's on that basis that we can actually resist the idea that, uh, 312 00:35:33,600 --> 00:35:39,750 normative ethics, practical or applied ethics even is, uh, some form of preaching. 313 00:35:44,670 --> 00:35:47,820 Now what changed and how did we get to to where we are today? 314 00:35:48,600 --> 00:35:53,400 I think the change occurred during the period when I was here at Oxford. 315 00:35:53,490 --> 00:36:02,490 And, uh, you know, in that sense, I guess that early article of mine was sensing the, uh, winds of change that were already beginning to blow. 316 00:36:02,880 --> 00:36:15,540 Um, and a lot of I'm not a historian, it seems obvious to me that, um, the student movement that had arisen at that time that had been motivated, uh, 317 00:36:15,990 --> 00:36:19,530 particularly by the war in Vietnam and opposition to that war, 318 00:36:19,560 --> 00:36:26,730 but also by social justice causes by the movement for civil rights in the United States and against racial discrimination. 319 00:36:27,720 --> 00:36:33,930 Um, the rise of the, uh, feminist movement, uh, around that time as well, that, um, 320 00:36:34,230 --> 00:36:42,330 that led to a demand from students that the courses they were taking should be relevant to the big issues of the day. 321 00:36:43,020 --> 00:36:50,820 And some philosophers began to realise that, uh, they were part of a tradition that had discussed those issues, 322 00:36:51,150 --> 00:36:54,930 a tradition that had discussed, for example, when is it right to go to war? 323 00:36:55,170 --> 00:36:58,590 When is when is a war just, uh, a tradition that is discussed? 324 00:36:58,770 --> 00:37:05,400 When do we have an obligation to obey the law, given that there was a lot of civil disobedience, uh, going on at the time? 325 00:37:05,820 --> 00:37:14,220 Um, so, uh, I think they began to go back to this, uh, in response to that student demand and, 326 00:37:14,520 --> 00:37:19,170 um, to think about ways in making philosophy more relevant. 327 00:37:21,000 --> 00:37:29,720 There was, uh, at the time, at this time also, um, a group known as Radical Philosophy, an organisation, 328 00:37:29,730 --> 00:37:34,950 it was called Radical Philosophy that was trying to make philosophy, uh, more radical in some way. 329 00:37:35,490 --> 00:37:45,420 Uh, and I, I went to a conference at which this organisation was founded in London at that time, uh, and was involved with it for a while. 330 00:37:45,780 --> 00:37:53,609 But, um, I became disenchanted with it because the idea of radical philosophy that began to 331 00:37:53,610 --> 00:37:57,600 be propounded and that you can see reflected if you go to the library and dig at, 332 00:37:57,630 --> 00:38:04,890 uh, early issues of the journal cold radical Philosophy that they put out, uh, was that what was really radical was, 333 00:38:04,890 --> 00:38:10,500 uh, partly the philosophy that was being done on the continent and partly Marxist philosophy. 334 00:38:10,680 --> 00:38:15,080 And if you combine the two of them and you got something like Louis Alto says, uh, 335 00:38:15,090 --> 00:38:23,190 rather obscure analysis of Marx, um, then you had the epitome of, uh, what radical philosophy might be. 336 00:38:23,760 --> 00:38:30,040 Um, and in contrast, Oxford philosophy was seen as something, uh, inherently conservative, done in, uh, 337 00:38:30,060 --> 00:38:38,640 stuffy academic drawing rooms by um, uh, elderly philosophers who were just analysing language and not interested in change. 338 00:38:39,970 --> 00:38:48,520 But, um. I thought that it was actually important to do philosophy, that if we wanted to be radical and wanted to do philosophy in a radical way, 339 00:38:48,820 --> 00:38:54,760 it was important to do it in a way that people outside the universities could understand, um, 340 00:38:54,760 --> 00:39:02,560 and that we could talk about issues that were important to them and mattered to them, uh, in ways that were comprehensible. 341 00:39:02,980 --> 00:39:11,260 And that would, I hope, raise the level of discussion of some of these major issues that were being talked about at the time. 342 00:39:11,590 --> 00:39:12,480 And, uh, 343 00:39:12,490 --> 00:39:20,890 going into continental or Marxist philosophy was certainly not going to make you more easily understandable in taking philosophy outside the academy. 344 00:39:21,400 --> 00:39:29,379 So, um, that's why I wrote that article, and I was interested in trying to put across the idea that philosophers do have something to say. 345 00:39:29,380 --> 00:39:35,380 You do have some kind of expertise, uh, on issues that they could contribute to these debates. 346 00:39:35,920 --> 00:39:41,979 And in fact, um, despite the image that Oxford had among radicals at the time, uh, 347 00:39:41,980 --> 00:39:45,370 I think I was pushing on a door that was already half open because there were 348 00:39:45,370 --> 00:39:49,870 people here in Oxford who were doing what we would now call practical ethics. 349 00:39:49,900 --> 00:39:57,520 There were seminars, uh, given, for example, by, uh, Derek Parfit, Jonathan Glover and Jim Griffin that, 350 00:39:57,550 --> 00:40:02,350 uh, where Parfit was doing his well-known material on on population ethics. 351 00:40:02,650 --> 00:40:07,960 Um, which, uh, although fairly abstract, does have implications for a lot of practical issues. 352 00:40:08,260 --> 00:40:13,510 Jonathan Glover was doing early work that developed into his book, Causing Death and Saving Lives, 353 00:40:13,660 --> 00:40:18,940 which was raising questions about the sanctity of life and very relevant for medical decision making. 354 00:40:19,270 --> 00:40:22,690 Um, so there was, uh, this sort of thing was going it was actually going on. 355 00:40:23,720 --> 00:40:35,090 And, um, when I suggested to my, uh, supervisor Aram here that, uh, I'd like to write my be full thesis on, uh, civil disobedience in a democracy. 356 00:40:35,390 --> 00:40:38,750 Uh, he was perfectly happy to accept that. And, um, 357 00:40:38,750 --> 00:40:42,200 said that he had actually always thought that the point of doing moral philosophy 358 00:40:42,320 --> 00:40:46,850 was to make a difference to our views about what we ought or what not to do. 359 00:40:47,180 --> 00:40:52,759 Um. Although his best known works, Language and Morals at the Time and Freedom and Reason didn't totally do that, 360 00:40:52,760 --> 00:40:58,940 but freedom and reason, you can see looking at it now, lots of examples he gives of how moral argument can be used. 361 00:40:59,120 --> 00:41:03,830 And those examples do suggest that you can get to certain normative conclusions. 362 00:41:05,180 --> 00:41:13,400 So, um, I think it wasn't, uh, that difficult, uh, to start doing moral philosophy in that way. 363 00:41:14,450 --> 00:41:20,480 And another important source of encouragement at the time was the founding of the journal Philosophy in Public Affairs, 364 00:41:20,630 --> 00:41:30,320 which was founded in the United States. Um, which provided uh academically uh, first right forum for discussion of practical issues. 365 00:41:30,320 --> 00:41:35,900 And you only have to look at the first couple of volumes of that to see that they immediately elicited, 366 00:41:36,080 --> 00:41:39,820 I think, articles that did challenged people's thinking in important ways. 367 00:41:39,830 --> 00:41:48,709 Um, a couple of them on abortion, Michael Tully's abortion and infanticide, Judith Jarvis Thompson's famous, um, Defence of Abortion, 368 00:41:48,710 --> 00:41:55,610 in which she uses the violinist example, uh, and my own Family Affluence and morality, was published in that journal as well. 369 00:41:56,600 --> 00:41:57,560 So, um. 370 00:42:01,430 --> 00:42:12,500 I think that, uh, this turn to, uh, practical ethics pretty rapidly began to make a real difference, uh, in philosophy and in the real world as well. 371 00:42:13,370 --> 00:42:22,370 And let me just mention some of those differences in the areas that I'm most interested in, in, uh, if we look at the question of animals and ethics. 372 00:42:23,390 --> 00:42:34,280 Uh, there's a bibliography that was published by Charles McGill in 1989, uh, listing works that deal with animals and ethics. 373 00:42:34,490 --> 00:42:40,160 Um, and up to 1970, up to this time, I'm talking about, um. 374 00:42:41,030 --> 00:42:44,840 Uh, well, beginning putting aside a couple of works in antiquity. 375 00:42:45,140 --> 00:42:53,360 Uh, in the first 1970 years of the Christian era, Michael found only 94 works that discuss animals and ethics. 376 00:42:53,780 --> 00:42:58,250 Um, in the next 18 years before the bibliography was completed. There were 240. 377 00:42:58,400 --> 00:43:03,800 And I'm sure if you tried to do a bibliography now, it would run well into the thousands. 378 00:43:04,100 --> 00:43:07,100 And that debate has spread around the world. 379 00:43:07,100 --> 00:43:17,120 And, uh, it's being discussed in a lot of non-English speaking countries as well, including, uh, Japan, uh, Korea, and even to some extent, China. 380 00:43:17,450 --> 00:43:24,020 And I think philosophers played an important role here. It's not that there wasn't an emerging animal movement as well. 381 00:43:24,470 --> 00:43:33,890 There was. But, uh, the work of philosophers helped to grow that and to make it, I think, something that was seen differently, 382 00:43:34,580 --> 00:43:39,970 transformed it from a movement that was widely seen by the public as something based on sentiment, 383 00:43:39,980 --> 00:43:43,280 something for animal lovers, people who cared about animals. 384 00:43:43,520 --> 00:43:50,150 So they were the ones who were going to be concerned about cruelty to animals. And if you were not an animal lover, well, this was not really for you. 385 00:43:51,140 --> 00:43:59,180 But I think the involvement of philosophers helped to make it just a moral issue in the same way that racism and sexism, 386 00:43:59,180 --> 00:44:02,540 for instance, uh, moral issues and uh, 387 00:44:02,780 --> 00:44:07,459 therefore enable other people who may not have thought that they were animal lovers, 388 00:44:07,460 --> 00:44:11,570 may not have been particularly concerned about living with cats and dogs or other 389 00:44:11,570 --> 00:44:16,430 animals to see that this was an issue that they should also be involved in. 390 00:44:17,480 --> 00:44:23,270 Um, so, uh, I think that, uh, clearly philosophers have played a role in that. 391 00:44:23,270 --> 00:44:29,899 And that has in turn led to various legislative changes, not, uh, not in my view, 392 00:44:29,900 --> 00:44:34,130 nearly enough in the way of change, but has led to significant changes. 393 00:44:34,430 --> 00:44:44,540 Um, and of course, has also led to many people changing their own diet, which is a pretty personal kind of transformation in terms of the way we live. 394 00:44:45,900 --> 00:44:49,980 Now is that a form of preaching rather than a form of philosophy? 395 00:44:50,580 --> 00:44:56,220 I think if I look back on my own work, if I look back on animal liberation, the first chapter is, 396 00:44:56,280 --> 00:45:01,770 um, setting out a philosophical argument, roughly the argument that I summarised for you. 397 00:45:02,160 --> 00:45:10,110 Um, and the fifth chapter talks about the history of other views, uh, of species as views of animals in a critical way. 398 00:45:10,410 --> 00:45:18,780 And the sixth chapter looks at some objections. Uh, the, uh, second, third and fourth chapters are really factual, descriptive materials. 399 00:45:19,080 --> 00:45:26,950 Um, so yes, the book as a whole is clearly a work of advocacy, but I would say that, uh, at least parts of it are, um, proper. 400 00:45:27,210 --> 00:45:30,090 What is probably the work of, uh, philosopher, uh, 401 00:45:30,090 --> 00:45:37,440 putting forward arguments that are subject to criticism and discussion and, uh, people have tried to rebut them. 402 00:45:37,440 --> 00:45:44,040 Obviously, I don't think they have done so successfully. Um, but, uh, that's a debate that, um, can continue. 403 00:45:45,870 --> 00:45:52,739 So let me move from that to, um, the other issue that I've been, uh, most involved with here, uh, 404 00:45:52,740 --> 00:45:59,190 the I've also already mentioned the question of our obligations or the obligations of the affluent to the global poor. 405 00:46:00,270 --> 00:46:03,300 Um, and the story is slightly different with animal liberation. 406 00:46:03,660 --> 00:46:11,520 Once the book was published, it started to get taken up by, uh, a number of organisations, new organisations that were founded, um, 407 00:46:11,520 --> 00:46:18,299 and started to have an influence reasonably rapidly in the direction that I wanted to have with famine, 408 00:46:18,300 --> 00:46:21,780 affluence and morality, which, as I said, was published in 1972. 409 00:46:22,320 --> 00:46:29,640 Um, it did get widely discussed by philosophers, uh, and it got reprinted in a lot of anthologies. 410 00:46:29,790 --> 00:46:33,090 And so it got its way into philosophy courses. 411 00:46:34,300 --> 00:46:41,470 But it didn't for quite a long time. Get rid outside philosophy courses and philosophy itself. 412 00:46:41,980 --> 00:46:53,170 And um, even when it was used in philosophy courses, it wasn't always used in the way that, um, I think now more people see it. 413 00:46:53,320 --> 00:46:56,440 Um, this was brought home to me just a couple of months ago. 414 00:46:56,440 --> 00:46:59,829 I was giving a talk at Harvard University to, um. 415 00:46:59,830 --> 00:47:03,870 It was organised by, uh, a group of effective altruists there. 416 00:47:03,890 --> 00:47:10,390 I'll talk about that in just a moment. And, uh, Josh green, who's a professor of psychology at Harvard, introduced me. 417 00:47:10,750 --> 00:47:14,500 Um, Josh had been an undergraduate philosophy student at Harvard. 418 00:47:14,950 --> 00:47:22,989 Uh, and he said that, uh, when he was an undergraduate family, affluence and morality was part of the curriculum, 419 00:47:22,990 --> 00:47:29,470 and, of course, that he was taught, but it was put by the professor teaching the course more or less like this. 420 00:47:29,950 --> 00:47:37,330 Um, here's an article that seems to make a strong argument for a conclusion that is clearly impossible. 421 00:47:37,690 --> 00:47:46,210 Just too demanding can't be true, that we are all doing something wrong because we're not helping people in poverty elsewhere in the world. 422 00:47:46,390 --> 00:47:50,350 So what's tell me what's wrong with the argument that was? 423 00:47:50,350 --> 00:47:51,910 That was the attitude with which it was taught. 424 00:47:52,270 --> 00:48:05,020 Um, and I think, you know, that that resonated with me in things that I'd certainly heard people say, um, but, uh, things have started to change. 425 00:48:05,080 --> 00:48:09,370 Uh, did start to change a few years ago in that area in an interesting way. 426 00:48:09,760 --> 00:48:20,739 And, uh, I think a number of people going say around something like eight years ago now started to think about this rather differently. 427 00:48:20,740 --> 00:48:28,570 And I'm not quite sure why this happened, but, uh, some of it, a significant part of it happened here at Oxford. 428 00:48:29,020 --> 00:48:36,759 Um, I remember the, one of the first intimations that I had that something different was going on was when Toby Ord, 429 00:48:36,760 --> 00:48:40,270 who was then a graduate student in philosophy here and is now a, uh, 430 00:48:40,510 --> 00:48:48,610 in the Future of Humanity, uh, institute, um, told me that he was interested in starting up a group, 431 00:48:48,970 --> 00:48:55,690 uh, which would promote the idea that we can quite inexpensively do a lot of good in the world. 432 00:48:56,080 --> 00:49:09,159 Uh, and indicate to people some of the charities that were most effective in doing good, uh, and Toby himself, um, took a public pledge to live, um, 433 00:49:09,160 --> 00:49:14,320 on an amount that was not much above his graduate student ship, adjusted for inflation, 434 00:49:14,590 --> 00:49:20,050 um, for the rest of his life, and to donate the rest to effective causes. 435 00:49:20,380 --> 00:49:27,700 Um, and was, you know, telling other people about this and suggesting that they might be doing might think about, 436 00:49:27,700 --> 00:49:36,370 if not doing quite as much as then at least pledging 10% of their income for the rest of their life to help the most, 437 00:49:36,370 --> 00:49:41,200 uh, to the most organisations that were most effectively improving the world. 438 00:49:42,460 --> 00:49:48,520 So Toby was doing that and giving what we can was launched, uh, here in Oxford in 2009. 439 00:49:49,120 --> 00:49:56,229 Um, and, uh, Toby was assisted in this by another Oxford graduate, William MacAskill, um, 440 00:49:56,230 --> 00:50:06,400 who also then set up another organisation called 80,000 hours, which, um is involved in helping people to choose ethical careers. 441 00:50:06,790 --> 00:50:09,669 Um, as a website, you can look at 80,000 hours.org. 442 00:50:09,670 --> 00:50:19,450 There's also giving what we can.org um and 80,000 was Will's calculation of the number of hours that people spend in their career. 443 00:50:19,750 --> 00:50:25,630 And we all thought people don't spend enough time, given that you're going to spend 80,000 hours doing something, 444 00:50:26,080 --> 00:50:35,860 wouldn't it makes sense to spend maybe 1% of that time in deciding what it is you're going to do with the other 99%, but 1% of 80,800 hours. 445 00:50:35,860 --> 00:50:43,270 Very few people spend 800 hours thinking about their career choice or thinking systematically about it, so Will wanted to help them doing it. 446 00:50:44,540 --> 00:50:50,959 And, um, came up with some interesting ideas which, uh, I won't go into now about, uh, 447 00:50:50,960 --> 00:50:56,030 what might be ethical career choices that you might not really have thought about. 448 00:50:56,630 --> 00:51:05,450 So, um, uh, this has led, uh, has been part of I wouldn't say it's the whole thing to the growth of this, uh, effective altruism movement. 449 00:51:05,690 --> 00:51:11,750 And if you're not familiar with it, effective altruism is a philosophy and a social movement. 450 00:51:11,990 --> 00:51:17,280 It's a philosophy that says, uh, we want to make the world a better place. 451 00:51:17,300 --> 00:51:25,460 We want to do the most good we can, uh, use the title of, uh, recent book on this that I've got for sale out there, if you want to have a look at it. 452 00:51:25,760 --> 00:51:35,330 Um, and, uh, but we want to do this in a way that uses evidence and reason to work out how we can actually do the most good. 453 00:51:36,440 --> 00:51:40,350 Uh, and I think it's been really interesting to see how this movement has grown. 454 00:51:40,370 --> 00:51:45,970 I was talking to Toby, in fact, earlier this afternoon, and he told me that, uh, 455 00:51:45,980 --> 00:51:50,180 giving what we can now has over a thousand people who have pledged to do that 10%. 456 00:51:50,540 --> 00:51:58,490 And if you, uh, estimate their likely income over their future and how much that will mean? 457 00:51:58,790 --> 00:52:07,040 Uh, we're talking about £400 million that will be donated to effective charities, uh, if they fulfil that pledge. 458 00:52:07,250 --> 00:52:11,530 So this is not a small thing, but it's this is this is one organisation. 459 00:52:11,540 --> 00:52:16,430 It's only a part of other organisations that are encouraging people to be effective altruists. 460 00:52:16,970 --> 00:52:24,110 And, uh, there is a centre for Effective Altruism that, um, is based here in Oxford, connected, 461 00:52:24,110 --> 00:52:29,810 uh, with the The Future of Humanity Institute and, uh, the Hero Centre for Practical Ethics. 462 00:52:30,200 --> 00:52:38,329 And we have a lot of people, um, in Oxford who are working in this in a wide variety of ways, a lot of different philosophies. 463 00:52:38,330 --> 00:52:44,900 And some of you will know and some of you, uh, may not, uh, who are involved in in this activity, 464 00:52:46,340 --> 00:52:52,040 but what I think is really most interesting about this whole change that we've had and about practical ethics, 465 00:52:52,280 --> 00:52:55,730 is how it shows the power of philosophy to change people's lives. 466 00:52:56,240 --> 00:53:01,040 I've already mentioned changing people's lives in the direction of changing their diet. 467 00:53:01,700 --> 00:53:09,470 Uh, changing people's lives. You've seen in the direction of what they do with their income in pledging to give a certain part of that to, 468 00:53:09,530 --> 00:53:13,880 um, effective charities to live on less than they otherwise would. 469 00:53:14,540 --> 00:53:17,959 Changing people's lives in terms of the direction of the career they take. 470 00:53:17,960 --> 00:53:23,000 And I know, uh, a number of people who've been influenced by 80,000 hours to change their career. 471 00:53:23,390 --> 00:53:28,400 In fact, um, you might be sorry to learn that Oxford lost a very promising graduate student, 472 00:53:28,400 --> 00:53:32,930 a Princeton student who had been accepted to do graduate work at Oxford, 473 00:53:33,200 --> 00:53:38,570 but decided that, uh, a career in philosophy was not the most effective thing that he could do with his life. 474 00:53:39,680 --> 00:53:44,000 Um, and I talk about him, uh, in the book, if you if you want to know more. 475 00:53:44,360 --> 00:53:48,560 But, um. Indeed. Uh, I've got an even more dramatic example of that. 476 00:53:48,590 --> 00:53:52,880 Um, and I want to close with that example and a couple of remarks about philosophy. 477 00:53:53,360 --> 00:54:02,780 So, um, a couple of years ago, I received an email out of the blue from someone I'd never heard of that began as follows in The Life You Can Save, 478 00:54:03,110 --> 00:54:10,430 my previous, uh, book on this issue, you remarked that as far as you know, no student of yours has ever actually donated a kidney. 479 00:54:11,300 --> 00:54:17,540 Last Tuesday, I bit the utilitarian bullet. I anonymously donated my right kidney to whoever could use it the most. 480 00:54:18,410 --> 00:54:23,510 By doing so, I started a kidney chain that I had a total of four people to receive kidneys. 481 00:54:24,140 --> 00:54:31,340 This came from Chris Croy, who was a student, um, at Saint Louis Community College and not a particularly elite university. 482 00:54:31,610 --> 00:54:38,630 But, uh, he came to this decision after a philosophy class that read, um, not just my family, affluence and morality, 483 00:54:38,870 --> 00:54:44,540 but a critical article by John Arthur that tried to refute my views by saying, well, 484 00:54:44,810 --> 00:54:49,090 if this is true, then one obvious means by which you could aid others is with your body. 485 00:54:49,100 --> 00:54:56,960 For example, you could donate a kidney to a stranger. Um, and someone in the class said, no, that isn't right because you die if you donate a kidney. 486 00:54:57,290 --> 00:55:02,059 But Chris realised that that wasn't true, that actually you can do quite well with one kidney. 487 00:55:02,060 --> 00:55:09,230 In fact, the chances that you will shorten your life by donating a kidney, according to one estimate, are only 1 in 4000. 488 00:55:09,740 --> 00:55:16,550 Um, so, uh, he did this, talked about it to various people, um, and eventually called up a hospital and offered to do it. 489 00:55:16,880 --> 00:55:23,300 And he's not the only person, in fact, who has been influenced by taking a class in philosophy to donate a kidney. 490 00:55:24,140 --> 00:55:28,610 Now, this is interesting in that it relates back to some things that philosophers have 491 00:55:28,610 --> 00:55:33,650 said in the past about the nature of ethics and the nature of human beings. 492 00:55:34,010 --> 00:55:38,270 I think, um, I'm thinking particularly here, for instance, of Bernard Williams, 493 00:55:38,630 --> 00:55:43,910 uh, criticisms of Henry Sedgwick's idea of taking a universal point of view. 494 00:55:44,120 --> 00:55:53,060 That is, we should act ethically by taking the point of view of the universe and recognising that the welfare of others counts as much as our own, 495 00:55:53,330 --> 00:55:56,930 our own welfare. And Williams says that's impossible. 496 00:55:56,930 --> 00:56:04,490 Humans just aren't that kind of being. You can't really detach yourself from your own projects and take the point of view of the universe, 497 00:56:04,820 --> 00:56:07,950 but I think effective altruists come close to doing that. 498 00:56:07,970 --> 00:56:10,270 I'm not going to say that they're purely doing that. 499 00:56:10,280 --> 00:56:16,280 Maybe nobody is is purely doing that, but they're certainly thinking much more from that perspective. 500 00:56:16,460 --> 00:56:23,120 And they're doing things that, um, other people, I think would have said it's quite unrealistic to expect people to do. 501 00:56:24,110 --> 00:56:32,870 And I think it's evidence of the importance of philosophy and the power of practical ethics that, um, it can lead people to do this. 502 00:56:33,590 --> 00:56:35,510 People often talk nowadays of. 503 00:56:35,570 --> 00:56:43,820 A crisis in the humanities that, uh, enrolments in the humanities are falling, that the humanities seem less relevant in a world that is, 504 00:56:43,820 --> 00:56:49,880 uh, much more digital and people find, uh, other, more attractive and exciting things that they want to do. 505 00:56:50,860 --> 00:57:00,220 Well, I don't have a brief to defend the humanities as a whole, but I certainly think that, uh, philosophy is a key part of the humanities. 506 00:57:00,490 --> 00:57:06,030 Uh, and, uh, I don't think that philosophy is less relevant than it was. 507 00:57:06,040 --> 00:57:11,200 I think it's more relevant. And I think that, uh, when I said, uh, 508 00:57:11,410 --> 00:57:15,940 one of the reasons people find philosophy is disappointing or moral philosophy is 509 00:57:15,940 --> 00:57:21,190 disappointing is because they mistakenly look to the moral philosophy of guidance. 510 00:57:21,460 --> 00:57:25,450 I think I had it exactly wrong, exactly in reverse. 511 00:57:25,780 --> 00:57:33,490 I think that, um, although certainly there's a lot of interest and importance in that discussion as to whether moral judgements can be true or false. 512 00:57:33,670 --> 00:57:40,090 And it's good to see that discussion revived, uh, again with, uh, Oxford philosopher Derek Parfit, 513 00:57:40,090 --> 00:57:44,290 I think playing an important role in, uh, on what matters in that discussion. 514 00:57:44,500 --> 00:57:47,470 It's good to see that discussion revised. That is an important discussion. 515 00:57:47,830 --> 00:57:56,920 But, um, I think the the cutting edge of this, if you like, where the rubber hits the road, um, is really when we start discussing normative ethics, 516 00:57:57,100 --> 00:58:02,950 and I don't think any of us should have any doubt about the importance of doing that and about the fact that, 517 00:58:03,250 --> 00:58:06,730 uh, here philosophy undoubtedly changes lives. 518 00:58:07,270 --> 00:58:07,990 Thank you very much.