1 00:00:01,460 --> 00:00:08,140 This is the version flight of a new paper that I'm thinking about for a long time and I'm not totally happy with, 2 00:00:08,150 --> 00:00:12,140 so I'd be very interested to hear your take on it. 3 00:00:15,410 --> 00:00:24,080 So I have two thesis. I have an empirical thesis and a normative thesis. 4 00:00:24,080 --> 00:00:29,270 The empirical thesis is that most people aim to be morally mediocre. 5 00:00:30,740 --> 00:00:36,350 They aim to be about as morally good as the people around them. 6 00:00:37,820 --> 00:00:44,480 Not especially better, not especially worse. This mediocrity has two aspects. 7 00:00:47,390 --> 00:00:49,760 It's peer relative rather than absolute. 8 00:00:49,770 --> 00:00:57,110 That is, they're aiming for not to be good or bad by absolute standards, but rather by peer relative standards. 9 00:00:58,280 --> 00:01:17,550 And it's middling rather than extreme. What we do is we, most of us notice the typical behaviour of people around us and so we missed our option. 10 00:01:18,660 --> 00:01:20,940 We could have had a free tablet computer. 11 00:01:21,600 --> 00:01:32,220 We we look around at people around us and we aim to be about morally within that range, not being among the very best, not being among the very worst. 12 00:01:33,600 --> 00:01:36,630 We look around us and then calibrate towards so, so. 13 00:01:37,550 --> 00:01:46,080 So that's the empirical thesis. And then the normative thesis is that this is a somewhat bad way to be, but it's not a terribly bad way to be. 14 00:01:47,610 --> 00:01:51,600 Also, it's a somewhat good way to be, but it's not a wonderfully good way to be. 15 00:01:52,830 --> 00:01:59,040 It's got good and bad points. It's morally mediocre to aim for moral mediocrity. 16 00:02:00,180 --> 00:02:04,409 Now, that way of phrasing, it sounds so bland as to be almost a tautology. 17 00:02:04,410 --> 00:02:13,020 But it's not a tautology. Someone with a stringent normative you might think it's horrible to aim to be about as good as your peers, 18 00:02:14,280 --> 00:02:21,300 someone with much less stringent views might think. It's perfectly fine to aim for mediocrity as long as you avoid being among the worst. 19 00:02:22,800 --> 00:02:29,430 So I'll argue that aiming for mediocrity is neither perfectly fine nor inexcusably rotten. 20 00:02:29,910 --> 00:02:40,680 It's kind of blah. We're morally blameworthy for not aspiring to be better, but we can also be legitimately praised for not being worse. 21 00:02:42,590 --> 00:02:46,340 All right. So that's the general set of the theses. 22 00:02:47,030 --> 00:02:52,290 So let's work on the empirical stuff. So following the moral crowd. 23 00:02:53,390 --> 00:03:02,240 So Robert Cialdini and collaborators went to Sorry My Shoes Untied, went to Arizona's Petrified Forest National Park. 24 00:03:04,850 --> 00:03:11,660 The park had been losing about a ton of petrified wood per month, mostly stolen by casual visitors, one small piece at a time. 25 00:03:13,280 --> 00:03:21,229 Chiellini and his collaborators posted four different signs at the head to discourage theft at the heads of different trails. 26 00:03:21,230 --> 00:03:25,730 And then they rotated the placement of the science. Two of the signs were explicit injunctions. 27 00:03:25,730 --> 00:03:33,860 One said, Please don't remove petrified wood from the park with a picture of a visitor stealing wooden grass by a red circle in a bar. 28 00:03:35,840 --> 00:03:39,080 Another one said, Please leave petrified wood in the park. 29 00:03:39,700 --> 00:03:43,160 There's a picture of a visitor admiring, photographing a piece of wood. 30 00:03:43,670 --> 00:03:47,420 So those are two normative signs. And then there were two descriptive signs. 31 00:03:47,660 --> 00:03:51,800 One said Past visitor. Many past visitors have removed petrified wood from the park, 32 00:03:52,490 --> 00:03:57,080 changing the state of the petrified forest and at a picture of three visitors stealing wood. 33 00:03:58,490 --> 00:04:05,059 And then the final sign said The vast majority of past visitors have left petrified wood in the park, 34 00:04:05,060 --> 00:04:07,280 preserving the natural state of the petrified forest. 35 00:04:07,400 --> 00:04:14,330 So there are these four signs two injunctive one positive one negative, two descriptive one positive or negative. 36 00:04:15,320 --> 00:04:21,830 What children and collaborators then notice? Is this going to restart on me? 37 00:04:22,490 --> 00:04:29,930 No. Then noticed how frequently visitors stole wood from these paths with the different signs. 38 00:04:31,160 --> 00:04:39,920 What they found was that the visitors took the least wood from the path when they were explicitly enjoined not to take wood. 39 00:04:40,130 --> 00:04:50,360 So the positive of the negative injunction and the rates of test work theft were highest up to 8% of visitors stored when 40 00:04:50,360 --> 00:04:56,509 visitors were told the many past visitors had removed wood being told that many of that past visitors had removed wood. 41 00:04:56,510 --> 00:05:03,140 Changing the state of the forest might even have increased the rates of theft, which are normally estimated to be 1 to 4% of visitors. 42 00:05:03,630 --> 00:05:08,060 So might have doubled or tripled the rates of threat theft. They did. 43 00:05:09,170 --> 00:05:13,100 They didn't find these visitors and ask them to return the wood. All right. 44 00:05:13,490 --> 00:05:16,340 Here's another one. Also by children, he's got a whole series of these. 45 00:05:17,090 --> 00:05:23,479 He found that hotel guests were substantially more likely to reuse towels when a message 46 00:05:23,480 --> 00:05:28,850 to help save the environment was supplemented with the information that 75% of the guests 47 00:05:29,240 --> 00:05:33,350 who stayed in this room and then listed the actual room number they were in participated 48 00:05:33,350 --> 00:05:37,340 in our new research savings program by using their towels more than once and quote. 49 00:05:41,870 --> 00:05:49,610 And that was more effective than a longer message, a longer inductive message or message that included other kinds of descriptive detail. 50 00:05:52,330 --> 00:05:58,629 Certainly there's evidence that people are more likely to heed injunctions to reduce household energy when they are shown. 51 00:05:58,630 --> 00:06:03,250 Statistics indicated that they are using more energy than their neighbours and when they 52 00:06:03,250 --> 00:06:06,219 are shown statistics suggesting that they're using less energy than their neighbours, 53 00:06:06,220 --> 00:06:12,160 they might tend to increase their energy use even if it's framed in the context of trying to reduce their energy use. 54 00:06:13,570 --> 00:06:17,680 Littering, lying and suicide all appear to be contagious. 55 00:06:20,760 --> 00:06:28,049 In dictator games, that is. These are laboratory situations in which randomly chosen participants are given money and then 56 00:06:28,050 --> 00:06:32,640 told they can either keep it all for themselves or share some with less lucky participants. 57 00:06:32,850 --> 00:06:39,240 All right. So in dictator games like that, if participants have been led to believe that previous participants were mostly shuffled selfish, 58 00:06:39,780 --> 00:06:44,910 they tended to share less than if they'd been told that previous participants mostly shared about equally. 59 00:06:48,700 --> 00:06:56,499 Cialdini concludes that injunctive norms that is social or moral admonitions most 60 00:06:56,500 --> 00:07:02,320 effectively promote non-compliant behaviour when they align with descriptive norms. 61 00:07:02,380 --> 00:07:12,010 That is facts about how people actually behave. People are more likely to abide by moral rules if they see that others are already doing so. 62 00:07:13,690 --> 00:07:15,370 Our beach area has a similar view. 63 00:07:16,960 --> 00:07:24,670 She argues that people normally comply with social norms only if they have the empirical expectation that others are also complying. 64 00:07:25,660 --> 00:07:28,780 People tend to follow the moral or immoral claim. 65 00:07:30,610 --> 00:07:36,640 So I'm going to add some caveats to this in a minute. But first, let me talk a little about a moral self licensing. 66 00:07:38,410 --> 00:07:42,870 So all half hearted dieters know about self licensing, right? 67 00:07:42,880 --> 00:07:51,490 You've been good all day. The salad with only a forkful addressing the orange, the cup of plain yoghurt, the turkey sandwich, no mayo. 68 00:07:51,760 --> 00:08:01,180 So tonight you can afford a brownie. So moral self licensing is that style of thinking applied to the moral domain. 69 00:08:01,900 --> 00:08:05,530 You've been morally good for a while. Or maybe you just done something especially morally admirable. 70 00:08:06,880 --> 00:08:09,280 So now you can indulge in a little sin or selfishness. 71 00:08:11,110 --> 00:08:19,090 Conversely, in moral cleansing, you might react to having done something particularly bad by being especially morally good for a little while. 72 00:08:20,230 --> 00:08:26,740 Kind of. Even things out. So in a well known series of experiments, 73 00:08:27,880 --> 00:08:36,670 Nina Mazar and Jimbo Jong showed participants an online store with mostly environmentally green friendly products. 74 00:08:37,060 --> 00:08:42,790 Participants then made hypothetical purchases of up to $25 with a chance they'd actually receive these products to take home. 75 00:08:44,380 --> 00:08:50,020 Mazar and Zhong hypothesised that participants who'd been randomly assigned to select products from the Green Store 76 00:08:51,100 --> 00:08:57,700 would experience a boost of their moral self and be subsequently left less motivated to avoid moral transgression. 77 00:08:58,120 --> 00:09:03,610 Then participants would only been assigned to a non-green more standard store and as predicted. 78 00:09:04,150 --> 00:09:11,920 Participants who had been selected to who had selected products from the Green Store offered less money in dictator games, 79 00:09:12,550 --> 00:09:18,130 or more likely subsequently, and were more likely to lie to an experimenter for money. 80 00:09:20,680 --> 00:09:29,739 Other researchers have found that merely expressing moral intentions or imagining positive moral traits attributable to yourself or positive 81 00:09:29,740 --> 00:09:38,320 actions you will do in the future will tend to produce more selfish choices in dictator games and more lying or cheating in laboratory conditions. 82 00:09:44,180 --> 00:09:47,389 In another influential series of studies, Benoit MOYEN, 83 00:09:47,390 --> 00:09:56,420 Monaghan and Dale Miller presented some Princeton undergraduates with an opportunity to display anti-racist or anti sexist credentials, 84 00:09:57,110 --> 00:10:05,780 either by affirming an egalitarian position in a questionnaire or by selecting an obviously best qualified candidate 85 00:10:07,040 --> 00:10:13,610 and obviously best of all qualified black or woman candidate over four white men in a hypothetical hiring decision. 86 00:10:15,820 --> 00:10:21,580 When later presented with a hypothetical hiring position for a stereotypical male job or 87 00:10:21,580 --> 00:10:27,400 for hiring a black versus a white police officer into a racially charged police force, 88 00:10:28,180 --> 00:10:32,980 participants would earlier displayed anti sexist or anti-racist credentials, 89 00:10:33,940 --> 00:10:38,320 expressed more preference for the white men in the hypothetical hiring decision. 90 00:10:40,760 --> 00:10:46,309 Then when compared with participants who had been who had just previously done a hypothetical 91 00:10:46,310 --> 00:10:49,790 hiring decision among only white men or had done a questionnaire on a different topic. 92 00:10:50,940 --> 00:10:55,940 Morning and Miller interpret these results of showing that after displaying egalitarianism, 93 00:10:56,510 --> 00:11:04,880 participants feel freer to express honest responses that might be partly based on sexism and racism. 94 00:11:08,000 --> 00:11:16,420 All right. So now these studies about moral self licensing and following the moral crowd, 95 00:11:17,650 --> 00:11:25,630 they broadly support my empirical thesis that most people aim for mediocrity, but they are also consistent with the falsity of my thesis. 96 00:11:26,770 --> 00:11:28,630 And let me tell you why. 97 00:11:30,460 --> 00:11:39,970 So first, in almost all of these studies, only a minority of participants change their behaviour as a result of the experimental interventions. 98 00:11:40,360 --> 00:11:41,110 So for example, 99 00:11:41,110 --> 00:11:48,700 although posting a sign saying that many visitors have stolen word from the Petrified Forest might have doubled or tripled the rates of theft, 100 00:11:49,180 --> 00:11:51,610 the observed rate with the sign was still only 8%. 101 00:11:51,700 --> 00:12:01,329 So that's consistent with 92% of park visitors abiding by an absolute rather than a relative norm of not stealing would be. 102 00:12:01,330 --> 00:12:08,350 Gary and Cheryl found a variation from about 50% to about 70% selfish choices in the dictator game, 103 00:12:08,350 --> 00:12:12,160 depending on what participants in the Met told about previous participants behaviour. 104 00:12:12,430 --> 00:12:19,900 It's a substantial change, but again it's one consistent with a majority adhering to absolute norms. 105 00:12:23,370 --> 00:12:27,990 Although most participants might follow the crowd, given the right conditions, 106 00:12:28,650 --> 00:12:33,210 for example, if they actually saw almost all visitors stealing wood from the park, 107 00:12:33,780 --> 00:12:42,210 or if they witnessed directly witnessed person after push person keeping the money in dictator games, although that might be the case. 108 00:12:42,500 --> 00:12:45,060 Existing experimental research doesn't show that to be so. 109 00:12:46,350 --> 00:12:55,530 So even if the experiments I've cited are taken at face value, it doesn't follow straight away that most people aim for Moral Majority. 110 00:12:58,140 --> 00:13:04,890 Another caveat here is that people might interpret the normative situation differently 111 00:13:05,880 --> 00:13:10,590 in light of evidence about peers behaviour or in light of their own past behaviour. 112 00:13:10,950 --> 00:13:15,540 So for example, if they believe that lots of other people take word from the park, 113 00:13:15,900 --> 00:13:19,260 they might think it's only fair that I should be able to do the same. 114 00:13:19,420 --> 00:13:28,020 Right. That's a changing opinion about fairness. Or they might think if everyone does it, then maybe it's not so bad after all. 115 00:13:30,520 --> 00:13:36,670 Switching example, participants might decide that if most lucky participants in dictator games just keep the money. 116 00:13:36,730 --> 00:13:40,960 Well, I guess that's the social norm for how you're supposed to play these games. 117 00:13:47,100 --> 00:13:52,590 Furthermore, as Mullen and Moanin emphasised with respect to the licensing experiments, 118 00:13:53,880 --> 00:13:57,780 some of them might be better understood as moral credentialing rather than as compensating. 119 00:13:58,020 --> 00:14:01,889 And credentialing is different from compensating. They're not quite the same. 120 00:14:01,890 --> 00:14:04,230 Credentialing is more epistemic. Right. 121 00:14:04,230 --> 00:14:14,160 So if you have positive anti sexist, say credentials that might lead you to interpret a bit an ambiguous action or behaviour as not bad, 122 00:14:14,160 --> 00:14:16,050 for example, as not sexist after all. 123 00:14:17,070 --> 00:14:23,100 And that's different from a calibration reaction in which you accept that an action is bad and then you do it anyway. 124 00:14:25,330 --> 00:14:33,340 And then final caveat here. The literatures on moral licensing and following the moral crowd are relatively new. 125 00:14:34,030 --> 00:14:37,480 The results come from mostly from the same few labs. 126 00:14:38,230 --> 00:14:46,690 Now, although a recent meta analysis finds of the licensing literature does seem to confirm an effect, caution is warranted. 127 00:14:46,690 --> 00:14:55,900 I think because the lower bounds of the 95% confidence intervals of most of the positive studies are quite close to zero. 128 00:14:55,960 --> 00:15:03,130 That's consistent with the possibility of underpowered research and a publication bias favouring positive results. 129 00:15:04,000 --> 00:15:06,370 So this fits a lot with replication crisis kind of stuff. 130 00:15:08,230 --> 00:15:15,459 One systematic replication, attempted attempt that I'm aware of of some of this literature does confirm a positive result. 131 00:15:15,460 --> 00:15:22,570 This is of the classic Montana Miller 2001 study write participants were in fact in this systematic replication, 132 00:15:22,570 --> 00:15:29,020 more likely to choose a man for a hypothetical higher after having previously expressed egalitarian opinions. 133 00:15:29,260 --> 00:15:33,700 But the effect size was very small is only 0.1 points on a seven point scale. 134 00:15:34,210 --> 00:15:40,060 So the empirical evidence, I think, including effect sizes and replicability, remains uncertain. 135 00:15:41,470 --> 00:15:47,740 So I'm asking offering the moral mediocrity hypothesis as a conjecture and empirical conjecture. 136 00:15:48,280 --> 00:15:58,210 It has some empirical support. It also, I think, hopefully has some appeal on common sense grounds or based in life experience. 137 00:15:59,920 --> 00:16:04,030 People seem to collaborate more or less, most of us, toward the moral middle. 138 00:16:04,960 --> 00:16:12,580 People don't typically choose to follow a moral rule if they see many others gathering the benefits of vice. 139 00:16:14,940 --> 00:16:17,940 Nor typically do people want to be worse violators than their peers. 140 00:16:18,390 --> 00:16:23,130 If we sense that others are honourable and true, we tend to admire that and follow along. 141 00:16:24,060 --> 00:16:31,950 In exceptional cases, we might take a lone stand for morality, but then afterwards, it's hard to resist the thought that the world owes you something. 142 00:16:34,830 --> 00:16:38,460 Conversely, we might soon of sex exceptionally. 143 00:16:38,790 --> 00:16:41,790 But then we often feel we should compensate someone. 144 00:16:50,350 --> 00:16:55,240 So as I was mentioning, I've done a lot of work on the moral behaviour of ethics professors. 145 00:16:57,550 --> 00:17:02,650 Professional ethicists appear to behave no differently on average than do professors who don't specialise in ethics. 146 00:17:04,150 --> 00:17:08,860 They behave no differently, despite on some issues, at least like charity and vegetarianism, 147 00:17:09,070 --> 00:17:13,240 endorsing, tending to endorse much more stringent or demanding moral views. 148 00:17:14,650 --> 00:17:18,280 And the moral mediocrity hypothesis is one way to make sense of these findings. 149 00:17:20,420 --> 00:17:28,370 Starting in 2007, Josh Rust and I began looking for empirical evidence that professional associates behave differently than to other people of similar 150 00:17:28,370 --> 00:17:37,520 social background and across several published articles encompassing 19 different main measures of arguably moral behaviour. 151 00:17:38,690 --> 00:17:49,759 Our evidence, which we reviewed in the 2015 paper, suggests that ethics professors basically behave the same as do other socially comparable people, 152 00:17:49,760 --> 00:17:54,560 like philosophers not specialising in ethics and like professors from departments other than philosophy. 153 00:17:55,730 --> 00:18:04,610 Some of our measures are peer rated. Overall moral behaviour, theft of library books, voting participation in public elections. 154 00:18:04,970 --> 00:18:08,690 Membership in the Nazi Party in 1930s. Germany littering. 155 00:18:09,230 --> 00:18:12,290 Failing to pay required conference registration fees. 156 00:18:12,770 --> 00:18:16,220 Staying in regular contact with one's mother if one's mother is still alive. 157 00:18:16,820 --> 00:18:24,920 Charitable giving. Vegetarianism. Replying to emails from students paying membership dues to support one's main disciplinary academic society. 158 00:18:24,920 --> 00:18:26,930 Being an organ donor, being a blood donor, 159 00:18:26,930 --> 00:18:33,710 responding dishonestly to survey questions and letting the door slam when leaving or entering in the middle of a talk. 160 00:18:36,470 --> 00:18:39,530 So these measures vary from the trivial, 161 00:18:39,560 --> 00:18:46,100 like door slamming and littering to substantial life decisions that some people would regard as highly morally important, 162 00:18:46,280 --> 00:18:49,640 like joining the Nazi Party or donating a large sum of money to charity. 163 00:18:52,700 --> 00:18:58,549 They vary from behaviour towards strangers like blood and organ donation to behaviour toward family members, 164 00:18:58,550 --> 00:19:08,060 like calling mom or two students replying to emails. They also very on issues on which moral opinion is divided like vegetarianism, 165 00:19:08,060 --> 00:19:16,340 or whether there's a duty to support one's main disciplinary society by paying dues to opinions tissues on which there's high consensus. 166 00:19:16,650 --> 00:19:22,790 I like the badness of joining the Nazi Party by consensus now and also issues of monsters. 167 00:19:22,790 --> 00:19:27,979 They're consensus even during the the act of voting like we have evidence that over 168 00:19:27,980 --> 00:19:31,820 80% of professors think that you should respond to emails from undergraduates. 169 00:19:31,820 --> 00:19:37,520 Over 80% of professors think that there is a moral duty, or at least it's morally good to vote in public elections. 170 00:19:40,600 --> 00:19:43,960 So in one case, we found evidence that ethicists behave a little worse. 171 00:19:43,990 --> 00:19:49,479 Library ethics books were more likely missing from academic libraries and were other comparable books in philosophy, 172 00:19:49,480 --> 00:19:53,770 similar in age and check out rate. But mostly we just found no detectable differences in behaviour. 173 00:19:57,070 --> 00:20:03,100 Now in some cases we also had measures of expressed moral opinion about those same issues which 174 00:20:03,100 --> 00:20:09,160 we could compare with either or both self-reported behaviour and directly observed behaviour. 175 00:20:10,540 --> 00:20:15,580 So we could compare expressed opinion about responding to student emails with self-reported behaviour. 176 00:20:16,000 --> 00:20:18,970 How often do you respond to student emails? What percentage of emails do you neglect? 177 00:20:19,120 --> 00:20:22,719 And then we actually sent them emails that look designed to look as though they were 178 00:20:22,720 --> 00:20:26,770 from students and saw whether they replied and talked more about that if you want. 179 00:20:27,190 --> 00:20:30,400 We emphasise, I would emphasise that these data are always de-identified. 180 00:20:31,750 --> 00:20:36,970 Transformed into numerical codes. We didn't couldn't reach any inferences about particular individuals. 181 00:20:38,800 --> 00:20:44,410 We found several issues on which professional ethicists tended to endorse more stringent norms, 182 00:20:45,880 --> 00:20:49,960 but we didn't detect any corresponding differences in behaviour on those issues. 183 00:20:52,870 --> 00:20:59,019 We also do not find that ethicists expressed moral opinions were either more or less consistent with 184 00:20:59,020 --> 00:21:03,880 their either self-reported or their directly measured behaviour than were not ethicists opinions. 185 00:21:04,630 --> 00:21:08,200 So attitude behaviour correlations were similar across the board. 186 00:21:10,990 --> 00:21:14,830 So there are various possible interpretations of these results. 187 00:21:14,860 --> 00:21:20,110 Let me just throw out a few. One is that philosophical, moral reflection is behaviourally inert? 188 00:21:21,820 --> 00:21:27,460 John Hyde, of course, is famously argues for that, and he's used our research as part of his support for that thesis. 189 00:21:29,320 --> 00:21:37,360 Another interpretation is that ethics would not be inert if it were applied to cases that arise in ethicist daily lives. 190 00:21:37,600 --> 00:21:42,730 But instead, ethics is focussed mainly on the abstract, the obscure or general public policy. 191 00:21:44,800 --> 00:21:54,940 So another interpretation is that people who are drawn to ethics are disproportionately those who need to reason their way to behaving morally well, 192 00:21:54,940 --> 00:21:58,120 not ethicists kind of just do it intuitively or automatically. 193 00:21:58,450 --> 00:22:04,210 And so ethical reflection is effective in getting hyper intellectual people to improve their behaviour up to average. 194 00:22:07,990 --> 00:22:16,330 So it's an open empirical question, which is true. I don't think we have enough evidence to even really have a firm conjecture about that. 195 00:22:16,990 --> 00:22:24,700 But recently I've been attracted to the consistency of these results with the moral mediocrity hypothesis. 196 00:22:25,180 --> 00:22:28,840 So if the Moral Majority hypothesis is correct, we can explain my results. 197 00:22:29,080 --> 00:22:35,680 So consider the issue on which we have the most, I think, striking results, which is the vegetarianism vegetarianism issue. 198 00:22:36,550 --> 00:22:43,300 In 2009, draft person, I sent a questionnaire to professional philosophers, including ethicists, anonymous philosophers, 199 00:22:43,390 --> 00:22:47,860 and then also a comparison group of other professors in other departments at the same universities. 200 00:22:48,760 --> 00:22:53,740 In the first part of questionnaire, we asked the respondents their normative opinions about a variety of moral issues. 201 00:22:54,190 --> 00:22:59,649 And the second part of the questionnaire, we asked them to self-report their behaviour on those same issues for some of those issues, 202 00:22:59,650 --> 00:23:04,330 but not for the vegetarian vegetarian vegetarianism issue. 203 00:23:04,630 --> 00:23:12,280 We also had some direct measures of behaviour. So on the vegetarianism vegetarianism issue, in the first part of the questionnaire, 204 00:23:12,610 --> 00:23:18,730 we asked respondents to rate various behaviours like regularly eating the meat of mammals such as beef, pork. 205 00:23:19,300 --> 00:23:23,950 We asked them to rate that on a 1 to 9 scale for very morally bad, very morally good. 206 00:23:24,460 --> 00:23:31,600 The five point was morally neutral. And what we found was 60% of the ethicists rated it somewhere on the bad side of the scale. 207 00:23:32,920 --> 00:23:39,520 45% of the non ethicists philosophers did, but only 19% of the professors in departments other than philosophy did. 208 00:23:39,520 --> 00:23:41,830 So a pretty big difference in moral opinion. 209 00:23:43,270 --> 00:23:51,580 Then we asked them later in the survey, Did you eat the meat of a mammal such as beef or pork during your last evening meal, not including snacks? 210 00:23:52,300 --> 00:23:55,510 And there we found no statistically significant difference among the groups. 211 00:23:56,170 --> 00:24:01,300 Overall, 38% of the respondents answered yes, including 37% of the ethicists. 212 00:24:02,290 --> 00:24:06,130 And we found a similar pattern by age and gender, women were more likely to say it was bad. 213 00:24:06,340 --> 00:24:10,240 We're not more likely to report or did not report less meat. 214 00:24:11,980 --> 00:24:16,900 Younger participants were more likely to say it's bad. We're not more likely to report having eaten less meat. 215 00:24:19,180 --> 00:24:24,760 So. So here's a thought experiment. Max. 216 00:24:25,540 --> 00:24:30,760 Oh, let me tell you how I choose my names in my thought experiments. Now, I commend this for your consideration. 217 00:24:31,600 --> 00:24:37,300 I have now to avoid implicit bias or and to increase cultural cultural representativeness. 218 00:24:37,600 --> 00:24:39,850 I now write my examples with bias. 219 00:24:40,270 --> 00:24:47,320 And then I have this Excel document with the names of all the undergraduates who've taken laudatory classes with me since 2014. 220 00:24:47,860 --> 00:24:59,240 And I just randomly choose a name from that. So Max is an ethics professor teaching and applied moral issues class. 221 00:24:59,870 --> 00:25:04,520 He introduces his students to the philosophical arguments for and against vegetarianism. 222 00:25:05,930 --> 00:25:14,870 He admits to his students that he personally is convinced that the arguments for vegetarianism are sound and that vegetarianism is morally required. 223 00:25:16,220 --> 00:25:21,770 He considers various objections, including objections raised in class discussion, and he rebuts each one. 224 00:25:22,770 --> 00:25:26,330 And after class, he goes to the school cafeteria and has a cheeseburger. 225 00:25:30,370 --> 00:25:36,730 Now, students might be surprised seeing him do this, might say, Hey, what are you doing? 226 00:25:36,880 --> 00:25:39,790 I thought you just said it was bad to eat meat. Here you are, eating meat. 227 00:25:42,070 --> 00:25:46,600 Now, if the moral mediocrity hypothesis is correct, we can explain Max's behaviour. 228 00:25:46,780 --> 00:25:52,150 Max believes, through philosophical reasoning, that he has discovered something about morality is morally bad meat. 229 00:25:53,350 --> 00:25:55,270 Now, having discovered this, what's he going to do? 230 00:25:57,070 --> 00:26:02,140 Well, according to the moral mediocrity hypothesis, he doesn't aim to be morally good by absolute standards. 231 00:26:03,100 --> 00:26:06,670 Instead, he aims to be about as morally good as the people around him, the people he regards as his peers. 232 00:26:06,820 --> 00:26:11,890 What is it? What are his peers doing? What are other professors doing? His family, his friends, people around him in the cafeteria? 233 00:26:12,100 --> 00:26:15,730 Well, they're mostly eating meat. So if he's aiming for mediocrity, he's going to do the same. 234 00:26:16,630 --> 00:26:18,430 Instead of leading to a change in behaviour. 235 00:26:20,050 --> 00:26:26,080 Max's moral discovery has led him only to have a lower moral opinion of almost everyone's moral behaviour, 236 00:26:26,260 --> 00:26:29,830 including his own, while he keeps on doing what he's always done. 237 00:26:34,910 --> 00:26:40,790 Now, it does seem likely that some people are convinced by philosophical arguments, 238 00:26:40,790 --> 00:26:44,810 arguments not only to endorse vegetarianism, but also to practice vegetarianism. 239 00:26:44,990 --> 00:26:48,860 In fact, I'm doing a study on this right now. Peter Singer, 240 00:26:48,860 --> 00:26:54,530 Brad KUKLA and I are collaborating on a study where we're taking undergraduates and exposing half of them to arguments for vegetarianism. 241 00:26:54,590 --> 00:26:58,040 Half of them are arguments for charitable giving, giving. And then they're going to get cafeteria vouchers. 242 00:26:58,040 --> 00:27:02,630 They want to see what food choices they actually make. I can talk about that more if you want during question period. 243 00:27:04,880 --> 00:27:09,620 Now, if some people are convinced by philosophical arguments for vegetarianism and then change their behaviour, 244 00:27:10,280 --> 00:27:17,270 that can be made consistent with a moral mediocrity hypothesis if the result is a licensing of worse behaviour in other areas. 245 00:27:17,660 --> 00:27:23,090 So for example, if you put in the work and the self-deprecation or perceived self-deprecation to become a vegetarian, 246 00:27:23,300 --> 00:27:30,410 then maybe you have an excuse to ignore student emails more than you otherwise would have, 247 00:27:30,410 --> 00:27:36,830 or to pollute more, or to be ruder to your neighbours, or to to shirk departmental duties or to give less to charity. 248 00:27:39,330 --> 00:27:45,240 So on this view, and I don't think this is the only thing going on and I'm absolutely committed to this, but. 249 00:27:47,400 --> 00:27:49,139 The simple version of moral mediocrity. 250 00:27:49,140 --> 00:27:59,610 Is this the main practical effect of moral discovery on your personal day to day behaviour is that it helps you 251 00:27:59,730 --> 00:28:07,650 more accurately calibrate your mediocrity with a clearer understanding of how morally good or bad we all are. 252 00:28:12,650 --> 00:28:18,080 I'm going to skip this part of the talk about whether people will mostly believe they're morally above average. 253 00:28:21,440 --> 00:28:24,530 I do think they tend to believe that, but kind of in a self deceptive way. 254 00:28:25,460 --> 00:28:27,410 All right. So let's get on to the normative thesis. Right. 255 00:28:27,420 --> 00:28:39,050 So this is the minority thesis, is that it's morally mediocre to aim to be morally mediocre or to phrase it in a less totally biased sounding way, 256 00:28:39,440 --> 00:28:40,190 somewhat bad, 257 00:28:40,940 --> 00:28:49,370 but also somewhat good to try to calibrate yourself so that you behave in ways that are overall similar to the moral behaviour of your peers. 258 00:28:51,530 --> 00:28:55,430 So mediocre has a negative connotation in ordinary English. 259 00:28:55,790 --> 00:29:00,650 It means partly it means somewhere in the ballpark of average or ordinary. 260 00:29:01,400 --> 00:29:10,880 But in contrast to the less loaded term like average, it also implies that the thing in question is somewhat bad and yet the mediocre isn't horrible, 261 00:29:11,330 --> 00:29:18,230 and being mediocre is compatible with having some redeeming features with being in some respects good. 262 00:29:19,130 --> 00:29:24,140 So mediocre coffee is good enough for me. 263 00:29:24,170 --> 00:29:31,340 Most of the time. Mediocre students mostly pass their classes and get their degrees. 264 00:29:32,330 --> 00:29:41,960 So aiming for moral mediocrity is like aiming to be a B-minus student or to be a kind of cheap doughnut shop grip blend. 265 00:29:46,780 --> 00:29:55,540 So the simplest opposing views to this are that it's perfectly fine to R&B are about as morally good as your peers, 266 00:29:56,170 --> 00:30:02,260 and that it's horrible to aim to be about as morally good, or maybe I should say morally bad as your peers. 267 00:30:05,410 --> 00:30:08,500 So I'm not going to criticise them. 268 00:30:08,860 --> 00:30:13,900 It's horrible view at length. I don't think many of us regard our peers as morally horrible. 269 00:30:16,450 --> 00:30:23,290 Some people might think that most of humanity is morally horrible, apart from their valued in-group of friends or co-religionists. 270 00:30:23,620 --> 00:30:28,720 But then they probably treat that in-group as their peers toward whom they morally calibrate. 271 00:30:30,910 --> 00:30:34,930 Others might think that their peers, maybe even especially their peers, 272 00:30:35,470 --> 00:30:40,540 are morally horrible on grounds that there's something morally horrible about our shared lifestyle, 273 00:30:41,680 --> 00:30:45,220 such as its luxurious ness in the face of global poverty. 274 00:30:46,300 --> 00:30:52,720 So I'm not going to try to address that view. Still, others might be just ordinary curmudgeons who just see the worst in people in general. 275 00:30:52,840 --> 00:30:56,170 Again, that's a kind of hard view to rebut in a brief space. 276 00:30:56,860 --> 00:31:03,790 But let me know that people do often lend a helping hand to strangers for no obvious benefit. 277 00:31:04,240 --> 00:31:08,020 They treat their fellows kindly. They share. They laugh. They maintain friendships. 278 00:31:08,020 --> 00:31:16,870 They take principled stands against injustice. So following the moral crowd can be good when others act with kindness and integrity. 279 00:31:17,320 --> 00:31:24,010 It inspires us to do the same. Attempting to compensate for having acted badly can also be good. 280 00:31:24,700 --> 00:31:30,340 The memory of guilt can motivate improvement. We're not horrible, only mediocre. 281 00:31:34,070 --> 00:31:41,420 Now against the view that it's perfectly fine to aim to behave in a morally mediocre way. 282 00:31:42,020 --> 00:31:51,440 I offer three arguments. First, your peers, they fail to reply to your important emails. 283 00:31:53,060 --> 00:31:56,660 They shirk their duties and neglect their promises. They're rude. 284 00:31:56,870 --> 00:32:04,310 They're grumpy for no good reason. They have annoying dogs, loud parties, bad driving habits, and unjustified sense of entitlement. 285 00:32:04,550 --> 00:32:12,440 They make you wait and then concoct some glib excuse. They form obnoxious opinions on too little information and then vote for horrible things. 286 00:32:13,580 --> 00:32:17,810 In all these little ways, they behave badly, and we ought to try to be better than that. 287 00:32:22,430 --> 00:32:28,760 Second, our treatment of others is pervaded with unjustified bias, unjustifiable bias, 288 00:32:29,960 --> 00:32:38,600 biased based on race, sex, age, beauty, disability, class, cultural background. 289 00:32:40,310 --> 00:32:51,380 It's not the worst of all things to be biased, but bias is pervasive and we're rightly criticised all for being biased. 290 00:32:53,450 --> 00:33:00,950 The range of biases based on disability in particular is huge and unavoidable since disability is so various. 291 00:33:03,050 --> 00:33:10,520 Biased toward conventionally beautiful people also is pervasive and substantial across across a wide range of social measures. 292 00:33:12,650 --> 00:33:20,510 So unjustified bias of the sort is a moral failing, for which we are rightly pretty sizeable, and we ought to aim to do better. 293 00:33:27,020 --> 00:33:34,160 Third. Even if we aren't morally horrible for leaving middle living, middle class, 294 00:33:34,430 --> 00:33:40,280 European or North American lifestyles or British lifestyles, as I assume most of you do. 295 00:33:41,960 --> 00:33:44,480 History might not judge us so kindly. 296 00:33:45,920 --> 00:33:54,830 Our typical lifestyles harm the environment by which we collectively contribute to the death and immiseration of many millions of future people. 297 00:33:56,420 --> 00:34:00,920 Arguably also, most of us ought to give more to charitable causes, local or global, 298 00:34:01,310 --> 00:34:06,920 in time or in money than we do, given our relative privilege and luxury. 299 00:34:08,270 --> 00:34:11,450 Most of us eat meat, which most of us ethicists think is morally bad. 300 00:34:12,530 --> 00:34:17,240 We purchase consumer goods from companies we know or ought to know, engage in bad practices. 301 00:34:18,140 --> 00:34:22,969 Now, it's contentious how bad all of this is, and my argument doesn't hinge upon this third bullet point. 302 00:34:22,970 --> 00:34:29,150 But if even something like this is even approximately correct, 303 00:34:29,960 --> 00:34:37,910 then every normal middle class person in our society is morally criticised for a wide range of actions every day. 304 00:34:40,670 --> 00:34:46,730 So it's not perfectly fine to aim to be morally mediocre. 305 00:34:48,620 --> 00:34:53,580 But. Probably that's about where you're in. 306 00:34:57,950 --> 00:35:05,780 So I'm now going to consider and conclude with two lines of reasoning by which we might hope to wiggle out of this somewhat depressing conclusion. 307 00:35:07,880 --> 00:35:15,130 Two arguments they're going to skip. One is a batch, what I call a batch argument where you say, Oh, no, just evaluate me as a batch as a whole. 308 00:35:15,180 --> 00:35:23,030 Right. And and another one is one that draws on work, on sainthood and why it might be good to avoid being a saint. 309 00:35:24,230 --> 00:35:27,650 If you want to raise either of those questions later, we can. 310 00:35:28,730 --> 00:35:35,000 First, I'm going to give well, the two that I'm going to focus on are what I call the happy coincidence defence and the most I can do. 311 00:35:35,000 --> 00:35:46,250 Sweet Spot. So on the happy coincidence defence, here are four things that I care intensely about. 312 00:35:48,900 --> 00:35:58,600 Being a good father. Being a good philosopher, being a good teacher, being a morally good person. 313 00:36:00,850 --> 00:36:05,920 It would be lovely if there were never any conflicts among these four things. 314 00:36:08,220 --> 00:36:15,450 Explicitly, explicitly acknowledging these trade offs is unpleasant, sufficiently unpleasant, and it's tempting to try to rationalise them away. 315 00:36:16,440 --> 00:36:26,490 It is distinctly uncomfortable to me, for example, to acknowledge that I would probably be a better father if I did less travel for work. 316 00:36:29,220 --> 00:36:39,180 But here I am. Similarly uncomfortable for me is the thought that the money I'll be spending on a family vacation to 317 00:36:39,180 --> 00:36:45,900 Iceland later this month could probably save a few people from death due to poverty related causes. 318 00:36:46,140 --> 00:36:57,850 If I gave it to the right charity. So here's a tempting way of rationalising that happy coincidence defence. 319 00:36:58,440 --> 00:37:05,100 Consider travel for work. Now I don't have to travel around the world giving talks and meeting people. 320 00:37:05,460 --> 00:37:11,550 It's not part of my job description. Normal fire me if I don't do it and some of my colleagues do it a lot less than I do. 321 00:37:12,630 --> 00:37:20,520 So it seems like I'm pirating, prioritising my research career at the cost of being somewhat less a good father, teacher and global moral citizen. 322 00:37:21,480 --> 00:37:25,350 The latter given the luxurious use of resources and the pollution of air travel. 323 00:37:26,280 --> 00:37:32,520 So happy coincidence, Defence says. No, I'm not sacrificing any of these other goals at all. 324 00:37:33,060 --> 00:37:36,150 Although I'm away from my children. I'm a better father for it. 325 00:37:37,320 --> 00:37:41,100 I've enriched my life. And now I can mingle that richness into theirs. 326 00:37:41,580 --> 00:37:51,990 I'm a more globally aware, wiser father. Similarly, although I might cancel a class or two and deprioritized lecture preparation, 327 00:37:52,740 --> 00:38:01,110 research travel improves my teaching in the long run, since it makes me a better philosopher and my philosophical work. 328 00:38:01,500 --> 00:38:07,860 Isn't that an important contribution to society? Maybe it's important enough to justify the expense, pollution and waste. 329 00:38:09,990 --> 00:38:14,010 I do more good for the world travelling around discussing philosophy than I could do do, 330 00:38:14,010 --> 00:38:19,440 leading a more modest lifestyle at home, donating more to charities and being more fully involved in my local community. 331 00:38:20,430 --> 00:38:22,590 So after enough reflection of the sort, 332 00:38:23,130 --> 00:38:28,290 it can come to seem that I'm not making any trade-offs at all among these four things that I care so intensely about, 333 00:38:28,620 --> 00:38:31,860 instead of maximising them all I am. 334 00:38:32,220 --> 00:38:40,800 This trip I'm on now is the best thing I can do as a father and as a teacher and as a researcher and as a global moral citizen. 335 00:38:42,120 --> 00:38:47,249 Yeah. Now. 336 00:38:47,250 --> 00:38:54,840 It might be true. It might look just like it might be true that the morally best action is always the one in one's enlightened self-interest. 337 00:38:58,060 --> 00:39:03,820 If so, it would be a happy coincidence. And sometimes they really are happy coincidences. 338 00:39:04,420 --> 00:39:10,150 It would be wonderful if we could structure our societies and our lives to increase the frequency of these kinds of happy coincidences. 339 00:39:11,440 --> 00:39:17,170 But I think I hope you'll agree that this pattern of thinking is suspicious. 340 00:39:19,420 --> 00:39:23,410 Life is full of trade offs among important things. 341 00:39:24,520 --> 00:39:28,600 The happy coincidence reasoning seems likely to often be epistemic. 342 00:39:28,720 --> 00:39:35,230 Dubious post-hoc rationalisation. It seems likely then illegitimately convincing myself. 343 00:39:35,500 --> 00:39:38,500 That's something that I want to be true. Really is true. 344 00:39:46,080 --> 00:39:55,370 Here's another way of it to accept your moral near average ness while rejecting the idea that you're criticised for being mediocre. 345 00:39:57,620 --> 00:40:00,680 It's to insist that you are doing the best you can do. 346 00:40:03,530 --> 00:40:07,939 Now it's true that trying too hard or something can sometimes backfire, right? 347 00:40:07,940 --> 00:40:11,540 If you try too hard to be funny, you could end up being less funny. 348 00:40:11,810 --> 00:40:16,340 If you tried too hard to win the race, you might exhaust yourself at the beginning and then collapse midway through. 349 00:40:17,660 --> 00:40:20,720 You need to pace yourself. Maybe morality is like that. 350 00:40:21,500 --> 00:40:29,200 Maybe moral idealists sometimes push themselves so hard that they would have been better pursuing a more moderate, sustainable course. 351 00:40:29,210 --> 00:40:37,880 Right? So for example, someone moved by the arguments for vegetarianism who decided instantly to become the strictest possible vegan might be 352 00:40:37,880 --> 00:40:45,260 more likely to revert to cheeseburger eating than someone who had chosen a more moderate vegetarian course to begin with. 353 00:40:47,540 --> 00:40:54,050 Somewhat differently. Someone who was passionately committed to those believing high moral standards 354 00:40:55,310 --> 00:41:00,170 might become prone to self-deception or might become unbearably sanctimonious, 355 00:41:00,620 --> 00:41:04,370 or might become insensitive to nuances and others legitimate excuses. 356 00:41:05,570 --> 00:41:15,920 So it's possible that you are currently in the sweet spot such that if you were to try to be any morally better, 357 00:41:15,920 --> 00:41:18,620 then you in fact are you would actually become morally worse. 358 00:41:20,000 --> 00:41:25,610 You might be doing the best you can do since you couldn't be morally better than you actually are. 359 00:41:25,610 --> 00:41:31,750 You're not pretty sizeable. You were rude to the cashier. 360 00:41:33,010 --> 00:41:36,010 You were biased in viewing the handsome student more favourably. 361 00:41:37,180 --> 00:41:38,709 You're thoughtless of your spouse, 362 00:41:38,710 --> 00:41:45,040 negligent in your months due overdue comments to the Ph.D. student who can advance until she receives your comments. 363 00:41:47,200 --> 00:41:53,560 But if you tried to be any morally better, you would. Well, something. 364 00:41:56,140 --> 00:41:58,610 So this may be true of some people, right? 365 00:41:58,660 --> 00:42:04,000 If you're a homeless mother of three who's managed to keep it together through a cold winter and physical abuse, 366 00:42:04,750 --> 00:42:08,320 I'm totally ready to believe you have no more resources to do any better than you doing. 367 00:42:08,710 --> 00:42:14,560 But most of us, it's probably good policy to be sceptical of any tendency. 368 00:42:14,560 --> 00:42:18,130 You have to think you are already in the most you can do. Sweet spot. 369 00:42:24,620 --> 00:42:34,850 All right. So. Most of us do not aim to be morally excellent by absolute standards. 370 00:42:36,320 --> 00:42:44,930 Instead, we aim to be about as morally good as our peers. Our peers are somewhat morally criticised, but not morally horrible. 371 00:42:46,850 --> 00:42:54,570 Morally mediocre. If we succeed in approximately matching their mediocrity, 372 00:42:54,960 --> 00:43:02,520 we too are mediocre and we're somewhat criticised more for having such low personal moral ambitions. 373 00:43:05,230 --> 00:43:13,540 It's tempting to try to rationalise one's mediocrity away by employing the happy coincidence defence or the most you can do sweet spot. 374 00:43:15,010 --> 00:43:29,440 But those defences don't stand up to critical examination. Own your mediocrity minute, except the reasonableness of the moral criticism you deserve. 375 00:43:31,340 --> 00:43:31,630 Yes.