1 00:00:01,320 --> 00:00:07,440 I speak today is my person is the professor of practical effects at the University of Guttenberg, 2 00:00:08,820 --> 00:00:12,960 and he's going to be speaking to us about priority realism. 3 00:00:13,080 --> 00:00:16,920 And we're lucky to have another expert in priority Arianism Derek Hough. 4 00:00:16,940 --> 00:00:24,840 In response, Zingerman's is going to talk for about half an hour and then they'll have a discussion and there's time for questions at the end. 5 00:00:25,740 --> 00:00:31,290 I will will do that. But so let make a start right away. 6 00:00:31,590 --> 00:00:38,850 I mean, welcoming in my personal place. Thank you very much. 7 00:00:38,890 --> 00:00:42,690 And I'm very honoured to have Derek. 8 00:00:42,690 --> 00:00:47,880 Yes, my respondent, I I've learned more from Derek. 9 00:00:48,030 --> 00:00:53,310 Much more from Derek. Any other philosopher, dead or alive. 10 00:00:54,450 --> 00:01:02,190 And for that reason I should perhaps be fair rather than honour to have him as a respondent, 11 00:01:02,520 --> 00:01:10,530 because I'm there's a risk that I will learn today from him that I've been mistaken about certain things. 12 00:01:11,280 --> 00:01:20,010 Anyway, what I'm going to talk about is some claims that Derek makes in his lecture Equality and Priorities. 13 00:01:21,150 --> 00:01:29,090 In that lecture, he he claims that egalitarianism is expressed in levelling down object. 14 00:01:29,100 --> 00:01:39,419 And more precisely, he claims that to tell teleological, not the ontologically egalitarian is exposed to this objection. 15 00:01:39,420 --> 00:01:48,180 The difference between these views is that when both are concerned about unjust or unfair inequality, 16 00:01:49,320 --> 00:01:58,260 the ontological egalitarianism takes an injustice to say certainly a situation 17 00:01:58,710 --> 00:02:05,070 where we have to be brought about by wrongdoing by moral and moral agents, 18 00:02:05,880 --> 00:02:17,160 whereas according to down to logical egalitarianism, there are not so-called natural, 19 00:02:17,820 --> 00:02:24,840 unjust inequalities brought about some people being born disabled and things like that. 20 00:02:27,990 --> 00:02:38,940 Levelling down objection is to the effect, I suppose many of you know it is that if we consider population where some are better off, 21 00:02:38,940 --> 00:02:52,520 some somebody else and we decrease it, whether they of the better off or lower the better off down to the level of all the worse off. 22 00:02:52,770 --> 00:02:58,830 We don't have to do it. It happens. Yes. Okay, it happens. 23 00:03:01,310 --> 00:03:10,469 And change like that is according to logical egalitarianism in one respect for the better, 24 00:03:10,470 --> 00:03:18,420 because now the equality doesn't exist anymore or it's a smaller inequality in another respect, 25 00:03:18,600 --> 00:03:24,719 it's it's a change for the worse because some people have become worse off. 26 00:03:24,720 --> 00:03:33,660 Nobody's become better off. So one might think of logical egalitarianism as consisting of two principles. 27 00:03:34,260 --> 00:03:44,790 One principle about saying something like It's bad in itself if some some individuals are unjustly worse off, 28 00:03:45,320 --> 00:03:53,729 some in some other individuals a principle of equality, the other principle of utility or beneficence, 29 00:03:53,730 --> 00:04:00,510 saying that it's better or morally better if people, 30 00:04:00,750 --> 00:04:13,020 individuals are better off from worse off or saying something like is better, that there is a bigger a greater sum of benefits. 31 00:04:15,060 --> 00:04:27,060 And it is the first principle, the principle of equality, according to which the levelling down situation is in one respect is something better. 32 00:04:27,060 --> 00:04:37,050 Namely, it's better in terms of equality, whereas it is worse according to the principle of utility. 33 00:04:37,650 --> 00:04:47,010 So it might be all things considered, a change levelling down might be changed for the worse. 34 00:04:47,370 --> 00:04:51,029 But all the same egalitarians are committed to teleology. 35 00:04:51,030 --> 00:04:56,730 Can they get a task? Are committed to saying that in one respect this is the change for the best. 36 00:04:56,980 --> 00:05:03,860 All that nobody is better off. And some are worse off and those who are better off. 37 00:05:05,960 --> 00:05:15,680 Now, there are claims that there is another view called priority terrorism or the Paris Review, 38 00:05:17,150 --> 00:05:26,389 which avoids this objection and proletarian this and says something like and that usually formalises it, 39 00:05:26,390 --> 00:05:33,190 saying that it's more and more important to to benefit those who are worse off. 40 00:05:33,200 --> 00:05:39,740 And that's our best role. Or alternatively, you can say that the weight, 41 00:05:39,900 --> 00:05:51,670 the more weight or value of a benefit is determined by the absolute level of the recipients of this benefit. 42 00:05:51,680 --> 00:06:01,820 So if the recipient is somebody who's worse off, the benefit has a high of more weight or value than if, 43 00:06:01,850 --> 00:06:06,170 if, and the recipient is somebody who is better off. 44 00:06:11,950 --> 00:06:16,929 I believe that I am proletarian some just like, yeah, 45 00:06:16,930 --> 00:06:36,990 this Harrison ism is committed to an impersonal value and egalitarianism is committed to an and and impersonal value of equality, 46 00:06:37,000 --> 00:06:52,510 saying that it's it's bad that there is unjust or unfair inequality and that this isn't something which is bad, necessarily bad for people. 47 00:06:52,810 --> 00:06:57,670 Take a personal value such as, for instance, of pleasure. 48 00:06:57,680 --> 00:07:01,300 It's good for people to feel pleasure. 49 00:07:01,480 --> 00:07:11,140 It's good for them to fail. But to feel as much pleasure as somebody else does isn't something that is good for you. 50 00:07:11,380 --> 00:07:17,380 For the people who who who have this pressure now in the priority area, 51 00:07:18,010 --> 00:07:27,910 impersonal value is to the effect that Private Hellenism says that if we are to distribute a benefit such as a pleasure, 52 00:07:29,290 --> 00:07:32,499 it has more, it has greater, more weight. 53 00:07:32,500 --> 00:07:39,190 If we give this benefit to somebody who is worse off and that to somebody who is better off, 54 00:07:39,550 --> 00:07:50,300 but it is to feel pleasure or have a benefit is as good for the person who's better off as for the person who is worse off. 55 00:07:50,680 --> 00:08:01,420 So it doesn't have any. And the pleasure, the benefit isn't better for the worse off than it is for the better off. 56 00:08:01,900 --> 00:08:10,420 This is an impersonal. That is the fact that it is giving the benefits to self. 57 00:08:10,420 --> 00:08:18,340 It's something that makes the outcome better than giving it to the better off person. 58 00:08:20,440 --> 00:08:24,280 Now, as I said, there are claims that. 59 00:08:27,830 --> 00:08:32,240 Proletarian and avoid the levelling down of action. 60 00:08:32,280 --> 00:08:38,510 And the reason they think so is that it makes no difference, he says, 61 00:08:38,630 --> 00:08:47,240 to our concern to the proletariat and concern for the worth of people, whether there are people who are better off. 62 00:08:48,650 --> 00:08:52,080 So it doesn't make any difference to that. 63 00:08:52,310 --> 00:09:03,050 So if we level down, the better off people know that that's claim is perfectly correct. 64 00:09:03,350 --> 00:09:11,569 But it doesn't follow. I will argue from that claim that priority irony isn't expressed to the levelling 65 00:09:11,570 --> 00:09:19,640 down object because there might be other respects in which this outcome is better. 66 00:09:20,480 --> 00:09:27,740 So it would be true on proletarian ism that in one respect, levelling down is better. 67 00:09:33,000 --> 00:09:50,730 To say that this is the case. All right. I want to have a look at a series of cases where the sum of benefits in all outcomes is the same. 68 00:09:51,630 --> 00:09:56,790 And it's also the case that there is equality. 69 00:09:57,150 --> 00:10:01,680 Nobody is better off than anyone else in this in these outcomes. 70 00:10:02,430 --> 00:10:10,110 What differ in those outcomes is the number of people who are individuals existing. 71 00:10:10,770 --> 00:10:19,040 So I'm going to. I'm excited to take it. 72 00:10:19,640 --> 00:10:26,750 I think it's so simple that I think you don't need to enjoy it. 73 00:10:26,750 --> 00:10:36,170 So simple. I don't need to find something I learned from corporate America. 74 00:10:37,160 --> 00:10:46,970 But the first hour of which is an outcome I call a hand 1 to 100. 75 00:10:47,660 --> 00:10:51,440 So I better write it down one person at level 100. 76 00:10:51,630 --> 00:11:01,400 Yes. I don't want to repeat that every time. No, no, no. Then you say to it, 5425, ending up with 101. 77 00:11:02,370 --> 00:11:12,100 That's clear enough. Yeah. So we have a cast out counting somebody. 78 00:11:12,160 --> 00:11:15,630 One person has a level over 100. 79 00:11:16,180 --> 00:11:20,320 This is the outcome I call one 100. 80 00:11:20,490 --> 00:11:32,670 Eric and I went ahead and then we have a second round count to Fitbit where people are two people at at a level of 50. 81 00:11:32,920 --> 00:11:45,070 And eventually we have an outcome here, which is 100 hundred one, 101. 82 00:11:45,100 --> 00:11:59,950 Okay. Now, these outcomes, according to both utilitarianism and egalitarianism, these outcomes will be equally good. 83 00:12:00,490 --> 00:12:09,610 And all things considered, if we apply these views to utilitarianism, regulatory system, 84 00:12:10,090 --> 00:12:19,000 these cases different involving different people, different number of people, we get a result that they sometimes are equally good. 85 00:12:19,180 --> 00:12:26,860 But if we apply prioritaires to this and these outcomes, 86 00:12:27,910 --> 00:12:36,790 we get the results that this outcome here is is the best one and all things considered, and this is the worst one. 87 00:12:36,940 --> 00:12:37,480 Why is that? 88 00:12:38,080 --> 00:12:51,060 That's because benefits here have the greatest moral way they can have in this series, because benefits here go to people who are worse off. 89 00:12:51,100 --> 00:13:04,590 So they have this sort of maximal moral weight here because private happiness obsessed that is sort of benefits to the worse off has its own way. 90 00:13:05,110 --> 00:13:17,860 Whereas in this case some benefits here here have a very level more where because there's somebody who is well 91 00:13:17,860 --> 00:13:28,870 off so that the benefits just sort of get their way in the way that the benefits increases as we get down here. 92 00:13:29,890 --> 00:13:37,570 Yeah. In this outcome one they have in the next one. 93 00:13:37,750 --> 00:13:49,579 So this is my but I summarised on on the first point on the handout according to priorities 94 00:13:49,580 --> 00:13:57,890 Harry and it's an outcome 101 in which 100 individuals each has one unit wrote that is better, 95 00:13:57,920 --> 00:14:05,890 all things considered than an outcome one one in which one individual enjoys all 100 units. 96 00:14:07,810 --> 00:14:12,850 But according to basic healthcare, in this new utilitarian, these outcomes are equally good. 97 00:14:13,090 --> 00:14:24,970 And this means that a proletarian ism implies something I call the desirability of of welfare, if you excuse is better. 98 00:14:25,330 --> 00:14:37,420 All things considered, if a quantity of welfare is distributed over men as many recipients as possible so that each recipient gets a minimal benefit. 99 00:14:38,650 --> 00:14:50,950 Now, if that is so that the last outcome is better, all things considered, that this is second point in the handout. 100 00:14:51,190 --> 00:14:56,650 There must be some respect in which this outcome is better, 101 00:14:57,220 --> 00:15:04,510 and that is that the average moral weight of value of a unit of welfare is higher 102 00:15:04,940 --> 00:15:11,920 in this outcome 100 to 1 since the recipients of benefits here are worse off. 103 00:15:12,430 --> 00:15:15,100 Now, this is an impersonal, as I said it, 104 00:15:15,110 --> 00:15:27,910 this is an impersonal waste of all value because a benefit is not better for those who are worse off than for those who are better off. 105 00:15:29,380 --> 00:15:43,090 Now, if we were to reduce it, gradually reduce the number of people who receive benefits and in the last outcome 101, 106 00:15:44,390 --> 00:15:49,180 reduce it to 9 to 9 people, 98 people, and so on. 107 00:15:49,390 --> 00:15:58,270 And so reduce and decrease the sum of of of benefits in that outcome. 108 00:15:58,750 --> 00:16:04,690 It's still that for a while that outcome still is better. 109 00:16:05,220 --> 00:16:08,790 All things considered according to priorities some. 110 00:16:12,810 --> 00:16:24,600 Then then the first outcome and so on, because it is reasonable to say that the sum of benefits is smaller in these outcomes. 111 00:16:24,990 --> 00:16:28,889 It's still the case that because the moral weight is certainly average, 112 00:16:28,890 --> 00:16:38,700 more weight over benefit is so much greater in these outcomes that outweighs the fact that that's a smaller sum offensive. 113 00:16:39,000 --> 00:16:50,440 If that if that wasn't the case, that is sort of the greater weight wake of up for the benefit could outweigh the sum it. 114 00:16:50,540 --> 00:17:04,770 It couldn't be the case that say if we were to decide whether to give two units to two units to two units of benefit, 115 00:17:04,950 --> 00:17:11,399 two to the best for all or one unit to somebody who's much worse off that the latter could be the best. 116 00:17:11,400 --> 00:17:19,500 That presupposes that the weight them all the way to the benefits to outweigh the size of the benefits. 117 00:17:21,660 --> 00:17:33,120 Now, eventually, however, if we go on reducing this number of people in this outcome, 118 00:17:33,570 --> 00:17:39,420 we get to a point where the outcome is no longer better, all things considered, 119 00:17:41,100 --> 00:17:45,300 but it will still be better in one respect, 120 00:17:45,780 --> 00:17:57,420 namely this respect that the average home value or weight of all benefits is higher than in the first outcome. 121 00:17:57,600 --> 00:18:03,740 One, two, one, one. Sorry. 122 00:18:04,660 --> 00:18:22,940 101. Yes. So now we have then a situation in which an outcome is is better. 123 00:18:23,010 --> 00:18:36,210 In one respect, there it's it's not better or worse, all things considered, because we have so reduce the sum of benefits so much. 124 00:18:36,360 --> 00:18:40,930 Now, that is precisely but I claim it is a case in, in, 125 00:18:42,290 --> 00:18:58,290 in the levelling down situation with this reasoning with is that we can view reformulate proletarian ism as consisting of two principal. 126 00:19:01,200 --> 00:19:17,880 The first principle says that it's better is the sum of benefits. 127 00:19:19,770 --> 00:19:25,190 This isn't on the handout three since that I mean, is something that struck me this morning. 128 00:19:25,740 --> 00:19:34,980 It might be a mistake, but anyway, especially if this whole sum of benefits is greater. 129 00:19:36,720 --> 00:19:57,180 That's the one principle. And the second principle is that it's essential or better if the average and the average of. 130 00:20:00,380 --> 00:20:03,830 This. It's. 131 00:20:06,790 --> 00:20:16,579 I. That is, these are the two two principles. 132 00:20:16,580 --> 00:20:22,040 One can say prioritise realism to consist of context, reformulation. 133 00:20:22,370 --> 00:20:47,990 And the second principle is a counterpart to the tell tale, your logical principle, which we can reformulate as follows It's better if. 134 00:20:50,670 --> 00:20:54,080 Unjust inequalities. 135 00:20:59,210 --> 00:21:03,840 Inequalities are minimised. 136 00:21:09,860 --> 00:21:16,099 Right now it is this second principle here, which, 137 00:21:16,100 --> 00:21:25,489 according to which makes it the case that private parliamentarian ism is exposed to the levelling down objection, 138 00:21:25,490 --> 00:21:27,980 because in the levelling down situation, 139 00:21:28,550 --> 00:21:40,820 it is the case that the average wage or benefit is higher increases because there is benefits with the lower more weight, 140 00:21:41,270 --> 00:21:45,440 the benefits of the better off they disappear. 141 00:21:45,740 --> 00:21:55,070 So what remains is a benefit to have a greater weight. 142 00:21:55,280 --> 00:21:58,940 So the average wage of benefits goes up. 143 00:21:59,240 --> 00:22:07,920 And that principle is a counterpart to sort of a corporate status, just principle of obedience, 144 00:22:08,300 --> 00:22:16,940 entirely logical utilitarianism, which exposes the levelling egalitarianism to levelling down. 145 00:22:17,010 --> 00:22:29,470 Objection. So we have the reason why private happiness is exposed to levelling down 146 00:22:29,480 --> 00:22:35,660 objection is that it has the principle of essentially trespass against arianism, 147 00:22:37,100 --> 00:22:47,330 which this might be obscured. By the way a proletarian ism is, is, is ordinarily formulated. 148 00:22:47,900 --> 00:22:54,060 But I, I try to sort of bring out that criterion is, 149 00:22:54,170 --> 00:23:04,340 consists of these three principles by considering this cases in this area of cases where we have an outcome which is better, 150 00:23:04,360 --> 00:23:16,780 all things considered, if we apply prior apparent mistakes, but it's the same as the sum of benefits is it's the same set. 151 00:23:16,790 --> 00:23:33,820 It must be another respect in which this outcome is better now because because the 152 00:23:35,690 --> 00:23:42,940 best proletarian ism egalitarianism all refers to the levelling down of action. 153 00:23:42,950 --> 00:23:52,580 They are they are violating a principle that Derek calls the person effect in class. 154 00:23:53,000 --> 00:23:58,490 And this is something you you have to hand out in the middle of. 155 00:23:58,880 --> 00:24:03,420 That is this principle. 156 00:24:03,420 --> 00:24:10,979 That PSA PAC says that an outcome can't be better or worse than another in any respect unless 157 00:24:10,980 --> 00:24:19,549 it's better or worse for for someone now based on best private fairness and egalitarianism, 158 00:24:19,550 --> 00:24:26,700 violate for the reason that they are committed to an impersonal value actually. 159 00:24:27,320 --> 00:24:40,729 But there's an awake person affecting claim which I called a pack star and it says an outcome can't be better than and not all things considered, 160 00:24:40,730 --> 00:24:49,040 unless it's better or something. So they think the bad talks about what's better, all things considered, not just better. 161 00:24:49,040 --> 00:25:02,809 In one respect, this is the principle that bears egalitarianism and private happiness. 162 00:25:02,810 --> 00:25:14,480 And we can respect it. And egalitarianism can respect that principle if it's it's what what Derek calls moderate. 163 00:25:15,470 --> 00:25:20,390 And this is something you also find on the handout. 164 00:25:21,770 --> 00:25:38,780 Again, Peronism is moderate. If, again, inequality is achieved simply, if again, inequality is achieved simply by a loss of benefit, 165 00:25:38,780 --> 00:25:51,290 the latter always outweighs the former, and then the loss of benefits always outweighs the gain of equality. 166 00:25:51,680 --> 00:25:57,020 Somebody has to gain something in order for the equality to be better off. 167 00:25:57,020 --> 00:25:57,860 So, for instance, if. 168 00:25:58,510 --> 00:26:12,430 If we were to make an IT situation better in respect of equality by sort of removing who benefits from the better off and adding one to the worse off, 169 00:26:12,800 --> 00:26:18,190 that's an outcome which would be worse, according to utilitarianism, 170 00:26:18,190 --> 00:26:28,210 because the sum of benefits carried out by it's it could be better according to that the moderate version of egalitarianism 171 00:26:28,420 --> 00:26:35,620 but an outcome in which we simply remove benefits for somebody can't do this or that that moderate version. 172 00:26:38,920 --> 00:26:54,940 And, and there's another similar doctrine which I call and tie in egalitarian ness and this is that says that also in the handout says an unjust 173 00:26:55,270 --> 00:27:03,880 equality inequality in respect of the distribution of benefits is something bad in itself that detracts from the value of an outcome. 174 00:27:04,150 --> 00:27:10,420 But just equality is not anything good in itself and as positively to the value of an outcome. 175 00:27:10,600 --> 00:27:16,830 This principle implies that if we have an outcome in which no benefits whatsoever, 176 00:27:16,970 --> 00:27:26,140 in which everyone is is at the zero level, there's nothing good about that outcome all day. 177 00:27:26,440 --> 00:27:38,170 There's equality in it. So equality, just equality doesn't add positively to the value of an outcome. 178 00:27:38,440 --> 00:27:49,750 It's only the case that an unjust inequality detracts from the value, makes it less than it would be if we were to consider the total sum of benefit. 179 00:27:50,020 --> 00:27:57,730 Now, prior terrorism could respect that principle the way you can person effective claim 180 00:27:58,720 --> 00:28:06,130 is if proprietary ends are except what I call the value of beginning to exist. 181 00:28:06,640 --> 00:28:17,740 Namely if they say that it's good for somebody to begin to exist and aid a worthwhile life if they don't say that sort of thing. 182 00:28:17,740 --> 00:28:29,320 But they still say that an outcome like a hundred was the last I've got on the black one is better, but not better for somebody. 183 00:28:29,620 --> 00:28:39,100 They will violate this weaker cause and effect black because in that case we will have an outcome which is better, 184 00:28:39,110 --> 00:28:42,430 all things considered there, it isn't better for for anyone. 185 00:28:43,510 --> 00:28:45,790 So priorities around. 186 00:28:46,690 --> 00:28:55,410 I think we'll have to accept this view of the value of beginning to exist if they want to be conform to a weaker person effective. 187 00:28:55,870 --> 00:29:02,890 And I think they will will have to do that because otherwise it would be strange for them to, 188 00:29:02,950 --> 00:29:13,390 to sort of criticise egalitarianism for, for, for violating the stronger person effective class. 189 00:29:14,440 --> 00:29:19,960 But let me emphasise one thing at the end, 190 00:29:21,700 --> 00:29:32,110 namely is that I don't I haven't presuppose that we apply these doctrines like prioritising this of egalitarian, 191 00:29:32,240 --> 00:29:42,820 some utilitarianism and so on to these cases where we have a different number of recipients of of benefits. 192 00:29:43,090 --> 00:29:51,250 All I'm saying that if I'm talking about what what these these doctrines would say if they were to apply to those cases, 193 00:29:51,850 --> 00:30:05,050 this is all I need, because what I wanted to arrive at is the claim that priority consists of these two principles on the blackboard. 194 00:30:05,710 --> 00:30:19,150 Because if that is the right way of constraining private antagonism, it is exposed to everything down objections and even even to to worst problems. 195 00:30:19,150 --> 00:30:31,360 Because proletarian ism will imply that in the levelling down situation, the outcome continues to be better in one respect, 196 00:30:31,360 --> 00:30:37,510 even though we passed a point where the better off are at the same level as the worst off, 197 00:30:37,960 --> 00:30:45,370 when the benefits of the better off go down even further, 198 00:30:45,640 --> 00:30:55,690 that means that the average more value of benefits goes up, and so the outcome will be in any respect better. 199 00:30:56,270 --> 00:30:56,680 So. 200 00:30:57,810 --> 00:31:13,440 Actually shifting from egalitarianism to prior terrorism because living down section is a jumping out of the frying pan into the fire, I would guess. 201 00:31:13,510 --> 00:31:16,890 But that's all I have to say. 202 00:31:17,100 --> 00:31:20,670 All right. Okay. Go ahead with your response. Yeah. 203 00:31:25,050 --> 00:31:31,050 First I want to say something about the levelling down objection, which was in your paper that you didn't actually say it in the talk. 204 00:31:31,620 --> 00:31:38,370 And you say, I grant that egalitarianism is exposed to the levelling down objection, 205 00:31:38,370 --> 00:31:43,290 but I don't regard this objection to serious egalitarian should simply deny that 206 00:31:43,290 --> 00:31:47,190 there's anything counter-intuitive about an outcome being better in one respect, 207 00:31:47,790 --> 00:31:49,410 although it's not better for anyone. 208 00:31:50,460 --> 00:31:58,920 Well, I think you there assume that in presenting the levelling down objection, I'm appealing to the person effecting claim. 209 00:31:59,370 --> 00:32:02,070 How can it be in one way better if it's better for no one? 210 00:32:03,120 --> 00:32:10,200 Now, I did, but I actually wrote that the person making it has less full access than the levelling down objection. 211 00:32:10,830 --> 00:32:13,860 And I reject the person if it can claim as a you know, 212 00:32:13,860 --> 00:32:21,840 I reject even the weak person effect and claim that an outcome can't be worse, all things considered, if it's worse for no one. 213 00:32:22,500 --> 00:32:28,230 I think that's very often true as is shown in cases that I call the non identity problem. 214 00:32:28,530 --> 00:32:33,810 However, that's just about the strength of the levelling down objection as applied to egalitarianism. 215 00:32:34,260 --> 00:32:40,829 I mean, I think we can say there are some ways in which one or two outcomes might be better there. 216 00:32:40,830 --> 00:32:46,110 It's better for no one. That's certainly possible, but not if. 217 00:32:46,110 --> 00:32:50,250 The only change is that the better off people become as badly off as everyone else. 218 00:32:50,430 --> 00:32:53,639 So it's that particular claim that the objection appeals. 219 00:32:53,640 --> 00:33:04,560 You know, you make the striking claim that the priority view is also open to the levelling down objections of a slightly different form. 220 00:33:06,810 --> 00:33:11,940 No, actually, it can be a simpler case than the one that suppose that everyone starts at level 221 00:33:11,940 --> 00:33:17,310 one and then everyone becomes much worse off falls to the level of ten benefits. 222 00:33:18,300 --> 00:33:27,360 Well, you say that that's in one way better because the average value of the benefits that people are receiving is higher. 223 00:33:27,720 --> 00:33:35,400 Yeah, but as I intended, the priority view, it doesn't appeal to the average value of benefit. 224 00:33:35,940 --> 00:33:39,870 It appeals to the total moral value of benefits. 225 00:33:40,500 --> 00:33:50,160 The difference between priority view and utilitarianism is simply the utilitarian say that it's better if there's a greater some benefits. 226 00:33:52,500 --> 00:33:58,110 Proletarians say it's better if there's a greater weighted sum of benefits with benefits 227 00:33:58,110 --> 00:34:03,600 doing more to make the outcome better the worse off the people who are who receive them. 228 00:34:04,530 --> 00:34:17,370 And since the priority view appeals to the total sum of benefits weighted in that way, there's no respect in which it's a bad feature of the outcome. 229 00:34:18,030 --> 00:34:27,420 If the average benefit per person has less moral value because they're better off or a good feature if because everyone's worse off, 230 00:34:27,570 --> 00:34:31,170 the average value is greater. 231 00:34:32,550 --> 00:34:41,400 So I'm just puzzled why you think that the priority view doesn't appeal to the total sum of weighted benefits, but only to the average sum. 232 00:34:41,730 --> 00:34:49,559 Now, in an earlier talk, you did suggest that a priority and might say that if everyone becomes much 233 00:34:49,560 --> 00:34:54,870 worse off so that the average value of the benefits they receive is greater, 234 00:34:55,800 --> 00:35:00,480 that that isn't a good feature of the outcome. It's a bad feature of the outcome. 235 00:35:01,680 --> 00:35:05,760 And then you suggested, well, that that would imply that on the priority view, 236 00:35:05,970 --> 00:35:11,610 it would be in one way better or in one way worse if everyone became better off. 237 00:35:13,320 --> 00:35:20,880 But I just didn't follow that argument. It seems to me, though, on the periphery, it's always better if people become better off. 238 00:35:23,580 --> 00:35:27,899 If they become better off, then the extra value gets less. 239 00:35:27,900 --> 00:35:31,490 But there's nothing bad about the average value of getting less. 240 00:35:31,890 --> 00:35:37,670 So I just wondered why you thought that the priority view had to appeal to average rather than total? 241 00:35:39,060 --> 00:35:54,750 Well, the reason the reason I, I don't talk about to total that more value is that I want to separate two principles and. 242 00:35:54,760 --> 00:36:04,860 Well, but the part of you differs from utility. Arianism in the gist says that it's not a total sum of benefits, it's the total wastage of benefits. 243 00:36:04,980 --> 00:36:17,760 That's the only difference. Yes, you could. Great. But I mean, consider the two principles that I expressed proletarian. 244 00:36:18,590 --> 00:36:23,030 I just reject the second one. The proletariat never appeals to the average, but. 245 00:36:23,100 --> 00:36:27,330 But that gives the same result. No. In those two principles. 246 00:36:28,350 --> 00:36:37,770 It just sort of separates two principles that are combined in the claim about way to interpret the differences. 247 00:36:37,890 --> 00:36:42,990 But the difference is, if you take the priority view to say things are in one way better, 248 00:36:43,170 --> 00:36:47,550 if the average moral value of benefits is greater because everyone's worse off, 249 00:36:47,700 --> 00:36:51,930 you get the absurd result that things are going one way better if everyone's worse off, 250 00:36:52,500 --> 00:36:57,690 if instead the priority view combines the principle of utility with the principle, 251 00:36:58,740 --> 00:37:03,600 how much more it makes the outcome better depends on how well off they are. 252 00:37:04,290 --> 00:37:12,870 Then you never refer to the average value you own have to the total weighted sum of benefits. 253 00:37:13,320 --> 00:37:20,490 So you can't as work claim that the priority you ought to take the form which 254 00:37:20,490 --> 00:37:25,950 makes it open to your objection rather than the form which avoids the objection. 255 00:37:26,310 --> 00:37:38,010 Which is what I intended. But. But if we consider this less serious of cases. 256 00:37:38,490 --> 00:37:42,090 But I'm about to turn to those because those raise a different question. 257 00:37:42,540 --> 00:37:53,970 Yeah, I do agree. I use those cases to to to argue that in one respect, prior terrorism, they say about a first. 258 00:37:54,330 --> 00:38:01,830 Yeah, but but look what you what you I mean, those cases involve different numbers of people in existence and they raise different questions. 259 00:38:01,830 --> 00:38:05,160 And I like to turn to them. But before we do that. 260 00:38:05,430 --> 00:38:12,180 If you take the cases that I actually considered where there's the same people who exist in all of the outcomes, 261 00:38:12,420 --> 00:38:24,000 I'm simply saying to you, surely I can be right in saying that the average moral value of the benefits that people receive is irrelevant. 262 00:38:25,080 --> 00:38:29,100 It's only the total sum of weighted benefits to which the principle appeals. 263 00:38:30,270 --> 00:38:34,530 Now, I don't see why you think a priority. 264 00:38:34,530 --> 00:38:42,990 You can't say there's nothing good if everyone becomes badly off so that the benefits that they receive have greater average value. 265 00:38:43,110 --> 00:38:47,750 There's nothing good about that on neither do you. 266 00:38:48,840 --> 00:38:55,480 And that's because it differs from utilitarianism only by giving extra weight and benefits. 267 00:38:55,510 --> 00:39:01,120 But I certainly I can't make my my case without appealing to find that to say. 268 00:39:01,120 --> 00:39:04,740 Yeah, okay. All right. I'll turn to the second one then. Right. 269 00:39:04,750 --> 00:39:10,290 Well, before I turn to your cases, I want to consider a range of similar cases. 270 00:39:11,490 --> 00:39:18,150 This involves a set of different possible outcomes, in all of which the same people would exist. 271 00:39:19,500 --> 00:39:26,370 So different from you in that respect. In one outcome, one person is at level 100, everyone else is at zero. 272 00:39:27,300 --> 00:39:30,510 Second outcome two at 50, everyone else is zero. 273 00:39:31,140 --> 00:39:36,750 For 25, everyone else is zero. And then at the end you get everyone at one. 274 00:39:37,300 --> 00:39:46,890 Okay. Now it is true that on the priority view it would be better if this hundred benefits were 275 00:39:46,890 --> 00:39:52,320 shared equally than if they all went to one person or if they all went to two people. 276 00:39:53,100 --> 00:39:58,980 But then so does the egalitarian think that and so does the utilitarian. 277 00:39:59,220 --> 00:40:10,800 So when you got the same people in all the outcomes, it is indeed desirable if the welfare is used among these people, but that's no objection. 278 00:40:11,460 --> 00:40:19,620 Now, if we turn to your case, the difference is that in your case, different numbers of people would exist in the different outcomes. 279 00:40:21,540 --> 00:40:27,770 Now, I think I may have misled you here by a phrase in which I introduced the priority view. 280 00:40:27,780 --> 00:40:36,030 I said It's enough to consider different state of affairs, in all of which the same set of people would exist. 281 00:40:36,840 --> 00:40:46,020 Now, that was misleading. You might have taken me to meant that if we reach a view about those cases, which I call the same people cases, 282 00:40:46,350 --> 00:40:51,420 then we can apply our conclusions to cases in which different people would exist or different numbers. 283 00:40:51,960 --> 00:40:57,260 But when I said it's enough to consider same people case, I didn't mean because then. 284 00:40:57,350 --> 00:41:05,750 You can apply to all cases. I meant we've got enough to discuss if we tried also to discuss cases with different numbers of people. 285 00:41:05,810 --> 00:41:07,820 That raises a whole new set of questions. 286 00:41:08,390 --> 00:41:18,500 So my priority view was explicitly designed for cases that don't raise these special puzzles when different numbers of people are different. 287 00:41:18,950 --> 00:41:23,960 However, I'll now turn to what you say about what the priority of you implies in those cases. 288 00:41:25,520 --> 00:41:30,020 You say that on the priority view in an earlier paper, 289 00:41:30,500 --> 00:41:36,170 it's better to bring the new people into existence because that's benefiting people 290 00:41:36,830 --> 00:41:41,980 whose level is lower because they're at the neutral level of non-existence. 291 00:41:43,490 --> 00:41:55,730 And that involves assuming that when the priority view says benefits to people, do more to make the outcome better when they're worse off. 292 00:41:56,780 --> 00:42:04,820 We include all of the merely possible people, people who never exist because, well, 293 00:42:05,150 --> 00:42:09,140 they're worse off than people who are having a good life because they're at level zero. 294 00:42:10,140 --> 00:42:17,180 No, I didn't intend the principle to apply to people who are merely possible and never existed. 295 00:42:18,860 --> 00:42:22,590 You could have a similar objection to the egalitarian view. 296 00:42:22,610 --> 00:42:31,819 You might say, Well, rather than giving more resources to people who exist, you should bring new people into existence because they're worse off, 297 00:42:31,820 --> 00:42:34,730 because they're at the neutral level that worse off than all the other people. 298 00:42:34,970 --> 00:42:40,940 Now, you rightly say that egalitarian principles aren't meant to cover people who never exist. 299 00:42:41,840 --> 00:42:45,110 It isn't a relevant inequality that we exist, and they don't. 300 00:42:46,010 --> 00:42:52,700 And the same is true for the priority view. When I say it's more important to benefit people who are worse off. 301 00:42:52,700 --> 00:42:56,570 I'm not including merely possible people who never exist. 302 00:42:57,590 --> 00:43:08,890 Now you might then say, Yeah, there's a difference here because you can have you can consider different outcomes. 303 00:43:08,950 --> 00:43:12,800 Yeah. You don't consider bringing people into existence. 304 00:43:13,370 --> 00:43:16,830 You just consider outcomes that already exist. 305 00:43:16,850 --> 00:43:25,080 You have one outcome where there's one person that's well-off as a 100 hundred and you can say another, 306 00:43:25,340 --> 00:43:29,000 but you're comparing that with an outcome in which that person doesn't exist. 307 00:43:30,740 --> 00:43:38,750 That's why you say that in this example, the priority of you, rather than having one person at level 100, be better to have 100 people at level one. 308 00:43:39,320 --> 00:43:46,100 Now that's implying that the second is better because the benefits would go to people who otherwise wouldn't exist. 309 00:43:46,910 --> 00:43:53,690 Yes. And in that respect, badly off and I'm saying, well, just meant to apply to people who never existed. 310 00:43:55,340 --> 00:44:02,630 But but if you make you if you make an adjustment about these two outcomes involving different numbers of people. 311 00:44:03,050 --> 00:44:11,540 Yeah. You can you can have the view that one outcome is better or the equally good or whatever, 312 00:44:11,990 --> 00:44:18,890 and take no view that they are better because they're better for somebody, namely people who come in. 313 00:44:19,010 --> 00:44:27,210 Well, I thought I thought your main objection to the priority view was that it claims it's better to have 100 people at level one. 314 00:44:27,230 --> 00:44:31,030 Yeah. Because that's benefiting people otherwise. 315 00:44:32,450 --> 00:44:39,890 No, the claim is just that that's the way that the benefits are high. 316 00:44:40,040 --> 00:44:43,640 Yeah, but that's because you're saying I mean, on the prior, you know, the weight, 317 00:44:43,730 --> 00:44:47,990 the weight of a benefit is higher if it goes to someone who's worse off. 318 00:44:48,440 --> 00:44:54,220 Yeah, well, you're assuming that if you bring some in into existence, that benefit is going to suffer. 319 00:44:54,280 --> 00:44:58,159 Now, that's a pretty badly off. No, that's a different claim. 320 00:44:58,160 --> 00:45:05,090 You know, you can you can make the claim that the benefits have a greater more weight, 321 00:45:05,090 --> 00:45:11,750 because they go to somebody who who is badly off at the level of one in the last outcome 322 00:45:11,930 --> 00:45:17,180 without saying you're benefiting somebody by having brought that person into existence. 323 00:45:17,210 --> 00:45:22,040 Well, that's why I switched in my case, in which all the same people exist in all the outcomes, 324 00:45:22,340 --> 00:45:26,000 and either one of them is at 100 or two at 50 or they're all at once. 325 00:45:26,070 --> 00:45:30,030 Yeah, that's okay. Some people say that Jeff McMahon. 326 00:45:30,350 --> 00:45:38,410 Yeah. Would take the view. Is that a case where you have more people existing? 327 00:45:38,420 --> 00:45:44,630 That outcome can be better, even though it's not better to fall for it. 328 00:45:44,690 --> 00:45:47,930 Yes, I agree it might be better, although it's not better for anyone. 329 00:45:48,290 --> 00:45:51,390 But I mean, there are really two really two points. 330 00:45:51,410 --> 00:45:56,640 One is the priority of you, as I defined it, applying to cases in which. 331 00:45:56,870 --> 00:46:04,400 All of the same people would exist when you turn to cases in which different people would exist or different numbers. 332 00:46:05,090 --> 00:46:08,060 That's completely different. It raises very different problems. 333 00:46:10,040 --> 00:46:18,140 And I think if some principle looks pretty plausible in cases where the same people exist, 334 00:46:20,450 --> 00:46:28,310 you can be justified in accepting that principle, although it's not clear when you turn to cases in which different numbers of people. 335 00:46:28,610 --> 00:46:36,050 Yes. What's the wider principle in which insane people cases would imply the conclusion you accept? 336 00:46:36,710 --> 00:46:43,220 I mean, here's another example. Many people think that an outcome can be worse. 337 00:46:43,220 --> 00:46:48,830 All things considered, if it's worse for no one, that's the person effecting claimed. 338 00:46:48,830 --> 00:46:54,470 Stop that you discuss. Well, that's plausible in cases in which all and only the same people exist. 339 00:46:54,890 --> 00:47:00,320 But once you switch to a case in which different people exist, it's very easy to see how an outcome can be worse. 340 00:47:00,380 --> 00:47:04,590 All things considered. Even though it's worse for no one. That's true. 341 00:47:04,610 --> 00:47:09,170 If the result of our energy policies is people who live two centuries from not who, which are low level, 342 00:47:09,290 --> 00:47:14,750 although their life is worth living rather than people, different people who would be at a higher level. 343 00:47:15,950 --> 00:47:24,470 So I think it isn't an objection to the priority view, as I described it, which applies only to cases with the same people, 344 00:47:25,550 --> 00:47:31,520 that it's not obvious what we should say about cases in which different numbers of people would exist. 345 00:47:34,340 --> 00:47:38,659 No, but. But you need. But then. But then, I mean, the other point I made, 346 00:47:38,660 --> 00:47:46,700 I wasn't just saying my version of the priority view doesn't apply to cases in which different numbers of people exist. 347 00:47:46,940 --> 00:47:53,120 I was going to say, well, if you look and see what the priority be might imply with respect to those cases. 348 00:47:53,870 --> 00:48:00,710 There's a huge difference between saying there's a greater moral value to a benefit of being at level one. 349 00:48:02,930 --> 00:48:04,399 If that comes to someone, 350 00:48:04,400 --> 00:48:15,170 some actual person who would otherwise be at zero and saying there's a greater value if it comes to someone who otherwise wouldn't exist at all. 351 00:48:16,040 --> 00:48:22,099 So in other words, just as it's plausible to say that if you believe that inequality is bad, you're not saying, 352 00:48:22,100 --> 00:48:27,710 well, one of the things that's very bad is that existing people are much better off than people who never existed. 353 00:48:27,920 --> 00:48:35,150 That's a bad inequality. You can actually. No, no, that isn't about inequality, because our principle applies only to actual people. 354 00:48:36,170 --> 00:48:43,120 I think the same is true of the priority view, the extra moral values of people in my outcomes. 355 00:48:44,120 --> 00:48:48,110 Only in one of the two outcomes. And you're saying bringing them into existence. 356 00:48:48,290 --> 00:48:50,240 If they come into existence, 357 00:48:50,870 --> 00:48:59,570 the priority view has to say that benefit matters more because it comes to people who otherwise would be at the zero level because they don't exist. 358 00:49:01,190 --> 00:49:07,880 And in other words, I'm saying two things. One is I intended it only to apply to the same people cases. 359 00:49:08,180 --> 00:49:17,090 Two, you can see why that's natural, because there's a huge difference between the way in which some people are worse off than others. 360 00:49:17,540 --> 00:49:25,760 If they exist and are worse off or if they never exist, you'd expect those to be quite different questions, and I think they do. 361 00:49:26,930 --> 00:49:36,170 Well, if you make it impossible to sort of separate in order to separate those two sides on two separate, 362 00:49:36,290 --> 00:49:42,560 you need to consider that that's cases where the sum of benefit is held constant. 363 00:49:43,880 --> 00:49:50,650 Whereas while then the moral weight of those benefits varies, 364 00:49:50,670 --> 00:49:58,130 you can't consider you consider that situation unless you you bring in different number of people, 365 00:49:58,670 --> 00:50:07,460 because otherwise the situation would be you can only change when in my example, they're the same people. 366 00:50:07,670 --> 00:50:18,770 The sum of benefits is the same. If one person gets all of 100 or to get 50 or full, get 25 or all of them get only one. 367 00:50:18,770 --> 00:50:22,100 The sum of benefits is of the same effect on the priority. 368 00:50:22,160 --> 00:50:28,190 In that case, it's better for the benefits to be shared equally between all these people. 369 00:50:29,960 --> 00:50:40,190 That's the case in which they will exist in all of the outcomes. So you can draw the distinction you want when you're considering a case like that. 370 00:50:41,150 --> 00:50:48,290 But I mean, your main objection to the priority view, I think, wasn't the one that applied to bringing extra people in, 371 00:50:49,700 --> 00:50:56,570 but was the claim that on the priority view, as everyone falls from a high level to a low level, that's in one respect. 372 00:50:56,860 --> 00:51:03,790 It changed for the better because the average value of the benefits they get is greater. 373 00:51:04,510 --> 00:51:10,809 And my initial response is that I don't know why you think the priority view gives any weight to 374 00:51:10,810 --> 00:51:20,260 the average value of the benefits rather than the total weighted value of all of the benefits. 375 00:51:20,650 --> 00:51:28,330 That was the view that I put forward, and I don't see why you think well, 376 00:51:28,330 --> 00:51:36,390 if you consider those cases first, this area areas where the sum of benefits is concerned. 377 00:51:36,520 --> 00:51:45,730 Yeah, and that's true in my example in which there are 100 people in all of the outcomes, the same people, the sum of benefits is the same. 378 00:51:47,080 --> 00:51:57,580 You either have 100 going to 152 to the other, 98 get nothing for 25 for four and the other 96 get nothing. 379 00:51:58,000 --> 00:52:04,840 That is a case in which the total sum is the same, but it's distributed differently between these people. 380 00:52:06,010 --> 00:52:11,380 And there I think most of us would think that it is probably that it's better if it's shared. 381 00:52:11,950 --> 00:52:17,890 Yeah, but that's a situation where egalitarianism would yield the same result. 382 00:52:18,010 --> 00:52:20,140 Yes, I know. But I thought you were saying that this is a case. 383 00:52:20,470 --> 00:52:29,920 This is a case in which the priority view has the implausible implication that when the average value of the benefits is greater, 384 00:52:30,130 --> 00:52:37,810 that's in one way better. Yeah, but in this sort of situation, you change the relative position as well. 385 00:52:39,160 --> 00:52:50,260 Whereas this hasn't changed in my in my case. But but surely we're considering two cases that as the simpler form of your objection, you say, 386 00:52:50,260 --> 00:52:56,270 well, suppose everyone is at 100, then they all suffer misfortune and everyone is earning ten. 387 00:52:57,730 --> 00:53:03,670 Now your main attraction is that on the priority view, that is in one way a change for the better. 388 00:53:04,500 --> 00:53:12,100 Yeah, because it's always better if the average value moral value benefits is greater. 389 00:53:12,550 --> 00:53:16,360 And I'm just denying that. Yeah, no, I know it doesn't say that. 390 00:53:17,260 --> 00:53:25,270 And then I thought you said, well, we can show that the priority of you has to give weight to the average value of benefits 391 00:53:25,870 --> 00:53:30,010 because and then you turn to a case in which different numbers of people exist. 392 00:53:31,210 --> 00:53:39,820 Now, I think that doesn't show that the part of you has to give weight to the average value. 393 00:53:40,510 --> 00:53:49,600 But but I didn't think you if you said what your view would be about this case on the blackboard is. 394 00:53:50,980 --> 00:54:00,250 Well, my view is I think it would be clearly worse if instead of 1 billion people in level 100, you've got 100 billion people at level one. 395 00:54:00,310 --> 00:54:05,110 That's like big world class. That would be worth on priority. 396 00:54:05,230 --> 00:54:09,810 No, not on priority. But that's not that's what my question is about. 397 00:54:10,240 --> 00:54:13,930 Well, the priority view doesn't even apply to those cases. 398 00:54:14,140 --> 00:54:16,660 But what would it say if it were? 399 00:54:16,960 --> 00:54:26,470 Because, as I said, I don't need to claim is that we do a proper priority realism proletarians apply the view of cases. 400 00:54:27,100 --> 00:54:30,130 The question is what you might say, what it might say. 401 00:54:30,850 --> 00:54:43,450 If you compare to 1 billion people at level 100 to 100 billion people at level one, the priority of yours is one. 402 00:54:43,450 --> 00:54:53,950 The second is in no way better because if these extra people come into a business that isn't benefiting people who otherwise would be at a bad level. 403 00:54:55,120 --> 00:55:00,819 So the crime, if you can just say it isn't bad for people, irrespective of whether it's benefiting people, 404 00:55:00,820 --> 00:55:07,570 there are benefits at at a low, low level and they would have greater way to cope with it. 405 00:55:07,960 --> 00:55:08,380 Well, no, 406 00:55:08,380 --> 00:55:20,320 because the point is the priority says that a benefit has greater weight if it comes to someone who without that benefit would be at a lower level. 407 00:55:21,910 --> 00:55:26,950 And I'm saying that doesn't include people who never exist. 408 00:55:28,240 --> 00:55:32,410 People who never exist don't count as being at a low, absolute level. 409 00:55:33,100 --> 00:55:37,450 They just don't enter into the reasoning. All right. 410 00:55:38,740 --> 00:55:42,170 I mean, doesn't that. Sorry, I mean, have I missed it? 411 00:55:44,110 --> 00:55:56,650 Well, I just think there is a way you can formulate a prior Rotarian as my saying that a benefit you have to somebody who's who. 412 00:55:56,720 --> 00:56:01,310 Otherwise wouldn't. Yeah. No, no, no, you don't you know, you don't have to say who otherwise wouldn't exist. 413 00:56:01,310 --> 00:56:08,570 And so you're just looking at an outcome and you say, well, you're hearing how this person does this outcome from. 414 00:56:08,570 --> 00:56:17,300 In the other outcome, it's you don't have to just look at the outcome and say, here are benefits to somebody at a very low level. 415 00:56:17,630 --> 00:56:25,850 It all benefits to somebody at at the level of one where you could similarly say, 416 00:56:25,850 --> 00:56:33,620 in applying the egalitarian principle that we ought to bring more people into existence, because then fewer people will be at levels here. 417 00:56:34,280 --> 00:56:39,049 And all that I'm saying is that I mean, when you say what separates you, 418 00:56:39,050 --> 00:56:44,140 whether you benefit people, putting them into existence, that's a separate issue from a judging. 419 00:56:44,150 --> 00:56:50,180 They say, well, look, let's say that these outcomes are equally good or better, 420 00:56:50,210 --> 00:56:56,090 even though they're not better for anyone because somebody is brought into existence. 421 00:56:56,690 --> 00:57:00,679 But I think we should let other people be as well. All right. 422 00:57:00,680 --> 00:57:07,000 Well, yeah, plenty of time for questions. So let me see who's who's interested in our of it. 423 00:57:07,010 --> 00:57:13,790 But we'll start with Julian. So took a bunch of cancer patients to this to two different groups. 424 00:57:13,790 --> 00:57:20,090 There's the older Good Prognosis Group and there's the younger worse prognosis group. 425 00:57:20,690 --> 00:57:30,500 So you could give one person who's 40 years old, 100 months of life, or you could give 100 people who are 30 years old, one month of life. 426 00:57:31,400 --> 00:57:38,120 So I take it then these are all people who exist, but they just think of different sorts of cancer and your priority and you. 427 00:57:38,690 --> 00:57:41,750 It would be better if we gave the 130 year olds one month of life. 428 00:57:44,090 --> 00:57:51,440 Prima facie, yes. And I think that's what most people would think. 429 00:57:53,990 --> 00:57:56,660 If you're giving a greater sum of benefits, 430 00:57:56,660 --> 00:58:05,870 which is more fairly shared and it's coming to people who otherwise would die younger and thus be worse off, it sounds unequivocally better. 431 00:58:08,180 --> 00:58:14,240 Well, why? I mean, this does seem to be a kind of version of the repugnant conclusion. 432 00:58:14,570 --> 00:58:19,160 If you apply it to large numbers of cancer patients, which gets smaller increments of life. 433 00:58:19,850 --> 00:58:28,040 Well, look, I agree. Some people think that if there are lots of people who exist and some of these people are very bad, 434 00:58:28,040 --> 00:58:32,149 they are very well off and the others are very badly. Some people think, well, 435 00:58:32,150 --> 00:58:38,719 there's no point in taxing the rich and spreading it out among the starving masses 436 00:58:38,720 --> 00:58:41,750 because each of them would get such a small benefit that it doesn't count. 437 00:58:42,320 --> 00:58:43,880 Well, I think that's just a mistake. 438 00:58:45,380 --> 00:58:51,050 Each of these many people would get us would get the benefit, and the total sum of benefits would be much greater. 439 00:58:51,980 --> 00:58:56,690 Not only would the total be greater, but it would be more fairly distributed. 440 00:58:57,380 --> 00:59:01,790 So I don't think there's any difficulty about thinking that when you can give 441 00:59:01,790 --> 00:59:06,439 a greater total sum of benefits to people who would otherwise be worse off, 442 00:59:06,440 --> 00:59:19,280 that that's a change for the better. Now, some people think if you go down to a benefit as small as, you know, an extra minute of life or something, 443 00:59:19,290 --> 00:59:24,050 you know, some of the benefits like that could add up to anything significant. 444 00:59:24,500 --> 00:59:30,730 And I think that's just a very deep mistake. I described the two cases in the bad old days, the tortuous. 445 00:59:30,770 --> 00:59:35,780 Each turns the dial on the instrument, inflicting great pain on one victim. 446 00:59:36,490 --> 00:59:42,860 But under the new system, the machines are so arranged that they go round and round it together. 447 00:59:43,340 --> 00:59:50,570 They impose just as much suffering on all their victims, but none of them makes any perceptible difference to any of the victims. 448 00:59:50,870 --> 00:59:57,260 Now, that's perfectly possible, and it's clear that they're imposing great suffering on their victims, 449 00:59:57,560 --> 01:00:02,240 although because the effect of each act is so thinly spread, it doesn't make any sense. 450 01:00:03,230 --> 01:00:11,150 So, I mean, I agree when you consider benefits and harms that are very small, there's a tendency to think, well, they just don't count. 451 01:00:11,570 --> 01:00:18,590 I think that's a huge mistake. But now, how was your question applying to. 452 01:00:19,160 --> 01:00:25,910 I think the main issue between Ingmar and I, because I say on the priority view, 453 01:00:27,530 --> 01:00:34,190 what makes the outcome better isn't that the average benefit that people get has more moral value. 454 01:00:34,880 --> 01:00:39,590 It's that the total sum of wages benefits is greater. 455 01:00:39,800 --> 01:00:41,030 That was my version. 456 01:00:41,840 --> 01:00:50,390 But the priority of you and Ingmar objection is directed at the average person that says, no, it isn't the total sum, it's the average fine. 457 01:00:51,470 --> 01:00:55,130 And that's a completely different view. And I don't see why that is in my view. 458 01:00:55,490 --> 01:01:04,010 My view is that it consists of two principles. You can divide you divide your your one principle into two principles. 459 01:01:04,250 --> 01:01:13,639 And I was using these cases to say you say one is it's in one way that if there's a greater total sum of benefits and 460 01:01:13,640 --> 01:01:19,760 it's in another way better if because people are badly off the benefit to each person has greater average moral value. 461 01:01:20,090 --> 01:01:26,420 And I'm saying I just reject that version of the part of you, which seems to me to have no plausibility at all. 462 01:01:26,810 --> 01:01:32,210 Now, we agree about that. Yeah, but I thought your objection was to the priority view. 463 01:01:32,510 --> 01:01:39,860 Well, then you should consider the version that says when people benefit, 464 01:01:39,860 --> 01:01:45,410 that makes the outcome better, and it does more to make the outcome the worse off they are. 465 01:01:45,770 --> 01:01:51,080 Now you add up all of the benefits and you give them greater weight if they go to people are worse off. 466 01:01:51,560 --> 01:01:53,420 And that's nothing to do with the average. 467 01:01:53,420 --> 01:02:01,610 But I was considering a version of proletarian ism, which applies to cases where your fiat version doesn't have an all. 468 01:02:02,990 --> 01:02:06,860 Oh, you mean cases involving different numbers of people? Yes. I wasn't discussing those cases. 469 01:02:07,130 --> 01:02:11,570 No. I take you along that we have and we have plenty more people in this equation. 470 01:02:13,340 --> 01:02:16,500 So this is the latest thing. Okay. Sorry. No disappointing. 471 01:02:18,170 --> 01:02:20,260 Come back to you later. Sorry. 472 01:02:20,450 --> 01:02:27,530 I was just wondering about the about the fact that the other party is not supposed to apply, I guess, I presume, to the different identities. 473 01:02:27,980 --> 01:02:35,540 Right. Those are different cases, right? Yeah. But that's also true that you need to think carefully about those because as I say, 474 01:02:35,660 --> 01:02:41,600 many people think one of two outcomes is worse for one person and better for no one. 475 01:02:41,600 --> 01:02:49,760 It must be worse, at least in one way. But once you turn to cases in which different people exist, that leads to a contradiction. 476 01:02:50,300 --> 01:02:53,800 So those are different cases which need to be. Handled in a different way. 477 01:02:53,890 --> 01:02:55,730 Right. I mean, you know, handled differently. 478 01:02:55,780 --> 01:03:04,810 Is there going to be sort of priority or something, some sort of for example, I mean, you talk about the cases the same number. 479 01:03:04,840 --> 01:03:08,320 You have this gradual, you know, different distribution of how these resources are. 480 01:03:08,710 --> 01:03:16,690 And I think it's worth it to give the different description of schemes the same, but the different identities of people in these various groups. 481 01:03:18,310 --> 01:03:20,680 Well, those need different consideration, right. 482 01:03:20,700 --> 01:03:29,370 I'm just wondering, are they going to be any consideration, the view that will make one of these situations more credible, better overall, that young? 483 01:03:29,380 --> 01:03:33,640 And if so, will they be something like high water or something? 484 01:03:33,790 --> 01:03:38,560 Well, look, I mean, let me try to be more positive. I'm sorry. I've been rather, rather rude. 485 01:03:39,280 --> 01:03:49,540 Ingmar has well pointed out that if you consider cases in which different numbers of people exist and you ask, 486 01:03:49,540 --> 01:03:53,350 what should the priority you say about such cases? 487 01:03:54,340 --> 01:04:00,580 If you said that people's coming into existence and being at a low level, 488 01:04:01,210 --> 01:04:08,950 it should be treated as a case of a benefit that has greater value because it goes to someone who would otherwise be badly off. 489 01:04:09,580 --> 01:04:15,940 Then the priority, if you've so applied, leads to something like the repugnant conclusion. 490 01:04:16,240 --> 01:04:20,740 Now that's well worth pointing out. And the moral as well, the priority. 491 01:04:20,750 --> 01:04:24,159 You shouldn't take that for now. 492 01:04:24,160 --> 01:04:27,670 And if someone says to me, Look, I agree. 493 01:04:27,670 --> 01:04:33,190 The priority, as you described, didn't seem to do jolly well in cases in which all the same people exist. 494 01:04:33,550 --> 01:04:37,060 But what about these other cases? What do you say about them? 495 01:04:37,330 --> 01:04:41,140 That's a perfectly fair question, but I think we can be confident. 496 01:04:41,440 --> 01:04:48,970 I mean, let me put it this way. We can be confident that some principle is right in cases where all and only the same people exist, 497 01:04:50,230 --> 01:04:56,500 though not knowing what the wider principle is from which this could be derived. 498 01:04:56,980 --> 01:05:05,530 And that's because it will normally be the case that there are several quite different, wider principles, which will all have the same implications. 499 01:05:05,980 --> 01:05:12,820 And one example of that is the difference between the average principle of utility and the total principle of you. 500 01:05:13,210 --> 01:05:17,710 They coincide in cases with all the same people, but they're completely different. 501 01:05:18,760 --> 01:05:26,800 And so I agree there may be quite different principles that would all support the priority view in same person cases. 502 01:05:26,980 --> 01:05:30,970 And we haven't yet discovered which of those different principles which could appeal to. 503 01:05:31,300 --> 01:05:40,750 So there's still work to be done. But the claim there's work to be done isn't that there's an objection to the priority view in saving people cases. 504 01:05:41,710 --> 01:05:45,850 That that's really all I that there is an expression of this. 505 01:05:46,760 --> 01:05:50,169 Yes. So I mean, this is all that I mean, just a similar point. 506 01:05:50,170 --> 01:05:59,880 So, you know, you use these cases and say, well, this to us is to say that terrorism violates the first thing. 507 01:06:00,370 --> 01:06:06,790 But in these two cases, you seem to see that attention plays in conjunction with the total view of population. 508 01:06:06,790 --> 01:06:10,540 Efforts for the social justice is enough to violate the personal pension. 509 01:06:11,380 --> 01:06:16,230 So if, for example, it's in conjunction with utilitarianism, which is the classic case, 510 01:06:16,250 --> 01:06:26,379 dozens and dozens of the letting down of section population case that have one person, 100 to 100 people. 511 01:06:26,380 --> 01:06:36,020 That's who. The second outcome is that, according to utilitarianism, the first is not that support anyway, because are some people that don't exist. 512 01:06:36,970 --> 01:06:45,520 Well, I actually think when you bring someone into existence with a life worth living, that should be counted as a case of benefiting the person. 513 01:06:46,030 --> 01:06:52,960 I wouldn't want actually to say, Oh, you're not benefiting them. The point I was making was a slightly different one. 514 01:06:53,560 --> 01:07:00,730 It's whether we should think, Look, there are these possible people who may never exist or are we to worry about them? 515 01:07:00,940 --> 01:07:04,090 Because after all, if they never exist, they'll be at the zero level. 516 01:07:04,570 --> 01:07:09,010 And I was saying the intuitions behind the priority of you don't I think say, well, 517 01:07:09,010 --> 01:07:13,710 we need to think about I mean, there's a very nice quotation I have from the here. 518 01:07:14,380 --> 01:07:19,630 He says, maybe if the priority signs are no way to possible people, that's unfair. 519 01:07:20,320 --> 01:07:31,660 And refusing to take into account the possible people whom you could benefit is morally obviously on a par with racism, sexism and speciesism. 520 01:07:31,840 --> 01:07:36,610 And I'm saying it isn't giving priority to benefits to actual people, 521 01:07:37,150 --> 01:07:49,420 isn't objectionable to be arbitrary or unfair because people who never exist are not being treated unfairly when we fail to bring them into existence. 522 01:07:50,590 --> 01:07:52,530 But I was trying to make a claim, which I. 523 01:07:53,680 --> 01:08:03,840 I can't I can't get Derek to do except apparently and this is that in is where you you see to people of different numbers and you. 524 01:08:03,880 --> 01:08:12,700 Yeah you can say that one outcome is better than another because that the total amount 525 01:08:12,700 --> 01:08:18,190 of benefits is great or whatever or that it isn't better for anyone where you're not, 526 01:08:18,340 --> 01:08:23,319 you haven't benefited anyone by bringing it into existence. You can separate those two years. 527 01:08:23,320 --> 01:08:29,230 That's what I've been sort of trying that point I've been trying to make, which Derek hasn't bought. 528 01:08:30,400 --> 01:08:37,959 But I mean, I reject the person effecting claim more sweepingly than you do because I think you 529 01:08:37,960 --> 01:08:42,610 thought we could at least accept the claim that one or two outcomes can't be worse. 530 01:08:42,640 --> 01:08:46,780 All things considered, if it was for no one. I think that's a mistake. 531 01:08:46,780 --> 01:08:51,370 Yes, but irrespective of what claims we're inclined to make, 532 01:08:51,940 --> 01:09:02,170 I'm just sort of trying to make the point that it is possible to take the view that those outcomes involving different people, 533 01:09:02,500 --> 01:09:11,620 that some of some of those outcomes might be better than somebody and not in fact, you have more people at a high level. 534 01:09:11,620 --> 01:09:20,050 For instance, you can say that outcome win more people at a higher level is better than one with fewer people, even though it isn't better for anyone. 535 01:09:20,260 --> 01:09:24,370 You haven't. You haven't, I agree. Benefited anyone by bringing them into existence. 536 01:09:26,050 --> 01:09:35,780 Yeah. And how is that an objection? It's you know, my point is that in talking about these cases in the serious, 537 01:09:36,190 --> 01:09:42,250 I'm not saying that the in the law case, we've made the situation better for it. 538 01:09:42,550 --> 01:09:50,620 I'm I'm only saying that you could make the claim that that outcome is better, even though you don't claim it's better for it. 539 01:09:50,830 --> 01:09:57,580 Yeah, but the only, the only question I can see it's this is that the priority of you says a 540 01:09:57,580 --> 01:10:03,490 benefit has more value if it comes to someone who's at a lower absolute level. 541 01:10:03,700 --> 01:10:11,920 Yes. And the question is, should we count among the people who are at a low absolute level, people who don't exist? 542 01:10:13,150 --> 01:10:20,590 And we could answer that either way. You beautifully pointed out, which was new to me, that if we say yes, 543 01:10:20,590 --> 01:10:26,110 let's include among the people who are at a low absolute level, all the people who never existed. 544 01:10:27,490 --> 01:10:34,149 The priority of you then leads to a worse version of the repugnant conclusion, to which I think, 545 01:10:34,150 --> 01:10:41,350 well, that shows that that you shouldn't take people who never exist as being at a low level. 546 01:10:42,790 --> 01:10:46,900 People who never exist just don't count as far as the priority view is concerned. 547 01:10:48,160 --> 01:10:57,190 But I mean, you're right to to to share that that would not be a plausible way to extend the priority principle to those cases. 548 01:10:58,720 --> 01:11:02,170 You're right. So we take another question. 549 01:11:03,620 --> 01:11:06,669 Yeah, sure. I understood. 550 01:11:06,670 --> 01:11:16,959 Why would you concede it's a first the very first question doesn't show why priority areas would be susceptible to that conclusion. 551 01:11:16,960 --> 01:11:23,680 So the example, I thought there were all people who were going to die at 30, but go on with the question. 552 01:11:24,400 --> 01:11:28,480 And so and so why would that sorry? 553 01:11:28,690 --> 01:11:39,790 Well, the question is, is it repugnant to think that instead of giving 40 months to one person, you should give one month to 40 people? 554 01:11:41,680 --> 01:11:49,410 I'm inclined to think that if 40 people have one more month in which to make their farewells to Thursday and so on and so on, 555 01:11:49,630 --> 01:11:59,130 you're going to be doing more good and it'll be more shared rather than giving the whole 40 months to one person and everyone you do it many, 556 01:11:59,140 --> 01:12:04,510 many more people's time is giving to the individuals gets smaller. 557 01:12:04,510 --> 01:12:07,700 It's more. I thought that was exactly right. Precisely for. 558 01:12:08,530 --> 01:12:12,370 No, no, that isn't the apparent inclusion at all. I mean, the point is, 559 01:12:13,030 --> 01:12:21,879 if you have a choice of either saving a single person from a whole year of pain or giving a single 560 01:12:21,880 --> 01:12:30,790 person a whole year of life or giving one more minute of life or one fewer minutes of pain. 561 01:12:31,120 --> 01:12:37,870 Each of a million people, lots of people saying, well, obviously you should give the one person a home more. 562 01:12:37,870 --> 01:12:42,499 Yeah. Rather than just giving one minute to a million people. Well, a year is half 1000000 minutes. 563 01:12:42,500 --> 01:12:46,750 So if you always did that, in those cases, everyone would die younger. 564 01:12:47,080 --> 01:12:52,060 I mean, yes, it would be worse for everyone. And I mean, I do agree that when the. 565 01:12:52,340 --> 01:12:58,940 It is so small as to be imperceptible. It's plausible that people think, well, they just don't count. 566 01:12:59,810 --> 01:13:05,090 And I think that's a huge mistake. As my case of the homeless torches was meant to show. 567 01:13:05,240 --> 01:13:12,680 None of my tortures makes any perceptible difference to any of the victims, but they together inflict great suffering. 568 01:13:13,280 --> 01:13:18,350 And that's what we're doing with global warming. You know, each act is not going to make a perceptible difference. 569 01:13:18,440 --> 01:13:24,740 But together, we're doing tremendous harm. And their permanent conclusion isn't about those cases. 570 01:13:26,210 --> 01:13:29,270 It's the claim that rather than a billion people with a wonderful, 571 01:13:29,270 --> 01:13:33,710 quality life would be better to have an enormous number whose lives are hardly worth living. 572 01:13:34,010 --> 01:13:36,680 That's completely different. But in the thousand tortuous case, 573 01:13:36,680 --> 01:13:44,360 you got individuals that suffered right at the end when you're simply giving an extra second to a large number of individuals. 574 01:13:44,630 --> 01:13:50,120 No. Single individual? No. On the contrary. Point to just the mass of note. 575 01:13:50,150 --> 01:13:57,950 The point is, if you keep doing that, if you keep doing that and you can do it once for everyone in the population, there are a million people. 576 01:13:58,610 --> 01:14:02,460 Okay. And on a million occasions, you are to give one more. 577 01:14:02,480 --> 01:14:07,580 Yet we've seen some difference. So we're talking about we're just on a one off case, a million people each. 578 01:14:07,580 --> 01:14:14,080 Get a second from your view? That's a bit like giving somebody a whole extra year on the priority. 579 01:14:14,480 --> 01:14:17,930 Well, you just a but yeah, none of those individuals gets any more than a second. 580 01:14:18,200 --> 01:14:21,680 Yeah, but look, look, the point is the reasoning is just the same. 581 01:14:22,130 --> 01:14:29,090 You're saying if people suffering is short of by only one second, that doesn't make any difference. 582 01:14:29,240 --> 01:14:34,700 It does. You're imposing a greater sum of suffering. 583 01:14:35,180 --> 01:14:39,650 I mean, an extra torture comes along, and he turns the thing once on each machine. 584 01:14:39,920 --> 01:14:46,670 Each of the victims suffers a bit more, and he's imposing a greater sum of suffering. 585 01:14:48,530 --> 01:14:59,030 But you are right. Some people think that it's okay to act in a way like this, provided you know that few people will do it. 586 01:15:00,020 --> 01:15:06,020 Although if you know that many people are going to do it, then it's wrong. I actually think that's not defensible, 587 01:15:06,560 --> 01:15:14,990 which is you get to see the harm you're doing when you consider many people are doing it, but you're doing just as much harm. 588 01:15:16,370 --> 01:15:24,020 But those really are different cases because those are about what weight we should give to benefits and harms that are trivial or imperceptible. 589 01:15:24,470 --> 01:15:27,700 And that's got nothing to do with the priority of you. John. 590 01:15:27,930 --> 01:15:31,820 John. All right, that's my one. 591 01:15:32,270 --> 01:15:38,690 Very sorry. My question. 592 01:15:39,770 --> 01:15:44,050 And I'm sorry I had to come in late. Yeah, me too. 593 01:15:46,160 --> 01:15:51,230 But let me ask you anyway, so you focus too often from one code on 101. 594 01:15:53,450 --> 01:16:00,680 And I think you think that the first one is worse than 600 years ago. 595 01:16:02,770 --> 01:16:07,640 No, it's not worse. You can't forgive anybody else. 596 01:16:09,260 --> 01:16:13,280 And yet you say on the sheet the principle that gives you that conclusion. 597 01:16:13,470 --> 01:16:17,570 He could comply with this week specification. 598 01:16:17,580 --> 01:16:21,739 That's the approach which can be less and less. 599 01:16:21,740 --> 01:16:26,270 It's worse. So that seems a contradiction. 600 01:16:27,410 --> 01:16:30,980 Could I ask Derek his question? Derek? 601 01:16:31,280 --> 01:16:34,490 Yeah. This is about. This is about doubling down. 602 01:16:34,640 --> 01:16:38,890 Yeah. And it isn't actually written down here. 603 01:16:38,900 --> 01:16:47,240 But you think that lengthening John can't be better in any respect than not? 604 01:16:49,010 --> 01:17:00,680 No. My claim was that it's not a decisive objection to egalitarianism, that on this view it's in one way a change for the better. 605 01:17:01,040 --> 01:17:02,920 If the better off suffer misfortune. 606 01:17:05,650 --> 01:17:12,110 As in my says, there are other cases in which it's right to claim it's in one way better, although it's better for no one. 607 01:17:12,500 --> 01:17:20,360 And any guy then might say, Well, I think that's true here too, that quite a lot of people find that implausible. 608 01:17:21,080 --> 01:17:26,330 And the point about the priority view is that it doesn't imply that it's in one way that that's okay. 609 01:17:26,870 --> 01:17:34,010 So yeah, that I have this. So you do actually expect that levelling down kind of gives extra amount of respect? 610 01:17:35,070 --> 01:17:49,220 No, I think it's implausible, but people like Tim can say, well, yeah, it's just what would be nice to have your view about it. 611 01:17:50,240 --> 01:17:56,520 Well, my view is that it? It is an objection which has some force, but I don't think it's decisive. 612 01:17:57,090 --> 01:18:04,920 I think we have to decide whether the priority view which avoids this implication is in other ways more plausible. 613 01:18:05,250 --> 01:18:11,280 And I mean, here's one point. I read quite a lot of anti egalitarian literature. 614 01:18:12,480 --> 01:18:21,240 And what they're all talking about is there's this, you know, impersonal thing, inequality, and it's got nothing to do with anyone's well-being. 615 01:18:21,750 --> 01:18:25,080 And the priority of you does wholly avoid those worries. 616 01:18:25,800 --> 01:18:31,620 So what all I meant was, look, some people find this because in favour of the priority. 617 01:18:31,990 --> 01:18:40,290 Yeah, there's an objection to tech egalitarianism, which has some force which is completely avoided by the priority. 618 01:18:40,440 --> 01:18:46,680 That was my claim here, but I don't think the levelling down objection understates it. 619 01:18:47,130 --> 01:18:50,880 Yeah, I ask you. Yeah. All right. Let me play play by play. 620 01:18:51,210 --> 01:19:01,440 I was carrying that on proletarian so that I would count as better than this, which implies that that outcome is worse. 621 01:19:02,460 --> 01:19:15,560 Yes. And I also say that if if Ontarians don't accept the view that coming into existence, the benefit. 622 01:19:19,070 --> 01:19:25,370 They're claiming that that outcome is better with all things considered. 623 01:19:25,700 --> 01:19:31,310 And that will contradict that way here. 624 01:19:31,700 --> 01:19:35,240 Yeah. Which you believe we should reject rightly. 625 01:19:36,890 --> 01:19:43,670 I didn't have any of you. I was just pointing out that it's sort of great for what I'm. 626 01:19:43,670 --> 01:19:49,999 Can your variation of controversy this week because you think that the one on the left is worse than the one on the right, 627 01:19:50,000 --> 01:19:51,620 which contradicts this recollection? 628 01:19:52,340 --> 01:20:03,730 No, I think that on on the other tariffs, which is the view I spent a very good, you know, the peak periods prior. 629 01:20:04,110 --> 01:20:07,400 You know what I meant is your view about about please. Ah. 630 01:20:07,700 --> 01:20:13,490 So P itself as described says that the left is worse than the right. 631 01:20:15,860 --> 01:20:26,870 The left and the right. Yes. But that in itself is in conflict with the weaker personal fact that unless unless 632 01:20:27,530 --> 01:20:34,460 you take the view that bringing somebody into existence benefits those people. 633 01:20:34,550 --> 01:20:40,130 So I feel that to see it sort of the one on the left is supposed to be worse than the one on the right. 634 01:20:41,060 --> 01:20:49,070 Now, whether we take the view to bring people in or bad for them, there is no one for whom no one on the left is worse than one on the right. 635 01:20:49,550 --> 01:21:01,820 And no on this, unless you take the view which I would be inclined to take, 636 01:21:02,000 --> 01:21:08,570 namely that the non-existence are worse off because in this case that have they have a benefit. 637 01:21:08,840 --> 01:21:16,660 But in that case, in this case, they have neither anything of value nor anything out of this yet. 638 01:21:16,820 --> 01:21:21,020 So they're actually neutral. So the non existent people are still. 639 01:21:22,640 --> 01:21:29,390 Yes, comparatively speaking to me, not if they would not exist. 640 01:21:30,510 --> 01:21:35,810 Yes. But it would only be truly over there that they don't. 641 01:21:36,500 --> 01:21:44,870 There's nothing intrinsically good about non-existence, but there is something intrinsically good about this existence, 642 01:21:44,870 --> 01:21:50,410 even though it's very small said and my say this is true of them, but you haven't anybody to refer to. 643 01:21:53,270 --> 01:22:01,190 I mean, it could be the case that we we talk to specific individuals because you can have as far and in know, yeah, 644 01:22:01,350 --> 01:22:13,879 there are non-existent people so non-existent that they're just not actually we of a temporary one when I have a question about 645 01:22:13,880 --> 01:22:22,160 this position about non-existent people in the priority areas and specifically in the context of transgenerational justice. 646 01:22:22,250 --> 01:22:26,660 So I'm trying to ask what is our of terrorism as a tiny slice but as opposed to all binary. 647 01:22:27,140 --> 01:22:35,300 So take a situation where you are people but in 20 years, not 200 people because the commissioner office yeah, 100 people. 648 01:22:36,650 --> 01:22:41,440 And then you're assessing justice in two different time slices and it's like, say, 649 01:22:41,630 --> 01:22:45,530 where you have 100 people and not yet the thought of the following hundred people, 650 01:22:45,980 --> 01:22:47,060 the times like you, 651 01:22:47,300 --> 01:22:58,040 where you have 200 hundred people for ten priority charities and compare outcomes between situation one that is before 100 people have been born. 652 01:22:58,510 --> 01:23:01,820 And situation to you, those hundred people will exist eventually. 653 01:23:02,300 --> 01:23:07,730 Oh yeah. Certainly. If you're comparing to ways in which just you might go away, 654 01:23:07,730 --> 01:23:13,470 that's somewhat better for people in the 20th century, but somewhat worse for people in the 20 seconds, I think. 655 01:23:13,490 --> 01:23:17,780 And the very same people would be better off in the 22nd or the third. 656 01:23:18,380 --> 01:23:23,330 There's no difficulty about comparing those because you've got the very same people existing in both outcomes. 657 01:23:24,140 --> 01:23:32,090 I was merely saying that the principle doesn't apply to cases in which you don't have the same people in the outcomes. 658 01:23:32,570 --> 01:23:37,130 So suppose we treated that one says the consequences of one's actions. 659 01:23:37,520 --> 01:23:43,250 In hindsight, one could result in 95 people existing the second situation rather than 100. 660 01:23:43,580 --> 01:23:46,600 Oh. Then they're going to be different people, existing different people. 661 01:23:46,610 --> 01:23:49,780 And priorities are no longer applies. No, it doesn't apply to them. 662 01:23:49,790 --> 01:23:59,480 So even if the circumstance we're talking about may have a direct consequence on the issue of the December comparison, 663 01:23:59,510 --> 01:24:02,080 one of the two now on this becomes outcomes. 664 01:24:02,220 --> 01:24:08,240 Yes, I'm just trying to clarify exactly the principle that we're talking about that allows the exact scope. 665 01:24:09,680 --> 01:24:15,490 I'm sorry. Well, it just doesn't apply to cases in which different people exist in the different outcomes. 666 01:24:16,470 --> 01:24:23,980 It's only intended to apply to same people cases. And I can see someone might say, well, you know, you have to have a view about these other cases. 667 01:24:24,000 --> 01:24:37,200 Yes. But they're much harder. And because the priority view could be implied by a wide range of quite different, wider principles. 668 01:24:37,530 --> 01:24:41,730 You can be fairly confident that it's the right view about same people cases, 669 01:24:41,910 --> 01:24:46,680 even though you haven't yet worked out, which is the wider principle from which it should be derived. 670 01:24:47,220 --> 01:24:52,020 That's my view. So please join me in thanking Margaret.