1 00:00:00,180 --> 00:00:09,330 Topic of my talk is is what connexion there is between an individual's moral status and his or her proper, 2 00:00:09,330 --> 00:00:23,160 and that means morally proper political and legal status. So I'm I'm not a fan of moral status talk, but that's all right. 3 00:00:23,160 --> 00:00:27,510 No, no offence to the conference organisers. 4 00:00:27,510 --> 00:00:31,830 But you know, I think I think we can minimise the damage as long as we're very clear about what we mean by it. 5 00:00:31,830 --> 00:00:37,350 So I'm going to use moral status in the in the verdict of sense. 6 00:00:37,350 --> 00:00:43,680 So the way I'm going to think about it is that an individual's moral status is 7 00:00:43,680 --> 00:00:49,890 just a summary way of referring to all the vindictive facts about that being, 8 00:00:49,890 --> 00:00:55,860 i.e., all the facts about what we do and do not go to that being what kind of treatment, 9 00:00:55,860 --> 00:01:03,780 what kind of treatment of that being or about other things that being cares about whatever to happen. 10 00:01:03,780 --> 00:01:09,180 So so my thesis is, as you can tell, 11 00:01:09,180 --> 00:01:16,380 as you probably guessed from the title of the talk is that the connexion between moral status and now in our moral state is so understood. 12 00:01:16,380 --> 00:01:25,380 Moral status in the predictive sense and proper political and legal status is tenuous or weak. 13 00:01:25,380 --> 00:01:30,630 So basically, what another way now that we've now that I've explained what I mean by moral status? 14 00:01:30,630 --> 00:01:39,630 I guess another way to express my thesis would be the fact that we morally owe or don't owe a variety 15 00:01:39,630 --> 00:01:45,210 of things to an individual does not have much by way of implications for what their political, 16 00:01:45,210 --> 00:01:55,490 what, what their political or legal status ought to be. It does have and and very little doesn't mean none. 17 00:01:55,490 --> 00:02:04,130 But I'm going to in my short time here, I'm going to focus on what I take on what I what I think will be the more controversial aspect of my view, 18 00:02:04,130 --> 00:02:06,530 which is where the connexion is absent. 19 00:02:06,530 --> 00:02:12,230 I think that the standard view would be that you would recognise quite a few places where the connexion is present. 20 00:02:12,230 --> 00:02:17,780 And when I agree with that, I'm just not going to say I'm only going to focus on the areas where I think I disagree with the common wisdom, 21 00:02:17,780 --> 00:02:22,280 which is where other people think there is a connexion, and I think there isn't. 22 00:02:22,280 --> 00:02:30,770 So I'm to start, I'm going to talk about political status and then legal status and then run through a few implications and then I'll be done. 23 00:02:30,770 --> 00:02:41,630 So I. In political philosophy, I endorse a weak form of political liberalism. 24 00:02:41,630 --> 00:02:47,510 I'm not sure it's ever been enunciated before, but here it goes. 25 00:02:47,510 --> 00:03:00,560 So my version says that the fact that something is good never grounds a reason for the state to pursue it. 26 00:03:00,560 --> 00:03:07,640 So I have an argument for this version of political liberalism. 27 00:03:07,640 --> 00:03:14,390 And that argument also in effect grounds sort of half of my thesis, 28 00:03:14,390 --> 00:03:21,500 the thesis about there being a tenuous connexion between moral status and proper political status. 29 00:03:21,500 --> 00:03:31,070 So I think that when we think about what the state ought to do, we have to focus on what kind of thing the state is. 30 00:03:31,070 --> 00:03:34,070 I think it ought to be uncontroversial. I don't know if it is uncontroversial, 31 00:03:34,070 --> 00:03:44,180 but I think it ought to be uncontroversial that the state is an institution and borrowing from a philosopher named Michael Hardiman. 32 00:03:44,180 --> 00:03:50,840 All defined an institution is an ongoing software producing structure that includes offices and positions, 33 00:03:50,840 --> 00:03:55,030 the occupants of which can change over time. 34 00:03:55,030 --> 00:04:06,400 So in order to help us think sort of in an unbiased way about what the institutional nature of the state implies for what the state ought to do, 35 00:04:06,400 --> 00:04:14,310 we should. Set the state aside and think about other institutions and what their institutional nature implies about what they ought to do. 36 00:04:14,310 --> 00:04:17,190 So I think uncontroversial corporations are institutions. 37 00:04:17,190 --> 00:04:24,930 So let's just think this think for a second about what the institutional nature of a corporation implies for what it ought to do. 38 00:04:24,930 --> 00:04:36,650 So. So we've got so you've got a corporation, so you set up a corporation and then and then you have a question, 39 00:04:36,650 --> 00:04:44,360 well, how how am I, how are we going to use it? How am I going to use it? Now there's some moral constraints that are simply given. 40 00:04:44,360 --> 00:04:50,430 So you're not allowed to if you if you are the. 41 00:04:50,430 --> 00:04:54,990 The owner of a corporation, you're not allowed to use it to kill people. 42 00:04:54,990 --> 00:05:01,620 OK, that's just a given. Morality set certain constraints on what you are and what you may or may not do with your corporation. 43 00:05:01,620 --> 00:05:10,260 But a lot, there's quite a bit, but there are there's quite a bit of leeway. 44 00:05:10,260 --> 00:05:13,380 So how do you how do we I mean, 45 00:05:13,380 --> 00:05:21,300 so you just sort of brute morality doesn't really answer that many questions about what ought to be done with the corporation. 46 00:05:21,300 --> 00:05:24,210 What does sort of answer those questions? 47 00:05:24,210 --> 00:05:32,950 Well, I think we have to look to how and why the corporation exists in order to get more answers to our questions. 48 00:05:32,950 --> 00:05:41,590 So. I think I think like all other institutions, Corp. a corporation, 49 00:05:41,590 --> 00:05:49,000 the sustaining of a corporation is a joint action made possible by shared intention intention to intentionality. 50 00:05:49,000 --> 00:05:57,020 So what I mean by a joint action is that it's it's a numerically single action that many people are doing and shared. 51 00:05:57,020 --> 00:06:00,550 And by end, when I say it's made possible by shared intentionality. 52 00:06:00,550 --> 00:06:07,990 What I mean is the way that action is being accomplished is via the various people doing it, 53 00:06:07,990 --> 00:06:16,500 having individual intentions that mesh with each other in the right way. 54 00:06:16,500 --> 00:06:28,400 So. I think what we what we get when we think about this corporate analogy and we think about how corporations are sustained. 55 00:06:28,400 --> 00:06:41,750 We are we at least may plausibly arrive at the conclusion that the corporation is to be used so as to serve the ends of those who sustain it. 56 00:06:41,750 --> 00:06:48,270 So we get something like the shareholder theory of corporate obligation. 57 00:06:48,270 --> 00:06:57,690 And I know that's controversial, I know not everybody accepts the shareholder theory, but if they if they don't accept the shareholder theory, 58 00:06:57,690 --> 00:07:02,010 chances are the reason why they don't accept it is because they have a brought because because their 59 00:07:02,010 --> 00:07:09,390 understanding of which set of individuals partakes of that joint intentionality is more expansive. 60 00:07:09,390 --> 00:07:14,910 But I think I hope most, most, all of us can agree that whatever that set of individuals is, 61 00:07:14,910 --> 00:07:22,620 that sustains the joint intentionality that makes the corporations sustenance possible that. 62 00:07:22,620 --> 00:07:25,440 That what the corporation ought to be used for. 63 00:07:25,440 --> 00:07:29,910 You know, the people, the people who are employed to work for the corporation, the managers, et cetera. 64 00:07:29,910 --> 00:07:36,900 What they ought to do is serve the ends of those who sustain the corporation. 65 00:07:36,900 --> 00:07:40,710 So I think that the same is true of the state. 66 00:07:40,710 --> 00:07:47,910 There is a set of individuals who sustain the state through joint intentionality and 67 00:07:47,910 --> 00:07:54,420 therefore the state is to be used to serve their ends again within moral limits, 68 00:07:54,420 --> 00:08:01,350 even if they want to use the state to go around killing people. That's not OK. 69 00:08:01,350 --> 00:08:14,880 So. Now we can return with that much on the table, we can return to the question of political status and who should have. 70 00:08:14,880 --> 00:08:20,780 Well, political like as as with all the other. 71 00:08:20,780 --> 00:08:27,380 As with all of it, as with all the other levers at the disposal of the state, the state, you know, has all sorts of power, does all sorts of things. 72 00:08:27,380 --> 00:08:33,590 One of the things it does is allocate status. As with all other things it does, 73 00:08:33,590 --> 00:08:39,980 I think the most general thing that can say that we can say that's true about how political stands off to be 74 00:08:39,980 --> 00:08:46,940 allocated is that it ought to be allocated in a way that serves the ends of those who sustain the state. 75 00:08:46,940 --> 00:08:53,840 So if it's if and what if and when it's true that some individual ought to be accorded political status. 76 00:08:53,840 --> 00:09:02,510 The reason why it's true is because doing so serves the ends of those who partake of the joint intentionality that sustains the state. 77 00:09:02,510 --> 00:09:11,780 So it's never the individual's moral status that does the grounding work. It's always some something else, something quite different. 78 00:09:11,780 --> 00:09:20,540 So really, there are two ways then that it can be morally required of the state that it accord. 79 00:09:20,540 --> 00:09:27,290 Political status to some individual one is that is that that individual him or herself partakes of the shared intentionality, 80 00:09:27,290 --> 00:09:34,790 the stink that sustains the state because, you know, operating under this sort of assumption that there, 81 00:09:34,790 --> 00:09:39,680 although it's a contingent truth, that it'll be a nearly exceptionalist truth, 82 00:09:39,680 --> 00:09:46,880 that those who sustain the state will care about them, that they themselves, that they will want political status for themselves. 83 00:09:46,880 --> 00:09:54,470 And then the second way is whether that individual is central to the ends of those who partake of the shared intention just to sustain the state. 84 00:09:54,470 --> 00:09:58,430 In other words, whether they're sort of cared about whether their well, 85 00:09:58,430 --> 00:10:02,840 their good is thought of amongst those who partake of this shared intentionality. 86 00:10:02,840 --> 00:10:09,680 It's thought of a sense as if they as if they if they think of it as sort of central to why they are partaking of the shared intentionality, 87 00:10:09,680 --> 00:10:13,900 why they're going to the trouble. Now, 88 00:10:13,900 --> 00:10:26,020 I admit that what I've said here seems to open the door to institutionalised bigotry because 89 00:10:26,020 --> 00:10:30,760 it seems like basically what I'm sort of implicitly allowing here is that if a group 90 00:10:30,760 --> 00:10:37,660 of people sustain a state and they're all bigots and there's some other and there's this 91 00:10:37,660 --> 00:10:41,530 other group of people whose well-being they don't care about at all is matter of fact. 92 00:10:41,530 --> 00:10:44,920 They help these people do really badly, 93 00:10:44,920 --> 00:10:53,320 that they're sort of their sort of bigotry actually justifies withholding political status from this other group of people or non or non people, 94 00:10:53,320 --> 00:10:55,630 whoever these other individuals may be. 95 00:10:55,630 --> 00:11:04,810 But ultimately, I think this is a question of excluding individuals from partaking of shared intentionality, and I never said that was OK. 96 00:11:04,810 --> 00:11:09,610 So, right, so people, I never I never said it was OK, I never said it was possible, 97 00:11:09,610 --> 00:11:19,910 never mind morally OK for a group of people who want to partake of shared intentionality with each other to prevent other people from so partaking. 98 00:11:19,910 --> 00:11:26,780 So you could sort of your sort of the sort of bigoted project I've nothing I've said commits me to saying 99 00:11:26,780 --> 00:11:31,400 that the bigoted project is sort of your sort of morally permitted to even get that project off the ground, 100 00:11:31,400 --> 00:11:47,000 if you would like to. OK, so that's that's the end of what I have to say about political status. 101 00:11:47,000 --> 00:11:54,620 OK, moving on to legal status, so if we asked the question about whether an individual's moral status. 102 00:11:54,620 --> 00:11:58,610 Entails that they ought to have some kind of legal status. 103 00:11:58,610 --> 00:12:11,360 Basically, we're asking a question about the truth of legal moralism. And now the kind of legal moral ism that we are being forced to talk about here. 104 00:12:11,360 --> 00:12:15,290 It's not your father's legal moralism, it's not Hart Devlin legal moralism. 105 00:12:15,290 --> 00:12:17,870 For those of you who know the devil Devlin debate, 106 00:12:17,870 --> 00:12:26,780 it's not about whether the fact that some moral proposition is widely accepted or as part of the social morality, 107 00:12:26,780 --> 00:12:32,540 whether that fact entails that the law ought to sort of embody it or enforce it. 108 00:12:32,540 --> 00:12:42,030 Now, this is new, this is new school legal moralism, this is whether the fact that some more of the proposition is actually true. 109 00:12:42,030 --> 00:12:55,800 Entails that the legal system ought to be sort of structured so as to embody it or enforce it or discharge it or whatever. 110 00:12:55,800 --> 00:13:08,460 So when it comes to this kind of legal moralism, I take my cues from a legal philosopher named R-rate Duff. 111 00:13:08,460 --> 00:13:18,700 So does. Has a very helpful analogy, I think, for thinking about philosophy of law. 112 00:13:18,700 --> 00:13:30,460 He what he proposes is that before we can know what content the law ought to have, we need to first think about why we have such a thing as the law. 113 00:13:30,460 --> 00:13:33,610 And I think that's absolutely right. 114 00:13:33,610 --> 00:13:44,230 And the reason and the way he he demonstrates the usefulness of this way of approaching it by pointing out that we that if we wanted to make sense, 115 00:13:44,230 --> 00:13:51,640 if we wanted to know what content some other body of rules had, we would definitely stop to first think about why we have that body of rules at all. 116 00:13:51,640 --> 00:13:56,770 And so his analogy is bodies of rules for professional societies. 117 00:13:56,770 --> 00:14:01,960 So he often comes back to medicine as his key example. 118 00:14:01,960 --> 00:14:12,100 So take Take an organisation like the American Medical Association, the main professional body for four physicians in the United States. 119 00:14:12,100 --> 00:14:13,720 Let's say they had a clean slate. 120 00:14:13,720 --> 00:14:19,960 You know, imagine a time when there just wasn't it when they had no code of conduct for doctors yet, and they were trying to come up with one. 121 00:14:19,960 --> 00:14:26,200 How would they have sort of started to think about what content their code ought to have? 122 00:14:26,200 --> 00:14:34,540 Well, Duff proposes and I I think this is right that the first thing they would have had to think about is, you know, why practise medicine? 123 00:14:34,540 --> 00:14:40,480 What's medicine for? What are we doing that for? And then once we have an idea of what we're doing medicine for, 124 00:14:40,480 --> 00:14:46,990 then we can think about what a good set of rules governing the conduct of medicine would look like. 125 00:14:46,990 --> 00:14:55,540 So, you know, under the supposition that that is and that that sort of that that sounds intuitively right, 126 00:14:55,540 --> 00:15:00,550 Duff wants to say, Well, we should say the same thing about legal systems. 127 00:15:00,550 --> 00:15:05,260 Why do we have? Why have legal systems as opposed to not have it? 128 00:15:05,260 --> 00:15:20,080 Well, what Duff says is that just as the just as the AMA's Code of Professional Conduct is to is to govern the conduct of physicians as physicians, 129 00:15:20,080 --> 00:15:29,530 the law is to govern is a code of conduct to govern the behaviour of citizens living together as citizens living together. 130 00:15:29,530 --> 00:15:34,420 So we have to think about why just and so analogously, just like we in the case of the AMA. 131 00:15:34,420 --> 00:15:40,090 The first thing you have to think about is say, why do medicine? The first thing we got to think about here is sort of why live together? 132 00:15:40,090 --> 00:15:47,860 Why live together in some sort of organised way with political structures instead of living in anarchy? 133 00:15:47,860 --> 00:15:53,420 So. We don't need to give an answer to that question. 134 00:15:53,420 --> 00:15:57,620 We can actually get pretty far and thinking about what the law ought to say. 135 00:15:57,620 --> 00:16:01,190 Just knowing that that question needs to be asked, never mind actually answering it. 136 00:16:01,190 --> 00:16:04,010 We can just put in a placeholder for whatever the answer is, 137 00:16:04,010 --> 00:16:09,290 whatever the answer is to why live together under political structures instead of having anarchy? 138 00:16:09,290 --> 00:16:14,810 Whatever the answer is, the placeholder word for it at least does placeholder word for it. 139 00:16:14,810 --> 00:16:16,880 Is the rest high? 140 00:16:16,880 --> 00:16:28,220 That is the sort of the body of ends that we together are trying to achieve through political means instead of leaving it to anarchy to solve it. 141 00:16:28,220 --> 00:16:37,190 So what the law is for then, is to promote and make more likely the achieving of the public. 142 00:16:37,190 --> 00:16:42,050 And this essentially makes legal philosophy a branch of political philosophy. 143 00:16:42,050 --> 00:16:48,110 And I think that I think that's entirely right and proper that it should be conceived of that way. 144 00:16:48,110 --> 00:16:50,480 We need to do a little political philosophy first media. 145 00:16:50,480 --> 00:16:57,140 We need to determine what the public is and then we can find out what sort of laws we ought to have. 146 00:16:57,140 --> 00:17:02,090 So if it's right that legal philosophy is a branch of political philosophy, 147 00:17:02,090 --> 00:17:09,350 then we then I kind of already answered the question about legal status by answering the question about political status. 148 00:17:09,350 --> 00:17:13,800 My weak political liberalism. My claim that. 149 00:17:13,800 --> 00:17:22,410 It's never the fact that something is good that grounds the fact that the state ought to produce it and ergo, I guess. 150 00:17:22,410 --> 00:17:28,440 And then, you know, even more so it's never the fact that an individual's moral status that grounds any fact about it. 151 00:17:28,440 --> 00:17:38,030 What's what's status? The state recorded that individual so that weak political liberalism entails legal liberalism. 152 00:17:38,030 --> 00:17:42,770 So the way I'm going to define it is that the fact that some action would be bad for 153 00:17:42,770 --> 00:17:49,280 someone or wrong someone never grounds a reason for us to use the law to address it. 154 00:17:49,280 --> 00:17:58,040 What does ground reasons for us to use laws to address things is that the doing of that thing undermines the public. 155 00:17:58,040 --> 00:18:02,630 So we get two conclusions for this from this, it's never an individual's moral status, 156 00:18:02,630 --> 00:18:10,550 the grounds for proper legal status and legal moralism is false. 157 00:18:10,550 --> 00:18:21,210 So I'm something more like a legal liberal. I think I think I'm a legal liberal in a strong sense that even stronger than Joel Feinberg was, 158 00:18:21,210 --> 00:18:31,740 who is considered sort of the standard bearer of legal liberalism. Interestingly, Duff himself ends up endorsing legal moralism. 159 00:18:31,740 --> 00:18:38,010 But I think that he just didn't properly follow through the implications of his own argument. 160 00:18:38,010 --> 00:18:41,650 But that's by the by. All right. 161 00:18:41,650 --> 00:18:50,200 So that's it for the. That's it for my sort of defence of the idea that moral status has very little to do with proper legal status. 162 00:18:50,200 --> 00:18:59,440 So what are the policy implications? Well, I think some of I think at least a few of the policy implications of what I said will be quite intuitive, 163 00:18:59,440 --> 00:19:03,030 like they will accord with with people's pre theoretical intuitions. 164 00:19:03,030 --> 00:19:07,800 But I only don't have that much time. And so I want to focus on the controversial things. 165 00:19:07,800 --> 00:19:11,820 So I'm just going to I'm just going to tell you the controversial implications, not the uncontroversial ones. 166 00:19:11,820 --> 00:19:17,430 And only even then only a couple of the controversial implications just because of limits to my time. 167 00:19:17,430 --> 00:19:22,350 So I think the most interesting implications of what I said so far are in terms of what we are morally 168 00:19:22,350 --> 00:19:27,090 permitted to do as a because there are implications in terms of what we're morally required to do. 169 00:19:27,090 --> 00:19:32,820 But I want to focus on the implications in terms of what we're morally permitted to do. 170 00:19:32,820 --> 00:19:40,320 So if we want, if we want to extract some implications from what I've said, we have to ask. 171 00:19:40,320 --> 00:19:42,390 At least this one question. 172 00:19:42,390 --> 00:19:48,570 Well, at least these two questions, excuse me, which individuals partake of the joint intentionality that sustains the state, 173 00:19:48,570 --> 00:19:59,680 call that set of individuals x and which individuals are such that their good is central to X ends? 174 00:19:59,680 --> 00:20:06,610 There again, the reason we have to ask that question is because those are the individuals such that it's quite it, 175 00:20:06,610 --> 00:20:09,480 we're it's we're going to have a good argument for a court, 176 00:20:09,480 --> 00:20:16,800 for being more for our being morally obligated to accord them some political and legal status. 177 00:20:16,800 --> 00:20:25,320 I'm going to focus on on this artificially intelligent machines, but just because I think it's interesting. 178 00:20:25,320 --> 00:20:32,610 So I submit to you that is at least the ones that sort of get talked about, the ones that are most likely to exist. 179 00:20:32,610 --> 00:20:39,180 The first ones that are most likely to exist will be guys that cannot partake of joint intentionality. 180 00:20:39,180 --> 00:20:47,490 The reason why is because I think partaking of joint intelligence finality requires seeing other individuals as rational agents remember, 181 00:20:47,490 --> 00:20:54,810 joint intentionality involves each amount involves each individual that partakes of that joint intentionality, 182 00:20:54,810 --> 00:21:01,260 having one or more intentions that meshes in the right way with the intentions of others. 183 00:21:01,260 --> 00:21:08,760 And I think that if you look at what people who are experts in this area of sort of social metaphysics will tell you, 184 00:21:08,760 --> 00:21:15,330 people like Bretman who have even quite weak requirements for shared intentionality will tell it. 185 00:21:15,330 --> 00:21:19,320 Tell you that the intentions have to have some sort of second personal reference. 186 00:21:19,320 --> 00:21:27,030 They have to be intentions to do something with others. And I think AI's can't do that. 187 00:21:27,030 --> 00:21:33,870 And the reason I think I could do that, they can't. They can't have thoughts like that. 188 00:21:33,870 --> 00:21:39,780 The reason I think that goes back to a distinction I think we owe to Pierre Straw since freedom and resentment. 189 00:21:39,780 --> 00:21:46,460 I could be wrong there. The distinction between two different ways of viewing others, viewing other objects, 190 00:21:46,460 --> 00:21:51,050 we can view other objects and buy objects, I include living things and even intelligent living things. 191 00:21:51,050 --> 00:21:56,510 We can view other objects as things to be managed or as things to be reasoned with. 192 00:21:56,510 --> 00:22:03,230 These are two sort of very basic fundamentals the sort of basic fundamental distinction between these two ways of approaching other individuals. 193 00:22:03,230 --> 00:22:08,390 So hurricanes, I think all of us view are we view them as things to be managed. 194 00:22:08,390 --> 00:22:12,740 It wouldn't make sense to view them as things to be reasoned with. They just are what they are. 195 00:22:12,740 --> 00:22:19,700 We can try to predict what they're going to do and then react accordingly in whatever way best promotes our ends. 196 00:22:19,700 --> 00:22:29,930 Other human persons of average intelligence and maturity, at least we can, and often we do interact with them as individuals to be reasoned with. 197 00:22:29,930 --> 00:22:35,570 We don't just take what they're going to do as a given and then try to protect our interests the best we can. 198 00:22:35,570 --> 00:22:41,630 We think we can engage with them and propose reasons why they ought to change their intentions and things like that. 199 00:22:41,630 --> 00:22:46,250 So my understanding of A.I. is, is that they only do the form. 200 00:22:46,250 --> 00:22:51,320 They interact with all of us in that formal way and exclusively in that formal way. 201 00:22:51,320 --> 00:22:55,790 For that reason, they cannot partake of shared intentionality. 202 00:22:55,790 --> 00:23:03,230 So they are not amongst the individuals who partake of the shared intentionality that sustains the state. 203 00:23:03,230 --> 00:23:07,130 I also would say this would propose, and this is an empirical claim, 204 00:23:07,130 --> 00:23:14,930 that they are not central to the ends of those who do partake of the shared intentionality that sustains the state. 205 00:23:14,930 --> 00:23:18,530 So one implication then I think and this is making a big leap, 206 00:23:18,530 --> 00:23:22,190 there are a bunch of intermediate premises that I'm leaving out just for the sake of time. 207 00:23:22,190 --> 00:23:29,880 I think it's permissible to withhold citizenship from them and all the things that go along with citizenship, like the right to vote. 208 00:23:29,880 --> 00:23:37,440 I also think it's permissible to make no law criminalising the destruction of AIDS. 209 00:23:37,440 --> 00:23:39,780 And again, I'm leaving out a whole bunch of intermediate premises, 210 00:23:39,780 --> 00:23:48,870 but I do think that by a series of sensible premises we can get from what I just said about joint intentionality to that conclusion. 211 00:23:48,870 --> 00:23:53,850 So that's just a taste of some of the implications we get out of what I've said. 212 00:23:53,850 --> 00:24:03,840 I have to stop there for time and consideration. I look forward to your questions and thank you for listening.