1 00:00:00,060 --> 00:00:07,950 Thank you very much. Thank you for inviting me. And thank you all for attending this fourth in-person and online. 2 00:00:08,850 --> 00:00:12,120 I am very excited and honoured to be here today. 3 00:00:13,830 --> 00:00:20,550 And of course, as you can see from the title and the reasons why you're here is that for the next half hour or so, 4 00:00:21,090 --> 00:00:26,790 I will be speaking to you about autonomous weapons and of your responsibility. 5 00:00:27,780 --> 00:00:31,110 This presentation is based on an article that I am finalising, 6 00:00:32,360 --> 00:00:39,360 and I will try to explain things as clearly as possible for an audience that is not ready for the last of order, 7 00:00:40,350 --> 00:00:47,510 which is also why I prepared this short PowerPoint with some keywords that hopefully will help you follow along. 8 00:00:48,120 --> 00:00:57,910 My reasoning, which perhaps at times can be a bit complex, but I welcome all comments and questions and clarifications at the end. 9 00:00:59,080 --> 00:01:04,000 I should also add as a disclaimer that I'm recovering from a cold, so that sounds great. 10 00:01:04,360 --> 00:01:08,140 So if at some point my voice drops, please let me know. 11 00:01:10,370 --> 00:01:22,100 So before getting to my main argument, perhaps I give you a bit of background about how I got to this topic in the first place. 12 00:01:23,330 --> 00:01:31,130 Like many lawyers, I am not particularly familiar with the tech aspects of autonomous weapons. 13 00:01:32,300 --> 00:01:39,320 But what I do find interesting about the advent of these technologies is that they prompt 14 00:01:39,320 --> 00:01:48,590 dozens of international lawyers to dissect some old rules that we took for granted in a way. 15 00:01:48,590 --> 00:01:59,210 And they make us question different elements, and they make expression of these old rules could potentially apply in light of these challenges. 16 00:01:59,780 --> 00:02:02,870 For instance, if you take international law, 17 00:02:03,350 --> 00:02:12,259 it's now the advent of one of those weapons is making us question whether in order to violate the principle of distinction and the other 18 00:02:12,260 --> 00:02:22,610 rules and the standard is targeting decisions needs to be made by a human or that can be delegated to an artificial intelligence. 19 00:02:23,540 --> 00:02:34,489 And in the real world of international criminal law, I think the doctrine of secure responsibility offers also the opportunity to think about some 20 00:02:34,490 --> 00:02:41,720 elements of this doctrine that we didn't necessarily have the chance to to question before. 21 00:02:43,290 --> 00:02:48,690 Super responsibility is, as you are probably all aware, 22 00:02:49,020 --> 00:02:57,629 a criminal law doctrine that entails the responsibility by omission of a military commander 23 00:02:57,630 --> 00:03:04,230 or a civilian superior that failed to prevent or repress the crimes of their subordinates. 24 00:03:04,620 --> 00:03:08,790 We find it in the first initial part of law to the Geneva Conventions before removing the 25 00:03:08,790 --> 00:03:14,160 statutes of the International Criminal Court of the Tribunals of Primary Courts and so forth. 26 00:03:16,590 --> 00:03:21,180 And superior responsibility is a bit unique in international criminal law. 27 00:03:21,930 --> 00:03:31,180 And because of these, it has attracted some criticism, because it is a form of sui generis liability. 28 00:03:32,070 --> 00:03:34,660 It has been described as a fallback option. 29 00:03:34,680 --> 00:03:44,780 So when everything else fails for the charge, it wouldn't stick under any other motive liability towards people in leadership position. 30 00:03:45,660 --> 00:03:55,890 Then we go for a superior responsibility and in this sense, superior responsibility of courts with our moral preferences. 31 00:03:56,160 --> 00:04:00,809 Those that are in a leadership position that could have done something to prevent 32 00:04:00,810 --> 00:04:06,990 or punish the crimes of their subordinates and didn't do so be held to account. 33 00:04:08,130 --> 00:04:15,660 But on the other hand, the criticism is that there is a gap between the culpability of the superior. 34 00:04:15,810 --> 00:04:24,960 We simply omitted to act and the actual responsibility, which is for the crimes, the war crimes ordinance. 35 00:04:26,750 --> 00:04:32,180 Now in all of this, enter autonomous weapons you might know about. 36 00:04:33,320 --> 00:04:39,950 Different stakeholders have been engaged in a series of debates about this weapons. 37 00:04:40,880 --> 00:04:45,140 What characteristics should they have? How to regulate them? 38 00:04:45,140 --> 00:04:48,620 How can they be used in compliance with international law? 39 00:04:49,850 --> 00:04:52,250 From the perspective of criminal law, 40 00:04:53,390 --> 00:05:03,170 the debates are focussed on the difficulty or even impossibility prescribed criminal responsibility to an individual. 41 00:05:03,500 --> 00:05:12,560 When autonomous weapons are employed and it is called the so-called responsibility that is created by the use of autonomous weapons. 42 00:05:13,960 --> 00:05:22,780 Now various proposals have been put forward as solutions to this problem, real or perceived on their responsibility gap. 43 00:05:23,560 --> 00:05:26,710 And one of them is superior responsibility. 44 00:05:28,110 --> 00:05:40,469 And the way that superior responsibility has been designed as a solution to the problem of the responsibility is in one way to consider autonomous 45 00:05:40,470 --> 00:05:53,549 weapons themselves as subordinates whose crimes need to be prevented or repress or otherwise to rethink and change the doctrine so as to accommodate, 46 00:05:53,550 --> 00:05:57,540 in a way, machine criminality. Opportunities, violence. 47 00:05:59,000 --> 00:06:03,680 I am not particularly convinced by either of these solutions. 48 00:06:04,130 --> 00:06:14,180 And I will tell you why. In the course of my presentation, which is mostly a response to the first kind of proposals, 49 00:06:14,590 --> 00:06:22,850 those that to those that believe that we can interpret severe responsibility as entailing the responsibility 50 00:06:22,850 --> 00:06:29,930 of superiors for crimes committed by sort of the advanced weapons themselves or the subordinates. 51 00:06:31,370 --> 00:06:41,930 And I will focus on those that are, I believe, are the challenges or some of the challenges to the applicability of the doctrine in the circumstances. 52 00:06:43,390 --> 00:06:54,430 And this is the type of console that humans exercise over a part of most weapons is different from a qualitative perspective, 53 00:06:54,820 --> 00:07:07,150 from the type of control that is needed for a Super Hornet relationship to exist, which is needed for super responsibility to supply. 54 00:07:08,770 --> 00:07:21,579 Second, I argue that another challenge is represented by underlying findings that we need to be prevented because it 55 00:07:21,580 --> 00:07:28,450 means that have been committed by a punishable subordinate and needs to be a crime committed in settlements. 56 00:07:29,590 --> 00:07:32,600 And finally, I will explain. 57 00:07:32,620 --> 00:07:37,450 However, instead, if we focus on the superiors supervisory duties, 58 00:07:37,780 --> 00:07:45,650 we can still find some use for us in the doctrine of secure responsibility within the debate concerning one the. 59 00:07:49,070 --> 00:07:53,630 So let's start with the first element, the super organic relationship. 60 00:07:55,810 --> 00:08:01,960 This is the main challenge, I think, to the applicability of the doctrine. 61 00:08:02,080 --> 00:08:11,620 And I will come back to this even if we look at the codification of the doctrine for various purposes. 62 00:08:12,550 --> 00:08:16,720 For instance, in the first part of the universe, various statutes, 63 00:08:17,410 --> 00:08:26,930 we see that the people that the doctrine refers to, the superiors in this audience are identified in various ways. 64 00:08:26,950 --> 00:08:32,920 There are two subordinate forces, the subordinates, people or persons. 65 00:08:33,340 --> 00:08:36,639 There are members of the armed forces under the superiors, 66 00:08:36,640 --> 00:08:43,900 command or otherwise persons under the effective command and control of a superior and so forth. 67 00:08:45,990 --> 00:08:57,210 Based on this wording, we never really had a thought that all of these words identified clearly individuals and therefore 68 00:08:57,240 --> 00:09:03,660 the superior subordinate relationship would be an interpersonal relationship between individuals. 69 00:09:06,010 --> 00:09:12,130 And I think that this interpretation still stands and is. 70 00:09:15,540 --> 00:09:23,040 In a way, it is undermined by going back to the origins of the doctrine of secure responsibility. 71 00:09:23,930 --> 00:09:33,880 Because today we know that the doctrine applies to military superiors for commanders and non-military superiors. 72 00:09:34,580 --> 00:09:38,570 But the origins of this doctrine are in the military sphere. 73 00:09:39,800 --> 00:09:44,000 The doctrine of secure responsibility is the criminal law. 74 00:09:44,000 --> 00:09:47,210 Corollary of the principle of responsible command. 75 00:09:48,050 --> 00:09:56,200 And responsible command means that members must foster a command culture that is in line with that. 76 00:09:56,240 --> 00:10:02,990 A child must train their troops, must discipline them, must threaten them, 77 00:10:02,990 --> 00:10:08,569 sometimes a punishment so as to make sure that they will use force only in ways 78 00:10:08,570 --> 00:10:12,710 there are a lot of impermissible in line with international humanitarian law. 79 00:10:15,030 --> 00:10:22,560 Responsible also implies a certain level of organisation and position of discipline. 80 00:10:24,480 --> 00:10:29,460 Responsible command and responsible command structures are also invited. 81 00:10:29,610 --> 00:10:40,170 Of course, authority can be delegated, legal obligations can be delegated, but they can only be delegated to another responsible moral agent. 82 00:10:41,280 --> 00:10:51,870 And in case obligations are not complied with, someone will be responsible and obligations will be still overseen by someone. 83 00:10:51,870 --> 00:10:56,820 And so we have the responsibility of the superior as a consequence. 84 00:10:58,900 --> 00:11:01,959 I think there is really no question about the fact it was venomous. 85 00:11:01,960 --> 00:11:11,620 Weapons are not responsible for agents, so we should not be giving them any obligations under the law. 86 00:11:12,310 --> 00:11:14,800 We cannot not threaten them with punishment. 87 00:11:15,460 --> 00:11:21,540 Of course, they can be programmed in compliance with your show and should be programmed to be using not your show. 88 00:11:22,240 --> 00:11:29,889 But I don't think that this is the same thing as educating troops and fostering respect for the command culture, 89 00:11:29,890 --> 00:11:40,270 which is based on the principle of responsible mind, because certainly autonomous weapons do not understand the legal character prohibitions. 90 00:11:43,170 --> 00:11:54,540 It is also important to know that in order for a secure subordinate relationship to exist. 91 00:11:55,840 --> 00:12:01,150 We need what is called effective control of the superiors over the subordinates. 92 00:12:01,330 --> 00:12:06,940 We know this from the case law of international criminal courts and tribunals. 93 00:12:07,330 --> 00:12:13,870 Of course, you know that the term effective control in international law means many different things. 94 00:12:14,910 --> 00:12:19,890 In the context of super responsibility, it means something very specific. 95 00:12:20,460 --> 00:12:33,420 It means the material ability of the superior to prevent or repress the commission of the crimes or to submit the matter to the room before it is. 96 00:12:34,860 --> 00:12:39,120 And the case law, like international criminal courts in particular, 97 00:12:39,120 --> 00:12:47,010 lets us come up with a list of indicative factors that might show the existence of effective control. 98 00:12:48,180 --> 00:12:55,410 You can see them on the PowerPoint. For instance, they include the power to issue orders engaged in hostilities, 99 00:12:55,830 --> 00:13:05,940 the commander's capacity to ensure compliance with orders, supporting units to remove, replace this being investigations and so on. 100 00:13:07,080 --> 00:13:18,120 Now, according to some offers, this information can be translated to the type of relationship that a commander would have with this weapon. 101 00:13:20,640 --> 00:13:27,120 I think instead that this factors are quite clearly anthropocentric in nature. 102 00:13:27,660 --> 00:13:35,310 Of course they were in the first place developed having in mind the relationship between individuals. 103 00:13:36,510 --> 00:13:39,090 But I think that, for instance, 104 00:13:39,300 --> 00:13:50,520 adjusting an algorithm that guides an autonomous weapon or in deciding to withdraw it from one place and deployed elsewhere, 105 00:13:51,600 --> 00:13:56,380 are not initial effective control. It is not the same as redeploying troops. 106 00:13:56,400 --> 00:14:01,070 It is not the same as issuing orders for troops to act in compliance with a challenge. 107 00:14:01,710 --> 00:14:08,670 These decisions are decisions are similar to decisions made about any other type of weapon in the arsenal, 108 00:14:09,990 --> 00:14:16,620 and yet it is only in relation to autonomous weapons that it is suggested that these decisions instead 109 00:14:16,620 --> 00:14:24,179 show that the weapons themselves are subordinates for the purposes of security responsibility, 110 00:14:24,180 --> 00:14:30,750 and that someone most effective control over them, as it has over there, are supporting the troops. 111 00:14:33,150 --> 00:14:39,570 And I think this is a fundamental misconception of the nature of effective 112 00:14:39,570 --> 00:14:45,660 control as opposed to the relationship between humans and any type of weapon, 113 00:14:45,660 --> 00:14:55,560 including autonomous weapons. And the law, as it stands today, where autonomous weapons are used for warfare. 114 00:14:56,040 --> 00:15:01,000 They're not combatants. They're not somewhere between combatants and means of warfare. 115 00:15:01,050 --> 00:15:05,340 They are weapons tools. They are objects of the law. 116 00:15:05,520 --> 00:15:12,900 And they they're not subjects of the law. So they're not addressed by equal obligations or prohibitions. 117 00:15:15,220 --> 00:15:23,320 And the type of control that is exercised over these weapons is not different from the type of exercise or any other weapon. 118 00:15:24,310 --> 00:15:33,350 Here I refer you to the work of Tim MacFarland, who is a lawyer who also understands and knows about the tech side of autonomous weapons. 119 00:15:33,460 --> 00:15:38,200 So someone who has certain authority, I'd like me to speak about these things. 120 00:15:38,200 --> 00:15:47,259 And I think he's convincingly demonstrated that autonomous weapons do not differ in any legally 121 00:15:47,260 --> 00:15:53,919 relevant way on the basis of their physical properties or functional aspects from other 122 00:15:53,920 --> 00:16:01,030 types of weapons and the type of control that is exercised over how autonomous weapons are 123 00:16:01,030 --> 00:16:10,270 operated and how they're used is planned or executed is the same as any other weapon. 124 00:16:11,340 --> 00:16:18,570 For instance, in operations, decisions will be made about what type of weapon to use, 125 00:16:18,660 --> 00:16:25,680 what type of explosives to employ for a certain attack, but the range of the weapon at the blast radius and so forth. 126 00:16:26,130 --> 00:16:35,400 And in a way, these are all decisions are dissimilar from deciding what level of autonomy to use in a certain operation. 127 00:16:35,430 --> 00:16:42,460 When autonomous weapons are employed. In the realm of autonomous weapons. 128 00:16:42,820 --> 00:16:54,690 Specifically, we speak about autonomy in the sense that a weapon can be a human supervise or will be autonomous or semi-autonomous. 129 00:16:54,700 --> 00:16:59,950 And autonomy is basically a function of human control over the weapon. 130 00:17:01,280 --> 00:17:06,470 But it is still a function of control that is a technical type of control over a tool. 131 00:17:06,770 --> 00:17:12,680 It is not the type of control that our commander over there, troops. 132 00:17:17,900 --> 00:17:21,320 So we also speak about, for instance, 133 00:17:21,320 --> 00:17:30,320 meaningful human control or sufficient levels of human control that our states are still trying to decide what exactly they mean by this. 134 00:17:30,740 --> 00:17:35,629 But all of these notions are employed to mean the level of human control that is needed. 135 00:17:35,630 --> 00:17:40,970 Some of these weapons, the main sort of tools that do what you want them to do. 136 00:17:42,450 --> 00:17:48,359 And so it is on the basis of all of this these arguments and this research that I end up 137 00:17:48,360 --> 00:17:56,100 concluding that a weapon is not a subordinate for the purposes of severe responsibility. 138 00:17:56,580 --> 00:18:07,380 I do think that the doctrine remains applicable with respect to human subordinates that operate or supervise a weapon and their superiors. 139 00:18:08,830 --> 00:18:15,520 But still, then the responsibility of the human operator needs to be established in order 140 00:18:15,520 --> 00:18:20,770 for the derivative responsibility of their superior to also be established. 141 00:18:22,060 --> 00:18:27,760 And this brings me to my second point, which is about the underlying problem. 142 00:18:30,130 --> 00:18:38,170 And I think in relation to the underlying crime that needs to be prevented or repressed by the superior. 143 00:18:38,860 --> 00:18:47,440 There are two issues that arise. The first is that the subordinate needs to be susceptible of punishment. 144 00:18:47,480 --> 00:18:51,550 And the second is that the crime has been committed in all its other means. 145 00:18:58,740 --> 00:19:05,080 So as I mentioned before, autonomous weapons are not considered to enjoy a legal personality. 146 00:19:05,580 --> 00:19:09,000 They're not addressed by crime rules or international humanitarian law. 147 00:19:09,450 --> 00:19:12,420 They cannot be punished based on criminal law. 148 00:19:14,190 --> 00:19:25,200 It is true that in some jurisdictions we have entities that are not individuals that have been granted legal personally do for instance, 149 00:19:25,200 --> 00:19:33,060 corporations in some states. But if we take, for instance, the doctrine of respondent superior in the United States, 150 00:19:33,960 --> 00:19:40,440 according to which a corporation itself can be punished under criminal law for 151 00:19:40,440 --> 00:19:48,210 the acts that its agents have undertook undertaken on behalf of the corporation. 152 00:19:49,050 --> 00:19:58,410 I think this is presents basically the opposite problem that we have with the weapons, because in the case of corporations, 153 00:19:58,410 --> 00:20:05,490 the responsibility of the corporation still extends from the actions of a not for a person, the agent. 154 00:20:06,240 --> 00:20:15,690 But in the case of autonomous weapons, still, we have to send a firm responsibility to an individual for the acts of autonomous weapons, 155 00:20:16,800 --> 00:20:25,380 including on the basis of secure responsibility. We should also keep in mind, having said in many other jurisdictions, for instance, in Germany. 156 00:20:26,430 --> 00:20:30,360 Criminal law is instead rooted on the principle of blind worthiness. 157 00:20:30,750 --> 00:20:41,610 And this means that an actor must be able to decide between doing right and doing wrong, and can only be punished if it is blameworthy. 158 00:20:41,640 --> 00:20:51,900 Hasn't decided to do something wrong. And this principle excludes the criminal responsibility of legal entities and 159 00:20:51,900 --> 00:20:57,840 would also exclude the responsibility of machines no matter how much they may be, 160 00:20:58,140 --> 00:21:02,430 because they will never be able to be blameworthy in this sense. 161 00:21:03,840 --> 00:21:07,350 The principle of worthiness would also affect, however, 162 00:21:07,890 --> 00:21:19,140 the possibility to hold a human or greater celebrity responsible if either was not possible for them to store the weapon, 163 00:21:19,980 --> 00:21:27,330 for instance, because they were out of the movie or if they do not understand how the technology. 164 00:21:29,000 --> 00:21:39,079 Besides to to to ask because it's a wonder isn't this machine learning or observing was workings are inexplicable to the operator in these 165 00:21:39,080 --> 00:21:50,000 circumstances the operator would not be worthy of the weapon on and as a consequence would be held that operators superior responsible. 166 00:21:50,510 --> 00:22:00,530 We would be further exacerbating even further the gap between the culpability of the superior and their responsibility, 167 00:22:00,530 --> 00:22:10,950 which I mentioned at the beginning. Consider also that severe responsibility arises for failure to repress, 168 00:22:10,950 --> 00:22:17,430 which includes punishment and estimating the crimes of the competent authorities. 169 00:22:18,240 --> 00:22:21,300 And because punishment is. 170 00:22:23,440 --> 00:22:33,759 Sort of the most visceral quality of legal personalities and autonomous weapons cannot be punished because their punishment could be, 171 00:22:33,760 --> 00:22:36,459 for instance, reprogramming or deactivating. 172 00:22:36,460 --> 00:22:42,790 And this wouldn't pursue any of the of the goals that we normally associate with punishment in criminal law. 173 00:22:43,450 --> 00:22:52,660 Then it would mean that the superior cannot simply exercise this duty. 174 00:22:53,800 --> 00:22:58,820 If you cannot punish the weapons with criminal exercise weapons. 175 00:22:59,470 --> 00:23:09,190 It can also not to make of the matter and the authorities, because the weapon cannot be one of doubt and so forth. 176 00:23:10,750 --> 00:23:16,690 And this means that if the material ability of the superior to punish is not there, 177 00:23:17,110 --> 00:23:20,890 there is also no effective control in the sense of superior responsibility. 178 00:23:21,190 --> 00:23:26,780 Which brings me back to the initial point. That, however, is not as important as. 179 00:23:32,850 --> 00:23:41,790 Relatedly, when we think about the underlying crime for which the superiors responsibility would arise. 180 00:23:43,240 --> 00:23:51,460 We should ask ourselves whether this has to be a crime committed know incitements or the objective elements of our core arms race. 181 00:23:52,060 --> 00:23:56,150 The mental or subjective element. Or real. And the actual element usually. 182 00:23:57,460 --> 00:24:04,810 And generally the answer to this question is, yes, we do need all of the elements of a crime to be committed. 183 00:24:05,410 --> 00:24:10,270 Otherwise, it's a crime. Not every roadblock is a crime after all. 184 00:24:11,590 --> 00:24:14,440 According to Simon, said, when it comes to autonomous weapons, 185 00:24:15,070 --> 00:24:20,800 it would be sufficient for only the off the rails to be committed without the requirements. 186 00:24:21,640 --> 00:24:25,480 So, for instance, they can take the problem of infecting civilians. 187 00:24:25,750 --> 00:24:30,220 It's intentionally attacking civilians. If an autonomous weapon does it. 188 00:24:30,250 --> 00:24:34,810 So this argument goes it is sufficient that civilians have been attacked. 189 00:24:35,200 --> 00:24:40,530 It's not sufficient. It's not necessary to prove that the machine did so with intent. 190 00:24:42,580 --> 00:24:52,690 I don't think that there is any reason to change the interpretation of what we mean by a commission when it comes to autonomous weapons. 191 00:24:53,890 --> 00:25:01,360 The commission of a crime, I think, implies that it needs to be committed properly in all its elements. 192 00:25:03,400 --> 00:25:09,520 The only partial exception is represented when the subordinate can avail itself of an excuse. 193 00:25:10,270 --> 00:25:16,350 But in the interests of science and ecosystem, it's even more complex than what I've been saying so far. 194 00:25:16,360 --> 00:25:21,320 It will only result. Under criminal law as extensive. 195 00:25:21,490 --> 00:25:29,020 However, I think that is pretty clear that autonomous weapons cannot form the required rea for a crime. 196 00:25:29,290 --> 00:25:38,059 They cannot act with intent. Because again, the answer is linked to the notions of blame, worthiness and culpability. 197 00:25:38,060 --> 00:25:41,630 And I think that these are here into human notions. 198 00:25:44,270 --> 00:25:53,330 And if autonomous weapons or then unable to form demonstrate are maybe the correct analogy is with dangerous animals. 199 00:25:54,230 --> 00:26:00,320 But then if you look at countries that do have laws on dangerous animals, for instance, 200 00:26:00,320 --> 00:26:06,980 that you take the responsibility of the owner of dangerous animals is some form of strict liability. 201 00:26:07,190 --> 00:26:20,309 It is not a horrible of severe responsibility. The fact also that an underlying crime must have been committed in all its elements, of course, 202 00:26:20,310 --> 00:26:34,380 affects the possibility of the for the operator to be held responsible as well because they might also not informed the required real. 203 00:26:36,150 --> 00:26:44,910 And as a consequence, this person's superior would not be responsible for the wrongdoings of the weapons. 204 00:26:46,660 --> 00:26:52,420 Now, so far, I've told you about all the reasons why I think it's your responsibility is not the way to go. 205 00:26:53,080 --> 00:27:00,190 I still think that it can be useful in the discussions concerning autonomous weapons. 206 00:27:01,090 --> 00:27:09,220 If we think about the supervisory duties of the superior, so we should think about those obligations that stem from the union to prevent. 207 00:27:11,300 --> 00:27:16,640 Again from the case of international criminal courts and tribunals, 208 00:27:17,150 --> 00:27:24,470 we have some examples of measures that can be taken to prevent the crimes of the subordinates. 209 00:27:25,840 --> 00:27:30,969 And I think that at least some of these measures can be interpreted, again, of course, 210 00:27:30,970 --> 00:27:41,200 with respect to human subordinates so as to keep weapons within human control and the human chain of command and control. 211 00:27:42,240 --> 00:27:50,430 For instance, if you take the obligation to give adequate training child to the subordinates. 212 00:27:51,000 --> 00:27:54,480 I think, of course, this means making sure that the subordinates. 213 00:27:54,990 --> 00:28:00,420 Again, I'm talking about humans are aware of the rules of international humanitarian law, 214 00:28:01,260 --> 00:28:08,790 but it also means making sure that they're trained in how to use the means and methods of warfare, their disposal in compliance with ISIL. 215 00:28:09,480 --> 00:28:16,080 And if this means of warfare and autonomous weapons, it means we need to understand how the system works. 216 00:28:16,740 --> 00:28:19,560 They need to understand how to operate it, then supervise it. 217 00:28:20,100 --> 00:28:25,230 They need to understand when it is not functioning and what to do when it is not functioning. 218 00:28:26,410 --> 00:28:38,160 And in turn, the commander must be able to understand sufficiently how this technology works so that it can be employed in accordance with show. 219 00:28:38,200 --> 00:28:42,760 And then, if anything goes wrong, it should be preventing the use of the weapon. 220 00:28:45,490 --> 00:28:52,360 Tate also issuing orders aimed at bringing the relevant practice in accordance with international humanitarian law. 221 00:28:53,080 --> 00:28:56,530 This can be done in different ways when it comes to homeless veterans, for instance. 222 00:28:57,280 --> 00:29:05,290 Commanders can order their subordinates to use a specific level of motivation for asserting that they can 223 00:29:05,290 --> 00:29:12,160 circumscribe the geographical or temporal use of the weapon so that it is used in compliance with bilateral. 224 00:29:13,590 --> 00:29:17,370 Spirits also have a duty to prevent further or recurring crimes. 225 00:29:17,640 --> 00:29:28,740 And in this sense, securing reports about the use of the weapons in past attacks is important because the first sign that a weapon was functions, 226 00:29:29,370 --> 00:29:32,070 that this might not be a war crime. 227 00:29:32,940 --> 00:29:41,250 But then at that point, the operator is on notice and their commander is on notice as well that something is potentially going wrong. 228 00:29:42,000 --> 00:29:51,540 And at that point, the commander might even have to order that the weapon not be used until it can be reprogrammed or adjusted. 229 00:29:54,200 --> 00:30:05,140 Now to conclude. I understand the appeal of superior responsibility as a solution to the responsibility gap. 230 00:30:07,040 --> 00:30:13,700 But I do not think that in its current formulation, the doctrine allows us to do so. 231 00:30:13,820 --> 00:30:19,730 I told you why. Of course, the doctrine could change as a matter of a treaty or customary law. 232 00:30:20,930 --> 00:30:23,329 I don't see this happening, but even if it did, 233 00:30:23,330 --> 00:30:32,000 I think we would run the risk of turning severe responsibility in the form of strict liability for criminal negligence, 234 00:30:33,020 --> 00:30:40,610 so that anyone who is in putative control of the weapon would be responsible as a superior for whatever malfunctioning happens. 235 00:30:41,180 --> 00:30:45,230 And I don't really think that this is a development that we should wish for. 236 00:30:46,740 --> 00:30:57,150 So instead of closing the responsibility gap, I think this would create further problems to the culpability of the superior. 237 00:30:57,570 --> 00:31:03,120 And I think that instead of creative interpretations of criminal law, 238 00:31:03,450 --> 00:31:11,670 we should focus on supervisory duties and making sure that the weapons remain under adequate disposal of humans, 239 00:31:12,540 --> 00:31:21,540 under a human chain of command and control. So that one person is responsible and that person superior can also be responsible. 240 00:31:21,570 --> 00:31:26,040 So at the end of the day, the responsible thing is supposed. 241 00:31:27,010 --> 00:31:38,320 When we mix the technical specifications of the weapon, we agree on a sufficient level of human control and not by interpreting criminal law. 242 00:31:38,450 --> 00:31:43,540 In other words. Thank you so much for your attention.