1 00:00:01,340 --> 00:00:08,390 Very. So we've got Professor Marcus Gardiner, who's an associate professor at the University of Miami School of Law. 2 00:00:08,860 --> 00:00:14,420 It's good to have you come all the way from Manhattan and speak to us on the topic that he's been up there. 3 00:00:15,230 --> 00:00:25,700 He works in international law in this in this field, as well as, in fact, he has worked at the Israeli Supreme Court, 4 00:00:25,700 --> 00:00:29,989 where he worked for perhaps the most distinguished president of that court, 5 00:00:29,990 --> 00:00:41,690 Aaron Barak, who's making numerous judgements on the international humanitarian law and of course, was there in 2003 the Supreme Court. 6 00:00:41,990 --> 00:00:49,940 I thought, Marcus, yesterday that I am, you know, the admiration of his academic work, but for me, 7 00:00:49,940 --> 00:00:55,670 his most distinguished accomplishment and I'm not sure he took this as a compliment, but I intended as a compliment. 8 00:00:56,030 --> 00:01:01,579 The most distinguished accomplishment was that in 2008, just before we met, actually, 9 00:01:01,580 --> 00:01:07,150 the flight crew all the way from Germany to China, how many thousand miles? 10 00:01:07,580 --> 00:01:13,220 Just under just under 10,000 miles. And I asked him back again and he said, no, just one way. 11 00:01:14,600 --> 00:01:18,140 But thank you very much for coming back over to kill you. 12 00:01:20,180 --> 00:01:26,810 Well, thank you for for inviting me. And let me thank you and for the great privilege. 13 00:01:27,350 --> 00:01:32,299 At some point, I hope you will you will see that some kind of accomplishment is also my writing. 14 00:01:32,300 --> 00:01:39,680 And I know what it's like. And but but let me just jump right into this. 15 00:01:40,400 --> 00:01:47,930 The title of my talk is The Battlefield From Afar Independent The Operating Weapon System and the Law of Armed Conflict. 16 00:01:48,680 --> 00:01:53,540 So the military landscape, I think, has undergone a considerable number of changes. 17 00:01:53,540 --> 00:01:59,029 We see asymmetric warfare. 18 00:01:59,030 --> 00:02:09,620 The traditional paradigm of protracted interstate war with large armies on both sides has to a certain extent at least been replaced by opponents, 19 00:02:09,710 --> 00:02:11,060 which are much more flexible. 20 00:02:12,440 --> 00:02:18,830 And and finally, the means with which armed conflict is being carried out has undergone significant modifications already. 21 00:02:19,430 --> 00:02:27,620 One of which I'm going to be talking about today, namely what we know as drones, 22 00:02:27,620 --> 00:02:33,860 quote unquote, and what we see in the future, namely autonomous weapons systems. 23 00:02:34,880 --> 00:02:41,930 And so this is a brief overview after this introduction with us to talk very briefly about the current state of technology and future trends, 24 00:02:42,590 --> 00:02:51,560 and then go into what I call dehumanisation in two dimensions, namely the legal and the moral dimension before offering some concluding remarks. 25 00:02:53,360 --> 00:03:02,180 So the the the phenomenon that we're most familiar with are probably the unmanned aerial vehicles, 26 00:03:02,900 --> 00:03:08,900 the fly reconnaissance mission to a very large extent. And a number of smaller missions are carried out. 27 00:03:09,680 --> 00:03:17,400 They are carried out are armed attacks. The operator sits thousands of miles away, depending on whether it's carried out by by the Air Force. 28 00:03:17,420 --> 00:03:20,810 It's in Nevada or in Virginia, in the case of the CIA. 29 00:03:22,070 --> 00:03:31,520 But regardless of whether an unmanned system operates in the air, on land or on or under the water, they share one characteristic. 30 00:03:31,610 --> 00:03:38,660 They're visible piece of a network that operates, at least to this point, with direct human input. 31 00:03:38,960 --> 00:03:45,130 The next generation, however, of humans, are designed to operate fully independently from human input. 32 00:03:45,140 --> 00:03:47,870 That means target selection, target acquisition, 33 00:03:48,170 --> 00:03:57,620 and a decision whether to employ a weapon system and what type of weapon system at any particular moment in time will be done by an unmanned system. 34 00:03:58,430 --> 00:04:03,079 The technology's either in place or in the process of being developed are the 35 00:04:03,080 --> 00:04:07,489 military leaders and political leaders repeatedly claim that this will not happen, 36 00:04:07,490 --> 00:04:10,250 that the human will remain in the loop, so to speak. 37 00:04:12,980 --> 00:04:23,360 I think that mantra is starting to wane off and certainly the last report by the Department of Defence of the United States is is 38 00:04:23,360 --> 00:04:33,080 very much indicative of that when it speaks of a completely autonomous weapons system being deployed in about 20 years from now. 39 00:04:33,860 --> 00:04:38,480 Now, let me go back very, very briefly with the history. 40 00:04:38,840 --> 00:04:43,309 There are certain vehicles before and during World War Two that operate remotely. 41 00:04:43,310 --> 00:04:51,200 They did not operate autonomously, but operated remotely. The first one that I'm going to show you is this. 42 00:04:51,680 --> 00:05:00,770 Nikola Tesla developed this contraption, offered it to the US and UK navies, which both. 43 00:05:01,000 --> 00:05:05,020 Declined for various reasons. Partially because they were blue water navies. 44 00:05:05,680 --> 00:05:12,309 The German military in World War Two took this took this machinery on very handily. 45 00:05:12,310 --> 00:05:26,469 It also created what's called the Goliath. Basically a bomb strapped onto or strapped onto this contraption, driving on with a connected by a line, 46 00:05:26,470 --> 00:05:34,900 not remotely operated by by other means, but with through through a wire and then would explode next to a tank. 47 00:05:36,430 --> 00:05:44,110 All of these were single use and and didn't do much in terms of extending extending 48 00:05:44,320 --> 00:05:49,890 human capabilities as far as as as what we're going to be talking about today. 49 00:05:52,090 --> 00:06:01,389 But at this time, already, what we saw were statements to the effect that this war may have been fought by heroes flying around in planes, 50 00:06:01,390 --> 00:06:05,170 the next maybe fight by aeroplanes with no men in them at all. 51 00:06:05,620 --> 00:06:12,130 That statement may have been true premature, but we're slowly coming to see this reality. 52 00:06:12,760 --> 00:06:15,790 Now, let me just offer a very brief definition. 53 00:06:16,420 --> 00:06:25,959 What I'm going to be speaking about, I'm going to follow the Department of Defence again and say that it's an electromechanical system 54 00:06:25,960 --> 00:06:33,340 that is able to exert its power to perform design missions and includes no human operator aboard. 55 00:06:33,550 --> 00:06:36,670 And the system is designed to return or to be recoverable. 56 00:06:36,850 --> 00:06:44,649 That's an unmanned system. The autonomous element thereof is where in the UMC receives its mission from the 57 00:06:44,650 --> 00:06:49,480 human and accomplish submission with or without further human robot interaction. 58 00:06:49,810 --> 00:06:54,010 The level of that interaction, along with other factors such as mission, complexity, 59 00:06:54,010 --> 00:06:57,790 environmental difficulty, determine the level of autonomy for the unit. 60 00:06:58,150 --> 00:07:04,780 So what you see is already a different type of of autonomy where you talk about remote controlled as of today, 61 00:07:05,020 --> 00:07:12,280 semi-autonomous, also already in existence or really autonomous systems that are still in the future. 62 00:07:14,110 --> 00:07:21,099 Also, were the route worthy of remembering is that there's a spectrum that many of these systems that we're 63 00:07:21,100 --> 00:07:25,750 talking about operate on a spectrum that may go from remote control all the way to autonomous. 64 00:07:26,800 --> 00:07:32,350 The so-called agent system that some of you may be familiar with, which I talk about in a second, 65 00:07:32,650 --> 00:07:44,140 is one of those this particular system which became famous, slightly infamous when shooting down Flight 65 in 1988 off the Iranian coast. 66 00:07:45,230 --> 00:07:54,010 The the system operated in an in semi-autonomous manner could have been interrupted by the crew, 67 00:07:54,010 --> 00:08:00,370 but wasn't because it was indicated that that that the approaching airliner was an F-14. 68 00:08:01,210 --> 00:08:07,270 The system engaged and ultimately shot down the plane, leading to the death of 290 people. 69 00:08:08,680 --> 00:08:20,680 Further systems that operate in a similar fashion are the Patriot missile system that shot down a British tornado in southern Iraq in 2003. 70 00:08:22,840 --> 00:08:31,300 And so what you see is a development towards more autonomy, Naito said. 71 00:08:31,570 --> 00:08:37,630 Currently, at best, this type of autonomy is very ambitious and at worst improbable to achieve. 72 00:08:38,980 --> 00:08:43,360 The US takes a very different stance on this and, and, and moves ahead. 73 00:08:43,630 --> 00:08:49,060 So what we see then are a variety of scenarios. 74 00:08:49,840 --> 00:08:57,010 On the left you see the current system and future systems which operate one operator. 75 00:08:57,080 --> 00:09:00,100 In reality, it's more than one. Far more than one. 76 00:09:00,100 --> 00:09:10,420 At least two for flying and targeting purposes at this point with predator drones all the way to a yet to be determined 77 00:09:10,540 --> 00:09:24,550 determined time where the total number of pilots that you see here only slightly is 154 total number that is in. 78 00:09:27,660 --> 00:09:32,430 Up to over a hundred of drones. 79 00:09:33,810 --> 00:09:37,320 So what are we? This is an Air Force flight. 80 00:09:37,320 --> 00:09:41,820 It's not a site that I made up myself. So what are we talking about? 81 00:09:41,850 --> 00:09:45,030 We're talking about reconnaissance in combat. 82 00:09:45,030 --> 00:09:54,000 And in terms of of aerial vehicles, we're talking about vehicles that have a very long flight time, 83 00:09:54,300 --> 00:09:58,680 that operate from ground bases and communicate over satellites with their ground stations. 84 00:09:58,690 --> 00:10:01,050 Again, this could be half a world away. 85 00:10:01,740 --> 00:10:11,940 They're both reconnaissance and asset, as well as combat firing, the so-called Hellfire missiles, but also laser guided bombs with up to £500. 86 00:10:12,780 --> 00:10:17,489 This is the next generation that that is not yet deployed. 87 00:10:17,490 --> 00:10:20,730 And it's just had its maiden flight not long ago. 88 00:10:21,090 --> 00:10:29,560 On on the ground slash on ships, you see systems such as these installed. 89 00:10:29,570 --> 00:10:32,850 This is the land based version of the around. 90 00:10:33,210 --> 00:10:36,270 And then you have unmanned ground vehicles. 91 00:10:36,660 --> 00:10:40,800 So far at least used for mainly for ordnance detection, 92 00:10:41,250 --> 00:10:47,820 but also designed to carry loads that otherwise would have been would have to be carried by humans. 93 00:10:48,450 --> 00:10:55,020 Military planners like them, of course, because it increases the ability of soldiers to have greater distances. 94 00:10:56,850 --> 00:11:03,509 There is, of course, if you see a platform carrying machinery, 95 00:11:03,510 --> 00:11:09,120 you also see a weapons based system that operates that is designed to operate autonomously in the future. 96 00:11:10,290 --> 00:11:17,790 It's the very same platform that you see. The final pictures are smaller robots that are already and in operation. 97 00:11:17,800 --> 00:11:25,530 All of these are remote controlled still. And then, of course, some some naval based versions. 98 00:11:29,750 --> 00:11:40,880 Now. What are the rationales for the deployment? The force multiplication is an obvious element why these systems are important employed. 99 00:11:41,300 --> 00:11:46,910 The expansion of the battlespace, i.e. combat, can be carried out in a larger area than before. 100 00:11:48,440 --> 00:11:54,650 Reach of military weapons system usually means longer times being spent. 101 00:11:55,400 --> 00:12:06,800 So say in the case of waves over a certain target area, reduction of casualties after 21, this took off in a much, 102 00:12:07,460 --> 00:12:12,680 much larger degree than before because of the need for reconnaissance in asymmetric warfare. 103 00:12:12,680 --> 00:12:17,600 It is being said we no longer have armies standing on both sides, but rather you have asymmetric warfare, 104 00:12:17,810 --> 00:12:21,590 which means that you may have to follow a particular individuals or groups of 105 00:12:21,590 --> 00:12:26,090 individuals for a longer period of time that increased the need for reconnaissance. 106 00:12:26,420 --> 00:12:32,750 So the narrative goes and then therefore increased the need for these for these vehicles. 107 00:12:33,260 --> 00:12:38,240 There also possess higher capabilities in terms of detection. 108 00:12:39,350 --> 00:12:43,460 Some say it's a morale boost. The British army, 109 00:12:43,880 --> 00:12:50,690 there's various interviews with British Army personnel that say it's a morale boost to have ground 110 00:12:50,690 --> 00:12:56,810 vehicles next to you or knowing that there are aerial vehicles above that that watch out over you. 111 00:12:57,290 --> 00:13:01,970 You also, of course, have the rationale for the reduction of costs. 112 00:13:04,460 --> 00:13:07,790 Think about pilots. This is an obvious think about pilots. 113 00:13:08,990 --> 00:13:12,170 The obvious example is it costs a lot to train a pilot. 114 00:13:12,440 --> 00:13:19,759 If that pilot gets shot down and if it passes away, then then you incur costs. 115 00:13:19,760 --> 00:13:22,730 But you also have Social Security and pension pays. 116 00:13:23,930 --> 00:13:31,340 You eliminate the cost associated with upkeep based of keeping recruitment quite apart, of course, from the human costs involved. 117 00:13:33,200 --> 00:13:36,770 Finally, should the network be taken out, this is then the next step. 118 00:13:37,100 --> 00:13:44,360 You need greater autonomy of any such systems, meaning we're moving away from remote control to fully autonomous. 119 00:13:45,710 --> 00:13:47,280 But consider this. On the other hand, 120 00:13:47,280 --> 00:13:57,889 the current system that operates to distinguish between friends and enemies based on the sound of a weapon system makes for, say, the. 121 00:13:57,890 --> 00:14:01,129 And an AK system sounds very different from an M-16. 122 00:14:01,130 --> 00:14:04,940 I've never been in the military, but I'm being told people can distinguish that very easily. 123 00:14:06,890 --> 00:14:10,640 The program is designed to distinguish based on that those characteristics. 124 00:14:12,020 --> 00:14:16,940 You can easily make up scenarios where that dichotomy may break down. 125 00:14:17,840 --> 00:14:20,209 An Allied soldier may carry a Kalashnikov. 126 00:14:20,210 --> 00:14:25,640 An Allied soldier may have to pick up that Kalashnikov because he or she ran out of ammunition or vice versa. 127 00:14:25,820 --> 00:14:31,190 An enemy comes in, picks up an M-16, and they're not no longer being engaged. 128 00:14:32,270 --> 00:14:43,580 Now, the extent of the use of humans in current conflicts is is dubious at best to to ascertain, because there are no clear numbers. 129 00:14:44,120 --> 00:14:49,279 If if the fiscal side is any indication, then the Department of Defence, 130 00:14:49,280 --> 00:14:56,600 U.S. Department of Defence will spend more than 5.4 billion on unmanned aerial vehicles alone in 2010. 131 00:14:57,140 --> 00:15:01,870 Compare that number with roughly 200 million in the 1990s. 132 00:15:02,480 --> 00:15:11,389 So you see a sharp increase that of money that is being allocated before Congress. 133 00:15:11,390 --> 00:15:19,430 In a hearing, a member of the administration said that every second of every day for the 40 predator, that serious aircraft are airborne worldwide. 134 00:15:20,000 --> 00:15:26,420 While the hours of various waves by the Air Force are in operation, that's more than tripled between 2006 and 2009. 135 00:15:26,720 --> 00:15:30,680 It stands now with 295000 hours per year. 136 00:15:31,190 --> 00:15:34,550 That's still very minuscule compared to aeroplane flight time credits. 137 00:15:34,880 --> 00:15:38,780 The indication is that it's growing exponentially at this point. 138 00:15:41,510 --> 00:15:48,919 The future of the future of autonomous weapons system is complicated and I'm not going 139 00:15:48,920 --> 00:15:54,320 to talk much about it because I am not an A lawyer and I'm not I'm not an engineer. 140 00:15:55,370 --> 00:16:01,699 But there it there's indications that about in about 10 to 15 years, 141 00:16:01,700 --> 00:16:08,810 intelligence experts say technology will have reached a point in which systems will be able to fully act independently. 142 00:16:10,010 --> 00:16:18,080 I simply assume right now that this will happen. And again, the indications are from the Department of Defence that this is absolutely true. 143 00:16:18,920 --> 00:16:27,890 Now let me turn to what I call the humanisation one, the legal dimension, new tech. 144 00:16:27,940 --> 00:16:36,370 Biologists have always posed a challenge to the law of armed conflict. The Lacey rules were designed with an anthropocentric paradigm in mind. 145 00:16:37,270 --> 00:16:42,010 Combatants squaring off face to face is certainly no longer the traditional paradigm. 146 00:16:42,850 --> 00:16:50,650 And regarding UMC, we see roughly two tendencies in two schools of thought over the question how to deal with them, legally speaking. 147 00:16:52,810 --> 00:16:59,620 One is that unmanned systems require the creation of a completely new legal regime, and the existing law is inadequate. 148 00:17:00,670 --> 00:17:07,569 The other school of thought says the existing body of rules is fair, is sufficient to capture these developments, 149 00:17:07,570 --> 00:17:14,350 and it's essential to design unmanned systems that are compatible with the existing framework rather than the other way round. 150 00:17:16,180 --> 00:17:21,639 According to that school of thought, there may not really be a need for additional legislative action if it turns 151 00:17:21,640 --> 00:17:25,750 out that the approach taken by the body of the law of armed conflict overall, 152 00:17:26,530 --> 00:17:34,030 i.e. one that doesn't focus on a single weapon system, at least usually or technology, is adequate to deal with this new development. 153 00:17:35,620 --> 00:17:42,160 Additional protocol one spells out this requirement more detail in Article 36, 154 00:17:43,450 --> 00:17:56,230 which requires that any employment of a new weapon means or method of warfare that a country studies develops or acquires shall be prohibited. 155 00:17:57,490 --> 00:18:02,080 If it runs counter to the principles laid down in an armed conflict. 156 00:18:03,250 --> 00:18:07,510 Now, more concretely, Article 48 of AIP one. 157 00:18:08,050 --> 00:18:11,920 Some of you, I'm sure, are familiar with it. I'll just read that one out loud. 158 00:18:13,450 --> 00:18:20,350 Puts forth some requirements in order to ensure respect for and prosecution of the civilian population and civilian objects. 159 00:18:20,650 --> 00:18:24,340 The parties to the conflict should at all times distinguish between the civilian 160 00:18:24,340 --> 00:18:28,569 population and combatants and between civilian objects and military objects, 161 00:18:28,570 --> 00:18:33,550 and accordingly shall direct their operations only against military objectives. 162 00:18:34,480 --> 00:18:39,580 So what you see is this general rule being fleshed out. 163 00:18:40,480 --> 00:18:45,430 Number one, with regard to the principal distinction, number two, with regard to the principle of proportionality. 164 00:18:46,660 --> 00:18:52,660 And then there's an underlying aspect that I think permeates the law of armed conflict that I will deal with subsequently, 165 00:18:52,660 --> 00:18:55,660 namely that combat be carried out in a humane fashion. 166 00:18:58,480 --> 00:19:02,500 The law of armed conflict, I think, is best described as a tension between two elements, 167 00:19:02,500 --> 00:19:07,330 namely military necessity on the one hand, and humanity on the other. 168 00:19:07,570 --> 00:19:13,540 There's considerable disagreement where on this continuum between those two poles, a balance should be struck. 169 00:19:14,050 --> 00:19:20,260 And there's disagreement to what extent X in circumstances may play a role in finding that balance. 170 00:19:20,710 --> 00:19:26,530 Advances in military technology are the topic of this talk the acceptability of civilian casualties 171 00:19:26,530 --> 00:19:32,650 in the court of public opinion and the role accorded to state sovereignty all play a role in this. 172 00:19:33,580 --> 00:19:45,190 There appears to be a tendency, I would submit, in interpreting this area of the law in a less military centric way than it has been in the past, 173 00:19:45,190 --> 00:19:52,090 and rather one that takes the military considerations into account to a greater extent, at least than it has in the past. 174 00:19:52,960 --> 00:19:58,260 Some claim that this is evident by the change in designation that this legal field as such has undergone. 175 00:19:58,270 --> 00:20:09,130 We spoke of the law of war in the past that moved on to the law of conflict and now oftentimes designated as international humanitarian law. 176 00:20:11,320 --> 00:20:12,969 This is somewhat counterintuitive. 177 00:20:12,970 --> 00:20:21,790 I think if you look at the large scale atrocities that we've all witnessed and seen in Cambodia, Somalia, the former Yugoslavia, 178 00:20:22,210 --> 00:20:31,540 Sierra Leone, Afghanistan and the Congo, all of which have had civilians as the centre of action in many, many instances. 179 00:20:32,230 --> 00:20:42,090 Whether computers which are. I would I would like you to recall, are much better on quantitative analysis rather than qualitative analysis. 180 00:20:42,100 --> 00:20:49,660 You heard me say this a number of times are capable of doing of making this distinction remains an open question. 181 00:20:50,380 --> 00:20:54,070 And while there have been impressive advances in cognitive technologies, 182 00:20:54,310 --> 00:21:01,870 it remains to be analysed whether the principle of distinction proportionality can safely be entrusted to a digital code existing of zeros and ones. 183 00:21:03,040 --> 00:21:13,600 Now, principle of distinction, essentially, in simple terms, it mandates that military attacks distinguish between civilian targets and military ones. 184 00:21:14,620 --> 00:21:22,180 This clear theoretical line of distinction is oftentimes blurred by the circumstances of any particular engagement. 185 00:21:25,480 --> 00:21:35,450 The variety of. Of provisions. In AP one speak to this notion, namely, that indiscriminate attacks are prohibited under Article 51.4. 186 00:21:37,610 --> 00:21:51,259 And furthermore, Article 52.2 tries to take account of the fact that military objectives are only those that by nature, location, 187 00:21:51,260 --> 00:21:59,629 purpose or use make an effective contribution to military action whose total or partial destruction, capture or neutralisation in the circumstances. 188 00:21:59,630 --> 00:22:03,590 Ruling at the time offers a definite military advantage. 189 00:22:05,300 --> 00:22:12,170 The element of use in this particular provision makes clear that the law of armed conflict incorporates a dynamic element 190 00:22:12,170 --> 00:22:20,420 in the in that civilian objectives may become military ones if they are being used by the enemy for military aims. 191 00:22:21,410 --> 00:22:29,660 The same applies obviously, to individuals, though one civilian can potentially be considered to directly participate in hostilities. 192 00:22:30,440 --> 00:22:36,980 However, the flip side, of course, is that even military bases could include non-military targets. 193 00:22:37,730 --> 00:22:42,170 Imagine any military base, and you will see what I mean, 194 00:22:43,220 --> 00:22:50,900 housing of civilians or of military families, for example, being being done on on military bases. 195 00:22:52,580 --> 00:22:57,440 What does all of that mean for purposes of UN deployment? 196 00:22:58,370 --> 00:23:01,639 The system must and it must do so under all circumstances, 197 00:23:01,640 --> 00:23:07,459 be capable of determining whether it can properly engage in such things in accordance with 198 00:23:07,460 --> 00:23:12,920 Article 51 four of the additional protocol for those states that have ratified the Treaty, 199 00:23:12,920 --> 00:23:18,080 of course, and for those that did not, must follow the parallel rule of customary international law. 200 00:23:18,410 --> 00:23:19,730 How do we achieve that? 201 00:23:19,760 --> 00:23:28,880 Well, the sensors, the guidance technology and the ability to react to circumstances they may quickly change are important elements. 202 00:23:29,210 --> 00:23:37,880 In that context, it must also then determine whether a particular weapon that it may carry at any 203 00:23:37,880 --> 00:23:44,450 moment in time is able to to to speak to that distinction through its engagement, 204 00:23:44,450 --> 00:23:49,370 i.e., weapons that are too large or do not discriminate. 205 00:23:50,090 --> 00:23:52,550 What would be prohibited under the law of armed conflict? 206 00:23:53,150 --> 00:23:59,120 And let it be clear that even outside of the the legal system, there is very little clarity in all of this. 207 00:24:01,220 --> 00:24:11,600 Peter Singer describes a situation in which a high ranking general in the US gave orders to destroy a compound despite the presence of civilians. 208 00:24:12,050 --> 00:24:18,560 I consider a large, considerably large number of them because insurgents entered and left openly carrying weapons in Afghanistan. 209 00:24:19,220 --> 00:24:23,360 The presence of insurgents should have been a signal to the civilians. 210 00:24:23,930 --> 00:24:27,400 The general said that the compound was now a legitimate target. 211 00:24:27,410 --> 00:24:30,800 After a few hours, civilians should have realised this should have moved away. 212 00:24:33,770 --> 00:24:39,830 In my mind, this is a good example for the potential shortcomings. If a high ranking general interprets the law of armed conflict. 213 00:24:40,430 --> 00:24:44,630 Provisions pertaining to distinction in and at the very least, dubious manner. 214 00:24:44,840 --> 00:24:51,440 It's not clear that the same mindset does not enter the code that may eventually determine what to do in similar situations. 215 00:24:51,860 --> 00:24:54,950 The machine will not say no to the orders of a commander. 216 00:24:54,950 --> 00:25:03,440 It will simply carry out what has been fed into it. So it all depends on the particular input that we would see. 217 00:25:04,250 --> 00:25:08,690 Now, the principle of proportionality creates similar challenges. 218 00:25:11,330 --> 00:25:18,830 The unmanned systems that we're talking about, autonomous unmanned systems, takes the challenges of UAV are currently facing considerably further. 219 00:25:19,370 --> 00:25:24,409 It removes a combatant entirely from the individual decision making process and shifts the 220 00:25:24,410 --> 00:25:28,850 burden off the decision making process to the programming stage of the system software. 221 00:25:29,870 --> 00:25:36,950 The legal basis for the proportionality principle, as most of you may know, is, again, additional protocol one. 222 00:25:37,340 --> 00:25:43,220 There is no actual mention of the proportionality principle, but it finds reflection in a number of provisions. 223 00:25:43,510 --> 00:25:48,770 Again, I will read some for for those who are not as familiar. 224 00:25:49,070 --> 00:25:52,219 Article 50 15b of different protocol. 225 00:25:52,220 --> 00:25:59,090 One prohibits an attack which may be expected to cause incidental loss of civilian life, life injury to civilians, 226 00:25:59,420 --> 00:26:10,160 damage to civilian objects or combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated. 227 00:26:11,690 --> 00:26:13,790 Now, excessive is the trigger term here. 228 00:26:13,790 --> 00:26:21,890 It's simply not clear what that term means in the abstract and can only be determined in the specific circumstances of a particular situation. 229 00:26:23,570 --> 00:26:27,590 Again, Article 57 two speaks of damage that would. 230 00:26:27,670 --> 00:26:32,890 Would prove to be excessive in relation to the concrete and direct military advantage anticipated. 231 00:26:35,260 --> 00:26:38,889 Some have suggested that the discrepancy between loss of life, injury, 232 00:26:38,890 --> 00:26:45,670 damage to objects on the one hand and the direct military advantage anticipated must be clearly disproportionate. 233 00:26:46,810 --> 00:26:52,130 The insertion of such a requirement, at least in my mind, does nothing to solve the problem yet, 234 00:26:52,540 --> 00:26:58,270 and if anything adds further confusion, the language simply does not bear that out. 235 00:26:59,320 --> 00:27:06,610 Reference to the Rome Statute, as some authors make to Article eight of the Rome Statute, I think is not very helpful either. 236 00:27:06,640 --> 00:27:09,130 The one good thing I can say for my own purposes right now, 237 00:27:09,340 --> 00:27:16,450 I don't have to cite this particular debate because regardless of where you come out on this, a machine that we're talking about, 238 00:27:16,450 --> 00:27:22,120 an autonomous UMC would have to be able to determine exactly what is excessive and 239 00:27:22,120 --> 00:27:26,590 what is not excessive in a particular circumstance at a particular moment in time. 240 00:27:28,810 --> 00:27:32,140 Now, proportionality obviously plays a part in target selection. 241 00:27:32,260 --> 00:27:40,570 The program would have to be designed to anticipate all potential decisions, either by program, all of them in which is impossible to do, 242 00:27:40,870 --> 00:27:48,580 or by designing a decision making process and decision rules that are capable of making such decisions with a myriad of factors to be weighed. 243 00:27:51,040 --> 00:27:58,420 It also proportionality also plays a role in determining what extent of civilian losses are acceptable in any given situation. 244 00:27:58,720 --> 00:28:04,900 The law of armed conflict does not, as such, prohibit the taking of civilian lives. 245 00:28:05,290 --> 00:28:08,350 I'd also need to take account of which weapon should be deployed. 246 00:28:08,830 --> 00:28:16,299 Namely, this autonomous weapons system in the future would have to know which weapon has what type of effect on what kind of terrain, 247 00:28:16,300 --> 00:28:20,320 under what type of circumstances. All of this may be easy in the abstract, 248 00:28:20,320 --> 00:28:24,309 but the close proximity and the closer proximity compared to the past of 249 00:28:24,310 --> 00:28:30,370 civilians may make this determination much more difficult now and in the future. 250 00:28:32,470 --> 00:28:41,200 Finally, if any of these decisions could not be made, the default in such a system would have to be, I cannot make this decision. 251 00:28:41,350 --> 00:28:46,540 I will then have to revert back to a human decision maker, which would have to have all this information. 252 00:28:46,930 --> 00:28:53,350 Again, we're talking about a network provider that would have to shoot that information back to to a human operator. 253 00:28:54,070 --> 00:28:59,050 Now, all of these rules refer to qualitative assessments rather than quantitative ones. 254 00:29:00,070 --> 00:29:06,879 And what this means is that fully autonomous you amass may be used in situations in which a target is 255 00:29:06,880 --> 00:29:13,330 completely remote and the situation appears with the potential for minimal or no civilian involvement at all, 256 00:29:13,930 --> 00:29:19,180 i.e. in cases of certainty over such circumstances at the beginning of the mission. 257 00:29:20,890 --> 00:29:26,920 This, in turn would preclude the use of fully autonomous rooms in all other situations, 258 00:29:27,280 --> 00:29:32,740 which I submit constitute the very large majority of all the cases that we're talking about. 259 00:29:33,160 --> 00:29:40,960 The case that I just outlined is extremely rare and and can be expected to occur only very, very infrequently. 260 00:29:42,130 --> 00:29:48,070 So this aspect of the legal framework is highly subjective, even more so than the principle of distinction. 261 00:29:48,100 --> 00:29:55,300 It requires military expertise, experience, legal expertise as essential to their decision making. 262 00:29:55,630 --> 00:30:01,840 And even engineers that right in this particular area can see that the principle of proportionality clearly 263 00:30:01,840 --> 00:30:07,240 highlights the difference between quantitative and qualitative decisions and the need for human decision making. 264 00:30:09,100 --> 00:30:14,380 Now, let me very briefly touch on command responsibility. 265 00:30:15,400 --> 00:30:29,830 I will keep this extremely brief. Namely who is to be held responsible in case of a violation of the law of armed conflict. 266 00:30:29,950 --> 00:30:37,640 Conflict work of mine. If I see the machine itself, I hope you will bear with me. 267 00:30:37,690 --> 00:30:40,780 There are proposals out there that the machine itself should be punished. 268 00:30:41,560 --> 00:30:52,720 I don't know how that works. How you establish criminal responsibility when most legal system require intent is beyond my comprehension. 269 00:30:53,710 --> 00:30:56,500 The traditional deterrence factors do not work. 270 00:30:56,500 --> 00:31:04,930 V7 Autonomous Unmanned system, which is a powerful element of criminal law and ultimately the law of armed conflict. 271 00:31:06,610 --> 00:31:09,430 A machine cannot be punished, nor does it possess moral agency. 272 00:31:10,090 --> 00:31:16,450 So there is there are a variety of factors that mitigate against holding the machine accountable, quote unquote. 273 00:31:18,700 --> 00:31:24,910 Now there's a scientist or the programmer who developed the software upon which the robot relied. 274 00:31:25,690 --> 00:31:31,120 The design is ultimately the foundation, foundation upon which the robot makes its determinations. 275 00:31:31,120 --> 00:31:38,470 But as a programmer, I would want to be indemnified from criminal responsibility once I hand that particular system over to the military. 276 00:31:40,090 --> 00:31:50,080 Military officers, finally, who set the parameters for a given engagement are probably the ones that would have to be held accountable. 277 00:31:50,140 --> 00:31:55,960 Again, this is there is no debate and I understand my comments right now and as conversation starters. 278 00:31:57,070 --> 00:32:01,050 Finally, I very briefly in the paper, touch upon state responsibility. 279 00:32:01,060 --> 00:32:06,250 I don't take that as it's highly controversial that the state that employs a particular 280 00:32:06,520 --> 00:32:14,440 weapon system would be held liable under under the rules of state responsibility. 281 00:32:14,450 --> 00:32:19,990 A secondary question that again may flow from this is whether if the software code, for example, was faulty, 282 00:32:20,230 --> 00:32:27,670 whether a state can recuperate any any liability from from the designer of that particular system. 283 00:32:28,840 --> 00:32:32,530 Finally, there are institutional concerns that I want to raise. 284 00:32:32,680 --> 00:32:40,630 The International Committee of the Red Cross has been fundamental in the past to shaping the rules of the law of armed conflict. 285 00:32:41,020 --> 00:32:50,110 There are very few indications until now that the ICRC has recognised the potential future challenges that autonomous movements actually represent. 286 00:32:51,310 --> 00:32:58,260 If, as Singer reports, the response to the phenomenon is there is so much terrible going on these days. 287 00:32:58,270 --> 00:33:02,020 Why waste time on something crazy like that then? 288 00:33:02,020 --> 00:33:14,379 That is deeply disconcerting. And if the developers and manufacturers of today's systems are not tied into a network that gives at the very least, 289 00:33:14,380 --> 00:33:18,040 some rough guidance on the principle that permeate the law of armed conflict. 290 00:33:18,970 --> 00:33:24,490 This development is even more disconcerting because software oftentimes is very 291 00:33:24,490 --> 00:33:27,580 much front loaded because you need to design the software in a certain way. 292 00:33:27,580 --> 00:33:29,500 That takes account of the fact early on. 293 00:33:30,130 --> 00:33:40,390 Now, let me move on to the moral dimensions of autonomous weapons system, namely what I call the Humanisation. 294 00:33:40,630 --> 00:33:50,170 Through the removal of individual from the battlefield. Question Mark Historically, we've always seen military law as an anthropocentric endeavour. 295 00:33:51,220 --> 00:33:59,530 The taking the person out of the equation by relying upon a computer chip or software code could increase the likelihood a state will resort to force, 296 00:33:59,530 --> 00:34:03,190 as its citizens are not necessarily being placed at risk any longer. 297 00:34:04,000 --> 00:34:10,239 Note For example, the increased reliance of developed states on bombing campaigns in the late 20th century, 298 00:34:10,240 --> 00:34:14,740 NATO's bombing campaign in the Balkans and the U.S., UK bombings of Iraq. 299 00:34:16,390 --> 00:34:23,620 As our examines all of this in terms of whether autonomous units ultra pre conflict proportionality call calculations 300 00:34:23,920 --> 00:34:33,670 and and and reverting to war only as a last resort if the loss of human life is an impediment to going to war. 301 00:34:34,510 --> 00:34:41,920 I made a similar argument with respect to to to private military contractors. 302 00:34:42,160 --> 00:34:47,770 If the loss of human life is an impediment to go into war there, the argument is it's not no longer our soldier. 303 00:34:48,010 --> 00:34:52,209 It's no longer the blood of the nation that's being that's being lost. 304 00:34:52,210 --> 00:34:59,320 But the dog of war. If the loss of human life is an impairment to go into war than sending an army of machines to war rather 305 00:34:59,320 --> 00:35:05,320 than friends and relatives may not exact the same physical and emotional toll on the population. 306 00:35:05,800 --> 00:35:12,400 If the cause is just, one could celebrate this new calculus which more readily permits legitimate self-defence. 307 00:35:12,400 --> 00:35:17,800 But this reduced cost may in turn reduce the rigour with which non-violent 308 00:35:17,890 --> 00:35:23,200 alternatives are pursued and thus encourage unnecessary and therefore unjust wars. 309 00:35:23,950 --> 00:35:30,670 Second, there may be a dehumanisation through the decision making taking place through an algorithm based on zeros and ones. 310 00:35:31,180 --> 00:35:34,930 This problem is not quite the same as the one that removal of soldiers has raised 311 00:35:34,930 --> 00:35:38,810 through e.g. being in a plane and not seeing the devastation that any action may cause. 312 00:35:39,730 --> 00:35:45,940 But as this has increases, it oftentimes becomes psychologically easier to deal with the aftermath of one's actions. 313 00:35:45,940 --> 00:35:53,740 And if we remove ourselves one step further by creating a code that carries out that particular action. 314 00:35:54,640 --> 00:36:01,300 You see where where the argument is heading. Now, the question is, does it really take the person out of the equation? 315 00:36:01,780 --> 00:36:08,110 You can say it's the operator or the military personnel which has decided to set the parameters for a particular engagement. 316 00:36:10,480 --> 00:36:15,400 There's also a concern of what I call loss of the battlefield. 317 00:36:15,730 --> 00:36:20,920 Individuals no longer fight on the battlefield, but rather are 20 minutes away from home. 318 00:36:21,550 --> 00:36:30,280 And again, there's this description fairly vividly where the drone operator lives, very close to to where where he works now. 319 00:36:30,280 --> 00:36:34,269 Right. There's no longer a physical or even a physical distance off the flight home. 320 00:36:34,270 --> 00:36:45,339 That used to be the case when engaging. And so 20 minutes after make it an hour after having engaged in combat by a remote control, 321 00:36:45,340 --> 00:36:49,150 he sits down or she sits down with their kids over over dinner. 322 00:36:49,960 --> 00:36:53,650 And for some soldiers, that created a considerable amount of stress. 323 00:36:57,070 --> 00:37:03,310 Now the question then turns to whether we can create ethical robots. 324 00:37:04,510 --> 00:37:08,350 There are a number of researchers that say, yes, that is perfectly possible. 325 00:37:09,220 --> 00:37:15,730 And one author in particular in this area has become fairly prolific wrong. 326 00:37:15,760 --> 00:37:23,710 Arkin argues that the use of unmanned systems will increase the ethical behaviour on the battlefield for the following reasons. 327 00:37:25,720 --> 00:37:30,040 The Economist's unmanned system does not need to protect itself. 328 00:37:32,800 --> 00:37:36,100 Humans are emotionless and nothing can cloud their judgement. 329 00:37:36,490 --> 00:37:43,750 This is very similar to an argument put forth by Walter, namely that fear and hysteria are always latent and combat, often real, 330 00:37:43,930 --> 00:37:52,990 and they press us toward fearful measures taken out the human psychological element of scenario for what's called scenario fulfilment, 331 00:37:53,260 --> 00:37:58,059 which is believed to have contributed to the incident that I mentioned before in the downing of Flight six, 332 00:37:58,060 --> 00:38:01,390 five five by the US has been since in 1988. 333 00:38:02,110 --> 00:38:06,370 The phenomenon leads to distortion or neglect of contradictory information. 334 00:38:06,370 --> 00:38:07,630 In stressful situations, 335 00:38:08,050 --> 00:38:16,390 humans use new and incoming information in ways that only fit the pre-existing belief patterns a form of premature cognitive closure, 336 00:38:17,200 --> 00:38:23,440 autonomous units, so the argument goes, can be developed so that they are not vulnerable to such patterns of behaviour. 337 00:38:26,770 --> 00:38:33,339 Due to the increase in censoring abilities. Another argument is UM's will have the ability to observe all relevant aspects in, 338 00:38:33,340 --> 00:38:37,120 if not certainly a greater ability to have a greater ability than humans. 339 00:38:38,780 --> 00:38:41,709 The data can arise from multiple remote sensors and intelligence, 340 00:38:41,710 --> 00:38:47,380 all of which have to work together through what what the US Army develops at this point. 341 00:38:47,380 --> 00:38:57,430 The global information grid, which in itself is, is a technological challenge at this point. 342 00:38:59,830 --> 00:39:06,129 Then there is an argument which I will let you decide what you make of it when working in combination with humans. 343 00:39:06,130 --> 00:39:15,280 So the argument goes you may have the ability to reduce the number of unethical behaviour patterns by humans through reporting mechanisms. 344 00:39:16,690 --> 00:39:22,360 Autonomous humans have the potential capability of independently and objectively monitoring ethical 345 00:39:22,360 --> 00:39:27,280 behaviour in the battlefield by all parties and reporting infractions that may be observed. 346 00:39:28,090 --> 00:39:34,329 There is an anecdote that this actually happened in Iraq whereby a ground based 347 00:39:34,330 --> 00:39:39,550 vehicle was actually filming a US soldier kicking a civilian in the head. 348 00:39:40,600 --> 00:39:45,339 Later reprimanded and punished. So you see where the arguments are coming from. 349 00:39:45,340 --> 00:39:54,430 Finally. Arkin argues that in 2006 the US Army studied the behaviour of its forces and found that 350 00:39:54,430 --> 00:40:00,010 approximately 10% of its forces have knowingly mistreated civilians at some point or another. 351 00:40:01,360 --> 00:40:11,860 Based on that, Arkin suggests that so long as artificial intelligence doesn't attack civilians or friendlies more than the current rest, 352 00:40:11,860 --> 00:40:19,419 namely 10% of the time armed artificial intelligence driven robots would be preferable 353 00:40:19,420 --> 00:40:23,770 to human soldiers because it will lower the new pretence and that potential for harm. 354 00:40:24,190 --> 00:40:29,710 The question that I would simply pose this is really the benchmark that we want to employ in these situations. 355 00:40:31,340 --> 00:40:38,140 There's before I come to to my conclusion, one final concern that I want to raise, namely what I call the feedback loop, 356 00:40:39,070 --> 00:40:45,640 namely the impact that autonomous weapon systems may have themselves on the law of armed conflict. 357 00:40:47,470 --> 00:40:51,190 Now, the indeterminacy of the rules of armed conflict, as you gathered, 358 00:40:51,190 --> 00:40:57,639 hopefully from what I tried to explain, especially as it pertains to the principle of proportionality, 359 00:40:57,640 --> 00:41:03,460 may have consequences that go beyond the immediate decision over whether to attack a particular target or not. 360 00:41:04,000 --> 00:41:08,620 In generally speaking, indeterminacy of legal rules, of course, does undermine their efficacy, 361 00:41:08,620 --> 00:41:12,699 and this may even be more so in the realm of the law of armed conflict, 362 00:41:12,700 --> 00:41:20,620 in which situations oftentimes require expeditious decisions, decisions which may also concern the life of human beings. 363 00:41:21,850 --> 00:41:29,790 Whether, as Thomas Frank posits, the principle of poor personality as truly matured into an advanced degree is, I think, a matter of debate. 364 00:41:29,800 --> 00:41:35,080 But following Frank, the argument would be that an increase in the transparency that virtual technologies brings with 365 00:41:35,080 --> 00:41:39,910 them also increases the compliance pull that is exerted upon decision makers within states. 366 00:41:40,750 --> 00:41:47,570 However, the reverse may also be true, as this material is oftentimes classified and acts highly controlled. 367 00:41:48,100 --> 00:41:52,180 The increase in transparency may not be as great as may be anticipated. 368 00:41:52,480 --> 00:41:58,719 More importantly, even more importantly, even, is the potential that the rules, 369 00:41:58,720 --> 00:42:06,310 as they're being reshaped through the introduction of autonomous ums, may actually be giving less traction to proportionality over time, 370 00:42:06,520 --> 00:42:11,800 namely because the inability of autonomous systems to carry out certain functions could be 371 00:42:12,070 --> 00:42:16,210 brought forth as an argument to circumvent the strictures of the law of armed conflict. 372 00:42:18,400 --> 00:42:27,110 Given that some authors and military lawyers follow a reading of the proportionality principle so that it prohibits attacks in only some cases, 373 00:42:27,130 --> 00:42:31,990 a potential, I think, for a decreasing traction is not wholly without basis. 374 00:42:32,890 --> 00:42:42,879 And now this judgement, a scenario that I hope you didn't get when you heard the title may is may not be a realistic concern, 375 00:42:42,880 --> 00:42:51,220 but I think it would also be foolish to dismiss the legal and ethical issues arising from armed autonomous weapons systems. 376 00:42:51,700 --> 00:42:56,530 The technology to implement such devices is currently either available or is in the process of development. 377 00:42:57,040 --> 00:43:03,730 And in the near future, advanced militaries will have the capability to employ such devices at their disposal. 378 00:43:04,630 --> 00:43:12,970 The path they choose to follow could undermine decades of international humanitarian law and human rights development unless care is taken to ensure, 379 00:43:13,600 --> 00:43:21,309 with a reasonable degree of certainty, compliance with international legal principles by the code that underlies the UMC. 380 00:43:21,310 --> 00:43:24,750 Autonomous UMC. This will, as has become evident, be the. 381 00:43:24,850 --> 00:43:27,490 Difficult or impossible to achieve, 382 00:43:27,850 --> 00:43:34,570 namely because the current design architecture does not build these concerns in at the front end of the projects currently involved. 383 00:43:35,740 --> 00:43:39,190 The law is far from clear, sometimes even indeterminate. 384 00:43:39,610 --> 00:43:43,030 Some would say, in the US parlance, void for vagueness. 385 00:43:44,920 --> 00:43:53,770 Furthermore, attacks that have been carried out may have been approved by high ranking military officers, but their legality may still be questioned. 386 00:43:55,480 --> 00:44:00,850 It may be too late to take a wait and see approach, and maybe the genie is truly out of the bottle. 387 00:44:01,360 --> 00:44:05,469 But sometimes it may be a good thing to be a beaver. 388 00:44:05,470 --> 00:44:08,590 And that refers exactly to the statement by Donald Rumsfeld. 389 00:44:08,920 --> 00:44:13,050 With that, I'll close and thank you for your time. 390 00:44:13,390 --> 00:44:17,220 Thank you for your attention. And.