1 00:00:02,710 --> 00:00:08,860 Today. We're very, very fortunate to have with us our very long time friend and colleague, Professor George Lucas. 2 00:00:10,000 --> 00:00:14,550 George holds one of the great chairs in military ethics, I think it's fair to say, 3 00:00:14,560 --> 00:00:20,320 in the United States, the chair and ethics at the United States Naval Academy in Annapolis. 4 00:00:20,320 --> 00:00:26,560 And he's also professor of ethics and public policy at the Naval Postgraduate School in Monterey, California. 5 00:00:27,520 --> 00:00:35,230 His topic today will be an extraordinarily topical one, permissible preventive cyber warfare. 6 00:00:35,830 --> 00:00:40,020 Thank you, David. Good afternoon, everybody. And thank you. 7 00:00:40,030 --> 00:00:47,019 Had a great honour to be with you. Despite the exotic topic this afternoon, 8 00:00:47,020 --> 00:00:54,520 I need to warn you first that I'm not going to show you any fancy film clips of killer robots 9 00:00:54,520 --> 00:01:00,280 running amok or of the devastation of the globe in a cyber conflagration of some kind. 10 00:01:00,670 --> 00:01:08,140 In fact, if there's any residual confusion on this, I'm going to have to disappoint those of you who may have been confused on this point, 11 00:01:09,010 --> 00:01:12,190 though I do come from California, at least part of the year. 12 00:01:12,550 --> 00:01:16,920 I do work on robots and cyber war. And my name is George Lucas. 13 00:01:16,940 --> 00:01:19,960 Despite all of the other things, you know, I'm not the guy. 14 00:01:20,230 --> 00:01:25,400 I figure when I saw this great audience this afternoon. Oh, they're surely expecting that it's. 15 00:01:25,420 --> 00:01:28,870 It's my neighbour to the North American family. No such luck. 16 00:01:29,110 --> 00:01:34,690 And if you doubt that, because I've been told that we bear some physical resemblance to one another, 17 00:01:35,720 --> 00:01:39,340 I have to say that I take a bit of umbrage at that. He's quite old. 18 00:01:40,120 --> 00:01:43,240 His beard and hair are grey and he's let himself go. 19 00:01:45,610 --> 00:01:46,900 But if you still doubt it, 20 00:01:47,680 --> 00:01:54,050 I can you can use some of the techniques we'll talk about this afternoon and you can check the meagre balance of my retirement account. 21 00:01:54,070 --> 00:01:57,730 And that will absolutely prove to be on shadow of doubt. I am not here. 22 00:01:59,380 --> 00:02:08,650 So the title of the talk today was entitled was intended to be provocative by suggesting that there might be occasions when a cyber war, 23 00:02:08,650 --> 00:02:13,090 if there is such a thing as a cyber war, some doubt that there is, 24 00:02:14,320 --> 00:02:19,840 but that if there is, there may be some occasions on which it's justified can be justifiably fought or pursued, 25 00:02:19,840 --> 00:02:29,200 and those may even include cases of what normally would be considered unethical and illegal wars. 26 00:02:29,200 --> 00:02:34,690 Preventive wars of self-defence. A better title than the one that I've picked, however, 27 00:02:34,990 --> 00:02:42,309 would probably have been to borrow a the example of my colleague Henry Chu and also another colleague, 28 00:02:42,310 --> 00:02:49,120 Rob Asaro at the New School for Social Research in New York, both of whom published papers with the titles. 29 00:02:49,630 --> 00:02:54,220 In Henry's case, is a just preventive war possible? 30 00:02:54,220 --> 00:03:00,580 As a question. Rob published a paper called Is a Just Robot War Possible? 31 00:03:00,850 --> 00:03:06,250 So this would be really the question we're addressing. Is it just cyber war possible? 32 00:03:09,010 --> 00:03:16,300 To answer that, I begin with the caution of the great theorist of conventional warfare, Baron Carl von Clausewitz, 33 00:03:16,840 --> 00:03:24,100 who warned at the end of that magisterial work of his that every age has its own unique kind of war, 34 00:03:24,100 --> 00:03:27,520 its own peculiar preconceptions, its own limiting conditions. 35 00:03:27,910 --> 00:03:30,660 And that's what we want to look at this afternoon. 36 00:03:30,730 --> 00:03:38,920 What are those unique conditions, those preconceptions, those limitations that attend to cyber conflict? 37 00:03:39,700 --> 00:03:44,679 I'm going to suggest sort of to give you an outline in advance. 38 00:03:44,680 --> 00:03:49,959 And where I want to go with the talk is that there are several really unique and 39 00:03:49,960 --> 00:03:53,920 troubling features about cyber war to which we need to attend more carefully. 40 00:03:54,460 --> 00:04:02,680 The first is the one that always strikes us as so extraordinary that and which were only recently coming to terms with in the last few 41 00:04:02,680 --> 00:04:09,670 years with the warnings we've been getting from a number of quarters about our vulnerabilities in the cyber sector of the cyber domain, 42 00:04:10,150 --> 00:04:24,010 and that is that any otherwise ordinary object or artefact or mundane process or conventional procedure, industrial or otherwise, 43 00:04:24,430 --> 00:04:32,890 if it involves computers controlling or guiding it in any way running software and is connected to the internet, 44 00:04:33,400 --> 00:04:39,730 then that object, that process, that procedure can in principle be weaponized. 45 00:04:41,410 --> 00:04:50,110 So the most ordinary kinds of things that we take for granted from cell phones to ATM machines to our own laptops, to, of course, 46 00:04:50,300 --> 00:05:01,930 more extraordinary and complex systems and procedures such as a nation's energy grid or its financial system or its system of traffic regulation or. 47 00:05:01,970 --> 00:05:05,030 Or air traffic control or maritime security. 48 00:05:05,540 --> 00:05:11,900 The logistical, food supply, health care, any of these things in principle can be weaponized. 49 00:05:12,710 --> 00:05:19,670 That is can be turned against the people who rely on them, depend upon them, and generally take them for granted and use to harm them in some way. 50 00:05:20,330 --> 00:05:29,360 The nature of that harm and the degree to which such harm is possible is another one of these questions that we need to wrestle with in cyber war. 51 00:05:29,390 --> 00:05:35,210 So point one is, if everything can be weaponized, cyber war is a ubiquitous kind of warfare. 52 00:05:35,510 --> 00:05:40,100 And the other piece of it that is ubiquitous is the claim that anyone, 53 00:05:40,100 --> 00:05:45,020 any of you sitting here, any individual in the war, in his pyjamas, in his bedroom, 54 00:05:45,590 --> 00:05:50,299 or a home office or any organisation, criminal, legitimate, commercial, 55 00:05:50,300 --> 00:05:57,260 otherwise, or nation state, anyone can engage in cyber war or that's the claim. 56 00:05:57,620 --> 00:06:04,760 So cyber war is ubiquitous in two senses. Anything can be weaponized and anyone can utilise those weapons. 57 00:06:05,480 --> 00:06:11,270 So that's a rather at first glance, a kind of startling and frankly, frightening prospect. 58 00:06:11,750 --> 00:06:18,770 And there have not been there's been no reluctance on the part of a number of people to frighten us with what 59 00:06:18,770 --> 00:06:24,950 those prospects are and try to convince us that the harm that can be done is very real and very severe. 60 00:06:25,040 --> 00:06:30,050 And in extraordinary circumstances, if any of you have seen, for example, 61 00:06:30,290 --> 00:06:36,610 this book by the former national security adviser to President George Herbert Walker Bush, 62 00:06:37,100 --> 00:06:41,620 Richard Clarke, cyber war, the next threat to national security and what to do about it. 63 00:06:41,630 --> 00:06:46,340 That's an example of a knowledgeable and a very dramatic presentation of what 64 00:06:46,340 --> 00:06:50,570 the prospects of cyber warfare and the degree of our vulnerabilities to afar. 65 00:06:51,440 --> 00:06:59,720 It's a scary scenario, particularly as he begins to describe the kind of violence and destruction that could in principle occur in cyber war. 66 00:07:00,560 --> 00:07:04,969 Chlorine gas escaping from chemical plants near urban areas. 67 00:07:04,970 --> 00:07:08,000 Trains going off the tracks. Cars running into each other. 68 00:07:08,000 --> 00:07:11,930 And intersections. Planes falling from the sky. On and on it goes. 69 00:07:12,560 --> 00:07:17,420 These pictures are designed to show what some of the comparative devastation would 70 00:07:17,420 --> 00:07:22,400 be in a nuclear attack on the level of Hiroshima over on the far side here, 71 00:07:22,700 --> 00:07:27,440 compared to a cyber attack on infrastructure like a dam. 72 00:07:28,130 --> 00:07:36,680 In this case, what I've portrayed are the Glen Canyon and Hoover Dams in United States, which in principle, again, if you listen to these arguments, 73 00:07:37,070 --> 00:07:46,700 can be subject to a cyber attack that would cause the major electric generators to overheat and finally explode, damaging the dam. 74 00:07:46,790 --> 00:07:54,079 The water pressure would would would unleash a torrent of torrential flood and the energy releases. 75 00:07:54,080 --> 00:07:59,930 There would be orders of magnitude more than a nuclear explosion and result 76 00:07:59,930 --> 00:08:05,330 in untold death and destruction and immiseration of the civilian population. 77 00:08:05,600 --> 00:08:09,680 So that is the kind of of the portrait that's painted. 78 00:08:10,160 --> 00:08:16,190 And the other piece of this that I think that you need to keep in mind is the claim that this is all 79 00:08:16,190 --> 00:08:25,429 going to be done presumably by well by folks in the employ of al-Qaeda reporting to Ayman al-Zawahiri. 80 00:08:25,430 --> 00:08:27,620 They're the current leader, perhaps from us. 81 00:08:27,710 --> 00:08:33,980 Two or three people from a small flat in Hamburg are trying to pull this off, or even worse, the worst nightmare of all. 82 00:08:34,250 --> 00:08:39,980 It'll be your next door neighbours, teenage son up in his bedroom who's going to do all this? 83 00:08:40,820 --> 00:08:47,810 And it's at this point that I want to say that we need to take a deep breath, step back, 84 00:08:48,410 --> 00:08:55,610 and having been alerted to our vulnerabilities, think a bit more carefully about just how severe those are. 85 00:08:55,940 --> 00:09:03,980 I think a lot of this has begun to move from appropriate warning and caution to a kind of cyber hysteria 86 00:09:04,280 --> 00:09:10,570 and threat information that probably doesn't do anybody any good and may actually do some harm. 87 00:09:10,580 --> 00:09:14,120 It may inappropriately allocate resources. 88 00:09:14,120 --> 00:09:18,500 It may inappropriately allocate power from one branch of government to another. 89 00:09:18,770 --> 00:09:26,570 It may protect certain areas of research and defence and security from the normal kind of budget scrutiny that all the rest of us are going through. 90 00:09:27,200 --> 00:09:31,580 So there's self-serving elements to this that need to be carefully borne in mind, 91 00:09:31,880 --> 00:09:38,570 as well as the realistic assessment, which we have yet to come up with a realistic assessment of the threats. 92 00:09:39,440 --> 00:09:46,489 I think there's a reason behind that I have failed to mention thus far that I think is important to keep 93 00:09:46,490 --> 00:09:53,450 in mind why this might be as confusing and as then essentially conceptually inchoate as it is right now. 94 00:09:53,810 --> 00:10:01,660 And that is that most of the warnings have come from and also most of the weapons and strategy have been developed by people. 95 00:10:02,420 --> 00:10:09,870 Whose primary vocation is in the intelligence and security communities rather than in the military percent. 96 00:10:11,660 --> 00:10:21,540 That's not to say the military are interested deeply, but initially uniformed military personnel were deeply sceptical of the threats of cyber war. 97 00:10:21,560 --> 00:10:25,820 They pooh poohed them and ignored them for a long time and finally were persuaded that they were real. 98 00:10:26,030 --> 00:10:31,580 And now, of course, the way we often do, we're going gangbusters and throwing money at it like there's no tomorrow. 99 00:10:33,080 --> 00:10:38,209 And again, it's because there is a realistic assessment of threats, particularly when you're involved. 100 00:10:38,210 --> 00:10:44,300 As my organisation, United States Navy is, and something called, with the last decade been called net centric warfare, 101 00:10:44,960 --> 00:10:53,870 suddenly realising that all your advanced technologies from predators to satellite communications to unmanned autonomous systems of other sorts, 102 00:10:53,870 --> 00:11:05,390 could be hacked, devised and and compromised and used against you or interfered with, and in various significant ways is deeply unsettling. 103 00:11:06,140 --> 00:11:11,000 But I call your attention to the intelligence dimension that is the espionage dimension of this, 104 00:11:11,000 --> 00:11:17,600 because the people that are involved in that kind of work come at the nature of inter-state conflict in international 105 00:11:17,600 --> 00:11:26,390 relations with a very different set of assumptions than do people who are trained primarily as military combatants. 106 00:11:26,960 --> 00:11:32,500 For example, the law of armed conflict is not seem to apply to the intelligence community, 107 00:11:32,510 --> 00:11:37,480 doesn't have any real bearing on what they do, nor do they pay much attention to it, whether they should or not. 108 00:11:37,490 --> 00:11:40,219 So another question is just what are the, if you will, 109 00:11:40,220 --> 00:11:49,670 the cultural presuppositions that go into thinking about this kind of conflict and how to how to prosecute it, if you will, or defend against it. 110 00:11:49,910 --> 00:11:58,880 The assumptions are those of the espionage agent or the covert agent and the intelligence and reconnaissance personnel, not the combatant. 111 00:11:59,900 --> 00:12:08,750 And I think that's important. And another problem that comes up is that the distinction between all the things I've listed up here, 112 00:12:09,080 --> 00:12:24,620 ranging in kind of continuum from mischief and vandalism to self-interested crime to espionage, sabotage and terrorism all the way up to war. 113 00:12:24,890 --> 00:12:28,490 Those boundaries are not very clear in the cyber realm. 114 00:12:29,030 --> 00:12:33,020 And if you listen to people like Clarke and many others who've written on this, 115 00:12:33,380 --> 00:12:39,840 one minute they're talking about something that looks like full scale warfare with massive loss of life and harm. 116 00:12:39,880 --> 00:12:45,230 The next thing you know, they're talking about really extraordinary acts of espionage, 117 00:12:45,350 --> 00:12:53,719 which are the data theft of data of a classified sort about national security or commercial trade secrets, one of which is spying, 118 00:12:53,720 --> 00:13:01,700 the other of which is commercial espionage or crime kind of trade infringement and patent infringement and so forth, say, well, wait a minute, 119 00:13:01,700 --> 00:13:08,900 those are sort of aren't they covered by sort of different conceptual frameworks for thinking about them ethically and legally? 120 00:13:09,230 --> 00:13:16,640 Different legal regimes apply to these, but we move around with a kind of reckless equivocation in these areas that I think is not helpful. 121 00:13:17,810 --> 00:13:22,430 And in particular, I think the threat that so often raised of cyber terrorism, 122 00:13:23,000 --> 00:13:29,510 that is of al Qaeda or some other organisation engaging in the kind of attack bringing down the Glen Canyon Dam, 123 00:13:29,510 --> 00:13:33,560 if you will, or the the Hoover Dam or the Three Gorges Dam in China. 124 00:13:33,830 --> 00:13:40,670 That's probably unlikely. And we can talk about why I believe that and whether it's a valid view. 125 00:13:41,720 --> 00:13:48,080 But I think for the moment one needs to keep these distinctions carefully in mind and recognise that then for some of them, 126 00:13:48,080 --> 00:13:51,649 the ones that are sort of at the beginning of the list, the ones we're most familiar with, 127 00:13:51,650 --> 00:13:56,990 vandalism and hacktivism and mischief making as well as organised crime. 128 00:13:58,130 --> 00:14:04,310 We have a fairly robust legal regime in domestic law and now in international law through this Buddhist 129 00:14:04,820 --> 00:14:12,440 Budapest Convention on cybercrime in 2001 to try and keep up with cyber criminals and cyber vandals, 130 00:14:12,800 --> 00:14:19,670 track them down, find them and bring them to justice and cooperate across borders where in cyberspace there aren't any borders. 131 00:14:19,670 --> 00:14:25,850 It makes it very hard. But crime fighting has always been hard and international crime fighting especially. 132 00:14:25,850 --> 00:14:33,319 And we're doing all right in that area, I would suggest where we haven't done much thinking and reflecting is in the area where 133 00:14:33,320 --> 00:14:39,950 David and his colleagues called us to convene this morning to talk about the other end, 134 00:14:39,950 --> 00:14:46,460 if you will, of that continuum, the espionage, intelligence, covert action, sabotage, cyber war side. 135 00:14:47,180 --> 00:14:53,509 That's the part that I think is is needing some attention. A very nice article on the ethics of cyber war. 136 00:14:53,510 --> 00:14:55,370 I think the best one to come out so far, 137 00:14:55,370 --> 00:15:01,610 one of the few and of the few I think the best is by my colleague Randall Deibert at the University of Buffalo. 138 00:15:01,940 --> 00:15:09,410 In the U.S. the ethics of cyber warfare, which you'll find in last December's issue of the Journal of Military Ethics. 139 00:15:09,890 --> 00:15:13,870 Now, what Randy Dyker does is run. 140 00:15:15,230 --> 00:15:20,150 He's very technically sophisticated about how cyberwar works more far more so 141 00:15:20,150 --> 00:15:23,750 than most of us who come at the ethics side of the question or the legal side. 142 00:15:24,320 --> 00:15:31,340 And so he's able to talk in knowledgeable detail about what some of the challenges are and argues discouragingly in his 143 00:15:31,340 --> 00:15:37,579 article that conventional just war thinking and international law are completely inadequate and have to be thrown aside, 144 00:15:37,580 --> 00:15:42,860 ignored or radically revised in order to keep up with the new challenges of cyber warfare. 145 00:15:43,490 --> 00:15:47,389 I happen to think once again that's an exaggeration, that's part of the hysteria, 146 00:15:47,390 --> 00:15:51,750 but I think it's an important position to recommend from the legal side. 147 00:15:51,770 --> 00:15:59,959 One of those recent pieces. There's actually a fair amount of good work out on cyber warfare and information warfare. 148 00:15:59,960 --> 00:16:07,450 And the law is a fellow by the name of Duncan Hollis at Temple University has written some interesting papers. 149 00:16:07,460 --> 00:16:09,800 A colleague in the Navy War College, 150 00:16:09,800 --> 00:16:20,000 a retired general whose work Henry Schuster knows Michael Schmidt has written some papers on ethics and law in cyber war recently. 151 00:16:20,000 --> 00:16:27,470 Steven Bradbury, who was the deputy legal counsel in the Office of Department of Justice, 152 00:16:28,280 --> 00:16:33,550 gave the keynote address at the annual Harvard National Security Forum, 153 00:16:34,520 --> 00:16:40,339 and his paper on developing legal frameworks for defensive and offensive cyber operations will be forthcoming. 154 00:16:40,340 --> 00:16:48,889 And it's a good survey. And one paper on quality retention is has just come out this last last month in 155 00:16:48,890 --> 00:16:56,600 October by a colleague at King's College London by the name of Thomas Read and he 156 00:16:56,720 --> 00:17:02,510 far more than I am actually I think briefly I enjoyed the role of being the cyber 157 00:17:02,510 --> 00:17:08,150 sceptic or a naysayer who said this is all a lot of poppycock and hyper inflated. 158 00:17:08,420 --> 00:17:13,129 Chicken Little Sky is falling stuff. Thomas Read is even more so than I am. 159 00:17:13,130 --> 00:17:23,300 He denies that any cyber wars can have or ever will occur, and that what goes on here in this discussion is, again, this massive equivocation. 160 00:17:23,300 --> 00:17:26,480 The problems are crime and espionage, not war. 161 00:17:26,840 --> 00:17:30,470 Okay. I think that is too extreme, as you'll see in a moment. 162 00:17:30,740 --> 00:17:34,850 But that's kind of a quick survey of some of the literature. There's not a lot out there. 163 00:17:35,630 --> 00:17:38,180 What kinds of ethical questions does it raise? 164 00:17:38,510 --> 00:17:47,329 Well, the United States, in my own case, was criticised for not having a very well thought through cyber strategy. 165 00:17:47,330 --> 00:17:52,159 I think the same debate was had here in the U.K. and so both governments quickly do, 166 00:17:52,160 --> 00:17:57,230 as governments do, get groups together, committees and so forth, and churned out documents. 167 00:17:57,800 --> 00:18:01,400 I brought with me a couple that have come out in just the past few months. 168 00:18:02,150 --> 00:18:05,959 This is a picture of the title page of one of them from the Department of Defence, 169 00:18:05,960 --> 00:18:10,160 its Department of U.S. Department of French Strategy for Operating in cyberspace. 170 00:18:10,580 --> 00:18:18,379 Which one of the draughters who would not allow his name to be used but was quoted widely in the press when this came out, saying, 171 00:18:18,380 --> 00:18:25,700 basically, our cyber strategy, folks, is this you shut down our power grid, maybe we put a missile down one of your smokestacks. 172 00:18:26,510 --> 00:18:38,320 Seems with all the subtlety and finesse of which my I and my countrymen are famous, this seems to have sort of put a rather a fine point on things. 173 00:18:38,330 --> 00:18:42,590 Of course, it leaves the question of whose smokestacks will we put those missiles down? 174 00:18:42,890 --> 00:18:49,400 Given that it's very hard, as you are, I'm sure no, to determine exactly who has done what and with which and from where and to whom. 175 00:18:50,540 --> 00:18:55,850 How many missiles would we propose to put down? That's not addressed in the Defence document. 176 00:18:56,990 --> 00:19:01,280 Of course the real problem is just the kind of harm done in a cyber attack, 177 00:19:02,060 --> 00:19:08,720 something about which we are very unclear what is the nature and the severity of harm that can be done in cyber war? 178 00:19:09,050 --> 00:19:17,330 I showed you the graphic pictures that look like conventional bombing attacks, but we're talking more like freezing your bank account. 179 00:19:17,810 --> 00:19:24,890 Or as one other sceptic put it, it would be like the Russian army invaded the U.S. and then they shot in line and said, 180 00:19:25,160 --> 00:19:28,810 we're going to keep you from renewing your driver's license in Hong Kong. 181 00:19:29,030 --> 00:19:33,440 You know, and they went, oh, wait a minute, how is that an act of war? 182 00:19:33,500 --> 00:19:38,660 How is that that harmful? So things like this all go on in a confusing way here. 183 00:19:39,020 --> 00:19:43,370 So how many missiles, if any, are an appropriate response to that? 184 00:19:43,370 --> 00:19:49,010 And indeed, does that kind of thing even constitute a grounds for this kind of tough, rough, 185 00:19:49,550 --> 00:19:54,740 obviously deterrent oriented talk about it can what the military likes to call kinetic response, 186 00:19:54,740 --> 00:20:01,250 what we would call bombs and destruction and loss of life and the one that interests me and. 187 00:20:01,830 --> 00:20:09,610 Interestingly, I think has interested the people who are working in the intelligence community on developing cyber defence and cyber weapons. 188 00:20:09,630 --> 00:20:16,920 They're the first to raise this question. Some of the weapons and tactics we're being asked to develop seem deliberately to target civilians. 189 00:20:17,760 --> 00:20:21,480 That's wrong, isn't it? That's against the law, isn't it? 190 00:20:21,570 --> 00:20:27,690 This is a computer scientist, you know, who doesn't really know much about international law, and that's not his or her field. 191 00:20:27,690 --> 00:20:30,330 But they are raising these questions very interesting ways. 192 00:20:30,720 --> 00:20:38,590 So obviously that one are we are we are required as those are you familiar with the sort of underlying tenets of international law? 193 00:20:38,610 --> 00:20:43,709 I highlighted all those questions from the defence document and read because they correspond to 194 00:20:43,710 --> 00:20:49,470 some areas if one summarises the sort of underlying principles of the law of armed conflict. 195 00:20:49,800 --> 00:20:55,800 There are five of them, three of which seem to have been addressed in those preceding sets of questions. 196 00:20:55,800 --> 00:21:01,260 How what military objective would be served by a cyber attack or a response to it? 197 00:21:01,560 --> 00:21:04,910 How proportionate would such an attack be and how would that be determined, 198 00:21:04,920 --> 00:21:13,020 given that we don't know what harm means in the cyber realm and how discriminate would we require such attacks to be or responses to be? 199 00:21:14,280 --> 00:21:20,429 When we turn to the area with which I'm personally more familiar, which is just war theory itself, 200 00:21:20,430 --> 00:21:27,329 that philosophical discussion down through the centuries that sort of forms the back historical background and the intellectual 201 00:21:27,330 --> 00:21:35,370 foundation for international humanitarian law in the Geneva Conventions and military regulation and the Hague Conventions. 202 00:21:35,880 --> 00:21:39,990 That a decision of war requires a compelling cause or justification. 203 00:21:40,170 --> 00:21:46,200 Among other things, public declaration by a legitimate authority and has to be undertaken as a lab only as a last resort. 204 00:21:46,230 --> 00:21:54,450 Well, stop right now. We see already that cyber war presents a real problem, at least for the second and the third of these. 205 00:21:55,500 --> 00:21:59,040 We have no public declaration. We have plausible deniability. 206 00:22:00,300 --> 00:22:06,810 People smiling and saying we didn't do it. It was loyal citizens of the republic who launched an attack on Estonia, 207 00:22:06,810 --> 00:22:15,209 had nothing to do with the state and the extent to which therefore their authority is legitimate to undertake such an attack, 208 00:22:15,210 --> 00:22:20,070 let alone whether the cause is just are difficult to examine. 209 00:22:20,400 --> 00:22:29,460 And of course, the great fear in cyberwar, given its sort of strange features, is, again, as we were discussing in our roundtable this morning, 210 00:22:29,880 --> 00:22:40,350 because it can be undertaken at least in some under some conceptions without a lot of physical harm or loss of life. 211 00:22:40,860 --> 00:22:50,850 Might that make it easier to resort to war, cyber war as a form of conflict resolution, not as a last resort, but as the first and preferable resort. 212 00:22:51,300 --> 00:23:02,700 So like most recent military technologies, the problem is that the cyber war dilemma really raises this threshold question. 213 00:23:03,000 --> 00:23:08,370 That is, will all these new technologies, robotics and predators and non-lethal weapons and so forth, 214 00:23:08,370 --> 00:23:13,080 India, and just make it easier for nations to fight because they won't see the costs attached to it. 215 00:23:13,560 --> 00:23:19,920 So those are problems. On the other hand, when we look at how war is conducted in the conventional sense, 216 00:23:19,920 --> 00:23:26,580 the restraints there of military necessity and proportionality and the discrimination of combatants and non-combatants. 217 00:23:26,910 --> 00:23:32,670 The interesting thing about the cyber realm is that it offers a lot of promise in those very precise 218 00:23:32,670 --> 00:23:38,069 weapons that can be used only against purely military targets with little or no collateral damage, 219 00:23:38,070 --> 00:23:43,320 and to be very discriminant, very proportionate. So those are the things I think that needs to be examined. 220 00:23:43,500 --> 00:23:48,120 Well, how would we go about doing it? Well, we have a methodology and applied ethics. 221 00:23:48,120 --> 00:23:52,530 We tend to when we face these new and somewhat unfamiliar areas, 222 00:23:52,800 --> 00:23:58,650 whether they were have been in medicine or business or whatever the areas were in decades past, 223 00:23:58,650 --> 00:24:01,980 that applied philosophers like your colleagues here have taken one. 224 00:24:02,490 --> 00:24:10,650 You start with your sort of pre theoretic intuitions, your best reasonable positions about what you think constitutes acceptable behaviour. 225 00:24:11,010 --> 00:24:18,329 You test those in actual and in hypothetical cases, and you engage in what the philosopher John Rawls called reflective equilibrium. 226 00:24:18,330 --> 00:24:26,070 That is, we essentially argue with each other dialectically about what reasonable behaviour would seem to constitute under the circumstances of 227 00:24:26,790 --> 00:24:36,330 trollies hurdling at defenceless people or persons devouring one another's kidneys on a desert island or whatever the case may be. 228 00:24:36,930 --> 00:24:39,870 That seems to be the convention of of philosophers. 229 00:24:42,240 --> 00:24:50,550 That's actually not unlike somewhat more frivolous, perhaps, but not unlike what what goes on in international law. 230 00:24:51,420 --> 00:24:55,170 This is what John Randolph, the fellow I mentioned, this paper I mentioned earlier, recommended. 231 00:24:55,440 --> 00:25:01,550 What we need to do is sort of see what people are actually doing and how they're going about it and how the. 232 00:25:01,620 --> 00:25:05,040 Reacting to what they do or what others do to them, 233 00:25:05,550 --> 00:25:14,310 and follow the principle of the evolution of customary law by looking at the boundaries of acceptable and unacceptable practice. 234 00:25:15,480 --> 00:25:21,300 So in a way, I think the methodologies of applied philosophy, international law work together very well on this problem. 235 00:25:21,310 --> 00:25:24,390 Let's see what's going on. Let's try out a few cases. 236 00:25:24,900 --> 00:25:32,130 Let's look at some actual experiences. And in this case, I'm going to now propose we look at four cases of cyber war. 237 00:25:33,180 --> 00:25:37,559 And I believe they've been instructive that some of them, not all of them, 238 00:25:37,560 --> 00:25:45,180 have been carried out within what might be broadly seen as the bounds of both international humanitarian law and the law of armed conflict, 239 00:25:45,360 --> 00:25:47,880 as well as the constraints of just war theory. 240 00:25:48,360 --> 00:25:59,730 The four conflicts I have in mind and I need to play this for you so that you'll be clear that not all agree that these were acts of war against them. 241 00:25:59,970 --> 00:26:05,640 My colleague at King's College would say none of these rose to the level of warfare. 242 00:26:06,330 --> 00:26:14,190 These were conventional wars. This wasn't a war at all. This was just sort of propaganda and this was sabotage. 243 00:26:14,550 --> 00:26:19,740 Okay. I'm going to argue that these were recent cases that we have discussed. 244 00:26:19,740 --> 00:26:25,350 They're in the literature. You can look them up online if you're not already. I'm sure those we're interested in this already know a lot, 245 00:26:25,350 --> 00:26:31,950 probably more than I do about some of these conflicts and the full scale attack on Estonia in 2007. 246 00:26:32,190 --> 00:26:35,340 Presumably, as I put with the question mark by the Russians, 247 00:26:36,690 --> 00:26:42,329 likewise Russia against Georgia in the battle over the secession of Ossetia in July 248 00:26:42,330 --> 00:26:52,280 of 2008 are somewhat less well known because neither victim nor the combatant, 249 00:26:52,290 --> 00:26:55,439 neither none of the sides wanted to admit this had happened. 250 00:26:55,440 --> 00:27:04,110 But Israel attacked in a conventional way a suspected nuclear weapons site in Syria on the 6th of September 2007. 251 00:27:04,470 --> 00:27:09,030 All of these, by the way, are dramatically described in Richard Clarke's book, if you would like. 252 00:27:09,030 --> 00:27:14,459 If you're not familiar with them, I'd recommend grabbing a copy off the local newsstand where it will be, 253 00:27:14,460 --> 00:27:17,010 you know, in the airport bookstore and it will be prominently featured. 254 00:27:17,430 --> 00:27:23,940 And and it is he has a real gift for describing not just the details accurately, 255 00:27:23,940 --> 00:27:31,050 but also the sense of drama that surrounds this and a kind of excitement and the exotic feature of this attack. 256 00:27:32,490 --> 00:27:39,150 The middle to were conventional military altercations that were preceded by cyber. 257 00:27:40,630 --> 00:27:51,700 Attacks of some kind. The first and the last were purely cyber attacks involving no conventional exchange of kinetic forces at all. 258 00:27:52,250 --> 00:28:00,460 Okay. So they're interesting cases. And let me talk about each in turn just very briefly. 259 00:28:01,240 --> 00:28:05,709 Again, keeping in mind these principles of just war theory, justifiable rationale, 260 00:28:05,710 --> 00:28:10,180 last resort, proportionality and discrimination in the conduct of war. 261 00:28:10,450 --> 00:28:20,110 I would submit that the one you're probably most familiar with, the attack on Estonia was a violation of all of these principles. 262 00:28:20,200 --> 00:28:29,530 That is, it lacked a just cause. Moving a military statue from the centre of town in a sovereign republic to a place of honour in a military cemetery. 263 00:28:29,830 --> 00:28:32,920 Hardly cause is sufficient as an act of war. 264 00:28:34,480 --> 00:28:38,130 The attacks were then badly motivated. 265 00:28:38,200 --> 00:28:40,359 They were not a last resort. No, no. 266 00:28:40,360 --> 00:28:49,780 Real attempts other than hooliganism were were were mounted to try and reverse the government's decision about the statue that compromises war. 267 00:28:50,020 --> 00:28:55,719 It was a successful diplomatic negotiation, actually, that took place before all of this. 268 00:28:55,720 --> 00:29:03,850 And there really wasn't any reason for these acts of aggression other than a kind of hooliganism against against Australia. 269 00:29:04,030 --> 00:29:08,680 And they were indiscriminately and disproportionately directed at the civilian population. 270 00:29:09,220 --> 00:29:10,660 Now nobody was killed. 271 00:29:11,290 --> 00:29:19,360 To my knowledge, there's no deaths have been directly attributable to the denial of services that were involved in these attacks. 272 00:29:20,140 --> 00:29:28,120 A lot of inconvenience was suffered, but I don't know that any great harm of a lasting and enduring sort comparable to conventional war. 273 00:29:28,690 --> 00:29:32,079 This is why Thomas Wren doesn't think this was a war. 274 00:29:32,080 --> 00:29:41,830 And this is also why Naco would asked under Section five of the collective security arrangements, refused to come to the aid of Estonia. 275 00:29:42,730 --> 00:29:48,300 While this doesn't rise to the level of war. So it's disputed. What it is, is without a just cause. 276 00:29:48,940 --> 00:29:55,600 Wasn't a last resort. Was disproportionate, extreme and indiscriminately directed at civilians. 277 00:29:56,410 --> 00:30:08,290 So I think it fails the test. By contrast the next two and that earlier last Russia's attack on Ossetia and the and the Israeli attack on. 278 00:30:09,280 --> 00:30:15,280 Sorry. Russia's attack in Syria and Georgia. And the Israeli attack in Syria. 279 00:30:15,670 --> 00:30:21,370 These strike me as falling within the boundaries of acceptable practice as defined here. 280 00:30:21,910 --> 00:30:30,520 That is, they were part of conventional wars which had legitimate political differences that could justify use of force in those cases. 281 00:30:31,210 --> 00:30:35,680 On our conventional understanding in recent history of when that is acceptable practice. 282 00:30:37,630 --> 00:30:41,560 They were directed entirely at military targets. No civilians were targeted. 283 00:30:41,560 --> 00:30:44,560 No civilians were killed. No civilian infrastructure was damaged. 284 00:30:45,910 --> 00:30:48,130 So they were both discriminate and proportionate. 285 00:30:48,640 --> 00:30:57,250 And both attacks occurred after lengthy negotiations in both cases and in the Syrian Israeli case under the table, so to speak. 286 00:30:57,850 --> 00:31:02,799 You know, Syria denying that it was building with North Korea's help, a nuclear reactor, very much like in Iran. 287 00:31:02,800 --> 00:31:06,970 Now, you know, the denials occurring, the denunciations from the other side. 288 00:31:07,210 --> 00:31:09,790 So it wasn't as though diplomatic initiatives weren't. I tried. 289 00:31:09,790 --> 00:31:18,490 So I think those two are reasonably within the realm of conventional understandings, both of international law and of the law of. 290 00:31:19,750 --> 00:31:21,100 I'm sorry. I have just war. 291 00:31:21,700 --> 00:31:32,740 Now, Stuxnet is the one that's probably probably half the people in the room are experts on Stuxnet and others have never heard of it. 292 00:31:33,220 --> 00:31:40,720 So a very quick rundown of the details as they have emerged over the last year or 293 00:31:40,720 --> 00:31:48,160 so is this is a cyber worm that was developed by we do not know who we presume, 294 00:31:48,160 --> 00:31:52,960 whoever smiles the most broadly and remains silent whenever the topic is broke. 295 00:31:54,670 --> 00:31:58,970 It spread from computer networks while various theories about how it got around. 296 00:31:58,990 --> 00:32:06,370 One is that some secret agent took a thumb drive and plugged in an Iranian computer somewhere, which is the way in which this often happens. 297 00:32:06,610 --> 00:32:10,540 I think that's probably not plausible or accurate in this case, that the worm was developed, 298 00:32:10,540 --> 00:32:14,170 released in Indonesia from all likelihood in launching it spread. 299 00:32:14,530 --> 00:32:20,050 It spread to countries in computers all over the world, slowly, 300 00:32:20,230 --> 00:32:25,959 unlike a virus which multiplies quickly, a worm kind of, as its image suggests, crawls around, 301 00:32:25,960 --> 00:32:30,460 slowly goes from one computer to the next and makes its way around the world in 302 00:32:30,460 --> 00:32:35,380 the Internet and finally found to be residing on computers in all these countries. 303 00:32:36,700 --> 00:32:39,790 But it did no harm. It stayed. 304 00:32:39,840 --> 00:32:48,330 A note on the computers on which it was detected. Unless the programs that constituted the worm detected a particular kind of software, 305 00:32:48,600 --> 00:32:56,429 proprietary software produced by the Germans company in Germany, now configured to an array of nuclear centrifuges. 306 00:32:56,430 --> 00:33:02,550 And not just any nuclear centrifuge, not even just any Siemens nuclear centrifuge, 307 00:33:02,850 --> 00:33:08,850 but a particular mathematical array of it, particularly the German side, 984 to be exact. 308 00:33:09,210 --> 00:33:17,130 If you on your personal computer are running an array of 984 Siemens computer centrifuges, nuclear centrifuges, 309 00:33:17,130 --> 00:33:22,500 spinning and producing fissionable material for a nuclear weapon, this worm is a problem for you. 310 00:33:22,530 --> 00:33:27,920 If you're not doing that, it even if it's on your computer, it doesn't do anything. 311 00:33:27,970 --> 00:33:31,500 It just sits there and makes itself inert. 312 00:33:31,620 --> 00:33:37,410 And it is apparently set to destroy itself on the 24th of June next year. 313 00:33:38,790 --> 00:33:45,269 So this is know this took like a year for people like this gentleman, Ralph Langer, who is a German cyber security expert. 314 00:33:45,270 --> 00:33:51,239 He's the one who's credited with first had discovered this worm moving around the Internet and 315 00:33:51,240 --> 00:33:56,250 calling attention to it and the warnings that were levelled by him and others as to what it meant, 316 00:33:56,250 --> 00:33:59,700 what it was doing were wildly disparate as time went on. 317 00:34:00,030 --> 00:34:06,070 Finally, it turned out it did seem to be a weapon, a very sophisticated weapon, as he credited it, 318 00:34:06,090 --> 00:34:11,969 that very primitive weapon that targeted only the nuclear centrifuges involved in the 319 00:34:11,970 --> 00:34:18,330 in Iran's presumably internationally sanctioned and illegal nuclear weapons program. 320 00:34:18,840 --> 00:34:23,490 Didn't kill anybody, didn't break anything, certainly didn't hurt or targeting civilians. 321 00:34:25,050 --> 00:34:29,010 So it was a rather remarkable piece of software. 322 00:34:30,210 --> 00:34:36,210 Those are the four attacks that we know of. And what I think and here I'll wrap up and give us some time for for discussion. 323 00:34:36,210 --> 00:34:45,900 I hope that they suggest some emerging norms for cyber war from the practices that we find acceptable and unacceptable and by 324 00:34:45,900 --> 00:34:52,380 extrapolation and extension and interpretation of what we know at present about how conventional war should be conducted. 325 00:34:53,100 --> 00:35:01,530 A cyber war would be permissible if the weapons, the cyber weapons aimed primarily or exclusively at military infrastructure, 326 00:35:02,130 --> 00:35:07,380 if they degraded and adversaries ability to undertake highly destructive defensive operations. 327 00:35:07,650 --> 00:35:16,139 That one's key, because that's the just cause you're cause would be to keep an adversary nation from doing something of a conventional sort, 328 00:35:16,140 --> 00:35:25,290 like building a nuclear weapon and using it. Then in the process of trying to prevent that and and arguing for justification in terms of 329 00:35:25,290 --> 00:35:29,880 that goal or intention that neither deliberately targets nor ultimately harm civilians, 330 00:35:29,910 --> 00:35:35,309 nor civilian infrastructure, and that it is undertaken as a last resort in the required sense. 331 00:35:35,310 --> 00:35:43,680 That is, every other reasonable option is first and exercised and tried without a deal, and further delay would only make things worse. 332 00:35:44,340 --> 00:35:47,819 For those of you who know the work of Michael Walzer on pre-emptive self-defence, 333 00:35:47,820 --> 00:35:55,500 you know that those are the conditions he uses to extend ever so slightly the notion of pre-emption from immediate, 334 00:35:55,590 --> 00:35:59,820 imminent harm to, oh, we've we've worked hard to get rid of this. 335 00:36:00,060 --> 00:36:06,780 It's not getting anywhere. It's a real threat. And waiting is only going to make matters worse than you're entitled to defend yourself. 336 00:36:06,780 --> 00:36:10,710 So that suggests that this would cover in this case, 337 00:36:10,830 --> 00:36:18,840 a cyber case with these conditions meant above would cover preventive as well as defensive measures in the traditional sense. 338 00:36:19,860 --> 00:36:23,760 Okay, so I'm done. I just want to leave you with some some things to think about. 339 00:36:27,030 --> 00:36:28,920 One is that one of the. 340 00:36:29,260 --> 00:36:33,720 Well, these are sort of lists of problems that we have to come to terms with, and we really don't know how to make sense of them. 341 00:36:36,630 --> 00:36:45,209 Some of you may know that that the question was raised whether the U.S. could use a cyber attack as Syria I'm sorry, 342 00:36:45,210 --> 00:36:53,760 as Israel did against Syria to take down its air defence forces before Naito began the bombing missions to protect civilians in that country. 343 00:36:54,450 --> 00:36:59,040 Apparently, the U.S. decided not to do that. There were two arguments given. 344 00:36:59,400 --> 00:37:05,330 One was that we really you know, cyber weapons are time intensive and they're one off. 345 00:37:05,340 --> 00:37:08,460 That is, we they are uniquely designed for a particular target. 346 00:37:08,500 --> 00:37:11,070 It takes time to do that. And expertise, knowledge. 347 00:37:11,400 --> 00:37:16,830 It isn't like we've got a shelf full of them sitting here somewhere and we just pick one up, stick it in the rifle and shoot it. 348 00:37:17,730 --> 00:37:19,490 So they're not like conventional weapons our way. 349 00:37:19,500 --> 00:37:28,200 So we really haven't had time to prepare a cyber weapon or more sinister account would be we have some on the shelf and they would work, 350 00:37:28,680 --> 00:37:38,549 but we wouldn't want to reveal their existence and use them for such a relatively trivial reason, partly because it would show what our capacity is. 351 00:37:38,550 --> 00:37:42,080 And partly because once they. Used to can't use them again. Okay. 352 00:37:44,030 --> 00:37:51,400 Now, that last comment ties into a criticism by Ralph Langer, this German cyber security expert about Stuxnet. 353 00:37:51,410 --> 00:38:01,010 He regards this as a very reckless, indiscriminate attack, not because of the damage it did itself in in the immediate situation, 354 00:38:01,280 --> 00:38:04,790 but because now it's out there on your computer very possibly, 355 00:38:05,060 --> 00:38:13,310 and a hacker who couldn't do this on their own life's expertise and download the software reverse engineer it, study how a weapon was put together. 356 00:38:13,550 --> 00:38:15,830 It becomes a kind of open source weapon. 357 00:38:16,970 --> 00:38:25,700 And he's even claimed recently in the press that he sees elements of Stuxnet reappearing and being used to do similar kinds of damage, 358 00:38:26,300 --> 00:38:31,370 as was done to the centrifuges now to, say, industrial infrastructure in other countries by terrorists. 359 00:38:32,660 --> 00:38:36,080 I think that's a mistake as well for a variety of reasons. 360 00:38:36,080 --> 00:38:39,830 But let me just set that aside. We have to think about that as a problem. 361 00:38:41,270 --> 00:38:42,020 Generally, 362 00:38:42,230 --> 00:38:51,889 the design of these as effective weapons requires so much time with so much expertise that the scenario of the kid or the terrorist doing this, 363 00:38:51,890 --> 00:38:56,990 I think is implausible and extreme. The kid or the terrorists can do a lot of trouble, make a lot of trouble. 364 00:38:56,990 --> 00:39:05,030 They can engage in vandalism, they can engage in crime, destruction of property, theft of property, the denial of service. 365 00:39:05,030 --> 00:39:11,570 They can shut down local power grids and do all kinds of things, as kids can do in conventional ways now, as terrorists can do. 366 00:39:12,110 --> 00:39:20,510 But when it comes to designing a weapon to do what Stuxnet did, that probably took a couple of years and hundreds, if not thousands of experts. 367 00:39:20,930 --> 00:39:27,410 And it required something that your neighbour's 14 year old kid probably doesn't have first. 368 00:39:27,530 --> 00:39:32,840 He doesn't have good grades in physics and engineering because he's been screwing around on the Internet and Facebook. 369 00:39:33,440 --> 00:39:41,780 Secondly, last time I checked, a centrifuge will not fit in the upstairs bedroom or the terrorist the part flat in Hamburg. 370 00:39:42,050 --> 00:39:51,440 And what's often what people fail to understand is you have to have access to the hardware that you're planning to attack in a genuine cyber attack, 371 00:39:51,650 --> 00:39:58,010 be able to experiment on it and work with it. That takes time and expertise, and usually that's a collective enterprise. 372 00:39:58,280 --> 00:40:00,650 So in a way that I think is guardedly hopeful, 373 00:40:00,860 --> 00:40:08,690 this keeps cyber weapons of the serious sort that people are terrifying us with more at the level of the state centred, 374 00:40:08,780 --> 00:40:14,269 state centric conflict of violence, which means it's amenable to governance, 375 00:40:14,270 --> 00:40:20,990 treaty regulation, an agreement based on self-interest, which vandalism and terrorism or not. 376 00:40:22,400 --> 00:40:27,260 There are interesting and competing philosophies of cybersecurity in my own country. 377 00:40:27,500 --> 00:40:33,829 The two documents that came out, one came out of the Department of Defence I've already mentioned and surprise, surprise, 378 00:40:33,830 --> 00:40:38,989 it sounds like it was written in the People's Republic of China in the sense that it's extremely defensive 379 00:40:38,990 --> 00:40:47,300 in its very concern for dominance on the one hand and protection of infrastructure and of security measures. 380 00:40:47,720 --> 00:40:53,780 Whereas a document written in part by a colleague from the program here who's now 381 00:40:53,780 --> 00:41:00,410 working in President Obama's office and for the State Department is called the sorry. 382 00:41:00,470 --> 00:41:04,730 This is the world peace, the Internet, the International Strategy for cyberspace. 383 00:41:04,740 --> 00:41:06,260 This is President Obama's document. 384 00:41:06,530 --> 00:41:16,159 This one is open, hopeful, aspirational, almost to the fault of naive attack in the sense that it believes in the end, transparency, openness, 385 00:41:16,160 --> 00:41:20,180 the developmental capacities of the Internet with far opposite any of these vulnerabilities, 386 00:41:20,390 --> 00:41:24,500 and encourages the nations and peoples in the world to work in that direction. 387 00:41:24,770 --> 00:41:31,160 So which philosophy ought to reign the one that's concerned, lest my retirement account be drained? 388 00:41:31,580 --> 00:41:37,489 Or the one that argues in the end, what we want is the greater industrial and economic productivity, 389 00:41:37,490 --> 00:41:41,870 the learning, education and human welfare that the Internet can produce. 390 00:41:43,670 --> 00:41:50,390 Finally, I think I'll assert that I think this discussion has been dominated by our intelligence community thus far. 391 00:41:50,780 --> 00:41:57,079 That means that the conceptions that are brought to bear in thinking about what's acceptable and unacceptable are from the get go, 392 00:41:57,080 --> 00:41:59,480 very different than what most of us would presume. 393 00:41:59,960 --> 00:42:06,830 And the distinctions between genuine acts of war and these acts that are very much at the edge of war intelligence, 394 00:42:06,830 --> 00:42:12,170 espionage, reconnaissance, surveillance, sabotage, covert action are blurring. 395 00:42:12,350 --> 00:42:19,010 And we don't know what to do about that. And I think I know what to do now is shut up and see what questions you have. 396 00:42:19,040 --> 00:42:20,300 Thank you very much for your attention.