1 00:00:01,230 --> 00:00:08,760 I do. I brought a copy of the book for Graham 4:00, and I don't think he's here, but I'm going to keep and give it to someone else as well. 2 00:00:09,510 --> 00:00:14,370 But yeah, the book just came out a few days ago in the US. You can get it on American Kindle. 3 00:00:15,900 --> 00:00:22,380 And people who know how to do cyber stuff can figure out how to get it in this country through the American version, the Kindle. 4 00:00:22,800 --> 00:00:26,490 It should be out soon enough, but I never understand the relationship between Oxford, U. 5 00:00:26,640 --> 00:00:30,209 S and Oxford UK terms of ownership. 6 00:00:30,210 --> 00:00:33,870 So I don't know how that works. But they're not the same entity is probably a lot of email. 7 00:00:35,250 --> 00:00:41,310 And I mean, I think one thing to mentioned is that I have a lot of publications the fans like. 8 00:00:42,060 --> 00:00:43,440 It doesn't really matter. 9 00:00:44,430 --> 00:00:53,280 I did a lot of research early on in my career on conflict processes, a lot of data work, a lot of stuff I don't really want to necessarily revisit. 10 00:00:54,300 --> 00:00:57,360 I did a whole book on Tibetan movies for some reason. I don't know how it happened. 11 00:00:57,360 --> 00:01:05,370 I just did. It's a long story, but I think what's interesting is that I feel a lot better about my stage of my career 12 00:01:05,370 --> 00:01:08,220 right now because I found something that I think is interesting and important, 13 00:01:08,820 --> 00:01:14,310 and that for me is cybersecurity, because there's a lot of bluster, there's a lot of hype, 14 00:01:15,090 --> 00:01:21,600 there's a lot of assurance that we know something about this domain, quote unquote, without any evidence. 15 00:01:21,600 --> 00:01:24,959 And to me, that's a huge problem. That's not what we do in politics. 16 00:01:24,960 --> 00:01:30,000 That's not what we do in research. We don't say things based on the imagination of what could happen. 17 00:01:30,510 --> 00:01:35,010 We say things based on what we know has happened and will happen based on evidence. 18 00:01:35,010 --> 00:01:37,770 And that's what really structured my work and structure. 19 00:01:37,770 --> 00:01:44,330 What I've been doing now, which I do with Benjamin Jensen, the Rain Man is all ears are theirs. 20 00:01:44,580 --> 00:01:48,960 I throw the co-authors under the bus. Really? And that, of course, is in the cover of the book. 21 00:01:51,360 --> 00:02:01,260 But the purpose of this next book, the second book I've done on cybersecurity, is really to understand just how useful our cyber strategies. 22 00:02:01,950 --> 00:02:07,530 No one has really dived into that question. No one's really asked the question of what good are they? 23 00:02:08,520 --> 00:02:11,970 We assume there's a utility for cyber weaponry. 24 00:02:12,300 --> 00:02:18,030 We assume there's a usefulness. We assume we can use them against ice. 25 00:02:18,630 --> 00:02:24,600 We assume that Russia was able to leverage these technologies against America during the 2016 election. 26 00:02:25,710 --> 00:02:33,930 And while we have the presence of these weapons during these pieces, we have no real analysis of their utility. 27 00:02:34,410 --> 00:02:38,250 Did they really work the ice? His story is a hilarious to me. 28 00:02:39,120 --> 00:02:42,060 One of my favourite articles I think was written by David Sanger. 29 00:02:42,300 --> 00:02:50,430 We're talking about America targeting Axis, and one of the things they said is the first thing Americans did is take out the power crisis. 30 00:02:51,160 --> 00:02:57,180 Then they supposedly it happens. And my question at that point is, how do you have someone without power? 31 00:02:57,720 --> 00:03:04,140 How does that work? You know, how do you leverage high technology tools against low technology adversaries? 32 00:03:04,860 --> 00:03:09,569 And this is a challenge that we have in America right now because we assume future warfare will 33 00:03:09,570 --> 00:03:15,780 be high tech when the reality is our adversaries from the last 30 or 40 years have been very, 34 00:03:15,780 --> 00:03:21,990 very low tech. So now we've moved towards using high technology weapons against unsuitable opponents. 35 00:03:22,500 --> 00:03:30,750 And this is because of the politics of defence acquisitions. This sort of hype have a line in the book, New Toys and bring new joys. 36 00:03:31,320 --> 00:03:36,570 I kind of get the idea that a lot of people in the military think cyber tools are like Christmas Day. 37 00:03:37,320 --> 00:03:39,840 You just unwrap your new toys and you want to play with them, 38 00:03:40,170 --> 00:03:45,870 but then you find out they don't really work against the target you're trying to use them against. 39 00:03:46,710 --> 00:03:50,820 And, you know, one of the funny issues here is in America, 40 00:03:50,820 --> 00:03:58,889 the DOD literally asks for loud cyber bombs because they felt that the cyber weapons we were using against adversaries, 41 00:03:58,890 --> 00:04:03,390 adversaries didn't know they were being used, they're being targeted. And for them, that was a problem. 42 00:04:03,390 --> 00:04:07,410 They wanted, in effect, the same tool that we could have. 43 00:04:07,410 --> 00:04:14,040 And we argue throughout this book is that cyber is not necessarily a new tool. 44 00:04:14,460 --> 00:04:19,380 It's a new method that this is really a tool of political warfare. 45 00:04:19,800 --> 00:04:22,140 This is something George Kennan talked about in the forties. 46 00:04:22,410 --> 00:04:27,960 It's something we've been doing since the beginning of time, and that is covert operations, espionage. 47 00:04:28,260 --> 00:04:34,020 But this is where cybersecurity really falls. Yeah, we put it towards the domain of warfare and violence. 48 00:04:34,920 --> 00:04:40,920 And I spoke in front of the Senate in the U.S. and they kept asking me, well, where is the red line? 49 00:04:41,730 --> 00:04:45,740 And I didn't really understand that question because to me, the red line is death. 50 00:04:46,080 --> 00:04:52,799 But until we have death, I don't know how we're really going to get there towards this idea of massive cyber war. 51 00:04:52,800 --> 00:05:00,000 It's not really a thing as far as I'm concerned. I think the bucket that we place cyber security in is utterly, utterly wrong. 52 00:05:00,390 --> 00:05:06,510 And we need to rethink the concept. So cyber enabled tools will not replace traditional elements of statecraft. 53 00:05:08,010 --> 00:05:17,130 They won't be new tools or leverage. The other thing is that these methods could be cheap, easy and fast are no substitute for traditional strategy. 54 00:05:18,420 --> 00:05:22,350 If something is cheap, easy and fast, it probably isn't very useful. 55 00:05:22,830 --> 00:05:27,360 The problem is what it says in the definition of cheap. It'll go away quickly. 56 00:05:27,720 --> 00:05:31,050 The facts will not link and that's a challenge. 57 00:05:31,320 --> 00:05:34,740 People ask me about Estonia. Weren't the facts there? 58 00:05:35,280 --> 00:05:41,640 Wasn't that a devastating attack in 2007? I mean, some people certainly argue that as they argue with me in person, 59 00:05:42,210 --> 00:05:47,190 but then you get them in private and they make jokes about how they use that attack to get NATO's to spend more money on the country. 60 00:05:48,120 --> 00:05:51,690 So the reality is much different. They are the ones who carried out the attack. 61 00:05:52,500 --> 00:05:55,950 They were, of course, were hacked by Russia. They, of course, were targeted. 62 00:05:56,430 --> 00:06:01,350 But their solution was to turn things off and let it basically blow the story. 63 00:06:01,350 --> 00:06:06,630 And reality is much different than a story we tell ourselves because we want these new sensational stories. 64 00:06:07,290 --> 00:06:10,680 We want to understand this sort of revolutionary war. 65 00:06:11,370 --> 00:06:22,589 And I think what we really have to kind of grass is the idea of a limited utility in destructive 66 00:06:22,590 --> 00:06:28,900 warfare and a more useful utility and a change towards lower values of covert warfare. 67 00:06:28,920 --> 00:06:34,500 I think that's the future. Innovation and military democracy is rarely revolutionary. 68 00:06:34,770 --> 00:06:41,670 Instead, it's evolutionary as a process replete with unintended consequences and cascading effects. 69 00:06:42,150 --> 00:06:48,150 And the challenge is that we focus more on the covert actions or aspects of cybersecurity. 70 00:06:49,110 --> 00:06:51,270 There are many, many unintended effects. 71 00:06:51,780 --> 00:06:58,710 And the reality is I am often in rooms of American military people who I have to kind of raise my hand and say, 72 00:06:59,280 --> 00:07:00,960 you just kind of mentioned a war crime. 73 00:07:01,260 --> 00:07:08,970 You probably shouldn't talk about that, because the reality is if you use cyber tools, there is no distinction between the attacker and the civilian. 74 00:07:09,840 --> 00:07:15,120 These things don't really there's no distinction, really. They bleed from one side to the other. 75 00:07:15,570 --> 00:07:20,700 Even Stuxnet attack on an Iranian nuclear plant underground in the middle of nowhere, 76 00:07:20,910 --> 00:07:25,110 disconnected from the Internet, still affects the civilian population and still what out. 77 00:07:25,950 --> 00:07:31,590 So that's a challenge, is that there are unintended consequences and possibly cascading effects. 78 00:07:31,950 --> 00:07:36,210 And we have no grasp on what the intention or what the reality will be in the future. 79 00:07:37,770 --> 00:07:40,770 We need to avoid the megaphone approach to cyber security. 80 00:07:41,760 --> 00:07:51,720 Most Irish security talks will focus on one event the focus on the 2016 election, a focus on the Sony hack, but focus on Estonia. 81 00:07:51,750 --> 00:07:52,860 They'll focus on Stuxnet. 82 00:07:53,970 --> 00:08:04,440 We have a World War One ification of cyber security where we focus on these grand events and we don't really look at the scope of the problem. 83 00:08:04,680 --> 00:08:11,940 And that's a huge challenge to understand any challenge in war, any issue. 84 00:08:12,180 --> 00:08:17,130 We should look at the large recent history, and we do have a large recent history of cyber events. 85 00:08:18,750 --> 00:08:23,040 We have coded over 192 cyber attacks between rival states. 86 00:08:24,000 --> 00:08:27,060 So those are states with a longstanding history of antagonism. 87 00:08:28,200 --> 00:08:32,819 We will spread this out and do a lot more, will go to states that are not rivals, 88 00:08:32,820 --> 00:08:39,210 will hold non-state actors, but in the end recognise cyber disputes between recognised adversaries. 89 00:08:39,660 --> 00:08:43,500 We're probably not going to reach more than 500 over the last 20 or 30 years. 90 00:08:44,190 --> 00:08:55,590 That is not a lot of events, but that is enough to grasp an empirical history and empirical and understand an empirical picture of this domain. 91 00:08:57,030 --> 00:09:00,080 And when you do this, we have a different picture. 92 00:09:01,080 --> 00:09:10,680 The efficacy of cyber strategies is very, very limited, and the stories that we tell about cyber security are limited utility. 93 00:09:10,680 --> 00:09:19,110 If they only really apply to one face when in as many cases as possible to achieve some sort of adequate leverage over our readiness. 94 00:09:19,650 --> 00:09:21,410 And that's a challenge for cyber security. 95 00:09:21,710 --> 00:09:29,250 If we focus on one case, we focus on this megafauna and megafauna basically means like harmless people studying the big things, 96 00:09:29,250 --> 00:09:34,680 but not like the insects or butterflies or things that are more important for understanding, say, climate change. 97 00:09:35,700 --> 00:09:47,580 We focus on the megafauna of cyber security. We miss the more useful information and taking a larger picture of the scope of cyber actions. 98 00:09:48,510 --> 00:09:56,909 We have not seen them achieve knock out effects. We haven't seen them have dramatic change on the opposition yet. 99 00:09:56,910 --> 00:10:03,090 This is still clearly a security problem and there does. We made a path to clear direction in this domain. 100 00:10:03,690 --> 00:10:09,510 But this path to chaos and destruction in this domain is very different in the way most people talk about it. 101 00:10:11,520 --> 00:10:17,220 Major powers now attempt to apply for some cyber strategies to gain a position of advantage relative to their rivals. 102 00:10:18,090 --> 00:10:23,819 And on the other hand, small states and asset actors are using cyber operations to punch above their weight, so to speak. 103 00:10:23,820 --> 00:10:30,030 So everyone using cyber powers. The major states have used them against other adversaries to maintain their power. 104 00:10:30,360 --> 00:10:34,410 Smaller states are using it to reach and harm larger states. 105 00:10:35,490 --> 00:10:39,450 These operations seek to achieve effects and compel the adversary to change their behaviour. 106 00:10:40,230 --> 00:10:45,420 That's the reality of what compelled to use coercion well well occurs in the 107 00:10:45,420 --> 00:10:51,060 second power few have uncovered how to measure and understand the effects. 108 00:10:51,870 --> 00:10:56,250 And of course, the potential, because that's really what is at the heart of cybersecurity. 109 00:10:56,640 --> 00:11:01,890 It's this promise of effect. The challenges in the field of cyber security. 110 00:11:02,130 --> 00:11:08,520 There's very little analysis of what effect means and how we can understand effects and how we can understand change. 111 00:11:09,910 --> 00:11:15,330 Actually, a cyber strategy is a critical question. We want to understand the future of cyber conflict and escalation. 112 00:11:15,780 --> 00:11:17,280 I don't know how you could do one without the other. 113 00:11:18,090 --> 00:11:23,850 I don't know how you understand the utility of a weapon without understanding how effective a weapon is. 114 00:11:24,930 --> 00:11:32,790 And this goes to you know, it's been a life changing event for me to move from Glasgow to Cardiff to now work for the U.S. Marines, 115 00:11:33,210 --> 00:11:36,870 where I work for the U.S. Marines. We talk about battle damage assessment quite a bit. 116 00:11:37,380 --> 00:11:41,010 We talk about how useful their weapons are and what effect they have. 117 00:11:41,910 --> 00:11:48,440 But when we talk about cyber security in the U.S. military, we don't talk about battle damage effect assessment. 118 00:11:48,960 --> 00:11:51,330 We don't talk about how useful these weapons work. 119 00:11:51,570 --> 00:11:58,470 It's more of a rush to use these new tools to justify their utility without evaluating their effectiveness. 120 00:11:59,010 --> 00:12:01,500 And that, to me, is a huge conundrum in a huge problem. 121 00:12:03,120 --> 00:12:07,050 And, of course, there's a huge promise that impacts about the sovereignty and the U.S. national security strategy. 122 00:12:07,290 --> 00:12:11,430 U.S. National Security Strategy says cyber attacks offer adversaries low cost 123 00:12:11,760 --> 00:12:16,410 denial opportunities to seriously damage or disrupt critical infrastructure, 124 00:12:16,740 --> 00:12:23,490 cripple American businesses, weaken our federal networks, and attack the tools and advice of Americans every day to communicate and conduct business. 125 00:12:24,690 --> 00:12:28,460 Cyber Revolution. US and academia see lines of code as potent. 126 00:12:28,470 --> 00:12:39,330 Cyber weapons that create strategic instability via magic says cyber warfare capabilities are leading to new remains revolution in military affairs, 127 00:12:39,330 --> 00:12:44,520 wherein cyber capabilities will play an increasingly desirable role in military conflicts. 128 00:12:46,070 --> 00:12:50,590 One of the things that's often said about my career is I've been attacking strong. 129 00:12:51,850 --> 00:12:57,150 I guess maybe you can say that. But the reality is there are a lot of strong man in the field of cyber security. 130 00:12:57,630 --> 00:13:04,170 There are a lot of people that say things without evidence and promise a lot of impact and promise a lot of change. 131 00:13:05,040 --> 00:13:10,410 And to me, as an American, I want to understand, is that really true? 132 00:13:10,710 --> 00:13:20,000 Do we have any evidence for any of this? And for me, a lot of this falls in the range of coercion and compliance, according to the use of threats, 133 00:13:20,020 --> 00:13:24,750 punishment and restoration of force during a crisis or conflict to alter foreign policy behaviour, 134 00:13:25,800 --> 00:13:32,790 cyber coercion that would be digitally exploiting the power to hurt and escalating the cost of taking certain actions. 135 00:13:33,150 --> 00:13:41,160 So in the digital domain, using using these tools, basically founded on microprocessors to escalate cost. 136 00:13:42,180 --> 00:13:47,159 But of course there's a difference between deterrence and compels a threat or intended to make 137 00:13:47,160 --> 00:13:51,570 an adversary do something versus the threat intended to keep adversary from doing something. 138 00:13:51,990 --> 00:13:58,980 Everyone in cyber security focuses on a cyber deterrence, which by definition would mean deter an adversary from doing something. 139 00:13:59,670 --> 00:14:05,430 I don't know how you witness someone not doing something. So it's tough to evaluate that with empirical evidence. 140 00:14:05,820 --> 00:14:14,040 But you can seek to evaluate what evidence a threat made to make an adversary change their behaviour will compel us. 141 00:14:14,520 --> 00:14:19,650 So it's a big distinction. A lot of people missing shortly, but for us our focus is on balance. 142 00:14:20,400 --> 00:14:26,670 And cyber operations are meant to shape how rivals will manage a crisis as an actor 143 00:14:26,670 --> 00:14:31,530 could compel an adversary sort of physical attack or also signal the risk of escalation. 144 00:14:32,560 --> 00:14:39,690 And that is a key aspect of cyber security. It's not all about massive war and it's not all about massive destruction. 145 00:14:40,230 --> 00:14:49,230 In fact, I think a lot of people watch Battlestar Galactica too much, or even potentially my favourite cyber security movie, WarGames from 1982. 146 00:14:49,890 --> 00:14:53,790 There hasn't been a good one since that really challenged me on that one. 147 00:14:56,160 --> 00:14:59,820 But this focus on massive destruction. 148 00:14:59,900 --> 00:15:07,280 Sure. The massive effect really belies what we think is going on here and that's using methods 149 00:15:07,280 --> 00:15:15,330 short of physical attack to change behaviour and all attention to signal intentions, 150 00:15:15,950 --> 00:15:21,950 to alter the balance of information and to also signal the risk of future escalation. 151 00:15:22,520 --> 00:15:28,610 We're finding something strange about cyber security. Cyber actions actually limit escalation. 152 00:15:29,570 --> 00:15:34,220 The entirety of the discourse of cyber security tells you this is going to be an escalatory domain. 153 00:15:34,880 --> 00:15:38,330 We find very little evidence of that, and that's a huge challenge. 154 00:15:38,630 --> 00:15:44,390 That's a huge problem. We keep talking about opening the Pandora's box of cyber security. 155 00:15:44,720 --> 00:15:50,810 Yet even when major powers were attacked by lesser adversaries, they refused to attack back. 156 00:15:51,740 --> 00:15:53,960 And my future work, which we're doing now, 157 00:15:54,770 --> 00:16:04,580 is we're doing cross-cultural scenarios which are basically experiments to look at decision making of people who leverage cyber weapons. 158 00:16:05,030 --> 00:16:08,040 And we're finding even military government. 159 00:16:08,480 --> 00:16:10,580 And of course, every experiment has students. 160 00:16:11,450 --> 00:16:19,400 All of them refuse to escalate because they don't think it's really worth escalating to war over a digital effect or digital attack. 161 00:16:20,180 --> 00:16:27,110 And that's a challenge for the entire discourse, because the story in reality is much different than the story we craft in our mind. 162 00:16:28,070 --> 00:16:34,309 And this is something that bothers me about the field. There are a lot of unfulfilled promises of cyber campaigns, 163 00:16:34,310 --> 00:16:41,060 and neither as revolutionary nor as normal as they seem when evaluated with evidence that I offer my hypothesis. 164 00:16:41,780 --> 00:16:46,640 Me and Ryan Anderson tell the 15 that adversaries of digital front are restrained. 165 00:16:47,330 --> 00:16:52,130 They're constrained in their options. They're unlikely to engage in cyber conflict because of normative restrictions, 166 00:16:52,430 --> 00:16:57,500 because the proliferation of cyber weapons and the risk inherent within tested options. 167 00:16:58,130 --> 00:17:03,350 So no impact out of it. There are enormous cybersecurity. They have been created. 168 00:17:03,350 --> 00:17:15,920 They're not very strong, but they are there. The other thing is cyber weapons are I don't know how you want to put it, use it or lose it, so to speak. 169 00:17:16,910 --> 00:17:20,130 You use a cyber weapon, it gets out into the wild. 170 00:17:20,150 --> 00:17:26,900 Anyone can use it and send it right back to you. That's an entirely different way of using a weapon than, say, a cruise missile. 171 00:17:27,980 --> 00:17:34,880 It's tough for us to use zero day weapons because once you do, you can't use it again. 172 00:17:35,390 --> 00:17:44,660 You've lost that in your arsenal and that's a challenge. And then finally, the military hates using untested weapons. 173 00:17:45,530 --> 00:17:50,780 They despise it because everything the military does is about success. 174 00:17:51,410 --> 00:17:58,940 It's about achieving success and not getting fired and making it to the next promotion y than what the military offered to use weapons. 175 00:17:59,180 --> 00:18:07,100 And they have little probability of knowing how useful they are because they had not used an actual day to day combat situations. 176 00:18:07,790 --> 00:18:15,109 So we offered those theories before. When we go one step further now and the next book in the Cyber Strategy book, 177 00:18:15,110 --> 00:18:21,920 we're suggesting that cyber strategies produce limited course of effects and actually limit escalation. 178 00:18:22,880 --> 00:18:32,720 So the reason we are arguing that cyber strategies are unlikely and are rarely used is because they're not very effective and that when they are used, 179 00:18:32,840 --> 00:18:39,260 they're actually used as signals to limit escalation and actually contain the potential for warfare. 180 00:18:41,320 --> 00:18:50,420 We find the utility of cyber warfare in the form political warfare optimised for 21st, 21st century lies, and it's formed as an ambiguous signal. 181 00:18:51,200 --> 00:18:54,980 That's not a plausible deniable signal. You do it. 182 00:18:55,190 --> 00:18:58,490 You send it out there, but you're not really attached to it. 183 00:18:58,520 --> 00:19:07,160 You could deny that you ever did. It proved to have the best quote when you said, I don't think we hacked 16 election, but we have hackers. 184 00:19:07,760 --> 00:19:13,460 And sometimes they wake up and they're very patriotic and they decide, hey, let's defend Russia, let's attack America. 185 00:19:13,940 --> 00:19:19,250 He literally said that basically, you know, like we work for the that's the idea. 186 00:19:20,280 --> 00:19:25,489 But there's no assurance that you actually did these things because it's tough 187 00:19:25,490 --> 00:19:30,770 to pinpoint the chain of command and who is actually watching these operations. 188 00:19:31,430 --> 00:19:38,809 But also, never underestimate the stupidity of the people watching these operations, which recently found that Gusev, 189 00:19:38,810 --> 00:19:46,580 for the one passing information to WikiLeaks, his Twitter account was logged in to from the GRU in St Petersburg. 190 00:19:47,180 --> 00:19:51,950 So they had like, you know, 180 lagoons and they found that one where you made that one mistake. 191 00:19:52,850 --> 00:19:55,880 And there's always that one mistake. There's always that behaviour. 192 00:19:56,150 --> 00:19:59,510 There's always that thing that a hacker does that gives up. 193 00:19:59,510 --> 00:20:03,770 Who? They are, it's tough to really hide them because they are human beings. 194 00:20:04,730 --> 00:20:07,760 No, I'm not talking about A.I. and cybersecurity. That's a whole other thing. 195 00:20:07,760 --> 00:20:11,330 I don't really know how to handle that just yet, but as we have humans involved, 196 00:20:11,750 --> 00:20:16,610 humans often make mistakes and we can figure out where a lot of these attacks come from. 197 00:20:17,270 --> 00:20:22,250 But because we know where the attack comes from, it doesn't mean we won't know who ordered that attack. 198 00:20:22,910 --> 00:20:27,020 We don't know. Our president is saying it. We don't know. President Trump is saying it. 199 00:20:27,320 --> 00:20:30,890 We don't know how Putin is ordering these things. But we knew they were. 200 00:20:30,930 --> 00:20:37,700 We do know who does. So that way, a sort of plausible deniability is a form of ambiguous signal. 201 00:20:38,480 --> 00:20:41,510 And in political science and international relations, 202 00:20:42,080 --> 00:20:49,729 we have largely failed to understand both ambiguous signals and covert actions and convert attempts 203 00:20:49,730 --> 00:20:54,530 to demonstrate resolve to rely on seeking costs a rising the risk to shape rival behaviour. 204 00:20:55,190 --> 00:21:03,950 That's really what cybersecurity is. It's cover attempts to raise costs to shape the behaviour of the Opposition. 205 00:21:05,450 --> 00:21:13,150 They're not massive, massively destructive. They're more ways of just kind of signalling that this isn't what we like. 206 00:21:13,160 --> 00:21:18,980 Let's do something different. Or in some ways they're harassment episodes. 207 00:21:19,370 --> 00:21:23,600 So we divide cybersecurity events into three buckets. 208 00:21:24,950 --> 00:21:28,190 This is just state to state right now. So we're leaving out crime. 209 00:21:29,150 --> 00:21:33,050 But disruption is harassment. Probing signalling in. 210 00:21:34,200 --> 00:21:40,130 They may have espionage to manipulate data and perceptions to alter the balance information to steal data. 211 00:21:40,850 --> 00:21:44,750 And then you have the great operations which are covert efforts to deny and sabotage. 212 00:21:45,080 --> 00:21:51,170 Those are the most destructive types of operations over the last, you know, since 2000. 213 00:21:52,340 --> 00:21:56,060 This is the picture of cybersecurity, 182 incidents in our dataset. 214 00:21:56,660 --> 00:21:59,990 I could talk about the dataset all day long, but not really to dive into that now. 215 00:22:00,980 --> 00:22:04,910 But you get the picture that there are rising cyber conflict events. 216 00:22:05,600 --> 00:22:09,830 But these are events are predominately dominated by harassment and espionage. 217 00:22:09,860 --> 00:22:18,140 That's what these two are. The degrade often the serious destruction options are very, very minimal throughout history. 218 00:22:18,470 --> 00:22:23,060 We don't see very many of them. And these are the megafauna events. 219 00:22:23,870 --> 00:22:30,950 So the thing is, we talk a lot about these grand cyber events, but in proportion there are very, very rare. 220 00:22:31,760 --> 00:22:33,410 And these are the significant attacks. 221 00:22:33,950 --> 00:22:42,080 You know, I can't capture every deed off, every attempt to change someone's password, every sort of problem, every intrusion. 222 00:22:42,320 --> 00:22:49,790 That's beyond our capabilities. We're talking about significant government efforts to harass, to use espionage or to degrade. 223 00:22:50,690 --> 00:22:58,010 And that's the picture we get. Our major finding is that cyber warfare produces only limited concessions. 224 00:22:58,740 --> 00:23:07,670 192 episodes of cyber wars between rival only 5.2% achieve compelled success, which seems bad. 225 00:23:08,900 --> 00:23:13,910 That's pretty good for coercion. But that's just tells you how bad caution is as a tool. 226 00:23:14,330 --> 00:23:20,120 Coercion rarely ever works. The history of coercion is a very tough story of telling. 227 00:23:20,420 --> 00:23:24,620 We we will threaten this. You will do this. There very rarely happens. 228 00:23:25,490 --> 00:23:32,450 Think of it like a child. You tell a child what to do. They're more likely to do the complete opposite or it is tough to achieve. 229 00:23:34,340 --> 00:23:38,660 Cyber degradations are more likely to achieve concessions and other forms of cyber strategy. 230 00:23:38,960 --> 00:23:43,190 So the more serious events obviously are more likely to change behaviour. 231 00:23:43,790 --> 00:23:49,339 The problem with that finding is all the cyber gate majority or all of the cyber 232 00:23:49,340 --> 00:23:55,850 events that produce concessions all involve the United States not being biased here. 233 00:23:55,850 --> 00:24:06,010 That's just what we know. And the problem is, if your findings are dominated by one state, your findings are probably biased by that one state. 234 00:24:06,020 --> 00:24:10,700 So it's either an outlier. America is super effective. 235 00:24:10,940 --> 00:24:15,679 Everyone super weak. I don't you can tell whatever story you want to tell the social scientist. 236 00:24:15,680 --> 00:24:19,280 To me, this is a challenge because all my events are driven by one state. 237 00:24:20,060 --> 00:24:28,070 It doesn't make it for a generalisable finding. And even if you unpack this, the story isn't very good. 238 00:24:29,240 --> 00:24:37,280 The United States and Russia are in a continuous cycle of China attacking China, stealing America, 239 00:24:37,280 --> 00:24:43,400 countering with counter espionage, China stopping and then starting right back up again in the year later. 240 00:24:43,520 --> 00:24:46,520 So there's just this continuous cycle that always happens. 241 00:24:46,970 --> 00:24:51,470 Now, we should expect that to happen. That's just how international politics works. 242 00:24:52,190 --> 00:24:59,420 The reality is, until we resolve the foundations of disputes between, say, Israel and Iran, US and Russia, i. 243 00:24:59,470 --> 00:25:02,710 The U.S. and China were never going to solve these cyber disputes. 244 00:25:03,250 --> 00:25:10,300 And these cyber disputes are going to continue. But are they going to be these massive, revolutionary, destructive events? 245 00:25:10,810 --> 00:25:18,760 We don't really think so. Neither past cyber emissions nor state's leading cyber power are empirically associated with concessions. 246 00:25:19,390 --> 00:25:22,990 Military and economic power appear to be better explanatory factors. 247 00:25:23,590 --> 00:25:26,710 So we get logic, regression analysis of statistics. 248 00:25:26,950 --> 00:25:36,610 We controlled for all other factors, including regime type, military capabilities, military power, economic power. 249 00:25:37,000 --> 00:25:45,670 We have a measure of cyber capacity based on superconductors, based on engineering degrees, based on cybersecurity publications. 250 00:25:45,970 --> 00:25:53,380 And the only thing that seemed to matter is kind of basically if you use cyber events on the past, in the past, you're more likely to use them again. 251 00:25:54,820 --> 00:25:58,120 Not necessarily confounding finding that's what you would expect to find. 252 00:25:58,720 --> 00:26:04,330 But the reality is nothing else really seems to matter that much besides economic power. 253 00:26:05,440 --> 00:26:09,640 So is it cyber that's making these states more effective when they use these tools? 254 00:26:09,670 --> 00:26:18,820 Likely not. And even when cyber exchanges between rivals escalate, they remain limited in scope outside ongoing military conflict. 255 00:26:19,270 --> 00:26:30,800 So when we see escalation on a ten point scale, it often is from a two to a three, not from a two to a six, not even from a 2 to 4. 256 00:26:30,820 --> 00:26:37,120 I think we have one case I should remember which one it is, where they jump to level on a severity. 257 00:26:37,830 --> 00:26:42,580 This other deeper problem is we have very little grasp on what escalation is. 258 00:26:43,750 --> 00:26:48,610 If I read one more application for American's escalation ladder for cyber security, 259 00:26:49,180 --> 00:26:58,540 come either know we need new ideas and new thoughts about how escalation works because we are focussed so much on the era of nuclear power, 260 00:26:59,500 --> 00:27:05,500 we haven't been able to leverage this era of nuclear power on the future of warfare as it is now, 261 00:27:05,500 --> 00:27:10,180 which is, hey, AI, drones and cyber and special operations. The two things don't match. 262 00:27:11,440 --> 00:27:18,100 I just declined to review a book on nuclear and cyber war because I just said I'm not going to be fair to it because it's just a dumb idea. 263 00:27:18,790 --> 00:27:23,170 We cannot connect these two things. They're disconnected quite a bit. 264 00:27:24,940 --> 00:27:28,390 Country specific results. We have case studies. Of course you do have cases. 265 00:27:29,080 --> 00:27:32,370 Oh, and I did this in front of a moscow State University professor. 266 00:27:32,380 --> 00:27:39,150 I'd say she was very upset. I used to represent Russia and I would be upset if you use a bald eagle. 267 00:27:39,160 --> 00:27:44,230 It's like, you know, losing hair or something good for America. But Russia is a diminished power. 268 00:27:45,190 --> 00:27:48,819 She's probably more upset with that. Has a political system, 269 00:27:48,820 --> 00:27:54,399 a flatter economy and limitations on military spending in cyber strategy is a 270 00:27:54,400 --> 00:27:58,960 disruption in this information as a cheaper way to seek out its political voice. 271 00:27:59,290 --> 00:28:08,620 It can't do much more than that. There's a overblown view of impact the election of 2016 with neither cheap nor very effective. 272 00:28:08,920 --> 00:28:15,670 But it was noticeable. And that's a challenge for cyber security because there's so much attention on these events. 273 00:28:16,060 --> 00:28:19,990 You gain notice, but it doesn't mean it changed anything. 274 00:28:20,680 --> 00:28:24,850 And I really hate to parent Trump's line about this, but I think first, 275 00:28:25,480 --> 00:28:31,629 the monkey cage with the time went out and looked at all the polling organisations in America and we asked them, 276 00:28:31,630 --> 00:28:36,280 Do you have any polls that show WikiLeaks having an impact on voters? 277 00:28:37,030 --> 00:28:40,180 You've never seen one. We can't find one. 278 00:28:40,190 --> 00:28:48,480 It's not to say there's not an impact there. Likely it is because the race was so close in an event so close, 279 00:28:48,490 --> 00:28:55,150 any little thing can have an impact but asked for a major impact and change in how America thought about Hillary Clinton. 280 00:28:55,570 --> 00:28:59,290 Now, it really just kind of reinforced views of people who already hated her. 281 00:28:59,800 --> 00:29:03,790 That's really all it did. It didn't really change much. 282 00:29:03,790 --> 00:29:07,810 We cannot find evidence for that. So it's not very effective. 283 00:29:09,010 --> 00:29:13,480 But it doesn't mean it wasn't insidious. It doesn't mean it wasn't challenging. 284 00:29:13,870 --> 00:29:19,150 It doesn't mean it's something we don't need to worry about. I'm not saying cyber is not an issue. 285 00:29:19,540 --> 00:29:24,490 I'm saying the frame when we talk about cybersecurity is completely and utterly off. 286 00:29:24,970 --> 00:29:31,630 We talk about it in a frame of warfare, and we really need to talk about it in a frame of espionage and covert operations. 287 00:29:31,900 --> 00:29:36,910 And we need to think about it more for the domain of psychological interpretations and perspectives. 288 00:29:37,360 --> 00:29:42,640 We need to look at it from the the view of blowback are behavioural psychology, 289 00:29:43,300 --> 00:29:48,370 but instead we focus so much on this nuclear race strategic perspective. 290 00:29:49,000 --> 00:29:55,040 Russia is using Ukraine as a testing ground, not what's not, it's just trying to win a war. 291 00:29:55,070 --> 00:29:58,950 Ukraine is not doing very well. It's been stuck in the airport and the last. 292 00:29:59,420 --> 00:30:05,330 For two and a half years. That side was not helping Russia win anything at all. 293 00:30:06,560 --> 00:30:09,350 It only does it because it can't do much else. 294 00:30:10,310 --> 00:30:17,120 China rising good powered and attempting to bridge the gap economically, militarily, to logically with a status quo competitor. 295 00:30:17,930 --> 00:30:25,100 And the challenge was in the eighties and seventies, China went out and sought to steal its way to adaptation and innovation. 296 00:30:25,880 --> 00:30:33,380 And they continue that with cyber security. The problem is you cannot steal your way of innovation, adaptation, and it doesn't work. 297 00:30:34,160 --> 00:30:38,860 My favourite story of that is all the Japanese distilleries that bar Scottish whiskies. 298 00:30:39,200 --> 00:30:43,490 I used to live out of Glasgow that they couldn't move it. 299 00:30:44,300 --> 00:30:48,260 They couldn't take a distillery, move to Japan and reproduce the same thing. 300 00:30:48,830 --> 00:30:54,750 There's something about techniques, knowledge, about developing the technology yourself. 301 00:30:54,770 --> 00:31:02,300 I think China has realised that, but the early history of China cybersecurity was very much about stealing information. 302 00:31:03,530 --> 00:31:12,440 The problem was, is that you would actually see China steal information two or three times and that would tell you there's a lack of coordination. 303 00:31:12,950 --> 00:31:16,790 One group would steal something and another group was still the same thing a few months later. 304 00:31:17,360 --> 00:31:22,160 They're not talking to each other. They're not digesting that information because there's just so much of it. 305 00:31:22,790 --> 00:31:28,240 It's tough to really do much with all this volumes and volumes of information there. 306 00:31:28,430 --> 00:31:30,380 China's new strategy, I believe, 307 00:31:30,830 --> 00:31:37,580 is more focussed on protecting their interests in the South China Sea and preventing internal revolt and and and kind of 308 00:31:37,610 --> 00:31:45,560 becoming the leader in what we call digital sovereignty and ensuring that each state can control their digital territory. 309 00:31:46,280 --> 00:31:55,820 And that's really China's focus now. So they become less dangerous in our eyes if we have a proper perspective on what they've been doing. 310 00:31:56,210 --> 00:32:00,890 But all states, the America does the same thing or Russia does the same thing. 311 00:32:01,100 --> 00:32:06,260 We should have blamed one state for espionage over another. This is the second oldest profession in the world. 312 00:32:06,830 --> 00:32:15,110 The question is, how effective is it? I don't think we have a good grasp in terms of espionage on does it really work? 313 00:32:16,160 --> 00:32:20,240 Do you really get anything out of all this analysis, out of all this information? 314 00:32:21,200 --> 00:32:25,430 You know, now it's become a lot of people who work for the CIA have just been reading newspapers at this point. 315 00:32:26,090 --> 00:32:30,530 That's really what's what espionage has become in many ways, America. 316 00:32:31,520 --> 00:32:34,190 That's just the logic of superiority over Russia and China. 317 00:32:34,730 --> 00:32:43,100 And what America basically does is where we stand at this precision shred complex to cybersecurity and the precision strike complex. 318 00:32:43,430 --> 00:32:46,550 There's this view in America of targeted operations. 319 00:32:48,170 --> 00:32:53,479 Probably the worst story I've ever heard about Trump is when they showed him that they have these 320 00:32:53,480 --> 00:32:59,390 new munitions that limit civilian casualties and how they waited for a terrorist to leave a house. 321 00:33:00,020 --> 00:33:03,110 And that's when they used a Stinger missile. 322 00:33:04,610 --> 00:33:09,320 And then Trump said, Well, why would we do that? We do that because that's what America does in the military. 323 00:33:09,350 --> 00:33:13,880 That's their doctor. To limit collateral damage, to be precise. 324 00:33:14,900 --> 00:33:20,690 We use America uses cyber capabilities sparingly because we don't want to give access to 325 00:33:20,690 --> 00:33:27,920 our superior capabilities in cyberspace and because we need to have precise strikes. 326 00:33:28,430 --> 00:33:31,610 I mentioned before there's a lot of collateral damage in cybersecurity. 327 00:33:31,880 --> 00:33:41,660 We want to limit that. America, you know, we talk about having a cyber operative at every regiment. 328 00:33:42,740 --> 00:33:47,030 The challenge for that would be, though, we would also need to have a legal analyst at every regiment. 329 00:33:47,630 --> 00:33:51,290 You cannot disconnect cybersecurity in America from legal analysis. 330 00:33:51,660 --> 00:33:56,420 There's so many rules about what you can do and what you can do to other people's computers. 331 00:33:57,050 --> 00:34:03,960 It's such a real tense weapon, and to use it effectively, we would need to go no holds barred. 332 00:34:04,970 --> 00:34:07,670 And that's something that maybe John Bolton might be willing to do. 333 00:34:08,420 --> 00:34:17,210 But out of the last 20, 30 years, America has been very careful about the use of cyber capabilities for harm, for destruction. 334 00:34:17,300 --> 00:34:20,930 It doesn't mean America doesn't use it for spying, for espionage and intrusion. 335 00:34:20,990 --> 00:34:24,110 Of course they do. That's just what the CIA does. 336 00:34:24,140 --> 00:34:28,650 They need to do something. Everyone has a job. Escalation in cyberspace. 337 00:34:28,650 --> 00:34:37,520 Stance of escalation ladder know that's the Herman Collins escalation ladder, which has 42 steps or 44 steps. 338 00:34:38,180 --> 00:34:41,630 There's nothing like that. It's just [INAUDIBLE] for tat. 339 00:34:42,380 --> 00:34:47,930 Or you can view it as a cycle. Espionage does not lead to escalation. 340 00:34:48,860 --> 00:34:53,630 It just leads to you get caught for escalation that you you bar a few people. 341 00:34:54,560 --> 00:34:59,140 You make a lot of people persona non grata. The other favourite thing that happened was. 342 00:34:59,430 --> 00:35:05,730 We kicked the Soviets out of the Seattle and actually they had because they were trying to recruit Microsoft people. 343 00:35:06,600 --> 00:35:13,020 They called the cops on us because all the Americans went in with the same devices and tried to sweep the building. 344 00:35:13,020 --> 00:35:16,499 So they were being upset about that. But for my enforcement, 345 00:35:16,500 --> 00:35:30,510 for there is a danger that the UAE recently used covert cyber capabilities to suggest that Qatar had said things that they did not say. 346 00:35:31,500 --> 00:35:37,470 So that's really, to me, the problem, the fear change in data, manipulating data. 347 00:35:38,400 --> 00:35:44,520 Forget what they call now, but they have a new term for it when you can just like change someone's face, do the video analysis. 348 00:35:45,150 --> 00:35:48,990 That's the real fear. But that's not cyberwar. 349 00:35:49,920 --> 00:35:53,250 That's just espionage, covert operations, propaganda. 350 00:35:53,700 --> 00:35:54,840 It's nothing really different. 351 00:35:56,790 --> 00:36:03,780 And the other thing is there's a key importance of political attribution that we make a lot about the attribution problem. 352 00:36:04,110 --> 00:36:10,620 But if these events happen in the context of a rivalry or let's say a company called 353 00:36:10,620 --> 00:36:14,010 Sony gets hacked because they're making a movie about the North Korean dictator, 354 00:36:15,030 --> 00:36:24,090 they were probably hacked by North Korea. It doesn't take a lot of analysis to figure that out, but a lot of people reject that idea. 355 00:36:25,410 --> 00:36:30,690 The cybersecurity and the digital people believe they are above politics. 356 00:36:31,260 --> 00:36:34,260 They believe that politics is a part of their process. 357 00:36:34,800 --> 00:36:42,350 Robert Lee, a very prominent cyber cyber analyst who formerly the NSA and I see and I say was 358 00:36:42,360 --> 00:36:45,569 in a big New York Times article the other day talking about how his company, 359 00:36:45,570 --> 00:36:49,590 Dragos, doesn't do attribution because of political issue. 360 00:36:51,270 --> 00:36:52,080 And that's right. 361 00:36:52,890 --> 00:37:00,180 But the problem is, if you're going to talk about who the attacker is and what their motives are, you should probably know who the attacker is. 362 00:37:00,930 --> 00:37:09,600 And limiting or dividing the buckets between digital technology and politics is not going to work on these. 363 00:37:09,600 --> 00:37:13,590 Two things need to be combined together, and that's a whole education challenge. 364 00:37:13,870 --> 00:37:18,690 That's a whole infrastructure challenge. That's a whole challenge for the whole of government approach to cybersecurity. 365 00:37:20,570 --> 00:37:26,300 We also are seeing the newer attacks too between government on private industry. 366 00:37:27,290 --> 00:37:30,950 And that's another issue because what's the proper response? 367 00:37:31,470 --> 00:37:37,090 You know, America, NSA, NSA knew that Sony was going to be hacked. 368 00:37:37,100 --> 00:37:40,820 They knew something was being hacked. We were watching Sony hacked. 369 00:37:41,660 --> 00:37:46,370 Well, what are you supposed to do? You're supposed to be intervening. 370 00:37:46,370 --> 00:37:53,300 Blow your sources. You're intervening. You're going too far in protecting a civilian organisation that might not want that protection. 371 00:37:53,660 --> 00:37:57,920 They may not ask for that protection as a whole. Another source of challenges. 372 00:37:58,400 --> 00:38:02,900 And it's not about this public private partnership people could be talking about. 373 00:38:03,260 --> 00:38:07,190 There is no partnership. There is no exchange of information. 374 00:38:07,940 --> 00:38:10,940 There is no collaboration that needs to happen more. 375 00:38:10,940 --> 00:38:15,950 And that's going to be the real challenge. To me, cyber security is more about a human security issue. 376 00:38:16,400 --> 00:38:21,560 How do you protect private individuals who are more likely to be harm than government? 377 00:38:22,640 --> 00:38:26,360 And we have no answer to that. We don't know who you're going to call. 378 00:38:27,410 --> 00:38:32,300 You know, we have this issue where Trump keeps saying, why didn't the DNC turn over their server? 379 00:38:33,080 --> 00:38:40,640 They did. To cyber security firms who were made up of people who are former NSC people who left because they make more money in private industry. 380 00:38:41,600 --> 00:38:52,140 That's just the way the world works now that now private industry is probably much more capable of doing a lot of these technological espionage. 381 00:38:53,330 --> 00:38:57,440 And that's a whole big challenge. My friend wrote a great article called Blue Hairs. 382 00:38:57,980 --> 00:39:05,390 It was a War on the Rocks. Jacki Snyder But what she's talking about is the people who do technology at a high rate, 383 00:39:05,840 --> 00:39:11,660 who are very good at it, don't fit in government, they don't fit in government structures. 384 00:39:12,230 --> 00:39:17,780 I heard many a general talk about why, while for a cyber operative they must know this system, 385 00:39:17,780 --> 00:39:22,070 that system and that system, and they must know this code, and that doesn't work. 386 00:39:22,250 --> 00:39:25,370 We all know how we train students. We know how we develop. 387 00:39:25,370 --> 00:39:32,600 How some of the best talent we develop comes from non-traditional sources, and we find it in non-traditional ways. 388 00:39:33,230 --> 00:39:38,960 Yet a lot of things we do on cyber security is about replicating traditional ways of learning and methodologies. 389 00:39:39,320 --> 00:39:42,500 That doesn't really mean that we're going to lead to some sort of revolution. 390 00:39:44,360 --> 00:39:50,689 So the challenge in practice, you do a lot of work on cyber conflict is a new means and of course, in political opponents demands. 391 00:39:50,690 --> 00:39:53,930 We understand the realities in the limits of innovation. 392 00:39:54,470 --> 00:39:58,760 We talk too much about the possibilities, but not enough about the limits. 393 00:40:00,290 --> 00:40:05,030 I'm a pessimistic person. American and Latino from L.A. 394 00:40:05,810 --> 00:40:10,870 I do things differently. I've always never trusted the government. I grew up in Rampart Division era. 395 00:40:10,880 --> 00:40:14,900 You know, the whole idea of, you know, the cops planting things on American civilians. 396 00:40:15,200 --> 00:40:19,510 I've always been sceptical in my life and I think it's important here. 397 00:40:20,070 --> 00:40:26,480 The story we're being told doesn't match with evidence that there are limits to innovation. 398 00:40:27,650 --> 00:40:32,540 There's a woeful need to understand the processes of decision making and psychological reactions. 399 00:40:32,540 --> 00:40:41,000 So we need to understand more how units make decisions as a collective group, but also the psychological aspects of cyber security. 400 00:40:41,690 --> 00:40:45,260 Because it is changing our brains, it's changing how we behave. 401 00:40:45,500 --> 00:40:49,610 There's change in our dependencies. It's becoming an addiction of sorts. 402 00:40:50,570 --> 00:40:57,570 We can't frame this as a war. This is more of a challenge and the elements of basic human behaviour. 403 00:40:57,590 --> 00:40:59,420 We haven't done enough psychological work. 404 00:40:59,840 --> 00:41:08,000 Michael GROSS and his team is doing good work, but I think others need to do more and we're doing more work on decision making and wargames. 405 00:41:09,110 --> 00:41:16,340 But we urge caution. States seeking to leverage cyber actions to achieve decisive effects are likely to be disappointed. 406 00:41:17,660 --> 00:41:24,260 Digital effects have been slow to emerge and tend to amplify rather than replace traditional elements of power. 407 00:41:24,680 --> 00:41:28,640 It's just making the strong states strong. That's all it's really doing. 408 00:41:29,930 --> 00:41:33,710 The story is not much different to the future. 409 00:41:34,760 --> 00:41:37,190 One, we have a very clear problem with threat inflation. 410 00:41:38,000 --> 00:41:44,840 If we inflate the threat too much and talk too much about the danger of a massive cyber war from Russia or China, 411 00:41:45,320 --> 00:41:50,630 we're not going to prepare our defences for the future. We need to do the basic things first. 412 00:41:51,650 --> 00:41:56,630 We need to understand how we're going to control access, how we're going to maintain privacy, 413 00:41:56,870 --> 00:42:01,670 how we're not going to let Facebook keep everyone's data away. These are current challenges. 414 00:42:02,120 --> 00:42:04,880 Yet the story we tell is much grander and much different. 415 00:42:05,060 --> 00:42:14,150 And the reality of the fight on a day to day basis is not a domain of novelty or escalation or conflict spirals. 416 00:42:14,750 --> 00:42:19,010 Cybersecurity is a domain of covert operations and an international. 417 00:42:19,080 --> 00:42:23,370 Relations. We have very few people who have done good work on covert operations. 418 00:42:23,790 --> 00:42:30,030 Bob Jervis, Austin Khazan, Caramello but there's very, very few of them. 419 00:42:30,880 --> 00:42:33,780 I think that's where the future of international relations needs to go. 420 00:42:34,020 --> 00:42:39,240 If we're going to understand a revolution in our relations, it's more of a revolution towards covert operations, 421 00:42:39,480 --> 00:42:44,160 away from these more escalatory, dramatic conflict events. 422 00:42:45,660 --> 00:42:54,240 There are a lot of underappreciated strategic risks in cyber security blowback, which means a weapon can be used right back at you. 423 00:42:55,350 --> 00:42:58,589 There are one shot weapons that can be spread. They're untested. 424 00:42:58,590 --> 00:43:02,190 They're risky. We put a lot of promise in cyber security when? 425 00:43:02,190 --> 00:43:05,999 After we actually talk to a lot of cyber operators, they're very dubious. 426 00:43:06,000 --> 00:43:08,760 They're on weapons because they know the limitations of their weapons. 427 00:43:09,180 --> 00:43:16,830 The difference between the story the news media tells, a story cyber operators tell is drastically different, and that's a problem. 428 00:43:17,610 --> 00:43:23,280 I think we need empirical research to back that up. Whatever you want to call empirical research, it doesn't really matter. 429 00:43:23,280 --> 00:43:28,230 But case studies data were psychological experiments, scenarios. 430 00:43:28,470 --> 00:43:37,140 But I want things grounded in the real world. There's too much done in cybersecurity, on flights of imagination, in the hypothetical. 431 00:43:37,920 --> 00:43:46,380 All hypotheticals must be grounded in what is plausible, and you can't do research on things that are not exactly plausible. 432 00:43:46,770 --> 00:43:52,080 A lot of the research people seem to do seems to be based on this grand idea of a cyber war, 433 00:43:52,470 --> 00:43:56,610 you know, made up from Battlestar Galactica or, say, the Ghost Fleet book. 434 00:43:56,970 --> 00:44:03,870 Ghost Fleet book. That's just not what's happening. We have to understand a process of modern political warfare. 435 00:44:04,710 --> 00:44:10,020 When states like Russia have a view of status that isn't being met, in reality, 436 00:44:10,410 --> 00:44:14,910 they have a very limited ability to change that perception of their status. 437 00:44:15,540 --> 00:44:23,940 And cyber is one way of doing. Cyber is one way of trying to achieve notice, but it's not a way of achieving impact. 438 00:44:24,960 --> 00:44:29,640 And the other thing, too, is we have seen covert operations. 439 00:44:29,640 --> 00:44:33,270 We've seen reflexive measured by Russia during the Cold War. 440 00:44:34,080 --> 00:44:37,380 You know, they they said America spread HIV, things like that. 441 00:44:37,680 --> 00:44:44,850 We never threatened to bomb them for this, but we do now all the time because the 216 and 16 election. 442 00:44:45,420 --> 00:44:52,380 And that's a problem. And then I think in the end, the two challenge, I think in the future is more of a personal thing. 443 00:44:52,560 --> 00:44:56,010 It's a human security issue. It's about cyber repression. 444 00:44:56,790 --> 00:45:00,720 We're talking earlier about the idea of cyber being a liberation technology. 445 00:45:00,900 --> 00:45:07,770 That's actually the complete opposite of that for many countries. It's actually a way of repressing and controlling your population. 446 00:45:08,310 --> 00:45:13,920 That's where cyber security is very effective. It's not effective in state to state and state to individual. 447 00:45:14,520 --> 00:45:18,090 So the framework we use to evaluate cyber security is entirely wrong. 448 00:45:19,320 --> 00:45:21,930 I shouldn't have written that book, but that's what people wanted me to do. 449 00:45:22,560 --> 00:45:25,830 There's a whole other product that I think is more important and that's trying 450 00:45:25,830 --> 00:45:31,140 to understand how states are using these actions to control their population. 451 00:45:31,530 --> 00:45:35,020 And we need to document that. We need to seek to name and shame it. 452 00:45:35,040 --> 00:45:38,010 We need to involve the U.N. and other civil society organisations. 453 00:45:38,280 --> 00:45:45,270 But we're so focussed on this idea of massive cyber war, I think we really miss what is going on. 454 00:45:45,630 --> 00:45:51,810 It's not a revolution of warfare between states. It's a revolution of warfare between state versus individual. 455 00:45:52,560 --> 00:45:56,070 And that's where I think the future needs to go and where we need to investigate in the future. 456 00:45:57,210 --> 00:45:59,670 So thank you. I'm open to questions.