1 00:00:00,330 --> 00:00:04,320 Great. Well, thank you so much, I'm just sharing my screen right now. 2 00:00:04,320 --> 00:00:08,490 I hope you can all see that it's that, right? Yes. 3 00:00:08,490 --> 00:00:19,920 Great. So as Christopher said, my name is Marina Fabro and I am a research fellow working in emerging technologies and arms control. 4 00:00:19,920 --> 00:00:27,660 And today I'll be presenting the findings from the study that I conducted with Heather Williams at King's College London. 5 00:00:27,660 --> 00:00:35,640 And this is actually a study that will be published next week. So we're very excited to be sharing these findings with you before anyone else, really. 6 00:00:35,640 --> 00:00:45,880 So the bottom line up front of this study is that it provides a new framework for thinking and talking about emerging technologies and nuclear risk. 7 00:00:45,880 --> 00:00:54,180 So this study group's 10 technologies into four technology clusters, which are described based on the effects that they'll have. 8 00:00:54,180 --> 00:01:02,160 So this study essentially groups technologies that distort, compress, thwart and illuminate in a crisis. 9 00:01:02,160 --> 00:01:08,340 And experts assess that technologies and cluster one distort were the most concerning in terms of nuclear 10 00:01:08,340 --> 00:01:14,570 risk because they're both highly impactful and relatively easy to implement in the next 10 years. 11 00:01:14,570 --> 00:01:22,910 So let's take a step back, go to the puzzle, so I don't think it's very helpful to treat emerging technologies as a broad risk category. 12 00:01:22,910 --> 00:01:25,640 After all, this is a vague term. It's a catchall term. 13 00:01:25,640 --> 00:01:31,310 And it's used to describe a really heterogeneous group of technologies from multiple operating domains of space, 14 00:01:31,310 --> 00:01:34,940 cyber, et cetera, and with different maturity timelines. 15 00:01:34,940 --> 00:01:41,120 So instead, what I wanted to do was to create a framework to better understand which technologies create, 16 00:01:41,120 --> 00:01:47,360 what kinds of risks and how these risks can be mitigated. So here are the questions that guided my research. 17 00:01:47,360 --> 00:01:52,010 Essentially, I was looking at which emerging technologies are most likely to escalate a crisis. 18 00:01:52,010 --> 00:01:57,200 How can policymakers better understand the ways in which nuclear weapons states might feel this impact 19 00:01:57,200 --> 00:02:02,510 and which nuclear risk reduction measures could mitigate any potential impacts for any potential risks, 20 00:02:02,510 --> 00:02:10,190 excuse me, of emerging technologies, while also capitalising on the bit on the benefits and opportunities that they present. 21 00:02:10,190 --> 00:02:18,920 So I think these are important questions to ask for a few reasons. They help states to make allocation decisions so states have limited resources. 22 00:02:18,920 --> 00:02:24,380 Emerging technologies is, as we said, this broad category. So they need to decide what to focus on. 23 00:02:24,380 --> 00:02:30,200 Secondly, it offers a new means for addressing emerging technologies in collaborative ways. 24 00:02:30,200 --> 00:02:35,600 So this could be between nuclear weapon states and non-nuclear weapon states or between the public sector and private sector. 25 00:02:35,600 --> 00:02:41,540 And it reduces the risks of catastrophic nuclear use, which obviously we are all interested in. 26 00:02:41,540 --> 00:02:47,060 So setting up the scope of the studies, I short listed 10 technologies for consideration. 27 00:02:47,060 --> 00:02:51,590 These are high powered cyber operations, big technology. 28 00:02:51,590 --> 00:02:57,510 I, for example, are small states for our rendezvous and proximity operations in space, 29 00:02:57,510 --> 00:03:04,130 kinetic antisatellite capabilities, satellite jamming and spoofing systems, hypersonics from robotics. 30 00:03:04,130 --> 00:03:07,550 I promise we'll come back to them. You don't need to memorise the list. 31 00:03:07,550 --> 00:03:14,300 So the function that these technologies are are going to affect in this study is crisis, stability. 32 00:03:14,300 --> 00:03:24,110 And in the simplest terms, a crisis is determined, a stable when no side has incentive to to use nuclear weapons first. 33 00:03:24,110 --> 00:03:29,090 The UK is the subject of the study and the timeline is ten years. So adding all this together, 34 00:03:29,090 --> 00:03:37,970 the study is looking at the likelihood of emerging technologies to impact a crisis between the UK and another nuclear reactor over the next 10 years. 35 00:03:37,970 --> 00:03:44,720 The how of the study is I use extreme method, which is a method that I became familiar with when I worked at Rand. 36 00:03:44,720 --> 00:03:48,860 It's a roundabout method and it's essentially used to look at current and potential 37 00:03:48,860 --> 00:03:52,610 technologies according to a range of impact and implementation criteria. 38 00:03:52,610 --> 00:04:00,890 So what does this mean? Essentially, comparing the impacts of technology side by side, as we said, can be very challenging. 39 00:04:00,890 --> 00:04:04,640 I like to say that it's the policy equivalent of comparing apples with oranges. 40 00:04:04,640 --> 00:04:10,010 Now, the stream method doesn't necessarily allow us to compare apples with apples because that's impossible, 41 00:04:10,010 --> 00:04:14,720 but it allows us to compare apples and oranges with common parameters. 42 00:04:14,720 --> 00:04:18,620 So that's the kind of what the study adds to the literature. 43 00:04:18,620 --> 00:04:27,020 And a huge part of the method is the survey where subject matter experts present a survey that looks like this with ten technologies. 44 00:04:27,020 --> 00:04:31,700 The ten technologies that we spoke about in the columns here. 45 00:04:31,700 --> 00:04:42,920 Over here, we have the impact questions. So these impact questions got at does this technology potentially deliver or enable a disarming first strike? 46 00:04:42,920 --> 00:04:47,120 Does it increase or decrease making time? 47 00:04:47,120 --> 00:04:52,310 Does it create myths or disinformation? And so those are the types of questions that experts were answering. 48 00:04:52,310 --> 00:04:53,090 And then over here, 49 00:04:53,090 --> 00:05:01,280 we have the feasibility of implementation questions which ask which barriers exist to developing or deploying these technologies in the UK. 50 00:05:01,280 --> 00:05:08,510 And this includes budgetary, regulatory, ethical, legal, technical barriers, etc. 51 00:05:08,510 --> 00:05:15,200 So in total, I received sixty one completed surveys. This is an example of what a completed survey looks like. 52 00:05:15,200 --> 00:05:21,800 It was from an equal mix of policy and technical folks, which was one of the objectives of the study. 53 00:05:21,800 --> 00:05:28,220 And on average, not all experts evaluated all technologies like this Absolute Hero did, 54 00:05:28,220 --> 00:05:32,750 but most evaluated between three and four technologies, which is in line with the guidance. 55 00:05:32,750 --> 00:05:39,560 So I only wanted people to evaluate technologies with which they considered themselves an expert. 56 00:05:39,560 --> 00:05:43,760 So as you might imagine, that created a huge amount of data, 57 00:05:43,760 --> 00:05:50,240 and I think that's there was arguably too much data for the human brain to look at and make sense of it in an unbiased way. 58 00:05:50,240 --> 00:05:54,470 So I actually use machine learning to cluster the technologies according to the type of 59 00:05:54,470 --> 00:06:01,190 impact they might have on the stability and the addition of machine learning set the sets, 60 00:06:01,190 --> 00:06:03,200 the study apart from that which came before it. 61 00:06:03,200 --> 00:06:11,870 Because when I've used this method in the past, usually the output is a ranking of technologies from most concerning to least concerning. 62 00:06:11,870 --> 00:06:17,810 But this relies working from the average of the impact and implementation scores for each technology. 63 00:06:17,810 --> 00:06:26,780 Whereas what machine learning does is it allows us to group technologies together that were scored similarly across similar questions. 64 00:06:26,780 --> 00:06:36,470 So the result is that we can see technologies that have a similar impact on stability and start to characterise these clusters. 65 00:06:36,470 --> 00:06:44,690 Yeah, so the computer actually graphs this with multiple axes, but I've visualised it here in 2D to make it understandable to our human brains. 66 00:06:44,690 --> 00:06:53,230 So here's how we can understand this graph. Up in this quadrant, we have highly feasible and highly impactful technologies. 67 00:06:53,230 --> 00:06:58,760 So we've got technology and satellite jamming spoofing systems in this cluster. 68 00:06:58,760 --> 00:07:04,670 In this quadrant, we have highly impactful but less feasible technologies in the UK context. 69 00:07:04,670 --> 00:07:11,540 And in this quadrant we have more feasible but lower impact technologies. 70 00:07:11,540 --> 00:07:13,760 And we'll get into this more as I go along. 71 00:07:13,760 --> 00:07:21,560 So just to note that these technology clusters were also found to be statistically significant when I use pairwise testing, 72 00:07:21,560 --> 00:07:30,380 which is a statistical significance test, that basically what it adds to my findings and how it makes us more confident in them is it tells us 73 00:07:30,380 --> 00:07:36,740 that technologies in different clusters were always seen to be statistically different from each other. 74 00:07:36,740 --> 00:07:42,260 And technologies in the same cluster seem to be sometimes the same sense and sometimes different. 75 00:07:42,260 --> 00:07:47,630 And that aligns with my thinking on this, because I'm not saying that technologies in the same cluster are the same. 76 00:07:47,630 --> 00:07:55,670 I'm just saying that they're more similar to each other than they are to the other technologies, which gives us a way of understanding the findings. 77 00:07:55,670 --> 00:08:02,720 So to give you a brief snapshot on these technology clusters, technology cluster one, which is that high impact, 78 00:08:02,720 --> 00:08:09,770 high visibility cluster encompasses satellite jamming and spoofing systems and big technology. 79 00:08:09,770 --> 00:08:15,920 And these were technologies that are capable of interacting data flows and distorting the information landscape 80 00:08:15,920 --> 00:08:23,690 technology cluster to impacts the speed of conflict and could compress decision making timelines in a crisis. 81 00:08:23,690 --> 00:08:33,140 So this includes hypersonic missiles, swarm robotics, kinetic antisatellite weapons, air powered cyber operations and Archos. 82 00:08:33,140 --> 00:08:39,590 The defining feature of Cluster three is its ability to credibly thwart or blunt a nuclear attack. 83 00:08:39,590 --> 00:08:43,580 And there's only one technology that ends up falling into this cluster, which is directed energy, 84 00:08:43,580 --> 00:08:51,830 weapons and finally cluster for illuminates insofar as it provides more accurate and more comprehensive data flows to decision makers. 85 00:08:51,830 --> 00:08:58,550 And these were the only technologies that were identified by experts as capable of both strengthening and eroding nuclear command, 86 00:08:58,550 --> 00:09:03,100 control and communications, as well as both increasing and reducing Decision-Making time. 87 00:09:03,100 --> 00:09:09,950 So these present both an opportunity and a threat to the stability and the policy paper spends much more time characterising 88 00:09:09,950 --> 00:09:17,780 each of these technology clusters and proposes bespoke risk reduction measures that are unique to their applications. 89 00:09:17,780 --> 00:09:22,580 But for the sake of time, all speed along to the implications of the study. 90 00:09:22,580 --> 00:09:24,710 So, as I mentioned at the beginning, 91 00:09:24,710 --> 00:09:33,020 experts assessed a cluster one distort is the most concerning for nuclear policymakers because their high impact and highly feasible. 92 00:09:33,020 --> 00:09:36,840 So indeed these technologies are very readily accessible to civilians. 93 00:09:36,840 --> 00:09:42,080 There are smartphone apps and telegram votes that create authentic seeming deep fakes. 94 00:09:42,080 --> 00:09:45,440 And even if these don't convince citizens they're still potentially disruptive. 95 00:09:45,440 --> 00:09:52,190 For example, they can be used to embarrass or blackmail decision makers or those with access to classified information. 96 00:09:52,190 --> 00:09:56,390 And one thing that analysts talk about in this area that's particularly concerning is that ninety 97 00:09:56,390 --> 00:10:01,430 six percent of fake videos on the Internet are nonconsensual pornographic videos of women. 98 00:10:01,430 --> 00:10:08,720 So you can see you can deduce that if that were to involve women who are involved in the national security infrastructure, 99 00:10:08,720 --> 00:10:13,340 that could be hugely impactful in an ongoing crisis. 100 00:10:13,340 --> 00:10:20,690 Defects could also compromise classified data feeds and so mistrust in the intelligence community's conclusions. 101 00:10:20,690 --> 00:10:24,350 And similarly, you can buy a GPS jammer that plugs into your car. 102 00:10:24,350 --> 00:10:31,460 They're illegal, but apparently a lot of truckers use them and delivery and service vehicles use these to manipulate their driving logs. 103 00:10:31,460 --> 00:10:38,310 So these can not only manipulate the driving logs, but they also can interfere with emergency services like. 104 00:10:38,310 --> 00:10:46,900 Nine nine, nine ambulances, firefighters, police, etc. So it's a very concerning and concerning thing and mindful of the time all said, 105 00:10:46,900 --> 00:10:53,440 basically, you can read the report if you would like to read some bespoke risk reduction measures that are unique to each cluster. 106 00:10:53,440 --> 00:10:57,040 But one overarching risk reduction measure is the public sector must cooperate 107 00:10:57,040 --> 00:11:00,520 more closely with the private sector because a lot of these technologies are 108 00:11:00,520 --> 00:11:04,750 already being developed or will primarily be developed in the private sector 109 00:11:04,750 --> 00:11:08,290 by multinational companies that traditionally haven't worked with defence. 110 00:11:08,290 --> 00:11:12,340 And this is partially because of the ways that the defence procurement process is 111 00:11:12,340 --> 00:11:18,520 changing and this gives rise to the potential harms of dual use technologies. 112 00:11:18,520 --> 00:11:22,300 So this is a really important finding from the study. And finally, 113 00:11:22,300 --> 00:11:28,780 I think this framework can be used in multilateral forums like the NPT to build bridges between nuclear weapon states and not nuclear weapon states. 114 00:11:28,780 --> 00:11:32,320 So last slide, I promise, revisiting the puzzle. 115 00:11:32,320 --> 00:11:39,370 So I shortlisted 10 technologies that are most likely to escalate the crisis stream and machine learning. 116 00:11:39,370 --> 00:11:45,340 We're used to group the 10 to the 10 short list of technologies into four technology clusters. 117 00:11:45,340 --> 00:11:52,030 And depending on the applications use cases behaviour's posed by each technology cluster. 118 00:11:52,030 --> 00:11:56,660 Bespoke risk reduction measures were proposed. And that's all for me. 119 00:11:56,660 --> 00:12:01,870 Thank you very much for having me today. I will give Marion an idea, 120 00:12:01,870 --> 00:12:11,030 so Marion's asking about potential risk reduction measures that the study found and one so because I talked most about Kluster, one, 121 00:12:11,030 --> 00:12:16,040 the district questionnaire, which has technologies and satellite jamming and spoofing systems, 122 00:12:16,040 --> 00:12:19,860 maybe I'll just say a bit about the risk reduction measures for that cluster specifically. 123 00:12:19,860 --> 00:12:28,070 So a lot of people are talking about specifically Yankovic's, who just wrote the book, How to Lose the Information, 124 00:12:28,070 --> 00:12:35,600 where she's a disinformation scholar and she talks a lot about how we can use deep nudes or these nonconsensual 125 00:12:35,600 --> 00:12:42,620 pornography videos that I was talking about as an entry point into legislating digital replicas of people. 126 00:12:42,620 --> 00:12:49,490 So this is another way of saying essentially because it's so clearly a harmful use of this technology, 127 00:12:49,490 --> 00:12:54,620 that the US Congress has even tried to go this route or is in the process of trying to go this route. 128 00:12:54,620 --> 00:13:06,110 And I propose that the UK also uses this as an entry point into legislating deep, deep technologies because of their potentially harmful impacts. 129 00:13:06,110 --> 00:13:16,400 And as for the space based protection of space based assets such as GPS satellites or satellites that enable early warning or and C three, 130 00:13:16,400 --> 00:13:23,600 I talk about how the UK could potentially use the UN General Assembly's proposal, 131 00:13:23,600 --> 00:13:27,620 a draught resolution on reducing space threats through responsible behaviours, 132 00:13:27,620 --> 00:13:33,230 which is actually a great segue into such presentation as a really great forum 133 00:13:33,230 --> 00:13:37,502 to talk about the protection of early warning satellites and three satellites.