1 00:00:00,060 --> 00:00:04,500 Having been in the business today is obviously David Scott. 2 00:00:04,500 --> 00:00:09,450 Right, talking for us on technology advice about a smartphone if you don't have him. 3 00:00:10,620 --> 00:00:17,400 David is professor of International Security and the director of the Centre for War and Technology at the University of Oxford. 4 00:00:18,060 --> 00:00:23,640 It's great to know, David, that actually war is still in the title of some centres of research in this country. 5 00:00:24,840 --> 00:00:28,380 He's also a conflict theme centre for the Partnership for Conflict, Crime and Security. 6 00:00:29,160 --> 00:00:32,100 For anybody else who knows his research council and the DNC, 7 00:00:32,670 --> 00:00:42,030 he's working on as I think a research fellowship funded by agency, and also on how technology is shaping emergent warfare, 8 00:00:42,600 --> 00:00:50,700 relying on a variety of approaches, multidisciplinary, the same way we do here, science and technology studies, philosophy and war studies. 9 00:00:51,690 --> 00:00:54,870 He's just been just finishing some of the projects. 10 00:00:55,080 --> 00:01:01,650 There's one film about technological innovation in all forms of transformation Innovation First, 11 00:01:02,070 --> 00:01:07,920 and he's the former editor in chief for defence studies and European security and is an 12 00:01:07,920 --> 00:01:12,600 expert on perhaps say is the right environment for defence studies published this year. 13 00:01:12,690 --> 00:01:20,700 Julian Probably about five months ago, but I was also looking at the ranch defence space which will have Davis name on the front. 14 00:01:20,760 --> 00:01:24,180 David, thank you very much indeed. Function as well as both on the subject. Right. 15 00:01:24,240 --> 00:01:29,180 Thank you very much. I thank you very much for coming today. 16 00:01:29,190 --> 00:01:32,810 It's it's a full house. And as I said, my name is David. 17 00:01:32,820 --> 00:01:37,620 And what I'm going to talk to you today is about what I referred to as boundless warfare. 18 00:01:37,980 --> 00:01:43,920 And I think boundless warfare is going to be contentious. And I think the way that I'm going to talk about technology is going to be contentious. 19 00:01:43,920 --> 00:01:51,059 But but more than anything else, I think what I'm trying to do is to be you know, I'm trying to set out an idea, set our hypothesis. 20 00:01:51,060 --> 00:01:55,530 I'm not trying to necessarily talk about policy in this case, 21 00:01:55,530 --> 00:01:59,189 although it does have policy implications if you were to follow the argument all the way out. 22 00:01:59,190 --> 00:02:06,540 And so and I know in my experience, I know that some people have a very strong reaction to technological determinism. 23 00:02:07,230 --> 00:02:14,190 And and so if you do have a strong reaction to technological determinism, be prepared and so on, because it's coming. 24 00:02:14,310 --> 00:02:23,340 But the but but by and large, I want to start with with with three questions and the two questions I'm going to come to at the very end, 25 00:02:23,340 --> 00:02:26,160 but I want to put them out there early. 26 00:02:27,330 --> 00:02:38,400 And the one is, if we think about the impact that technology and I mean modern and emergent technology and some of those are actually quite old. 27 00:02:38,640 --> 00:02:44,219 But but but the way that we think about everything from UAV is to networked. 28 00:02:44,220 --> 00:02:53,340 And they enabled us to looking at interventions in physiological spaces and neurological spaces and things like this. 29 00:02:53,340 --> 00:02:58,379 If we think about all those things and what does it mean for other things? 30 00:02:58,380 --> 00:03:02,280 And so and one of the things so I ask the question, what does it mean for the state? 31 00:03:02,670 --> 00:03:04,590 So I talk about boundless warfare here. 32 00:03:04,950 --> 00:03:11,790 The state is very much a bounded entity, but both in terms it's got boundaries, but also it's bounded in lots of other ways. 33 00:03:12,150 --> 00:03:17,820 And we talk about who is we and who, who are they and things like this, who are we rather and who are they? 34 00:03:19,470 --> 00:03:28,560 The the second is so what role does a state have in terms of the way that future that future of war may happen, the way that it may happen? 35 00:03:28,980 --> 00:03:38,790 The second is, what role does or what is the changing relationship between the way that we talk about power and what we mean by security? 36 00:03:38,880 --> 00:03:42,060 How is that changing in the way that we think about security? 37 00:03:42,060 --> 00:03:47,850 I mean, we come to traditional security, strategic security, everything to human security and so on. 38 00:03:47,850 --> 00:03:53,490 And all of these are are what sociologists or some sociologists would refer to as governance mentalities. 39 00:03:53,910 --> 00:03:57,810 They tell us what's important and in some ways what we should do about it. 40 00:03:58,080 --> 00:04:04,020 And those change over time in a way. And so and I'm particularly interested in the way that they may to may change over time. 41 00:04:04,020 --> 00:04:10,310 And I'm particularly interested in arms control as a as a function of the relationship between power and security. 42 00:04:10,770 --> 00:04:20,430 So in the way that we talk about future the future arms control, or as I've argued in the past, at the end of arms control, 43 00:04:21,570 --> 00:04:26,820 the third and final is how does it shape the or how this war one technology or boundless warfare, 44 00:04:26,820 --> 00:04:29,790 how does it shape the relationship between security and defence? 45 00:04:30,360 --> 00:04:39,389 Because one of the things that's really quite striking to those so many of you here will have had witnessed this in one way or another. 46 00:04:39,390 --> 00:04:45,930 Is is that what we mean by what we mean by national defence, what we mean by what the military does, 47 00:04:45,930 --> 00:04:50,999 what we mean by what the Home Office does when it comes to counter-terrorism, 48 00:04:51,000 --> 00:04:59,970 when it comes to cyber, when it comes to public health, public safety, all these issues in a sense have some kind of security element. 49 00:05:00,000 --> 00:05:04,230 To them and, and, and increasingly they have a defence element to them in some way or another. 50 00:05:04,470 --> 00:05:14,160 Maybe that's to defend our infrastructures against, you know, chemical attacks or war or a cyber attacks or something like this. 51 00:05:14,490 --> 00:05:22,469 Or maybe it has something to do with with just the way that we think about what is the boundary of something like the 52 00:05:22,470 --> 00:05:30,810 Ministry of Defence and what is the boundary of something like the the something like the Home Office and things like that. 53 00:05:30,810 --> 00:05:34,710 And, and how does something like the Cabinet Office try to manage that boundary? 54 00:05:35,340 --> 00:05:38,370 And that's very specific, you know, in a way. 55 00:05:38,370 --> 00:05:40,649 But you can imagine that that's happening in many places, 56 00:05:40,650 --> 00:05:48,760 that the boundary between what we mean by security and what we mean by defence are in a sense kind of blurring in a way. 57 00:05:48,780 --> 00:05:54,360 And so and I'm going to talk about all three of these questions and and the way that I talk about this. 58 00:05:54,570 --> 00:06:01,680 Now, the one thing that I would say is that I'm going to some people have a strong reaction to this, too, but hopefully you'll stay with me. 59 00:06:01,950 --> 00:06:05,339 I'm going to read some and I'm going to talk some and I'm going to read some of them and talk some. 60 00:06:05,340 --> 00:06:08,549 And I'm going to do things like I'm going to do things like this as an American. 61 00:06:08,550 --> 00:06:14,640 It helps me to have a more sophisticated thing that I can read, because obviously my words are simple, right? 62 00:06:14,640 --> 00:06:18,150 So, you know, okay, the yeah. 63 00:06:18,540 --> 00:06:28,949 Okay. So I'm going to I'm going to to start and I'd be happy to in a sense, kind of engage with questions as we go along in some way. 64 00:06:28,950 --> 00:06:33,689 But I realise that that may throw some people off in terms of the way that I've tried to tie things together. 65 00:06:33,690 --> 00:06:40,950 So if you do have any questions in terms of what, what do you mean by that or could you say that again or something like that? 66 00:06:40,950 --> 00:06:44,100 Please do ask. 67 00:06:44,880 --> 00:06:47,880 But otherwise, if you could hold your questions to the end soon. 68 00:06:49,470 --> 00:06:54,570 So war. What is it good for war. 69 00:06:54,810 --> 00:06:58,799 And it's it's it's in our news. It's in our conversations. 70 00:06:58,800 --> 00:07:04,260 It's in our politics. It's in our gut reactions. It's in our film, television literature. 71 00:07:04,500 --> 00:07:12,570 It's in our language. You know, that war kind of proliferates through our entire and our entire essence, really, in some way or another. 72 00:07:13,290 --> 00:07:16,200 War is all around us. We can understand war is pejorative. 73 00:07:16,500 --> 00:07:24,360 And the way that we look at war saying in Syria as a stain on humanity, as an evil, as an ultimate sign of human chaos, 74 00:07:25,080 --> 00:07:29,459 we can also understand war as a human, as human expression, togetherness, 75 00:07:29,460 --> 00:07:33,570 order, exacting, you know something that would you accept some kind of order? 76 00:07:34,560 --> 00:07:39,330 War is a feature of our historical contemporary and will be our future societies. 77 00:07:40,140 --> 00:07:45,930 And in many ways it comes straight to the to the heart in terms of what we think of ourselves, 78 00:07:45,930 --> 00:07:51,569 in a way, comes straight to the heart, straight to our minds and what we think of as being important. 79 00:07:51,570 --> 00:07:58,140 And that's what Chris Hedges referred to and what Chris was trying to look at in in his his book around the, 80 00:07:58,350 --> 00:08:04,320 you know, what is this meaning as he was trying to say, why is war so brutal? 81 00:08:04,680 --> 00:08:12,419 You know, why is it why do we when we talk about war, why is it so carnal and the way that it deals with the human body in some way or another? 82 00:08:12,420 --> 00:08:16,560 What is it about war that that that that that is special. 83 00:08:17,730 --> 00:08:22,620 And he doesn't mean that in a good way. It just means that it sits outside of normal in some way. 84 00:08:23,370 --> 00:08:29,460 So and then he writes this this that he starts with this notion that war is important because it gives us meaning. 85 00:08:30,930 --> 00:08:43,230 And what I want to do is try to unpick that in some way and try to say how technology has the ability to interrupt this meaning fulness of war itself. 86 00:08:44,490 --> 00:08:49,379 Now, what I'm going to start with is this idea that what I the way that Rene Descartes 87 00:08:49,380 --> 00:08:55,710 talked about the the mental and the material and Descartes was particularly, 88 00:08:56,130 --> 00:09:03,990 you know, this this notion of Cartesian duality, that that what they court was particularly interested in this this idea that, you know, 89 00:09:04,170 --> 00:09:09,420 there's something on the outside and then there's the something on the inside and 90 00:09:10,080 --> 00:09:15,150 and they are somehow correlated kind of code produced in some way or another. 91 00:09:15,480 --> 00:09:22,950 And, and he talked quite a bit about that, not only in terms of the way that I'm talking about maybe tool use or technology, 92 00:09:23,520 --> 00:09:31,020 but he talked about it in other ways as well. But this distinction between in a sense, what is the on the on the inside and what is on the outside? 93 00:09:31,020 --> 00:09:34,830 This is particularly useful now. 94 00:09:36,360 --> 00:09:43,620 I think what my my in a sense, my travel to this point is characterised by several steps. 95 00:09:43,620 --> 00:09:51,530 In a way. About five years ago I was particularly interested in how technology was impacting on society and politics, and, 96 00:09:51,700 --> 00:09:59,640 and I was particularly interested in that not necessarily from a war studies perspective or, or, you know, what we're doing. 97 00:10:00,430 --> 00:10:02,559 I was interested in it in a very broad way. 98 00:10:02,560 --> 00:10:09,550 So I have a project for interesting for interest on on smart cities and the way that smart cities are interrupting 99 00:10:09,790 --> 00:10:17,200 the way that we do public policy for instance and that with a colleague get into you and in Singapore. 100 00:10:18,010 --> 00:10:19,209 So it's not only about war. 101 00:10:19,210 --> 00:10:26,650 I'm particularly interested in how that's shaping society, but in particular the route to me to it to war was trying to understand. 102 00:10:29,020 --> 00:10:35,580 Really how the shifts and the role of war, but not necessarily the operation of it, 103 00:10:36,250 --> 00:10:40,410 you know, the shifts in the sense how we why is war meaningful in some way or another? 104 00:10:40,420 --> 00:10:54,190 Not necessarily in saying that this is the way that we can think about it, too, in the area of access, denial and issues like this. 105 00:10:55,300 --> 00:11:02,350 Not necessarily in a sense, kind of very tactical operational elements about how you employ technology, 106 00:11:02,350 --> 00:11:07,390 but really, why how is war changing in terms of being meaningful, meaningful for us? 107 00:11:07,420 --> 00:11:15,940 And so the first is, is that I found Christian Decker's work on risk and complexity and the emerging global system. 108 00:11:16,150 --> 00:11:19,300 He wrote his his paper was called New Times for the Military, 109 00:11:19,630 --> 00:11:24,940 some sociological remarks on the changing role and structure of the armed forces of the advanced societies. 110 00:11:24,970 --> 00:11:29,770 Now, of course, as Robert and I were talking about just earlier, you know that, you know, 111 00:11:30,190 --> 00:11:35,330 Chris was writing this at a time to where what we talked about war meant something else entirely. 112 00:11:35,350 --> 00:11:45,429 You know, I mean, it was meaning something like what has been referred to as wars of the willing, you know, wars of choice in a way. 113 00:11:45,430 --> 00:11:48,490 It wasn't about existential defence. 114 00:11:48,910 --> 00:11:55,120 It wasn't about holding on to our ways of life or holding on to this alliance in some way or another. 115 00:11:56,140 --> 00:12:03,880 It was about, you know, intervention and sometimes, you know, more direct intervention. 116 00:12:04,270 --> 00:12:11,530 And and and so but in particular, what he was particularly what he was interested in and he was interested in the British military, 117 00:12:11,530 --> 00:12:16,420 and particularly as he was he was suggesting that there was a shift leading to risk. 118 00:12:17,260 --> 00:12:20,290 You know, that that in fact, you know, we before we did, 119 00:12:20,530 --> 00:12:28,360 we did defence and that increasingly now what we're doing is risk and and the military's responsibility in that. 120 00:12:28,370 --> 00:12:32,590 And again, it goes back to my final question, which is what's the relationship between security and defence? 121 00:12:33,070 --> 00:12:38,770 Then Dr. would have said in that article that, you know, that actually the reason why we're in this position is, 122 00:12:38,770 --> 00:12:44,800 is that because the existential nature of war is fundamentally changing. 123 00:12:45,160 --> 00:12:50,979 So, okay, maybe with the changes and the geopolitical changes and, you know, 124 00:12:50,980 --> 00:12:57,070 Trump's election and what that means for China and what that means for Russia and what that means for a little Estonia and things like that. 125 00:12:57,070 --> 00:13:01,330 Maybe war means something different to us now. But secondly, 126 00:13:01,330 --> 00:13:10,090 Tim and Edmonds in 2006 wrote What Our Armed Forces for the Changing Nature of military roles in Europe that was published in international affairs. 127 00:13:11,020 --> 00:13:17,409 And again, what he was trying to say is, is that, you know, that the notion of, you know, you know, 128 00:13:17,410 --> 00:13:21,800 I live in Somerset and you can drive around the roads of Somerset and you can see pillboxes everywhere. 129 00:13:21,830 --> 00:13:28,000 Right. I mean, that gives you a real sense of national territorial defence in some way or another. 130 00:13:28,360 --> 00:13:31,570 And what he was pointing to, what Tim Edmunds was pointing to this is that, 131 00:13:31,900 --> 00:13:36,220 well, we're not really asking militaries to do that sort of thing any longer. 132 00:13:36,460 --> 00:13:42,640 In a sense, there's been this disconnection between territorial defence and what we're asking our our militaries to do, 133 00:13:42,640 --> 00:13:47,350 even to the point to where we're even asking our militaries to go into Afghanistan and Iraq, 134 00:13:47,350 --> 00:13:52,060 and we're not even asking them to hold territory in the way that we would have traditionally have done. 135 00:13:53,170 --> 00:14:03,940 And then we we vacillate on that issue. But and then finally and Tony King, who some of you may know, he's at the University of Warwick now. 136 00:14:05,230 --> 00:14:10,750 He wrote a piece he wrote a book called The Transformation of Europe's Armed Forces From the Rhine to Afghanistan. 137 00:14:11,410 --> 00:14:17,610 And again, what he was trying to say is, is that there's something something changing about these armed forces. 138 00:14:17,620 --> 00:14:22,450 You know, the idea that increasingly they were kind of going in different directions. 139 00:14:22,690 --> 00:14:27,880 And he uses sociological ideas of production about the way that militaries do their business. 140 00:14:28,310 --> 00:14:33,310 And he was saying that, you know, that we'd see the split. His most recent book, The Combat Soldier, does this even more. 141 00:14:33,640 --> 00:14:42,370 And he talks about, you know, this idea that we go into this notion of of what we expect increasingly elite troops to do the production part. 142 00:14:42,610 --> 00:14:50,470 And then we we ask others to do the other bits of defence that we don't think as war any longer and things like that. 143 00:14:50,500 --> 00:14:57,820 So all this suggested to me that, you know, this this shift leading to risk was was particularly interesting. 144 00:14:58,150 --> 00:15:02,590 And one of the things that I thought that made it all possible was technology. 145 00:15:03,940 --> 00:15:07,020 Of course, you could say also globalisation and things like this made it possible. 146 00:15:07,030 --> 00:15:14,260 But for me what was particularly interesting, the role of technology in that and that and that change. 147 00:15:15,070 --> 00:15:21,140 Now, despite the fact that I probably the way that Chris and Tim and Tony talked about over this time, 148 00:15:21,190 --> 00:15:25,089 maybe they were talking about war in a slightly different age. 149 00:15:25,090 --> 00:15:31,920 And I have to say before I came here today. I don't think I'd really concretely thought about that era as a different era before. 150 00:15:31,930 --> 00:15:39,920 It's just as I talk, it just seems more and more the case. As I read this, it seems to be that they were in a different era and already here today. 151 00:15:39,940 --> 00:15:45,130 The shift leading to risk is technology and the rise of what I refer to here as boundless warfare. 152 00:15:45,790 --> 00:15:54,040 Now, the first thing to say is, is what I mean by technology, and I mean technology writ in a very large way. 153 00:15:54,550 --> 00:16:02,200 You know, the technology is not just about networks or about processors or about about some kind of mechanical thing. 154 00:16:03,820 --> 00:16:09,729 I think I talk about it. I talk about is anything that enables for instance, 155 00:16:09,730 --> 00:16:16,840 if you read Michael Horowitz's book on the diffusion of military can remember the last power or something 156 00:16:16,840 --> 00:16:23,049 like that of our I can remember that final word but but what he's particularly interested in is why, 157 00:16:23,050 --> 00:16:27,400 you know, why do some states take on something like aircraft carriers and others don't? 158 00:16:28,060 --> 00:16:34,000 You know, why why is that the case? Is it about economics, why we can't afford aircraft carriers or on destined submarines instead? 159 00:16:34,750 --> 00:16:41,230 Or is it something else? And he argues that it's a it's a mix of things academics usually do. 160 00:16:41,680 --> 00:16:50,560 And but one of the things that he points to is the role of the U.S. Marine Corps as being a fantastic 161 00:16:50,980 --> 00:16:58,570 enabler for the U.S. military that allows it to do lots of things in a very flexible and agile way. 162 00:16:59,260 --> 00:17:01,540 And what he's particularly interested in is the war, 163 00:17:01,540 --> 00:17:07,660 the notion of the warrior ethos and what that means for the Marine Corps as an enabler for that in that way. 164 00:17:08,620 --> 00:17:15,210 So does that mean that Michael Horowitz was talking about soldiers as far as technology? 165 00:17:15,220 --> 00:17:20,320 In a way? I argue that that he is and that they are. 166 00:17:22,060 --> 00:17:28,390 But but technology also does other thing. You know, it it's about geolocation telling us where things are. 167 00:17:28,570 --> 00:17:32,950 It's about keeping boots off the ground. It's about tapping into your and my communications. 168 00:17:33,880 --> 00:17:41,020 Thank you. It's also allows for greater lethality, more pursuit and accumulation or the dispersal of power. 169 00:17:42,040 --> 00:17:47,710 Technology is satellite sets, networks, it's processors, personal computers, it's robots. 170 00:17:48,010 --> 00:17:51,130 All the stuff that we think of readily in terms of technology. 171 00:17:51,400 --> 00:17:56,110 But it's also the results of material, biological and chemical sciences, you know, 172 00:17:56,110 --> 00:17:59,220 the way that these are changing and the convergence of them in some way. 173 00:17:59,230 --> 00:18:01,560 So I was here maybe about eight months ago, 174 00:18:01,570 --> 00:18:07,300 and I was talking about this convergence of of of science and technology and what that meant for new technologies. 175 00:18:08,170 --> 00:18:12,399 Perhaps even more contentious is the combat soldier, as I said before, and Tony King's book, 176 00:18:12,400 --> 00:18:17,950 The Combat Soldier, is a very good example of how we have something like martial technology. 177 00:18:18,130 --> 00:18:25,360 And this martial technology is constantly developing. And what Tony does in that book is he looks at, in a sense, 178 00:18:25,360 --> 00:18:30,700 the way that militaries have evolved over time and the role of the combat soldier in that over time. 179 00:18:30,700 --> 00:18:40,870 It's an excellent book, and I recommended the concept of the soldier or the warrior and even what gender theorists would 180 00:18:40,870 --> 00:18:46,149 talk about in terms of something like the notion of the warrior as a man and things like this. 181 00:18:46,150 --> 00:18:54,210 That's a that becomes a technological construct in a way, something that you can employ, something that you can use to get people into the military, 182 00:18:54,220 --> 00:19:01,240 something that you can get them to do, in a sense, quite amazing things that otherwise rational human being would never want to do. 183 00:19:02,030 --> 00:19:11,530 Yeah. I mean, it's, it's it's quite an amazing discursive technique, you know, it's it's very, very useful. 184 00:19:12,160 --> 00:19:19,030 So here I want to talk about 10% space time in force, because I think these three things are going to be the way that we could talk about, 185 00:19:19,360 --> 00:19:23,020 the way that we could talk about the boundless warfare as being impactful. 186 00:19:23,980 --> 00:19:28,930 So space, we can understand this in terms of distance or dispersal or concentration. 187 00:19:29,440 --> 00:19:32,560 It's very open, you know, a simple discussion around space. 188 00:19:32,980 --> 00:19:38,680 But in particular, I'm not really talking about space always in terms of geographical space. 189 00:19:39,100 --> 00:19:47,620 So you can understand it in terms of you could understand it in terms of networks with the idea that a network is always, you know, 190 00:19:47,620 --> 00:19:51,040 nothing is at the centre of a network and that's why it's a network, 191 00:19:52,300 --> 00:19:58,780 but isn't necessarily you can imagine that there are distances within the network that make a difference sometimes. 192 00:19:59,260 --> 00:20:05,770 So so there's not a sense a centre periphery to a network, but there is a there are distances. 193 00:20:05,810 --> 00:20:13,780 So so space time we can think of, you know, the relationship between time long in time to again, that's very, very straightforward. 194 00:20:13,780 --> 00:20:17,319 I think we could get more sophisticated than what we'd like to talk about time. 195 00:20:17,320 --> 00:20:21,760 But I'm not going to do that today unless you want to ask questions and then finally force, 196 00:20:21,880 --> 00:20:26,260 you know, in the way that we talk about kinetic force, you know, mass and acceleration. 197 00:20:27,260 --> 00:20:34,940 All these things become increasingly interesting in the way that we talk about technology and current contemporary technology. 198 00:20:34,940 --> 00:20:45,950 And I recommend Peter Singer's 2012 book on Wired for War as a great example of of of of of just that. 199 00:20:47,420 --> 00:20:52,490 So let's we'll come back to this as we discuss kind of bounded an unbounded war in a way. 200 00:20:52,520 --> 00:21:00,079 So I use, I use, I use boundless warfare because I make a distinct point of whether bounded warfare can 201 00:21:00,080 --> 00:21:05,830 become unbounded or whether boundless warfare is something else entirely bounded warfare. 202 00:21:05,840 --> 00:21:12,400 What we would ordinarily understand is, say, the Second World War, the First World War, maybe even the Napoleonic Wars or something like that. 203 00:21:12,410 --> 00:21:19,320 We we understand that there's some kind of you know, we understand that there's some kind of start to some kind of end in a way. 204 00:21:19,360 --> 00:21:25,099 Now, if we think back to the 30 Years War or the Hundred Years War, we we think about the beginning and the end. 205 00:21:25,100 --> 00:21:29,810 But of course, the 30 Years War, 100 years war is not necessarily the best way to describe those wars, 206 00:21:30,620 --> 00:21:36,320 because the hundred years war lasted longer than 100 years. And so the 30 years war lasts longer than the 30 years. 207 00:21:39,200 --> 00:21:45,260 So, you know, but but historically, this is the way that we we think about war. 208 00:21:45,980 --> 00:21:51,800 Surely we only see this bounded like that because we look back from this point in history at the time. 209 00:21:51,830 --> 00:21:54,860 We'll come back to that. Yeah. No, that's good. Yeah, that's good. Yeah. Thank you. 210 00:21:55,160 --> 00:21:57,020 Yeah. What, can we come back to that? Yeah. 211 00:21:57,230 --> 00:22:07,549 And but with the idea that what the important thing is, is that they become history historically important for us as bounded activities, 212 00:22:07,550 --> 00:22:11,030 you know, bounded notions and historical activities and time. 213 00:22:11,910 --> 00:22:15,760 And whether or not they were really like that or not, it's the how they mean. 214 00:22:15,920 --> 00:22:20,030 You know, they're meaningful in a way for us. And that's what I'm particularly interested in. 215 00:22:20,030 --> 00:22:25,820 This is how meaningful that bounded notion is, not whether or not it was really bounded that way. 216 00:22:27,020 --> 00:22:32,479 So, okay, so we can understand technology as applied, as used, as passive, you know, 217 00:22:32,480 --> 00:22:44,150 something that has some kind of agency in some way or another, you know, like a robot, a a a demining robot, for instance. 218 00:22:44,150 --> 00:22:49,400 It has agency. It's something that you can put into the field and it can it can do things in a way. 219 00:22:49,640 --> 00:22:55,730 Sometimes it can do things through remote control and sometimes it can do through things through artificial intelligence. 220 00:22:58,250 --> 00:23:01,520 But the technology does not have to be an agent itself. 221 00:23:01,520 --> 00:23:06,440 So many scholars are looking at the impact of technology on defence in the fashion such as UAV, 222 00:23:06,440 --> 00:23:15,260 such as the future of air power processing speeds, big data, networked enabled forces, you know, the way the command and control of C2 is done. 223 00:23:15,710 --> 00:23:23,630 These are all, in a sense, ways in which the humanness is or the soldier is becoming increasingly enmeshed in technological systems. 224 00:23:24,500 --> 00:23:33,320 Now, let me say here what I mean by by. Let me see what I mean here by technology is much in the way that a former researcher of 225 00:23:33,370 --> 00:23:41,660 mine brought Taguba talks about in his in his book on reimagining war in the 21st century. 226 00:23:42,920 --> 00:23:50,059 So we largely what he's doing is trying to understand technology as a system or a way of doing things and again, 227 00:23:50,060 --> 00:23:57,350 make that point as towards something like a governmentality. So the ways of understanding, the ways of knowing, the ways of seeing, 228 00:23:58,340 --> 00:24:04,850 the ways of governing, all these things are being impacted on on these sorts of systems. 229 00:24:05,540 --> 00:24:08,890 And much of the way that we say we talk about fake news, for instance, you know, 230 00:24:08,900 --> 00:24:13,430 and and and what kind of impact that had on on elections and other things. 231 00:24:13,430 --> 00:24:19,969 So but but this raises some very interesting issues around the way that we think about technology and the way 232 00:24:19,970 --> 00:24:26,330 that Descartes talked about the love of the system and the way that Newton talked about the value of the system. 233 00:24:26,810 --> 00:24:32,719 And of course, you can see John Stuart Mill after that, you know, as being a and utilitarianism being an extension of that. 234 00:24:32,720 --> 00:24:41,690 But but this notion of love for the system versus value of the system as being well, it changes the way that you think about technology and this. 235 00:24:44,400 --> 00:24:54,120 So I'm going to move around a little bit. So let's take let's take a specific example for those that have followed me up to this point. 236 00:24:54,130 --> 00:24:56,100 Now, there's a few of you who have followed me up to this. 237 00:24:56,490 --> 00:25:03,000 You might see this as technological determinism, because I talked about technology having agency in some way, and if it has agency, 238 00:25:03,300 --> 00:25:10,320 then it has this ability to redirect outcomes in some way or another that if it weren't there, 239 00:25:12,240 --> 00:25:16,170 even human behaviour would arrive at the same endpoint in some way or another, right? 240 00:25:16,440 --> 00:25:24,480 It has some kind of time. Maybe it has some kind of constitutional ising element that changes reality as we know it. 241 00:25:25,680 --> 00:25:30,850 So in Ruth Miller's book can snarl. It's a it's a great it's a it's a great book. 242 00:25:30,870 --> 00:25:38,070 I can't talk about that enough. Ruth Miller, it's called Snarl in Defence of Stalled Traffic and Faulty Networks. 243 00:25:39,570 --> 00:25:45,270 Ruth is talking about. So she uses an interesting historical example to illustrate the power of technology once applied. 244 00:25:46,230 --> 00:25:51,840 So the example is interesting because it illustrates to us what we refer to as something like dual use. 245 00:25:52,200 --> 00:25:58,529 You know what? We talk about something having both civil and martial qualities in a way, so and so. 246 00:25:58,530 --> 00:26:05,580 In the 18th century, the city of Paris was covered in paving stones to cover what would ordinarily be dirty, muddy streets. 247 00:26:06,330 --> 00:26:12,330 Paris was not the first to do this by any means, nor obviously the last. 248 00:26:13,110 --> 00:26:19,470 But in this case, the martial qualities of the paving stones fundamentally changed the system in which prior political, 249 00:26:20,400 --> 00:26:26,180 political and power relations had once stood. So she shares a quote from a French civil servant at the time. 250 00:26:26,190 --> 00:26:33,870 So again, this is in the 18th century, paving stones, often the most suitable material for the construction of barricades in the moments of civil war. 251 00:26:34,770 --> 00:26:40,730 It was this that in June 1848 sorry, I said 1890. 252 00:26:41,220 --> 00:26:46,380 The streets of Paris was covered in a few hours with a series of citadels which cannonballs could hardly demolish. 253 00:26:46,920 --> 00:26:52,650 Much bloodshed would have been spared on both sides if such materials had not existed close to hand. 254 00:26:53,820 --> 00:26:57,420 The government of the French Republic, deeply impressed with the magnitude of this danger, 255 00:26:57,420 --> 00:27:02,549 called the serious attention of the engineers of the Paris roads to the necessity of replacing this ancient 256 00:27:02,550 --> 00:27:08,850 system of stone paving by some other which would not offer materials for the construction of barricades. 257 00:27:09,480 --> 00:27:12,890 So yeah, these it's these, it's these things, right? 258 00:27:12,960 --> 00:27:15,480 That caused the problem in some way. 259 00:27:16,770 --> 00:27:23,850 So Millet Miller argues that roads as a network technology, I mean, a road to network, a road to nowhere is useful to no one. 260 00:27:24,480 --> 00:27:35,340 Right. It's a it's a network technology. They roads connect must be understood to play a constitutional role or a constitutional izing role, 261 00:27:35,670 --> 00:27:38,820 a system role in which the world as we know it operates. 262 00:27:39,540 --> 00:27:48,420 So she states, our roads are at war regardless of the needs, desires and activities of class or of the human citizen subjects. 263 00:27:48,960 --> 00:27:55,290 You know that they've been built for that purpose. They're at war regardless of whether or not we're using them in that way. 264 00:27:55,740 --> 00:28:00,930 We can think about how the transformation of stone paving into barricades and weapons turns roads, 265 00:28:01,200 --> 00:28:05,370 regardless of the people on them in the physical environments of military activity. 266 00:28:07,340 --> 00:28:12,190 So this is particularly problematic, I would suggest, in the way that she's talking about. 267 00:28:12,200 --> 00:28:13,129 I mean, it's interesting, 268 00:28:13,130 --> 00:28:19,970 but it's particularly problematic in the way that she's saying that essentially technology in this case is just the you know, 269 00:28:19,970 --> 00:28:26,300 just the paving stone technology because something that fundamentally changes the progress of war following. 270 00:28:27,680 --> 00:28:34,430 You know, that's just. Now, of course, if we think about the, you know, the introduction of a firearm, for instance, 271 00:28:34,430 --> 00:28:43,670 into into war or or heavy infantry or something like that, we can say that obviously it had an impact on war in one way or another. 272 00:28:43,940 --> 00:28:48,620 Does that necessarily change the way that we talk about war as an extension of politics? 273 00:28:49,730 --> 00:28:56,510 Would it fundamentally change the way that could we talk about not only the character of war as being distinctly different, 274 00:28:56,510 --> 00:29:02,600 but we could could we even talk about the nature of war as being distinctly technological? 275 00:29:03,910 --> 00:29:07,000 That is in some way prior to the political. 276 00:29:08,440 --> 00:29:11,139 Yeah. And that's what that's the idea that she's putting forward. 277 00:29:11,140 --> 00:29:17,950 So, and that I'm relying on here to put forward this this notion so I can see that there's some discomfort. 278 00:29:17,950 --> 00:29:26,979 We're talking about Clausewitz in this way, but. Okay, so let's bring this back to the notion of bounded and boundless war for warfare. 279 00:29:26,980 --> 00:29:30,250 What do we mean by bounded warfare? We can think it through two lenses. 280 00:29:30,250 --> 00:29:35,380 The first is the political or the democratic lens, the relationship between war and states. 281 00:29:36,850 --> 00:29:42,790 And that relationship has been long studied. And, you know, with the idea that what you know, the the relationship between them, 282 00:29:43,510 --> 00:29:48,670 the conversation goes like the chicken and the egg which came first. Two states make wars or two wars make states. 283 00:29:49,120 --> 00:29:54,670 And the wars of France, revolutionary, Napoleonic and otherwise are fundamental to the way that we think about France today. 284 00:29:55,150 --> 00:30:02,590 And these wars fit as between two bookends, you know, the way that we could, the way that we could imagine, you know, 285 00:30:03,040 --> 00:30:08,859 the way that we can imagine wars as as being these wars between two bookends, 286 00:30:08,860 --> 00:30:13,599 they are bounded by beginnings and ends by epics or eras, by political narratives. 287 00:30:13,600 --> 00:30:18,130 We understand that wars against Napoleonic France or Nazi Germany are important for us. 288 00:30:18,160 --> 00:30:27,879 They give us meaning. Yeah. Even to the point to where it's very difficult for us to even understand narratives that in some way go against. 289 00:30:27,880 --> 00:30:31,810 You know, George Soros was called a Jewish Nazi today. 290 00:30:32,560 --> 00:30:38,110 Well, anybody who can remember the Second World War just I mean, how could you have how can you be a Jewish Nazi? 291 00:30:38,380 --> 00:30:41,620 It's just the two worlds kind of collide in that in some way or another. 292 00:30:42,880 --> 00:30:47,290 We just go back as you get your painting. STONE Sorry. Yeah, there's a 1948. 293 00:30:47,470 --> 00:30:52,870 Yes. Of having already by that time as a result of the Napoleon. 294 00:30:54,170 --> 00:30:59,870 The French designed the capital in such a way that the military could easily attack. 295 00:30:59,930 --> 00:31:03,500 Yes. Yes. On the bridges and things like that. Yes, absolutely. 296 00:31:03,530 --> 00:31:08,870 How does this picture. Well, I mean, I think madura. You are you're being more explicit than that. 297 00:31:09,110 --> 00:31:10,760 That I mean, as I should have been. 298 00:31:11,060 --> 00:31:17,990 In fact, the argument there is, is that these roads had already been redesigned for very marshal purposes and hadn't. 299 00:31:18,110 --> 00:31:24,230 And what's particularly interesting and and it's my apologies, I should have been more explicit about this, 300 00:31:24,650 --> 00:31:30,469 but the issue there becomes that there's something like the road itself becomes very not only is it commercial purposes, 301 00:31:30,470 --> 00:31:37,760 but it has this ability to bike back. And and she shows this over and over as this relationship between between that. 302 00:31:38,160 --> 00:31:47,630 And so the key thing is, is that if we think about if we think about this and we talk about the war on terrorism or the war against terrorism, 303 00:31:47,870 --> 00:31:50,929 and we can talk about the war against drugs. 304 00:31:50,930 --> 00:31:54,620 And in the case of the United States, do they give us much meaning? 305 00:31:55,640 --> 00:32:01,940 You know, they do. They you know, do they do they say the same that when we look back to the war on terror and think about it in the 306 00:32:01,940 --> 00:32:08,120 same way that we thought about defeating that in Nazi Germany or Napoleon or something like that, 307 00:32:08,480 --> 00:32:11,720 you know, the likelihood is we won't. Yeah. 308 00:32:13,070 --> 00:32:15,750 Maybe because they're not extensible in the same as extension, 309 00:32:15,750 --> 00:32:21,260 in the same way terrorism or drugs will wipe us off the map, exterminate Starbucks, then. 310 00:32:21,530 --> 00:32:23,989 That is not to say that they're not politically important. 311 00:32:23,990 --> 00:32:28,970 You know, that we don't need to be protected from terrorist activities and and things like that. 312 00:32:30,020 --> 00:32:33,920 You know, it's but they are not wars because they do not give us meaning. 313 00:32:34,130 --> 00:32:41,660 Is that the way that we can define wars, even the wars when they give us meaning, the war on terror doesn't give us meaning in the same way. 314 00:32:42,320 --> 00:32:47,630 That's not to say that they don't have strategies, deployments, tactics, operations, casualties and so on. 315 00:32:48,680 --> 00:32:53,240 So what is meant by boundless warfare? Warfare needs an enemy. 316 00:32:53,780 --> 00:32:58,940 Warfare has a start and an end warfare has discernible action and reaction. 317 00:32:59,510 --> 00:33:07,190 War has ideology. These are three boundaries of war that are becoming eclipsed by technology. 318 00:33:07,970 --> 00:33:11,360 Yeah. So, you know, we need an enemy. We need to start in an end. 319 00:33:12,140 --> 00:33:16,100 We need discernable action and reaction. And we need ideology. 320 00:33:16,100 --> 00:33:28,080 That's for. Actually, I miscounted in my notes. You. So in 1994, George Orwell imagined war bounded by enemy and boundless by time. 321 00:33:29,520 --> 00:33:35,010 Oceania was always at war with other, with either Eurasia or East Asia. 322 00:33:36,360 --> 00:33:44,099 The enemy changed, but the action did not. If we hold Chris Hedges quote that war is a force that gives us meaning, 323 00:33:44,100 --> 00:33:51,570 then we can see why the government in the Oceania and presumably Eurasia and East Asia would have used more as a means of control. 324 00:33:53,100 --> 00:33:58,680 The key thing is, is if you read 1984, you realise that it doesn't give any meaning to Winston Smith. 325 00:33:59,550 --> 00:34:08,520 You know, it's lost its meaning. And this is particularly interesting because what we see in something like Mary Dunn's book, Wartime, 326 00:34:09,210 --> 00:34:14,490 is an example of the way that this notion of peacetime versus wartime and the 327 00:34:14,490 --> 00:34:19,320 vacillation between what you would see in peacetime and what you would see in wartime. 328 00:34:19,360 --> 00:34:30,720 And and she's a legal scholar, but she's particularly interested in the way that civil liberties are impacted by peacetime and by and by wartime. 329 00:34:31,140 --> 00:34:37,550 And what she argues is that, as you can see, and she's just looking at the 20th century in the United States even, and her argument is that, 330 00:34:37,560 --> 00:34:44,100 as you can see, that you see something like wartime civil liberties go down and then peacetime civil liberties go up. 331 00:34:44,460 --> 00:34:48,570 And then you can see this vacillation like a seesaw between wartime in peacetime. 332 00:34:48,780 --> 00:34:51,270 But our argument is, just as you go throughout the Cold War, 333 00:34:51,300 --> 00:34:57,600 you get introduce introduced to the Cold War and the notion between peace and war, but starts to become confused in a way. 334 00:34:58,350 --> 00:35:06,690 And then eventually you get to the point to the post-Cold War period to where, well, largely we're not even sure what peace and war are any longer. 335 00:35:07,590 --> 00:35:13,850 But needless to say, we don't see the same vacillation. We don't see the same return to what we would see in peacetime. 336 00:35:14,010 --> 00:35:18,360 She's arguing that actually we live in a period of eternal wartime now, 337 00:35:18,990 --> 00:35:27,450 and that's why we can have really quite radical restrictions that perhaps we would not have allowed before. 338 00:35:28,210 --> 00:35:38,490 And also very recently, the the government has now the ability to to look in our communications like never before. 339 00:35:42,530 --> 00:35:45,659 I mean, I'm not saying whether that's a good thing or a bad thing, but what I am saying is, 340 00:35:45,660 --> 00:35:52,649 is that's a level of that's a a level or that's a an issue around civil liberties 341 00:35:52,650 --> 00:35:56,340 that perhaps wouldn't have been allowed at any other time other than wartime. 342 00:35:57,870 --> 00:36:01,890 That's Mary. That's the argument more than it is mine. But it's a well worth. 343 00:36:01,900 --> 00:36:05,040 But again, Mary. Mary is fantastic. 344 00:36:05,670 --> 00:36:09,540 I'd never met Mary, but her work is fantastic. So what do we know? 345 00:36:10,440 --> 00:36:17,250 So when it comes to something like this, you know, and a radar, you know, the you know, when we're trying to find the enemy in this. 346 00:36:18,240 --> 00:36:20,880 What if we don't know who who is attacking us? 347 00:36:22,380 --> 00:36:28,890 And, you know, if you're thinking about something like cyber situational awareness, the idea of identify identification, 348 00:36:28,890 --> 00:36:35,610 and that becomes actually really quite down the list of actions, because what you really need to do is to respond. 349 00:36:35,850 --> 00:36:38,940 You know, you need to you need to control the damage. 350 00:36:39,360 --> 00:36:42,870 You need to see what you need to identify the damage. You need to control the damage. 351 00:36:43,440 --> 00:36:48,270 You need to do react in that in some way. And then only then can you start to see who has attacked me. 352 00:36:49,830 --> 00:36:57,790 You know, the the the issue there and and maybe we do have an idea of who did attack us in a way. 353 00:36:57,840 --> 00:37:02,400 You know, we're Sony, for instance, and we maybe we have an idea about who attacked us. 354 00:37:03,030 --> 00:37:10,040 Maybe it was the North Koreans or something like that. Right. But maybe we don't. 355 00:37:10,050 --> 00:37:11,190 You know, in particular, 356 00:37:11,190 --> 00:37:21,360 we could imagine how something like cyber attacks are increasingly being are increasingly difficult to not only are they difficult to pin down, 357 00:37:21,360 --> 00:37:24,540 but the issue of plausible deniability comes online. 358 00:37:25,170 --> 00:37:30,150 And the Russians say, well, I didn't know about this fancy bears thing, but it really doesn't have anything to do with us. 359 00:37:30,930 --> 00:37:41,150 And and the United States says, no, you intervened and you you hacked the, you know, the DNC accounts and then, you know, you publicise this. 360 00:37:41,160 --> 00:37:42,510 I had a big impact on our elections. 361 00:37:43,620 --> 00:37:51,960 And so the idea is then becomes it becomes maybe there is you know, maybe there's a person in Whitehall that knows who has done that. 362 00:37:52,710 --> 00:38:00,190 Whether that's politically significant or not or whether it's used as a a matter of fact or truth, is neither here nor there. 363 00:38:00,210 --> 00:38:06,930 And so the issue of the enemy, you know, or the the other who's attacked us becomes problematic. 364 00:38:09,470 --> 00:38:14,660 So let us think of it. And that becomes. So let's return to Clausewitz, if you allow me. 365 00:38:14,690 --> 00:38:20,750 So let us think of this relationship and how we think about war itself as an extension of politics. 366 00:38:21,500 --> 00:38:25,310 So Carl von Clausewitz states that war is a continuation of politics. 367 00:38:25,890 --> 00:38:31,370 Okay. Others, including Christopher Coker, talks about war as a function of human nature. 368 00:38:31,970 --> 00:38:36,740 That sounds very similar to me in a way, you know, and and and Christopher Cox. 369 00:38:36,740 --> 00:38:44,140 But Warrior Geeks. Great book and I recommend I've said that to all the books about the. 370 00:38:44,930 --> 00:38:51,259 But in war you get to know he asks himself the question of whether we ever imagined that robots would do all the fighting for us. 371 00:38:51,260 --> 00:38:59,540 And eventually what we can do is just kind of sit back, let other people do the fighting and and we can they destroyed all of our machines. 372 00:38:59,540 --> 00:39:04,130 But by and large, we can just remain quite cosy where we where we are, in a way. 373 00:39:04,430 --> 00:39:11,960 And he argues that, no, that that's not going to happen. And the reason it's not going to happen is because war is part of human nature itself, 374 00:39:12,470 --> 00:39:18,500 that the more and more you take humans out of war itself, the more and more that war will try to pull humans back into it again. 375 00:39:18,980 --> 00:39:21,150 And that that's just the nature of of human nature. 376 00:39:21,170 --> 00:39:25,700 So, okay, I, I don't know that I'm making that argument today, but I think it's an interesting argument. 377 00:39:26,270 --> 00:39:30,230 So we can refer to to we can refer to both, you know, 378 00:39:30,240 --> 00:39:38,450 notion of of politics as an extension of politics and as an extension of human nature as something like, 379 00:39:38,900 --> 00:39:46,559 you know, at the centre there you get the political human. Human is kind of fundamental to that in some way or another, a social, you know, 380 00:39:46,560 --> 00:39:49,850 it becomes a social function, you know, political function becomes social function. 381 00:39:50,210 --> 00:39:59,150 And within this political dynamic, we can think of three determinants of of war itself and something that I've already kind of pointed to in a way. 382 00:39:59,180 --> 00:40:07,310 One, this is the way that we think about information. Another is the way that we think about citizenship or individual relationships to the state. 383 00:40:08,510 --> 00:40:19,120 And finally, is the notion of decision making. So for Clausewitz, the political function exists within what he refers to as the Trinity. 384 00:40:19,170 --> 00:40:28,470 I'm suspect that, you know, this relationship between state and society and in the military and so on. 385 00:40:29,190 --> 00:40:35,969 And it can be seen in particular, is this the notion that war must remain subject to the action of a what he refers 386 00:40:35,970 --> 00:40:40,860 to as superior intelligence and has a political purpose in some way or another. 387 00:40:41,250 --> 00:40:51,300 And that's why, even though we're particularly worried about people having the ability to say 3D print handguns, we're not overly worried about it. 388 00:40:51,330 --> 00:40:53,370 I mean, we're worried about it, but we realise it. 389 00:40:53,370 --> 00:40:58,230 Not everybody has an interest in going home and printing their 3D handgun and going out and killing people. 390 00:40:58,650 --> 00:41:04,110 And since there's some kind of political element there that actually restricts people from being murderers. 391 00:41:04,380 --> 00:41:08,970 Right. I mean, there's there's a there's a there's there's a there's a political something there 392 00:41:08,970 --> 00:41:13,920 that stops people from just being self-destructive or destructive towards some. 393 00:41:15,020 --> 00:41:20,329 So maybe that's the case. Maybe politics really does make a difference. 394 00:41:20,330 --> 00:41:27,920 So maybe maybe I'm not overturning Clausewitz, but but herein lies a problem for us and the paving stones in particular. 395 00:41:28,430 --> 00:41:37,840 What role does technology have and political purpose? Surely a rioter has to choose to pick up the rock in order to throw it right. 396 00:41:37,850 --> 00:41:41,030 You know, I mean, that's a political that's a that's a human action, if you will. 397 00:41:42,380 --> 00:41:51,290 You know, that a human action to throw to throw a rock at the police or or an RPG or something, you know, I mean, 398 00:41:51,290 --> 00:42:02,749 to talk about I mean, to to take it from street violence to toward the the political purpose is the factor. 399 00:42:02,750 --> 00:42:05,480 The driver that leads the rioter to make the choice. 400 00:42:05,930 --> 00:42:15,050 You know, that to be an agent of violence, the conception of violence or warfare in our case requires a political human, 401 00:42:15,320 --> 00:42:21,470 you know, the political activism of another. So at the moment, that stays right within what we would imagine Clausewitz will tell us. 402 00:42:23,090 --> 00:42:26,750 But but but in particular, this. 403 00:42:27,260 --> 00:42:33,440 Let's assume that our conception of the political human, or at least the agency that he or she poses is limited. 404 00:42:34,010 --> 00:42:41,030 That actually that that there's a limit to how much going to be explained by politics in this conception of the political human. 405 00:42:41,030 --> 00:42:46,579 We can only have technical objects, right? We have the human who's the key agent or actor. 406 00:42:46,580 --> 00:42:52,610 And then we have these technological objects and the political, you know, the human cannot see well enough. 407 00:42:52,610 --> 00:42:56,210 So what? He gets an object, which is you've come blurry glasses. 408 00:42:56,620 --> 00:42:59,630 He's taken on the on the tool. 409 00:43:01,600 --> 00:43:09,770 So. This is interesting for us because we realised that the political and the human has to stop somewhere. 410 00:43:10,550 --> 00:43:15,110 You know that that in my case it's the glasses are the watch or the thumb or something like that. 411 00:43:16,160 --> 00:43:22,040 And so in this conception of the political human, we we only have technical objects, not technical agents. 412 00:43:22,640 --> 00:43:28,910 The technical agents of war, like roads, like it networks like UAV are themselves enablers or agents. 413 00:43:30,140 --> 00:43:33,680 And this reminds me of Lori Calhoun's book, We Kill Because We Can. 414 00:43:34,610 --> 00:43:37,940 And and it's a it's an interesting it's an interesting book. 415 00:43:38,970 --> 00:43:42,620 I mean that and the British way of interesting as opposed to the American way because I. 416 00:43:43,310 --> 00:43:50,660 It's a it's an interesting book. And and this argument largely is almost a technological determinist argument that we have this 417 00:43:50,660 --> 00:43:57,350 ability to to hover and to kill people at will through the use of drones anywhere in the world. 418 00:43:57,560 --> 00:44:02,960 And because we have this ability, we will continue to see threats in that way. 419 00:44:03,890 --> 00:44:10,190 We see threats in that way because we have the drones not but it's not that we see the threats first and then we develop the drones. 420 00:44:10,550 --> 00:44:14,030 And so. So but it's interesting. 421 00:44:14,180 --> 00:44:21,079 Controversial book I say but less so information as it is comes into being when data 422 00:44:21,080 --> 00:44:25,190 is processed you know when it's given meaning when it's appropriated in some way, 423 00:44:25,910 --> 00:44:30,410 its relationship to bounded warfare is is quite, quite pronounced. 424 00:44:30,410 --> 00:44:36,230 You know, that this idea that information becomes fundamental to the way that we did war in Afghanistan and Iraq. 425 00:44:36,620 --> 00:44:42,469 Information was fundamental to the way that we looked at PowerPoint presentations in the morning or the way that 426 00:44:42,470 --> 00:44:50,990 intelligence work was done or the way that we understood human terrain information was going to be what the army delivered. 427 00:44:51,440 --> 00:44:56,390 Right. The revolution in military affairs. It was an information revolution. 428 00:44:58,370 --> 00:45:03,190 Citizenship, as it is, determines the political you know, what what is what's important? 429 00:45:03,200 --> 00:45:06,319 You know, it sustains the political. We know who we are. 430 00:45:06,320 --> 00:45:07,580 We know what is sacred. 431 00:45:07,700 --> 00:45:16,939 You know, this notion of decision making and we can think about in way in ways that are referred to as decision making as much in the way that. 432 00:45:16,940 --> 00:45:24,140 Q strong talks about strategy, about in the sense making sense of the political you know, about what is to be done in some way. 433 00:45:24,830 --> 00:45:30,050 So boundless warfare challenges a political theory of warfare and this of Clausewitz's ideas. 434 00:45:30,890 --> 00:45:36,410 And in terms of trying to posit something like a technological theory of war. 435 00:45:37,760 --> 00:45:43,160 Now, what if we think about of warfare not as a political behaviour, but as a technological behaviour? 436 00:45:43,790 --> 00:45:47,570 Where does this leave the human in this? Can we transpose that? 437 00:45:47,720 --> 00:45:51,860 The political human with the political with the technological, the human? 438 00:45:53,360 --> 00:45:59,690 What do we even mean by this? Because it may be politics as technology and technology is politics, but what do we mean by this? 439 00:46:00,110 --> 00:46:08,360 We don't mean cyborgs are robots. You know, this idea that we take out a 100% human and we add a bit of kit to them and they become a cyborg in a way. 440 00:46:08,360 --> 00:46:12,680 But we mean something else. Let's let's go back to Ruth Miller. 441 00:46:13,700 --> 00:46:17,450 She charts how English metaphors began to change from 19th, the 20th century. 442 00:46:17,450 --> 00:46:25,790 And I have to say, if you've read Tom Rhett's book on the history of cybernetics, you will have the same ideas. 443 00:46:26,420 --> 00:46:30,680 So she indicates where before we had biological metaphors for human behaviour. 444 00:46:30,680 --> 00:46:43,400 And if you do read Renaissance and Enlightenment philosophy, you'll see much more a biological reference to the way that we as societies work. 445 00:46:45,230 --> 00:46:50,870 But she says that as we go more and more towards the middle of the 20th century, our language begins to change. 446 00:46:51,320 --> 00:46:57,110 And more and more it starts to be changed not by biological systems, but by information systems. 447 00:46:57,860 --> 00:47:07,280 So the organism human in our case, was no longer being understood as a single point or single agent, but instead was being understood as a message. 448 00:47:08,240 --> 00:47:12,290 So the way that we thought about town planning, the way that we thought about public administration, 449 00:47:12,440 --> 00:47:23,839 the way that we thought about being doing big policy, grand policy became much more about about obscuring the human and less so as communication. 450 00:47:23,840 --> 00:47:28,640 In other words, human value becomes increasingly roped into the system of communication. 451 00:47:29,660 --> 00:47:34,129 It becomes a it becomes a different it becomes nothing biological at all. 452 00:47:34,130 --> 00:47:41,660 It becomes an information system. So others have talked about this in terms of the impact on shopping networks had on a wide range of discourses. 453 00:47:41,840 --> 00:47:47,870 Again, Tom Ricks book's great example that if anything, information theory allowed us to think about how the system works together, 454 00:47:48,080 --> 00:47:52,520 about how you can do big policy, and these things come to come come together. 455 00:47:52,880 --> 00:47:58,760 And we're talking about policy that actually we've never had to, you know, with with this increase of information, 456 00:47:59,960 --> 00:48:05,660 with the way that we think about millions and millions of people living in London or something like this, 457 00:48:05,660 --> 00:48:11,090 all of a sudden, the way that we do, the way that we do things just fundamentally has to change. 458 00:48:11,090 --> 00:48:20,570 And it changes as information. Information grows. So if you a great example of that is Caesar Dawkins book on Why Information Grows. 459 00:48:21,830 --> 00:48:27,830 So in fact, Ruth Miller illustrates how they continue to view information theory through a subjective humanist perspective. 460 00:48:28,340 --> 00:48:35,210 You know, in many ways it was about an information system. It was about the cybernetic system, you know, the networked system. 461 00:48:35,570 --> 00:48:43,550 But she argues that the human was an increasing you know, it was part of that in a way that we can talk about the battlefield, the space. 462 00:48:43,820 --> 00:48:50,300 But we have these reference points, these individual reference points, which are soldiers within that in some way or another. 463 00:48:51,140 --> 00:48:56,540 And they and as much as we would like to talk about it as being networked and something like that, the argument is, 464 00:48:56,540 --> 00:49:02,030 is that by and large, we still talk about networked enabled, missed or networked centric warfare. 465 00:49:02,420 --> 00:49:05,180 The human is still a fundamental part of that in some way or another, 466 00:49:05,180 --> 00:49:11,210 and which is one of the big critiques of Networked Central Network centric warfare. 467 00:49:12,390 --> 00:49:19,080 So what does this mean for him and this mean in terms of the the political frameworks? 468 00:49:19,860 --> 00:49:26,430 So she argues they were transforming the machine into a human, if a digitised human and the physical network into a communications network. 469 00:49:26,820 --> 00:49:29,730 They were obliterating the machine, but actually rhetorically, 470 00:49:29,940 --> 00:49:34,739 just as the machine was taking on a greater and greater role in the city or in the battlefield, 471 00:49:34,740 --> 00:49:42,840 its it was becoming they they were essentially re positing the human in that and as much as we were understanding that essentially 472 00:49:42,840 --> 00:49:49,710 what was you know as we were going to having lines and lines of men with guns spacing with one another and converging, 473 00:49:50,070 --> 00:49:57,300 and as more and more the human in a sense, came behind what was fundamentally a a tank battle or an area of aerial battle, 474 00:49:57,720 --> 00:50:01,560 we had big machinery, too, in a sense, to change the dynamics of war itself. 475 00:50:02,220 --> 00:50:08,910 But the way that we thought about war and the way that we thought about politics became much more about the human. 476 00:50:09,150 --> 00:50:13,650 She has this very interesting notion of of of traffic. 477 00:50:14,400 --> 00:50:19,890 You know, that tradition before we had before traffic as a word existed, before we had cars. 478 00:50:20,700 --> 00:50:24,030 So the idea traffic used to mean people. 479 00:50:24,630 --> 00:50:32,190 There's too much traffic on this pavement. You know, where in a sense, once we had cars, it started to be applied to cars. 480 00:50:32,460 --> 00:50:35,430 And the reason that started being applied to cars is that you had too much traffic, 481 00:50:35,670 --> 00:50:41,790 so you needed to move the traffic out to the sides, onto the pavements, off the streets of the roads and let cars drive on them. 482 00:50:42,240 --> 00:50:46,890 But then all of a sudden, cars started doing the same thing that actually humans used to do. 483 00:50:47,070 --> 00:50:51,860 They found it difficult to use the road network and and and the argument there. 484 00:50:51,990 --> 00:51:05,400 And so this notion of what used to used to be a human category, traffic became applied to to the machines themselves, to the network itself, in a way. 485 00:51:06,120 --> 00:51:09,210 But fundamentally, we still understand that traffic is affecting us. 486 00:51:09,570 --> 00:51:15,420 You know, it's a human agent, you know, I mean, it's people get in their cars and they decide to drive in some way or another. 487 00:51:15,780 --> 00:51:19,200 Forget about the car. The car is neither here nor there. It's tied to the road. 488 00:51:19,410 --> 00:51:23,969 Forget about the road. The idea is that people decided to get in the car and drive on that road. 489 00:51:23,970 --> 00:51:31,140 And she's arguing, wait a second, the car and the road fundamentally change your options and reconstitute. 490 00:51:31,350 --> 00:51:38,820 Is this reality for you in some way or another? Maybe that says something about Lorry Calhoun's arguments around We Kill because we can. 491 00:51:39,510 --> 00:51:46,320 If we have the technology, it repurposes our reality in some way, it resets our reality. 492 00:51:48,120 --> 00:51:52,649 So I realise I'm going over time, so I'm going to try to move forward. 493 00:51:52,650 --> 00:51:56,190 Okay, let's talk about known knowns. 494 00:51:57,570 --> 00:52:02,340 Remember the known knowns and the known unknowns and unknown unknowns. And so, okay, known knowns. 495 00:52:02,790 --> 00:52:05,880 We live in a world where networks continually take over our world. 496 00:52:06,990 --> 00:52:12,570 They are seen as being applicable to everything, you know, Iot and things like that. 497 00:52:13,590 --> 00:52:21,629 And does anybody have an Alexa yet? So I mean, as an example of that, hopefully all of you know, Amazon's Alexa, 498 00:52:21,630 --> 00:52:27,600 what I'm referring to actually be great if most of you didn't know but anyway so they they 499 00:52:27,600 --> 00:52:31,469 are seen as being absolutely about everything but not just information technology networks, 500 00:52:31,470 --> 00:52:37,620 but road networks as we've discussed, but also community networks, body networks, you know, everything that you know, 501 00:52:37,620 --> 00:52:43,140 the measure, the society, everything is being, in a sense, talked about in terms of networks. 502 00:52:45,010 --> 00:52:49,360 Complexity is key to our understanding of emergent warfare and the way that we talk 503 00:52:49,360 --> 00:52:55,419 about data science and how that fits into a notion of complexity and nonlinear 504 00:52:55,420 --> 00:53:00,040 causality is the way that we talk about everything from the impact of pharmaceuticals 505 00:53:00,040 --> 00:53:04,270 to the way that we talk about the way to get smart cities working and so on. 506 00:53:04,930 --> 00:53:08,890 We talk about them through a networked discourse. 507 00:53:09,640 --> 00:53:16,330 The applied technologies that make networks possible are fibre optics, mobile batteries, processor size and speed, 508 00:53:16,690 --> 00:53:24,070 low orbit satellites and fundamentally us, you know, where we're a fundamental part of that network. 509 00:53:24,820 --> 00:53:28,780 We have become mechanised ourselves in the way that we seek to network our task. 510 00:53:29,200 --> 00:53:38,950 We tap in. We add on. We hack and interview with small children who who do play on iPads or Xboxes or something like that. 511 00:53:39,460 --> 00:53:44,110 You realise that they call everything a glitch. You know, the bus hasn't got here yet. 512 00:53:44,200 --> 00:53:54,940 There's a glitch. The bus is not here yet. It's a very interesting kind of transfer of what is an IT language into into, well, material reality. 513 00:53:55,590 --> 00:53:59,050 You know that it's a glitch that I'm going slightly over. 514 00:54:00,700 --> 00:54:06,000 So what are the known unknowns? So we know that in a sense, a lot of things are changing in a way, you know, 515 00:54:06,100 --> 00:54:12,490 that the impact of networks is going to be it's impacting on the way that we we do normal things. 516 00:54:14,350 --> 00:54:21,070 And Moore's Law tells us that the number of transistors on a dense, integrated circuit board will double every two years. 517 00:54:21,370 --> 00:54:23,649 And that was seen like Moore's Law was going to come to an end. 518 00:54:23,650 --> 00:54:30,970 And all of a sudden now we have new materials, material sciences that allow us to continue Moore's Law going even further. 519 00:54:31,780 --> 00:54:38,230 History has shown that law, the Moore's Law is apt and anticipate anticipating process power. 520 00:54:38,620 --> 00:54:42,970 But what was particularly interesting, even though that we realise that process power is important, 521 00:54:43,270 --> 00:54:50,140 we still can't have a vision in the future of what that process power is going to mean for the future of our technology in any way. 522 00:54:51,040 --> 00:54:55,600 We can't say, let's talk about Moore's Law as it exists in 20 years time. 523 00:54:56,710 --> 00:55:00,580 What is technology? What are we going to be able to do with that kind of processing power? 524 00:55:01,900 --> 00:55:06,040 You know, the ability for us to forecast in that way is what's difficult. 525 00:55:08,420 --> 00:55:12,300 So Christopher Coker is illustrating this in this in his book for you geek. 526 00:55:12,320 --> 00:55:16,910 And he's talking about science fiction as being a rich source of how we might use technology in the 527 00:55:16,910 --> 00:55:22,460 future and how science fiction tells us that this type of processing power is going to be applied. 528 00:55:22,820 --> 00:55:30,890 Christopher Coker is telling you to go look in science fiction and in that processor speeds, going to allow greater artificial intelligence, 529 00:55:31,250 --> 00:55:35,770 whether it's like us or whether it's like the Hoover robot, you know, 530 00:55:36,590 --> 00:55:41,660 which in turn relies on autonomous decision making and or arithmetic decision making. 531 00:55:42,290 --> 00:55:52,010 The change in genetic material sciences macro engineering also opened up a world of distinctly bespoke, tailored, altered applications. 532 00:55:52,880 --> 00:55:57,050 Gene therapy is one thing, but gene warfare is quite another. 533 00:55:58,850 --> 00:56:02,300 It feels as if we are on the birth of the new kind of molecular science. 534 00:56:02,330 --> 00:56:09,680 You know, that's why the scientists are talking sciences that will allow us to interact with physical world in a totally different way. 535 00:56:10,760 --> 00:56:16,819 You know, any of you that have experienced VR and to think, Oh, this is amazing virtual reality, 536 00:56:16,820 --> 00:56:23,200 this is an amazing kind of way that we could experience things that we never thought we could experience before. 537 00:56:23,210 --> 00:56:29,030 And you know that the notion of individual, you know how things kind of suck you into it in some way or another. 538 00:56:29,330 --> 00:56:31,730 I'm veering off to the matrix, so I'll I'll leave that. 539 00:56:32,630 --> 00:56:40,080 Let's talk about the unknown unknowns where the losses to anticipate those technologies and whose science has yet to be discovered or engineered. 540 00:56:40,100 --> 00:56:47,180 We know that techno scientific changes have a tendency to support existing networks and fundamentally to eventually overcome them. 541 00:56:48,350 --> 00:56:53,569 So we understand that roads, you know, the traffic, the human traffic was a problem. 542 00:56:53,570 --> 00:56:58,010 So we pushed out to the side to let roads, cars go faster than cars become a problem. 543 00:56:58,250 --> 00:57:02,959 So we started looking at alternatives to the road network and then we start and you know, 544 00:57:02,960 --> 00:57:08,060 we what we do is we establish networks and then we try to transgress networks and 545 00:57:08,060 --> 00:57:12,020 then those networks get transgressed and then those networks get transgressed. 546 00:57:12,290 --> 00:57:15,740 Maybe that's a function of human nature in some way or another. 547 00:57:16,540 --> 00:57:20,420 So but drones are a typical example of that, as you know. 548 00:57:20,570 --> 00:57:24,050 How do we defeat space and time in a way that makes a difference? 549 00:57:24,710 --> 00:57:28,760 Finally, let's go to something that Zizek refers to as unknown knowns. 550 00:57:30,290 --> 00:57:36,350 Yeah, we know that there are lots of things happening out there, that there are quite amazing things happening out there. 551 00:57:36,650 --> 00:57:41,150 We know that the Chinese military are doing some quite amazing things and sometimes 552 00:57:41,150 --> 00:57:44,720 we even know that they refer to them as something like the assassin's mace. 553 00:57:45,670 --> 00:57:52,160 But we have ideas of what we think the assassin's May says, but we're not 100% sure about what that is. 554 00:57:53,390 --> 00:57:58,940 So we know that it exists. We know that over 4 million people in the United States have security clearance. 555 00:58:00,410 --> 00:58:04,700 Now, there's a lot of people in the United States that know things. 556 00:58:05,360 --> 00:58:10,160 And sometimes as we see, that comes back to haunt but Edward Snowden and so on. 557 00:58:10,550 --> 00:58:14,240 Right. So this has an impact in the way that we would talk about now. 558 00:58:14,600 --> 00:58:17,120 I'm not going to go into this. But what's particularly interesting is, 559 00:58:17,120 --> 00:58:25,640 is that Ruth Miller tells us that it has all these things to say about what we talk about in terms of vitality, liberty and and mobility. 560 00:58:26,300 --> 00:58:30,830 You know that we're giving things life vitality that never had life before. 561 00:58:31,340 --> 00:58:39,620 So we're creating, you know, in many ways that may be artificial intelligence, but maybe in some cases that shaking a bottle of fizzy water. 562 00:58:40,370 --> 00:58:43,580 You know, you're adding information to that bottle. 563 00:58:44,390 --> 00:58:53,990 You know that you can imagine that bread in different ways, written in different ways and mobility, this ability to go beyond our boundaries. 564 00:58:53,990 --> 00:59:03,500 You know, that fundamentally, you know, what's important in mobility is not the ability to get there, to cross boundaries, to to move fast and so on. 565 00:59:03,500 --> 00:59:13,190 But more importantly, what's really fundamental is the ability to stay there, permanence, you know, what was not and, you know, 566 00:59:13,310 --> 00:59:21,590 Stuxnet was fantastic, not because it actually zoomed from Israel and the United States and so on and hit something. 567 00:59:21,980 --> 00:59:25,490 But Stuxnet was important because it just set down. 568 00:59:26,860 --> 00:59:36,040 You just had Stuxnet on a USB and it just sat there for ages and we were just waiting for someone to plug in that USB with Stuxnet on it. 569 00:59:36,430 --> 00:59:43,240 Stuxnet was important for shutting down that run Iranian fuselage. 570 00:59:44,170 --> 00:59:53,740 Not because not because it was in some kind of kinetic way being shot at the fuselage, but because it had the ability to just stay there, to hang out. 571 00:59:54,130 --> 00:59:56,590 And of course, that's what drone operators say. What's important is, 572 00:59:56,620 --> 01:00:01,479 is that you can sit there for days and watch someone's behaviour and be able to really 573 01:00:01,480 --> 01:00:06,670 verify the things that actually you've never been able to verify in the same way before. 574 01:00:10,260 --> 01:00:18,030 Okay. So what does this mean for our questions? Well, the role of the state is interesting because it has an impact on the way that we may talk about, 575 01:00:18,240 --> 01:00:21,360 you know, something that's increasingly bounded. 576 01:00:22,590 --> 01:00:28,200 You know, something like the state is being is encountering something that's increasingly unbounded. 577 01:00:28,860 --> 01:00:33,020 So we could think about something like the power of production. You know, it's being distributed. 578 01:00:33,660 --> 01:00:39,149 You know, the ability for additive manufacturing or 3D printing is quite amazing. 579 01:00:39,150 --> 01:00:42,590 You know, the future and how that's impacting different companies is quite amazing. 580 01:00:42,600 --> 01:00:47,459 Lockheed Martin says, no, we're still going to make things, but says we're going to print things. 581 01:00:47,460 --> 01:00:50,760 Actually, we're going to give you you're going to buy the ideas from us and you're going to print them. 582 01:00:51,390 --> 01:00:56,520 We don't want to be in the material making business any longer. And here I've brought some examples. 583 01:00:57,420 --> 01:01:03,569 You know, that's the way that value systems are talking about totally different business models because they realise 584 01:01:03,570 --> 01:01:10,230 that the power of production is is becoming greater while at the same time information is becoming cheaper. 585 01:01:12,860 --> 01:01:14,429 And that's that's that's interesting. 586 01:01:14,430 --> 01:01:20,300 And many of you will have seen The Matrix films, and you remember that one of the characters doesn't know how to fly a helicopter. 587 01:01:20,310 --> 01:01:25,620 So she asks, Could you please download how to fly this helicopter? I think it's in the matrix. 588 01:01:25,620 --> 01:01:30,609 And can you please download how and then you know takes on a downloading the buffer speed and the matrix 589 01:01:30,610 --> 01:01:36,120 is is is slow but it's almost there and finally it downloads and now she knows how to find this, 590 01:01:36,580 --> 01:01:41,250 this, this, this, this, this, this helicopter. 591 01:01:41,970 --> 01:01:45,300 That's quite amazing. Same thing in Rogue and Rogue One. You saw that? 592 01:01:45,480 --> 01:01:48,780 The robot? No. I mean, you know, how does he know where to go? 593 01:01:48,810 --> 01:01:55,620 He goes in and he plugs in and he downloads it and away. So information is becoming increasingly cheaper as it expands. 594 01:01:57,810 --> 01:02:00,780 Those two things have a big impact on the way that we talk about, in a sense, 595 01:02:00,780 --> 01:02:05,850 the power of the state to be able to make decisions about research and development, 596 01:02:06,120 --> 01:02:10,840 about the way that economies grow or don't grow and things like this. 597 01:02:10,870 --> 01:02:15,540 So in many ways maybe that's talking about globalisation from a different perspective. 598 01:02:15,540 --> 01:02:18,990 You know, it's not about international trade agreements, but it's about this, 599 01:02:19,410 --> 01:02:24,420 this notion of information and production power coming from a different direction. 600 01:02:24,690 --> 01:02:30,600 Now the reason I mention that is because it has a big impact on the way that we talk about power and security or arms control. 601 01:02:31,620 --> 01:02:39,540 So the way that we talk about cyber, the way that we talk about biochem, the way that we talk about robotics is being fundamentally challenged. 602 01:02:41,150 --> 01:02:45,210 And then finally, this relationship between security and defence, you know, 603 01:02:45,390 --> 01:02:51,180 what role do militaries have within this military's complaint that the convergence of security and defence makes their job 604 01:02:51,180 --> 01:02:57,930 more difficult because it thins out what they can do well with issues that are shared by other agencies and departments. 605 01:02:58,380 --> 01:03:08,310 Not only that, but it forces them to take on logics that don't put them at the centre, you know, and that's particularly difficult for them. 606 01:03:08,760 --> 01:03:16,919 So but fundamentally, it poses a problem. It fundamentally poses a problem, as I suggest here, 607 01:03:16,920 --> 01:03:21,959 for all of us in terms of coming to terms with what technology means for society and 608 01:03:21,960 --> 01:03:25,750 fundamentally what it means for the future of war and the way that I've explained it. 609 01:03:25,770 --> 01:03:26,520 Thank you very much.