1 00:00:04,690 --> 00:00:10,870 Hello, everyone, and welcome to the latest and the last in this term of our conversations that have been arranged 2 00:00:10,870 --> 00:00:16,150 by the Oxford Martin School at Oxford University on the theme of building back better. 3 00:00:16,150 --> 00:00:21,440 So some of the challenges ahead as hopefully sooner rather than later we come out of 4 00:00:21,440 --> 00:00:30,970 the pandemic before introducing to today's two speakers or conversation participants. 5 00:00:30,970 --> 00:00:32,680 Let me welcome the audience. 6 00:00:32,680 --> 00:00:41,000 Let me point out that on the bottom right of your screens, you'll see something that says, ask a question, please do ask a question. 7 00:00:41,000 --> 00:00:46,570 What we'll do is we'll have a conversation for about 40 minutes and then have 20 minutes at the end. 8 00:00:46,570 --> 00:00:50,440 And not only can you ask a question, you can also vote on the question. 9 00:00:50,440 --> 00:00:55,840 So if if someone else is asked a question you're particularly keen for me to put, then please do vote. 10 00:00:55,840 --> 00:01:03,610 That makes my job very much easier. We do record these these conversations. 11 00:01:03,610 --> 00:01:08,110 So if you ask a question, we may mention who your name is. 12 00:01:08,110 --> 00:01:15,850 So this afternoon we're going to be talking about cyber security and particularly the issues around new and emerging technologies. 13 00:01:15,850 --> 00:01:20,620 And I'm delighted to welcome two people who's going to explore this topic. 14 00:01:20,620 --> 00:01:30,130 The first professor say decrease. And Sadie is professor of cybersecurity in the Department of Computer Science here in Oxford. 15 00:01:30,130 --> 00:01:39,130 Say he has worked with the Martin School for a long time, in particular in our global cybersecurity capacity centre, and said he's done many things. 16 00:01:39,130 --> 00:01:45,310 She's worked in industry in Connecticut. She was at the University of Warwick before moving to Oxford. 17 00:01:45,310 --> 00:01:51,280 And Miss Sadie is Jamie Saunders. Jamie is a strategic security consultant. 18 00:01:51,280 --> 00:02:02,410 He's been involved in studies programme at the Martin School, is in Oxford Martin Fellow and is also a visiting professor at UCL and has worked 19 00:02:02,410 --> 00:02:09,790 with many parts of UK government over the years and again around cyber security. 20 00:02:09,790 --> 00:02:18,250 So welcome, both of you. I'm one of the things we're going to talk about is a report that you both with colleagues have authored with 21 00:02:18,250 --> 00:02:27,340 the World Economic Forum on the challenges to cyber security that new and emerging technologies are raising. 22 00:02:27,340 --> 00:02:34,210 But before we do that, perhaps we could just talk a little bit about cyber security in general. 23 00:02:34,210 --> 00:02:42,760 And I guess when I think of cyber security, I think of my own personal cyber security, whether my computer can get hacked. 24 00:02:42,760 --> 00:02:50,020 I worry about the companies that I deal with. So cyber security in the in the commercial sphere. 25 00:02:50,020 --> 00:02:51,400 And then I worry about the government. 26 00:02:51,400 --> 00:03:04,150 So the degree to which government functions and all the things that we rely on in for the country to run may be susceptible to cyber security safety. 27 00:03:04,150 --> 00:03:08,720 I come to you first for where are we with cyber security at the moment? 28 00:03:08,720 --> 00:03:16,720 What are the new and emerging threats? How sophisticated are we with our countermeasures? 29 00:03:16,720 --> 00:03:22,240 Yes, well, of course, one that doesn't fit all. 30 00:03:22,240 --> 00:03:28,510 If I was to reflect upon where we are in the United Kingdom, we're very lucky. 31 00:03:28,510 --> 00:03:31,030 We have quite the capacity. 32 00:03:31,030 --> 00:03:42,240 We have a great workforce, people that we can call upon and in terms of how we deliver it and how we think about picture challenges. 33 00:03:42,240 --> 00:03:47,960 Well, first thing to say is that cybersecurity is all about really accepting insecurity. 34 00:03:47,960 --> 00:03:54,910 There's no such thing as 100 percent security. It's really about understanding risk and managing it in such a way that you're exposed 35 00:03:54,910 --> 00:03:58,810 to enough risk to do what you want to do and take the opportunities you want to take, 36 00:03:58,810 --> 00:04:02,590 whether that's in your personal life or in your business life. 37 00:04:02,590 --> 00:04:11,170 In terms of where we are and how it's evolved historically, the way in which cyber assets are exposed has changed. 38 00:04:11,170 --> 00:04:20,410 So you can imagine 20 years ago we might have been concerned with protecting hardware, computers and some data or some programmes on them. 39 00:04:20,410 --> 00:04:24,940 And then that fast evolved as our usage of cyberspace evolves to. 40 00:04:24,940 --> 00:04:29,570 Now we worry about protecting our health because our health care depends so much on them. 41 00:04:29,570 --> 00:04:37,690 We worry about protecting our light, staying on because all of our energy transmission systems are dependent on cyberspace. 42 00:04:37,690 --> 00:04:44,470 We worry about accessing money, food, and this is what we call critical infrastructure in actual fact. 43 00:04:44,470 --> 00:04:49,990 And that's true around the world, pretty much depending on how you use cyberspace. 44 00:04:49,990 --> 00:05:00,400 And I guess it would be lost on people that in recent years we began worrying about protecting democracies and freedom from influence to choose your 45 00:05:00,400 --> 00:05:13,480 destinies because of the way in which people started influencing and communicating and operating in line to really change the populace thinking. 46 00:05:13,480 --> 00:05:15,470 So it really evolves with how we use it. 47 00:05:15,470 --> 00:05:22,810 So as humanity is going online, that places new requirements in terms of what we want to do in terms of cybersecurity. 48 00:05:22,810 --> 00:05:29,080 We think about that in terms of protecting the availability of stuff, making sure the only people that can see the stuff, 49 00:05:29,080 --> 00:05:33,970 the authorised people do you see the stuff and making sure that it's free from sabotage, 50 00:05:33,970 --> 00:05:37,630 it has integrity, and that gets interpreted in many different ways, 51 00:05:37,630 --> 00:05:45,310 depending upon the kinds of assets and services, all that you really require to protect. 52 00:05:45,310 --> 00:05:54,460 And as you can imagine, the job is really big. And understanding where you put your limited resource is what we mean by being risk. 53 00:05:54,460 --> 00:05:57,850 Let me say something here. Yeah, I mean, 54 00:05:57,850 --> 00:06:07,090 I just throw out something that said is sad because I think there's been a change of emphasis around cybersecurity certainly in the last 10 years. 55 00:06:07,090 --> 00:06:14,810 And in the corporate world in particular, a lot of the focus used to be on confidentiality, privacy of information. 56 00:06:14,810 --> 00:06:20,290 I think for individuals, privacy of information is kind of what we thought about what's actually changed. 57 00:06:20,290 --> 00:06:24,580 And I think the pandemic has reinforced this is the dependence that we have on 58 00:06:24,580 --> 00:06:29,980 it for a broad range of extraordinary wide range of day to day activities. 59 00:06:29,980 --> 00:06:38,500 So the concern is now much more about our fundamental resilience and the threat of disruption, loss of critical services, et cetera. 60 00:06:38,500 --> 00:06:44,860 And then to say this earlier point, I think we're now just beginning to think, well, hang on a minute, 61 00:06:44,860 --> 00:06:50,350 can we trust the information that is being served up to us through the digital environment? 62 00:06:50,350 --> 00:06:54,610 So issues of fake news, issues of the integrity of information, 63 00:06:54,610 --> 00:06:59,590 I think are getting to greater prominence and businesses are just beginning to think about. 64 00:06:59,590 --> 00:07:10,220 So what does that mean for us? And I guess as that attack surface evolves, what we've also seen is an evolution of the threat ecosystem. 65 00:07:10,220 --> 00:07:17,230 So the those that would attack us in cyberspace that's evolved over recent years to us just 66 00:07:17,230 --> 00:07:22,090 remembering it was about 20 years ago when I was working in the business side of this, 67 00:07:22,090 --> 00:07:31,480 and we used to see large ransomware attacks even then on very large organisations who would pay large ransoms. 68 00:07:31,480 --> 00:07:39,190 And of course, ransomware attacks have seen a massive surge up in the last couple of years in the in the press. 69 00:07:39,190 --> 00:07:40,480 But they've always been there. 70 00:07:40,480 --> 00:07:51,050 And so I should say ransomware attack is essentially where somebody hacks into somebody's systems, essentially holds that data to ransom. 71 00:07:51,050 --> 00:07:55,180 They do that by encrypting it and then selling back the decryption key. 72 00:07:55,180 --> 00:08:05,110 That's the kind of traditional ransomware attack in so many organisations find themselves in the position where they do pay the ransom to get the key. 73 00:08:05,110 --> 00:08:09,200 Sometimes they have the benefit of not needing to for whatever reason. 74 00:08:09,200 --> 00:08:12,760 But yeah, so we've seen an evolution of the threat ecosystem. 75 00:08:12,760 --> 00:08:23,980 That's quite important because what happens is as the weaponry used to conduct attacks proliferates and the global supply chain, 76 00:08:23,980 --> 00:08:28,420 then that becomes cheaper. It gets these skills that more people can use. 77 00:08:28,420 --> 00:08:36,010 What we see is a filtering out of attack weaponry, if you like, tools that enable many more people to use them. 78 00:08:36,010 --> 00:08:40,920 So the barrier of entry to conduct a cyber attack is down. 79 00:08:40,920 --> 00:08:48,820 We also have seen, of course, more and more people generating the skills and knowledge to be able to attack in cyberspace. 80 00:08:48,820 --> 00:08:58,720 And we've seen by virtue of the Internet and dark websites, ways of maintaining a background of supply chain activity. 81 00:08:58,720 --> 00:09:06,730 So now what you have is a whole number of different people that can work together to conduct a cyber attack and people can purchase their skills. 82 00:09:06,730 --> 00:09:10,960 So it's a fairly evolved ecosystem and things are changing. 83 00:09:10,960 --> 00:09:15,100 We've also seen a significant movement of organised crime in cyberspace. 84 00:09:15,100 --> 00:09:25,030 So old crimes using against cyber assets or against those assets, but using cyberspace because they can monetise it so easily, 85 00:09:25,030 --> 00:09:29,990 that's attracted much more influx of a greater influence of threat actors. 86 00:09:29,990 --> 00:09:38,260 So that's changed, too. And consequently, what's happening is our tech office is growing because we are using more and more 87 00:09:38,260 --> 00:09:44,710 new technology and will and that's unfortunately often comes with vulnerability. 88 00:09:44,710 --> 00:09:54,400 And the the interest of threat to attack us is growing because they can make the gains that they want to make and not actually scaling up, 89 00:09:54,400 --> 00:09:59,770 too, which is kind of creating a perfect storm in a way that's just nonsense. 90 00:09:59,770 --> 00:10:07,250 When you say the attack surface, as we use more and more different types of devices that are connected to the Internet, 91 00:10:07,250 --> 00:10:14,290 do you mean that there are more ways that a hostile act I can get can can attack us usually? 92 00:10:14,290 --> 00:10:23,740 So usually new tech will come with inherent vulnerabilities not necessarily designed in on purpose, 93 00:10:23,740 --> 00:10:28,180 just sometimes the way in which technology is developed. 94 00:10:28,180 --> 00:10:34,000 If somebody were to send something against that that was unexpected, it doesn't always deal with that safely. 95 00:10:34,000 --> 00:10:40,720 And sometimes it can be tricked into running someone else's programmes and that enables them to take control of boxes. 96 00:10:40,720 --> 00:10:47,020 But it isn't just about technology. It's also about the way in which we glue it together and which we orchestrate it. 97 00:10:47,020 --> 00:10:54,880 And there's something about the way in which we incorporate these new technologies into our lives that can open up new attack services. 98 00:10:54,880 --> 00:10:58,780 And sometimes it's in us in the way we use things as human beings. 99 00:10:58,780 --> 00:11:04,990 We see in our cybersecurity world we see this system as having the humans inside it. 100 00:11:04,990 --> 00:11:10,000 They're not just on the outside watching passively, they're actually taking part. 101 00:11:10,000 --> 00:11:16,720 So we're part of that system and sometimes unintentionally, unintentionally, we are in witness. 102 00:11:16,720 --> 00:11:26,380 Other times, of course, that's with intent. And if you were to classify the hostile agents or the motivation behind the hostile agents, 103 00:11:26,380 --> 00:11:34,330 then some of the groups that do ransomware and things like that, some are out there just to extort money. 104 00:11:34,330 --> 00:11:39,760 Others make money by stealing secrets, presumably. And then you have others. 105 00:11:39,760 --> 00:11:43,030 And you mentioned I haven't really realised that it came within cybersecurity, 106 00:11:43,030 --> 00:11:49,960 but the systematic misinformation and social media that one gets that tries to influence a political process, 107 00:11:49,960 --> 00:11:57,590 is there a sort of typology of the different types of threats out there in cybersecurity? 108 00:11:57,590 --> 00:12:08,560 And that's the actual. And they all have kind of strengths and weaknesses. 109 00:12:08,560 --> 00:12:12,460 I think the important thing is to to think very broadly about this. 110 00:12:12,460 --> 00:12:14,260 I mean, ultimately, 111 00:12:14,260 --> 00:12:24,580 cyber security is about the risks that are inherent in the dependence that we have on I.T. And some of that is that someone might hack our data. 112 00:12:24,580 --> 00:12:32,200 But some of that is that, you know, that could be misinformation in terms of the sorts of actors there are. 113 00:12:32,200 --> 00:12:40,660 There's a lot of conversation about cyber criminals, whether they are criminals who will make money by whatever means they can. 114 00:12:40,660 --> 00:12:50,770 And cyber crime is a particularly attractive activity because the rewards are high and the risk of getting caught and punished are low. 115 00:12:50,770 --> 00:12:58,850 So when it comes to cyber crime, really the game that we're in is how can we increase the cost to. 116 00:12:58,850 --> 00:13:05,540 So and that's very difficult to do, especially when they are operating globally, 117 00:13:05,540 --> 00:13:14,210 but we are often tied to our own nation state when it comes to nation state threats. 118 00:13:14,210 --> 00:13:22,550 Again, I mean, any any advanced country worth its salt has made some kind of investment in cyber. 119 00:13:22,550 --> 00:13:29,790 And the real issue is what do we mean by a hostile state actor and what do we think 120 00:13:29,790 --> 00:13:36,500 are acceptable normative behaviours in terms of what a state should and shouldn't do, 121 00:13:36,500 --> 00:13:48,740 in terms of how it uses that level of power? And we often talk also about hacktivists and teenagers just having a laugh and so on. 122 00:13:48,740 --> 00:13:55,580 The other thing I think it's important to bear in mind, just just as we need a broad definition of cyber security, 123 00:13:55,580 --> 00:14:05,690 we also need a broad definition of cyber crime, because any crime which is exploiting the characteristics of cyber space is of concern. 124 00:14:05,690 --> 00:14:11,570 And that's as much about child abuse, incitement to violence, radicalisation. 125 00:14:11,570 --> 00:14:17,210 Most social woes have some kind of digital component. 126 00:14:17,210 --> 00:14:22,910 And we need to think about all of that when we think about cyber crime and cyber security. 127 00:14:22,910 --> 00:14:31,640 I suspect many of the audience will have been reading in the last couple of weeks about the Microsoft hack and maybe going a little bit further back, 128 00:14:31,640 --> 00:14:42,750 the solar winds hack, and which struck me as particularly insidious and that they have malware lurking on systems for a long time. 129 00:14:42,750 --> 00:14:49,740 So this is a new type of threat, or is it something that's been around a long time and just more visible in the last six months? 130 00:14:49,740 --> 00:14:55,010 Yeah, it's been around a number of years now. 131 00:14:55,010 --> 00:15:04,220 We refer to anything that persists on a system for some time is, generally speaking, an advanced, persistent threat. 132 00:15:04,220 --> 00:15:08,000 We've seen examples of attacks like that since years ago. 133 00:15:08,000 --> 00:15:18,750 I believe the first one that really made it into the public eye was the Stuxnet attack, which was on a uranium refinement facility. 134 00:15:18,750 --> 00:15:28,340 And they are insidious. And actually, one of the most worrying characteristics is not simply what we know to have happened, 135 00:15:28,340 --> 00:15:36,620 whether you're talking about the exchange server attacks or whether you're talking about getting 136 00:15:36,620 --> 00:15:42,980 malware into patches that are distributed through a supply supply chain to everyone using it. 137 00:15:42,980 --> 00:15:52,070 And what's really concerning is actually what else might have happened and the difficulty for organisations or indeed I guess individuals, 138 00:15:52,070 --> 00:16:00,410 if it were to happen to them, is really deciding when to call time on the investigations that you have to make. 139 00:16:00,410 --> 00:16:05,330 You can imagine when something like this gets announced and it's in your system. 140 00:16:05,330 --> 00:16:14,240 You you might know about some evidence, so we know this state has been essentially stolen, although we still got it, 141 00:16:14,240 --> 00:16:19,940 but we've seen it for sale somewhere or somebody has told us they've hacked us, they've asked us for money. 142 00:16:19,940 --> 00:16:28,730 However you find out there's an immediate harm that you have to deal with, but then you have to ask ourselves the question, what else happened? 143 00:16:28,730 --> 00:16:33,680 And if they were there for years, you really have to conduct fairly detailed investigation, 144 00:16:33,680 --> 00:16:38,510 trying to identify where these attackers had presence on your systems, 145 00:16:38,510 --> 00:16:43,580 where they moved to, because they generally will pivot around what they've accessed, 146 00:16:43,580 --> 00:16:51,500 data process, what they've learnt about you and your people and what what they may have done with that kind of knowledge, 147 00:16:51,500 --> 00:16:58,260 if you like, which, by the way, is often sold on and used in other attacks later on down. 148 00:16:58,260 --> 00:17:07,080 It as much as we know about the Microsoft hack hack and the solar winds, this was a criminal enterprise or state backed enterprise. 149 00:17:07,080 --> 00:17:18,970 Is it is it clear yet to. Um. Well, I think there's lots of different claims around in the press, do you know? 150 00:17:18,970 --> 00:17:22,050 We know certainly who were the perpetrators of these? 151 00:17:22,050 --> 00:17:34,780 I think we got pretty strong suspicions about the solar wind and without going into accusations that that was state backed or an organisation. 152 00:17:34,780 --> 00:17:42,580 Highly sophisticated. OK, I can take that means the assertions of state backing them about highly 153 00:17:42,580 --> 00:17:50,110 sophisticated is a particular vulnerability with the big platforms that we have. 154 00:17:50,110 --> 00:17:55,960 And we'll talk about in the moment that this extraordinary connectedness and increasing connectedness. 155 00:17:55,960 --> 00:18:02,680 But this is a accomplished accompanied by a relatively small number of very big companies being involved. 156 00:18:02,680 --> 00:18:06,960 And we all the same degree use a limited amount of it. 157 00:18:06,960 --> 00:18:12,100 Is that a sort of lack of resilience in the system and what can be done about it? 158 00:18:12,100 --> 00:18:21,640 You have identified exactly the issue. They're not particular needs with the hacks that you're talking about, particularly sophisticated. 159 00:18:21,640 --> 00:18:33,820 I mean, technically, they require you to have very sophisticated knowledge, but it doesn't take a whole nation a number of years to craft. 160 00:18:33,820 --> 00:18:40,090 Those we call them exploits the actual pieces of malicious code. 161 00:18:40,090 --> 00:18:45,730 It will take a long period of time to work out what to craft and then you make it. 162 00:18:45,730 --> 00:18:49,420 But the real vulnerability that we face is actually this. 163 00:18:49,420 --> 00:19:02,020 This what looks like a form of systemic risk is a lot of dependency on a few building blocks in our digital infrastructures. 164 00:19:02,020 --> 00:19:10,810 And that means if one of those building blocks gets taken out, then that risk can propagate, as will the harm across the systems. 165 00:19:10,810 --> 00:19:21,820 And in the research that we conducted with the World Economic Forum, this is one of the things that we were really drawing people's attention to, 166 00:19:21,820 --> 00:19:28,210 that there is not only risk is aggregating, but it's potentially hidden. 167 00:19:28,210 --> 00:19:29,920 And organisations, 168 00:19:29,920 --> 00:19:37,690 people and countries are not entirely aware of exactly where these common dependencies are because they're often further down the supply chain. 169 00:19:37,690 --> 00:19:41,890 They don't necessarily have visibilities, is not totally transparent. 170 00:19:41,890 --> 00:19:49,540 So if you don't understand there's this risk aggregating, then there's not much you can do. 171 00:19:49,540 --> 00:19:57,880 I mean, we can recommend people attempt to build in resilience and prepare for bad things happening. 172 00:19:57,880 --> 00:20:09,340 But you'd be far better equipped if you could at least prepare and have early warning signs as this risk was becoming realised, 173 00:20:09,340 --> 00:20:19,240 if you like, and we know where it is. So I do think we need the right balance in this because it's not a black and white issue. 174 00:20:19,240 --> 00:20:27,640 And there are big advantages to the availability of commodity computing, cloud computing and so on, 175 00:20:27,640 --> 00:20:41,080 because you are embedding security functions in extremely capable organisations so that there's a real security gain from that kind of application. 176 00:20:41,080 --> 00:20:49,000 But clearly it does bring into question issues of of monoculture for want of a better term. 177 00:20:49,000 --> 00:20:57,670 And therefore, it's right that there's a a a conversation with the providers of those systems that says, OK, 178 00:20:57,670 --> 00:21:06,880 how can we have the benefits of aggregation with addressing some of the resilience issues which inevitably arise. 179 00:21:06,880 --> 00:21:11,230 But I certainly wouldn't be saying that aggregation is a bad thing. 180 00:21:11,230 --> 00:21:21,160 It's how do you manage the risk in a in an environment where there's a high level of aggregation and dependence on on a small number of supplies? 181 00:21:21,160 --> 00:21:26,710 It certainly feels like this is a space where we need to do some more policy development work. 182 00:21:26,710 --> 00:21:34,150 We need to think about oversight mechanisms, incentivisation of markets, what kind of behaviour, 183 00:21:34,150 --> 00:21:42,130 standards of care should being being adopted by these organisations that are so critically dependent upon? 184 00:21:42,130 --> 00:21:48,370 And we certainly need to be doing more thinking about how we can ensure that they're 185 00:21:48,370 --> 00:21:53,980 doing what we need them to do in order to secure up these things that we're 186 00:21:53,980 --> 00:22:02,680 very dependent upon and the system to cyberspace that we could talk a little bit 187 00:22:02,680 --> 00:22:07,030 more about your report with the World Economic Forum and members of the audience. 188 00:22:07,030 --> 00:22:11,830 We put a link to it, which should be a green bar and bottom of your screen. 189 00:22:11,830 --> 00:22:18,310 And you begin in a report to talk about the the evolving dynamics of cyberspace. 190 00:22:18,310 --> 00:22:23,000 And that has particularly struck by a quotation we had from our readers. 191 00:22:23,000 --> 00:22:29,740 You talk about the increasing complexity and many feel that this is so complex with increasingly sophisticated 192 00:22:29,740 --> 00:22:38,110 characteristics that our role will change so that we become observers of the system increasingly outside our control. 193 00:22:38,110 --> 00:22:43,420 Now, you go on to talk about some of the things that can be done, but could you just elaborate a little bit about that? 194 00:22:43,420 --> 00:22:55,870 Because it it's when I first read it, it was a bit apocalyptic. So, yeah, well. 195 00:22:55,870 --> 00:23:02,480 It's probably actually, when you think about it, quite intuitive. Really, 196 00:23:02,480 --> 00:23:07,520 it's the observation that what we're doing is we're creating systems of many moving 197 00:23:07,520 --> 00:23:12,650 parts and they're now and we're not just talking about the Internet infrastructure, 198 00:23:12,650 --> 00:23:17,540 we're talking about all of the systems that that gives rise to support systems. 199 00:23:17,540 --> 00:23:28,310 We mean ecosystems of businesses, government services, individuals, the tech, the process, the routines, the products. 200 00:23:28,310 --> 00:23:34,070 And all of these are coming together in various moving parts. So the dynamics are getting quite complex. 201 00:23:34,070 --> 00:23:40,760 Some of the relationships, long time some have passed, created and dissipated. 202 00:23:40,760 --> 00:23:46,130 Some of these supply chains, the very, very long end to end. 203 00:23:46,130 --> 00:23:52,100 In fact, they're multidimensional. They're hard to even think about in a two dimensional, three dimensional space. 204 00:23:52,100 --> 00:24:00,650 And all of this is being driven by these new kinds of technologies and the processes that are created from them. 205 00:24:00,650 --> 00:24:07,580 And if you think about it, what's happening is groups of people, not just single human beings, 206 00:24:07,580 --> 00:24:12,440 groups of organisations are coming together to engineer parts of it. 207 00:24:12,440 --> 00:24:18,090 And then we've and we're creating in such a way that you can almost glue it all together. 208 00:24:18,090 --> 00:24:27,290 And the consequence of that is something that is very complex and changing so dynamically complex. 209 00:24:27,290 --> 00:24:39,350 And it's already the case. If we were really honest with ourselves, if you would turn round to a CTO of a very large organisation and ask them, 210 00:24:39,350 --> 00:24:45,320 do they really understand all the constituent parts that form that information infrastructure, 211 00:24:45,320 --> 00:24:49,550 even without thinking about how they relate to what the people are doing with it? 212 00:24:49,550 --> 00:24:57,020 And they would say, we probably couldn't write that down and get that right is very hard for humans to hold that in their heads. 213 00:24:57,020 --> 00:25:05,990 It's very hard for us to tool it. And so if you think about that in a global way and the way in which we're instrumenting the planet and the 214 00:25:05,990 --> 00:25:12,110 way in which we're now generating new knowledge and processing the data from instruments on the planet, 215 00:25:12,110 --> 00:25:20,210 you can see already that we are what we're doing really as humans is creating tools to enable us to make some sense of this, 216 00:25:20,210 --> 00:25:27,680 to spot some connexions, if you like, to hear the weak signals and find the intelligence elsewhere in the universe. 217 00:25:27,680 --> 00:25:31,220 But at the same time, we're kind of giving birth to the cyber universe, 218 00:25:31,220 --> 00:25:36,770 which is much greater than any of these lenses that we have in any one part of it. 219 00:25:36,770 --> 00:25:42,260 And we're probably already there. It's just expecting at the moment. 220 00:25:42,260 --> 00:25:50,390 So that knee jerk reaction to that would be, well, what we need is some rules and laws and and standards. 221 00:25:50,390 --> 00:25:57,950 But there are dangers to that as well, that just because cyberspace is so ubiquitous that if you put in rules and standards, 222 00:25:57,950 --> 00:26:04,200 if you put someone in charge, there's a potential for the misuse of that authority. 223 00:26:04,200 --> 00:26:08,720 Can I can I make a slightly different point and then come back to that? 224 00:26:08,720 --> 00:26:14,640 Of course. And this, I think, takes us back to the pandemic and building back better. 225 00:26:14,640 --> 00:26:22,460 I think what. What saving has one of the things that society has said is that. 226 00:26:22,460 --> 00:26:30,530 Resilience and understanding has, in a sense, gone second place to issues around convenience and efficiency, 227 00:26:30,530 --> 00:26:39,290 which one could argue is a feature of global supply chains, and hit us very, very hard when the pandemic hit us hard. 228 00:26:39,290 --> 00:26:48,350 So it seems to me when we're thinking about cyber space and cyber security and lessons that we draw from this report and from this pandemic, 229 00:26:48,350 --> 00:26:52,550 we need to give a greater emphasis to resilience. 230 00:26:52,550 --> 00:27:02,690 We need to give a greater emphasis to really understanding the risks that we're dealing with a degree of simplification, 231 00:27:02,690 --> 00:27:09,470 because simplification does lead to greater resilience and predictability. 232 00:27:09,470 --> 00:27:12,350 I think standards actually have a role to play in that. 233 00:27:12,350 --> 00:27:21,260 And I agree that if the standards are too prescriptive, then you can end up undermining innovation and so on. 234 00:27:21,260 --> 00:27:34,520 But I think principle based standards, outcome based standards and so on all part of the answer in terms of both enabling innovation, 235 00:27:34,520 --> 00:27:41,420 but also making sure that we're thinking about these resilience type issues is 236 00:27:41,420 --> 00:27:47,070 very important and actually in the concept of the expanding cyber universe. 237 00:27:47,070 --> 00:27:52,770 It is about what it's really about understanding the science and how you deliver the resilience. 238 00:27:52,770 --> 00:27:58,590 Of course it is. That's the implication. If we were dealing with so complex, 239 00:27:58,590 --> 00:28:06,030 we stop trying to pretend we can understand all of it and engineer for everything and focus on how we get what we need from this. 240 00:28:06,030 --> 00:28:12,030 So take all the opportunities it provides and deliver resilience. 241 00:28:12,030 --> 00:28:19,140 It's like it's like a form of science in a way. We need to remind ourselves that science is a model that explains things to which we 242 00:28:19,140 --> 00:28:24,990 have evidence and it changes over time because we to new evidence and and in a way, 243 00:28:24,990 --> 00:28:32,550 the pursuit of resilience is really it is that in a nutshell, I would say it's incredibly important. 244 00:28:32,550 --> 00:28:39,750 And if you think about the work that we do at the capacity centre and I mean the model 245 00:28:39,750 --> 00:28:47,090 for benchmarking national capacity has been used by over 85 countries now in the world. 246 00:28:47,090 --> 00:28:51,030 And that's incredibly important in the concept of the development agenda, 247 00:28:51,030 --> 00:29:03,510 because we know that helping some of the poorest nations develop their economies is going to hinge upon their ability to harness cyberspace, 248 00:29:03,510 --> 00:29:11,280 whether that's enabling very tiny farming enterprises to do business and to bank online, 249 00:29:11,280 --> 00:29:19,230 or whether that's really about accessing energy and facilities in order to manufacture in that kind of thing. 250 00:29:19,230 --> 00:29:24,570 And one of the new pieces of work that you and I are involved with actually in the context of the capacity 251 00:29:24,570 --> 00:29:34,030 centre is really trying to help explain the importance of cybersecurity to the sustainable development goals, 252 00:29:34,030 --> 00:29:42,530 that all of the things that we've become working on this year because we know it's incredibly important to delivering on these kinds of visions. 253 00:29:42,530 --> 00:29:48,720 So on the one hand, there's understanding how we might be introducing new risks that we need to be managed 254 00:29:48,720 --> 00:29:53,650 and we need to ensure that we are resilient in the face of in terms of new technology. 255 00:29:53,650 --> 00:29:58,470 And on the other hand, we've got how can we make sure that the world can. 256 00:29:58,470 --> 00:30:08,010 Do what it needs to do to address some of these big issues around climate change or food security and well-being in general, and of course, 257 00:30:08,010 --> 00:30:15,270 one of the great opportunities in making in achieving sustainable jobs is leapfrogging technology, 258 00:30:15,270 --> 00:30:19,920 much of which is digital, which, if it's compromised, will undermine that. 259 00:30:19,920 --> 00:30:26,010 And you touch on four particular areas of emerging technology, 260 00:30:26,010 --> 00:30:31,780 one of which I think we've been discussing for the last 10 minutes or so, which is increasing complexity. 261 00:30:31,780 --> 00:30:41,550 If I can ask you about the other three and Jamie, I could come to you on artificial intelligence and machine learning. 262 00:30:41,550 --> 00:30:53,430 What are they? Well, both the threats that the spread of AI raises and also whether it can be part of the solution of improving cyber security. 263 00:30:53,430 --> 00:30:59,880 I mean, the first thing I would say is that artificial intelligence machine learning is 264 00:30:59,880 --> 00:31:06,150 already being used quite extensively to help cybersecurity professionals do the work. 265 00:31:06,150 --> 00:31:12,330 So, you know, there's a lot of really positive things there. 266 00:31:12,330 --> 00:31:21,610 But it's also true that artificial intelligence machine learning is being used by attackers to improve their ability to attack. 267 00:31:21,610 --> 00:31:31,680 So there's a kind of cat and mouse game there, and it's actually quite hard to tell who is where the benefit of advantage is going to be. 268 00:31:31,680 --> 00:31:38,400 And we have a go at that in the report and I think essentially conclude we don't know. 269 00:31:38,400 --> 00:31:43,200 But I think there's a broader issue here that says if you're thinking about a world 270 00:31:43,200 --> 00:31:50,790 which is much more dependent on what we call algorithmically curated processes, 271 00:31:50,790 --> 00:31:57,630 then you have to think about, well, what are we actually defending? Because you're not just defending hardware and software. 272 00:31:57,630 --> 00:32:04,530 You're also defending the data that is going into these systems, the algorithm itself, 273 00:32:04,530 --> 00:32:14,130 which you don't necessarily understand because it's a heuristic algorithm rather than a deterministic algorithm. 274 00:32:14,130 --> 00:32:20,910 And so we need to think much more broadly about, well, it's back to say this term, the attack surface. 275 00:32:20,910 --> 00:32:30,600 So what are we trying to defend? And I think perhaps the most significant conclusion from this part of the report was that we're 276 00:32:30,600 --> 00:32:38,700 not yet building these systems thinking deeply about how we can build them as secure as possible. 277 00:32:38,700 --> 00:32:46,320 Also, how can we operate them through a whole lifecycle point of view as accurately as possible? 278 00:32:46,320 --> 00:32:54,870 And what we need is clearer principles and approaches to developing an operating system so that we can do it securely. 279 00:32:54,870 --> 00:33:01,860 And indeed, that's going to be an area of the future research. As a follow on from this report, 280 00:33:01,860 --> 00:33:06,300 I thought a very relevant phrase you used here and actually in several places and 281 00:33:06,300 --> 00:33:11,340 report is building security from the base upwards when you're designing a system, 282 00:33:11,340 --> 00:33:19,890 just having it from the very beginning. If I could go on to the next topic you looked at, which is around quantum computing. 283 00:33:19,890 --> 00:33:26,400 And I got to ask you just to explain briefly what quantum computing is, if that's possible in your seconds. 284 00:33:26,400 --> 00:33:31,050 And then I'm going to play devil's advocate and ask a question for you to shoot me down. 285 00:33:31,050 --> 00:33:35,790 So quantum computing. Yes, really exciting, but sufficiently far on the horizon. 286 00:33:35,790 --> 00:33:42,340 It's not something we need worry about now. So let me for both of you. 287 00:33:42,340 --> 00:33:46,380 So the one sentence on what quantum computing is was a completely different way 288 00:33:46,380 --> 00:33:52,110 of doing computation where things can be essentially in more than one state. 289 00:33:52,110 --> 00:33:59,070 And so historically, when we think about how we compute the answer to a problem, then we fill the probabilities, 290 00:33:59,070 --> 00:34:04,440 compute that through a series of if it's that this if it's not, it's this point because there isn't one. 291 00:34:04,440 --> 00:34:08,760 And the thing about quantum is you can be zero and one at the same time. 292 00:34:08,760 --> 00:34:11,730 That's a simplistic way of communicating it. 293 00:34:11,730 --> 00:34:17,670 And that, as you can imagine, though, if you can compute as if things are in many different states at once, 294 00:34:17,670 --> 00:34:27,510 then you could computational power and the kinds of problems that are now tractable to solve grows dramatically. 295 00:34:27,510 --> 00:34:33,990 The reason why we were considering it in the context of the report is actually, although it might feel like it's very, very far away, 296 00:34:33,990 --> 00:34:40,800 there is significant investment being put into the ability to create quantum computing, 297 00:34:40,800 --> 00:34:49,480 to be world leaders and being able to deliver quantum capabilities. 298 00:34:49,480 --> 00:34:53,830 Could be for many reasons, not least because of the kinds of problems that will then become Tatopoulos, 299 00:34:53,830 --> 00:35:01,480 the new kinds of solutions and insights that will have new kinds of programmes that will be able to will be far more sophisticated, 300 00:35:01,480 --> 00:35:07,840 we expect, but also importantly from the cybersecurity perspective, 301 00:35:07,840 --> 00:35:14,170 because as I think we began this discussion by reminding everyone that there's not a 100 percent security effort. 302 00:35:14,170 --> 00:35:18,070 And if you think about how we deliver confidentiality and encrypt data, 303 00:35:18,070 --> 00:35:27,790 that's a good example of in actual fact, we don't deliver cryptography that is 100 percent secure. 304 00:35:27,790 --> 00:35:37,000 We deliver we encrypt our data in such a way that it was it would be nearly impossible to comprehend of someone having 305 00:35:37,000 --> 00:35:45,740 the computational power to break the encryption and to calculate the the primitives that create the encryption keys. 306 00:35:45,740 --> 00:35:56,170 So essentially, the concern is if quantum computer comes along, lots of our old crypto will be very, very easy to decrypt. 307 00:35:56,170 --> 00:36:02,680 All of a sudden, at even the moment, we rely on the fact that it would be very, very hard to do so. 308 00:36:02,680 --> 00:36:05,920 We're considering it because, one, there's lots of investment. 309 00:36:05,920 --> 00:36:10,730 We know that there's capability building and it could speed up because of that investment. 310 00:36:10,730 --> 00:36:20,140 And we know that there's at least one security mechanism that would be vulnerable if an attacker had access to this kind of computational power. 311 00:36:20,140 --> 00:36:28,510 But by the way, any kind of security control that for which its reliability depends upon it being very, 312 00:36:28,510 --> 00:36:36,640 very hard to compute something would be vulnerable to attack as it has quantum computing power. 313 00:36:36,640 --> 00:36:44,920 Jamie, let's go. I was going to add is we're building and designing systems now that will still be alive in 30 years time. 314 00:36:44,920 --> 00:36:50,260 Exactly. Industrial control systems or systems in space. 315 00:36:50,260 --> 00:36:57,400 And so we better think now about how we can design them in a way that if quantum computing does become a reality, 316 00:36:57,400 --> 00:36:59,590 and I think it will in that time period, 317 00:36:59,590 --> 00:37:09,640 that we can at least easily swap out the cryptographic that we got and replace them with something that is quantum secure. 318 00:37:09,640 --> 00:37:16,240 So, Charles, you go ahead. I was going to say the good news is we've had some foresight, as we say in the report, 319 00:37:16,240 --> 00:37:23,650 there has been activity amongst the crypto community for some years developing new kinds 320 00:37:23,650 --> 00:37:30,880 of encryption algorithms that would be considered quantum safe and secure and indeed, 321 00:37:30,880 --> 00:37:34,600 oxfords. We have one of those hubs and there's activities going on all around the world. 322 00:37:34,600 --> 00:37:45,430 So on that particular component, there is there is thought going into how you would generate new kinds of encryption. 323 00:37:45,430 --> 00:37:53,530 But as we know from the reports of people read it, that's just one part of the problem, that even on that little building block alone, 324 00:37:53,530 --> 00:37:58,060 just because somebody produced a way of developing an encryption key, 325 00:37:58,060 --> 00:38:04,160 you still have the practical challenge of rolling that out across all of your data. 326 00:38:04,160 --> 00:38:09,400 And Jamie just said some of this will have to present in your system some years 327 00:38:09,400 --> 00:38:14,470 and legacy understanding what you've already encrypted and where it sits, 328 00:38:14,470 --> 00:38:21,090 knowing that you've got to re encrypt it with a new key to keep it secure, deciding whether you need to do that. 329 00:38:21,090 --> 00:38:26,670 What of the stuff that you've been creating needs this this level of protection 330 00:38:26,670 --> 00:38:32,500 and what you can rest of changing and just think about as human beings. 331 00:38:32,500 --> 00:38:41,500 Is anyone really aware of where they've put their data recently, where the data about what they've what they've been doing in their personal lives? 332 00:38:41,500 --> 00:38:47,440 And you can imagine if you're a business large corporation, it's incredibly challenging. 333 00:38:47,440 --> 00:38:58,030 So in the end, what tends to happen is a focus on the the stuff that people really know they care about and getting that sorted out first and 334 00:38:58,030 --> 00:39:06,340 then lots of things often will naturally fall by the wayside and be considered no longer necessary for that level of protection. 335 00:39:06,340 --> 00:39:12,310 And there's another side of constant technology in constant communication where you can, at least in theory, 336 00:39:12,310 --> 00:39:19,120 have a communication system that's impossible to to eavesdrop on or without being detected. 337 00:39:19,120 --> 00:39:26,800 Do you see that as being an important element of cybersecurity in the future? 338 00:39:26,800 --> 00:39:35,330 I can jump in. Yeah, well, I think we're talking about quantum key distribution and yes. 339 00:39:35,330 --> 00:39:41,560 And lasers and lights and stuff, et cetera. Yes. So we definitely made progress on actually. 340 00:39:41,560 --> 00:39:51,250 It's fantastic, isn't it? Kind of harnessing physics as part of the security argument for one of our security mechanisms? 341 00:39:51,250 --> 00:39:56,230 Yeah, there's always kind of operational limitations as to where you can use that. 342 00:39:56,230 --> 00:40:07,090 In this particular example we're talking about, you have to have really good ability to have line of sight or fibre optic cable that isn't losse, 343 00:40:07,090 --> 00:40:14,410 to have really reliable communications between the two elements that are using this kind of mechanism to talk. 344 00:40:14,410 --> 00:40:20,660 So there'll always be a useful kind of backbone infrastructure where you can guarantee that. 345 00:40:20,660 --> 00:40:28,330 Be very easy for some type of fibre optic suckler to light out and and to do denial of service attack at you. 346 00:40:28,330 --> 00:40:35,740 So here is how it works. Need to create to create some kind of confidentiality and secrecy. 347 00:40:35,740 --> 00:40:44,620 But sometimes that mechanism can be so brittle that also we need to do is come and step in the middle and just lost it. 348 00:40:44,620 --> 00:40:52,330 And you kind of lost it. And your mechanisms will have a failsafe if they can't have reliable communications and that channel is gone. 349 00:40:52,330 --> 00:40:59,380 And so you have to learn another means of communication. But yes, there will always be a role for different solutions. 350 00:40:59,380 --> 00:41:05,800 And if, in fact, there always has been and but they come at different costs. 351 00:41:05,800 --> 00:41:10,810 And that actually, in a way, it doesn't sound very direct, 352 00:41:10,810 --> 00:41:19,790 but it is a direct pivot's with some of the observations we made in the reports around business leaders really being needing to have tools, 353 00:41:19,790 --> 00:41:24,970 knowledge to enable them to understand what these risks are and to lead in the face 354 00:41:24,970 --> 00:41:31,690 of incidents to help ensure that their organisations and the ecosystems are part of 355 00:41:31,690 --> 00:41:39,730 remain resilient and really get your head around what kinds of security needs you have 356 00:41:39,730 --> 00:41:43,610 and then how you line up these kinds of approaches and these tools to deliver them. 357 00:41:43,610 --> 00:41:50,170 That can be really challenging. Time to ask one last question before going to two questions from the audience. 358 00:41:50,170 --> 00:41:54,370 And I do encourage the audience to ask questions and in particular to vote. 359 00:41:54,370 --> 00:42:00,490 And this is the fourth of the areas of emerging technologies you looked at and I guess is INOMAX. 360 00:42:00,490 --> 00:42:06,700 But I wasn't surprised by the first three, but I wouldn't have guessed the whole issue about digital identity. 361 00:42:06,700 --> 00:42:16,370 And Jamie, might you just say why that is important and what the cyber security threats and opportunities are around that area? 362 00:42:16,370 --> 00:42:24,530 Yeah, I mean, the first thing to say is that managing digital identities, well, digital identity identities, 363 00:42:24,530 --> 00:42:31,820 in other words, just like we have a passport that says who we are, we can wave it in front of a border officer. 364 00:42:31,820 --> 00:42:38,390 A digital identity tells someone you have a digital transaction with it. 365 00:42:38,390 --> 00:42:51,200 It tells them who you are, or rather, it shows that you are entitled to the service that you're asking for, which is a slightly different thing. 366 00:42:51,200 --> 00:42:56,690 If one can trust that someone is who they say they are or that they are, 367 00:42:56,690 --> 00:43:06,290 or that they are authorised to receive a particular service or went to a bar or whatever it might be, then obviously that's where they go to security. 368 00:43:06,290 --> 00:43:16,700 So I think most people would say digital identity is an essential component of security in this ubiquitously connected world. 369 00:43:16,700 --> 00:43:25,100 The issue as as as most people would recognise, and especially people who are who are taking part from the UK, 370 00:43:25,100 --> 00:43:33,080 is that the whole question of identity and identity cards and so on and so on is politically very, very difficult. 371 00:43:33,080 --> 00:43:41,510 And so different countries have got different cultures in terms of how digital identities should be managed. 372 00:43:41,510 --> 00:43:47,450 And every single country I talk to is convinced that its approach is the right approach. 373 00:43:47,450 --> 00:43:55,430 And that's fine until you ask the question. Well, what if people want to undertake digital transactions that cross borders? 374 00:43:55,430 --> 00:44:03,110 How can I be confident that the digital identity provided about with a very different approach 375 00:44:03,110 --> 00:44:13,100 in Country A is sufficiently strong for me to be able to safely provide a service in Country B? 376 00:44:13,100 --> 00:44:19,670 And every time I talk to people about how are you ensuring that this will all be interoperable, they say that's easy. 377 00:44:19,670 --> 00:44:24,350 We're going to persuade everyone else to do it the way that we want to do it. 378 00:44:24,350 --> 00:44:28,910 Well, that's not going to happen. And so essentially what the report is saying, 379 00:44:28,910 --> 00:44:33,830 we need to have some sort of international conversation that says what are the 380 00:44:33,830 --> 00:44:39,920 basic standards that we can all agree on that will enable interoperability. 381 00:44:39,920 --> 00:44:48,950 And the reason that this is important is that actually that issue affects almost every other aspect of security that we've talked about today, 382 00:44:48,950 --> 00:45:00,200 because the last thing that we want is for the security tail to wag the global Internet dog, if I may use that expression. 383 00:45:00,200 --> 00:45:11,690 And and it's very easy for security concerns to undermine the very free and open Internet that will actually all here trying to protect you. 384 00:45:11,690 --> 00:45:20,040 Thank you. So let me go and ask some questions. And the questions with the most votes is from Maria Kang. 385 00:45:20,040 --> 00:45:26,990 And I can, to paraphrase it slightly, but she essentially says. 386 00:45:26,990 --> 00:45:33,390 Should we be downscaling the growth of new technologies, is there a risk that we have a runaway train? 387 00:45:33,390 --> 00:45:44,310 So are we perhaps moving too fast towards a multi connected world and things should be put the brakes on for a little bit? 388 00:45:44,310 --> 00:45:55,370 Of oh, well, I would say I realise, having heard lots of talk about security and threats, 389 00:45:55,370 --> 00:45:58,370 et cetera, that I can see where that question would stem from. 390 00:45:58,370 --> 00:46:06,320 But there's little doubt in the value of the connected world and the data that it's providing. 391 00:46:06,320 --> 00:46:13,480 I can't I can't see if you imagine a trade off of benefits. 392 00:46:13,480 --> 00:46:20,590 I just can't see that we want to do anything other than build resilience into this highly connected world. 393 00:46:20,590 --> 00:46:25,300 All the benefits that it comes with, I mean, I appreciate that a bit. 394 00:46:25,300 --> 00:46:29,710 Like Jamie was saying, every country feels its identity system is the right one. 395 00:46:29,710 --> 00:46:34,360 Every generation feels that the way it was when they grew up was the right one. 396 00:46:34,360 --> 00:46:44,310 That's how I am. It's the generations that, however, it's quite clear that medical breakthroughs, the opportunities. 397 00:46:44,310 --> 00:46:53,480 People connecting around the world with with other people like them facing the same kinds of challenges or dreaming the same dreams. 398 00:46:53,480 --> 00:47:03,000 The upside has been so great that I think it's really about it's on us to actually really develop the capacity to ensure 399 00:47:03,000 --> 00:47:10,920 that we're resilient and that we can build with the values and adopt the right principles to see this one through. 400 00:47:10,920 --> 00:47:17,580 So I called. I think we should put the brakes on a little. 401 00:47:17,580 --> 00:47:25,770 And I think if one looks at the evolution of technology, certainly over the last few decades and probably the last few centuries, 402 00:47:25,770 --> 00:47:34,740 there were probably some things which we regret having rolled out quite so fast, DDT and the whole Silent Spring issue. 403 00:47:34,740 --> 00:47:40,380 And it seems to be right that there should be mechanisms for society, 404 00:47:40,380 --> 00:47:46,530 for governments to step back and say, are we really sure this is a sensible thing to do? 405 00:47:46,530 --> 00:47:53,670 Um, so I wouldn't want to stop or reverse the roll out of technology. 406 00:47:53,670 --> 00:47:57,210 And I think we need to do it with our eyes open. That's exactly it. 407 00:47:57,210 --> 00:48:01,980 And in fact, I've been taking agreement and it may not even be about speed. 408 00:48:01,980 --> 00:48:08,940 I just think that we have to do it with the right kind of oversight and insights and it responsibly. 409 00:48:08,940 --> 00:48:20,130 And if that means slowing down a bit so that we have the means to do the right thing, then so be it, but instead be avoiding the opportunities as all. 410 00:48:20,130 --> 00:48:22,530 So that leads onto the next question. 411 00:48:22,530 --> 00:48:30,540 And I'm going to bring together two questions, one by David Bloom and the other by James Reid Darbee, who both who point out, 412 00:48:30,540 --> 00:48:39,600 first of all, the importance for innovation in cyberspace, for economic growth, and that aggregation happens very rapidly. 413 00:48:39,600 --> 00:48:47,640 So what's a tiny Start-Up today is a unicorn. Tomorrow is a trillion dollar company in two years time. 414 00:48:47,640 --> 00:48:54,270 And they both ask questions about what type of mechanisms would you like to see that 415 00:48:54,270 --> 00:49:02,830 ensures that they build in security from the start without stifling innovation? 416 00:49:02,830 --> 00:49:11,470 Well, let me let one. Yeah, I mean. 417 00:49:11,470 --> 00:49:16,570 There's this there's two very, very different ways of looking at that. 418 00:49:16,570 --> 00:49:22,840 I mean, one is that by raising awareness and so on, I mean, 419 00:49:22,840 --> 00:49:29,290 it's important that everyone who is involved in technology is thinking about cybersecurity. 420 00:49:29,290 --> 00:49:37,750 And in a way, you know, we kind of got that. I mean, that is a change that has happened over the last few years. 421 00:49:37,750 --> 00:49:47,200 But I think the more fundamental point is we seem to treat digital technology in a much more kind of less a fair way than we do, 422 00:49:47,200 --> 00:49:56,610 for example, genomics or biotechnology. And it seems to me that. 423 00:49:56,610 --> 00:50:03,250 There ought to be some some sort of process of saying, you know, you're not actually going to be able to deploy, 424 00:50:03,250 --> 00:50:09,420 why wasn't there sufficient regulation that says that actually? 425 00:50:09,420 --> 00:50:20,550 Wildly deploying fundamentally insecure products, you know, the regulation without being heavy handed won't let you do that. 426 00:50:20,550 --> 00:50:28,830 And so if you if if you haven't planned in security early, then that's going to hit you hard and slow down your growth later on. 427 00:50:28,830 --> 00:50:33,360 So it's not it's not saying no one should do anything, but until they've done security, 428 00:50:33,360 --> 00:50:37,710 it's just seeing you as a business need to realise that if you're going to go, 429 00:50:37,710 --> 00:50:41,640 there will be a point where you have to answer these security questions. 430 00:50:41,640 --> 00:50:45,870 And our advice to you is the earlier you answer those, 431 00:50:45,870 --> 00:50:53,790 the quicker you'll actually be able to play the brakes on on your prototype car rather than building a car and then thinking, 432 00:50:53,790 --> 00:51:01,830 how can we put our brakes on later? Yeah, I would I would I would agree with that, and I know that she say. 433 00:51:01,830 --> 00:51:09,960 And then I think that there's some serious creative and deep thinking needs to be given around the space 434 00:51:09,960 --> 00:51:17,310 because there's going to be a role for some regulation is going to be a role for incentivising markets. 435 00:51:17,310 --> 00:51:23,340 That might be a role for incentivising consumers and users and and understanding how we 436 00:51:23,340 --> 00:51:31,860 balance out trying to make it that to be a financial market upside for the innovators, 437 00:51:31,860 --> 00:51:38,730 that will be greater if they're doing things that actually protect us from the harms we're trying to avoid. 438 00:51:38,730 --> 00:51:43,650 That will take some deep thinking because we have seen, haven't we, that there has been a market failure. 439 00:51:43,650 --> 00:51:56,220 If we leave it to just the the technology developers, we've known how to avoid many of the software errors that get exploited by a tax for years. 440 00:51:56,220 --> 00:52:05,670 But the the paradigms, the ways of of writing that code haven't been taken up and the business models have changed. 441 00:52:05,670 --> 00:52:14,580 Now, we work in a world of app stores and download different pieces of software from Yunos where be totally reliant on the App Store. 442 00:52:14,580 --> 00:52:18,780 Obviously, it has is not an email intent inside it. 443 00:52:18,780 --> 00:52:23,760 So the world is shifting and it feels like you couldn't for example, 444 00:52:23,760 --> 00:52:32,160 you couldn't solely rely on regulation because anyone that's in the role of standard setting knows that those things take years to evolve. 445 00:52:32,160 --> 00:52:39,330 And if you look at the pace of change of technology in the way we're using it, there's no way the standards can keep up with that. 446 00:52:39,330 --> 00:52:46,050 They can catch up with that. But there's always going to be a window of time where they have not yet. 447 00:52:46,050 --> 00:52:54,000 And so you couldn't solely rely on compliance to standards to to remove the tax office. 448 00:52:54,000 --> 00:53:00,360 You will also need something around attitudes and behaviours in organisations. 449 00:53:00,360 --> 00:53:09,150 And you'll need to try and make it so that that if you protect people from the harms you want to avoid, then you're going to flourish. 450 00:53:09,150 --> 00:53:15,990 And and that's in the UK, which we've seen buying power been used by governments. 451 00:53:15,990 --> 00:53:22,620 So they've set up schemes where so I only have organisations in the supply chain 452 00:53:22,620 --> 00:53:31,080 that are implementing to a certain level of integrity cybersecurity controls. 453 00:53:31,080 --> 00:53:39,900 So big, big users of third party services buying in such a way that he actually uses the mechanisms, 454 00:53:39,900 --> 00:53:45,960 the economics of supply and demand to try and make a success out of some of these more innovative solutions. 455 00:53:45,960 --> 00:53:54,270 But what actually happen to think that this requires a very concerted effort to look at what might work, might not work, 456 00:53:54,270 --> 00:54:01,950 and not solely from technology out there needs to be a technology angle, but there also needs to be a whoremonger. 457 00:54:01,950 --> 00:54:11,040 Yes. What it will actually need to be resilient to in terms of threats and what are the harms that we're trying to avoid or to 458 00:54:11,040 --> 00:54:19,920 mitigate and then to use that to help define where we want to put our investment grade sales and financial investments. 459 00:54:19,920 --> 00:54:23,640 What are the kinds of standards? Are the standards of care? 460 00:54:23,640 --> 00:54:28,710 Are they technology standards? Are they something to do with the way we regulate markets? 461 00:54:28,710 --> 00:54:33,900 And what can we do to try and promote those organisations and technologies and 462 00:54:33,900 --> 00:54:38,890 ecosystems that are going to deliver more cyber resilience and protect more people? 463 00:54:38,890 --> 00:54:47,580 Is this something we can do about that? So I think this piece of work, which is of course another sort of, you know, 464 00:54:47,580 --> 00:54:55,390 positive incentives and negative incentive is run by better enforcement, but also around the legal liability. 465 00:54:55,390 --> 00:55:07,940 I didn't think it would help if software companies, platforms, etc have greater legal liabilities for the harms they expose their customers to. 466 00:55:07,940 --> 00:55:18,190 So tell me to start that one off Saturday. I mean, I've heard the argument. 467 00:55:18,190 --> 00:55:28,690 And again, I don't think it's black and white, I think generally when it comes to digital technology. 468 00:55:28,690 --> 00:55:39,280 Governments have been and then the law, if you like, has been more hands off than it has in other forms of commerce. 469 00:55:39,280 --> 00:55:46,930 And to put it away in another way. Tech companies have got away with quite a lot, and that's a one away one competition. 470 00:55:46,930 --> 00:55:54,100 And that applies to competition law, as we know from the debates in the US as much as it does to security issues. 471 00:55:54,100 --> 00:56:01,600 And I think the balance does need to shift a bit, but it mustn't shift kind of too far the other way. 472 00:56:01,600 --> 00:56:10,870 So it seems to me unreasonable to make a supplier of software liable for all the the foolish things that a user of that software might do. 473 00:56:10,870 --> 00:56:16,750 But at the other end, it can't have a completely kind of hands off. 474 00:56:16,750 --> 00:56:20,320 We have no responsibility at all. 475 00:56:20,320 --> 00:56:30,040 So to me, that does need to be a bit of a change in emphasis, but not a kind of dramatic swing, at least not for now. 476 00:56:30,040 --> 00:56:34,890 I think if you imagine it's an. 477 00:56:34,890 --> 00:56:44,460 The good metaphor, because when we do expect cars to be inherently safe, we're certainly about the structure is going to have integrity. 478 00:56:44,460 --> 00:56:52,410 We have now in certain parts of the world, we have laws where you have to sell them with seatbelts and safety features. 479 00:56:52,410 --> 00:56:57,840 But the car manufacturers are not inherently liable for the way in which that car gets driven. 480 00:56:57,840 --> 00:57:05,580 So there is but I agree with Jamie that there is still some health and safety around the vehicle in the first place. 481 00:57:05,580 --> 00:57:12,330 And actually this product recall when when features are found to be wanting and there 482 00:57:12,330 --> 00:57:18,840 are ways of taking remedial action and demanding that remedial action is taken. 483 00:57:18,840 --> 00:57:22,800 And we probably haven't reached that point enough with the tech sector. 484 00:57:22,800 --> 00:57:26,280 So I would totally agree with Jamie. There's a little bit of work to be done, 485 00:57:26,280 --> 00:57:31,170 but it's not the case that you would absolve the drivers of the cars from their actions and they will be cheaper. 486 00:57:31,170 --> 00:57:39,630 Forgive me for interrupting Sandy, because we're coming towards the end. I want to step in one last question, and this is from John M. 487 00:57:39,630 --> 00:57:47,880 We have seen in very real terms how communications platforms have been weaponized and used for very effective disinformation campaigns. 488 00:57:47,880 --> 00:57:54,030 A panel have any thoughts on how governments or society can counter or mitigate this threat. 489 00:57:54,030 --> 00:57:58,330 And I'm really sorry. I'm going to give you about 30 seconds to answer that side. 490 00:57:58,330 --> 00:58:09,040 Would you like to go first? Well, there's this and multiple elements to that. 491 00:58:09,040 --> 00:58:14,530 We in the cyber security space to look at how we maintain integrity and ensure that somebody 492 00:58:14,530 --> 00:58:22,030 hasn't managed to bust the system so that it influences people in ways that were not intended, 493 00:58:22,030 --> 00:58:33,530 but that would not in any way, shape or form deal with the issue of how people are susceptible to messaging, which is something in our biology. 494 00:58:33,530 --> 00:58:45,560 Enough. Jamie? Yeah, I think part of this is the conversation with the platform owners to say, you know, you do have some accountability. 495 00:58:45,560 --> 00:58:51,460 I think the platform owners have accepted that. And I think the platform owners are doing a huge amount. 496 00:58:51,460 --> 00:59:01,330 But Assadi says ultimately this is a social societal problem and one shouldn't expect a technologist in the whole of the problem, 497 00:59:01,330 --> 00:59:04,690 and neither can it be the whole of the solution. Thank you. 498 00:59:04,690 --> 00:59:11,500 And I thought that was one of the strong points that came out from your report that, yes, there is a lot that can be done technologically, 499 00:59:11,500 --> 00:59:17,230 but a lot that requires both government action, but also changes in norms, 500 00:59:17,230 --> 00:59:24,670 changes in the in the way that we view and the way we interact in the digital world champion. 501 00:59:24,670 --> 00:59:30,430 I suspect we could go on talking about this fascinating topic for several more hours and we have to stop here. 502 00:59:30,430 --> 00:59:36,250 So let me thank you very much for taking part in the conversation this afternoon. 503 00:59:36,250 --> 00:59:41,980 I'd like to thank the audience and apologise apologise for not getting through all the questions. 504 00:59:41,980 --> 00:59:49,660 We're now going to take a break until after Easter. We'll put my website details of the next series of conversations. 505 00:59:49,660 --> 00:59:56,320 They will be around the recent report that was produced by His Her Majesty's Treasury, 506 00:59:56,320 --> 01:00:03,080 the UK Economic Ministry on the on the Economics of Biodiversity, 507 01:00:03,080 --> 01:00:08,770 that this is the path to report and we'll have path talking and a couple of other people as well. 508 01:00:08,770 --> 01:00:13,400 So the details will be on our website shortly. 509 01:00:13,400 --> 01:00:15,138 Sadiya, Jamie, thank you again.