1 00:00:02,000 --> 00:00:05,500 ARI: Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,500 --> 00:00:09,500 ARI: We're a podcast all about exploring the different sides of cybersecurity, 3 00:00:09,500 --> 00:00:13,790 from political to computer science, international relations to mathematics. 4 00:00:13,790 --> 00:00:16,070 Join us as we talk to our friends about the work they do. 5 00:00:17,570 --> 00:00:21,500 Really short intro[duction] today, we have Andrew, the director of our CDT, 6 00:00:21,500 --> 00:00:24,709 Louise, who used to be a [CDT] student and has become a researcher. 7 00:00:24,710 --> 00:00:29,210 Kevin is the Chief Executive Officer of a company called Cyjax. They do threat 8 00:00:29,210 --> 00:00:33,559 intelligence, but more importantly, he is a member of our academic board! 9 00:00:33,560 --> 00:00:36,420 Just before we dive in this week, we're going to talk a lot today about 10 00:00:36,420 --> 00:00:39,200 different disciplines - what that means is subject areas. 11 00:00:39,200 --> 00:00:43,790 My discipline is Computer Science. Claudine has many! The disciplines 12 00:00:43,790 --> 00:00:47,000 she comes from are Political Science and Law. For the purpose of this 13 00:00:47,000 --> 00:00:51,770 discussion, interdisciplinary is the integration of knowledge and methods. 14 00:00:51,770 --> 00:00:55,500 Multidisciplinary - people from different subject areas working together, 15 00:00:55,500 --> 00:00:59,059 they draw from each other's knowledge, not necessarily integration, 16 00:00:59,060 --> 00:01:02,700 just being around each other. We mix up these words, but that's a 17 00:01:02,700 --> 00:01:05,599 general guide as to what the specific differences are. 18 00:01:05,600 --> 00:01:09,890 We hope you have almost as much fun listening to this as we did recording it! 19 00:01:09,890 --> 00:01:15,130 Without any further ado, let's get this show on the road. LOUISE: I'm Louise, I 20 00:01:15,130 --> 00:01:22,720 started in the cyber security CDT in 2014 looking at present[ing] threat and 21 00:01:22,720 --> 00:01:27,730 monitoring data to security practitioners in security operations centres. 22 00:01:27,730 --> 00:01:32,350 I was looking at alternatives to visual techniques, in particular using sound. 23 00:01:32,350 --> 00:01:37,690 I've moved on to being a postdoctoral researcher where I look at systemic 24 00:01:37,690 --> 00:01:42,040 cyber security risks and modelling interdependencies between organisations, 25 00:01:42,040 --> 00:01:46,540 how different attacker strategies might play against different defender strategies. 26 00:01:46,540 --> 00:01:50,000 I've also been looking at security for some of the alternative systems 27 00:01:50,000 --> 00:01:52,989 structures that are starting to be used more and more, 28 00:01:52,990 --> 00:01:56,070 in particular distributed ledgers like blockchain. 29 00:01:56,070 --> 00:02:02,590 I define cyber security as protecting computing systems at the core, 30 00:02:05,000 --> 00:02:10,410 protecting confidentiality, integrity and availability of data and systems. 31 00:02:10,420 --> 00:02:15,000 There's a lot of complexity behind that, there's a lot of different types of 32 00:02:15,000 --> 00:02:18,579 systems, there's a lot of connections and dependencies between systems, 33 00:02:18,580 --> 00:02:23,000 and there's a lot of different stages to protect. It sounds simple, about 34 00:02:23,000 --> 00:02:26,350 protecting computer systems, but there's a lot of complexity behind it. 35 00:02:26,350 --> 00:02:30,940 KEVIN: I'm Kevin McMahon. I'm a member of the CDT Advisory Board. 36 00:02:30,940 --> 00:02:37,390 In 2012 I founded a cyber threat intelligence company. 37 00:02:37,390 --> 00:02:41,750 And since then, we've been helping UK government organisations and private 38 00:02:41,750 --> 00:02:45,640 sector companies to develop their cybersecurity capabilities internally 39 00:02:46,500 --> 00:02:49,830 and better understand the risk that they are facing 40 00:02:50,830 --> 00:02:54,370 as a result of problems that are presented by cyber security. 41 00:02:54,370 --> 00:03:00,460 One of the biggest challenges we have is develop[ing] recruitment pipelines. 42 00:03:00,460 --> 00:03:09,850 With a skills shortage, it's difficult to maintain and keep staff with necessary skills. 43 00:03:09,850 --> 00:03:12,850 There's a bit of a race on at the moment from some of the larger, 44 00:03:12,850 --> 00:03:21,010 better funded organisations to snap up as much of the talent as they can. 45 00:03:21,010 --> 00:03:26,140 Cyber security has been about protecting devices within organisations. 46 00:03:26,140 --> 00:03:30,200 More recently we've started to understand and embrace the wider 47 00:03:30,200 --> 00:03:34,209 scope of the problem and how it impacts every area of the business. 48 00:03:34,210 --> 00:03:39,500 Cybersecurity can present itself in physical risk to employees, to assets, 49 00:03:39,500 --> 00:03:44,050 setting benchmarks and protecting the brand reputation of organisations. 50 00:03:46,200 --> 00:03:50,950 Cyber security relates to [challenges] within business and within technology. 51 00:03:53,200 --> 00:03:58,749 which certainly touches upon the multi-disciplinary work we are here to discuss. 52 00:03:58,750 --> 00:04:04,150 ANDREW: Hi I'm Andrew Martin, Professor of Systems Security. 53 00:04:04,150 --> 00:04:08,800 I'm Founder and Director of the Centre for Doctoral Training in Cyber Security. 54 00:04:08,800 --> 00:04:17,410 I first encountered computer security near the beginning of my career, 1990. 55 00:04:17,410 --> 00:04:24,720 [I] look[ed at] technical aspects of how we make computers secure. 56 00:04:24,720 --> 00:04:30,840 I started teaching that stuff to Master's students after the Millennium. 57 00:04:30,840 --> 00:04:34,200 It was immediately very clear that you could have all the good 58 00:04:34,200 --> 00:04:36,540 technology in the world and if people didn't use it right or 59 00:04:36,540 --> 00:04:40,000 didn't understand it or had their own priorities that were at odds with 60 00:04:40,000 --> 00:04:42,809 what you were trying to do with the technology, 61 00:04:42,810 --> 00:04:47,600 you didn't end up with a good security outcome. I quickly recruited someone 62 00:04:47,600 --> 00:04:50,850 to teach the course that became known as 'People and Security'. 63 00:04:50,850 --> 00:04:53,069 Things have rather snowballed from there. 64 00:04:53,069 --> 00:04:56,800 We've tried very much to keep a multidisciplinary approach 65 00:04:56,800 --> 00:04:58,290 to security across what we do in the Computer Science Department 66 00:05:01,000 --> 00:05:03,690 and more widely across the university. That became 'Cyber Security Oxford', 67 00:05:07,690 --> 00:05:11,160 Then the opportunity came along to set up the Centre for Doctoral Training, 68 00:05:12,000 --> 00:05:17,190 the priority was that we would recruit students from across the disciplines, 69 00:05:17,190 --> 00:05:20,800 maybe a majority from Computer Science, but plenty with other 70 00:05:20,800 --> 00:05:27,030 backgrounds in Law or in Social Policy or in Business or in Economics. 71 00:05:28,030 --> 00:05:36,569 They would probably specialise in topics close to their their original discipline, 72 00:05:36,570 --> 00:05:41,190 but they would have the experience of learning together and across disciplines 73 00:05:41,190 --> 00:05:47,730 Bring[ing] diverse insights together to solve cyber security problems. 74 00:05:47,730 --> 00:05:50,940 We knew that was going to be tough, but it seems to have worked. 75 00:05:50,940 --> 00:05:54,809 One of my goals was that, whilst nobody could learn everything, 76 00:05:54,810 --> 00:06:00,600 I hope that those graduat[ing] from the CDT know what they don't know and 77 00:06:00,600 --> 00:06:04,830 know that there are more perspectives than the one that they specialise in. 78 00:06:04,830 --> 00:06:09,710 It's been really good to see lots of students growing in confidence, and in 79 00:06:09,710 --> 00:06:14,099 understanding security from a very rounded perspective. That excites me! 80 00:06:14,100 --> 00:06:17,430 So that's still my priority. Looking for new ways to do that, 81 00:06:17,430 --> 00:06:23,000 even though I myself stick to looking at how to make technology more secure. 82 00:06:23,000 --> 00:06:26,460 As regards defining cyber security, it's something I've asked each new cohort 83 00:06:28,800 --> 00:06:32,789 of students to do as they came into the Centre for Doctoral Training. 84 00:06:32,790 --> 00:06:37,530 It's been quite interesting to see how it's evolved. In the very first cohort, 85 00:06:37,530 --> 00:06:42,740 students seemed to split into groups with a very technical view, 86 00:06:42,740 --> 00:06:45,299 (they started talking about networks and servers) 87 00:06:45,300 --> 00:06:52,500 whereas others with social backgrounds started talking about people. 88 00:06:52,500 --> 00:06:56,820 By the fifth and sixth cohorts, people had a much more rounded view. 89 00:06:56,820 --> 00:07:02,500 I think cyber security has become understood as something that has to 90 00:07:02,500 --> 00:07:08,040 do with how we harness technology to build security in the space, in the real 91 00:07:08,800 --> 00:07:13,699 world context that is cyber space, because somehow cyber is real and 92 00:07:13,700 --> 00:07:14,580 it's where we live even though it's also virtual. 93 00:07:17,580 --> 00:07:23,629 That's a paradox, I suppose, but it's the place where cyber security sits today. 94 00:07:23,630 --> 00:07:28,310 CLAUDINE: Claudine, part of the 2018 CDT cohort. 95 00:07:28,310 --> 00:07:35,600 [I prioritise] how individuals experience harm related to content they 96 00:07:35,600 --> 00:07:41,270 interact with and consume, specifically more mundane types of content, 97 00:07:41,270 --> 00:07:46,220 the everyday content that could create unseen harm, 98 00:07:46,220 --> 00:07:55,790 small, incremental harms we experience from everyday social media use. 99 00:07:55,790 --> 00:08:03,920 I take a slightly broader view of cyber security than the traditional definition. 100 00:08:03,920 --> 00:08:09,200 Given how integrated humans are in the use of technology, particularly 101 00:08:09,200 --> 00:08:15,300 around how much we use our devices with one another and with cyber space. 102 00:08:15,300 --> 00:08:16,430 As Andrew said, 'cyber' is real at this point! 103 00:08:18,400 --> 00:08:24,080 I take a slightly broader view and look at mitigation strategies to protect 104 00:08:24,200 --> 00:08:29,120 individuals from psychological and emotional harm online, 105 00:08:29,120 --> 00:08:37,160 that providers of user services can implement to protect users from harm, 106 00:08:37,160 --> 00:08:42,700 [strategies can] be technical, guidelines or community standards. 107 00:08:42,700 --> 00:08:46,450 ARI: I'm Arianna, CDT 2016, we've done intros on the podcast before. 108 00:08:46,450 --> 00:08:51,400 Where did I start? In a dark room with lots of flashing lights, lots of cables. 109 00:08:51,400 --> 00:08:55,330 So a bit of a network monkey - at the moment, it's quite different. 110 00:08:55,330 --> 00:09:00,490 I have worked with medical researchers and people with rare conditions. 111 00:09:00,490 --> 00:09:14,560 I've looked at how we build systems - how do you architect choice? 112 00:09:14,560 --> 00:09:19,900 So my priorities now, rather than the bare bones and the cables, switches 113 00:09:19,900 --> 00:09:23,050 and routers, it's more about the responsibility of the (cyber sec) expert 114 00:09:25,050 --> 00:09:29,439 in creating a culture where people feel safe to disclose mistakes they make 115 00:09:29,440 --> 00:09:34,630 or report something without fear of getting punished. 116 00:09:34,630 --> 00:09:36,700 How do you respond when something happens? 117 00:09:36,700 --> 00:09:41,599 How do you pull everyone together, leverag[ing] the expertise they have 118 00:09:41,600 --> 00:09:44,300 so that you're not carrying the full weight of cyber security 119 00:09:44,300 --> 00:09:46,570 across an organisation? Because that's really heavy. You'll burn out, and as 120 00:09:48,700 --> 00:09:50,920 Kevin was talking about, we do lose people. It's not always to a company... 121 00:09:52,920 --> 00:09:56,760 How would I define cyber security? I go right back to 'cybernetics'. 122 00:09:56,760 --> 00:10:01,800 Cybernetics is a system that's made of lots of other systems. It's complex. 123 00:10:01,800 --> 00:10:05,679 You have different moving pieces. There's this idea of feedback loops, 124 00:10:05,680 --> 00:10:09,200 you have a rule, take action and there's some outcome. 125 00:10:09,200 --> 00:10:11,499 Then that outcome feeds into the next steps. 126 00:10:11,500 --> 00:10:15,000 Cybersecurity is strategic. You've got to have leadership and 127 00:10:15,000 --> 00:10:17,620 you've got to be able to measure and demonstrate value. 128 00:10:17,620 --> 00:10:21,600 There's an expectation that cyber security people need to have loads of 129 00:10:21,600 --> 00:10:26,200 deep areas of expertise, and I think it's more about - how do you facilitate, 130 00:10:26,200 --> 00:10:30,900 bring in other people's expertise and then take responsibility as the expert 131 00:10:30,900 --> 00:10:35,199 to do the technical work, do the deep dive and come up with these plans? 132 00:10:35,200 --> 00:10:39,639 Over the series, we've seen that interdisciplinary work is difficult. 133 00:10:39,640 --> 00:10:44,170 How do you think this translates to cyber security as a sector? 134 00:10:44,170 --> 00:10:50,400 KEVIN: One of the biggest challenges is [that] we are typically looking for 135 00:10:50,400 --> 00:10:54,789 solutions to what we believe is a predominantly technical problem. 136 00:10:54,790 --> 00:11:01,570 A point you just made - cyber security is a strategic problem within organisations 137 00:11:01,570 --> 00:11:07,000 and there are certain things that practitioners need to be aware of 138 00:11:07,000 --> 00:11:09,510 and need to be able to manage and deal with within organisations. 139 00:11:10,600 --> 00:11:12,070 Business strategy and security, first and foremost, will define your solution 140 00:11:16,840 --> 00:11:20,800 Practitioners need an understanding of what business strategy is, why it is 141 00:11:20,800 --> 00:11:25,359 important and why it's critical that it is led from the top, 142 00:11:26,360 --> 00:11:31,059 to define solutions that you're going to implement across the organisation. 143 00:11:31,060 --> 00:11:37,600 Technology is going to grow more complex as we continue to develop 144 00:11:37,600 --> 00:11:41,999 at the breakneck pace that we are. The third challenge for how this 145 00:11:42,000 --> 00:11:46,690 translates to cyber as a sector, is risk. Risk drives everything we do. 146 00:11:46,690 --> 00:11:53,560 We've come a long way since 2005 where these problems took centre stage. 147 00:11:53,560 --> 00:11:58,270 We're now considering more than just putting a firewall in place. 148 00:11:58,270 --> 00:12:03,790 It presents a challenging environment for cybersecurity practitioners, 149 00:12:03,790 --> 00:12:06,910 and it's something that we have to figure out how best to address. 150 00:12:06,910 --> 00:12:11,500 ANDREW: Building on that, it's taken a long time to get real understanding of 151 00:12:11,500 --> 00:12:15,819 how to build technology that people can use right, and to do their job. 152 00:12:15,820 --> 00:12:20,500 Cyber security throws into sharp relief any shortcomings we have in 153 00:12:20,500 --> 00:12:25,840 understanding how technology works, and might work when deployed, 154 00:12:25,840 --> 00:12:32,090 but also how people are going to use it, rely on it and work around [it]. 155 00:12:32,090 --> 00:12:35,440 You may think you've built a system that's going to be used in one way. 156 00:12:35,440 --> 00:12:39,340 People might well use it in another. In the absence of security problems, 157 00:12:39,340 --> 00:12:43,570 that's not really an issue, as long as people are able to get the job done. 158 00:12:43,570 --> 00:12:47,300 But if the workaround introduces a point of weakness that somebody else 159 00:12:47,300 --> 00:12:51,609 might exploit, then suddenly all bets are off. 160 00:12:51,610 --> 00:12:55,270 One of the fascinating things about cyber security is it really shines a light 161 00:12:55,270 --> 00:13:00,040 on the gap between what we think we're doing, and actually doing. 162 00:13:00,040 --> 00:13:03,200 That makes it quite a challenge for a lot of disciplines, because 163 00:13:03,200 --> 00:13:05,529 that means thinking about how the technology works, 164 00:13:05,530 --> 00:13:09,520 thinking about whether law is lined up with what we need to enforce. 165 00:13:09,520 --> 00:13:14,500 Whether social practice and people's understanding, comfort, trust, 166 00:13:14,500 --> 00:13:19,270 happiness in their work... are lined up with what we think is going on. 167 00:13:19,270 --> 00:13:24,040 KEVIN: The technologies that are being created to make things as 168 00:13:24,040 --> 00:13:28,630 easy as possible (work that cyber security practitioners have to do), 169 00:13:28,630 --> 00:13:33,370 you tell everyone, 'Just point and click and we'll do the rest for you", 170 00:13:33,370 --> 00:13:36,840 if they use [those technologies] outside of [intended] scope of use, 171 00:13:36,840 --> 00:13:40,120 it starts to introduce other risks and other potential problems as well. 172 00:13:40,120 --> 00:13:41,880 And it's an awful lot to keep track of. 173 00:13:41,880 --> 00:13:48,130 ARI: How do we prioritise? Louise, I know you were us[ing] sound... 174 00:13:48,130 --> 00:13:57,010 Overload and overwhelm is common. What comes first, what do we focus on? 175 00:13:57,010 --> 00:14:02,350 [Teaching] people to know what you don't know, and perspectives to bring. 176 00:14:02,350 --> 00:14:07,840 [We talked] to practitioners about their needs and the issues they face. 177 00:14:07,840 --> 00:14:12,640 We did lots of these interviews, lots of requirements gathering, 178 00:14:12,640 --> 00:14:16,010 and that's something we'd learned through the interdisciplinary course, 179 00:14:16,010 --> 00:14:20,530 working closely with people in fields like Human Computer Interaction. 180 00:14:20,530 --> 00:14:23,345 That's a really valuable thing that we can learn, and something 181 00:14:23,345 --> 00:14:25,900 that it's really important is out there in industry. 182 00:14:25,900 --> 00:14:31,600 In lots of cases it is - we can look to continue growing and facilitating 183 00:14:31,600 --> 00:14:35,230 these kinds of engagements between different groups that teaches people, 184 00:14:35,230 --> 00:14:40,840 this is a gap that could be filled by considering [some other] perspective. 185 00:14:40,840 --> 00:14:48,070 KEVIN: We have to think about developing accessible cyber security 186 00:14:48,070 --> 00:14:50,050 technologies for practitioners. They're not all going to be tech whizzes. 187 00:14:50,050 --> 00:14:53,100 Some of them are going to be more focused in and around strategy, which 188 00:14:53,100 --> 00:14:57,989 is absolutely imperative to help inform everyone else within that space. 189 00:14:57,990 --> 00:15:03,270 The earlier we consider interdisciplinary requirements, 190 00:15:03,270 --> 00:15:06,819 the better prepared we can make students, as they are 191 00:15:06,819 --> 00:15:09,120 coming through the various different schooling systems 192 00:15:09,120 --> 00:15:14,670 and getting into universities where they figure out where to specialise. 193 00:15:14,670 --> 00:15:17,800 ANDREW: We definitely struggle with interdisciplinary work. 194 00:15:17,800 --> 00:15:22,079 There are 21st century challenges that need an interdisciplinary approach. 195 00:15:22,080 --> 00:15:25,470 Cyber security is one. Climate change is another. 196 00:15:25,470 --> 00:15:30,600 Climate change needs somebody to understand the physics of the climate 197 00:15:30,600 --> 00:15:33,930 and the economics of changing energy sources. 198 00:15:33,930 --> 00:15:41,280 Impacts of climate change upon agriculture, societies... other disciplines 199 00:15:41,280 --> 00:15:45,190 in order to make sense of where the best policy interventions are. 200 00:15:45,190 --> 00:15:50,190 The same applies in cyber security, we need [a] grounded understanding. 201 00:15:50,500 --> 00:15:57,630 Universities are still set up on a 19th century model of [walled] faculties. 202 00:15:59,000 --> 00:16:03,200 Students from one faculty [cannot study] in another, particularly in the UK. 203 00:16:03,200 --> 00:16:05,700 In Oxford, we don't just gather people in faculties. 204 00:16:05,700 --> 00:16:09,000 We also gather them in a variety of other places, with the goal of 205 00:16:09,000 --> 00:16:14,010 making sure that people don't sit in silos and ignore every other discipline. 206 00:16:14,010 --> 00:16:19,110 That doesn't make interdisciplinary work easy - people still get rewarded 207 00:16:19,110 --> 00:16:23,280 for publishing in traditional journals that are in a particular subject. 208 00:16:23,280 --> 00:16:28,350 They get career points for being in the centre of that subject very often. 209 00:16:28,350 --> 00:16:33,800 You don't get rewarded very readily for working across the disciplines 210 00:16:33,800 --> 00:16:35,700 because it's very easy for work that crosses disciplines to look shallow, 211 00:16:37,700 --> 00:16:41,249 from the perspective of people who are in the core of the discipline. 212 00:16:42,000 --> 00:16:49,770 You need to integrate insights across disciplines to solve difficult problems. 213 00:16:49,770 --> 00:16:55,560 LOUISE: I've been trying to publish in cyber security for eight years, 214 00:16:55,560 --> 00:17:00,630 it is feeling more open to interdisciplinary work. 215 00:17:00,630 --> 00:17:04,760 It feels like something that's slowly improving, there are more venues 216 00:17:04,760 --> 00:17:06,930 that include interdisciplinary fields that I have been working on. 217 00:17:06,930 --> 00:17:14,070 It sometimes feels like, if you're not firmly in one discipline, it's a challenge. 218 00:17:14,070 --> 00:17:17,000 ANDREW: We knew that was going to be a problem for the CDT. 219 00:17:17,000 --> 00:17:19,649 I think the students have stepped up to the challenge incredibly well. 220 00:17:19,650 --> 00:17:24,060 But that doesn't mean we've solved it. Institutional structures are so deeply 221 00:17:24,060 --> 00:17:27,210 ingrained that even if you push the boundaries for a while, 222 00:17:27,400 --> 00:17:31,019 it's very easy for them to snap back as soon as you stop. 223 00:17:31,020 --> 00:17:32,940 So we've got to keep pushing, 224 00:17:32,940 --> 00:17:37,300 KEVIN: I really like the approach that Oxford is taking to this, in using other 225 00:17:37,300 --> 00:17:42,209 disciplines to analyse and understand where we need to put some focus. 226 00:17:42,210 --> 00:17:47,140 Perhaps that is a discipline that can be opened up to encourage students 227 00:17:47,140 --> 00:17:49,889 to come in and figure out how we solve these problems, 228 00:17:49,890 --> 00:17:55,400 not just technical problems typically synonymous with cybersecurity. 229 00:17:55,400 --> 00:17:59,610 There's a lot of work that can be done with the skill sets that you have, 230 00:17:59,700 --> 00:18:05,819 you can lead [thinking on] education for cybersecurity practitioners. 231 00:18:05,820 --> 00:18:10,760 CLAUDINE: A challenge I personally noticed coming from outside, 232 00:18:10,760 --> 00:18:13,050 [a discipline not typically considered as a core cyber security discipline] 233 00:18:16,000 --> 00:18:21,719 there's a challenge around having a shared language across disciplines, 234 00:18:21,720 --> 00:18:25,300 law, policy, psychology... which feed into cybersecurity. 235 00:18:25,300 --> 00:18:29,280 Slightly different definitions of very similar terms can lead to confusion. 236 00:18:29,500 --> 00:18:35,069 A common [cyber security] vocabulary across disciplines is something we, 237 00:18:35,070 --> 00:18:37,960 both industry and academia, need to work on, 238 00:18:37,960 --> 00:18:41,909 to be more consistent and ease communication. 239 00:18:41,910 --> 00:18:45,010 ARI: We have talked about how cyber security presents challenges. 240 00:18:45,010 --> 00:18:46,400 What is the goal? 241 00:18:46,400 --> 00:18:52,229 KEVIN: To reduce the risk of a successful compromise [to] 242 00:18:52,230 --> 00:18:55,709 your business, technology, or [the] people who rely on that technology. 243 00:18:55,710 --> 00:19:03,370 We cannot stop these things from happening. Technology evolves daily. 244 00:19:03,370 --> 00:19:08,549 We are never going to be able to create a 100% secure system. 245 00:19:08,550 --> 00:19:13,290 We have to face up to it - cyber security is about the reduction of risk. 246 00:19:13,290 --> 00:19:18,400 It's about minimising the impact that these events could possibly have. 247 00:19:18,400 --> 00:19:21,540 We do that by learning from what we've already seen, events that have 248 00:19:23,000 --> 00:19:27,410 taken place, looking at impact across the organisation and end users. 249 00:19:33,900 --> 00:19:35,250 LOUISE: [Cyber security] should be there to let us reap benefits that the 250 00:19:37,100 --> 00:19:41,309 processes and systems that we use every day have the potential to bring. 251 00:19:41,310 --> 00:19:46,980 There's potential to make more efficient cities or improve Medicine 252 00:19:46,980 --> 00:19:52,680 and help solve global issues like food shortages. 253 00:19:52,680 --> 00:19:57,450 We need security in place to let that happen, if we don't get security right, 254 00:19:57,450 --> 00:20:01,890 that underlies all that, it has more potential to go very badly wrong, 255 00:20:01,890 --> 00:20:03,610 the bigger impact it could have. 256 00:20:03,610 --> 00:20:09,629 We're starting to see harms that you didn't have the potential to create. 257 00:20:09,630 --> 00:20:13,800 As we start to link up transport and manufacturing systems, there's more 258 00:20:13,800 --> 00:20:18,479 potential to create physical harms and harms that badly impact people. 259 00:20:18,480 --> 00:20:22,000 An important goal of cyber security is getting it right so that 260 00:20:22,000 --> 00:20:24,719 we can have that benefit that we're trying to build. 261 00:20:24,720 --> 00:20:27,700 KEVIN: Getting it right is very difficult in the current climate, with the 262 00:20:27,700 --> 00:20:30,930 skills shortage that we have. As a cyber threat intelligence company, 263 00:20:31,000 --> 00:20:34,300 we are continually monitoring and looking at the emerging threats 264 00:20:34,300 --> 00:20:37,529 in the space, not just to the technology, but for business as well. 265 00:20:37,530 --> 00:20:42,360 We ourselves are often looking to recruit skilled individuals. 266 00:20:42,360 --> 00:20:46,800 One of the more recent scams that we have witnessed is people taking 267 00:20:46,800 --> 00:20:48,909 advantage of this skills shortage. We are starting to see 'Skills-as-a-Service' 268 00:20:48,910 --> 00:20:54,540 emerging, so almost the cloud version of cyber security skills. 269 00:20:54,540 --> 00:20:59,100 But the problems that we face there are pretty much in and around trust. 270 00:20:59,100 --> 00:21:04,470 We've witnessed a number of cases where developers provided for hire... 271 00:21:04,470 --> 00:21:11,010 Developers take these roles and use the access to compromise organisations, 272 00:21:11,010 --> 00:21:13,800 planting malware and planting ransomware in some cases. 273 00:21:13,800 --> 00:21:15,600 ANDREW: To look at the goal of cybersecurity, you've got to look at 274 00:21:16,600 --> 00:21:19,649 the overall development of a technological society. 275 00:21:19,650 --> 00:21:21,900 We are connecting more and more people together. 276 00:21:21,900 --> 00:21:24,500 We're connecting more and more things together 277 00:21:24,500 --> 00:21:27,569 with the goal of making us happier, feeding more people, 278 00:21:27,570 --> 00:21:32,820 using fewer fossil fuels... other things that seem to be good for society. 279 00:21:32,820 --> 00:21:35,800 The downside is, the more connections you make, the more 280 00:21:35,800 --> 00:21:40,560 opportunity there is for anything from mischief to straightforward theft 281 00:21:40,560 --> 00:21:44,129 and scarily somewhere in the background, to existential risk. 282 00:21:44,130 --> 00:21:47,500 (Whether it's to an organisation or even to a country) 283 00:21:47,500 --> 00:21:51,209 The goal, somehow, is to enable us to have those technological benefits, 284 00:21:51,210 --> 00:21:56,760 being able to get a job on the other side of the planet is probably good! 285 00:21:56,760 --> 00:22:04,590 But without introducing small scale harms and existential risks. 286 00:22:04,590 --> 00:22:07,680 Sometimes you look at technologies and the way people use them, 287 00:22:07,680 --> 00:22:10,100 and you wonder whether we get scarily close to 288 00:22:10,100 --> 00:22:14,209 scenarios better as movie plots than the real world. 289 00:22:14,210 --> 00:22:17,300 The goal is to keep us out of the movie plot world and in the world 290 00:22:17,300 --> 00:22:21,799 that makes people happier, healthier, wealthier and so on. 291 00:22:21,800 --> 00:22:27,200 ARI: I like this vision, but there are practicalities and realities where we 292 00:22:27,200 --> 00:22:30,439 don't enable everyone and we are not feeding everyone. 293 00:22:30,440 --> 00:22:35,210 At the end of the day, people come to work to do a job. 294 00:22:35,210 --> 00:22:41,150 One of the roles of cyber security experts is to translate. 295 00:22:41,150 --> 00:22:43,360 They have to figure out what motivates people. 296 00:22:43,360 --> 00:22:46,200 Kevin, I think you're right. It has to come from the top. 297 00:22:46,200 --> 00:22:49,399 That's where the money is and that's where the pull has to be. 298 00:22:49,400 --> 00:22:53,520 But I read reports that tell me boards don't understand. 299 00:22:53,520 --> 00:22:55,520 They think [security is] important but don't know where to start. 300 00:22:55,520 --> 00:22:57,800 There are toolkits for how you communicate with the board 301 00:22:57,800 --> 00:23:00,619 to get that imperative for cyber security work. 302 00:23:00,620 --> 00:23:05,311 Perhaps it is about taking the ego out of it. We have a rock star mentality, 303 00:23:05,311 --> 00:23:08,060 we're in cyber and it's very shiny, it's very new. 304 00:23:08,060 --> 00:23:12,530 We're going to make the UK the most secure, the best place to do business. 305 00:23:12,530 --> 00:23:18,950 We need as many eyes out there as possible [as a security method]. 306 00:23:18,950 --> 00:23:22,800 How do we get more eyes? How do we get people reporting [instead of] 307 00:23:22,800 --> 00:23:26,479 creating their own workarounds. Hacking works both ways - 308 00:23:26,480 --> 00:23:30,440 it can be bad, but it can be useful if something [is] in your way, 309 00:23:30,440 --> 00:23:34,520 you figure out a way to work the system that makes your job easier. 310 00:23:34,520 --> 00:23:40,640 KEVIN: The board level is still reactionary in the UK (and abroad). 311 00:23:40,640 --> 00:23:47,000 They sit there and think 'It's happening out there, but not to us...' 312 00:23:47,000 --> 00:23:48,830 '... so why should we make the investments?' 313 00:23:48,830 --> 00:23:52,000 And they are considerable investments that we're talking about. 314 00:23:52,000 --> 00:23:55,069 Companies are asking 'Why should we make these investments to do this? 315 00:23:55,070 --> 00:24:00,890 Can we offload this to [a] third party and get rid of the responsibility?' 316 00:24:00,890 --> 00:24:05,800 Unfortunately, we see organisational boards taking that approach. 317 00:24:05,800 --> 00:24:08,989 ANDREW: That brings us back to interdisciplinary study. 318 00:24:08,990 --> 00:24:12,000 Whilst there are some things that we can clearly do, that will improve 319 00:24:12,000 --> 00:24:15,469 our cyber security, the evidence base is lacking overall. 320 00:24:15,470 --> 00:24:21,710 Senior management at the university is very exercised about this subject. 321 00:24:21,710 --> 00:24:27,590 If I want to give them advice about [making] the university a safer place, 322 00:24:27,590 --> 00:24:31,580 I could tell them to triple the budget of the central cyber security unit. 323 00:24:31,580 --> 00:24:38,540 But deciding which [advice] is good and bad is not backed by evidence. 324 00:24:38,540 --> 00:24:41,200 Perhaps because we've got an immature science, 325 00:24:41,200 --> 00:24:44,209 perhaps because the question is just too hard. 326 00:24:44,210 --> 00:24:49,900 E.g., a lot of people are told that password safes are a good idea. 327 00:24:49,900 --> 00:24:54,999 Media articles are conflicted on whether the password manager 328 00:24:55,000 --> 00:24:57,620 built into your browser is a good idea or not. 329 00:24:57,620 --> 00:25:03,740 There are arguments, both ways. As a scientist who understands the issues, 330 00:25:03,740 --> 00:25:10,000 I'm not sure I could tell you hand on heart, which will actually reduce your 331 00:25:10,000 --> 00:25:14,329 risk more, because there's a lot of contextual factors that go with it. 332 00:25:14,330 --> 00:25:17,586 As a scientist, that's a problem. 333 00:25:17,586 --> 00:25:23,000 KEVIN: I think we've teased out a better answer... the goal has to be to 334 00:25:23,000 --> 00:25:27,860 educate everyone to the risks so that they can act and respond accordingly. 335 00:25:27,860 --> 00:25:30,500 You have to educate everyone across the business. 336 00:25:30,500 --> 00:25:32,390 Janitors have email addresses, they need to be educated to the risks of 337 00:25:34,200 --> 00:25:37,099 using those - clicking on the links that could potentially target them. 338 00:25:37,100 --> 00:25:43,910 This slips through the net - you are only as strong as your weakest link. 339 00:25:43,910 --> 00:25:48,950 ARI: We use 'educate the users' to absolve ourselves from responsibility. 340 00:25:48,950 --> 00:25:53,100 The cleaner needs to clean. Ultimately, it is the organisation's responsibility 341 00:25:53,100 --> 00:25:56,780 to follow upon that, have some measurement for what's going on, 342 00:25:56,780 --> 00:26:01,200 and rather than punishing this person if they get got by a phishing attempt, 343 00:26:01,200 --> 00:26:03,440 instead, it's saying 'Oh, OK, right, well because we've had training, you...' 344 00:26:06,490 --> 00:26:09,979 '... know what phishing looks like, [that] you've done it' - no shame. 345 00:26:09,980 --> 00:26:14,510 One of the issues with cyber security training is it's completely irrelevant. 346 00:26:14,510 --> 00:26:16,900 You cannot have a one size fits all approach because 347 00:26:16,900 --> 00:26:18,799 your HR bod [Human Resources person] is not going to have the same 348 00:26:18,800 --> 00:26:21,859 motivations as junior software developers. 349 00:26:21,860 --> 00:26:25,000 There's just different people who have different priorities. 350 00:26:25,000 --> 00:26:29,810 That's one of the fundamental complex issues we have to tussle with. 351 00:26:31,000 --> 00:26:33,000 KEVIN: User gets phished [clicks a bad link, allowing unauthorised access], it's 352 00:26:33,000 --> 00:26:34,040 not their fault. That, in itself, is an example of a failure of the 353 00:26:37,400 --> 00:26:40,549 organisation to properly prepare a security strategy. 354 00:26:40,550 --> 00:26:44,550 If the end users are being successfully phished, [the org. is] not providing 355 00:26:44,550 --> 00:26:48,709 them with the right information to understand that it's happening. 356 00:26:48,710 --> 00:26:50,090 ANDREW: I was going to say exactly the same. If your business relies on 357 00:26:50,090 --> 00:26:56,030 the cleaner not clicking on the wrong link, then there's something wrong. 358 00:26:56,030 --> 00:27:03,050 Your business relies on the cleaner not putting bleach in a kitchen dispenser. 359 00:27:03,050 --> 00:27:06,800 We know how to manage that risk - most of the time! 360 00:27:06,800 --> 00:27:11,850 But the cleaner shouldn't be expected to do tasks that belong to techies. 361 00:27:11,850 --> 00:27:14,360 We have got to be clear on that, as you've all said, we can't just assume 362 00:27:15,650 --> 00:27:20,299 that training people is the solution. We've got to use the technology right 363 00:27:20,300 --> 00:27:21,600 and not simply assume that giving everyone a multi-purpose, 364 00:27:24,600 --> 00:27:27,180 multi-program device that connects to every other device on the Internet 365 00:27:27,180 --> 00:27:33,690 is a safe or straightforward thing to do. ARI: Or feasible, or cheap! 366 00:27:33,690 --> 00:27:38,010 CLAUDINE: Just one thing to add is that a goal of cyber security is to shift 367 00:27:38,010 --> 00:27:40,110 focus somewhat and put more emphasis on the end user. 368 00:27:40,800 --> 00:27:43,999 Organisations and businesses are the key focus for 369 00:27:44,000 --> 00:27:46,409 a substantial amount of the cybersecurity sector. 370 00:27:46,410 --> 00:27:51,360 But traditionally, the impact on end users (when there is a breach) 371 00:27:51,360 --> 00:27:55,110 tends to be an afterthought or a secondary harm, i.e., not prioritised. 372 00:27:58,110 --> 00:28:01,590 Perhaps shifting focus more to the impact on end users as well as 373 00:28:02,500 --> 00:28:07,170 the organisations tasked with protecting or controlling their data, 374 00:28:07,170 --> 00:28:13,740 would help shift perspective to a more interdisciplinary and holistic view, 375 00:28:13,740 --> 00:28:19,290 [providing] better insights into avoid[ing] situations like the cleaner 376 00:28:19,290 --> 00:28:24,900 or someone who simply doesn't think (in the context of their job or life), 377 00:28:24,900 --> 00:28:31,300 about security ramifications of the tech they use and have access to. 378 00:28:31,300 --> 00:28:32,250 ANDREW: One more thought there, cyber security awareness training 379 00:28:36,250 --> 00:28:39,479 [is everywhere] in business and even in the university, I think. 380 00:28:39,480 --> 00:28:42,630 But if that's any use at all, it can only be the baseline. 381 00:28:42,630 --> 00:28:46,700 What we actually need are people who make decisions to have a deeper 382 00:28:46,700 --> 00:28:50,799 understanding of the implications of the decisions they make. 383 00:28:50,800 --> 00:28:54,800 Cyber security in decision-making, in [the] design of new facilities and 384 00:28:54,800 --> 00:28:57,210 services and products and so on needs to follow. The interdisciplinary 385 00:28:59,350 --> 00:29:03,389 approach needs to go much deeper into the training of every profession, 386 00:29:03,390 --> 00:29:07,000 not merely the don't click on links kind of advice. 387 00:29:07,000 --> 00:29:10,200 LOUISE: It's very important that we look at where this is all going, where 388 00:29:10,700 --> 00:29:15,630 this is going to be in 3-5 years when we're thinking about skills, 389 00:29:15,800 --> 00:29:18,479 engagement and which disciplines we need to be working with. 390 00:29:18,480 --> 00:29:21,790 What we build and create now is what's going to be used 391 00:29:21,790 --> 00:29:23,670 when there will be a different landscape. We're talking about [a] 392 00:29:24,670 --> 00:29:28,440 need to train cyber security people to engage in the right way, ensure that 393 00:29:28,600 --> 00:29:32,729 the organisation is doing the right thing and employees aren't relied on... 394 00:29:32,730 --> 00:29:37,080 This all moves so fast and in terms of research, industry and policy, 395 00:29:37,080 --> 00:29:40,500 [we need to be forward-looking in terms of] 396 00:29:40,500 --> 00:29:44,340 the training we do now and what engagement we facilitate. 397 00:29:44,340 --> 00:29:48,340 Join us next week for another fascinating conversation. 398 00:29:48,340 --> 00:29:51,510 In the meantime, you can tweet at us @HelloPTNPod 399 00:29:51,510 --> 00:29:55,770 And you can subscribe on Apple Podcasts or wherever you listen to podcasts. 400 00:29:55,770 --> 00:30:01,650 The title there is PTNPod. See you next week. ARI: Bye! 401 00:30:01,650 --> 00:30:06,840 CLAUDINE: This has been a podcast from the Centre for Doctoral Training in Cybersecurity, University of Oxford. 402 00:30:06,840 --> 00:30:11,733 Funded by the Engineering and Physical Sciences Research Council.