1 00:00:01,990 --> 00:00:05,500 ARI: Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,500 --> 00:00:09,500 ARI: We're a podcast all about exploring the different sides of cybersecurity, 3 00:00:09,500 --> 00:00:13,790 from political to computer science, international relations to mathematics. 4 00:00:13,790 --> 00:00:16,070 Join us as we talk to our friends about the work they do. 5 00:00:16,070 --> 00:00:17,500 *Sound bites from interviews* 6 00:00:17,500 --> 00:00:22,900 HARMONIE: There's a shift of attention towards cyber security, it comes at a cost. 7 00:00:22,900 --> 00:00:30,040 BERNARD: There's a more complex value chain that now sits behind 'product value'. 8 00:00:30,040 --> 00:00:34,170 JOHN: You want the smartest room of people, to deal with complex problems. 9 00:00:34,170 --> 00:00:43,950 AWAIS: This is something that is truly an international effort. 10 00:00:43,950 --> 00:00:49,290 CHRIS: The latest cyber strategy starts to expand and tackle tough problems, 11 00:00:49,290 --> 00:00:53,070 building on what we've already done before. 12 00:00:53,070 --> 00:00:56,710 ARI: Hello everyone. Welcome back to PTNPod, it's great to have you with us. 13 00:00:56,710 --> 00:00:59,720 This episode is just a smorgasbord of delights. 14 00:00:59,730 --> 00:01:03,400 This is a special episode. We got ourselves invited to a conference, 15 00:01:03,400 --> 00:01:08,190 celebrating Academic Centres of Excellence in Cyber Security Research (ACE-CSR). 16 00:01:08,190 --> 00:01:11,700 At this conference, there's lots of people interested in cyber security, 17 00:01:11,700 --> 00:01:17,500 doing work across the board: cyber physical systems (water, power grids), 18 00:01:17,500 --> 00:01:20,500 we've got a bit of international relations and politics, 19 00:01:20,500 --> 00:01:22,990 [and] how we're teaching, how we're educating. 20 00:01:22,990 --> 00:01:26,580 What we're trying to do in the UK is build an ecosystem. 21 00:01:26,580 --> 00:01:30,700 This ecosystem will have students and newbies, people building careers, 22 00:01:30,700 --> 00:01:34,300 people who have been around for a little while (i.e., before 'cyber' existed). 23 00:01:34,300 --> 00:01:45,300 We want all this expertise to make the UK a safe place to be and do business. 24 00:01:45,300 --> 00:01:49,140 We're going to start with an introduction to the UK Cyber Strategy. 25 00:01:49,140 --> 00:01:52,480 Now, this is very exciting because it points us in the right direction. 26 00:01:52,480 --> 00:01:58,500 We'll be talking to industry, policy and academic people to understand value. 27 00:01:58,500 --> 00:02:04,170 The question here is 'what kind of bang are we getting for the taxpayer buck?' 28 00:02:04,170 --> 00:02:08,100 We will start with [the] big picture, [then] move on to value and 29 00:02:08,100 --> 00:02:13,200 what value looks like in the work we do. We'll talk to someone who reminds us... 30 00:02:13,200 --> 00:02:20,070 ...there's more to life than cyber security. 31 00:02:20,070 --> 00:02:25,200 You'll hear ideas we've talked about in other episodes (e.g., the skills gap), 32 00:02:25,200 --> 00:02:31,890 as well as new projects, like the Cyber Security Body of Knowledge (CyBOK). 33 00:02:31,890 --> 00:02:34,703 We wondered why no students were coming to talk to us... 34 00:02:34,703 --> 00:02:40,140 We were in a room set aside from the main conference... we found 'em! 35 00:02:40,140 --> 00:02:43,230 We found where the food was, and the free coffee. 36 00:02:43,230 --> 00:02:47,250 At the end of the episode we interview students who are doing great stuff. 37 00:02:47,250 --> 00:02:50,790 Without any further ado, let's rock and roll. 38 00:02:50,790 --> 00:02:53,000 CHRIS: I'm Chris Ensor, one of the Deputy Directors of the 39 00:02:53,000 --> 00:02:55,830 National Cyber Security Centre (NCSC), responsible for cyber growth. 40 00:02:55,830 --> 00:02:59,900 We all rely on technology today. It's just how we live and ultimately 41 00:02:59,900 --> 00:03:02,970 we need it to be safe and secure so people have confidence in it. 42 00:03:02,970 --> 00:03:05,000 That's the job of the National Cyber Security Centre, 43 00:03:05,000 --> 00:03:07,850 how do we make the UK a safe place to live and work online? 44 00:03:07,850 --> 00:03:12,450 The priority for me is how do we make security easy for everybody? 45 00:03:12,450 --> 00:03:17,250 It shouldn't be hard. People should be able to use stuff securely by default. 46 00:03:17,250 --> 00:03:21,120 For me it's all about how we make that happen - how we build in security. 47 00:03:21,120 --> 00:03:24,690 How do we make the companies who build software, 48 00:03:24,690 --> 00:03:30,980 build it in a way that doesn't make people more vulnerable to cyber attack? 49 00:03:30,980 --> 00:03:37,040 We rely on technology, and that isn't going to change any time soon. 50 00:03:37,040 --> 00:03:41,340 We want that technology to be secure and robust against any sort of cyber attack. 51 00:03:41,340 --> 00:03:45,830 [In] the long term? Will cyber security be a discipline into the future? 52 00:03:45,830 --> 00:03:50,330 I don't know. But certainly we want cyber security built in. 53 00:03:50,330 --> 00:03:55,250 Over time you'll see the standalone discipline, which is security in the corner, 54 00:03:55,250 --> 00:04:00,530 part of everyday life so people don't have to worry - it's already there. 55 00:04:00,530 --> 00:04:06,860 The Cyber Strategy begins to tackle longer term problems: the use of regulation, 56 00:04:06,860 --> 00:04:09,830 [and] the need to upskill and look at the education system. 57 00:04:09,830 --> 00:04:16,550 We've learned a lot from the last two strategies and this one builds on it and 58 00:04:16,550 --> 00:04:20,210 starts to expand and tackle some of the really tough problems, 59 00:04:20,210 --> 00:04:25,910 building on what we've already done before. ARI: Chris Ensor, from the NCSC. 60 00:04:25,910 --> 00:04:33,860 Talking about strategy is really helpful because it gives us [a] roadmap. 61 00:04:33,860 --> 00:04:39,530 The place we're trying to go is the UK being a safe place, to work and live. 62 00:04:39,530 --> 00:04:48,860 This has to be the same for everyone. The government's responsible for taxpayer money, 63 00:04:48,860 --> 00:04:53,060 so the directions that they set have to be resilient. It's got to 'bounce back', 64 00:04:53,060 --> 00:04:59,940 in case anything happens. This long term roadmap is what the strategy is for. 65 00:04:59,940 --> 00:05:05,940 My name's Bernard Parsons, I'm the CEO of a cyber security company called Becrypt. 66 00:05:05,940 --> 00:05:10,800 We're a UK developer of security products and services based in London. 67 00:05:10,800 --> 00:05:16,440 The main focus for us at the moment is around endpoint security. 68 00:05:16,440 --> 00:05:21,180 The endpoint is where a lot of cyber attacks gain a foothold. 69 00:05:21,180 --> 00:05:32,910 'Endpoint' refers to company devices that users interact with to do their job, 70 00:05:32,910 --> 00:05:39,270 typical examples being laptops and desktops, tablets, mobile devices. 71 00:05:39,270 --> 00:05:45,150 But actually it can be viewed more widely than that. 72 00:05:45,150 --> 00:05:50,100 There's some work we've been supporting recently where actually 73 00:05:50,100 --> 00:05:55,350 the term they're using is 'enterprise connected devices'. 74 00:05:55,350 --> 00:06:00,420 So it gets even broader and starts to pull in things like printers and scanners. 75 00:06:00,420 --> 00:06:06,960 All of these devices have an increasing amount of intelligence within them, 76 00:06:06,960 --> 00:06:14,610 very often what comes with that is increasing vulnerability. 77 00:06:14,610 --> 00:06:22,320 Most of what we do is around securing desktops within enterprise environments, 78 00:06:22,320 --> 00:06:32,010 but it does extend much more broadly now. We did some work with government... 79 00:06:32,010 --> 00:06:39,690 [making] endpoints more resilient to attacks and resistant to cyber threats. 80 00:06:39,690 --> 00:06:47,100 [Delivering value for money is] about reducing endpoint risk within organisations. 81 00:06:47,100 --> 00:06:50,550 You can come up with the best security in the world, 82 00:06:50,550 --> 00:06:56,130 but if you can't deliver in a way that's [usable], 83 00:06:56,130 --> 00:07:02,610 and [cost-efficient], you're not going to get off the starting line. 84 00:07:02,610 --> 00:07:09,960 It's all about being able to deliver a convincing value proposition. 85 00:07:09,960 --> 00:07:14,490 It's interesting to think about the big picture, because it's quite cross-cutting. 86 00:07:14,490 --> 00:07:22,230 It's difficult to think of cyber as separate, and that's reflected in the UK Cyber Strategy. 87 00:07:22,230 --> 00:07:27,150 Now that is the 'Cyber' strategy as opposed to 'Cyber Security'. 88 00:07:27,150 --> 00:07:36,540 It cuts across technology, over into socio-economic and environmental aspects. 89 00:07:36,540 --> 00:07:43,110 The 'right' amount of cyber resilience for your organisation, 90 00:07:43,110 --> 00:07:51,240 is increasingly important: managing risk as part of what you want to do. 91 00:07:51,240 --> 00:08:00,960 Managing cyber-related risk in a way that is appropriate for you, 92 00:08:00,960 --> 00:08:08,400 Is about getting the right controls and processes in place [to] balance 93 00:08:08,400 --> 00:08:17,460 operational efficiency and [cyber-related] risk management. 94 00:08:17,460 --> 00:08:21,960 ARI: I asked what value looked like and this is what we talked about. 95 00:08:21,960 --> 00:08:29,880 BERNARD: It depends [on] scope. I could approach answering [this] where I'm 96 00:08:29,880 --> 00:08:34,020 only going to be concerned in proving to you what the value of my product is. 97 00:08:34,020 --> 00:08:37,980 It's a much bigger conversation than that, because any one 98 00:08:37,980 --> 00:08:43,350 product is part of a much bigger environment and set of systems. 99 00:08:43,350 --> 00:08:50,010 If I tackle [this] at product level, it is a big problem for cyber security. 100 00:08:50,010 --> 00:08:57,000 As a sector, how do we demonstrate the value of what we're building? 101 00:08:57,000 --> 00:09:04,500 Historically, what one looked towards to answer that question was product assurance. 102 00:09:04,500 --> 00:09:11,970 [Getting] some independent entity to certify my product against [a] standard. 103 00:09:11,970 --> 00:09:15,750 Product certification that gives someone a tick in the box, 104 00:09:15,750 --> 00:09:24,240 and we're happy. That has has been problematic. [Some] background... 105 00:09:24,240 --> 00:09:30,390 It used to work with very well defined categories of product. 106 00:09:30,390 --> 00:09:34,350 In the early days, what Becrypt was all about was building encryption products. 107 00:09:34,350 --> 00:09:41,370 Very easy to define a set of standards and criteria: 'to encrypt this...' 108 00:09:41,370 --> 00:09:48,540 '... this is what we want.' Algorithms, how you manage credentials and so on. 109 00:09:48,540 --> 00:09:53,640 You could get a certification for that - something of value. 110 00:09:53,640 --> 00:09:59,910 But then, how that artefact sometimes got used by the consumer, 111 00:09:59,910 --> 00:10:09,960 was problematic. People would use [it] as a proxy for informed risk management. 112 00:10:09,690 --> 00:10:12,630 So, somebody told me this product is good, 113 00:10:12,630 --> 00:10:18,690 I don't need to think about what I'm storing, how long I'm storing for, and so on. 114 00:10:18,690 --> 00:10:24,420 Product certifications have their challenges, even where easily applied. 115 00:10:24,420 --> 00:10:28,320 But as technology has evolved and become more diverse, 116 00:10:28,320 --> 00:10:36,510 traditional product certification has become strained - it's difficult to create 117 00:10:36,510 --> 00:10:44,310 criteria for what technology should and shouldn't do for [what] we have today. 118 00:10:44,310 --> 00:10:49,560 That's why you're seeing a lot of work taking place, 119 00:10:49,560 --> 00:10:53,220 where they're trying to shift the model more to thinking about, 120 00:10:53,220 --> 00:11:04,110 principles appropriate to classes of product, rather than prescriptive criteria. 121 00:11:04,110 --> 00:11:13,800 More outcomes based, e.g., how a particular manufacturer might build a product, 122 00:11:13,800 --> 00:11:23,520 but also what surrounds that. The processes of the organisation itself. 123 00:11:23,520 --> 00:11:30,810 There's a more complex value chain that sits behind the question of product value. 124 00:11:30,810 --> 00:11:37,820 It's an evolving space and we're supporting some of [that] work. 125 00:11:37,820 --> 00:11:45,540 It's a whole different conversation at the the economic level, where 126 00:11:45,540 --> 00:11:52,110 we sell products. Following on from the difficulty in providing assurance, 127 00:11:52,110 --> 00:12:00,240 what that leads to within cyber, is an increased amount of information asymmetry. 128 00:12:00,240 --> 00:12:06,390 The cyber market is dominated by companies spend[ing] more on marketing than R&D. 129 00:12:06,390 --> 00:12:13,110 That leads to a situation where it's difficult for buyers to tell good [from] bad. 130 00:12:13,110 --> 00:12:19,350 That drives companies that want to invest in stronger controls out of the market. 131 00:12:19,350 --> 00:12:25,440 That drives down the average value of products being delivered in the market. 132 00:12:25,440 --> 00:12:33,450 The net result is - investment from companies within sectors that 133 00:12:33,450 --> 00:12:39,450 need to demonstrate security and resilience, without a net increase in resilience. 134 00:12:39,450 --> 00:12:43,770 ARI: Dr Bernard Parsons MBE. CEO of Becrypt. 135 00:12:43,770 --> 00:12:50,130 Talking about organisational security is important in reducing the risk that 136 00:12:50,130 --> 00:12:53,520 printers, whatever that these devices pose to a business. 137 00:12:53,520 --> 00:13:01,740 If a hacker was going to try [anything, these devices would be on their] list, 138 00:13:01,740 --> 00:13:06,390 trying to find holes, weaknesses. The way we do business is changing. 139 00:13:06,390 --> 00:13:12,540 The technology is developed very fast, it's clever, and sucking up a lot of data. 140 00:13:12,540 --> 00:13:16,350 Oversight and quality checking has never been more important. 141 00:13:16,350 --> 00:13:19,620 Rather than having tick boxes (which ARE helpful), 142 00:13:19,620 --> 00:13:26,460 the first step is having [an] expert suggest where your priorities should be. 143 00:13:26,460 --> 00:13:29,100 What's the number one thing to address? 144 00:13:29,100 --> 00:13:35,850 You can also ask yourself questions: What do we want this technology to do? 145 00:13:35,850 --> 00:13:39,090 How do we expect this thing to happen? 146 00:13:39,090 --> 00:13:44,260 What don't we want? What is not acceptable? HARMONIE: My name is Harmonie Toros. 147 00:13:44,260 --> 00:13:48,260 I am a Reader in International Conflict Analysis at the University of Kent. 148 00:13:48,260 --> 00:13:52,660 I research security negotiations and violence. 149 00:13:52,660 --> 00:14:02,260 My priority is to ensure that research into (critical) security studies, 150 00:14:02,260 --> 00:14:07,630 is learned from by the (academic and policy) cyber security community, 151 00:14:07,630 --> 00:14:13,750 it's important we understand that a lot of the terms we use, 152 00:14:13,750 --> 00:14:20,890 can be very complicated and have upsides and downsides. 153 00:14:20,890 --> 00:14:28,390 The main problem I am addressing is that, because (cyber security is) a new field, 154 00:14:28,390 --> 00:14:35,380 there is a tendency [to see] security as a positive. It usually is, I would agree. 155 00:14:35,890 --> 00:14:40,600 [But it is not challenged, or looked at in a critical way] Also, resilience. 156 00:14:40,600 --> 00:14:43,270 I've had many conversations here, 157 00:14:43,270 --> 00:14:48,880 [asking], do you see the (political) difference between security and resilience? 158 00:14:48,880 --> 00:14:54,340 Security is [a] State's (and company's) responsibility to secure people. 159 00:14:54,340 --> 00:14:59,950 Resilience is [a] person's responsibility. There's a shift there. 160 00:14:59,950 --> 00:15:06,190 These are important questions [with] legal and political implications. 161 00:15:06,190 --> 00:15:14,950 My goal [is] to transfer reflections and critical engagement from security studies to cyber security. 162 00:15:14,950 --> 00:15:20,020 I come from a terrorism studies background. That's what I studied for a long time. 163 00:15:20,020 --> 00:15:24,790 One of my main concerns is - however many governments would decide that terrorism 164 00:15:24,790 --> 00:15:28,990 was either the greatest, or one of the greatest, threats to national security, 165 00:15:28,990 --> 00:15:30,880 I could find very little evidence. 166 00:15:30,880 --> 00:15:35,770 If a country like the UK tells me that terrorism is the greatest threat, 167 00:15:35,770 --> 00:15:40,210 I can't understand how that is the case. There's no metric. 168 00:15:40,210 --> 00:15:43,870 However, it was there. 169 00:15:43,870 --> 00:15:46,960 It was there in all the documents. It was there in the money being poured into it. 170 00:15:46,960 --> 00:15:53,320 We're moving away from terrorism, but there's a shift towards cyber security. 171 00:15:53,320 --> 00:16:01,390 We need to understand that that comes at a cost. A cost of funding going to 172 00:16:01,390 --> 00:16:06,790 mental health, education, social services, safe spaces... 173 00:16:06,790 --> 00:16:12,310 We need to know that if we're moving attention towards cyber security, 175 00:16:12,310 --> 00:16:15,130 that it is really a threat to national security. 176 00:16:15,130 --> 00:16:20,890 How is it a threat to national security? How much money do we have to put in? 177 00:16:20,890 --> 00:16:26,080 We need to make sure that security questions are looked at outside of security. 178 00:16:26,080 --> 00:16:31,330 Of course, within the security world, cyber security is very important. 179 00:16:31,330 --> 00:16:34,400 But we have to look at it in a broader, societal way. 180 00:16:34,400 --> 00:16:40,840 What are the other problems we are facing? What other questions do we [face]? 181 00:16:40,840 --> 00:16:45,190 If you only look at cyber security within the security silo, it seems essential. 182 00:16:45,190 --> 00:16:51,580 [More broadly], you understand that if we're focussing here, we're not focussing there. 183 00:16:51,580 --> 00:16:55,960 I'm going to end with a little anecdote. 184 00:16:55,960 --> 00:17:00,220 In conferences I'm often asked 'what is the greatest security threat?' 185 00:17:00,220 --> 00:17:08,260 The greatest security threat, statistically, to me is my husband. 186 00:17:08,260 --> 00:17:11,500 I am more likely to be killed by my husband than by anybody else. 187 00:17:11,500 --> 00:17:15,130 And the greatest security threat to most men in the room [is] themselves. 188 00:17:15,130 --> 00:17:20,530 We need to think about security in that way. 189 00:17:20,530 --> 00:17:28,720 We need to think about other ways in which people are hurt and suffer. 190 00:17:28,720 --> 00:17:37,480 Dr Harmonie Toros, Deputy Director of the Institute of Cyber Security for Society, 191 00:17:37,480 --> 00:17:41,680 University of Kent. In this conversation, we talk[ed] about how the words we use matter. 192 00:17:41,680 --> 00:17:49,210 The policies that direct public funding are steeped in assumptions. 193 00:17:49,210 --> 00:17:55,540 We need to be specific about what needs addressing. 194 00:17:55,540 --> 00:18:00,100 Who is at risk? What can we do about it? What should the bare minimum be? 195 00:18:00,100 --> 00:18:06,580 Cyber security [is] fairly new and there is some risk of it being overhyped. 196 00:18:06,580 --> 00:18:10,540 Rather than re-inventing a shiny new wheel, we could just use the ones we have. 197 00:18:10,540 --> 00:18:14,770 Take stock. Be critical. And only promise things that can realistically happen. 198 00:18:14,770 --> 00:18:20,140 [That's] why I enjoyed this conversation so much! 199 00:18:20,140 --> 00:18:25,000 The takeaway for me is that the world is much bigger than I am, 200 00:18:25,000 --> 00:18:35,050 having a micro focus is a problem because you oversimplify or ignore problems. 201 00:18:35,050 --> 00:18:42,490 I'm Furrah, I'm the RITICS Programme Manager, a programme funded by EPSRC & NCSC. 202 00:18:42,490 --> 00:18:50,140 It's coordinated at Imperial College London, a hub for security, research and engagement. 203 00:18:50,140 --> 00:18:58,780 We give money to people to answer [research] questions on cyber-physical systems. 204 00:18:58,780 --> 00:19:02,920 Cyber physical systems [are] attached to our everyday life. 205 00:19:02,920 --> 00:19:08,680 A colleague gave birth - computer systems in the hospital were hacked, 206 00:19:08,680 --> 00:19:15,070 she was administered the wrong amount of medication and became deaf. 207 00:19:15,070 --> 00:19:17,180 You can see the implications... [of cyber-physical risk causing harm] 208 00:19:17,180 --> 00:19:24,070 It's important for us to make [systems] more secure and do more research. 209 00:19:24,070 --> 00:19:28,870 Cyber is involved in everything and anything, from ships to planes to boats. 210 00:19:28,870 --> 00:19:32,770 So why is it important in cyber-physical systems? Because we're interconnected. 211 00:19:32,770 --> 00:19:36,700 We're connected to each other. Everything runs off each other. 212 00:19:36,700 --> 00:19:39,850 It doesn't rely on one source. You need multiple things. 213 00:19:39,850 --> 00:19:43,660 If you're flying a plane, you've got a security system trying to keep it all safe, 214 00:19:43,660 --> 00:19:49,270 make sure no can hack into [it]. You need systems on the plane to run. 215 00:19:49,270 --> 00:19:52,420 And then when you're in the airports, to receive the correct information. 216 00:19:52,420 --> 00:19:57,250 It's not just one branch, it's connecting to one another. You need lots of skills. 217 00:19:57,250 --> 00:20:02,770 I have a degree in Forensics, a Master's in Security and Resilience... 218 00:20:02,770 --> 00:20:07,390 I have skills: communication, networking, administration. 219 00:20:07,390 --> 00:20:12,610 You need to be able to talk to people, mingling, know what to ask, when to ask. 220 00:20:12,610 --> 00:20:18,580 You need to know how to navigate people within academia [and] government. 221 00:20:18,580 --> 00:20:21,970 People in industry are more formal whereas people in academia [are] 222 00:20:21,970 --> 00:20:29,620 a little more relaxed. You need to be able to assess, adjust and be observant. 223 00:20:29,620 --> 00:20:31,870 Not everyone has those skills. 224 00:20:31,870 --> 00:20:41,020 Furrah Hussain, Programme Manager for RITICS, the Research Institute in Trustworthy, Interconnected Cyber-Physical Systems. 225 00:20:41,020 --> 00:20:44,710 Furrah talks about how interconnected cyber-physical systems are. 226 00:20:44,710 --> 00:20:49,960 We'll talk about [these] systems again later. A couple of examples, 227 00:20:49,960 --> 00:20:53,890 water and energy systems - anything that's also physical, 228 00:20:53,890 --> 00:20:55,570 not just in the computer. 229 00:20:55,570 --> 00:21:02,920 She also talked about skills useful in a government funded cyber security research programme. 230 00:21:02,920 --> 00:21:11,200 You don't need to be super technical to appreciate the complexity of systems. 231 00:21:11,200 --> 00:21:16,060 There are different kinds of technicality and different areas of expertise. 232 00:21:16,060 --> 00:21:20,290 It's important that the kind of skills that help get her job done: communication, 233 00:21:20,290 --> 00:21:26,050 networking... we shouldn't overlook them. 234 00:21:26,050 --> 00:21:29,590 Academic Centres of Excellence, they're only one piece of the puzzle. 235 00:21:29,590 --> 00:21:33,970 There [are] all sorts of networks and hubs of expertise; people sharing ideas. 236 00:21:33,970 --> 00:21:37,380 It's not just about having rockstar researchers. 237 00:21:37,380 --> 00:21:43,500 I'm Fabio (Kings College London). I work at the intersection of AI and system security. 238 00:21:43,500 --> 00:21:47,730 AI is really the most viable solution for protecting systems in the long term, 239 00:21:47,730 --> 00:21:51,600 especially given this scalability of threats appearing in the wild and 240 00:21:51,600 --> 00:21:55,830 the fact that attackers have more resources than defenders typically. 241 00:21:55,830 --> 00:21:57,570 The main challenge in this, 242 00:21:57,570 --> 00:22:07,980 is bridging the semantic gap between experts designing systems and people [using them], 243 00:22:07,980 --> 00:22:09,870 like security analysts. 244 00:22:09,870 --> 00:22:18,030 In the research community, we have methods and systems understood by experts. 245 00:22:18,030 --> 00:22:23,820 The very important thing is to create solutions that can be understood and 246 00:22:23,820 --> 00:22:29,820 used by real world analysts [who] may not know much about [AI/machine learning]. 247 00:22:29,820 --> 00:22:33,750 Whenever I design a new project, 248 00:22:33,750 --> 00:22:47,190 I ask myself how it can be useful, how it can help while being practical. 249 00:22:47,190 --> 00:22:51,240 Being able to practically the blind in the wild I engage with. 250 00:22:51,240 --> 00:22:58,110 I [discuss these topics with other academics] and industry about [their] needs. 251 00:22:58,110 --> 00:23:02,520 I have an ongoing collaboration with Avast and I have been talking with NCC Group. 252 00:23:02,520 --> 00:23:05,760 We talk different languages, even while we talk about research. 253 00:23:05,760 --> 00:23:08,280 Research in academia means publishing papers, 254 00:23:08,280 --> 00:23:16,230 in industry it's more about the mission of the company, making systems efficient. 255 00:23:16,230 --> 00:23:19,560 I tend to talk a lot to industry people, and want to learn from them. 256 00:23:19,560 --> 00:23:24,390 It's important that we keep bridging this communication gap. 257 00:23:24,390 --> 00:23:31,800 Cyber security fits everywhere. It's about creating safe spaces for people. 258 00:23:31,800 --> 00:23:38,970 My research focuses on system security [which] happens behind the curtain. 259 00:23:39,420 --> 00:23:43,080 It protects people, keeps their data safe, their money safe. 260 00:23:43,080 --> 00:23:49,560 Cyber security should really create a better and more peaceful society. 261 00:23:49,560 --> 00:23:54,870 Dr Fabio Pierazzi, Lecturer in Computer Science at Kings College London. 262 00:23:54,870 --> 00:23:59,940 When Fabio is talking about bridging a semantic gap, we're coming back to words. 263 00:23:59,940 --> 00:24:05,790 Everyone speaks a different language, it's worth making sure we're on the same page. 264 00:24:05,790 --> 00:24:09,300 He's also talking about making sure that research is informed by the needs of 265 00:24:09,300 --> 00:24:13,350 the people who are going to use whatever it is you build. 266 00:24:13,350 --> 00:24:16,800 Translational work is important, particularly in cyber security. 267 00:24:16,800 --> 00:24:23,040 Being able to speak to people in their own words, or maybe a shared language. 268 00:24:23,040 --> 00:24:28,500 Translating, to make sure we're all pulling in the same direction... 269 00:24:28,500 --> 00:24:33,150 Without this work, you build things that aren't helpful, useful or usable, 270 00:24:33,150 --> 00:24:39,060 and worst case, it's a waste of taxpayer money! 271 00:24:39,060 --> 00:24:44,880 I'm John, I work in the NCSC in a team called the Sociotechnical Security Group. 272 00:24:44,880 --> 00:24:48,660 We undertake multi-disciplinary research into cyber security problems. 273 00:24:48,660 --> 00:24:54,900 Trying to take on board different academic disciplines (psychology, economics...) 274 00:24:54,900 --> 00:25:00,870 [Looking] at problems [with] a multi-disciplinary perspective to understand them, 275 00:25:00,870 --> 00:25:07,470 rotate them around, and see if we can address or manage them. 276 00:25:07,470 --> 00:25:12,210 It's a range of problems we look at, it goes back to the NCSC mission: 277 00:25:12,210 --> 00:25:16,560 How do we keep the UK safe online? From that, there's a whole raft of [ideas]. 278 00:25:16,560 --> 00:25:20,400 So, for example, we care about assurance. We care about cyber incidents. 279 00:25:20,400 --> 00:25:26,160 We want to understand what happens, how we can manage them, learn from them, 280 00:25:26,160 --> 00:25:29,220 and how we can try and put (mitigations) in place. 281 00:25:29,220 --> 00:25:34,020 If I'm looking at economics, I might apply that to ransomware, 282 00:25:34,020 --> 00:25:39,390 [to see] what are the implications? To do that we go out to a broad network, 283 00:25:39,390 --> 00:25:47,280 beyond the team - academia, government, UK allies and commercial partners. 284 00:25:47,280 --> 00:25:50,730 Trying to expand the room, because we want the smartest 285 00:25:50,730 --> 00:25:54,000 room of people to be able to deal with some of the most complex problems. 286 00:25:54,000 --> 00:25:57,390 To do that, we have to go beyond our boundaries to draw in the biggest, 287 00:25:57,390 --> 00:26:03,540 smartest room to tackle long term systemic problems. 288 00:26:03,540 --> 00:26:07,800 We're always looking for 'lenses', different disciplines. 289 00:26:07,800 --> 00:26:14,220 The last two years I've been looking at economics, building a network of expertise 290 00:26:14,220 --> 00:26:17,070 and then trying to develop a set of research questions, 291 00:26:17,070 --> 00:26:20,940 which is going to tackle and feed back into the priorities we care about. 292 00:26:20,940 --> 00:26:23,460 I'm now looking at a new sort of space, 293 00:26:23,460 --> 00:26:28,830 how [we can] bring in international dimensions, international relations. 294 00:26:28,830 --> 00:26:33,540 It's about trying to identify what's new in terms of technology, 295 00:26:33,540 --> 00:26:39,600 [and] academic disciplines that can help us develop [a] more nuanced understanding 296 00:26:39,600 --> 00:26:43,350 I think 'problems' is the wrong word. There are opportunities as well as problems. 297 00:26:43,350 --> 00:26:49,500 Part of it is trying to frame it in that way because it goes beyond safety into... 298 00:26:49,500 --> 00:26:54,730 how could you think positive? How can people live their best lives online? 299 00:26:54,730 --> 00:26:59,700 We're increasingly living online and living blended, distributed existences. 300 00:26:59,700 --> 00:27:03,720 And part of it is about - how do we do that best? 301 00:27:03,720 --> 00:27:09,900 How can we have safe, happy communities but also live individual lives where 302 00:27:09,900 --> 00:27:15,300 the challenges posed by technology can be addressed and thought about in advance. 303 00:27:15,300 --> 00:27:21,820 There are bigger questions about people that seek to cause problems, 304 00:27:21,820 --> 00:27:25,230 part of the NCSC's missions is about protection. 305 00:27:25,230 --> 00:27:32,820 Whether that's nation states, criminal gangs or unintended consequences. 306 00:27:32,820 --> 00:27:37,200 You don't need bad actors, you can have unintended consequences. 307 00:27:37,200 --> 00:27:44,010 The root of what we do in this group is systems thinking. 308 00:27:44,010 --> 00:27:49,680 The more complex systems and interactions you have from systems of systems, 309 00:27:49,680 --> 00:27:54,510 the more unexpected outcomes... technology is a big Factor X in all that. 310 00:27:54,510 --> 00:28:01,650 ARI: I couldn't resist asking John about systems of systems. We've come across 311 00:28:01,650 --> 00:28:07,410 the idea that a big system made up of smaller systems is complex. 312 00:28:07,410 --> 00:28:12,510 I wanted to ask how you manage all of that work [and] complexity. 313 00:28:12,510 --> 00:28:14,880 JOHN: Quite often... 314 00:28:14,880 --> 00:28:19,320 you're not looking at something which is static, [it] is dynamic and changing. 315 00:28:19,320 --> 00:28:21,900 Even if you get the snapshot right... 316 00:28:21,900 --> 00:28:27,150 The spiel I use when I have my slides, which you'll be thankful I don't have here, 317 00:28:27,150 --> 00:28:29,790 I talk about the idea that it takes a village to raise a child. 318 00:28:29,790 --> 00:28:36,210 From a multi-disciplinary perspective, what goes into raising that child? 319 00:28:36,210 --> 00:28:38,820 You could talk about the parents, the school, 320 00:28:38,820 --> 00:28:44,670 the village, the child's own psychological development. 321 00:28:44,670 --> 00:28:50,250 You could go into attachment theory... 322 00:28:50,250 --> 00:28:56,250 Beyond that, you'd look at anthropology and other disciplines, educational theory. 323 00:28:56,250 --> 00:28:59,970 You could keep on going. At some point you have to bound the system. 324 00:28:59,970 --> 00:29:04,980 There's a quote from George Box - all models are wrong, some are useful! 325 00:29:04,980 --> 00:29:11,520 We're never going to get it right because it's changing. 326 00:29:11,520 --> 00:29:15,990 From our perspective, we're not looking at that model for [a] child. 327 00:29:15,990 --> 00:29:20,880 It takes an ecosystem to produce a cyber incident. 328 00:29:20,880 --> 00:29:24,270 These things don't just happen by themselves. 329 00:29:24,270 --> 00:29:28,620 it's not the case that 1) hardware, 2) software or 3) people did something wrong. 330 00:29:28,620 --> 00:29:33,330 You've got to try to understand the broader ecosystem. 331 00:29:33,330 --> 00:29:38,130 You could look at social networks, hardware, policies, supply chains... 332 00:29:38,130 --> 00:29:43,890 Assurance, social engineering... behind each of those, issues and subdisciplines. 333 00:29:43,890 --> 00:29:49,190 How can we reach out, draw on expertise and bring it together? 334 00:29:49,190 --> 00:29:53,030 One of the things we're doing at the moment is looking at ransomware. 335 00:29:53,030 --> 00:29:59,930 We work with economists to understand an illegal marketplace like ransomware. 336 00:29:59,930 --> 00:30:05,510 We have [ideas] about how we prevent market failures using economic theory. 337 00:30:05,510 --> 00:30:08,790 What happens if you flip that on its head? We want to use economic theory to try 338 00:30:08,790 --> 00:30:12,560 and understand how we can degrade and prevent a marketplace from working. 339 00:30:12,560 --> 00:30:14,810 That's great, but how do we test that? 340 00:30:14,810 --> 00:30:19,040 You can't get data on marketplaces [but] you can get historical datasets. 341 00:30:19,040 --> 00:30:26,090 [Medieval Historians are showing us] how legal marketplaces were disrupted by the Catholic Church. 342 00:30:26,090 --> 00:30:31,700 If you take the structure of those marketplaces, 343 00:30:31,700 --> 00:30:35,840 it maps across very strongly to the sort of the marketplace for ransomware. 344 00:30:35,840 --> 00:30:37,730 The point is, you've got triangulation. 345 00:30:37,730 --> 00:30:43,610 This is telling you how you can use that understanding from the past, 346 00:30:43,610 --> 00:30:47,270 using economists, using technical expertise, 347 00:30:47,270 --> 00:30:51,140 [and] historians to bring these things together to try and identify where we 348 00:30:51,140 --> 00:30:55,280 should be focussing our energies in trying to disrupt a threat to the UK. 349 00:30:55,280 --> 00:30:59,510 ARI: That was John from the Sociotechnical Security Group at the NCSC. 350 00:30:59,510 --> 00:31:08,690 He is talking about sociotechnical work [solving] real, large scale problems. 351 00:31:08,690 --> 00:31:12,380 This is important because we can't just simplify things. 352 00:31:12,380 --> 00:31:15,770 A lot of the time it's not about everybody understanding the problem. 353 00:31:15,770 --> 00:31:22,610 It's about technical experts digging into their areas of expertise and then 354 00:31:22,610 --> 00:31:27,110 returning to a level where we can understand what that means in real terms. 355 00:31:27,110 --> 00:31:34,280 We develop an approach together, or we don't understand this problem at all. 356 00:31:34,280 --> 00:31:37,580 I love the idea that - you don't want to be the smartest person in the room, 357 00:31:37,580 --> 00:31:41,870 You want the entire room to be smart. People have different approaches, 358 00:31:41,870 --> 00:31:47,300 I completely resonated with that. 359 00:31:47,300 --> 00:31:54,140 Innovation doesn't happen in a vacuum. 360 00:31:54,140 --> 00:31:58,580 It's much more about creating space for this work to happen. 361 00:31:58,580 --> 00:32:06,350 AWAIS: I'm Awais Rashid, Professor of Cyber Security, University of Bristol. I lead the Cyber Security Body of Knowledge (CyBOK) project. 362 00:32:06,350 --> 00:32:10,570 ANDREW: I'm Andrew Martin, Professor of Systems Security, University of Oxford. 363 00:32:10,570 --> 00:32:15,630 I'm a member of the Executive Board of CyBOK . STEVE: I'm Professor Steve Schneider. 364 00:32:15,630 --> 00:32:17,200 I'm at the University of Surrey. 365 00:32:17,200 --> 00:32:23,850 I'm Director of the Surrey Centre for Cyber Security, I'm a (CyBOK) Executive Board member. 366 00:32:23,850 --> 00:32:34,000 AWAIS: CyBOK was born out of a need. Most mature disciplines have bodies of knowledge. 367 00:32:34,000 --> 00:32:37,440 Cyber security as a field has grown in maturity, 368 00:32:37,440 --> 00:32:44,580 [but it did not have a body of knowledge that educators and trainers could use] 369 00:32:44,580 --> 00:32:51,030 How do we know what good looks like when trying to train people in this area? 370 00:32:51,030 --> 00:32:57,150 Given [the] varying quality [of available resources], is there an authoritative source? 371 00:32:57,150 --> 00:33:02,040 The challenges are articulated in the aims that we set ourselves. 372 00:33:02,040 --> 00:33:06,990 The project was funded by the UK's National Cyber Security Programme, 373 00:33:06,990 --> 00:33:11,610 UK cyber security doesn't live in isolation from the rest of the world. 374 00:33:11,610 --> 00:33:16,620 There are a number of excellent experts around the planet. 375 00:33:16,620 --> 00:33:22,170 The biggest challenge has been to make sure that this is truly 376 00:33:22,170 --> 00:33:27,130 an international effort where experts from around the world come together to 377 00:33:27,130 --> 00:33:33,910 provide input [and we maintain the rigour of that process]. 378 00:33:33,910 --> 00:33:39,750 It's not just members of the editorial team or executive board making decisions. 379 00:33:39,750 --> 00:33:46,770 Community has to be involved - these people are experts in the field. 380 00:33:46,770 --> 00:33:49,440 I'm going to borrow from someone else who said this to me, 381 00:33:49,440 --> 00:33:55,590 The author and reviewer list reads like a Who's Who of cyber security. 382 00:33:55,590 --> 00:33:59,370 That credibility comes from the expertise of the people, 383 00:33:59,370 --> 00:34:05,490 and the rigour [of the] reviewing and editing, 384 00:34:05,490 --> 00:34:10,950 and the wider community input that comes in. 385 00:34:10,950 --> 00:34:13,560 There are a number of key things that have happened. 386 00:34:13,560 --> 00:34:21,990 In the UK we have a certification programme with the National Cyber Security Centre. 387 00:34:21,990 --> 00:34:28,620 If there are certified degrees in the UK in cyber security, they are based on CyBOK. 388 00:34:28,620 --> 00:34:31,800 So we are actually using it in education and training materials. 389 00:34:31,800 --> 00:34:38,610 It's also being used as a guide to design new courses. 390 00:34:38,610 --> 00:34:44,730 It allows you to compare where the focus of different programmes lie. 391 00:34:44,730 --> 00:34:51,810 There isn't a single [type of] expert. The people you need to build cryptographic mechanisms 392 00:34:51,810 --> 00:34:55,320 are very different than people who work in a security operation centre. 393 00:34:55,320 --> 00:35:01,680 Both are equally valuable, but the knowledge and skills [are] different. 394 00:35:01,680 --> 00:35:07,020 It allows you to see whether programmes are delivering relevant knowledge or not. 395 00:35:07,020 --> 00:35:13,830 STEVE: There's a number of activities that we've got going on at the moment. 396 00:35:13,830 --> 00:35:16,950 We're running a call for small projects. 397 00:35:16,950 --> 00:35:22,350 We've had two very successful runs already and have built up a good body of 398 00:35:22,350 --> 00:35:28,070 supplementary material that's used to support CyBOK. 399 00:35:28,070 --> 00:35:35,100 We're looking for more small projects for the community to develop. 400 00:35:35,100 --> 00:35:43,770 We're also looking to recruit industry champions 401 00:35:43,770 --> 00:35:54,780 [who] will work with the CyBOK team and industry to embed CyBOK in industry. 402 00:35:54,780 --> 00:36:03,150 We're developing knowledge guides (emerging knowledge) and topic guides, 403 00:36:03,150 --> 00:36:09,000 where particular topics are embedded [throughout] CyBOK. 404 00:36:09,000 --> 00:36:10,590 It's about bringing them together. 405 00:36:10,590 --> 00:36:19,050 We're looking at security of AI systems and the use of AI in cybersecurity, 406 00:36:19,050 --> 00:36:24,420 which appears in various places in CyBOK. We wanted to bring that together. 407 00:36:24,420 --> 00:36:29,100 ANDREW: Discipline[s] change and develop over time. That's true of cyber security. 408 00:36:29,100 --> 00:36:38,970 We want CyBOK to remain relevant, it mustn't become something that nobody looks at. 409 00:36:38,970 --> 00:36:45,930 Nor do we want it to be a straitjacket for people, not keeping up with the field. 410 00:36:45,930 --> 00:36:53,010 Since we published the first edition, we've had a process for change requests. 411 00:36:53,010 --> 00:37:02,040 People can spot errors or typos, they may want to comment... 412 00:37:02,040 --> 00:37:07,440 maybe propose something that we've missed and really needs to be added. 413 00:37:07,440 --> 00:37:12,870 We're also aware that people are busy and may not always get around to telling us. 414 00:37:12,870 --> 00:37:20,370 We're about to proactively go out and look at a number of the knowledge areas 415 00:37:20,370 --> 00:37:27,690 and ask whether they're still a good summary, or need a bit of an update, 416 00:37:27,690 --> 00:37:30,840 a bit of a reorganisation of material, perhaps. 417 00:37:30,840 --> 00:37:38,790 We'll get experts to do that and to keep us keep us on our toes. 418 00:37:38,790 --> 00:37:45,120 Periodically, we expect to do a bigger review of the overall scope of CyBOK 419 00:37:45,120 --> 00:37:51,930 [so] it remains aligned with the community's thinking on cyber security. 420 00:37:51,930 --> 00:37:55,230 ARI: I asked who this community was, and really liked what they had to say. 421 00:37:55,230 --> 00:37:59,790 AWAIS: I would say this is my personal view, not the CyBOK project's view... 422 00:37:59,790 --> 00:38:06,390 Our community is any professional who wants to learn about the topic. 423 00:38:06,390 --> 00:38:14,280 Ultimately CyBOK is used to build education and training programmes for them. 424 00:38:14,280 --> 00:38:19,020 CyBOK is a way for employers to see whether they have the knowledge. 425 00:38:19,020 --> 00:38:26,850 But at the heart of it we talk about the cyber security workforce gap. 426 00:38:26,850 --> 00:38:29,880 They are our primary community. 427 00:38:29,880 --> 00:38:37,410 STEVE: The beneficiaries of all this, of increasing our capability, is the nation. 428 00:38:37,410 --> 00:38:42,030 We are talking about the resilience and the 429 00:38:42,030 --> 00:38:49,230 ability to maintain infrastructure and the whole of society 430 00:38:49,230 --> 00:38:53,760 making sure that we have the right skills 431 00:38:53,760 --> 00:38:59,880 to make sure that the systems will work in the way that they should. 432 00:38:59,880 --> 00:39:04,410 ANDREW: Part of our objective has been not to have a boundary to the community, 433 00:39:04,410 --> 00:39:10,590 so CyBOK resources are freely available for download. 434 00:39:10,590 --> 00:39:19,260 That means we get people telling us that they're using CyBOK across the world. 435 00:39:19,260 --> 00:39:26,010 And so the community is boundless. ARI: CyBOK brought together documents, people, 436 00:39:26,010 --> 00:39:30,510 a lot of areas of expertise, to map out different topics within cyber security. 437 00:39:30,510 --> 00:39:36,300 This work is giving people who create teaching materials and education programmes 438 00:39:36,300 --> 00:39:41,430 content to work with [and] they don't track you when you access their website! 439 00:39:41,430 --> 00:39:48,150 Cyber security as a field is being taken more seriously in the UK. 440 00:39:48,150 --> 00:39:54,150 We're seeing standards being mapped out, professional bodies forming around CyBOK. 441 00:39:54,150 --> 00:39:57,930 It was a real privilege to get to talk to them. ANDREW: I'm Andrew Hood. 442 00:39:57,930 --> 00:40:00,090 I'm a Lecturer in Cyber Security at Cardiff University. 443 00:40:00,090 --> 00:40:04,050 One of the problems we're trying to solve is recruitment within cyber, 444 00:40:04,050 --> 00:40:10,710 especially within academia, finding talent to be able to work on our research. 445 00:40:10,710 --> 00:40:13,440 I come from industry and I've also been a Youth Offending Officer. 446 00:40:13,440 --> 00:40:19,740 I've worked a lot with young people - one of the things we tend to miss is talent. 447 00:40:19,740 --> 00:40:25,110 When I started at universities, I noticed that (enthusiastic) young people came in 448 00:40:25,110 --> 00:40:29,160 but nobody was taking them seriously. At Warwick and at Cardiff, 449 00:40:29,160 --> 00:40:34,980 I started a programme - rather than trying to recruit a research assistant, 450 00:40:34,980 --> 00:40:42,390 I would take that role, break it up into a pool of hours [and] recruit students. 451 00:40:42,390 --> 00:40:47,970 We can use the infrastructures universities already have to allow us to do this. 452 00:40:47,970 --> 00:40:55,770 I end[ed] up [with] a team from different departments, not just Computer Science. 453 00:40:55,770 --> 00:40:59,400 They work on the research together. 454 00:40:59,400 --> 00:41:05,490 They've been very successful. Warwick, I had 26 undergraduates. We're starting at 455 00:41:05,490 --> 00:41:10,350 Cardiff with 6 students working on a project funded with the Alan Turing Institute 456 00:41:10,350 --> 00:41:14,010 The priority is to build up our skillset within the industry. 457 00:41:14,010 --> 00:41:21,840 Within UK-based companies. We have a huge shortage of technical skills, 458 00:41:21,840 --> 00:41:25,980 we're really struggling to find people who have backgrounds in hardware, 459 00:41:25,980 --> 00:41:32,820 cyber-resilient systems... that starts at university, identifying talent early, 460 00:41:32,820 --> 00:41:37,980 nurturing it and growing it before releasing it out into the UK space. 461 00:41:37,980 --> 00:41:44,430 We have a responsibility as academics to support industry. This is one 462 00:41:44,430 --> 00:41:50,550 of the ways we can do it, by helping prepare the next generation of experts. 463 00:41:50,550 --> 00:41:57,060 Cyber security is in everything. 464 00:41:57,060 --> 00:42:03,000 I think it's vitally important that we start investing in our young people. 465 00:42:03,000 --> 00:42:07,710 I firmly believe that you can only really learn cyber by doing cyber, 466 00:42:07,710 --> 00:42:12,000 the things we learn most in industry is when we're actually working on it, 467 00:42:12,000 --> 00:42:16,230 not necessarily sitting on an online course. 468 00:42:16,230 --> 00:42:20,790 All my teaching is lab based, it's all technical, it's all hands on. 469 00:42:20,790 --> 00:42:24,840 If I want you to read lecture slides, you can read them in your own time. 470 00:42:24,840 --> 00:42:27,910 There's no benefit in that from a teaching perspective. 471 00:42:27,910 --> 00:42:37,260 Benefit comes from you working with other students, solving real life problems. 472 00:42:37,260 --> 00:42:42,570 That's something that universities can really embrace. 473 00:42:42,570 --> 00:42:48,450 Cardiff built an entire new building to allow us to be flexible in this space. 474 00:42:48,450 --> 00:42:49,500 And I think that's the future. 475 00:42:49,500 --> 00:42:56,550 If you are running a degree on a lecture slide basis, 476 00:42:56,550 --> 00:42:59,940 then you are failing your students and you are not preparing them properly. 477 00:42:59,940 --> 00:43:05,880 We've developed a Masters in Cyber Security, linked with PricewaterhouseCooper's. 478 00:43:05,880 --> 00:43:09,120 PwC are keen for us to develop a technical Masters. 479 00:43:09,120 --> 00:43:17,340 One of the complaints from industry at the moment is that Masters programmes... 480 00:43:17,340 --> 00:43:20,770 they're no longer preparing somebody to be a Master of the subject. 481 00:43:20,770 --> 00:43:24,600 PricewaterhouseCooper's was keen for us to develop a technical Masters. 482 00:43:24,600 --> 00:43:29,460 The Welsh Government is supporting us, providing fully funded bursaries. 483 00:43:29,460 --> 00:43:36,830 We've looked at every module [asking] How can we teach this in a physical way? 484 00:43:36,830 --> 00:43:41,750 We've created a whole lab devoted to teach forensics. 485 00:43:41,750 --> 00:43:47,990 We've adapted all our labs to allow students to explore the subject matter. 486 00:43:47,990 --> 00:43:52,550 Andrew Hood, Lecturer in Cybersecurity, Cardiff University. 487 00:43:52,550 --> 00:43:58,730 The way that Andrew talks about training undergraduates as researchers, 488 00:43:58,730 --> 00:44:04,460 encouraging them to start exploring ideas and building confidence in themselves. 489 00:44:04,460 --> 00:44:11,120 It's really important because universities nourish independence and ideas. 490 00:44:11,120 --> 00:44:16,220 Part of the pipeline that feeds a trained workforce into various sectors. 491 00:44:16,220 --> 00:44:24,160 A trained cyber security workforce starts to address the sector's skills gap. 492 00:44:24,160 --> 00:44:27,790 My name is Matthew Boakes. I'm a PhD student at the University of Kent. 493 00:44:27,790 --> 00:44:33,070 My primary focus is biometrics on mobile devices and how we can test them, 494 00:44:33,070 --> 00:44:37,360 the environments and scenarios available for mobile devices is bigger than, 495 00:44:37,360 --> 00:44:43,120 say, airport gate systems, where it's much more static. 496 00:44:43,120 --> 00:44:50,080 Something that's come up more in recent years is ethical use of biometric systems. 497 00:44:50,080 --> 00:44:52,090 Some recent examples over the pandemic, 498 00:44:52,090 --> 00:45:00,310 facial recognition technology to track citizens leaving the house, in Ukraine 499 00:45:00,310 --> 00:45:03,700 using facial recognition technology as part of their strategy, 500 00:45:03,700 --> 00:45:08,920 using an organisation that isn't so ethical, pictures scraped from social media. 501 00:45:08,920 --> 00:45:12,820 So there's this whole debate about whether this is ethical. 502 00:45:12,820 --> 00:45:17,290 [The company] was fined by the Information Commissioner's Office in the UK. 503 00:45:17,290 --> 00:45:23,200 [There's a] whole drive towards a passwordless society. 504 00:45:23,200 --> 00:45:26,200 If you've heard of the FIDO alliance (https://fidoalliance.org), 505 00:45:26,200 --> 00:45:30,460 that's their main aim, to come up with how can we go towards 'passwordless'. 506 00:45:30,460 --> 00:45:35,390 Both Microsoft and Apple are driving this. 507 00:45:35,390 --> 00:45:42,250 The use of biometric sensors in your smartphone, more and more in laptops... 508 00:45:42,250 --> 00:45:47,620 That's where I see biometrics - playing a part in driving this passwordless world. 509 00:45:47,620 --> 00:45:51,730 Biometrics, something we carry on us all the time, can't really be changed. 510 00:45:51,730 --> 00:45:53,890 That's where I see it fitting into the big picture. 511 00:45:53,890 --> 00:46:00,280 ARI: Matthew's talking to us about biometric systems (biometrics: eye scans, etc.) 512 00:46:00,280 --> 00:46:05,230 He's talking about the systems themselves, [and] their ethical use. 513 00:46:05,230 --> 00:46:11,110 This is important because, as biometrics are embedded into laptops etc., 514 00:46:11,110 --> 00:46:15,820 people are getting less of a choice about sharing quite personal data. 515 00:46:15,820 --> 00:46:18,730 I've seen people needing to access a (payment) till using their thumbprint, 516 00:46:18,730 --> 00:46:24,610 [Is that the] difference between being hired or not getting the job at all? 517 00:46:24,610 --> 00:46:31,270 People building new biometrics need to be considering ethical implications. 518 00:46:31,270 --> 00:46:37,150 We're not saying that they need all the answers, just thinking about imbalances, 519 00:46:37,150 --> 00:46:43,270 perhaps even drawing colleagues in who do specialise in this kind of space... 520 00:46:43,270 --> 00:46:46,370 MARIOS: Hello. My name is Marios Samanis, a PhD student, University of Bristol. 521 00:46:46,370 --> 00:46:53,140 I'm interested in the security of industrial control systems, attacks, 522 00:46:53,140 --> 00:46:57,070 and cascading effects of cyber attacks on interconnected critical infrastructures 523 00:46:57,070 --> 00:47:06,250 How can we disrupt one system and see effects in another? 524 00:47:06,250 --> 00:47:14,140 It's important to have a lab environment to be able to do practical research, test 525 00:47:14,140 --> 00:47:20,530 attacks or defences and be able to program such systems, policies and so on. 526 00:47:20,530 --> 00:47:24,970 I'm interested in digital twins, which is something that we [are working on]. 527 00:47:24,970 --> 00:47:27,640 This is what I think is important. 528 00:47:27,640 --> 00:47:34,900 Critical national infrastructure is very important, separated in the past, 529 00:47:34,900 --> 00:47:43,240 but now interconnected with each other, the internet, the [power] grid and so on. 530 00:47:43,240 --> 00:47:47,680 The attack vectors are much greater, the vulnerabilities are much bigger. 531 00:47:47,680 --> 00:47:54,040 We need a security plan in order to secure these infrastructures. 532 00:47:54,040 --> 00:48:00,880 ARI: Marios is talking about critical national infrastructures (CNI). 533 00:48:00,880 --> 00:48:04,930 We're more at risk now because not only are they getting connected to each other, 534 00:48:04,930 --> 00:48:07,930 which can be a problem because these systems are different to one another. 535 00:48:07,930 --> 00:48:12,130 So when you stick them together, you might get holes in communication and weak spots. 536 00:48:12,130 --> 00:48:16,150 They're also connected to the Internet (so we can look at data in real time?). 537 00:48:16,150 --> 00:48:23,620 These systems need to be protected from disruption, not just being turned off. 538 00:48:23,620 --> 00:48:27,460 MARIA: Hi, my name is Maria Sameen. 539 00:48:27,460 --> 00:48:34,690 I'm a doctoral student at University of Bristol under the CDT programme 540 00:48:34,690 --> 00:48:39,200 [I look at] TIPS at scale - trust, identity, privacy and security. 541 00:48:39,200 --> 00:48:46,450 Cyber security is segregated and not considered fundamental in organisations. 542 00:48:46,450 --> 00:48:51,730 I believe that everything is connected. 543 00:48:51,730 --> 00:49:00,130 We need cyber security to be thought of as regulatory practises, social science, 544 00:49:00,130 --> 00:49:05,590 how users perceive security. Cyber security in the future shouldn't be a buzzword. 545 00:49:05,590 --> 00:49:11,950 Users are becoming more susceptible to different privacy 546 00:49:11,950 --> 00:49:19,090 harms, exploits etc. You are already in the era of smart homes, 547 00:49:19,090 --> 00:49:24,700 24-hour surveillance systems, or smart cities for (surveillance outside homes... 548 00:49:24,700 --> 00:49:30,100 ... inside our homes), you need to take steps now so we can have a better future. 549 00:49:30,100 --> 00:49:34,840 ARI: Maria was talking about how cyber security should have multidisciplinary aspects. 550 00:49:34,840 --> 00:49:38,290 This is something we've seen in a few of the conversations today. 551 00:49:38,290 --> 00:49:45,250 This is a good way to think about cyber security, especially as a starting point. 552 00:49:45,250 --> 00:49:51,430 If you expect to think bigger picture, be open to new ideas and inputs, 553 00:49:51,430 --> 00:49:55,390 perhaps we'll have fewer nasty surprises along the way. 554 00:49:55,390 --> 00:50:00,130 PRIYANKA: My name is Priyanka. I'm doing a PhD in Cyber Security, University of Bristol. 555 00:50:00,130 --> 00:50:08,890 Every organisation has the security operation centre (the SOC) with different teams. 556 00:50:08,890 --> 00:50:15,910 Security monitoring, the red team, incident response and threat hunting. 557 00:50:15,910 --> 00:50:19,600 Most literature covers the SOC. 558 00:50:19,600 --> 00:50:26,800 The challenges, how a SOC performs and operates. Less literature around threat hunting. 559 00:50:26,800 --> 00:50:33,550 My research is to understand how can we help threat hunters perform better. 560 00:50:33,550 --> 00:50:39,460 Because I worked in [a] SOC in my past, I understand there are lots of challenges. 561 00:50:39,460 --> 00:50:45,130 I understand what they go through - I got an opportunity to do research, 562 00:50:45,130 --> 00:50:49,780 My first priority is to help them. Everything is digital now. 563 00:50:49,780 --> 00:50:58,230 Everyone is using machine learning, AI - cyber security is really important. 564 00:50:58,230 --> 00:51:01,450 And so I think cyber security will be... it's going to be a blast! 565 00:51:01,450 --> 00:51:05,830 ARI: Priyanka was talking about helping threat hunters do their job. 566 00:51:05,830 --> 00:51:08,770 Threat hunting is a really important approach to dealing with cyber attacks 567 00:51:08,770 --> 00:51:13,030 because instead of retrospective or passive work after things have happened, 568 00:51:13,030 --> 00:51:15,610 these teams face real challenges in real time. 569 00:51:15,610 --> 00:51:21,610 As an ex-threat hunter, Priyanka has a general idea of her research, 570 00:51:21,610 --> 00:51:28,420 by understanding the challenges these people face, we can reduce obstacles! 571 00:51:28,420 --> 00:51:33,370 On that note, we're going to wrap up this episode. 572 00:51:33,370 --> 00:51:37,060 Hopefully this demonstrates the value of cyber security work. 573 00:51:37,060 --> 00:51:41,200 A big thank you to the NCSC for hosting the event and inviting us. 574 00:51:41,200 --> 00:51:45,490 A massive thank you to everybody who's taken part. We leave you on this note. 575 00:51:45,490 --> 00:51:51,250 We protect things together or we don't protect them at all. 576 00:51:51,250 --> 00:51:54,640 What does this look like in real life? 577 00:51:54,640 --> 00:52:00,850 Understand your context, know who you're working with, understand what you want to achieve. 578 00:52:00,850 --> 00:52:04,160 Thanks for listening, everyone. See you next time. 579 00:52:04,160 --> 00:52:09,000 CLAUDINE: You can tweet at us @HelloPTNPod and subscribe 580 00:52:09,000 --> 00:52:11,990 on Apple Podcasts or wherever you listen to podcasts. 581 00:52:11,990 --> 00:52:15,440 The title there is PTNPod. See you next week. ARI: Bye! 582 00:52:15,440 --> 00:52:17,800 This has been a podcast from the Centre for Doctoral Training in Cyber Security, University of Oxford. 583 00:52:23,010 --> 00:52:26,010 Funded by the Engineering and Physical Sciences Research Council.