1 00:00:01,980 --> 00:00:05,610 Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,610 --> 00:00:09,490 ARI: We're a podcast all about exploring the different sides of cybersecurity, 3 00:00:09,490 --> 00:00:13,770 from political to computer science, international relations to mathematics. 4 00:00:13,770 --> 00:00:17,250 Join us as we talk to our friends about the work they do. 5 00:00:17,250 --> 00:00:18,540 [KLAUDIA] My name is Klaudia Krawiecka, 6 00:00:18,540 --> 00:00:23,220 I'm a CDT cyber security student at the University of Oxford. 7 00:00:23,220 --> 00:00:28,738 I work on introducing novel, behavioural biometric systems based on 8 00:00:28,738 --> 00:00:32,220 naturally occurring interactions with objects in smart environments, 9 00:00:32,220 --> 00:00:36,090 in Internet of Things (IoT) environments and augmented reality environments. 10 00:00:36,090 --> 00:00:42,480 So, profiling people based on how they interact with devices around them. 11 00:00:42,480 --> 00:00:47,500 The biometric that I propose leverages existing sensors to both identify and 12 00:00:47,500 --> 00:00:53,430 authenticate users without requiring any hardware modification of existing 13 00:00:53,430 --> 00:00:57,000 smart home devices. It doesn't require any specific gestures, 14 00:00:57,000 --> 00:00:58,530 [you] just interact with environment. 15 00:00:58,530 --> 00:01:02,600 The system is designed to reduce the need for phone based authentication 16 00:01:02,600 --> 00:01:05,370 on which current smart home systems rely. 17 00:01:05,370 --> 00:01:10,000 I conducted a real world experiment that involved 38 participants 18 00:01:10,000 --> 00:01:15,150 in both a smart office environment and private smart household. 19 00:01:15,150 --> 00:01:20,750 I used this experiment to, not only collect the data for identification purposes, 20 00:01:20,750 --> 00:01:22,890 but also to study mimicry attacks. 21 00:01:22,890 --> 00:01:25,800 What's interesting about this (and novel) is that it operates in three modes: 22 00:01:25,800 --> 00:01:30,900 the first uses only the sensors embedded in devices that the user interacts with. 23 00:01:30,900 --> 00:01:37,800 Another mode only uses co-located sensors (sensors on nearby devices). 24 00:01:37,800 --> 00:01:42,120 Lastly, it operates in combined mode that is a mixture of both. 25 00:01:42,120 --> 00:01:45,000 So that's in a nutshell what I'm working on currently. 26 00:01:45,000 --> 00:01:50,040 [CLAUDINE] Could you explain what authentication is? 27 00:01:50,040 --> 00:01:53,510 [KLAUDIA] The system can perform both identification and authentication. 28 00:01:53,510 --> 00:02:01,770 Identification is when we classify users into individual profiles per person. 29 00:02:01,770 --> 00:02:06,930 Can the system recognise whether it's mother or father or child interacting? 30 00:02:06,930 --> 00:02:12,210 Authentication is - do they get permission to operate certain devices? 31 00:02:12,210 --> 00:02:16,230 E.g., financial transactions would not be available to children. 32 00:02:16,230 --> 00:02:21,090 The system, after identifying who is currently interacting, should make this decision. 33 00:02:21,090 --> 00:02:24,190 That's the distinction between the two [concepts] I'm looking at. 34 00:02:24,190 --> 00:02:31,350 Authentication (who can do what) and identification (do I recognise you?). 35 00:02:31,350 --> 00:02:33,240 [ARI] What is it that you're curious about? 36 00:02:33,240 --> 00:02:37,780 [KLAUDIA] We have lots of different smart devices on the market. 37 00:02:37,780 --> 00:02:43,970 We have plenty of vendors. One of the things that's annoying about setting up 38 00:02:43,970 --> 00:02:48,200 your own environment is the cross-platform interaction (or lack thereof) 39 00:02:48,200 --> 00:02:51,120 each device from different vendors will have a different app, 40 00:02:51,120 --> 00:02:56,710 you operate these devices through the app - authenticating very differently. 41 00:02:56,710 --> 00:03:00,210 It is a lot of hassle to use all of this securely. 42 00:03:00,210 --> 00:03:04,750 There's also a problem with permissions, because you have cross-app interaction. 43 00:03:04,750 --> 00:03:05,640 [An operation on one device] 44 00:03:05,640 --> 00:03:09,540 may be permitting something on another device from another vendor. It's tricky. 45 00:03:09,540 --> 00:03:17,010 I'm trying to develop a method that will work for all and look at usability aspects. 46 00:03:17,010 --> 00:03:21,000 [ARI] Why do we want to think about usability? Why is that important to you? 47 00:03:21,000 --> 00:03:26,500 [KLAUDIA] Asking people for passwords, PIN codes or gestures... it's a hassle for 48 00:03:26,500 --> 00:03:31,980 the user. What I'm proposing and looking at is - can we make it more usable, 49 00:03:31,980 --> 00:03:37,230 more natural for the user to authenticate, to identify devices in an environment. 50 00:03:37,230 --> 00:03:43,170 That's why I started doing this research. I was curious - can we make it usable? 51 00:03:43,170 --> 00:03:47,500 We have a lot of amazing security platforms and systems and tools, but 52 00:03:47,500 --> 00:03:51,500 users are just not able to use them. So can we do something that will be easy to 53 00:03:51,500 --> 00:03:58,750 use, and will be used by the user? There's so much good research, but it's unusable. 54 00:03:58,750 --> 00:04:01,680 That's my motivation, and that's where my curiosity lies. 55 00:04:01,680 --> 00:04:04,980 [CLAUDINE] I'm curious how, over the last few years your ideas around what 56 00:04:04,980 --> 00:04:11,450 usability means has changed as you've gone through all of these challenges 57 00:04:11,450 --> 00:04:13,770 and dealt with all of these practical problems. 58 00:04:13,770 --> 00:04:18,200 [KLAUDIA] I come from an applied engineering (software engineering) background. 59 00:04:18,200 --> 00:04:22,500 Until my Masters research, usability wasn't something I was thinking about. 60 00:04:22,500 --> 00:04:26,070 I was like, "OK, if I develop this system and it works, fine!" 61 00:04:26,070 --> 00:04:30,690 I was working on securing password databases on cloud servers. 62 00:04:30,690 --> 00:04:34,000 I was collaborating with Intel, it was a new technology at the time. 63 00:04:34,000 --> 00:04:36,370 It was hardware-based protection. 64 00:04:36,370 --> 00:04:42,120 We asked - how do we convey (to users) which platforms are using our solution? 65 00:04:42,120 --> 00:04:46,200 We came up with a web browser plugin to connect [to a server] and perform 66 00:04:46,200 --> 00:04:52,000 something called 'remote attestation' to verify whether the server is running the 67 00:04:52,000 --> 00:04:57,180 secure part that will protect data. For this study we had almost 100 users coming 68 00:04:57,180 --> 00:05:01,630 to our office trying to use the plug-in, trying to spot whether the website was 69 00:05:01,630 --> 00:05:06,500 actually using our solution or just trying to convince them [that it was secure]. 70 00:05:06,500 --> 00:05:10,290 Also known as... 'phishing attacks'. We asked users afterwards, 71 00:05:10,290 --> 00:05:14,000 "What do you think about the solution? Would you use the solution?" 72 00:05:14,000 --> 00:05:20,140 Many people said that it was a very interesting project, very easy to use. 73 00:05:20,140 --> 00:05:25,000 Some of them also said, "I don't have that many things to protect". 74 00:05:25,000 --> 00:05:29,050 And once you follow up with them and you ask, "What about medical records...?" 75 00:05:29,050 --> 00:05:34,000 Then they start thinking about how inconvenient the password feels for them 76 00:05:34,000 --> 00:05:36,000 and how they reuse passwords across different platforms. 77 00:05:36,000 --> 00:05:39,270 They said [Klaudia's solution] was a lot of new things to learn and it didn't seem 78 00:05:39,270 --> 00:05:42,300 seamless for them. It seemed something they would need to spend time on, 79 00:05:42,300 --> 00:05:47,830 reading and researching, and that's why they wouldn't transition [from passwords] 80 00:05:47,830 --> 00:05:52,090 So then I realised that things have to happen in the background. 81 00:05:52,090 --> 00:05:57,600 It has to be easy to use, intuitive, because otherwise we waste 82 00:05:57,600 --> 00:06:01,400 the potential of the amazing systems we have. In the end, we are doing this for 83 00:06:01,400 --> 00:06:07,480 users, not for engineers. This is why this aspect became interesting to me. 84 00:06:07,480 --> 00:06:13,000 [ARI] Security shouldn't get in the way. [KLAUDIA] Yeah, exactly. That's the thing. 85 00:06:13,000 --> 00:06:16,200 It shouldn't get in the way. It should be there, it should be there for users 86 00:06:16,200 --> 00:06:18,880 because that's what we want, we want to protect people and their data. 87 00:06:18,880 --> 00:06:22,450 [ARI] What's the biggest challenge in your research, how have you approached it? 88 00:06:22,450 --> 00:06:28,060 [KLAUDIA] Let's go back to the end of 2019, maybe the beginning of 2020. 89 00:06:28,060 --> 00:06:32,660 When you do any research that connects to users and requires them to come into 90 00:06:32,660 --> 00:06:36,520 the lab so you can gather samples... It became very tricky in the pandemic. 91 00:06:36,520 --> 00:06:45,070 The biggest challenge was how to reach the user remotely - in their own houses. 92 00:06:45,070 --> 00:06:50,140 If you invite [research participants] to a lab, it's an artificial setting. 93 00:06:50,140 --> 00:06:55,180 In behavioural biometrics you want people to behave naturally, 94 00:06:55,180 --> 00:06:58,900 which, in a lab they will not. With the pandemic and the knowledge that we 95 00:06:58,900 --> 00:07:02,560 wanted to gather realistic samples. 96 00:07:02,560 --> 00:07:09,500 I came up with a remote testbed for the participants. 97 00:07:09,500 --> 00:07:13,490 We developed this plug and play experimentation set-up, sent 98 00:07:13,490 --> 00:07:17,230 to people in private households so that they could connect all the smart devices. 99 00:07:17,230 --> 00:07:24,340 They would get instructions (2-3 pages), video... we decided to progress this way. 100 00:07:24,340 --> 00:07:28,150 You have different aspects that you need to consider, e.g., safety. 101 00:07:28,150 --> 00:07:35,890 There were wires - people could trip. On university premises, we were insured 102 00:07:35,890 --> 00:07:42,340 but outside, what then? We came up with a remote experimentation framework. 103 00:07:42,340 --> 00:07:47,710 The system was called "Plug and play: a framework for remote experimentation in cyber security". 104 00:07:47,710 --> 00:07:53,170 The biggest challenge that I had resulted in a very interesting research problem. 105 00:07:53,170 --> 00:07:57,000 It allowed us to reach people we normally couldn't reach, 106 00:07:57,000 --> 00:07:59,890 people with disabilities, for instance, that couldn't come to the lab. 107 00:07:59,890 --> 00:08:04,500 This was the biggest challenge, but it was also, in a way, a blessing, because 108 00:08:04,500 --> 00:08:09,040 we started thinking about all the other aspects and how to address them. 109 00:08:09,040 --> 00:08:17,350 The first testing household we sent the devices to... we asked participants to stick 110 00:08:17,350 --> 00:08:22,360 devices to surfaces they might, in the future, put a smart device. 111 00:08:22,360 --> 00:08:27,048 We sent [adhesive] stickers but once devices got hot (because of 112 00:08:27,048 --> 00:08:29,750 all the processing they were doing), they were falling off the walls. 113 00:08:29,750 --> 00:08:35,000 This was something we didn't think of earlier. It was the first test household, 114 00:08:35,000 --> 00:08:41,560 we wanted to collect data from them (due to the cost of shipping/procedures). 115 00:08:41,560 --> 00:08:48,640 We couldn't. Devices were falling down - people didn't put them in the same place! 116 00:08:48,640 --> 00:08:53,950 For us it was important that all users had devices in the same position/placement. 117 00:08:53,950 --> 00:08:56,320 And then you get the data, you look at them [and there's something odd]. 118 00:08:56,320 --> 00:09:01,520 Fortunately, we asked for pictures of the set-up. Different days, different places! 119 00:09:01,520 --> 00:09:04,000 When we asked why, [participants] said they were falling down, 120 00:09:04,000 --> 00:09:06,880 so we had to put them somewhere else. We had to discard that data. 121 00:09:06,880 --> 00:09:09,750 [ARI] What has been the biggest surprise in your research? 122 00:09:09,750 --> 00:09:11,770 [KLAUDIA] Huh. That's a really good question. 123 00:09:11,770 --> 00:09:17,840 Tied to this issue of the pandemic and inability to work with people directly, 124 00:09:17,840 --> 00:09:23,300 is how the process of experimentation and doing something out of the typical 125 00:09:23,300 --> 00:09:27,160 frame can be challenging. When we were developing this methodology, 126 00:09:27,160 --> 00:09:31,500 we had to wait for approval for several months because it had to go through 127 00:09:31,500 --> 00:09:35,020 all the safety committees. That can delay your research. 128 00:09:35,020 --> 00:09:41,610 I was surprised to see no solution for problems like this. No guidelines. 129 00:09:41,610 --> 00:09:45,750 This is not a very technical paper, but it's one of my proudest achievements. 130 00:09:45,750 --> 00:09:49,540 It starts this discussion - we should improve the methods of experimentation. 131 00:09:49,540 --> 00:09:55,000 We should improve formal procedures and take safety into account. 132 00:09:55,000 --> 00:10:00,450 You need to be able to experiment with the user where they are. 133 00:10:00,450 --> 00:10:03,500 Depending on your research work, you might take it outside the lab 134 00:10:03,500 --> 00:10:08,640 because in certain cases, simulations are not reflective of natural behaviour. 135 00:10:08,640 --> 00:10:12,500 I was surprised there was no formal method of dealing with 136 00:10:12,500 --> 00:10:18,000 outside-of-premises experimentation. If you want to collect natural behavioural 137 00:10:18,000 --> 00:10:26,500 samples from users - having that in the lab is not representative of behaviour. 138 00:10:26,500 --> 00:10:30,150 People behave very differently, depending on where they are. 139 00:10:30,150 --> 00:10:36,540 Are they familiar with the space? Is someone observing them? In a lab they 140 00:10:36,540 --> 00:10:41,730 would be on their best behaviour, which is not what they would do normally. 141 00:10:41,730 --> 00:10:49,740 What I learned is that, you can do amazing experiments and collect data... 142 00:10:49,740 --> 00:10:54,060 But in the end, they may not represent what is actually happening. 143 00:10:54,060 --> 00:10:57,120 We talk about the usability of systems to end-users, 144 00:10:57,120 --> 00:11:02,700 but there's no usability in terms of how you experiment with users. 145 00:11:02,700 --> 00:11:09,120 If you're going to give them flexibility to organise in the way they want to do it, 146 00:11:09,120 --> 00:11:16,000 Make it usable, make it experimentation-friendly for the user. 147 00:11:16,000 --> 00:11:20,500 There is a bigger issue of of making system interfaces usable. 148 00:11:20,500 --> 00:11:24,750 So this is not the main thing I look at in my research, but this is certainly 149 00:11:24,750 --> 00:11:29,520 something that I pay attention to and discuss in every paper I write now. 150 00:11:29,520 --> 00:11:35,750 My supervisor says that I'm the usability person in our group because I always 151 00:11:35,750 --> 00:11:41,390 think about it whenever we design experiments. I think it's important. 152 00:11:41,390 --> 00:11:48,000 My [research] group is very technical. We are all from a computer science, 153 00:11:48,000 --> 00:11:53,410 software engineering background. So it's easy not to think about [usability]. 154 00:11:53,410 --> 00:11:59,970 Then again, if you have smart devices or wearable devices (part of the IoT, 155 00:11:59,970 --> 00:12:06,350 which is something that people interact with on daily basis), there are ways to 156 00:12:06,350 --> 00:12:08,580 secure these devices (or data) for the environment they are in, without 157 00:12:08,580 --> 00:12:17,940 interfering or asking [people] to understand something too technical. 158 00:12:17,940 --> 00:12:22,260 [ARI] You've mentioned two different kinds of attacks so far. Phishing attacks: 159 00:12:22,260 --> 00:12:25,500 we don't click on links in emails because it might take you somewhere 160 00:12:25,500 --> 00:12:28,200 you're not meant to go, and so it might put viruses on your machine. 161 00:12:28,200 --> 00:12:34,250 [ARI]: Tell us about [mimicry attacks] and how they relate to your work. 162 00:12:34,250 --> 00:12:38,500 [KLAUDIA] Mimicry attacks are [copy cat] attacks - the attacker tries to mimic 163 00:12:38,500 --> 00:12:44,500 the behaviour (e.g., gestures) of the victim, of the person who is able to 164 00:12:44,500 --> 00:12:50,250 carry out financial transactions or access data [i.e., a user with 'privileges']. 165 00:12:50,250 --> 00:12:55,140 [To defend against this group of attacks] You will look at how your system defends 166 00:12:55,140 --> 00:13:04,320 and protects the user from someone copying their gestures/interactions. 167 00:13:04,320 --> 00:13:10,700 In my work, I look at 2 kinds of this attack (it's the same attack in both, 168 00:13:10,700 --> 00:13:15,150 the attacker will mimic the victim, but the observation period is different). 169 00:13:15,150 --> 00:13:22,050 So two flavours of this attack: video-based and in-person observation. 170 00:13:22,050 --> 00:13:28,890 Video based: we assume the attacker has video footage. 171 00:13:28,890 --> 00:13:36,500 E.g., the attacker got access to a video recording of the local camera. 172 00:13:36,500 --> 00:13:42,000 We allowed our participants (the attacking group) to watch the video 173 00:13:42,000 --> 00:13:48,200 and then try to mimic the behaviour of the victim on the video footage. 174 00:13:48,200 --> 00:13:52,000 In-person observation: let's imagine a smart-office environment. You have a 175 00:13:52,000 --> 00:13:55,380 lot of colleagues, but you have a manager who can do things you can't. 176 00:13:55,380 --> 00:13:59,490 Or you want access to a smart printer's history of another user. 177 00:13:59,490 --> 00:14:03,360 You are a malicious colleague. You want to use the privileges of someone else. 178 00:14:03,360 --> 00:14:09,520 You can easily follow the person because it won't be that suspicious. 179 00:14:09,520 --> 00:14:15,000 You might be in the kitchen or next to the printer, you can observe how the 180 00:14:15,000 --> 00:14:18,810 person interacts, how much force they put into the interaction... 181 00:14:18,810 --> 00:14:24,200 Some of our participants looked at the time the person took to interact with 182 00:14:24,200 --> 00:14:28,440 certain devices or the strength, the speed of certain interaction types. 183 00:14:28,440 --> 00:14:32,530 Mimicry attacks will be the attacks in which the attacker tries to simulate 184 00:14:32,530 --> 00:14:34,620 the behaviour of the victim to fool the system. 185 00:14:34,620 --> 00:14:46,770 [CLAUDINE] Which strategies for mimicry attacks were more successful? 186 00:14:46,770 --> 00:14:52,410 [KLAUDIA] We found, in the smart office, that video attacks were successful. 187 00:14:52,410 --> 00:14:56,200 [Attackers] could observe the victim, and replay the recording. 188 00:14:56,200 --> 00:15:00,534 What plays a key role here is the time of observation. 189 00:15:00,534 --> 00:15:00,540 [The security system Klaudia was testing used nearby (co-located) objects which 190 00:15:00,540 --> 00:15:08,200 took different measurement to make sure the right person was gesturing]. 191 00:15:08,200 --> 00:15:13,200 What I'm saying is, [mimicry attacks are] more difficult [with several sensors]. 192 00:15:13,200 --> 00:15:18,500 In-person, [attackers] could observe the victim from different angles. They could 193 00:15:18,500 --> 00:15:22,270 observe the victim from different perspectives, video attackers couldn't. 194 00:15:22,270 --> 00:15:28,000 The key factor [of successful attacks] was the time of observation. 195 00:15:28,000 --> 00:15:30,000 [ARI] What was the experiment? What did you collect? 196 00:15:30,000 --> 00:15:32,500 [KLAUDIA] There were two types of experiments. 197 00:15:32,500 --> 00:15:37,300 1. General interactions in a private household. We sent devices, people 198 00:15:37,300 --> 00:15:42,500 placed the devices wherever they wanted. They had instructions, we sent 199 00:15:42,500 --> 00:15:48,000 a mobile device which connected to the internet, sending data to our server. 200 00:15:48,000 --> 00:15:52,000 We collected data in real-time so we could quickly feed back to participants. 201 00:15:52,000 --> 00:15:55,990 We asked everyone to do 20 interactions per device. 202 00:15:55,990 --> 00:16:00,280 We gave them flexibility to do it over a day, over a few days... 203 00:16:00,280 --> 00:16:02,880 The majority of them chose to do [all the interactions] on the same day. 204 00:16:02,880 --> 00:16:09,000 (In future work, I want to see how behaviour changes over time!) 205 00:16:09,000 --> 00:16:17,000 The smart office environment (that's the attack scenario) - we asked everyone to 206 00:16:17,000 --> 00:16:22,750 do 20 rounds [of interaction] individually for baseline behaviour, to 207 00:16:22,750 --> 00:16:27,300 see - if they are not trying to mimic the user, can they still fool the system? 208 00:16:27,300 --> 00:16:31,500 They were divided randomly into two groups (one victim chosen randomly) 209 00:16:31,500 --> 00:16:36,000 and the rest of the 12 participants (6 in each group) were either in-person 210 00:16:36,000 --> 00:16:40,180 so they could observe the victim, or getting footage of the victim. 211 00:16:40,180 --> 00:16:44,140 [ARI] What kind of device was it that people were interacting with? 212 00:16:44,140 --> 00:16:48,780 [KLAUDIA] We needed good resolution of sensor data and sampling frequency. 213 00:16:48,780 --> 00:16:54,490 [Smart device vendors could not provide products with high enough resolution] 214 00:16:54,490 --> 00:16:57,500 [We wanted to know which sensors were most common] 215 00:16:57,500 --> 00:17:02,320 We identified the most popular sensors: accelerometer, gyroscope and microphone. 216 00:17:02,320 --> 00:17:06,460 We attached them to Raspberry Pi devices (a small computer) 217 00:17:06,460 --> 00:17:12,000 emulating a smart device. I did a lot of soldiering and all the fun parts! 218 00:17:12,000 --> 00:17:15,500 [CLAUDINE] How did the in-person observation work? 219 00:17:15,500 --> 00:17:19,500 [KLAUDIA] They decided to set up the experiment in the office kitchen, 220 00:17:19,500 --> 00:17:25,360 we asked the victim to do her 20 rounds so [attackers] could watch. 221 00:17:25,360 --> 00:17:32,800 We invited attackers one by one to observe her interactions with devices. 222 00:17:32,800 --> 00:17:37,360 Each attacker got to observe one full round of her interaction. 223 00:17:37,360 --> 00:17:46,720 Attackers were not allowed to speak. They were not allowed to interfere. 224 00:17:46,720 --> 00:17:55,810 [CLAUDINE] If you had unlimited resources, what would you solve? 225 00:17:55,810 --> 00:18:03,070 [KLAUDIA] Two years ago, we looked at authentication for augmented reality. 226 00:18:03,070 --> 00:18:07,100 With unlimited resources taking into account, metaverse coming up... 227 00:18:07,100 --> 00:18:12,250 people moving their services, social interactions to a virtual environment. 228 00:18:12,250 --> 00:18:16,720 This is something I would like to look deeper into. How can we authenticate? 229 00:18:16,720 --> 00:18:21,800 How can we ensure the security of users in that environment, or while they are 230 00:18:21,800 --> 00:18:26,110 using virtual reality or augmented reality glasses? 231 00:18:26,110 --> 00:18:31,500 A lot of what I'm doing now for IoT devices can be mapped to techniques 232 00:18:31,500 --> 00:18:37,300 used in augmented reality environments for how you interact with 3D objects. 233 00:18:37,300 --> 00:18:49,120 [ARI] Thinking about new ways that attackers might steal identities... 234 00:18:49,120 --> 00:18:51,637 how would you bring creativity into this kind of work? 235 00:18:51,637 --> 00:18:56,860 [KLAUDIA] There are a few points you can look at securing (many points!) 236 00:18:56,860 --> 00:19:00,680 One would be biometrics of your interactions with the object. 237 00:19:00,680 --> 00:19:05,600 So continuous authentication - is the person using the glasses currently? 238 00:19:05,600 --> 00:19:08,650 Imagine that you wear the glasses for, let's say, half of the day. 239 00:19:08,650 --> 00:19:12,980 You share the glasses with your partner and they can access all your data. 240 00:19:12,980 --> 00:19:15,910 They potentially see something you saw earlier in the day. 241 00:19:15,910 --> 00:19:21,030 That seems like a huge privacy concern, we have to make sure the glasses can 242 00:19:21,030 --> 00:19:25,100 recognise the user. Especially because this is still very expensive technology, 243 00:19:25,100 --> 00:19:27,190 So it will be shared, at least in the beginning. 244 00:19:27,190 --> 00:19:31,330 There is huge potential in looking at that and trying to solve this problem. 245 00:19:31,330 --> 00:19:35,230 How to protect users from internal threats - family members, for instance, 246 00:19:35,230 --> 00:19:39,500 or people they share equipment with. These things are with you all the time, 247 00:19:39,500 --> 00:19:43,800 they record you. It's not like having a smart watch. Of course, having a smart 248 00:19:43,800 --> 00:19:47,380 watch, you can tell where a person was, what activities they were doing. There's 249 00:19:47,380 --> 00:19:51,100 plenty of things you can infer from that, but [with] VR glasses you'll have 250 00:19:51,100 --> 00:19:55,630 plenty of different types of data that can be misused. Creativity should focus on 251 00:19:55,630 --> 00:19:59,590 securing that... Another thing [to consider] is interface compromise. 252 00:19:59,590 --> 00:20:07,430 Let's say you are driving. A commercial pops up, obscuring your view. 253 00:20:07,430 --> 00:20:13,440 This is another area that people discuss - the safety implications. 254 00:20:13,440 --> 00:20:17,800 There is a lot of creativity to be brought into virtual environments 255 00:20:17,800 --> 00:20:20,030 because there are many things that can go wrong! 256 00:20:20,030 --> 00:20:23,840 [CLAUDINE] Have you thought about using that environment to make users 257 00:20:23,840 --> 00:20:28,040 aware of areas where they might be vulnerable to [mimicry] attacks? 258 00:20:28,040 --> 00:20:32,800 [KLAUDIA] We will live in very interesting times soon - smart devices 259 00:20:32,800 --> 00:20:38,510 all around you and wearables or augmented/virtual reality glasses 260 00:20:38,510 --> 00:20:43,520 that could be used to record the person performing certain interactions 261 00:20:43,520 --> 00:20:49,340 and using the interface to guide you. This is very interesting. 262 00:20:49,340 --> 00:20:53,750 I might start looking at this. There was a paper a few years ago on a person 263 00:20:53,750 --> 00:21:02,000 using glasses to record someone typing their password in a cafe. 264 00:21:02,000 --> 00:21:05,310 Something we should think about is recognising whether users are 265 00:21:05,310 --> 00:21:14,960 performing critical operations, or maybe they are in [restricted] areas. 266 00:21:14,960 --> 00:21:22,400 [ARI] Do you have any tips for keeping up to speed with cybersecurity? 267 00:21:22,400 --> 00:21:30,000 [KLAUDIA] One thing I find useful is the mailing list security[at]fosad.org 268 00:21:30,000 --> 00:21:39,120 Interesting talks and research, different topics (machine learning, IoT...) 269 00:21:39,120 --> 00:21:47,000 There are invitations to summer schools, workshops, reading groups... 270 00:21:47,000 --> 00:21:52,420 [Everyone] interested in keeping up with conferences, papers, talks... 271 00:21:52,420 --> 00:21:56,730 I highly recommend signing up to this mailing list. 272 00:21:56,730 --> 00:22:02,240 [ARI] Do you have anything we can tell our listeners about? 273 00:22:02,240 --> 00:22:07,000 [KLAUDIA] I'm co-organising a research session at the largest 274 00:22:07,000 --> 00:22:10,500 European Women in Tech summit that will happen in Warsaw, Poland. 275 00:22:10,500 --> 00:22:13,100 It's called "Perskpektywy Women in Tech Conference" [June 7-8 2022] 276 00:22:13,100 --> 00:22:16,500 It's really exciting because it's in Poland, my home country. 277 00:22:16,500 --> 00:22:22,940 There is a huge cyber security community coming, meeting, mentoring 278 00:22:22,940 --> 00:22:26,800 There will be cyber security organisations. This is free of charge, we 279 00:22:26,800 --> 00:22:29,600 have a lot of scholarships for students and researchers to attend. 280 00:22:29,600 --> 00:22:32,800 It's a hybrid conference, so you can attend online. 281 00:22:32,800 --> 00:22:36,400 It's a very interesting place to meet like minded people. 282 00:22:36,400 --> 00:22:39,020 Join us next week for another fascinating conversation. 283 00:22:39,020 --> 00:22:43,430 In the meantime, you can tweet at us @HelloPTNPod 284 00:22:43,430 --> 00:22:47,690 and you can subscribe on Apple Podcasts or wherever you listen to podcasts. 285 00:22:47,690 --> 00:22:53,550 The title there is PTNPod. See you next week. ARI: Bye! 286 00:22:53,550 --> 00:22:58,710 CLAUDINE: This has been a podcast from the Centre for Doctoral Training in Cybersecurity, University of Oxford. 287 00:22:58,710 --> 00:23:03,616 Funded by the Engineering and Physical Sciences Research Council.