1 00:00:01,990 --> 00:00:05,750 ARI: Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,750 --> 00:00:09,490 ARI: We're a podcast all about exploring the different sides of cyber security, 3 00:00:09,490 --> 00:00:13,780 from political to computer science, international relations to mathematics. 4 00:00:13,780 --> 00:00:17,240 Join us as we talk to our friends about the work they do. 5 00:00:17,450 --> 00:00:21,999 CLAUDINE: How's it going, Ari? ARI: I'm excited for our top 5 moments. 6 00:00:22,000 --> 00:00:26,500 Anirudh, Klaudia, Harmonie, Sean, Mary... 7 00:00:26,500 --> 00:00:28,520 I'm going to kick off with Anirudh. 8 00:00:28,670 --> 00:00:33,344 ANIRUDH: Digital resilience means children have digital and mental tools, 9 00:00:33,345 --> 00:00:37,550 to cope with the challenges that they're facing. 10 00:00:37,550 --> 00:00:42,409 Browser plug-ins, add-ons for your phone to prevent data sharing... 11 00:00:42,410 --> 00:00:48,070 The mental tools are important as well. If you find a photo appeared online, 12 00:00:48,070 --> 00:00:50,199 which you didn't want to be online, how do you cope with that? 13 00:00:50,200 --> 00:00:53,690 We taught children to navigate the physical world, 14 00:00:53,750 --> 00:00:57,290 now we have to teach them how to navigate the digital world. 15 00:00:57,740 --> 00:01:01,670 Long term consequences need research. 16 00:01:02,000 --> 00:01:06,100 For a child growing up using Facebook, what are the consequences 17 00:01:06,100 --> 00:01:08,569 once that child has become an adult? 18 00:01:08,570 --> 00:01:11,300 How does this person engage with other people? 19 00:01:11,300 --> 00:01:16,519 How does it impact their relationship with their kids or people around them? 20 00:01:16,520 --> 00:01:24,139 ARI: In this clip, I enjoyed [how] Anirudh talks about long term impacts, 21 00:01:24,140 --> 00:01:30,470 we don't know what they are [and are ill-equipped] to map [them] out. 22 00:01:30,770 --> 00:01:35,900 Big social media platforms have all the data, researchers don't. 23 00:01:36,200 --> 00:01:42,050 Because there's no conversation happening, we're missing out. 24 00:01:42,440 --> 00:01:49,160 CLAUDINE: I like the zooming out he did - he went into detail, 25 00:01:49,340 --> 00:01:52,800 he did also pull out big picture concerns. 26 00:01:52,800 --> 00:01:55,669 ARI: Kids are a vulnerable population. 27 00:01:55,670 --> 00:01:59,540 Part of security work is understanding how you protect people from harm. 28 00:01:59,870 --> 00:02:03,649 I'll add kids to [my] pool of groups of vulnerable people. 29 00:02:03,650 --> 00:02:09,979 CLAUDINE: Also, we're not just thinking about how [we] might harm users, 30 00:02:09,980 --> 00:02:13,190 [rather] what those harms might be. ARI: I love that. The long term view. 31 00:02:13,280 --> 00:02:15,080 CLAUDINE: Next up, Klaudia's interview. 32 00:02:15,380 --> 00:02:23,240 What I liked (I thought it was cool), was her research process. 33 00:02:23,250 --> 00:02:27,409 So just listen to the clip here. KLAUDIA: we assume the attacker has footage. 34 00:02:27,410 --> 00:02:33,260 Someone recorded it or the attacker got access to a local video recording. 35 00:02:33,260 --> 00:02:36,859 And the user was interacting with the smart environment and devices. 36 00:02:36,860 --> 00:02:43,010 We allowed our participants (the attacking group) to watch the video 37 00:02:43,130 --> 00:02:49,160 and then try to mimic the behaviour of the victim on the video footage. 38 00:02:49,700 --> 00:02:52,789 In-person observation: let's imagine a smart office environment. 39 00:02:52,790 --> 00:03:00,649 you have colleagues, but you have a manager who can do things you can't. 40 00:03:00,650 --> 00:03:04,490 You are a malicious colleague. You want to use the privileges of someone else... 41 00:03:05,750 --> 00:03:12,560 CLAUDINE: First - the research design. They went into a real environment, 42 00:03:12,680 --> 00:03:16,160 very neat and novel and innovative in the way that they deployed it. 43 00:03:16,310 --> 00:03:22,440 ARI: Particularly where sensors kept popping off the walls... 44 00:03:22,650 --> 00:03:24,830 CLAUDINE: They dealt with the real world problems... 45 00:03:24,830 --> 00:03:28,800 ARI: They found out that the sensors were popping off because they asked. 46 00:03:28,800 --> 00:03:31,399 If they hadn't listened to [people taking part in the smart home study], 47 00:03:31,400 --> 00:03:36,170 they would have said: 'smart homes are weird, look at this weird data'. 48 00:03:38,500 --> 00:03:41,540 All right. Let me take you through my second snippet. 49 00:03:42,080 --> 00:03:45,150 This is Dr Harmonie Toros from last week's conference. 50 00:03:45,150 --> 00:03:52,909 HARMONIE: There is a tendency [to see] security as a positive. It usually is... 51 00:03:52,910 --> 00:03:57,620 [But it is not challenged, or looked at in a critical way] Also, resilience. 52 00:03:58,620 --> 00:04:00,110 I've had many conversations here, 53 00:04:00,260 --> 00:04:05,680 [asking], do you see the (political) difference between security and resilience? 54 00:04:05,870 --> 00:04:11,030 Security is [a] State's (and company's) responsibility to secure people. 55 00:04:11,330 --> 00:04:16,760 Resilience is [a] person's responsibility. There's a shift there. 56 00:04:16,970 --> 00:04:22,910 These are important questions [with] legal and political implications. 57 00:04:24,380 --> 00:04:27,500 ARI: This interview rocked my world because at a cyber security conference, 58 00:04:27,500 --> 00:04:30,260 we had someone saying, you know, there's more to life. 59 00:04:30,260 --> 00:04:35,930 I enjoyed the idea that security is the responsibility of people with power. 60 00:04:36,290 --> 00:04:39,350 And again, Anirudh talks about being resilient, 61 00:04:39,380 --> 00:04:45,170 [equipping] kids to have digital resilience - if pictures are shared, 62 00:04:45,170 --> 00:04:49,520 how do they deal with that? How do they keep living their life despite [this]? 63 00:04:49,940 --> 00:04:54,020 CLAUDINE: Resilience is an interesting one. I view it from a different angle. 64 00:04:54,230 --> 00:04:59,630 It's easy to slide into the idea that users are responsible for themselves. 65 00:04:59,870 --> 00:05:04,610 It's not the platform or government's responsibility to protect users. 66 00:05:04,910 --> 00:05:12,770 If users don't have means to be resilient, it's an impossible goal. 67 00:05:13,220 --> 00:05:16,970 I would like to talk about Sean. SEAN: I think cyber security has 68 00:05:17,050 --> 00:05:20,050 to revolve around information technology. 69 00:05:20,170 --> 00:05:23,170 What most of us are talking about is computers. 70 00:05:23,290 --> 00:05:26,470 Whether you want to introduce books etc. into that notion, I think you can. 71 00:05:26,560 --> 00:05:29,630 Cyber security is more than just computer security or computer science. 72 00:05:29,650 --> 00:05:32,320 It's about taking a step back and looking at the bigger picture. 73 00:05:32,560 --> 00:05:36,370 When we're talking about security, we are trying to secure computers. 74 00:05:36,370 --> 00:05:40,090 We're securing these computers because they mean something to us. 75 00:05:40,300 --> 00:05:44,650 That's why cyber security is inseparable from social and psychological aspects. 76 00:05:44,830 --> 00:05:49,600 Ultimately, it's human beings who value and use these devices. 77 00:05:49,750 --> 00:05:55,120 We have to learn about the interaction between human and device behaviours. 78 00:05:56,210 --> 00:06:00,950 CLAUDINE: One of the things I loved overall was the use of Maths. 79 00:06:00,980 --> 00:06:04,219 I find it fascinating because I don't do Maths and I don't really understand it. 80 00:06:06,220 --> 00:06:11,420 It's interesting to hear it applied in such a real, specific and relatable context: 81 00:06:11,870 --> 00:06:17,479 trust and reputation online. It was inclusive, I think, that was what I liked. 82 00:06:17,750 --> 00:06:22,459 We tend to get bogged down in very specific definitions around systems, 83 00:06:22,460 --> 00:06:28,130 information and infrastructure, not accounting for the human element. 84 00:06:28,220 --> 00:06:31,790 ARI: Sean does Maths, Sean great words. CLAUDINE: Sean do words good. [laughter] 85 00:06:33,970 --> 00:06:38,110 ARI: Okay. I'm going to talk about... Oh, no, WE're going to talk about Mary. 86 00:06:40,000 --> 00:06:44,170 MARY: When [robot: Curiosity] landed, it started sending Twitter updates. 87 00:06:44,200 --> 00:06:50,950 That was probably a human typing at NASA and not the robot on Mars. 88 00:06:50,950 --> 00:06:58,780 It got me thinking, we're going to be sending robots [to space] ahead of us. 89 00:06:59,110 --> 00:07:04,719 Alexa is a robot. And I wondered, what [is] the place for speech technology? 90 00:07:04,720 --> 00:07:09,140 Could we train a robot to describe to us, the things that it finds? 91 00:07:09,160 --> 00:07:14,350 How would we make those communications secure? 92 00:07:16,500 --> 00:07:17,770 CLAUDINE: The way we selected these moments. 93 00:07:17,800 --> 00:07:22,450 We did not talk to each other... until right before we started recording. 94 00:07:22,810 --> 00:07:29,970 I wish we had a deeper explanation but... robots in space. 95 00:07:31,370 --> 00:07:34,550 ARI: From last episode, the sound byte was 'we get there, together or not at all', 96 00:07:34,580 --> 00:07:39,260 everyone does X and that all counts! There's a great Swiss cheese analogy. 97 00:07:39,260 --> 00:07:45,280 Instead of one protective layer [with] holes, you have multiple layers. 98 00:07:45,280 --> 00:07:46,639 Each has holes, but in different places. 99 00:07:46,640 --> 00:07:51,679 Eventually you're more protected because you've got more coverage. 100 00:07:51,680 --> 00:07:55,130 More barriers in the way of bad things happening. 101 00:07:55,130 --> 00:07:59,510 Security is useless if it is not relevant and not useful, because what tends 102 00:07:59,510 --> 00:08:03,260 to happen: hackers aren't just baddies who try and get into your systems. 103 00:08:03,500 --> 00:08:08,330 People create their own hacks to get around things that are bothering them. 104 00:08:11,720 --> 00:08:14,480 CLAUDINE: And accessibility. ARI: It has to be accessible as well. 105 00:08:17,200 --> 00:08:19,190 As a parting gift for anyone listening, 106 00:08:19,760 --> 00:08:22,820 how can they think about security in their own way? 107 00:08:23,000 --> 00:08:23,930 CLAUDINE: Security by design. 108 00:08:25,100 --> 00:08:29,300 Designing a project which you think has no particular security implications, 109 00:08:29,300 --> 00:08:31,939 there are almost always security implications. 110 00:08:31,940 --> 00:08:35,810 ARI: Think about confidentiality, integrity and availability. 111 00:08:36,170 --> 00:08:42,010 Confidentiality: How safe and secret do you want your data/important thing? 112 00:08:42,010 --> 00:08:47,270 Integrity: How do we make sure it's correct as much as possible? 113 00:08:47,800 --> 00:08:54,050 Availability. How do we make sure people who need access, can do that? 114 00:08:54,560 --> 00:08:58,300 CLAUDINE: The idea is not to scare people. Just be mindful, 115 00:08:58,300 --> 00:09:02,089 that cyber security is something that you should keep in mind 116 00:09:02,090 --> 00:09:05,510 when you're doing anything online or using any kind of technology. 117 00:09:05,600 --> 00:09:09,980 It shouldn't take over your life or make you scared, bear it in mind. 118 00:09:11,400 --> 00:09:13,330 ARI: What are you doing next, Claudine? 119 00:09:13,330 --> 00:09:17,190 CLAUDINE: More research! I will have to figure out what to do next... 120 00:09:17,190 --> 00:09:20,250 and that is future Claudine's problem. And you? What are you doing? 121 00:09:22,540 --> 00:09:23,670 ARI: More cyber security. Because I just can't get enough. I'm going to America. 122 00:09:29,900 --> 00:09:32,090 A few thank yous before we sign off... 123 00:09:32,630 --> 00:09:36,980 Big thanks to the Cyber Security CDT at the University of Oxford. 124 00:09:37,010 --> 00:09:40,200 CLAUDINE: Thank you to everybody who participated, talked to us, emailed... 125 00:09:40,200 --> 00:09:41,989 ARI: A big thank you to you for listening, 126 00:09:41,990 --> 00:09:45,979 it's really great to have you. That is all we have time for today. 127 00:09:47,100 --> 00:09:51,380 We hope you've enjoyed Proving the Negative: Swanning About in Cyber Security. 128 00:09:52,000 --> 00:09:58,200 CLAUDINE: You can tweet at us @HelloPTNPod and you can subscribe 129 00:09:58,200 --> 00:10:01,140 on Apple Podcasts or wherever you listen to podcasts. The title is PTNPod. 130 00:10:05,900 --> 00:10:11,019 This has been a podcast from the Centre for Doctoral Training in Cyber Security, University of Oxford. 131 00:10:11,020 --> 00:10:14,020 Funded by the Engineering and Physical Sciences Research Council.