1 00:00:01,980 --> 00:00:05,610 ARI: Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,610 --> 00:00:09,490 ARI: We're a podcast all about exploring the different sides of cybersecurity, 3 00:00:09,490 --> 00:00:13,770 from political to computer science, international relations to mathematics. 4 00:00:13,770 --> 00:00:17,770 Join us as we talk to our friends about the work they do. 5 00:00:17,770 --> 00:00:19,050 ARI: Would you like to start with yours? 6 00:00:19,050 --> 00:00:22,000 CLAUDINE: Why don't we start with yours? ARI: OK. 7 00:00:22,000 --> 00:00:26,300 We're doing something a bit different - this podcast is all about engagement, 8 00:00:26,300 --> 00:00:31,610 and it's all about being easy for people to understand. 9 00:00:31,610 --> 00:00:35,480 We've got key words here and there that might throw you for a loop. 10 00:00:35,480 --> 00:00:39,980 These intros are to chat about why we're talking about what we're talking about. 11 00:00:39,980 --> 00:00:44,240 Today we're talking about my research. CLAUDINE: since we are... what is it? 12 00:00:44,240 --> 00:00:48,170 ARI: I lived in the US for a while. Before I went, I needed my immunisation record. 13 00:00:48,170 --> 00:00:53,540 (the jabs I'd had) I went to the doctors and they didn't have any records for me. 14 00:00:53,540 --> 00:01:00,110 The people who were meant to look after my information didn't have it. 15 00:01:00,110 --> 00:01:07,160 This happened ages ago, I'm still salty! It's affected how I think about privacy. 16 00:01:07,160 --> 00:01:12,890 How can I share data? How can I hide it from people who don't need to see it? 17 00:01:12,890 --> 00:01:14,270 Power to the people, really. 18 00:01:14,270 --> 00:01:23,900 The way that we try and control how data is used doesn't really work. 19 00:01:23,900 --> 00:01:29,780 Part of the reason is - it's inflexible. Paper forms don't translate to online spaces. 20 00:01:29,780 --> 00:01:34,430 CLAUDINE: Power to the people is always good. Any terms you want to define? 21 00:01:34,430 --> 00:01:41,930 ARI: Privacy is controlling what is known about you, in a reasonable way. 22 00:01:41,930 --> 00:01:48,920 You can get a bit paranoid. Back in the day, US lawyers (Warren and Brandeis) 23 00:01:48,920 --> 00:01:55,580 were part of an elite group who got annoyed by the paparazzi taking photos. 24 00:01:55,580 --> 00:02:01,130 They wrote about the right to privacy, the right to control what is known. 25 00:02:01,130 --> 00:02:06,470 There's a history of different takes people have on it. I like to think about choice. 26 00:02:06,470 --> 00:02:11,870 I talk about informed consent. Cookie [notices] online are really annoying... 27 00:02:11,870 --> 00:02:15,860 Apparently these are consent notices and it's all about having informed consent. 28 00:02:15,860 --> 00:02:21,940 CLAUDINE: We rarely think about what it means to be informed before agreeing. 29 00:02:21,940 --> 00:02:26,390 ARI: Yeah, informed consent is meant to help people keep safe. 30 00:02:26,390 --> 00:02:33,680 The way we do it, it gets in the way. We need to do better, make it a conversation. 31 00:02:33,680 --> 00:02:41,060 What we'll talk about is part of a much wider conversation about data use. 32 00:02:41,060 --> 00:02:46,390 CLAUDINE: Let's go ahead and get into the interview that you and I recorded. 33 00:02:46,390 --> 00:02:51,730 Welcome, Ari. [ARI: Hello]. Would you give me and our listeners your overview? 34 00:02:51,730 --> 00:02:57,850 ARI: There are some misconceptions that we have about people, in security. 35 00:02:57,850 --> 00:03:04,450 You may hear 'users are the problem/weakest link'. 36 00:03:04,450 --> 00:03:08,950 This places blame on the people who have the least power to do anything. 37 00:03:08,950 --> 00:03:19,030 This is something I'm trying to find evidence either for or against. 38 00:03:19,030 --> 00:03:23,860 I've been looking at how we architect choice - how we offer people options. 39 00:03:23,860 --> 00:03:28,030 I've been collaborating with a medical online research platform and I've been 40 00:03:28,030 --> 00:03:32,470 looking at how they architect choice for the people who take part in their study. 41 00:03:32,470 --> 00:03:38,830 This is important - in research, informed consent is used to protect people. 42 00:03:38,830 --> 00:03:43,550 You offer a description of the work you want to do to a potential participant. 43 00:03:43,550 --> 00:03:49,840 They say yes or no. [If] they say yes, you can then collect data for your research. 44 00:03:49,840 --> 00:03:53,680 Now, what I did in my work, there were three parts to it. 45 00:03:53,680 --> 00:04:02,890 1. I interviewed people taking part in, and the research team behind, a research study. 46 00:04:02,890 --> 00:04:06,940 It's an online platform. They collect data on rare genetic conditions. 47 00:04:06,940 --> 00:04:11,110 And what I found was that the research team was excellent, very engaged, 48 00:04:11,110 --> 00:04:16,450 they had participant interests at heart and wanted to protect, help and support. 49 00:04:16,450 --> 00:04:23,290 They had a group they asked for advice... Participants didn't know what was going on, 50 00:04:23,290 --> 00:04:28,210 they had a lot of assumptions around how researchers would use their data. 51 00:04:28,210 --> 00:04:34,810 They said, 'yes, I'll take part', but they didn't expect to hear anything back. 52 00:04:34,810 --> 00:04:40,300 2. I asked participants what they wanted to hear [back]. And they told me. 53 00:04:40,300 --> 00:04:44,980 3. I worked with software developers and the research team to improve feedback. 54 00:04:44,980 --> 00:04:49,480 We looked at whether this changed how people shared data. 55 00:04:49,480 --> 00:04:53,140 We measured the number of questionnaires that the study asked 56 00:04:53,140 --> 00:04:57,280 people to fill out, against the number of questionnaires they received back. 57 00:04:57,280 --> 00:05:04,030 Drop off in participation was reduced which suggests that if you listen to people, 58 00:05:04,030 --> 00:05:08,230 if you respond to what they want, they're more likely to stick with you, 59 00:05:08,230 --> 00:05:12,190 which is important when you're talking about how you share data, 60 00:05:12,190 --> 00:05:16,870 how you operate online, how you build systems and how you architect choice. 61 00:05:16,870 --> 00:05:24,670 CLAUDINE: Could you expand on what it means for consent to be informed? 62 00:05:24,670 --> 00:05:30,580 ARI: Informed consent practices are data protection practices. After World War II, 63 00:05:30,580 --> 00:05:34,060 (terrible things had happened to people in the name of medical research) 64 00:05:34,060 --> 00:05:37,930 There was a series of trials in which war criminals were held to account. 65 00:05:37,930 --> 00:05:43,360 Informed consent was how medical researchers could show agreement to 66 00:05:43,360 --> 00:05:50,260 potential risks, what's involved. That participants have knowingly agreed. 67 00:05:50,260 --> 00:05:55,690 [Consent] is freely given so you're not coercing people; informed and voluntary. 68 00:05:55,690 --> 00:06:01,090 It's important you haven't forced the decision. Informed consent has problems. 69 00:06:01,090 --> 00:06:04,540 Sometimes, once you give your consent, you can't actually change it. 70 00:06:04,540 --> 00:06:11,170 People are overloaded with technical information. When you look online... 71 00:06:11,170 --> 00:06:17,410 pre-existing problems are compounded. People unfamiliar with digital technology 72 00:06:17,410 --> 00:06:25,120 will find it hard to use. It adds up. If you look at consent online - cookie notices. 73 00:06:25,120 --> 00:06:32,170 I come across them every day on every website and they're really annoying. 74 00:06:32,170 --> 00:06:38,650 They overload me with information, I've never once been able to revisit a choice. 75 00:06:38,650 --> 00:06:44,020 I just want to look at kitten videos. Calling this informed consent doesn't fit. 76 00:06:44,020 --> 00:06:48,400 It doesn't work. Informed consent is broken online. 77 00:06:48,400 --> 00:06:53,560 I'm thinking of how we can do better. There's a theory called Dynamic Consent. 78 00:06:53,560 --> 00:06:57,250 It's flexible and you can change your mind over time. It's a theory. 79 00:06:57,250 --> 00:07:01,330 There's not a lot of work that supports it or refutes it. 80 00:07:01,330 --> 00:07:06,730 The people I work with have implemented Dynamic Consent. 81 00:07:06,730 --> 00:07:10,390 I tried to do a service evaluation. I wanted to explore something new. 82 00:07:10,390 --> 00:07:19,030 The system we have isn't working. Consent is a safeguard, there's room to improve. 83 00:07:19,030 --> 00:07:22,870 CLAUDINE: What drew you to that topic specifically? 84 00:07:22,870 --> 00:07:28,510 ARI: We have a right to privacy in the home. What does the home look like online? 85 00:07:28,510 --> 00:07:31,990 Where do the boundaries sit? Solove, Nissenbaum... 86 00:07:31,990 --> 00:07:40,150 There are people figuring out how to model privacy when building technology. 87 00:07:40,150 --> 00:07:42,850 I'm someone who works with computers a lot, 88 00:07:42,850 --> 00:07:49,260 I'm frustrated by [having] to give someone my email just to get a receipt. 89 00:07:49,260 --> 00:07:53,920 That shouldn't be the case. CLAUDINE: Based on the interviews you conducted, 90 00:07:53,920 --> 00:08:01,690 go into what people thought about privacy and eroding digital privacy. 91 00:08:01,690 --> 00:08:05,050 ARI: It's not like we ever had total privacy, and we've had less and less of it 92 00:08:05,050 --> 00:08:10,900 and now we have no privacy. It's more like we've normalised a lack of privacy. 93 00:08:10,900 --> 00:08:13,930 There's a great quote that talks about how (in the early Internet), 94 00:08:13,930 --> 00:08:21,190 had browsers in 1995 declined cookies as a default option, then most 95 00:08:21,190 --> 00:08:26,890 websites would have responded with designs that didn't seek personal data. 96 00:08:26,890 --> 00:08:34,210 The way we've [built] has led to this point. It's going to take energy to change. 97 00:08:34,210 --> 00:08:38,950 Once you've done theoretical, academic work, you need some kind of output. 98 00:08:38,950 --> 00:08:47,230 I've got something I could give a researcher building an online study, 99 00:08:47,230 --> 00:08:52,150 they could use that to think about what data they want from participants and 100 00:08:52,150 --> 00:08:57,970 how they demonstrate that data is being used for the purposes of research. 101 00:08:57,970 --> 00:09:04,390 Currently in a research study, I can use a template that has 'share' or 'not share', 102 00:09:04,390 --> 00:09:11,800 [If someone agrees to share] I don't have to tell them anything after that point. 103 00:09:11,800 --> 00:09:14,900 There are other options. There are various choices you can offer people. 104 00:09:14,900 --> 00:09:20,470 E.g., are you happy for me to share your data with colleagues/pharma companies? 105 00:09:20,470 --> 00:09:25,540 There are different levels that people would like to have choice over. 106 00:09:25,540 --> 00:09:34,660 People trusted researchers, saying they'd revoke consent if a researcher did a bad 107 00:09:34,660 --> 00:09:39,580 thing or if they felt that what they had contributed towards was of no benefit. 108 00:09:39,580 --> 00:09:44,110 Two conditions under which someone might take back consent, but none did. 109 00:09:44,110 --> 00:09:50,350 There's implicit trust in researchers as experts - if we can do better, we should. 110 00:09:50,350 --> 00:09:56,530 CLAUDINE: It is a little bit frightening that there is such implicit trust, 111 00:09:56,530 --> 00:10:02,350 given that the research world has not been immune from ethical scandals. 112 00:10:02,350 --> 00:10:08,560 What has the biggest surprise been in conducting your research? 113 00:10:08,560 --> 00:10:16,360 ARI: The people I interviewed thought they also had duties: to be truthful, 114 00:10:16,360 --> 00:10:22,180 to be accurate, to provide 'perfect' information to support research. 115 00:10:22,180 --> 00:10:25,810 It's not that people don't take this seriously, or they don't want to engage. 116 00:10:25,810 --> 00:10:31,570 There are so many underlying assumptions or decisions being made. 117 00:10:31,570 --> 00:10:38,830 We don't see them. Privacy is a way for people to express choices. 118 00:10:38,830 --> 00:10:44,410 We researchers, and experts in other fields (people who make software, 119 00:10:44,410 --> 00:10:49,330 people who make policy decisions), seem to assume that people are stupid. 120 00:10:49,330 --> 00:10:54,030 We do not take time to communicate in terms people can understand. 121 00:10:54,030 --> 00:10:58,030 People have jobs, they have time restrictions, energy constraints, 122 00:10:58,030 --> 00:11:03,460 it's the expert's (i.e., MY job) to communicate what I want to do, 123 00:11:03,460 --> 00:11:06,210 so it's a truly informed choice. 124 00:11:06,210 --> 00:11:17,860 People expected so much, but they never voiced that. 125 00:11:17,860 --> 00:11:21,400 CLAUDINE: What I'm hearing from you is that the notion 126 00:11:21,400 --> 00:11:25,930 of transparency is hand-in-hand with the notion of privacy? ARI: Exactly. 127 00:11:25,930 --> 00:11:29,920 One of the duties that people thought researchers had - to protect data, 128 00:11:29,920 --> 00:11:35,020 people thought researchers should make them feel safe. 129 00:11:35,020 --> 00:11:39,910 Combining safety and security when protecting their personal information. 130 00:11:39,910 --> 00:11:46,540 They talked about confidentiality and anonymity - security was important. 131 00:11:46,540 --> 00:11:50,200 Most of all, it was that the researchers had been considerate, 132 00:11:50,200 --> 00:11:59,650 the consideration and protection of data includes security measures (access ctrl). 133 00:11:59,650 --> 00:12:06,100 It was also about the researcher as a person that they were honest/reliable. 134 00:12:06,100 --> 00:12:10,870 In ~2015, a hospital made an agreement with a big tech company. 135 00:12:10,870 --> 00:12:18,190 For anyone who knows it, it's the Royal Free and Google... 136 00:12:18,190 --> 00:12:21,760 What happened was they decided to work on an application together. 137 00:12:21,760 --> 00:12:27,820 It would look at symptoms and predict serious conditions requiring treatment. 138 00:12:27,820 --> 00:12:33,910 They didn't have a data agreement in place. What a data agreement is, it says 139 00:12:33,910 --> 00:12:37,750 'as a hospital we'll provide this', 'as a tech company...' 140 00:12:37,750 --> 00:12:44,170 'we'll do that', we have everything agreed before anything (bad) happens. 141 00:12:44,170 --> 00:12:49,000 What [developers] generally do is test on information that is not live, 142 00:12:49,000 --> 00:12:54,970 not being used for something critical, like care in an emergency ward. 143 00:12:54,970 --> 00:12:59,110 The app that they made was tested on live patient data (no consent). 144 00:12:59,110 --> 00:13:03,800 The UK Information Commissioner's Office, an office that looks at situations 145 00:13:03,800 --> 00:13:08,020 and makes a ruling on whether or not the way that data was used was legal. 146 00:13:08,020 --> 00:13:11,410 And they said, 'no, this is bad'. 147 00:13:11,410 --> 00:13:18,690 Doctors have a duty of care to patients. I think about duty of care in security. 148 00:13:18,690 --> 00:13:23,173 I have a duty to care for data that people entrust to me, 149 00:13:23,173 --> 00:13:27,630 if I damage that trust, it's on me to show that I am trustworthy. 150 00:13:27,630 --> 00:13:35,270 It's not on people giving me data to do anything, placing trust is up to them. 151 00:13:35,270 --> 00:13:39,880 I've seen ethics processes, they can take a while, they can be frustrating. 152 00:13:39,880 --> 00:13:45,700 Ultimately, the reason we do this is so that we don't cause harm. 153 00:13:45,700 --> 00:13:47,590 We shouldn't be wanting to take shortcuts. 154 00:13:47,590 --> 00:13:54,850 I make ethical work easier for researchers by giving them a roadmap. 155 00:13:54,850 --> 00:14:01,420 You don't need to be an expert in ethics. What do you want from participants? 156 00:14:01,420 --> 00:14:04,130 Research data? They also have perspectives of their own. 157 00:14:04,130 --> 00:14:11,620 They might be able to help or provide ideas. Look for input and filter it later. 158 00:14:11,620 --> 00:14:17,110 Once you've done inclusive work, you need responsive work (give feedback), 159 00:14:17,110 --> 00:14:24,130 show that you're worth the trust that people have put in you as a researcher. 160 00:14:24,130 --> 00:14:27,100 Which is a bit touchy-feely, but very important. 161 00:14:27,100 --> 00:14:33,810 I would like to make it easier for others to demonstrate trustworthiness as well. 162 00:14:33,810 --> 00:14:44,310 CLAUDINE: Developing that roadmap, did you did you hit any speed bumps? 163 00:14:44,310 --> 00:14:47,790 ARI: Oh, yeah, totally. It was going to be a long, very involved experiment. 164 00:14:47,790 --> 00:14:52,710 We didn't have time, so we just did it all at once. The work lacks granularity. 165 00:14:52,710 --> 00:14:58,890 Then the pandemic hit - there's lots of different ways this work can branch out. 166 00:14:58,890 --> 00:15:03,420 I did what I could and hopefully other people will take it on. 167 00:15:03,420 --> 00:15:08,460 The lessons I've learnt from this I will take on to whatever I do next. 168 00:15:08,460 --> 00:15:13,590 CLAUDINE: What do you think the most important lessons are? ARI: Listen. 169 00:15:13,590 --> 00:15:17,760 I as the expert researcher build a study for people to take part in. 170 00:15:17,760 --> 00:15:24,210 I have to think about how I'm including people, how I'm listening and feed back. 171 00:15:24,210 --> 00:15:31,260 Weave informed consent into the process for a relationship over time. 172 00:15:31,260 --> 00:15:33,720 That can be quite intensive as well, resource-wise. But, if this isn't in place, 173 00:15:33,720 --> 00:15:42,100 people can never express what they want to, something that might help. 174 00:15:42,100 --> 00:15:49,470 My collaborators have this patient forum. People came up to them: 175 00:15:49,470 --> 00:15:56,400 'Have you ever thought about mental health, not just the physical parts?' 176 00:15:56,400 --> 00:16:03,720 My collaborators learned from participants as well as doing their work. 177 00:16:03,720 --> 00:16:08,820 [It] makes your data collection much more valuable, having that context. 178 00:16:08,820 --> 00:16:11,910 Not only do you receive the information about symptoms, 179 00:16:11,910 --> 00:16:17,830 you also get information about people's lives and living conditions. 180 00:16:17,830 --> 00:16:23,130 CLAUDINE: It's easy to get bogged down in technical minutiae, 181 00:16:23,130 --> 00:16:29,850 and forget that a lot of the research we're doing is for people, for users, 182 00:16:29,850 --> 00:16:35,190 and we need to listen to them and pay attention to their needs as opposed to, 183 00:16:35,190 --> 00:16:39,570 'THIS is what you need'. ARI: It can be a lot, but I've been a software developer, 184 00:16:39,570 --> 00:16:41,970 I know you can apply this to that kind of work. 185 00:16:41,970 --> 00:16:48,930 E.g., when you build something, do user testing. Identify who's going to use this, 186 00:16:48,930 --> 00:16:55,650 invite a few people in to have a go. You can troubleshoot, identify issues earlier, 187 00:16:55,650 --> 00:17:00,660 that would be more expensive to fix later. CLAUDINE: Yeah, everybody wins. 188 00:17:00,660 --> 00:17:05,160 It's [potentially] more laborious, but benefits everyone in the long run. 189 00:17:05,160 --> 00:17:10,800 If you had infinite resources, whatever resources might mean, 190 00:17:10,800 --> 00:17:17,100 what is the one big question you would like to investigate or solve? 191 00:17:17,100 --> 00:17:21,000 ARI: It's all about culture. How do we build cultures in which people 192 00:17:21,000 --> 00:17:25,300 feel comfortable bringing ideas, bringing mistakes, their experiences? 193 00:17:25,300 --> 00:17:29,900 It's very much about leadership. As a cyber security person. I have to lead 194 00:17:29,900 --> 00:17:33,660 the work that I do because I'm not an expert in lots of different subjects, 195 00:17:33,660 --> 00:17:36,960 but what my job is, is to bring people who are experts into the group. 196 00:17:36,960 --> 00:17:40,950 The expert that so many of us miss out is the user. 197 00:17:40,950 --> 00:17:48,510 They have such a wealth of information and knowledge and experience. 198 00:17:48,510 --> 00:17:53,190 My collaborators leveraged that experience and it worked out for them. 199 00:17:53,190 --> 00:17:57,870 It was valuable. If I had all the resources, I would scale that up. 200 00:17:57,870 --> 00:18:01,950 How do we communicate effectively? How do we build those relationships? 201 00:18:01,950 --> 00:18:07,800 I don't expect every person in a study I'm running to call me every evening. 202 00:18:07,800 --> 00:18:11,820 It's just about having that architecture in place, having that system in place, 203 00:18:11,820 --> 00:18:18,120 where people can tell you information or they feel that you value them. 204 00:18:18,120 --> 00:18:22,590 As a result, they're more likely to come forward with helpful information. 205 00:18:22,590 --> 00:18:27,400 That's pretty abstract, I realise. In terms of cybersecurity experts, it's about 206 00:18:27,400 --> 00:18:33,670 fostering leadership and knowing who to include as part of initial conversations 207 00:18:33,670 --> 00:18:35,130 For people who are not cyber security experts - how do you equip them 208 00:18:35,130 --> 00:18:43,590 to be a bit braver? How do you equip them with tools they can then use? 209 00:18:43,590 --> 00:18:51,730 It's about advocacy. What do people need to be security advocates, elsewhere? 210 00:18:51,730 --> 00:18:57,240 Bit of a step away from my research - it's not about me being 'the expert'. 211 00:18:57,240 --> 00:19:03,630 It's about helping other people figure out how security is relevant and useful. 212 00:19:03,630 --> 00:19:07,330 And then, weaving that into the work they do. 213 00:19:07,330 --> 00:19:12,600 CLAUDINE: What cyber security resources would you recommend? 214 00:19:12,600 --> 00:19:16,920 ARI: I only have Twitter, so I'm @schulite. 215 00:19:16,920 --> 00:19:21,000 I get to see people's thoughts who I wouldn't normally have access to. 216 00:19:21,000 --> 00:19:23,700 If you are not someone who likes to use social media, 217 00:19:23,700 --> 00:19:27,390 The Register is really good. (https://www.theregister.com/security) 218 00:19:27,390 --> 00:19:32,990 You can keep up to date with tech news, and you can branch out from there. 219 00:19:32,990 --> 00:19:36,200 CLAUDINE: In case you missed it, Ari's twitter handle is @schulite, follow her, 220 00:19:36,200 --> 00:19:41,750 she posts very interesting things. ARI: I do. CLAUDINE: Yes, you do. *laughter* 221 00:19:41,750 --> 00:19:48,920 In the meantime, you can tweet at us @HelloPTNPod 222 00:19:48,920 --> 00:19:53,210 and you can subscribe on Apple Podcasts or wherever you listen to podcasts. 223 00:19:53,210 --> 00:19:59,070 The title there is PTNPod. See you next week. [ARI: Bye!] 224 00:19:59,070 --> 00:20:04,260 This has been a podcast from the Centre for Doctoral Training in Cybersecurity, University of Oxford. 225 00:20:04,260 --> 00:20:09,130 Funded by the Engineering and Physical Sciences Research Council.