1 00:00:01,980 --> 00:00:05,610 ARI: Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,610 --> 00:00:09,490 ARI: We're a podcast all about exploring the different sides of cybersecurity, 3 00:00:09,490 --> 00:00:13,770 from political to computer science, international relations to mathematics. 4 00:00:13,770 --> 00:00:17,200 Join us as we talk to our friends about the work they do. 5 00:00:17,200 --> 00:00:20,100 JULIA: I wrote down in big notes here, "trigger warning" because I start any talks 6 00:00:20,100 --> 00:00:24,700 because I start talks on this subject with a notice reminding people that this can be 7 00:00:24,700 --> 00:00:27,210 disturbing whether or not you've experienced [intimate partner violence, IPV], 8 00:00:27,210 --> 00:00:32,920 but especially if you have experienced it. Remember that you can stop listening 9 00:00:32,920 --> 00:00:37,210 at any point and return to it and remember to take care of yourself. 10 00:00:37,210 --> 00:00:41,320 Hi. So, first of all, thank you so much for having me on your podcast. 11 00:00:41,320 --> 00:00:48,400 I'm a Ph.D. student at the Oxford Internet Institute and Cyber Security CDT. 12 00:00:48,400 --> 00:00:51,450 I'm researching how people abuse technology 13 00:00:51,450 --> 00:00:54,310 for coercion and control in intimate relationships. 14 00:00:54,310 --> 00:00:57,480 So cybersecurity in the context of intimate partner violence, 15 00:00:57,480 --> 00:01:01,660 looking at the ways in which people abuse technology to stalk their partners, 16 00:01:01,660 --> 00:01:07,270 to coerce or control them through image based sexual abuse, or surveillance. 17 00:01:07,270 --> 00:01:13,580 I'm looking at how technology producers and people who maintain technology 18 00:01:13,580 --> 00:01:16,630 can anticipate and respond to these forms of abuse. 19 00:01:16,630 --> 00:01:22,930 More broadly, how feminist theories and practices can inform technology design. 20 00:01:22,930 --> 00:01:29,470 Some of the solutions that I'm exploring are drawing on participatory methods 21 00:01:29,470 --> 00:01:34,450 and a concept called abusability, expanding security to think about abuse. 22 00:01:34,450 --> 00:01:40,500 ARI: Perhaps we could start with abusability. Could you tell us more? 23 00:01:40,500 --> 00:01:43,540 JULIA: Abusability is a concept that comes from Ashkan Soltani, 24 00:01:43,540 --> 00:01:46,750 who used to be at the Federal Trade Commission in the US. 25 00:01:46,750 --> 00:01:51,670 He coined it in contrast with conventional concepts like security or usability. 26 00:01:51,670 --> 00:01:58,660 Usability is how easy it is for a user to navigate a technology to use it. 27 00:01:58,660 --> 00:02:03,000 How intuitive the design is, to accomplish whatever their ends are. 28 00:02:03,000 --> 00:02:10,330 Security is thinking about how to stop people from penetrating the system, 29 00:02:10,330 --> 00:02:17,500 exfiltrating data or using it in a way that wasn't intended (causing risk or harm). 30 00:02:17,500 --> 00:02:22,180 Abusability shifts that to think, not just about how someone might hack a system, 31 00:02:22,180 --> 00:02:26,200 but how someone might use a system broadly in the way that it was intended. 32 00:02:26,200 --> 00:02:31,450 How features were intended [for use], but in a way that causes harm to someone. 33 00:02:31,450 --> 00:02:35,290 For example, if you think about the app, "Find my friends". 34 00:02:35,290 --> 00:02:38,730 Conventional security may have thought about how someone completely external 35 00:02:38,730 --> 00:02:42,880 to the system might hack in and get everyone's locations. 36 00:02:42,880 --> 00:02:46,680 But abusability would think about how someone might use that app to 37 00:02:46,680 --> 00:02:53,000 spy on friends or family by coercing them to share their location, and then using 38 00:02:53,000 --> 00:02:55,360 that knowledge to monitor them constantly, 39 00:02:55,360 --> 00:02:58,660 make them feel watched, control where they can and can't go. 40 00:02:58,660 --> 00:03:02,000 It flips usability on its head because we tend to think that 41 00:03:02,000 --> 00:03:04,660 technology should be as easy to use as possible. 42 00:03:04,660 --> 00:03:09,800 But in this case, we want technology to be less easy to use 43 00:03:09,800 --> 00:03:11,710 for certain people, for certain ends. 44 00:03:11,710 --> 00:03:14,950 ARI: How have you approached the challenges within your research? 45 00:03:14,950 --> 00:03:18,500 JULIA: I think that's such a great question and thinking through challenges 46 00:03:18,500 --> 00:03:22,450 in your research and conversation with others, it's just really valuable to do. 47 00:03:22,450 --> 00:03:26,000 One big challenge that I've been grappling with constantly and actually 48 00:03:26,000 --> 00:03:29,200 keep returning to even when I've thought I've moved away from it, 49 00:03:29,200 --> 00:03:33,750 is that the basic framing of a lot of technology research 50 00:03:33,750 --> 00:03:36,595 (thinking of us being in the cybersecurity programme) 51 00:03:36,595 --> 00:03:40,480 research questions tend to be "How can we make technology better?" 52 00:03:40,480 --> 00:03:44,060 I've taken on that thinking through the concept of abusability 53 00:03:44,060 --> 00:03:47,020 (which is very much "How can we design better technology?") 54 00:03:47,020 --> 00:03:53,120 A lot of complex social problems can't necessarily be solved by technology. 55 00:03:53,120 --> 00:03:55,250 And so, striking a balance... 56 00:03:55,250 --> 00:03:59,390 There are fundamental problems in how technology is being designed 57 00:03:59,390 --> 00:04:00,820 and it contributes to the problem. 58 00:04:00,820 --> 00:04:09,520 It is hard to talk about new solutions without saying "We need more tech". 59 00:04:09,520 --> 00:04:13,900 The biggest changes to the broader problems of domestic violence, 60 00:04:13,900 --> 00:04:17,710 intimate partner violence, need to come in how we allocate resources, 61 00:04:17,710 --> 00:04:23,400 allocating better resources towards care systems (domestic violence advocates, shelters... 62 00:04:23,400 --> 00:04:25,990 ... better housing in general) so that people aren't stuck 63 00:04:25,990 --> 00:04:42,310 in abusive relationships because they depend on it for their housing. 64 00:04:42,310 --> 00:04:45,670 ARI: Beautifully put - what is it that you are curious about? 65 00:04:45,670 --> 00:04:48,900 JULIA: What I've been really curious about in the past few months 66 00:04:48,900 --> 00:04:54,000 [are the] care practices of advocates who support survivors of domestic violence 67 00:04:54,000 --> 00:04:57,520 on the front lines. Places like domestic violence shelters. 68 00:04:57,520 --> 00:05:01,500 I've also been speaking to people from the traditional technology space 69 00:05:01,500 --> 00:05:04,990 (e.g., software engineers) that have taken this on, supporting survivors... 70 00:05:04,990 --> 00:05:07,780 people in the digital privacy rights space, who've taken this on... 71 00:05:07,780 --> 00:05:12,070 even people in hacking collectives who've decided to make this their focus 72 00:05:12,070 --> 00:05:15,250 (who aren't at all in the traditional domestic violence space). 73 00:05:15,250 --> 00:05:19,500 Something really interesting is happening at the intersection of those 74 00:05:19,500 --> 00:05:23,880 different forms of expertise. On the one hand, cyber security expertise of 75 00:05:23,880 --> 00:05:29,950 people who think about systematically modelling and responding to 76 00:05:29,950 --> 00:05:31,190 technology threats and building safer systems. And then, on the other hand, 77 00:05:31,190 --> 00:05:36,350 the care expertise of people who think about trauma, healing from trauma, 78 00:05:36,350 --> 00:05:39,990 protecting yourself from the vicarious trauma that comes from working with 79 00:05:39,990 --> 00:05:41,320 people who are traumatised. 80 00:05:41,320 --> 00:05:46,660 That group is suddenly having to learn about VPNs and password managers, 81 00:05:46,660 --> 00:05:50,760 just because it's increasingly a part of what they're seeing in their jobs. 82 00:05:50,760 --> 00:05:54,400 Tech people are having to learn about trauma and learn about care. 83 00:05:54,400 --> 00:05:58,599 It's really hard to disentangle those two kinds of knowledges. 84 00:05:58,599 --> 00:06:01,550 When you're helping survivors, you need to have both. 85 00:06:01,550 --> 00:06:06,500 The security that comes out of that is really, really different to 86 00:06:06,500 --> 00:06:09,430 the conventional way that we think about cybersecurity. 87 00:06:09,430 --> 00:06:13,900 One thing that's come up again and again in my interviews, 88 00:06:13,900 --> 00:06:19,330 is that a lot of times local advocates will build relationships with local [experts]. 89 00:06:19,330 --> 00:06:24,600 For example, tech support at their Best Buy or local mechanics, reaching out to 90 00:06:24,600 --> 00:06:28,620 them and explaining domestic violence, the kind of situation that survivors are in 91 00:06:28,620 --> 00:06:33,800 with specific needs so that if they have a survivor that's experiencing stalking, 92 00:06:33,800 --> 00:06:38,110 they're able to refer them to a mechanic who can search their car for trackers. 93 00:06:38,110 --> 00:06:43,630 Mechanics know how a car works and where it's easy to hide something. 94 00:06:43,630 --> 00:06:46,750 Just having that knowledge isn't enough to help a survivor, 95 00:06:46,750 --> 00:06:53,750 because if you don't have that training and understanding, you might tell a 96 00:06:53,750 --> 00:06:58,480 survivor that they're being crazy or treat them in ways that can retraumatise them. 97 00:06:58,480 --> 00:07:02,750 Building those relationships and those networks of care in the local community 98 00:07:02,750 --> 00:07:06,490 is a really interesting way of doing cybersecurity or digital security. 99 00:07:06,490 --> 00:07:12,310 CLAUDINE: How have you seen dynamics change in the context of the pandemic? 100 00:07:12,310 --> 00:07:17,680 JULIA: The pandemic affected people living in abusive relationships, 101 00:07:17,680 --> 00:07:22,900 particularly cohabitating with their abusers by trapping them in homes, 102 00:07:22,900 --> 00:07:26,380 while simultaneously a lot of the services that support them (e.g., counselling and... 103 00:07:26,380 --> 00:07:33,160 ... advocacy) were forced to close and then struggled to move online. 104 00:07:33,160 --> 00:07:38,890 In situations where there is technology abuse, people's devices being monitored. 105 00:07:38,890 --> 00:07:42,500 How do you provide services? That's a problem that domestic abuse advocates 106 00:07:42,500 --> 00:07:46,200 have been on the forefront of, but it actually affects every single online service. 107 00:07:46,200 --> 00:07:48,750 [When] you're setting up online counselling you need to be thinking 108 00:07:48,750 --> 00:07:51,670 about the people that you're providing online counselling for. 109 00:07:51,670 --> 00:07:56,500 Similarly, for court systems that have moved online. There was a case in the US 110 00:07:56,500 --> 00:07:59,620 where it was recorded on camera that a survivor who 111 00:07:59,620 --> 00:08:03,890 was trying to give testimony was being intimidated off camera by an abuser. 112 00:08:03,890 --> 00:08:08,539 And so accounting for that domestic abuse situation has to be a part of 113 00:08:08,539 --> 00:08:12,000 security practices. It will always be the case that some people will be trapped 114 00:08:12,000 --> 00:08:13,820 in a domestic situation with an abuser. 115 00:08:13,820 --> 00:08:18,580 That is really important thing for online services to consider. 116 00:08:18,580 --> 00:08:21,600 People have found creative ways to approach [this], 117 00:08:21,600 --> 00:08:26,860 incorporating some level of checking at the beginning of an online conversation. 118 00:08:26,860 --> 00:08:30,500 Are you alone in the room? Do you trust this device? 119 00:08:30,500 --> 00:08:33,520 Offering as many different service options as possible. 120 00:08:33,520 --> 00:08:37,090 E.g., offering an email option, a text option, a phone option... 121 00:08:37,090 --> 00:08:42,000 It's a mentally straining situation for both advocates and survivors to be in, 122 00:08:42,000 --> 00:08:42,970 knowing that for some people, 123 00:08:42,970 --> 00:08:46,780 they just won't have a safe space, but you still need to provide those services. 124 00:08:46,780 --> 00:08:52,180 ARI: Is this a particular threat to be modelled, or does the way that we think 125 00:08:52,180 --> 00:08:56,800 about risk and how we're designing these systems need a bit of an overhaul? 126 00:08:56,800 --> 00:09:01,500 JULIA: I've used the language of threat modelling in my work a lot, and I found it 127 00:09:01,500 --> 00:09:06,251 an intuitive way to explain what's happening in terms 128 00:09:06,251 --> 00:09:10,150 technology designers and security experts will understand, 129 00:09:10,150 --> 00:09:14,690 which is that whenever we design a technology, design a service... 130 00:09:14,690 --> 00:09:20,050 and also generally in everyday situations, we'll have certain models of threats in 131 00:09:20,050 --> 00:09:26,140 our mind, and intimate partner violence just doesn't get considered. 132 00:09:26,140 --> 00:09:29,500 Thinking about the threat model, understanding it and incorporating it 133 00:09:29,500 --> 00:09:34,500 is an important part of the solution, although threat modelling as a process 134 00:09:34,500 --> 00:09:37,000 can have downsides as well. Becky Kazansky 135 00:09:37,000 --> 00:09:39,610 has written a really interesting paper talking about threat modelling 136 00:09:39,610 --> 00:09:43,720 with activists and different communities that are targeted by state surveillance. 137 00:09:43,720 --> 00:09:47,770 I've done participatory threat modelling with different communities, 138 00:09:47,770 --> 00:09:52,000 including activists - what she's described is that it can be this overwhelmingly 139 00:09:52,000 --> 00:09:55,330 negative process, thinking through every single thing that can go wrong, 140 00:09:55,330 --> 00:09:59,770 every single thing that's threatened, without pairing it to positive values or 141 00:09:59,770 --> 00:10:04,150 the positive things that you're aiming for (psychologically really challenging). 142 00:10:04,150 --> 00:10:08,700 There are modes of being that are quite anxious and part of the way security is, 143 00:10:08,700 --> 00:10:13,000 as a field. Always thinking about risk isn't the best thing to do if 144 00:10:13,000 --> 00:10:16,630 you're trying to create change in the world, or heal from trauma. 145 00:10:16,630 --> 00:10:21,490 Finding ways that the IPV threat model is considered and is incorporated, 146 00:10:21,490 --> 00:10:26,750 but doesn't necessarily require those who are most at risk to be constantly 147 00:10:26,750 --> 00:10:30,610 dwelling on and explaining those risks, is really important. 148 00:10:30,610 --> 00:10:36,420 CLAUDINE: In your intro, you said that you apply feminist theory to your work. 149 00:10:36,420 --> 00:10:42,000 Can you give us an overview of what that means? 150 00:10:42,000 --> 00:10:48,240 JULIA: What kicked off my thinking about threat modelling was a set 151 00:10:48,240 --> 00:10:55,700 of feminist critiques of international relations theory and security studies, 152 00:10:55,700 --> 00:11:00,240 a binary in the way we understand the world[s] of personal and political. 153 00:11:00,240 --> 00:11:03,860 It's quite an old feminist slogan that the personal is political, 154 00:11:03,860 --> 00:11:07,020 that binary will change how we think about the world. 155 00:11:07,020 --> 00:11:11,400 Cynthia Enloe talks about how we think of international relations as something 156 00:11:11,400 --> 00:11:14,910 that happens at the level of the UN or nation states. 157 00:11:14,910 --> 00:11:20,500 Usually, men in suits debating the end of a war or invasions of one country 158 00:11:20,500 --> 00:11:23,550 by another, or trade or economic relationships. 159 00:11:23,550 --> 00:11:27,000 We wouldn't necessarily think of international relations as something 160 00:11:27,000 --> 00:11:30,150 that happens at a plantation in the Bahamas, 161 00:11:30,150 --> 00:11:34,050 owned by a multinational corporation and the workers who work there, 162 00:11:34,050 --> 00:11:40,140 or a camp of sex workers in South Korea outside of a U.S. military base 163 00:11:40,140 --> 00:11:42,390 and the ways that power is exercised there. 164 00:11:42,390 --> 00:11:46,390 We wouldn't think of that as security or international relations [IR]. 165 00:11:46,390 --> 00:11:54,420 [Enloe] chose that as a way to study IR rather than diplomatic negotiations. 166 00:11:54,420 --> 00:12:01,500 A similar thing happens in cyber security when we think of cyber security as 167 00:12:01,500 --> 00:12:06,000 something that happens within businesses, militaries defending their 168 00:12:06,000 --> 00:12:10,710 assets, states spying on each other or penetrating each other's networks. 169 00:12:10,710 --> 00:12:15,000 But [we] don't think of cyber security as something that happens at the level of 170 00:12:15,000 --> 00:12:19,590 teenagers sexting each other and sharing those images non-consensually. 171 00:12:19,590 --> 00:12:23,490 Similar moves toward separation and moves towards binaries will happen. 172 00:12:23,490 --> 00:12:27,700 I remember at the beginning of my time at the CDT, I would ask people whether 173 00:12:27,700 --> 00:12:32,000 they thought non-consensually shared images were a cyber security issue, and 174 00:12:32,000 --> 00:12:36,600 they would say it's a privacy issue or not really answer the question. I've learned 175 00:12:36,600 --> 00:12:39,500 since then that there is a whole community of privacy engineering 176 00:12:39,500 --> 00:12:43,000 that does care about privacy a lot, and this isn't to discount them. 177 00:12:43,000 --> 00:12:50,130 'Privacy' will be used as a code to say "this isn't serious security research... 178 00:12:50,130 --> 00:12:55,130 ... this is an 'other' thing", which is a problem. In practice, this will mean that 179 00:12:55,130 --> 00:12:59,600 in cases of image-based sexual abuse (non-consensual pornography), the 180 00:12:59,600 --> 00:13:02,940 response will be "You shouldn't have shared that image in the first place", 181 00:13:02,940 --> 00:13:07,330 rather than "How do we build better, more secure technology that gives 182 00:13:07,330 --> 00:13:11,070 more control over your images?" which is what we do for companies. 183 00:13:11,070 --> 00:13:14,500 We would never tell a company "Don't share so much sensitive data". 184 00:13:14,500 --> 00:13:17,850 We think about how we can build better technology for them. 185 00:13:17,850 --> 00:13:23,000 Feminism is a theoretical approach, an activist movement... 186 00:13:23,000 --> 00:13:25,230 it means a lot of things to different people. 187 00:13:25,230 --> 00:13:28,750 One thing that started happening in recent years, which I think is helpful, 188 00:13:28,750 --> 00:13:32,970 is to think of feminisms rather than a singular, unified feminism. 189 00:13:32,970 --> 00:13:38,500 It draws on ideas of plurality or intersectionality - that depending on 190 00:13:38,500 --> 00:13:40,680 other identity factors that you have, depending on where you come from 191 00:13:40,680 --> 00:13:43,000 depending on where you come from and stand in the world, 192 00:13:43,000 --> 00:13:47,190 what feminism means to you and the feminism you practice will be different. 193 00:13:47,190 --> 00:13:54,810 That's also true for other academic theories or political movements. 194 00:13:54,810 --> 00:13:59,880 Words like liberalism or socialism mean different things to different people. 195 00:13:59,880 --> 00:14:04,110 ARI: It's wonderful to get input from people who are not 'us'. As researchers 196 00:14:04,110 --> 00:14:09,310 there's a contract with society, we have privileges not afforded to everyone. 197 00:14:09,310 --> 00:14:13,410 By privilege - if I ask someone about personal life or cyber security habits, 198 00:14:13,410 --> 00:14:19,110 they're more likely to tell me because I'm a researcher and I've got approval. 199 00:14:19,110 --> 00:14:25,200 What are participatory methods, and why do we use them? 200 00:14:25,200 --> 00:14:30,000 JULIA: Participatory approaches to research come from different areas, 201 00:14:30,000 --> 00:14:33,800 it's something that's quite common in feminist research traditionally, 202 00:14:33,800 --> 00:14:37,000 it also came out of activism in South America... 203 00:14:37,000 --> 00:14:40,770 ... Scandinavian worker-based approaches to participatory methods. 204 00:14:40,770 --> 00:14:50,010 Traditional research will have binaries of expert researcher and non-expert other. 205 00:14:50,010 --> 00:14:57,390 The expert collects data about their subjects and writes it up. 206 00:14:57,390 --> 00:15:01,110 If it's a subject that doesn't involve humans like computer science, 207 00:15:01,110 --> 00:15:07,410 then it's just expert study that goes on to inform public policy decisions. 208 00:15:07,410 --> 00:15:14,550 Participatory methods erase that hierarchy, or at least mitigate it a little, 209 00:15:14,550 --> 00:15:21,000 balancing the relationship between researchers and participants 210 00:15:21,000 --> 00:15:26,250 by involving participants in as many different stages of research as possible. 211 00:15:26,250 --> 00:15:29,810 That can involve reaching out to people who are outside of research 212 00:15:29,810 --> 00:15:33,100 to frame your research questions after data is collected, 213 00:15:33,100 --> 00:15:37,170 and it can include reaching out to your participants and involving them in the 214 00:15:37,170 --> 00:15:40,910 data analysis process, (asking "What kind of patterns do you see here?") 215 00:15:40,910 --> 00:15:45,530 It can involve joining your participants' initiatives to help them improve 216 00:15:45,530 --> 00:15:49,880 their lives through education. If they're campaigning on a certain cause, 217 00:15:49,880 --> 00:15:52,910 you can think about how your research can better support that cause. 218 00:15:52,910 --> 00:15:58,670 There's a risk in being too idealistic because thinking of researchers' role, 219 00:15:58,670 --> 00:16:03,860 you can't fully erase those differences and hierarchies. In my own research, 220 00:16:03,860 --> 00:16:07,500 we were running digital privacy and security workshops with different 221 00:16:07,500 --> 00:16:11,500 groups: the general public, different activist groups, a group representing 222 00:16:11,500 --> 00:16:16,220 survivors of image-based sexual abuse, environmental groups... 223 00:16:16,220 --> 00:16:20,480 And in those workshops, we tried to make them two-way, creating a safe, 224 00:16:20,480 --> 00:16:26,510 community fun space rather than just teach digital privacy and security. 225 00:16:26,510 --> 00:16:29,000 We first asked people about their threat models, 226 00:16:29,000 --> 00:16:31,700 what they were worried about, what they wanted to defend against. 227 00:16:31,700 --> 00:16:34,400 In asking that, we were learning about their threat models, 228 00:16:34,400 --> 00:16:39,530 I mentioned the psychological burden that threat modelling can impose... 229 00:16:39,530 --> 00:16:46,250 So we would have a tech support session with [experts] on hand to help 230 00:16:46,250 --> 00:16:51,020 people take action to address threats or concerns that they saw in their lives. 231 00:16:51,020 --> 00:16:54,950 We very much wanted those support people to support people. 232 00:16:54,950 --> 00:17:00,000 So rather than someone there as an expert to tell you what you need to do, 233 00:17:00,000 --> 00:17:06,500 they help access resources, sit with you to download Tor, and look up issues. 234 00:17:06,500 --> 00:17:12,270 At later stages we've been able to be even more participatory. 235 00:17:12,270 --> 00:17:17,750 We're collaborating with 'The Voice of Domestic Workers'. 236 00:17:17,750 --> 00:17:21,020 We're planning a kind of data walk-through where some of the 237 00:17:21,020 --> 00:17:25,610 people who were in workshops will help us analyse the data. 238 00:17:25,610 --> 00:17:30,410 We'll walk through some of the data and hear their responses and thoughts. 239 00:17:30,410 --> 00:17:36,830 ARI: Do you have any tips for keeping up to speed with cybersecurity? 240 00:17:36,830 --> 00:17:42,350 JULIA: I've had good experiences with online conferences and webinars. 241 00:17:42,350 --> 00:17:48,200 I joke about whether future generations will look back at this moment now and 242 00:17:48,200 --> 00:17:53,200 see it as the 'golden age' of the webinar (they probably will not). But there is a 243 00:17:53,200 --> 00:17:57,350 lot of amazing and interesting work that's being put online now, 244 00:17:57,350 --> 00:18:00,800 where you can hear conversations by people who have very different life 245 00:18:00,800 --> 00:18:05,090 experiences or approaches, and their thoughts on online security and privacy, 246 00:18:05,090 --> 00:18:08,610 for example, RightsCon or USENIX 247 00:18:08,610 --> 00:18:15,080 (Privacy Engineering Practice and Respect, PEPR) 248 00:18:15,080 --> 00:18:18,920 ARI: Where can our listeners find out more? 249 00:18:18,920 --> 00:18:24,320 JULIA: I've published a paper on participatory threat modelling, 250 00:18:24,320 --> 00:18:30,470 which goes deeper into participatory approaches to threat modelling. 251 00:18:30,470 --> 00:18:32,750 And also to Claudine's earlier question, 252 00:18:32,750 --> 00:18:40,880 I've looked at IPV - specifically survivors who are in lockdown with their abusers, 253 00:18:40,880 --> 00:18:46,070 lessons from that threat model to build better digital security practices. 254 00:18:46,070 --> 00:18:49,000 CLAUDINE: Join us next week for another fascinating conversation. 255 00:18:49,000 --> 00:18:53,240 In the meantime, you can tweet at us @HelloPTNPod 256 00:18:53,240 --> 00:18:57,500 You can subscribe on Apple Podcasts or wherever you listen to podcasts. 257 00:18:57,500 --> 00:19:03,360 The title there is PTNPod. See you next week. ARI: Bye! 258 00:19:03,360 --> 00:19:08,520 CLAUDINE: This has been a podcast from the Centre for Doctoral Training in Cybersecurity, University of Oxford. 259 00:19:08,520 --> 00:19:13,408 Funded by the Engineering and Physical Sciences Research Council.