1 00:00:01,980 --> 00:00:05,610 ARI: Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,610 --> 00:00:09,490 ARI: We're a podcast all about exploring the different sides of cybersecurity, 3 00:00:09,490 --> 00:00:13,770 from political to computer science, international relations to mathematics. 4 00:00:13,770 --> 00:00:17,560 Join us as we talk to our friends about the work they do. 5 00:00:17,560 --> 00:00:24,860 All right Claudine, this week we're doing [you... oh no, phrasing!!]. 6 00:00:24,860 --> 00:00:28,590 CLAUDINE: Oh, please leave that in. 7 00:00:28,590 --> 00:00:31,830 ARI: Hello everyone, welcome back to PTNPod - Proving the Negative, 8 00:00:31,830 --> 00:00:36,900 Swanning About in Cyber Security. We're going to intro, just like we did last time. 9 00:00:36,900 --> 00:00:40,920 Claudine, why do you do the work that you do, in the grand scheme of things? 10 00:00:40,920 --> 00:00:45,210 CLAUDINE:The Online Safety Bill's been in the news quite a bit recently. 11 00:00:45,210 --> 00:00:54,570 It's purpose is to protect users from harm they might encounter on social media. 12 00:00:54,570 --> 00:01:02,640 The new and controversial bit of the bill is referred to as 'legal but harmful' content. 13 00:01:02,640 --> 00:01:09,120 This is content on social media that is not illegal, but might still be harmful to users. 14 00:01:09,120 --> 00:01:14,220 This Bill is trying to deal with that illegal content, but also impose a duty of care. 15 00:01:14,220 --> 00:01:17,000 Ari, you mentioned a 'duty of care' in your episode... this is slightly different. 16 00:01:17,000 --> 00:01:22,860 [The Bill] imposes a duty of care on social media providers and search engines, 17 00:01:22,860 --> 00:01:29,190 or at least the very large ones, to protect their users from harm from legal content. 18 00:01:29,190 --> 00:01:36,500 ARI: We talk about qualitative and quantitative data, why's this contentious? 19 00:01:36,500 --> 00:01:43,440 CLAUDINE: Quantitative data tends to be seen [in STEM fields] as better (e.g., statistics). 20 00:01:43,440 --> 00:01:47,760 Qualitative data is a little bit different. It tends to be done on smaller samples. 21 00:01:47,760 --> 00:01:50,300 (You shouldn't impose any pre-conceived notions...!) 22 00:01:50,300 --> 00:01:54,000 I tend to think that the best approach is mixed methods. There are situations... 23 00:01:54,000 --> 00:01:57,780 ... in which quantitative data is absolutely appropriate, and cases where it's not. 24 00:01:57,780 --> 00:02:00,500 In those situations, you need to use qualitative analysis. 25 00:02:00,500 --> 00:02:03,810 ARI: We will also talk about dark patterns or deceptive design. 26 00:02:03,810 --> 00:02:11,750 CLAUDINE: They're meant to nudge you to take a specific action... when you're 27 00:02:11,750 --> 00:02:17,440 shopping online and it's hard to remove something from your cart, or you can't 28 00:02:17,440 --> 00:02:21,270 find the unsubscribe option in an email that you're getting for a newsletter. 29 00:02:21,270 --> 00:02:28,000 Hi, I'm Claudine. I'm based in the Human Centred Computing Group at Oxford. 30 00:02:28,000 --> 00:02:32,220 I study the effect of social media use on well-being. 31 00:02:32,220 --> 00:02:37,170 When we use social media, we have emotional responses to content we see. 32 00:02:37,170 --> 00:02:41,220 We might respond a specific way to certain types of content. 33 00:02:41,220 --> 00:02:48,750 One aspect that tends to be overlooked is the effect of individual context: 34 00:02:48,750 --> 00:02:53,500 socio-economic background, your gender identity, ethnicity, where you grew up, 35 00:02:53,500 --> 00:02:57,540 your native language... all those different factors, that's your background context. 36 00:02:57,540 --> 00:03:03,750 Then there's individual context that varies depending on your mood, personality... 37 00:03:03,750 --> 00:03:10,300 If you're feeling bad because you've had a professional letdown, then on Twitter 38 00:03:10,300 --> 00:03:15,410 a colleague has posted about a success in their professional life, you might feel bad 39 00:03:15,410 --> 00:03:18,390 in that moment. But there's nothing objectively wrong with that content. 40 00:03:18,390 --> 00:03:21,660 You might feel completely different about it at a different point in time. 41 00:03:21,660 --> 00:03:33,630 I study mundane social media content that is not considered illegal or harmful. 42 00:03:33,630 --> 00:03:39,810 Individual differences impact how we respond to social media content. 43 00:03:39,810 --> 00:03:45,240 I try to figure out how people develop strategies to address those emotions and 44 00:03:45,240 --> 00:03:49,350 what user controls we can provide to make that experience more seamless. 45 00:03:49,350 --> 00:03:52,380 Something I hear a lot is [that] by looking at it from this angle. 46 00:03:52,380 --> 00:03:59,010 I'm putting burden and stress on the user to manage their own experience. 47 00:03:59,010 --> 00:04:04,300 That's fair. But there are cases in which it is the responsibility of the platform to 48 00:04:04,300 --> 00:04:09,180 manage and to moderate. When it comes to personal and individual preferences, 49 00:04:09,180 --> 00:04:12,300 users should have the appropriate tools at their disposal. 50 00:04:12,300 --> 00:04:13,890 ARI: What is it that you're curious about? 51 00:04:13,890 --> 00:04:18,040 This seems like a really interesting mix of social and technical understanding. 52 00:04:18,040 --> 00:04:22,620 CLAUDINE: Good question. I think I just side stepped into it accidentally. 53 00:04:22,620 --> 00:04:30,690 I became interested in social media and impacts that aren't often talked about. 54 00:04:30,690 --> 00:04:38,550 We talk a lot about 'big bad harms': exploitation, harassment, cyber bullying. 55 00:04:38,550 --> 00:04:44,340 A lot of negative experiences that I was having had nothing to do with that. 56 00:04:44,340 --> 00:04:49,140 A lot of it had to do with feeling bad after reading the news or feeling stressed 57 00:04:49,140 --> 00:04:54,210 because other people seemed to be doing better than I was. On a selfish level, 58 00:04:54,210 --> 00:05:00,930 I started wondering how we could manage this and help others 59 00:05:00,930 --> 00:05:06,900 manage slow and incremental harms that build up over time. 60 00:05:06,900 --> 00:05:10,620 ARI: What do you measure? CLAUDINE: That's a perpetual struggle! 61 00:05:10,620 --> 00:05:17,580 There's no one way to do it. I to take as broad of a view as I can, 62 00:05:17,580 --> 00:05:23,490 parsing out individual characteristics that could influence whether 63 00:05:23,490 --> 00:05:28,110 someone is likely to [experience harm] around (COVID-19) news content. 64 00:05:28,110 --> 00:05:35,880 I'm working on a project about the negative impact of social media on 65 00:05:35,880 --> 00:05:40,290 people who have post-traumatic stress disorder and how they cope with that. 66 00:05:40,290 --> 00:05:44,290 I'm working on a mobile experience sampling method, which allows us to 67 00:05:44,290 --> 00:05:48,060 collect data in real time from users as they're using social media. 68 00:05:48,060 --> 00:05:52,680 ARI: I don't know anything about PTSD. CLAUDINE: For people who don't know, PTSD, 69 00:05:52,680 --> 00:05:58,460 post-traumatic stress disorder, is a condition that can occur in someone who 70 00:05:58,460 --> 00:06:02,460 has experienced a traumatic event or a series of traumatic events. 71 00:06:02,460 --> 00:06:09,460 It can lead to long term, potentially debilitating psychological effects that can 72 00:06:09,460 --> 00:06:12,270 be triggered by something in their environment that reminds them of 73 00:06:12,270 --> 00:06:18,240 the event (or some aspect of the events). To be diagnosed with PTSD, 74 00:06:18,240 --> 00:06:21,540 there are a number of criteria that someone would have to fulfill. 75 00:06:21,540 --> 00:06:27,510 If someone has developed PTSD from a very serious car accident, the experience 76 00:06:27,510 --> 00:06:30,540 is the physical experience of having been in the car accident. 77 00:06:30,540 --> 00:06:35,610 Witnessing news footage of a serious car accident, which has a negative 78 00:06:35,610 --> 00:06:40,530 psychological impact, would not qualify someone for PTSD. 79 00:06:40,530 --> 00:06:47,400 You could (clinically) experience PTSD if the event happened to a loved one, 80 00:06:47,400 --> 00:06:55,200 first responders can develop 'vicarious' PTSD from repeated exposure to events. 81 00:06:55,200 --> 00:07:04,320 During the pandemic, I ran a national survey about COVID-19 social media use. 82 00:07:04,320 --> 00:07:09,150 (Exchanging news or information, or looking at content about COVID-19) 83 00:07:09,150 --> 00:07:13,000 PCL-5 is a standard screening questionnaire for PTSD 84 00:07:13,000 --> 00:07:16,000 (not a definitive diagnostic tool, I should make that clear) 85 00:07:16,000 --> 00:07:19,000 I asked [people] to fill out that [survey]. What we found was that there were 86 00:07:19,000 --> 00:07:26,390 types of social media use more closely associated with PTSD symptoms, 87 00:07:26,390 --> 00:07:30,860 symptoms that would have qualified somebody as being above a certain score 88 00:07:30,860 --> 00:07:35,660 threshold, likely to be having PTSD and referred for further examination. 89 00:07:35,660 --> 00:07:42,410 We screened out individuals who were first responders, had had COVID-19, or 90 00:07:42,410 --> 00:07:49,150 with a close friend or family member who had been seriously ill from COVID-19. 91 00:07:49,150 --> 00:07:52,490 Active use was more clicking, posting, interacting with social media, 92 00:07:52,490 --> 00:07:56,750 People who used social media passively (scrolling and browsing) were associated 93 00:07:56,750 --> 00:08:01,040 with higher levels of post-traumatic stress symptoms. 94 00:08:01,040 --> 00:08:06,560 There hasn't been a lot of research on social media triggers. Existing controls: 95 00:08:06,560 --> 00:08:12,890 filtering or blocking content, are not very fine tuned. 96 00:08:12,890 --> 00:08:19,760 This is not specific to people who have experienced trauma, dealing with PTSD. 97 00:08:19,760 --> 00:08:23,450 This seems to be a case study that could be representative other groups. 98 00:08:23,450 --> 00:08:33,950 ARI: Social media giants don't have a history of collaborating with researchers. 99 00:08:33,950 --> 00:08:38,500 There was a study that manipulated people's news feeds to see how it would 100 00:08:38,500 --> 00:08:42,410 change their emotions - they looked at how they could impact how people felt. 101 00:08:42,410 --> 00:08:46,310 It didn't have any ethical oversight. Hopefully, we're doing better. 102 00:08:46,310 --> 00:08:47,750 What does your work look like? 103 00:08:47,750 --> 00:08:50,540 CLAUDINE: Mobile experience sampling isn't done through the platforms directly. 104 00:08:50,540 --> 00:08:55,220 We are developing an app that users will download on their phones. 105 00:08:55,220 --> 00:08:59,330 It will prompt them throughout the day. 106 00:08:59,330 --> 00:09:02,600 The app is running in the background of their phone, and then a few times a day, 107 00:09:02,600 --> 00:09:09,490 they'll get a pop up while they're using social media, they'll be asked to complete 108 00:09:09,490 --> 00:09:13,490 a questionnaire about the type of content that they're looking at, 109 00:09:13,490 --> 00:09:16,550 how it's making them feel and what else they're doing. 110 00:09:16,550 --> 00:09:24,860 E.g., are they using social media on a break or while they are working? 111 00:09:24,860 --> 00:09:28,340 The reason we ask about this is, again, it goes back to context. 112 00:09:28,340 --> 00:09:37,250 It's critical to understand[ing] how and why people react the way they do. 113 00:09:37,250 --> 00:09:39,020 [Social media content] use doesn't happen in a vacuum, right? 114 00:09:39,020 --> 00:09:48,220 It happens in a physical/cyber environment - context is hard to discern. 115 00:09:48,220 --> 00:09:51,860 We will ask specifically about the other things that they're doing and some of 116 00:09:51,860 --> 00:09:55,160 the other things happening in their life throughout the course of the study. 117 00:09:55,160 --> 00:09:58,670 We will collect data in the morning and the evening. 118 00:09:58,670 --> 00:10:02,330 We will ask questions about any significant changes that day, 119 00:10:02,330 --> 00:10:06,740 if they see something on social media that elicits a particularly strong emotion, 120 00:10:06,740 --> 00:10:14,000 they can press a button to bring up the a survey at that time. 121 00:10:14,000 --> 00:10:18,620 At the end of several weeks, the idea is to see if there are any strong 122 00:10:18,620 --> 00:10:24,590 correlations between environmental factors and emotional responses. 123 00:10:24,590 --> 00:10:28,280 We are getting around the social media platforms by doing it this way. 124 00:10:28,280 --> 00:10:31,640 And as you rightly pointed out, we do need ethical review for a study like this. 125 00:10:31,640 --> 00:10:34,580 There are certainly some major data protection implications. 126 00:10:34,580 --> 00:10:37,460 We need to be mindful of that while we're designing the study. 127 00:10:37,460 --> 00:10:42,980 We are ensuring security and privacy and (GDPR) compliance by design. 128 00:10:42,980 --> 00:10:45,820 ARI: How have you approached the biggest challenge in your research? 129 00:10:45,820 --> 00:10:48,560 CLAUDINE: Since I don't come from a technical background, 130 00:10:48,560 --> 00:10:55,790 it can be daunting to go for projects that have a strong technical component 131 00:10:55,790 --> 00:11:02,210 collaborat[ing] with people who have a software engineering/computer science background. 132 00:11:02,210 --> 00:11:05,570 The biggest challenge that I have is more of a technical one 133 00:11:05,570 --> 00:11:11,690 I'm at a stage in my training where I know enough to get myself into trouble, 134 00:11:11,690 --> 00:11:14,180 but I don't know enough to get myself out of trouble. 135 00:11:14,180 --> 00:11:21,820 Trying to go it alone is a bad idea - if you don't collaborate and seek help, 136 00:11:21,820 --> 00:11:24,140 you're just going to be in for a world of hurt. 137 00:11:24,140 --> 00:11:29,100 ARI: When we talked to Mary, she talked about having confidence in what 138 00:11:29,100 --> 00:11:34,700 you want to do, you can use that to hang other people's expertise off of. 139 00:11:34,700 --> 00:11:38,720 You direct your work and draw in the techies to support you. 140 00:11:38,720 --> 00:11:46,850 Are technical people more confident that they know what is right/correct? 141 00:11:46,850 --> 00:11:50,540 You have to explore questions in your work, and the answer presents itself. 142 00:11:50,540 --> 00:11:54,000 CLAUDINE: I found that to be an interesting difference. I come from 143 00:11:54,000 --> 00:11:58,500 a background where it's all about shades of grey in a lot of cases, right? 144 00:11:58,500 --> 00:12:03,370 Politics is the art of the possible, and Law is very much... not about what is 145 00:12:03,370 --> 00:12:06,560 objectively right as much as what you can argue or what you can prove. 146 00:12:06,560 --> 00:12:12,260 That's a different mindset to what I've experienced of people around me who 147 00:12:12,260 --> 00:12:20,300 have a clear-cut idea about what they want to do and the approaches to take. 148 00:12:20,300 --> 00:12:23,500 I find that sometimes it can take a little bit of work to get on the same 149 00:12:23,500 --> 00:12:26,690 wavelength, and that was a challenge that I had to learn to adapt to. 150 00:12:26,690 --> 00:12:31,850 ARI: When you say 'qualitative' data or methods, why is that different? 151 00:12:31,850 --> 00:12:38,980 Why is that something that presents a challenge? 152 00:12:38,980 --> 00:12:44,010 CLAUDINE: In the research I do, a lot of it is exploratory, which sounds weird 153 00:12:44,010 --> 00:12:46,130 because you think that social media is something that's been studied to death, 154 00:12:46,130 --> 00:12:52,340 on the scale of things, it's actually a fairly new field. Social media has been 155 00:12:52,340 --> 00:12:58,500 around for 20 years or so and has only been widespread since Facebook. 156 00:12:58,500 --> 00:13:06,000 I was a teenager before I got my first Facebook page (well into high school) 157 00:13:06,000 --> 00:13:10,680 The areas of it that have been studied cluster around certain topics, 158 00:13:10,680 --> 00:13:21,300 around specific sub-populations (e.g., body image/eating disorders in teenage girls). 159 00:13:21,300 --> 00:13:26,850 When you're doing qualitative analysis in certain areas of social media, 160 00:13:26,850 --> 00:13:33,750 you do occasionally end up with these spaces where there's just nothing. 161 00:13:33,750 --> 00:13:42,060 Research around PTSD and social media triggers has not been studied much. 162 00:13:42,060 --> 00:13:46,170 Other areas and sub-populations really haven't been studied all that much. 163 00:13:46,170 --> 00:13:51,930 You can end up navigating in the dark, a little bit! 164 00:13:51,930 --> 00:13:57,870 You're starting off with some research questions and assumptions, and if 165 00:13:57,870 --> 00:14:04,030 you have a grounded approach, collecting data before extracting ideas, 166 00:14:04,030 --> 00:14:07,530 you're not going in with any preconceived notions. 167 00:14:07,530 --> 00:14:09,660 ARI: It sounds like you're comfortable with uncertainty. 168 00:14:09,660 --> 00:14:13,370 CLAUDINE: I occasionally have stress dreams about it, but other than that... 169 00:14:13,370 --> 00:14:17,500 ARI: About social media in particular, or computer scientists?! 170 00:14:17,500 --> 00:14:20,570 CLAUDINE: All of the above... *laughter* 171 00:14:20,570 --> 00:14:24,410 ARI: I can't think of anything that lets me filter what I what I see. 172 00:14:24,410 --> 00:14:27,350 In addition to the actual content, you've also got advertising. 173 00:14:27,350 --> 00:14:34,820 That is content you have zero control over... what's been the biggest surprise? 174 00:14:34,820 --> 00:14:44,000 CLAUDINE: It's been an accumulation of little ones, of being OK with, 175 00:14:44,000 --> 00:14:48,470 with uncertainty in a situation where you feel like the minority, 176 00:14:48,470 --> 00:14:52,920 I don't have a technical background and I am in a more technical field. 177 00:14:52,920 --> 00:14:58,130 General uncertainty in terms of thinking, when I'm doing research or starting out 178 00:14:58,130 --> 00:15:02,570 It still happens to me very regularly that I look at something and go, "Oh... 179 00:15:02,570 --> 00:15:07,080 "... I don't know how to do X, how to code this... but I don't have a choice... 180 00:15:07,080 --> 00:15:14,540 "... this is the only framework that exists to do X". Teaching myself to do it I think 181 00:15:14,540 --> 00:15:18,500 "Hmm, do I actually... do I actually know what I'm doing here?" 182 00:15:18,500 --> 00:15:24,360 That's always a little bit of a struggle for me. It's not the biggest single challenge 183 00:15:24,360 --> 00:15:28,520 but it is a constant to manage because if you don't, it can become paralysing. 184 00:15:28,520 --> 00:15:37,340 If you allow yourself to be too uncertain and too insecure, nothing gets done. 185 00:15:37,340 --> 00:15:42,620 On the other hand, if you don't have a healthy level of self-reflection, 186 00:15:42,620 --> 00:15:49,220 you can end up doing something that is not sufficiently rigorous or appropriate. 187 00:15:49,220 --> 00:15:53,690 It's figuring out where the balance of self-doubt and self-assurance is. 188 00:15:53,690 --> 00:15:58,290 I'm always, even now (and I'm at a fairly late stage at this point), trying to be OK 189 00:15:58,290 --> 00:16:08,090 with that level of uncertainty, risk and knowing that sometimes you just don't know. 190 00:16:08,090 --> 00:16:16,010 It is kind of a daily struggle. I'm just fascinated by the 'squishy' bits. 191 00:16:16,010 --> 00:16:24,680 I'm interested in how life shapes the way we think, and interact with technology. 192 00:16:24,680 --> 00:16:28,070 AR: How do the squishy bits relate to cyber security? Why're they important? 193 00:16:28,070 --> 00:16:31,930 If you look at a really basic notion that cybersecurity has to do with protecting 194 00:16:31,930 --> 00:16:35,390 the security of information systems infrastructure, 195 00:16:35,390 --> 00:16:40,160 this isn't strictly speaking cybersecurity. However, there has been a shift, 196 00:16:40,160 --> 00:16:46,040 largely due to the advent of social media and connected digital devices 197 00:16:46,040 --> 00:16:54,000 humans are increasingly permanently connected to technology. In some cases, 198 00:16:54,000 --> 00:16:58,660 24 hours a day, 7 days a week. There's been an expansion of cybersecurity 199 00:16:58,660 --> 00:17:02,550 to include more and more of a human centred focus. When we look at the 200 00:17:02,550 --> 00:17:06,800 individual and the way an individual experiences social media, for instance, 201 00:17:06,800 --> 00:17:10,000 that is more what I would consider cyber safety, 202 00:17:10,000 --> 00:17:13,820 the general psychological and emotional safety when being online. 203 00:17:13,820 --> 00:17:17,150 That's becoming an increasingly important aspect 204 00:17:17,150 --> 00:17:21,830 of being safe in general and of being secure in general. 205 00:17:21,830 --> 00:17:27,030 Cyber security is going to include cyber safety as a notion. There is some debate 206 00:17:27,030 --> 00:17:31,000 right now as to whether cyber safety and cyber security should be considered 207 00:17:31,000 --> 00:17:32,390 part of the same general group. I think they should. 208 00:17:32,390 --> 00:17:36,680 Some of our more technically minded friends might disagree with me... 209 00:17:36,680 --> 00:17:41,370 Given how connected humans are to technology in general, 210 00:17:41,370 --> 00:17:46,850 they should at this point be considered part of systems and infrastructure. 211 00:17:46,850 --> 00:17:53,500 ARI: I would disagree with you saying what you do is not cyber security. 212 00:17:53,500 --> 00:17:59,280 'Cyber' comes from this [idea of] 'cybernetics', a self-governing system. 213 00:17:59,280 --> 00:18:04,000 You have a complex system of systems, there's all sorts of interactions going on. 214 00:18:04,000 --> 00:18:05,660 This presents a massive... 215 00:18:05,660 --> 00:18:09,410 I would call it threat surface - there are all sorts of things that could go wrong. 216 00:18:09,410 --> 00:18:14,090 This surface is quite wide. You can't ignore safety. 217 00:18:14,090 --> 00:18:18,230 To find vulnerabilities and gaps and places where things could go wrong, 218 00:18:18,230 --> 00:18:21,470 you have to understand that there are different moving pieces. 219 00:18:21,470 --> 00:18:27,290 People are, of course, a really important part of those systems. 220 00:18:27,290 --> 00:18:29,510 Cyber safety, that's incredibly important. 221 00:18:29,510 --> 00:18:36,470 The nature of what we do is interdisciplinary - it doesn't fit into preconceived buckets. 222 00:18:36,470 --> 00:18:41,510 CLAUDINE: The types of people with a valuable (cyber security) skill set 223 00:18:41,510 --> 00:18:45,620 has changed significantly, even just since I started in twenty eighteen. 224 00:18:45,620 --> 00:18:49,320 It's really important to be flexible. There are always going to be tensions 225 00:18:49,320 --> 00:18:53,560 around what's considered part of the field, what isn't 226 00:18:53,560 --> 00:18:58,700 and where we draw the line in terms of being inclusionary or exclusionary. 227 00:18:58,700 --> 00:19:03,830 Last year, I was working on a project circumventing dark patterns in apps. 228 00:19:03,830 --> 00:19:07,800 This plays into social media - as you pointed out earlier Ari, to do with ads. 229 00:19:07,800 --> 00:19:11,450 Dark patterns can play a huge role in targeted advertising and often do. 230 00:19:11,450 --> 00:19:16,770 I wanted to figure out how people felt about dark patterns, 231 00:19:16,770 --> 00:19:21,140 whether users necessarily felt that dark patterns were bad. 232 00:19:21,140 --> 00:19:24,370 Even the words 'dark pattern' sound menacing and ominous. 233 00:19:24,370 --> 00:19:29,410 It's really hard to ask people about dark patterns. They're hidden, not obvious... 234 00:19:29,410 --> 00:19:33,920 We can ask about them conceptually, but that's not asking about experiences. 235 00:19:33,920 --> 00:19:36,200 That's why I'm doing experience sampling now. 236 00:19:36,200 --> 00:19:39,920 You can ask people a questionnaire about their social media use and ask, 237 00:19:39,920 --> 00:19:45,020 "What type/s of social media content make you angry, sad, happy...?" 238 00:19:45,020 --> 00:19:48,560 "What do you think were factors in your life that influenced that?" 239 00:19:48,560 --> 00:19:55,520 I can ask those questions, but you're never going to get the precision of data 240 00:19:55,520 --> 00:20:01,850 [as] asking people in the moment when experiencing emotion or using technology. 241 00:20:01,850 --> 00:20:11,390 You're never going to get the same level of clarity about context in hindsight. 242 00:20:11,390 --> 00:20:14,030 ARI: There's this whole conversation we have about privacy, 243 00:20:14,030 --> 00:20:19,550 [Experts] say people don't care about privacy as they SAY they do, but DO nothing. 244 00:20:19,550 --> 00:20:24,830 And a lot of studies that this has come from are based on reported data. 245 00:20:24,830 --> 00:20:31,890 They asked people, but not when they were making a choice to share data. 246 00:20:31,890 --> 00:20:36,320 ARI: I would like to ask you about the Online Safety Bill. 247 00:20:36,320 --> 00:20:40,610 CLAUDINE: It's a Bill currently working its way through [UK] Parliament. 248 00:20:40,610 --> 00:20:48,820 It's been evolving for the last few years. The purpose of this Bill, if it passes, is to 249 00:20:48,820 --> 00:20:54,080 impose a duty of care on social media providers and search engines to protect 250 00:20:54,080 --> 00:21:00,530 users from harms that they might encounter when using their services. 251 00:21:00,530 --> 00:21:03,380 They do things called 'Calls for evidence' periodically. 252 00:21:03,380 --> 00:21:13,640 (I am currently working on one which I need to send out very soon!) 253 00:21:13,640 --> 00:21:19,370 Unfortunately, academia tends to be a little bit underrepresented in the experts 254 00:21:19,370 --> 00:21:23,390 it's been a lot of industry people and people from non-profit organisations. 255 00:21:23,390 --> 00:21:28,640 There have not been a lot of academics involved. ARI: Why is the Bill important? 256 00:21:28,640 --> 00:21:32,170 CLAUDINE: One of the aspects that has been neglected... 257 00:21:32,170 --> 00:21:34,370 (and I should say that this is my personal opinion... 258 00:21:34,370 --> 00:21:37,130 ... it does not reflect the University of Oxford or anyone in my group... 259 00:21:37,130 --> 00:21:38,719 ... disclosure over) 260 00:21:38,719 --> 00:21:44,480 They have not solicited experts in the field of human centric computing, 261 00:21:44,480 --> 00:21:47,570 specifically around issues like user controls. 262 00:21:47,570 --> 00:21:51,640 There's a big thing that's been added to the Online Safety Bill about empowering 263 00:21:51,640 --> 00:21:55,610 users to have more control over harmful content that they see. 264 00:21:55,610 --> 00:21:56,600 Theoretically, that's great. 265 00:21:56,600 --> 00:22:05,150 The problem is there's no detail in terms of how that would be done. 266 00:22:05,150 --> 00:22:12,470 The language is vague - I'm paraphrasing, but it says that, 267 00:22:12,470 --> 00:22:21,470 users should have tools [to] reduce exposure to certain types of harmful content. 268 00:22:21,470 --> 00:22:27,080 What does that mean? What is the scope? It could mean what we have is sufficient. 269 00:22:27,080 --> 00:22:32,000 There really isn't much of a focus on users' needs in this Bill, 270 00:22:32,000 --> 00:22:35,660 that's the main focus of the recommendations I'm going to submit. 271 00:22:35,660 --> 00:22:39,800 What does that mean? Potentially, what's happening is that the scope is wrong. 272 00:22:39,800 --> 00:22:45,950 E.g., content discussing self-harm [could be] around supporting individuals 273 00:22:45,950 --> 00:22:51,080 attempting to recover from, or manage, self-harm. 274 00:22:51,080 --> 00:22:56,450 The concern is that if the guidance to be provided to social media companies, 275 00:22:56,450 --> 00:23:02,960 (given by Ofcom), is not sufficiently narrow and is not sufficiently specific, 276 00:23:02,960 --> 00:23:08,960 then there is a risk that this power will either be delivered far too broadly, 277 00:23:08,960 --> 00:23:19,160 (the potential good is wiped out with potential harm), or far too narrowly, 278 00:23:19,160 --> 00:23:31,000 (the mark will be missed for many groups whose interests are ignored). 279 00:23:31,000 --> 00:23:34,820 Specific interests may not be factored into the assessments that social media 280 00:23:34,820 --> 00:23:41,600 companies will [need] to determine whether content is at risk of being harmful. 281 00:23:41,600 --> 00:23:42,763 ARI: Do you have any tips for keeping up to speed with cybersecurity? 282 00:23:42,763 --> 00:23:48,350 CLAUDINE: Twitter is always a good place to start! For the work that I do... 283 00:23:48,350 --> 00:23:52,420 Journals like Social Media in Society or New Media in Society. 284 00:23:52,420 --> 00:23:55,800 The ACM CHI Conference on Human Factors in Computing Systems. 285 00:23:55,800 --> 00:23:58,390 Wired (https://www.wired.com) is a good mainstream publication. 286 00:23:58,390 --> 00:23:59,798 A blog (https://www.cyberleagle.com) run by attorney Graham Smith. 287 00:23:59,798 --> 00:24:04,210 ARI: Where can our listeners find out more? 288 00:24:04,210 --> 00:24:10,850 CLAUDINE: You can follow me on Twitter (@ClaudineTinsman). 289 00:24:10,850 --> 00:24:13,610 Join us next week for another fascinating conversation. 290 00:24:13,610 --> 00:24:18,060 In the meantime, you can tweet at us @HelloPTNPod 291 00:24:18,060 --> 00:24:22,310 and you can subscribe on Apple Podcasts or wherever you listen to podcasts. 292 00:24:22,310 --> 00:24:28,180 The title there is PTNPod. See you next week. [ARI: Bye!] 293 00:24:28,180 --> 00:24:33,370 This has been a podcast from the Centre for Doctoral Training in Cybersecurity, University of Oxford. 294 00:24:33,370 --> 00:24:38,250 Funded by the Engineering and Physical Sciences Research Council.