1 00:00:02,000 --> 00:00:05,500 ARI: Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,500 --> 00:00:09,500 ARI: We're a podcast all about exploring the different sides of cybersecurity, 3 00:00:09,500 --> 00:00:13,790 from political to computer science, international relations to mathematics. 4 00:00:13,790 --> 00:00:16,072 Join us as we talk to our friends about the work they do. 5 00:00:16,072 --> 00:00:21,560 CLAUDINE: Today, we're going to be talking about how hate groups operate online, 6 00:00:21,560 --> 00:00:29,990 how they use social media platforms to coordinate efforts or spread ideologies. 7 00:00:29,990 --> 00:00:32,729 ARI: It's a topic for governments, it's a topic for the public, 8 00:00:32,729 --> 00:00:35,780 and it's of interest, of course, to law enforcement. 9 00:00:35,780 --> 00:00:42,170 CLAUDINE: The way online platforms were used to coordinate the Jan 6th attack on the Capitol, 10 00:00:42,170 --> 00:00:48,290 organised groups can use social media to coordinate offline actions. 11 00:00:48,290 --> 00:00:55,100 Those actions can slip under the radar, so it's a topic I'm keen to learn more about. 12 00:00:55,100 --> 00:01:01,730 FATIMA: My name is Fatima, PhD student at the University of Oxford in cyber security. 13 00:01:01,730 --> 00:01:07,850 My research look into online hate and how it's used across multiple platforms. 14 00:01:07,850 --> 00:01:12,110 ARI: Your elevator pitch - what would that sound like? 15 00:01:12,110 --> 00:01:15,979 We've seen over the past couple of decades how online platforms, 16 00:01:15,979 --> 00:01:21,860 and social media has become an increasingly integral part of communication, 17 00:01:21,860 --> 00:01:26,990 this has especially been highlighted during the ongoing COVID pandemic. 18 00:01:26,990 --> 00:01:31,280 But this of course, had an impact on how malicious actors communicate as well. 19 00:01:31,280 --> 00:01:38,690 Extremist and hate groups make use of platforms to spread propaganda, 20 00:01:38,690 --> 00:01:44,180 recruit individuals, or just form stronger networks with like minded people. 21 00:01:44,180 --> 00:01:47,837 We've seen how this behaviour can lead to catastrophic events, 22 00:01:47,837 --> 00:01:52,400 like the ISIS Paris attacks and the Christchurch mosque attack. 23 00:01:52,400 --> 00:01:56,030 So this isn't just a problem for the online community. 24 00:01:56,030 --> 00:02:02,370 This is something that has a huge impact on the offline world as well. 25 00:02:02,370 --> 00:02:06,394 So this isn't a problem faced just by specific platforms, 26 00:02:06,394 --> 00:02:10,650 it's something that is faced by a majority of online platforms today, 27 00:02:10,650 --> 00:02:15,101 and oftentimes hate groups will make use of multiple platforms, 28 00:02:15,101 --> 00:02:17,040 no matter their audience size. 29 00:02:17,040 --> 00:02:25,530 I argue that we can't look at online hate from the perspective of a single platform. 30 00:02:25,530 --> 00:02:32,160 I look into how we can use a cross-platform perspective, 31 00:02:32,160 --> 00:02:36,600 to see if we can gain any insight into how platforms are used differently, 32 00:02:36,600 --> 00:02:40,920 and whether we can map hateful behaviour across multiple platforms. 33 00:02:40,920 --> 00:02:44,790 So we have a more organised approach to countering it. 34 00:02:44,790 --> 00:02:48,960 CLAUDINE: Could you clarify what you mean by a malicious actor? 35 00:02:48,960 --> 00:02:54,000 FATIMA: What I am considering a malicious actor is... 36 00:02:54,000 --> 00:03:02,344 I'm looking at hate groups that have been identified by law enforcement agencies, 37 00:03:02,344 --> 00:03:09,000 or people who have identified in some way that they are supporters of hateful 38 00:03:09,000 --> 00:03:12,780 ideology that has also been identified by law enforcement agencies. 39 00:03:12,780 --> 00:03:18,660 Various other malicious actors use social media. 40 00:03:18,660 --> 00:03:23,280 ARI: So what kind of malicious actors do we have out there now? 41 00:03:23,280 --> 00:03:24,540 FATIMA: They're not just humans. 42 00:03:24,540 --> 00:03:28,728 You have bots as well who are propagating all kinds of stuff, 43 00:03:28,728 --> 00:03:33,960 whether it's trying to influence the opinion of the online audience, 44 00:03:33,960 --> 00:03:41,000 or whether it's just trying to get people to engage in certain activities online, 45 00:03:41,000 --> 00:03:44,280 in which you can get information from them. 46 00:03:44,280 --> 00:03:51,000 So we saw this with Cambridge Analytica. It wasn't so much malicious, 47 00:03:51,000 --> 00:03:57,490 as it was intrusive and exploiting things that online platforms provide 48 00:03:57,490 --> 00:04:00,960 that weren't brought to the attention of the general public before. 49 00:04:00,960 --> 00:04:09,000 The definition changes all the time - as well as how they make use of platforms. 50 00:04:09,000 --> 00:04:12,960 ARI: The end result is to have this unified approach, 51 00:04:12,960 --> 00:04:18,330 different platforms have one problem in common, particularly the problem you look at. 52 00:04:18,330 --> 00:04:23,040 Why do we have to work together - who needs to work together on this? 53 00:04:23,040 --> 00:04:27,060 FATIMA: It's not so much that we need to have one unified approach. 54 00:04:27,060 --> 00:04:32,640 It's more - we haven't looked at how multiple platforms are used in this space. 55 00:04:32,640 --> 00:04:37,650 Research has seen: "so this is what's going on in Twitter this year, this month..." 56 00:04:37,650 --> 00:04:42,840 "... so that means this is how we need to address online hate". 57 00:04:42,840 --> 00:04:47,970 But then that's specific to Twitter so we can't apply that to multiple platforms. 58 00:04:47,970 --> 00:04:56,550 We need to start looking at all platforms and seeing how hate groups use them. 59 00:04:56,550 --> 00:05:00,360 Do they use them for different things? Are they used together in some way? 60 00:05:00,360 --> 00:05:06,360 Is some stuff posted on one platform, and then other stuff posted on another? 61 00:05:06,360 --> 00:05:13,950 The rules on one platform are more lax than another - a unifying response wouldn't work. 62 00:05:13,950 --> 00:05:20,520 It's more that we haven't looked at the perspectives of different platforms, 63 00:05:20,520 --> 00:05:25,470 So we haven't included that in the counter-approaches we're looking at. 64 00:05:25,470 --> 00:05:30,780 CLAUDINE: I realise it's a work in progress - what have you found so far, 65 00:05:30,780 --> 00:05:33,976 In terms of taking that global perspective of multiple platforms? 66 00:05:33,976 --> 00:05:41,670 FATIMA: I have seen differences in how platforms are used, it can be very subtle. 67 00:05:41,670 --> 00:05:45,030 I was looking at online hate during the US elections. 68 00:05:45,030 --> 00:05:52,110 I was looking at hate groups posting on different platforms - 4chan and Reddit, 69 00:05:52,110 --> 00:05:58,620 Reddit had a much more organised response during the actual election 70 00:05:58,620 --> 00:06:05,760 posting links for fundraising, whether hate groups or Trump-supporting causes. 71 00:06:05,760 --> 00:06:10,230 There was a lot of repeat posting URL links, 72 00:06:10,230 --> 00:06:17,040 but 4chan discussed the results more - not as much of an organised approach. 73 00:06:17,040 --> 00:06:21,390 It's interesting to see how different communities use platforms differently. 74 00:06:21,390 --> 00:06:27,480 That was one thing that I've found so far. CLAUDINE: Could you clarify 'hate group'? 75 00:06:27,480 --> 00:06:35,490 FATIMA: Hate groups have a different definition across different areas, jurisdictions, countries, 76 00:06:35,490 --> 00:06:40,620 each kind of law enforcement agency will have their own definition of a hate group. 77 00:06:40,620 --> 00:06:50,460 Generally, it's when you're going against acceptable social norms. 78 00:06:50,460 --> 00:06:56,340 A lot of the time it's specific to the country that you're in. 79 00:06:56,340 --> 00:06:58,530 This is applicable to social media as well. 80 00:06:58,530 --> 00:07:06,840 Social media platforms generally have their own body to monitor online content. 81 00:07:06,840 --> 00:07:11,100 Each country also has their rules, which social media platforms have to follow. 82 00:07:11,100 --> 00:07:18,780 Similarly, hate groups are going against what is socially acceptable behaviour. 83 00:07:18,780 --> 00:07:22,440 ARI: They say things that mean certain groups of people have a worse time. 84 00:07:22,440 --> 00:07:26,460 People being mistreated - that's a tangible result. 85 00:07:26,460 --> 00:07:32,730 We don't want hate groups because people will suffer in real life. 86 00:07:32,730 --> 00:07:37,980 This becomes a real world harm - these ideas being tossed around in online forums. 87 00:07:37,980 --> 00:07:42,930 These things are bad because people are impacted. It's not a data problem entirely. 88 00:07:42,930 --> 00:07:49,860 Is this a fair assessment of why hate groups are undesirable? 89 00:07:49,860 --> 00:07:58,080 FATIMA: Exactly. It's why most countries now have have listed online hate as a harm, 90 00:07:58,080 --> 00:08:03,150 even though it is a grey area in terms of defining and having one unified response 91 00:08:03,150 --> 00:08:08,250 everywhere and one unified definition of hate across all countries and languages and stuff. 92 00:08:08,250 --> 00:08:16,500 It's universally agreed that hate groups cause harm online, and consequence offline as well. 93 00:08:16,500 --> 00:08:22,830 CLAUDINE: You saw a more organised response for these groups on certain platforms than on others. 94 00:08:22,830 --> 00:08:28,500 Have you seen a change in the relationship between hate groups and the platforms that they use. 95 00:08:28,500 --> 00:08:36,510 If platforms respond to new and emerging threats (posed by hate groups), by reinforcing standards, 96 00:08:36,510 --> 00:08:43,440 how does that affect the way those hate groups use social media platforms and which social media platforms they use? 97 00:08:43,440 --> 00:08:49,830 FATIMA: This is a huge area of research . 98 00:08:49,830 --> 00:08:56,460 The main example that can be used is during ISIS' prime on Twitter, 99 00:08:56,460 --> 00:09:05,970 2014-2016, we saw a lot of pro-ISIS support posted on Twitter. 100 00:09:05,970 --> 00:09:12,300 But after a lot of the attacks that were happening offline, 101 00:09:12,300 --> 00:09:18,510 in particular the Paris attacks, Twitter took a stand and started suspending multiple accounts. 102 00:09:18,510 --> 00:09:26,160 At first it was becoming a badge of honour to these pro-ISIS supporters, where they'd, 103 00:09:26,160 --> 00:09:33,690 create new accounts and say "Oh, I got suspended X many times before", like they'd been doing more for the cause. 104 00:09:33,690 --> 00:09:39,840 But of course, this meant that eventually they were kicked off the platform, 105 00:09:39,840 --> 00:09:45,600 so they had to move to a different platform to spread more extremist content. 106 00:09:45,600 --> 00:09:51,000 We saw them move to more encrypted platforms like Telegram, which of course, 107 00:09:51,000 --> 00:09:57,240 champion themselves as providing encrypted communication. 108 00:09:57,240 --> 00:10:02,250 Your communication isn't going to be accessible to law enforcement. No one's going to publish it. 109 00:10:02,250 --> 00:10:08,520 It's going to stick within your groups. We see platform migration a lot in this space, 110 00:10:08,520 --> 00:10:15,960 even over the last year or so - the US elections and BLM resurgence last summer, 111 00:10:15,960 --> 00:10:21,270 we saw that there was a lot more hate on Reddit communities and they were 112 00:10:21,270 --> 00:10:25,770 eventually banned by it because they were increasing their hateful content. 113 00:10:25,770 --> 00:10:34,330 Any time one of these platforms is suspended, they move to another . 114 00:10:34,330 --> 00:10:41,200 That platform will have more lax rules - eventually pressure mounts on them to 115 00:10:41,200 --> 00:10:46,660 monitor their content and to moderate whatever is being posted there. 116 00:10:46,660 --> 00:10:54,460 You get platforms with a lot more reach, the ones with more rules and moderation. 117 00:10:54,460 --> 00:11:04,570 And you have platforms with less reach, but more lax rules. Platforms are used together. 118 00:11:04,570 --> 00:11:12,970 You're still recruiting, but not posting as much extremist or hateful explicit content on mainstream sites. 119 00:11:12,970 --> 00:11:18,970 ARI: They're being used in the way that they've been intended to be used as social media. 120 00:11:18,970 --> 00:11:23,540 Then it's whether you censor. Free speech is not freedom from 121 00:11:23,540 --> 00:11:25,870 consequence, though. FATIMA: This is huge, 122 00:11:25,870 --> 00:11:35,750 especially platforms with more lax rules. Their main ethos is "we're providing freedom of speech". 123 00:11:35,750 --> 00:11:41,110 We won't be monitoring. We won't have as many rules. 124 00:11:41,110 --> 00:11:50,050 You will see that they're used by particular hate groups and not others. 125 00:11:50,050 --> 00:11:56,890 For example, Gab rose to prominence following Trump's election, 126 00:11:56,890 --> 00:12:01,600 a lot of content being posted on Twitter had become identified as hate. 127 00:12:01,600 --> 00:12:05,590 So this other platform (Gab) was saying, Oh, we won't have any rules here. 128 00:12:05,590 --> 00:12:10,540 So - used more by Trump supporting white supremacist type hate. 129 00:12:10,540 --> 00:12:17,590 You won't find as many ISIS supporters there, which is an interesting part of this. 130 00:12:17,590 --> 00:12:20,950 Mostly, it's under the control of platform providers. 131 00:12:20,950 --> 00:12:28,780 Twitter will have their own set of rules. 132 00:12:28,780 --> 00:12:32,710 And so a lot of the time this [hate posts] will just be suspended or deleted. 133 00:12:32,710 --> 00:12:36,490 It's mainly just the platform provider, especially when it comes to removing content. 134 00:12:36,490 --> 00:12:45,190 There can be pressure put on them by governments and even the general public, 135 00:12:45,190 --> 00:12:49,750 if there's enough of a response. It's just so big in terms of perspective. 136 00:12:49,750 --> 00:12:56,800 There are countless platforms, countless hate groups. And so I have to narrow down my scope a little. 137 00:12:56,800 --> 00:13:02,380 I was looking at this from a specific use case, just so we'd get an idea and then 138 00:13:02,380 --> 00:13:07,750 identify hate groups or hate ideologies present across multiple platforms. 139 00:13:07,750 --> 00:13:14,410 That would help narrow down the platforms and hate ideologies we include. 140 00:13:14,410 --> 00:13:19,150 CLAUDINE: What was the most surprising or unexpected thing that you've come across? 141 00:13:19,150 --> 00:13:26,440 FATIMA: I didn't know whether there would be a difference in how each platform was used, 142 00:13:26,440 --> 00:13:32,000 each of these platforms has their own audience and their own type of content. 143 00:13:32,000 --> 00:13:39,850 There was more explicit content on 4chan, 144 00:13:39,850 --> 00:13:44,290 people weren't afraid to link pictures or graphics and stuff. 145 00:13:44,290 --> 00:13:53,750 Reddit was more specific, because Reddit has more rules and moderation. 146 00:13:53,750 --> 00:13:57,460 That could have had an impact - "we don't want to get kicked off this platform..." 147 00:13:57,460 --> 00:14:01,630 "... so we have to stick by the rules". 4chan was less organised, 148 00:14:01,630 --> 00:14:06,910 you could post anything and there were a lot more agreements 149 00:14:06,910 --> 00:14:10,990 and disagreements. ARI: If you had all the resources and you had all the money in the world. 150 00:14:10,990 --> 00:14:15,520 What big question, would you explore? FATIMA: For this particular area of research, 151 00:14:15,520 --> 00:14:22,660 there are so many platforms online, there's so many hate groups. 152 00:14:22,660 --> 00:14:30,190 Hate is posted in different languages, cultures... there are so many definitions. 153 00:14:30,190 --> 00:14:33,970 There hasn't been a lot of comparison in this area. 154 00:14:33,970 --> 00:14:38,260 Whether it's comparing hateful content in one language to English, 155 00:14:38,260 --> 00:14:42,830 because most research in this area focuses on English hate, 156 00:14:42,830 --> 00:14:49,930 we haven't had comparison across different countries, languages, cultures, platforms. 157 00:14:49,930 --> 00:14:57,640 This is something we need to do to better understand of how hate operates online. 158 00:14:57,640 --> 00:15:01,780 So if I had an unlimited amount of time and money, 159 00:15:01,780 --> 00:15:10,500 I would look at hate from different languages, seeing how it compares, 160 00:15:10,500 --> 00:15:15,100 whether there's cause for concern in other countries as well, 161 00:15:15,100 --> 00:15:18,860 whether that has an impact on online hate in other places. 162 00:15:18,860 --> 00:15:24,500 We saw with ISIS' attacks, a lot of the time it was organised in Arabic. 163 00:15:24,500 --> 00:15:30,310 But then actual attacks happen in non-Arabic speaking countries. 164 00:15:30,310 --> 00:15:33,940 ARI: Is there a word we can use instead of hate? FATIMA: This is interesting, 165 00:15:33,940 --> 00:15:40,850 when I was starting, I was focussing on extremism and extremist groups. 166 00:15:40,850 --> 00:15:49,900 There were less groups [referred to by] law enforcement under this particular term. 167 00:15:49,900 --> 00:15:55,570 They would have a list of hate groups because, I guess for them a lot of, 168 00:15:55,570 --> 00:16:01,930 extremist groups are hate groups that have been linked to attacks offline. 169 00:16:01,930 --> 00:16:07,120 There's a definite impact, from the online to offline. 170 00:16:07,120 --> 00:16:12,070 ISIS is clearly an extremist group. 171 00:16:12,070 --> 00:16:20,020 Then you have individual extremists like the Christchurch shooter who isn't operating under a group. 172 00:16:20,020 --> 00:16:27,140 But still very much an extremist. This Christchurch shooter only became an 173 00:16:27,140 --> 00:16:33,460 extremist after he did the attack. He wouldn't have been considered an extremist while posting online. 174 00:16:33,460 --> 00:16:38,350 It would have just been online hate, because he was attacking certain groups. 175 00:16:38,350 --> 00:16:41,570 It wouldn't have been extremism, per se. 176 00:16:41,570 --> 00:16:49,060 I went towards hate in my research is because it was a slightly broader. 177 00:16:49,060 --> 00:16:53,170 It had a slightly broader definition than something specific like extremism. 178 00:16:53,170 --> 00:16:59,780 Currently, with how legislation across most countries is being proposed, 179 00:16:59,780 --> 00:17:05,780 'hate' is what is generally used. You will find more resources or information, 180 00:17:05,780 --> 00:17:09,640 on the type of groups or ideologies that are counted as hate. 181 00:17:09,640 --> 00:17:19,840 CLAUDINE: Have you noticed any overarching similarities between different hate groups, 182 00:17:19,840 --> 00:17:21,940 obviously not in terms of ideology necessarily, 183 00:17:21,940 --> 00:17:30,880 in terms of the approaches they use to market themselves, recruit or organise? 184 00:17:30,880 --> 00:17:46,750 FATIMA: A lot of the time there wasn't enough content for me to compare. 185 00:17:46,750 --> 00:17:53,500 I tended to just group my hate groups into one overarching hate ideology, 186 00:17:53,500 --> 00:17:56,800 and compare them across multiple platforms. 187 00:17:56,800 --> 00:17:59,260 That being said, while I was doing my research, 188 00:17:59,260 --> 00:18:08,080 say one hate group is propagating hate through selling merchandise (t-shirts, music...). 189 00:18:08,080 --> 00:18:13,930 This would be promoted by other hate groups, "this is a link, help our cause". 190 00:18:13,930 --> 00:18:16,420 So there was collaboration between them. 191 00:18:16,420 --> 00:18:24,070 It wasn't part of my research, but it's something that I found interesting, 192 00:18:24,070 --> 00:18:32,560 Y'know? Is this one hate group? 193 00:18:32,560 --> 00:18:39,070 Is this two hate groups? They have different names, organisational bodies and websites, 194 00:18:39,070 --> 00:18:42,880 there was a lot of overlap between them online. 195 00:18:42,880 --> 00:18:47,030 Many people will celebrate across the groups, 196 00:18:47,030 --> 00:18:56,970 whether it's politicians that are very radically right, or certain public figures, 197 00:18:56,970 --> 00:19:01,960 a lot of the time, it's public figures who have come to light in some way in this area. 198 00:19:01,960 --> 00:19:10,660 So they're celebrated across groups as some kind of idol. 199 00:19:10,660 --> 00:19:13,990 So there's a lot of overlap between them. ARI: Let's talk more about you. 200 00:19:13,990 --> 00:19:20,440 What are the frustrations/things you enjoy about the work that you do? 201 00:19:20,440 --> 00:19:29,080 It's the one space where I'm constantly learning, I don't know if I would get as much in another. 202 00:19:29,080 --> 00:19:36,310 One thing I found difficult was finding an area where I could provide insight. 203 00:19:36,310 --> 00:19:43,870 That's something that I struggled with. There's so much research, 204 00:19:43,870 --> 00:19:48,460 you realise "wow, there's so much research going on. How do I find the gap here?" 205 00:19:48,460 --> 00:19:51,360 That's one thing that I found difficult. 206 00:19:51,360 --> 00:19:58,810 Oftentimes you don't realise how long it can take just to identify that gap. 207 00:19:58,810 --> 00:20:04,780 Just to find this novel area of space that you can work in. 208 00:20:04,780 --> 00:20:11,830 I would say, have one question that you're constantly answering, 209 00:20:11,830 --> 00:20:15,400 no matter what kind of analysis that you're doing. 210 00:20:15,400 --> 00:20:20,030 That can also be a little difficult in your research. 211 00:20:20,030 --> 00:20:27,730 But once you have these clearly outlined , it becomes so interesting and then 212 00:20:27,730 --> 00:20:33,340 you start enjoying the whole process of actually learning from your analysis. 213 00:20:33,340 --> 00:20:39,850 There are going to be frustrating days where nothing's working! 214 00:20:39,850 --> 00:20:49,060 CLAUDINE: What are you working on now? FATIMA: As a result of a summer school... 215 00:20:49,060 --> 00:20:53,560 This is a group project with different researchers across the UK and Europe, 216 00:20:53,560 --> 00:20:59,950 we're using similar methods (text/social media analysis) where 217 00:20:59,950 --> 00:21:07,510 we're looking at how sentiment across online communities changes during crypto hypes. 218 00:21:07,510 --> 00:21:14,170 For people who don't know, this is when the crypto currency values rise and fall. 219 00:21:14,170 --> 00:21:21,670 We're looking at online crypto communities where this is discussed, 220 00:21:21,670 --> 00:21:28,450 whether there's a community involved when prices change, etc. 221 00:21:28,450 --> 00:21:31,450 It's not specifically feelings [we look at], 222 00:21:31,450 --> 00:21:37,840 it's the overall group opinion. 223 00:21:37,840 --> 00:21:43,540 We did find a few interesting findings. 224 00:21:43,540 --> 00:21:54,820 When prices of Bitcoin increased, 225 00:21:54,820 --> 00:22:03,490 then the online community became more individual and independent. 226 00:22:03,490 --> 00:22:10,660 So they will talk about "I bought/sold this stock in bitcoin". 227 00:22:10,660 --> 00:22:16,500 Next, I want to look at how these platforms are used within networks of hate, 228 00:22:16,500 --> 00:22:20,470 and how activity maps across platforms, 229 00:22:20,470 --> 00:22:28,600 whether there is a larger network of hate that encompasses multiple platforms. 230 00:22:28,600 --> 00:22:34,600 It's something I haven't done before - hopefully I can get something out of it! 231 00:22:34,600 --> 00:22:37,570 So it's more about using network analysis approaches. 232 00:22:37,570 --> 00:22:46,690 CLAUDINE: If people are interested in learning more, what resources would you suggest? 233 00:22:46,690 --> 00:22:55,480 FATIMA: This conference called ICWSM, general research in online social media platforms. 234 00:22:55,480 --> 00:23:01,150 There's a lot of research going on across loads of different topics, not just online hate, 235 00:23:01,150 --> 00:23:10,150 it's from - looking at pandemic sentiment around vaccinations. 236 00:23:10,150 --> 00:23:14,140 Whether you're looking at online hate or not, 237 00:23:14,140 --> 00:23:21,490 this conference will have a lot of the methods applicable to your research. 238 00:23:21,490 --> 00:23:28,780 This is a resource I found really useful. It also publishes data sets, 239 00:23:28,780 --> 00:23:36,980 loads of different types of datasets as well that are really useful. 240 00:23:36,980 --> 00:23:42,500 Research project VOX - Pol have a lot of ideas and a lot of articles, 241 00:23:42,500 --> 00:23:47,160 whether it's fully-fledged research publications, articles... 242 00:23:47,160 --> 00:23:53,240 "We've seen X happening online", So that's been a starting point. 243 00:23:53,240 --> 00:24:00,950 With how much technology and online resources that we're using day to day, 244 00:24:00,950 --> 00:24:06,003 at home or at work or what have you, realising how these can be misused, 245 00:24:06,003 --> 00:24:09,590 and how to use these technologies in the proper way... 246 00:24:09,590 --> 00:24:16,970 [that's] My understanding of cybersecurity - such a broad area. 247 00:24:16,970 --> 00:24:22,760 I found way more definitions of cyber security than I had anticipated. 248 00:24:22,760 --> 00:24:26,000 CLAUDINE: That was our interview with Fatima. Join us next week 249 00:24:26,000 --> 00:24:27,830 for another fascinating conversation. 250 00:24:27,830 --> 00:24:32,240 In the meantime, you can tweet at us @HelloPTNPod 251 00:24:32,240 --> 00:24:36,500 and you can subscribe on Apple Podcasts or wherever you listen to podcasts. 252 00:24:36,500 --> 00:24:42,330 The title there is PTNPod. See you next week. ARI: Bye! 253 00:24:42,330 --> 00:24:47,520 CLAUDINE: This has been a podcast from the Centre for Doctoral Training in Cybersecurity, University of Oxford. 254 00:24:47,520 --> 00:24:52,411 Funded by the Engineering and Physical Sciences Research Council.