1 00:00:01,980 --> 00:00:05,610 ARI: Hello, I'm Ari. [CLAUDINE: And I'm Claudine] Welcome to Proving the Negative. 2 00:00:05,610 --> 00:00:09,490 ARI: We're a podcast all about exploring the different sides of cybersecurity, 3 00:00:09,490 --> 00:00:13,770 from political to computer science, international relations to mathematics. 4 00:00:13,770 --> 00:00:16,000 Join us as we talk to our friends about the work they do. 5 00:00:16,000 --> 00:00:20,790 CLAUDINE: Today, we're talking about how app design may affect children, 6 00:00:20,790 --> 00:00:22,961 specifically, apps that are targeted towards kids... 7 00:00:22,961 --> 00:00:28,450 and how data collected from those apps could potentially be putting kids at risk. 8 00:00:28,450 --> 00:00:31,250 Ari, is that something you know much about? ARI: I can't say I know much. 9 00:00:31,250 --> 00:00:35,160 It sounds like it's going to tie in to safeguarding (safety/looking after kids). 10 00:00:35,160 --> 00:00:38,760 Does it draw any parallels with your work? CLAUDINE: Not really. 11 00:00:38,760 --> 00:00:43,740 I tend to focus on adults in my research, but this topic is very relevant, 12 00:00:43,740 --> 00:00:49,590 given that online or digital safety for children has been all over the news, 13 00:00:49,590 --> 00:00:59,890 particularly in the UK (legislation like the Online Safety Bill/age appropriate design code). 14 00:00:59,890 --> 00:01:01,560 ANIRUDH: People hear the word cybersecurity 15 00:01:01,560 --> 00:01:06,400 the first thing they think about is hacking and technical things... 16 00:01:06,400 --> 00:01:10,630 Websites, databases, things like that. It's grown way beyond that. 17 00:01:10,630 --> 00:01:16,030 It's not just technical aspects [it's] also about people, what is right and wrong. 18 00:01:16,030 --> 00:01:22,930 It's related to security of computers, to people and however it connects together. 19 00:01:22,930 --> 00:01:26,400 It's as much about people as it is about technical elements. 20 00:01:26,400 --> 00:01:27,250 My name is Anirudh. 21 00:01:27,250 --> 00:01:31,630 I'm a third year student. I'm part of the Human Centred Computing group. 22 00:01:31,630 --> 00:01:36,220 I focus on ethical and responsible app[lication] development for children. 23 00:01:36,220 --> 00:01:40,540 I do this because products that both children and adults make use of, 24 00:01:40,540 --> 00:01:43,570 are designed in such a way to keep us busy, 25 00:01:43,570 --> 00:01:48,040 to keep us occupied and hooked, and to collect as much data as possible. 26 00:01:48,040 --> 00:01:57,370 This can lead to mental health issues: addiction, attention loss or violent behaviour in kids. 27 00:01:57,370 --> 00:02:01,780 Data sharing and loss of privacy can also have consequences. 28 00:02:01,780 --> 00:02:06,511 Once a child grows up and decides to apply for a job, 29 00:02:06,511 --> 00:02:10,090 their CV could go through an automatic profiler, 30 00:02:10,090 --> 00:02:17,110 which might make use of data collected when the individual was a child. 31 00:02:17,110 --> 00:02:20,300 This means that when you're designing services for kids, 32 00:02:20,300 --> 00:02:22,060 there are certain things we have to keep in mind. 33 00:02:22,060 --> 00:02:25,330 We shouldn't try to keep them hooked for as long as possible. 34 00:02:25,330 --> 00:02:28,925 We shouldn't persuade them to disclose unnecessary data. 35 00:02:28,925 --> 00:02:34,720 I'm working with developers to see how we can incentivise apps, 36 00:02:34,720 --> 00:02:37,970 which are age-appropriate and child-friendly. 37 00:02:37,970 --> 00:02:44,960 CLAUDINE: Why are you interested in researching kids, digital resilience, 38 00:02:44,960 --> 00:02:48,890 privacy and responsible design? ANIRUDH: In a way I just rolled into it. 39 00:02:48,890 --> 00:02:55,590 Kids are a vulnerable population and they require different protections from adults, 40 00:02:55,590 --> 00:02:58,850 because adults, they're responsible for their own decisions. 41 00:02:58,850 --> 00:03:05,120 The same cannot be said for kids because they don't make those same decisions. 42 00:03:05,120 --> 00:03:09,000 They might not be fully aware of what privacy is [or] why it's important. 43 00:03:09,000 --> 00:03:11,660 They require extra protections. 44 00:03:11,660 --> 00:03:15,460 I've heard developers say, "you know what, the things we do for children, it..." 45 00:03:15,460 --> 00:03:18,770 "wouldn't be a bad idea to do them for adults as well". 46 00:03:18,770 --> 00:03:24,590 I'm hoping that whatever we do here, we can take that to a wider audience. 47 00:03:24,590 --> 00:03:31,220 Initially I thought I was going to be working [directly] with children. 48 00:03:31,220 --> 00:03:36,680 However, that didn't turn out to be the case. I decided to work with developers. 49 00:03:36,680 --> 00:03:40,450 A lot of the apps on the Google Play Store or the Apple App Store, they're 50 00:03:40,450 --> 00:03:44,990 collecting a lot of data through data trackers and third party libraries. 51 00:03:44,990 --> 00:03:49,940 One of the first questions we had was, "why are developers doing this?", 52 00:03:49,940 --> 00:03:54,000 "are they bad people?", "why are they creating apps which are harmful..." 53 00:03:54,000 --> 00:03:55,880 "... or could potentially be harmful for kids?" 54 00:03:55,880 --> 00:03:59,330 That's how it started out - we interviewed developers and asked them, 55 00:03:59,330 --> 00:04:03,980 "what are your development practices, what do you think about data collection?" 56 00:04:03,980 --> 00:04:08,240 "what do you think you should be doing about this?" That's how it started. 57 00:04:08,240 --> 00:04:11,720 Initially, we didn't really know what to expect. 58 00:04:11,720 --> 00:04:18,690 We thought they would say "we want to earn money, so we're collecting data". 59 00:04:18,690 --> 00:04:23,390 Once we spoke to the developers, we found out they're not bad people at all! 60 00:04:23,390 --> 00:04:27,140 Their motivations are good. They tell us "we don't want to harm children". 61 00:04:27,140 --> 00:04:32,200 We understand they're in their formative years, they're vulnerable and we bear a 62 00:04:32,200 --> 00:04:35,420 certain responsibility when creating apps for children. 63 00:04:35,420 --> 00:04:40,040 However, it's quite challenging to create child-friendly apps. 64 00:04:40,040 --> 00:04:44,090 There are several reasons for that. One of them is analytics. 65 00:04:44,090 --> 00:04:47,120 We need to know what users are doing, how they're behaving. 66 00:04:47,120 --> 00:04:51,320 And this means that developers have to use third party analytics tools. 67 00:04:51,320 --> 00:04:56,000 The most popular ones are from large data controllers such as Google, 68 00:04:56,000 --> 00:04:59,750 meaning that Google has access to all of this data. 69 00:04:59,750 --> 00:05:05,330 It's almost impossible to create apps or services without third party tools, 70 00:05:05,330 --> 00:05:10,640 Developers didn't always know what these are doing in the background. 71 00:05:10,640 --> 00:05:14,540 Sometimes they're collecting data that we're just not aware of. 72 00:05:14,540 --> 00:05:22,640 No one really knows what exactly is happening in the background. 73 00:05:22,640 --> 00:05:25,700 That's the second problem, and the third problem is - at the end of the day, 74 00:05:25,700 --> 00:05:30,680 [developers] have to make money. 75 00:05:30,680 --> 00:05:37,200 How can you compete? Having users pay for the app doesn't work... 76 00:05:37,200 --> 00:05:40,070 first of all, you might not get enough downloads, 77 00:05:40,070 --> 00:05:44,540 so not enough people buy the app for you to make a living. 78 00:05:44,540 --> 00:05:48,650 The only option that's left is targeted advertising, 79 00:05:48,650 --> 00:05:56,090 which means using kids' data to influence behaviours and persuade installations. 80 00:05:56,090 --> 00:05:59,990 Sometimes users want this. 81 00:05:59,990 --> 00:06:04,130 They can't pay for the apps and if [developers] don't make them free, 82 00:06:04,130 --> 00:06:05,960 They're going to leave bad ratings, 83 00:06:05,960 --> 00:06:14,270 forcing developers to remove payment features that make apps free. 84 00:06:14,270 --> 00:06:21,350 These are three challenges we found for developers to create child friendly apps. 85 00:06:21,350 --> 00:06:26,510 The big question for my research is, "what is the solution"? 86 00:06:26,510 --> 00:06:34,020 That's what we're trying to answer. CLAUDINE: What is a tool, or a tool kit? 87 00:06:34,020 --> 00:06:39,180 ANIRUDH: A third party tool or library is software you haven't developed yourself. 88 00:06:39,180 --> 00:06:44,670 Big companies (Google, Facebook, Unity), have third party libraries, 89 00:06:44,670 --> 00:06:48,600 [that] they put out there, saying "we created these, feel free to use them". 90 00:06:48,600 --> 00:06:53,300 Google Analytics is one of them. 'Analytics', in this case means 91 00:06:53,300 --> 00:06:56,790 keeping track of who's coming to your app or website. 92 00:06:56,790 --> 00:07:01,590 Where they are coming from, their gender, how long they use the app... 93 00:07:01,590 --> 00:07:06,570 Very useful, of course. You know exactly what's going on with the service. 94 00:07:06,570 --> 00:07:12,240 A data tracker is not one particular [piece of] software. 95 00:07:12,240 --> 00:07:20,640 It's a generic term for software collecting data and sending it back to a server. 96 00:07:20,640 --> 00:07:30,360 Apps ask for certain permissions... to look into your contacts or record your calls. 97 00:07:30,360 --> 00:07:34,230 Apps like Facebook, they want access to as much as possible. 98 00:07:34,230 --> 00:07:38,130 There is a reason why a lot of these third party libraries are free. 99 00:07:38,130 --> 00:07:41,370 They are making money with it, just not the way you think they are. 100 00:07:41,370 --> 00:07:43,260 They're collecting as much data as they can. 101 00:07:43,260 --> 00:07:48,360 This is used for advertising purposes. This is how Google earns their money. 102 00:07:48,360 --> 00:07:52,890 The moment you click on an ad, they connect advertisers with you. 103 00:07:52,890 --> 00:07:58,740 They get a commission. The reason Google Analytics is free is so they have 104 00:07:58,740 --> 00:08:04,500 data about you, they can show you ads you are more likely to click on. 105 00:08:04,500 --> 00:08:08,370 They'll end up printing more money. The same is true for Facebook. 106 00:08:08,370 --> 00:08:13,710 They want all these permissions, to collect data for exactly the same purposes. 107 00:08:13,710 --> 00:08:16,410 ARI: Why do children and young people need to be resilient? 108 00:08:16,410 --> 00:08:19,100 We put them in environments that we build, and we force them 109 00:08:19,100 --> 00:08:22,890 to interact with each other and adults in these spaces. What are your thoughts? 110 00:08:22,890 --> 00:08:28,609 Digital resilience means children have digital and mental tools, 111 00:08:28,609 --> 00:08:31,740 to cope with the challenges that they're facing. 112 00:08:31,740 --> 00:08:37,080 Browser plug-ins, add-ons for your phone to prevent data sharing... 113 00:08:37,080 --> 00:08:39,360 The mental tools are important as well. 114 00:08:39,360 --> 00:08:44,000 If you find out that your data has been shared or a photo of you has appeared 115 00:08:44,000 --> 00:08:47,550 online, which you didn't want to be online, how do you cope with that? 116 00:08:47,550 --> 00:08:53,310 They are the ones participating in social media, overwhelmed with information. 117 00:08:53,310 --> 00:08:56,000 At the end of the day, they're the ones facing all of that. 118 00:08:56,000 --> 00:08:58,680 That's what digital resilience is about. 119 00:08:58,680 --> 00:09:04,554 Facebook, and a lot of companies, have funders and investors who just want to 120 00:09:04,554 --> 00:09:09,960 make money, because that's very important in this process. 121 00:09:09,960 --> 00:09:16,800 They don't have children's needs in mind. Responsibility to protect whoever is 122 00:09:16,800 --> 00:09:20,617 using [the app] falls on the user. With kids, the responsibility is shifted onto 123 00:09:20,617 --> 00:09:23,040 parents or the caretakers. 124 00:09:23,040 --> 00:09:28,260 We have to do it because there really is no one else - it's not going to get better. 125 00:09:28,260 --> 00:09:33,270 Facebook isn't going to disappear, we're not going to stop developing technology. 126 00:09:33,270 --> 00:09:39,000 If anything, we'll become more connected and children will grow up more and more 127 00:09:39,000 --> 00:09:44,500 in a digital world. The pandemic saw children attending school through the 128 00:09:44,500 --> 00:09:49,710 computer - where before, we taught children to navigate the physical world, 129 00:09:49,710 --> 00:09:53,400 now we have to teach them how to navigate the digital world. 130 00:09:53,400 --> 00:09:57,270 It developed so quickly. 10, 20 years seems like a long time, 131 00:09:57,270 --> 00:10:02,490 but we're not equipped to deal with rapid technological developments. 132 00:10:02,490 --> 00:10:05,000 And now we're trying to catch up, and now we're finding out, 133 00:10:05,000 --> 00:10:09,000 if you're on Facebook, it can actually lead to depression because we're 134 00:10:09,000 --> 00:10:12,870 constantly comparing ourselves with people who seem to have it better. 135 00:10:12,870 --> 00:10:17,250 How do we deal with this? How do we tell children, "OK, you know what?" 136 00:10:17,250 --> 00:10:21,210 "This is not actually the truth". We're playing this game of catch up. 137 00:10:21,210 --> 00:10:24,100 CLAUDINE: There are often assumptions made about levels of competence, 138 00:10:24,100 --> 00:10:27,500 for adults, which isn't necessarily the case. 139 00:10:27,500 --> 00:10:31,470 ANIRUDH: Parents play a big role. They might not know what the dangers are. 140 00:10:31,470 --> 00:10:36,930 They don't have the tools, either. Parents have a responsibility here. 141 00:10:36,930 --> 00:10:42,030 But how do we tell parents about this? How do we educate them? 142 00:10:42,030 --> 00:10:46,470 Should schools have get togethers to discuss these things with parents? 143 00:10:46,470 --> 00:10:56,200 I'm not sure what the answer is. I don't know what these platforms know. 144 00:10:56,200 --> 00:10:59,520 On the one hand, they know exactly what they're doing. 145 00:10:59,520 --> 00:11:06,720 They know exactly how to hook users. Long term consequences need research. 146 00:11:06,720 --> 00:11:12,000 For a child growing up using Facebook, what are the consequences 147 00:11:12,000 --> 00:11:14,640 once that child has become an adult? 148 00:11:14,640 --> 00:11:17,172 How does this person engage with other people? 149 00:11:17,172 --> 00:11:21,600 How does it impact their relationship with their kids or people around them? 150 00:11:21,600 --> 00:11:29,380 These are questions we cannot answer yet, I doubt these platforms know. 151 00:11:29,380 --> 00:11:37,000 These platforms will do research which benefits them. 152 00:11:37,000 --> 00:11:41,410 They'll research:"how can we keep people engaged?" 153 00:11:41,410 --> 00:11:51,410 "How can we improve their experience?", but not so much harm prevention 154 00:11:51,410 --> 00:11:53,470 There's no easy solution to this. 155 00:11:53,470 --> 00:11:58,450 If we tell someone to create an ethical app, how can we make that happen? 156 00:11:58,450 --> 00:12:01,480 There is no strict definition of what an ethical app is, of course, 157 00:12:01,480 --> 00:12:05,000 but let's just say that you don't persuade children to use your app for 158 00:12:05,000 --> 00:12:08,950 hours and hours, and you don't collect unnecessary data. 159 00:12:08,950 --> 00:12:12,910 You're likely to earn less money. That's a big issue. 160 00:12:12,910 --> 00:12:19,510 We don't have a lot of power, actually. If we want to change this ecosystem. 161 00:12:19,510 --> 00:12:25,100 You need cooperation from major players. What Apple has done, 162 00:12:25,100 --> 00:12:28,970 they've added the pop up, or you can opt in to tracking. 163 00:12:28,970 --> 00:12:32,830 So do you agree to this, yes or no? This is good for the end users. 164 00:12:32,830 --> 00:12:35,000 This is not something we could have done ourselves. 165 00:12:35,000 --> 00:12:39,250 We really need Apple or Google or Facebook to cooperate. 166 00:12:39,250 --> 00:12:42,909 We're doing two things. One, we're raising awareness, informing developers: 167 00:12:42,909 --> 00:12:50,680 "Listen, designing for kids is important. These are the reasons, you should do it" 168 00:12:50,680 --> 00:12:53,890 Secondly, a lot of these rules tend to be a little bit abstract. 169 00:12:53,890 --> 00:12:58,600 "Be transparent in what you do". That's great. But what exactly does that mean? 170 00:12:58,600 --> 00:13:01,794 What do you have to do? Does that mean you have to immediately show 171 00:13:01,794 --> 00:13:05,260 your privacy policy the moment people open up your app? 172 00:13:05,260 --> 00:13:08,800 If you have a huge privacy policy, a child is not going to read it. 173 00:13:08,800 --> 00:13:17,470 Raising awareness and developing tools, we hope it's easier to develop for kids. 174 00:13:17,470 --> 00:13:25,030 CLAUDINE: We have developers, social media platforms, advertisers and users. 175 00:13:25,030 --> 00:13:31,870 What role do you think regulation should play in managing this problem? 176 00:13:31,870 --> 00:13:35,260 ANIRUDH: Regulation has started to play a larger role. 177 00:13:35,260 --> 00:13:41,260 A few years back, we saw the GDPR, which changed the landscape a little bit. 178 00:13:41,260 --> 00:13:46,210 In the UK, we saw the age appropriate design codes, 179 00:13:46,210 --> 00:13:52,000 which tell developers and organisations how to develop services for children, 180 00:13:52,000 --> 00:13:53,650 how to make it age-appropriate. 181 00:13:53,650 --> 00:14:00,160 I don't know how much impact they will have. Big organisations are so powerful. 182 00:14:00,160 --> 00:14:04,800 It's hard to bypass them. A few years ago, Shoshana Zuboff wrote a book 183 00:14:04,800 --> 00:14:09,750 called "The Age of Surveillance Capitalism". She describes how 184 00:14:09,750 --> 00:14:14,110 organisations circumvent the role of governments and regulation. 185 00:14:14,110 --> 00:14:19,990 One of the things they do is to make their practises the new normal, 186 00:14:19,990 --> 00:14:22,120 The example she gave was Google Maps. 187 00:14:22,120 --> 00:14:26,500 What Google started doing was going around and photographing everything 188 00:14:26,500 --> 00:14:30,220 people's houses with cars, and suddenly it became the new normal. 189 00:14:30,220 --> 00:14:36,490 We were so used to having these maps available that no one questioned them. 190 00:14:36,490 --> 00:14:40,990 When there was pushback, they made minor changes, not really changes at all. 191 00:14:40,990 --> 00:14:45,550 Then we forgot about it. Regulation can sometimes be a little bit slow. 192 00:14:45,550 --> 00:14:54,280 By the time it tackles issues, we might have gotten used to [a] new way of life. 193 00:14:54,280 --> 00:14:59,320 CLAUDINE: In terms of children and their use of apps and social 194 00:14:59,320 --> 00:15:04,480 media, the collection of their data could potentially harm them in the long term. 195 00:15:04,480 --> 00:15:09,160 What did you mean by that? ANIRUDH: There are several different harms. 196 00:15:09,160 --> 00:15:12,700 One is mental health and well-being, for example, addiction. 197 00:15:12,700 --> 00:15:16,450 If you're scrolling Facebook or if you're playing an online game, they're 198 00:15:16,450 --> 00:15:19,600 designed in such a way that they keep you busy for as long as possible. 199 00:15:19,600 --> 00:15:23,080 And addiction can lead to all types of problems - loss of sleep, 200 00:15:23,080 --> 00:15:29,380 increased violence, etc. Depression anxiety when using social media. 201 00:15:29,380 --> 00:15:33,490 The second category which is important is privacy. The harms seem very far away. 202 00:15:33,490 --> 00:15:35,290 They're not immediately visible. 203 00:15:35,290 --> 00:15:40,000 But these relate to decreased opportunities in life, 204 00:15:40,000 --> 00:15:45,750 and risks associated with data theft, fraud, identity theft. 205 00:15:45,750 --> 00:15:50,000 Privacy is, of course, a huge part of security. It's all related. 206 00:15:50,000 --> 00:15:54,730 Some of the harms related to loss of privacy are not immediately clear. 207 00:15:54,730 --> 00:15:59,140 Google collects my data, but I haven't felt anything. Nothing has happened. 208 00:15:59,140 --> 00:16:04,240 So what exactly is the issue? There are harms which can arise from this. 209 00:16:04,240 --> 00:16:08,710 Data theft - Google or Facebook can be hacked, 210 00:16:08,710 --> 00:16:14,950 suddenly your data is in the hands of someone else, they can use it for fraud. 211 00:16:14,950 --> 00:16:19,030 They might get a hold of your bank details, they could impersonate you. 212 00:16:19,030 --> 00:16:24,430 More interestingly, are harms associated with long term opportunities. 213 00:16:24,430 --> 00:16:29,000 Facebook and Google might have data from you which they've collected over 214 00:16:29,000 --> 00:16:32,410 the course of, let's say, 20, 30 years. 215 00:16:32,410 --> 00:16:34,870 They might be sharing this with third parties. 216 00:16:34,870 --> 00:16:41,650 Insurance rates might be higher because of previous behaviours. 217 00:16:41,650 --> 00:16:46,660 It could simply be how you used your mobile device - lots of hand movement, 218 00:16:46,660 --> 00:16:51,330 you're less careful. Your insurance rates are higher. 219 00:16:51,330 --> 00:16:57,510 Is that fair? How you behaved 20 years ago might be very different to now. 220 00:16:57,510 --> 00:17:02,900 This is an example of how loss of privacy and data sharing happening when you 221 00:17:02,900 --> 00:17:05,760 are a child can impact you when you're a lot older. 222 00:17:05,760 --> 00:17:10,850 There have been examples where CVs have been scanned by an algorithm 223 00:17:10,850 --> 00:17:13,500 and [the algorithm] profiled against women, for example. 224 00:17:13,500 --> 00:17:19,000 Data collected when you were a kid from a toy or website, combined with 225 00:17:19,000 --> 00:17:21,870 your CV, suddenly you got rejected. 226 00:17:21,870 --> 00:17:26,100 That's an example of how data sharing can impact you negatively. 227 00:17:26,100 --> 00:17:32,100 CLAUDINE: If you had unlimited time, money, personnel and resources, 228 00:17:32,100 --> 00:17:36,450 how would you tackle some of the challenges that you face? 229 00:17:36,450 --> 00:17:40,270 ANIRUDH: I'm interested in mindfulness. 230 00:17:40,270 --> 00:17:45,330 I'd be curious to find out if digital resilience could be added as a subject. 231 00:17:45,330 --> 00:17:49,350 Kids could be taught about the issues we've discussed, so they're more 232 00:17:49,350 --> 00:17:51,840 equipped to handle it later on in their lives. 233 00:17:51,840 --> 00:17:57,240 This would be interesting for researchers and beneficial in the long term. 234 00:17:57,240 --> 00:18:00,510 This might be one of the things I would allocate those funds to. 235 00:18:00,510 --> 00:18:04,140 ARI: What have you and your colleagues learned from kids and young people? 236 00:18:04,140 --> 00:18:07,530 ANIRUDH: What we have learnt from kids is what their needs are. 237 00:18:07,530 --> 00:18:11,280 We keep talking about making [apps] ethical, child friendly, age appropriate. 238 00:18:11,280 --> 00:18:15,810 But what I've not discussed is what that means. We can learn from kids. 239 00:18:15,810 --> 00:18:20,910 We can ask "what is important to you?", "what are your needs?" 240 00:18:20,910 --> 00:18:25,290 We can think about how this feeds into what developers are creating. 241 00:18:25,290 --> 00:18:29,000 We can keep that in mind - for example, don't make a pop-up notification at 242 00:18:29,000 --> 00:18:32,100 eight o'clock in the morning because they should be focused on school. 243 00:18:32,100 --> 00:18:38,110 What we learn from them we can use to inform developers. 244 00:18:38,110 --> 00:18:41,130 CLAUDINE: What is next for you? ANIRUDH: Developing for kids is critical, 245 00:18:41,130 --> 00:18:44,790 beyond the UK. Right now, we're focused on the UK market. Next, 246 00:18:44,790 --> 00:18:49,400 we are hoping to develop a collection of tools for developers which makes 247 00:18:49,400 --> 00:18:52,350 age-appropriate and child-friendly design for kids easier. 248 00:18:52,350 --> 00:18:56,750 For example, a video course, design templates, code libraries... 249 00:18:56,750 --> 00:19:01,840 [We'll ask devs] "What exactly will be useful for you, what will you use?" 250 00:19:01,840 --> 00:19:06,090 Then prototype and develop [the tools] and release them to the wider audience. 251 00:19:06,090 --> 00:19:10,390 We'll establish a community around that which we can take internationally. 252 00:19:10,390 --> 00:19:12,230 CLAUDINE: Where can people keep up with the work that you're doing? 253 00:19:12,230 --> 00:19:14,100 ANIRUDH: I don't have a huge online presence. I've got an Oxford page... 254 00:19:14,100 --> 00:19:20,020 CLAUDINE: If people are interested in learning more about the topic, what 255 00:19:20,020 --> 00:19:22,820 kind of resources would you suggest that they look at? 256 00:19:22,820 --> 00:19:31,080 ANIRUDH: General news sources, Twitter influencers can be quite useful. 257 00:19:31,080 --> 00:19:33,250 CLAUDINE: That was our interview with Anirudh. 258 00:19:33,250 --> 00:19:36,090 Join us next week for another fascinating conversation. 259 00:19:36,090 --> 00:19:40,500 In the meantime, you can tweet at us @HelloPTNPod. 260 00:19:40,500 --> 00:19:44,790 You can subscribe on Apple Podcasts or wherever you listen to podcasts. 261 00:19:44,790 --> 00:19:50,630 The title there is PTNPod. See you next week. ARI: Bye! 262 00:19:50,630 --> 00:19:55,800 CLAUDINE: This has been a podcast from the Centre for Doctoral Training in Cybersecurity, University of Oxford. 263 00:19:55,800 --> 00:20:51,401 Funded by the Engineering and Physical Sciences Research Council.