1 00:00:00,060 --> 00:00:07,380 My name is Nicole. It can. I am a policy manager at Phil Fact for a fact as the UK is independent fact checking charity. 2 00:00:07,380 --> 00:00:15,200 We're a team of independent fact Chequers researchers and policy specialists who famed expose and counter the harm that bad information does. 3 00:00:15,200 --> 00:00:19,010 And I'll hand over to Professor Phil Hebert, who is joining us this evening. 4 00:00:19,010 --> 00:00:23,340 They'll join to introduce yourself. Thank you. Thank you very much for joining. 5 00:00:23,340 --> 00:00:30,380 For hosting us tonight. So, yes, I'm Phil Howard, I'm director of the Oxford Internet Institute. 6 00:00:30,380 --> 00:00:39,100 And we'll talk a little bit about our latest misinformation work and the book Like Machines. 7 00:00:39,100 --> 00:00:43,240 The book was written, I guess, over the last two years, 8 00:00:43,240 --> 00:00:54,970 and the project as a whole started when I was based in Budapest and in the summer of 2014 when Malaysian Airlines flight was shot down over Ukraine, 9 00:00:54,970 --> 00:01:02,200 I watched my Hungarian friends get several kinds of misinformation about who had shot the plane down and why. 10 00:01:02,200 --> 00:01:08,410 There was the story of democracy advocates who had shot the plane down. 11 00:01:08,410 --> 00:01:13,150 Maybe they thought Putin was flying to Malaysia on a commercial airlines. 12 00:01:13,150 --> 00:01:17,530 There was a story of American soldiers, U.S. soldiers who had chockful plane down. 13 00:01:17,530 --> 00:01:23,710 And then my favourite story is the story of a lost tank from World War Two. 14 00:01:23,710 --> 00:01:30,130 That have come out of the great forests of Ukraine confused and accidentally shot the plane down. 15 00:01:30,130 --> 00:01:39,070 And it's at that point that I realised that there were actually several nuances to the strategy of Russian origin, misinformation. 16 00:01:39,070 --> 00:01:46,240 It wasn't a strategy that involved ceding one story that people could respond to or contradict. 17 00:01:46,240 --> 00:01:55,390 It was a strategy of seeing multiple conflicting, sometimes equally ridiculous stories that nobody could respond to in a consistent way. 18 00:01:55,390 --> 00:02:03,320 Opposition couldn't pick the narrative to to address. And so what surprised me over the years, and I tell some of these stories in the book, 19 00:02:03,320 --> 00:02:11,200 is seeing how this strategy of misinformation has gone from being something that leaders in 20 00:02:11,200 --> 00:02:18,820 authoritarian countries used on their own people to being something that dictators use against voters, 21 00:02:18,820 --> 00:02:28,130 democracies, to being a strategy that political leaders and democracies use against their own voters during elections. 22 00:02:28,130 --> 00:02:34,780 And that slide, that slide, that transition is what the book Lie Machine is all about. 23 00:02:34,780 --> 00:02:41,530 Now, the book is broken into several different chapters, several different kinds of case studies. 24 00:02:41,530 --> 00:02:45,580 I think the important place to start is probably with a definition, right. 25 00:02:45,580 --> 00:02:54,790 Definition. What I would I think a lie machine is a lie machine for me is a system of technical infrastructure 26 00:02:54,790 --> 00:03:04,150 and social organisation that puts some untruth into the service of a party or a political ideology. 27 00:03:04,150 --> 00:03:10,750 Let's unpack that. I think one important part is the social organisation, 28 00:03:10,750 --> 00:03:19,840 an ultra conservative or white supremacist group that wants to come up with a bit of misinformation about the impact of immigrants on the economy. 29 00:03:19,840 --> 00:03:25,510 And then the technical system is a social media platform that serves up junk news, 30 00:03:25,510 --> 00:03:34,540 fake news to the people who will be receptive to it or the people who might be receptive to it. 31 00:03:34,540 --> 00:03:42,910 So the lie machine actually has those two parts of the social organisation and the technical system, the algorithms for distributing content. 32 00:03:42,910 --> 00:03:46,480 Over time, I think we've seen that there's actually several different functions here. 33 00:03:46,480 --> 00:03:52,040 There's a there's a production function where you have teams of trolls in Russia, 34 00:03:52,040 --> 00:03:58,420 in many countries now that have come up with a story, come up with a bit of misinformation. 35 00:03:58,420 --> 00:04:05,890 Does the distribution system that involves social media algorithms and your social media inbox. 36 00:04:05,890 --> 00:04:14,680 Right. That puts content into your feed. And then there's the marketing system that is often a secondary news organisations to give stories, 37 00:04:14,680 --> 00:04:19,430 additional spin or let stories build on each other. 38 00:04:19,430 --> 00:04:28,480 And I think we've seen that at work. Unfortunately, around covered by one of the greatest misinformation packages I've seen involves 39 00:04:28,480 --> 00:04:36,350 RFID chips and Bill Gates and the origins of crowbarred in a lab in Colorado. 40 00:04:36,350 --> 00:04:43,240 Or maybe it was northern Italy. There's a complex story, a nuanced story of conspiracy. 41 00:04:43,240 --> 00:04:53,430 It builds on the anti Backes movement. Right. So a longstanding, longstanding movement of misinformation around the values inoculations. 42 00:04:53,430 --> 00:05:00,960 It it also builds on our lack of trust in what the government might be doing around Corvet. 43 00:05:00,960 --> 00:05:07,520 Right. So there's there's multiple things that are complex lie machine and build on. 44 00:05:07,520 --> 00:05:11,350 And the misinformation around COBA is particularly pernicious. Right. 45 00:05:11,350 --> 00:05:16,630 And so one of the storylines involves a conspiracy from Bill Gates to inoculate us 46 00:05:16,630 --> 00:05:23,710 against covered through putting in a process that puts RFID chips in our arms, 47 00:05:23,710 --> 00:05:29,590 a big overarching story. That's some ways it's easy to refute with evidence. 48 00:05:29,590 --> 00:05:37,620 In other ways, it's such a large package. It's hard to know what part of that kind of storyline to start with. 49 00:05:37,620 --> 00:05:43,040 The problem for us as researchers is, is figuring out where this stuff comes from. 50 00:05:43,040 --> 00:05:48,360 And as a researcher, often it's best to be multi method. 51 00:05:48,360 --> 00:05:54,750 So we have some people who do fieldwork with the labs, the trolls who produce this content. 52 00:05:54,750 --> 00:06:02,610 We do some big data analysis to figure out the trends, how somebody actually games will search for games, YouTube search. 53 00:06:02,610 --> 00:06:07,050 There's a real craft to talk about figuring out how to make use of social 54 00:06:07,050 --> 00:06:11,880 media algorithms and then figuring out how political actors have to use them. 55 00:06:11,880 --> 00:06:21,330 And right now, we're sort of at a sensitive moment where we know a lot of the traffic around misinformation is on platforms that we can't study. 56 00:06:21,330 --> 00:06:28,440 Instagram and I have a Tinder story I could tell later like, but there's a serious tick tock, YouTube, WhatsApp. 57 00:06:28,440 --> 00:06:34,050 There's a range of platforms that are actually pretty tough to study in a comprehensive way. 58 00:06:34,050 --> 00:06:39,570 And that's where a lot of the action is now in terms of misinformation trends. 59 00:06:39,570 --> 00:06:41,890 I'm going to talk for another ten minutes or so. 60 00:06:41,890 --> 00:06:48,260 Then Nicole and I can have a conversation about what what what she's observing, what we're observing together. 61 00:06:48,260 --> 00:06:54,390 I thought I'd offer a quick run through some of the evidence that we prepared for the Senate Select Committee 62 00:06:54,390 --> 00:07:04,080 on Intelligence when it was looking back at what happened in 2016 in the US up through 2017 and 2018. 63 00:07:04,080 --> 00:07:11,130 As you may recall, the Senate asked major heads of heads of major technology firms to come and testify. 64 00:07:11,130 --> 00:07:21,310 And each of the firms turned over some fairly large data sets of accounts that they knew were managed from St. Petersburg. 65 00:07:21,310 --> 00:07:27,090 And this data sets a special because they're they're amongst the few fully attributed. 66 00:07:27,090 --> 00:07:31,500 The accounts were clearly set up and managed from IP addresses in St. Petersburg. 67 00:07:31,500 --> 00:07:34,380 And the firms from firms, Facebook in particular, 68 00:07:34,380 --> 00:07:46,170 gave over some three and a half thousand fake U.S. citizens to US based profiles that have been managed for a long time. 69 00:07:46,170 --> 00:07:54,350 Look, look at the data. We found that some of the profiles had been active in other social media as early as 2012. 70 00:07:54,350 --> 00:08:02,480 So of the surprises in the Russian accounts were active much, much earlier than we expected them to be active. 71 00:08:02,480 --> 00:08:06,410 When we look at the rhythm of their activity during the electoral year, 72 00:08:06,410 --> 00:08:13,010 it's pretty clear that there are bursts of activity are timed with natural spots in the electoral calendar. 73 00:08:13,010 --> 00:08:20,510 So the big national conventions, the debates between Clinton and Trump, each of these are bursts of Russian activity. 74 00:08:20,510 --> 00:08:26,180 Election night, first Russian activity. So they started much earlier than we expected. 75 00:08:26,180 --> 00:08:31,940 Their activity parallels, parallels, events and politics. 76 00:08:31,940 --> 00:08:37,800 One of the other surprises for us is that the bulk of the activity we study. 77 00:08:37,800 --> 00:08:47,640 Happened after 2016. So when Facebook turned over data, they turned over data from about 2015 to middle middle of 2015, 78 00:08:47,640 --> 00:08:51,960 the peak of Russian activity was not on election night in 2016. 79 00:08:51,960 --> 00:08:57,530 It was much later, early into 2017. That's when the whole of Russian activity occurs. 80 00:08:57,530 --> 00:09:04,040 And to me, this suggests that it's almost as if the Russian government's. 81 00:09:04,040 --> 00:09:10,460 Russian government decided that they were successful or having some success or having some impact can put more money, 82 00:09:10,460 --> 00:09:15,950 put more staff into managing these accounts. 83 00:09:15,950 --> 00:09:22,610 So they started much earlier than they expected. Their activity parallels the natural rhythm of the electoral calendar. 84 00:09:22,610 --> 00:09:27,210 And the bulk of their activity has been since 2016. 85 00:09:27,210 --> 00:09:36,810 Now, another interesting finding is that the bulk of their activity moved from Facebook onto Instagram. 86 00:09:36,810 --> 00:09:41,910 Instagram is a fairly closed, closed, closed platform. 87 00:09:41,910 --> 00:09:48,660 Independent researchers, independent researchers don't have systematic access to what's what's found there. 88 00:09:48,660 --> 00:09:54,990 That's a real challenge for researchers. That makes it difficult for me to say what kinds of misinformation trends are there. 89 00:09:54,990 --> 00:10:04,290 But we do know that the centre of the Russian activity really burst on Instagram after 2017 and 2018. 90 00:10:04,290 --> 00:10:09,240 Facebook ads add purchases were relatively minor parts of the campaign. 91 00:10:09,240 --> 00:10:12,960 And maybe you and I talk about the role of AD advised. 92 00:10:12,960 --> 00:10:20,620 I think in this data that we were looking at for the U.S. election, political ads where we're not as important as the organic content, 93 00:10:20,620 --> 00:10:27,720 the content coming from fake users who tweet about soccer scores and some soap operas 94 00:10:27,720 --> 00:10:34,230 and family photos from fake families and then start talking about politics possible. 95 00:10:34,230 --> 00:10:38,920 Those are the bulk of the profiles that we found in this Russian dataset. 96 00:10:38,920 --> 00:10:46,070 And then the themes that these accounts work on are all the polarising themes that we know in the U.S. context. 97 00:10:46,070 --> 00:10:49,730 These are about Black Lives Matter. 98 00:10:49,730 --> 00:11:00,060 Gun rights, abortion isn't a series of hot button issues in the US, some of which also are hot button issues here around race and political speech. 99 00:11:00,060 --> 00:11:06,830 And the Russian strategy is to pick up these. The government strategy is to pick up these, to try to generate polarising content. 100 00:11:06,830 --> 00:11:11,090 And the best of their most successful of their campaigns are the ones where they 101 00:11:11,090 --> 00:11:17,570 can get two different groups to meet on the street at the same time to protest, 102 00:11:17,570 --> 00:11:23,180 to have a physical, violent interaction, if possible. That's one of the outcomes. 103 00:11:23,180 --> 00:11:34,540 That's one of the goals of misinformation campaigns that we've tracked since having a close look at what happened in the US in 2016. 104 00:11:34,540 --> 00:11:41,900 Our teams at Oxford have looked at how many other similar efforts there are around the world. 105 00:11:41,900 --> 00:11:48,670 In 2017, we counted 28 countries with organised misinformation campaigns. 106 00:11:48,670 --> 00:11:57,440 These are not lone wolf operations. These are organisations with receptionists and hiring plans and performance bonuses. 107 00:11:57,440 --> 00:12:02,330 Sometimes they have retirement plans. They have regular job ads, retirement plans, phones and desks. 108 00:12:02,330 --> 00:12:11,700 These are formal organisations. In 2017, we found 28 countries like that have these these operations. 109 00:12:11,700 --> 00:12:16,010 In 2018, there were 48 countries. And last year there were 70 countries. 110 00:12:16,010 --> 00:12:22,160 These kinds of organised misinformation campaigns in authoritarian regimes, as you can imagine. 111 00:12:22,160 --> 00:12:27,950 These are often military units that have been retest doing this information. 112 00:12:27,950 --> 00:12:31,360 But in democracies there, their regular PR firms, 113 00:12:31,360 --> 00:12:40,190 their political consulting firms that dabble in subcontracts and create these kinds of Astroturf movements. 114 00:12:40,190 --> 00:12:45,260 We sometimes call them to put out misinformation. 115 00:12:45,260 --> 00:12:51,140 It's not just a problem. It's not just a challenge coming from authoritarian regimes. 116 00:12:51,140 --> 00:12:57,380 One of the big changes when we we've had last summer when we did our inventory last summer, 117 00:12:57,380 --> 00:13:03,240 is that we found that there were several governments that were imitating the Russian government apparatus. 118 00:13:03,240 --> 00:13:14,270 This French agency, Saudi, India, Pakistan, Venezuela, all now had small operations like what the Russians had in effect in two countries. 119 00:13:14,270 --> 00:13:23,130 We found that the governments had sent staff to Moscow for training on how to do good information operations. 120 00:13:23,130 --> 00:13:28,490 So there's a real learning curve right between between these types of regimes. 121 00:13:28,490 --> 00:13:38,470 And since covered, a significant amount of that expertise has gone into covered related misinformation. 122 00:13:38,470 --> 00:13:46,160 Let me say about a little bit about what that looks like, because it's and it's different all over again. 123 00:13:46,160 --> 00:13:52,040 In an important way, China has really arrived as a producer of misinformation. 124 00:13:52,040 --> 00:13:57,560 For the last few years, they haven't really they haven't really produced a lot of content in English. 125 00:13:57,560 --> 00:14:08,000 When protests erupted in Hong Kong, then they started to worry about what English language social media users were thinking about the protests. 126 00:14:08,000 --> 00:14:15,500 And they put out content about how the protesters were rioting, thugs, criminal gangs. 127 00:14:15,500 --> 00:14:23,180 And that was the first moment. And they worked over multiple social media platforms that aren't actually accessible in China. 128 00:14:23,180 --> 00:14:34,010 So so that that was a moment where it was clear the Chinese government was interested in influencing opinion on Twitter and Facebook in English. 129 00:14:34,010 --> 00:14:45,640 For Harro. Having developed that skill once covered Sprunt of the world and appears to come from one, 130 00:14:45,640 --> 00:14:50,920 the Chinese government stepped up messaging around the origins of Koban. 131 00:14:50,920 --> 00:15:00,880 And they put the CGT in the major broadcaster, other broadcaster into production as a sort of source. 132 00:15:00,880 --> 00:15:05,290 I mean, to producing fake news stories about Crovitz. 133 00:15:05,290 --> 00:15:11,960 Now, what's challenging is that I think the very definition of misinformation is changed over time. 134 00:15:11,960 --> 00:15:20,250 And so now a lot of the Chinese language, the Chinese origin, stories around covered are actually about asking questions. 135 00:15:20,250 --> 00:15:27,880 So they don't they don't argue that covered actually originated in a lab in the United States. 136 00:15:27,880 --> 00:15:37,660 But they'll ask the question, did crowbarred to cover it, really originate over 19, really originate or did it come from a lab in Colorado? 137 00:15:37,660 --> 00:15:43,240 Question mark. That's the flavour. And it's not just by asking the question they allowed. 138 00:15:43,240 --> 00:15:51,260 They allowed to enter doubt. They may have a quote from a doctor somewhere in the world who's who also asked the question. 139 00:15:51,260 --> 00:15:55,060 But there's no evidence beyond them asking the question. There's a range of themes. 140 00:15:55,060 --> 00:16:01,780 Chinese get on. The Russians also. Russian government also does misinformation around Colvard. 141 00:16:01,780 --> 00:16:07,840 Both regimes have two or three common common themes they hit. 142 00:16:07,840 --> 00:16:12,130 They like to talk about how democracies are failing, 143 00:16:12,130 --> 00:16:19,380 how the political institutions here can't make good decisions, are being responsive, can't control the virus. 144 00:16:19,380 --> 00:16:27,300 They talk about the aid that China and Russia are offering to northern Italy, to the White House. 145 00:16:27,300 --> 00:16:32,670 My reticence masks for the White House. And this becomes a big story. 146 00:16:32,670 --> 00:16:38,890 And then there's the third story. Consistent story probably is about how China and Russia are leading the science. 147 00:16:38,890 --> 00:16:44,020 Phil, my strap too quickly. Your microphone is rubbing against your shirt. 148 00:16:44,020 --> 00:16:48,150 May we ask you? Got it. Thank. Thank you. Of course. 149 00:16:48,150 --> 00:16:53,880 And so is the misinformation is about failing democratic institutions. 150 00:16:53,880 --> 00:17:06,990 It's about how China and Russia are leading the science. And it's about how how those two governments are providing aid to the rest of the world. 151 00:17:06,990 --> 00:17:14,360 All stories which they're able to pull up push out to a billion social media users a week. 152 00:17:14,360 --> 00:17:21,000 We got that number by adding up the total reach of all Twitter accounts, YouTube accounts, Facebook accounts, 153 00:17:21,000 --> 00:17:31,470 Reddit accounts and Instagram accounts that have signed up for content from Russia today and CGT and in a range of other national government sources. 154 00:17:31,470 --> 00:17:35,870 So it's quite a large audience once in a while. 155 00:17:35,870 --> 00:17:46,770 The reach of social media reach more stories from those two governments will outstrip the reach of a story from the BBC. 156 00:17:46,770 --> 00:17:50,320 But he and The New York Times, most of the time, that won't happen. 157 00:17:50,320 --> 00:17:56,630 But once in a while, they can actually reach more people than professionalised news outlet. 158 00:17:56,630 --> 00:18:04,700 The last thing I want to offer is sort of a long term look at what I think is the existential threat to democracy. 159 00:18:04,700 --> 00:18:12,620 I think that it's probably safe or conservative to say that every national security issue, every budget bill, 160 00:18:12,620 --> 00:18:20,150 every tax bill will come with some kind of automated or troll based campaign for it or against it. 161 00:18:20,150 --> 00:18:23,900 It could be a complex humanitarian disaster. It could be an economic crisis. 162 00:18:23,900 --> 00:18:33,320 It almost doesn't matter what the issue is. Somebody will try and blame immigrants or try to blame another country. 163 00:18:33,320 --> 00:18:37,970 And that that mechanism for spreading information is misinformation is probably 164 00:18:37,970 --> 00:18:43,200 going to be a consistent part of national politics in most democracies. 165 00:18:43,200 --> 00:18:48,630 I think we've seen China arrive. I think China will be much more active on a range of other issues. 166 00:18:48,630 --> 00:18:55,240 After COGAT. I think we'll see many of these techniques applied to special interests, 167 00:18:55,240 --> 00:19:01,370 special interest lobbying of government whenever some lobbyist needs legislative relief. 168 00:19:01,370 --> 00:19:05,570 Some clients use these tools. These tricks will be in the toolkit. 169 00:19:05,570 --> 00:19:11,520 Right, for for for the political communication managers. 170 00:19:11,520 --> 00:19:18,120 And then I don't think we've seen artificial intelligence yet behind many of the campaigns. 171 00:19:18,120 --> 00:19:27,250 But I think we will. I think it is on the horizon. We've seen a few dramatic examples of how fake fake political images, videos can be generated. 172 00:19:27,250 --> 00:19:36,520 But beyond that, I think if a lobbyist can work out what kind of face will reply to or will will be most interested in 173 00:19:36,520 --> 00:19:42,860 hearing from this behavioural research that suggests women respond well to men with a deep voice. 174 00:19:42,860 --> 00:19:46,000 And men respond well to women with a high pitched voice. 175 00:19:46,000 --> 00:19:56,160 Some of that behavioural research will start to feed the construction of faces and messages that will be customised directly for us. 176 00:19:56,160 --> 00:20:04,660 And that that sort of feeding of behavioural data into messaging is is one of the things I think is on the horizon not for the next election, 177 00:20:04,660 --> 00:20:12,310 but possibly the one after that. And then I think the really deep threat here is, is to the role of science in public life. 178 00:20:12,310 --> 00:20:20,260 So many of these campaigns are about undermining our confidence in experts and evidence. 179 00:20:20,260 --> 00:20:27,410 A lot of them support politicians who go with their gut on key issues and and ask critical questions. 180 00:20:27,410 --> 00:20:31,000 But but not in the sense of looking for evidence. 181 00:20:31,000 --> 00:20:41,200 They just repeat her retort with whatever whatever lobbyist fed them information most recently so that the role for science in public life, 182 00:20:41,200 --> 00:20:49,450 I think, is being diminished by misinformation campaigns, especially around health and public life. 183 00:20:49,450 --> 00:20:52,840 So I don't think it's too late and the book does end up being trite. 184 00:20:52,840 --> 00:21:05,770 So there are a couple of things we can do. Making sure to call the trolls that might be on our own accounts and reading things before we share them. 185 00:21:05,770 --> 00:21:07,060 Government needs to invest. 186 00:21:07,060 --> 00:21:16,450 Schools need to invest in in media campaigns to help young people learn critical thinking skills, not forward everything they see. 187 00:21:16,450 --> 00:21:20,860 I think probably that we need a new economic model for journalism itself. 188 00:21:20,860 --> 00:21:24,850 Some of the revenue that's ending up the social media firms probably does need to 189 00:21:24,850 --> 00:21:31,000 be redirected to professional news outlets so they can do the investigative work, 190 00:21:31,000 --> 00:21:37,690 do the fact checking and produce truths that help us make decisions when we vote. 191 00:21:37,690 --> 00:21:42,820 There's certainly a lot of big, big public issues in front of us to do with race and to do with health. 192 00:21:42,820 --> 00:21:47,750 And so I think it's it's important that we clean up public life in this way. 193 00:21:47,750 --> 00:21:56,300 So I have the the there is a code y l i e s for purchasing a book, if you want to use that. 194 00:21:56,300 --> 00:22:03,780 I hope you're still interested in this topic at the end of the conversation. But, Nicola, I'm happy to start chatting. 195 00:22:03,780 --> 00:22:12,650 What do you think? What's the most. Serious misinformation campaign you've looked at the last few weeks. 196 00:22:12,650 --> 00:22:16,280 Thanks, Bill, for setting that I was really interesting to hear it, 197 00:22:16,280 --> 00:22:21,230 particularly around the hostile states that you've seen who are manipulating the stories in this area, 198 00:22:21,230 --> 00:22:27,620 because it feels from my perspective and feels like we only look at the misinformation that's circulating in the UK. 199 00:22:27,620 --> 00:22:31,310 But it feels like a lot of the narrative that we've heard Rhines misinformation are included. 200 00:22:31,310 --> 00:22:40,310 19 in the UK has been focussed on family and friends sharing organic content and sharing stories, and particularly on social media. 201 00:22:40,310 --> 00:22:48,100 I think it's really interesting to hear the actually the hostile states are are active and taking advantage in this space. 202 00:22:48,100 --> 00:22:49,640 I mean, just to pick up on that a little bit. 203 00:22:49,640 --> 00:22:56,670 You mentioned you mentioned China's strategy of asking questions rather than or setting out the argument. 204 00:22:56,670 --> 00:23:03,440 And I note that Artie, the Russian state broadcaster, their slogan is question more. 205 00:23:03,440 --> 00:23:10,940 So how do we how do we push back on this idea that actually is questioning a bad thing? 206 00:23:10,940 --> 00:23:16,850 Should we be questioning less questioning? Yeah, that's a that's a that is a great question. 207 00:23:16,850 --> 00:23:33,500 It's up to the. I would say that there are some institutions and organisations that we can trust and should trust a little more. 208 00:23:33,500 --> 00:23:39,710 We can't all individually be expert in everything. And, you know, 209 00:23:39,710 --> 00:23:47,450 as in as much as a government has both experts and political appointees figuring out 210 00:23:47,450 --> 00:23:53,330 which experts to trust is part of the challenge of being a modern modern citizen. 211 00:23:53,330 --> 00:23:58,060 Now, doing that work, of figuring out who to trust. Probably involves having a diverse media diet. 212 00:23:58,060 --> 00:24:04,340 Right. So if you're a liberal every once in a while, you should cheque a conservative newspaper just like the conservative. 213 00:24:04,340 --> 00:24:11,740 Every once in a while you should see what the liberal papers are writing about and understanding that. 214 00:24:11,740 --> 00:24:21,590 And we can't all be expert in every single issue is I mean, I think that's a reasonable, reasonable limit to modern modern democracy. 215 00:24:21,590 --> 00:24:29,150 Being passionate about a few things that we do research is is the way we can still contribute to public life. 216 00:24:29,150 --> 00:24:36,290 Yeah, I completely agree. So just to talk from a full fact perspective, when we see a claim that we want to fact cheque, 217 00:24:36,290 --> 00:24:42,110 we will all is published that on our Web site and always publish the source that we have made that decision from as well. 218 00:24:42,110 --> 00:24:49,430 So we're really keen that people make up their own minds using the data that we have used to to come to their own conclusions. 219 00:24:49,430 --> 00:24:59,660 I'm visually struck by what you're saying about the kind of variety of conspiracy theories basically that we've seen in the last few months. 220 00:24:59,660 --> 00:25:03,500 And it's something that we've certainly picked up on and I feel factor as well. 221 00:25:03,500 --> 00:25:08,960 So you mentioned Bill Gates. And I think what's interesting that Bill Gates is that there's not just one story about Bill Gates. 222 00:25:08,960 --> 00:25:14,550 There's about 10 different stories about what Bill Gates is is apparently doing 223 00:25:14,550 --> 00:25:20,450 and and how he's trying to manipulate various things or make money or microchip. 224 00:25:20,450 --> 00:25:27,290 All of us in some way. I just think that really fascinating that there's such a variety of different things 225 00:25:27,290 --> 00:25:32,090 out there and not just stories that are circulating in the UK or even Europe. 226 00:25:32,090 --> 00:25:36,020 It's becoming a real international information space. 227 00:25:36,020 --> 00:25:41,690 And that's certainly hope and certainly helped, by the way, that we are receiving information these days. 228 00:25:41,690 --> 00:25:45,440 The social media companies obviously are global. 229 00:25:45,440 --> 00:25:52,610 I feel like we're certainly in a kind of a more precarious situation just through the prevalence of English language as well. 230 00:25:52,610 --> 00:25:58,450 That certainly seems to be spreading a lot more across other countries. 231 00:25:58,450 --> 00:26:02,480 But then equally, we are in a privileged position that there are many more people who want to rebut those stories 232 00:26:02,480 --> 00:26:10,040 and want to be able to or are able to fact cheque them in some way or otherwise about them. 233 00:26:10,040 --> 00:26:15,450 Whereas in other countries who have minority lime juice, perhaps it's it's not happening in the charter effectively. 234 00:26:15,450 --> 00:26:21,800 Yeah. We've I think we've noticed, too, that there is very little fact checking in Arabic. 235 00:26:21,800 --> 00:26:34,060 For example, if a country has a big civil society and maybe has some donors who are willing to support. 236 00:26:34,060 --> 00:26:47,050 Some sort of up to users to flag content, and then it's up to social media platforms to to evaluate the flag content and act responsibly. 237 00:26:47,050 --> 00:26:52,870 And I think the real challenge probably you and I would both be frustrated by is that we have no sight of what 238 00:26:52,870 --> 00:27:00,310 we have very little sight of the fact internal fact checking happens or what the internal flagging process is. 239 00:27:00,310 --> 00:27:10,730 So many of the firms. Absolutely. I know that Twitter most recently have been fact checking various places, most famously Donald Trump, 240 00:27:10,730 --> 00:27:16,420 obviously, but we have no insight into who's making those decisions in the companies. 241 00:27:16,420 --> 00:27:23,630 And what the processes for deciding which posts to cheque or whether, you know, what content there working. 242 00:27:23,630 --> 00:27:31,480 I mean, I'm curious about your impressions. I have the impression that in response to COGAT, Twitter has actually been pretty responsive. 243 00:27:31,480 --> 00:27:33,280 They're certainly sharing more data sets. 244 00:27:33,280 --> 00:27:43,910 And as you say, they're they're flagging posts that are that are misleading or maybe expressions of hate speech. 245 00:27:43,910 --> 00:27:49,520 I believe Facebook is doing a little bit more, maybe not as good as creatively as Twitter, 246 00:27:49,520 --> 00:27:57,500 but also that both firms are doing more around code than they ever did from the previous three years. 247 00:27:57,500 --> 00:28:02,340 Other kinds of political issues, even though there was still additional. 248 00:28:02,340 --> 00:28:07,160 So you read what you think. Well, how do you think the different firms have been responding in different ways? 249 00:28:07,160 --> 00:28:14,900 Yeah, I do think we have to give the different firm some credit for how they are responding to this very unique information crisis. 250 00:28:14,900 --> 00:28:18,320 As I mentioned before, it's not just around one country this time. 251 00:28:18,320 --> 00:28:25,460 So obviously, the Internet companies where we're thinking that the biggest information event of the year would be the U.S. election, 252 00:28:25,460 --> 00:28:33,500 actually, we've had a pivot and tackle these claims across basically a big country and the world has had some version of this. 253 00:28:33,500 --> 00:28:37,820 So do you think we need to give them some credit? We haven't seen them take action. 254 00:28:37,820 --> 00:28:42,350 We've seen them make policy changes. I mentioned Twitter. As you said, Facebook. 255 00:28:42,350 --> 00:28:46,550 I've done various things, as have Google. I've been particularly introduced by. 256 00:28:46,550 --> 00:28:57,420 Yeah. I've been particularly impressed by how much the companies are willing to promote an accurate information ahead of other sources. 257 00:28:57,420 --> 00:29:01,520 So particularly if you take Google search or YouTube when you're searching and 258 00:29:01,520 --> 00:29:05,750 even Twitter and Facebook when you look for specific terms in the search bar, 259 00:29:05,750 --> 00:29:14,190 it's it's accurate and kind of informative sources that come up first, which is a step further than we've ever seen them go before. 260 00:29:14,190 --> 00:29:21,590 So I think that's really positive and we have to acknowledge that. That being said, there's certainly a lot more that they could be doing. 261 00:29:21,590 --> 00:29:27,560 So there's more to do. There's always more. I think for me to make it smart and it's that transparency space. 262 00:29:27,560 --> 00:29:32,360 So I mentioned Twitter not liking, hey, they're making their decisions. 263 00:29:32,360 --> 00:29:38,210 Actually, that's the problem. All of the Internet companies have. And I think transparency would be the the biggest thing, 264 00:29:38,210 --> 00:29:45,180 especially when we're talking about content and what people are trying to say online, particularly beliefs. 265 00:29:45,180 --> 00:29:51,410 You know, we talked about conspiracy theories before. There's certainly nothing that says that you cannot believe a conspiracy theory. 266 00:29:51,410 --> 00:29:55,550 So these are really big questions are in freedom of speech at the minute that we have to be conscious of. 267 00:29:55,550 --> 00:29:59,420 And I think that the best way to tackle that is around greater transparency. I agree. 268 00:29:59,420 --> 00:30:08,540 I agree. I'm a liberal. I think one of the last conversations we had was about the fact checking the sites. 269 00:30:08,540 --> 00:30:15,960 Have you noticed any of these in your domain? No. Are they closed or the more of them open? 270 00:30:15,960 --> 00:30:17,430 So I say that again. 271 00:30:17,430 --> 00:30:26,300 Fake fake fact checking sites, so we tend to be fact checking sites but are not actually doing the chicken work that you're doing. 272 00:30:26,300 --> 00:30:33,290 Yeah, I think that's a really it's slightly niche part of the problem, but it's certainly a really important one. 273 00:30:33,290 --> 00:30:39,680 You know, you mentioned before that people need to know who to trust or know will expire as they can. 274 00:30:39,680 --> 00:30:47,810 They can trust it to give views on things that are way more complicated than they want to look into or have time to look into having fake fact. 275 00:30:47,810 --> 00:30:55,410 Chequers certainly is a real threat to the real fact Chequers out there who are trying to do good work and are trying to be transparent. 276 00:30:55,410 --> 00:31:03,200 So I think that's certainly a problem. It's not something I've seen, particularly around over 19 in a structured way. 277 00:31:03,200 --> 00:31:09,620 But I think there's certainly a lot of people out there who are trying to promote the truth of what's 278 00:31:09,620 --> 00:31:16,870 happening in the way that they see it and trying to do that in a very factual and factual in a time. 279 00:31:16,870 --> 00:31:21,500 And they have to that. Yeah, good. 280 00:31:21,500 --> 00:31:27,500 So we take a few questions or. Absolutely. And it's a great question from David Horrigan. 281 00:31:27,500 --> 00:31:33,470 Naturally, he's asking to Britain, the USA and other Western governments have similar programmes as part of their soft power. 282 00:31:33,470 --> 00:31:40,280 I think that's picking up on what you're saying around Russia and China. I'd be really interested in your view on that. 283 00:31:40,280 --> 00:31:44,210 I think the answer the answer is yes, but they're different. 284 00:31:44,210 --> 00:31:49,970 So when we do our inventories. So the answer is yes. 285 00:31:49,970 --> 00:31:55,280 When we do our inventories of government expenditures on misinformation. 286 00:31:55,280 --> 00:32:08,870 If if a democracy, for example, has an information campaign that's trying to discourage women from becoming ISIS brides and moving to Syria, 287 00:32:08,870 --> 00:32:17,700 that we don't count that as in our in our count of what governments are spending on on social media manipulation. 288 00:32:17,700 --> 00:32:24,080 So that might be an example of a fake grassroots movement that does is normatively valuable. 289 00:32:24,080 --> 00:32:36,250 So we don't include those kinds of campaigns in a democracy, if there's a PR firm that has clearly been caught running a misinformation campaign. 290 00:32:36,250 --> 00:32:43,880 Call client or logs, and we do we do count that the answer is yes, these things do exist in democracies. 291 00:32:43,880 --> 00:32:48,670 And if they're if they're tied to a national, reasonable national security issue, 292 00:32:48,670 --> 00:32:53,900 whether there's judicial oversight, we don't we don't include those campaigns. 293 00:32:53,900 --> 00:33:02,410 But if they're freelance lobbyists just playing with technology than we we do those in. 294 00:33:02,410 --> 00:33:04,140 There also aren't very many of those. 295 00:33:04,140 --> 00:33:16,080 There are as many examples of those in the military militaries in Western democracies, because there are other having authorised. 296 00:33:16,080 --> 00:33:21,330 Thanks. Let's take another question then and from Chris, I think it's linked to that. 297 00:33:21,330 --> 00:33:27,740 So he's asked, are you investigating what kind of audience they're susceptible to different kinds of misinformation? 298 00:33:27,740 --> 00:33:32,730 I think the really interesting question around the different types of campaigns that various 299 00:33:32,730 --> 00:33:37,920 organisations are countries are running it is that it's actually really tricky to study it outright. 300 00:33:37,920 --> 00:33:44,470 There's no there are no statistical models that will relate a tweet to a changed vote. 301 00:33:44,470 --> 00:33:53,190 We can't make that kind of direct causal connexion. And what we do know is that there are there's a very long tail to misinformation. 302 00:33:53,190 --> 00:34:02,100 So you can still measure the number of young voters in the US who think that Hillary Clinton was up to something in that pizzeria in, 303 00:34:02,100 --> 00:34:08,520 you know, outside Washington, D.C. They don't quite know what. But do they think there was something going on there? 304 00:34:08,520 --> 00:34:12,810 You can still you can still measure the number of UK citizens who think that they'll 305 00:34:12,810 --> 00:34:20,130 be saving hundreds of millions of pounds for the NHS by leaving it in the EU. 306 00:34:20,130 --> 00:34:24,510 So there's a there's a long tail to a story once it gets out reinforced. 307 00:34:24,510 --> 00:34:35,520 It's difficult to disabuse people on it. We think that older voters may be a little bit more likely to share this information. 308 00:34:35,520 --> 00:34:38,850 We don't know that they are more likely to believe it themselves. So. 309 00:34:38,850 --> 00:34:46,500 So it's it's not simply that older voters are more susceptible to believing the stuff they are more likely to share. 310 00:34:46,500 --> 00:34:54,690 We we do think that people who spend who have more of their main Newstart coming from 311 00:34:54,690 --> 00:35:01,290 social media are exposed to more misinformation than people who have a diverse media diet, 312 00:35:01,290 --> 00:35:05,460 some TV, some print and some digital. 313 00:35:05,460 --> 00:35:13,710 So there's a few effects we know. I'm fully aware of. The other things we effects we know is that it's fairly common across many countries. 314 00:35:13,710 --> 00:35:20,190 We've studied to find examples of prominent female politician or women journalists 315 00:35:20,190 --> 00:35:25,200 or feminist intellectuals who are driven off social media because of some nasty, 316 00:35:25,200 --> 00:35:36,690 particularly nasty campaign that's targeted. So one of the certain effects is has been in pushing some women out of public life. 317 00:35:36,690 --> 00:35:42,560 And another. That's definitely and that's that's one of the definite effects. 318 00:35:42,560 --> 00:35:49,160 It's really interesting you're saying about this prevailing view in the US that Hillary Clinton was up to something. 319 00:35:49,160 --> 00:35:54,370 I wonder if that's actually one of the biggest detrimental impacts of disinformation campaigns. 320 00:35:54,370 --> 00:35:57,870 It's not that one particular story or claim is believed. 321 00:35:57,870 --> 00:36:05,350 It's this kind of deepening distrust in either a particular person or particular institution or process. 322 00:36:05,350 --> 00:36:12,850 And that's that's one of the really kind of things that we need to fight against. But it's really hard to fight against because it's so intangible. 323 00:36:12,850 --> 00:36:22,680 Well, as you say, it can be about process. So if there's misinformation about about an election, the trust in elections or vote counting. 324 00:36:22,680 --> 00:36:29,500 But if especially major politicians say things, things that undermine their own electoral system, 325 00:36:29,500 --> 00:36:34,660 that that becomes a vicious cycle where people trust they're not sure they should even bother to vote. 326 00:36:34,660 --> 00:36:47,300 They don't vote or they make unusual choices. That creates a cycle of cycle of mistrust and political and in our own institutions. 327 00:36:47,300 --> 00:36:52,980 Absolutely. This is what I think. And another question actually, from Mike Thomas. 328 00:36:52,980 --> 00:36:56,720 Do you think that the motivations and intentions of information operations have changed? 329 00:36:56,720 --> 00:36:59,740 And he's mentioned since the early nineteen hundreds. 330 00:36:59,740 --> 00:37:04,520 Are we just more susceptible to misinformation given the amount of faith that we're exposed to in modern life? 331 00:37:04,520 --> 00:37:13,230 Interesting your views on that. Fascinating. So I think several things have changed since. 332 00:37:13,230 --> 00:37:21,720 Since the last century, most misinformation, campaign trade, mostly misinformation campaigns were very large. 333 00:37:21,720 --> 00:37:26,850 They were coordinated by states in times of war or major crises. 334 00:37:26,850 --> 00:37:37,770 They involved paying large amounts of money for radio stations that were broadcast a large areas or dropping pamphlets from aeroplanes. 335 00:37:37,770 --> 00:37:43,770 There were propaganda campaigns, but they used to be the only major governments could do them. 336 00:37:43,770 --> 00:37:50,280 And it usually involves sort of military units and acting and the resources of the state to enact them. 337 00:37:50,280 --> 00:37:54,090 So the first thing that's changed is now it's it's much cheaper, right, 338 00:37:54,090 --> 00:38:01,860 to do direct propaganda messaging over social media that will target particular people on the basis of race or gender. 339 00:38:01,860 --> 00:38:07,140 So it's much, much more focussed. It's much, much less expensive. And it's so much faster. 340 00:38:07,140 --> 00:38:13,310 It's direct and fast. I also think there's a qualitative change. 341 00:38:13,310 --> 00:38:19,270 And how the propaganda is structured in that so much of it is now baby tested. 342 00:38:19,270 --> 00:38:24,570 So that there'll be one message that's tested on the subpopulation of other the side of the population. 343 00:38:24,570 --> 00:38:28,520 And what's once you get the click through rates, you hold that, 344 00:38:28,520 --> 00:38:38,660 you pick this AJP and then you hold it variation B Prime and B and B probably make another variation of that process of A B testing, rapid testing. 345 00:38:38,660 --> 00:38:46,450 A message refines the message so quickly and delivers it to a more and more accurate target. 346 00:38:46,450 --> 00:38:55,510 And so I think that's a significant qualitative difference from from the intentions of envelopes from from years go by. 347 00:38:55,510 --> 00:38:58,610 But Nicole, what do you think? Yeah, I agree. 348 00:38:58,610 --> 00:39:06,380 I also wonder and people are much more aware of of these tactics, Niren, and are aware that it could be happening. 349 00:39:06,380 --> 00:39:14,240 And I think, you know, even in the U.K., we had a particular rise in consciousness in twenty, seventeen ranks and particularly in soulsby poisoning, 350 00:39:14,240 --> 00:39:19,940 where we saw really coordinated campaign from the Russian state then and obviously this year as well. 351 00:39:19,940 --> 00:39:23,180 There's been, as I mentioned before, there's been a lot of, um, 352 00:39:23,180 --> 00:39:31,190 a lot of news and a lot of an education campaigns been wanting to see exactly the kind of things you mentioned before. 353 00:39:31,190 --> 00:39:36,180 Take care before you share that with the U.N. campaign that ran earlier last week. 354 00:39:36,180 --> 00:39:41,780 There's the BBC you've done once as fowleri in checking your source, thinking before you share and all of those things. 355 00:39:41,780 --> 00:39:47,630 So I do wonder if people are much more aware of it now. And that also means that the tactics have to change. 356 00:39:47,630 --> 00:39:51,860 I think also journalists are getting better at reporting this kind of stuff like that. 357 00:39:51,860 --> 00:39:58,010 So there are more fact checking organisations like yourself that help everybody by providing a 358 00:39:58,010 --> 00:40:05,810 resource that I think news organisations are pretty standard pig for pretty much every election. 359 00:40:05,810 --> 00:40:13,550 Somebody will write about the information operations of play. You know, Nicole, I think listening to you talk about innovation in this, 360 00:40:13,550 --> 00:40:20,980 I am I do want to point out that much of the innovation happens during a U.S. presidential election because 361 00:40:20,980 --> 00:40:28,690 that's when tens and hundreds of millions of dollars are spent doing tricky things with new technologies. 362 00:40:28,690 --> 00:40:35,200 And so what happens in November in 2020 is going to have follow on consequences for other democracies. 363 00:40:35,200 --> 00:40:43,300 Whatever tools and tricks get practised over the next five or six months in the US will come to other democracies. 364 00:40:43,300 --> 00:40:48,400 And I think there are there are things that the platforms will do, I hope, 365 00:40:48,400 --> 00:40:55,500 try to prevent the leak of those tactics, but that the big cycles of innovation seem to be. 366 00:40:55,500 --> 00:40:59,750 2012, 2016, 2020. That's that's one innovation in this area. 367 00:40:59,750 --> 00:41:06,170 What happens? Yeah, I think that's completely, completely true. 368 00:41:06,170 --> 00:41:13,630 Do you think, though, that there is a risk that we get too caught up in, particularly in the US election cycle? 369 00:41:13,630 --> 00:41:22,000 Yes, I think it's problem. Yes, because you would know this from from your own your own work history. 370 00:41:22,000 --> 00:41:31,370 Yes, because many of the things that. Political leaders do the US illegal in the UK or the EU. 371 00:41:31,370 --> 00:41:36,230 This just would not be they could not be done work that the punishments and fines 372 00:41:36,230 --> 00:41:40,320 involved would be would be serious and swift and immediate court in the US. 373 00:41:40,320 --> 00:41:44,530 That tends to be a lot more. 374 00:41:44,530 --> 00:41:50,900 It's a much less regulated environment. Elections officials don't have the resources to track everything very well. 375 00:41:50,900 --> 00:41:52,940 There's a lot of variety state to state. 376 00:41:52,940 --> 00:41:59,750 So in some states, elections administrators will catch things and others just don't have the resources to do it. 377 00:41:59,750 --> 00:42:10,140 So I think you're right, we shouldn't and we shouldn't assume that whatever nasty tricks get developed in school will be applied elsewhere. 378 00:42:10,140 --> 00:42:18,940 But Klepto, which would be nice to be ahead of the game, could make sure that they don't come here. 379 00:42:18,940 --> 00:42:22,340 Absolutely. Ahead of the game is definitely the goal here. 380 00:42:22,340 --> 00:42:30,630 And I wonder if that's ever possible, given the amount of resources that various actors put into. 381 00:42:30,630 --> 00:42:36,610 As you say, a B testing. A lot of them are finding the best technique for this one. 382 00:42:36,610 --> 00:42:43,260 There's one interesting way we're about YouTube earlier. That's one interesting way we are ahead of the game, and that is with fake videos. 383 00:42:43,260 --> 00:42:47,310 So it turns out that it's fairly hard to produce at the moment. 384 00:42:47,310 --> 00:42:56,100 Hard to produce a good fake video. And when you try to compress it, I think video uploaded to YouTube, 385 00:42:56,100 --> 00:43:01,110 YouTube can detect something about the thickness and the way the compression works. 386 00:43:01,110 --> 00:43:04,350 So for the moment, there's a there is that there are ahead of the game. 387 00:43:04,350 --> 00:43:08,910 Technically, it's complicated. Not sure there's ever been a successful. 388 00:43:08,910 --> 00:43:17,450 I think that's successful misinformation. Fake video uploaded to YouTube, letting the new pop into my you. 389 00:43:17,450 --> 00:43:19,240 I'm not I'm aware of, 390 00:43:19,240 --> 00:43:28,200 but I've wanted to keep it interesting is that people are swayed by fake videos and actually there's almost the other problem of, 391 00:43:28,200 --> 00:43:34,560 you know, somebody gets caught on video doing something, you know, ads or or less than ideal. 392 00:43:34,560 --> 00:43:38,320 And then there's the possibility that they could claim that that was a fake video in itself. 393 00:43:38,320 --> 00:43:48,070 So we've also got that risk. And, you know, these technologies are becoming more and more available as well. 394 00:43:48,070 --> 00:43:52,750 We've got a couple of questions here. I the role of the media particularly. 395 00:43:52,750 --> 00:44:01,900 So some. I'm sorry, John Rosenfield has asked, is there a problem of misinformation media due to the problem of editorial process? 396 00:44:01,900 --> 00:44:08,620 I think that, yeah, a couple of people have picked up from the point that there's demise of local media that look media's and, you know, 397 00:44:08,620 --> 00:44:17,310 they're all competing for attention and so potentially increasing their stories to be more sensationalist or have more dramatic headlines. 398 00:44:17,310 --> 00:44:23,950 And how much a problem do you think that is and how much is contributing to the misinformation, disinformation space? 399 00:44:23,950 --> 00:44:34,030 I think it's a medium sized problem. So it's I think the instinct is right that more or more outlets need to generate sensational titles, 400 00:44:34,030 --> 00:44:44,270 headlines, and that editorial editorial decisions to shape the quality of reporting. 401 00:44:44,270 --> 00:44:51,580 I would say that the. I would tell you that there's some evidence that. 402 00:44:51,580 --> 00:45:01,430 People thought our own social media users on our own. Do tend to have good information behaviours on the whole. 403 00:45:01,430 --> 00:45:09,380 So when when we see a mistake on Wikipedia, we tend to edit it and correct it called positive hurting. 404 00:45:09,380 --> 00:45:19,880 We do things as a group that tend to be helpful. And when we see an obvious piece of junk news, we tend not to forward it. 405 00:45:19,880 --> 00:45:25,790 Certain people do all the time, but most people most of the time don't share misinformation information. 406 00:45:25,790 --> 00:45:31,790 Most of the junk news, most of the news out there isn't for the average social media user. 407 00:45:31,790 --> 00:45:36,650 It's not consumed by the average voter. It's actually a fairly particular niche. 408 00:45:36,650 --> 00:45:42,980 That is the audience for that stuff. And it doesn't come from mainstream mainstream outlets. 409 00:45:42,980 --> 00:45:54,140 For example, we did a study once of misinformation on national security issues targeted at military personnel who were on duty overseas. 410 00:45:54,140 --> 00:45:57,200 So this is this is about information about Syria. 411 00:45:57,200 --> 00:46:04,050 What was going on in India, Pakistan targeted at NATO military personnel who were doing their service. 412 00:46:04,050 --> 00:46:11,960 And one of the things we found is that people doing military service are amongst the most sophisticated news consumers we've ever seen. 413 00:46:11,960 --> 00:46:17,670 They don't they don't share any of that. They know where it comes from and they don't call it. 414 00:46:17,670 --> 00:46:23,490 The challenges, their friends and family, so the people who immediately open, who are not doing the military service, 415 00:46:23,490 --> 00:46:33,120 but who are interested in military conspiracies, and they're the ones more likely to spread misinformation on national security. 416 00:46:33,120 --> 00:46:37,780 So its editorial process is definitely a part of it. 417 00:46:37,780 --> 00:46:51,260 But. But I think if more people got their news from professionalised news outlets, I think they have a higher, higher quality information diet. 418 00:46:51,260 --> 00:47:00,210 Yeah, I I agree with you, there is a medium problem. Obviously, a@ feel bad to actually we spend a lot of your time fact checking newspapers, 419 00:47:00,210 --> 00:47:04,620 particularly where they have, you know, misrepresented statistics or pools. 420 00:47:04,620 --> 00:47:09,630 Pools are the worst, actually. A lot of people get misinterpret pools. 421 00:47:09,630 --> 00:47:12,300 And so we do quite a lot of fact checking of the newspapers. 422 00:47:12,300 --> 00:47:18,180 And I think what's been interesting in Cuba, 19 actually, is that the I mean, I'm very much generalising here, 423 00:47:18,180 --> 00:47:27,210 but a lot of the newspapers in the UK where publishing some of the conspiracy theories, particularly around 5G, for example, at the start of the year. 424 00:47:27,210 --> 00:47:37,380 But we've really seen that stop. I think newspapers are being a lot more careful about particularly things like conspiracy theories or or cures. 425 00:47:37,380 --> 00:47:41,760 That's another thing that we've seen there in Cuba, 19, which may I ask your question? 426 00:47:41,760 --> 00:47:47,890 Yeah, of course. So since since starting a full and since having such a big impact. 427 00:47:47,890 --> 00:47:56,120 REPORTER Have you noticed that media outlets are doing less fact checking on their own? 428 00:47:56,120 --> 00:48:01,930 So most people's professional outlets are supposed to have fact Chequers. Right. There's different processes, but. 429 00:48:01,930 --> 00:48:04,990 Most of them are supposed to do their own fact checking. 430 00:48:04,990 --> 00:48:10,780 Have you noticed that journalism is withdrawing a little bit on their own fact checking staff resources? 431 00:48:10,780 --> 00:48:16,090 Because full fact is there to do that proofing? 432 00:48:16,090 --> 00:48:18,940 Yeah, I think that's a really interesting question. 433 00:48:18,940 --> 00:48:27,790 I mean, the answer is I don't I don't know specifically if newspapers are or media outlets are particularly withdrawing from facture. 434 00:48:27,790 --> 00:48:33,910 Obviously, it's a really difficult time in the media landscape, in the media and various organisations are are cutting their staffs. 435 00:48:33,910 --> 00:48:42,170 And I think there's probably a mixture. Are the be an amalgamation of fact Chequers and investigative journalists? 436 00:48:42,170 --> 00:48:48,560 And I think certainly investigative journalism is it's one of the key aspects of the media that needs to be protected. 437 00:48:48,560 --> 00:48:53,830 And because it is such a valuable job, beyond's beyond the reporting of the single story, 438 00:48:53,830 --> 00:49:00,310 actually diving underneath and explaining how we got here and what's happening and making sense of some really complicated issues. 439 00:49:00,310 --> 00:49:07,930 And I think if that gets lost and then that does only help this problem of of only meeting that minor, 440 00:49:07,930 --> 00:49:13,240 only getting the surface understanding of a story, that's certainly an issue. 441 00:49:13,240 --> 00:49:16,050 And I think what we have seen, though, in the last couple years in particular, 442 00:49:16,050 --> 00:49:22,480 and especially Válková 19, is more organisations setting up and setting up teams. 443 00:49:22,480 --> 00:49:26,560 He will look at claims they're circulating on social media in particular. 444 00:49:26,560 --> 00:49:32,830 So, you know, we saw this last year with the elections that happened in the U.K. We're seeing it Dankova, 445 00:49:32,830 --> 00:49:43,000 19 people like BBC News keep an excellent team looking at social media and claims that are being circulated there, but also Channel four. 446 00:49:43,000 --> 00:49:51,730 So I think that's that's increased certainly not years, but I think being a really positive increase as well. 447 00:49:51,730 --> 00:49:57,100 We're getting towards the end of this. So why don't we finish on a bit of a positive note? 448 00:49:57,100 --> 00:50:01,580 Hey, what's your top tip on how we solve this problem, solve the problem? 449 00:50:01,580 --> 00:50:09,660 So I think the top tip for individual individually is to be more careful about forwarding right. 450 00:50:09,660 --> 00:50:17,520 To go through our own follower lists and get rid of the people who are we really don't know and be careful forwarding cut. 451 00:50:17,520 --> 00:50:22,210 That's that's the individual tip. I do think that there's a structural thing we can do. 452 00:50:22,210 --> 00:50:25,930 And this is in some ways, this is pie in the sky thinking. 453 00:50:25,930 --> 00:50:33,460 But I think I think data is how we express ourselves now that the stuff we buy on our credit card, 454 00:50:33,460 --> 00:50:38,920 our physical location in space, what we do with our mobile phone, all that stuff generates all that. 455 00:50:38,920 --> 00:50:44,450 All those devices, users of devices generate politically meaningful data. 456 00:50:44,450 --> 00:50:50,140 And right. Though, most of that politically meaningful data goes to Silicon Valley. 457 00:50:50,140 --> 00:50:55,570 So my top tip solution is something I've actually borrowed from the blood diamonds campaign. 458 00:50:55,570 --> 00:50:58,990 One of the innovations of the blood diamonds campaign was that there's this 459 00:50:58,990 --> 00:51:04,780 argument that if you could if you could tell a consumer where a diamond came from, 460 00:51:04,780 --> 00:51:11,020 most consumers would not buy the diamonds that came from the nastiest bits of pits of Africa. 461 00:51:11,020 --> 00:51:18,310 They would buy clean diamonds. And I think we should be able to look at any device that we have and ask it to tell us 462 00:51:18,310 --> 00:51:24,640 who the ultimate beneficiary is of the data that comes that's collected on a device. 463 00:51:24,640 --> 00:51:30,130 So the diamonds, you can say who is the ultimate beneficiary of purchasing a diamond? 464 00:51:30,130 --> 00:51:39,100 Who's the ultimate beneficiary of the location data coming from my phone or the consumption data coming from my refrigerator? 465 00:51:39,100 --> 00:51:48,700 At the moment, devices can't even do that. The answer is that it's mostly some Silicon Valley firms and their third party, third party organisations. 466 00:51:48,700 --> 00:51:57,800 But if we could get a device to tell us who's benefiting from our data, then maybe we could add organisations to that list of beneficiaries. 467 00:51:57,800 --> 00:52:07,180 Can I as a citizen, if I want to add my faith based group, some covered health researchers had a couple of academics to my flow of data. 468 00:52:07,180 --> 00:52:14,230 I probably would. I think many of us would contract volunteer to put they put data in the hands of health researchers if we could. 469 00:52:14,230 --> 00:52:16,390 Right now, we don't even have that infrastructural choice. 470 00:52:16,390 --> 00:52:25,590 So I think that the first thing we really need to do is, is break this monopoly of flow of data to Silicon Valley. 471 00:52:25,590 --> 00:52:32,460 Yeah. That's really interesting. I love that comparison to the blood diamonds and everybody being more conscious. 472 00:52:32,460 --> 00:52:35,190 I think there's a lot going on to say about conscious choices, 473 00:52:35,190 --> 00:52:44,520 particularly we think that the environment and if we can apply that same mentality to our choices of how we spend our time online or our user bases, 474 00:52:44,520 --> 00:52:47,190 I think that's a really interesting concept. 475 00:52:47,190 --> 00:52:56,040 I guess I would build on that and say that we need we need collaboration across a variety of different sectors to push for this change. 476 00:52:56,040 --> 00:52:57,980 It's been really fascinating what you've seen the last couple weeks, 477 00:52:57,980 --> 00:53:05,650 the fall and a variety of advertisers pulling their advert as fans from a variety of different platforms. 478 00:53:05,650 --> 00:53:11,550 I'm not sure that's the long term solution, but it's certainly an interesting lever that's being Piltz. 479 00:53:11,550 --> 00:53:18,030 And I think it's see the platforms respond in some way and make some changes, particularly Facebook. 480 00:53:18,030 --> 00:53:23,580 But, yeah, I think collaboration and collective action is needed to tackle these problems. 481 00:53:23,580 --> 00:53:28,560 And just off our last minute, we've had one request which had very much back. 482 00:53:28,560 --> 00:53:32,850 Could you please tell us a tender story? Tender story. 483 00:53:32,850 --> 00:53:41,340 So one of our store, one of our researchers a few years ago for the twenty nineteen UK election, 484 00:53:41,340 --> 00:53:47,820 found a tinder bot that would flirt and then talk about Jeremy Corbyn. 485 00:53:47,820 --> 00:53:56,310 And the only way we the only reason we know about it is that the campaign managers who built the bot on Tinder 486 00:53:56,310 --> 00:54:05,970 went on to Twitter and thanked the bot for giving them a few percentage points edge in a couple of key districts. 487 00:54:05,970 --> 00:54:13,220 So so they actually went on Twitter and said thank you to the Tinder got named the districts where they thought it had given them an edge. 488 00:54:13,220 --> 00:54:18,840 I don't know whether the tweet dated about actually did get particular PS elected, 489 00:54:18,840 --> 00:54:25,780 but the campaign managers who testified to it on Twitter seem convinced that it was worthwhile. 490 00:54:25,780 --> 00:54:32,100 So whether or not you believe it and that's the important point, is simply that whatever platform emerges, 491 00:54:32,100 --> 00:54:37,900 we'll have people talking politics on it a little bit and we'll have other people trying to manipulate politics. 492 00:54:37,900 --> 00:54:45,670 The. That's certainly a great example of innovation, places, surprising places. 493 00:54:45,670 --> 00:54:47,120 Yes. Absolutely. 494 00:54:47,120 --> 00:54:54,370 And we haven't even touched on on some of these new platforms that are emerging and their role to play in in the information environment. 495 00:54:54,370 --> 00:54:57,980 Come. But we'll pause there. We have run out of time alone. Sure we can. 496 00:54:57,980 --> 00:55:03,350 Of these issues for longer. Apologies to those questions that we didn't get a chance to answer. 497 00:55:03,350 --> 00:55:07,670 I'm pleased to have a look at Phil's great book. 498 00:55:07,670 --> 00:55:13,880 You can buy the book. There should be a link just at the bottom of your screen, says Bifold Pirates' latest book. 499 00:55:13,880 --> 00:55:18,470 Go ahead and click that unfiled. You want to repeat that? The court as well. 500 00:55:18,470 --> 00:55:22,920 It's capital. Why so? I guess. Right. 501 00:55:22,920 --> 00:55:27,820 And it works a little on the Yale University Press. Great. 502 00:55:27,820 --> 00:55:33,350 So thanks very much to all four of Martin SKO for arranging this event. Thank you to everybody for joining this evening. 503 00:55:33,350 --> 00:55:41,870 I hope you have a enjoy the rest of your evenings and. Yeah, think before you share the creepy headlines. 504 00:55:41,870 --> 00:55:47,304 Thanks very much, everybody. Canticle.