1 00:00:02,420 --> 00:00:06,500 So could you start by giving me a name and saying what your position is? 2 00:00:07,160 --> 00:00:12,440 Yes. I'm Richard Fletcher and I am a senior research fellow and leader of the research team 3 00:00:12,440 --> 00:00:15,920 at the Reuters Institute for the Study of Journalism at the University of Oxford. 4 00:00:16,140 --> 00:00:20,629 Okay. Thanks very much. And can you just we don't have time for your entire life story, 5 00:00:20,630 --> 00:00:25,910 but could you just give me a rough summary of how you got to where you are now and 6 00:00:25,910 --> 00:00:29,090 how you became interested in the field of research that you're interested in. 7 00:00:29,960 --> 00:00:39,440 So my I originally studied computer science at university and then after that I did a master's degree in science, 8 00:00:39,440 --> 00:00:42,919 technology, medicine and society, then a Ph.D. in sociology. 9 00:00:42,920 --> 00:00:54,680 And after that and I was looking for research opportunities and I found one studying journalism at the City University in London. 10 00:00:55,280 --> 00:01:01,489 And that project was looking at how it would be possible to build a tool that could be used 11 00:01:01,490 --> 00:01:06,740 by journalists and the public in order to verify information that they find on social media. 12 00:01:07,400 --> 00:01:14,240 And this led to a sort of follow up position here at the Voice Institute, and that was in 2015. 13 00:01:14,240 --> 00:01:23,899 And I've been here ever since. And and here we research journalism, how people get news and information across different countries. 14 00:01:23,900 --> 00:01:30,590 And, of course, that that fed into some of the work we did on coronavirus starting in 2020. 15 00:01:30,980 --> 00:01:38,610 And we've looked back. But I'm quite interested in the transition you make from being a computer scientist to being somebody who's, 16 00:01:38,630 --> 00:01:44,450 I suppose, broadly speaking, in the social sciences. Yes. And it wasn't planned. 17 00:01:44,450 --> 00:01:47,540 It was something that was attacked quite well. 18 00:01:47,540 --> 00:01:54,199 I have worked on on projects which involve data when I was studying computing 19 00:01:54,200 --> 00:01:59,329 science and although data wasn't really a sort of seen as as important as it is, 20 00:01:59,330 --> 00:02:07,639 it is now. And, you know, in the intervening time, the social sciences became more sort of data oriented, I think, and, and that's their skills. 21 00:02:07,640 --> 00:02:13,940 Working with data and programming became useful in, in, 22 00:02:13,940 --> 00:02:19,250 in terms of social research in ways that I didn't anticipate, but it's worked out reasonably well. 23 00:02:19,910 --> 00:02:25,879 And when you went to study journalism, have you had any thoughts of actually becoming a journalist yourself at that point? 24 00:02:25,880 --> 00:02:30,210 We always interested in studying it as a as if, you know, 25 00:02:30,230 --> 00:02:39,800 you had never wanted to be a journalist or was interested in journalism and the news media, but not not something I'd wanted to work actively in. 26 00:02:40,400 --> 00:02:50,090 And I was just interested in, in, in being a researcher and understandings of society and people better and people's relationship 27 00:02:50,090 --> 00:02:53,960 with the news media and how they get news and information is it's always been interesting to me, 28 00:02:54,950 --> 00:02:58,120 and that's something that's massively changed over the last I don't know. 29 00:02:58,130 --> 00:03:01,400 You want to start with that in the last 30 years? 30 00:03:01,400 --> 00:03:02,750 I can say yes. 31 00:03:02,750 --> 00:03:10,639 When is the projects, the ongoing projects that we have here at the institute, it's called the Digital News Report, and that started in 2012. 32 00:03:10,640 --> 00:03:16,100 And it was one of the first projects that really tried to look at how people got news and information across multiple countries. 33 00:03:16,100 --> 00:03:20,569 People have done this in in specific countries, but they hadn't compared it that often. 34 00:03:20,570 --> 00:03:23,090 So that started before I was here. 35 00:03:23,090 --> 00:03:31,709 But it's still going it's based on an annual survey now in about 40 different countries of how people get news and information. 36 00:03:31,710 --> 00:03:37,760 And since 2012, when we started up till and this year, that we've seen huge change. 37 00:03:37,760 --> 00:03:38,870 So in particular, 38 00:03:39,740 --> 00:03:49,280 the rise of social media as a as a source of information and the decline of traditional forms of news media like print and to a certain extent, 39 00:03:49,280 --> 00:03:59,060 television and the growth of smartphones in particular, as a way for people to connect to the Internet and then and then access to news. 40 00:04:00,470 --> 00:04:04,250 And is it possible to put figures on the size of that change? 41 00:04:05,000 --> 00:04:10,730 Yes, roughly from memory. It's of in 2013 in many countries. 42 00:04:10,850 --> 00:04:15,979 And about a quarter of people said that they access regularly access news and 43 00:04:15,980 --> 00:04:21,379 information on their on their smartphone compared to around 70% in some countries now. 44 00:04:21,380 --> 00:04:29,930 So it's a huge change. And it's something that, you know, even if you said it is year on year, you can you can see large increases. 45 00:04:31,070 --> 00:04:36,740 And this is why we do the project every year rather than every five years, which is what a lot of, you know, longitudinal studies do. 46 00:04:37,220 --> 00:04:43,100 And just because the rate of change has been has been so dramatic in some areas, and of course, this varies across countries. 47 00:04:43,100 --> 00:04:46,730 Some countries haven't changed as much as others. 48 00:04:47,420 --> 00:04:55,280 But it's you know, that's what makes it interesting and that's what it justifies the the inclusion of lots of different countries in the study. 49 00:04:56,420 --> 00:05:00,980 And that presumably also involved a proliferation of providers from having. 50 00:05:01,470 --> 00:05:01,799 You know, 51 00:05:01,800 --> 00:05:12,480 a relatively small number of recognised news organisations within each country to having a number of sources that are available by the by the end. 52 00:05:12,870 --> 00:05:16,830 That's right. There's been a huge growth in in information supply. 53 00:05:18,390 --> 00:05:24,100 But at the same time, the the sort of demand, you know, sort of within these past hasn't really changed a huge amount. 54 00:05:24,120 --> 00:05:26,760 So there's a huge there's much more choice than the was. 55 00:05:27,050 --> 00:05:32,940 And, of course, there's this this sort of pre-dates the Internet to a certain extent when with the growth of cable television and so on. 56 00:05:32,940 --> 00:05:37,050 But the Internet has has has made that more pronounced. 57 00:05:37,620 --> 00:05:44,309 And although it's still true that the the traditional news organisations in this country, 58 00:05:44,310 --> 00:05:49,050 the BBC, The Guardian, the Mail and so on, is still the most widely used sources of news. 59 00:05:49,560 --> 00:05:58,020 And there are many more outlets in the so-called long tail, meaning that if people are aware of them, they can access them online. 60 00:05:58,020 --> 00:06:04,560 And this is added huge diversity to to people's news and information reportage in some cases. 61 00:06:05,280 --> 00:06:13,100 And does that also mean that there's been a fragmentation that's the on the demand side that people, you know, 62 00:06:13,200 --> 00:06:18,690 fall into groups that access their news from a particular source and would never look at another source. 63 00:06:19,410 --> 00:06:23,910 Is there any evidence of any kind of an increased fragmentation like that? 64 00:06:24,330 --> 00:06:34,590 I mean, not really. I think it's still the case that for most people that the large traditional news brands are still the most the most widely used. 65 00:06:35,010 --> 00:06:38,760 And that accounts for a huge amount of the total news use. 66 00:06:39,180 --> 00:06:48,719 And so if this fragmentation is on a smaller scale and it's among sort of smaller groups in society, 67 00:06:48,720 --> 00:06:54,990 perhaps I'll have turned away from these large news sources and and are accessing alternative sources. 68 00:06:54,990 --> 00:06:58,649 But even for many of these people, they're still coming back to, for example, 69 00:06:58,650 --> 00:07:04,490 the BBC and and the man in the garden and so on as that as that's a reference point. 70 00:07:04,500 --> 00:07:12,420 So it's only a very, very small part of the population that have that have abandoned the sort of mainstream news sources entirely. 71 00:07:12,700 --> 00:07:18,509 Hmm. And can you just tell me a little bit more about how the digital news and so the work sort 72 00:07:18,510 --> 00:07:23,400 of maybe step turns like you did in news report works and how do you collect the data? 73 00:07:23,970 --> 00:07:32,440 So the data we collect each year through an online survey which is administered by the polling company YouGov. 74 00:07:32,460 --> 00:07:37,590 But the, the questionnaire is designed by us and a team at the Institute. 75 00:07:39,240 --> 00:07:45,360 We go into the field in January and February each year and then and then that 76 00:07:45,360 --> 00:07:50,069 gives us enough time to analyse the data ready for publication in the summer, 77 00:07:50,070 --> 00:07:53,520 usually in the middle of June. But it's based on an online survey. 78 00:07:53,520 --> 00:07:58,079 We also do. We also supplement it with qualitative research, not in as many countries. 79 00:07:58,080 --> 00:08:03,510 So we typically run focus groups or interview sessions in in a handful of countries 80 00:08:03,510 --> 00:08:07,680 that we can use to better understand some of the data we get from the survey. 81 00:08:07,800 --> 00:08:12,720 And YouGov makes a selection of it. It targets a representative. 82 00:08:13,020 --> 00:08:17,099 That's right. Yeah. The sample that obviously is an online survey. 83 00:08:17,100 --> 00:08:20,069 So we can't reach anyone who doesn't have access to the Internet. 84 00:08:20,070 --> 00:08:26,010 But in many of the countries we study, the internet penetration is very high and we use quite a sampling. 85 00:08:26,010 --> 00:08:35,820 So YouGov ensure that we have the right number of men, women and people of a certain age groups, etc. that matches the national population. 86 00:08:36,340 --> 00:08:42,209 Hmm. And what kind of questions are you asking? So some of these are quite fundamental questions. 87 00:08:42,210 --> 00:08:47,310 So what sources of news people are using, how often they're using them? 88 00:08:48,000 --> 00:08:52,680 But also we look at people's attitudes towards the news, whether they trust it, 89 00:08:53,010 --> 00:08:58,319 whether they think they're avoiding the news more than they used to and things like that. 90 00:08:58,320 --> 00:09:05,910 So it's a mixture of behaviour and attitudes about about two thirds of the survey we keep the same every year, 91 00:09:06,000 --> 00:09:07,799 so we want to track developments over time. 92 00:09:07,800 --> 00:09:14,700 And the remaining third is devoted to the new issues that have emerged that we want to get people's opinion on. 93 00:09:15,990 --> 00:09:20,879 So finally, turning to coronavirus, just from a personal point of view, 94 00:09:20,880 --> 00:09:30,750 can you remember how you first came to hear about it and realised that it was something that was going to be relevant to your research? 95 00:09:32,550 --> 00:09:35,580 And I can't remember the exact dates, 96 00:09:35,580 --> 00:09:46,080 but I certainly remember hearing about it as something that was still quite far away in January 2020 and discussing it with friends and so on. 97 00:09:46,080 --> 00:09:53,760 And I also, you know, on Twitter, you you would see experts of epidemiologists sort of speculating, 98 00:09:54,060 --> 00:09:57,390 but also you remember somebody quite, quite concerned. 99 00:09:57,540 --> 00:10:00,930 And I think that was probably the very first moment that I thought in. 100 00:10:00,940 --> 00:10:11,009 This could be something that's very serious. And then I think when it really dawned on me that this was going to have a large 101 00:10:11,010 --> 00:10:16,649 impact on daily life when European countries started going into lockdown, 102 00:10:16,650 --> 00:10:21,090 and then at that point it felt almost inevitable that the UK would follow at some point. 103 00:10:21,390 --> 00:10:24,780 Mm hmm. And what did you see? 104 00:10:24,810 --> 00:10:32,160 Did you see that as an opportunity to dedicate part of your your data gathering, 105 00:10:32,160 --> 00:10:39,180 your research to focus on how that was going to impact on the consumption of news and trusted news and so on? 106 00:10:39,600 --> 00:10:47,159 Well, I'm biased, and I always think that communication is is is always important for any sort of political event, 107 00:10:47,160 --> 00:10:50,640 and I think particularly so during a crisis. 108 00:10:52,470 --> 00:10:58,860 And I did remember thinking that maybe we could do some research that would help. 109 00:10:59,890 --> 00:11:12,690 And I know that and particularly in the UK, we know we have a good sense of of where sort of research like the research that we do might come from. 110 00:11:14,130 --> 00:11:19,140 And we thought we were quite well-placed to be able to make a contribution because we thought we could 111 00:11:19,140 --> 00:11:25,799 we could do something that was timely because we we imagine the situation would change very quickly. 112 00:11:25,800 --> 00:11:29,940 And we thought maybe we could we and we were agile enough to be able to sort of 113 00:11:31,110 --> 00:11:35,130 generate some reliable data that could feed into into people's understanding. 114 00:11:35,610 --> 00:11:39,030 Mm hmm. And. 115 00:11:39,030 --> 00:11:48,989 And so what did you do? I mean, did you did you need to raise funding in order to get something started? 116 00:11:48,990 --> 00:11:52,290 Or were you able to simply adapt what you were already doing? 117 00:11:53,370 --> 00:12:08,069 It's a bit of both. So one of the first things we did was and this was in April 2020, was a six country survey of news views, 118 00:12:08,070 --> 00:12:16,410 but specifically on coronavirus news use and the funding from that came from an existing project that was ongoing, 119 00:12:17,040 --> 00:12:21,270 the Misinformation Science and Media Project, which was funded by the Oxford Martin School. 120 00:12:21,990 --> 00:12:29,459 We were able to see some of the funding to work on this project and simultaneously 121 00:12:29,460 --> 00:12:36,990 I think we also saw funding from the Nuffield Foundation for a projects, 122 00:12:37,380 --> 00:12:40,730 specifically looking at the UK over time. 123 00:12:40,740 --> 00:12:44,010 So we proposed a ten wave survey. 124 00:12:44,610 --> 00:12:54,600 So starting with this large panel of respondents and then coming back to them every two weeks for several months. 125 00:12:55,170 --> 00:12:59,549 And for that we, we bid for funding from the Nuffield Foundation and we got that. 126 00:12:59,550 --> 00:13:03,330 So we were able to start that in April as well. 127 00:13:03,490 --> 00:13:06,479 Hmm. So let's just take those two projects. 128 00:13:06,480 --> 00:13:16,200 So obviously, misinformation has been an issue throughout the pandemic, but it was an issue before in a variety of other political and social areas. 129 00:13:17,070 --> 00:13:21,060 So. So what were the findings of the first study? 130 00:13:21,360 --> 00:13:25,979 The first study of the six countries study was the primary purpose, I think, 131 00:13:25,980 --> 00:13:33,540 was to look at the differences between countries and we deliberately selected countries that were at different stages. 132 00:13:33,930 --> 00:13:43,230 So we had countries from Asia Pacific where the the pandemic had been going on for a little bit longer. 133 00:13:43,980 --> 00:13:48,270 We had European countries where which were just going into lockdown, 134 00:13:48,270 --> 00:13:54,570 but we also had Latin American countries where really the pandemic hadn't had really arrived to the same extent. 135 00:13:54,810 --> 00:14:07,770 So that was the point of comparison. One of the things we found, despite those differences, it was a very large increase in and in self-reported news. 136 00:14:07,770 --> 00:14:17,310 You said people with say, in their access in news a lot more. And also at the same time, we also noticed an increase in people's trust in the news. 137 00:14:17,670 --> 00:14:26,190 And what we we called this a rally around the news effect, which borrows a term from political science, the rally around the flag effect, 138 00:14:26,190 --> 00:14:35,760 which is where in a crisis, people people's support for their for their leaders tend tends to tends to tend to go up. 139 00:14:36,140 --> 00:14:43,230 And I guess because people are concerned and where the future and they have little choice but to put a bit more trust in those in charge. 140 00:14:43,230 --> 00:14:49,680 And when we saw a similar effect across all countries because even though the pandemic hadn't arrived, 141 00:14:49,950 --> 00:14:53,430 it was still a global news event and behind people were aware of it. 142 00:14:53,460 --> 00:14:59,610 Mm hmm. So. So, yeah. So that's interesting that they were more trustworthy, more trusting. 143 00:15:00,330 --> 00:15:03,790 Yeah, that was. I mean, we I suppose we'd expected news use to go up. 144 00:15:03,790 --> 00:15:08,020 That wasn't a huge surprise, but we weren't we weren't quite prepared for the increased interest. 145 00:15:08,350 --> 00:15:16,209 Just to keep in mind that since 2015, when we've been tracking trust in the news, it's been almost consistent sort of decline in many countries. 146 00:15:16,210 --> 00:15:20,710 So it was very unusual to see to see trust go up in that way. 147 00:15:21,340 --> 00:15:26,200 But you think it was that the reason was the fact that there was this global crisis that we think so. 148 00:15:26,200 --> 00:15:35,529 I mean, later on, we when we asked similar questions in the digital news report, which is, of course, across across you know, 149 00:15:35,530 --> 00:15:43,120 in 40 different countries, we saw an increase in trust even that was a bit later in almost all of those. 150 00:15:43,370 --> 00:15:53,070 And we thought that there was only really one sort of global event big enough in order to create that that that that sort of parallel rise everywhere. 151 00:15:53,530 --> 00:15:57,700 And of course, then the next question is, what is that? What's actually really going on? 152 00:15:57,700 --> 00:15:59,320 And that's why we're not so sure. 153 00:15:59,830 --> 00:16:12,309 But my my thinking is that coronavirus squeezed out a lot of the sort of political news and replaced it with something that was a bit less partisan, 154 00:16:12,310 --> 00:16:19,660 a bit less divisive. And when you ask people why they don't trust the news, they often mention bias and this kind of thing. 155 00:16:19,660 --> 00:16:25,990 And I think there is less room for that in the not in every country, of course, but in many countries there was there was less of that. 156 00:16:26,380 --> 00:16:30,100 And I think that that that sort of filtered through to people's people's attitudes. 157 00:16:31,210 --> 00:16:39,760 And presumably, there was just a hunger for information, I'm thinking particularly of things like was this was part of your six countries study. 158 00:16:39,760 --> 00:16:46,570 But in the UK there were daily briefings with the Prime Minister and the chief scientific and medical advisers. 159 00:16:47,440 --> 00:16:53,259 And I think a lot of people I mean, I only watched TV that kind of Sunday, 160 00:16:53,260 --> 00:16:59,920 but a lot of people felt that knowing what had been said at those briefings was an important part of that day. 161 00:17:00,100 --> 00:17:07,770 That's right. One of the other things that we saw early on was not just an increase in news organisations, 162 00:17:07,810 --> 00:17:12,310 a source of information about coronavirus, but other other sources as well. 163 00:17:12,310 --> 00:17:26,080 So politicians, scientists were much more prominent than they were than they would normally be, especially scientists who, as you mentioned, 164 00:17:26,380 --> 00:17:35,050 were part of the daily briefings in the UK and were used as a source of information much more widely than they would be in normal times. 165 00:17:35,080 --> 00:17:41,170 Mm hmm. And, I mean, this is relatively new that the general public have had access to things like 166 00:17:41,170 --> 00:17:46,149 the Johns Hopkins data was published at regular intervals and all well debated, 167 00:17:46,150 --> 00:17:51,010 dated some of these other online data sources that simply prevent presented the figures, 168 00:17:51,790 --> 00:17:57,460 which I suppose previously would have needed to be mediated through some kind of journalistic. 169 00:17:57,730 --> 00:18:03,010 That's right, yes. Ofcom, I think in their study they did they were doing a study that was quite similar to 170 00:18:03,010 --> 00:18:08,469 ours in the UK and I think it's still ongoing and we're tracking that as well. 171 00:18:08,470 --> 00:18:15,970 The use of certain sort of dashboards, tracking cases and deaths and so on. 172 00:18:16,600 --> 00:18:21,700 And they and in their data these are quite widely used and I think it became part of some people, 173 00:18:21,700 --> 00:18:27,970 not everyone of course, but some people's sort of daily media habits is sort of checking in with these sources. 174 00:18:28,090 --> 00:18:31,660 Mm hmm. Satanic duo UK study that. 175 00:18:31,870 --> 00:18:35,080 That just ran for eight, eight months. Is that right? Exactly what it was. 176 00:18:35,530 --> 00:18:42,189 Yes, we, we, we, we it ran from end of April to end of August in terms of data collection. 177 00:18:42,190 --> 00:18:46,989 And then we were still publishing the results of that, you know, up until up until October. 178 00:18:46,990 --> 00:18:51,620 But that, you know, the field work was April and August. And what were the main questions you were asking? 179 00:18:52,300 --> 00:18:58,720 It was similar to those that we'd asked in the in the six country study. 180 00:18:58,720 --> 00:19:04,060 But the strength in this in this case was collecting data from the same people over time. 181 00:19:04,060 --> 00:19:09,670 So we could really see how people's attitudes and behaviours were changing. 182 00:19:10,180 --> 00:19:16,120 And one of the things we saw fairly early on in the project was that that sort of peak of news use 183 00:19:16,120 --> 00:19:25,059 and also trust was really in April or the end of April and then and then the beginning of May. 184 00:19:25,060 --> 00:19:35,950 And the decline after that was, was quite sharp. And going in sort of alongside that, you could also see people's behaviour changing. 185 00:19:35,950 --> 00:19:40,120 And we were going into the summer at this point and say people were, you know, 186 00:19:41,200 --> 00:19:49,450 following the U.S. describing how they were following the, the sort of guide and slightly less so religiously than they had been. 187 00:19:49,750 --> 00:19:53,829 And if still the majority of the, you know, following the guidance, you know, very closely. 188 00:19:53,830 --> 00:19:57,250 But I guess one of the questions that you asked was, yeah, yeah, we did. 189 00:19:57,310 --> 00:20:02,000 We asked that. And and I think. Not just Asadi. 190 00:20:02,000 --> 00:20:05,120 This was something that the others way but were tracking as well. 191 00:20:05,120 --> 00:20:14,239 And I think one of the defining themes was the sort of high degree of public willingness to follow the rules. 192 00:20:14,240 --> 00:20:17,930 And also one of the things that we noticed we asked people questions about, 193 00:20:18,260 --> 00:20:25,489 which really designed to tap their knowledge of coronavirus or how much they knew about the virus and the rules surrounding it. 194 00:20:25,490 --> 00:20:29,990 And we found we didn't have anything to directly compare it to. 195 00:20:29,990 --> 00:20:38,000 But what I saw were very high levels of sort of awareness and knowledge of the of the, 196 00:20:39,270 --> 00:20:44,000 of the virus and also what people needed to do in order to stay safe from it. 197 00:20:44,510 --> 00:20:52,460 Mm hmm. And, yeah, so those were the. 198 00:20:53,770 --> 00:21:00,430 Peak and declining trust. And the high level of knowledge were to two of the findings from that with her and others. 199 00:21:00,730 --> 00:21:10,139 Yes. I mean, one of the and as part of this project, we we we we published the results of each survey shortly after. 200 00:21:10,140 --> 00:21:14,379 So two week intervals, we published a what we call a fact sheets which which had the numbers. 201 00:21:14,380 --> 00:21:23,470 And we also did a couple of some longer reports that tried to sort of bring together some of the findings from that in ten weeks. 202 00:21:23,470 --> 00:21:31,030 And the first of those looked at what we called the inequalities in in news and information use. 203 00:21:31,330 --> 00:21:39,160 So at the beginning of the pandemic, when the when the news use was at its highest, 204 00:21:39,880 --> 00:21:47,920 then really there were very small differences between between demographic groups in terms of how much news and information they were getting, 205 00:21:47,920 --> 00:21:50,739 because it sort of everyone was was getting news at that point. 206 00:21:50,740 --> 00:21:57,969 So it had papered over some of the long standing inequalities in news use that we know exists. 207 00:21:57,970 --> 00:22:01,720 So, for example, younger people tend to consume less news. 208 00:22:02,650 --> 00:22:08,230 Women tend to consume less news. People with lower levels of formal education consume less news and so on. 209 00:22:08,680 --> 00:22:16,180 But at this point in the beginning of the the pandemic in the UK and those gaps had temporarily disappeared. 210 00:22:16,180 --> 00:22:25,180 And what we saw over time was that that that re-emergence and and although news news use 211 00:22:25,180 --> 00:22:31,149 was still a lot higher even even during the summer than than it would be in normal times, 212 00:22:31,150 --> 00:22:42,129 it was that those gaps re-emerged. And of course, because we think that the people people's news information is very important for how they behave, 213 00:22:42,130 --> 00:22:46,780 how much they know during the crisis that that was a cause for concern. 214 00:22:47,620 --> 00:22:59,019 And then later on, we tried to come up with a concept that sort of captured this, which we called the infodemic and vulnerable. 215 00:22:59,020 --> 00:23:12,969 And this was the group that consumes almost no news and information about coronavirus and also has very low trust or no trust in that information, 216 00:23:12,970 --> 00:23:22,330 even if they were to come across it, and partly because of the inequalities that had emerged during the the pandemic, 217 00:23:22,330 --> 00:23:25,459 but also because we also saw a rise in in news avoidance. 218 00:23:25,460 --> 00:23:32,110 So people stepping back from the news because they think there's too much of it or it's having a negative effect on their on their wellbeing. 219 00:23:32,410 --> 00:23:37,450 We saw the size of the Infodemic Vulnerable group grow over time slightly. 220 00:23:37,450 --> 00:23:47,170 And of course this can be linked to sort of lower levels of knowledge, less willingness to follow the rules and so on. 221 00:23:48,100 --> 00:23:52,239 So how big did that group get? It was still quite small. 222 00:23:52,240 --> 00:23:56,530 So when we first started to measure it, measure it, I think it was about 5%. 223 00:23:56,530 --> 00:24:03,459 And by the end of the fieldwork, I think it had reached 15% to roughly 15%. 224 00:24:03,460 --> 00:24:10,240 So it was you know, it's not it's not a huge group within society, but the fact that it was growing was still a concern. 225 00:24:10,390 --> 00:24:13,990 Yes. Yes. So that study finished before the vaccination programme. 226 00:24:14,020 --> 00:24:21,760 So yes, it did. Yes. And we we, we thought about continuing it, but that would have been quite complicated because we, 227 00:24:22,120 --> 00:24:30,339 we start off with a panel of a certain size and people dropouts over time and eventually in a quiz, you need to talk to the same people. 228 00:24:30,340 --> 00:24:39,520 Over time, you end up with a group that's too small really to to make any sort of sort of meaningful claims about the population. 229 00:24:39,760 --> 00:24:48,700 And so the projected that project had had had finished in a few months before the vaccination programme started. 230 00:24:49,030 --> 00:24:50,710 But we were, we were still interested in, 231 00:24:51,040 --> 00:24:55,880 in attitudes towards vaccines because of course they were still talked about at the time and we did have some data on, 232 00:24:56,070 --> 00:24:59,230 on people's hypothetical willingness to get a vaccine. 233 00:25:00,430 --> 00:25:08,139 And have you been able to ask questions relevant to the pandemic in your bigger news survey? 234 00:25:08,140 --> 00:25:08,650 Yes. 235 00:25:09,160 --> 00:25:20,830 I mean, we we we also the six country study that we conducted in in April 2020, we were able to repeat in 2021 with it with eight countries this time. 236 00:25:21,220 --> 00:25:25,810 And we asked many of the same questions so we could see sort of a yearly change. 237 00:25:25,810 --> 00:25:34,320 And by that point, we were able to include questions on on vaccination because people knew the programme. 238 00:25:34,330 --> 00:25:38,470 It started in some countries and also people were much better informed. 239 00:25:38,740 --> 00:25:42,729 It was less hypothetical than than it was. So we were able to to include some of those. 240 00:25:42,730 --> 00:25:52,600 And also, yes, in the in the digital news report, we had some questions on on coronavirus as well, but normally a bit broader than. 241 00:25:53,480 --> 00:25:56,660 The questions that we had in the coronavirus sort of focussed studies. 242 00:25:57,560 --> 00:26:02,090 And did you see much change between 2020 and 2021 and the kinds of things that we were seeing? 243 00:26:02,300 --> 00:26:08,660 Yes, it's difficult to sort of capture it in one sentence, but I think it uses many sense. 244 00:26:10,840 --> 00:26:14,600 But essentially, I mean, it's not really a surprise. 245 00:26:14,600 --> 00:26:23,570 But we saw in the in the data that we were looking at over time, we did see a year later that people were using far less news on coronavirus. 246 00:26:23,670 --> 00:26:31,400 That's not that's perhaps not a surprise. But we did also see declines in in trust. 247 00:26:31,610 --> 00:26:40,189 So not just among news organisations, but also most other sources of information about the crisis. 248 00:26:40,190 --> 00:26:45,170 So governments, politicians, international health organisations, scientists. 249 00:26:45,470 --> 00:26:50,480 In some cases, these declines went huge. But it was it was certainly certain that there's any evidence of it. 250 00:26:50,990 --> 00:26:57,050 And I think by that point, even though these scientists in particular were very highly trusted, 251 00:26:57,440 --> 00:27:05,030 I think perhaps there was a small part of the population that had perhaps sort of picked up on the the sort of 252 00:27:05,030 --> 00:27:14,030 natural uncertainty that goes with with with with a situation like this and added perhaps lost trust after, 253 00:27:14,030 --> 00:27:20,810 you know, sort of mistakes were made and so on. But still the most trusted sources of information in almost every country. 254 00:27:22,310 --> 00:27:27,889 And as we as as I mentioned, we also asked questions on vaccines. 255 00:27:27,890 --> 00:27:34,070 And in particular, we're interested in in people's belief, in misinformation, about about vaccines. 256 00:27:34,550 --> 00:27:37,190 And there that the picture is quite encouraging. 257 00:27:37,190 --> 00:27:46,540 We found relatively low levels of the belief in in some of the most widely disseminated sort of false claims about about vaccines. 258 00:27:46,560 --> 00:28:00,350 And the Bill Gates is trying to put a yes to that that kind of thing, but also about some of the the sort of the the health risks and so on. 259 00:28:00,360 --> 00:28:04,759 And we found relatively low levels of a belief in that which was encouraging. 260 00:28:04,760 --> 00:28:14,780 And I think that broadly tallies with the the data on on on uptake and also data not from 261 00:28:14,780 --> 00:28:20,750 us but from other sources about vaccine hesitancy decreasing quite a lot over time. 262 00:28:21,020 --> 00:28:26,720 And as the as the vaccine program sort of, you know, had demonstrably, demonstrably was working. 263 00:28:26,730 --> 00:28:30,620 So I think that they sort of it aligned to that as well. Mm hmm. 264 00:28:31,580 --> 00:28:37,760 So what sort of things? Have a brief pause. 265 00:28:38,240 --> 00:28:42,230 Do you think the coffee men know? It's it's it's a it's still probably a bit hot. 266 00:28:49,540 --> 00:28:54,970 And you've mentioned that other organisations have been doing similar research. 267 00:28:55,630 --> 00:29:02,020 How has your research become more collaborative as a result of the pandemic? 268 00:29:02,020 --> 00:29:08,200 Has it prompted you to form wider links with other organisations or have you always been working? 269 00:29:09,160 --> 00:29:15,000 I think many of the the links that we already had we use during the coronavirus research. 270 00:29:15,010 --> 00:29:22,059 So for example on the on the digital news report by we were studying lots of different countries. 271 00:29:22,060 --> 00:29:28,480 You know, we, we don't claim to be able to understand all of those countries, you know, in this to those people on the ground. 272 00:29:28,580 --> 00:29:37,180 And so we have a network of partners in the in those countries who help us understand some of the data and also formulate the questions. 273 00:29:37,180 --> 00:29:42,820 And and we were able to draw on that network when we were looking at Corona virus as well. 274 00:29:43,420 --> 00:29:49,000 And the other organisations that we use for that kind of virus research are fact checkers. 275 00:29:49,960 --> 00:29:52,100 We we drew a lot on their work. 276 00:29:53,860 --> 00:30:03,610 And I've also mentioned Ofcom already who who do an enormous amount of media research in the UK and they and their sort of their work, 277 00:30:03,610 --> 00:30:09,069 which they were also kind of publishing in real time like we were trying to do, helped it inform, inform what we were doing as well. 278 00:30:09,070 --> 00:30:15,670 And of course there were a handful of other projects doing similar things. 279 00:30:15,670 --> 00:30:19,630 So the Winton Centre for Risk at the University of Cambridge, 280 00:30:19,680 --> 00:30:29,320 and they were looking at some things not with a focus on the media but with study, with a focus on and public attitudes towards Corona virus. 281 00:30:29,950 --> 00:30:34,090 And so I think those saw that those were the main. 282 00:30:36,330 --> 00:30:44,280 The major projects and main organisations that we were not collaborating with directly, but sort of communicating with them and looking at their work. 283 00:30:44,370 --> 00:30:48,000 Mm hmm. And who would you say are the main consumers of your research? 284 00:30:48,810 --> 00:30:53,160 Who? Who you said at the start when you wanted to do something to help. 285 00:30:53,340 --> 00:30:57,060 Hmm. Who is using the kind of data that you generate? 286 00:30:57,390 --> 00:31:06,960 And we see our main audience as a mixture of journalists and others that work for news organisations, 287 00:31:07,950 --> 00:31:16,800 policymakers, commentators, observers who are interested in, in, in developments, in news, in the media. 288 00:31:16,830 --> 00:31:30,720 So an example of that might be people who writes about tech or perhaps even work for big tech companies and and and also academics and the academic. 289 00:31:31,180 --> 00:31:36,389 The reports that we publish, I think, are we hope the academics read them, 290 00:31:36,390 --> 00:31:45,930 but that they're mainly designed and written in a way that the people who don't have sort of academic social science expertise can understand. 291 00:31:47,040 --> 00:31:51,270 And we have, you know, we publish academic articles as well for that for the academic audience. 292 00:31:52,590 --> 00:31:54,270 But I think we see it quite broadly. 293 00:31:54,270 --> 00:32:08,190 And I think we we certainly aim to act as a as a as a bridge between research that is conducted to academic standards and and and policymakers, 294 00:32:08,190 --> 00:32:14,310 journalists, etc. Mm hmm. And how do you think your findings have have influenced policy? 295 00:32:15,810 --> 00:32:24,120 Well, we particularly during the beginning of the the UK project, 296 00:32:24,120 --> 00:32:33,179 we we had a lot of media coverage that was picked up by politicians during Prime Minister's questions. 297 00:32:33,180 --> 00:32:45,329 For example, we were mentioned in one of those and also I gave evidence to an inquiry by the Public Affairs and something 298 00:32:45,330 --> 00:32:56,969 something committee will have to check on that which was doing an inquiry into the use of data during the pandemic. 299 00:32:56,970 --> 00:33:03,600 And I gave evidence that based on research in the UK. 300 00:33:03,840 --> 00:33:08,980 Mm hmm. So do you. Yeah. I mean, really, I suppose because I'm still thinking about this article I read, 301 00:33:09,000 --> 00:33:17,100 but it's suggesting that the pandemic will end when people stop looking at that dashboards. 302 00:33:18,180 --> 00:33:22,050 I mean, can people have too much data, too much information? 303 00:33:22,860 --> 00:33:26,339 I mean, I think that it's possible. 304 00:33:26,340 --> 00:33:34,290 And and particularly when there's so much uncertainty like there was at the beginning of the dataset extent there is now as well. 305 00:33:34,300 --> 00:33:41,160 But certainly at the beginning of the pandemic, the huge amount of uncertainty and there was a lot of data. 306 00:33:41,160 --> 00:33:48,030 So there was data coming from multiple countries on cases, all of which were being measured in a slightly different way. 307 00:33:48,390 --> 00:33:51,180 Also, you know, hospitalisations, etc. 308 00:33:52,230 --> 00:34:03,660 And and it was easy at that point to use that data and arrive at the misleading conclusion, even with, you know, 309 00:34:03,660 --> 00:34:10,950 people with the best will in the world and also sort of, you know, sometimes experts in the in the topic with it with a new situation. 310 00:34:11,100 --> 00:34:12,330 I think it was very difficult. 311 00:34:13,080 --> 00:34:23,460 And, you know, if experts are finding it difficult and this uncertainty evolves, then the public will will also likely struggle with it as well. 312 00:34:24,180 --> 00:34:27,360 Whether that means there was too much information, I don't know. 313 00:34:27,360 --> 00:34:37,890 But I can certainly see the risk. And of course, this is in part where the infodemic comes from and why the W.H.O. started to use it. 314 00:34:39,000 --> 00:34:42,569 And I think and I think there was some in some instances of that, as I said. 315 00:34:42,570 --> 00:34:48,149 But there's an equally suppressing problem in not not so much in the UK, 316 00:34:48,150 --> 00:34:56,070 but in other in some other countries where there's a real lack of data and a lack of transparency around the data that does exist. 317 00:34:56,610 --> 00:34:59,840 And, of course, that's that's a serious problem as well. Mm hmm. 318 00:35:02,040 --> 00:35:05,360 So turning, but it's more personally. 319 00:35:05,910 --> 00:35:10,080 So how did the first lockdown impact on what you were able to do? 320 00:35:11,920 --> 00:35:21,270 Um, well, I certainly remember at the beginning of the pandemic when the UK started, 321 00:35:21,270 --> 00:35:25,740 well not the beginning but the when, when the UK went into lockdown, 322 00:35:26,430 --> 00:35:34,559 a huge amount of uncertainty around how it would work and whether even whether sort of, 323 00:35:34,560 --> 00:35:37,830 you know, sort of big parts of society might cease to function. 324 00:35:38,060 --> 00:35:41,610 And that was certainly a concern. And I know people around me as well. 325 00:35:41,610 --> 00:35:51,600 But we're thinking along similar lines. And I think it's it's easy to forget how how sort of real some of those risks felt right at the beginning. 326 00:35:52,050 --> 00:35:56,310 And in terms of working it, 327 00:35:56,460 --> 00:36:07,590 that that part of it was was but was relatively straightforward just because most people and the research team of the institute there, 328 00:36:07,590 --> 00:36:12,120 they used to working from home, you know, in their previous jobs, they perhaps worked from home a little bit as well. 329 00:36:13,020 --> 00:36:17,310 And social scientists can do quite a lot with just their laptop. 330 00:36:18,780 --> 00:36:21,780 So in one sense, a lot could carry on as normal. 331 00:36:22,140 --> 00:36:28,890 Other things became almost impossible. So face to face sort of qualitative research wasn't an option. 332 00:36:28,890 --> 00:36:33,360 We had to move online and we did that in some of the research, 333 00:36:33,360 --> 00:36:37,349 not just on the digital news report, but on a parallel project called the Trust in News Project, 334 00:36:37,350 --> 00:36:43,260 where we we went, we wanted to do face to face interviews and just just weren't able to able to do them. 335 00:36:44,370 --> 00:36:51,120 So there are there are some sort of practical aspects of of research that that became much harder. 336 00:36:52,380 --> 00:37:00,660 But the day to day work, I was quite private of the people I work with who were able to adapt to using, you know, 337 00:37:00,660 --> 00:37:07,170 video conferencing software, doing everything by email, accessing, you know, the university systems remotely. 338 00:37:08,520 --> 00:37:13,379 And it was it wasn't easy, but it was it that wasn't the most challenging part. 339 00:37:13,380 --> 00:37:21,140 And I think. What was the most challenging part? Well, just just everything to do with that. 340 00:37:21,680 --> 00:37:30,090 The social interactions that you miss that I said I really believe that they're important when it comes to generating ideas for research. 341 00:37:31,290 --> 00:37:40,890 And if brainstorming sessions to come up with questions to include in the survey, things like this, which just don't really work well online. 342 00:37:42,000 --> 00:37:52,139 And yeah, I think and I think that combined with the, you know, the of challenges of working from home, 343 00:37:52,140 --> 00:38:00,420 which personally I'm relatively fortunate but for those you had children during home schooling you know 344 00:38:00,510 --> 00:38:05,550 perhaps as or sharing a sort of small flat with their partner who's who's who's working there as well, 345 00:38:05,700 --> 00:38:10,229 that kind of thing. And there are of course, people on our on our team who are in that situation. 346 00:38:10,230 --> 00:38:13,730 So I think they would have certainly found that difficult. Mm hmm. 347 00:38:15,750 --> 00:38:23,799 Has there been any positive outcomes that we've all got extremely familiar with working remotely? 348 00:38:23,800 --> 00:38:27,000 You know, I mean, sometimes it is just more convenient. 349 00:38:27,000 --> 00:38:32,340 Do you think it's something that's going to stay part of your, um. The package of things that. 350 00:38:33,990 --> 00:38:42,820 Guy. Can you work in the future? Yes, I think people are different and some people struggle with the idea of making a payment for others. 351 00:38:42,840 --> 00:38:46,499 It's it's it's really a kind of blessing. 352 00:38:46,500 --> 00:38:49,920 I mean, people often don't like commuting. 353 00:38:49,920 --> 00:38:59,249 They they perhaps sort of don't feel like they thrive in an office environment and prefer to work at a slightly different pace remotely. 354 00:38:59,250 --> 00:39:09,000 And that's some of the feedback that we've got. Not not necessarily the industry, but in the in the sort of the DPL more widely. 355 00:39:09,300 --> 00:39:15,390 And so I think it's it seems a lot of people and I think it makes sense, therefore, 356 00:39:15,390 --> 00:39:22,920 to to think about how that sort of hybrid working can, can can carry on. 357 00:39:24,540 --> 00:39:28,050 But I think about travel because I mean you do a lot of your work has an international. 358 00:39:28,470 --> 00:39:31,740 Yeah. Where you're doing much travelling for did. Yes. 359 00:39:31,860 --> 00:39:36,180 Yeah. When, when we're covering lots of different countries eventually, 360 00:39:36,570 --> 00:39:41,219 you know the people in those countries who want to hear what you find because you know, 361 00:39:41,220 --> 00:39:47,940 we can't put it all the data from, you know, a 46 country study into into a report. 362 00:39:47,940 --> 00:39:55,019 So there's lots that isn't isn't included and maybe you can do a kind of deep dive and on to countries and people are 363 00:39:55,020 --> 00:40:05,670 interested in that but also and you know kind of international I of the European Parliament in ASCO etc. they you know, 364 00:40:05,760 --> 00:40:09,690 they they have that based that side of the case. 365 00:40:09,690 --> 00:40:17,040 There is there's travel there to do events there and and sort of meeting with with colleagues often outside of academia. 366 00:40:17,070 --> 00:40:26,550 There was yeah we did a lot of that before the and I still do of course but it's it's that's the kind of thing that that can be done remotely. 367 00:40:26,820 --> 00:40:33,360 And I think it remains to be seen whether whether it's that will go back to sort of in-person events. 368 00:40:33,360 --> 00:40:39,930 But it I think it's increasingly you know, it's something that's that can be done online, I think. 369 00:40:40,590 --> 00:40:48,960 I mean, I suppose with a sort of growing awareness that that climate change pressures suggest that travel should be minimised. 370 00:40:49,200 --> 00:40:53,909 Yes, exactly. Yes. I think this is good arguments there. 371 00:40:53,910 --> 00:41:00,660 And also, you know, now we have a kind of a sort of proof of concept that this kind of thing can be done online. 372 00:41:00,660 --> 00:41:04,710 Then, then, then that might make it easier to to change to change habits. 373 00:41:05,550 --> 00:41:10,870 And what about your working hours? What did you did you distinguish the same or were you working harder than you had to? 374 00:41:11,370 --> 00:41:21,899 The the the, the, the UK project, which was on a kind of two week cycle that was it sounds like that that should be manageable. 375 00:41:21,900 --> 00:41:24,540 And we thought it would be, but it was quite relentless. 376 00:41:24,930 --> 00:41:32,100 As soon as we finished one writing up the results from one to wave the survey, we immediately had to start on the next one. 377 00:41:32,580 --> 00:41:37,140 And that was that was how many of you were working on that? There were four of us. 378 00:41:37,350 --> 00:41:44,759 Myself, Rasmus Nielsen, who's the director of the Institute, a former colleague called Anton, his colleague Acropolis, 379 00:41:44,760 --> 00:41:51,600 who's at the University of Liverpool, and Felix Simon, who is doing a Ph.D. at the Oxford Internet Institute. 380 00:41:52,140 --> 00:41:59,490 Those are the four of us on the team, of course. You know, we have people working on the sort of publications within the industry, right? 381 00:41:59,490 --> 00:42:03,270 But that was on the on the on the research side and the data kept on the work. 382 00:42:03,270 --> 00:42:09,250 People sent questionnaires where you actually imaginable ringing them up or whatever together some of the data. 383 00:42:09,450 --> 00:42:15,959 So the again YouGov where we're administering it and we redesigned the questionnaire 384 00:42:15,960 --> 00:42:19,920 and they and they send it out to that panel and deliver the the results back. 385 00:42:21,450 --> 00:42:25,550 But to do it to do it every every two weeks was and then sort of, you know, 386 00:42:25,560 --> 00:42:29,459 before going into the for the next survey, we wanted to publish the results. 387 00:42:29,460 --> 00:42:38,670 And so we did that in the in the form of what we call fact sheets, which were kind of eight page, eight page summaries of the data. 388 00:42:38,820 --> 00:42:45,570 And it was yeah, that was I would I'd be lying if I said that was a 9 to 5 job. 389 00:42:45,570 --> 00:42:56,580 But other than that, I think it's, it's I for me personally, it's, it's, it's been quite similar and I've. 390 00:42:59,060 --> 00:43:04,160 You know, to spend the time that I would have spent in the office, at home and then and then stopping it at a certain point. 391 00:43:04,160 --> 00:43:09,139 But I often you hear about other colleagues, not necessarily here, 392 00:43:09,140 --> 00:43:14,420 but some in the wider university who who are perhaps whose work has have 393 00:43:14,420 --> 00:43:19,790 changed a bit because it is a less of a barrier between sort of work and home. 394 00:43:22,930 --> 00:43:27,460 And. Good. Do you have any involvement in teaching? 395 00:43:28,060 --> 00:43:34,140 No. I don't discuss that. It's not something that you've done. 396 00:43:35,170 --> 00:43:39,010 But were you involved in any discussions about your department's safety regime 397 00:43:39,010 --> 00:43:43,059 and the measures that were brought in for the opportunity to be with it, 398 00:43:43,060 --> 00:43:47,730 to just stick with it? No, no, not at the department level. 399 00:43:47,740 --> 00:43:54,250 But we we have our own set of policies for this building, which I've been involved in, 400 00:43:54,750 --> 00:44:01,960 to discussions about and making decisions about, about when we come in, when we don't come in and so on. 401 00:44:01,970 --> 00:44:08,440 But in general, we've we've talked quite closely to the to the government side and also the universities policy as well. 402 00:44:08,470 --> 00:44:11,830 Yeah. Yeah. So that wasn't something that caused. 403 00:44:13,540 --> 00:44:18,610 Division within the book among your colleagues? No, no. 404 00:44:18,680 --> 00:44:22,180 I mean, we have tried to be as flexible as possible. 405 00:44:23,660 --> 00:44:29,319 And if and if people if we have policies that that sort of apply across the board. 406 00:44:29,320 --> 00:44:29,950 But if, you know, 407 00:44:29,950 --> 00:44:38,540 people come to stick to those for for for good reasons and and it's within the sort of what's allowed by the university and the government, 408 00:44:38,540 --> 00:44:40,450 then then we can easily accommodate it. 409 00:44:43,090 --> 00:44:52,360 So, I mean, I think a lot of people found the whole lockdown business very hard emotionally and psychologically. 410 00:44:52,690 --> 00:45:01,690 But do you think the fact that you had a specific project to work on that, as you said, any you thought would help contributed to your own wellbeing? 411 00:45:03,160 --> 00:45:04,750 Yeah, I mean, I think it did. 412 00:45:05,320 --> 00:45:16,140 And just because you really got something to focus on and of course it's it's related to the thing you might otherwise be worried about. 413 00:45:16,150 --> 00:45:23,650 But I think it is possible to to sort of detach slightly from that and focus on on on doing the work. 414 00:45:23,650 --> 00:45:30,820 And particularly at the beginning, as I mentioned, I really thought it was, you know, we could help. 415 00:45:30,820 --> 00:45:36,670 I mean, not of course, not to the same extent as doctors and scientists working in other, you know, other areas. 416 00:45:36,670 --> 00:45:39,080 But as you know, in a small way, I thought, you know, 417 00:45:39,130 --> 00:45:44,260 it would be useful to have this information and that, you know, that that's obviously motivating. 418 00:45:45,640 --> 00:45:49,240 And I think I think it probably did. 419 00:45:49,560 --> 00:45:54,820 Yeah. Overall helped to have something like these projects to to focus on. 420 00:45:54,850 --> 00:45:59,020 Mm hmm. Did you did you ever feel personally threatened by the virus? 421 00:45:59,020 --> 00:46:03,060 You did say when you talked about a worry about bits of society, actually. 422 00:46:03,640 --> 00:46:07,570 Yeah, it made it sound slightly silly to think about it like that now in hindsight. 423 00:46:07,570 --> 00:46:11,260 But I certainly remember thinking that at the time I was less so. 424 00:46:11,260 --> 00:46:19,419 I think even even very early on it was quite clear that this was a virus or a disease. 425 00:46:19,420 --> 00:46:27,040 That that that that really the risk profile for those in the older age groups is obviously much higher. 426 00:46:27,040 --> 00:46:38,769 And and I think if you're, you know, sort of my age, then then it was quite clear that it was less of a less serious for many people, younger people. 427 00:46:38,770 --> 00:46:45,249 So I personally wasn't worried about about the virus for my own sake. 428 00:46:45,250 --> 00:46:48,879 But of course, you worry about, you know, 429 00:46:48,880 --> 00:46:56,980 your parents or the people you know and you interact with who perhaps have it have a greater risk for for whatever reason. 430 00:46:57,220 --> 00:47:06,670 So that was that was a concern. And then I did you know, you do wonder whether, for example, near the beginning when when, you know, 431 00:47:06,670 --> 00:47:14,980 food started disappearing from the shelves in supermarkets, that that was the sort of main concern that things would start to go wrong on that front. 432 00:47:15,610 --> 00:47:19,899 And, you know, in the end, I do think it was as bad as some people feared. 433 00:47:19,900 --> 00:47:25,840 But there was a brief time when it when it looked quite like it looked very serious. 434 00:47:29,990 --> 00:47:33,390 Okay. I think we just about got that. Just got a couple of final question. 435 00:47:33,430 --> 00:47:39,110 Yeah. So how has the work you've done on COVID raised new questions that you'd be 436 00:47:39,110 --> 00:47:43,370 interested in exploring in the future in relation to news consumption and trust? 437 00:47:43,370 --> 00:47:48,440 And so, yes, I mean, I think it will have to excuse me. 438 00:47:49,460 --> 00:47:58,610 I think it will have a long term effect. And it's actually still too early for some of our projects to have picked up on picked up on this. 439 00:48:00,710 --> 00:48:04,610 But I think, you know, for example, with something like trust, 440 00:48:05,630 --> 00:48:14,420 it will it will have an effect on the on the sort of consumption side and the public side on that, you know, in different ways. 441 00:48:14,430 --> 00:48:21,200 It will certainly have an effect on what news organisations do, what journalists do. 442 00:48:21,680 --> 00:48:31,250 There's already been quite a big shift in into the working practices and the ways that news organisations are run that I don't think will, 443 00:48:32,180 --> 00:48:37,969 I don't think are going to change back and some of that can be, some of that can be positive. 444 00:48:37,970 --> 00:48:45,290 So you know, you can imagine, for example, how news organisations, if people are working remotely, 445 00:48:45,290 --> 00:48:52,129 journalists can be based outside of London more easily and sort of have a closer connection with the community that they're reporting on and so on. 446 00:48:52,130 --> 00:49:02,690 So some of that could be positive. That also, I think is the issue of where the coronavirus has and the pandemic has kind of accelerated 447 00:49:02,690 --> 00:49:07,549 some of the the sort of declines that news organisations are struggling with, 448 00:49:07,550 --> 00:49:14,480 said that the decline of print, for example, we saw that that that was very badly affected by lockdown for obvious reasons. 449 00:49:14,480 --> 00:49:15,139 But I think, you know, 450 00:49:15,140 --> 00:49:22,460 that could have been that could have broken habits for some people and that there could be an acceleration of the what was already declining. 451 00:49:23,120 --> 00:49:31,700 And then, you know, the debates around misinformation on social media, etc., 452 00:49:33,260 --> 00:49:43,430 I think amplified by something like coronavirus because of the clear link between, you know, risks to people's health as a consequence. 453 00:49:44,490 --> 00:49:52,219 And that may have when we look back, it may have been something that accelerated the move to more sort of content, 454 00:49:52,220 --> 00:49:58,670 moderation, different policies from social networks, more action from governments to to deal with it and so on. 455 00:49:59,120 --> 00:50:07,190 Mm hmm. And has the experience changed your attitude or approach to your work and all the things you'd like to see change in the future? 456 00:50:08,090 --> 00:50:19,040 I mean, it's made me think certainly more about, you know, what, what the right working environment is for different people and how, you know, 457 00:50:19,250 --> 00:50:27,620 it's wrong to just assume that everyone thinks the same about, you know, working from home or working from the floor, from the office and so on. 458 00:50:28,100 --> 00:50:35,990 But certainly think differently about that. And I'm sorry, what was the second part of the questions that you'd like to see change in the future? 459 00:50:38,000 --> 00:50:41,930 And what I mean, I hope they're supposed to sort of building on that. 460 00:50:41,930 --> 00:50:47,809 That's working practices, not just in academia, 461 00:50:47,810 --> 00:50:56,060 but but everywhere can can become a bit more bit more flexible to to sort of meet people's different circumstances and. 462 00:50:57,950 --> 00:51:05,149 And I was thinking about this in news media, and it's been different in different countries. 463 00:51:05,150 --> 00:51:16,400 But I think on the whole in the UK, the way that vaccines in particular have been covered and reported on has been has been quite good. 464 00:51:18,290 --> 00:51:27,350 And, you know, I think that model and that and that the sort of serious seriousness that is attached to the vaccines, 465 00:51:27,360 --> 00:51:31,940 it would be nice if that if that approach could be applied to to other topics as well, 466 00:51:32,000 --> 00:51:41,090 because I think that the news media and to a certain extent the government have been doing doing a good job. 467 00:51:41,540 --> 00:51:46,700 For example, the government dashboard, it took a while to arrive, but when it arrived, it was it was it was very useful. 468 00:51:46,730 --> 00:51:53,440 And I think, you know, that that could could, you know, sort of show how the similar you know, 469 00:51:53,540 --> 00:51:57,320 how similar situations could be dealt with if we learn something from it. 470 00:51:57,680 --> 00:52:03,649 Mm hmm. And, I mean, I suppose, again, climate change is the obvious example that people people are not saying, 471 00:52:03,650 --> 00:52:07,280 you know, we've shown we can act actively against one threat. 472 00:52:07,820 --> 00:52:11,170 We ought to be able to. That's right. I like that attitude. 473 00:52:11,350 --> 00:52:14,320 Yeah, yeah, yeah. Good.