1 00:00:17,150 --> 00:00:24,139 Thank you very much. Hello. It's very kind of you. I don't know if any of you have ever been to New York, 2 00:00:24,140 --> 00:00:30,800 but just on the corner of Central Park and Fifth Avenue, there is an iconic building, the Plaza Hotel. 3 00:00:31,050 --> 00:00:36,930 It's beautiful as the The Oak Room bar used to be, that unaccompanied ladies were not allowed in the Okun bar. 4 00:00:36,950 --> 00:00:41,780 I think they changed that slightly more recently than one might expect. 5 00:00:42,050 --> 00:00:45,900 How about many great meetings have been had there? 6 00:00:45,920 --> 00:00:54,379 Lady Gaga has played there. A bunch of finance ministers signed a currency manipulation agreement there in the 1980s. 7 00:00:54,380 --> 00:01:05,090 It's still all kinds of really the whole gamut. But the meeting that I want to discuss today took place in 1953, ten days before Christmas. 8 00:01:05,960 --> 00:01:16,610 And it was a meeting between the presidents of four of the largest tobacco companies in the United States. 9 00:01:17,150 --> 00:01:21,320 And they were meeting with John Hill, who is a PR guru. 10 00:01:21,320 --> 00:01:25,220 He was the head of Hill and Knowlton public relations firm. 11 00:01:26,060 --> 00:01:30,010 And it was a crisis meeting. The reason for the crisis was. 12 00:01:31,910 --> 00:01:40,459 Turns out cigarettes give you cancer. So German scientists had demonstrated this fairly clearly in the 1930s, 13 00:01:40,460 --> 00:01:44,930 but nobody really wanted to talk too much about what German scientists been doing in the 1930s. 14 00:01:45,800 --> 00:01:52,490 Then Richard Dole and Austin Bradford Hill in the UK demonstrated a pretty 15 00:01:52,490 --> 00:01:56,810 clear link that basically almost everyone with lung cancer was was a smoker. 16 00:01:58,220 --> 00:02:07,400 Then in the United States, Ernest Window and Evarts Graham demonstrated that you could induce cancer, 17 00:02:07,400 --> 00:02:14,460 a randomised controlled trial if you painted condensed tobacco smoke, the tar that comes from condensed tobacco smoke on the backs of mice. 18 00:02:14,810 --> 00:02:17,150 You could produce skin cancer on the backs of mice. 19 00:02:18,020 --> 00:02:25,910 So for some reason, that seemed to be a more powerful, powerful finding than than what Donald and Bradford Hill found. 20 00:02:27,090 --> 00:02:36,910 But really none of that was the problem. The problem was the Reader's Digest, which is the most popular widely read magazine on the planet, 21 00:02:37,480 --> 00:02:42,100 had published an article with the title Cancer by the Carton. 22 00:02:44,220 --> 00:02:51,900 So you have a problem because your your only product is not only obviously addictive, 23 00:02:52,230 --> 00:03:00,900 but also now incredibly dangerous and all of the modern discussions about healthy eating and causes of cancer. 24 00:03:01,140 --> 00:03:05,090 Well, what does burnt toast cause cancer? Maybe bacon causes cancer. 25 00:03:05,110 --> 00:03:10,350 Maybe clingfilm causes cancer. I mean, some of these connections are real and some of them are nonsense. 26 00:03:10,560 --> 00:03:16,410 But what's obviously true is none of them none of them are remotely close to being as 27 00:03:16,410 --> 00:03:22,260 significant as the link between cigarettes and cancer and all kinds of other illnesses. 28 00:03:22,260 --> 00:03:25,440 This is an absolutely fatal product. 29 00:03:26,870 --> 00:03:34,040 So it's awkward. It's awkward. And I lost two cooks, great observer of America. 30 00:03:34,520 --> 00:03:42,290 In 1954, he harks back to Sir Walter Raleigh, bringing the tobacco to Europe. 31 00:03:42,800 --> 00:03:52,550 And in 1954, Alistair Cooke wrote that really the publication of the next serious scientific study into the link between cancer and cigarettes, 32 00:03:52,550 --> 00:03:58,580 the next the next study that was published that might well and what Sir Walter Raleigh began. 33 00:04:00,810 --> 00:04:05,990 But John Hill, the PR guru, had a plan. And the plan, it turns out. 34 00:04:06,920 --> 00:04:15,459 Worked incredibly well. Because for decades the tobacco companies were able to fend off regulation and 35 00:04:15,460 --> 00:04:19,480 litigation and even the perception among their customers that the cigarette I mean, 36 00:04:19,480 --> 00:04:24,160 yeah, maybe that maybe maybe they might be slightly back give you a cough anyway. 37 00:04:24,400 --> 00:04:30,940 Just they, they managed to hold back all of these things for a tremendously long time. 38 00:04:32,030 --> 00:04:44,260 And this was despite. Absolutely no unarguable scientific evidence, unimpeachable facts, incredibly credible scientists. 39 00:04:45,560 --> 00:04:51,500 And yet the credible scientists weren't believed. The indisputable facts ended up being disputed. 40 00:04:52,430 --> 00:04:57,230 And the doubts just rolled on and on for years and then for decades. 41 00:04:58,280 --> 00:05:02,150 And there are lots of different arms to John Hill strategy, lots of different ways in which it worked. 42 00:05:02,840 --> 00:05:11,660 But we we now know quite a lot about it. And one of the reasons we know a lot about it is after in 1998, a huge 1998 settlement in the United States. 43 00:05:13,420 --> 00:05:19,660 A lot of the internal documents produced by the tobacco companies and by their public relations firms have become public. 44 00:05:19,810 --> 00:05:25,270 Publicly available and publicly searchable. And they've been exhaustively examined by academics and journalists. 45 00:05:25,690 --> 00:05:32,650 And there's one that was written in 1969, a particular memo that I think has become iconic, 46 00:05:33,130 --> 00:05:40,960 just discussing how is it that we are going to fend off these claims that cigarettes are incredibly bad for you? 47 00:05:41,470 --> 00:05:45,850 And there's this line in this memo, and the line is. 48 00:05:46,880 --> 00:05:51,870 Doubt is our product. And we're not just going to make cigarettes. 49 00:05:52,080 --> 00:05:55,650 We also need to manufacture doubt. 50 00:05:56,620 --> 00:06:03,730 And for all these years, the expert evidence, the statistics, the facts. 51 00:06:05,200 --> 00:06:08,950 They couldn't quite break through against this wall of doubt. 52 00:06:11,670 --> 00:06:13,720 I'm a fan of fans. Of course I'm a fan of facts. 53 00:06:14,230 --> 00:06:25,600 But I couldn't help noticing the irony that Facebook chose to celebrate the 63rd anniversary of the meeting at the Plaza Hotel with a press release. 54 00:06:27,070 --> 00:06:32,950 I don't think it was deliberate, but the press release was all about how Facebook was going to combat fake news. 55 00:06:33,700 --> 00:06:36,970 So all these lies that people tell on Facebook. Facebook was going to fix that. 56 00:06:37,300 --> 00:06:42,940 And the way they were going to fix it was if people thought that something wasn't true, 57 00:06:42,940 --> 00:06:46,330 they'd be able to flag up this fake news story and say, I think this is not true. 58 00:06:47,190 --> 00:06:51,639 Then, I mean, none of this is quite transparent, but it would then pass the fact checkers. 59 00:06:51,640 --> 00:06:56,040 And of course, you know, I love fact checkers. Some of my best friends are fact checkers. 60 00:06:56,050 --> 00:06:59,170 You know, I myself I have dabbled myself, in fact, checking. 61 00:06:59,200 --> 00:07:05,310 Okay. So we'll go to the fact checkers. And the fact checkers would check their facts and they would say, well, this turns out not to be true. 62 00:07:05,320 --> 00:07:12,260 And then it would go back to Facebook. Then Facebook would put a flag on the story saying, oh, this story is disputed by independent fact checkers. 63 00:07:12,280 --> 00:07:15,700 You may want to go and check the independent fact checkers, independent facts. 64 00:07:18,220 --> 00:07:24,220 And it might even it might even although the details of this were not discussed, it might even be downgraded in the algorithm. 65 00:07:24,370 --> 00:07:27,140 And this means that you would just be less likely to see it. 66 00:07:27,190 --> 00:07:34,300 And if Facebook is showing you things you might want to see and so maybe you might not want to see things that are manifestly untrue. 67 00:07:34,330 --> 00:07:38,700 We might we might downgraded in the algorithm, of course. 68 00:07:38,710 --> 00:07:42,700 Why were they doing this when they were doing this? Because Donald Trump got elected. 69 00:07:42,700 --> 00:07:47,230 Basically, they were doing this because it was it was in the air. 70 00:07:47,260 --> 00:07:57,570 And suddenly we had noticed that a man who was perfectly happy to just say something that wasn't true was demonstrably untrue. 71 00:07:58,590 --> 00:08:05,010 And then he'd be corrected and said, you know, Mr. Trump, when you said you never said that China invented global warming, 72 00:08:05,010 --> 00:08:08,830 here's your tweet saying that China invented global warming. They just had never said that. 73 00:08:08,850 --> 00:08:16,890 I never said that. You know, you know, you spread all those rumours about Barack Obama not being born in the United States. 74 00:08:17,040 --> 00:08:21,780 I was Hillary Clinton. She's put those rumours just and just. 75 00:08:22,830 --> 00:08:27,750 Seemed impossible to get through this wall of willingness to just say things that weren't true. 76 00:08:31,350 --> 00:08:35,460 I didn't put that slide up there. But, you know, let's don't mention the bus. 77 00:08:39,570 --> 00:08:44,370 It just seems like politicians have always been willing to say things that aren't true. 78 00:08:44,610 --> 00:08:51,030 But it seemed that a line had been crossed. When you say something that is demonstrably untrue, clearly untrue. 79 00:08:51,630 --> 00:08:56,640 And authoritative figures. For example, in the case of the bus, Sir Andrew Dilnot, 80 00:08:56,640 --> 00:09:02,340 the former presenter of more or less the head of Nuffield College and the chairman of the UK Statistics Authority says, you know what, 81 00:09:02,910 --> 00:09:09,180 that number on the bus that's misleading and it's not is completely true that this willingness to just plough on is 82 00:09:09,180 --> 00:09:15,450 perhaps one of the reasons why our friends over at the Oxford Dictionaries named Post-Truth their word of the year. 83 00:09:16,110 --> 00:09:22,470 So this sense that somehow the truth no longer had this perch is the truth no longer seem to matter to people. 84 00:09:23,010 --> 00:09:31,770 This is what Facebook were responding to. And the idea and more traditional journalists have responded in a in a related way. 85 00:09:31,860 --> 00:09:36,179 We need to we need to check more facts. We need to use more fact checking. 86 00:09:36,180 --> 00:09:41,520 We need to rebut untrue claims. It's basically is the same thing as Facebook is doing, except without the algorithms. 87 00:09:41,970 --> 00:09:43,530 And the idea is, well, 88 00:09:43,530 --> 00:09:51,630 that will then lead to a more informed electorate and more reasoned political discourse and better decisions and more respect for the truth. 89 00:09:51,660 --> 00:09:57,600 That's the idea. I'm not sure that that is actually going to happen. 90 00:09:57,840 --> 00:10:02,940 I'm not sure, or at least I'm not sure that that is remotely enough. 91 00:10:04,090 --> 00:10:08,960 Now remember that memo? Doubt is our product. 92 00:10:09,980 --> 00:10:17,780 It's not enough to just have fact checkers rebutting untrue claims as long as there is still doubt out there. 93 00:10:18,380 --> 00:10:22,790 The controversy continues and some people thrive off the controversy. 94 00:10:23,060 --> 00:10:33,920 We have to remember that the tobacco companies were able to muddy the waters partly by hacking the norms of science and the norms of journalism. 95 00:10:34,100 --> 00:10:40,730 They had this judo move that would turn both journalistic norms and scientific norms around to become self-defeating. 96 00:10:40,820 --> 00:10:44,090 So you've just called for more research. We need more research. 97 00:10:45,050 --> 00:10:52,040 Who who doesn't want more research? All I'm ever told by science, whenever I tune in and listen to Ted Robbins and Brian Cox on Infinite Monkey Cage, 98 00:10:52,040 --> 00:10:55,580 they're always telling me, well, all scientific truth is provisional. 99 00:10:56,150 --> 00:11:00,950 You know, it's always subject to to disprove. So the tobacco companies just take that idea and run with it. 100 00:11:01,400 --> 00:11:05,750 Well, some people say they cause cancer, but we need more research. 101 00:11:06,080 --> 00:11:11,090 And the journalistic norm is of objectivity and balance. 102 00:11:11,870 --> 00:11:17,659 So if a tobacco company produces something that seems objective and of course they have a view on whether cigarettes 103 00:11:17,660 --> 00:11:23,180 cause cancer and balance demands that you report the tobacco companies view on whether cigarettes cause cancer. 104 00:11:23,930 --> 00:11:29,570 And so the controversy continues. Now, in the 1950s, the 1960s, we weren't smart to this. 105 00:11:30,770 --> 00:11:36,360 It was a tremendously sophisticated campaign. And I understand why a lot of people were caught out by it. 106 00:11:36,380 --> 00:11:41,300 A lot of my people of journalists were caught out by this. But it's 2017 now. 107 00:11:42,230 --> 00:11:48,140 And I still see some of the same moves being used and I still see them having the same effect. 108 00:11:49,060 --> 00:11:56,210 And I think that some of the techniques that we're using that the rebuttal, the insistence on, well, let's just start with the facts. 109 00:11:57,380 --> 00:12:00,590 I love facts. But I'm not sure it's working. 110 00:12:01,720 --> 00:12:06,220 I mean, take take the fake news issue. So let me give you a piece of news. 111 00:12:07,420 --> 00:12:11,080 Pope Francis Shocks World Endorses Donald Trump for President. 112 00:12:13,550 --> 00:12:19,790 It's the most successful story on Facebook in America in the three months before the election. 113 00:12:19,790 --> 00:12:27,139 By most successful, I mean it had the most engagements, the most the most likes, dislikes, shares, comments. 114 00:12:27,140 --> 00:12:33,830 People were interacting with this story. And of course, that, of course, is part of the problem. 115 00:12:34,010 --> 00:12:42,840 In 2014, Mark Zuckerberg described Facebook. He keeps saying it's not a media company, but he also described Facebook as the most perfect newspaper. 116 00:12:42,860 --> 00:12:48,950 He said, our goal is to build the perfect personalised newspaper for every person in the world. 117 00:12:49,640 --> 00:12:53,800 He said, We going to show you the stuff that's going to be most interesting to you. 118 00:12:55,230 --> 00:13:00,030 But of course, if you are really interested in the story. 119 00:13:01,150 --> 00:13:03,850 That Pope Francis endorsed Donald Trump for president. 120 00:13:04,330 --> 00:13:13,360 Now that that's the perfect something, but I'm not sure it's the perfect newspaper because spoiler alert, Pope Francis did not endorse Donald Trump. 121 00:13:14,040 --> 00:13:20,440 Okay. Didn't happen. So, of course, we you know, we get very, very excited about fake news. 122 00:13:20,470 --> 00:13:25,570 I was at a meeting of the Royal Statistical Society last night, and there was a lot of talk about fake news. 123 00:13:25,690 --> 00:13:35,190 But I think that's partly journalists like me. Really feeling insulted that this sort of stuff looks like news and gets traded around Facebook. 124 00:13:35,520 --> 00:13:42,150 And by the way, a lot of it is not coming out of the Kremlin or indeed out of hyperpartisan right wing blogs. 125 00:13:42,180 --> 00:13:49,770 A lot of it is just being manufactured by teenagers. From California to Macedonia, because people click on it. 126 00:13:49,770 --> 00:13:56,680 And if you click on it, you can sell adverts. So this is the ultimate insult, because I can assure you newspapers are not making a lot of money. 127 00:13:57,310 --> 00:14:01,810 So it turns out that fake news is more profitable than actual news. 128 00:14:02,680 --> 00:14:08,620 And it seems it's more politically successful as well. So, of course, we're insulted and we want something to be done about it. 129 00:14:09,040 --> 00:14:15,310 And I think something should be done about it. And I suspect that Facebook's tweaks to the algorithms probably will be effective. 130 00:14:15,610 --> 00:14:19,120 But what I also think is that the focus on fake news is a distraction. 131 00:14:19,390 --> 00:14:24,250 This is not the main issue that we can quantify this. 132 00:14:24,880 --> 00:14:29,680 Ironically, the Royal Statistical Society panel discussion last night did not try to quantify fake news. 133 00:14:29,800 --> 00:14:36,490 But we economists, we're the proper nerds. Forget the statisticians of the says proper nerds try and quantify these things. 134 00:14:36,700 --> 00:14:40,520 So a couple of economists, Hunter Alcott and Matthew Gentzkow. 135 00:14:40,540 --> 00:14:45,760 Matthew Gentzkow, by the way, is one of the most respected economists in the United States, won the Bates Clark Medal, 136 00:14:45,910 --> 00:14:53,710 which has also been won by Nobel Laureate Joe Stiglitz, Nobel Laureate Gary Beck and Nobel Laureate Paul Krugman by Steve Freakonomics Levitt. 137 00:14:53,740 --> 00:15:01,420 I mean it's it's serious what in the continental and he studies media media markets polarisation and media bias. 138 00:15:01,840 --> 00:15:05,350 So Hunter Walcott and Matthew Chance because all this fake news going around they 139 00:15:05,350 --> 00:15:08,920 said we need to quantify this and we need to get a research paper out quickly. 140 00:15:10,000 --> 00:15:16,870 So various Trump trolls through Facebook's data and checking with independent fact checking sites, 141 00:15:16,870 --> 00:15:20,230 and they think they quantified the amount of fake news on Facebook. 142 00:15:20,920 --> 00:15:27,280 And it turns out that if you just look at the top most shared stories, fake news looks like it's doing extremely well. 143 00:15:27,640 --> 00:15:33,460 And of course, part of the reason is there is actually only one story that says that Pope Francis endorsed Donald Trump, 144 00:15:33,790 --> 00:15:41,290 whereas any true story, there are hundreds of different. If you go to Google News and Google any true story, there are hundreds of versions of it. 145 00:15:41,620 --> 00:15:45,100 So any particular one version of that story may not be very widely shared. 146 00:15:45,670 --> 00:15:48,670 So it turns out that the total volume of fake news is not that big. 147 00:15:48,970 --> 00:16:00,460 It's not that small, but it's not that big either. So there were 38 million shares of fake stories in the 90 days before the election. 148 00:16:01,080 --> 00:16:04,930 38 million sounds like quite a lot. 30 million were pro-Trump, 8 million pro-Clinton. 149 00:16:07,100 --> 00:16:10,150 Apparently, they tried getting fake news stories going around. Bernie Clinton. 150 00:16:11,680 --> 00:16:15,030 Penny, Bernie. Bernie Clifton. 151 00:16:15,040 --> 00:16:21,189 Yeah, no, not Bernie. Because of Bernie Sanders, they tried to get Bernie Sanders fake news stories going, 152 00:16:21,190 --> 00:16:27,700 but it turns out Bernie supporters weren't interested in clicking on it. So there's no money in trolling Bernie Sanders. 153 00:16:27,880 --> 00:16:31,550 So. So there was 38 million fake news stories. 154 00:16:31,570 --> 00:16:35,290 Sounds like a lot. Well, that's 420,000 shares a day. 155 00:16:35,830 --> 00:16:45,820 I can out a paper copy of the New York Times contains multiple true stories on every page and a lot of pages. 156 00:16:46,510 --> 00:16:51,850 And there are more than 420,000 paper copies of the New York Times sold every day. 157 00:16:53,280 --> 00:16:58,920 So far, 20,000 fake news stories. It's not nothing, but it's not that much. 158 00:16:59,520 --> 00:17:03,030 And Hunt and Gentzkow tried to quantify, could it have swung? 159 00:17:03,030 --> 00:17:07,620 The election was quite a close election. Could it have swung the election? 160 00:17:08,340 --> 00:17:23,190 Maybe. If. If one fake news story had the same impact as 36 TV campaigns, which is possible, but I think stretches the boundaries of credibility. 161 00:17:23,580 --> 00:17:27,990 So Gentzkow estimates that the typical voter there are 130 million voters. 162 00:17:28,200 --> 00:17:34,770 The typical voter could remember seeing exactly one fake news story, approximately one fake news story. 163 00:17:34,800 --> 00:17:39,300 I think it was 0.98 fake news stories in the in the 90 days. 164 00:17:39,570 --> 00:17:43,770 So a lot of people didn't couldn't remember ever seeing any fake news stories. 165 00:17:43,860 --> 00:17:46,950 And, of course, a lot of people who did see fake news stories didn't believe them. 166 00:17:48,090 --> 00:17:51,270 So it's unlikely that fake news is really the issue. 167 00:17:51,300 --> 00:17:54,790 I think that the problem lies somewhere else. And. 168 00:17:55,990 --> 00:18:00,250 Remember the tobacco strategy. 169 00:18:00,760 --> 00:18:07,540 There's a handwritten note in one of the tobacco memos from the 1970s, and it says the key point. 170 00:18:09,020 --> 00:18:18,010 Keep the controversy alive. As long as we're arguing about fake news, we're not arguing about the issues that really matter. 171 00:18:19,270 --> 00:18:22,810 Now I want to say, I think. 172 00:18:23,750 --> 00:18:34,500 Fact checking. Both in terms of the narrow specialist websites like full fact dot org is wonderful and independent 173 00:18:34,500 --> 00:18:38,940 think tanks like the Institute for Fiscal Studies and fact checking by mainstream journalists. 174 00:18:39,180 --> 00:18:42,300 The BBC. The Financial Times. The Guardian. The Times. 175 00:18:42,870 --> 00:18:48,600 Just establishing what is true and what isn't and how we know and what the sources are. 176 00:18:49,410 --> 00:18:52,430 It's incredibly important. It's its foundation. 177 00:18:52,440 --> 00:18:58,409 If we don't have the statistics to understand how the country works, what the budget of the national health services, 178 00:18:58,410 --> 00:19:04,130 what the deficit is, what the unemployment rate is, what GDP growth is, what the inflation rate is. 179 00:19:04,140 --> 00:19:11,010 If we don't have those sorts of statistics and we don't do our very best because no statistics are perfect, 180 00:19:11,010 --> 00:19:15,120 but we do our very best to collect the most rigorous and robust statistics. 181 00:19:15,420 --> 00:19:19,740 And if we don't have agreement on what those things are, then we're really in trouble. 182 00:19:19,750 --> 00:19:25,680 We've got we've got we've got nothing really except feelings about politics. 183 00:19:26,950 --> 00:19:33,219 So facts are incredibly important. But I think it's it's it feels comfortable for people like me. 184 00:19:33,220 --> 00:19:40,150 And I suspect for many of the people here, it feels comfortable to think that just just explain the facts. 185 00:19:40,480 --> 00:19:43,570 Just give people the facts. Show them the facts. Show them what's true. 186 00:19:44,020 --> 00:19:47,499 Feels that that should be enough. And it's not enough. 187 00:19:47,500 --> 00:19:54,690 That's never enough. And I have some facts to prove that the facts are never enough. 188 00:19:56,490 --> 00:20:02,550 So one issue well documented in psychology is a. 189 00:20:05,330 --> 00:20:09,500 I mean, there were three different issues, but they all have the same name, which is confusing. Blame the psychologist. 190 00:20:09,500 --> 00:20:13,760 Don't blame me. They go by the collective name of the backfire effect. 191 00:20:14,570 --> 00:20:18,980 The backfire effect is I give you facts and you get stupider. 192 00:20:19,910 --> 00:20:26,060 Okay. So how does this work? Well, number one, the backfire effect sometimes operates on memory. 193 00:20:26,300 --> 00:20:32,330 So the the most ridiculous example of this is when in a sometimes see these in court dramas, 194 00:20:32,510 --> 00:20:37,550 somebody says something outrageous in court and then the judge says the jury will disregard what was just said. 195 00:20:38,300 --> 00:20:43,250 And of course, no one no one can ever disregard what was just said. You can't just make yourself we don't we don't computers. 196 00:20:43,250 --> 00:20:45,980 You can't place plus delete and wipe that memory back. 197 00:20:46,220 --> 00:20:51,200 Of course, you remember what was said, and it turns out it's surprisingly difficult to forget untrue information. 198 00:20:51,410 --> 00:20:54,559 So one classic study of this gave people an account of a fire. 199 00:20:54,560 --> 00:21:01,190 There's been this fire, and you need to read this account of this fire and various pieces of information come to come about. 200 00:21:01,190 --> 00:21:06,410 And it turns out there was some there were some paints in this warehouse where there was this fire. 201 00:21:06,650 --> 00:21:10,580 And that may have been something to do with why there was so much smoke and so on. 202 00:21:10,600 --> 00:21:16,249 It did it anyway. And it turns out that later investigation reveals that actually that thing about the paint is not true. 203 00:21:16,250 --> 00:21:21,050 There weren't any discarded paints in the warehouse, no paints at all. Maybe even further discussion of the fire. 204 00:21:21,860 --> 00:21:28,639 And then you start to ask people, well, you know, why was there so much smoke? And they would say, well, because of the paints it had. 205 00:21:28,640 --> 00:21:33,680 But you. We're the patents that honour the. No, the patents one that I remember the patent for that. 206 00:21:33,680 --> 00:21:41,150 But people's, people's interpretation of what they'd seen was coloured by the fact that they'd been given false information. 207 00:21:41,420 --> 00:21:44,660 The false information had been withdrawn, but it never really completely gets withdrawn. 208 00:21:45,710 --> 00:21:50,450 Now, the irony is, the more often I rebut a myth. 209 00:21:51,930 --> 00:21:55,110 The more often you hear it and the more often you hear it, the more often you remember it. 210 00:21:55,530 --> 00:21:57,840 And it's kind of difficult to remember that it's untrue. 211 00:21:59,090 --> 00:22:05,150 And when I spoke to Angelina Coe, who was one of the leading economists on the leave campaign, 212 00:22:05,840 --> 00:22:12,530 who I believe is a very truthful man, I was asking him about this claim on the bus, 250 million a week. 213 00:22:12,620 --> 00:22:16,280 It's not true, but it's completely untrue, provably untrue. 214 00:22:17,870 --> 00:22:21,740 And I told him about it and he said, Yeah, I know. It's kind of it's not really true, is it? 215 00:22:21,770 --> 00:22:28,910 I mean, he was like, well, you know, but if you know, it's not true. And he said, I wish they hadn't. 216 00:22:29,540 --> 00:22:36,229 If they didn't ask me, I wish they hadn't gone with that number, you know, because they could have said, I forget 225 million, maybe. 217 00:22:36,230 --> 00:22:42,110 I think you can really justify 225 million. It's hard really hard to say that that's untrue. 218 00:22:43,670 --> 00:22:48,540 And to be honest, who cares? But who cares whether it's 350 million or 225 million? 219 00:22:48,630 --> 00:22:53,720 But, I mean, the basic message was we send money to the EU, we could spend it on the NHS. 220 00:22:53,720 --> 00:23:00,680 The basic message would be the same. Just one would be false and one would be factually correct. 221 00:23:00,680 --> 00:23:09,530 And we can argue about the economics of it all. But the thing is, I think from a political point of view, Wilko he was this was an honourable remark, 222 00:23:10,370 --> 00:23:17,300 but politically he was completely wrong because if they had put 225 million on the bus, nobody would ever have talked about it ever again. 223 00:23:18,650 --> 00:23:20,480 And that would just have faded away. 224 00:23:21,930 --> 00:23:32,580 But all the Remain campaign did, aided by me and aided by my fact checking friends, is just bang on about the 350 million and how it wasn't true. 225 00:23:33,420 --> 00:23:37,469 And of course, a lot of people a lot of what people were hearing was 350 million. 226 00:23:37,470 --> 00:23:41,940 Yeah, I remember. But familiarity is tremendously powerful. 227 00:23:42,420 --> 00:23:45,510 And of course, familiarity is often a very powerful way of making decisions. 228 00:23:45,510 --> 00:23:47,250 So the psychologist Good Giga Enza, 229 00:23:47,700 --> 00:23:53,549 has assembled stock market portfolios by stopping people on street corners and showing them a list of companies and saying, 230 00:23:53,550 --> 00:23:55,740 Do you recognise any of the companies on this list? 231 00:23:56,220 --> 00:24:01,620 And he assembles his portfolio of stocks that people have never heard of and stocks that people have heard of. 232 00:24:02,010 --> 00:24:07,740 And it turns out that robustly the portfolio of stocks that people have heard of does better in the next year. 233 00:24:07,890 --> 00:24:12,660 So he did that in the middle of the dotcom boom and outperformed all of the professional stock pickers. 234 00:24:12,780 --> 00:24:16,530 Now, people said, well, that was the dotcom boom. So he said, okay, well, we'll do it again. 235 00:24:16,560 --> 00:24:21,480 Then he did it. In the past, still worked. The companies that people had heard of did better. 236 00:24:21,600 --> 00:24:25,050 So, you know, familiarity, just having heard acclaim. 237 00:24:25,410 --> 00:24:28,560 You know, there's a number evolutionary reasons, I think, why we cling onto them. 238 00:24:28,770 --> 00:24:34,440 But if you want to rebut something just constantly going on about the thing that isn't true doesn't seem to help. 239 00:24:34,790 --> 00:24:42,450 That's one aspect of the backfire effect. Second aspect of the backfire effect is simply a really simple message sticks. 240 00:24:43,410 --> 00:24:52,200 And fact checking is often complicated. If you go to reality check on the BBC or to full fact, these are great sites. 241 00:24:52,200 --> 00:24:58,079 I'm not criticising these sites. You go down and you just got the footnotes and explainers and it all. 242 00:24:58,080 --> 00:25:03,360 It all depends what you mean. And well, you know, but if you measure from 1978, then it is true. 243 00:25:03,360 --> 00:25:06,989 But, but this claim was from 1984 and that's not really true. 244 00:25:06,990 --> 00:25:10,680 And then and and I know exactly why they do this. 245 00:25:10,680 --> 00:25:17,160 They have to do this because they have to be absolutely meticulous, because they're interested in the facts and not only interested in the facts, 246 00:25:17,520 --> 00:25:21,450 in documenting their reasoning, exactly how did they come to reach this conclusion? 247 00:25:21,570 --> 00:25:25,530 And you can follow the hyperlinks and you can look up the sources and you can go to the Office for National Statistics. 248 00:25:25,770 --> 00:25:31,170 Of course, it's vital that you do it that way. If you are in the fact checking business, there is no other way to do it. 249 00:25:32,580 --> 00:25:36,530 But it's not persuasive. It's not supposed to be persuasive. 250 00:25:36,530 --> 00:25:42,720 That's not the design of the website. But that's why when Facebook flag up, this may not be true. 251 00:25:42,890 --> 00:25:50,630 Go and look at a fact checking website. I'm not sure that's going to work because people remember the simple truth over the complicated one. 252 00:25:50,780 --> 00:25:56,479 And this has been documented in incredibly complicated research papers that I'm going to describe to you, 253 00:25:56,480 --> 00:26:03,890 because I hope that you're going to absorb the simple truth of what I just said. Now, the third issue, the third source of the backfire effects. 254 00:26:04,130 --> 00:26:08,720 I need to take a step back. It's all about something we call motivated reasoning. 255 00:26:09,590 --> 00:26:12,889 Now the motivated reasoning. You know, I'm an economist. 256 00:26:12,890 --> 00:26:21,110 We don't we don't do this psychology stuff. So all I wanted to do to understand motivated reasoning was to go to this classic psychological study. 257 00:26:22,100 --> 00:26:29,000 And it was a study of a football game or an American football game or, as they called it in America, a football game. 258 00:26:29,610 --> 00:26:35,420 And 1951 was a football game, college football game between Princeton and Dartmouth. 259 00:26:36,440 --> 00:26:39,860 And it's not a very nice game. A lot of fouls. 260 00:26:40,010 --> 00:26:45,230 It's pretty ill tempered. Both quarterbacks have to leave the field with injuries. 261 00:26:45,740 --> 00:26:49,790 One of them has to be stretchered off because he's got a broken leg. It's not it's not a nice game. 262 00:26:52,100 --> 00:27:00,200 So a short time after the game, I think it was a week, two psychologists, Albert Haseloff and Hadley Cantrell, 263 00:27:00,200 --> 00:27:04,190 has taught at Dartmouth and controls at Princeton, and they were aware of this game. 264 00:27:04,520 --> 00:27:07,549 They give the same survey to their students. 265 00:27:07,550 --> 00:27:11,090 So the Princeton students, the dark students, and they ask them about this game, 266 00:27:11,600 --> 00:27:16,429 you know, the students who saw the game, you know, was it was it bad tempered? 267 00:27:16,430 --> 00:27:20,089 Was it was it tough but fair or was it just, you know, tough and unfair? 268 00:27:20,090 --> 00:27:26,390 And if you thought there were problems, who started the problems? Was it Dartmouth or was it Princeton and all these sorts of questions? 269 00:27:26,750 --> 00:27:28,610 And what they found, I think, unsurprisingly, 270 00:27:28,610 --> 00:27:34,070 is basically the Dartmouth students thought that Dartmouth was pretty blameless and Princeton were a bunch of animals. 271 00:27:34,670 --> 00:27:39,320 And the Princeton students thought to say, rather, Dartmouth student. So I don't think that's incredibly surprising. 272 00:27:40,130 --> 00:27:46,910 But then a couple of years later, the researchers. Did a much more interesting study. 273 00:27:47,210 --> 00:27:55,690 They got video of the game. And they showed it to the students and they said, what I want you to do is count. 274 00:27:56,350 --> 00:28:00,730 Maybe not quite objective truths, but never objective facts about the game. 275 00:28:01,510 --> 00:28:06,550 So how many infringements did you see who committed the infringement? 276 00:28:06,970 --> 00:28:10,300 Was it a serious infringement or a trivial infringement? Of course there is. 277 00:28:10,600 --> 00:28:16,030 There's room for opinion in these judgements, but fundamentally not asking you who behaved better. 278 00:28:16,060 --> 00:28:19,600 I'm asking you to count fouls and and who committed them. 279 00:28:21,610 --> 00:28:25,060 And the title of the study is They Saw a Game. 280 00:28:26,650 --> 00:28:31,450 And the idea behind that title is, well, it would be nice to say they saw the game, 281 00:28:32,500 --> 00:28:40,930 but it seemed that everybody was seeing their own version of the game. They were watching the same videotape, but they weren't seeing the same events. 282 00:28:41,500 --> 00:28:45,010 They were filtering them because of their own partisan bias. 283 00:28:46,330 --> 00:28:49,450 Of course, that's football. And, you know, you've met football fans. 284 00:28:49,780 --> 00:28:54,700 You know, maybe it's just about that kind of tribalism. Well, it turns out not. 285 00:28:54,700 --> 00:29:04,210 It turns out we see the same thing in politics. There's a fascinating study by Dan Kahan at Yale of the same sort of thing, only in political protest. 286 00:29:05,140 --> 00:29:13,000 So he has video of protesters outside the building, you know, placards and yelling away and that they're doing what protesters do. 287 00:29:14,200 --> 00:29:17,710 So there are forums, experiments, a randomised controlled trial. 288 00:29:18,520 --> 00:29:22,780 In one arm he shows the footage to Liberal students, lefty students. 289 00:29:23,500 --> 00:29:31,690 And he says these are pro-choice, anti-abortion activists and they are protesting outside an abortion clinic. 290 00:29:33,800 --> 00:29:37,390 The second group. Still liberal lefty students. 291 00:29:38,290 --> 00:29:42,790 He says these are lesbian, gay, bisexual and trans activists, 292 00:29:43,390 --> 00:29:50,590 and they are protesting outside an army recruitment centre against the then policy of don't ask, don't tell. 293 00:29:51,280 --> 00:29:56,620 Where if you were if you were queer and you went into the military, you had to keep quiet about it. 294 00:29:56,620 --> 00:29:58,090 And they promise not to ask any questions. 295 00:29:58,450 --> 00:30:06,520 They were protesting outside the Army recruitment centre, the third box in this randomised controlled trial. 296 00:30:07,030 --> 00:30:13,299 They showed the video to conservative right wing students and they said it's the pro-choice, 297 00:30:13,300 --> 00:30:20,620 anti-abortion protect protesters outside the abortion clinic. And the fourth, of course, they showed them all the same video. 298 00:30:21,340 --> 00:30:24,639 They showed it to the conservative, the right wing students. 299 00:30:24,640 --> 00:30:29,140 And they said, these are LGBT activists protesting Don't Ask, Don't Tell. 300 00:30:31,340 --> 00:30:34,470 And then they said. What do you think of this protest? 301 00:30:35,400 --> 00:30:40,710 But they didn't just sort of say, well, you know, how does it make you feel? And, you know, do you think the protests are justified? 302 00:30:41,100 --> 00:30:46,410 They would say things like, did you see the protesters obstruct the entrance to the building? 303 00:30:47,880 --> 00:30:51,420 Did you see protesters screaming at passers by? 304 00:30:52,990 --> 00:31:00,460 And it was just like they saw a game. What people saw was heavily mediated by their emotional responses to the protest. 305 00:31:01,120 --> 00:31:05,890 If they felt that the protest was justified, they didn't see anybody obstructing the door. 306 00:31:05,900 --> 00:31:11,710 They didn't see any screaming. They didn't see any trouble. But if they felt that the protest was unjustified in a cause they disagreed with, 307 00:31:11,920 --> 00:31:15,940 then they saw all kinds of infractions and infringements and unacceptable behaviour. 308 00:31:17,780 --> 00:31:26,780 So our our perception of what should be objective reality is coloured by our feelings, by our own, by our very identity. 309 00:31:27,770 --> 00:31:34,249 And so the very interesting studies of this, when you present people with ideas about gun control, 310 00:31:34,250 --> 00:31:42,080 when you present people with ideas about climate change, when you ask people, did the US army find weapons of mass destruction in Iraq or not? 311 00:31:42,320 --> 00:31:49,309 Highly partisan issues. And it turns out that people who are in general more informed. 312 00:31:49,310 --> 00:31:57,110 So, for example, people who know more about science, better qualified scientists score higher on tests of scientific ability. 313 00:31:57,950 --> 00:32:01,100 You would think when you present them with a scientific question like climate change, 314 00:32:01,730 --> 00:32:05,690 the more informed you are, the less your political view should matter. 315 00:32:06,230 --> 00:32:09,719 If you're if you don't know anything, then maybe you just fit into your tribe. 316 00:32:09,720 --> 00:32:12,950 But when the moment you know something, of course you cleave to the science. 317 00:32:13,280 --> 00:32:14,810 But in fact, the opposite is true. 318 00:32:15,290 --> 00:32:24,290 So the partisan divide between Democrats and Republicans about climate change is wider among those who have a high degree of scientific literacy, 319 00:32:24,710 --> 00:32:31,290 and it's narrower among those who don't know anything about climate change. Which is disturbing. 320 00:32:34,180 --> 00:32:40,030 Now. It was a fascinating study of the backfire effect of the idea if you're going to 321 00:32:40,030 --> 00:32:43,690 give people more information about something they really believe strongly about. 322 00:32:43,900 --> 00:32:47,310 Conducted by two psychologists, Brendan Nyhan and Jason Riflettere. 323 00:32:47,320 --> 00:32:52,120 Jason at Exeter. Brendan Nyhan I think is it. 324 00:32:52,930 --> 00:33:00,910 I think he's at Dartmouth and they've studied this in various contexts, but I think the most famous study was of the flu vaccine, 325 00:33:01,330 --> 00:33:07,780 and they said, okay, if a whole bunch of people it turns out that 43% of Americans in the flu vaccine can give you flu. 326 00:33:08,740 --> 00:33:11,930 The flu vaccine can't give you flu. Well, I am with my facts. 327 00:33:11,950 --> 00:33:15,580 That's not going to work. They want a randomised controlled trial. 328 00:33:16,000 --> 00:33:20,829 I got a whole bunch of people who had, you know, then maybe they were going to vaccinate themselves, 329 00:33:20,830 --> 00:33:26,979 maybe they were and they had different feelings about the flu vaccine and they showed some of them the 330 00:33:26,980 --> 00:33:33,730 Centres for Disease Control advice on specifically on the misconception that the flu vaccine causes flu. 331 00:33:33,850 --> 00:33:37,720 Does the flu vaccine cause flu? No. It turns out the flu vaccine does not cause flu. 332 00:33:38,920 --> 00:33:45,850 Here's why it doesn't cause flu. Here's what it might cause, you know, soreness of the arm, maybe a headache. 333 00:33:47,020 --> 00:33:50,830 Here's why that happens. If you have any further, you want more detail. 334 00:33:51,250 --> 00:33:54,610 Here are the randomised controlled trials and I'm going to summarise them all but you can click through and read them. 335 00:33:55,450 --> 00:34:02,260 Now here's what's interesting. If you give people that information and then you say, okay, you thought that the flu vaccine cause flu. 336 00:34:02,530 --> 00:34:07,420 What do you think? Now people would say, okay, I accept the flu vaccine does not cause flu. 337 00:34:08,920 --> 00:34:16,000 Then if you say, okay, all your parents get vaccinated and no way, 338 00:34:17,080 --> 00:34:25,180 they were significantly both statistically and practically significantly less likely to express an intention 339 00:34:25,450 --> 00:34:32,410 to be vaccinated after they had been shown and accepted the evidence that the flu vaccine doesn't cause flu. 340 00:34:33,340 --> 00:34:39,080 And what seems to be going on, I mean, this effect is now under exhaustive examination by psychologists. 341 00:34:39,400 --> 00:34:45,010 Fascinating. What seems to be going on is really my concerns about the flu vaccine. 342 00:34:45,010 --> 00:34:47,770 They're not really about does it cause flu? 343 00:34:47,890 --> 00:34:57,000 They're about something much deeper about sharp needles or not trusting the government or polluting my body or something. 344 00:34:57,010 --> 00:35:03,729 There's something quite primal going on. And when you give me the evidence that flu vaccine doesn't cause flu, 345 00:35:03,730 --> 00:35:08,980 and there was a very, very similar study done of MMR and autism and the same thing. 346 00:35:09,310 --> 00:35:14,710 I accept that the MMR vaccine does not cause autism. I accept that all of the evidence on that was just junk science. 347 00:35:15,640 --> 00:35:19,360 Yeah, I believe you. Are you going to vaccinate your kids? I don't think so. 348 00:35:19,600 --> 00:35:27,759 I don't think so, Mr. Scientist. What's going on is, as I give you the information in a narrow way, you accept the information. 349 00:35:27,760 --> 00:35:31,360 Okay. I accept that the flu vaccine doesn't cause flu. 350 00:35:32,470 --> 00:35:36,610 But then emotionally, I'm thinking of all the reasons I don't want to get that flu vaccine. 351 00:35:36,970 --> 00:35:42,760 I'm fighting back, and subconsciously I'm calling to mind all of the different reasons why I don't want this to be true. 352 00:35:44,990 --> 00:35:48,290 So do you think Donald Trump is a sex pest? Absolutely not. 353 00:35:48,710 --> 00:35:51,950 Here is tape of Donald Trump boasting about sexually assaulting women. 354 00:35:52,040 --> 00:35:53,510 Now, do you think he's a sex pest? 355 00:35:53,780 --> 00:36:01,759 Okay, now I accept Donald Trump as a sex pest, but my vote for Donald Trump was never about, you know, him being polite to women. 356 00:36:01,760 --> 00:36:05,870 It was about he was going to shake Washington up. So you give people a narrow fact. 357 00:36:05,870 --> 00:36:11,720 They accept an hour of fact and then back. They come with all kinds of other reasons because there were always other reasons. 358 00:36:13,010 --> 00:36:18,379 Now I'm nine and rightfully by the way, they did run a randomised controlled trial of fact checking. 359 00:36:18,380 --> 00:36:24,560 And the good news for my fact checking friends is fact checking probably helps a little bit. 360 00:36:25,040 --> 00:36:31,580 Okay. So they they expose people too to to fact check the three months before the 2014 congressional elections in the United States. 361 00:36:31,820 --> 00:36:38,720 So people either got to read regular news or fact checking briefs from sources like PolitiFact and full fact. 362 00:36:39,770 --> 00:36:43,459 And it turns out that the fact checking improve people's knowledge. 363 00:36:43,460 --> 00:36:47,210 They knew more about the facts. They were better informed about politics. 364 00:36:48,350 --> 00:36:51,560 I mean, from an incredibly low level to a low level. 365 00:36:51,890 --> 00:36:56,960 But they were better informed. So. So we didn't see direct evidence of the backfire effect yet. 366 00:36:57,770 --> 00:37:02,030 But what we also saw there in that particular randomised controlled trial was 367 00:37:02,030 --> 00:37:05,570 the fact checking worked a lot better for people who already had some facts, 368 00:37:05,960 --> 00:37:09,230 people who already knew a bit about politics and you give them the facts. 369 00:37:09,530 --> 00:37:13,580 They know more. They absorb more when people know very little about politics. 370 00:37:14,480 --> 00:37:16,550 The fact checking didn't seem to work at all. 371 00:37:17,180 --> 00:37:27,470 So I don't think the facts are the answer or they're not by themselves the answer when you're faced with people who have a strong reason to believe. 372 00:37:27,590 --> 00:37:31,280 And just remembering, again, the case of smoking, 373 00:37:31,850 --> 00:37:37,820 who could be more motivated to believe that smoking doesn't cause cancer than somebody who addicted to cigarettes? 374 00:37:38,240 --> 00:37:46,130 What could be more fundamental a belief than this thing that I can't stop that I chose to I chose to take up and now I can't stop. 375 00:37:46,880 --> 00:37:48,230 It's not really going to kill me. 376 00:37:49,590 --> 00:37:58,200 And by the way, many of the journalists who the tobacco lobby were most effective at using to spread their misinformation were smokers. 377 00:37:59,310 --> 00:38:02,460 Of course, motivated reasoning and action. 378 00:38:03,900 --> 00:38:12,660 Now mentioning Facebook calls to on this other issue that often comes up, which is the filter bubble, the idea of the filter bubble, 379 00:38:12,660 --> 00:38:19,110 the idea that we live in an echo chamber is sort of these people who would say, I can't believe that the country voted to leave the EU. 380 00:38:19,440 --> 00:38:23,700 I've never met anybody who wanted to leave the EU. They where did all these people come from? 381 00:38:23,870 --> 00:38:31,470 And, you know, it's it's quite a natural response because, you know, we we do tend to hang around in groups of people who who think like us. 382 00:38:31,860 --> 00:38:38,730 Now, Cass Sunstein, one of the co-authors of Nudge, and he was a he was a an official under Barack Obama, a very smart guy. 383 00:38:39,040 --> 00:38:42,630 And he wrote years ago about the problem of Echo Chambers online. 384 00:38:42,930 --> 00:38:48,899 He said, the idea is, you know, you could just seek out the more choice you have, the more you tend to seek out people who think exactly like you. 385 00:38:48,900 --> 00:38:51,450 And you just talk in your own little bubble to people who think like you. 386 00:38:51,840 --> 00:39:00,570 And then Eli Paris, who was a digital activist and now runs the site Upworthy, wrote a book about five years ago now called The Filter Bubble. 387 00:39:00,990 --> 00:39:03,990 And the Parasite added something to Cass Sunstein argument. 388 00:39:04,000 --> 00:39:10,080 He said, Not only are we in this bubble, but the algorithms are making it worse. 389 00:39:10,770 --> 00:39:14,490 So when I search on Facebook, I was just for my series. 390 00:39:14,520 --> 00:39:18,299 Thank you, by the way, alone for recommending 50 things that made the modern economy. You should all subscribe. 391 00:39:18,300 --> 00:39:24,300 It's free. I was researching The Plough and I wanted to find out stuff about the plough. 392 00:39:25,530 --> 00:39:31,109 You know, I've got some you know, I've got some deep research here, but, you know, some basic facts that's just get a Google around. 393 00:39:31,110 --> 00:39:32,610 And what does the Internet tell me about the plough? 394 00:39:32,730 --> 00:39:37,709 And of course, when you type in the plough to Google, it says, here's a popular Oxford called The Plough. 395 00:39:37,710 --> 00:39:39,330 Here's another capability of Oxford called the Plough. 396 00:39:39,330 --> 00:39:45,690 He's a third property in Oxford called the Plan because it knows I'm in Oxford and it obviously knows how he like to like a beer. 397 00:39:47,770 --> 00:39:54,239 So so Eli Paris his argument is your Google searches and Bing searches and they're all 398 00:39:54,240 --> 00:39:59,549 personalised and that can in their quest to show you what you want to see because of course, 399 00:39:59,550 --> 00:40:05,970 they want to show you what you want to see. So there's no malice in it that might lead you towards biased sources of information. 400 00:40:06,210 --> 00:40:11,430 And the same thing with Facebook. Remember Zuckerberg's claim? I want to show you the content that's most relevant to you. 401 00:40:12,030 --> 00:40:14,310 Well, Paris says he's on Facebook. 402 00:40:14,460 --> 00:40:21,960 He's politically active, he's leftwing, but he follows lots of right wing sources because he wants to he wants to be aware of what's going on, 403 00:40:22,860 --> 00:40:27,900 but he's clicking like and share on the left wing sources and he's just reading the right wing sources. 404 00:40:28,710 --> 00:40:33,450 And Facebook's algorithm seems to go well, they don't seem that excited about these right wing sources. 405 00:40:33,810 --> 00:40:36,960 We'll we'll remove those for your feed, don't worry. 406 00:40:36,990 --> 00:40:41,730 To show you stuff that's more relevant to you and your interests. Again, no malice in it. 407 00:40:42,030 --> 00:40:45,209 I mean, there's no great Satan in either. 408 00:40:45,210 --> 00:40:50,640 They just want to sell adverts, but they're not is not designed to do this. 409 00:40:50,650 --> 00:40:58,110 It's a side effect of the algorithm. But again, as with fake news, I think it's worth asking how serious is this problem? 410 00:40:59,400 --> 00:41:04,559 It was a fantastic, quite informal study by Emma Pearson, who's based here in Oxford. 411 00:41:04,560 --> 00:41:10,440 Maybe I'm as here you maybe not a statistician at Oxford. 412 00:41:10,440 --> 00:41:20,430 And she she studied tweets during the the troubles in Ferguson, a young black man shot by a policeman in Ferguson, Missouri. 413 00:41:20,640 --> 00:41:22,950 And then people started tweeting about it because that's what people do. 414 00:41:23,340 --> 00:41:29,130 And Pearson analysed all the tweets and she said, okay, the red tweets and the blue tweets and the red tweets say, 415 00:41:29,310 --> 00:41:34,380 you know, this is looting and, you know, outrageous disorder. 416 00:41:34,560 --> 00:41:36,360 And the police are responding in the right way. 417 00:41:36,360 --> 00:41:42,389 And by the way, that police officer was framed and the guy he shot had it coming in the right tweet and then the blue tweets, 418 00:41:42,390 --> 00:41:45,690 which like this is this is noble protest. 419 00:41:45,780 --> 00:41:51,360 It's not a riot, it's not looting. It's it's principled civil disobedience. 420 00:41:51,510 --> 00:41:55,410 The police response is outrageous and disproportionate and blah, blah, blah. 421 00:41:56,760 --> 00:42:03,150 No, no contact. No contact at all. I mean, we have all this complaints about Twitter trolls and of course, Twitter trolls are a problem. 422 00:42:03,330 --> 00:42:07,860 But the real problem is these people were not talking to each other at all, just retweeting each other. 423 00:42:08,550 --> 00:42:13,130 The blue tweets are retweeting the blue tweets, the right tweets to be true to the red tweets. But here's what's interesting. 424 00:42:13,140 --> 00:42:17,730 When you think about Eli, Paris's filter bubble thesis, the algorithms make it worse. 425 00:42:18,090 --> 00:42:22,830 Twitter didn't have an algorithm in the summer of 2014 when this was all happening. 426 00:42:22,980 --> 00:42:28,380 This is all purely based on our choices who are friends and who we hung around with. 427 00:42:28,800 --> 00:42:29,820 And Facebook have studied. 428 00:42:29,820 --> 00:42:36,660 They said, you know, probably something in science, which is a respected journal, but it is internal research from Facebook. 429 00:42:37,050 --> 00:42:40,770 Facebook argues that the algorithm does politically filter. 430 00:42:40,770 --> 00:42:42,149 That is a side effect of the algorithm, 431 00:42:42,150 --> 00:42:47,850 but it's quite modest compared with the way that we politically filter by just following people who think like us. 432 00:42:48,750 --> 00:42:56,520 And there's been other studies of this. Seth Flaxman, who's a statistician here in Oxford and a couple of colleagues have tried to study this. 433 00:42:56,760 --> 00:43:02,520 They got a 1.2 million browser users and just check what news they were reading and. 434 00:43:04,170 --> 00:43:08,130 These were Microsoft browsers that were sharing data. The people that agreed to share data with Microsoft. 435 00:43:08,280 --> 00:43:17,580 So the question is, well, what use do you read when you just type in F.T. dot com or BBC News dot bbc.co.uk or guardian echo dot UK? 436 00:43:17,700 --> 00:43:24,180 When you type that into your browser versus if you search for a story overseas, if you click through from social media. 437 00:43:24,690 --> 00:43:27,990 Which which method of reading news is more polarised. 438 00:43:28,800 --> 00:43:33,150 And they found there was a small polarisation effect from Google searches. 439 00:43:33,300 --> 00:43:36,240 Bing searches from social media was not that big. 440 00:43:36,720 --> 00:43:42,630 And also which means basically if you click from social media, you're more likely either to be served with. 441 00:43:42,740 --> 00:43:47,850 If you're if you tend right, you'll be served with right wing stories. If you tend to left, you'll be served with left wing sources. 442 00:43:48,210 --> 00:43:55,170 There's more polarisation, but also you're more likely to be shown sources for the opposing point of view. 443 00:43:55,950 --> 00:44:02,579 I mean, the thing is, what is what is a filter bubble? If I mean, if the Guardian or the F.T. or The Times, I mean, they're all filter bubbles. 444 00:44:02,580 --> 00:44:06,270 The Daily Mail is a filter bubble. They're all filter bubbles reading one newspaper. 445 00:44:07,050 --> 00:44:11,040 I mean, it should be the Financial Times, obviously, but of course, you you know, 446 00:44:11,040 --> 00:44:14,909 you're just exposed to a particular viewpoint and you're relying on the editors 447 00:44:14,910 --> 00:44:18,600 of the newspaper to expose you to alternative ways of seeing the world. 448 00:44:18,870 --> 00:44:25,050 So they found that actually there wasn't a massive amount of polarisation for people clicking from social media, 449 00:44:25,410 --> 00:44:29,820 and there was arguably more diversity if you did get your news from social media. 450 00:44:30,060 --> 00:44:34,290 All of which suggests the filter bubble. It may be a it may be a problem in the future. 451 00:44:34,620 --> 00:44:38,400 It's not a huge problem yet. But you know what is a huge problem? 452 00:44:39,530 --> 00:44:44,180 Just a little detail in the footnotes of of the Flaxman study. 453 00:44:44,540 --> 00:44:50,330 There's 1.2 million users that they were examining to see how they consume their news online. 454 00:44:51,830 --> 00:44:55,250 In order to actually be analysed for filter bubble behaviour, 455 00:44:55,550 --> 00:45:02,120 you had to consume ten news stories and two op ed pieces a month and those sports stories didn't count. 456 00:45:03,290 --> 00:45:09,890 You want to guess how many of the 1.2 million people read ten news stories in two or PEDs a month? 457 00:45:11,870 --> 00:45:16,550 50,044%. The other 96%. 458 00:45:16,640 --> 00:45:19,640 Yeah, they had a filter bubble. The filter bubble is don't read the news. 459 00:45:21,080 --> 00:45:24,860 So I think I think we have we have bigger problems. 460 00:45:25,250 --> 00:45:31,520 We have bigger problems in the filter bubble. And to the extent that there is a filter bubble in the way we consume news, it's it's us. 461 00:45:31,700 --> 00:45:35,030 It's our friends and how we're influenced by our friends. We can't blame Facebook. 462 00:45:35,960 --> 00:45:40,520 We have to blame ourselves. Just a thought, by the way, on smoking. 463 00:45:41,270 --> 00:45:49,210 Smoking is socially contagious. Smoking itself exists in a filter bubble and quitting is socially contagious. 464 00:45:49,220 --> 00:45:55,700 There's a wonderful study by David Cutler, an adviser to economists who study what happens when you've got a husband and 465 00:45:55,700 --> 00:46:02,410 wife who both smoke and one of them is affected by a workplace ban on smoking? 466 00:46:02,420 --> 00:46:06,950 It makes it more difficult for him or her to smoke. That doesn't in any way affect the other smoker. 467 00:46:07,370 --> 00:46:15,050 And they found this enormous effect. If your spouse finds it more difficult to smoke, you are highly likely to quit smoking. 468 00:46:15,560 --> 00:46:19,190 So smoking a play is part of a filter bubble as well. 469 00:46:20,540 --> 00:46:28,730 So. If. If part of the problem if a major part of the problem is basically just people are not that interested. 470 00:46:29,600 --> 00:46:32,839 It's not the filter bubble, it's not motivated reasoning. 471 00:46:32,840 --> 00:46:36,920 It's just people just out broadly don't care about any of this stuff. 472 00:46:36,950 --> 00:46:44,840 They don't care about the facts. They're not interested. Well, what would we expect to see by people who want to mislead us? 473 00:46:46,160 --> 00:46:51,980 We would expect to see not lies so much as distraction. 474 00:46:53,360 --> 00:46:57,860 We would expect to see arguments about whether Donald Trump owns a bathrobe or not. 475 00:46:59,000 --> 00:47:02,600 We would expect to see arguments about whether the crowd at Donald Trump's 476 00:47:02,600 --> 00:47:06,800 inauguration was bigger than the crowd at Barack Obama's inauguration or not. 477 00:47:07,760 --> 00:47:11,780 Fortunately, none of these things has come to pass. People are really focusing on the issues that matter. 478 00:47:13,760 --> 00:47:20,659 There's a wonderful new study been published of the 50 Cent AMI and we know who the 50 cent AMI he were Chinese. 479 00:47:20,660 --> 00:47:25,160 You would know the 50 cent Armijo. So the 50 cent AMI are allegedly. 480 00:47:26,080 --> 00:47:32,230 People who have paid $0.50 an hour to talk up the Chinese government on Chinese social media. 481 00:47:32,800 --> 00:47:34,990 So reportedly, 482 00:47:34,990 --> 00:47:45,010 tens of millions of people paid by the government to basically just make spread propaganda on social media favourable to the Chinese government. 483 00:47:45,640 --> 00:47:51,430 This fantastic study. It's like a, you know, a spy novel reading this works. 484 00:47:51,430 --> 00:47:55,569 At first they managed to get hold of a big data leak of loads and loads of people in 485 00:47:55,570 --> 00:47:59,560 the 50 Cent Army sending screenshots of their work to a government office saying, 486 00:47:59,680 --> 00:48:03,999 I've been posting all this propaganda. Can I have my money now? There was this big leak of all this data. 487 00:48:04,000 --> 00:48:07,300 So they're able to identify people who were employed by the government. 488 00:48:07,810 --> 00:48:14,410 And then they they had this fishing expedition where they managed to persuade these people that they were also in the 50 cent army. 489 00:48:14,410 --> 00:48:17,890 And could they give them some advice about the best kind of posts to make? 490 00:48:18,130 --> 00:48:23,140 And they were just able to track exactly what the strategies were and what posts were being made when. 491 00:48:23,500 --> 00:48:30,430 And the bottom line of this study is a lot of this stuff was not the propaganda you might expect. 492 00:48:30,550 --> 00:48:37,870 It was not people arguing against dissidents. It was not people straightforwardly lying about what the government was doing 493 00:48:38,080 --> 00:48:40,900 or attacking people who told the truth about what the government was doing. 494 00:48:41,290 --> 00:48:50,250 What it was instead was people changing the subject, talking about this cool movie that was on TV last night or celebrations, 495 00:48:50,260 --> 00:48:55,540 the Chinese New Year coming up talking about the Chinese New Year or just just creating noise, 496 00:48:55,540 --> 00:49:00,170 just creating a distraction, anything to prevent us focusing on the issue. 497 00:49:01,330 --> 00:49:06,340 Which is interesting, I think. Then we know who Stanley Prisoner is. 498 00:49:10,090 --> 00:49:14,310 He has a Nobel Prize. It's a medicine, though. 499 00:49:14,320 --> 00:49:21,730 I mean, obviously there's no Nobel Prize in maths. Stanley Peacenik got the Nobel Prize in medicine for discovering prions. 500 00:49:22,510 --> 00:49:27,280 So prions are this rogue protein that cause codes failed. 501 00:49:27,280 --> 00:49:30,640 Jakob disease in mad cow disease and. 502 00:49:32,120 --> 00:49:37,460 Saudi prison had just started investigating this in 1972, had a patient who had quite scaled back up disease. 503 00:49:37,550 --> 00:49:42,530 Nobody knew what caused it. And there were similar sorts of diseases out there, like scraping sheep. 504 00:49:43,550 --> 00:49:46,850 And the theory was it was some kind of really, really slow acting virus. 505 00:49:47,900 --> 00:49:51,440 And scientists kept looking for the virus in looking for the virus, looking for the virus. And they could never find the virus. 506 00:49:51,440 --> 00:49:54,530 And prisoner was looking for the virus. Looking for the virus, and he couldn't find the virus. 507 00:49:54,680 --> 00:49:58,130 And in the end, he discovered it's not a virus. It's a protein. 508 00:49:58,820 --> 00:50:03,360 A protein causes this condition. Which is insane. 509 00:50:04,540 --> 00:50:09,579 And everybody told him it was insane and that he was a crackpot and he was gradually marginalised 510 00:50:09,580 --> 00:50:14,080 and his funding was withdrawn and the whole sort of story about these scientific heroes. 511 00:50:15,170 --> 00:50:20,850 But. He managed to find a source of funding to continue his work. 512 00:50:22,380 --> 00:50:28,290 R.J. Reynolds, makers of camel cigarettes. Here is the story, the secret he made. 513 00:50:28,290 --> 00:50:35,140 He thanks them in his Nobel Prize acceptance speech. The discovery of the cause of mad cow disease and quixotic disease. 514 00:50:35,610 --> 00:50:38,940 The Nobel Prize winning discovery was funded by big tobacco. 515 00:50:39,960 --> 00:50:44,690 And this is in no way a slant on Sunday proves no. I've got no argument with Sandy presenter he's a great, 516 00:50:44,760 --> 00:50:48,420 truly great scientist and as far as I'm concerned, he's never said a word about tobacco, nothing. 517 00:50:48,720 --> 00:50:52,960 He's not interested in it. But you know what's interesting about prions? 518 00:50:54,470 --> 00:50:57,680 They are not the new story. Cigarettes cause cancer. 519 00:50:58,490 --> 00:51:05,180 There is something new and different and interesting. And one of the things the tobacco lobby did was to find all kinds of interesting research. 520 00:51:05,480 --> 00:51:10,010 Maybe lung cancer is caused by a fungus. There is a fungus that does seem to mimic the conditions of lung cancer. 521 00:51:10,220 --> 00:51:15,740 You have a sick building syndrome. Tobacco industry. I'm not saying there's no such thing as sick building syndrome. 522 00:51:15,920 --> 00:51:19,999 It seems a bit vague to me, but the point is, doesn't really matter whether it's true or false. 523 00:51:20,000 --> 00:51:23,270 It's not cigarettes cause cancer, it's something else. 524 00:51:24,500 --> 00:51:27,080 So I spoke to Robert Proctor, who's a historian who studies this. 525 00:51:27,410 --> 00:51:36,110 He told me that ten Nobel Prizes have been awarded to scientists who basically were funded by the tobacco industry. 526 00:51:36,320 --> 00:51:39,379 And they're not they're not kind of junk science. 527 00:51:39,380 --> 00:51:44,960 They're not trying to cast doubt on the claim of cigarettes. They're just something else to talk about. 528 00:51:45,830 --> 00:51:47,630 They're just a distraction, something new. 529 00:51:48,950 --> 00:51:56,990 And Robert Proctor said, what's going on here with a lot of these news stories says the opposite of terrorism. 530 00:51:58,060 --> 00:52:02,530 Terrorism is getting getting excited about a risk that isn't that big. Getting upset about it isn't that big. 531 00:52:02,770 --> 00:52:08,950 So the opposite of terrorism is trivial ism. It's managing to distract people from a huge risk and get them to look at something else. 532 00:52:09,910 --> 00:52:15,800 And that's what happened with tobacco. You know, I mean, we still know that cigarettes cause cancer. 533 00:52:16,040 --> 00:52:22,280 It's perhaps the the most or among the most important facts about our health today. 534 00:52:22,340 --> 00:52:27,080 It's still one of the most important facts about our health. Do you know how often we mention it in the news? 535 00:52:27,260 --> 00:52:30,710 Not very often. Because it's not you. It's not you. 536 00:52:32,420 --> 00:52:36,440 We do sometimes mention it more or less because that's how we roll, but not very often. 537 00:52:38,460 --> 00:52:43,620 So I've told you that. Just debunking. 538 00:52:45,220 --> 00:52:49,660 Doesn't necessarily work. Just debunking can backfire. It can reinforce myths. 539 00:52:50,320 --> 00:52:58,660 It can confuse people. It can create a backfire effect where people fight back so hard, they don't want to listen to what you're saying. 540 00:52:59,800 --> 00:53:06,129 I've told you that actually a lot of the problem is not the fact that people believe false things. 541 00:53:06,130 --> 00:53:10,210 A lot of the problem is that people are just not paying any attention at all, not interested in the facts. 542 00:53:10,780 --> 00:53:15,580 They have some feelings that I want. Pesky facts. Is there a solution? 543 00:53:16,660 --> 00:53:21,070 What? I said if fact checking is part of the solution. But FactCheck is just the foundation of the solution. 544 00:53:21,760 --> 00:53:31,570 Is there a solution? Maybe there is. So Dan Kahan of Yale, published just a couple of weeks ago, 545 00:53:32,770 --> 00:53:40,059 a paper looking at this whole question of people who know stuff about science and what they believe about climate change and the existing finding. 546 00:53:40,060 --> 00:53:46,610 Remember, if you know more about science, then the polarisation is wider. 547 00:53:46,630 --> 00:53:51,820 Republicans and Democrats are further apart in their beliefs about climate change if they're more informed about science. 548 00:53:53,360 --> 00:53:58,200 So Duncan mentioned something else. Curiosity. 549 00:53:59,620 --> 00:54:03,790 Like, What do we want to know? How interested are we in surprising results? 550 00:54:03,970 --> 00:54:07,390 We measure it in all kinds of ways. He asked people how often they read science books. 551 00:54:07,660 --> 00:54:12,970 He asked people if they if they enjoyed watching science documentaries, nature documentaries, that sort of thing. 552 00:54:13,270 --> 00:54:17,020 It turns out interest in science is distributed across the political spectrum. 553 00:54:17,020 --> 00:54:19,300 Republicans are just interested in science as Democrats. 554 00:54:20,740 --> 00:54:26,470 And interest in science is correlated with scientific knowledge, but it's not that closely correlated with scientific knowledge. 555 00:54:26,980 --> 00:54:30,850 You can be curious and not really know a lot and you can know quite a lot and not be very curious. 556 00:54:32,680 --> 00:54:39,340 And then having established this measure of curiosity, he asked, Well, do we have the same polarisation or not? 557 00:54:40,960 --> 00:54:46,970 And he found, No. I mean, Republicans are still more sceptical about climate change than Democrats. 558 00:54:47,510 --> 00:54:52,070 But consistently the more curious both of them are they move in lockstep. 559 00:54:52,430 --> 00:54:58,790 They get more and more concerned. The more the more curious they are, the more concerned they get about climate change. 560 00:54:59,740 --> 00:55:05,920 And there are various other studies that are starting to suggest that interest in the scientific method, an interest in where facts come from, 561 00:55:06,070 --> 00:55:10,690 how scientists work, maybe even how economists work or even how statisticians work, 562 00:55:11,950 --> 00:55:16,180 might be more productive than just hammering people with the facts. 563 00:55:17,360 --> 00:55:22,850 So what I'm really saying is we need to make people care about the truth, 564 00:55:23,510 --> 00:55:28,340 not just give them the truth, but make them want to know, make them curious, make them ask questions. 565 00:55:29,000 --> 00:55:32,360 We need to encourage that. That spirit of adventure, that curiosity. 566 00:55:33,380 --> 00:55:41,120 What we basically need. It's the Brian Cox of Statistics, Economics, Politics and Social Science. 567 00:55:45,380 --> 00:55:48,380 Yesterday. He died. 568 00:55:51,220 --> 00:55:55,030 So Hans Rosling was the most amazing scientific communicator. 569 00:55:55,450 --> 00:56:02,589 I saw him speak many times. You get this amazing TEDx talk that's been seen zillions of times with beautiful 570 00:56:02,590 --> 00:56:07,360 bubbles floating all over the place and the amazing data visualisation. 571 00:56:08,440 --> 00:56:17,230 He also swallowed swords. He also gave a fantastic demonstration of demography with with just a pile of toilet rolls, one on top of another. 572 00:56:17,500 --> 00:56:20,920 And it wasn't just kind of a fancy prop it. 573 00:56:21,390 --> 00:56:24,460 I really did understand demographic change in the process. Demographic change. 574 00:56:24,760 --> 00:56:30,550 Having seen Hans Rosling messing around with toilet rolls and the other thing he did. 575 00:56:31,540 --> 00:56:36,649 So he wasn't afraid to give people a real bollocking if he felt that they were asking 576 00:56:36,650 --> 00:56:40,970 questions that just came from a place of ignorance and and to encourage them. 577 00:56:41,050 --> 00:56:45,320 You can't you have to inform yourself about the world and what you want. 578 00:56:45,350 --> 00:56:48,680 Hans was interested in was what he called fact fulness. 579 00:56:48,960 --> 00:56:53,300 He's not interested in making an argument. I'm just interested in giving you the facts. 580 00:56:53,900 --> 00:56:57,390 That's what he said. But that wasn't quite true. 581 00:56:59,160 --> 00:57:03,030 He was never just giving us the facts. He was bringing them to life. 582 00:57:03,720 --> 00:57:09,690 He was making us realise that they were important, that they matter to us, mattered to making us better people, 583 00:57:09,840 --> 00:57:14,910 making us better citizens, better voters, understanding the world around us. 584 00:57:15,540 --> 00:57:19,560 And he made them an absolute joy. He made you want to find out more. 585 00:57:21,220 --> 00:57:26,080 That's what we need more of. We're going to miss him. Thanks very much for listening.