1 00:00:01,470 --> 00:00:14,190 So in December 2016, a man by the name of Edgar Welch walked into a pizza restaurant in Washington, D.C., armed with an assault rifle. 2 00:00:15,030 --> 00:00:23,830 He was there. To self investigate a bizarre conspiracy claim he had heard about, 3 00:00:24,520 --> 00:00:28,690 which alleged that Hillary Clinton and other Democratic politicians were 4 00:00:28,690 --> 00:00:33,640 involved in a child sex trafficking ring in the basement of that very Russian. 5 00:00:36,640 --> 00:00:45,930 Much to his surprise, that turned out to be false. There was no child sex trafficking ring in the basement of that restaurant. 6 00:00:45,970 --> 00:00:56,240 In fact, there was no business, as he would later tell The New York Times in jail on this one wasn't present. 7 00:01:01,630 --> 00:01:09,190 Now, fancy the cost Welch to turn up in D.C. that day is, you know, often how a lot of people work on this topic. 8 00:01:09,190 --> 00:01:19,630 And Irish eyes are the epitome of fake news, by which I mean deliberately misleading news stories spread for fun and profit and political gain, 9 00:01:20,830 --> 00:01:30,340 and Welch's actions are often cited as the prime example of the harm that fake news can do. 10 00:01:31,790 --> 00:01:41,360 But conspiracy stories is all you hear. No conspiracy stories like Pizzagate, as it's commonly called. 11 00:01:43,040 --> 00:01:49,500 It's just one form, just one form of a broader phenomenon that would be better called information formation. 12 00:01:50,420 --> 00:01:58,800 Okay. As I understand that term here, information pollution is the dumping of potentially toxic information into the media environment. 13 00:01:59,760 --> 00:02:02,910 And information can be toxic in a variety of ways. 14 00:02:03,030 --> 00:02:06,360 I was lucky to get on that, but I'm not going to say much about that here. 15 00:02:06,510 --> 00:02:10,739 I'll just say this that some of the ways in which it can be toxic, for example, 16 00:02:10,740 --> 00:02:17,460 is by the false, deceptive, misleading, or simply not based on good evidence. 17 00:02:18,060 --> 00:02:21,360 So there's degrees of toxicity. There's lots of things to talk about there. 18 00:02:21,630 --> 00:02:27,230 But I'm just going to take it for granted that you have an intuitive grasp of what? 19 00:02:27,240 --> 00:02:31,770 Any information. Hard to find work. 20 00:02:31,770 --> 00:02:36,040 Clearly living in a golden age for information computers. 21 00:02:36,960 --> 00:02:42,990 And the talk today is going to be about the forms that information will bring. 22 00:02:43,470 --> 00:02:47,960 But some of those forms, I think, are ones that are the ones I'm going to concentrate on. 23 00:02:48,210 --> 00:02:58,960 I think don't receive enough attention. Now the background of why information this is a golden age for information China is is worth noting. 24 00:02:59,230 --> 00:03:04,900 It's worth noting a background condition that makes innovation one of a variety of sorts. 25 00:03:08,880 --> 00:03:15,510 So easily spread around. And that background condition is the personalised internet. 26 00:03:17,750 --> 00:03:24,410 As most of you, I hope, know everything that you read on the Internet or on the Internet. 27 00:03:24,920 --> 00:03:28,340 From the ads that you read online, 28 00:03:29,630 --> 00:03:35,660 when you consult to The Guardian or the New York Times or what have you to the news that comes down your Facebook feed. 29 00:03:36,200 --> 00:03:40,370 All of that is tailored to fit your preferences. 30 00:03:40,910 --> 00:03:50,000 The Internet is personalised, as we say. Everything that one basically encounters on the Internet has some tinge of personalisation. 31 00:03:50,060 --> 00:03:55,880 That's all the apps on offer are almost certainly the ones that work on many standards. 32 00:03:57,500 --> 00:04:04,550 And that fact is the fact that it's going to be sort of lurking in the background and I'm going to talk about that much. 33 00:04:04,850 --> 00:04:13,880 But that is that fact that makes much of what is happening on the Internet today in terms of information pollution. 34 00:04:16,420 --> 00:04:18,760 Effective. All right. 35 00:04:21,090 --> 00:04:28,590 So Russian troll farms and research firms like Cambridge Analytica use the personalised Internet to help them place targeted political advertising, 36 00:04:29,000 --> 00:04:37,290 as we know, and get people to follow fake social media accounts and feed up news which reinforce their political 37 00:04:37,290 --> 00:04:45,479 opinions and encourage them to perhaps possibly clear vote one way or another in a variety of elections. 38 00:04:45,480 --> 00:04:48,570 And as we all know, that's still going on certainly in the United States. 39 00:04:49,530 --> 00:05:00,730 There are. Active information pollution campaigns done by a variety of state and non-state actors going on now. 40 00:05:01,630 --> 00:05:10,510 I heard just a few months ago Facebook, you know, announced with some fanfare that it had finally figured out that, oh, my God, they're shocked. 41 00:05:10,780 --> 00:05:14,620 Well, that news is going to be all about platforms. They were shocked about that. 42 00:05:14,650 --> 00:05:18,220 They they and none of the rest of us were, of course. 43 00:05:20,230 --> 00:05:23,799 So this is still an ongoing problem. It's not I referencing from 16. 44 00:05:23,800 --> 00:05:28,420 I haven't been talking, as I'm sure you all know about. It's just one instance. 45 00:05:29,400 --> 00:05:35,780 It's having. So most people tend to assume that information pollution is wrong because that's the line. 46 00:05:35,800 --> 00:05:38,740 So when people talk about one of the misleading things about fake news, 47 00:05:38,980 --> 00:05:46,680 besides the fact that my president has now changed the meaning of that term in any news story, you don't like that news. 48 00:05:46,690 --> 00:05:55,060 Now, he's had space as a as a term. But if you think about it in its original sense, lots of people are misled by the idea that, 49 00:05:55,270 --> 00:05:58,660 you know, what do they think about fake news as a false news story? 50 00:05:58,930 --> 00:06:02,380 They tend to think that, well, what's wrong with it is it must be a lie. 51 00:06:03,100 --> 00:06:10,690 Somebody is lying to us. And the fact that headlines and fake news stories are covered as fake are often put in terms of lie, 52 00:06:12,070 --> 00:06:15,310 but they're actually that underplays you mischaracterises the danger. 53 00:06:15,760 --> 00:06:24,450 The danger is it lies for, say, deception. And by the deception, as many people pointed out, are not necessarily the same. 54 00:06:25,300 --> 00:06:32,590 Oh, when I lie, I'm saying what I believe to be false, roughly with the intention of deceiving my audience. 55 00:06:33,970 --> 00:06:41,590 But of course, I can deceived you without warning. Silence is any achievement, for example, can be effectively deceptive. 56 00:06:42,100 --> 00:06:47,950 And I could go a lot like you without see that be because you are sceptical. 57 00:06:47,950 --> 00:06:53,500 You don't believe me? That is I lie to you. But you just, you know, see right through me everything. 58 00:06:53,530 --> 00:06:57,520 I'll speak because I said what I say is inadvertently true. 59 00:06:58,930 --> 00:07:06,070 Ah, the way you're lied to. But now seems how this might suggest a section occurs when someone is actually possibly what is false. 60 00:07:07,170 --> 00:07:11,190 Deception, as we might say, is a success. You like that? 61 00:07:11,820 --> 00:07:16,440 But that's only halfway there. Deception can happen, even when. 62 00:07:18,110 --> 00:07:24,470 Without false belief in causal results. So reflect on the old con the shell game. 63 00:07:25,600 --> 00:07:29,440 Where somebody has three cups. Right. And they have 14 or something. 64 00:07:30,370 --> 00:07:37,480 And the person who's doing the the the time is very adept at moving the cups around in a risky fashion. 65 00:07:37,750 --> 00:07:41,170 And then you're supposed to ask where. Which which cup? 66 00:07:41,170 --> 00:07:49,240 The coin. Yeah. Under which cup results were the people going back to town that presents these three shells. 67 00:07:50,050 --> 00:08:01,590 Move the shells around. What he's doing generally is he may be getting you to believe falsely that the coin is here rather than here. 68 00:08:02,100 --> 00:08:07,980 But in most cases, what's often happening is the fact you're not forming a belief at all and you just don't know what to think. 69 00:08:08,160 --> 00:08:14,730 Things move so quickly that you just move up and then you make a guess. 70 00:08:17,540 --> 00:08:22,610 That can be a form of subject control. The point is to get the money from. 71 00:08:23,580 --> 00:08:28,510 I want to keep that in mind, that analogy with the shoulder in mind as we move forward. 72 00:08:30,020 --> 00:08:40,250 Because one way of understanding that person is the use of social media to spread political misinformation online is probably just a giant shell game. 73 00:08:41,040 --> 00:08:50,880 And when we later talk about the weaponization of information pollution, it's important to keep in mind just how it's being used and for what reason. 74 00:08:51,720 --> 00:09:02,790 What this. A propagandist doesn't really care what anyone or even most people really believe the specific things they are selling. 75 00:09:03,630 --> 00:09:06,810 Or although it often turns out that what people do. 76 00:09:07,560 --> 00:09:10,830 They don't have to get you to actually believe the pantyhose under the wrong channel. 77 00:09:10,830 --> 00:09:13,860 They just copy that you could use the so that you don't know what is true. 78 00:09:14,520 --> 00:09:20,999 That stealth deception. And it is the kind of deception that just dreadful for profit conspiracy sites and 79 00:09:21,000 --> 00:09:26,310 Russian sponsored troll farms have a particularly adept at spreading on social media. 80 00:09:26,670 --> 00:09:29,820 No doubt some percentage of people actually believe such postings, 81 00:09:30,060 --> 00:09:36,870 but a far greater number of people come away ever so slightly doubtful what is true of whom to trust. 82 00:09:37,380 --> 00:09:41,400 They just don't know what to believe we can trust. Now. 83 00:09:41,400 --> 00:09:49,620 It used to be that when someone would say something outrageously false like the moon landing was fake, for that the earth is flat. 84 00:09:49,650 --> 00:09:53,670 I don't know if you know this, but in the United States, there's been a revival of bilateralism. 85 00:09:54,870 --> 00:09:55,500 I mean, really. 86 00:09:55,660 --> 00:10:07,420 But actually finding out you may be surprised right before culture is running rampant through the school slaughterhouse when you said so. 87 00:10:08,630 --> 00:10:12,630 So it used to be that when somebody would say something outrageously false like the one that was fake, 88 00:10:12,840 --> 00:10:16,020 you know, most people would ignore it because they'd say, well, that was true. 89 00:10:16,020 --> 00:10:22,950 I've heard about it. But fixing the openness, socialism, all of it as make this point. 90 00:10:25,330 --> 00:10:30,670 And what they had was something like, well, I would have heard about it from credible, independent sources. 91 00:10:30,820 --> 00:10:39,040 It's not just you, right? The person you're talking to. And they would have been assuming that there were people, credible people, editors, 92 00:10:39,040 --> 00:10:45,340 experts and the like who are still sorting through various sorts of information and weeding out the facts from. 93 00:10:48,140 --> 00:10:55,670 Now the Internet has made that reasoning sort of reasoning moot simply because so many of us are ensconced in our own information bubbles. 94 00:10:56,100 --> 00:11:01,069 Few people reject crazy claims based on the fact they had heard about them before at 95 00:11:01,070 --> 00:11:06,590 home because chances are they almost never about them or something close to them. 96 00:11:06,770 --> 00:11:15,500 From the sites that tend to confirm their own biases, that is when people who do and you know here and talk about Turkey imply. 97 00:11:15,890 --> 00:11:20,550 Well they don't. First of all they don't understand. That is a great idea. And secondly, even if you think it is crazy, 98 00:11:20,690 --> 00:11:27,170 you probably have heard about it because you've come up on your social media feed even as something to make fun of. 99 00:11:27,710 --> 00:11:30,870 Right. So in any event, 100 00:11:31,200 --> 00:11:39,659 the the the point I'm making there is that the spread of information online has a 101 00:11:39,660 --> 00:11:45,270 way of making it more difficult for us to perhaps even reject things out of hand. 102 00:11:45,270 --> 00:11:55,010 Because the idea that almost anything that somebody can say has been spread rather widely on the Internet, that goes for almost anything. 103 00:11:55,740 --> 00:12:01,020 And that can actually have the weird results of encouraging us to be more closed minded and dogmatic. 104 00:12:01,350 --> 00:12:07,830 And for weirdly rational reasons, part of being open minded, presumably, is taking relevant alternatives to your view seriously. 105 00:12:08,560 --> 00:12:13,380 Looking at the alternatives that they check out and you modify your view if they come to you, carry on. 106 00:12:15,060 --> 00:12:19,710 That is, after all, just what you want to try to do. And it works. 107 00:12:21,420 --> 00:12:28,710 He Ed Welch, believe it or not, he told the Times that he had not actually had an Internet account. 108 00:12:29,340 --> 00:12:35,010 We first heard about Pete's game. He'd heard about it from some friends at a job site that it was on. 109 00:12:35,550 --> 00:12:45,780 And they were talking about this. You know, Hillary Clinton was involved in selling children by the slice I it had come from. 110 00:12:46,320 --> 00:12:52,390 And he was like, how that sounds terrible. So if I should check into it. 111 00:12:53,170 --> 00:12:56,990 So he got himself an Internet account. Did it? 112 00:12:57,070 --> 00:13:03,330 He said he sent the research. Did you look at the sites that is friends recommended? 113 00:13:03,660 --> 00:13:09,000 And there it was on this board, reputable news site for GM. 114 00:13:11,630 --> 00:13:15,870 And there he saw it and he thought, well, somebody needs to do something. 115 00:13:16,950 --> 00:13:20,700 What's interesting, he was being open minded in a sense. 116 00:13:21,360 --> 00:13:28,590 I'm not saying he was historically virtuous otherwise. In that sense, he was doing what we would sort of think people would be. 117 00:13:28,960 --> 00:13:31,980 We heard something sounded odd. He did the work. 118 00:13:33,690 --> 00:13:42,840 I'll get back to his actions later as they were also curious or purchased stories way. 119 00:13:44,460 --> 00:13:53,010 But now notice. Think about the rest of us. The rest of us know that there's all sorts of crazy stuff coming down on Twitter. 120 00:13:55,480 --> 00:14:04,809 So this fact can mean that not only that all this craziness could mean that not only that 90 people like Mr. Welch can be deceived. 121 00:14:04,810 --> 00:14:11,770 It means that the rest of us now need to decide which crazy stories to pay attention to and which ones just to dismiss. 122 00:14:12,520 --> 00:14:15,700 And when they get heard from sources we all trust, 123 00:14:16,270 --> 00:14:21,819 we don't really check out conservatives because at least we tell ourselves there are simply too many outrageous claims out there. 124 00:14:21,820 --> 00:14:30,340 And that is definitely true. If you've just been following cattle and steaks right now. 125 00:14:30,340 --> 00:14:38,200 This is the thing about this reporter this is now possible supreme court justice, sexual assault allegations against him. 126 00:14:38,380 --> 00:14:46,660 And in just the last 12 hours, there's been a variety of claims made, some of which I was having a hard time trying to check out right here. 127 00:14:47,380 --> 00:14:54,130 Claims there are some these two people are now claiming they were the person that assaulted him on Capitol Hill. 128 00:14:54,430 --> 00:15:00,249 So you can hear them, right? I heard one ad which seems to be wrong. 129 00:15:00,250 --> 00:15:10,180 But anyway, the point is, is that in these situations, crazy things, so much crazy information is happening. 130 00:15:10,180 --> 00:15:13,810 All of us. I'm sure you're aware of this. You tried to go to the same sites. 131 00:15:14,170 --> 00:15:18,399 You're going to go to things that you trust and you'll have good reasons for doing that. 132 00:15:18,400 --> 00:15:23,920 Those are your dollars. And I imagine cases where people hear those really good reasons. 133 00:15:24,820 --> 00:15:32,230 But that's got to be one of the saddest ironies in a time right now, the technology that allows us to now more than ever, 134 00:15:34,270 --> 00:15:42,450 more than ever before, faster before the technology, mature arrival pockets in all fairness and close minded, 135 00:15:43,080 --> 00:15:47,670 and not just because it's inflating our information models or not forcing them, 136 00:15:48,070 --> 00:15:56,950 but because it's causing us to feel like we need to dismiss more and more alternatives to our abuse simply because we don't have the time. 137 00:15:57,190 --> 00:16:00,600 Yes, there's too many options available. 138 00:16:02,140 --> 00:16:10,390 All right. So these first remarks that I've made have been an attempt to try to point out some of the obvious harms, 139 00:16:10,390 --> 00:16:16,200 deception, confusion that information pollution cost. 140 00:16:16,210 --> 00:16:23,260 People are confused. They don't want to believe they've become maybe more closed minded, possibly they're being lied to. 141 00:16:23,560 --> 00:16:30,810 Those are the more obvious harms of information, collusion, harms that are inflicted on how by it become. 142 00:16:33,690 --> 00:16:40,380 Now I want to move into the type of harm I think is less obvious and we're talking all the time. 143 00:16:41,640 --> 00:16:45,550 So information pollution is not a new phenomenon and has always been continue. 144 00:16:45,930 --> 00:16:52,229 The chief weapon of a propagandist ordered by the use and misuse of information 145 00:16:52,230 --> 00:16:56,190 that those who desire to manipulate hearts and minds have always acted. 146 00:16:56,970 --> 00:17:05,160 And the purpose of that action is almost invariably to seek to rouse passions of the ordinary citizen and to instil a dogmatic, 147 00:17:05,400 --> 00:17:11,400 unforgiving attitude with regard to some thought, ideology or strategy. 148 00:17:13,430 --> 00:17:16,730 Ford is by instilling some attitude, some passionate attitude, 149 00:17:17,090 --> 00:17:20,960 and that will happen to the people can be made to do the most inhumane things to each other. 150 00:17:21,830 --> 00:17:27,770 That is the case of totalitarian regimes and is the case in contemporary liberal democracies, 151 00:17:28,070 --> 00:17:32,570 where propaganda necessarily in our character is no less attractive. 152 00:17:33,530 --> 00:17:36,680 You know, the term information pollution also can be misleading. 153 00:17:37,010 --> 00:17:40,970 The metaphor assumes the broader information culture being be. 154 00:17:41,600 --> 00:17:46,790 What nature is are given and save for the interactions of polluters. 155 00:17:47,090 --> 00:17:53,180 Pure. And that's first. I now know that that is definitely misleading. 156 00:17:54,890 --> 00:18:01,970 Is misleading because as we humans who make it convey information, we are constructed live in the wider information nature. 157 00:18:02,660 --> 00:18:06,440 The Internet is not just something that has happened to us. 158 00:18:10,450 --> 00:18:11,740 It's a world we've created. 159 00:18:12,100 --> 00:18:19,110 And every good, as every good propagandist knows, one can't cheat hearts and minds without appealing to something that is already hidden. 160 00:18:20,920 --> 00:18:27,879 And so as the fake news and information illusion. We created a digital digital world reflects our tendency to care less about the 161 00:18:27,880 --> 00:18:35,200 truth than what encourages us to be more arrogant in our convictions or shot. 162 00:18:38,250 --> 00:18:45,569 And so I think there's a way of thinking about what the right term might need to describe our present media culture. 163 00:18:45,570 --> 00:18:56,970 And by that I mean in particular in this talk, our social media culture is not polluted air pollution, information pollution, lack of corruption. 164 00:18:58,620 --> 00:19:03,000 So corruption is not the same as a as pollution. Pollution is something that happens to a system. 165 00:19:03,210 --> 00:19:11,970 Corruption is something that happens within the system. One way and I emphasise only one, that this is not an analysis of corruption. 166 00:19:13,590 --> 00:19:21,090 This one mark of corruption is when the system, the system is corrupt. 167 00:19:21,510 --> 00:19:28,380 One way of it being corrupt is when a state it rules are not the actual rules that it operates under. 168 00:19:29,070 --> 00:19:30,030 So, for example, 169 00:19:30,480 --> 00:19:40,830 the judiciary system is corrupt when its state rules are treating everybody equally under the law are not the actual rules that it's operating on. 170 00:19:40,950 --> 00:19:44,100 When, for example, drugs are common. All right. 171 00:19:46,110 --> 00:19:57,500 There are a lot of it is persistent speak for. In a way that I'm going to go on to argue. 172 00:19:57,510 --> 00:20:02,670 I think that there is a clear sense in which our present social media culture I 173 00:20:02,760 --> 00:20:10,050 get here is corrupt or can be corrupt in the way that I just sketched that is. 174 00:20:10,590 --> 00:20:20,820 And in particular what we might call its epistemic rules as epistemic rules in which information is traded, distributed, 175 00:20:21,180 --> 00:20:32,759 consumed and produced in this climate as platforms that rules that we think we're operating under are not the actual rules that are, 176 00:20:32,760 --> 00:20:45,230 in fact, operative concerns. In order to warm you up for that. 177 00:20:47,180 --> 00:20:52,400 Let's return to the beach games, for example. 178 00:20:54,680 --> 00:21:02,030 This is a small example. I'm going to talk about our our ordinary life online in a moment. 179 00:21:02,300 --> 00:21:06,020 But let's talk about something else. 180 00:21:06,350 --> 00:21:10,309 So that story was very specific. It was a very specific story. 181 00:21:10,310 --> 00:21:13,970 Hillary Clinton literacy stories in general. For those of you who have thought about them, 182 00:21:14,180 --> 00:21:22,520 you realise that when more of a true conspiracy sort of story is that it's overwhelming specificity and this was extremely specific. 183 00:21:22,700 --> 00:21:31,820 And sure, she heard it was a particular restaurant that came out with a particular restaurant or particular street owned by real people. 184 00:21:32,210 --> 00:21:38,120 That was where it was supposed to be happening, right? It was widely circulated the story. 185 00:21:39,590 --> 00:21:48,480 But by the time Mr. and Mrs. Chuck Norris had not a lot, not enough. 186 00:21:48,740 --> 00:21:51,770 So it was widely circulated. Lots and lots of people talked about it. 187 00:21:51,770 --> 00:22:02,480 Alex Jones of Infowars, a dreadful character in the United States, conspiracy theorist who has a big following, talked about it at great length. 188 00:22:03,860 --> 00:22:07,130 So but what were people doing? 189 00:22:07,370 --> 00:22:11,209 Well, there were a lot of questions. So a lot of the restaurant owners got death threats. 190 00:22:11,210 --> 00:22:17,570 It was terrible. I had people call up and say, you know, you got to stop telling children with your pizza. 191 00:22:18,050 --> 00:22:23,700 We all got here. Right. And they would say nasty things. 192 00:22:23,720 --> 00:22:28,640 And then also, bizarrely, restaurants along the block also got death threats. 193 00:22:28,970 --> 00:22:32,600 It wasn't really clear why that was, but I guess just I don't know. 194 00:22:33,290 --> 00:22:37,520 Better safe than sorry. These are all good. 195 00:22:39,240 --> 00:22:46,280 But. And there were a lot of protests, except the protests tended to be like in just one or two people standing outside with signs. 196 00:22:46,950 --> 00:22:55,610 Well, this is a widely circulated story. Okay. Not like hidden in the dark corners of the Internet, widely on the streets. 197 00:22:56,630 --> 00:23:06,350 Lots of people profess to believe that actual real politicians actually floated possibility of checking into this story. 198 00:23:08,300 --> 00:23:19,430 So, no, not so. Some of the more difficult of Mr. Welch, though, was the first time he sort of he heard about it and he said he actually got to on it. 199 00:23:19,940 --> 00:23:24,709 Well, I need to do something about why I thought that was a good thing as opposed to what second place 200 00:23:24,710 --> 00:23:30,440 I'm not sure or why no one else called the police strictly off like a job doing some research. 201 00:23:30,450 --> 00:23:33,730 Really? They didn't really get it right. Sure. 202 00:23:33,740 --> 00:23:38,660 It shows that something weird is going on, isn't it? Something weird is going on. 203 00:23:40,030 --> 00:23:49,269 Something. It's not quite the war. And then what a very interesting thing right across the bow is the reaction 204 00:23:49,270 --> 00:23:53,350 within certain far right media circles immediately following Mr. Walsh's attack. 205 00:23:53,630 --> 00:23:59,020 Because he went and he went in and he looked for the basement and he made that point open the door, 206 00:23:59,020 --> 00:24:03,580 which led to the basement, and it was just a closet. And then he realised there was no basement. 207 00:24:04,000 --> 00:24:07,120 So he put down this guy and he came back out. He held his hands. Right. 208 00:24:08,180 --> 00:24:11,780 He was like, Well, I was checking out that. 209 00:24:13,660 --> 00:24:26,860 Um, you know, you try your best and, uh, but right afterwards, in macabre circles, somebody that's now become very familiar in the United States. 210 00:24:27,340 --> 00:24:30,390 And there were a flurry of tweets like this one. 211 00:24:30,420 --> 00:24:38,350 I'll read it off. This is from a fellow, a real guy, a president who's still happily disseminating false information all over the Internet. 212 00:24:39,210 --> 00:24:44,350 He he tweeted out, confirm, this is what my friends confirm. 213 00:24:44,380 --> 00:24:52,020 All caps comment piece of government eager to squelch is a factual that's also in. 214 00:24:53,020 --> 00:24:56,440 Yes. And this got spread around very quickly. 215 00:24:56,560 --> 00:25:02,910 That actually well, it was an actor recruited by the very same conspiracy people that were selling tickets. 216 00:25:03,220 --> 00:25:07,660 Right. And they had changed the building or, you know, it was all. 217 00:25:08,530 --> 00:25:11,530 But what's interesting about that is that this guy got frozen. 218 00:25:11,530 --> 00:25:18,159 That was one of the first people, along with Alex Jones, talking about this conspiracy theory on his social media. 219 00:25:18,160 --> 00:25:25,120 His tweets and his his other platforms were shared by thousands, hundreds of thousands of people. 220 00:25:27,080 --> 00:25:31,330 And right after the guy does this, he says, I was an actor. 221 00:25:31,420 --> 00:25:37,420 I was like a director because this guy also was a product director. 222 00:25:37,420 --> 00:25:44,920 He comes from a group called Citizens for Troubles in Russia, was promoted every year by this. 223 00:25:45,520 --> 00:25:49,720 And I was under those who supported the Pizzagate theory almost instantly began claimed 224 00:25:49,960 --> 00:25:53,800 that the man would really follow up on the one that would follow up on the story. 225 00:25:53,920 --> 00:25:56,530 Was actually an actor paid for by a little conspiracy. 226 00:25:57,670 --> 00:26:03,460 I mean, so go conspiracy theory might say there's no getting around the craziness, but this is a bit different. 227 00:26:03,790 --> 00:26:11,500 It was almost as if Trump supporters who insisted they believe the story now wants to label that very fact, 228 00:26:12,580 --> 00:26:18,280 the fact that they seem to believe it or said that that's fake news. 229 00:26:19,240 --> 00:26:23,170 And in fact, they did. They started to claim that they had never seen these tweets. 230 00:26:23,350 --> 00:26:28,330 You know, they were there. In fact, in this case, Crosscheck didn't even go to his tweets. 231 00:26:29,670 --> 00:26:38,020 So those are. They wanted to deny that anyone like them would act on a story that they possibly said required action. 232 00:26:38,890 --> 00:26:43,480 And this wasn't even a transaction. The cast of the drama. 233 00:26:43,570 --> 00:26:47,230 Somebody has to do something. Somebody has to do something. 234 00:26:47,470 --> 00:26:50,740 Children at risk. Yeah. 235 00:26:50,740 --> 00:26:53,770 When somebody gets something, that's absurd. 236 00:26:54,070 --> 00:27:00,010 No. Why would anybody do things to an actor or a crazy person? 237 00:27:02,660 --> 00:27:07,000 This shows that there's an element of self-deception in our work and what we're doing all along. 238 00:27:07,010 --> 00:27:15,830 We're not. Just part of the danger or not is not just the deception that can be brought on by information. 239 00:27:17,300 --> 00:27:22,910 It also shows that to some degree we seem to be playing by the rules that we say. 240 00:27:23,760 --> 00:27:29,660 Now, in the case of Mr. Spock and the other conspiracy theorists. 241 00:27:30,320 --> 00:27:39,890 That's again, fairly obviously a clear start to my case, less obvious when it comes to the rest of us, but more important. 242 00:27:42,340 --> 00:27:48,159 Okay. So now I want to move into my name in a moment. 243 00:27:48,160 --> 00:27:52,600 So just for my main case, I'm on, right? 244 00:27:54,340 --> 00:28:02,740 So one of the things that people are president, also people like you, if you do, all of you are going to be social media. 245 00:28:02,750 --> 00:28:07,660 I understand that some of you or not good for you, but for the rest of us, including myself, 246 00:28:07,810 --> 00:28:14,020 one of the things you do on social media, you don't use it at my work is to share news stories. 247 00:28:15,460 --> 00:28:21,920 In fact, it's one of the most politically common ways in which people politically engage on your track, 248 00:28:21,940 --> 00:28:25,690 whether it's Twitter or Facebook or any variety of other sites. 249 00:28:27,940 --> 00:28:32,230 New stories are shared online like this for political purposes. 250 00:28:36,140 --> 00:28:41,880 And of course, this is what makes social platforms so appealing to farms because. 251 00:28:43,290 --> 00:28:47,430 But let's focus on what we're doing when we're sharing news stories. 252 00:28:49,690 --> 00:28:54,420 Let's hope so. But then you can talk it out of a share. 253 00:28:54,430 --> 00:28:58,509 So take a share. So here's, you know, there you can take real ones like story, 254 00:28:58,510 --> 00:29:10,149 like just sharing a story online about news that updates on the, uh, about Laci false testimony that just happened. 255 00:29:10,150 --> 00:29:11,260 I share some on Facebook. 256 00:29:11,260 --> 00:29:20,020 You could take that or you can take out fake news story like one that I like ran across one once, which was reveal Hillary Clinton. 257 00:29:20,230 --> 00:29:29,170 Really Hillary Clinton is really working for the Russians on whatever it is that you, whatever it is, just, let's say aspects of that. 258 00:29:30,070 --> 00:29:35,950 Take us take it as a news story. The headline which is a clear claim like that. 259 00:29:36,130 --> 00:29:39,460 Right, for access on. All right. 260 00:29:41,200 --> 00:29:46,660 Now, the act of sharing that story, I'm going to claim can be understood as a communicative act. 261 00:29:46,870 --> 00:29:53,530 It's not a linguistic act, deceptive, unless you're going to be very permissive in your use of the term linguistic right. 262 00:29:53,830 --> 00:30:00,280 But it is a community server because of acts. 263 00:30:01,060 --> 00:30:05,890 Actually, communication is not just verbal or not just linguistic. 264 00:30:06,430 --> 00:30:14,680 It can be visually involved. Pictures and drawings on a stick. 265 00:30:14,680 --> 00:30:27,190 Figures. Four point. And as we all know, that is becoming a variety of forms we can connect with, with of course, with words we can as well known, 266 00:30:27,240 --> 00:30:37,080 especially this institution can do different things to the question or command or assert and describe sometimes. 267 00:30:37,080 --> 00:30:40,020 Indeed, in fact, many times we do more than one thing at a time. 268 00:30:40,320 --> 00:30:46,830 So just when I say that some food in a restaurant is tasty, or I say, you know, food at that restaurant is tasty, 269 00:30:48,870 --> 00:30:52,410 and I say it out of the context of when we're wondering where to go for dinner. 270 00:30:52,980 --> 00:30:58,440 I probably form at least two actions of communication on the one hand, 271 00:30:58,440 --> 00:31:04,500 or positive action I assert or seem to assert that food at that establishment is tasty. 272 00:31:06,540 --> 00:31:10,440 And at the same time, I also endorse it as the place that we ought to eat. 273 00:31:11,640 --> 00:31:18,040 Right? So first, you know, because it's in the context of us wondering where we got to go for dinner, a gesture at the restaurant. 274 00:31:18,110 --> 00:31:21,570 I'm just not at it. So food, that was tasty. 275 00:31:21,880 --> 00:31:27,330 Well, I'm sure it's one thing, but I've also endorsed where my endorsement. 276 00:31:27,540 --> 00:31:30,810 I mean, I just heard it another time with an op ed. 277 00:31:32,940 --> 00:31:36,840 All right. You might now see where I'm going. 278 00:31:37,200 --> 00:31:41,279 I want to find that axiom of sharing these stories. 279 00:31:41,280 --> 00:31:45,810 Our community of apps and these apps, like any other acts of communication, 280 00:31:46,230 --> 00:31:53,850 should be understood as as far as being capable of doing a number of different things, 281 00:31:54,270 --> 00:32:01,530 watch carefully that we can do incredible things with one act, sharing a story or more. 282 00:32:04,070 --> 00:32:12,870 Now when we share a news story online, especially when we do so without comment, which it just shows, 283 00:32:12,950 --> 00:32:18,560 is almost always most of the time, most people who share facts do not know about it, just sharing. 284 00:32:20,360 --> 00:32:21,140 When we do that, 285 00:32:22,040 --> 00:32:33,020 we typically appear to others and ourselves to be doing something similar to the example of the restaurant at that Food Armstrong Station. 286 00:32:33,530 --> 00:32:38,630 We appear to be engaging in one or of the following one or both of us. 287 00:32:39,140 --> 00:32:45,230 One thing we appear to be doing is engage in an act of testimony that is asserting that something is the case. 288 00:32:45,680 --> 00:32:53,600 So one thing we should be doing when we say that we share this story with that deadline is we could be, 289 00:32:55,070 --> 00:32:58,430 not strictly speaking, a sergeant, because we're not asserting. 290 00:32:58,730 --> 00:33:01,370 I think it's strictly anything, but we're doing something. 291 00:33:01,370 --> 00:33:08,030 That's sort of my call asserting in any event, whether or not you like the notion of extending the notion of a sergeant like that, 292 00:33:08,210 --> 00:33:11,390 it's certainly seems that we're engaging in that. Just on the. 293 00:33:11,420 --> 00:33:14,500 What are we testifying to? Well, that's exactly right. 294 00:33:14,510 --> 00:33:22,280 So that's one of the things that we seem to be. We don't have to be doing that. But it's typically we take ourselves if I share some story. 295 00:33:23,680 --> 00:33:34,300 Trump is a liar in all the stories, and I seem to think now I could always do something different. 296 00:33:35,420 --> 00:33:41,620 I could not believe that I could be doing something. I could be instead be endorsing. 297 00:33:43,450 --> 00:33:51,279 I could be not testifying to actually the expert that I could be endorsing or recommending 298 00:33:51,280 --> 00:33:56,800 some piece of information saying it is worth the attention or possibly even belief. 299 00:33:58,000 --> 00:34:04,690 We may even just say this on the court and you can post we might even say everybody needs to read this. 300 00:34:06,520 --> 00:34:13,170 So another thing we could be doing is saying. Oh. 301 00:34:21,230 --> 00:34:27,770 Something along that. Right. That's just an example. I mean, you could have many they might be attacking today. 302 00:34:29,510 --> 00:34:38,520 All right. My point is that I'm trying to build a build the case that on the surface I emphasise that. 303 00:34:38,940 --> 00:34:44,540 So this on the surface it seems that what we're doing when we're engaging, 304 00:34:45,440 --> 00:34:51,350 sharing stories is doing something like what we're doing when we say to the rats that our food in that restaurant. 305 00:34:55,070 --> 00:35:01,470 Okay. Of course, sometimes we are actually endorsing the story. 306 00:35:02,700 --> 00:35:10,050 There are lots of exceptions. We are sharing it only as something we think is amusing or really dumb or ironic. 307 00:35:10,830 --> 00:35:15,660 When we use something like that kind of that we're engaging in self-consciously expressive. 308 00:35:16,590 --> 00:35:20,640 We are expressing our amusement or our detachment or frustration. 309 00:35:20,650 --> 00:35:23,610 We might not be stating anything. We're just expressing our amusement. 310 00:35:27,480 --> 00:35:38,160 We are trying to convey anything with factual descriptive content even to the culture, at least on the surface also. 311 00:35:41,460 --> 00:35:50,940 Yeah, well, that happens. It isn't a typical case, as evidenced by the fact that most people feel obligated to signal one way or 312 00:35:50,940 --> 00:35:55,650 another that their act of sharing shouldn't be understood as an endorsement. 313 00:35:56,940 --> 00:36:02,130 When you share something that you don't agree with. This is a general rule of how people interact online. 314 00:36:02,430 --> 00:36:07,920 When you share something that you don't agree with that you think is misleading or false, even if you think it's fun. 315 00:36:08,910 --> 00:36:14,010 You often try to find some way, maybe a contextual way to signal that that's the case. 316 00:36:16,690 --> 00:36:19,889 That's how it works. On Twitter, for example, 317 00:36:19,890 --> 00:36:27,920 the potential of is said it's not uncommon for people to declare on their profile page that retweets are not endorsements people, 318 00:36:28,140 --> 00:36:37,080 but that it's like a blanket retreat. Stop or don't come back and join me if it turns out that's not right. 319 00:36:38,850 --> 00:36:45,000 And of course, that would make sense as a default assumption if the default assumption wasn't that, 320 00:36:45,180 --> 00:36:49,650 of course, retweets are endorsements, otherwise people wouldn't care. 321 00:36:49,980 --> 00:36:57,209 So it's pretty clear what I mean. I mean, I'll stop you there because it is definitely more mainstream social science, I suppose, 322 00:36:57,210 --> 00:37:03,000 where they just go on there and experience the world and you'll see that that's the case. 323 00:37:03,810 --> 00:37:08,160 So I'll share. It's typically seen to us like assertions and or endorsements as searches. 324 00:37:08,940 --> 00:37:12,720 Again, typically, again on the surface we seem to treat them that way. 325 00:37:14,100 --> 00:37:18,120 But what if that appearance is just that appearance and not reality? 326 00:37:18,120 --> 00:37:24,930 What if we are just confused about the way communication actually functions online in this case, 327 00:37:25,710 --> 00:37:30,270 as it turns out, for reasons I think to think that this is all right. 328 00:37:31,690 --> 00:37:40,470 They these reasons have to do with what we do and what we don't do with the content that we share online. 329 00:37:40,480 --> 00:37:44,830 So let's start with what we don't do. What we don't do, it turns out, is rehab. 330 00:37:48,020 --> 00:37:56,480 Some of you will not respond to this. So a second. 331 00:37:56,960 --> 00:38:05,270 You may be shocked. Current research estimates, at least, for example, 60% of new stories shared online or not even read by the person sharing. 332 00:38:07,030 --> 00:38:13,810 In 2016. Researchers at Columbia reached this result by cleverly studying the intersection of two data sets. 333 00:38:14,290 --> 00:38:21,930 The first was made up of Twitter shares over the course of a month, provided we can use science as tracked by tweets pertaining to the links. 334 00:38:23,590 --> 00:38:30,580 That is the second set of data was comprised of the clicks over the same period connected to that set of short legs. 335 00:38:30,850 --> 00:38:36,970 So first there were tracking shares and other tracking requests for different faces. 336 00:38:37,900 --> 00:38:48,310 Right? So these sets were massive, 2.8 billion shares responsible for 75 billion potential views and views. 337 00:38:48,670 --> 00:38:56,230 The interesting question of how we got from our conservative moment and almost 10 million clicks, that's pretty easy to tell. 338 00:38:57,070 --> 00:39:01,120 So that's a really high standard. 339 00:39:02,260 --> 00:39:09,220 Just designing a methodology for sorting through its correlations, which is really watch, study and correct people respond. 340 00:39:10,810 --> 00:39:15,070 Well, they found that only four in ten people tweet have news items that actually read. 341 00:39:16,280 --> 00:39:20,110 You may go farther than that. And here this is an idea. 342 00:39:20,350 --> 00:39:30,160 So I was at an event at the National Press Club, and at this event it was operating the rules about how other people might go, 343 00:39:31,240 --> 00:39:35,980 where we were allowed to speak, or a lot of talk about what happened, but not to a credit. 344 00:39:37,270 --> 00:39:41,350 But I will say that a senior social media executive said to the group that 345 00:39:41,590 --> 00:39:49,870 their own internal data showed they were he was just saying that he was okay. 346 00:39:49,920 --> 00:39:59,620 And it's not that he has got their own internal data he claimed was actually much worse than his colleagues. 347 00:39:59,620 --> 00:40:06,310 And he was talking in his own internal data that showed that up to 90% of people actually don't read. 348 00:40:06,580 --> 00:40:14,390 They share. It was interesting questions about how both states what constitutes reading actors and writers are different problems. 349 00:40:14,690 --> 00:40:17,320 But it turns out in the case of Facebook's own internal stuff, 350 00:40:17,510 --> 00:40:22,880 it was really just clicking through and spending and they could tell how long they were on the page. 351 00:40:23,090 --> 00:40:30,350 And in just a few seconds, really. So reading poor judgement like, you know, not really what we were considering. 352 00:40:31,550 --> 00:40:32,840 So I don't know whether, you know, 353 00:40:32,840 --> 00:40:39,860 I can't say that that's you can't verify that for you can't verify the Columbia when you look at that study after those two, 354 00:40:40,190 --> 00:40:46,830 another variety of studies are right. It seems to indicate that the news basically information posted. 355 00:40:48,100 --> 00:40:51,420 Yes, the Facebook data is correct. 356 00:40:52,250 --> 00:40:58,080 I also know this now as a sign that won't surprise anybody here. 357 00:40:59,110 --> 00:41:03,480 Actually also said that it was not right that. Right. 358 00:41:04,980 --> 00:41:08,240 Why would they undermine this? Um. 359 00:41:10,680 --> 00:41:13,920 So that's what we don't do. 360 00:41:14,240 --> 00:41:16,650 We want to share. So we should talk about that data. 361 00:41:17,460 --> 00:41:24,480 After all, what you do, on the other hand, when we share content is we share content that gets us pissed off in my life. 362 00:41:25,560 --> 00:41:33,959 Researchers at the University of Pennsylvania, for example, found that the best predictor of sharing that's a predictor that a particular item 363 00:41:33,960 --> 00:41:40,620 will get shared widely is the strong emotions associated with the content displayed. 364 00:41:41,370 --> 00:41:48,359 Words from emotive words. And that includes a load of emotions or pictures. 365 00:41:48,360 --> 00:41:51,680 Of course, pictures are part of this, of course, in large part it like affection. 366 00:41:51,690 --> 00:42:00,240 So think about fluffy kids like those strong emotions, a path to follow up feeling, right? 367 00:42:00,450 --> 00:42:03,150 That's very good for predicting sharing again. 368 00:42:03,780 --> 00:42:13,109 But none of us are surprised at all about the other emotion that they found was really particularly salient is and this will again, 369 00:42:13,110 --> 00:42:24,160 perhaps not surprise you if you think about this outbreak. NYU study suggests, for example, that morally latent emotions are particularly effective. 370 00:42:24,250 --> 00:42:30,160 That is an objective. And of those, again, morally like notion of it, of our energy. 371 00:42:30,880 --> 00:42:40,240 In fact, every moral sentiment actually increases by 20% its chances of future and particularly negative moral assessments. 372 00:42:43,570 --> 00:42:50,200 And as the neuroscientist Molly Crockett of Yale's, working against social media actually tends to increase our feelings of outrage, 373 00:42:50,380 --> 00:42:53,470 acts that would not elicit as much, in fact, in her study. 374 00:42:53,840 --> 00:43:03,550 I don't know what's in as much outrage offline actually elicit increase levels of outrage online. 375 00:43:07,580 --> 00:43:15,420 Really interesting. Now, Crockett has no idea. 376 00:43:15,420 --> 00:43:22,530 This may be due in part to the fact that social benefits of expressing outrage online, such as increased tribal bonding, still exist. 377 00:43:23,480 --> 00:43:27,480 Right. So we express outrage, Crockett says. 378 00:43:27,480 --> 00:43:28,740 And I think she's got a good point. 379 00:43:29,760 --> 00:43:39,480 A group in your tribe and your outrage is something that can sugar along with other, morally or emotionally to bind the tribe. 380 00:43:39,720 --> 00:43:47,860 Yeah. And that can happen in the online as well. But one thing that doesn't happen as much online are the cost of outrage in real life. 381 00:43:47,910 --> 00:43:52,410 When you get outraged, the person on the bus. Right, you stand up and yell at them. 382 00:43:53,040 --> 00:43:56,810 They might get back or hit you. I think there's a there's a right. 383 00:43:56,820 --> 00:44:00,960 We know. Right. You think before you get outraged in person. 384 00:44:01,230 --> 00:44:05,880 But, you know, Chris on my left, Chris was star struck or not. 385 00:44:10,650 --> 00:44:16,210 Moreover, lots of research has shown that outrage feels good to people. 386 00:44:19,940 --> 00:44:24,760 No way. It's just it feels good. Not just in a little sense. 387 00:44:25,580 --> 00:44:29,060 It makes people feel good. Angry. 388 00:44:29,660 --> 00:44:33,050 But certain, certain sectors are active. 389 00:44:34,280 --> 00:44:42,140 So that's interesting because our digital platforms are designed to maximise shares and eyeballs on posts and outrage does that. 390 00:44:43,670 --> 00:44:48,379 So it's not surprising that perhaps though it is a great thing and that is a great 391 00:44:48,380 --> 00:44:53,240 mechanism for producing and encouraging spread of outrage and social media. 392 00:44:54,590 --> 00:45:03,400 We might say there's an outrage factory. This moral outrage is like fire and social media is like gasoline. 393 00:45:06,060 --> 00:45:13,950 Now put together these points. What we're doing with our shares is not, uh, uh, what we're doing with our shares and what we're not doing. 394 00:45:15,480 --> 00:45:18,600 Well, I think if you put those together one couple, 395 00:45:19,860 --> 00:45:26,860 I think it makes it difficult for us to believe that the primary function or maybe 396 00:45:26,860 --> 00:45:36,839 the facts are really to make assertions or to make an endorsement that is assertion, 397 00:45:36,840 --> 00:45:46,169 at least things that, you know on a lot of complicated things that are searching for that. 398 00:45:46,170 --> 00:45:50,010 Awesome. Okay. So these are linguistically. 399 00:45:51,610 --> 00:45:59,519 Researchers. But even though that's typically what you think that we're doing and again, remember, tweets are not endorsements. 400 00:45:59,520 --> 00:46:07,920 So we think we're doing what you say. But the data I just suggested seems to me to require an explanation. 401 00:46:08,730 --> 00:46:17,650 I think there is. The explanation is that we're not trying to share our trash or knowledge. 402 00:46:17,830 --> 00:46:25,990 We're not actually. Primary function of that is not to to engage in testimony or even endorsement. 403 00:46:27,760 --> 00:46:33,190 The primary function of sharing content online is to express our emotions. 404 00:46:40,400 --> 00:46:46,390 And in particular, when it comes to local news stories, we often share them despite our outrage and to the in and. 405 00:46:47,090 --> 00:46:54,499 Now the reason I talk about climate change or what we call a proper function is what I mean by that is that roughly that a community, 406 00:46:54,500 --> 00:47:01,550 Iraq's primary function is that which sustains the use of that contract. 407 00:47:01,870 --> 00:47:10,200 That is why that kind of act continues to be employed by a community of agents that sometimes want to be a, 408 00:47:10,220 --> 00:47:15,740 you know, a kind of community impact, could have a primarily expressive function. 409 00:47:16,630 --> 00:47:20,860 But of course also sometimes we use descriptively. 410 00:47:21,430 --> 00:47:28,430 That's why. What I find that is the problem that sustains the practice. 411 00:47:29,270 --> 00:47:38,899 What sustains the practice of sharing is not that we are learning things from each other or even that trying to get over what we're doing. 412 00:47:38,900 --> 00:47:43,220 What sustains it is the fact that we are expressing our emotions. 413 00:47:44,870 --> 00:47:51,680 It's very useful for expressions like emotions are expressions of emotion, 414 00:47:51,690 --> 00:47:57,680 like CB like moral outrage, or how tribes are built and how social norms are enforced. 415 00:48:01,730 --> 00:48:11,030 And this outrage factor in social media is worse because most folks aren't aware of that fact or don't want to be aware of that fact. 416 00:48:13,430 --> 00:48:22,820 In his lap. This lack of awareness presumably that trolls and fake the fake news industrial complex such as is find so useful. 417 00:48:23,660 --> 00:48:29,390 Purveyors of fake news are keenly aware. I suggest of what we're doing when we share. 418 00:48:31,060 --> 00:48:37,750 And they're at least if they're not aware of the theory of tax offers, after all, they are aware of that. 419 00:48:37,750 --> 00:48:40,150 What we're doing is not what we think we're doing. 420 00:48:42,490 --> 00:48:55,480 And I think that, you know, once that's sort of laid out, it's not it in general is easier to manipulate people and get certain results. 421 00:48:55,600 --> 00:49:01,450 If they are under that. They think that they're playing by a certain set of rules, but, you know, they're actually playing by different set of rules. 422 00:49:04,690 --> 00:49:09,520 So if losses from you Ebola get you started pointed out of course I mean Rob Stein has this point. 423 00:49:09,840 --> 00:49:14,080 I mean, you have the sort of long digression about the semantics here. 424 00:49:14,350 --> 00:49:24,550 And what I'm essentially claiming is, to put it bluntly, if you want, I so in the section I say, well, express autism, yada, yada. 425 00:49:25,840 --> 00:49:30,570 This seems to be a sort of simplified version of the case, 426 00:49:30,880 --> 00:49:36,790 essentially trotting out old fashioned erstwhile mode of ism and saying, Well, it looks seems to work for the Internet. 427 00:49:37,060 --> 00:49:42,590 And then I say, But I don't I'm not endorsing it right now. 428 00:49:42,610 --> 00:49:46,639 Endorsements are for moral language in general. 429 00:49:46,640 --> 00:49:49,120 I'm not expressing this language in general. 430 00:49:49,870 --> 00:49:57,040 What I'm saying is, if you wanted to know how old fashioned expressionism for the world would look like, an old fashioned expression, that was true. 431 00:49:57,700 --> 00:50:08,099 Look at Facebook. I'm sorry I had to try that, but I want to get this far in a different venue where there are Oscars or I can dance. 432 00:50:08,100 --> 00:50:13,300 I can't say that. For me, it's. 433 00:50:13,680 --> 00:50:17,310 I'm okay and it's fine. 434 00:50:17,670 --> 00:50:22,740 I'm sure some people. I'm sure it's okay. 435 00:50:22,830 --> 00:50:27,390 So I'm all right. 436 00:50:28,610 --> 00:50:31,650 Okay. Various locations. Yeah. Yeah. 437 00:50:32,130 --> 00:50:37,230 Energy. New direction. Scale. Overall. Yeah. So. 438 00:50:44,170 --> 00:50:48,240 Might consider it. Let me mentioned one sort of. So I have a couple more minutes. 439 00:50:48,780 --> 00:50:53,370 Yeah. Yeah. Okay. So consider a thought. 440 00:50:53,370 --> 00:50:59,570 That's impossible for. Imagine that. Okay. So notice out the thought experiment you need to do if you don't know Facebook. 441 00:50:59,580 --> 00:51:07,940 I'll tell you something on Facebook used to be the case that when you share a poster, most people react by either giving it a thumbs up. 442 00:51:08,100 --> 00:51:11,760 I'll give it a give it the thumbs up for not. All right. 443 00:51:12,240 --> 00:51:15,570 Nowadays you can give it the thumbs up and you have a little emoticons, 444 00:51:15,720 --> 00:51:23,550 little happy face, angry, fed, much like outrage, really, and so forth and so on. 445 00:51:23,730 --> 00:51:28,260 Sad face, laughing face. So you have these slogans. 446 00:51:30,180 --> 00:51:36,480 And that's how Facebook regulates how you react emotionally. 447 00:51:37,260 --> 00:51:44,010 That's how it goes. Now, imagine contrast that with imagine that I you know, I watched it. 448 00:51:44,020 --> 00:51:47,580 It's just nice to have different Facebook representatives say I find it very amusing. 449 00:51:49,020 --> 00:51:55,770 I so imagine that you said and I said very earnestly, you have only a small undertaking. 450 00:51:55,770 --> 00:52:00,750 So the idea that instead of having emoticons, you have certain buttons that were like, 451 00:52:01,380 --> 00:52:08,280 uh, justified by the evidence, not just the idea, but need more information. 452 00:52:11,100 --> 00:52:18,930 I was like, What are you guys doing? Have you found that he had other things to do? 453 00:52:22,800 --> 00:52:27,600 But actually when you think about it, what you realise is that even if they did do that, 454 00:52:27,600 --> 00:52:33,780 if I'm the one claiming here is right that we've been playing, you think we're playing under one set about the same rules. 455 00:52:33,780 --> 00:52:37,290 That is the rules of justification and testimony online. 456 00:52:37,380 --> 00:52:42,750 And we're actually not playing by those rules all because we're not playing the, you know, the truth app game at all. 457 00:52:44,130 --> 00:52:47,160 If that's true, then changing buttons wouldn't mean a thing. 458 00:52:47,160 --> 00:52:51,460 I mean, it might mean first people might be like, Yeah, oh, you know, 459 00:52:51,480 --> 00:52:55,110 they might be more inclined to just say e more information along with Fine by me. 460 00:52:55,950 --> 00:53:02,580 But pretty quickly, don't you think if I'm right that justified by the evidence we start to have an emotional balance. 461 00:53:03,540 --> 00:53:08,400 It would be more like, Yeah, you get that go girl. 462 00:53:09,810 --> 00:53:16,740 I'm not part of the evidence. I think that's right for us. 463 00:53:16,740 --> 00:53:20,760 We could try to get successful getting anybody to do it for me. 464 00:53:20,760 --> 00:53:24,050 But you know somebody. 465 00:53:27,240 --> 00:53:30,930 Yeah. Even the way a thought experiment. This one is idealised. It's idealised. 466 00:53:30,960 --> 00:53:33,450 Of course, the one. I just want to highlight the crucial points. 467 00:53:33,450 --> 00:53:41,430 We don't typically react to these stories, share our mind in the way that we do information shared by a person or a person, 468 00:53:41,520 --> 00:53:46,160 either on the street or a courtroom or newspaper, said the newspaper. 469 00:53:46,560 --> 00:53:55,300 That's no surprise. Of course, Facebook and Twitter weren't designed to be news platforms, and they are actual factors. 470 00:53:56,640 --> 00:54:00,960 Now, most people in the world claim they get their news from Facebook. 471 00:54:03,450 --> 00:54:17,419 Largest I overwhelming market. What does it tell us for creating a world of illusion of what to live at all the wild 472 00:54:17,420 --> 00:54:23,930 complaints that our critics and our enemies are made providing us with fake news.