1 00:00:00,720 --> 00:00:09,390 I should say what I'm going to talk about tonight is out of a was part of a book on the ethics of cyber of cyber security, 2 00:00:10,530 --> 00:00:15,330 which I'm writing with a computer scientist by the name of Terry Bosso Meyer. 3 00:00:15,930 --> 00:00:21,510 Terry, unfortunately, can't be here. He's the man who knows about all the technical details. 4 00:00:21,690 --> 00:00:29,430 So any hard questions that I can't answer inevitably would be ones that he would have answered if he'd if he'd been here. 5 00:00:31,710 --> 00:00:36,570 But nevertheless, I have to take full responsibility for what I'm about to say. 6 00:00:38,860 --> 00:00:43,500 Well, I did get some assistance from Terry for this particular presentation. 7 00:00:44,220 --> 00:00:51,510 Okay. So I'll just quickly run through what I'm going to the structure of the presentation. 8 00:00:51,510 --> 00:00:57,840 So, you know, roughly what's in store for you? It's in sort of four parts. 9 00:00:59,340 --> 00:01:08,670 Depending on time, I may have to traverse some of these parts more quickly than might otherwise be desirable. 10 00:01:09,600 --> 00:01:15,330 The first part is basically looking at what the nature of the problem in this area is, 11 00:01:16,710 --> 00:01:22,470 because I think it's a kind of collection of sort of set of complex set of problems. 12 00:01:24,180 --> 00:01:33,360 Secondly, I'm going to define the three key notions fake news, hate speech and proper and political propaganda. 13 00:01:33,780 --> 00:01:43,020 It's political propaganda. That's my focus. And I take it that fake news and hate speech are typically components of political propaganda. 14 00:01:45,120 --> 00:01:51,900 I won't spend too much time on these definitions since this is a sort of philosophical Chesnut point of area. 15 00:01:54,090 --> 00:02:03,420 In Part C, the third part, I'll talk about how the right to freedom of communication is to be understood, 16 00:02:04,980 --> 00:02:13,290 because I think the the dominant tradition in philosophy, at least in the analytic world, is derived from four mil. 17 00:02:13,830 --> 00:02:23,640 And I don't want to divert too much from, but I think that we need a slightly more expansive notion of freedom of communication. 18 00:02:25,110 --> 00:02:34,980 And of course, the key problem is that on the one hand, we want to give a lot of weight to the moral right of freedom of communication. 19 00:02:35,550 --> 00:02:38,970 We want to allow people to to express themselves and so on. 20 00:02:39,180 --> 00:02:43,260 But on the other hand, we've got this problem because a lot of people are abusing that, right? 21 00:02:46,850 --> 00:02:53,370 Giving us hate speech, giving us fake news and spouting propaganda. 22 00:02:54,480 --> 00:02:59,670 And so we want to restrict that. But how do we how do we square the circle? 23 00:03:01,320 --> 00:03:07,470 Okay. So I'll talk a little bit about the the notion of relevant notion of freedom of communication. 24 00:03:07,830 --> 00:03:17,520 Then in Part D, I'm going to try and say think some things about how we could go about countering political proper 25 00:03:17,820 --> 00:03:24,150 propaganda whilst in doing so respecting hopefully this right to freedom of communication, 26 00:03:24,510 --> 00:03:33,150 but at least having some impact on reducing the amount of fake news, hate speech and and political propaganda. 27 00:03:33,870 --> 00:03:37,349 And a particular thing to keep in mind here is if you see that on the board 28 00:03:37,350 --> 00:03:44,249 there under D one is the role of what I refer to as epistemic institutions. 29 00:03:44,250 --> 00:03:52,110 This is sort of the philosophical terms of the notion of epistemic, but it's basically just means institutions concerned with knowledge. 30 00:03:53,460 --> 00:03:59,280 So if one thinks of schools, universities, but also the press, for example, 31 00:03:59,280 --> 00:04:07,170 that concern to investigate matters, to figure out what's actually going on out there, but also, of course, 32 00:04:07,320 --> 00:04:09,990 parliaments and committee, parliamentary committees, 33 00:04:09,990 --> 00:04:19,620 because it ought to be their business to try to think through what would be the best policies helping themselves, of course, to expert advice. 34 00:04:20,280 --> 00:04:27,290 But they should be, as it were, deliberative bodies and therefore to that extent epistemic in my sense. 35 00:04:27,300 --> 00:04:31,860 So it's a fairly broad ranging notion of the epistemic that I have in mind. 36 00:04:34,250 --> 00:04:37,410 Right. So Part II is sort of unpacking the problem. 37 00:04:38,040 --> 00:04:47,249 Okay. So putting things somewhat simplistically, what we've had in in recent times is the development of a whole range of social media platforms, 38 00:04:47,250 --> 00:04:59,730 Facebook, Twitter, etc. and they're now used by literally billions of communicators world wide as I related tech developments such in the. 39 00:04:59,810 --> 00:05:13,520 I think these figures such as Google and so on and the advent of these these tech, tech giants and the underpinning technology has, of course, 40 00:05:13,520 --> 00:05:22,580 been remarkable in terms of the unprecedented flows of information, the capacity to to find out things, 41 00:05:22,910 --> 00:05:32,240 but also in terms of the capacity of people to exercise their right to communicate with one another. 42 00:05:32,870 --> 00:05:38,870 What is of particular interest to me is the right to to communicate, 43 00:05:39,470 --> 00:05:44,750 not simply with anyone on the Internet or with your friends on Facebook or something. 44 00:05:44,930 --> 00:05:52,009 What I'm interested in is because I'm interested in political propaganda. 45 00:05:52,010 --> 00:05:56,210 I'm interested in the moral right to public communication. 46 00:05:56,390 --> 00:06:05,270 Public communication. And by public communication, I don't just mean as a way your private communications, your friends. 47 00:06:06,410 --> 00:06:10,040 There are two features of public communication I would draw attention to. 48 00:06:10,040 --> 00:06:15,830 The notion is rather more complex than than this. But there are two features that I think are worth drawing attention to. 49 00:06:15,860 --> 00:06:28,160 First of all, the message communicated becomes a message that goes to and is intended to go to a large number of people. 50 00:06:28,550 --> 00:06:37,220 But each of those each of those persons who is communicated to knows that a whole lot of other people have been communicated to. 51 00:06:38,270 --> 00:06:44,590 So it's not just that you've been communicated with, but that you've grasped this message in someone else's grasp, especially. 52 00:06:44,600 --> 00:06:48,570 So it's that we all know that we've all grasped this message. 53 00:06:48,590 --> 00:06:56,329 So there's a there's that dimension to it, which is sometimes referred to as in the literature is common knowledge, 54 00:06:56,330 --> 00:07:00,980 which is I know that something you know, that something, I know that you know that something's wrong and so forth. 55 00:07:01,730 --> 00:07:05,719 The other feature of this, what I'm calling public communication, 56 00:07:05,720 --> 00:07:12,709 I just draw attention to is that it to say that it's public communication doesn't simply mean it's in the public domain, 57 00:07:12,710 --> 00:07:16,690 in the sense that in principle you got access that is, you know, 58 00:07:16,700 --> 00:07:23,060 so there must be there's a lot of information in the public records that in principle is in the public domain, but no one ever bothers to look at it. 59 00:07:23,840 --> 00:07:32,090 So I'm looking at public community. I'm thinking of public communication in the sense of of message that messages and information out there, 60 00:07:32,480 --> 00:07:36,740 that's that's, as I say, visible, highly, pretty highly visible. 61 00:07:36,740 --> 00:07:40,970 So we're all kind of aware of it. It's not just a question of the capacity to access it. 62 00:07:44,310 --> 00:07:55,550 Okay. Unfortunately, in addition to this, in increasing our capacity to to come to know things and to communicate, 63 00:07:55,940 --> 00:08:04,880 there's also been an exponential increase in the topic of this talk, namely hate speech, fake news and political propaganda. 64 00:08:07,310 --> 00:08:17,090 And as and relatedly, there's been an attempt, considerable empowerment of extremist political groups, 65 00:08:17,090 --> 00:08:20,540 bearing in mind I'm interested in political propaganda, 66 00:08:20,900 --> 00:08:35,750 notably groups like Islamic State and so on, and also the capacity it has enabled various political actors to intervene in democratic processes. 67 00:08:35,750 --> 00:08:42,139 And one thinks here of Cambridge, at Cambridge I think also the interventions in relation to Brexit, 68 00:08:42,140 --> 00:08:53,090 but more recently in the European elections on these these the capacity to intervene has been facilitated by not only by the Internet obviously, 69 00:08:53,090 --> 00:09:01,190 but by social media and by some of these AI techniques of machine learning, for example. 70 00:09:02,600 --> 00:09:06,169 And if you think of particular example of hate speech, 71 00:09:06,170 --> 00:09:17,840 that's often drawing attention to this is the the Facebook hate speech emanating from Myanmar military in relation to the Rohingya in Burma in May, 72 00:09:17,870 --> 00:09:27,470 in Myanmar, which evidently led to or was facilitated the murder of many of those individuals. 73 00:09:29,840 --> 00:09:38,880 Okay. So. If we come back to this notion of public communication and the kind of sphere of public communication, 74 00:09:39,450 --> 00:09:43,139 and I suppose the paradigm that people sometimes have in mind, 75 00:09:43,140 --> 00:09:50,850 or at least people who've studied the history of Western civilisation, 76 00:09:50,850 --> 00:09:54,210 the history of Western philosophy or whatever they think of Athens and the forum. 77 00:09:54,240 --> 00:09:57,330 And, you know, there's a relatively small number of people who are citizens. 78 00:09:58,080 --> 00:10:04,979 There are only males. There had to be aristocrats and so forth. But there's a small group of them and they get all sit around the forum and 79 00:10:04,980 --> 00:10:07,980 they could talk to one another and everyone had a chance to speak and so on. 80 00:10:08,250 --> 00:10:14,900 And so you had this kind of this notion of public communication, such that everyone could speak, as it were, 81 00:10:15,070 --> 00:10:23,940 at least in in in theory, if not in practice, to pretty much everyone else they could publicly communicate was a public space. 82 00:10:24,330 --> 00:10:32,550 So one of the things that seems to have happened insofar as we've had a public space in that sense of public sphere, at least in that. 83 00:10:32,640 --> 00:10:39,960 So if you take the U.K., the kind of public sphere created to some extent by organisations like the BBC, 84 00:10:40,140 --> 00:10:46,709 where most people in the U.K. aware of certain developments they get everyone gets the same, 85 00:10:46,710 --> 00:10:50,880 roughly speaking, the same news and so on, and knows what's going on. 86 00:10:51,180 --> 00:10:55,649 That has been or at least that was formerly the case to a considerable extent. 87 00:10:55,650 --> 00:10:59,730 That's been undermined to some extent by these new developments. 88 00:10:59,740 --> 00:11:06,060 So there's been a kind of splitting or fragmentation of this public sphere, 89 00:11:06,420 --> 00:11:12,989 and you've got a whole range of different audiences that are focussed on other members of their narrow 90 00:11:12,990 --> 00:11:19,320 audience and don't necessarily get the same news or in the same things as members of some other audience. 91 00:11:20,400 --> 00:11:30,630 So you've got a development, or at least the the the considerable increase in so-called narrowcasting versus broadcast communications. 92 00:11:31,020 --> 00:11:36,899 And to some extent, if we look at the international sphere in the world as it's come to be with globalisation, 93 00:11:36,900 --> 00:11:42,750 it's clear that there are a whole range of common problems that we all, as sort of member of the human race, 94 00:11:43,080 --> 00:11:46,320 need to know something about and one thinks of climate change and so on. 95 00:11:47,640 --> 00:11:51,760 So one would one would have thought we would want a kind of, as it were, 96 00:11:51,760 --> 00:11:57,120 a sort of global public sphere of communication to so that we can think about 97 00:11:57,120 --> 00:12:03,720 and be informed about these problems that we confront collectively as humans. 98 00:12:04,080 --> 00:12:12,750 But unfortunately, there's been a kind of a kind of Balkanisation, if you like, in relation to the public. 99 00:12:12,960 --> 00:12:21,690 That's where the global public sphere, partly because we've got some nation states such as, 100 00:12:22,200 --> 00:12:34,739 say, China and Russia, who want to slow down or censor certain sorts of information. 101 00:12:34,740 --> 00:12:39,150 They don't want Chinese people to have access, for example, to websites outside China. 102 00:12:40,710 --> 00:12:44,160 The Russians, again, are looking at doing things in this space. 103 00:12:44,160 --> 00:12:52,650 So there's a kind of fragmentation Balkanisation going on in terms of where the international public sphere. 104 00:12:55,230 --> 00:13:01,680 And so the upshot of this is that perhaps somewhat paradoxically, 105 00:13:02,380 --> 00:13:07,170 the important development that seems to have arisen as a consequence of social media, 106 00:13:07,170 --> 00:13:16,350 the Internet and so on, is that the pre-existing public spheres of communication have been to some extent, sort of splintered and fragmented. 107 00:13:17,280 --> 00:13:25,679 And the what one might have expected, namely the development of out of a global sphere of public communication, 108 00:13:25,680 --> 00:13:33,450 has been much less than one might have thought, and certainly less than what one might think would be desirable. 109 00:13:36,450 --> 00:13:47,909 Okay. Now, aside from this notion of a public sphere of communication and the idea of people having rights to communicate to 110 00:13:47,910 --> 00:13:54,750 the public as opposed to simply being able to communicate with their friends or in an interpersonal communication, 111 00:13:56,490 --> 00:14:02,070 these this the maintenance of of public communication, 112 00:14:03,270 --> 00:14:14,430 if it's if it's to work and succeed and and be a development for the good of humanity or the good of nation states or for good of organisations, 113 00:14:15,000 --> 00:14:23,160 is it will need to be communication which is compliant with various epistemic and moral norms. 114 00:14:24,060 --> 00:14:29,340 In other words, that is regularities in behaviour that people are committed to. 115 00:14:29,610 --> 00:14:36,710 So for example, if you take a simple. Out of the norm of aiming at the truth, of trying to tell the truth. 116 00:14:37,160 --> 00:14:45,350 This is a pretty fundamental norm. Because if it's if we if if all of us with a breach, this norm, if we were all to say, well, 117 00:14:45,350 --> 00:14:49,970 I'm not going to bother telling the truth, and you said decided you weren't going to bother telling the truth. 118 00:14:50,630 --> 00:14:55,940 Communication would quickly start to break down because I wouldn't I wouldn't trust you to tell the truth. 119 00:14:55,940 --> 00:15:03,530 You wouldn't trust me to tell the truth. And so these norms are obviously of great importance. 120 00:15:03,950 --> 00:15:12,040 Other kinds of norms that have developed, which are under pressure, would be norms like the not the norm, that you don't defame people. 121 00:15:12,050 --> 00:15:15,770 You don't go around trashing people, saying false things about them. 122 00:15:16,610 --> 00:15:20,870 And of course, there are laws about body of law development in relation to this. 123 00:15:20,880 --> 00:15:26,660 But in the case of the sort of unregulated or relatively unregulated Internet, 124 00:15:26,900 --> 00:15:32,720 you can get away with a lot of the things that you wouldn't be able to know I wouldn't have been able to get away with previously. 125 00:15:33,320 --> 00:15:46,580 Another kind of norm is in relation to the distinction between factual or truth aiming or communication, 126 00:15:46,580 --> 00:15:51,620 which is concerned with how things are as opposed to mere entertainment. 127 00:15:52,340 --> 00:15:55,730 And we think that's an important distinction that we want to hold onto. 128 00:15:56,360 --> 00:15:58,450 And when someone says something, we want to know, well, 129 00:15:58,460 --> 00:16:05,180 are you are you saying something because you you believe this to be true or are you just trying to have fun or this is just a silly, 130 00:16:05,510 --> 00:16:06,830 silly, entertaining claim? 131 00:16:07,190 --> 00:16:15,620 And that distinction between truth on the one hand and entertainment on the other seems to have been coming under increasing pressure. 132 00:16:15,620 --> 00:16:23,870 It was coming under pressure before with phenomena like infotainment that it's becoming it's coming under greater pressure. 133 00:16:25,100 --> 00:16:31,940 Okay. So it's very important that we attend to these epistemic and moral norms of communication. 134 00:16:33,320 --> 00:16:42,500 And the the lack of compliance with them is very problematic in terms of the quality of of public communication. 135 00:16:43,400 --> 00:16:51,050 There are two general ways in which we try to ensure compliance with these norms. 136 00:16:51,080 --> 00:16:56,960 One way is to have laws and enforceable laws and regulations for the most serious breaches. 137 00:16:57,080 --> 00:17:04,010 Laws against defamation, laws against incitement to violence and so on. 138 00:17:05,510 --> 00:17:11,120 But those laws and regulations in and of themselves are not going to be sufficient. 139 00:17:11,810 --> 00:17:20,000 We also need people to have collectively attitudes that of respect for those norms. 140 00:17:20,000 --> 00:17:21,110 They know what the norms are, 141 00:17:21,110 --> 00:17:31,430 and I think they're important and they will express disapproval and pull people up if and when they breach those those norms. 142 00:17:33,960 --> 00:17:44,900 And of course, if you've got the the the greater the amount of non-compliance, the weaker those norms become. 143 00:17:44,910 --> 00:17:49,950 If you get enough people disobeying the laws, enough people just give up on telling the truth. 144 00:17:49,950 --> 00:17:53,430 Oh, well, you know, you've got to tell the truth. I'm going to tell the truth. No, it doesn't really matter. 145 00:17:53,790 --> 00:17:58,020 And so on. Then the norm starts to be undermined. 146 00:17:58,290 --> 00:18:04,229 And of course, if you have state actors deliberately such as, say, Russia, 147 00:18:04,230 --> 00:18:10,350 in relation to various matters, or to take a look at an example near to home, if you've got, say, 148 00:18:10,350 --> 00:18:16,020 the American president who's quite happy to tell lies and put about say false things about people willy nilly, 149 00:18:16,350 --> 00:18:24,990 then this is very unhelpful in terms of maintenance of these these norms, these epistemic and moral norms. 150 00:18:26,910 --> 00:18:29,820 Okay. So we're coming to the end of this section. 151 00:18:30,900 --> 00:18:42,150 I said earlier that I thought there wasn't just a simple problem of freedom of communication versus banning or prohibiting hate speech, 152 00:18:42,870 --> 00:18:44,880 fake news and propaganda. 153 00:18:47,430 --> 00:18:59,340 So perhaps now I can draw your attention to a sort of set of different problems that have emerged in the process of of this discussion. 154 00:18:59,700 --> 00:19:12,929 So one thing we've got to do is figure out what actually is the difference between politically motivated fake news and another news that may be false, 155 00:19:12,930 --> 00:19:17,400 for example. But isn't it essentially fake news? 156 00:19:18,960 --> 00:19:28,050 What is the difference between propaganda and political claims that may be motivated by by political interests? 157 00:19:28,050 --> 00:19:31,080 But nevertheless, we might want to say they're not actually propaganda. 158 00:19:31,240 --> 00:19:33,330 So there's a range of questions in that space. 159 00:19:34,140 --> 00:19:40,680 Another kind of question is that we've touched on is what is the nature and extent of the moral right of freedom, 160 00:19:40,680 --> 00:19:51,810 of communication with the where is the right to freedom and stop and restrictions in relation to what one can say start yet another kind of ethical. 161 00:19:51,840 --> 00:19:54,180 This is a kind of ethical principle. 162 00:19:54,180 --> 00:20:06,759 The question would be what are the kinds of principles we knew we need to use when we if we going to regulate content in cyberspace? 163 00:20:06,760 --> 00:20:11,700 So are we going to help ourselves to principle, necessity, proportionality, Facebook, 164 00:20:11,700 --> 00:20:23,150 for example, when it decides that it wants it thinks that certain material should be taken down, 165 00:20:23,340 --> 00:20:33,690 helps itself to community standards and of course, censors people who censor communications have always helped themselves to community standards. 166 00:20:34,710 --> 00:20:38,190 But the question then arises with what are the what are the community standards? 167 00:20:38,190 --> 00:20:47,100 And which community ought to be the one that we ought to be referencing when we say that this does or doesn't meet community standards? 168 00:20:49,590 --> 00:20:56,249 Another kind of set of set of questions, concerns what ought to who ought to decide these questions? 169 00:20:56,250 --> 00:21:02,040 So it's one thing to say we've decided what the definition of fake news we decided we want to make it illegal. 170 00:21:02,040 --> 00:21:08,099 But then there's the question as to who gets to decide what is or isn't fake news at the moment. 171 00:21:08,100 --> 00:21:14,159 For example, some of the big tech companies themselves, Facebook, are making a lot of those decisions. 172 00:21:14,160 --> 00:21:17,340 Is this is this the way we should be going? 173 00:21:17,340 --> 00:21:24,540 Are they the right the right institutional actors to be making those sorts of decisions? 174 00:21:26,430 --> 00:21:32,159 And then finally, there's a whole range of questions to do with how how are we going to respond to this? 175 00:21:32,160 --> 00:21:36,660 How are we going to combat this? Presumably, we want to reduce hate speech. 176 00:21:36,660 --> 00:21:43,860 We want to reduce fake news and political propaganda, but what are we going to do to reduce it? 177 00:21:44,400 --> 00:21:50,040 And the critical question there is what are we going to do? 178 00:21:50,700 --> 00:21:58,380 What policies we want to put in place? What regulations have we got to introduce consistent with liberal democratic principles? 179 00:21:59,160 --> 00:22:03,210 Because it may be that we can we can reduce a lot of these things, but in so doing, 180 00:22:04,050 --> 00:22:09,420 undermine principles we take to be constitutive of liberal democracy. 181 00:22:11,390 --> 00:22:17,910 I guess in Part B, I might skim through part B reasonably quickly. 182 00:22:17,910 --> 00:22:26,860 It's a sort of a lot of definitions. So fake news, of course, is news. 183 00:22:26,870 --> 00:22:29,239 It's so it's not simply things that are false. 184 00:22:29,240 --> 00:22:40,480 It's something that is in the public in the public sphere, as as I indicated earlier, which suggest to you that it's it's news. 185 00:22:40,490 --> 00:22:44,420 It is false. And it's news that is not believed by the originator. 186 00:22:44,960 --> 00:22:54,140 So it might be it might not be disbelieved by the originator, but as long as the originator doesn't think it doesn't believe it, 187 00:22:54,140 --> 00:22:59,030 then I suggest that it's that is a condition for being fake news. 188 00:22:59,360 --> 00:23:09,410 But the key condition I would draw your attention to is that fake news purports to be true. 189 00:23:09,710 --> 00:23:20,750 That is, it presents itself as being true. And that's quite, quite an important defining condition, I suggest, of fake news. 190 00:23:21,050 --> 00:23:27,440 Fake news is, of course, problematic, since a lot of it's false. 191 00:23:27,680 --> 00:23:32,690 All of it's false, but it's likely to be believed by many people that's going to be a problem. 192 00:23:33,020 --> 00:23:38,660 And it has a lot of potentially bad political consequences, some of which I mentioned earlier. 193 00:23:38,990 --> 00:23:47,180 What about political hate speech? Well, political hate speech has been actually quite a difficult concept to define. 194 00:23:47,180 --> 00:23:53,840 And there's a lot of differences of opinion in different jurisdictions about what it is and whether or not it be banned and so on and so forth. 195 00:23:55,460 --> 00:23:59,330 I would suggest to you that it's it's it's speech that is intended to incite 196 00:23:59,420 --> 00:24:04,190 hatred against some group and that it has a reasonable chance of inciting hatred, 197 00:24:04,190 --> 00:24:11,690 even if it doesn't actually do so. And in the case of if it's a political hate speech, it'll have an ulterior political purpose. 198 00:24:12,050 --> 00:24:20,930 But a critical component of hate speech, as I understand it, is that the truth of what said in the hate speech? 199 00:24:21,170 --> 00:24:27,610 In the hate speech, even if even if it does have some truthful content, that it's not the truth. 200 00:24:27,620 --> 00:24:37,969 It's of interest to the to the speaker of hate speech. The the the interest is in is inciting interest, inciting hatred. 201 00:24:37,970 --> 00:24:45,310 So that's the that's the motive. Okay. 202 00:24:46,120 --> 00:24:47,859 And again, it's obviously problematic. 203 00:24:47,860 --> 00:24:53,560 It's likely to be false and it's harmful, as we saw the various groups in the case of the Rohingya in Burma and so on. 204 00:24:54,010 --> 00:25:04,749 It also has some other characteristics. For example, it's likely to generate conflict and it undermines an important liberal democratic value, 205 00:25:04,750 --> 00:25:09,280 namely fraternity, which people often don't think about. 206 00:25:09,820 --> 00:25:18,900 These days I think of freedom. I think of equality. But fraternity, the sense in which a political community has to work together, 207 00:25:19,720 --> 00:25:30,250 respect one another in relation to moving forward to pursue policies for the common good is obviously very unhelpful in relation to that. 208 00:25:30,790 --> 00:25:40,269 And of course, fake news, hate speech and cyber space have been particularly harmful given the community reach of the Internet, 209 00:25:40,270 --> 00:25:45,430 the communicative reach of the Internet and the power of these social media platforms. 210 00:25:46,270 --> 00:25:53,830 Okay. What about political propaganda? Well, political propaganda is, I suggest, public communication in the service of a political ideology. 211 00:25:54,310 --> 00:26:04,230 What is political ideology? I think here it's important to hold on to a distinction between political ideology and systems of political ideas. 212 00:26:04,240 --> 00:26:09,820 The two things are often collapsed and people start talking about a political set of political ideas as ideology. 213 00:26:10,870 --> 00:26:12,760 I think it's important to resist that. 214 00:26:13,930 --> 00:26:25,900 Like political ideas, ideology is a structured set of beliefs and assumptions, and it serves like political ideas for political purposes. 215 00:26:26,380 --> 00:26:37,720 But I think the critical difference is that ideology only exists because it serves political a political purpose, and not because, as it were, 216 00:26:37,720 --> 00:26:48,640 the world of reality, the social, economic, legal, whatever reality is, as the ideology says, research, not ideology, is not motivated by truth. 217 00:26:48,640 --> 00:26:52,270 It's motivated purely by political purpose. 218 00:26:54,940 --> 00:26:59,590 And of course, in the current environment, in the cyber world, 219 00:27:00,520 --> 00:27:12,669 political ideology has been able to rely on the power of the Internet and social media in harness with a lot of psychologically based, 220 00:27:12,670 --> 00:27:23,890 manipulative marketing techniques to great effect, as, for example, is evident in the Cambridge Analytica scandal mentioned earlier. 221 00:27:25,270 --> 00:27:34,720 Okay, so part C would go through this reasonably quickly as well to get to Part D. 222 00:27:37,450 --> 00:27:45,720 One of the points about the right to public communication in a mass society is that it's kind of necessarily in direct because the societies, 223 00:27:45,850 --> 00:27:52,610 there's so many people. It's something of an illusion to think that everyone can communicate with everyone else. 224 00:27:52,630 --> 00:27:53,680 There's just too many people. 225 00:27:54,820 --> 00:28:05,830 So what's historically happened is that there's a kind of people's communication with with one another is mediated by various institutions, 226 00:28:05,830 --> 00:28:09,790 including notably the press, print and TV and so on. 227 00:28:12,130 --> 00:28:23,950 Now, with the advent of social media and the Internet, there's now this capacity to engage in direct public communication with everyone as, 228 00:28:24,430 --> 00:28:29,230 for example, TRUMP But there are a lot of other ordinary people do. 229 00:28:30,430 --> 00:28:38,440 And this has created the, I would suggest, the kind of illusion that we can now return to the this kind of image, as I mentioned earlier, 230 00:28:38,440 --> 00:28:46,719 of the Athenian sort of forum, where we were all sitting around and we can all talk to one another because in principle, 231 00:28:46,720 --> 00:28:54,100 any of us can reach anyone else directly. Well, I think this is an illusion because just far too many people. 232 00:28:54,100 --> 00:29:02,380 And so in actual fact, if you if you take, for example, Trump's got apparently 3 million or so members of his audience. 233 00:29:02,920 --> 00:29:06,400 He can regularly and does regularly communicate with all 3 million. 234 00:29:07,510 --> 00:29:15,010 And some of the others some of the members of that 3 million audience could communicate and do communicate occasionally to the total of the 3 million. 235 00:29:15,280 --> 00:29:20,770 But the three not every one of that 3 million can possibly communicate to all the other 3 million. 236 00:29:21,040 --> 00:29:29,890 So it's something of an illusion to think that we've as we're overcome through these for social media and the Internet, 237 00:29:30,280 --> 00:29:40,390 that fundamental problem that confronts mass societies, although it has certainly extended the capacity to communicate, 238 00:29:40,840 --> 00:29:46,579 including to directly communicate for some minority here, 239 00:29:46,580 --> 00:29:51,610 we I think we need to distinguish between the right to communicate and the right to be a follower of communications. 240 00:29:52,300 --> 00:29:58,840 Everyone can be a right can receive the communications of Trump, but few can actually be to communicate all. 241 00:30:01,960 --> 00:30:05,640 Okay. So what is this right to public communication? 242 00:30:05,680 --> 00:30:10,900 What is it actually being reduced to in in the contemporary setting? 243 00:30:11,320 --> 00:30:17,979 Well, I suggest that the the right the public communication, which we think is an individual right, that I have to communicate to the public at large. 244 00:30:17,980 --> 00:30:22,690 It's a right that you have to communicate to the public at large. And so we think it's a right everyone has. 245 00:30:23,170 --> 00:30:30,760 I suggest that that right and I'm talking now about public political communication has in effect, been reduced to the following. 246 00:30:31,480 --> 00:30:38,080 It's not so much a right as an opportunity to compete with others for a public audience. 247 00:30:38,380 --> 00:30:40,900 So you don't have a right to communicate, communicate everyone. 248 00:30:40,900 --> 00:30:45,040 You do have an opportunity to compete with a whole lot of other people for a large audience. 249 00:30:47,050 --> 00:30:54,040 But unfortunately, when you compete with others to communicate with a large audience, 250 00:30:54,940 --> 00:30:59,319 you do so under conditions of unfair competition because you don't have the resources, 251 00:30:59,320 --> 00:31:04,270 the wherewithal that a lot of organisations or people that are backed by organisations 252 00:31:04,270 --> 00:31:09,190 have to collect all of the data to use machine learning techniques and so on, 253 00:31:09,190 --> 00:31:13,690 and so to refine your message and and so on and so forth. So it's targeted to certain people. 254 00:31:13,690 --> 00:31:19,150 And so, so unfortunately for you and for me, you're competing under conditions of unfair competition. 255 00:31:20,260 --> 00:31:32,710 A third point, which I've already touched on, is that unfortunately, the norms governing this communication, a frequently flouted, 256 00:31:33,380 --> 00:31:43,150 they are frequently not complied with, and the space of public communication is much less regulated both in terms of laws and regulations, 257 00:31:43,150 --> 00:32:00,250 but also in terms of ordinary social attitudes than was the case before and in exercising or trying to exercise your your 258 00:32:00,610 --> 00:32:08,710 opportunity to compete with other audiences in conditions of unfair competition in which norms governing communication flouted. 259 00:32:10,510 --> 00:32:21,700 This whole process is facilitated by the social media platforms on which which you use to make your communications. 260 00:32:22,720 --> 00:32:27,760 And those platforms are far from being, as it were, 261 00:32:27,910 --> 00:32:32,080 entirely neutral platforms that are only interested in making sure that everyone 262 00:32:33,430 --> 00:32:36,850 has a fair shot at communicating whatever it is that they want to communicate, 263 00:32:37,660 --> 00:32:49,390 are actually interested in ensuring that they make the most money and drive audiences to that material, which is going to make them more money. 264 00:32:49,810 --> 00:32:56,050 So the idea isn't that they're going to going to adjudicate matters, 265 00:32:56,890 --> 00:33:01,750 adjudicate things in the manner in which the most important political information or the most 266 00:33:01,750 --> 00:33:06,460 truthful or the most accurate political information is gained by the greatest number of people. 267 00:33:06,760 --> 00:33:19,150 Rather, what they're interested in is ensuring that the kind of material that's pumped around maximises the cash flow. 268 00:33:20,200 --> 00:33:31,930 So that isn't very helpful either in terms of the quality and the the compliance with with moral and epistemic norms. 269 00:33:33,910 --> 00:33:42,969 Okay. So if that's the general situation, the specific problem, let me remind you again, is on the one hand, 270 00:33:42,970 --> 00:33:51,190 we've got political matter to fake news, hate speech and so on, which is exploding exponentially, is a sort of exponentially developed. 271 00:33:51,190 --> 00:33:53,080 So there's a huge amount of it out there. 272 00:33:53,620 --> 00:34:03,850 But on the other hand, which is creating a lot of problems, on the other hand, we have, as I say, this is important, right to freely communicate. 273 00:34:05,110 --> 00:34:12,129 So what I want to do now is, is take a little bit of a step back from this right to formally communicate. 274 00:34:12,130 --> 00:34:18,430 So the first thing, when people think about their right to communicate, they think of an individual right to communicate. 275 00:34:18,790 --> 00:34:21,459 That I have to you and you have to me. 276 00:34:21,460 --> 00:34:27,850 And so I've suggested that that's not quite right, that what you've actually got, if you're talking about public communication, 277 00:34:28,540 --> 00:34:34,600 is a right because you can't actually communicate with everyone and everyone can't actually communicate with you. 278 00:34:34,900 --> 00:34:41,200 It's more like you've got a right to if you've got a right to say there opportunity 279 00:34:41,200 --> 00:34:44,410 to compete with others to get your communication to a larger audience. 280 00:34:45,070 --> 00:34:51,639 So what is the but what is the sort of fundamental right that we're interested here in, 281 00:34:51,640 --> 00:34:59,710 in protecting what what's the important right that we want to make sure is if not. 282 00:35:01,730 --> 00:35:17,990 If not entirely respected, is at least, uh, respected to a considerable amount of time and is and is a right that feasibly could be respected. 283 00:35:19,430 --> 00:35:25,370 Well, the right that I think we need in this space is something like something like this. 284 00:35:26,900 --> 00:35:30,950 And I'll refer to as a sort of freedom to seek the truth by reasoning with other people. 285 00:35:31,400 --> 00:35:34,400 So it's a kind of complex, but I think fundamental, right? 286 00:35:35,720 --> 00:35:47,120 It embraces a freedom of thought component because as an individual, you need to be able to think freely and also a capacity to to reasons individual. 287 00:35:47,510 --> 00:35:51,499 But it also involves the freedom of communication that you communicate. 288 00:35:51,500 --> 00:35:58,370 You say things to people, they say things back to you and so on. But in addition, a kind of capacity to discuss things. 289 00:35:58,400 --> 00:36:03,710 So it's a it's an interpersonal, if you like, or social kind of phenomenon. 290 00:36:04,250 --> 00:36:11,540 So it's not simply you as an individual saying things. It's you as an individual thinking things and reasoning about what you're thinking about. 291 00:36:11,870 --> 00:36:18,329 But it's also individuals gathering together and communicating and discussing things. 292 00:36:18,330 --> 00:36:24,070 So there's a kind of deliberative dimension to it, a kind of collective deliberative dimension to it that's important. 293 00:36:26,270 --> 00:36:33,709 And moreover, if we're talking about political communication in the current environment, the most particularly mass society, 294 00:36:33,710 --> 00:36:41,180 albeit one with with these new with the Internet and with social media and these other enablers of public communication, 295 00:36:42,050 --> 00:36:45,170 it's to some extent both directly and inter. 296 00:36:45,170 --> 00:36:56,840 It nevertheless is some, to some extent, directly and indirectly exercised as as has been the case for some for hundreds of years via elected leaders. 297 00:36:58,280 --> 00:37:02,849 So I'm going to put this on the table, this notion of freely seeking the truth by reasoning with others. 298 00:37:02,850 --> 00:37:06,110 And I'm going to suggest that it's not so much an individual right as a joint. Right. 299 00:37:06,110 --> 00:37:11,000 It's a right that I have and you have, but you have and I have interdependent independently with one another. 300 00:37:11,780 --> 00:37:18,050 And it's a right that involves reasoning, not only formal, inductive and deductive reasoning, but informal reasoning. 301 00:37:19,340 --> 00:37:27,110 And of course, it's a right that is governed by epistemic and moral norms. 302 00:37:27,110 --> 00:37:30,380 Tell the truth, respect, evidence, and so on and so forth. 303 00:37:31,760 --> 00:37:40,610 So if you think about the sort of thing I have in mind, at least, and I'm not suggesting that this has been an ideal process in this case, 304 00:37:40,910 --> 00:37:49,489 the sort of thing I have in mind is the collective decision on the basis of reasoning with others. 305 00:37:49,490 --> 00:37:57,260 A lot of people got together and reasoned with their friends, but they also listened to what elected leaders and various other people had to say. 306 00:37:58,850 --> 00:38:04,370 There was, as it were, a collective political conversation which is ongoing in realise the Brexit. 307 00:38:05,270 --> 00:38:09,200 So that seems to me to be an example of the sort of phenomenon I've got in question. 308 00:38:09,530 --> 00:38:12,620 It's a lot of people thinking about an important matter. 309 00:38:12,620 --> 00:38:16,100 They've got different viewpoints, they're operating directly with one another, 310 00:38:16,640 --> 00:38:22,310 but they're also operating indirectly via, as it were thought leaders and political leaders and so on and so forth. 311 00:38:22,760 --> 00:38:27,110 So there's a kind of political conversation going on in relation to this issue. 312 00:38:27,110 --> 00:38:35,600 And at the end of that conversation, decisions are made via an election or potentially via another referendum or potentially 313 00:38:37,760 --> 00:38:43,160 simply via parliament or perhaps via the Prime Minister without the help of Parliament, 314 00:38:44,030 --> 00:38:52,280 depending on how things go. Now, I'm suggesting at all that that process has been an ideal one. 315 00:38:52,520 --> 00:38:56,060 What I'm suggesting choose. That's the kind of process I've got in mind. 316 00:38:56,390 --> 00:39:03,050 And it's kind of useful process to think about what has gone right and what has gone wrong in that process. 317 00:39:03,740 --> 00:39:07,310 You know, to what extent have people been telling the truth? To what extent has there been evidence? 318 00:39:07,700 --> 00:39:14,180 To what extent has there been manipulative outside influence and so on and so forth? 319 00:39:15,500 --> 00:39:20,930 So it's the it's the kind of process that even if it's even if in that instance, 320 00:39:20,930 --> 00:39:30,709 it's been extremely imperfect that comes under what I'm under the heading of freely seeking the truth by reasoning with other people. 321 00:39:30,710 --> 00:39:32,030 The truth in this case, of course, 322 00:39:32,030 --> 00:39:43,250 isn't so much a particular factual truth as opposed to a policy or course of action that is believed to be the best course of action for the country. 323 00:39:45,770 --> 00:39:53,809 Okay. So often when you make these sorts of points about, look, you know, we've got to think through the importance of reasoning, 324 00:39:53,810 --> 00:39:58,970 getting everyone to say, let's respect the truth, let's look at the evidence. 325 00:39:59,330 --> 00:40:07,340 You get a whole lot of people. They are, you know, the the the cynical and the the people who really know how it all works, 326 00:40:07,340 --> 00:40:13,430 the marketing experts and so on, saying, oh, well, you know, it's really all propaganda. 327 00:40:13,430 --> 00:40:16,790 And there isn't really a distinction between propaganda on the one hand and 328 00:40:16,790 --> 00:40:21,050 knowledge acquisition dissemination and serious rational deliberation on the other. 329 00:40:21,440 --> 00:40:25,969 It's it's just a sort of it's it's just all a kind of grand marketing exercise. 330 00:40:25,970 --> 00:40:29,480 And it's just some people are more powerful than others and so on and so forth. 331 00:40:30,080 --> 00:40:32,959 I think it's very important to maintain this distinction. 332 00:40:32,960 --> 00:40:39,350 And secondly, I think it's very important to realise that actually, far from there not being a distinction, 333 00:40:40,040 --> 00:40:51,199 the propagandists and the spurs of of fake news and so on are actually parasites that parasites on the fundamental, 334 00:40:51,200 --> 00:41:00,020 more fundamental process, which is actually bona fide a communication, try to respect the truth, try to look at the evidence and so on and so forth. 335 00:41:00,470 --> 00:41:09,800 And they're parasites in the in the, in the literal sense that a parasite wants to feed off the host, 336 00:41:10,490 --> 00:41:16,970 but it doesn't want the host to die because once the host dies, they can't extract anything from the host. 337 00:41:17,390 --> 00:41:21,080 And so what's actually going on with with propaganda and hate speech and also other stuff, 338 00:41:21,410 --> 00:41:28,190 if everyone was lying and everyone was simply manipulating everyone else and everyone realised this, 339 00:41:28,580 --> 00:41:32,840 then the whole the whole communicative enterprise would simply collapse. 340 00:41:33,020 --> 00:41:38,280 No one would bother communicating with anyone. No one you would be able to manipulate, manipulating because there were no other manipulative. 341 00:41:38,280 --> 00:41:43,790 And so so the the manipulators and the propagandists and the spread of the fake 342 00:41:43,790 --> 00:41:49,940 news are actually parrot are actually parasites in the sense they rely on, 343 00:41:50,330 --> 00:41:56,420 um, on others to actually comply with the epistemic and moral norms. 344 00:41:58,730 --> 00:42:04,900 Okay. So finally part D where we come to this subject, 5 minutes. 345 00:42:05,510 --> 00:42:14,960 This should be rather quick. It's the solution. Well, it's it's not really the solution, but it's some suggestions that I've done. 346 00:42:14,990 --> 00:42:21,649 I've done too much work to set up the difficulty, this problem solving, realising that now I don't have the answer. 347 00:42:21,650 --> 00:42:24,950 I should not give myself an easy task. Okay. 348 00:42:24,950 --> 00:42:30,500 So let's let's look at some of the possible things have been done in response to this. 349 00:42:30,800 --> 00:42:34,730 Well, one of the things I think can be done, which might sound rather silly and obvious, 350 00:42:34,730 --> 00:42:40,610 but is to address some of the legitimate grievances that are being exploited by the propagandists, 351 00:42:40,970 --> 00:42:47,690 if there is actually is a problem is a lot of people who are extremely poor or deprived or downtrodden might be an idea to do something about it, 352 00:42:47,690 --> 00:42:52,400 because then you would rob the propagandists of some of the firepower. 353 00:42:53,220 --> 00:42:58,820 Another thing that has to happen, of course, is that we have to enact justified laws and regulations, too, 354 00:42:58,850 --> 00:43:04,489 but we have to think hard about which things ought to be prohibited and and which ones ought not to be prohibited. 355 00:43:04,490 --> 00:43:07,520 And that's a complex process. 356 00:43:09,230 --> 00:43:14,090 Another kind of area, once we've got through that, this is obviously a schematic. 357 00:43:14,090 --> 00:43:16,430 I'm not going to do this for you in the time that I have, 358 00:43:17,570 --> 00:43:23,480 is that we need to look at issues around that are problematic in relation to enforcing laws and regulations. 359 00:43:23,930 --> 00:43:31,550 So obviously you're looking at removal of illegal propaganda, but there are a lot of sophisticated new methods that can undermine that. 360 00:43:32,510 --> 00:43:36,170 For example, the the Russians in this Cambridge Analytica case, 361 00:43:37,280 --> 00:43:41,749 some of these methods are not easy to counter, but a couple of things that you could be done. 362 00:43:41,750 --> 00:43:46,219 One is obviously to provide more adequate protection for personal information in 363 00:43:46,220 --> 00:43:51,050 the case of the Cambridge Analytica thing on the part of the Facebook and so on. 364 00:43:52,940 --> 00:44:04,860 But another thing that you can look at is actually looking at offensive, offensive cyber attacks on those nation states that want to be propagandist. 365 00:44:06,290 --> 00:44:14,869 This is a very real possibility and one that may need to be countenanced rather than just doing defensive measures, 366 00:44:14,870 --> 00:44:17,720 you might have to look at offensive measures of various kinds. 367 00:44:20,060 --> 00:44:25,910 One suggestion that a lot of people have made is a kind of counter propaganda would be one thing, of course, 368 00:44:25,910 --> 00:44:31,760 to have counter narratives that espousing liberal democratic values I don't have, but that's something that can be done. 369 00:44:32,030 --> 00:44:38,180 Counter propaganda, of course, is you now going for the business of spreading disinformation, 370 00:44:39,440 --> 00:44:47,780 telling few lies and so on in the manner that, for example, the UK did in the Second World War? 371 00:44:48,170 --> 00:44:52,489 This seems to me to be a really bad idea and it's something that of course the marketing people can tell us all. 372 00:44:52,490 --> 00:45:00,250 We can do this, we can use machine learning, we can battle this propaganda which will be counter propaganda, I think. 373 00:45:00,350 --> 00:45:05,600 This is very problematic, particularly for liberal democracies who are committed to evidence based, 374 00:45:05,600 --> 00:45:10,910 rational inquiry, open discussion to go in for propaganda in this in this way. 375 00:45:11,420 --> 00:45:18,110 Which is not to say that the intelligence agencies don't need to spread a bit of disinformation, as it were, secretly. 376 00:45:18,440 --> 00:45:22,280 But if you're talking about in the public space, I think this is highly problematic. 377 00:45:22,580 --> 00:45:31,610 And counter propaganda may actually just be ultimately counterproductive and devalue the liberal democratic currency. 378 00:45:32,000 --> 00:45:33,620 So I would recommend against that. 379 00:45:34,760 --> 00:45:48,559 Some of the things that you can do is work to restore or strengthen epistemic norms in the citizenry via various institutions, 380 00:45:48,560 --> 00:45:53,360 schools and universities, and try to maintain or strengthen the intellectual, 381 00:45:53,360 --> 00:45:57,140 moral health of epistemic institutions such as journalism and Parliament. 382 00:45:57,350 --> 00:46:02,839 I mean, one of the things that seems pretty obvious is that you if you want to if you want a strong, robust, 383 00:46:02,840 --> 00:46:12,230 resilient epistemic institution, say the media, a lot of and a lot of this is due to fellow countrymen. 384 00:46:12,680 --> 00:46:22,310 MURDOCH If you if you're in the business of undermining these norms within the journalistic and media companies themselves, 385 00:46:22,670 --> 00:46:26,150 then obviously you're doing the work of of your adversary for you. 386 00:46:28,370 --> 00:46:29,120 And similarly, 387 00:46:29,120 --> 00:46:37,010 if you've got a parliament that is so obsessed with promoting the self-interest of individuals who are so obsessed with the interests of the party, 388 00:46:37,580 --> 00:46:47,420 of the political parties, that it can't actually get ahead with its deliberative work on behalf of the nations, 389 00:46:47,750 --> 00:46:51,230 which seems to be the case with the Republican Party in the US. 390 00:46:51,800 --> 00:46:59,720 And, and now, from what I can see from the outside is seems to be a feature of the Conservative Party at the moment. 391 00:47:00,140 --> 00:47:07,250 I mean, obviously this is very unhelpful because far from your epistemic institutions being a counterweight to these problems, 392 00:47:07,250 --> 00:47:16,879 they're sort of becoming part of the problem. Another kind of dimension to this is not to strengthen existing epistemic institutions, 393 00:47:16,880 --> 00:47:27,740 but to try to embed those epistemic institutions in the inside, as it were, in cyberspace. 394 00:47:28,460 --> 00:47:36,710 I mean, think of think of the say a quality newspaper with it will support investment journals 395 00:47:37,070 --> 00:47:44,420 that is using working and operating in the cyber realm as its material peers in, 396 00:47:44,750 --> 00:47:46,280 for example, in Facebook and so on. 397 00:47:48,230 --> 00:47:57,980 But it also can can do things to hold accountable those who are responsible for that, for fake news, hate speech and so on. 398 00:47:58,280 --> 00:48:06,890 So embedding in cyber in the cyber realm, these epistemic institutions is another thing that can be done. 399 00:48:07,160 --> 00:48:11,060 Let me finally then turn to social media platforms themselves. 400 00:48:11,960 --> 00:48:20,990 This brings me to the end of this is talk of the Internet and social media platforms. 401 00:48:21,410 --> 00:48:25,310 It has been claimed to render traditional epistemic institutions redundant. 402 00:48:26,180 --> 00:48:28,970 I think this has now been shown to be completely false. 403 00:48:29,000 --> 00:48:39,610 In fact, the reverse is the case that actually we need we need to strengthen and and extend the traditional epistemic institutions of universities, 404 00:48:40,940 --> 00:48:49,070 the free and independent press and so on. So it's proved to be a complete illusion that they've been rendered redundant. 405 00:48:51,280 --> 00:48:55,400 But secondly is quite clear that the tech companies themselves have failed to 406 00:48:55,640 --> 00:49:00,440 adequately self-regulate and ensure compliance with epistemic and moral norms. 407 00:49:00,860 --> 00:49:08,480 And so we need to to deal and partly because of the overriding commercial interests, various reasons for that. 408 00:49:08,780 --> 00:49:14,629 And so regulation is is strong. 409 00:49:14,630 --> 00:49:22,840 Regulation is required. There are a couple of different. 410 00:49:24,040 --> 00:49:30,580 Let me just come to come to the end of this, a couple of different general options here. 411 00:49:32,710 --> 00:49:37,570 One is if the joint check, if joint tech companies are to remain market based companies, 412 00:49:37,570 --> 00:49:41,530 then obviously they have to be made to comply with principles of free and fair competition. 413 00:49:41,950 --> 00:49:48,190 And that would require downsizing them, for example, reversing past acquisitions. 414 00:49:48,520 --> 00:49:57,460 In the case of Facebook, of Instagram and WhatsApp. On the other hand, if we come to The View, which I think we probably do need to come to, 415 00:49:58,180 --> 00:50:04,239 and the view that the tech companies themselves in a sense, espouse, that they're simply infrastructure providers. 416 00:50:04,240 --> 00:50:09,660 They're not actually the platforms, they're not actually publishers, they're just platforms for others, too. 417 00:50:09,940 --> 00:50:15,459 Then we need to ensure that that's exactly what they're doing. They're not actually facilitating particular publications. 418 00:50:15,460 --> 00:50:17,260 They're simply providing infrastructure. 419 00:50:17,650 --> 00:50:23,620 But the next thing that seems to follow is it may be that we just ought to be they ought to be transformed into public utilities, 420 00:50:24,820 --> 00:50:34,030 because public utilities will will perform that role much more satisfactorily, arguably, than the markets. 421 00:50:37,240 --> 00:50:40,690 Okay. You'll be pleased to know that this is the last slide. 422 00:50:44,680 --> 00:50:51,219 The certainly one of the things I don't think we can leave alone is the current 423 00:50:51,220 --> 00:50:57,430 situation where the tech the tech giants are not legally liable for material, 424 00:50:59,260 --> 00:51:07,660 for hate speech and so on that may appear on the platforms, but simultaneously, 425 00:51:07,870 --> 00:51:13,870 somehow they've taken responsibility or there's now co-responsibility for removing or regulating that. 426 00:51:14,180 --> 00:51:21,339 So I don't think you can have it both ways either, that they're responsible for regulating which case they're going to be held legally 427 00:51:21,340 --> 00:51:27,340 liable when they fail to regulate or they're out of the business of of of regulating. 428 00:51:27,340 --> 00:51:32,470 We don't have them making those decisions, but rather, say, an independent entity. 429 00:51:34,660 --> 00:51:38,230 So far, this is the final point I've been distinguishing. 430 00:51:38,620 --> 00:51:44,650 I've I've been talking largely in these last few minutes about regulation and, 431 00:51:45,070 --> 00:51:54,280 and the law in relation to to hate speech and, and fake news and propaganda. 432 00:51:54,640 --> 00:52:01,060 And of course, the law and regulation can do and is a necessary condition for doing this work. 433 00:52:01,270 --> 00:52:06,010 But I think the final point would be that actually is not going to be sufficient, as I pointed out earlier. 434 00:52:06,370 --> 00:52:20,890 And that means there has to be there has to be a strong counterweight from outside from from from, as it were, 435 00:52:21,430 --> 00:52:27,190 social forces, if you like, other than than the law and regulation and the enforcement of rule and regulation. 436 00:52:27,640 --> 00:52:37,450 And that that means, to some extent, I think, attitudes of ordinary people and them taking a responsibility for the citizenry. 437 00:52:37,450 --> 00:52:48,100 But it also means enhanced or or the return of professionalisation to a lot of the relevant occupations in this space. 438 00:52:48,640 --> 00:52:53,980 Uh, professionalisation being something that is to some extent subject to regulation, 439 00:52:53,980 --> 00:53:04,930 but is very much a matter of an occupational group or of an organisation taking a stand and developing 440 00:53:04,930 --> 00:53:14,770 a kind of culture that is oppositional to non-compliance to the epistemic and moral norms. 441 00:53:15,160 --> 00:53:16,510 Thank you. Thank you, Ruth.