1 00:00:09,080 --> 00:00:18,410 Welcome, everybody. This is a very exciting occasion. It's the first ever event of Oxfords New Institute for Ethics in A.I. 2 00:00:18,410 --> 00:00:22,550 My name is John to Pseudolus and I'm the director of the institute. 3 00:00:22,550 --> 00:00:31,920 And it's with great pleasure that I'm chairing our event today, which is the launch of Charissa Liz's important new book, Privacy is Power. 4 00:00:31,920 --> 00:00:36,520 Why and How you should take back control of your data. 5 00:00:36,520 --> 00:00:43,720 Garissa is an associate professor of philosophy at Oxford University and a tutorial fellow at Harvard College. 6 00:00:43,720 --> 00:00:49,200 She's also a colleague of mine in the new Institute for Ethics in a I. 7 00:00:49,200 --> 00:00:56,490 Her book is Engaged Philosophy, and it's best engaging with a momentous contemporary issue. 8 00:00:56,490 --> 00:01:01,830 The threat to privacy in a world of surveillance, capitalism written in clear, 9 00:01:01,830 --> 00:01:12,660 attractive prose that is fired by a real moral passion, reminiscent in many ways of Peter Singer's writings on animal liberation. 10 00:01:12,660 --> 00:01:16,710 Although here it's humans who are being liberated just in a brief way. 11 00:01:16,710 --> 00:01:24,500 I want to highlight three ways in which the book is especially valuable. First, it's a real storehouse of information. 12 00:01:24,500 --> 00:01:34,220 Some of it quite hair raising about the threat to our privacy created by the Web sites, apps and gadgets with which we constantly interact. 13 00:01:34,220 --> 00:01:40,430 And the corporations and governments that use the personal data they collect. 14 00:01:40,430 --> 00:01:45,880 Second, it offers a very compelling account of what is at stake, 15 00:01:45,880 --> 00:01:53,570 what we stand to lose in a world in which there are systematic intrusions into our privacy, 16 00:01:53,570 --> 00:02:01,640 in particular, Chris's analysis is really fruitful in highlighting two dimensions of the value of privacy. 17 00:02:01,640 --> 00:02:08,780 The first is an individualistic dimension whereby privacy bears on some of our deepest interests, 18 00:02:08,780 --> 00:02:15,440 for example, in intimate personal relations with others, or being able to think freely for ourselves. 19 00:02:15,440 --> 00:02:26,000 And secondly, a public or collective dimension to privacy whereby it's a common good one that has an especially prominent role to play. 20 00:02:26,000 --> 00:02:36,020 According to Charissa, in sustaining the possibility of an authentic, deliberative, participatory, democratic culture. 21 00:02:36,020 --> 00:02:37,250 And thirdly, 22 00:02:37,250 --> 00:02:50,430 the book goes beyond description and diagnosis and offers concrete prescriptions for how to improve privacy protection before it's too late. 23 00:02:50,430 --> 00:03:00,510 Prescriptions regarding institutional and legal change, for example, banning personalised ads and the trade in personal data, 24 00:03:00,510 --> 00:03:10,650 but also prescriptions regarding our own personal behaviour with respect to our privacy and the privacy of others. 25 00:03:10,650 --> 00:03:21,120 We are very fortunate in having two commentators, two distinguished commentators, to discuss Chorus's book. 26 00:03:21,120 --> 00:03:29,340 The first is Sir Michael Tickin Head, who is one of the leading British experts on the law of privacy. 27 00:03:29,340 --> 00:03:40,920 He's also a philosophically minded lawyer whose books include Liberty Intact Human Rights in English Law, published by UPI in 2016. 28 00:03:40,920 --> 00:03:48,340 And I understand that he is about to publish another book in French on unjustly the Collected Intellectuals, 29 00:03:48,340 --> 00:03:51,810 a category in which most academics probably feel they belong. 30 00:03:51,810 --> 00:04:01,290 So it should get a wide readership. Our second commentator is Stephanie Hare, a leading writer on technology policy. 31 00:04:01,290 --> 00:04:10,370 Woman, journalist and broadcaster. And she is writing a book which should come out, I think, next year entitled Technology Ethics. 32 00:04:10,370 --> 00:04:15,620 So thank you to both of them for coming today to discuss this important new book. 33 00:04:15,620 --> 00:04:23,360 There will also be a Q&A session afterwards. So please, if you do have a question, it into the YouTube comment section. 34 00:04:23,360 --> 00:04:31,190 But first, I'm going to ask Clarissa to give us her view of the main point and purpose of this great book that she's written. 35 00:04:31,190 --> 00:04:38,810 Kristen, thank you so much, John. Thank you for your words. And thanks, everyone, for for being here, particularly John, Michael and Steph. 36 00:04:38,810 --> 00:04:44,270 It's wonderful to have such good company to talk about these issues that worry me so much. 37 00:04:44,270 --> 00:04:50,030 So maybe I would like to tell you a bit about how I came to this topic. I am six years ago. 38 00:04:50,030 --> 00:04:57,940 I was deep in the archives of the Spanish Civil War researching my family history. 39 00:04:57,940 --> 00:05:03,140 And I dug out many things that my grandparents had never told me, as was my mom. 40 00:05:03,140 --> 00:05:11,320 And we started wondering whether we had a right to know these things. And I looked to philosophy for some answers and I felt unsatisfied. 41 00:05:11,320 --> 00:05:17,560 And in that same summer that Edward Snowden came up with his revelations. 42 00:05:17,560 --> 00:05:19,370 And I thought that was so important. 43 00:05:19,370 --> 00:05:27,470 And I wasn't sure the philosophy or the world have the tools to think about the significance of these revelations and what they meant. 44 00:05:27,470 --> 00:05:31,550 Eventually, that led me to write my dissertation on the ethics and politics of privacy. 45 00:05:31,550 --> 00:05:37,940 And to read as much as I could about the topic. And with this book, Privacy is Power. 46 00:05:37,940 --> 00:05:43,640 I wrote the book that I wish I had been able to read at the time. It's a book that's very concise. 47 00:05:43,640 --> 00:05:49,070 It's short and it really goes to the point. I think we are at a crossroads. 48 00:05:49,070 --> 00:05:53,420 It is urgent to fix our privacy landscape. 49 00:05:53,420 --> 00:06:01,190 And so the first chapter goes through a person's day just to illustrate how much data we're losing by this point. 50 00:06:01,190 --> 00:06:08,630 Everybody knows more or less that we're losing data. But do we know to what extent and exactly how it's being collected and how it's being used? 51 00:06:08,630 --> 00:06:11,430 In many ways. It's very surprising. 52 00:06:11,430 --> 00:06:21,770 Even privacy experts get surprised by some of the details because most of the practises that go on are very unknown to the public at large. 53 00:06:21,770 --> 00:06:28,550 And even to privacy experts, because they're very experimental and companies typically don't advertise what they're doing. 54 00:06:28,550 --> 00:06:32,900 Exactly. So I'll share a couple of examples just to give you a taste of the kind of thing that you might 55 00:06:32,900 --> 00:06:41,000 find if you have a smart car when you drive your seat is actually recording your weight. 56 00:06:41,000 --> 00:06:45,360 That information could be sold to insurance companies, to prospective employers. 57 00:06:45,360 --> 00:06:50,930 Who knows? Another example is when you're going to a store and there's music in the background, 58 00:06:50,930 --> 00:06:57,200 there might be audio beacons being transmitted that your phone picks up even though we can't hear them. 59 00:06:57,200 --> 00:07:00,860 And that's to identify you as you and the same customer. 60 00:07:00,860 --> 00:07:03,080 So that's if if the company wants to know that, you know, 61 00:07:03,080 --> 00:07:09,990 you saw an ad in the morning on your laptop that you went to the store in your neighbourhood and you bought that product, 62 00:07:09,990 --> 00:07:18,290 but you saw an ad for and the way they triangulate that data is through these audio becomes that can be broadcast at a music and TV. 63 00:07:18,290 --> 00:07:23,990 And that we don't know about them. The book explains a bit how this came about. 64 00:07:23,990 --> 00:07:34,670 It's just such a weird system that we have the data economy here are these this enormous industry that is earning so much money off of our data. 65 00:07:34,670 --> 00:07:38,580 How did how did that happen? Who had that idea and how did it come about? 66 00:07:38,580 --> 00:07:50,420 And it explains a bit how Google was a main protagonist in this story and how they really developed the whole concept of personalised ads and how 67 00:07:50,420 --> 00:07:59,570 the data economy took off at a moment in which the United States government was considering regulating data along some of the lines of the GDP. 68 00:07:59,570 --> 00:08:04,130 Has has implemented many, many years later. And then came 9/11. 69 00:08:04,130 --> 00:08:13,820 And suddenly privacy wasn't that important anymore. And security really took the stage and our privacy legislation got shelved. 70 00:08:13,820 --> 00:08:19,610 I mean, OK, now, when we have the situation and what does it mean, why is it important? 71 00:08:19,610 --> 00:08:23,780 Many people have the idea that privacy is just a personal preference, 72 00:08:23,780 --> 00:08:28,370 that if you're not very shy and you're not a criminal, you have nothing to fear, nothing to hide. 73 00:08:28,370 --> 00:08:36,920 And so it's up to you whether you want to be careful or not. And the third chapter is really about how that conception is very misleading. 74 00:08:36,920 --> 00:08:41,180 In fact, privacy should be a political concern. It's a collective matter. 75 00:08:41,180 --> 00:08:46,220 It's really not a personal preference or an individual matter or much less so. 76 00:08:46,220 --> 00:08:52,820 And I take this idea from the philosopher Bertrand Russell that we should think about power as energy. 77 00:08:52,820 --> 00:08:56,940 One of the characteristics of energy is that it can transform from one thing into another. 78 00:08:56,940 --> 00:09:01,430 And in the same way, power can transform from one type of power into another. 79 00:09:01,430 --> 00:09:07,640 So there are three kinds of power that we're very familiar with in history, and that's economic power, political power, military power. 80 00:09:07,640 --> 00:09:11,600 If you have enough money and the system doesn't work very well, you can buy votes. 81 00:09:11,600 --> 00:09:17,810 You can buy politicians. Or you can. You can have an influence in politics in a way that shouldn't be allowed. 82 00:09:17,810 --> 00:09:23,780 If you have enough politic political power, you can get yourself military power and so on. 83 00:09:23,780 --> 00:09:28,070 And I argue that there's this fourth kind of power that kind of crept up on us. 84 00:09:28,070 --> 00:09:31,820 And that's a power related to having too much personal data. 85 00:09:31,820 --> 00:09:40,190 And it's a power that is related with having the ability to influence behaviour and predict the future. 86 00:09:40,190 --> 00:09:42,800 So the reason why crept up on us, 87 00:09:42,800 --> 00:09:51,380 because we thought that we would identify Monopoly's as long as there was a company that could increase their prices without losing customers. 88 00:09:51,380 --> 00:10:03,500 And here we have companies that are free but are really charging their data or rather collecting our data from us in return and our attention. 89 00:10:03,500 --> 00:10:10,790 And they can come up with very abusive policies and change them at will without even notifying us. 90 00:10:10,790 --> 00:10:18,650 And we are losing consumers, but we didn't identify them as monopolies because it wasn't a matter of price in in the traditional sense. 91 00:10:18,650 --> 00:10:28,190 So really, we should think about personal data as power. And if we give too much personal data to corporations, then the wealthy will rule. 92 00:10:28,190 --> 00:10:36,110 If you give too much personal data to governments, we risk sliding into some form of authoritarianism. 93 00:10:36,110 --> 00:10:44,510 And I argue that for democracy to be strong, we need to have control over our personal data as a collective, as a citizenry. 94 00:10:44,510 --> 00:10:51,530 And one of the things I wrote in the book is how creepy data science can be. 95 00:10:51,530 --> 00:10:57,530 Really? I mean, you have these data scientists who are creating kind of avatars of the population. 96 00:10:57,530 --> 00:11:01,700 They have like a like a reality game of the population. 97 00:11:01,700 --> 00:11:06,290 And they try things on these virtual zombies. 98 00:11:06,290 --> 00:11:12,110 And whenever it works, then they try it on the real population. And you might think, well, you know, I'm an intelligent person. 99 00:11:12,110 --> 00:11:15,470 I read, surely I won't be affected. 100 00:11:15,470 --> 00:11:25,700 But when you are influencing millions and millions and millions of people, really, it doesn't take a lot of influence to be able to sway an election. 101 00:11:25,700 --> 00:11:32,090 We have to think that in some elections are won and lost on the basis of a few hundreds or thousands of votes. 102 00:11:32,090 --> 00:11:37,000 And to make that difference is it is isn't impossible. 103 00:11:37,000 --> 00:11:41,840 In fact, Facebook published a study in Nature in 2010. 104 00:11:41,840 --> 00:11:47,990 If I remember correctly, which it showed that just by showing ads to people to remind them to vote, 105 00:11:47,990 --> 00:11:52,030 they could increase voter turn around by about four percent. 106 00:11:52,030 --> 00:12:00,890 And that that could sway the election in itself. And the scary thing is that we don't know who Facebook shows these ads to. 107 00:12:00,890 --> 00:12:08,170 There is no kind of auditing. And in fact, they have used this kind of I vote bots on in many elections without we. 108 00:12:08,170 --> 00:12:13,190 We know that not everybody saw that. And we don't know what criteria Facebook used. 109 00:12:13,190 --> 00:12:19,700 And with the coming elections now in the US, this is something outrageous that should really concern us because the rule of law 110 00:12:19,700 --> 00:12:24,800 has to be more robust than just having trust that Facebook won't abuse their power. 111 00:12:24,800 --> 00:12:30,360 So the gist is we've given too much data to corporations and they have too much power. 112 00:12:30,360 --> 00:12:36,410 And what's special about this book is that it's the first one to really call for an end to the data economy. 113 00:12:36,410 --> 00:12:40,490 I argue that personal data shouldn't be the kind of thing that can be bought and sold, 114 00:12:40,490 --> 00:12:46,430 that we allow data brokers who are these companies that have files on every Internet user and then sell 115 00:12:46,430 --> 00:12:51,980 amongst to other companies and governments that we allow them to profit from very sensitive information, 116 00:12:51,980 --> 00:13:01,850 like whether somebody has been the victim of a rape or whether they have HIV or whether they suffer from all kinds of diseases. 117 00:13:01,850 --> 00:13:07,190 It's just outrageous that that should never be. And even in the most capitalist of societies, 118 00:13:07,190 --> 00:13:15,830 we agree that there are certain things that shouldn't be part of the market typically votes people the results of sports match matches. 119 00:13:15,830 --> 00:13:19,190 And I think personal data should be in that category. 120 00:13:19,190 --> 00:13:27,930 So one of the also one of the innovative parts of this book is that it suggests that personal data should be treated as a toxic asset. 121 00:13:27,930 --> 00:13:31,790 And that suggests that we should think about it as something like asbestos. 122 00:13:31,790 --> 00:13:42,740 It is poisoning individuals by making us vulnerable to harms like extortion, like data theft, like identity theft, like discrimination. 123 00:13:42,740 --> 00:13:46,610 And it's poisoning societies by undermining equality. We're not being treated as equals. 124 00:13:46,610 --> 00:13:51,730 We're being treated on the basis of our data and I don't undermine democracy. 125 00:13:51,730 --> 00:13:55,460 So at the end, I have two chapters are very practical. 126 00:13:55,460 --> 00:14:03,650 I'm a philosopher, but I also like to have one foot in the real world and to come up with solutions and proposals. 127 00:14:03,650 --> 00:14:12,440 And one chapter is for policymakers. Amongst many of the measures I recommend are to ban all trades and personal data. 128 00:14:12,440 --> 00:14:20,390 The second one is to ban all personalised content. We should be able to see the same thing as everybody else sees because otherwise our public 129 00:14:20,390 --> 00:14:25,370 sphere will be fragmented and we won't be able to see the same opportunities to be offered, 130 00:14:25,370 --> 00:14:33,650 the same opportunities. And we will, if we have access to information that is contradictory, is just gonna put us against each other. 131 00:14:33,650 --> 00:14:37,340 And the third one is to implement fiduciary duties. 132 00:14:37,340 --> 00:14:43,910 So fiduciary duties are duties that are implemented when there is a professional relationship in which there is an asymmetry. 133 00:14:43,910 --> 00:14:53,720 So doctors and patients, lawyers and clients, financial advisor and clients are some examples in which there is an expert that can help you, 134 00:14:53,720 --> 00:14:57,050 but who knows a lot more than you and who may have a conflict of interest. 135 00:14:57,050 --> 00:15:04,470 So you can imagine that your doctor might be very interested in performing as a surgery on you to practise their skills. 136 00:15:04,470 --> 00:15:12,920 Sorry. All your financial adviser might have an interest in selling your particular kind of stock. 137 00:15:12,920 --> 00:15:20,990 And fiduciary duties are there so that in those kinds of cases, the interests of the client or the patient have to come first. 138 00:15:20,990 --> 00:15:30,410 So I argue that whatever has anyone who wants to collect our data, then they have to accept fiduciary duties such that our data will always be used. 139 00:15:30,410 --> 00:15:36,500 To help us, I'm never against us. And then finally, the last chapter is for ordinary citizens. 140 00:15:36,500 --> 00:15:40,850 About what you can do to better protect your privacy and that of your family, your friends. 141 00:15:40,850 --> 00:15:48,620 Because as I argue, privacy is collective. So if you expose yourself, you're really exposing others around you and others like you. 142 00:15:48,620 --> 00:15:57,290 And even though you'll never get it perfect, you don't need to really it's a matter of making an effort and also expressing dissent. 143 00:15:57,290 --> 00:16:02,900 One of the arguments that companies make when they have these outrageous practises is that people are consenting, 144 00:16:02,900 --> 00:16:09,950 that they don't care about their privacy, that we're happy and to resist that is incredibly important. 145 00:16:09,950 --> 00:16:19,010 First of all, it can actually protect you. But secondly, it creates statement and it leaves a data trail such that when regulators see that, 146 00:16:19,010 --> 00:16:26,420 they can say, hey, look, now your customers weren't accepting wilfully. They were trying to say, I don't want my data collected and you violated that. 147 00:16:26,420 --> 00:16:33,500 So I have lots of advice, really practical advice as to what kind of apps to use, what kind of practises to do. 148 00:16:33,500 --> 00:16:40,440 And I hope that you will read the book, that you'll enjoy it and that you will tell me what you think. 149 00:16:40,440 --> 00:16:46,410 Thank you so much, Chris, so that was really excellent and gave a really vivid sense of what the book is about. 150 00:16:46,410 --> 00:16:53,100 So I'm now going to turn to the first of our commentators, and that is Michael to go ahead. 151 00:16:53,100 --> 00:17:01,040 Michael. Hello. It's a privilege and a pleasure to be here. 152 00:17:01,040 --> 00:17:11,050 What time round is the first event of the Institute for Ethics in Oxford? 153 00:17:11,050 --> 00:17:19,930 This book is indeed a report, one I have enjoyed reading it, and it's easy to read. 154 00:17:19,930 --> 00:17:28,420 It's in the tradition. It made me think of one of the most influential books written in the 20th century. 155 00:17:28,420 --> 00:17:36,160 In 1962, Rachel Carson wrote Silent Spring. 156 00:17:36,160 --> 00:17:44,860 Carson exposed the dangerous side effects and unintended consequences of agricultural chemicals, 157 00:17:44,860 --> 00:17:56,180 DDT helped to prevent the spread of malaria, which was a huge benefit to humanity, but it also had devastating effects on the environment. 158 00:17:56,180 --> 00:18:06,390 There have been similar exposures of other products and services that have brought great benefits to the human race, but also great dangers. 159 00:18:06,390 --> 00:18:16,030 For example, fossil fuels. That privacy is par is not a new idea. 160 00:18:16,030 --> 00:18:25,090 One of the most important laws to protect privacy. It was ever passed was the laws that introduced a secret ballot in England. 161 00:18:25,090 --> 00:18:34,420 That was in 1872. Before that, elections were decided by bribery and intimidation. 162 00:18:34,420 --> 00:18:42,060 Landlords could intimidate tenants with threats of eviction and employers with threats of dismissal. 163 00:18:42,060 --> 00:18:46,420 The secret ballot is privacy in the polling station. 164 00:18:46,420 --> 00:18:54,630 Without it can't be any democracy. It's a very clear example of how privacy gives to the electorate. 165 00:18:54,630 --> 00:19:07,930 That's all of us, the power that we wouldn't have if we were voting under the threats and bribes that existed before. 166 00:19:07,930 --> 00:19:16,150 As Chris Wallace said, the opponents of privacy say that if you have nothing to hide, you have nothing to fear. 167 00:19:16,150 --> 00:19:20,410 Some even say privacy is fraud, in other words. 168 00:19:20,410 --> 00:19:28,740 They say you should not do say anything in private, but you would not do or say in public. 169 00:19:28,740 --> 00:19:34,120 Curser exposes the fallacy of these arguments with crushing false. 170 00:19:34,120 --> 00:19:39,010 For much of my career, first as a barrister and then as a judge, 171 00:19:39,010 --> 00:19:49,230 I have been engaged in cases in which a balance has to be struck between rights that sometimes conflict. 172 00:19:49,230 --> 00:19:58,240 For the publisher, for example, of a newspaper or the user of the Internet for blogging, 173 00:19:58,240 --> 00:20:06,820 privacy may come or seem to come at the expense of freedom of expression for the state and for the citizen. 174 00:20:06,820 --> 00:20:13,210 One person's privacy may come at the expense of other people's security. 175 00:20:13,210 --> 00:20:25,210 But for the ordinary user of the Internet, his charisma shows. Privacy is necessary for freedom of expression and it is necessary for security to. 176 00:20:25,210 --> 00:20:32,920 There are many topics that we need to discuss with our friends, our families, our employers and our colleagues. 177 00:20:32,920 --> 00:20:43,080 We have advisors and other professionals, but we can only discuss these topics freely if we can discuss them in private. 178 00:20:43,080 --> 00:20:46,770 Indeed, the fact that we are engaging in the discussion at all. 179 00:20:46,770 --> 00:20:52,430 Could be a very private matter, let alone the contents of the discussion. 180 00:20:52,430 --> 00:20:59,430 So two journalists and police officers insist on privacy, although they never used the word. 181 00:20:59,430 --> 00:21:04,350 They do not disclose their sources or methods of work, if they did. 182 00:21:04,350 --> 00:21:09,560 The sources would dry up and the methods might become unusable. 183 00:21:09,560 --> 00:21:18,500 From my own experience, I know of the dangers to people's lives and security breaches of their privacy. 184 00:21:18,500 --> 00:21:27,900 The invention of the smartphone with a camera has led to a plague of black men, intimidation and abuse. 185 00:21:27,900 --> 00:21:35,760 Kiss and tell stories in the tabloids and revenge porn are examples in the past. 186 00:21:35,760 --> 00:21:43,080 It was common for people's addresses to be listed in the telephone directory and nobody objected. 187 00:21:43,080 --> 00:21:51,680 Today, for most people, that would be unthinkable. You can even have your address now removed from the electoral register. 188 00:21:51,680 --> 00:21:59,850 And why? Of course, this is to protect people from all kinds of harassment, abuse and worse. 189 00:21:59,850 --> 00:22:09,030 These are the dangers the cursor is seeking to publicise and with her suggested courses of action to remedy. 190 00:22:09,030 --> 00:22:20,170 I have to say, I don't think a task will be an easy one. Agricultural chemicals and fossil fuels have also brought huge benefits to mankind. 191 00:22:20,170 --> 00:22:25,570 They both help feed the world and fuel keeps us heated and mobile. 192 00:22:25,570 --> 00:22:30,240 That is why they're still in use in spite of the damage that they do. 193 00:22:30,240 --> 00:22:39,770 The bartering of personal data, which is what we are discussing tonight, is similarly a two sided invention. 194 00:22:39,770 --> 00:22:49,020 The advantages which the Internet has brought us are obvious. That is why we click the consent button with little hesitation. 195 00:22:49,020 --> 00:22:58,360 If you want to buy this book online, you will have to consent or perhaps you've already consented to the use of your data by the book shop. 196 00:22:58,360 --> 00:23:06,340 And by others and look to the privacy policy on the Web sites, for example, the most honourable, 197 00:23:06,340 --> 00:23:15,640 not for profit bodies such as Oxford University have kookie policies which are not consistent with Clarissa's recommendations. 198 00:23:15,640 --> 00:23:23,120 Universities do not turn off the heating to save a few. And they do not abandon their kookie policies. 199 00:23:23,120 --> 00:23:27,440 But something has to be done. The solution to the problem will come. 200 00:23:27,440 --> 00:23:33,920 But it will only come if people are informed, as they will be if they read this book. 201 00:23:33,920 --> 00:23:41,120 And if they have put the matter at the forefront of their concerns. 202 00:23:41,120 --> 00:23:47,990 Two days ago, as it happens, the Director of Public Prosecutions, Max Hill, Queen's Counsel, made a speech. 203 00:23:47,990 --> 00:23:55,120 It was entitled The Internet of Things is helping to provide key evidence in criminal trials. 204 00:23:55,120 --> 00:23:59,380 That was also my experience as a judge in criminal trials. 205 00:23:59,380 --> 00:24:12,550 I saw many horrific murder cases solved by the use of mobile phone metadata, DNA and other information available only from the Internet. 206 00:24:12,550 --> 00:24:19,840 In those cases, it was hard to imagine how the crimes could have been solved in any other way. 207 00:24:19,840 --> 00:24:29,490 There were no witnesses, no fingerprints. What Carissa has done is to highlight the price we pay for these enormous benefits. 208 00:24:29,490 --> 00:24:34,410 It's a very unusual book. It is written by an Oxford philosopher. 209 00:24:34,410 --> 00:24:40,590 But in spite of that, it can easily be read by anyone. This is no small achievement. 210 00:24:40,590 --> 00:24:46,560 Some readers might find it surprising. They might even think that it's exaggerated. 211 00:24:46,560 --> 00:24:58,680 But do not doubt what you read. I know from my own experience as a judge and as a barrister how much of what it contains is true. 212 00:24:58,680 --> 00:25:05,110 It really is. And if you care about your freedom and security, I urge you to read it to. 213 00:25:05,110 --> 00:25:10,500 Thank you. Thank you so much. So Michael and I can't emphasise enough how right you are. 214 00:25:10,500 --> 00:25:17,340 What an achievement it is for academics to write accessibly. And this book certainly is a very accessible book. 215 00:25:17,340 --> 00:25:21,530 So we're very lucky now to turn to our second commentator, Stephanie, here. 216 00:25:21,530 --> 00:25:26,370 Stephanie, thank you very much. And that was. 217 00:25:26,370 --> 00:25:33,730 A wonderful review by author Michael and I wanted to hold up the book just so we can look at it and celebrate it again. 218 00:25:33,730 --> 00:25:41,770 This is such an achievement. I congratulate Charissa for writing something that is so accessible and I have many, many thoughts to share. 219 00:25:41,770 --> 00:25:49,250 So I'm just going to dive right in. I would recommend this book to people who think they already know this topic, 220 00:25:49,250 --> 00:25:54,490 of which I was one of those narcissistic people, and I found myself underlining pretty much on every page. 221 00:25:54,490 --> 00:26:00,900 So as somebody who's the supposed experts, I learnt a lot and was challenged a lot by this book. 222 00:26:00,900 --> 00:26:07,060 But I particularly think it will be a value to people who think they don't need this book. 223 00:26:07,060 --> 00:26:13,150 Anyone who thinks that they have nothing to hide, nothing to fear, would really benefit from this book. 224 00:26:13,150 --> 00:26:18,760 And also, I think younger people, because I think some of my assumptions of why I thought I didn't need to read this 225 00:26:18,760 --> 00:26:24,760 book are because I lived through so much of what Caressa so ably summarises, 226 00:26:24,760 --> 00:26:29,140 which is 9/11 and the changes that were brought about in the United States 227 00:26:29,140 --> 00:26:35,920 by the Patriot Act and that history that she gives the contextualisation of. 228 00:26:35,920 --> 00:26:40,660 Really, not just the surveillance capitalism model that Shaunessy Bob discusses in her book 229 00:26:40,660 --> 00:26:45,190 and that Charissa does a very excellent job of bringing to our attention here, 230 00:26:45,190 --> 00:26:50,140 but also the very genuine good intentions that I think a lot of liberal democracies 231 00:26:50,140 --> 00:26:56,590 had in the wake of 9/11 and the power and promise of data to fight crime, 232 00:26:56,590 --> 00:27:03,850 to fight terrorism. And we're seeing some of those arguments revisited today in a fight against a pandemic. 233 00:27:03,850 --> 00:27:09,760 So I think having that historical context for people who are perhaps younger and 234 00:27:09,760 --> 00:27:13,290 I'm thinking of all the people who might be wondering if they should read this, 235 00:27:13,290 --> 00:27:18,160 if they're studying at university or even in high school or their parents or their teachers. 236 00:27:18,160 --> 00:27:24,190 This is a really great introductory text that also is surprisingly useful for experts. 237 00:27:24,190 --> 00:27:32,710 So you will hit so many different levels of knowledge and curiosity and assumption challenging with it. 238 00:27:32,710 --> 00:27:40,870 It's great if you have if you don't have the scars of the 2016 U.S. election and Brexit referendum, if you don't have the scars of 9/11, 239 00:27:40,870 --> 00:27:50,680 if you don't have the scars of finding out how your data that you were perhaps hitting with very good faith onto the Internet in the 2000s and 2010s. 240 00:27:50,680 --> 00:27:55,540 This is a nice review for why some of us are so, so concerned about this. 241 00:27:55,540 --> 00:28:02,530 And for those of us who have lived through it, as I say, there's still quite a lot in there that is really new. 242 00:28:02,530 --> 00:28:10,540 I like this idea, this question of is data something we should be thinking about as a question of ownership or as a human rights? 243 00:28:10,540 --> 00:28:20,200 And again, Charissa really brings that question to the surface, because you start thinking, why do all of these data brokers have all of our data? 244 00:28:20,200 --> 00:28:26,380 I don't transact. You don't transact ever directly with the data broker. 245 00:28:26,380 --> 00:28:31,720 Yet they are able to buy and sell our data and we can't stop them from doing it. 246 00:28:31,720 --> 00:28:36,790 We're powerless to do it. And even with all of the very wonderful tips at the end of this book, 247 00:28:36,790 --> 00:28:41,320 in the last two chapters in particular about what we can do on an individual level, 248 00:28:41,320 --> 00:28:47,110 there is no stopping Equifax, which is one of the big data brokers in the United States. 249 00:28:47,110 --> 00:28:53,800 As an individual American, or indeed many non Americans who were not based in the United States were U.S. citizens, 250 00:28:53,800 --> 00:29:01,420 but who still had their data taken because Equifax failed to protect it and the US Congress did nothing right. 251 00:29:01,420 --> 00:29:06,160 Had a hearing that that was it. So I think one of the things that you find is you might wanna read this book 252 00:29:06,160 --> 00:29:10,090 with a stressful because you'll read it and you will find yourself thinking, 253 00:29:10,090 --> 00:29:17,080 why are our elected representatives doing so little to protect us? 254 00:29:17,080 --> 00:29:26,230 What do we need to do about that? So there's a really nice tension in this book between what you can do as an individual, how it isn't just about you, 255 00:29:26,230 --> 00:29:33,520 it's about everyone in your network, and indeed just the society and tone that we're creating about data and privacy. 256 00:29:33,520 --> 00:29:37,990 And then there's a sort of there's a limitation to that at a certain point. 257 00:29:37,990 --> 00:29:41,860 Are regulators and lawmakers need to be doing a better job. 258 00:29:41,860 --> 00:29:48,450 And the perennial complaint of regulators is they never feel they have enough staff and they never feel they have enough money. 259 00:29:48,450 --> 00:29:54,870 Right. So they don't have the resources to do the job they need to take on these very big companies who have. 260 00:29:54,870 --> 00:30:02,070 Bottomless pits of money and all the lawyers they can hire. So the question we have to ask and we might have to get awkward about this is, well, what? 261 00:30:02,070 --> 00:30:05,460 What is enough money for regulator today? 262 00:30:05,460 --> 00:30:14,190 What do what does the ICAO here in the United Kingdom information commissioner's office need to be able to truly take on a Facebook? 263 00:30:14,190 --> 00:30:20,460 All right. Because there's all sorts of companies getting away with murder in this country and the ICAO can only do so much. 264 00:30:20,460 --> 00:30:28,530 And all of our lawmakers, who I also hope will read this book and I hope that they all get a copy of it if they don't buy it themselves. 265 00:30:28,530 --> 00:30:33,540 I need to ask themselves, what are they doing by not taking action? 266 00:30:33,540 --> 00:30:39,150 And one of the things that I would love to hear in the discussion that we do later after my remarks is to 267 00:30:39,150 --> 00:30:46,410 discuss the role of the GDP are the general data protection regulation and what CARECEN thinks about it? 268 00:30:46,410 --> 00:30:55,320 Is it is it enough? Is it just a start? How is it influencing our thinking about data protection around the world? 269 00:30:55,320 --> 00:30:59,340 There was a really great assessment that was just published today about California's Data Protection Act, 270 00:30:59,340 --> 00:31:03,350 which takes a lot of inspiration from here in Europe. And just your thoughts on that. 271 00:31:03,350 --> 00:31:09,030 As a philosopher, I think would be absolutely fascinating to hear. 272 00:31:09,030 --> 00:31:12,510 There was discussion in the book, I think, but don't quote me on it. 273 00:31:12,510 --> 00:31:18,210 I'd love to ask about whether or not you think we need to ban facial recognition technology. 274 00:31:18,210 --> 00:31:23,570 And if so, in what uses. So are we talking about using it to unlock my phone, 275 00:31:23,570 --> 00:31:32,670 ban that using it at work or using it out in public or the use of it for law enforcement, et cetera? 276 00:31:32,670 --> 00:31:41,430 This concept of consent and there was an article I remember a few years ago called Putting the Con in Consent, which is that it's a joke. 277 00:31:41,430 --> 00:31:48,150 We're being asked to violate our own privacy repeatedly over and over again. 278 00:31:48,150 --> 00:31:55,020 And that you can't really use a smartphone or go on the Internet without being asked to violate your own privacy. 279 00:31:55,020 --> 00:31:58,530 And I've noticed I was really good on finding a lot of this. 280 00:31:58,530 --> 00:32:06,540 And frankly, since the pandemic, I'm just so psychologically tapped that I just find myself often consenting to things because I'm on the Internet. 281 00:32:06,540 --> 00:32:11,780 So much for my work that I would be asked to consent, which is set my privacy settings, 282 00:32:11,780 --> 00:32:20,220 you know, hundreds of times a day, which is just not realistic. And that's what I mean about the model of consent is so flawed. 283 00:32:20,220 --> 00:32:24,900 It doesn't work on an individual level. It's not working in aggregate. Is your book so aptly? 284 00:32:24,900 --> 00:32:29,640 Aptly points out. And so the question becomes, well, then how do we fix it? 285 00:32:29,640 --> 00:32:34,980 And that's more than an individual engineer can fix. It's more than one country can fix. 286 00:32:34,980 --> 00:32:40,260 This is something we're going to have to really get thorny with. 287 00:32:40,260 --> 00:32:43,230 You raise some excellent points about democracy, 288 00:32:43,230 --> 00:32:49,620 a question that is close to my heart at the moment with what's coming in the United States in just a few weeks time. 289 00:32:49,620 --> 00:32:58,170 And again, this this question of not just being manipulated by ads, because I think a lot of people have pushed back on that argument saying, 290 00:32:58,170 --> 00:33:05,700 I know very well when I think about my politics and I'm not you know, it wasn't Cambridge Analytica that made me vote one way or another or not vote. 291 00:33:05,700 --> 00:33:07,350 And we can argue that point or not. 292 00:33:07,350 --> 00:33:16,450 But I think the point that you raised in the book that's so important is, is it just about undermining confidence in the entire process? 293 00:33:16,450 --> 00:33:21,520 And who's getting access to our data? And the really not a question that we must ask of. 294 00:33:21,520 --> 00:33:24,220 Is it simply Russia? That's interfering in elections? 295 00:33:24,220 --> 00:33:30,610 Or do liberal democracies, which have had a history of interfering in elections elsewhere around the world? 296 00:33:30,610 --> 00:33:37,270 Do we ever do that? And do companies based in liberal democracies allow that to happen? 297 00:33:37,270 --> 00:33:44,800 And is that an underexploited area? So you raised some some points that I thought were so provocative. 298 00:33:44,800 --> 00:33:50,620 And the biggest one, the one I will end on is that for so long, 299 00:33:50,620 --> 00:33:57,210 people have put a tension between privacy and security and said you have to sacrifice one or the other, 300 00:33:57,210 --> 00:34:03,040 and that if we allow everybody to be private online, then all the criminals and terrorists can run around and do what they want. 301 00:34:03,040 --> 00:34:08,920 So we have to sacrifice our privacy of open everything up and then we can have greater security because law enforcement can do their job. 302 00:34:08,920 --> 00:34:13,360 That's been a really long argument. We'll call that sort of post 9/11 argument. 303 00:34:13,360 --> 00:34:16,820 But you make a different argument. I think. 304 00:34:16,820 --> 00:34:25,700 And it's a compelling one, which is that actually privacy enhances personal security and privacy enhances collective security. 305 00:34:25,700 --> 00:34:29,990 And I personally am very much of that argument and would agree with you. 306 00:34:29,990 --> 00:34:33,830 And then the counter argument that I now find myself arguing against in my own 307 00:34:33,830 --> 00:34:38,060 head and hope to argue all of you arguing in the fun and philosophical sense, 308 00:34:38,060 --> 00:34:45,110 not in the antagonistic sense, is that we need data in this pandemic. 309 00:34:45,110 --> 00:34:50,780 We need to really granular data. We need data about people's contacts to do contact tracing. 310 00:34:50,780 --> 00:34:59,630 Whether it's with a human contact tracing team, which is really privacy invasive or an app or a wearable device like we're seeing in Singapore, 311 00:34:59,630 --> 00:35:05,120 we need public health authorities to be able to know when a cluster has has been identified. 312 00:35:05,120 --> 00:35:08,780 We need to be identifying super spreader events so that we can stop them. 313 00:35:08,780 --> 00:35:15,530 And we also need to see if it's certain people who are spreading this disease more than others, because as we're learning more and more about it, 314 00:35:15,530 --> 00:35:21,440 we're starting to find out that it seems to be only a small percentage of the population that's spreading it a lot. 315 00:35:21,440 --> 00:35:28,610 And that might be part of having to do the virus response management, according to public health authorities, which, if you are into privacy, 316 00:35:28,610 --> 00:35:35,270 is such a challenge because you're thinking about things not just for your own personal health or that of your family. 317 00:35:35,270 --> 00:35:40,520 You're also thinking about it collectively, which is that the sooner we can get out of this mess, the better for all of us. 318 00:35:40,520 --> 00:35:47,620 So I love that privacy versus security. Privacy is security. 319 00:35:47,620 --> 00:35:52,570 And then pandemic addition, which I would just call WCF. 320 00:35:52,570 --> 00:35:57,640 I don't know, I don't have the answers to it, but I loved I loved the way that your book made me think. 321 00:35:57,640 --> 00:36:00,890 And I love how accessible it is to beginners, too. 322 00:36:00,890 --> 00:36:05,220 So so-called experts. So thank you so much. Thank you, Stephanie. 323 00:36:05,220 --> 00:36:10,200 That was fantastic. Look, that's a very rich array of comments that Charissa. 324 00:36:10,200 --> 00:36:15,630 And it would be unfair to get you to sort of respond to all the interesting issues that were raised there. 325 00:36:15,630 --> 00:36:19,390 But is there anything in particular? A couple of points you'd like to address the Rysiek? 326 00:36:19,390 --> 00:36:24,820 Those two sets of comments? Yeah, sure. Thank you so much for those very insightful comments. 327 00:36:24,820 --> 00:36:29,460 Maybe one thing to address that I think is in everybody's mind is whether this is realistic. 328 00:36:29,460 --> 00:36:33,060 Is it realistic to call for an end to the data economy? 329 00:36:33,060 --> 00:36:41,240 And of course, if we look for reasons for pessimism in history and human nature, it's not difficult to find them. 330 00:36:41,240 --> 00:36:48,180 They're everywhere, you know, screaming at us every day. Like, ecology is a big one. 331 00:36:48,180 --> 00:36:51,590 But there is one reason that I think. 332 00:36:51,590 --> 00:37:00,800 Gives us a reason to be optimistic, and that is we have skin in the game individually in a way that inequality is less obvious. 333 00:37:00,800 --> 00:37:07,490 Many people think, well, you know, I won't be here by the time, you know, the earth roasts. 334 00:37:07,490 --> 00:37:13,280 But with privacy, it's becoming more and more obvious how people are at risk. 335 00:37:13,280 --> 00:37:17,330 So in a recent survey that I carried out with Sambrook from Oxford, 336 00:37:17,330 --> 00:37:24,290 we found out that about 92 percent of people have had some kind of bad experience related to privacy online. 337 00:37:24,290 --> 00:37:30,730 Sometimes it's identity theft. Sometimes it is public humiliation, doxing all kinds of bad experiences. 338 00:37:30,730 --> 00:37:41,610 And that really makes people realise the risk that they're engaging in when when they give up their privacy in a way that we don't realise that yet. 339 00:37:41,610 --> 00:37:43,520 That immediately about ecology. 340 00:37:43,520 --> 00:37:52,340 And also, even though privacy is collective, there is also this individual site in which hopefully that can motivate people to act. 341 00:37:52,340 --> 00:38:01,150 And there are some examples of people overcoming barriers and coming together and organising and being very successful. 342 00:38:01,150 --> 00:38:05,400 The one I gave in the book, but there are others is the ozone layer. 343 00:38:05,400 --> 00:38:10,190 You know, a few decades ago, it was really a really bad shape. We came together, we realised this. 344 00:38:10,190 --> 00:38:16,130 We banned CFC. And today it's recovering and it will completely recover in a few years. 345 00:38:16,130 --> 00:38:23,420 So that is an incredible success story and an example of the things that we can do when when we put our minds to it. 346 00:38:23,420 --> 00:38:29,180 But there are others. If you think about child labour, at some point it was unimaginable for it to go away. 347 00:38:29,180 --> 00:38:32,360 And yet we we change things and we change rules. 348 00:38:32,360 --> 00:38:37,880 And even though today, particularly in the context of the pandemic, the digital economy seems so entrenched. 349 00:38:37,880 --> 00:38:47,660 We should bear in mind that it's really very new and there is still a lot to be digitised and that this is very brand new and there's a lot of I mean, 350 00:38:47,660 --> 00:38:51,860 a lot of opportunity to change it in time about the GDP. 351 00:38:51,860 --> 00:39:02,310 I think that that's also probably people are wondering about that. I think it was a magnificent thing, you know, three years before it happened. 352 00:39:02,310 --> 00:39:10,110 Nobody thought it was even possible. Everybody just laughed. And that it came to be was an incredible achievement. 353 00:39:10,110 --> 00:39:17,670 And it really put privacy in the minds of so many people. And it made companies have to change so many things and deal better with data 354 00:39:17,670 --> 00:39:23,880 and think about it and realise that personal data is a liability for companies. 355 00:39:23,880 --> 00:39:27,720 So in a way, it's it's wonderful. No, it's not enough. Not nearly enough. 356 00:39:27,720 --> 00:39:35,820 We need to go way beyond that. One of the main problems is, as Stephanie said, that it feels it's it's based on consent. 357 00:39:35,820 --> 00:39:41,710 And that doesn't make any sense for many reasons. One, it's too burdensome for the individual, too. 358 00:39:41,710 --> 00:39:46,440 There's no way of informed consent when you don't know what kind of inferences can be made from the data. 359 00:39:46,440 --> 00:39:53,640 I'm not even data scientists can know. So when we speak about consent, how are you consenting if you have no idea what you're consenting to? 360 00:39:53,640 --> 00:40:03,360 And third, because personal data is collected in this way, it's not okay for you to consent to give your data when your data includes data about me. 361 00:40:03,360 --> 00:40:09,610 So when my family gives up their digital genetic data, they're giving their genetic data up me. 362 00:40:09,610 --> 00:40:12,700 They're their siblings. Their parents are kids. 363 00:40:12,700 --> 00:40:19,770 And so we don't have the moral authority to consent to personal data being collected in the way that we have 364 00:40:19,770 --> 00:40:27,040 more authority to consent to what's being done to our bodies in the context of matters in for instance. 365 00:40:27,040 --> 00:40:37,030 Regarding the question that Stephanie mentioned about whether liberal democracies also interfere in elections in other places in the world, 366 00:40:37,030 --> 00:40:42,910 I think that is indeed an honour. That's something that's understudied and that we should pay more attention to. 367 00:40:42,910 --> 00:40:48,170 We know that there are many data firms out there similar to Cambridge Analytica. 368 00:40:48,170 --> 00:40:54,590 Hundreds of them. And we should really be looking at what they do exactly and whether it's OK. 369 00:40:54,590 --> 00:41:01,780 And then finally, this argue the post 9/11 argument in the context of coalbed. 370 00:41:01,780 --> 00:41:06,340 I think that it is a false dichotomy to think that we have to choose between security and privacy. 371 00:41:06,340 --> 00:41:10,370 And in fact, when we sacrifice privacy, we sacrifice security. 372 00:41:10,370 --> 00:41:18,520 And one example is because the Internet, the Internet is very insecure and it's very insecure, insecure to allow for data collection. 373 00:41:18,520 --> 00:41:26,140 It's kind of conscience. A conscious decision, but making the Internet so insecure means that we are all at risk. 374 00:41:26,140 --> 00:41:32,410 So if hackers hacked even just 10 percent of electrical appliances in a country, 375 00:41:32,410 --> 00:41:36,370 they could bring the electrical grid down and that would put the country on its knees. 376 00:41:36,370 --> 00:41:41,890 So it's it's that's risky to have but privacy practises. 377 00:41:41,890 --> 00:41:46,600 And with regards to the coronavirus pandemic, of course, of course, there is a debate to have. 378 00:41:46,600 --> 00:41:52,510 And in some cases, you do need to give up privacy in order to get some benefits. 379 00:41:52,510 --> 00:41:59,380 I mean, obviously, when you go to the doctor, if you don't tell them what's wrong with you. You're not gonna get the care that you need. 380 00:41:59,380 --> 00:42:09,430 But at the same time, I think that because it's such a sexy debate, it distracts from the main point that we need medical solutions. 381 00:42:09,430 --> 00:42:15,040 You know, we need a good testing system, which we don't have. 382 00:42:15,040 --> 00:42:19,150 We have mass testing and we need protective equipment for the right people. 383 00:42:19,150 --> 00:42:26,170 And we need social distancing and we need medicines and we need a vaccine. And these are the medical requirements that will make the pandemic go away. 384 00:42:26,170 --> 00:42:28,520 And no app can substitute for that. 385 00:42:28,520 --> 00:42:41,340 And I worry that too much, too 20 times the debates kind of get distracted by these finer points, whereas the main thing is not being addressed. 386 00:42:41,340 --> 00:42:48,750 Great. Let me raise some questions that have come through from our audience that actually relate to some of the points you've made, 387 00:42:48,750 --> 00:42:54,390 so I'm just going to mention two of them that distinct, but I think they relate to what you discussed. 388 00:42:54,390 --> 00:43:01,230 So you began with some optimism that we can bring an end to the data economy, the personal data economy. 389 00:43:01,230 --> 00:43:03,510 This is a question from Tristan Gertz who says, 390 00:43:03,510 --> 00:43:14,040 How can we dismantle this economy given the huge and pervasive power of big tech in all aspects of life, including our personal lives? 391 00:43:14,040 --> 00:43:20,220 So that's the first thing. I mean, I think also about the way in which, for example, they can buy political influence, 392 00:43:20,220 --> 00:43:24,300 especially in America with lax camp campaign financing laws and so forth. 393 00:43:24,300 --> 00:43:32,130 So apart from appealing to historical analogies, what's the concrete way this is going to happen? 394 00:43:32,130 --> 00:43:39,540 Second question was Maximilien Kina, which is he says if privacy is a collective good. 395 00:43:39,540 --> 00:43:44,760 How significant can individual consent be? Now, I want to riff on this a little bit, 396 00:43:44,760 --> 00:43:51,840 because I think one of the most original things about your book is that normally when we think about privacy, we think about the right to privacy. 397 00:43:51,840 --> 00:43:56,160 And that means that my personal information is protected. 398 00:43:56,160 --> 00:44:00,210 You have duties not to pry into it, not to use it in certain ways. 399 00:44:00,210 --> 00:44:07,470 But in addition to talking about a right to privacy, you seem to be talking as well that we have duties of non-disclosure. 400 00:44:07,470 --> 00:44:12,450 I think this is really interesting. So you say we want to create a culture of restraint. 401 00:44:12,450 --> 00:44:21,540 You say we want to create. You say that a culture of exposure where we voluntarily expose information damages society. 402 00:44:21,540 --> 00:44:26,220 And you say things like that. People shouldn't be able, I think. 403 00:44:26,220 --> 00:44:33,180 I think you say they shouldn't be able to go and buy direct to consumer DNA test and publish that size these results, 404 00:44:33,180 --> 00:44:39,540 because if they do, then they're in a sense exposing information also about other people. 405 00:44:39,540 --> 00:44:46,450 So I think this kind of raises an issue that Michael kind of approaches, which is balancing the rights. 406 00:44:46,450 --> 00:44:52,860 Someone might say, well, look, your your imposing such heavy duties of non-disclosure, 407 00:44:52,860 --> 00:44:58,650 of not being able to do a DNA test and so forth, you're actually constricting my liberty. 408 00:44:58,650 --> 00:45:07,890 So how do we draw the line between the liberties that I have even to do wrong, even to engage in forms of exposure that are not good? 409 00:45:07,890 --> 00:45:13,190 And how so? How do we make proper regard to privacy compatible with. 410 00:45:13,190 --> 00:45:18,970 Regard for the liberty of the person. Excellent questions. 411 00:45:18,970 --> 00:45:23,950 So how do we change this? We've changed this mainly through public pressure. 412 00:45:23,950 --> 00:45:29,440 When regulators and companies realise that people are really sick and tired of it and we're not 413 00:45:29,440 --> 00:45:35,320 going to collaborate anymore and we're not going to cooperate to the best of our abilities. 414 00:45:35,320 --> 00:45:40,180 Things will change. It's amazing how sensitive governments and companies are to people sentiments. 415 00:45:40,180 --> 00:45:43,210 And you might think that nobody cares. But they're actually monitoring. 416 00:45:43,210 --> 00:45:47,920 All of these companies are monitoring social media to see what people think of their products. 417 00:45:47,920 --> 00:45:54,610 And one of the examples I gave is how Google Glass didn't take off because everybody thought it was so creepy. 418 00:45:54,610 --> 00:46:00,610 And people who started using it were called glass holes that very soon it just disappeared. 419 00:46:00,610 --> 00:46:11,620 And we can do the same with this. Furthermore, I have the hope that companies realise that privacy can be a competitive advantage when people want it. 420 00:46:11,620 --> 00:46:20,140 It can sell. And so if a big company starts giving much, much better services with privacy and people start choosing it, 421 00:46:20,140 --> 00:46:27,530 other companies will follow and they will up their game. So I think it will be effortful and it's going to take some time. 422 00:46:27,530 --> 00:46:36,080 But the sentence, I can't take that much time, if you will really start behaving in this way and start expressing their dissent. 423 00:46:36,080 --> 00:46:41,210 It could change quite quickly, just like the GDP came into effect in like a matter of few years, 424 00:46:41,210 --> 00:46:45,470 five years, when before that everybody thought it was absolutely and completely impossible. 425 00:46:45,470 --> 00:46:54,070 Never going to happen. Privacy's then. So I think we should be more optimistic and I think I really think that tech wants us to be pessimistic. 426 00:46:54,070 --> 00:47:00,070 They want us to see them as these giants who will be here forever and that, you know, we can't possibly change. 427 00:47:00,070 --> 00:47:08,850 But in fact, they're quite dependent on us. So it's good to have one in mind with regard to the collective part of privacy. 428 00:47:08,850 --> 00:47:17,210 Yes, I do think we have duties. To protect our privacy, both for ourselves. 429 00:47:17,210 --> 00:47:25,270 For the people around us and for our society in general. I wouldn't want those duties to be implemented in law to a very strict degree. 430 00:47:25,270 --> 00:47:30,080 So I don't I don't argue that people shouldn't be able to do a genetic test. 431 00:47:30,080 --> 00:47:31,730 I think that, you know, you should think about it. 432 00:47:31,730 --> 00:47:36,320 And if you don't need it for medical reasons, you should be aware that you're putting other people at risk and that, 433 00:47:36,320 --> 00:47:41,930 you know, 40 percent of these results are false. And like, you know, is this worth it? 434 00:47:41,930 --> 00:47:46,560 And here, I think a lot depends on culture. 435 00:47:46,560 --> 00:47:51,510 Currently, we have a culture of exposure in which there is a lot of pressure to expose yourself. 436 00:47:51,510 --> 00:47:57,840 You have to say what you think about an issue at all times. People just assume that if they take a picture of you. 437 00:47:57,840 --> 00:48:02,160 They can upload it to wherever they want and tag you and these kinds of things. 438 00:48:02,160 --> 00:48:09,000 And already I think it's changing. When I started working in this area, whenever I talked about this with people, they were kind of surprised. 439 00:48:09,000 --> 00:48:16,800 And more and more either I must before I need to ask permission to upload my my photo or when I do. 440 00:48:16,800 --> 00:48:22,350 People are completely apathetic about it and interested. And it's it's much, much easier. 441 00:48:22,350 --> 00:48:29,700 So I think it's already changing and we're more aware of the risks online as we become more savvy online. 442 00:48:29,700 --> 00:48:34,500 And here I think we should. We should think about the analogy with with medical ethics. 443 00:48:34,500 --> 00:48:39,390 So there are a few things in medical ethics in which even if people consent, you can't do. 444 00:48:39,390 --> 00:48:45,630 So one is like very extreme bad medical practises that even even if only the individual gets harmed. 445 00:48:45,630 --> 00:48:50,660 So if you go to the hospital and you tell the doctor, hey, I want you to infect me with this disease just because, 446 00:48:50,660 --> 00:48:55,020 you know, I want to try it out, I want experience that in my own flesh. This is interesting to me. 447 00:48:55,020 --> 00:48:59,850 Even if the patient would agree and the doctor would agree, that's that's illegal, that you can't do that. 448 00:48:59,850 --> 00:49:05,910 I mean, the same way for public health purposes. There are certain things in the context of the pandemic you can do. 449 00:49:05,910 --> 00:49:13,170 So you can't, you know, are singing a choir without any mask, you in an enclosed space, et cetera, et cetera. 450 00:49:13,170 --> 00:49:19,380 So in the same mind that we don't allow certain very, very bad practises to happen in medicine, we shouldn't allow very, 451 00:49:19,380 --> 00:49:23,520 very bad practises to happen with regards to personal data, 452 00:49:23,520 --> 00:49:29,060 even if people were to consent because society has an interest that we are protected in these ways. 453 00:49:29,060 --> 00:49:37,380 And I think this doesn't need to be extremely stringent. So you don't need to feel like your liberty is is being limited, 454 00:49:37,380 --> 00:49:41,760 just as you generally don't feel that your liberty is being limited because you're 455 00:49:41,760 --> 00:49:46,780 not allowed to go to a doctor and ask them to infected with a deadly disease. 456 00:49:46,780 --> 00:49:50,050 Okay, let me ask you something about you say a culture. 457 00:49:50,050 --> 00:49:58,420 So there was one question from Ross Jones who says, look, this cultural variation in the importance people attach to various aspects of privacy. 458 00:49:58,420 --> 00:50:05,910 The book is very hostile to personalised ads. You say one point they have to stop. 459 00:50:05,910 --> 00:50:14,680 Page 26. Ross Jones says that in China, people have a more favourable attitude towards personalised ads. 460 00:50:14,680 --> 00:50:19,300 So, you know, number one. What's wrong with personalised ads? 461 00:50:19,300 --> 00:50:28,070 And number two, might there be legitimate cross-cultural variation in how we regulate things like personal instance? 462 00:50:28,070 --> 00:50:33,310 Personalised us. Don't seem too bad on the surface, they seem like, OK. 463 00:50:33,310 --> 00:50:40,700 So. Companies get to sell the products and you get to see ads that are relevant to you. 464 00:50:40,700 --> 00:50:48,380 Everybody wins. But the cost is not a parent. If that were the whole deal, then I would be totally in favour of personalised Strads. 465 00:50:48,380 --> 00:50:53,090 But the risk is so high and the disadvantages are so high that the benefit might not be worth it. 466 00:50:53,090 --> 00:51:00,020 And the cost is that people will misuse that data to target people in ways that are very unfair, 467 00:51:00,020 --> 00:51:07,310 either because they're going to target them for their vulnerabilities. So you identify who is that, a veteran, 468 00:51:07,310 --> 00:51:14,720 and then you sell them payday loans that are going to put them in a very bad position or because the data is going to be used 469 00:51:14,720 --> 00:51:22,670 to target political propaganda that is going to inflame the population and is going to pit what people against each other. 470 00:51:22,670 --> 00:51:28,970 And one of the things that we don't realise is that we don't have. UNmediated access to reality. 471 00:51:28,970 --> 00:51:32,030 Most of what we know about the world, we know through our screens. 472 00:51:32,030 --> 00:51:36,620 And so when you think that, you know, I won't be influenced because I don't believe ads. 473 00:51:36,620 --> 00:51:41,000 Everything you watch online is part of it makes a picture of the world. 474 00:51:41,000 --> 00:51:46,130 And if your picture of the world is completely different from your neighbour because you don't see the same content, 475 00:51:46,130 --> 00:51:53,570 it's very hard to have a peaceful and harmonious society and relationship with a person or with those people. 476 00:51:53,570 --> 00:52:01,790 So the cost is too high. Regarding cultural differences, I think they have been exaggerated. 477 00:52:01,790 --> 00:52:08,230 A lot. And it's it's more of a myth than a reality than that people in Eastern cultures don't care about privacy. 478 00:52:08,230 --> 00:52:12,740 And one really good example has been China precisely during the pandemic. 479 00:52:12,740 --> 00:52:20,780 It's been very interesting to see how people are starting to rebel against this level of intrusion by speaking more to newspapers, 480 00:52:20,780 --> 00:52:28,250 by talking amongst themselves. And there has been much more repression from the Chinese government clashing down 481 00:52:28,250 --> 00:52:32,300 on the on these conversations about privacy and these concerns about privacy. 482 00:52:32,300 --> 00:52:38,330 So even though there might be a culture of variation, the reasons why we care about privacy, 483 00:52:38,330 --> 00:52:47,480 the interest we have in privacy being protected are the same for all human beings because nobody wants to be abused and nobody wants to be at risk of, 484 00:52:47,480 --> 00:52:53,840 you know, identity theft or physical. You don't want your physical security compromised. 485 00:52:53,840 --> 00:53:04,820 You want to be able to vote in a way that is free of pressures. And these kind of very primal interests are pretty common across cultures. 486 00:53:04,820 --> 00:53:11,900 Let me ask one question and hopefully Michael and Stephanie also could give their views about this. 487 00:53:11,900 --> 00:53:18,020 So, you know, we've already talked about the potential tension between protection of privacy and freedom of speech. 488 00:53:18,020 --> 00:53:22,930 The one area with freedom of speech seems to be particularly important is political speech. 489 00:53:22,930 --> 00:53:30,010 Now, you say something in respect to targeted sort of microtargeting political ads. 490 00:53:30,010 --> 00:53:34,970 I think in the Cambridge analytical context, you say this on page two, 107, 491 00:53:34,970 --> 00:53:42,800 discouraging people who might vote for the candidate who is not your client from going to vote is thwarting democracy. 492 00:53:42,800 --> 00:53:51,170 So I'm calling democracy. If I discourage people from voting for the client, who is the candidate who is not my client, but someone would say, 493 00:53:51,170 --> 00:53:54,080 but that's the essence of political campaigning, 494 00:53:54,080 --> 00:53:59,690 that I'm trying to discourage people to vote for Trump or whoever it is that I want people to vote for. 495 00:53:59,690 --> 00:54:07,160 So there's a kind of one might think you're kind of worrying feature there that, you know, someone's gonna have to decide what propaganda is. 496 00:54:07,160 --> 00:54:15,350 And this is going to be something that we have to somehow ban. Isn't it better just to recognise that people will come forward with different views? 497 00:54:15,350 --> 00:54:19,160 And as long as people have access to different views, they can make up their minds. 498 00:54:19,160 --> 00:54:28,610 But it's it's of the essence of political speech to encourage people to vote for one person or at least discourage them from voting somewhere else. 499 00:54:28,610 --> 00:54:32,630 I reread that page and maybe edited to be increasingly clear. 500 00:54:32,630 --> 00:54:41,120 What I was referring to is that is voter suppression. So to discourage people from voting and the people who you discourage from voting are 501 00:54:41,120 --> 00:54:44,420 only the people who would vote for the candidate that you don't want them to vote for. 502 00:54:44,420 --> 00:54:51,930 That is why they start thwarting democracy. Can I ask also Michael and Stephanie what their thoughts are on this and in particular? 503 00:54:51,930 --> 00:54:55,550 It's an issue that Carissa raises quite powerfully, 504 00:54:55,550 --> 00:55:06,630 which is that she says personalised ads fracture the public sphere into individual parallel realities. 505 00:55:06,630 --> 00:55:13,480 And that this undermines the kind of deliberation and solidarity needed for just serious Democratic decision making. 506 00:55:13,480 --> 00:55:20,830 Any thoughts on this? I certainly agree that there's a strong point there. 507 00:55:20,830 --> 00:55:31,360 The question is how how far does it go? And I think Chris's project is a realistic one in the long term, not in the short term, but in the long term. 508 00:55:31,360 --> 00:55:41,710 And that is the result of public discussion. That will be a solution, but it isn't a political speech at the moment. 509 00:55:41,710 --> 00:55:48,430 The only sort of information which is is not permissible for this to use to 510 00:55:48,430 --> 00:55:55,030 influence voters is false information about the personal character of a candidate. 511 00:55:55,030 --> 00:56:00,670 Even that is difficult to define and is not a provision to which which is commonly used. 512 00:56:00,670 --> 00:56:06,790 So, yes, there must be other forms of interference with elections, 513 00:56:06,790 --> 00:56:11,140 such as giving false information as to whether it's safe to go to the polling booth. 514 00:56:11,140 --> 00:56:16,600 That could be very destructive indeed. And there's plenty of other sorts of information as well. 515 00:56:16,600 --> 00:56:26,120 So I I'm quite certain that Chris has told us, particularly me, a great deal, but I didn't know before. 516 00:56:26,120 --> 00:56:34,210 And what's not important is that the debate has started and we'll be able to address these issues. 517 00:56:34,210 --> 00:56:40,390 One of the things that I was thinking about as we were just discussing this was that in some countries it's a requirement to vote. 518 00:56:40,390 --> 00:56:44,230 And in others, it is not a problem in states where it is not a requirement. 519 00:56:44,230 --> 00:56:52,390 And lots of people don't vote. And I believe Australia it is. And that made me wonder if that might be a way of, first of all, removing. 520 00:56:52,390 --> 00:56:59,620 And we had seen sort of innervates you could remove this whole question of disenfranchising voters by saying voting is now legal requirements. 521 00:56:59,620 --> 00:57:06,610 You have to vote. But I would once, if we were to do that, to have a none of the above option so that you're not being, 522 00:57:06,610 --> 00:57:11,410 again, forced to kind of violate your own consent. I don't want to be forced to say that I want any of these people. 523 00:57:11,410 --> 00:57:19,390 I would love a none of the above option as well so that we could have that option of my body for Trump and my body for Biden or neither. 524 00:57:19,390 --> 00:57:24,400 So that that's recorded. Right, because that's the choice and that will give us more accurate data. 525 00:57:24,400 --> 00:57:28,090 So I thought about Nas just as you were covered. You do have that. 526 00:57:28,090 --> 00:57:32,380 You can spoil your ballot paper. You will. But you never know if they've done it by accident or not. 527 00:57:32,380 --> 00:57:37,590 I want them to know I will. I would love to be able to count that. 528 00:57:37,590 --> 00:57:41,020 But I also thought about this point. You write it so beautifully in your book. 529 00:57:41,020 --> 00:57:47,460 Each of us lives in a different reality on page one of seven. And that made me sort of think. 530 00:57:47,460 --> 00:57:51,380 I was trying to argue against these things a lot in my mind because I was agreeing with you so much. 531 00:57:51,380 --> 00:57:54,690 Thank you. Isn't that true? Don't we all live in our own reality? 532 00:57:54,690 --> 00:57:58,650 And that's a big thing with this pandemic, because at the beginning, everyone was saying, 533 00:57:58,650 --> 00:58:05,560 oh, you know, we're all in the same boat, Ralph, facing the same threat. There was a little almost bizarre kumbayah, at least on Twitter. 534 00:58:05,560 --> 00:58:10,050 I just was like, what the hell are you people talking about there? There are people who are out in their garden. 535 00:58:10,050 --> 00:58:14,300 I can see them. They're my neighbours. They have a garden. I do not. Right. So just an example right there. 536 00:58:14,300 --> 00:58:17,420 They've been barbecuing and having people over. 537 00:58:17,420 --> 00:58:23,280 And if you have little kids versus teenagers versus if you're living on your own, you're having a totally different pandemic. 538 00:58:23,280 --> 00:58:28,110 If you're six months furloughed at 80 percent of your salary versus you've lost all your work, 539 00:58:28,110 --> 00:58:33,840 if you aren't any just doctor and you're working double time wearing bin bags because there's no PPE for you. 540 00:58:33,840 --> 00:58:38,310 You are having a different pandemic than me who is able to hold up and complain in my 541 00:58:38,310 --> 00:58:42,900 flat about being isolated while my friends were going to hospital and risk their lives. 542 00:58:42,900 --> 00:58:45,720 And it just made me think of this these different realities. 543 00:58:45,720 --> 00:58:52,680 And what I love is how you articulate this in your book, that this is where lawmakers and regulation are going to have a role to play. 544 00:58:52,680 --> 00:58:59,010 That maybe we have to agree some rules here for the social media age with regards to elections. 545 00:58:59,010 --> 00:59:07,740 And that made me think of France and how France has a rule that in the run up to an election, at a certain point, political advertising has to stop. 546 00:59:07,740 --> 00:59:07,980 I mean, 547 00:59:07,980 --> 00:59:15,660 I think I can speak for all Americans who say that they would love to see a ban on all political ads because they get so blasted and it starts, 548 00:59:15,660 --> 00:59:21,630 you know, a year up until it. But can you imagine how peaceful and harmonious people's lives and relationships 549 00:59:21,630 --> 00:59:25,530 would be if there was a sort of two week hiatus in the run up to the election? 550 00:59:25,530 --> 00:59:31,450 Was it's done just silence. Everybody said what they have to say. In the context of the book, 551 00:59:31,450 --> 00:59:36,580 I think the greater point is that we need to have these things public because if somebody is 552 00:59:36,580 --> 00:59:42,670 suppressing votes but we can't see it because only the person being targeted and the company sees them, 553 00:59:42,670 --> 00:59:49,270 then that's a problem. If we have ads being completely public and everybody can see the same thing, then academics can criticise them. 554 00:59:49,270 --> 00:59:53,770 Journalists can fact cheque them. We can have the public debate if we don't know what's going on. 555 00:59:53,770 --> 00:59:59,440 If we don't have what are called dark ads, then we can't have that kind of public debate. 556 00:59:59,440 --> 01:00:06,120 We can make a decision about what we want our society to look like. This has been a fascinating debate. 557 01:00:06,120 --> 01:00:10,440 I hope everyone who's joined us has enjoyed it as much as I have. 558 01:00:10,440 --> 01:00:18,210 I think all that's really left is to say thank you to Karissa for writing this excellent book and for discussing it with us today. 559 01:00:18,210 --> 01:00:23,220 Thank you also to our two distinguished commentators, Stephanie here. 560 01:00:23,220 --> 01:00:26,160 And so, Michael, chicken hat. 561 01:00:26,160 --> 01:00:34,270 Our next event, I should mention, is going to be on October 15th at the same time where we'll be discussing, I'm sorry, six o'clock, 562 01:00:34,270 --> 01:00:42,540 not the same time, six o'clock, where we'll be discussing algorithms, the ethics of algorithms with Cass Sunstein of Harvard University. 563 01:00:42,540 --> 01:01:25,067 But thank you, everyone, for watching, Miss. That's a different screen.