1 00:00:01,600 --> 00:00:07,480 Welcome to this book launch of the new book Privacy's Power by Charissa Alice, 2 00:00:07,480 --> 00:00:13,780 who is an associate professor at the Institute for the Ethics of A.I. here at the University of Oxford and a 3 00:00:13,780 --> 00:00:21,610 fellow of Hertford College and a fellow of the Oxford Martin Programme on the Ethics of Weapon Data Architecture. 4 00:00:21,610 --> 00:00:23,110 My name is Rosmus Nelson. 5 00:00:23,110 --> 00:00:27,850 I'm Professor of Political Communication at the University and director of the Reuters Institute for the Study of Journalism, 6 00:00:27,850 --> 00:00:29,740 where we explore the future of journalism worldwide. 7 00:00:29,740 --> 00:00:35,260 And I've done a little bit of work on some of the issues around privacy and data that Charissa will take us through today. 8 00:00:35,260 --> 00:00:39,430 My role here is to serve as the chair of the discussion today. 9 00:00:39,430 --> 00:00:43,840 So we all look forward to hearing Charissa talk about her important than interesting 10 00:00:43,840 --> 00:00:47,800 new book on privacy and how it connects with power or the power of states, 11 00:00:47,800 --> 00:00:53,010 the power of private companies and potentially the power of US citizens. 12 00:00:53,010 --> 00:00:56,710 And I will ask a few follow up questions after the presentation, 13 00:00:56,710 --> 00:01:01,930 and then we will have a Q&A opportunity for those of you who've joined the conversation today. 14 00:01:01,930 --> 00:01:06,730 If you could please use the ask a question function on the crowd cast platform, 15 00:01:06,730 --> 00:01:16,090 then I will take questions from there and give Keris an opportunity to answer the questions and expand on her presentation. 16 00:01:16,090 --> 00:01:21,820 So with that, welcome to all of you who've joined us for this conversation. 17 00:01:21,820 --> 00:01:27,160 Chris, I'm really looking forward to this. This is an important and timely and interesting book. 18 00:01:27,160 --> 00:01:33,100 It also coincides today with the judgement just passed by the European Court of Justice 19 00:01:33,100 --> 00:01:38,680 banning indiscriminate mass surveillance by governments unless for clearly defined purposes, 20 00:01:38,680 --> 00:01:46,830 period of time and with limited retention of data. So quite a timely, I think, reminder of just the big stakes that that cursus new book is about. 21 00:01:46,830 --> 00:01:52,420 So, Charissa, we look forward to hearing your talk first and then we will take it from there. 22 00:01:52,420 --> 00:02:00,460 Thank you so much for us once. So I'll give just a little presentation about the books that just give you an overview of what the book is about. 23 00:02:00,460 --> 00:02:06,580 I started getting interested in privacy about six or seven years ago, and it started as a very personal concern. 24 00:02:06,580 --> 00:02:12,610 I went with my mom to the archives in Spain to dig out the history of my family, 25 00:02:12,610 --> 00:02:16,300 and we learnt all sorts of things that my grandparents hadn't told us. 26 00:02:16,300 --> 00:02:22,510 And I started wondering whether we had a right to know those things, whether I had a right to maybe write about those things. 27 00:02:22,510 --> 00:02:31,330 And of course, being a philosopher, I turned to philosophy for some answers and I wasn't very satisfied with the answers I got. 28 00:02:31,330 --> 00:02:35,440 And I thought, surely we have to work more on this topic. 29 00:02:35,440 --> 00:02:40,990 There isn't or there wasn't a lot of work on privacy in philosophy at that time. 30 00:02:40,990 --> 00:02:48,880 And just around the time, Snowden revealed that the world was being subject to mass surveillance by intelligence agencies. 31 00:02:48,880 --> 00:02:55,810 I thought that was amazing. That really impacted me in a very deep way and that became a professional interest. 32 00:02:55,810 --> 00:03:04,750 I started researching it as a philosopher and I ended up writing my dissertation in Oxford on the ethics and politics of privacy. 33 00:03:04,750 --> 00:03:07,360 And the more I read about privacy, the more alarmed I get. 34 00:03:07,360 --> 00:03:16,060 I got about how dangerous it is to collect so much data, keep it indefinitely on the population. 35 00:03:16,060 --> 00:03:22,540 It really seemed to me that the system of the data economy that we have built is absolutely insane. 36 00:03:22,540 --> 00:03:31,930 It's a ticking bomb. I wrote this book in that kind of feeling of urgency, an alarm that we're doing something that's not very wise. 37 00:03:31,930 --> 00:03:38,650 That has led to rising trouble in the past. That's something that has happened in the past, is very likely to happen in the future again. 38 00:03:38,650 --> 00:03:47,770 And to try to change this. So the first part of the book is a bit of about how much data is being collected on you. 39 00:03:47,770 --> 00:03:51,370 Everybody knows that data is being collected on you. Well, that's very abstract. 40 00:03:51,370 --> 00:03:55,740 And data has a way of being sounding very abstract and sounding very innocuous. 41 00:03:55,740 --> 00:04:04,510 And I go through a whole day in somebody's life to just illustrate how much data. 42 00:04:04,510 --> 00:04:08,950 Tech giants and governments and other corporations are collecting on us. 43 00:04:08,950 --> 00:04:21,070 So some examples are they thought is being shared about you, includes your name, your age, where you live, where you work, who you sleep next to, 44 00:04:21,070 --> 00:04:26,260 whether you sleep well or not, what you eat, whether you smoke, how much you drink, 45 00:04:26,260 --> 00:04:32,470 how much you weigh your credit history, your browsing history, your purchasing history. 46 00:04:32,470 --> 00:04:37,420 It's incredibly detailed. And many inferences can be made from my data, 47 00:04:37,420 --> 00:04:45,760 from your life expectancy to the kind of diseases that you might have in the future to whether you're employable in a certain kind of job, 48 00:04:45,760 --> 00:04:54,370 whether you will pay back a loan. And all these data is being used to make decisions about our life that are impacting us in very deep ways. 49 00:04:54,370 --> 00:05:07,210 We you might get rejected by for a job or a loan or an apartment, and you might not know what piece of data exactly was the culprit for that. 50 00:05:07,210 --> 00:05:15,010 And for all you know, a piece of data might be inaccurate and we don't have access to that or any way to correct it or appeal. 51 00:05:15,010 --> 00:05:19,600 Many decisions are made on us. Furthermore, 52 00:05:19,600 --> 00:05:26,620 another motivation for me to write this book is to refuse a very common myth and a very common 53 00:05:26,620 --> 00:05:31,550 conception of privacy is something that's just a personal preference or an individual preference. 54 00:05:31,550 --> 00:05:35,560 And the idea is that it's up to you how much data you want to share. 55 00:05:35,560 --> 00:05:40,220 And if you're OK with sharing data, then you have no reason not to. 56 00:05:40,220 --> 00:05:43,780 And what I argue is not actually privacy is a political concern. 57 00:05:43,780 --> 00:05:53,260 Above all, privacy is collective in a way that this myth of privacy is personal doesn't really misguiding. 58 00:05:53,260 --> 00:06:00,250 So your data contains data about other people. If you share your genetic data, you shared data about your kids, 59 00:06:00,250 --> 00:06:07,990 your parents or siblings and even very different things that can be affected by that sharing of data and sometimes in a very profound way. 60 00:06:07,990 --> 00:06:14,380 So they could be deported. They could be jailed. They could have they could be denied certain kinds of insurance. 61 00:06:14,380 --> 00:06:21,160 And you have no control over how that data will be used and you probably won't know what happens to it. 62 00:06:21,160 --> 00:06:22,920 The book then explains how did we get here? 63 00:06:22,920 --> 00:06:28,660 It seems that, you know, if you were to explain the system from to somebody from the 1950s, they would say, well, that's crazy. 64 00:06:28,660 --> 00:06:36,460 Like what? Why did you do that? And so it explains a bit how we came to this point in history with the hope that 65 00:06:36,460 --> 00:06:42,670 understanding how it came about might help us in getting out of this situation. 66 00:06:42,670 --> 00:06:47,120 And the way it came about was Google was a protagonist in this story. 67 00:06:47,120 --> 00:06:53,240 And a lot other companies are already realised that they could use personal data to target ads. 68 00:06:53,240 --> 00:07:05,290 And this became incredibly profitable. And it became it became refined to a very large degree so that you could profile somebody for where they live, 69 00:07:05,290 --> 00:07:17,390 their gender, their political tendencies, their fears or desires or their ailments and so on into us. 70 00:07:17,390 --> 00:07:23,020 A new one. The FTC, the Federal Trade Commission in the United States had recommended to Congress 71 00:07:23,020 --> 00:07:26,910 that they regulate data because they saw that this could be something dangerous. 72 00:07:26,910 --> 00:07:30,730 And many of the advice given was along the lines of the GDP. 73 00:07:30,730 --> 00:07:39,440 That came afterwards in Europe. And just then 9/11 happened, and that changed everything because suddenly privacy regulation got shelved. 74 00:07:39,440 --> 00:07:48,890 Security became the absolute absolute priority in governments lists of objectives and 75 00:07:48,890 --> 00:07:52,580 intelligence agents since figured that they could actually make a copy of the data. 76 00:07:52,580 --> 00:07:57,800 They could make a copy of all the data that was being gathered and use it for intelligence purposes. 77 00:07:57,800 --> 00:08:03,110 Now, as I argue in the book and elsewhere in my book and in my work. 78 00:08:03,110 --> 00:08:10,490 It turns out that full data collection is not the kind of thing that is very useful to prevent terrorism, 79 00:08:10,490 --> 00:08:16,100 because terrorism is a very rare event such that it seems that we made this incredible 80 00:08:16,100 --> 00:08:23,300 sacrifice of our privacy and we're not really getting what we were promised to get in return. 81 00:08:23,300 --> 00:08:28,430 So the third chapter talks about how personal data is a kind of power. 82 00:08:28,430 --> 00:08:38,060 And one of the reasons why the big tech giants crept up on us was that we didn't recognise it as a kind of power because it's new. 83 00:08:38,060 --> 00:08:43,490 So we're used to political power. We're used to economic power. We know how military power works. 84 00:08:43,490 --> 00:08:49,070 And one of the characteristics of power, and this is an insight from the philosopher Bertrand Russell, 85 00:08:49,070 --> 00:08:55,670 is that it can change from one kind of to another like energy. So if you have enough money, you can influence politics. 86 00:08:55,670 --> 00:09:01,400 If you have enough political power, you can influence and have a big military power and so on. 87 00:09:01,400 --> 00:09:06,440 Good regulation is one that stops one kind of power from turning into another. 88 00:09:06,440 --> 00:09:14,540 So it was a function in society is one in which no matter how much money you have, you can buy votes and you can buy politicians. 89 00:09:14,540 --> 00:09:19,390 And there's this other kind of power that is related to personal data and privacy. 90 00:09:19,390 --> 00:09:29,110 And I argue that what's special about this power is that it creates the opportunity to influence behaviour and predict behaviour. 91 00:09:29,110 --> 00:09:34,770 I. This is the reason why data is so coveted. 92 00:09:34,770 --> 00:09:39,660 It's not only that it can be sold. Technically, Google and Facebook don't really sell your data. 93 00:09:39,660 --> 00:09:46,060 They sell the power to influence you. They sell your attention and access to you. 94 00:09:46,060 --> 00:09:52,740 And this is why we didn't realise what was going on, because we didn't know the narrative that this was a deal that, you know, 95 00:09:52,740 --> 00:09:59,190 you you got to to give your personal data in return for all these wonderful services is a 96 00:09:59,190 --> 00:10:05,880 narrative that came way after the fact when the date that the deal had already been struck. 97 00:10:05,880 --> 00:10:09,870 Just go back and try to remember the first time you open an e-mail account. 98 00:10:09,870 --> 00:10:13,530 It probably never occurred to you that you were giving up your your personal data. 99 00:10:13,530 --> 00:10:20,400 It wasn't transparent. We learnt about the system when it was already entrenched and it surprised us 100 00:10:20,400 --> 00:10:24,570 because we had this litmus test test for whether something was a monopoly. 101 00:10:24,570 --> 00:10:34,470 And the litmus test is something is a monopoly or can be suspicious of being a monopoly if the company can make the prices go up. 102 00:10:34,470 --> 00:10:40,200 I'm not losing customers, but here you have companies that supposedly give free services, 103 00:10:40,200 --> 00:10:46,290 although they are placing extremely exploitative conditions and they don't lose customers. 104 00:10:46,290 --> 00:10:53,530 But the litmus test failed because there wasn't a price tag involved and there wasn't a price tag because we are the product. 105 00:10:53,530 --> 00:11:01,380 We're not the customer. If the customers aren't our advertising companies and other companies that want access to that data. 106 00:11:01,380 --> 00:11:05,850 So we have to think more deeply about the relationship between personal data and power 107 00:11:05,850 --> 00:11:11,190 and realise that we are giving up too much power when we give up too much personal data. 108 00:11:11,190 --> 00:11:14,250 You would give up too much of our personal data to corporations. 109 00:11:14,250 --> 00:11:20,940 We shouldn't be surprised that the rule that the wealthy rule and make the rules of our society, 110 00:11:20,940 --> 00:11:29,030 if we give too much of our personal data to governments, we shouldn't be surprised if we start sliding into authoritarian tendencies. 111 00:11:29,030 --> 00:11:36,240 And I argue that for democracy to be strong, the citizenry has to be in charge of data and in control of personal data. 112 00:11:36,240 --> 00:11:41,870 In particular, I argue that we should think about personal data as a kind of toxic asset. 113 00:11:41,870 --> 00:11:47,330 And Bush Nido was the first person to argue for this. And I kind of expound on that. 114 00:11:47,330 --> 00:11:52,980 And I make an analogy to asbestos. Asbestos is a very useful material. 115 00:11:52,980 --> 00:11:59,970 It's very cheap to mine and it's very valuable because it avoids buildings from catching fire for it, for instance. 116 00:11:59,970 --> 00:12:06,630 And so we used it in a lot of things like walls and and buildings and plumbing. 117 00:12:06,630 --> 00:12:10,200 And then we realised it's incredibly toxic and there's no safe threshold. 118 00:12:10,200 --> 00:12:16,170 So hundreds of thousands of people die every year as a consequence of cancer caused by asbestos. 119 00:12:16,170 --> 00:12:21,840 I mean, the same way personal data is very cheap to mine. It's easy to get is very valuable. 120 00:12:21,840 --> 00:12:31,530 It's very helpful. But it turns out it has these risks that we're not paying attention to and that can really poison our societies. 121 00:12:31,530 --> 00:12:40,200 It can poison individuals by making us vulnerable to harm, like discrimination, public humiliation, extortion and so on. 122 00:12:40,200 --> 00:12:48,270 And it can poison societies by eroding democracy and eroding equality. 123 00:12:48,270 --> 00:12:52,710 We are not being treated as equals anymore. We are being treated on the basis of our data. 124 00:12:52,710 --> 00:12:57,630 And that means that some of us are treated better than others, depending on how much money we have, 125 00:12:57,630 --> 00:13:03,080 whether we're poor or rich, whether we're fat or slim, whether we're men or women. 126 00:13:03,080 --> 00:13:09,750 And that should worry us. So in the last two chapters of the book, I come up with some solutions. 127 00:13:09,750 --> 00:13:11,880 We need regulation. There's no way around it. 128 00:13:11,880 --> 00:13:22,170 But I argue that for regulation to happen, we, the public to be more aware of the situation and to demand that privacy be respected in there. 129 00:13:22,170 --> 00:13:31,320 There are other examples historically in which public pressure is really what has to build up before regulation happens and for regulation to happen. 130 00:13:31,320 --> 00:13:34,560 So I argue that we should end the data economy. This is crazy. 131 00:13:34,560 --> 00:13:40,590 Personal data shouldn't be the kind of thing that can be bought or sold as long as personal data is profitable in that way. 132 00:13:40,590 --> 00:13:44,490 It will be misused and it will be used against. It goes against you. 133 00:13:44,490 --> 00:13:52,190 Against us. So I argue that's just as you know it, even in the most capitalist capitalist societies, 134 00:13:52,190 --> 00:14:02,000 we agree that there are certain things that should be off the market, like organs, like people like votes and like the results of sports matches. 135 00:14:02,000 --> 00:14:04,460 Personal data should be on that list. 136 00:14:04,460 --> 00:14:10,860 And if anybody wants to do something with personal data, of course, we should allow them to in many cases because it's valuable. 137 00:14:10,860 --> 00:14:15,950 So in order for you to get medical care, you have to give some personal data to your doctor. 138 00:14:15,950 --> 00:14:22,640 But those people who manage our data have to be tied to fiduciary duties, 139 00:14:22,640 --> 00:14:29,210 fiduciary duties or duties that apply to professional relationships in which there is a symmetry of power. 140 00:14:29,210 --> 00:14:32,540 So your doctor knows a lot more about my son than you do. 141 00:14:32,540 --> 00:14:38,240 You have a level position because you often feel ill when you go to the doctor and fiduciary duties, 142 00:14:38,240 --> 00:14:44,480 meaning that the doctor has to put your interests first. And whenever there's a conflict of interest, your interests come first. 143 00:14:44,480 --> 00:14:51,830 So if the doctor would like to perform surgery on you because they would really like to practise their skills or have a data point, 144 00:14:51,830 --> 00:14:56,120 they can't do that only if it will benefit you. First and foremost, in the same way, 145 00:14:56,120 --> 00:15:03,350 your financial adviser has to advise you in a way that will protect your own interests if they have interests because 146 00:15:03,350 --> 00:15:10,220 they want to buy a particular stock to try out how it goes or because they have financial interests in that. 147 00:15:10,220 --> 00:15:17,660 That's not good enough. So in the same way, if a company wants to do something innovative with data, then by all means let's allow it, 148 00:15:17,660 --> 00:15:24,890 but only if they can assure us that that data will be used in our interest and never against us. 149 00:15:24,890 --> 00:15:30,380 I have other regulatory proposals like banning personalised content. 150 00:15:30,380 --> 00:15:35,990 It seems that the disadvantages are much, many more than the advantages. 151 00:15:35,990 --> 00:15:41,750 And it's a tool that is being used for political manipulation and the gerrymandering of democracy. 152 00:15:41,750 --> 00:15:46,700 And it's too dangerous. And apart from that, we should delete data. 153 00:15:46,700 --> 00:15:51,380 It's too dangerous to keep it. They've had just stored there for as long as possible. 154 00:15:51,380 --> 00:15:58,970 So whenever we have used data, we should read it periodically. We should have much better cybersecurity measures. 155 00:15:58,970 --> 00:16:05,720 Currently, the Internet is incredibly insecure. It says, look, you're in order to allow for the collection of personal data. 156 00:16:05,720 --> 00:16:13,610 And that puts us all at risk. Just to give an example, if hackers were to have just 10 percent of electrical appliances in any country and 157 00:16:13,610 --> 00:16:19,160 electrical appliances are increasingly connected to the Internet and smart appliances, 158 00:16:19,160 --> 00:16:22,340 they could bring down the national grid and that would be incredibly dangerous. 159 00:16:22,340 --> 00:16:29,750 Imagine being in a pandemic, being in lockdown and having the national grid just not work. 160 00:16:29,750 --> 00:16:32,570 That would be a catastrophe. 161 00:16:32,570 --> 00:16:42,170 So really, we have an interest as a society for democracy, but also just for security to have much, much better privacy regulations. 162 00:16:42,170 --> 00:16:51,950 And finally, at the end of the book, I propose some ways in which individuals can protect their own privacy and the privacy of others around them. 163 00:16:51,950 --> 00:16:54,810 So it is a way to create public pressure. 164 00:16:54,810 --> 00:17:05,080 Not only will protect your privacy if you say stop using Google search and start using ducktail go, but it also sends out a statement. 165 00:17:05,080 --> 00:17:11,340 It says, I care about privacy. I'm not consenting to this data economy, despite what companies say. 166 00:17:11,340 --> 00:17:14,030 And it creates a data trail as well. 167 00:17:14,030 --> 00:17:21,890 So every time, for instance, you ask a company for your data or you ask a company to delete your data even if you don't get a satisfying response. 168 00:17:21,890 --> 00:17:27,260 It creates a data trail such that when that company has to phase data regulators, 169 00:17:27,260 --> 00:17:36,800 data regulators can have evidence that customers and users are actually not consenting to that kind of treatment and that they should be treated, 170 00:17:36,800 --> 00:17:41,330 that those companies should be treated accordingly. So that's an overview of the book. 171 00:17:41,330 --> 00:17:47,270 I hope you read it. I hope you enjoy it. If you do, please tell me what you think. 172 00:17:47,270 --> 00:17:50,450 Thanks very much, Chris. I mean, I'll hold you sort of second. 173 00:17:50,450 --> 00:17:56,690 The suggestion that people buy the book and read the book, I think it's a really important topic and a very challenging book also, 174 00:17:56,690 --> 00:18:02,840 because I think if you just went through or as a you you sort of shine a light on some of the collective 175 00:18:02,840 --> 00:18:08,180 decisions that are made in societies and some of the for profit decisions made by individual companies, 176 00:18:08,180 --> 00:18:12,960 but also at the possible acquiescence that each of us engage in individually. 177 00:18:12,960 --> 00:18:19,940 You know, I have what APEN Moken from Columbia Law School called a machine that spies on me for a living in my pocket every day. 178 00:18:19,940 --> 00:18:23,210 And it is arguably busy spying on me right now, 179 00:18:23,210 --> 00:18:32,810 as I suspect many people not but not all on this cold will have maybe a few questions to kick us off before we open up. 180 00:18:32,810 --> 00:18:38,090 And I'll remind people again to police used to ask a question function on the broadcast platform, 181 00:18:38,090 --> 00:18:42,650 and then I will take questions from there to field to Charissa later on in the conversation. 182 00:18:42,650 --> 00:18:50,960 But maybe just a few things to get us started. I think your your analogy of data to a toxic asset is one of the really arresting 183 00:18:50,960 --> 00:18:55,400 parts of the book and really a powerful metaphor for where we might be. 184 00:18:55,400 --> 00:19:00,230 I wonder whether you can share with those who've joined the conversation today 185 00:19:00,230 --> 00:19:06,680 a bit more about how you think about where we are at this moment in time, 186 00:19:06,680 --> 00:19:13,250 about how much we know about the demonstrable harm of the kind of data collection that 187 00:19:13,250 --> 00:19:17,480 you're concerned about in the book by states and by for profit private companies. 188 00:19:17,480 --> 00:19:24,990 Do you think that we are closer to sort of the, say, tobacco and cancer link? 189 00:19:24,990 --> 00:19:31,730 You know, I happen to sit right now in the 30 Naum Gardens building in Oxford where Richard Doll did pioneering work in demonstrating that link. 190 00:19:31,730 --> 00:19:35,930 Do you think we're at the stage where there is a large volume of peer reviewed scientific 191 00:19:35,930 --> 00:19:42,190 work that points in the same direction of identifying a demonstrable objective harm? 192 00:19:42,190 --> 00:19:44,480 Would you think we're closer perhaps to the, say, 193 00:19:44,480 --> 00:19:52,850 the genetically modified food discussion where I think it's fair to say that there the science is sort of less consensual, 194 00:19:52,850 --> 00:19:58,730 arguably, and there may even be some scientists who believe that the harm is actually not demonstrated before. 195 00:19:58,730 --> 00:20:08,390 There might still be a case for precautionary principle because the possible harm might be very dramatic and well beyond the modest, 196 00:20:08,390 --> 00:20:13,130 pragmatic gains that could be realised. Where do you feel we are with US toxicity, if you will? 197 00:20:13,130 --> 00:20:17,930 Is this sort of demonstrated fact or is it more about a precautionary principle at this stage? 198 00:20:17,930 --> 00:20:26,900 That's an excellent question. I would argue it's a demonstrated fact with the caveat that with political and economic issues, 199 00:20:26,900 --> 00:20:34,340 there's always more controversy as to cordillo relationships than with biological matters. 200 00:20:34,340 --> 00:20:38,700 But I think history teaches us that this has already happened. 201 00:20:38,700 --> 00:20:43,500 I mean, if it has already happened, it will happen again. In the book, I tell the. 202 00:20:43,500 --> 00:20:47,100 Sorry about the Nazis when the Nazis invaded a city, 203 00:20:47,100 --> 00:20:52,210 one of the first things they did was to go to the registry because they wanted to find the Jewish population. 204 00:20:52,210 --> 00:20:56,940 And that's where the data was as to where Jewish people lived. 205 00:20:56,940 --> 00:21:06,540 And when you compare the death rates of the Jewish population between the Netherlands and France, it's really striking. 206 00:21:06,540 --> 00:21:14,040 I think the most important difference between the Netherlands and France was how much data they collected on the census. 207 00:21:14,040 --> 00:21:21,530 France had made a conscious decision for privacy reasons not to collect certain kinds of data related to religious affiliation, 208 00:21:21,530 --> 00:21:25,920 ancestry and location of, say, your grandparents and so on. 209 00:21:25,920 --> 00:21:30,480 Whereas the Netherlands had collected incredible amounts of information because there was 210 00:21:30,480 --> 00:21:36,240 this one person called a Lynx who was a fan of statistics and he wanted to create a system. 211 00:21:36,240 --> 00:21:39,840 This is a direct quote that followed people from cradle to grave. 212 00:21:39,840 --> 00:21:47,400 In fact, he invented the I.D. cards and these these made a huge difference in the Netherlands. 213 00:21:47,400 --> 00:21:51,950 The Nazis found 75 percent of the Jewish population and exterminated them. 214 00:21:51,950 --> 00:21:56,580 And whereas in France, they only found them killed. Twenty five percent of the Jewish population. 215 00:21:56,580 --> 00:22:00,810 And that means hundreds of thousands of people, in particular in France. 216 00:22:00,810 --> 00:22:07,860 There was this person called Frank Gaffney, who was the general counter of the army. 217 00:22:07,860 --> 00:22:15,180 And he was here. He was. He told the Nazi that because the census didn't have this data, 218 00:22:15,180 --> 00:22:21,660 he would be in charge of creating a new census in which the data would be collected about Jewish ancestry. 219 00:22:21,660 --> 00:22:25,370 And he made this effort. What the Nazis didn't know was this was that. 220 00:22:25,370 --> 00:22:29,910 But it got to me was one of the most important people in the French Resistance. 221 00:22:29,910 --> 00:22:32,100 And he never intended to collect the data. 222 00:22:32,100 --> 00:22:42,780 So what he did was basically he hacked the Hollerith machines that were being used by IBM to with punch cards such that the question number eleven, 223 00:22:42,780 --> 00:22:49,860 which was about Jewish ancestry, never got recorded. And by not collecting that data, he saved hundreds of thousands of lives. 224 00:22:49,860 --> 00:22:56,400 So we already know how data collection can be so dangerous and how he has it has led to genocide. 225 00:22:56,400 --> 00:23:03,360 And we can see in China, for instance, how surveillance is being used to control the population like that. 226 00:23:03,360 --> 00:23:10,590 Will Europe, but also during the pandemic. There have been incredible intrusions into the privacy of people such and what we're already seeing, 227 00:23:10,590 --> 00:23:15,390 seeing much more rebelliousness from the Chinese population than in the past. 228 00:23:15,390 --> 00:23:20,640 And likewise, if you think about other kinds of authoritarian regimes, 229 00:23:20,640 --> 00:23:28,510 they tend to have a pretty strong surveillance police, just like the Stazi or the KGB in Russia. 230 00:23:28,510 --> 00:23:34,360 I mean, I think your your your example is a powerful one in particular, the contemporary one of mainland China. 231 00:23:34,360 --> 00:23:38,020 I think this is a striking and deeply worrying one, though. 232 00:23:38,020 --> 00:23:42,910 I suppose your historical example, General Eisenhower might have said that the allies went straight for the registries 233 00:23:42,910 --> 00:23:47,230 as well because they want to find the Nazis and the data can be arguably, 234 00:23:47,230 --> 00:23:52,030 while clearly dangerous. As you powerfully illustrate, and sometimes she puts a different uses. 235 00:23:52,030 --> 00:23:59,650 I want to pick up on your on this really powerful examples that you offer about, say, you know, 236 00:23:59,650 --> 00:24:10,420 the historical records of, say, the KGB and the Nazis or today of mainland China's treatment of workers in particular, 237 00:24:10,420 --> 00:24:14,860 you use the term surveillance society throughout the book, which is a I think, 238 00:24:14,860 --> 00:24:21,250 a powerful description of where we are, both in terms of state surveillance and corporate surveillance today. 239 00:24:21,250 --> 00:24:25,570 And you argue that the data economy has to sort of be brought to an end, at least as we know it today, 240 00:24:25,570 --> 00:24:30,040 and replaced with something that puts the citizen at the centre of this. 241 00:24:30,040 --> 00:24:43,810 I wonder whether you want to share with the listeners your thoughts on whether in the interim period before the data economy is perhaps abolished. 242 00:24:43,810 --> 00:24:50,680 You think there are sort of important distinctions to be made between different societies and different companies, say, 243 00:24:50,680 --> 00:24:57,040 between mainland China versus the European Union versus the United States or even between individual companies between, 244 00:24:57,040 --> 00:25:05,410 say, Facebook or Google or Apple or other of that sort of tech giants that are central to the collection of data at scale, 245 00:25:05,410 --> 00:25:10,570 or whether you think that that is sort of enters into sort of a form of moral relativism. 246 00:25:10,570 --> 00:25:15,460 That's a little bit like sort of arguing over drinks for the fascism when that system is worse or, 247 00:25:15,460 --> 00:25:18,550 you know, Saugerties and the CHERIAN is more Naziism is worse. 248 00:25:18,550 --> 00:25:25,900 Is this is this for you something that is so clearly a violation of our fundamental rights and our autonomy as citizens, 249 00:25:25,900 --> 00:25:31,810 no matter how it's practised or or or do you want to draw out some differences you think that 250 00:25:31,810 --> 00:25:37,420 are meaningful between how different societies and different companies engage in surveillance? 251 00:25:37,420 --> 00:25:46,330 Yeah, I think there's some meaningful differences. The most meaningful difference is asking how does this company make their money? 252 00:25:46,330 --> 00:25:48,910 And whenever a company makes their money through data, 253 00:25:48,910 --> 00:25:56,140 that is the most concerning case by far because they depend on that data and they'll do anything to protect it. 254 00:25:56,140 --> 00:26:00,370 So when one of those companies claims the pair about your privacy, you should be very, 255 00:26:00,370 --> 00:26:05,800 very suspicious because they are making their money through your personal data. 256 00:26:05,800 --> 00:26:11,980 So I do think that companies that use data, but not in the way that funds their services, 257 00:26:11,980 --> 00:26:19,900 are much less dangerous and also much more open to kind of changing in ways that might empower the citizenry and that might put 258 00:26:19,900 --> 00:26:31,540 the data in our control rather than keeping the data because they don't have such so much financial incentive in the same way, 259 00:26:31,540 --> 00:26:38,860 even though both citizens in China and citizens in the West are subjected to a huge amount of surveillance. 260 00:26:38,860 --> 00:26:47,140 There are still many very meaningful differences. And one of the meaningful differences is that if you and I argue for this in the book, 261 00:26:47,140 --> 00:26:52,690 and that's one of the essences of liberalism, I think if you commit a minor infraction in the West, 262 00:26:52,690 --> 00:26:56,690 say you have a party at your place and things get a little bit out of hand, 263 00:26:56,690 --> 00:27:00,930 then music is really loud and your neighbours are pretty upset that they call the police. 264 00:27:00,930 --> 00:27:06,820 So they call the police, comes to your place, they knock on the door and say, hey, keep it down and lower the music. 265 00:27:06,820 --> 00:27:12,040 And that's it. That's it for you. Those are the consequences if it's contained. 266 00:27:12,040 --> 00:27:18,850 And at most, you know, your neighbours are not going to like you very much, but you'll be in China when you do something like that. 267 00:27:18,850 --> 00:27:24,130 It sounds against your credit. Social credit. No. 268 00:27:24,130 --> 00:27:28,360 So that could have effects on whether you get a job next month or whether you 269 00:27:28,360 --> 00:27:35,200 can travel as a VIP in certain kinds of trains or certain kinds of hotels. 270 00:27:35,200 --> 00:27:43,190 And that kind of system is much more totalitarian because every little thing that you do counts towards how you're being treated in general. 271 00:27:43,190 --> 00:27:50,200 Now, I fear that we're sliding into that in the West. And when China gets criticised for this many times, what people say is, hey, 272 00:27:50,200 --> 00:27:53,650 that's an unfair criticism, because in the West, you're doing exactly the same. 273 00:27:53,650 --> 00:27:58,300 And it's even worse because you don't even tell people who make people think that they're OK. 274 00:27:58,300 --> 00:28:03,700 It's not happening. And there's some truth to that. So in the West, there are systems, for instance, 275 00:28:03,700 --> 00:28:10,630 if you're living in the United States and you buy something and you're not happy with the products, then you call customer service. 276 00:28:10,630 --> 00:28:16,270 The timing that you spend waiting might depend on what kind of customer you are on your personal data, 277 00:28:16,270 --> 00:28:19,690 whether you have a lot of money, whether you're some somebody influential. 278 00:28:19,690 --> 00:28:24,580 And that's something that we don't get told that about how we are being treated. 279 00:28:24,580 --> 00:28:29,670 So one of my fears is that we're sliding more and more into that. Direction of society. 280 00:28:29,670 --> 00:28:33,160 And I don't think we want that. I don't think that's that's a smart way to go. 281 00:28:33,160 --> 00:28:37,280 And we should try to start walking the other way. 282 00:28:37,280 --> 00:28:44,630 Well, for those worried about what Henry Luce called the American Century, perhaps weight passing judgement, will you see what might come next? 283 00:28:44,630 --> 00:28:51,020 One could offer. Maybe just one last question for me before I start taking questions from the ask a question function again, 284 00:28:51,020 --> 00:28:57,430 please use to ask a question function and I will pick questions from there for Carosa to respond to. 285 00:28:57,430 --> 00:28:58,450 You know, one thing. 286 00:28:58,450 --> 00:29:07,490 One theme you strike at throughout the book and also spoken very powerfully to just now in your presentation is the idea of a choice. 287 00:29:07,490 --> 00:29:13,010 You write in the book that we've been asked to give up our personal data in the name of the public good. 288 00:29:13,010 --> 00:29:21,020 I wondered whether you want to expand a little bit on how sort of a much you feel that this has really been a choice. 289 00:29:21,020 --> 00:29:22,400 And I suppose in particular, 290 00:29:22,400 --> 00:29:30,650 whether you have any reflections on the legitimacy of this choice and in particular whether it's possible that people may be what I think some 291 00:29:30,650 --> 00:29:34,100 commentators and certainly some industry spokespeople have called privacy 292 00:29:34,100 --> 00:29:40,850 pragmatists that more or less knowingly decide to listen to your argumentation, 293 00:29:40,850 --> 00:29:46,640 say, well, we like convenient things as supported by data driven at at economies. 294 00:29:46,640 --> 00:29:53,120 So thanks very much for the warning. But, you know, we like our search engines and our social media and and our mobile phone. 295 00:29:53,120 --> 00:30:01,100 So we will just carry on with your warning in mind. How much of a choice do you really think this has been for us as individual citizens? 296 00:30:01,100 --> 00:30:06,740 Is this sort of is this sort of something that's akin to climate change where I think 297 00:30:06,740 --> 00:30:11,330 one candidate recently argued that we have quite a lot of agency as individuals. And I can't you know, 298 00:30:11,330 --> 00:30:15,470 I can't foist on the state and on big companies the responsibility I take on as a 299 00:30:15,470 --> 00:30:19,610 consumer when I decide to jet around the world and emit tons of carbon into the air. 300 00:30:19,610 --> 00:30:25,430 That is my responsibility as it is my choice. And I have to answer for that. Or do you think it's sort of close to something that's structural, 301 00:30:25,430 --> 00:30:32,240 where it's not really meaningful to talk about a choice that I make as a consumer when I have something that spies on me in my pocket? 302 00:30:32,240 --> 00:30:35,250 I think we didn't have a choice. I think we got duped into the system. 303 00:30:35,250 --> 00:30:39,380 And once it was there and once we learnt to read, they told us, well, this was the deal. 304 00:30:39,380 --> 00:30:45,210 You knew about it. Do you think the terms and conditions when everybody knows that nobody can possibly win a 305 00:30:45,210 --> 00:30:49,830 race are conditions and we don't even understand them and they can change at any point? 306 00:30:49,830 --> 00:30:51,630 So I don't think we had a choice. 307 00:30:51,630 --> 00:30:58,980 I think now we have a choice because now we know, now we know what's going on and we have a choice in the sense that we have. 308 00:30:58,980 --> 00:31:06,060 Agency in pushing the direction of our society one way or another, if you try your best to protect your privacy. 309 00:31:06,060 --> 00:31:10,440 I can tell you now you'll fail. But it doesn't matter. You don't have to be perfect. 310 00:31:10,440 --> 00:31:15,090 You don't have to achieve perfect privacy to make a big, big, big impact. 311 00:31:15,090 --> 00:31:21,450 So I think people would be amazed at how much companies and governments are worried about what they think. 312 00:31:21,450 --> 00:31:24,860 Their opinions are being monitored on social media all the time. 313 00:31:24,860 --> 00:31:31,170 And companies are very sensitive if they start realising that privacy can be a competitive advantage. 314 00:31:31,170 --> 00:31:34,560 Things will change pretty quickly. 315 00:31:34,560 --> 00:31:43,440 So in a way, it's it's something like climate change because we have now the agency to push towards one way or another. 316 00:31:43,440 --> 00:31:48,150 So one of the aspects of your question, you ask, well, what what people just say, OK, 317 00:31:48,150 --> 00:31:52,740 you know, thank you for the warning, but I'd rather have a really convenient things. 318 00:31:52,740 --> 00:31:54,030 There are two answers to that. 319 00:31:54,030 --> 00:32:03,390 One is you don't have to give up convenience so you can have signal and have much more privacy than with other message messenger apps. 320 00:32:03,390 --> 00:32:09,930 And it's the same thing. It's not going to make you lose anything. So this is another kind of myth that I think is completely wrong, 321 00:32:09,930 --> 00:32:17,700 that we have to choose between privacy and cutting that tech and cutting edge tech know cutting edge tech can work perfectly fine. 322 00:32:17,700 --> 00:32:21,030 Protecting privacy and the data economy is just a business model. 323 00:32:21,030 --> 00:32:26,190 We can fund these things in a different way so you don't have to choose. So that's one answer. 324 00:32:26,190 --> 00:32:36,510 And the second answer is. One of the unfortunate characteristics of people that often you don't realise how important it is until you lose it. 325 00:32:36,510 --> 00:32:45,300 I want to lose it. It's too late. You kind of recall it. So once you know your identity gets stolen, you have to go on trial to try to defend that. 326 00:32:45,300 --> 00:32:53,250 It wasn't you who committed that crime. Your privacy has already done that, you know, being stolen and the damage has been done in a way. 327 00:32:53,250 --> 00:32:59,760 So because there is this collective aspect to privacy and I argue that privacy is collective, 328 00:32:59,760 --> 00:33:04,230 it's not clear to me that we have the moral authority as individuals to say, you know, 329 00:33:04,230 --> 00:33:09,480 I choose to give up my privacy because you are choosing to give up our security. 330 00:33:09,480 --> 00:33:16,680 And that matters as a society. So I think even if people were to say, well, you know, I'm not that worried, 331 00:33:16,680 --> 00:33:25,230 they're wrong and we should take measures in just like, you know, we should take measures in public health issues or in ecology, 332 00:33:25,230 --> 00:33:31,710 even if there are some people who think, well, I don't have any kids and I don't care about what happens to the world in 20 years time or read, 333 00:33:31,710 --> 00:33:36,300 so I'll just bike and pollute as much as possible. 334 00:33:36,300 --> 00:33:43,080 I mean, thanks for for this, because I think in some ways this really puts it on the point for me in a sense that, you know, I, like you, 335 00:33:43,080 --> 00:33:47,670 was very struck by the Snowden revelations of mass government surveillance, 336 00:33:47,670 --> 00:33:52,290 piggybacking off the infrastructures built by private for profit companies in 2013. 337 00:33:52,290 --> 00:33:58,830 And all the revelations have enabled by his very brave act since by The Guardian, The New York Times and others. 338 00:33:58,830 --> 00:34:03,210 And at the same time, I have to be honest and say, you know, I didn't change very much in my behaviour. 339 00:34:03,210 --> 00:34:05,040 You know, I felt like I learnt something. 340 00:34:05,040 --> 00:34:11,580 And I think there was an intrinsic value to knowing something about societies, the sites being intelligible rather than opaque. 341 00:34:11,580 --> 00:34:15,420 But at the same time, I suppose that my sort of in practise, 342 00:34:15,420 --> 00:34:22,440 my thinking as a citizen was that there are different incommensurate values here and a traditional conception of privacy is one of them. 343 00:34:22,440 --> 00:34:30,360 But it may not be the only one. And that I might have felt as a citizen that my autonomy was in fact enhanced by the data economy. 344 00:34:30,360 --> 00:34:35,100 That was so demonstrably problematic in many different ways that you outlined so powerfully in the book. 345 00:34:35,100 --> 00:34:40,860 And I really appreciate the book in part because it calls up people like me for being immoral, basically. 346 00:34:40,860 --> 00:34:46,230 And I think it really puts it a very sort of a clear point on why it's important 347 00:34:46,230 --> 00:34:49,590 to read your book and really think about the choices that we make individually. 348 00:34:49,590 --> 00:34:58,050 But perhaps more importantly, as societies now, as promised, now I'll open up and start looking at the questions. 349 00:34:58,050 --> 00:35:04,680 Just bear me with me for a few seconds as I start picking a few questions. 350 00:35:04,680 --> 00:35:12,890 Syria. Let's start with a question from Anna, who notes that you've been you know, 351 00:35:12,890 --> 00:35:20,520 that you've written that six years ago when you started the work on this topic, that few people were interested in discussing privacy. 352 00:35:20,520 --> 00:35:26,220 What are the what are the sort of the changes you would most like to see in the six years after this book? 353 00:35:26,220 --> 00:35:32,940 I would like to see much more regulatory changes. I would like to see, like really important steps for banning the data economy. 354 00:35:32,940 --> 00:35:37,800 And I'm optimistic in the sense that the US is discussing different privacy bills. 355 00:35:37,800 --> 00:35:41,670 And I think it's a matter of time before they happen. 356 00:35:41,670 --> 00:35:50,530 I am also would like to see much more effort from the part of companies to be serious about privacy and to offer privacy as a competitive advantage. 357 00:35:50,530 --> 00:35:57,940 And I think people will start choosing privacy more and more. In a recent survey, I carried on with John Brooke from from Oxford. 358 00:35:57,940 --> 00:36:03,520 We found out that 92 percent of people have had some kind of bad experience with privacy. 359 00:36:03,520 --> 00:36:10,870 So I think privacy is something that we forgot because it was relatively successful until the digital age. 360 00:36:10,870 --> 00:36:14,950 And then, you know, these tech companies told us, hey, you don't need to worry about this isn't a thing of the past. 361 00:36:14,950 --> 00:36:19,420 And we sort of trusted them for a while. And now we're realising, no, actually, this is harming us. 362 00:36:19,420 --> 00:36:22,000 And more and more people are having bad experiences. 363 00:36:22,000 --> 00:36:27,990 I mean, there we are having bad experiences or we know somebody who had their credit card number stolen, who had a troll. 364 00:36:27,990 --> 00:36:32,670 I'd do something online to them who are suffered doxing. There's many kinds of harms. 365 00:36:32,670 --> 00:36:39,170 And as that's happened, I think people will make more of an effort to choose privacy. 366 00:36:39,170 --> 00:36:50,080 Thanks. I'm going to continue on with a question from Christina, who I think sort of presents a of a version of the situation I described in myself. 367 00:36:50,080 --> 00:36:57,760 I am Marylee a moral. You know, I have sort of brought about these things and sort of, I suppose, in practise decided just to carry on. 368 00:36:57,760 --> 00:37:02,920 But Christina rightly draws our attention to the so-called privacy paradox that there often 369 00:37:02,920 --> 00:37:07,420 is a discrepancy between people sort of expressed privacy concerns in surveys and the like. 370 00:37:07,420 --> 00:37:10,600 And then there are actual documented online behaviour. 371 00:37:10,600 --> 00:37:16,450 So Christina asks, you know, how can we convince people that their privacy should not be traded for efficiency? 372 00:37:16,450 --> 00:37:22,660 How can we sort of sensitise people to a danger that is not immediately pressing? 373 00:37:22,660 --> 00:37:27,250 I think there are many hypotheses for why the privacy paradox comes about. 374 00:37:27,250 --> 00:37:31,810 My own view is that it comes about because we feel we don't really have a choice. 375 00:37:31,810 --> 00:37:37,180 You think, OK, why am I going to protect my privacy if I know that I'm being surveilled anyway, if I'm trying? 376 00:37:37,180 --> 00:37:44,820 You know, why would I choose not to plead no to cookies when I know, like these other things that I'm not even being asked about are going on? 377 00:37:44,820 --> 00:37:50,740 And what I'm trying to argue in this book is that it matters if you don't you shouldn't think like, 378 00:37:50,740 --> 00:37:56,920 oh, well, either I protect my privacy perfectly or nothing makes a difference. 379 00:37:56,920 --> 00:38:00,490 Every little time you say I choose privacy. 380 00:38:00,490 --> 00:38:04,780 It gets recorded and it matters and it creates public pressure. 381 00:38:04,780 --> 00:38:12,370 So I want to offer people a view both of how much it matters to resist, how much they can resist. 382 00:38:12,370 --> 00:38:15,070 And what are the alternatives for them to resist. 383 00:38:15,070 --> 00:38:21,880 And I'm hoping that this might motivate us to be a bit more privacy conscious and a bit more coherent 384 00:38:21,880 --> 00:38:30,480 with our beliefs because it's certain such it's a direct continuation of this sort of area of enquiry. 385 00:38:30,480 --> 00:38:40,060 I'm going to pick one next. It's from Anna that she argues that surveillance capitalism as sort of a subset of the sort of surveillance society term, 386 00:38:40,060 --> 00:38:47,050 effectively outsourced to privacy management and guilt for this possible privacy mismanagement to us as individuals. 387 00:38:47,050 --> 00:38:50,530 And she is interested in your thoughts on what, if anything, 388 00:38:50,530 --> 00:38:58,990 we can do to recollect Tobi's this and make this as a society level choice rather than something that each of us feel we have to navigate ourselves. 389 00:38:58,990 --> 00:39:04,470 One cookie banner at a time. There are few possibilities. 390 00:39:04,470 --> 00:39:13,190 Some people discuss the possibility of Batur trusts that could work, it would work something like a union, but might be a way to go. 391 00:39:13,190 --> 00:39:16,830 But I find that more practical. Just to think of regulation. 392 00:39:16,830 --> 00:39:24,000 We don't have to cheque whether what we buy in the supermarket is edible every single time. 393 00:39:24,000 --> 00:39:27,330 How does that happen? Well, it happened through a process of regulation. 394 00:39:27,330 --> 00:39:31,990 So I think we've got we're going through a process of civilisation that was that 395 00:39:31,990 --> 00:39:37,500 is similar to what we saw in the offline world that made the world more liveable. 396 00:39:37,500 --> 00:39:42,120 Like we don't have to cheque whether our aeroplane has gone through security cheques. 397 00:39:42,120 --> 00:39:46,700 We can trust that our countries are run well enough that somebody is taking that. 398 00:39:46,700 --> 00:39:51,390 And in fact, we have much less, many less accidents than we used to have. 399 00:39:51,390 --> 00:40:00,420 So I think it ends up having to be through regulation, but regulation will only happen when there is enough people asking for it. 400 00:40:00,420 --> 00:40:04,620 And I think there are a lot of politicians that want to protect their privacy, 401 00:40:04,620 --> 00:40:09,930 but they need to know that they have the citizenry backing them up in order to 402 00:40:09,930 --> 00:40:15,180 really be able to face these tech giants that put enormous pressure on them. 403 00:40:15,180 --> 00:40:15,510 I mean, 404 00:40:15,510 --> 00:40:23,670 I wonder whether it's maybe worth pursuing this a little bit further and just sort of to to hear any sort of further thoughts you have in the book. 405 00:40:23,670 --> 00:40:29,280 You you powerfully describe as sort of essentially a triangle with some element of a zero sum game between, 406 00:40:29,280 --> 00:40:31,290 you know, a state as source of power for governments, 407 00:40:31,290 --> 00:40:41,100 a state, a source of power for private for profit companies or a state, a source of power for us as individuals, as member of the public, as citizens. 408 00:40:41,100 --> 00:40:49,110 I mean, I am I'm encouraged by your description of sort of the history of regulation in the 20th and 21st century. 409 00:40:49,110 --> 00:40:55,530 I mean, I suppose it it it might come across as sort of quite an optimistic take, if you will. 410 00:40:55,530 --> 00:41:02,160 I think it's true that there are areas in which we have made progress and we arguably live in societies where we have at least privileged 411 00:41:02,160 --> 00:41:08,880 parts of the world have to be less afraid that someone's going to poison us just to make a quick buck than many people had to be very, 412 00:41:08,880 --> 00:41:10,630 very afraid of early in the 20th century. 413 00:41:10,630 --> 00:41:15,030 And of course, many people have to be very afraid of across the world, including being poisoned by Western companies, 414 00:41:15,030 --> 00:41:20,970 selling various things that are not actually fit for human consumption. 415 00:41:20,970 --> 00:41:26,160 But I think something that you that you made quite explicit at the beginning of your conversation, too, 416 00:41:26,160 --> 00:41:33,540 and that is quite present in the book as well, is that while for profit companies are often the sort of the interface, if you will, 417 00:41:33,540 --> 00:41:40,230 further this data collection for us, whether it's tech giants or EIAs piece or other or, you know, 418 00:41:40,230 --> 00:41:50,410 back end consumer data inside, companies like Oxfam and others are often sort of the interface that we encounter as citizens. 419 00:41:50,410 --> 00:42:01,380 It seems naive after the Snowden revelations to discount that the state is not a disinterested party, to put it mildly, 420 00:42:01,380 --> 00:42:10,470 and that it might in fact be quite convenient from the point of view of quite a lot of sort of executive arm functions, law enforcement, security, 421 00:42:10,470 --> 00:42:14,880 but also more benign ones, if you will, public health and others to have, you know, 422 00:42:14,880 --> 00:42:21,850 mass surveillance available and and to be able to either dip in and selectively or just, 423 00:42:21,850 --> 00:42:28,820 you know, get all of it through the back doors as we now know that some governments have done. 424 00:42:28,820 --> 00:42:29,370 And in that sense, 425 00:42:29,370 --> 00:42:39,630 I just wonder whether is there a risk sometimes in these discussions that we get so focussed on the very real problems that a number 426 00:42:39,630 --> 00:42:53,610 of for profit private companies are are central to that we forget that the state is not a impartial and disinterested actor in this, 427 00:42:53,610 --> 00:43:02,510 but actually arguably quite central and in some countries extremely central to this whole situation and quite happy with it. 428 00:43:02,510 --> 00:43:04,100 That's an excellent point. 429 00:43:04,100 --> 00:43:13,210 From the start, the surveillance society has been a covenant between private and public institutions and it happens all the time. 430 00:43:13,210 --> 00:43:19,610 And data that you give to companies ends up in the government and data you give to government ends up with companies. 431 00:43:19,610 --> 00:43:26,090 But one reason for optimism is that I think the first reaction to 9/11 was very short sighted. 432 00:43:26,090 --> 00:43:33,680 It was a panicked reaction. It was a reaction that was meant to show citizens that something was being done. 433 00:43:33,680 --> 00:43:37,220 I mean, in part because at the beginning, of course, it was secret. 434 00:43:37,220 --> 00:43:45,260 But I think more and more governments are realising how dangerous it is for national security to have this insecure Internet. 435 00:43:45,260 --> 00:43:49,640 And that is incredibly valuable to us as citizens on a very important realisation. 436 00:43:49,640 --> 00:43:58,910 So you're seeing now, for instance, how the U.S. doesn't want to tick tock and to function with China and how more and more these 437 00:43:58,910 --> 00:44:05,600 kind of tech companies are having to put in security frameworks for national security purposes. 438 00:44:05,600 --> 00:44:11,600 Another example that I give in the book is where the Grinder and how, you know, 439 00:44:11,600 --> 00:44:18,260 dating apps are being checked out by by intelligence agencies because it's so easy to get very 440 00:44:18,260 --> 00:44:23,330 sensitive information from people who work in the government and then possibly to blackmail them. 441 00:44:23,330 --> 00:44:31,370 There are many security risks there. So I think that gives governments an incentive to make the Internet much more secure. 442 00:44:31,370 --> 00:44:38,540 And we should use that to leverage our case. Another point is that governments are usually I mean, 443 00:44:38,540 --> 00:44:46,340 especially democratic governments are not like a whole institution that always thinks the same thing and they're not an individual. 444 00:44:46,340 --> 00:44:53,180 There are many parts to government and we have to make sure that while intelligence agencies are important and their voice should be heard, 445 00:44:53,180 --> 00:44:58,790 there are these other aspects to good democracy that may be just as important or more important and that, 446 00:44:58,790 --> 00:45:03,650 you know, we should kind of have a view of what kind of society we want to live in rather 447 00:45:03,650 --> 00:45:09,740 than make these very shortsighted decisions that might take us in a wrong turn. 448 00:45:09,740 --> 00:45:13,970 Thank you. There's a question here from Martin Butler that I think is pertinent to this. 449 00:45:13,970 --> 00:45:19,370 I mean, Martin asks, given the business models the tech giants use to, you know, 450 00:45:19,370 --> 00:45:22,700 that that depends on using our data, at least for some of them, as you argue. 451 00:45:22,700 --> 00:45:29,750 This is absolutely essential for others. It's sort of additive. Is it likely that we would ever be able to take back control of the state? 452 00:45:29,750 --> 00:45:36,290 And if I may sort of expand on Martin's important question. 453 00:45:36,290 --> 00:45:40,400 If we turn for a moment from whether we as citizens can influence governments, 454 00:45:40,400 --> 00:45:48,230 which you rightly argue are not from a unitary actress with a single interest to thinking about common companies, I mean, no. 455 00:45:48,230 --> 00:45:51,320 One question, Martin's question whether this is ever going to happen. 456 00:45:51,320 --> 00:45:58,250 I suppose it's a cholera question is let's imagine you had half an hour with Mark Zuckerberg or some other CHA. 457 00:45:58,250 --> 00:46:03,500 You know, is there a case that you could make sure either of them about how it might be in the 458 00:46:03,500 --> 00:46:07,970 long term interests of their company or their shareholder to clean up their act? 459 00:46:07,970 --> 00:46:15,560 Or is your message to them go to hell? So, yes, I think it can definitely happen again. 460 00:46:15,560 --> 00:46:22,040 If you look at history, there are things that have happened, good things that we never thought it was remotely possible. 461 00:46:22,040 --> 00:46:26,720 A minor thing was just the GDP. When I started working on privacy, people laughed at me. 462 00:46:26,720 --> 00:46:32,510 Really like privacy. And what I mean, you're just doing history. You're not doing all day philosophy. 463 00:46:32,510 --> 00:46:35,210 And the DPR seemed like something impossible. 464 00:46:35,210 --> 00:46:41,720 It's not perfect, but for all its faults, it's a real high apsos in history and we have to acknowledge that. 465 00:46:41,720 --> 00:46:51,020 And more importantly, things like democracy are votes for women or the Internet or getting rid of slavery. 466 00:46:51,020 --> 00:46:56,420 These things were things that didn't seem possible at some point. And in particular with slavery, for instance, 467 00:46:56,420 --> 00:47:02,310 there was a very strong argument that it wasn't just not economically feasible to get rid of slavery, and yet we did. 468 00:47:02,310 --> 00:47:08,000 And of course, you know, slavery was much worse than the data economy in many, many ways and not trying to make an analogy. 469 00:47:08,000 --> 00:47:13,070 But the point is that we have done things in history that didn't seem possible. 470 00:47:13,070 --> 00:47:17,720 And again, with regulation regarding safety and food and so on, these things, 471 00:47:17,720 --> 00:47:21,460 if you were to tell like somebody who lives in the fourteen hundreds or fifteen hundreds. 472 00:47:21,460 --> 00:47:28,210 How the world has developed, they would be very surprised that we have Muna's degree of organisation and affordance. 473 00:47:28,210 --> 00:47:32,290 There are many reasons for pessimism as well. But we should focus on the things that we've done. 474 00:47:32,290 --> 00:47:38,330 And I want to Facebook. I think I would say, look, you're just you have a business model that's based in the past. 475 00:47:38,330 --> 00:47:40,910 If you want to survive, you have to reinvent yourself. 476 00:47:40,910 --> 00:47:48,050 Just like oil companies will have to reinvent themselves because there are certain kinds of models are just not sustainable. 477 00:47:48,050 --> 00:47:57,530 It's a it's a if I may make a personal observation, it's refreshing to meet someone who is an optimist in October 2020 and a real inspiration. 478 00:47:57,530 --> 00:48:01,410 And I applaud and admire your optimism. I suppose your. 479 00:48:01,410 --> 00:48:06,910 Person from the Fortune Hundred, if you're told someone from ancient Rome how they lived in the Fortune hundreds. 480 00:48:06,910 --> 00:48:10,080 You know, she would think that the world had gone to hell in a handbasket. 481 00:48:10,080 --> 00:48:18,930 So I suppose I remain mixed on my belief in sort of the arc of history going in the right direction always. 482 00:48:18,930 --> 00:48:24,930 Nonetheless, let's turn to Jules questions that have been sort of a couple of questions and jewels that are along the same lines, 483 00:48:24,930 --> 00:48:31,410 old tech cherished privilege of consolidating them. But I mean, effectively, you know, 484 00:48:31,410 --> 00:48:44,430 given that we have lift through sort of a period of various of rampant data collection by private companies and by some governments, 485 00:48:44,430 --> 00:48:49,950 and arguably a lot of this data is stored, sometimes obtained using dubious methods, 486 00:48:49,950 --> 00:48:58,390 at best a grey zone and worst things that that are at the very least against terms of service and arguably often illegal. 487 00:48:58,390 --> 00:49:05,330 You know, do you think there is a world in which it becomes possible to essentially get the state of the lead at? 488 00:49:05,330 --> 00:49:11,750 I think so. I think it's necessary. And he's one of the things that we have to really change how we think about data. 489 00:49:11,750 --> 00:49:16,940 Data should never be held for as long as possible. 490 00:49:16,940 --> 00:49:25,310 That's that's just crazy. So we need to implement systems in which they took it's routinely deleted for safety reasons. 491 00:49:25,310 --> 00:49:30,020 And it's not going to be easy. And we need better ways of auditing data. 492 00:49:30,020 --> 00:49:40,040 So one of the challenges we have is whether there can be a way to design data systems such that you can know exactly where your data is, 493 00:49:40,040 --> 00:49:45,200 that your data is tagged, such that you could have an app or some kind of platform in which you could track it and you 494 00:49:45,200 --> 00:49:49,870 would have control of deleting it and you could cheque that it was deleted at the same time, 495 00:49:49,870 --> 00:49:56,960 but might create a privacy risk because, of course, tagging data means that it's being tied to you. 496 00:49:56,960 --> 00:50:04,700 And somebody like Tim Bergonzi, the creator of the World Wide Web, is working on such a system that's called solid. 497 00:50:04,700 --> 00:50:13,970 And we'll see if if it's possible. That would be an incredible achievement for privacy because it's so central to this question. 498 00:50:13,970 --> 00:50:20,180 I'm going to dive down that list a little bit and surface a question from our colleague, Peter Milliken. 499 00:50:20,180 --> 00:50:25,430 Because it is essentially isn't this question of data that's already been collected. 500 00:50:25,430 --> 00:50:32,540 And Peter asks effectively, he asked whether our rights expire or the moment we do so. 501 00:50:32,540 --> 00:50:36,590 Do you think that people ought to be accorded the right to privacy after they are dead? 502 00:50:36,590 --> 00:50:44,930 And would you apply the same constraints as you would apply to the living even after someone has been dead for, say, 100 years? 503 00:50:44,930 --> 00:50:49,950 That's a question that has been in my mind since I started researching my family's history. 504 00:50:49,950 --> 00:50:55,430 And originally I had thought that would be the main question. My dissertation, I never got to it. 505 00:50:55,430 --> 00:50:59,600 I still haven't got through it. I'm afraid to say so. I'm not sure. 506 00:50:59,600 --> 00:51:04,220 I don't think I would give it exactly the same value, but I think it has some value. 507 00:51:04,220 --> 00:51:08,690 So I think there are certain interests of the living that are more important than the interests of the dead. 508 00:51:08,690 --> 00:51:13,340 But I think there's something to be said for some kind of privacy rights after that. 509 00:51:13,340 --> 00:51:20,630 So say, you know, if there's going to be something that will totally ruin your reputation and that is done in the public interest, 510 00:51:20,630 --> 00:51:24,470 I mean, it's not relevant for the kind of contribution you have made to society. 511 00:51:24,470 --> 00:51:30,470 I think there is some kind of argument to limit what we can share, say, for instance, unhealth data, 512 00:51:30,470 --> 00:51:37,310 if nobody's going to benefit from, you know, in a medical way here, we could look at the laws in Germany. 513 00:51:37,310 --> 00:51:41,690 Germany is one of the countries that recognises our right to privacy after death. 514 00:51:41,690 --> 00:51:48,020 But it's still something to work on for me in the future. Thanks, Chris. 515 00:51:48,020 --> 00:51:53,630 I'm going to turn to a question that Stephen Lawrence has raised in the Q and A, 516 00:51:53,630 --> 00:52:02,810 which is effectively is a sort of another dimension of the sort of reformist project that's been quite central to the discussion tonight, 517 00:52:02,810 --> 00:52:08,270 which is what do you think about the standard of data privacy, education, schools? 518 00:52:08,270 --> 00:52:15,470 And more broadly, your thoughts on what sort of education and awareness raising in this broader area? 519 00:52:15,470 --> 00:52:21,170 And Stephen remarks that, you know, in his experience, it's it's with conversation with family and the like. 520 00:52:21,170 --> 00:52:25,720 It's quite hard to get people to care as they do not see the harm, perhaps members of Stephen's family, 521 00:52:25,720 --> 00:52:34,970 maybe immoral people like myself, or they're not, at least in court in the privacy paradox that was raised before. 522 00:52:34,970 --> 00:52:38,540 I have a whole section in the book on children because I think it's so important. 523 00:52:38,540 --> 00:52:42,530 And I worry about us bringing up children in these surveillance society in which 524 00:52:42,530 --> 00:52:47,180 we teach them the essentially writes the matter because their privacy is a right. 525 00:52:47,180 --> 00:52:49,980 But, you know, it doesn't really apply. 526 00:52:49,980 --> 00:52:57,650 And in which we don't let them mature, because I think there is a lot to be said for growing up without being watched all the time. 527 00:52:57,650 --> 00:53:03,650 That makes you into a functional adult. I mean, if you watch it all the time, there are things that you might not question. 528 00:53:03,650 --> 00:53:10,710 There are things that you might not do. And there are ways in which you might not develop. In schools, I think this is changing a lot. 529 00:53:10,710 --> 00:53:16,680 It's changing a lot also depending on the country in the US. It's quite frightening the degree to which children are being surveilled. 530 00:53:16,680 --> 00:53:20,910 They're being surveilled for what they search on online. 531 00:53:20,910 --> 00:53:30,240 What messages they send even between themselves and their biometric details getting collected. 532 00:53:30,240 --> 00:53:38,700 And, of course, with a pandemic, this has become a lot worse because a lot of children having to do some having to learn online 533 00:53:38,700 --> 00:53:44,850 and don't really have a choice but to engage with platforms that are not very privacy friendly. 534 00:53:44,850 --> 00:53:52,200 I think the consequences of these are are still to be felt. And that's one of the dangers that once we really feel them, it'll be too late. 535 00:53:52,200 --> 00:53:59,240 But a few of the things that I worry about is not giving people a chance to really mature. 536 00:53:59,240 --> 00:54:03,800 Not giving them the chance to explore topics that might be controversial because of the fear 537 00:54:03,800 --> 00:54:10,520 that they will be punished or some somehow have bad consequences for them from their teachers, 538 00:54:10,520 --> 00:54:19,160 from society, from prospective employers, from universities when they apply to university, but also how they might be disadvantaged. 539 00:54:19,160 --> 00:54:24,740 With respect to previous generations in that's everything they do can have a consequence. 540 00:54:24,740 --> 00:54:31,460 Every game they play could be used to infer further IQ abilities and then make them unemployable. 541 00:54:31,460 --> 00:54:40,250 Or any any party they they go to. They have to be aware of how everything that could be broadcasted, every picture that is taking on them. 542 00:54:40,250 --> 00:54:45,780 And I think this is this has a risk of creating people who are. 543 00:54:45,780 --> 00:54:57,420 Very pain. Who or who would have to comply with very strict rules in order to be a functional member of society. 544 00:54:57,420 --> 00:55:05,100 Thank you. Let's just have a final last question before we wrap this up, which is, I think, to sort of bring out some of the sort of truly political, 545 00:55:05,100 --> 00:55:10,440 really a nature of a lot of the argument and also the position that you advocate yourself. 546 00:55:10,440 --> 00:55:13,410 I mean, I guess sort of simplifying, exaggerating, at least in Europe. 547 00:55:13,410 --> 00:55:20,700 I think you can say that the liberal tradition was always historically more afraid of the state and state abuse of power than anything else. 548 00:55:20,700 --> 00:55:26,550 And the socialists and sort of Christian Democrat tradition in some extent were actually more afraid of capitalism, 549 00:55:26,550 --> 00:55:32,720 really, than they than they were of the state. And Social Democrats were equally afraid of both of them. 550 00:55:32,720 --> 00:55:36,750 So. So where would you put yourself? Is it the state we have to worry the most about? 551 00:55:36,750 --> 00:55:41,990 Is it capitalism we have to worry the most about or is it both? It's both. 552 00:55:41,990 --> 00:55:46,040 It's always hard data being treated in ways that can be against us and not for us. 553 00:55:46,040 --> 00:55:53,560 So I think we are having fiduciary duties as one of the most important elements of what I propose. 554 00:55:53,560 --> 00:55:58,210 Well, Social Democrats may not be able to win any elections in any actually accessing European countries, 555 00:55:58,210 --> 00:56:05,980 but I'm sure they'll be reassured by the thought that they might still have a chance. In Oxford seminars, at least something beats nothing. 556 00:56:05,980 --> 00:56:10,120 As I like to say, I think we should sort of probably wrap it up here. 557 00:56:10,120 --> 00:56:14,770 Thank you so much for sharing your insights with us on this call today. 558 00:56:14,770 --> 00:56:19,570 Again, I want to recommend buying Chris's book Privacy's Power, which I can't help but notice, 559 00:56:19,570 --> 00:56:27,250 is entirely accidentally on display in the background of Cursus living room, which is I I'm sure it's always standing like that. 560 00:56:27,250 --> 00:56:33,250 I warmly recommend that. And I genuinely meant what I said along the way, that I think this is such an important and interesting book, 561 00:56:33,250 --> 00:56:42,220 in part because it really challenges the sort of the sort of the acquiescence or pragmatic compliance, if you will, 562 00:56:42,220 --> 00:56:48,220 that people like myself and I and I suspect at least some people have joined the conversation they have engaged in and really challenges, I think, 563 00:56:48,220 --> 00:56:53,470 to think in much bigger terms about what the long term societal and collective consequences 564 00:56:53,470 --> 00:56:59,620 are of individual discrete decisions that may seem quite innocent on their own. 565 00:56:59,620 --> 00:57:01,360 But at that, that carries a really, I think, 566 00:57:01,360 --> 00:57:08,220 challenge us to think about in a very different light in a very powerful book that I hope all of you will buy and read. 567 00:57:08,220 --> 00:57:15,790 So thanks to all of you for joining us today. Thanks for all the questions and apologies for not marching us through all of them. 568 00:57:15,790 --> 00:57:22,750 And thanks. Of course, more than anything to you. Charissa, for this really interesting, important book and for your time this evening. 569 00:57:22,750 --> 00:57:26,260 Thank you so much for us. It has been a lot of fun. And just to end. 570 00:57:26,260 --> 00:57:31,630 I would like to just invite people who think that it might be crazy to call for an end to the data economy, 571 00:57:31,630 --> 00:57:36,640 given how important it is just to think about how crazy it is to have a business 572 00:57:36,640 --> 00:57:43,330 model that depends on the massive and systematic violation of everybody's rights. 573 00:57:43,330 --> 00:57:48,826 What a powerful place to am. Thank you so much, Charism. Thank you. Good evening, all.