1 00:00:01,000 --> 00:00:10,000 Hello. I'm Katrien Devolder. This is thinking out loud conversations with leading philosophers from around the world on topics that concern us all. 2 00:00:10,000 --> 00:00:14,000 This is a special edition on ethical questions raised by the Corona pandemic. 3 00:00:14,000 --> 00:00:19,890 So this time, we'll will be talking to philosopher Carissa Veliz, from the University of Oxford. 4 00:00:19,890 --> 00:00:21,630 In her book Privacy's Power, 5 00:00:21,630 --> 00:00:29,850 she exposes how our personal data is giving too much power to big tech and governments and why that matters and what we can do about it. 6 00:00:29,850 --> 00:00:36,000 So she seems like the right person to talk about the use of contact tracing apps in our fight against a corona virus pandemic. 7 00:00:36,000 --> 00:00:40,000 There are different types of app and they can be centralised or decentralised, 8 00:00:40,000 --> 00:00:45,920 but they all aim to let people know if they have been in close contact with someone who has COVID 19. 9 00:00:45,920 --> 00:00:54,450 If so, and they should get a test and self isolate. So let's find out why these apps may pose a threat to our privacy. 10 00:00:54,450 --> 00:00:58,830 So some people are worried about these apps, especially because they may threaten our privacy. 11 00:00:58,830 --> 00:01:05,400 Do you think that such threats to privacy are real? Should we be worried about privacy when we use such apps? 12 00:01:05,400 --> 00:01:11,000 It's definitely a consideration. One of the problems that many apps have had, including, for instance, 13 00:01:11,000 --> 00:01:17,000 the one used in Qatar and the one used in India, is that they're very easy to hack. 14 00:01:17,000 --> 00:01:21,300 Location data is incredibly sensitive through location data. 15 00:01:21,300 --> 00:01:27,990 You can know where somebody lives, where they work, who they sleep with, whether they go to a hospital. 16 00:01:27,990 --> 00:01:36,900 You can infer whether they do drugs, all kinds of sensitive information. And this is particularly sensitive in the context of a disease that can kill. 17 00:01:36,900 --> 00:01:43,890 So if you can hack people and figure out or try to figure out who might have infected the loved ones, 18 00:01:43,890 --> 00:01:48,780 who might have died from the disease, it can be very, very sensitive. So that's one concern. 19 00:01:48,780 --> 00:01:56,880 The possibility of a hack. Another concern is if you have a centralised database with all these connexions between people. 20 00:01:56,880 --> 00:02:02,760 That's, again, very sensitive information. It not only tells you where people live and work and so on, all all about location data, 21 00:02:02,760 --> 00:02:09,030 but also their networks, their connexion networks and network networks of people around them. 22 00:02:09,030 --> 00:02:17,660 So you can know whether journalists met with a source, whether a lawyer met with a client, a doctor with a patient and so on. 23 00:02:17,660 --> 00:02:20,960 And again, this is very sensitive, especially in the wrong hands. 24 00:02:20,960 --> 00:02:28,350 So if we think about government might be untrustworthy, then it's a really big worry. 25 00:02:28,350 --> 00:02:34,620 And so I think many people don't really care about these privacy threats because I think they have 26 00:02:34,620 --> 00:02:42,000 nothing to hide and that privacy is only an issue for celebrities or politicians or journalists maybe, 27 00:02:42,000 --> 00:02:49,110 or criminals. So are they making a mistake then? I mean, should it matter to them too? Yes, they're making various mistakes? 28 00:02:49,110 --> 00:02:52,950 One mistake is that they do have something to hide and something to fear. 29 00:02:52,950 --> 00:02:55,470 So in a recent survey that I did with a colleague, 30 00:02:55,470 --> 00:03:02,610 we found out that about 92 percent of people have had some kind of bad experience related to privacy online. 31 00:03:02,610 --> 00:03:06,600 So eventually, if you're not careful with your privacy, you're going to get hacked. 32 00:03:06,600 --> 00:03:11,000 You're going to get your credit card number stolen. You're going to be exposed in some way. 33 00:03:11,000 --> 00:03:16,000 You're going to be publicly humiliated. This happens to normal people who are not lawyers. 34 00:03:16,000 --> 00:03:21,030 and not journalists. Nobody special. So on the one hand, yes, you do have something to fear for yourself. 35 00:03:21,030 --> 00:03:28,650 And then the second kind of mistake is that you should take care of your privacy for other people as well for for society in general. 36 00:03:28,650 --> 00:03:33,000 Why don't we don't protect our privacy? We leave our democracy at risk. 37 00:03:33,000 --> 00:03:35,000 So an excellent example is Cambridge Analytica. 38 00:03:35,000 --> 00:03:41,760 it's like a textbook example of how knowledge is power and privacy is related to power in that way, 39 00:03:41,760 --> 00:03:49,710 because the more people know about you, the more they can hack you in a way or they more the more they can exploit your your vulnerabilities. 40 00:03:49,710 --> 00:03:51,840 So what happened in Cambridge, Analytica, 41 00:03:51,840 --> 00:04:00,990 was that they exploited people's psychology to build this psychological warfare tool that they could use on any voter to try to sway elections. 42 00:04:00,990 --> 00:04:07,170 And that affected us all. It affected the people in the United Kingdom who are being affected through Brexit, 43 00:04:07,170 --> 00:04:14,980 but also European people who are also affected by Brexit and Americans who are affected by the Trump elections. 44 00:04:14,980 --> 00:04:20,640 And anybody who is touched by world politics. And that's pretty much everybody gets affected. 45 00:04:20,640 --> 00:04:25,140 One could say, well, privacy is importance, but this is a pandemic. 46 00:04:25,140 --> 00:04:26,000 I mean, shouldn't we risk our privacy? 47 00:04:26,000 --> 00:04:34,000 And if this can save many lives. Of course, we should consider it very seriously, but we shouldn't be uncritical. 48 00:04:34,000 --> 00:04:41,760 There is this phenomenon that's very common in crises in which we're you know, we really want to get out of it as soon as possible. 49 00:04:41,760 --> 00:04:47,550 People are dying. It's really extreme. And the temptation is to say, yes, let's do whatever it takes, no matter what. 50 00:04:47,550 --> 00:04:52,530 And to assume that that's going to work. Actually, there are many questions about that. 51 00:04:52,530 --> 00:04:57,930 The first thing to notice is that, of course, a contact tracing app works on a phone if you don't have a phone. 52 00:04:57,930 --> 00:05:03,810 It doesn't work. Many people who don't have a phone or who don't carry their phone around all the time. 53 00:05:03,810 --> 00:05:10,090 But let's suppose that everybody has a phone and everybody carries it around. Even then, you have at least two problems. 54 00:05:10,090 --> 00:05:18,760 False positives and false negative. So suppose you meet someone down the street and it's a friend and you haven't seen each other in a while. 55 00:05:18,760 --> 00:05:23,000 And so you give each other a hug and a kiss and you're not in contact for more than 15 minutes. 56 00:05:23,000 --> 00:05:30,000 So the app and doesn't recorded as a contact. But actually, you were close enough that there was a contagion there. 57 00:05:30,000 --> 00:05:35,290 They wouldn't be able to notice it and so it would be a case of a false negative. 58 00:05:35,290 --> 00:05:39,550 You would feel safe because you're upset as you. Absolutely fine. And in fact, you're not. 59 00:05:39,550 --> 00:05:47,000 And then there's a case of false positives. One case would be a case in which you are very close to another person, maybe in the same floor, 60 00:05:47,000 --> 00:05:53,830 but in different rooms or in different floors, being in the same building. And because you're separated by a wall, there was never a risk. 61 00:05:53,830 --> 00:05:56,980 But your app will record that as a contact. 62 00:05:56,980 --> 00:06:08,260 Perhaps one could say, well, many of us already give up much of our privacy through the Internet, especially through the use of apps and social media. 63 00:06:08,260 --> 00:06:15,640 So perhaps using a track and trace app only adds a small additional risk to the risk we already take. 64 00:06:15,640 --> 00:06:24,940 So if that's true, the benefits of using yet can more easily outweigh the small additional risk that comes with it. 65 00:06:24,940 --> 00:06:34,000 So doesn't that provide sort of a good reason to use the app as long as we're going to keep being reckless with privacy online anyway? 66 00:06:34,000 --> 00:06:41,000 Well, again, it depends on the app. So if there's an app that's, you know, really well tested, and it looks very secure and decentralised. 67 00:06:41,000 --> 00:06:50,000 Then maybe the answer is yes, if it's a centralised app, probably not because that kind of data is very sensitive. 68 00:06:50,000 --> 00:06:55,000 So even though you give it up in other ways, it's harder to aggregated in the same way. 69 00:06:55,000 --> 00:07:04,580 Furthermore, this is data that will end up in the hands of governments and even though we give a lot of data to corporations and we shouldn't. 70 00:07:04,580 --> 00:07:07,900 And many times governments get a copy of that data. 71 00:07:07,900 --> 00:07:14,000 There is a lot of leeway for abuse and governments can force people to do things that corporations typically can't. 72 00:07:14,000 --> 00:07:19,000 They have a different kind of power relations with citizens. And that is a further risk. 73 00:07:19,000 --> 00:07:29,000 And what do you think about the argument that sort of the track and trace apps are actually much less intrusive than measures we already accept? 74 00:07:29,000 --> 00:07:36,190 So some people say, well, if we accept, you know, lock down, social distancing and so on. 75 00:07:36,190 --> 00:07:42,600 Shouldn't we also accept the track and trace apps even if there is this potential harm? 76 00:07:42,600 --> 00:07:46,000 I mean, there are many considerations to take into account there. 77 00:07:46,000 --> 00:07:48,690 One is that, you know, it's a slippery slope. 78 00:07:48,690 --> 00:07:56,000 It's like, OK, it's you know, if we accept this and we should accept anything that just leads us to to a very unreliable place. 79 00:07:56,000 --> 00:08:00,000 Secondly, again, it matters a lot how effective something is. 80 00:08:00,000 --> 00:08:06,860 So social distancing, is directly effective, in avoiding contagion in a way that the app is not. 81 00:08:06,860 --> 00:08:14,850 And so it makes a huge difference. And furthermore, there is this danger about privacy because. 82 00:08:14,850 --> 00:08:21,120 In other kinds of cases, when you give something up, the consequence is very direct and very probable. 83 00:08:21,120 --> 00:08:31,080 So when you isolate in your home, it's very direct why it's bad for you and how much you missing people and the opportunities that you're missing, 84 00:08:31,080 --> 00:08:35,280 both professionally and personally. It's very tangible. It's very clear. 85 00:08:35,280 --> 00:08:38,560 It's it's very fast. Privacy doesn't work like that. 86 00:08:38,560 --> 00:08:42,660 So you give up your privacy and you don't feel anything. You don't bleed. 87 00:08:42,660 --> 00:08:46,860 It's not hard to grieve. You know, your skin is not broken. 88 00:08:46,860 --> 00:08:54,330 There isn't a palpable feeling. I mean, I'm down the line maybe two months down the line, maybe one year, a month down the line. 89 00:08:54,330 --> 00:09:02,940 Your identity gets stolen. You get your money stolen or you end up with a government that abuses its power and suddenly you. 90 00:09:02,940 --> 00:09:05,760 Or you could get discriminated against by a company. 91 00:09:05,760 --> 00:09:15,000 Is it conceivable to have an app that is not so dangerous or should we just not use any track and trace app at all? 92 00:09:15,000 --> 00:09:21,750 It's conceivable, although the way to do it is first you put all the physical conditions in place that are necessary for the app to work. 93 00:09:21,750 --> 00:09:28,000 So first you have to have a good programme of mass testing and track and trace, because even if you have an app, 94 00:09:28,000 --> 00:09:34,000 you need manual track and trace, because that kind of information that you can't get from an app. 95 00:09:34,000 --> 00:09:42,000 So, you know, on manual track and trace programme, will figure out whether two people met were in a park or in an inside place, no windows. 96 00:09:42,000 --> 00:09:51,090 And that kind of information makes a huge difference. So once you have all the physical requirements in place, then you design an app that's super secure. 97 00:09:51,090 --> 00:09:55,000 You go to privacy experts, you talk to them, you ask for their advice, you make them tested. 98 00:09:55,000 --> 00:09:59,000 You go to the computer scientist, you've tried to challenge them to hack it 99 00:09:59,000 --> 00:10:04,830 And, you know, people want to save lives. Nobody wants to be in this situation. This is horrendous. 100 00:10:04,830 --> 00:10:08,550 So experts will collaborate and they have collaborated. 101 00:10:08,550 --> 00:10:14,060 So once you get the green light or people who say, yes, you know, I've tested this app and I couldn't break into it. 102 00:10:14,060 --> 00:10:19,000 And once, you know, a privacy expert says, you know, I've looked at this app and it's absolutely as solid as can be. 103 00:10:19,000 --> 00:10:24,000 Then people will get behind it. But if you do it, you know, in this disorganised fashion, you don't inform people. 104 00:10:24,000 --> 00:10:37,020 You don't give reasons. You design an app that is so faulty that, you know, everybody can see that it's leaking everywhere and then it's gonna be a mess. 105 00:10:37,020 --> 00:10:42,720 Thanks for listening to this thinking out loud interview. You can also watch the thinking out loud videos on YouTube, 106 00:10:42,720 --> 00:10:53,244 on the practical ethics channel and remain up to date with the thinking out loud Facebook page.