1 00:00:02,630 --> 00:00:09,760 It's great to be here. Thank you all for coming. I'm going to talk a little bit about the place of philosophy and the ethics of that. 2 00:00:09,760 --> 00:00:19,000 Plus, we can contribute to this field. Practical ethics was developed in the 1970s, in the 1940s and 50s. 3 00:00:19,000 --> 00:00:25,900 It was understood that ethics didn't have much to do except analyse moral language, 4 00:00:25,900 --> 00:00:30,700 analyse the meaning of terms like food and riots and obligatory of things like that, 5 00:00:30,700 --> 00:00:38,230 and maybe explore the tooth conditions for which a particular utterance was true or false. 6 00:00:38,230 --> 00:00:45,340 And then in the 1970s, a lot of social movements happened that put pressure on the discipline both externally and internally. 7 00:00:45,340 --> 00:00:49,990 And that made philosophy engage more directly with political issues. 8 00:00:49,990 --> 00:00:59,680 The Vietnam War was an important thing that happened, that the philosopher started to think about feminism, 9 00:00:59,680 --> 00:01:06,430 questions about abortion and sexual orientation and discrimination, and so on. 10 00:01:06,430 --> 00:01:11,590 And this really changes the story first, a lot of students became very, 11 00:01:11,590 --> 00:01:18,430 very much involved in philosophy and sign up to philosophy courses at a time when philosophy was kind of losing steam. 12 00:01:18,430 --> 00:01:29,970 And whenever I read about these moments in philosophy history, I always had a bit of jealousy and thought I just came to philosophy 30 or 40 years. 13 00:01:29,970 --> 00:01:36,580 You know how interesting and how important to change the discipline in a way that can contribute positively to the world. 14 00:01:36,580 --> 00:01:41,200 And one moment that was particularly important wasn't. 15 00:01:41,200 --> 00:01:50,050 In 1972, the New York Times came out with a story about the Tuskegee experiment and the Tuskegee scandal, 16 00:01:50,050 --> 00:01:55,330 and this had been an experiment that carried on for 40 years in which people who have 17 00:01:55,330 --> 00:02:01,810 syphilis were observed and they weren't treated even though treatment was available. 18 00:02:01,810 --> 00:02:08,260 And it was a huge scandal that we pushed have the discipline of medical ethics forward. 19 00:02:08,260 --> 00:02:13,510 And I think in many ways we are in this in a similar situation when in the 1970s, 20 00:02:13,510 --> 00:02:18,310 we have new technologies that are facing us with new problems that we haven't faced before 21 00:02:18,310 --> 00:02:24,470 and that computer scientists are not particularly well trained to think about these things. 22 00:02:24,470 --> 00:02:33,250 And we have new scandals like the Cambridge Analytica scandal, amongst others, that we make tangible the need for for ethics. 23 00:02:33,250 --> 00:02:44,610 So perfect ethics more generally. Tries to come up with ways to apply theory into practise, and so there's a question of, well, OK, 24 00:02:44,610 --> 00:02:50,310 so that's what practical philosophy does more more or less, but how can philosophy contribute to ethics in particular? 25 00:02:50,310 --> 00:02:57,720 What is special about the philosopher and what is special about the philosopher in the context of the ethics of A.I.? 26 00:02:57,720 --> 00:03:03,900 And this is just the latest version of a question that has been bothering philosophers for a long time, 27 00:03:03,900 --> 00:03:10,000 and many people have for getting different answers. So this is just a small sample. 28 00:03:10,000 --> 00:03:17,140 And although there are many controversies within this debates, maybe the most important of which is. 29 00:03:17,140 --> 00:03:24,130 Are there more authorities and what do we mean by mall authorities? There's much more consensus and disagreement within that debate, 30 00:03:24,130 --> 00:03:31,570 and perhaps the most important points of consensus is that philosophy can offer conceptual analysis in the 31 00:03:31,570 --> 00:03:37,690 hope of leading to better decisions and also better justifying decisions that have already been made, 32 00:03:37,690 --> 00:03:46,420 especially to those who lose out, you know, in a decision and with the hope that consumption rises can make debates sharper, 33 00:03:46,420 --> 00:03:52,340 it can get shorter and it can make it less intrusive. 34 00:03:52,340 --> 00:03:56,240 So what is consumption analysis and what kinds of things does it include? 35 00:03:56,240 --> 00:04:03,350 It includes things like clarifying concepts. Sometimes people are fighting about something and they're not even talking about the same thing. 36 00:04:03,350 --> 00:04:04,640 And on occasion, 37 00:04:04,640 --> 00:04:10,880 making sure that people are talking about the same thing even leads to the solving problems is something that with this sign defended. 38 00:04:10,880 --> 00:04:18,770 Of course, that's not always the case, and sometimes philosophy can provide nuance like any other discipline. 39 00:04:18,770 --> 00:04:25,880 Ethics has developed a very technical language that can be much more nuanced than just ordinary moral language. 40 00:04:25,880 --> 00:04:30,180 So it's not only about right and wrong, it's about what's permissible, what's not permissible, 41 00:04:30,180 --> 00:04:36,140 what's obligatory, what might be Suburgatory above and beyond duty and so on. 42 00:04:36,140 --> 00:04:43,760 It's also about working out the implications of use some views might be might feel very attractive or very attractive in a first glance. 43 00:04:43,760 --> 00:04:47,360 And then you start working out what are the implications, 44 00:04:47,360 --> 00:04:53,000 either practical implications or theoretical implications and suddenly it doesn't seem that attractive anymore. 45 00:04:53,000 --> 00:05:00,830 A good example is how your personal data has been treated or some some people think that we should treat personal data as property. 46 00:05:00,830 --> 00:05:08,150 And that sounds quite intuitive, except when you start looking at the implications and how property differs from personal data. 47 00:05:08,150 --> 00:05:12,670 It doesn't seem like such a good idea anymore. Pointing out contradictions. 48 00:05:12,670 --> 00:05:20,130 Public discourse is full of contradictions and fallacies, both in the media but also in parliament and everywhere in between. 49 00:05:20,130 --> 00:05:26,400 And philosophers can point out those distinguishing questions of fact from questions of value. 50 00:05:26,400 --> 00:05:33,540 It's not always obvious and it's not always easy. For instance, in the 1960s, we thought that death was just a biological question, 51 00:05:33,540 --> 00:05:37,710 a medical question, whether somebody was dead or not, just for the doctor to the side. 52 00:05:37,710 --> 00:05:45,330 Suddenly, with mechanical ventilators, we realised, Well, you know, we have these bodies that are warm, their hearts are beating, 53 00:05:45,330 --> 00:05:50,640 but their brains seem to be destroyed and we can harvest their organs and they're alive or they're dead. 54 00:05:50,640 --> 00:05:54,180 And suddenly, the philosopher comes and says, Well, you know, what do you mean by that? 55 00:05:54,180 --> 00:05:57,810 Is it the death of the body, the death of consciousness and death of the person? 56 00:05:57,810 --> 00:06:10,020 The death of the interests and rights that typically attached to people, and finally providing theory and in practical ethics? 57 00:06:10,020 --> 00:06:15,170 Of course, there's a lot of theory that comes both from narrative ethics, metaphysics and so on. 58 00:06:15,170 --> 00:06:24,600 And in the property, in the course of application, many times we realise the limits and possible mistakes of the theory itself such that. 59 00:06:24,600 --> 00:06:30,030 Practical cases and also empirical facts inform the theory and change it in politics. 60 00:06:30,030 --> 00:06:35,110 And this is an interesting process because, you know, philosophy has a bad reputation for a lot of disagreements. 61 00:06:35,110 --> 00:06:45,180 There are a lot of progress and consensus. But in fact, when you study these of the the history of philosophy, theories get very much polished. 62 00:06:45,180 --> 00:06:48,340 First, they're a consensus on some things that they used to. 63 00:06:48,340 --> 00:06:53,340 There was disagreement in the past, but even when there is disagreement with, for example, 64 00:06:53,340 --> 00:06:58,890 consequentials, today is a much more sophisticated theory than it was in its origins. 65 00:06:58,890 --> 00:07:06,600 And partly, theories get polished through bumping up with reality and taking a look at types of cases. 66 00:07:06,600 --> 00:07:10,170 Secondly, it's really important for philosophy to identify moral problems. 67 00:07:10,170 --> 00:07:13,470 And again, some of these are not as obvious as you might think. 68 00:07:13,470 --> 00:07:21,150 So before bioethics came along, doctors engaged in all sorts of problematic practises that once seen as a problem, weren't seen as problematic. 69 00:07:21,150 --> 00:07:29,700 So, for example, is not informing patients on their diagnoses randomising patients to treatment or placebo without informing them that they were 70 00:07:29,700 --> 00:07:40,770 part of a research and or even conducting very invasive examinations like rectal examinations in patients who were unconscious. 71 00:07:40,770 --> 00:07:47,850 And this one seen, as you know, doctors having bad intentions or anything like that is just. 72 00:07:47,850 --> 00:07:55,620 They laugh. Third, philosophy can inspire moral thoughts and. 73 00:07:55,620 --> 00:08:07,650 Through arguments and thought experiments and analogies, we can raise moral thought about prejudice and invite people to consider certain situations. 74 00:08:07,650 --> 00:08:14,910 And in so doing, stimulates their their moral thinking and also challenge moral intuitions. 75 00:08:14,910 --> 00:08:22,530 And here I think there's a very important role for public philosophy and for public engagement with the public in general. 76 00:08:22,530 --> 00:08:30,930 And then finally, philosophy can provide experience. What ethicists have extensive experience tackling difficult issues, 77 00:08:30,930 --> 00:08:40,560 and it would be a waste to not involve so much knowledge in a time when there's so much at stake. 78 00:08:40,560 --> 00:08:47,700 Of course, a person doesn't have to have a pristine ethics or have published in the best ethical journals to have good ethical insights. 79 00:08:47,700 --> 00:08:52,410 But spending most of your hours or most of your days thinking about ethical questions, 80 00:08:52,410 --> 00:09:01,920 trying different methodologies and learning about past these pitfalls does provide some kind of experience that can be of use and in the sense. 81 00:09:01,920 --> 00:09:06,090 Practical ethics has most experience with medical ethics, of course. 82 00:09:06,090 --> 00:09:12,990 And I think there's a lot to learn from this analogy that still hasn't been worked out, 83 00:09:12,990 --> 00:09:17,730 both from the similarities in these two fields and the differences. 84 00:09:17,730 --> 00:09:21,030 I think digital ethics is much more political than medical ethics. 85 00:09:21,030 --> 00:09:25,560 Just cite one difference, and that makes it's going to make it very, very different. 86 00:09:25,560 --> 00:09:30,000 And also, we have a lot to learn from successes, but also from failures. 87 00:09:30,000 --> 00:09:36,900 And one of the biggest failures, in my view of medical ethics is how it hasn't been able to regulate Big Pharma property. 88 00:09:36,900 --> 00:09:44,160 So Big Pharma today gets away with, for instance, carrying out 100 experiments to prove that a drug works. 89 00:09:44,160 --> 00:09:50,260 99 of them show that the drug doesn't work. One shows that the drug might work and that one gets published. 90 00:09:50,260 --> 00:09:56,890 The 1990s don't get published, and sometimes researchers can even talk about it. 91 00:09:56,890 --> 00:10:04,350 This is a huge failure, and it sort of signals a challenge that we have with regulating industry because most are a good part 92 00:10:04,350 --> 00:10:09,030 of the research carried on an A.I. right now is carried out by industry and not in universities, 93 00:10:09,030 --> 00:10:14,150 and so it is a huge challenge for digital ethics to tackle. 94 00:10:14,150 --> 00:10:21,890 So just to finish, I'm involved in the following research projects I'm thinking about what digital exclusion for medical ethics. 95 00:10:21,890 --> 00:10:27,050 I'm finishing a book about the privacy I'm working on the ethics of prediction. 96 00:10:27,050 --> 00:10:31,820 Human beings have been using predictions since the oracle of five, and strangely enough, 97 00:10:31,820 --> 00:10:36,470 we haven't thought much about the ethics of prediction and what makes a prediction ethical. 98 00:10:36,470 --> 00:10:42,440 And most importantly, and the thing that I'm most excited about is editing the Oxford Handbook of Digital Ethics. 99 00:10:42,440 --> 00:10:46,970 Many philosophers in the room where I in that which is great and it's going to cover all sorts 100 00:10:46,970 --> 00:10:53,780 of things from our sex and friendship in the digital age to democracy in the digital sphere, 101 00:10:53,780 --> 00:11:00,050 the use of killer robots, surveillance, privacy and so much more. 102 00:11:00,050 --> 00:11:06,800 So it's very exciting. This is a very exciting time, and I'm no longer jealous of the practical emphasis of the 1970s. 103 00:11:06,800 --> 00:11:14,150 I think this is so much better. And one day in 10 15 years, when the institute has had enormous impact on the ethics of AI, 104 00:11:14,150 --> 00:11:21,088 each of us will be able to say that we were here the first thing and besides.