1 00:00:00,660 --> 00:00:06,870 Michael Parker, your professor of bioethics and director of the Welcome Centre for Ethics and Humanities 2 00:00:06,870 --> 00:00:11,640 and the Ethos Centre all at the University of Oxford and a Fellow of St Cross College, 3 00:00:11,640 --> 00:00:19,860 Oxford. Welcome to St Cross College Shorts. It's now about a year since the start of the Kovik 19 pandemic. 4 00:00:19,860 --> 00:00:24,620 Can you describe your everyday research stocks before Kopitar 19? 5 00:00:24,620 --> 00:00:29,330 It's hard to remember before Cauvin, 19 away. It seems such a long time. 6 00:00:29,330 --> 00:00:36,110 But I'll try my best. My son, my background. I'm a philosopher by background and I specialise in ethics. 7 00:00:36,110 --> 00:00:40,090 I'm interested in the ethical issues or ethical aspects of development, 8 00:00:40,090 --> 00:00:50,330 in particular in in in global health and in various new technologies in relation to medicine, particularly genomics and data driven science. 9 00:00:50,330 --> 00:00:57,080 I know I'm one of the things I'm interested in is looking at those issues as they arise in the real world for real people. 10 00:00:57,080 --> 00:01:00,530 So I'm interested in moral problems that arise for, for example, 11 00:01:00,530 --> 00:01:05,720 scientists at the cutting edge of research in genetics, say, or in global health around the world. 12 00:01:05,720 --> 00:01:11,150 Maybe, maybe I have collaborative partnerships with partners in Africa and Southeast Asia. 13 00:01:11,150 --> 00:01:17,230 So what's it like to be a researcher in Ghana engaged in an international collaborative research project? 14 00:01:17,230 --> 00:01:22,920 What are the kind of problems, what are the kinds of things that you worry about from a moral perspective in this kind of network? 15 00:01:22,920 --> 00:01:31,420 That's that's one aspect. And I suppose the other thing, which is a bit more thinking as a as an academic or as a philosopher is I'm interested in. 16 00:01:31,420 --> 00:01:37,090 The ways in which are disciplined, so the disciplines in broadly speaking, in the humanities, her philosophy. 17 00:01:37,090 --> 00:01:40,870 And I'm also interested in history. How are disciplines might need to change? 18 00:01:40,870 --> 00:01:46,150 Might need to be rethought in a way for us to be for them to be adequate to engage 19 00:01:46,150 --> 00:01:50,650 with the kinds of ways in which the world's being changed by neuroscience, 20 00:01:50,650 --> 00:01:53,800 by genomics, by global connectedness, by data science and so on. 21 00:01:53,800 --> 00:01:58,780 So what does it mean for us? What does it mean to do ethics now in the world as it has? 22 00:01:58,780 --> 00:02:06,550 It's as it is currently and changing. And that strikes me that was a really interesting question, 23 00:02:06,550 --> 00:02:11,020 because obviously all of the developments I mentioned raise important questions not only about ethics, 24 00:02:11,020 --> 00:02:16,190 but also about what it means to be what it means to be human and the way I go about it in practise. 25 00:02:16,190 --> 00:02:22,120 My work is I I'm not sort of the usual kind of philosopher, I suppose, in this sense. 26 00:02:22,120 --> 00:02:28,900 I try my best to get to embed my work as closely as I can in the Day-To-Day work of science and medicine. 27 00:02:28,900 --> 00:02:34,760 And that's for two reasons, really. One is that I think that context really matters. 28 00:02:34,760 --> 00:02:39,820 Mean you're trying to understand ethical issues. So. So that's one reason for getting close to the practise. 29 00:02:39,820 --> 00:02:46,780 And the other is I think it's a good place to get kind of early warning in some ways of cutting edge problems and 30 00:02:46,780 --> 00:02:53,050 to be really out at the forefront of of where these difficult moral problems are arising in novel combinations. 31 00:02:53,050 --> 00:03:00,740 So that's that's what I do. So global health and genomics were what I was working on before before I ended up in Kovik mostly. 32 00:03:00,740 --> 00:03:05,430 There's a link there. So how did that change from the pandemic to cold? 33 00:03:05,430 --> 00:03:12,790 Yeah, that's that's an interesting question, too, I think. So one immediate experience was I saw this sort of anecdote, really. 34 00:03:12,790 --> 00:03:20,530 I was sitting in my office. I work in the Big Data Institute, which is up on in Headington and on the Old Road campus. 35 00:03:20,530 --> 00:03:29,440 And I was in my office in a meeting one day with with two of our administrators talking about budgets and to infectious disease. 36 00:03:29,440 --> 00:03:33,790 Researchers started knocking on my door and looking through my glass window in a very 37 00:03:33,790 --> 00:03:39,040 sort of excited way and essentially insisting that I came and spoke to them immediately. 38 00:03:39,040 --> 00:03:47,440 And these were two people. One of them was Professor Krystal Fraser, who's in a group called the Pathogen Dynamics Group in the Big Data Institute. 39 00:03:47,440 --> 00:03:54,940 And they were very excited because they had developed an idea for a mobile phone app which could be used in contact tracing. 40 00:03:54,940 --> 00:04:00,240 And their view was that traditional contacts, these were the early days of COPD, certainly in the. 41 00:04:00,240 --> 00:04:07,810 I'm talking about February. Naturally, their view was that contact tracing, the usual kind of contract tracing, which is, you know, 42 00:04:07,810 --> 00:04:11,920 going around and knocking on people's doors and asking them who they been in contact with was 43 00:04:11,920 --> 00:04:18,000 just gonna be too slow for this new this new disease that was emergency emerging in East Asia, 44 00:04:18,000 --> 00:04:22,990 and that the only way we were going to deal with it and, you know, in a way that was speedy, 45 00:04:22,990 --> 00:04:26,440 sufficiently speedy would be through some sort of digital contact tracing. 46 00:04:26,440 --> 00:04:32,810 So they got an idea of how to do that. And, you know, within kind of an hour of having the thought, they recognised that we're ethical issues. 47 00:04:32,810 --> 00:04:35,560 So they got me involved in that. And I spent quite a lot. 48 00:04:35,560 --> 00:04:41,340 I essentially dropped everything else and worked on that for a for a while and then got more involved in the Kofod 49 00:04:41,340 --> 00:04:49,450 work because of I was approached to be a member of the UK government's scientific advisory group on Emergences Sage. 50 00:04:49,450 --> 00:04:54,040 So my life changed pretty dramatically, possibly a bit earlier than most people in the UK. 51 00:04:54,040 --> 00:05:01,060 So sort of February time really. It sort of started started happening a bit more about how you became a member of SAGE. 52 00:05:01,060 --> 00:05:06,080 So to clarify, it's the UK government's scientific advisory group for emergencies. 53 00:05:06,080 --> 00:05:11,300 Yeah, that's right. So there are two. So I can tell you the story about how I got involved, which I will do. 54 00:05:11,300 --> 00:05:14,410 But there's an interesting question out. Why would someone would approach someone like me? 55 00:05:14,410 --> 00:05:19,090 And I have now that I've got a couple of thoughts about that, but I don't really know. 56 00:05:19,090 --> 00:05:24,850 I for the last 10 years or so, arising out of the kinds of research I mentioned earlier, 57 00:05:24,850 --> 00:05:30,040 I've been working very closely in something called the Hundred Thousand Genomes Project, 58 00:05:30,040 --> 00:05:35,140 which is genomics ing this big initiative by the UK government, by the UK health system, 59 00:05:35,140 --> 00:05:44,650 and some funding from the government to build up genomic tonight, testing in genomic research in the context of the NHS. 60 00:05:44,650 --> 00:05:51,520 So I've been doing that for about 10 years and that's a that's an initiative which does have quite close links with public health and and so on. 61 00:05:51,520 --> 00:05:52,660 So I knew. I knew. 62 00:05:52,660 --> 00:05:59,470 So I know some people in the Department of Health and know some some of the people who are involved in policymaking around science and technology. 63 00:05:59,470 --> 00:06:03,520 So my guess is possibly that maybe one of the reasons I was invited, 64 00:06:03,520 --> 00:06:09,070 the other is I chaired a two year working party working group for the Nuffield Cats on Bioethics, 65 00:06:09,070 --> 00:06:16,630 which led to the publication of a report on the ethical issues in researching global health emergencies in January this year. 66 00:06:16,630 --> 00:06:21,870 So, again, that that that publication date was agreed two years ago, but it turned out to be very timely. 67 00:06:21,870 --> 00:06:25,150 So my guess is that those two things are the reason I was asked. 68 00:06:25,150 --> 00:06:29,260 The way it actually happened was I had a call on my mobile phone on a Sunday afternoon. 69 00:06:29,260 --> 00:06:31,570 I think it was I was sitting. 70 00:06:31,570 --> 00:06:39,250 In my in my in my living room and the phone rang with one of those numbers, and you think, oh, is this gonna be a kind of prankster call or something? 71 00:06:39,250 --> 00:06:42,150 I didn't recognise. And then I answered the phone and it was Patrick. 72 00:06:42,150 --> 00:06:51,070 Sir Patrick Vallance on the other end asking me whether I be essentially this was so I think Sage has been around for a long time. 73 00:06:51,070 --> 00:06:55,620 Sage is a group that responds to emergencies of a range of different kinds. 74 00:06:55,620 --> 00:07:02,380 So the Grenfell Tower disaster, for example, was something that they they they were involved in reporting. 75 00:07:02,380 --> 00:07:07,420 So this was but this was a code version of the coded version, I think had been around since early in the year. 76 00:07:07,420 --> 00:07:11,370 And some big decisions have been made around the initial kind of decisions that lockdown and so on. 77 00:07:11,370 --> 00:07:17,050 And I think they'd recognised early on the ethics was a component that was that had been missing. 78 00:07:17,050 --> 00:07:21,840 I don't know if I'm the first person to be an ethicist to be onstage, but I might. 79 00:07:21,840 --> 00:07:25,750 I think I may well be. So I think they'd recognise the need for some ethics input. 80 00:07:25,750 --> 00:07:30,820 And someone had told them that I wish someone to speak to. So I got this call and agreed. 81 00:07:30,820 --> 00:07:42,310 And then I think I then I was sort of in get up to speed very quickly, sort of, you know, several meetings a week, all, of course, on on on online. 82 00:07:42,310 --> 00:07:48,310 So that's kind of how it happened. But now that's the kind of story, I think, behind why why I was invited. 83 00:07:48,310 --> 00:07:53,420 How do you see the role of the message Sistan Scientific Advisory Group like Sage? 84 00:07:53,420 --> 00:07:59,300 That that is a very good question. So first to this question, what they were expecting of me and I'll see someone else we can answer that question. 85 00:07:59,300 --> 00:08:06,710 I, I, I can't really. But it's the science of, say, Seijas is very much as you say, it's a scientific advisory group. 86 00:08:06,710 --> 00:08:11,500 So it doesn't recommend policy. It doesn't say you should be doing this or that. 87 00:08:11,500 --> 00:08:16,180 It basically says from a scientific perspective, these are a range of consider. 88 00:08:16,180 --> 00:08:19,990 This is a range of considerations that are can be relevant to any decision, 89 00:08:19,990 --> 00:08:24,730 scientific considerations to do with for examples, the levels of infectiousness, 90 00:08:24,730 --> 00:08:27,100 prevalence of the disease in the community, 91 00:08:27,100 --> 00:08:32,820 what people's behaviour looks like now and how what what kind of predictions might be made about future behaviour. 92 00:08:32,820 --> 00:08:37,840 So it's very scientific group and it's advice to government is is is scientific. 93 00:08:37,840 --> 00:08:41,740 So, you know, then the question is, why is it ethicist in that space? 94 00:08:41,740 --> 00:08:48,340 Because so I initially took my my role to be that my role was essentially to advise Sage rather than advising the government. 95 00:08:48,340 --> 00:08:53,170 So I would be saying these are some things that you're talking about as scientists, 96 00:08:53,170 --> 00:09:00,130 which look to me like they have an ethical component to them and to sort of foster a discussion on those topics within those meetings. 97 00:09:00,130 --> 00:09:03,730 And I have to say, I've always been very receptive and we've had some very good discussions. 98 00:09:03,730 --> 00:09:11,230 But actually over time I've managed to find a way of incorporating bits of ethical advice into the advice to governments. 99 00:09:11,230 --> 00:09:14,650 And essentially, I've taken the same approach as the scientists have. 100 00:09:14,650 --> 00:09:23,140 I've tried to highlight where I think there are ethical considerations, considerations that are relevant to policy decisions. 101 00:09:23,140 --> 00:09:30,610 And I've I'm not in a position I mean, it was pretty clear that my role was not to kind of recommend policies, but to say, you know, 102 00:09:30,610 --> 00:09:36,670 here are a couple of policies that you might want to consider and these are the ethical implications of doing one or the other. 103 00:09:36,670 --> 00:09:40,090 These are the things that are relevant considerations to that. 104 00:09:40,090 --> 00:09:45,470 So and so. I sort of see myself as a far partly a raising awareness of ethical issues. 105 00:09:45,470 --> 00:09:51,910 It's partly helping to provide an analysis of those issues and what the what they mean, what what the what the problem is. 106 00:09:51,910 --> 00:10:00,010 And then partly sort of exploring and setting out the implications of of of different competing courses of action. 107 00:10:00,010 --> 00:10:08,800 And of course, ultimately, the decisions and the moral responsibility lies with lies with the parliament or with or with ministers rather than me. 108 00:10:08,800 --> 00:10:14,680 But I felt, you know, there is a sort of moral role for the for me as well as an adviser and obviously for 109 00:10:14,680 --> 00:10:18,640 others to speak up and say this is this is something that I think is a moral problem. 110 00:10:18,640 --> 00:10:23,320 And then it's up to others to decide what they what they do as a result of that advice. 111 00:10:23,320 --> 00:10:30,030 The challenge has been and the opportunity is presented to one of the challenges has been rising. 112 00:10:30,030 --> 00:10:33,890 It's a bit like a dinner party. And if you're studying a dinner party or you're at a dinner party, you're a bit late to arrive. 113 00:10:33,890 --> 00:10:40,390 And everyone sort of standing in groups, talking to each other and in this case are talking about sight, very high tech, scientific, you know, 114 00:10:40,390 --> 00:10:45,850 very sophisticated science that's kind of trying to look for an opportunity to enter into the conversation, 115 00:10:45,850 --> 00:10:48,460 as you would the party, you know, without being pushy. 116 00:10:48,460 --> 00:10:53,830 So one of the challenges at the beginning was to try and identify, listen really, really carefully. 117 00:10:53,830 --> 00:11:01,590 I'm not a scientist. I do have science A-levels. But I but I'm not scientists are trying to listen very carefully and get a sense of the language, 118 00:11:01,590 --> 00:11:10,620 a bit like being a sort of anthropologist, I suppose, and try to get a get a sense of when it when there are questions that that look like the sort 119 00:11:10,620 --> 00:11:15,370 of things I can contribute to and look and what and work out the kind of conventions really. 120 00:11:15,370 --> 00:11:18,850 You know what how how might one speak in these kind of contexts. 121 00:11:18,850 --> 00:11:24,390 So there was a lot of that the beginning and then sort of putting my footing on occasions or speaking when it clearly, 122 00:11:24,390 --> 00:11:27,880 you know, that wasn't the right time. And so I'm sort of getting used to making mistakes. 123 00:11:27,880 --> 00:11:34,390 So that was that was, you know, and that and as I say, everyone's. Been very welcoming and the things I've said have always been taken very seriously. 124 00:11:34,390 --> 00:11:42,520 So I need perhaps have worried as much as I did, but I did see it as a as an experiment in the need for sort of cultural anthropology, really. 125 00:11:42,520 --> 00:11:45,960 I did sort of get it. That's how I approached it. And I think I've made it. 126 00:11:45,960 --> 00:11:49,300 And the opportunity to look, you know, this is there are personal opportunities here, which is, 127 00:11:49,300 --> 00:11:54,490 I think, my experience of Cauvin 19 and the pandemic has been different perhaps to many other people. 128 00:11:54,490 --> 00:11:58,210 Certainly it's the people around me because I'm I'm in the room hearing the decisions. 129 00:11:58,210 --> 00:12:05,850 Hearing the discussion of the evidence and recognising and seeing some of the uncertainties and knowing that, you know, that they exist and so on. 130 00:12:05,850 --> 00:12:11,830 So I think my personal experience has been enhanced, enriched as it whereas a person but also I've been in the room where I've been able 131 00:12:11,830 --> 00:12:17,680 to speak directly to government and to senior scientists and say then and and say, 132 00:12:17,680 --> 00:12:21,160 you know, something about the kind of what I think are important ethical questions. 133 00:12:21,160 --> 00:12:22,690 And, you know, rather than shouting. 134 00:12:22,690 --> 00:12:29,800 NEWSNIGHT on TV, I've actually been able to as I haven't done the shouting, but I've had the opportunity at least to raise these issues and say, 135 00:12:29,800 --> 00:12:35,230 you know, how do you how do you think these might relate to policies, decisions that you're considering? 136 00:12:35,230 --> 00:12:42,090 So that's a privilege, of course, to be asked. You've been involved in the development of the National Health Service, Corbitt contact tracing it. 137 00:12:42,090 --> 00:12:43,570 What was your role in that? 138 00:12:43,570 --> 00:12:50,600 So, as I mentioned earlier, in a way, that's the thing that got me into thinking about covered in it in an academic way at the very beginning. 139 00:12:50,600 --> 00:12:56,740 So I've done I'm doing less of that now. In fact, I'm doing nothing on that, really, apart from my as my as a member sage, 140 00:12:56,740 --> 00:13:01,900 I hear presentations about how the app's going, and I've got a sense of how it's working in the country, as it were. 141 00:13:01,900 --> 00:13:08,590 But I don't do much academically on it. But yeah, in the beginning, in February, I was I was very intensely working on that. 142 00:13:08,590 --> 00:13:13,660 I was working with Crystal Frazer and David Bonfils group while they developed the science. 143 00:13:13,660 --> 00:13:20,500 And I was sitting in rooms with them while they were sort of doing all these interesting calculations about how this app might work and, 144 00:13:20,500 --> 00:13:23,920 you know, very involved in the ethical discussions. And we published a couple of papers. 145 00:13:23,920 --> 00:13:32,110 We published a paper in Science which sort of showed that the digital technology of this kind could make a difference to in the context of a pandemic. 146 00:13:32,110 --> 00:13:38,380 And then we published and I I had the opportunity to write into that paper. Some of the ethical questions that are important I thought were important. 147 00:13:38,380 --> 00:13:42,160 And then we published a paper in another journal which was specifically on the ethics. 148 00:13:42,160 --> 00:13:45,520 And at that point, really, the world kind of changed overnight because I was. 149 00:13:45,520 --> 00:13:50,920 I did multiple interviews for New York Times, BBC, you know, all the various radio. 150 00:13:50,920 --> 00:13:54,910 I was on the Today programme, you know, lots of lots of media interest. 151 00:13:54,910 --> 00:13:59,830 And essentially there was a very interesting discussion which emerged, which I think is is still very interesting, 152 00:13:59,830 --> 00:14:08,230 which is to what extent and when I would we be willing for our current approaches to privacy to to change. 153 00:14:08,230 --> 00:14:12,980 So, you know, and I think I think, you know, we're in the context of a public health emergency. 154 00:14:12,980 --> 00:14:19,870 There's this question which arise out of that. And a solution to the Cambridge 19 pandemic is essentially going to have to be a collective one. 155 00:14:19,870 --> 00:14:25,930 It's about people working together as a society or smaller levels, in particular venues and so on. 156 00:14:25,930 --> 00:14:30,130 That's the only way that the pandemic, certainly when there's no vaccine or treatment, 157 00:14:30,130 --> 00:14:33,460 effective treatments, the only way it can be dealt with is the question about solidarity. 158 00:14:33,460 --> 00:14:37,570 And some of those questions about solidarity are questions about information and data. 159 00:14:37,570 --> 00:14:42,610 And I think there was a lot of interesting discussion, I think, in the early days about about those kind of questions, the limits, 160 00:14:42,610 --> 00:14:49,000 the appropriate limits of privacy within the context of a recognition that we all needed to work together to solve this problem. 161 00:14:49,000 --> 00:14:54,940 And I've and I've done lots of that. I personally think that the mobile phone app was it has worked to some extent in this pandemic. 162 00:14:54,940 --> 00:14:59,320 But really, it's it's the lessons learnt are really more appropriate for future, for the future. 163 00:14:59,320 --> 00:15:03,820 I think it's gonna be really interesting to think about digital technologies and infectious disease in the future, 164 00:15:03,820 --> 00:15:07,720 not only in the UK and around the around the world, for example, in low middle income countries. 165 00:15:07,720 --> 00:15:11,330 I think there's a lot of potential there for thinking about the ethics of that. 166 00:15:11,330 --> 00:15:21,442 Parking is giving us a lot of new knowledge about the role of ethics and in this pandemic.