1 00:00:11,790 --> 00:00:16,200 Hello, I'm David Edmonds and this is the Pandemic Ethics Accelerator Podcast. 2 00:00:16,710 --> 00:00:22,140 The UK Pandemic Ethics Accelerator was a project funded by the Arts and Humanities Research 3 00:00:22,140 --> 00:00:28,830 Council in 2021 22 to examine the ethical challenges faced during the COVID pandemic, 4 00:00:29,430 --> 00:00:37,650 it combined expertise from the University of Oxford, Bristol, Edinburgh University College, London and the Nuffield Council on Bioethics. 5 00:00:38,190 --> 00:00:43,140 This six part podcast series covers some of the themes that emerged from the research. 6 00:00:51,930 --> 00:00:59,220 The Pandemic Ethics Accelerator Program was led by a nursing and Oxford professor of Neuroscience and Society. 7 00:01:00,060 --> 00:01:07,980 Here she explains what the program was, what it was designed to achieve, and whether it succeeded in a nursing welcome. 8 00:01:08,250 --> 00:01:17,430 Lovely to be here. We're starting the podcast series with you because you were really the brains behind the pandemic ethics accelerator. 9 00:01:17,430 --> 00:01:21,010 So perhaps you can begin by telling us what it is. Thank you. 10 00:01:21,030 --> 00:01:26,160 But I should start by saying it was a very collaborative effort of nine of us. 11 00:01:26,280 --> 00:01:32,580 So I was the principal investigator and there were eight co-investigators spread across UK institutions. 12 00:01:32,940 --> 00:01:41,759 So the idea of the UK pandemic ethics accelerator really came about early in the pandemic when we looked at what was happening, 13 00:01:41,760 --> 00:01:47,130 the speed with which really profound decisions were needing to be made, 14 00:01:47,370 --> 00:01:55,499 not just at the political level but also in everyday life, was when going to go shopping, how much food were you going to buy in one go? 15 00:01:55,500 --> 00:01:58,710 Were you going to go visit your grandmother in the care home? 16 00:01:59,010 --> 00:02:05,010 So we thought actually all of these are really important ethical decisions that people are making 17 00:02:05,010 --> 00:02:10,950 in a way that they probably haven't had to make ethical decisions on a daily basis before. 18 00:02:11,220 --> 00:02:19,950 And the UK has an extraordinary resource in having among the best ethicists and philosophers 19 00:02:19,950 --> 00:02:25,259 in the world to help people understand how to think about these difficult questions, 20 00:02:25,260 --> 00:02:32,670 these dilemmas. And you received a grant to do this work, which was spread out among a series of participating institutions. 21 00:02:33,180 --> 00:02:40,260 Yes. So we were funded by the UK, COVID 19 Research and Innovation Budget, 22 00:02:40,350 --> 00:02:48,239 which was the enormous budget that was put together very quickly to fund research to help improve things during the pandemic, 23 00:02:48,240 --> 00:02:56,940 everything from developing vaccines to helping to shape the social and moral dimensions of the pandemic response. 24 00:02:57,180 --> 00:03:06,240 So we were a group of philosophers, ethicists and sociologists who were from Oxford, from UCL, 25 00:03:06,600 --> 00:03:13,680 from Edinburgh, from Bristol, and we had the Nuffield Council on Bioethics also join us. 26 00:03:13,710 --> 00:03:18,780 One of our core national resources for bioethics thinking and development. 27 00:03:19,170 --> 00:03:23,340 And why did you call it specifically an ethics accelerator? 28 00:03:24,180 --> 00:03:25,200 That's a great question. 29 00:03:26,220 --> 00:03:33,750 What I saw looking around was that there were a number of observatories that were coming about as a consequence of the pandemic. 30 00:03:33,750 --> 00:03:38,370 So we had the the National Economics Observatory, for example. 31 00:03:38,610 --> 00:03:41,790 I had initially said, well, we should be the Ethics Observatory. 32 00:03:41,790 --> 00:03:46,379 And then my colleagues, I think, quite rightly said, well, that's not nearly action oriented enough. 33 00:03:46,380 --> 00:03:49,260 We don't want to just observe, we want to do things. 34 00:03:49,740 --> 00:03:59,590 And in fact, we want to accelerate the pace at which ethics is brought to bear on the decisions that are being made. 35 00:03:59,610 --> 00:04:05,660 Because we thought that that transparency was really lacking at the point when we were putting in the application. 36 00:04:05,680 --> 00:04:08,309 So we thought, well, let's call it an accelerator. 37 00:04:08,310 --> 00:04:15,150 And the other dimension of the acceleration was that we wanted it not to be just the nine of us sitting 38 00:04:15,150 --> 00:04:21,570 around talking to each other about what kind of ethical thinking we should bring into the pandemic context. 39 00:04:21,570 --> 00:04:30,270 But we really wanted to mobilise ethics communities across the UK beyond our institutions, particularly early career scholars, 40 00:04:30,480 --> 00:04:38,130 to help kind of mobilise them into this space of particularly policy decision making, where ethics tends not to have a very good reach. 41 00:04:38,520 --> 00:04:43,620 Scientists can disagree with one another about the spread of the virus and so on, 42 00:04:43,620 --> 00:04:50,370 but there were ways of sorting out who's right and who's wrong in science, and that's just not quite so easy, is it? 43 00:04:50,370 --> 00:04:57,269 With ethics, even among so-called ethical experts, you can't expect unanimity. 44 00:04:57,270 --> 00:05:00,810 And where there's disagreements, it's not at all clear how they're to be resolved. 45 00:05:01,260 --> 00:05:02,410 I think you're right. 46 00:05:02,430 --> 00:05:11,370 I'm trained as an empirical scientist, and I converted to bioethics even before I finished my Ph.D. So this question that you're asking, 47 00:05:11,370 --> 00:05:18,689 I recognise very early on as a fundamental disconnect between what empirical scientists do and what ethicists do. 48 00:05:18,690 --> 00:05:23,260 So empirical scientists can disagree with each other on the basis of the evidence. 49 00:05:23,280 --> 00:05:26,639 How is the evidence collected? How is the data analysed? 50 00:05:26,640 --> 00:05:29,640 How is it interpreted and how is it then apply? 51 00:05:29,670 --> 00:05:33,030 The question becomes about the data in ethics. 52 00:05:33,210 --> 00:05:37,140 The goodness of the argument is about the argument. 53 00:05:37,290 --> 00:05:42,180 How coherent is your argument? It's a fully analytical challenge. 54 00:05:42,480 --> 00:05:49,380 I think a lot of people don't understand that. They don't understand the reasoning behind decision making. 55 00:05:49,710 --> 00:05:59,950 So one of. Charges really want to make that available to people, to give them the tools that philosophers and ethicists have to judge arguments, 56 00:05:59,950 --> 00:06:04,870 the goodness, the soundness of an argument, but also to say, look, it depends on your approach. 57 00:06:04,900 --> 00:06:12,070 So if I'm a utilitarian and I want to maximise the good for the most, I'm going to think differently from somebody who's an egalitarian, 58 00:06:12,070 --> 00:06:20,979 who would be much more likely to, for example, randomly allocate ventilators rather than prioritising people through a ventilator system. 59 00:06:20,980 --> 00:06:26,200 So that the way we approach these questions, the kind of ethics we bring to the table, 60 00:06:26,500 --> 00:06:30,850 have profound implications and consequences for what decisions we make. 61 00:06:31,240 --> 00:06:37,630 So you can bring clarity both in arguments and in the concepts that we use in those arguments. 62 00:06:38,530 --> 00:06:42,400 Yes. And we can also talk about reasonable disagreements. 63 00:06:42,610 --> 00:06:46,570 We tend to live in a world where we think, well, we should all achieve consensus. 64 00:06:46,660 --> 00:06:55,020 But of course, what ethics allows us to do is to identify both bad arguments, inconsistencies in coherence, 65 00:06:55,360 --> 00:07:00,400 but also good arguments from different perspectives that lead to different actions. 66 00:07:00,910 --> 00:07:04,240 And that is what we call reasonable disagreement. 67 00:07:04,510 --> 00:07:08,770 Reasonable disagreement says, well, you have to put up with disagreement. 68 00:07:08,890 --> 00:07:16,600 Disagreement isn't necessarily a bad outcome as long as you can trace the ways in which people have come to those disagreements. 69 00:07:16,600 --> 00:07:23,409 And that's empowering. Another aspect of ethics and the way we came at it in the pandemic is that we wanted to empower 70 00:07:23,410 --> 00:07:28,480 people to actually feel like they had some major and in the decision making in their own lives. 71 00:07:28,540 --> 00:07:30,219 And part of that empowerment, we felt, 72 00:07:30,220 --> 00:07:37,810 was to be able to have some reflection on how they themselves came to make the decisions that they made on a daily basis. 73 00:07:38,320 --> 00:07:47,320 So you've had a lot of researchers in various universities researching different aspects of the pandemic and pandemic ethics. 74 00:07:47,500 --> 00:07:52,150 Is it your view that ethics was a serious consideration during the pandemic? 75 00:07:52,840 --> 00:07:57,610 Well, it was for us, I mean, for government and for policymakers. 76 00:07:58,310 --> 00:08:06,590 But I think there was a lot more reflection on how people were making decisions in those contexts 77 00:08:06,610 --> 00:08:12,640 and the huge implications that those decisions were having for publics of various kinds. 78 00:08:12,940 --> 00:08:20,950 What there was not was any public debate or invitation to reflect, 79 00:08:20,950 --> 00:08:26,770 to disagree or any transparency about what approaches to ethical decisions were being made. 80 00:08:26,980 --> 00:08:36,850 And I think that that was a real failing of the government response, because what it meant, as you'll hear from other colleagues in this accelerator, 81 00:08:36,850 --> 00:08:44,440 is that the government largely lost trust with people who were already at greater risk during the pandemic. 82 00:08:44,450 --> 00:08:50,259 And I think that was a real failing. We'll get on to that in a minute. But I wonder whether we have short memories about the pandemic. 83 00:08:50,260 --> 00:08:54,880 In February and March 2020, there was a real sense of panic. 84 00:08:54,910 --> 00:08:59,740 We had very little data about the pandemic. Politicians had to make very quick decisions. 85 00:09:00,430 --> 00:09:05,260 And in Britain, Boris Johnson has been accused of being too slow in introducing a lockdown. 86 00:09:05,800 --> 00:09:09,760 There wasn't really time, was there, to reflect too deeply about the ethics. 87 00:09:10,240 --> 00:09:12,670 You're right, people had to make quick decisions. 88 00:09:13,030 --> 00:09:22,180 But perhaps that's one of the points here that a pandemic ethics accelerator should not have come into being a year or more into the pandemic. 89 00:09:22,450 --> 00:09:31,629 It should have existed prior to the pandemic. There should have been an ethics resource infrastructure within policymaking that could have 90 00:09:31,630 --> 00:09:37,450 been mobilised in the same way that vaccines development infrastructure was mobilised. 91 00:09:37,570 --> 00:09:41,260 So that's something they got wrong. What else did they get wrong? 92 00:09:43,810 --> 00:09:50,020 Well, one of the areas that we focussed on most was the lack of public deliberation, 93 00:09:50,440 --> 00:09:56,979 the lack of fora for people who aren't used to having a voice in policymaking but were 94 00:09:56,980 --> 00:10:04,990 disproportionately affected during the pandemic to actually allow for a degree of not just reflection, 95 00:10:04,990 --> 00:10:09,430 but actually just educating policymakers on what their values and priorities are. 96 00:10:09,550 --> 00:10:15,970 Why do we have such an extraordinary rate of non vaccine uptake among certain groups? 97 00:10:16,300 --> 00:10:20,560 We could have learned a lot actually just by having those public deliberations. 98 00:10:20,770 --> 00:10:24,370 But I also want to say, because one shouldn't just point the finger in one way. 99 00:10:24,700 --> 00:10:28,419 One of the areas in which we as researchers are, I think, 100 00:10:28,420 --> 00:10:34,590 still quite weak is that we don't have terribly good methodologies for what we call public deliberation. 101 00:10:34,600 --> 00:10:40,480 We tend to just get a bunch of people together in a room to talk and we listen well and then we write it up. 102 00:10:40,660 --> 00:10:43,760 That's not a method. I don't think that has much credibility. 103 00:10:43,810 --> 00:10:47,500 Certainly not when you're trying to make normative decisions. 104 00:10:47,510 --> 00:10:51,530 You know what we could do? Not just what do people. Think is the case. 105 00:10:51,890 --> 00:10:59,540 And so we need to get better at that. And then I think we would be more persuasive in helping governments come to us to do that kind of work. 106 00:10:59,720 --> 00:11:07,460 So it's a two way street and is the point that the virtues of transparency and clarity and 107 00:11:07,580 --> 00:11:14,000 deliberation and consultation is the point that those virtues would lead to better decisions? 108 00:11:14,000 --> 00:11:17,450 Or do they have intrinsic value? Both. 109 00:11:18,800 --> 00:11:28,460 If you take someone like my father who's Indian and didn't want a vaccine for a long time until he was finally persuaded by his children, 110 00:11:29,120 --> 00:11:34,070 and this was not an unusual situation for members of some minoritized groups. 111 00:11:34,640 --> 00:11:44,090 For him, the question was not just about trying to understand the values and priorities that were driving the decisions to mandate a vaccine. 112 00:11:44,510 --> 00:11:53,240 But he wanted to know that for him, those decisions were inherently good to do, not just of his own health, but for broader health benefits. 113 00:11:53,270 --> 00:11:55,399 In the end, that's what persuaded him to do. 114 00:11:55,400 --> 00:12:02,120 This was not his own health, which we were focussed on, but the fact that he was protecting others by protecting himself. 115 00:12:02,330 --> 00:12:05,750 I think the intrinsic good drives the better decision making. 116 00:12:05,900 --> 00:12:11,870 But if there had been more consultation with minority groups or representatives of minority groups. 117 00:12:12,410 --> 00:12:16,660 That's always a contentious topic. Who is a representative of a minority group? 118 00:12:17,030 --> 00:12:26,330 Had there been more of that, there would have been more trust in your father, for example, might have shown less resistance to having the vaccine. 119 00:12:27,050 --> 00:12:35,810 Yes, that makes it sound very simple and that disregards an entire history that has made him sceptical of medical interventions in general. 120 00:12:35,960 --> 00:12:40,130 As a person of colour. So I think we can't ignore that. 121 00:12:40,220 --> 00:12:47,690 And I guess one has to come back to this question of what's the grounds on which the pandemic response was taking place? 122 00:12:47,960 --> 00:12:58,010 And that ground itself is full of historic injustices and inequities, etc., and one can't solve those in the moment of the crisis. 123 00:12:58,280 --> 00:13:04,010 These are the conditions in which any response to crises is going to have to contend. 124 00:13:04,340 --> 00:13:09,860 You have to tackle those conditions outside the crisis in a kind of proprietary way. 125 00:13:10,310 --> 00:13:15,830 We know now, of course, that the pandemic did have a disproportionate impact on certain groups, 126 00:13:16,130 --> 00:13:20,780 but presumably that was something that we could have predicted as soon as the pandemic began. 127 00:13:22,520 --> 00:13:29,090 While health disparities certainly didn't start with the pandemic, though, we certainly could have predicted it. 128 00:13:29,150 --> 00:13:34,910 The question is, did anybody care? And an important ethical question is why do we now? 129 00:13:35,600 --> 00:13:39,440 Do we care? Because of the broader impact, the cost to the NHS? 130 00:13:39,680 --> 00:13:47,610 The fact that we're going to be living with the consequences of health injustices now in an amplified way for many, many years. 131 00:13:47,630 --> 00:14:00,260 So is it an economic form of care? Because we've come to understand the intrinsic value of caring for people who are at risk of health injustice. 132 00:14:00,320 --> 00:14:06,080 And I think that's part of the work of ethics in going forward, is to help people understand, again, 133 00:14:06,080 --> 00:14:14,190 that those are two different approaches to why we should care about the disproportionate impact of the pandemic. 134 00:14:14,210 --> 00:14:17,330 Accelerator program. It receives a fair amount of money. 135 00:14:17,450 --> 00:14:22,250 What are the criteria for judging whether it's been money well spent? 136 00:14:23,180 --> 00:14:31,790 It's a very good question. I think for us, there are a number of ways in which we thought about the success of the Endeavour. 137 00:14:32,340 --> 00:14:39,200 So one was, did we make an impact? Have we affected policy thinking, if not policy making? 138 00:14:39,350 --> 00:14:46,969 We don't just mean by that. Did number ten listen to us, but all the way down the chain of command of people who make decisions, 139 00:14:46,970 --> 00:14:52,160 I include in that people who are making decisions about their own family members and their own behaviour. 140 00:14:52,520 --> 00:15:00,800 So I think the answer there is to a large extent no, we've not been successful, but we're also coming to understand that time will tell. 141 00:15:01,130 --> 00:15:05,060 The pandemic isn't over and in part this period of recovery. 142 00:15:05,120 --> 00:15:09,170 What we're finding is there's more opportunity now to speak and to be heard. 143 00:15:09,320 --> 00:15:12,560 People are listening because they want to know how to do better. 144 00:15:12,740 --> 00:15:16,670 To that degree, we hope to be successful even beyond the end of the grant. 145 00:15:17,180 --> 00:15:26,600 The other way to judge success is did we mobilise the ethics research community to bring forward their skills in this period of time? 146 00:15:27,080 --> 00:15:33,649 And again, I don't think we did as well as we wanted to in mobilising the national research ethics community. 147 00:15:33,650 --> 00:15:40,880 But in terms of the early career researchers in the accelerator who were just really the engine of the whole thing, 148 00:15:41,150 --> 00:15:49,850 I think that we did a great job and they did a fantastic job of generating new kinds of products, learning how to do things much more quickly than. 149 00:15:50,160 --> 00:15:56,160 Academic timescales would normally provide giving them opportunities to engage with policymaking. 150 00:15:56,490 --> 00:16:02,430 So I think in that way we've been very successful and I hope that that will be a legacy of the project. 151 00:16:02,550 --> 00:16:14,820 And then finally, the fact that we, as nine ethicists, social scientists, had this really intensive interaction over 18 months to 24 months, 152 00:16:14,820 --> 00:16:23,220 has also kind of shaped a collaborative entity and a way of doing a big ethics project in a collaborative way. 153 00:16:23,250 --> 00:16:27,600 We've learned a lot from that, and I think going forward there will be much more opportunities, 154 00:16:27,870 --> 00:16:33,599 these kind of things that usually scientists do that build these huge collaborations and across institutions, etc. 155 00:16:33,600 --> 00:16:41,940 So we really found a way to come together and also maintain independent expertise as well as working as a collective. 156 00:16:41,940 --> 00:16:47,580 And I think that that we did really well. So you've given you the self a mixed report. 157 00:16:47,730 --> 00:16:50,640 You've been very honest about that. I wonder if you've been too tough on yourself. 158 00:16:50,880 --> 00:16:54,510 How do you know that nobody's been listening or few people have been listening? 159 00:16:55,050 --> 00:17:01,830 We know because of the things we haven't been asked to do, we haven't been invited into number ten. 160 00:17:02,010 --> 00:17:10,440 We haven't been invited. Again, I want to say yet two conferences that are looking at the legacy of the track and trace program, 161 00:17:10,800 --> 00:17:14,580 other groups that are coming together to think about pandemic recovery. 162 00:17:14,640 --> 00:17:19,830 We have been talking to the people who are leading the COVID inquiry. 163 00:17:20,220 --> 00:17:22,740 One of the people leading that is an academic, which I think helps. 164 00:17:23,580 --> 00:17:29,490 And he said, gosh, you know, I hadn't thought ethics would be one of the strands that we look at. 165 00:17:29,640 --> 00:17:36,120 But as I learned about what the accelerator was doing, I began to see that it's actually everything. 166 00:17:36,510 --> 00:17:41,040 It's in all our strands. And so we'd really like to think about it as a cross-cutting resource. 167 00:17:41,100 --> 00:17:45,960 Could we potentially use your group on a kind of consultancy basis? 168 00:17:46,020 --> 00:17:50,790 We'll see if that comes to fruition. But we were really excited about that because we thought, yes, exactly. 169 00:17:50,790 --> 00:17:59,790 That's exactly what we want people to realise. You criticised the Government earlier for being slow to respond to the pandemic. 170 00:18:00,060 --> 00:18:08,670 I wonder whether there's a case for having something like the pandemic ethics accelerator existing as a permanent body? 171 00:18:09,000 --> 00:18:09,299 Yes. 172 00:18:09,300 --> 00:18:20,430 So I think there certainly is a case for having the infrastructure available so that the pandemic ethics accelerator could mobilise again quickly. 173 00:18:20,790 --> 00:18:25,140 And that would mean allowing us to continue on in the background of a low hum. 174 00:18:25,410 --> 00:18:29,160 As you well know, nothing in academia works unless it has funding. 175 00:18:29,850 --> 00:18:35,060 What I would really like to see happen in the next phase of national funding is that 176 00:18:35,070 --> 00:18:39,300 there are infrastructure projects that are funded by all the research councils. 177 00:18:39,600 --> 00:18:45,030 We know that science and medicine are very good at funding these large scale infrastructure projects, 178 00:18:45,030 --> 00:18:49,710 but we're really less good and almost non-existent in funding them in, 179 00:18:49,720 --> 00:18:53,670 for example, arts and humanities, where ethics and philosophy funding comes from. 180 00:18:54,150 --> 00:18:59,639 And one of the problems, of course, is that these larger infrastructure projects have a very long lead time. 181 00:18:59,640 --> 00:19:04,400 And so success indicators, which you asked about earlier, are more difficult to establish. 182 00:19:04,410 --> 00:19:11,880 But I think we've shown ourselves as a group to be very productive and to a limited but growing extent impactful. 183 00:19:12,420 --> 00:19:16,319 It would make a lot of sense to keep us going and to have this infrastructure. 184 00:19:16,320 --> 00:19:23,340 And then I think that it also makes sense to continue having that happen in collaboration with the Nuffield Council on Bioethics, 185 00:19:23,760 --> 00:19:32,400 which is itself going to be moving to more of a responsive mode, but obviously needs UK academics to support its work. 186 00:19:32,550 --> 00:19:36,600 And so I think that that infrastructure funding for us would be very valuable. 187 00:19:37,560 --> 00:19:40,590 And then saying thank you very much indeed. Thank you very much. 188 00:19:45,690 --> 00:19:48,870 Thanks for listening to the Pandemic Ethics Accelerator podcast. 189 00:19:49,470 --> 00:19:53,580 You can hear more in this six part series on University of Oxford Podcasts. 190 00:19:53,820 --> 00:19:56,580 Well, that's Pandemic Ethics dot UK.