1 00:00:00,390 --> 00:00:04,290 This is bioethics bytes with me, David Edmonds and me Nigel Warburton. 2 00:00:04,560 --> 00:00:12,060 Bioethics Bytes is made in association with Oxford here Centre for Practical Ethics and made possible by a grant from the Wellcome Trust. 3 00:00:12,360 --> 00:00:17,940 For more information about bioethics bytes go to WW dot practical ethics. 4 00:00:18,060 --> 00:00:22,230 Dot Oaks Dot AC dot UK or to i-Tunes. 5 00:00:22,230 --> 00:00:30,810 You imagine the following choice. You can either finance medical treatment that was thought hearing to 50 people who'd lost hearing in just one ear. 6 00:00:31,200 --> 00:00:37,500 Or you could pay the same amount to treat an entirely deaf person so that this one person could hear perfectly. 7 00:00:37,920 --> 00:00:41,100 Which would you choose to help the 50 or the one? 8 00:00:41,550 --> 00:00:46,440 In this series, we've mainly focussed on dilemmas and debates involving individual doctors and patients. 9 00:00:47,280 --> 00:00:55,800 Jonathan Wolff is a political theorist and he believes that traditional bioethics misses a vital dimension the distribution of resources. 10 00:00:56,490 --> 00:01:00,000 Joe Wolff, welcome to Bioethics Bytes. Well, thank you for having me. 11 00:01:00,420 --> 00:01:06,510 The topic we're going to talk about is political bioethics, in a sense, as a new subject. 12 00:01:06,540 --> 00:01:11,010 Could you just say what it is and how it relates to traditional bioethics? 13 00:01:11,430 --> 00:01:22,110 Traditional bioethics has focussed on what goes on in the hospital or in the GP surgery, a whole series of doctors dilemmas as they've been called. 14 00:01:22,200 --> 00:01:25,890 What should we tell the patients? What can we release to the patient's family? 15 00:01:25,950 --> 00:01:29,580 What should we do with the patient's organs after surgery? 16 00:01:30,000 --> 00:01:33,780 What should we do at the beginning of life? Should we save the baby or not? 17 00:01:34,020 --> 00:01:37,920 What can we do at end of life? Can we legitimately engage in euthanasia? 18 00:01:38,430 --> 00:01:46,410 Those are all very good questions, but they're all about the doctor's relation with the patient or sometimes with the patient's family. 19 00:01:47,190 --> 00:01:51,360 The type of thing I'm interested in which we've called political bioethics, 20 00:01:51,360 --> 00:01:56,399 or some people call it population level bioethics, some people will even call it public health. 21 00:01:56,400 --> 00:02:07,680 Ethics wants to widen the focus to make very clear there's a lot to do with health that is not about what goes on in the hospital or at the bedside. 22 00:02:08,310 --> 00:02:12,510 And these other issues also throw up moral and political dilemmas. 23 00:02:12,960 --> 00:02:16,170 Perhaps you could give a couple of examples to show us what we're talking about here. 24 00:02:16,650 --> 00:02:21,180 Consider what drugs should be made available on the NHS. 25 00:02:21,660 --> 00:02:30,300 This is a question about health care, but it's not a question about who should give consent to an operation or what should be told to the patient. 26 00:02:30,480 --> 00:02:37,680 It's much more like traditional questions and political philosophy where we're talking about issues of justice and fairness. 27 00:02:38,100 --> 00:02:41,970 What drugs should be available? Why should they be available? Who should they be available to? 28 00:02:42,000 --> 00:02:48,000 These are questions about health, the moral questions, but they're not questions of a traditional bioethics form. 29 00:02:48,300 --> 00:02:52,020 And what about the more general issues of health that go beyond medicine? 30 00:02:52,200 --> 00:02:59,370 Do they fall within the scope of political bioethics? One thing within this literature that's very important is to make a distinction. 31 00:02:59,670 --> 00:03:05,550 You could call it a distinction between health and health care or the distinction between health and medicine. 32 00:03:05,970 --> 00:03:10,770 Most of the things about your health have got very little to do with the health care system. 33 00:03:11,430 --> 00:03:15,540 That is what makes you ill is not anything to do with your doctor. 34 00:03:15,540 --> 00:03:19,080 It's your lifestyle, your genes, bad luck and so on. 35 00:03:19,650 --> 00:03:25,050 So the things that are needed to keep you well very often have got nothing to do with the medical profession. 36 00:03:25,410 --> 00:03:33,210 If you take a longer historical look, the rise in life expectancy we've had in the last hundred years is really incredible. 37 00:03:33,420 --> 00:03:37,110 People are living decades longer. Why is that? 38 00:03:37,470 --> 00:03:41,130 Now a lot of people will say it's these fantastic advances in medical care. 39 00:03:41,490 --> 00:03:46,050 We can now do heart transplants, all sorts of operations that we couldn't do 100 years ago. 40 00:03:46,500 --> 00:03:50,610 But how many people get a heart transplant? How many people have had these operations? 41 00:03:51,030 --> 00:03:59,100 The reason why life expectancy is increased, a lot of people think, is because of much more basic things like sanitation, hygiene, nutrition. 42 00:03:59,520 --> 00:04:06,450 These are now known as the social determinants of health, the factors in your life that determine your health. 43 00:04:06,720 --> 00:04:13,050 Medical care is one determinants of health, but in fact, it turns out for most people to be a very small component. 44 00:04:13,590 --> 00:04:16,650 So political bioethics tries to take the bigger picture. 45 00:04:16,800 --> 00:04:24,370 It doesn't just zoom in on doctors dilemmas or individuals dilemmas about how they should react to the available medical resources. 46 00:04:24,840 --> 00:04:31,110 He's looking at a whole range of topics that perhaps have a larger effect on your likelihood of a healthy life. 47 00:04:31,500 --> 00:04:37,350 Absolutely right. I mean, one of the most important documents really in generating this, it's quite old now, 48 00:04:37,350 --> 00:04:45,270 but it was the Black Report in 1979 looking at the progress of the National Health Service. 49 00:04:45,630 --> 00:04:51,170 National Health Service came into effect in 1948. 30 years later, what effects did it have? 50 00:04:51,180 --> 00:04:55,860 And when the NHS came in, everyone thought that it would be fantastic. 51 00:04:55,860 --> 00:04:59,630 It would equalise health or at least improve the health of the worst off. 52 00:04:59,830 --> 00:05:06,010 People who didn't have access to health care. The Black Report pointed out that health of the nation had improved, 53 00:05:06,190 --> 00:05:10,840 but most of those improvements had gone to people who were already well-off and who already had good health. 54 00:05:11,290 --> 00:05:16,629 And it argued that the social determinants of health, poverty, poor working conditions, 55 00:05:16,630 --> 00:05:21,700 poor housing hadn't really changed very much for people at the bottom end, so they were still getting ill. 56 00:05:22,120 --> 00:05:29,770 Now, one of the authors of the Black Report argue that because of this, we ought to spend less money on health and more money on poverty alleviation. 57 00:05:30,130 --> 00:05:36,100 Therefore, for reasons of health, take money out of the health system and spend more on social services. 58 00:05:36,610 --> 00:05:41,890 Naturally, that didn't go down very well with his colleagues on the committee, and they didn't make that recommendation in the end. 59 00:05:42,160 --> 00:05:44,230 But every now and again, this same point is made. 60 00:05:44,470 --> 00:05:50,470 But that's a fascinating case, because that is, it seems to me, a rigorous application of cost benefit analysis. 61 00:05:50,830 --> 00:05:58,570 If you're really concerned with the outcome that health is improved, then there might be better ways and pumping money into health service. 62 00:05:58,720 --> 00:06:06,430 That's true. And that was the argument. It was also realised that policy doesn't always follow the cost benefit analysis because that 63 00:06:06,430 --> 00:06:12,340 can be very powerful political actors who will make sure that the interests and looked after, 64 00:06:12,490 --> 00:06:16,000 even when they may not be indicated, is the best policy. 65 00:06:16,450 --> 00:06:23,890 But as philosophers, we're interested in thinking clearly about how in this case, resources might be well allocated. 66 00:06:24,130 --> 00:06:28,690 What sort of principles should we use then? Are you saying that it's wrong to do cost benefit analysis, 67 00:06:28,930 --> 00:06:35,470 or that so far people who've done cost benefit analysis have focussed too narrowly on one little bit of the picture? 68 00:06:35,830 --> 00:06:40,989 Cost benefit analysis is going to be essential when you've got a fixed level of resources, 69 00:06:40,990 --> 00:06:43,780 or at least a limited level of resources, as we already have. 70 00:06:44,000 --> 00:06:53,380 In fact, raising this issue does bring us to one of the most interesting and lively areas in political bioethics, which is the role of Nice. 71 00:06:53,470 --> 00:06:57,790 There's a National Institute of Health and Clinical Excellence because they have the job 72 00:06:57,910 --> 00:07:01,960 of deciding which drugs are going to be made available on the National Health Service. 73 00:07:02,320 --> 00:07:06,390 And what will happen is that a pharmaceutical company will create a new drug. 74 00:07:06,400 --> 00:07:13,180 It will go through the normal safety and efficacy procedures to test that it actually works and doesn't kill people. 75 00:07:13,510 --> 00:07:18,220 So there is a point where it's licensed in the sense of being available, but it might be very expensive. 76 00:07:18,520 --> 00:07:24,100 There are all sorts of drugs that come forward and the drug companies want the National Health Service to take them. 77 00:07:24,370 --> 00:07:32,230 A decision has to be made and the way that's done at the moment is through not exactly cost benefit analysis, but cost effectiveness analysis. 78 00:07:32,380 --> 00:07:38,290 It's assumed that there is a certain drug budget, perhaps £10 billion a year in this country. 79 00:07:38,290 --> 00:07:45,190 And roughly speaking, the job of Nice is to work out how to get the best health benefits for that 10 billion. 80 00:07:45,460 --> 00:07:50,110 That will inevitably mean that they won't licence for refund on the NHS some drugs 81 00:07:50,110 --> 00:07:53,500 that are very expensive because they don't think they give sufficient health benefit. 82 00:07:53,800 --> 00:07:57,400 Could you just explain what the difference is between cost benefit and cost effectiveness? 83 00:07:57,730 --> 00:08:05,740 Cost benefit analysis requires you to give a valuation of the costs and evaluation of the benefits and 84 00:08:05,740 --> 00:08:11,650 then to pursue a course of action where the benefits are higher than the costs to the highest degree. 85 00:08:12,100 --> 00:08:17,080 So this requires monetisation of all values in order to do that. 86 00:08:17,620 --> 00:08:20,890 Cost effectiveness doesn't require so much information. 87 00:08:21,040 --> 00:08:25,269 Suppose you're doing your shopping and you've got a fixed amount of money in your 88 00:08:25,270 --> 00:08:29,980 pocket and you want to go out and get the most nutritious shopping you can. 89 00:08:30,160 --> 00:08:33,940 You can make decisions between different things, and you could end up with one trolley, 90 00:08:33,940 --> 00:08:38,530 and you could look at that and say, I'm sure I could have done better with this money and go back and change a few things. 91 00:08:39,070 --> 00:08:45,760 You could be sure you've made the most cost effective use of that money without having a view about the value of the benefits. 92 00:08:45,940 --> 00:08:50,769 That is, you haven't said, well, ten calories is worth five piece. 93 00:08:50,770 --> 00:08:56,170 I've got to make sure I've got that much. So you haven't really worked out whether the benefits outweigh the costs. 94 00:08:56,320 --> 00:09:02,200 All you've done is spend your money in the most effective way you can, and that is what the health system at the moment does. 95 00:09:02,650 --> 00:09:06,940 And the currency, it measures effectiveness and is quality of life. 96 00:09:07,090 --> 00:09:10,749 Yes, primarily it uses a notion known as the quality, 97 00:09:10,750 --> 00:09:16,330 which is the quality adjusted life year to people who come across it the first time, including me. 98 00:09:16,510 --> 00:09:23,350 It seems very crude and inhumane measure, but what it does is say, 99 00:09:23,620 --> 00:09:28,030 suppose you were to live for the next year in full health, no health problems at all. 100 00:09:28,240 --> 00:09:31,840 Well, that gives you a quality adjusted life. Year of one. 101 00:09:31,840 --> 00:09:37,150 You're getting a full year. So 1.0. But suppose you had some impairment. 102 00:09:37,270 --> 00:09:42,790 Suppose you had a problem for some reason with mobility and you couldn't get around very well. 103 00:09:42,970 --> 00:09:51,700 Perhaps you've got arthritis, perhaps you've had a sporting injury, so you're not in full health, so that next year will be less than 1.0. 104 00:09:52,540 --> 00:09:57,280 What would it be? Well, it might be 0.8 if you're blind. 105 00:09:57,640 --> 00:09:59,730 The current view is that gives. 106 00:09:59,800 --> 00:10:07,450 You quality adjusted life year, your next year will be worth 0.5 qualys because there's still an awful lot you can do if you're blind. 107 00:10:07,450 --> 00:10:12,010 It seems devastating, but there's still a life to be lived with lots of satisfactions. 108 00:10:12,340 --> 00:10:16,870 If you're blind and bedridden, that might then go down to 0.2 and so on. 109 00:10:17,050 --> 00:10:21,670 And there are states that are thought to be worse than death, quite a few states worse than death. 110 00:10:21,670 --> 00:10:29,649 So they have minus quality numbers. So to put it in its crudest form, what nice does, and I'm going to modify this in a moment, 111 00:10:29,650 --> 00:10:32,650 is trying to get the most quality as it can for the health budget. 112 00:10:33,040 --> 00:10:40,660 And it charges new medicines on on the grounds of whether they're going to deliver more qualys than other things that are currently in the system. 113 00:10:41,500 --> 00:10:48,100 So that's a way of assessing what kind of drugs should be available to people who need them. 114 00:10:48,340 --> 00:10:52,930 What could be wrong with that? That seems to be quite a rational way of allocating resources. 115 00:10:53,620 --> 00:10:58,480 Well, it's true. It does seem very rational, and for many purposes it is. 116 00:10:58,630 --> 00:11:04,900 But what as a philosopher jumps out when you start to look at it, is we have a system. 117 00:11:04,990 --> 00:11:09,400 A certain amount of money is being used to maximise the number of qualities available. 118 00:11:09,700 --> 00:11:12,700 That sounds like a moral theory called utilitarianism. 119 00:11:13,240 --> 00:11:17,530 And the first thing we learn in our undergraduate classes is utilitarianism is wrong. 120 00:11:17,920 --> 00:11:21,010 Now, we might be wrong to be told that utilitarianism is wrong, 121 00:11:21,220 --> 00:11:30,190 but the standard method of decision making in health economics at the moment uses what many people believe to be a flawed moral theory. 122 00:11:30,490 --> 00:11:37,450 This is interesting in lots of ways. One way is that it allows us as moral and political philosophers to draw on 123 00:11:37,720 --> 00:11:42,430 what is now a vast literature to try to illuminate questions in health policy. 124 00:11:42,760 --> 00:11:49,090 We could go back and say what we wrong. Actually we over hasty to dismiss utilitarianism should have been utilitarian after all. 125 00:11:49,450 --> 00:11:57,190 Now to know whether that is right or wrong. In this case, what we have to do is think about the possible consequences of maximising qualities. 126 00:11:57,430 --> 00:12:04,030 One type of case that has been used in the literature is so-called discrimination against the disabled. 127 00:12:04,360 --> 00:12:10,060 Now, I mentioned before, if you're blind, your quality value for the next year is only 0.5. 128 00:12:10,180 --> 00:12:18,820 If you're not blind, potentially it could be one. Suppose we were allocating the next heart for transplants on the basis of the quality value. 129 00:12:18,970 --> 00:12:25,390 If you were going to give it to someone who's blind, the most you can get from them is 0.5 per year for however many years are going to live. 130 00:12:25,480 --> 00:12:28,750 Whereas if you're going to give it to someone who's not blind, maybe you'll get them back up to one. 131 00:12:29,020 --> 00:12:37,330 So if you were really going to use qualities in clinical judgement, people with chronic, longstanding health problems would get very little attention. 132 00:12:37,840 --> 00:12:47,500 Perhaps a rigorous utilitarian would say that's difficult to stomach, but that is actually given the scarce resources, the best thing to do. 133 00:12:47,950 --> 00:12:53,229 No doubt many utilitarians would swallow hard and say that in practice, actually, 134 00:12:53,230 --> 00:12:59,290 things are not as bad as this because quality calculations are never made in clinical judgements. 135 00:12:59,440 --> 00:13:07,840 They made it to a different level, made at the level when Nice is deciding whether a drug or treatment should be made available generally. 136 00:13:08,080 --> 00:13:13,330 So in practice there wouldn't be any of this direct discrimination against blind people or others. 137 00:13:13,330 --> 00:13:17,110 But it is true that under current situation, 138 00:13:17,110 --> 00:13:24,640 a treatment for a patient group of poor health will yield fewer qualities than a treatment that could bring people back to full health. 139 00:13:24,910 --> 00:13:30,010 And this has troubled a lot of people because if you think about it in your own intuitions about cases, 140 00:13:30,010 --> 00:13:33,819 you might think people in poor health should be the highest priority for treatment, 141 00:13:33,820 --> 00:13:38,500 not the lowest, even if the benefit you can give them is not as high as you could give to someone else. 142 00:13:38,770 --> 00:13:43,360 So this is a typical objection to utilitarianism applied to the health domain. 143 00:13:43,990 --> 00:13:50,410 So how do we get out of that? The first thing to do would be to look at other philosophical theories, because after all, 144 00:13:50,410 --> 00:13:54,370 the development of moral and political philosophy didn't end with utilitarianism. 145 00:13:54,790 --> 00:13:59,980 People came up with forms of rule utilitarianism later on with John Rawls. 146 00:14:00,400 --> 00:14:06,280 Although, of course, John Rawls didn't talk about health care, his theory can be applied in this area. 147 00:14:06,670 --> 00:14:09,760 His difference principle that we should make the worst off as well off as possible 148 00:14:10,000 --> 00:14:15,100 can be interpreted in this area as a principle to give priority to the worst off. 149 00:14:15,820 --> 00:14:19,900 The trouble with doing this, and I think this is one of the reasons why Rawls didn't talk about health care, 150 00:14:20,410 --> 00:14:28,480 is that if you use the Rawls in principle giving absolute priority to the worst off, you have what has come to be known as the black hole problem. 151 00:14:28,600 --> 00:14:34,690 You may have people who are very badly off, very expensive to treat and can get very little benefit from it. 152 00:14:35,140 --> 00:14:37,930 And if you really believed an absolute priority to the worst off, 153 00:14:38,020 --> 00:14:44,250 it seems that we would have to give all our resources to those people who are very badly off, even if this gave them very little benefits. 154 00:14:44,560 --> 00:14:47,530 And this wouldn't be considered cost effective in other ways. 155 00:14:47,890 --> 00:14:56,890 So people who use Rawls in type ideas have tended to hang onto this notion of giving priority to the worst off, but not absolute priority. 156 00:14:57,340 --> 00:15:04,450 So a cynic listening to this might say. Well, these are very difficult problems, but philosophers haven't found a way to answer them. 157 00:15:04,500 --> 00:15:10,350 They just present us with further problems. They reveal to us the consequences of certain strategies for allocating resources. 158 00:15:10,350 --> 00:15:14,580 But where are the solutions? Well, I may well be one of those cynics myself. 159 00:15:14,790 --> 00:15:19,690 I think it's going to be very hard to demonstrate that one of these is the correct approach, 160 00:15:19,710 --> 00:15:24,150 is that we should be utilitarians or we should believe in equality or priority to the worst off. 161 00:15:24,810 --> 00:15:30,150 Some philosophers think if you only think hard enough, we're going to be able to get to the right single true answer. 162 00:15:30,330 --> 00:15:35,160 Others, such as Norman Daniels, have said that there are incommensurable values here, 163 00:15:35,370 --> 00:15:41,700 and what we need in the end is a procedure that everyone can sign up to rather than a set of principles. 164 00:15:42,030 --> 00:15:47,370 So Norman Daniels has what he calls the notion of accounting for reasonableness. 165 00:15:47,790 --> 00:15:52,079 And this is very interesting. It has been adopted by Nice in this country. 166 00:15:52,080 --> 00:15:57,209 So they say that they're not these crude, vulgar utilitarians. 167 00:15:57,210 --> 00:16:02,770 They're not just trying to maximise qualities. They're taking other things into account, for example, severity of illness. 168 00:16:03,030 --> 00:16:13,770 They would be much more likely to approve a drug if it cured a severe illness rather than if it gave a mild benefit to very many people cheaply. 169 00:16:13,950 --> 00:16:18,510 And you can imagine a situation where you get the same quality value by treating lots of people, 170 00:16:18,660 --> 00:16:22,350 giving them a small benefit with treating one person and giving them a big benefit. 171 00:16:22,470 --> 00:16:26,400 And Nice is prepared to take Savarese into account. 172 00:16:26,820 --> 00:16:36,120 So it's not crudely utilitarian, but what it tries to do is to have a procedure where all these values and the reasons are laid bare transparently, 173 00:16:36,510 --> 00:16:42,570 so it can explain its reasoning to other people. And if others think they've gone seriously wrong, it is subject to challenge. 174 00:16:43,020 --> 00:16:50,160 So is that a kind of cancer history where you debates the particular case rather than impose a principle that fits every case? 175 00:16:50,810 --> 00:16:56,970 There's a type of ideal of reason where we should have a principle that we can apply to every case. 176 00:16:57,330 --> 00:17:00,420 This is the holy grail of philosophy. 177 00:17:00,450 --> 00:17:07,140 Ideally, we want a principle that is as small as possible, maybe two variables in one relationship and universal coverage. 178 00:17:07,830 --> 00:17:14,550 But we all know really that we're not going to get anything like that. It would have to be much longer, many more variables, exceptions and so on. 179 00:17:14,850 --> 00:17:19,560 So in the end, you find that the principle has so many exceptions, it's no longer principle. 180 00:17:19,980 --> 00:17:27,750 And what we're doing really is appealing to a whole range of values and seeing how weighty they appeal to us. 181 00:17:28,050 --> 00:17:34,140 And quite often all you can do is hope other people share the same view about how weighty the different values are. 182 00:17:34,590 --> 00:17:41,650 It's a form of case by case analysis. There is room for principles, but nothing is going to be determined fully by principles. 183 00:17:41,670 --> 00:17:48,660 You can appeal to all sorts of different types of considerations. But I think it has something in common with legal reasoning. 184 00:17:49,140 --> 00:17:56,730 So if you think about a court case, very often a case is decided because there's some very clear principle of law and it's all cut and dried. 185 00:17:57,120 --> 00:18:02,820 But most of the time, cases go to court precisely because there isn't a clear principle of law. 186 00:18:03,180 --> 00:18:09,180 Both sides think they've got an arguable case, and very often you can't really predict which way the judge is going to go beforehand, 187 00:18:09,630 --> 00:18:15,060 although the judge has to make it look like this was the only possible decision he or she could come up with afterwards. 188 00:18:15,330 --> 00:18:23,880 So they provide a set of reasoning. They employ new distinctions that no one had ever thought of before, and they find a way of getting to an answer, 189 00:18:23,970 --> 00:18:30,450 which is, in a sense, objective, and it creates a precedent for future cases as well. 190 00:18:30,660 --> 00:18:36,780 It's rational and it's transparent, but it's not applying a single principle in order to determine an answer. 191 00:18:37,020 --> 00:18:42,760 And I think a lot of reasoning in life is much more like the legal model than the principle model and decision making. 192 00:18:42,790 --> 00:18:45,240 Health care allocation, I think is like that, too. 193 00:18:45,780 --> 00:18:52,030 Well, is there a role for a philosopher in that world, then, where you're just presenting evidence, presenting arguments on either side? 194 00:18:52,050 --> 00:18:56,610 What can we do as philosophers? There's a lot we can do. Maybe not what we thought we could do. 195 00:18:56,820 --> 00:19:00,390 We probably thought we could solve the problems by coming up with the best theory. 196 00:19:00,660 --> 00:19:04,140 I no longer have any confidence that this is what we're going to be able to do. 197 00:19:04,470 --> 00:19:08,760 But we can clarify patterns of reasoning. 198 00:19:09,150 --> 00:19:16,200 We can make clear what values are involved. We can show what logical relations, different values may have to each other, 199 00:19:16,200 --> 00:19:19,260 that if you believe in one thing, then you shouldn't believe in something else. 200 00:19:19,620 --> 00:19:23,699 Now, I wouldn't want to say that only philosophers can do this because other academics, 201 00:19:23,700 --> 00:19:27,540 intelligent people, can do it to some degree, and they already do to it. 202 00:19:27,960 --> 00:19:31,770 But we have the advantage that this is, in a way, all we do and in this area, 203 00:19:31,770 --> 00:19:36,450 like many other areas of philosophical training, is going to be very helpful. 204 00:19:36,480 --> 00:19:40,110 But I've said this many times, I'll say it again, I'm sure. 205 00:19:40,110 --> 00:19:48,000 But in this area of public life, what we need are people with philosophical skills, not philosophical theories. 206 00:19:49,170 --> 00:19:59,130 Jonathan Wolf, thank you very much. My pleasure. For more information about bioethics bytes, go to WW dot practical ethics dot oecs dot ac dot you. 207 00:19:59,820 --> 00:20:00,900 Or iTunes you.