1 00:00:00,570 --> 00:00:07,230 This is the first one in the series Common Arguments in the series of mini podcasts, 2 00:00:07,230 --> 00:00:12,090 and in this one, I'm going to be looking at two very common arguments. 3 00:00:12,090 --> 00:00:18,720 So welcome to everyone here and welcome to everyone watching or listening to the podcast. 4 00:00:18,720 --> 00:00:27,130 So there's some arguments that use time and time again against advances in science and especially biotechnology. 5 00:00:27,130 --> 00:00:31,650 And it's not natural is one that you'll hear often. 6 00:00:31,650 --> 00:00:38,310 It's disgusting is one that started to come up actually much more recently in the States. 7 00:00:38,310 --> 00:00:44,130 It's too risky. That's one that'll come up a lot. And finally, it's a matter of opinion. 8 00:00:44,130 --> 00:00:51,180 And I'm going to look at the first two arguments. It's unnatural and it's disgusting in this podcast. 9 00:00:51,180 --> 00:00:58,980 And then we're going to look at it. It's too risky. And it's a matter of opinion in two separate podcasts after this. 10 00:00:58,980 --> 00:01:07,730 So let's start with it's not natural. Louise Brown, born in nineteen seventy eight, was the first ever test tube babies, 11 00:01:07,730 --> 00:01:17,450 and many people were worried and hugely worried about the birth of Louise Brown because test tube babies just seemed unnatural. 12 00:01:17,450 --> 00:01:28,370 So James Watson Watson amongst parents, both Nobel PRISE winners, expressed fears of deformed babies who might be the victims of infanticide. 13 00:01:28,370 --> 00:01:34,790 The thought would be that in the test tube, they would become deformed and people would then want to kill them. 14 00:01:34,790 --> 00:01:41,360 Many obstetricians wondered who would care for the babies if the experiment with nature went wrong. 15 00:01:41,360 --> 00:01:44,870 And Jeremy Rifkin, who's an American bioethicist, 16 00:01:44,870 --> 00:01:53,090 was concerned about babies growing up as specimens sheltered not by a warm womb but by steel and glass. 17 00:01:53,090 --> 00:01:58,370 So this idea that something's unnatural really worried people. 18 00:01:58,370 --> 00:02:01,670 But we all know the punchline of this particular story. 19 00:02:01,670 --> 00:02:09,410 Thousands of test tube babies have been born now and they have no greater chance of defects than ordinary births. 20 00:02:09,410 --> 00:02:15,320 So even if test tube babies are somehow unnatural, it doesn't suggest there's anything wrong with them. 21 00:02:15,320 --> 00:02:25,970 And I apologise for the lack of the E there. So many people believe that something's being unnatural is a black mark against it. 22 00:02:25,970 --> 00:02:32,600 But what is it for something to be unnatural? And why is being unnatural a bad thing? 23 00:02:32,600 --> 00:02:36,990 Well, let's have a look at three possible meanings from natural. 24 00:02:36,990 --> 00:02:42,650 Firstly, there's violates the laws of nature. Secondly, there's artificial. 25 00:02:42,650 --> 00:02:43,700 And thirdly, 26 00:02:43,700 --> 00:02:56,630 man made to read unnatural as violates the laws of nature and ensures that the it's not natural worry actually becomes the it's too risky worry. 27 00:02:56,630 --> 00:03:01,790 It's not a separate objection in this case. 28 00:03:01,790 --> 00:03:08,870 And when we look at it simply as it's not natural, i.e. it might violate the laws of nature, 29 00:03:08,870 --> 00:03:14,180 we can put our minds at rest because actually we can't violate the laws of nature. 30 00:03:14,180 --> 00:03:19,550 The laws of nature govern us. We don't interfere with them. 31 00:03:19,550 --> 00:03:22,970 And so we really put our minds at rest. 32 00:03:22,970 --> 00:03:26,600 If that's how we think of unnatural to read. 33 00:03:26,600 --> 00:03:32,390 Unnatural as artificial, on the other hand, doesn't seem to particularly generate a problem. 34 00:03:32,390 --> 00:03:37,430 I mean, artificial roses, OK, but they might be made of real silk. 35 00:03:37,430 --> 00:03:46,670 Artificial intelligence, for example, is real intelligence. Artificial material is made of real fabrics. 36 00:03:46,670 --> 00:03:54,440 And perhaps therefore we should read it's not natural as it's manmade or it's influenced by man. 37 00:03:54,440 --> 00:04:04,610 And in a way that makes sense because we do think of things that are not made by man that are natural in that sense as attracting a premium. 38 00:04:04,610 --> 00:04:10,040 We pay more for things that are not man made, for example. 39 00:04:10,040 --> 00:04:18,100 But why is being man made or influenced by man enough to make something bad or immoral? 40 00:04:18,100 --> 00:04:25,330 It's true that some philosophers, many philosophers, in fact, believe that only human beings can be immoral. 41 00:04:25,330 --> 00:04:35,080 You can't animals can't be immoral. So when your dog steals the meat from your table, it's only acting naturally. 42 00:04:35,080 --> 00:04:41,530 It's not being immoral. It's only us who can do something that's immoral. 43 00:04:41,530 --> 00:04:49,000 But if it's the same, if you believe that, you're also going to believe that only human beings can be moral. 44 00:04:49,000 --> 00:04:54,130 So it can't be this that makes manmade things bad. 45 00:04:54,130 --> 00:04:58,720 And there are plenty of things made by man that don't seem at all immoral or bad. 46 00:04:58,720 --> 00:05:03,250 I mean, anaesthetic in childbirth, for example. Is that a bad thing? 47 00:05:03,250 --> 00:05:10,120 Is that an immoral thing? What about vaccinations? They've saved the lives of millions of human beings. 48 00:05:10,120 --> 00:05:15,250 Now, what's bad about those yet they're man made email. 49 00:05:15,250 --> 00:05:22,930 Now here, I'm slightly open on this. I think it possibly could be bad, but it's certainly not obviously bad. 50 00:05:22,930 --> 00:05:29,920 And I'm sure you can think of many, many manmade inventions that are not bad. 51 00:05:29,920 --> 00:05:39,700 There also seem to be plenty of things that are not made by man that are bad on earthquakes, disease, tsunami, cannibalism. 52 00:05:39,700 --> 00:05:45,230 I mean, there's plenty of cannibalism in nature, quite apart from man. 53 00:05:45,230 --> 00:05:52,510 So being manmade seems neither necessary nor sufficient for being bad. 54 00:05:52,510 --> 00:05:56,410 So when you hear that it's unnatural and therefore bad, 55 00:05:56,410 --> 00:06:03,280 and I've just mentioned a few things here for the sort of things that people women having babies after the menopause, it's unnatural. 56 00:06:03,280 --> 00:06:07,090 Therefore, it's bad engaging in genetic engineering. 57 00:06:07,090 --> 00:06:19,750 It's unnatural, therefore bad creating life forms like Synthia that don't exist in nature, bad, unnatural, transplanting pig organs into human beings. 58 00:06:19,750 --> 00:06:25,810 Xenotransplantation again, unnatural, bad. And you should be sceptical. 59 00:06:25,810 --> 00:06:30,790 These things might be bad, but not because they're unnatural. 60 00:06:30,790 --> 00:06:38,710 We haven't yet found a single reason for thinking that because something is unnatural, it's bad. 61 00:06:38,710 --> 00:06:44,200 So let's have a look instead at the idea that something's disgusting. 62 00:06:44,200 --> 00:06:51,010 Well, intuition plays a large role in our decisions about whether something is or isn't morally acceptable. 63 00:06:51,010 --> 00:06:55,810 Some people think that if we feel something is wrong, it must be wrong. 64 00:06:55,810 --> 00:07:03,810 And there's certainly some truth in that. I mean, if when you're listening to my arguments, you're thinking that doesn't sound quite right. 65 00:07:03,810 --> 00:07:11,920 I would advise you that that intuition is possibly a good intuition and you ought to follow it up. 66 00:07:11,920 --> 00:07:18,990 Well, American ethicists Leon Cass and Alta Charo have said that. 67 00:07:18,990 --> 00:07:23,520 We shouldn't make moral recommendations on cloning, 68 00:07:23,520 --> 00:07:28,350 so they were talking to the Ethics Committee on cloning that was set up by President 69 00:07:28,350 --> 00:07:34,680 Clinton and they argued against making recommendations on the basis of reason and logic. 70 00:07:34,680 --> 00:07:43,920 And what they were arguing is that emotional responses are much more important when we're discussing ethical questions in the context of politics. 71 00:07:43,920 --> 00:07:50,310 So what matters is your intuition, not your reason or your logic. 72 00:07:50,310 --> 00:07:54,240 And they yock theory of morality, which I'm going to call it. 73 00:07:54,240 --> 00:08:04,620 So yuck, yuck. Therefore, it's immoral, says that if something disgusting, then it's wrong. 74 00:08:04,620 --> 00:08:11,850 And so if you find it disgusting to think about transplanting pig organs into humans or eating in vitro meat, 75 00:08:11,850 --> 00:08:19,650 meat that's been created in the laboratory or using eggs from aborted foetuses for IVF or for research purposes, 76 00:08:19,650 --> 00:08:24,660 if you think these things are disgusting, you should also think of them as immoral. 77 00:08:24,660 --> 00:08:28,100 That's according to this theory. 78 00:08:28,100 --> 00:08:35,210 But consider reading in a tourist guide that you shouldn't take photographs of certain people because they believe that in recreating their image, 79 00:08:35,210 --> 00:08:44,230 you're stealing their soul. Well, OK, if you heard that, you'd probably refrain from taking photographs, why why would you hurt these people? 80 00:08:44,230 --> 00:08:51,280 But this doesn't mean that you would accept that in taking photographs, you are stealing their souls. 81 00:08:51,280 --> 00:08:58,180 Most of us would believe that the fears of these people are based on ignorance and this is the problem with intuitions, 82 00:08:58,180 --> 00:09:03,450 intuitions are often grounded on ignorance. 83 00:09:03,450 --> 00:09:13,440 And it's understandable for people to recoil from things they don't know anything about, but such intuitive recalls are not always well grounded, 84 00:09:13,440 --> 00:09:19,620 and the only way to discover whether our intuitive fears are well grounded is to subject them to 85 00:09:19,620 --> 00:09:28,560 rational scrutiny and to subject our intuitions to rational scrutiny is to try and pin them down. 86 00:09:28,560 --> 00:09:33,420 We need to be able to recognise and firstly recognise and intuition. 87 00:09:33,420 --> 00:09:40,170 So we OK, we think this is disgusting. Then we need to find out why we hold it if we can. 88 00:09:40,170 --> 00:09:49,890 So we need to look at it and subjected to scrutiny. Ask why we hold it and ask whether our reason for holding it is or isn't a good one. 89 00:09:49,890 --> 00:09:55,560 And often when we try and pin down an intuition, we find we can back it up with good reasons, 90 00:09:55,560 --> 00:10:02,700 but now we have an argument for whatever it is we were saying. We're not relying simply on our intuitions. 91 00:10:02,700 --> 00:10:07,260 On other occasions, we find ourselves unable to pin down our intuitions. 92 00:10:07,260 --> 00:10:14,220 And sometimes this means that they're going to dissolve that we see they didn't have any grounding at all. 93 00:10:14,220 --> 00:10:19,560 And on other occasions we're going to be left with the disturbing feeling that something is wrong or right. 94 00:10:19,560 --> 00:10:27,750 Perhaps, though, we remain unable, despite our best efforts to say why it's wrong or right. 95 00:10:27,750 --> 00:10:32,760 Now, if we're in the last case, the rational thing to do is to keep an open mind, 96 00:10:32,760 --> 00:10:43,490 to keep trying to pin our intuitions down and to listen to hard to those who believe themselves to have arguments both for and against. 97 00:10:43,490 --> 00:10:47,780 So if you find yourself on hearing of a science and scientific innovation, 98 00:10:47,780 --> 00:10:54,080 thinking that's disgusting, do not be tempted immediately to infer that it is immoral. 99 00:10:54,080 --> 00:11:01,130 It might be immoral, but almost certainly not because it's disgusting. 100 00:11:01,130 --> 00:11:11,920 In the next two podcasts, we're going to consider the arguments, it's too risky and it's all a matter of opinion.