1 00:00:01,310 --> 00:00:05,210 This is bioethics bytes with me, David Edmonds and me Nigel Warburton. 2 00:00:05,480 --> 00:00:12,950 Bioethics Bytes is made in association with Oxford's Hero Centre for Practical Ethics and made possible by a grant from the Wellcome Trust. 3 00:00:13,250 --> 00:00:23,420 For more information about bioethics bytes, go to WW w doctor practical ethics oecs docs ac uk or to I choose you. 4 00:00:23,780 --> 00:00:28,940 What can science tell us about morality? Many philosophers would say nothing at all. 5 00:00:29,630 --> 00:00:31,490 Facts don't imply values, they say. 6 00:00:31,700 --> 00:00:37,520 You need further argument to move from facts about us and about the world to conclusions about what we ought to do. 7 00:00:38,180 --> 00:00:45,620 For example, most humans are altruistic. They genuinely care about the well-being of friends and family, and to a lesser extent, even at strangers. 8 00:00:46,040 --> 00:00:49,010 They'll give money to charity to help people they've never even met. 9 00:00:49,550 --> 00:00:54,530 Suppose science gives us a compelling scientific explanation for why we're altruistic. 10 00:00:55,220 --> 00:01:02,120 Does not tell us whether we should be altruistic. Professor Pat Churchland is a well-known US scientist based at the University 11 00:01:02,120 --> 00:01:06,650 of San Diego who works at the intersection of neuroscience and philosophy. 12 00:01:06,980 --> 00:01:11,030 Pat Churchland, welcome to Bioethics Bytes. Thanks so much, Nigel. 13 00:01:11,030 --> 00:01:16,370 It's a pleasure to be here. The topic we're going to focus on is what neuroscience can tell us about morality. 14 00:01:16,910 --> 00:01:22,160 I wonder if we could just begin by sketching your view of the neural basis of morality. 15 00:01:22,910 --> 00:01:32,959 It's a beautiful story. There was a major shift in brain organisation and structure as mammals evolved and 16 00:01:32,960 --> 00:01:36,710 there were a number of changes that were really important in the mammalian brain, 17 00:01:36,710 --> 00:01:44,450 and one of which was that it was organised to see to the care and the nurture of offspring. 18 00:01:44,960 --> 00:01:53,930 In the case of reptiles or frogs or snakes, for example, basically what happens is that the female lays the eggs and goes on her merry way. 19 00:01:54,320 --> 00:01:58,790 In the case of mammals, because they were born very immature, 20 00:01:59,000 --> 00:02:07,490 circuitry was in place to ensure that when there was separation of the infant from the mother, the mother felt pain and so did the infant. 21 00:02:07,640 --> 00:02:11,660 They felt pleasure and well-being when they reconnected. 22 00:02:11,660 --> 00:02:15,709 And we know something about that circuitry and we know that oxytocin, 23 00:02:15,710 --> 00:02:22,070 although not by any means the only important molecule is in a certain sense, at the hub of all of that. 24 00:02:22,220 --> 00:02:25,010 Just to clarify, what is oxytocin? What does it do? 25 00:02:25,580 --> 00:02:33,830 Oxytocin is actually a very ancient molecule, and it's found in all reptiles and probably almost all animals. 26 00:02:34,520 --> 00:02:38,310 But it in reptiles served very different purposes. 27 00:02:38,330 --> 00:02:44,780 It had to do with smooth muscle contraction for the release of eggs, for example, in mammals. 28 00:02:45,080 --> 00:02:55,230 It was put to rather different uses and it uses it head to head in the brain had to do with attachment between parent and offspring. 29 00:02:55,250 --> 00:03:01,760 This shift in the mammalian brain was kind of like an extension of care of oneself, 30 00:03:02,000 --> 00:03:10,700 seeing to one's own temperature and safety and food, to seeing to the temperature and safety and food of the others. 31 00:03:10,880 --> 00:03:15,440 I can see how that works with the care of one's own offspring, 32 00:03:15,440 --> 00:03:20,149 and there's obviously a genetic component to that in terms of looking after one's genes and so on. 33 00:03:20,150 --> 00:03:23,150 But how does that extend to others who aren't related? 34 00:03:23,990 --> 00:03:27,100 Yes. So this has been known for a while. 35 00:03:27,110 --> 00:03:30,530 That is the role of oxytocin in parent infant bonding. 36 00:03:31,040 --> 00:03:39,619 But a discovery about ten years ago with regard to mate attachment suggested something that 37 00:03:39,620 --> 00:03:48,380 gave us insight into how care and attachment can extend beyond offspring to unrelated others, 38 00:03:48,560 --> 00:03:52,140 to friends and so forth. And here was the story. 39 00:03:52,160 --> 00:03:59,930 There are a number of kinds of voles. And as you know, voles are rodents look a little bit like mice, except they have very short tails. 40 00:04:00,500 --> 00:04:05,810 Now, I'm going to contrast for you two kinds of voles. There's prairie voles and there's montane voles. 41 00:04:06,200 --> 00:04:10,729 And in prairie voles, something very surprising and unusual was observed. 42 00:04:10,730 --> 00:04:18,020 And that is that after the first meeting, the male and the female bond for life, the male guards the nest, 43 00:04:18,140 --> 00:04:23,360 the male helps take care of the offspring and they just like to hang out together. 44 00:04:23,720 --> 00:04:26,810 By contrast, let's talk about the montane voles. 45 00:04:27,110 --> 00:04:32,030 There is no mate attachment. They mate and then they go on their merry way. 46 00:04:32,060 --> 00:04:41,360 So when this behaviour was observed, Sue Carter and a number of other neurobiologists asked this question What's the difference in the brain, 47 00:04:41,480 --> 00:04:49,850 the receptors to which oxytocin binds, or the receptors to which a very similar molecule vasopressin binds? 48 00:04:50,270 --> 00:04:59,870 The density of receptors in very specific places in the prairie vole brain was different from that of the Montane vole brain. 49 00:05:00,320 --> 00:05:08,470 And they. Found when they did the manipulations that you can imagine, for example, blocking those receptors that they could change the behaviour. 50 00:05:08,860 --> 00:05:14,950 Now when you step back and kind of look at what was likely the genetic change 51 00:05:15,310 --> 00:05:21,370 that takes you from caring for offspring to also caring for and being bonded. 52 00:05:21,370 --> 00:05:29,640 Team-mates The second one is probably quite small and it has to do with the particular ecology for prairie voles. 53 00:05:29,650 --> 00:05:35,050 After all, they live out on the open prairie where they are susceptible to predation by kestrels. 54 00:05:35,290 --> 00:05:41,829 That's not true, of course, for montane voles. The insight that followed this was really important, 55 00:05:41,830 --> 00:05:51,370 and that is you can make a small genetic change having to do with small aspects of circuitry or small aspects of receptor density. 56 00:05:51,940 --> 00:06:02,379 And you can get things like mate attachment or kin attachment or calf attachment attachment to friends and others in the group, 57 00:06:02,380 --> 00:06:11,020 and possibly even extending to individuals that are strangers or to individuals that are not even within your own species. 58 00:06:11,320 --> 00:06:16,660 What would be the advantage of an individual cooperating with people who aren't close relatives? 59 00:06:17,650 --> 00:06:25,690 Well, that's a very good question. And we can ask that same question in the context of baboons or chimpanzees or wolves, for example. 60 00:06:25,720 --> 00:06:32,770 A wolf pack can much more easily bring down a major game like a caribou or a moose. 61 00:06:33,130 --> 00:06:38,020 A wolf pack jointly can drive a grizzly bear off a kill. 62 00:06:38,020 --> 00:06:44,440 Your food resources are much greatly enhanced than if you try to make it on your own. 63 00:06:44,440 --> 00:06:47,559 A lone wolf or a lone coyote alone baboon. 64 00:06:47,560 --> 00:06:50,560 The lone chimpanzee doesn't last very long. 65 00:06:50,800 --> 00:06:54,940 They can't get the food resources and they're much more vulnerable to predation. 66 00:06:54,970 --> 00:07:03,460 Now, it's so interesting to me to think about it in this way, because Darwin clearly understood this point about the advantages of group living. 67 00:07:03,850 --> 00:07:07,420 But if you look at HUME and at Adam Smith, it's there. 68 00:07:07,960 --> 00:07:19,450 And if you look at Aristotle, when he talks about humans as social by nature and about the advantages of being social by nature, it's there as well. 69 00:07:19,960 --> 00:07:27,460 And so what I think is really kind of amazing about this time in science is that these earlier 70 00:07:27,460 --> 00:07:34,690 hunches about what was likely to be the case are turning out to indeed have a real biological basis. 71 00:07:35,470 --> 00:07:39,220 I can imagine someone listening to this saying, Why are you talking about animals? 72 00:07:39,430 --> 00:07:47,110 Human morality isn't simply a matter of group cooperation in the way it is with chimpanzees. 73 00:07:47,410 --> 00:07:51,460 It's got a kind of self-reflective aspect to it and a cultural aspect of it. 74 00:07:51,490 --> 00:07:54,780 This isn't morality. This is a long way back in the past. 75 00:07:54,790 --> 00:07:59,110 These are all the precursors of morality. There's a fair point there. 76 00:07:59,140 --> 00:08:07,480 Part of the reason, of course, we're interested in the behaviour of other primates is because the human brain and the chimpanzee brain, 77 00:08:07,480 --> 00:08:16,240 for example, are so very, very similar. I mean, the brains of all mammals are remarkably similar in structure and organisation. 78 00:08:16,240 --> 00:08:21,880 But of course, as you say, in the case of humans, culture is hugely important. 79 00:08:22,480 --> 00:08:28,110 Now you don't want to just focus on culture as it is right now. 80 00:08:28,120 --> 00:08:34,360 What you also want to think about is the development of cultural institutions over time. 81 00:08:34,660 --> 00:08:41,770 So if you think about humans as they began in Africa about 250,000 years ago, 82 00:08:42,220 --> 00:08:50,230 those early groups in their social life and social organisation were probably pretty similar to chimpanzees and baboons. 83 00:08:50,950 --> 00:09:03,220 Now the difference, of course, is that humans seem to have this capacity over time to evolve artefacts and tools, also to evolve social institutions. 84 00:09:03,230 --> 00:09:13,090 And once humans began to congregate in very large groups, largely made possible by the advent of agricultural techniques, 85 00:09:13,630 --> 00:09:18,220 then we begin to see a sort of new level of problem solving, 86 00:09:18,490 --> 00:09:29,080 problem solving, having to do with resource distribution, private property, how to deal with miscreants, how to deal with inheritance and so forth. 87 00:09:29,620 --> 00:09:40,540 But those institutions came about, I guess you'd say, as a result of collective problem solving about what will work and what won't work. 88 00:09:40,960 --> 00:09:46,270 And if I may just go back to human Adam Smith again, but also to Aristotle. 89 00:09:46,270 --> 00:09:56,470 I mean, all of them understood the importance of institutions in our social life and that if you have really terrible and corrupt institutions, 90 00:09:56,830 --> 00:10:00,550 it's going to reflect badly on the prosperity and well-being of. 91 00:10:00,610 --> 00:10:08,350 Everybody. Now, you've told a story about the origins of what I call the precursors of morality. 92 00:10:08,380 --> 00:10:11,740 You seem to be saying that's what morality now is. 93 00:10:12,010 --> 00:10:17,799 So you're moving from a description of facts about the past and facts about neuroscience to 94 00:10:17,800 --> 00:10:22,570 saying how we engage with each other now and how we ought to engage with each other now. 95 00:10:22,870 --> 00:10:29,260 Interesting. We might want to talk about just briefly is what do we mean by morality? 96 00:10:29,380 --> 00:10:35,230 I may have a slightly looser conception, one which is, again, very dependent on HUME. 97 00:10:36,160 --> 00:10:43,750 My conception sort of sees social behaviour on a spectrum so that there is social behaviour at one end, 98 00:10:44,230 --> 00:10:54,340 which has got to do with etiquette and manners and in which hand you hold your fork matters that help perhaps grease the wheels of sociality, 99 00:10:54,340 --> 00:11:02,919 but which are not really, really serious. And then at the other end, you have social practices that bear upon really, 100 00:11:02,920 --> 00:11:11,860 really serious aspects of social life where you exactly say morality begins and etiquette and manners shades off, 101 00:11:11,860 --> 00:11:19,870 I think is not really as important as the fact that we recognise what are the prototypical cases of moral issues. 102 00:11:20,200 --> 00:11:29,229 If you think as experimental psychology has shown that all of our workaday categories are radially structured, 103 00:11:29,230 --> 00:11:36,220 meaning that they have prototypes at the centre where we agree upon what counts as an instance of that category, 104 00:11:36,610 --> 00:11:42,250 and that, with declining similarities, shades off to a fuzzy boundary. 105 00:11:42,280 --> 00:11:45,310 Carrots and potatoes or prototypical vegetables. 106 00:11:45,820 --> 00:11:51,160 Mushrooms and parsley. Well, some people count them as vegetables and some people don't. 107 00:11:51,730 --> 00:12:00,100 And I think the same is true of many concepts that we use in the social domain, what it is, to be honest, or what it is to be fair. 108 00:12:00,610 --> 00:12:08,229 These we understand in terms of prototypical cases and we may disagree at the boundaries and there may be no right answer 109 00:12:08,230 --> 00:12:16,180 at the boundary whether this is really an instance of moral behaviour or whether it's merely social and conventional. 110 00:12:16,240 --> 00:12:23,860 If you think about morality in that way as having to do with very serious matters and as 111 00:12:23,860 --> 00:12:29,590 being a workaday concept with fuzzy boundaries at the edge and prototypes at the centre, 112 00:12:30,130 --> 00:12:38,620 I think it helps us see also why there can be differences amongst groups as to the 113 00:12:38,620 --> 00:12:42,880 way they handle certain kinds of issues with regard to such things as fairness. 114 00:12:43,120 --> 00:12:53,230 It also helps us see the important role that social problem solving and the development of institutions to answer problem solving issues. 115 00:12:53,740 --> 00:13:01,330 It helps us understand how that works as well. But right to the heart of morality is this notion that we have a concern for other people's interests. 116 00:13:01,540 --> 00:13:06,330 But the way you described it, that concern is really just a concern for our own genes. 117 00:13:06,340 --> 00:13:12,190 So for many people, that wouldn't be morality at all. Let's think about this in a slightly different way. 118 00:13:12,400 --> 00:13:21,520 It's not that an individual is maximising his self-interest in anything like a straightforward, conscious way. 119 00:13:21,550 --> 00:13:31,870 When we do things like fall in love, it's not because we explicitly say to ourselves, Oh, well, you know, I must get on with propagating my genes. 120 00:13:32,230 --> 00:13:35,530 That's all kind of background stuff. 121 00:13:35,680 --> 00:13:39,390 The evolution of the brain did this really interesting thing. 122 00:13:39,400 --> 00:13:44,140 It took the circuitry for self-interest and it expanded it. 123 00:13:44,290 --> 00:13:47,800 So you genuinely do care for others. 124 00:13:48,100 --> 00:13:57,580 So you kind of want to distinguish between the sort of background, ultimate cause and the proximal cause that motivates people in the here and now. 125 00:13:58,060 --> 00:13:59,500 You've mentioned David HUME. 126 00:13:59,650 --> 00:14:09,190 David HUME famously said that you can't move straightforwardly from a description of the world, the way the world is to the way it ought to be, 127 00:14:09,400 --> 00:14:15,760 that there has to be some kind of implied evaluative premise to get from any descriptive account to a moral account. 128 00:14:17,470 --> 00:14:24,280 That's right. When you go back and read him, of course, he's always a lot more subtle than you think. 129 00:14:24,340 --> 00:14:34,209 When you go back, you realise that in that famous passage, what he's actually doing is lambasting clerics in particular, 130 00:14:34,210 --> 00:14:41,470 who think that there is a simple inference that takes you from something that is the case to something that ought to be the case, 131 00:14:41,470 --> 00:14:47,670 such as let's just take a hypothetical example. Small boys do work as chimney sweeps. 132 00:14:47,680 --> 00:14:56,560 Therefore, small boys ought to work as chimney sweeps. And HUME thought that any kind of inference that was simple and direct like that was stupid. 133 00:14:57,130 --> 00:15:07,480 On the other hand, HUME was a naturalist about ethics, considerations of cell survival and the moral sentiment that is care about. 134 00:15:07,480 --> 00:15:09,460 Others were motivating. 135 00:15:10,120 --> 00:15:19,270 And that meant that those were, as it were, facts about the nature of the species in virtue of which certain things ought to happen. 136 00:15:19,390 --> 00:15:28,810 So HUME, as a naturalist, of course, was quite willing to see that there is a way of getting from what is the case to what ought to be the case. 137 00:15:29,230 --> 00:15:35,320 If he had been alive today, he would very likely have been fascinated by the developments in neuroscience. 138 00:15:35,980 --> 00:15:40,209 But I'm still intrigued to know which developments can actually shed light on morality, 139 00:15:40,210 --> 00:15:44,740 which reflect on our understanding of what we are in relation to how we ought to be. 140 00:15:45,820 --> 00:15:47,410 I think that's a wonderful question, 141 00:15:47,410 --> 00:15:56,860 and I don't think there is very much in neuroscience right now that can really bear upon that really difficult questions about 142 00:15:57,040 --> 00:16:05,260 the fairness of an inheritance tax or about the obligation to donate in Oregon or when a war is a just war and so forth. 143 00:16:05,650 --> 00:16:14,530 These, I think, are questions where neuroscience certainly today but in the foreseeable future is really not going to have anything to say. 144 00:16:15,370 --> 00:16:23,980 There are some other examples, though, where I think a discovery in neuroscience makes a relevant contribution. 145 00:16:24,130 --> 00:16:31,570 So for example, the understanding that in children the prefrontal neurones are not, well, 146 00:16:31,570 --> 00:16:42,130 myelinated and aren't really well-developed until early adulthood has had an impact on the court's decision about trying capital crimes in children. 147 00:16:42,280 --> 00:16:47,439 The frontal structures are known to be very important in executive control in general, 148 00:16:47,440 --> 00:16:53,649 and that means they're important in impulse control and envisaging the consequences 149 00:16:53,650 --> 00:16:58,690 of one option versus another of not being overwhelmed by your passions. 150 00:16:59,050 --> 00:17:02,560 And in the case of youthful offenders, 151 00:17:02,560 --> 00:17:13,150 18 and under their brain lacks the maturity to be able to manage impulse control in quite the same way that an adult can. 152 00:17:13,270 --> 00:17:19,240 Do you think that neuroscience could ever have an impact on our understanding of specific moral issues? 153 00:17:20,140 --> 00:17:22,120 It's hard for me to see that it can, 154 00:17:22,120 --> 00:17:31,210 because I think many of the problems that we deal with in the moral domain involve incredibly complex social problems. 155 00:17:31,750 --> 00:17:34,899 And so for something like When is a war just war? 156 00:17:34,900 --> 00:17:41,350 I can't really quite imagine that there would be any discovery about the neurobiology 157 00:17:41,350 --> 00:17:45,790 of humans that would help us to understand and to answer that question. 158 00:17:46,090 --> 00:17:50,500 Imagine we could put oxytocin in the food supply somehow, 159 00:17:50,710 --> 00:17:58,510 and it produced a big effect on people that they're much more likely to cooperate, much less likely to be violent to each other. 160 00:17:58,660 --> 00:18:03,160 That would seem to be the direction that this sort of discussion of neuroscience is going in. 161 00:18:03,280 --> 00:18:06,410 Should we do it? Well, we don't really have the power to intervene. 162 00:18:06,490 --> 00:18:14,800 One problem is that oxytocin plays an important role in all aspects of the body and in females it regulates oestrus. 163 00:18:15,220 --> 00:18:23,020 So if we put it in the food supply and we dysregulated female reproductive functions, then I think you have a problem. 164 00:18:23,620 --> 00:18:29,439 There are experimental procedures where people are administering oxytocin. 165 00:18:29,440 --> 00:18:35,140 I would say the results so far are really complicated in the sense that we don't really know what we've got. 166 00:18:35,860 --> 00:18:39,099 But the other really more simple point, I think, 167 00:18:39,100 --> 00:18:47,410 is that we have a fairly good idea of how at a behavioural level to enhance cooperation and to reduce violence. 168 00:18:47,980 --> 00:18:51,640 And what we sometimes need is the will to do it. 169 00:18:52,150 --> 00:18:58,540 The shortcut through a particular molecule is I think, both unrealistic technically, 170 00:18:59,170 --> 00:19:05,890 but it doesn't really make as much sense as doing the other things behaviourally, which we know we can do. 171 00:19:06,460 --> 00:19:12,550 There are lots of studies regarding children and how to enhance cooperation. 172 00:19:12,730 --> 00:19:16,960 Role playing in simple games is one of the easiest and has. 173 00:19:17,040 --> 00:19:24,570 A big effect. So I think there are really good behavioural ways of achieving those ends. 174 00:19:24,720 --> 00:19:33,210 But it's just not likely in the foreseeable future that something like spraying oxytocin in a room is going to give you what you want. 175 00:19:33,720 --> 00:19:37,200 Pat Churchland, thank you very much. Thank you. Nigel It's been a pleasure. 176 00:19:37,500 --> 00:19:46,020 For more information about bioethics bites, go to WW Dot Practical Ethics, Dot Oaks Dot, AC, Dot, UK or iTunes U.