1 00:00:01,880 --> 00:00:07,400 So welcome to this and cross ethics lectures. We have two of these per term for those of you who haven't been before. 2 00:00:08,360 --> 00:00:14,390 And I'm actually filling in for Matthew Hussey, the love guru who was supposed to talk tonight, 3 00:00:14,930 --> 00:00:18,470 but instead you've got genetic enhancement or extinction. 4 00:00:19,280 --> 00:00:23,870 And I'm particularly interested in this talk on the nature, 5 00:00:24,330 --> 00:00:32,720 our human nature and our biology and how it contributes to cooperation and various forms of moral behaviour, 6 00:00:33,200 --> 00:00:41,240 but more importantly, the limitations that are inherent in our biology and what these represent in a modern world. 7 00:00:42,320 --> 00:00:48,620 About a decade, I'm sorry, about a year or so ago, Stephen Hawking, the famous physicist from Cambridge, 8 00:00:49,430 --> 00:00:54,140 ran a poll on the Internet and he asked, How can the human race survive the next hundred years? 9 00:00:55,280 --> 00:01:01,339 And at the end of the poll, they asked him, What do you think the biggest risk to humanity continuing to exist are? 10 00:01:01,340 --> 00:01:10,430 And he said, Well, the traditional risks still exist of asteroid strikes or radical epidemics from naturally evolved viruses. 11 00:01:11,030 --> 00:01:16,490 But now we have the possibility of the intentional engineering of viruses, climate change and still nuclear war. 12 00:01:17,120 --> 00:01:20,239 And he finished his interview with this statement. 13 00:01:20,240 --> 00:01:28,069 He said, there's a sick joke that the reason we haven't been visited by aliens is that when a civilisation reaches our stage of development, 14 00:01:28,070 --> 00:01:33,350 it becomes unstable and destroys itself the long term survival of the human race in many centuries. 15 00:01:34,250 --> 00:01:39,020 We'll be safe only if we spread out into space because that planet has a limited lifespan. 16 00:01:39,020 --> 00:01:43,429 And then to other stars, this won't happen for at least 100 years. So we have to be very careful. 17 00:01:43,430 --> 00:01:48,440 Perhaps we must hope that genetic engineering will make us wise and less aggressive. 18 00:01:48,800 --> 00:01:53,900 Now, this seems fantastic to claim that genetic engineering could make us wiser and less aggressive, 19 00:01:53,900 --> 00:02:01,130 and that somehow the fate of humanity lies, at least in substantial part, within our own biology. 20 00:02:01,910 --> 00:02:04,790 But I want to talk up the plausibility of Stephen Hawking's claim. 21 00:02:04,790 --> 00:02:10,519 And in fact, I argue that it's one of the most pressing, practical, ethical problems that we face today that is, 22 00:02:10,520 --> 00:02:15,110 our own nature represents the greatest threat to humanity this century. 23 00:02:17,500 --> 00:02:21,580 So I'm going to talk about whether we're fit for the future. 24 00:02:21,940 --> 00:02:29,110 And by fit, I mean the fit between our nature, which I take to be dispositions, to act and to be in various ways. 25 00:02:29,530 --> 00:02:34,090 And the external world, whether it's the social world or the natural world, to realise some valuable outcome. 26 00:02:34,100 --> 00:02:36,190 But in this talk it's going to be just be a survival. 27 00:02:36,790 --> 00:02:42,970 Traditionally people have talked about biologists and evolutionary biologists have talked about evolutionary fit, 28 00:02:43,450 --> 00:02:46,900 the fit between our nature and our survival and reproduction. 29 00:02:48,550 --> 00:02:54,790 I'm going to argue that this century we are about to enter a Bermuda Triangle of extinction. 30 00:02:55,780 --> 00:02:59,710 On one corner is a radical technological power. 31 00:03:00,190 --> 00:03:04,930 Another corner is liberal democracy. In the last corner as our own moral nature. 32 00:03:07,030 --> 00:03:11,349 Now the first corner, our radically enhanced technological power. 33 00:03:11,350 --> 00:03:15,160 We have newfound abilities to destroy ourselves this century. 34 00:03:15,460 --> 00:03:21,160 Indeed. Martin Rees, the astronomer royal and president of the Royal Society, former president, 35 00:03:21,670 --> 00:03:26,080 said that we stand a roughly 50% chance of annihilating ourselves this century. 36 00:03:27,950 --> 00:03:31,159 And various people have written books on this, including Richard Posner, 37 00:03:31,160 --> 00:03:40,070 and those who have looked at it in a scholarly way have put the risk of us destroying ourselves at between 20 and 50% this century. 38 00:03:40,850 --> 00:03:46,490 Last century, only a handful of people could have brought about this kind of destruction during the Cold War. 39 00:03:47,330 --> 00:03:52,220 Only a small number of people had the power to launch a nuclear attack. 40 00:03:52,940 --> 00:03:56,780 However, today, many more people have that power. 41 00:03:57,050 --> 00:04:04,250 And indeed, I'll argue in perhaps a decade or two, hundreds of thousands, if not millions of people, 42 00:04:04,250 --> 00:04:10,310 will have the capacity to, if not destroy the world, create enormous devastation. 43 00:04:10,550 --> 00:04:16,250 It only takes one of those people to realise that potential for there to be an unimaginable disaster. 44 00:04:17,490 --> 00:04:21,930 Indeed, technology continues to exponentially advance Moore's Law. 45 00:04:21,930 --> 00:04:30,530 The computing power doubles every two years. It's probably also true for genetic technology and indeed all forms of advanced technology. 46 00:04:30,540 --> 00:04:36,690 So we're really on the beginning of the very steep part of the exponential technological curve. 47 00:04:37,080 --> 00:04:41,220 And some people have predicted the so-called singularity halfway through the 48 00:04:41,230 --> 00:04:47,460 century where artificial intelligence will radically outstrip human capacities. 49 00:04:47,640 --> 00:04:53,550 Whether or not that's true. It's certainly going to be true that we will have much enhanced powers of self-destruction. 50 00:04:55,020 --> 00:04:58,320 So you're all familiar with the ease in which we can harm people. 51 00:04:59,160 --> 00:05:04,410 A couple of years ago, a student in Virginia Tech killed 32 people in a matter of minutes with two handguns. 52 00:05:06,180 --> 00:05:13,559 However, today, it's possible that many countries could have access to poorly secured stockpiles of enriched uranium. 53 00:05:13,560 --> 00:05:16,709 Richard Posner, a Supreme Court judge in the US, 54 00:05:16,710 --> 00:05:22,980 says there's enough plutonium outside secure military installations to furnish the raw material for 20,000 atomic bombs. 55 00:05:23,460 --> 00:05:31,200 It's only a matter of time before a terrorist organisation or rogue individuals access that material and detonate a nuclear device. 56 00:05:31,800 --> 00:05:37,530 Even more scary, however, a biological weapons because these are much cheaper and easier to produce. 57 00:05:38,100 --> 00:05:44,549 The mass on the top there is infected by mouse pox. About five or six years ago, researchers in Canberra, 58 00:05:44,550 --> 00:05:53,310 in Australia were looking at ways of creating a new strain of mouse pox to control the mouse plague, thereby rendering mice infertile. 59 00:05:53,360 --> 00:05:59,970 A very laudable aim. They didn't want to kill the mice. They just wanted to become infertile and control the plague that way. 60 00:06:00,210 --> 00:06:08,280 However, they unexpectedly produced a super strain of mouse pox that was 100% lethal, much, much more devastating than the natural version. 61 00:06:08,640 --> 00:06:10,200 They published the findings on the internet. 62 00:06:10,560 --> 00:06:18,780 It was realised that the modification in the interleukin gene could easily be done on human smallpox that was eradicated in the 1970s. 63 00:06:19,590 --> 00:06:27,960 So at its peak, human smallpox was the most deadly disease and it has killed more people throughout human history than any other disease. 64 00:06:28,170 --> 00:06:31,170 At its peak it was 20 to 30% lethal. 65 00:06:32,370 --> 00:06:38,910 And this is the sort of picture that was routine and common last century before it was eradicated. 66 00:06:40,680 --> 00:06:50,490 But you could easily engineer a super lethal strain of 100% lethality using the sort of genetic modification that the scientists in Canberra produced. 67 00:06:51,090 --> 00:06:54,900 Down the bottom is this is an image of polio virus. 68 00:06:54,930 --> 00:07:04,920 Now this was artificially synthesised recently from small units of DNA that are easily available and in fact could be ordered over the Internet. 69 00:07:05,520 --> 00:07:11,129 So in the matter of a decade or so, it won't be beyond the realms of imagination. 70 00:07:11,130 --> 00:07:16,980 Indeed, pretty simple possibility for an individual to order components across the Internet 71 00:07:17,190 --> 00:07:22,050 and in a small laboratory construct a smallpox virus that would be 100% lethal. 72 00:07:22,680 --> 00:07:31,200 Now, I don't know if any of you have seen the film 12 Monkeys, where this scenario unfolds to destroy humanity. 73 00:07:31,590 --> 00:07:35,459 But it wouldn't be very difficult to release smallpox, a number of major international airports. 74 00:07:35,460 --> 00:07:41,280 And because it takes two weeks for the science to become apparent and it's rapidly spread by aerosol, 75 00:07:42,030 --> 00:07:47,100 you would have a major pandemic that would be very difficult to control. 76 00:07:48,410 --> 00:07:52,310 Okay. So that's our potential. So let's now move on. 77 00:07:52,660 --> 00:07:57,770 What are people like in virtue of that power? Well, you're all familiar with sociopaths and psychopaths. 78 00:07:57,780 --> 00:08:02,569 This is the extreme end of human moral limitation. He's just one story. 79 00:08:02,570 --> 00:08:05,630 But you'd be familiar with lots of these stories, Ted Bundy and so on. 80 00:08:06,020 --> 00:08:09,500 So in 1993, two bodies were found on a road in Texas. 81 00:08:10,190 --> 00:08:14,780 One was a boy, age 14, who'd been shot. But the 13 year old girl had been stripped, raped and dismembered. 82 00:08:14,780 --> 00:08:21,080 Her head and hands were missing. The killer was a Jason Massey, 20, decided he was going to be the worst serial killer Texas had ever seen. 83 00:08:21,470 --> 00:08:26,000 He tortured animals and revered other killers. He was nine years old when he killed his first cat. 84 00:08:26,480 --> 00:08:29,860 He had thousands more over the years and even six cows. 85 00:08:29,870 --> 00:08:35,450 He had a long list of potential victims, and his diaries revealed fantasies of rape, torture and cannibalism. 86 00:08:36,800 --> 00:08:44,000 When you look at some of the quotations from parents of psychopaths such as Massey, you find these sorts of things. 87 00:08:44,510 --> 00:08:48,230 I've had to work so very hard to distance myself from my daughter emotionally. 88 00:08:48,620 --> 00:08:52,590 I would have done anything to make it right. My husband and I have done everything in our power to help her. 89 00:08:52,610 --> 00:08:53,310 We can do no more. 90 00:08:53,330 --> 00:09:01,010 I still love her, but I know that she is who she is and that just about kills me and other parents that I also have a son aged 18 years of age. 91 00:09:01,370 --> 00:09:08,900 He's exhibited problems since childhood. He's now off to a very good college and is extremely bright, which actually makes it more lethal. 92 00:09:09,410 --> 00:09:13,040 And that's typical of psychopaths are often very intelligent and very personable. 93 00:09:13,880 --> 00:09:20,210 He just hasn't been right since birth. My therapist said I did everything I possibly could for him, including therapy since the age of three. 94 00:09:20,900 --> 00:09:28,850 When you look at the biology of this, there is at least some genetic contribution to antisocial personality disorder and psychopathic 95 00:09:28,850 --> 00:09:33,559 personality disorder five times more common amongst first degree biological relatives of male males, 96 00:09:33,560 --> 00:09:37,520 ten times more common in first degree relatives of females with that condition. 97 00:09:38,060 --> 00:09:42,140 And twin studies have revealed a significant shared contribution. 98 00:09:42,170 --> 00:09:49,159 These are not genetic disorders, but genes and biology will play a role as they'll go on to argue what's likely to be. 99 00:09:49,160 --> 00:09:53,870 The cases of people who have a genetic disposition to these sorts of personality disorders, 100 00:09:54,440 --> 00:09:57,680 given the right environmental circumstance, are highly likely to develop them. 101 00:09:58,790 --> 00:10:04,100 75% of inmates in English and American prisons have antisocial personality disorder. 102 00:10:04,460 --> 00:10:10,910 25% of inmates are psychopaths, and psychopaths are highly likely to be recidivist criminals. 103 00:10:12,650 --> 00:10:19,670 Okay, so that's the extreme end of the spectrum, but it's even worse than that when you in many ways when you consider ordinary folk. 104 00:10:20,870 --> 00:10:26,509 Now, here, I'm going to go on and talk about ordinary morality, what I called evolved moral psychology of morality. 105 00:10:26,510 --> 00:10:32,480 I don't mean moral theories or justified morality that philosophers talk about high minded 106 00:10:33,410 --> 00:10:40,580 theoretical ideals like Kant's two formulas of humanity or utilitarianism or virtue ethics. 107 00:10:40,910 --> 00:10:43,340 I mean, to talk about the sorts of moral disposition, 108 00:10:43,340 --> 00:10:50,210 moral concepts and behaviour that characterise ordinary people in their ordinary lives, their sense of justice and fairness and dessert. 109 00:10:50,780 --> 00:10:58,310 And for example, many people ordinarily believe that there's some morally relevant distinction between what they do and what they fail to do. 110 00:10:58,520 --> 00:11:02,210 I'm not going to talk about whether there really is a distinction. I don't believe there is. 111 00:11:02,630 --> 00:11:08,450 That's what I call justified morality. I want to hear look at what the folk believe and how they act. 112 00:11:10,150 --> 00:11:14,950 Now our evolved moral psychology is a feature of our evolutionary history. 113 00:11:15,340 --> 00:11:23,260 So essentially we have a set of dispositions that haven't changed over the past couple of hundred thousand years. 114 00:11:23,410 --> 00:11:29,080 We essentially have a biology and a psychology of a hunter gatherer, a Pleistocene man. 115 00:11:30,280 --> 00:11:33,430 And during that time, humans had to compete for resources. 116 00:11:33,880 --> 00:11:37,210 By the time of adulthood, they were fit to survive on their own resources. 117 00:11:37,480 --> 00:11:42,940 It was easier to hum than to benefit, and morality evolved to facilitate cooperation. 118 00:11:43,240 --> 00:11:49,480 The promoted, inclusive fitness, the ability of that tribe or that group to pass on its genes to the next generation. 119 00:11:51,390 --> 00:11:55,730 We have very limited morality and moral dispositions at this folk level. 120 00:11:55,740 --> 00:11:58,200 We have, first of all, a bias towards the near future. 121 00:11:58,830 --> 00:12:06,930 So tools and other means at our disposal in our long pre scientific history enabled solely to affect our immediate environment in the immediate term. 122 00:12:07,230 --> 00:12:10,620 So we're much more focussed in bias towards the near future. 123 00:12:11,310 --> 00:12:15,810 We have biases towards small numbers of people and indeed we're involved. 124 00:12:15,930 --> 00:12:21,510 Russell Powell and Steve Clarke on a potential project with Robin Dunbar, who's famous for the Dunbar number. 125 00:12:21,940 --> 00:12:29,219 Yes, a number of numbers, but the Dunbar number of 150 is meant to be the basic unit of human cooperation. 126 00:12:29,220 --> 00:12:39,150 If you look at societies across the world and even today on Facebook and modern human networks, we tend to cooperate with a maximum of 150 people. 127 00:12:39,150 --> 00:12:44,580 We have about five very close people, 15, who we're moderately close with. 128 00:12:44,580 --> 00:12:49,560 We recognise I think about 50 names or 50 faces and we cooperate with 150. 129 00:12:49,710 --> 00:12:53,880 That's how humans have evolved and that's how we continue to behave. 130 00:12:54,690 --> 00:12:59,780 So we have limited abilities to cooperate with each other inherently when we're on. 131 00:12:59,790 --> 00:13:05,310 Zoom has a famous example of a rowboat that if one person on the boat fails tomorrow, 132 00:13:05,520 --> 00:13:09,210 the boat will go in circles, will be obvious that the person isn't cooperating. 133 00:13:10,020 --> 00:13:17,010 But when humans are in larger boats or in larger groups where their contribution is not obvious to all, 134 00:13:17,550 --> 00:13:23,970 they tend to free ride, they tend to not contribute and to draw the benefits of the other's efforts. 135 00:13:24,660 --> 00:13:30,870 Indeed, distrust of strangers is typical of our reaction to free riding and xenophobia. 136 00:13:30,960 --> 00:13:38,010 Xenophobia and distrust of strangers protect protected protected us from being exploited through other free riding human beings. 137 00:13:38,160 --> 00:13:42,090 So humans have a capacity to be altruistic, but they also have a capacity to free ride. 138 00:13:42,840 --> 00:13:46,770 Our altruism extends to our family, friends, to individuals who we personally know, 139 00:13:47,010 --> 00:13:51,420 grow to like and are concerned about all those who it's beneficial to cooperate with. 140 00:13:51,780 --> 00:13:53,729 And as I said, there's an evolutionary reason for this. 141 00:13:53,730 --> 00:14:00,090 A disposition to do favours indiscriminately to strangers would expose us to too great a risk of being exploited by free riders. 142 00:14:00,240 --> 00:14:06,600 So suspicion of strangers has protected humanity or human beings in small groups from being exploited. 143 00:14:08,010 --> 00:14:11,430 So here are some basic features of our history. 144 00:14:11,640 --> 00:14:14,880 It's been easy to harm people, much easier to harm them, to benefit them. 145 00:14:15,180 --> 00:14:20,550 So what's morally most important at a folk level are proscriptions against harming. 146 00:14:21,210 --> 00:14:26,520 So we have very strong rules against harming. 147 00:14:26,520 --> 00:14:30,600 We have no strong rights to aid in any form of folk morality. 148 00:14:31,530 --> 00:14:37,890 We believe also that we're primarily responsible for what we cause in proportion to our cause of contribution. 149 00:14:38,250 --> 00:14:43,650 And this grounds the belief in ordinary morality that acts are more important than omissions. 150 00:14:44,160 --> 00:14:50,100 Even if there is no justifiable defence for that, we've had a limited capacity to affect the distant future. 151 00:14:50,340 --> 00:14:55,110 So we're psychologically myopic, disposed to care more about what happens to us in the near future, 152 00:14:55,890 --> 00:15:01,350 in excess of the probabilistic uncertainty of the future. 153 00:15:02,890 --> 00:15:07,620 We live in small groups and we're capable of empathising and sympathising with only single individuals. 154 00:15:07,630 --> 00:15:12,310 It's very difficult to empathise and sympathise with collectives in proportion to their numbers. 155 00:15:12,760 --> 00:15:18,870 These are all features of our psychology. Now, maybe you don't believe that we're, you know, we're much better than you were suggesting. 156 00:15:18,880 --> 00:15:24,100 We're not psychopaths and we cooperate all the time. And, you know, man is made in the image of God. 157 00:15:24,850 --> 00:15:29,980 He's really he's really a great thing. Well, here are some some evidence that we're pretty limited. 158 00:15:30,340 --> 00:15:37,990 And what I'm going to argue is there's no reason to be optimistic based on our evolutionary history that in our inherent biology, in psychology. 159 00:15:38,230 --> 00:15:40,270 So let's look at aid, first of all. 160 00:15:40,990 --> 00:15:53,260 In 2008, only five countries had reached the very modest UN quota set decades ago of of donating 0.7% of GDP to aid those countries Sweden, 161 00:15:53,260 --> 00:15:56,410 Luxembourg, Norway, Denmark and the Netherlands, all northern European countries. 162 00:15:57,130 --> 00:16:02,170 The average for all OECD countries was only 0.4 7%, and the two largest economies did the worst. 163 00:16:02,170 --> 00:16:07,150 Japan in the US only giving point 2% of their of their budget to aid. 164 00:16:07,420 --> 00:16:13,310 One staggering figure, which Dara drew to my attention from Toby's website is that the. 165 00:16:13,480 --> 00:16:15,230 I think this is correct us do this from memory. 166 00:16:15,430 --> 00:16:23,170 The US has spent $3 trillion already in its war on terror and it spent over the last 50 years, 2.1 trillion on aid. 167 00:16:23,260 --> 00:16:28,270 So we're much more we're much more prepared to go to war and obliterate innocent enemies 168 00:16:28,270 --> 00:16:34,010 and cause vast amounts of collateral damage and instability than we are to provide aid. 169 00:16:34,030 --> 00:16:41,530 I mean, that's a staggering figure. And I think just, you know, is objective evidence of what humanity is like. 170 00:16:42,250 --> 00:16:45,309 So the fact is, for our unwillingness to aid other countries, 171 00:16:45,310 --> 00:16:53,050 despite having enormous amounts of ability to improve things in the developing world, are the inherent acts of mission doctrine, 172 00:16:53,200 --> 00:17:01,540 the belief that we all have inherently that there is something more important about what we do and we're not responsible for what we fail to do now. 173 00:17:01,540 --> 00:17:08,410 Limited altruism. So Americans value a non-American life in the poorer nations at 1/2000, 174 00:17:08,830 --> 00:17:15,250 the value they put on an American life figure from Cass Sunstein and the sheer numbers of 175 00:17:15,250 --> 00:17:19,180 people in the developing world to whom we would have to respond can also present an obstacle. 176 00:17:19,450 --> 00:17:24,640 As I mentioned, we're only capable of empathising and being compassionate with a small number of people. 177 00:17:24,970 --> 00:17:32,350 We're unable to vividly imagine the suffering of 100 people, even if they're in sight, let alone a thousand, let alone a million people. 178 00:17:32,860 --> 00:17:39,309 This insensitivity to numbers of sufferers cause a vast moral underestimate is 179 00:17:39,310 --> 00:17:43,690 underestimation in our own imagination of the amount of suffering in the world. 180 00:17:45,040 --> 00:17:49,599 And it's getting worse. Inequality is not reducing. It's increasing in 3 to 1. 181 00:17:49,600 --> 00:17:53,169 The difference between the per capita incomes of the richest and poorest countries is only 182 00:17:53,170 --> 00:18:03,819 3 to 1 united in 2011 to 1 in 1913 35 to 1 in 1950 44 to 1 in 1973 and 72 to 1 in 1999. 183 00:18:03,820 --> 00:18:15,520 And I'm sure it's doubled since then. In 2000, the wealthiest fifth of the world's population stood for 86% of the world's gross domestic product. 184 00:18:15,850 --> 00:18:24,220 The poorest fifth stood for only 1%. The richest three single individuals in the world owned as much as 600 million people in the poorest countries. 185 00:18:24,700 --> 00:18:28,690 So these are just facts that express our moral nature. 186 00:18:30,720 --> 00:18:38,400 Another crisis that is going to continue and has been caused and won't be resolved 187 00:18:38,400 --> 00:18:43,500 as the recent Copenhagen summit vividly demonstrated in the environmental crisis. 188 00:18:43,710 --> 00:18:51,660 It's an example of a familiar problem of the tragedy of the commons, where herdsmen, for example, have a small area of pasture, 189 00:18:52,350 --> 00:19:01,560 and if they don't restrain their grazing, they'll over graze the pasture, eventually exhausting it and all suffering starvation as a result. 190 00:19:02,280 --> 00:19:06,479 Under such circumstances of limited resources and the need for restraint, 191 00:19:06,480 --> 00:19:13,230 people stop consuming only if they have good reason to believe that a sufficient number of other people will do so as well, 192 00:19:13,470 --> 00:19:16,800 and especially if this number will not be sufficient without their own contribution. 193 00:19:17,520 --> 00:19:20,940 So when you've got a small number of people and an observable commons, 194 00:19:21,480 --> 00:19:25,800 people are more likely to know each other personally and have concern and trust for each other. 195 00:19:26,220 --> 00:19:29,850 It's easy for them to keep an eye on each other and to take free riders. 196 00:19:30,330 --> 00:19:35,760 However, this is not the situation with the tragedy of the commons of the environmental crisis, 197 00:19:35,910 --> 00:19:40,469 which is a global problem caused by huge states which are which have millions of citizens, 198 00:19:40,470 --> 00:19:43,500 which are anonymous to each other, have little reason to trust each other. 199 00:19:43,740 --> 00:19:46,530 And it's easy for free riders and defectors to escape notice. 200 00:19:46,920 --> 00:19:52,680 There's little disposition to cooperate as the number of agents involved in a cooperation problem grows. 201 00:19:53,010 --> 00:19:59,790 The stage is reached when contribution of each agent to the total outcome becomes negligible or imperceptible. 202 00:20:00,210 --> 00:20:05,100 Then an agent can have no altruistic or even stronger, no self-interested reason to contribute. 203 00:20:05,460 --> 00:20:08,700 Nothing that you do will ever affect the environment. 204 00:20:09,090 --> 00:20:12,120 That's why many people don't engage in restraint. 205 00:20:12,900 --> 00:20:18,540 The contribution of each of us to the environmental degradation is virtually negligible, but it is negligible. 206 00:20:18,840 --> 00:20:27,360 And the prohibition of common sense morality that governs most of our moral actions against causing harm directly doesn't kick in. 207 00:20:28,590 --> 00:20:30,090 What could make us cooperative, 208 00:20:30,090 --> 00:20:37,110 some sense of fairness or justice that it would be unfair to free ride on the efforts of others to restrain themselves. 209 00:20:37,380 --> 00:20:41,850 But this feeling of unfairness is going to be much weaker when many of the other parties are anonymous, 210 00:20:42,090 --> 00:20:44,400 and there is no concern for them because their strategies. 211 00:20:45,870 --> 00:20:54,180 So the reasons why we fail to address the global problem of climate change and global warming is the uncertainty. 212 00:20:54,860 --> 00:20:59,310 It's the effects are going to happen in the distant future. And we have a bias to the near future. 213 00:20:59,580 --> 00:21:02,730 The effects will be borne by people mainly in Africa and Southeast Asia. 214 00:21:03,270 --> 00:21:08,250 And we don't have strong altruistic dispositions to strangers. 215 00:21:08,700 --> 00:21:19,680 And the sustainable level of welfare that we could sustain as a as a global community is considerably lower than the one that we currently enjoy. 216 00:21:19,920 --> 00:21:25,499 If all 6 billion people were to reach the current standard of living in countries such as the UK, 217 00:21:25,500 --> 00:21:30,239 the consumption of resources and the release of waste products would be 12 times higher than it is today, 218 00:21:30,240 --> 00:21:37,200 which is clearly impossible for the world to sustain. So if we were really serious about sustainable lifestyles, we would have to cut down. 219 00:21:37,770 --> 00:21:40,980 Are people going to cut down? Evidence says no. 220 00:21:42,120 --> 00:21:51,360 When you do a survey of Americans, the 52% wouldn't even spend $50 an extra month per household to conform to Kyoto. 221 00:21:51,570 --> 00:21:56,040 And if you ask them to spend $100 more per month, only 11% would do that. 222 00:21:56,580 --> 00:22:00,479 So people are selfish. They don't care about strangers. 223 00:22:00,480 --> 00:22:08,160 And it's unlikely that we're going to address the climate the climate problem through voluntary cooperation. 224 00:22:08,520 --> 00:22:13,349 Inherent biases and psychological states places at great risk prevent acceptance of 225 00:22:13,350 --> 00:22:17,610 any justified morality and interfere with the motivating force of any accepted, 226 00:22:17,610 --> 00:22:21,780 justified morality. So controversial that we should protect the planet. 227 00:22:21,930 --> 00:22:25,480 So controversial that we should improve the lives of people who are starving. 228 00:22:25,680 --> 00:22:29,160 But we don't because of our biology and our psychology. 229 00:22:30,820 --> 00:22:36,320 Now this is going to be increasingly more significant as our technological power increases for two reasons. 230 00:22:36,340 --> 00:22:40,060 First of all, the capacity directly Hamas cells will vastly increase. 231 00:22:40,480 --> 00:22:50,380 And secondly, the capacity to do good in parts of the globe that we failed to do will also significantly increase liberal democracy. 232 00:22:50,860 --> 00:22:55,630 The third point on the Bermuda Triangle. So we've discussed our nature. 233 00:22:55,780 --> 00:23:05,380 We've discussed our technological power. The third point that creates a triangle of heightened risk of extinction is liberal democracy. 234 00:23:06,580 --> 00:23:13,360 Liberal democracies is characterised by tolerance and liberal neutrality two different conceptions of the good and different ways of living. 235 00:23:14,020 --> 00:23:21,129 It maximises the freedom of individuals. It's tolerant to multiple cultures and provides a poor basis for coordination, 236 00:23:21,130 --> 00:23:26,650 such as necessary to deal with the environmental crisis in China more rapidly. 237 00:23:26,650 --> 00:23:32,470 Remove lead from its petrol vastly more rapidly than any Western country. 238 00:23:33,670 --> 00:23:37,180 It's it's a poor base of coordination, especially when you have small, 239 00:23:37,180 --> 00:23:42,220 overdetermined causal contributions that affect small strangers in the distant future. 240 00:23:42,550 --> 00:23:45,800 It's also obviously tolerant of minorities. 241 00:23:46,270 --> 00:23:49,450 So at the moment, things are fairly stable with liberal democracy. 242 00:23:49,630 --> 00:23:58,810 But you could expect that these kinds of problems will increase as we continue to maintain a neutrality of different conceptions of the good, 243 00:23:59,140 --> 00:24:04,840 different values and so on. So how could we address this problem of the risk of extinction? 244 00:24:05,590 --> 00:24:11,290 We can either change the natural world. We can change society through enhanced surveillance. 245 00:24:11,530 --> 00:24:16,500 Encroaching in privacy and confidentiality occurred in the US following September 11. 246 00:24:16,510 --> 00:24:24,160 We could change our psychology through moral education by inculcating a form of justified morality, attempting to persuade people. 247 00:24:24,520 --> 00:24:30,240 But we could also attempt to understand and use our knowledge of biology to influence our biology. 248 00:24:30,250 --> 00:24:34,990 Biology. I'm going to tell you a little story now that it might be possible to do that. 249 00:24:35,230 --> 00:24:38,560 And while it might not be the most effective means at the moment, 250 00:24:38,830 --> 00:24:43,840 it may in the future be a very important means, and it should be a means that we shouldn't exclude. 251 00:24:45,640 --> 00:24:54,930 So. We have a basic set of dispositions for sense of justice or fairness that we actually also share with other primates. 252 00:24:55,350 --> 00:25:02,070 The most basic of these is the so-called [INAUDIBLE] for tat strategy. And the idea here is that someone does you a favour out of altruism. 253 00:25:02,220 --> 00:25:05,940 You should respond with gratitude and and a desire to return the favour. 254 00:25:06,270 --> 00:25:11,400 If someone harms you, the proper response is to retaliate with anger and a desire for retaliation. 255 00:25:13,130 --> 00:25:16,190 More sophisticated, emotional responses are possible. 256 00:25:16,640 --> 00:25:20,870 Remorse and feelings of guilt if you act wrongly by harming someone without good reason. 257 00:25:21,020 --> 00:25:26,360 Shame if you're less successful than others in returning favours or retaliating in wrongs. 258 00:25:26,510 --> 00:25:33,559 Pride. If you're more successful in these respects. Admiration and contempt for others who are successful and unsuccessful in these respects. 259 00:25:33,560 --> 00:25:38,030 And forgiveness. When you realise that someone is not responsible for something wrong or shows remorse. 260 00:25:38,270 --> 00:25:42,250 These are primitive versions of the [INAUDIBLE] for tat strategy. 261 00:25:42,260 --> 00:25:46,040 That's why we feel these emotions and that's why we express them. 262 00:25:47,420 --> 00:25:59,360 These emotions which make up [INAUDIBLE] for tat behaviour, are very bound up with concepts, inherent and innate concepts of dessert and of justice. 263 00:25:59,690 --> 00:26:08,300 When you feel gratitude to someone, you, this individual deserves a good turn and it's just that she has a good turn when you're angry at somebody. 264 00:26:08,450 --> 00:26:12,770 This individual is thought to be deserving of punishment or harmed. 265 00:26:13,760 --> 00:26:15,620 And the treatment is thought to be just. 266 00:26:17,240 --> 00:26:27,170 Now we can test the effect of these senses of fairness and how written they are into our biology through various games, 267 00:26:27,470 --> 00:26:34,730 economic games and cooperative games. One game is called the Ultimatum Game, and this game shows, I believe, 268 00:26:34,730 --> 00:26:40,640 that our moral dispositions not only have an evolutionary origin, but as we would expect, a biological basis. 269 00:26:41,330 --> 00:26:44,959 Now, the basic idea of this game, for those of you who are unfamiliar with it, 270 00:26:44,960 --> 00:26:50,600 is there is a proposer and a responder, and the proposer divides a reward. 271 00:26:50,600 --> 00:26:56,139 So when this is done with chimps, you might have ten raisins and there'll be two pots distributed in different ways. 272 00:26:56,140 --> 00:27:00,440 So for example, one set will have five and five and the set will have eight and two. 273 00:27:01,040 --> 00:27:07,009 And the proposer chooses which of these two distributions to have for himself and the responder 274 00:27:07,010 --> 00:27:14,419 by pulling it halfway to his cage so the proposal can pull the I raisins towards himself. 275 00:27:14,420 --> 00:27:19,940 And so he's the responder will get the two raisins. The responder then can respond either by rejecting the offer altogether. 276 00:27:20,120 --> 00:27:23,180 So not getting any raisins or by accepting their share. 277 00:27:23,720 --> 00:27:30,080 When you do this with chimps, they'll accept up to 28228 distributions without any sign of dissatisfaction. 278 00:27:30,410 --> 00:27:37,610 So chimps chimps are quite prepared to accept unequal office, and this is thought to be a reflection of their sharing of food in the wild. 279 00:27:38,060 --> 00:27:42,620 Human responders, however, typically reject offers that deviate much from five and five. 280 00:27:44,120 --> 00:27:55,070 So they reject reject the offer even if they even if that entails rejecting some kind of as a way of punishing the proposer for unfairness. 281 00:27:56,480 --> 00:28:03,740 Now, what's most interesting about this is not the fact that humans have an inherent bias towards greater degrees of fairness, 282 00:28:04,610 --> 00:28:10,909 but that they do deviate from each other. So while you might only accept five and five, I might accept six and four. 283 00:28:10,910 --> 00:28:14,240 Somebody else might even accept seven and three of some reward. 284 00:28:15,240 --> 00:28:21,120 But when you do this without any twins, the twins except identical twins separated at birth, 285 00:28:21,300 --> 00:28:29,070 the twins accept the same distributions with a much higher statistical likelihood than fraternal twins or people who are unrelated. 286 00:28:29,190 --> 00:28:34,290 So if one twin will accept six and four, the other twin is highly likely to accept six and four. 287 00:28:34,950 --> 00:28:40,110 So this is led these researchers to estimate that the rejection of unfair offers, 288 00:28:40,410 --> 00:28:45,630 the sense of fairness that people have is about 40% or more than 40% genetically determined. 289 00:28:45,900 --> 00:28:52,440 So whether you're a person who responds in a certain way to an offer is going to be largely at least significant, 290 00:28:52,530 --> 00:28:56,250 not the kind of person you are, which is pretty obvious. 291 00:28:57,210 --> 00:29:08,440 And that's going to be genetic. So there are other there's other good research which shows that we're inherently racist, 292 00:29:08,440 --> 00:29:14,530 as you would expect from our evolutionary history, even when we consciously claim not to be. 293 00:29:14,860 --> 00:29:16,920 There's various studies that showed that, you know, 294 00:29:16,930 --> 00:29:26,050 even people with very anti racial tendency or anti racial kind of beliefs still have inherent racial biases when showed. 295 00:29:27,820 --> 00:29:31,270 Implicit Association tests. Okay, so could we change this? 296 00:29:31,270 --> 00:29:34,720 Could we enhance human beings? Could we change their moral nature? 297 00:29:35,740 --> 00:29:42,940 Now I'm going to just sketch a story that might be possible at some point to actually change our nature, 298 00:29:43,330 --> 00:29:46,360 or at least select the sorts of people we bring into the world. 299 00:29:47,440 --> 00:29:53,920 Okay. And I've called this in other places by liberation. Hawking's claim that we should make humans wiser and less aggressive. 300 00:29:54,160 --> 00:29:57,580 We could do this in two ways. One, by cognitively enhancing people, 301 00:29:58,120 --> 00:30:06,100 by making them more better able to see the consequences of their decisions to make more rational choices or directly by moral enhancement. 302 00:30:07,360 --> 00:30:13,149 Now, cognitive enhancement is an area where there's a lot of progress being made there, being drugs such as Modafinil, Ritalin, 303 00:30:13,150 --> 00:30:19,510 Adderall that have been shown to improve cognitive performance as we develop drugs which improve concentration, 304 00:30:19,510 --> 00:30:23,050 memory for diseases like Alzheimer's disease. 305 00:30:23,080 --> 00:30:29,350 These are likely to improve cognitive performance. So cognitive enhancement, I think, is not going to be something difficult to achieve. 306 00:30:29,710 --> 00:30:33,760 I think that we will develop very powerful cognitive enhancers. 307 00:30:34,450 --> 00:30:38,470 These, however, would also increase the capacity of people to act badly. 308 00:30:38,980 --> 00:30:44,260 You know, if you're smarter as that parents said, not you, you can be even more lethal. 309 00:30:44,800 --> 00:30:50,140 So, in fact, in another paper that I've argued, we think in my person, if we do go down the route of cognitive enhancement, 310 00:30:50,140 --> 00:30:58,150 which we're likely to, we also need to enhance ourselves morally to deploy this increased power more safely. 311 00:30:59,480 --> 00:31:05,180 So genetics may hold the key to influencing behaviour. 312 00:31:06,260 --> 00:31:09,920 Crude genetic experiments been run for 10,000 years. 313 00:31:10,250 --> 00:31:17,270 The selection of dogs by breeding the 300 different dogs that you see today are the result simply of genetic selection. 314 00:31:17,570 --> 00:31:20,690 Now those dogs have very different character. Some are placid, some a vicious. 315 00:31:20,960 --> 00:31:26,810 That's all genetic. It's the result of genetics, of selective breeding. 316 00:31:27,740 --> 00:31:35,060 We could do this in humans now, not through a project that takes 10,000 years, but in a single generation by genetic manipulation. 317 00:31:35,390 --> 00:31:40,190 The rabbit on the left is a fluorescent rabbit created by introducing a gene from a jellyfish into a rabbit. 318 00:31:40,460 --> 00:31:49,340 On the right is an embryo which fluoresce in the dark. It's a human embryo created by transferring gene from a jellyfish into a human embryo. 319 00:31:49,670 --> 00:31:55,970 So we are understanding the basis of various phenotypic qualities and being able to intervene at a genetic level. 320 00:31:57,970 --> 00:32:00,970 I'm sure lots of you have seen this before, but I'll quickly show it. 321 00:32:01,120 --> 00:32:06,189 This is the power of genetic modification. The back is a normal mouse. 322 00:32:06,190 --> 00:32:10,030 On the front is a mouse that's been genetically modified, running at 20 metres per minute. 323 00:32:10,930 --> 00:32:15,010 They both fall off the treadmill periodically for a matter of minutes. 324 00:32:15,040 --> 00:32:20,230 There goes the genetically modified mouse off after 200 metres. 325 00:32:20,620 --> 00:32:24,930 The normal mouse's exhausted. Can't continue. He doesn't get back on at all. 326 00:32:25,240 --> 00:32:33,220 That's 200 metres. 6 hours later, the genetically modified mouse collapses with exhaustion after five kilometres. 327 00:32:33,640 --> 00:32:37,240 So that's the power of biological modification. 328 00:32:38,350 --> 00:32:42,520 We've done, we've changed the behaviour of non-human animals. 329 00:32:42,520 --> 00:32:45,520 We've made voles monogamous rather than polygamous. 330 00:32:46,150 --> 00:32:55,990 We've made monkeys work harder. We've also produced various other physical improvements, but we've also produced a massive bit of memory. 331 00:32:56,410 --> 00:33:03,520 So there is no reason in principle why human moral behaviour or its biological underpinnings couldn't likewise be altered. 332 00:33:04,990 --> 00:33:08,020 So we could do this today through by a prediction. 333 00:33:08,200 --> 00:33:13,959 We could use genetic testing once we identified the various genes or the 334 00:33:13,960 --> 00:33:18,160 collection of genes that increase the risk of sociopathic personality disorder. 335 00:33:18,430 --> 00:33:23,920 You could test embryos for these and select out those embryos which don't have the disposition. 336 00:33:24,250 --> 00:33:27,280 This is relatively possible. 337 00:33:27,280 --> 00:33:29,770 If we understand the genetic basis of these conditions. 338 00:33:29,890 --> 00:33:37,120 You could target social psychological interventions for people at higher risk of sociopathic personality disorder. 339 00:33:37,600 --> 00:33:43,390 You could increase surveillance. Or you could segregate. You could also modify behaviour biologically. 340 00:33:43,400 --> 00:33:51,250 This is done in a very crude way already. When paedophiles are given female hormones to reduce their sex drive as an alternative to imprisonment, 341 00:33:51,340 --> 00:33:54,640 now this is simply a modification of their behaviour. 342 00:33:54,710 --> 00:33:59,200 Biologically, we'll be able to do that in a more fine grained way once we understand the biology. 343 00:34:00,280 --> 00:34:05,760 Just an example of how we could do it by a prediction in the 1980s. 344 00:34:05,830 --> 00:34:10,840 There was a famous Dutch family identifier brought forward by the grandmother to a geneticist called Bruna, 345 00:34:11,320 --> 00:34:17,680 where half of the males in the family were criminals in jail, guilty of violent crimes like arson and assault and rape. 346 00:34:18,460 --> 00:34:22,660 And when they looked at this family, they found that this clearly was an X-linked genetic disorder. 347 00:34:22,840 --> 00:34:26,530 So half of the males would be would be affected. 348 00:34:26,770 --> 00:34:33,880 They looked at the X chromosome. They found a mutation in a gene that coded for the metabolism of a neurotransmitter in the brain. 349 00:34:34,390 --> 00:34:42,310 I mean, oxidised type I. And this was highly, essentially perfectly correlated with the males who were in prison. 350 00:34:43,510 --> 00:34:47,110 There was no other family found to have this mutation in this way. 351 00:34:47,500 --> 00:34:52,510 But when a large group of New Zealand males were studied by Caspi, 352 00:34:52,930 --> 00:35:02,589 he found that people who had this mutation and social deprivation very early in life were much more likely to exhibit antisocial behaviour and to 353 00:35:02,590 --> 00:35:11,950 be imprisoned than people who either were mistreated but lacked the genetic change or people who had the genetic change but weren't mistreated. 354 00:35:11,960 --> 00:35:21,100 So the combination of of social deprivation plus the no, I mean oxidase II mutation was highly linked to criminal behaviour. 355 00:35:21,490 --> 00:35:25,000 This raised the possibility of manipulating that level of neurotransmitter. 356 00:35:25,750 --> 00:35:33,280 But you could easily use preimplantation diagnosis to have children without this mutation or prenatal diagnosis. 357 00:35:34,060 --> 00:35:40,480 You could identify people early and increase social support, or more controversially, you could segregate at the time. 358 00:35:40,530 --> 00:35:49,600 Offenders with this kind of genetic mutation are the substance of have also been associated with moral behaviour or dispositions to cooperate. 359 00:35:50,230 --> 00:35:57,940 Serotonin is a neurotransmitter. When it's lower in the brain, people tend to behave more aggressively. 360 00:35:58,570 --> 00:36:03,160 When you deplete it, you get more aggressive behaviour and when you increase its level through drugs like Prozac, 361 00:36:03,160 --> 00:36:05,770 you increase cooperation and reduce aggression. 362 00:36:06,280 --> 00:36:12,910 Oxytocin is another natural hormone women release when they're breastfeeding, but all people have it to a degree. 363 00:36:13,270 --> 00:36:16,840 It's been shown to influence the ability to infer other people's mental states. 364 00:36:17,290 --> 00:36:23,650 So oxytocin increases the willingness to trust other people and it prevents decrease in trust after betrayal. 365 00:36:24,460 --> 00:36:28,960 So this is obviously not the solution to global warming or psychopathy, 366 00:36:29,200 --> 00:36:36,550 but it shows that we can understand the biological bases and influence the ways in which people cooperate and behave morally. 367 00:36:37,450 --> 00:36:40,929 Many of the world's problems, such as poverty and climate change, 368 00:36:40,930 --> 00:36:49,030 will require cooperation and a change in behaviour that has not so far occurred through human history at a global level. 369 00:36:49,300 --> 00:36:59,440 There's no reason to believe that we will be able to overcome those biases to the degree that is necessary to address these problems. 370 00:37:00,160 --> 00:37:05,570 So in this talk, I've I've described what I've called the Bermuda Triangle of Extinction with at one corner, 371 00:37:05,830 --> 00:37:11,860 radical technological power, unprecedented through human history, liberal democracy and human nature. 372 00:37:13,530 --> 00:37:17,880 We need science to understand our own moral limitations and the ways to overcome them. 373 00:37:18,360 --> 00:37:21,840 We need ethics to decide whether and how we should reshape our own nature. 374 00:37:23,030 --> 00:37:26,660 We could, of course, take political steps to address these problems. 375 00:37:27,170 --> 00:37:34,790 We could reduce our commitment to liberalism. We could restrict privacy and increase surveillance, as has happened in the US. 376 00:37:34,820 --> 00:37:41,540 We could promote certain values. We could give up political neutrality. We could attempt to inculcate morality and address the problem in that way. 377 00:37:41,840 --> 00:37:45,200 We could attempt to restrict consumerism and lower standard of living. 378 00:37:45,890 --> 00:37:51,950 And we could potentially restrain or segregate people at greatest risk to the community. 379 00:37:54,700 --> 00:38:01,479 This is the century where humanity will become the greatest threat to humanity at an individual level through sociopaths, 380 00:38:01,480 --> 00:38:08,770 psychopaths or fanatics gaining access to technologies with the immense power of destruction, 381 00:38:08,920 --> 00:38:12,430 but more generally through global collective action problems, 382 00:38:12,880 --> 00:38:19,990 inability to cooperate to address problems like climate change or the plight of developing countries. 383 00:38:20,740 --> 00:38:24,280 The answer to these problems may lie partly in ourselves, in the human condition. 384 00:38:26,060 --> 00:38:28,220 What can practical ethics learn from human nature? 385 00:38:28,640 --> 00:38:36,350 We've been very interested in our nature as animals, not as man made in the image of God, the man as the animal and the product of evolution. 386 00:38:36,860 --> 00:38:38,540 When you look around at the world's problems, 387 00:38:38,780 --> 00:38:45,830 they are predicted and increasingly understood as products of our own evolutionary history, our own nature. 388 00:38:46,520 --> 00:38:55,310 I believe we should acknowledge our vast human imperfection and identify our moral limitations and take steps to reduce these. 389 00:38:56,340 --> 00:39:02,129 As I said, this is not to exclude other psychosocial responses. 390 00:39:02,130 --> 00:39:10,230 We could reduce our commitment to liberal neutrality, use moral education to correct or to inculcate certain behaviours. 391 00:39:11,070 --> 00:39:18,000 But we could also change our moral nature. So when Hawking said, perhaps we must hope that genetic engineering will make us wise and less aggressive. 392 00:39:18,330 --> 00:39:22,620 I hope that doesn't sound as implausible as it might have seemed. 393 00:39:22,860 --> 00:39:23,280 Thank you.