1 00:00:02,690 --> 00:00:07,190 [Auto-generated transcript. Edits may have been applied for clarity.] A computer scientist, an exoplanetary physicist, and a theologian. 2 00:00:07,190 --> 00:00:10,130 Walk into the Sheldonian Theatre one evening. 3 00:00:10,370 --> 00:00:17,690 No, this is not the opening line of a joke, but it is the premise of this evening's event on the topic of life. 4 00:00:18,020 --> 00:00:25,970 So a very warm welcome to this, our second Sheldonian series event, following on from the launch last term. 5 00:00:26,210 --> 00:00:35,540 It's wonderful to have students, staff, and alumni all blended here together in a lovely melting pot of the Sheldonian Theatre, 6 00:00:35,810 --> 00:00:39,680 including quite a few of you who are returners from last term. 7 00:00:40,460 --> 00:00:44,090 This series, as you know, is a new addition to our calendar. 8 00:00:44,120 --> 00:00:48,950 It reflects our commitment to putting inclusive and respectful inquiry, 9 00:00:49,190 --> 00:00:55,520 as well as diversity of thought through free speech at the heart of our life here in the Collegiate University. 10 00:00:56,090 --> 00:01:03,920 And each term, we are tackling one topic, seeking to show the different ways that we can approach the big issues and questions of the day. 11 00:01:04,670 --> 00:01:12,530 The fact that so many of you are here again reflects that there is a thirst for such an event, and so thank you for your company. 12 00:01:13,610 --> 00:01:23,360 Learning at its best happens when you approach your topic with humility and open mind and heart, and with a spirit of exploration and examination. 13 00:01:23,780 --> 00:01:30,890 That is what we strive to deliver here at Oxford and what we wish to showcase through our Sheldonian series. 14 00:01:31,550 --> 00:01:39,230 Last term, we explored the topic of democracy with a panel discussion with experts drawn from the fields of political science and journalism. 15 00:01:40,130 --> 00:01:44,330 This term, we're exploring life in all its many forms. 16 00:01:44,990 --> 00:01:51,260 How on earth or in space for that matter, can we even begin to contemplate such a topic? 17 00:01:51,680 --> 00:01:59,600 Well, one of the many strengths we have at Oxford is our breadth and how we work across disciplines and boundaries. 18 00:02:00,110 --> 00:02:03,319 It is something that comes naturally to us in our colleges. 19 00:02:03,320 --> 00:02:08,690 Students from the humanities and social sciences come together with students from the natural sciences and medicine. 20 00:02:09,230 --> 00:02:15,860 And in our research, too, so much that is distinctive about what we bring carries the mark of the interdisciplinary. 21 00:02:16,610 --> 00:02:22,759 So whether it's challenges such as climate change or pandemic prevention, or opportunities such as artificial intelligence, 22 00:02:22,760 --> 00:02:28,640 we know that solutions will emerge from the connections that we make between ideas, findings and approaches. 23 00:02:29,580 --> 00:02:34,770 So it is then that we present to you tonight a number of perspectives on life. 24 00:02:35,670 --> 00:02:43,290 You will hear from three exceptional scholars from our community, each bringing different disciplines and different disciplinary expertise, 25 00:02:43,290 --> 00:02:51,300 exploring questions including the origins of cosmic life, the future of artificial life and intelligence, and what it means to live well. 26 00:02:52,430 --> 00:02:56,810 Here's how the evening is going to proceed. First up is Professor Jayne Birky. 27 00:02:57,470 --> 00:03:01,550 Professor Birkby received her PhD in astrophysics from the University of Cambridge, 28 00:03:01,820 --> 00:03:08,750 and has previously held a postdoctoral scholarship at Leiden University and NASA's Sagan Fellowship at Harvard University, 29 00:03:09,170 --> 00:03:12,110 and an Assistant Professorship at the University of Amsterdam. 30 00:03:12,800 --> 00:03:18,290 She is an ERC starting Grant Laureate, was awarded the 2021 Philip Leverhulme Prize in Physics 31 00:03:18,680 --> 00:03:23,150 and was a finalist in the 2024 Blavatnik Awards for Young Scientists in the UK. 32 00:03:23,720 --> 00:03:27,140 Professor Birkby uses the world's largest telescopes and highest resolution 33 00:03:27,140 --> 00:03:31,700 instruments to study the atmospheres of other worlds outside our solar system. 34 00:03:32,210 --> 00:03:42,320 She has a keen interest in connecting these exoplanet studies with chemistry, geology and biology, and the pursuit of the answer to are we alone? 35 00:03:43,540 --> 00:03:47,410 She will then be followed by Reverend Canon Professor Luke Breherton, 36 00:03:47,770 --> 00:03:56,049 who is the newly minted Regis Professor of Moral and Pastoral Theology at Oxford and director of the MacDonald centre for theology, 37 00:03:56,050 --> 00:04:03,310 Ethics and Public Life. Prior to that, he was the Robert E Cushman Distinguished Professor of Moral and Political Theology at Duke University, 38 00:04:03,760 --> 00:04:07,510 and before joining Duke in 2012, he taught at King's College London. 39 00:04:08,230 --> 00:04:12,910 His most recent book is A Primer in Christian Ethics Christ and the Struggle to Live Well. 40 00:04:13,600 --> 00:04:18,220 Alongside his scholarly work, he writes in the media on topics related to religion and politics, 41 00:04:18,580 --> 00:04:23,650 and is actively involved in forms of grassroots democratic politics, both in the UK and the US. 42 00:04:24,340 --> 00:04:27,850 And then we will conclude with Professor Sir Nigel Shadbolt. 43 00:04:28,240 --> 00:04:35,200 Nigel is a leading researcher in artificial intelligence and one of the originators of the interdisciplinary field of web science. 44 00:04:35,800 --> 00:04:40,930 He is principal of Jesus College and professor of Computing Science here at the University of Oxford. 45 00:04:41,470 --> 00:04:46,750 He is chairman of the Open Data Institute, which he co-founded with Sir Tim Berners-Lee. 46 00:04:47,230 --> 00:04:51,070 He was knighted in 2013 for services to science and engineering. 47 00:04:51,430 --> 00:04:57,250 He is a fellow of the Royal Society, the Royal Academy of Engineering and the British Computer Society. 48 00:04:58,330 --> 00:05:06,100 Jaye, Luke and Nigel will all speak one after the other, each for about ten minutes at the conclusion of Nigel's presentation. 49 00:05:06,340 --> 00:05:14,110 We will then begin the Q&A session and open to questions from the room and to those that were pre-submitted at registration. 50 00:05:14,500 --> 00:05:18,590 So now it's over to our speakers, and I invite Jayne to come up and deliver her presentation. 51 00:05:18,610 --> 00:05:27,770 Thank you. 52 00:05:29,670 --> 00:05:33,900 Are we alone? Is us and its life unique? 53 00:05:34,740 --> 00:05:38,280 This is a simple question, but it has a profound answer. 54 00:05:39,060 --> 00:05:43,800 If life began on earth, how and where did it happen? 55 00:05:44,430 --> 00:05:49,770 What was the environment and the conditions that allowed it to evolve into the complex, 56 00:05:49,770 --> 00:05:54,930 advanced lifeforms that we are today and can it exist on other planets? 57 00:05:55,740 --> 00:06:00,270 As an astronomer, my job is to look for signs of life on other planets. 58 00:06:00,930 --> 00:06:10,860 Today, I want to tell you about where we might look for signs of life, how we might do it, and how close we are to being able to do that. 59 00:06:12,030 --> 00:06:16,060 We know of one planet that has life. The Earth. 60 00:06:16,840 --> 00:06:21,220 That's home. That's us. And so far, it's the only home that we have known. 61 00:06:22,540 --> 00:06:29,080 We are perfectly adapted to live and survive in the conditions of modern day Earth. 62 00:06:29,620 --> 00:06:32,920 But the Earth was not always like it is today. 63 00:06:33,460 --> 00:06:36,670 Its entire surface and atmosphere have evolved over time. 64 00:06:37,630 --> 00:06:41,320 So when did life begin, in our fossil records 65 00:06:41,350 --> 00:06:45,730 We can look back to about 3.5 billion years ago. 66 00:06:46,290 --> 00:06:54,130 Tere is evidence, 3.5 billion years ago, of microbial mats in the fossil record found in Western Australia. 67 00:06:55,540 --> 00:06:59,080 There is even some evidence that life may have begun. 68 00:06:59,740 --> 00:07:06,340 4 billion years ago on Earth. And the Earth itself, we think, formed at the same time as the sun. 69 00:07:06,520 --> 00:07:13,780 4.5 billion years ago. So life on Earth, in some form or another has been around for a long time. 70 00:07:15,130 --> 00:07:20,380 So what does it need? We think life to begin starts with chemistry. 71 00:07:20,980 --> 00:07:22,450 It needs an energy source. 72 00:07:22,900 --> 00:07:32,830 It needs an environment with the right temperature and pressure that can allow it to drive the chemical reactions that are prebiotic chemistry. 73 00:07:33,160 --> 00:07:36,670 Then take inanimate objects, inanimate atoms, 74 00:07:36,970 --> 00:07:44,500 through a complex series of reactions that eventually become so complex that they become what we recognise as life. 75 00:07:47,410 --> 00:07:54,940 If we think life started around the same time as the Earth formed, and we know the kind of place that it needs to form, 76 00:07:55,360 --> 00:07:59,380 the question then comes, do we have the material that we need to make life? 77 00:08:00,220 --> 00:08:06,850 Life uses things like carbon, hydrogen, nitrogen and oxygen, phosphorus and sulphur. 78 00:08:07,730 --> 00:08:10,730 But these atoms need to develop into more complex molecules. 79 00:08:11,450 --> 00:08:15,140 Now, it may be that the sites for life to do that exist on Earth. 80 00:08:15,590 --> 00:08:21,140 However, we know that the leftover debris from planet formation. 81 00:08:22,220 --> 00:08:26,510 Such as asteroids and comets may have delivered that material to us. 82 00:08:26,810 --> 00:08:30,590 We think that some of the water on Earth came from comets. 83 00:08:31,370 --> 00:08:40,550 The recent OSIRIS-REx mission actually went to an asteroid, the asteroid Bennu, and grabbed a sample, a pristine sample of asteroid material in it. 84 00:08:40,580 --> 00:08:45,440 Not only did they find 14 of the 20 amino acids that we use in life, 85 00:08:46,070 --> 00:08:53,930 all life on Earth uses these amino acids, and they also found all five bases of the DNA structure. 86 00:08:54,200 --> 00:09:02,540 So in DNA, the rungs of the ladder that are in the helix, that material, all five of those bases have been found on this asteroid. 87 00:09:03,050 --> 00:09:09,740 So it's possible that material was delivered to us and didn't necessarily have to start those reactions on the Earth. 88 00:09:11,130 --> 00:09:15,660 The other striking thing about the Earth that does make it unique in our solar system 89 00:09:15,780 --> 00:09:20,850 is that it was the only rocky planet with a liquid water ocean on its surface. 90 00:09:21,240 --> 00:09:27,270 We know of some icy moons where there are liquid water oceans below, but this is the only one where we have it on the surface. 91 00:09:27,690 --> 00:09:34,890 We think water is necessary for Earth like life, because it is the solvent that allows those chemical reactions to proceed. 92 00:09:35,730 --> 00:09:41,070 Fortunately, it also turns out that water is one of the most common molecules in the universe. 93 00:09:42,210 --> 00:09:48,690 So we have an idea of when life formed. Where is might have formed the conditions and the material that it needed. 94 00:09:49,020 --> 00:09:54,870 So where can we look for life elsewhere? We can look within our own solar system. 95 00:09:55,170 --> 00:09:58,830 We have two other rocky planets very nearby. 96 00:09:58,860 --> 00:10:07,740 We have Venus and we have Mars. We see evidence on Mars in its past that it had surface water. 97 00:10:08,310 --> 00:10:13,290 There are regions that we can see now that look like they've been carved out by flowing rivers. 98 00:10:13,650 --> 00:10:22,740 So the potential for life potentially to have existed on Mars in the past and perhaps even today, it might be buried under the surface. 99 00:10:23,100 --> 00:10:25,589 So we have we are sending rovers to Mars. 100 00:10:25,590 --> 00:10:32,880 They are digging, uh, they are even picking up samples and leaving them for us to go and collect later for the Mars sample return mission, 101 00:10:33,060 --> 00:10:36,480 to actually analyse what we dig down into the surface. 102 00:10:36,930 --> 00:10:39,180 We're sending missions to parachute through Venus, 103 00:10:39,450 --> 00:10:45,540 and we're sending other missions to fly through the geysers that are spouted out from the icy moons around Jupiter and Saturn. 104 00:10:46,410 --> 00:10:52,410 But if we want to look for Earth twins, then we need to look at planets around other stars. 105 00:10:52,440 --> 00:10:55,380 We need to understand how common are other Earths. 106 00:10:56,830 --> 00:11:06,190 In the process of doing this, we have found that every star has at least one planet, and for Earth like planets, 107 00:11:06,610 --> 00:11:12,250 that their temperature is such that they are able to have a liquid water ocean on the surface. 108 00:11:12,430 --> 00:11:15,460 So this depends on how far they are from their star. 109 00:11:16,360 --> 00:11:19,780 And for some, like stars, it's about 1 in 4. 110 00:11:20,820 --> 00:11:26,460 For smaller stars of about a 10th of the size of the sun, about 1% of its luminosity. 111 00:11:26,830 --> 00:11:36,959 Um, it's more like 50%. So if there are billions of stars in our galaxy and there are billions of galaxies in the observable universe, the odds, 112 00:11:36,960 --> 00:11:45,720 the numbers seem to be in our favour, that we would at least find us like twins the same mass, size, and temperature as our own Earth. 113 00:11:46,500 --> 00:11:57,360 But the process for life existing on Earth went through many different events that it's difficult to know whether that is common. 114 00:11:57,750 --> 00:12:02,520 One of these, for example, is the collision in our early formation that formed the moon. 115 00:12:03,150 --> 00:12:11,340 This collision would have stripped the early Earth of any of its atmosphere, any of its surface activity, and reset the planet. 116 00:12:11,340 --> 00:12:14,790 It would have sterilised the entire planet after that impact. 117 00:12:14,970 --> 00:12:19,890 Where we formed the moon. We don't yet know how common moons are around other planets. 118 00:12:19,950 --> 00:12:28,380 This is still something that we need to find out. But in looking at exoplanets, we found an extraordinary array of systems. 119 00:12:28,770 --> 00:12:34,140 We found star systems with multiple suns. If you like Star Wars, you may have imagined these yourselves. 120 00:12:34,530 --> 00:12:42,840 We found planets that are very icy, and even planets that are so close to their host star, they're so hot that their entire surface is molten lava. 121 00:12:43,170 --> 00:12:50,220 And they have a permanent dayside and a permanent nightside. They don't get the diurnal, diurnal rhythms that we have here on Earth. 122 00:12:52,680 --> 00:12:59,190 This is an artist's impression of a system called Trappist one, and it has seven Earth sized planets. 123 00:12:59,580 --> 00:13:07,890 It is a space laboratory to go and study what happens if you take the Earth and you slowly move it away from a heat source, 124 00:13:08,370 --> 00:13:12,210 and you can see what happens. These are all born around the same time. 125 00:13:12,600 --> 00:13:18,750 You can see how they evolved and the different scenarios that have appeared, um, on those planets. 126 00:13:19,620 --> 00:13:25,260 At the same time, our very nearest exoplanet neighbour, the planet orbiting Proxima Centauri, 127 00:13:25,710 --> 00:13:30,390 the planet Proxima Centauri b, orbits in what we call the habitable zone. 128 00:13:30,960 --> 00:13:36,930 So in this nice region where we can have liquid water on the surface where the temperature is just right. 129 00:13:38,400 --> 00:13:43,530 This is where we will be looking for life on exoplanets, planets around other stars. 130 00:13:44,420 --> 00:13:48,120 But these stars are very small and they're very active. 131 00:13:48,140 --> 00:13:53,330 They have lots of flares. Now, Earth protects itself from flares using its magnetic field. 132 00:13:53,570 --> 00:13:56,570 If you've ever seen the northern lights, that's our magnetic field. 133 00:13:56,570 --> 00:14:02,780 Protecting us from all the harmful radiation and particles from the sun when it flares these small stars. 134 00:14:03,050 --> 00:14:10,100 We don't know if the planets around them have magnetic fields. So it might be that life on those planets have to evolve away to protect itself. 135 00:14:11,050 --> 00:14:15,250 We know that Earth like life, can survive very extreme conditions. 136 00:14:15,640 --> 00:14:18,730 This is a tardigrade that really, really tiny. 137 00:14:18,940 --> 00:14:26,740 And they're everywhere. But these guys have survived the vacuum of space, and they can survive temperatures up to 150 degrees. 138 00:14:27,810 --> 00:14:34,950 They're extremely robust, and it's because they can put themself into stasis so they can survive extreme conditions. 139 00:14:35,310 --> 00:14:42,090 But we also have other, uh, lifeforms on Earth called extremophiles, and they actually thrive in extreme conditions. 140 00:14:42,330 --> 00:14:45,600 They can survive in extremely acidic or alkaline conditions. 141 00:14:45,960 --> 00:14:49,770 Um, they can exist down in the deep vents, uh, deep in the ocean, 142 00:14:50,100 --> 00:14:54,300 with a sort of thermal heat coming out, which potentially is one of the sites for life. 143 00:14:54,930 --> 00:15:01,620 Um, so when we go to look for life on other planets, we have to be very open minded about what that might look like. 144 00:15:02,820 --> 00:15:08,460 But an exoplanet, the only thing we can see is its outer skin, its atmosphere. 145 00:15:09,060 --> 00:15:12,870 Now life changes. Atmospheres on Earth. 146 00:15:13,140 --> 00:15:20,550 Oxygen in our atmosphere is here because of photosynthesis and respiration from, uh, animals like us. 147 00:15:21,180 --> 00:15:29,700 Um, if we were to all disappear, if all life disappears, then the oxygen in a few tens of thousand years combines with methane, and it disappears. 148 00:15:30,090 --> 00:15:33,240 So that oxygen is a signature of disequilibrium. 149 00:15:33,690 --> 00:15:37,290 It's a sign that something is constantly replenishing it. 150 00:15:37,290 --> 00:15:41,140 And it's on Earth. It is not a geological process that can do this. 151 00:15:41,160 --> 00:15:42,180 It has to be life. 152 00:15:42,450 --> 00:15:50,400 So if we can find those signatures, oxygen, methane, biosignatures on other planets, then perhaps this is a strong indication of life. 153 00:15:50,880 --> 00:15:56,460 Same for technological signatures. CFCs in old refrigerators. 154 00:15:57,030 --> 00:16:04,890 They are gases that hang around in our atmosphere. If we can see their signature, then we can say that there's potentially advanced civilisation. 155 00:16:06,330 --> 00:16:13,470 How do we do this? We do this using spectroscopy. We take the light from the planet, and we split it into all the colours of the rainbow. 156 00:16:13,740 --> 00:16:22,470 And we look for the missing colours. These missing colours tell us very precisely what molecules are in that atmosphere. 157 00:16:23,280 --> 00:16:28,380 Oxygen has a very specific pattern of lines that no other molecule in the universe can replicate. 158 00:16:28,710 --> 00:16:35,430 So if we see that specific pattern has to be oxygen, potentially that is a signature of life. 159 00:16:36,300 --> 00:16:40,440 So to do this we are building the Extremely Large Telescope. 160 00:16:40,710 --> 00:16:46,770 Its mirror is 39m across. That is twice the size of this building. 161 00:16:47,220 --> 00:16:53,070 It is enormous. It's being built in Chile, in the Atacama Desert, and it will be ready in 2029. 162 00:16:53,460 --> 00:16:58,200 This enormous light bucket will enable us to actually look at the light from these planets. 163 00:16:59,200 --> 00:17:07,150 And so if life is common and nearby, then by the end of this decade, it is quite possible that we will have found it. 164 00:17:07,270 --> 00:17:17,090 Thank you. Okay. 165 00:17:17,340 --> 00:17:23,640 So I arrived in Oxford less than a month ago. 166 00:17:23,680 --> 00:17:26,730 I'm still slightly bewildered and unpacking boxes, 167 00:17:27,110 --> 00:17:31,499 and I'm in that slightly strange fugal state where you feel you've walked through the back 168 00:17:31,500 --> 00:17:36,390 of a wardrobe and entered a weird land where you have to ask yourself questions like, 169 00:17:36,750 --> 00:17:39,960 why should I wear a gown? Or should I wear a gown to this meeting? 170 00:17:40,230 --> 00:17:49,350 And why is a working knowledge of the liturgical calendar and Latin necessary to know when anything is actually happening or going on? 171 00:17:49,800 --> 00:17:58,970 Um, but I would say, given my job, I was very, very pleased when I went for my induction at my department and saw this question, 172 00:17:58,980 --> 00:18:02,850 uh, emblazoned on the hoarding outside the new Schwarzman Building. 173 00:18:03,060 --> 00:18:11,610 And I was even happier when I saw the arrow pointed directly to the entrance of the theology and Religion department, which I took to be a sign. 174 00:18:12,120 --> 00:18:16,790 Um, but joking aside, this question of what does it mean to be human? 175 00:18:16,800 --> 00:18:24,480 I think it's the most important and fundamental question any of us could ask about life in this life. 176 00:18:24,690 --> 00:18:29,940 And this question is central to all ancient and philosophical traditions around the world. 177 00:18:30,300 --> 00:18:38,480 And they try to answer this question through wrestling with a further question, namely, what is what does it mean to live human life? 178 00:18:38,490 --> 00:18:41,790 Well, what does a flourishing life consist of? 179 00:18:42,210 --> 00:18:50,400 And I don't think we can begin to fathom any answers to these questions without asking further questions about the meaning, 180 00:18:50,400 --> 00:18:54,030 purpose, and character of what it means to be human. 181 00:18:54,630 --> 00:19:03,540 Now, any account of what it means to be human must address and ask inevitably about the material conditions of human flourishing. 182 00:19:03,540 --> 00:19:11,100 And today this takes addressing questions about biology, physics, natural sciences, and the kind of things Jayne addresses. 183 00:19:11,880 --> 00:19:15,959 Some philosophical traditions contend that all one needs to know, 184 00:19:15,960 --> 00:19:21,270 in order to answer the question of what it means to be human, is what science tells us. 185 00:19:21,690 --> 00:19:27,870 They tend to reduce living well and the meaning and purpose of human life to its material basis. 186 00:19:27,900 --> 00:19:33,360 One example, famously, is Karl Marx, who quipped that humans are what they eat, 187 00:19:33,360 --> 00:19:39,690 by which he meant that human life and consciousness is wholly determined by its material conditions. 188 00:19:40,200 --> 00:19:48,000 By contrast, other philosophical and religious traditions hold that there's more to life than what we can touch, see, taste, and smell. 189 00:19:48,210 --> 00:19:53,010 Or, to use the words of Jesus. Humans are more cannot live by bread alone. 190 00:19:53,610 --> 00:20:00,510 And it's this insight that I think sets the stage for why religion, philosophy, and the humanities are so important. 191 00:20:00,780 --> 00:20:07,469 They seek answers, or at least they should seek answers to the meaning and purpose of life and the 192 00:20:07,470 --> 00:20:13,020 question of what should be the quality and character of relations between humans. 193 00:20:13,260 --> 00:20:16,649 If life is to be lived well at this point, 194 00:20:16,650 --> 00:20:24,210 I think it's helpful to introduce a distinction made by the early 20th century French existentialist philosopher Gabriel Marcel, 195 00:20:24,510 --> 00:20:28,530 and he distinguishes between a mystery and a problem. 196 00:20:28,920 --> 00:20:35,460 A problem is something you can stand over and solve, such as how to make a computer or solve a complex equation. 197 00:20:35,760 --> 00:20:44,460 Whereas a mystery is something we as humans are situated within that is greater than us, that we can never solve or come to the end of. 198 00:20:44,760 --> 00:20:51,170 The origins of the universe, the meaning of life, uh, how to live a good life are just such mysteries. 199 00:20:51,180 --> 00:20:54,840 Now that doesn't make a mystery. Spooky or irrational. 200 00:20:55,200 --> 00:21:03,410 We can have ever deeper, ever deeper understanding of mysteries and more and more insightful things to say about such questions. 201 00:21:03,420 --> 00:21:06,590 But we never resolve or solve mysteries. 202 00:21:06,600 --> 00:21:09,870 The quest for answers is always before us. 203 00:21:10,380 --> 00:21:13,950 So science is vital for understanding the material conditions of life, 204 00:21:14,190 --> 00:21:20,250 but I don't think it can address questions about the meaning and purpose of life, or how to live well with others. 205 00:21:20,550 --> 00:21:24,270 These are mysteries that require us to grow in wisdom. 206 00:21:24,780 --> 00:21:33,560 If we fail, though, to put questions about the meaning of life and its purposes at the heart of every human institution and practice, 207 00:21:33,570 --> 00:21:42,570 I think we quickly become driven by materialistic goals, seeking power over other others to obtain material things. 208 00:21:42,990 --> 00:21:48,240 I think this is true of science and technology as it is of business and academia. 209 00:21:48,630 --> 00:21:56,820 It's as if science and technological endeavours are not situated within a broader account of moral purposes and the ends of a human life, 210 00:21:57,060 --> 00:21:59,940 then science and technology are easily corrupted. 211 00:22:00,240 --> 00:22:09,750 They collapse into manipulating and instrumentalising human lives and life itself, for the sake of the power and enrichment of the few. 212 00:22:10,290 --> 00:22:18,480 As an aside, I would go as far as to say that the enterprise of science, at least in the university, needs relocating within the. 213 00:22:18,560 --> 00:22:25,220 Mystery side of things if it is not to be wholly instrumentalized or commodified when so relocated. 214 00:22:25,460 --> 00:22:33,890 Science can recover itself as a mode of contemplation, discovery and form of deep play undertaken for its own sake. 215 00:22:34,280 --> 00:22:41,390 This is perhaps points to the true meaning and purpose of science, but that is a debate for another day, perhaps in other Sheldonian lectures. 216 00:22:42,200 --> 00:22:49,370 Now growing in wisdom about how to address the questions about the meaning and purpose of life is a struggle. 217 00:22:49,670 --> 00:22:55,370 Again, this is a key insight of all ancient philosophical and religious traditions around the world. 218 00:22:55,850 --> 00:23:02,510 We are easily distracted or corrupted, and it is hard work to try and address these questions wisely. 219 00:23:03,290 --> 00:23:10,820 Part of the struggle is personal, whether through meditation, therapy, ritual processes, prayer, diligent study, physical training. 220 00:23:10,970 --> 00:23:20,000 We must grow in wisdom and self-awareness if we are to know how to relate well to ourselves, to others, and to the world around us. 221 00:23:20,450 --> 00:23:27,630 But it is also a collective struggle. Humans are what anthropologists call ultra social animals. 222 00:23:27,770 --> 00:23:34,190 Are not individual atoms bouncing against each other. We are mutually vulnerable, interdependent creatures. 223 00:23:34,370 --> 00:23:42,620 We cannot survive without others. And to thrive depends on being embedded in some kind of shared or common life. 224 00:23:43,400 --> 00:23:50,510 Our individual flourishing, then, is symbiotic with the flourishing of a wider ecology of human and non-human relations. 225 00:23:50,990 --> 00:23:59,090 Each of us comes to be as a human through the quality and character of our relations with others, including non-human others. 226 00:23:59,450 --> 00:24:05,960 So our shared life is the condition for the possibility of each individual to have a good life. 227 00:24:06,590 --> 00:24:10,760 If there is no decent and affordable housing, if there are no well-paying jobs, 228 00:24:10,880 --> 00:24:16,670 if there is no safe place for the kids to play, then each individual life is diminished. 229 00:24:16,700 --> 00:24:20,420 No human is an island and all flourishing is mutual. 230 00:24:20,430 --> 00:24:25,340 I came across the old Labour Party slogan from the early 20th century which summarises this wonderfully. 231 00:24:25,370 --> 00:24:29,990 Fellowship is life, or, as the Bible puts it. Love your neighbour as yourself. 232 00:24:30,590 --> 00:24:36,080 But as many of you know first hand and to state the obvious, people don't love their enemies. 233 00:24:36,110 --> 00:24:44,000 Societies are not just and generous. They are mostly structured to so as to benefit the few or those with the most money. 234 00:24:44,510 --> 00:24:50,810 So there must be a collective struggle to ensure a decent quality of life can be had by all. 235 00:24:51,380 --> 00:24:58,760 Central to our collective struggle and to any account of living well, is asking difficult questions about what needs to change. 236 00:24:58,880 --> 00:25:02,300 If everyone is to have a decent quality of life. 237 00:25:02,660 --> 00:25:11,450 Asking hard questions about what needs to change is based on the realisation that my life or the life of others is not all that it could be. 238 00:25:11,690 --> 00:25:19,310 To have a just and generous form of common life in which all can thrive, means we have to face, head on the need for change. 239 00:25:19,970 --> 00:25:24,110 So through collective struggle and asking difficult questions about our way of life, 240 00:25:24,320 --> 00:25:30,500 we come to self-awareness and can begin to overcome the ways material and social conditions are set up 241 00:25:30,500 --> 00:25:37,040 to alienate or exploit us or those around us through personal and shared struggle for a better life. 242 00:25:37,280 --> 00:25:42,920 We ourselves become better able to live this life well with others. 243 00:25:43,670 --> 00:25:53,360 Part of our personal and collective struggle is to discern and grow in wisdom about which claims concerning the human condition are true. 244 00:25:53,840 --> 00:25:59,300 We must do this in a world where there are radically different visions of the meaning and purpose of life. 245 00:25:59,540 --> 00:26:04,310 Whether it is Hinduism, Islam, stoicism, Marxism, liberalism, Christianity, whatever. 246 00:26:04,730 --> 00:26:09,020 Each gives different answers to the question of what it means to live well. 247 00:26:09,680 --> 00:26:17,600 Learning how to navigate and negotiate this kind of plurality is itself part of learning how to live a good life. 248 00:26:18,020 --> 00:26:23,030 But at this very point, we meet a severe and present danger. 249 00:26:23,720 --> 00:26:28,580 There are four options of what to do when meeting someone who I disagree with, 250 00:26:28,580 --> 00:26:37,280 or whose way of life I find threatening or strange eventualities that are inevitable in any form of society beyond the immediate family. 251 00:26:37,850 --> 00:26:42,020 When we meet a stranger, we can do one of four things I can kill them. 252 00:26:42,650 --> 00:26:46,520 I can create a system to dominate them so I don't have to listen to them or talk to them. 253 00:26:46,910 --> 00:26:51,889 I can make life so difficult that I caused them to flee, or I can do politics. 254 00:26:51,890 --> 00:26:59,870 That is to say, I can form norm and sustain and create some kind of common life amid asymmetries of power. 255 00:27:00,140 --> 00:27:03,890 Competing visions about what life really means or what it's for, 256 00:27:04,160 --> 00:27:11,720 and my own feelings of fear and aversion to others without killing, coercing, or causing others to flee. 257 00:27:12,200 --> 00:27:17,419 Human history and the contemporary context are full of examples of the first three options. 258 00:27:17,420 --> 00:27:23,440 Politics. Is the other option. Indeed, politics is essential to live, let alone live. 259 00:27:23,470 --> 00:27:31,940 Well, as I said already, humans cannot survive, let alone thrive, without others, and some kind of common life must be cultivated. 260 00:27:31,960 --> 00:27:37,240 If human life is to go on, politics is the name for generating this common life. 261 00:27:37,450 --> 00:27:42,580 In Aristotle's succinct formulation. To be human is to be a political animal. 262 00:27:43,030 --> 00:27:51,460 The quality and character of political relationships, as well as the goods pursued through politics, determine whether or not humans live well. 263 00:27:51,720 --> 00:27:55,170 And so politics is not a failure of the good or moral life. 264 00:27:55,180 --> 00:27:58,180 It's constitutive of living life well. 265 00:27:58,570 --> 00:28:05,110 The failure to do politics leads to killing, coercing, coercing, coercing, and causing others to flee. 266 00:28:05,110 --> 00:28:07,600 And that is not just immoral, it is evil. 267 00:28:08,470 --> 00:28:17,950 A university should be a place where you can learn to discern between conflicting claims to truth and how to navigate conflict politically. 268 00:28:18,190 --> 00:28:20,709 That is to say, through peaceful means. 269 00:28:20,710 --> 00:28:29,620 That is debate and argument rather than force and violence to work out right from wrong, good from evil, truth from lies. 270 00:28:30,130 --> 00:28:38,080 And that is what I take the point of this series of talks instituted by the Vice Chancellor to be tragically, too often, 271 00:28:38,320 --> 00:28:46,600 universities such as this August institution are so focussed on the means of life they never ask about the meaning of life, 272 00:28:46,750 --> 00:28:50,740 let alone teach students how to pursue this question well. 273 00:28:51,190 --> 00:28:59,140 And that is a dereliction of duty, particularly as we live in a moment when basic questions are being asked about what it means to be human. 274 00:28:59,410 --> 00:29:06,400 As we will hear in a moment. Technological advances in areas like genetic engineering and human machine integration. 275 00:29:06,410 --> 00:29:12,190 And AI raise a profound questions about the identity and purpose of human life. 276 00:29:12,400 --> 00:29:15,880 Similarly, as we have heard, growing ecological awareness, 277 00:29:16,180 --> 00:29:24,130 reflection on the origins of life and life on other planets are reshaping the view about the interconnectedness of all of life. 278 00:29:24,550 --> 00:29:32,980 Some foresee a future of enhanced or even post human beings, while others view humanity as merely a thread within a larger web of life. 279 00:29:33,520 --> 00:29:40,390 It is vital we are able to address these questions well and seek moral, not simply technical, answers. 280 00:29:40,660 --> 00:29:45,700 Even as we recognise the contested and contingent nature of our answers. 281 00:29:46,180 --> 00:29:48,930 Politics is the means through which to do this, 282 00:29:48,970 --> 00:29:54,610 the means through which we move from the world as it is a world in which many people are not able to live well, 283 00:29:54,610 --> 00:29:59,620 where science and technology are often instrumentalized for immoral or evil purposes, 284 00:30:00,010 --> 00:30:08,230 and where injustice and discrimination prevail to a world as it should be, one in which we and others can live well. 285 00:30:08,650 --> 00:30:16,480 My challenge to you tonight is to pursue knowledge about life, which is a good and beautiful thing to do for its own sake, 286 00:30:16,810 --> 00:30:27,010 but also to pursue wisdom in how to live well through forms of personal and collective struggle for a more just and generous common life. 287 00:30:27,310 --> 00:30:30,820 Your life and the life of others depends on it. Thank you. 288 00:30:44,530 --> 00:30:56,410 Uh, so, uh, following on from Jayne and Luke, I'd like to share some thoughts on life from a perspective of having researched AI for over four decades. 289 00:30:57,970 --> 00:31:00,580 In thinking about my own contribution, 290 00:31:00,580 --> 00:31:06,670 I was reminded of Darwin's concluding paragraph at the end of one of the most important books ever written, Origin of Species. 291 00:31:07,030 --> 00:31:16,030 He wrote, there's a grandeur in this view of life, with its several powers having been originally breathed into a few forms or into one, 292 00:31:16,630 --> 00:31:21,340 and that whilst this planet has gone cycling on according to the fixed law of gravity, 293 00:31:21,820 --> 00:31:29,620 from so simple a beginning, endless forms, most beautiful and most wonderful have been and are being evolved. 294 00:31:30,370 --> 00:31:35,260 It is an arresting, powerful, beautifully crafted piece of prose. 295 00:31:36,430 --> 00:31:47,050 I used it, um, as part of a title for a paper I wrote in the spring of 2022 from So Simple a beginning species of artificial intelligence. 296 00:31:47,320 --> 00:31:54,000 The paper reviewed the history emergence. And evolution of different AI methods and techniques. 297 00:31:55,590 --> 00:32:02,670 It included the large language models, or LLMs, at the heart of much current excitement around AI. 298 00:32:03,300 --> 00:32:12,090 The paper describes an ecosystem of different types of AI, each occupying a niche to which it is particularly suited. 299 00:32:12,960 --> 00:32:18,150 Developments in modern AI are often presented as evolutionary trees. 300 00:32:19,020 --> 00:32:27,360 This one starts with a method called word to vec, although it too had earlier ancestors going back to the 1980s. 301 00:32:28,050 --> 00:32:30,750 There are many other trees that lead to other types of AI, 302 00:32:31,020 --> 00:32:42,480 but word to vec is an important ancestor because in 2013, Micallef and his colleagues turned words into vectors. 303 00:32:42,930 --> 00:32:47,400 This is a profound representation of the world in this approach. 304 00:32:47,640 --> 00:32:57,570 Words aren't dictionary entries in a lexicon. Each word is represented as a series of ones and zeros in what is a high dimensional space. 305 00:32:58,560 --> 00:33:02,250 This is achieved by training the program on large amounts of language. 306 00:33:02,610 --> 00:33:06,330 The meaning of a word is determined by the words that surround it. 307 00:33:06,420 --> 00:33:12,690 Words that are similar in meaning are close together in this high dimensional space called an embedding space. 308 00:33:13,080 --> 00:33:20,250 Similar words share similar vectors. Many species of AI system proliferated using embeddings. 309 00:33:20,250 --> 00:33:24,870 We can see in our tree the earliest GPT models we can see. 310 00:33:26,960 --> 00:33:32,210 Models from Google. We can see models from Meta Lamas models. 311 00:33:32,330 --> 00:33:40,150 Uh. Uh. Open source. Which means that you can test, adapt, change and modify it. 312 00:33:40,180 --> 00:33:47,060 In fact, the Lama models gave rise to a whole proliferation of different models, many of them in Chinese. 313 00:33:47,080 --> 00:33:51,370 Interestingly, because the open weights were available to be adapted and changed, 314 00:33:52,330 --> 00:34:01,250 and I hadn't even mentioned Deep Seek in this little tree of life, the company founded in 2023, which is sending reverberations through the world of AI 315 00:34:01,250 --> 00:34:07,260 Most language models are possible because of vast data and prodigious compute. 316 00:34:08,690 --> 00:34:14,480 It's difficult to imagine the amount of data these systems have been trained on. Most of the available internet. 317 00:34:16,030 --> 00:34:21,820 Public internet has been used and not always respecting the moral rights of authors who produced it. 318 00:34:23,340 --> 00:34:28,559 From Shakespeare to song lyrics, from Mumsnet to Wikipedia, social media to the New York Times, 319 00:34:28,560 --> 00:34:33,750 from computer programs to scientific articles, content in a multitude of languages. 320 00:34:34,500 --> 00:34:39,090 The data is processed and turned into patterns of embeddings or vectors, 321 00:34:39,270 --> 00:34:48,030 and wider representations of context that express at global scale the likelihood of one token preceding another, 322 00:34:48,570 --> 00:34:53,160 one token following another, but hundreds thousands of words of context. 323 00:34:53,610 --> 00:34:58,800 Our current generation of AI's, when given a prompt or asked a question, 324 00:34:59,070 --> 00:35:07,470 can generate coherent conversations much else besides all of which demands huge compute power. 325 00:35:09,220 --> 00:35:13,400 This visualisation is one way to understand the increase in compute power. 326 00:35:13,420 --> 00:35:20,170 It shows the number of transistors on a computer chip by the year of introduction. 327 00:35:21,360 --> 00:35:27,340 This straight line. Represents a true exponential change in our world. 328 00:35:27,930 --> 00:35:32,890 A doubling and re doubling of compute power every two years. 329 00:35:33,280 --> 00:35:40,540 In 50 years, a 50 million fold increase in the computer power of chips. 330 00:35:41,380 --> 00:35:46,930 This is Moore's law. Our AI models become more capable. 331 00:35:48,140 --> 00:35:53,780 As more compute and more data are made available. One benchmark is their continued improvement. 332 00:35:55,100 --> 00:36:04,070 And the exams they can pass with flying colours from law to maths, chemistry to English literature, improving on a whole range of other benchmarks to. 333 00:36:05,160 --> 00:36:13,260 To get even better, they're being encouraged to explain their reasoning and set out the steps they use and learned from that. 334 00:36:13,710 --> 00:36:23,430 And they now call up other AI methods to help. These rates of improvement are touted as harbingers of artificial general intelligence. 335 00:36:23,430 --> 00:36:27,150 AGI Polymathic superintelligence. 336 00:36:27,600 --> 00:36:31,740 What is life in this age of powerful AI systems? 337 00:36:33,000 --> 00:36:36,570 There's a lot of doom mongering in discussions about AI. 338 00:36:36,600 --> 00:36:43,290 It's claimed they're going to take our jobs and render us obsolete, make many of our accomplishments appear trivial. 339 00:36:45,460 --> 00:36:52,690 Consider the images in this slide. Garry Kasparov, the world's greatest chess player, beaten by machine in 1996. 340 00:36:52,810 --> 00:36:58,900 Lee Sedol, one of the world's greatest go players, beaten by a machine in 2016. 341 00:36:59,050 --> 00:37:02,440 Each of these events touted as pivotal moments. 342 00:37:03,190 --> 00:37:06,220 But we haven't stopped playing chess or go. 343 00:37:06,250 --> 00:37:09,730 We're not diminished by machines that can routinely beat us. 344 00:37:10,390 --> 00:37:17,410 Indeed, these games have been enriched by the presence of AI, the analysis of games, the formation of hybrid teams. 345 00:37:17,650 --> 00:37:25,030 More people play chess today than ever. We've been augmented by the presence of chess super intelligence. 346 00:37:25,450 --> 00:37:37,180 And remember, these systems do not derive the slightest scintilla of pleasure when they defeat us, nor any pride in a game well played. 347 00:37:37,720 --> 00:37:43,510 AI systems have nothing that corresponds to the sense of our lived humanity. 348 00:37:43,720 --> 00:37:47,530 When we we we respond to this photograph. 349 00:37:50,540 --> 00:37:54,500 In the 1980s, I was a postgraduate student visiting in MITs AI lab. 350 00:37:55,130 --> 00:37:59,210 Pat Winston, its director. An early pioneer in machine vision. 351 00:37:59,630 --> 00:38:06,920 When talking about progress in AI used to say, there are lots of ways of being smart that aren't smart like us. 352 00:38:07,820 --> 00:38:11,780 And this led me to think about the ways in which smart systems might vary. 353 00:38:12,260 --> 00:38:20,719 One dimension is varying levels of intelligence. The American Psychological Association defines intelligence as the ability to derive information, 354 00:38:20,720 --> 00:38:25,460 learn from experience, adapt to the environment, understand and correctly utilise reason. 355 00:38:26,810 --> 00:38:31,650 Many examples of adaptive behaviour and learning in the animal kingdom from the desert, 356 00:38:31,670 --> 00:38:36,170 ant that can reliably find its way home to the octopus that can navigate mazes. 357 00:38:36,860 --> 00:38:42,410 Our latest AI systems display quite a lot of this operational sense of intelligence. 358 00:38:42,800 --> 00:38:48,650 If learning and adaption are key characteristics of intelligence, all machines have it to some degree. 359 00:38:49,590 --> 00:38:56,190 The substrate in which these systems are realised is another dimension along which smart systems might vary. 360 00:38:57,250 --> 00:39:00,310 With a carbon based biology or silicon based computers. 361 00:39:01,300 --> 00:39:07,630 But the dimension that intrigues us so much is the virtue and value we attach to sentience, 362 00:39:08,080 --> 00:39:13,000 our self-aware presence of living in the world, of being alive. 363 00:39:13,420 --> 00:39:15,880 This is the famously hard problem. 364 00:39:16,270 --> 00:39:28,149 Our conscious awareness being in the world as we place ourselves of the species or a AIS, and perhaps one day, extraterrestrials. 365 00:39:28,150 --> 00:39:34,270 In the broad landscape of smart systems, we will doubtless wish to explore other attributes too. 366 00:39:35,020 --> 00:39:41,920 And as we consider our own evolution, our sense of life, remember that we just didn't make our technologies. 367 00:39:42,280 --> 00:39:51,430 Our technology is made and changed us. The tools we built, made and changed us from the hand axe to writing. 368 00:39:51,580 --> 00:39:58,750 Both changed our neurology and we are changing our neurology even now as we use computers. 369 00:39:59,140 --> 00:40:06,040 Developments in machine brain interfaces suggest we may integrate our technologies directly into ourselves. 370 00:40:07,210 --> 00:40:17,530 Humans in the future will see, sense, explore, understand and live in the world via technologies that further augment us. 371 00:40:18,740 --> 00:40:25,130 All of this demands an understanding of what matters, what we value. 372 00:40:25,700 --> 00:40:29,630 It is about what it is to be a moral agent. 373 00:40:30,320 --> 00:40:37,100 It will be about ethics. It will be about our sense of being human in an age of AI. 374 00:40:38,000 --> 00:40:44,060 And let's return to where Jayne began this evening's discussion on a final question about life. 375 00:40:45,820 --> 00:40:51,490 If life arises elsewhere in the universe, as the numbers suggest, it might or indeed should, 376 00:40:52,000 --> 00:40:57,880 one in another civilisation with its hold this thought own E.T. equivalent of Elon Musk, 377 00:40:58,810 --> 00:41:08,440 committed to their species becoming space faring, exploiting AI wouldn't they have built self-replicating machines? 378 00:41:08,440 --> 00:41:12,970 Machines that don't have to be sentient, just smart in other ways. 379 00:41:13,910 --> 00:41:17,780 Mining the resources of other solar systems, replicating and spreading outwards. 380 00:41:18,050 --> 00:41:24,460 To explore brave new worlds and report back. These machines have been imagined. 381 00:41:24,850 --> 00:41:31,600 They even have a name. Von Neumann probes after one of the founders of computer science, John von Neumann. 382 00:41:32,110 --> 00:41:35,170 On some reasonable assumptions, the galaxy should be full of them. 383 00:41:35,740 --> 00:41:39,700 E.T. AI's. So finally, a paradox. 384 00:41:40,660 --> 00:41:43,569 The Fermi paradox. Enrico Fermi, the great physicist, 385 00:41:43,570 --> 00:41:52,390 pointed out in 1950 the contradiction between the high likelihood of extraterrestrial life and the lack of evidence for it. 386 00:41:53,440 --> 00:41:58,420 If life can develop and evolve whatever the substrate, where is it? 387 00:41:59,170 --> 00:42:09,650 Thank you. 388 00:42:10,520 --> 00:42:14,960 Thank you Nigel. Thank you so much Luke. Thank you Jayne. Absolutely wonderful presentations. 389 00:42:15,080 --> 00:42:21,440 We're just going to bring some chairs along and we'll sit down. So that gives you a moment to think about what questions you'd like to ask for. 390 00:42:21,440 --> 00:42:27,530 If I could invite the speakers to come and take a seat. And we'll start, uh, the discussion period. 391 00:42:28,220 --> 00:42:33,560 We've got roving microphones. So, um, please do wait for that so that everybody's able to hear you. 392 00:42:33,980 --> 00:42:40,130 Uh, and your question. But why don't we come? I'm going to take a seat at the end so I can sort of, uh, see you all. 393 00:42:40,730 --> 00:42:44,240 That's it. That's great stuff when I go there. Brilliant. 394 00:42:44,270 --> 00:42:47,360 Thank you so much. Right, Jayne Come on. Come on over. And. 395 00:42:47,360 --> 00:42:54,520 Luke. Terrific. 396 00:42:57,180 --> 00:43:02,970 Jayne I'm. I'm going to take, uh, moderator's privilege and ask the first question if that's okay. 397 00:43:03,000 --> 00:43:08,100 Uh, and it's it links a little bit to, um, a question that we've had in the pre-submitted as well, 398 00:43:08,100 --> 00:43:12,270 which will overlap, uh, with some of the points that you made, uh, Luke two. 399 00:43:13,690 --> 00:43:18,310 Are you constrained in looking for life that just looks like the version of life as we see biologically. 400 00:43:18,490 --> 00:43:24,069 Here on Earth, I was interested to know that you're very dependent on the oxygen signature and 401 00:43:24,070 --> 00:43:27,880 limiting to the concept of amino acids and other building blocks that we've got. 402 00:43:28,120 --> 00:43:33,370 So what's the chance that we're missing other variations and forms of life and outside 403 00:43:33,370 --> 00:43:37,059 And how how are you addressing that particular question? Yeah, exactly. 404 00:43:37,060 --> 00:43:41,230 So a lot of what I showed is based on looking for Earth like life. 405 00:43:41,650 --> 00:43:51,379 I'm I think we don't yet know if. The beginnings of life and the development of life happens in similar ways that if it does happen elsewhere, 406 00:43:51,380 --> 00:43:54,110 that it sort of has to happen the same way. 407 00:43:54,110 --> 00:44:03,410 So this sort of stems from the idea that if planets are the debris of star formation, that life, if it occurs on a planet, 408 00:44:03,560 --> 00:44:08,840 is just a natural consequence of all the physics and chemistry that happened, uh, informing the star. 409 00:44:09,230 --> 00:44:15,340 And so perhaps that. Suggest that the pathways for evolution and life might be the same. 410 00:44:15,340 --> 00:44:17,530 And so what we're looking for might be the same. 411 00:44:17,980 --> 00:44:25,450 But I would say as astronomers, what we are looking for, to try and be as agnostic as possible as what we think life is. 412 00:44:25,870 --> 00:44:29,590 And what to look for is just any signature of disequilibrium. 413 00:44:30,160 --> 00:44:37,970 So any signature in the atmosphere that geology on its own wouldn't produce in a steady state. 414 00:44:38,020 --> 00:44:46,870 So I thought for an on Earth that something like oxygen, life creates oxygen in our atmosphere, but we can be very open about what we might see. 415 00:44:47,170 --> 00:44:54,480 And then from that, you sort of I would then go to my chemist and biologist and science, uh, 416 00:44:54,490 --> 00:45:01,390 colleagues and ask them, okay, well, we found this, but can any think that that might be life create this? 417 00:45:02,270 --> 00:45:05,350 Luke you very much spoke from the context of human life. 418 00:45:05,710 --> 00:45:13,690 Um, and to a certain extent, Nigel, the same for this sort of AI, uh, challenges, even with the use of, you know, intelligence in the expression. 419 00:45:14,740 --> 00:45:20,260 What would you say about the sort of more basic version of biology as presented by Jayne you know, 420 00:45:20,260 --> 00:45:26,170 in the tiny little mites or other, uh, animal systems that that aren't in the human form. 421 00:45:26,470 --> 00:45:33,880 I mean, if you follow your logic, you would say that for them to thrive and live well, which they have a right to, would require a political system. 422 00:45:34,120 --> 00:45:37,269 Otherwise, it's a very Darwinian process of just survival of the fittest. 423 00:45:37,270 --> 00:45:42,909 So how do you think about having listened to Jayne the opportunity for living and thriving 424 00:45:42,910 --> 00:45:47,319 Well, with a completely different version of life that's not human. Yeah it's a wonderful question. 425 00:45:47,320 --> 00:45:52,960 I think two quick things. One is that I think that the sense of life. 426 00:45:53,700 --> 00:45:57,450 That is kind of the forms of symbiosis. 427 00:45:57,600 --> 00:46:00,810 It's the forms of symbiosis in which there are multiple forms. 428 00:46:01,230 --> 00:46:06,720 Um, is, uh, central to all of life and the conditions of life. 429 00:46:07,110 --> 00:46:11,400 And so that's what it means to flourish, human and non-human. 430 00:46:11,910 --> 00:46:18,270 And how we navigate that and, you know, contentious issues like predation and this kind of stuff, 431 00:46:18,540 --> 00:46:22,559 how that figures within an understanding of, of living. 432 00:46:22,560 --> 00:46:25,470 Well, I think is kind of a key question. 433 00:46:25,890 --> 00:46:31,620 Um, and there's obviously lots of different religious, philosophical traditions kind of have things to say about that. 434 00:46:31,830 --> 00:46:41,309 I think building on what Jayne was saying, I think there's a broader question of how we recognise and name life and obviously religions 435 00:46:41,310 --> 00:46:51,930 Christianity particular, uh, has lots of speculation about non embodied forms of life and forms of consciousness which aren't materially based. 436 00:46:52,180 --> 00:46:59,610 Uh, angels being one example. And now we can think of this is kind of magical language, but it's a I think it's an interesting point of reflection. 437 00:46:59,610 --> 00:47:02,159 There's been lots of philosophical, uh, 438 00:47:02,160 --> 00:47:09,719 rich philosophical and moral thought about the nature of life that isn't dependent on chemistry, biology and other things. 439 00:47:09,720 --> 00:47:15,410 And I think that's, uh, could be an interesting point of dialogue between science and religion. 440 00:47:15,410 --> 00:47:20,910 And it doesn't tend to factor in because the conditions you laid out were tended to be. 441 00:47:21,510 --> 00:47:24,030 You know, we're looking for biological and chemical signatures, 442 00:47:24,270 --> 00:47:31,229 or we have the imaginative possibilities of other modalities of life not dependent on material conditions. 443 00:47:31,230 --> 00:47:40,350 And related to that is this very, very big question of, Jayne and I were talking about this earlier, this question of could we even communicate, 444 00:47:40,500 --> 00:47:52,440 let alone record, could we even recognise, let alone communicate with non, um, material based or non-human forms of life or what would that mean? 445 00:47:52,860 --> 00:48:01,830 And how would that stretch us? Because I think we tend to think holy in kind of analogical terms, from human life to some other form. 446 00:48:02,700 --> 00:48:06,050 What would it mean to think from some other form to us? Um, yeah. 447 00:48:06,210 --> 00:48:11,220 Yeah. No, that's very interesting. And, and that the sort of ability to bring the two together, actually, you're quite right. 448 00:48:11,220 --> 00:48:18,870 There is a strong tradition in religion of other versions of it. And in fact, uh, Piero crania student has asked a question which speaks to this. 449 00:48:18,870 --> 00:48:22,169 Can you be religious and still be okay with that? 450 00:48:22,170 --> 00:48:25,450 There's other forms of life out there in the galaxy, and, you know, Luke would say yes. 451 00:48:25,470 --> 00:48:31,560 So there's a very rich tradition of, yes, there's a great, um, early modern theological Nicholas of Cusa. 452 00:48:31,560 --> 00:48:35,620 Yes. It's a long treatise thinking about speculating on life in other plants. 453 00:48:35,620 --> 00:48:41,879 There was a brief moment in the Reformation. Melanchthon was very against the idea that God would make a life on other planets, 454 00:48:41,880 --> 00:48:49,320 but basically as a very rich tradition of kind of the idea that if the universe is if God creates a habitable universe, 455 00:48:49,920 --> 00:48:59,250 then that we should expect a life on other planets, uh, and that, that and the other thing about that, that will be very different to human life. 456 00:48:59,400 --> 00:49:04,500 That's again, a very, very rich tradition of kind of theological and philosophical speculation. 457 00:49:04,530 --> 00:49:09,899 And I think, um, there's no there's no conflict there I don't think. 458 00:49:09,900 --> 00:49:14,060 would you agree, Jayne? 459 00:49:14,070 --> 00:49:17,600 I mean, I approach it from sort of the physics and the chemistry. 460 00:49:17,610 --> 00:49:24,240 So if Earth is habitable, it suggests that that is that is a process that can happen. 461 00:49:24,570 --> 00:49:27,780 Um, so yes, I think that life forms. 462 00:49:28,170 --> 00:49:34,559 So we have a pre-submitted question from a post-grad Luis Moya, this is for you, Nigel, 463 00:49:34,560 --> 00:49:42,090 is does a drive to build artificial life on Earth conflict with our quest to find organic life out there in the cosmos? 464 00:49:42,540 --> 00:49:45,610 Or do you see, I'll be interested in what you both think about land. Well, that's. 465 00:49:45,630 --> 00:49:49,350 I think I think that the point I made is a really interesting, uh, reflection on that, 466 00:49:49,350 --> 00:49:54,719 which is that, uh, you might use one to serve the other because, again, space is very big. 467 00:49:54,720 --> 00:49:59,100 Take a very long time. So perhaps your best hope is, uh, intelligent self. 468 00:49:59,460 --> 00:50:01,170 Yeah. Maintaining systems. 469 00:50:01,170 --> 00:50:08,940 Uh, you know, imagine Voyager plus plus plus, which is kind of out there and able to build copies of it and, and perhaps send the information back. 470 00:50:09,180 --> 00:50:14,669 So, so no, I think that's what's really interesting about how we might use the one to address the other. 471 00:50:14,670 --> 00:50:19,200 And I think, I think I mean, what I do want to kind of. 472 00:50:20,660 --> 00:50:32,010 Make clear is that there was a lot of. Nonsense around the imminent arrival of artificial general intelligence, as if that will be a new kind of, um. 473 00:50:32,790 --> 00:50:38,280 Um, challenging agency. Um, we have to be thoughtful about how those built and evolved. 474 00:50:38,400 --> 00:50:41,969 I mean, what the. But this is an echo of ourselves coming back to us. 475 00:50:41,970 --> 00:50:44,250 Extremely capable, these generative AI systems. 476 00:50:44,250 --> 00:50:50,310 But they succeed because they've internalised in a completely different way and represented to us coherent. 477 00:50:50,550 --> 00:50:54,660 And indeed, we'll find patterns in systems that we can't find ourselves. 478 00:50:54,660 --> 00:50:57,750 And we've given them all the tasks at this point. 479 00:50:57,760 --> 00:51:04,889 We've seen them all their goals. It will lead to a very interesting world in which a kind of a new kind of animism will emerge, 480 00:51:04,890 --> 00:51:11,850 which is the systems won't necessarily possess that sentience, that thick texture, I believe, for a long time. 481 00:51:11,850 --> 00:51:15,870 But they will they there will be the voices of our ancestors, literally. 482 00:51:15,870 --> 00:51:20,970 I mean, the posthumous AIS that will represented whatever we think about that, um, 483 00:51:21,000 --> 00:51:29,730 will be suffused with systems that talk to us, that maintain a representation of of past experiences. 484 00:51:29,970 --> 00:51:38,460 So I can see, um, this, this very interesting blurring of distinctions as our AI technology, 485 00:51:39,030 --> 00:51:46,020 without any sense of it being sentient at all, is never the less able to represent sets of possibilities to us that. 486 00:51:46,970 --> 00:51:54,420 Bring the world to us in quite different ways. We're very much reflects your point, Luke, about other versions of life in different ways. 487 00:51:54,470 --> 00:51:58,070 Well, let's open it up now to the audience. So as I said, we've got roving mics. 488 00:51:58,070 --> 00:52:04,040 Please just put your hand up nice. N ice and clearly, if you have a question that you wish to ask to our panellists. 489 00:52:06,370 --> 00:52:09,910 Please this gentleman here in the front. Thank you 490 00:52:10,240 --> 00:52:15,300 491 00:52:15,400 --> 00:52:22,540 This question about the meaning of life, which is a kind of bugbear of our lives. 492 00:52:24,760 --> 00:52:29,079 Put it into a context of the idea that I was brought up with. 493 00:52:29,080 --> 00:52:38,860 Many people here brought up this quite false idea that evolution is a kind of R&D program to produce us. 494 00:52:40,170 --> 00:52:46,720 Which clearly isn't. I've been, I think, five great extinctions. 495 00:52:48,140 --> 00:52:54,530 One thing we know from this record of extinctions is that evolution doesn't repeat. 496 00:52:56,440 --> 00:52:59,890 There are no repeats the trilobites are gone. 497 00:53:00,550 --> 00:53:04,810 The ammonites are gone. The dinosaurs are all but a few. 498 00:53:04,810 --> 00:53:16,610 A few relics have gone. And yet here we are, 65 million years since the last extinction, and possibly only a few centuries from the next. 499 00:53:17,930 --> 00:53:23,090 Well, possibly only a few decades from the next. Talking about the meaning of life. 500 00:53:23,930 --> 00:53:29,240 180 million years of dinosaurs. Never uttered a word that had no vocal chords. 501 00:53:30,110 --> 00:53:36,560 They probably had no ideas. And they certainly never asked what is the meaning of being a dinosaur? 502 00:53:38,000 --> 00:53:41,350 What is the meaning? Being a trilobite. 503 00:53:42,070 --> 00:53:51,190 None of these creatures ever asked this question. Here we arrive without curious arrogance, thinking that we must have a meaning. 504 00:53:52,460 --> 00:53:56,120 Why do we believe that we have a meaning? 505 00:53:56,900 --> 00:54:00,230 506 00:54:00,500 --> 00:54:03,530 Wonderful question. Right. 507 00:54:04,490 --> 00:54:08,510 The quest for the meaning and the thrive. I'm going to give that right to you, Luke. 508 00:54:08,510 --> 00:54:12,500 But I would like to, uh, again have all the panellists reflect on that question. 509 00:54:13,130 --> 00:54:16,910 No, it's a it's a wonderful it's a wonderful point. I mean, I think, um. 510 00:54:17,990 --> 00:54:25,310 The sense that in a sense you can kind of flip it and the sense of, why shouldn't we ask that question? 511 00:54:25,340 --> 00:54:29,000 The fact that we can ask that question is significant. It's not trivial. 512 00:54:29,270 --> 00:54:36,229 It's not irrelevant. So, um, I think I'd also say who knows what dinosaurs speculation was like. 513 00:54:36,230 --> 00:54:39,709 We we can't. It's famous. Thomas Nagel has his famous essay about. 514 00:54:39,710 --> 00:54:47,740 Could we understand a dog or a bird or, you know, have the forms of consciousness in other ways of being alive are beyond us. 515 00:54:47,750 --> 00:54:53,930 In a sense, we struggle to comprehend them, but that doesn't make them of no value or of no worth. 516 00:54:54,480 --> 00:55:02,059 Uh, and so I think our capacity to engage with life in particular human ways, 517 00:55:02,060 --> 00:55:09,530 which involves the capacity to ask questions about the meaning and purpose of life and to come up with ethical frames 518 00:55:09,530 --> 00:55:17,270 of reference and think beyond simply being determined by our material and social conditions which make life possible, 519 00:55:17,390 --> 00:55:26,810 and act I both against that. And with that I think is a distinct way of being alive which needs to be engaged with. 520 00:55:26,820 --> 00:55:32,899 So I don't think one needs to say, uh, kind of put it in context and say, well, it's trivial. 521 00:55:32,900 --> 00:55:38,840 The fact that we can ask these questions, I think is incredibly profound and a remarkable gift, 522 00:55:39,380 --> 00:55:44,510 that we are the kind of animal that can ask these questions and that needs to be embraced 523 00:55:44,510 --> 00:55:52,310 and engaged with how then that leads us to make sense of and value the life of dinosaurs. 524 00:55:52,610 --> 00:55:59,089 I think part of the gift of being human is that dinosaurs suddenly can be a memory, 525 00:55:59,090 --> 00:56:03,800 can be retrieved, can be engaged with and delighted and wondered at. 526 00:56:04,340 --> 00:56:08,060 And I think that capacity for wonder, that capacity for contemplation, 527 00:56:08,330 --> 00:56:16,069 that capacity for questioning is this is this remarkable gift that that relates to being the human. 528 00:56:16,070 --> 00:56:20,149 That joy would be to find other forms of life, uh, 529 00:56:20,150 --> 00:56:30,970 that are able to have that capacity for wonder and question asking or to somehow create a form of system which could equally participate in that. 530 00:56:30,980 --> 00:56:38,540 I think it would do that in very different ways. Uh, and then the challenge for us is, can we receive that gift from others? 531 00:56:38,810 --> 00:56:46,670 But I certainly want to maintain and hold out the need, the necessity, the vitality of our embrace, 532 00:56:46,670 --> 00:56:52,370 of our capacity to ask those questions is central and constitutive of who we are as this kind of animal. 533 00:56:52,910 --> 00:56:59,660 Of course, there is an interesting there's an interesting, uh, line on the, uh, the joy of finding others who might ask those questions, which is, 534 00:56:59,690 --> 00:57:09,140 um, perhaps one reason we haven't heard from anybody yet is they're all keeping very quiet because only a fool makes a noise in the jungle. 535 00:57:09,230 --> 00:57:15,559 Um, and that genuinely is we're having this debate earlier on this kind of serious exchange between people thinking, 536 00:57:15,560 --> 00:57:25,340 should we be sending messages out there, um, which isn't perhaps very elevating about the kind of, uh, propensities of sufficiently advanced systems, 537 00:57:25,430 --> 00:57:28,370 uh, to think about, uh, maintaining their own interests. 538 00:57:28,370 --> 00:57:35,809 But, uh, no, I and the other thing is, you make a good point is that we have this tendency to think there is some ascendancy in all of this, 539 00:57:35,810 --> 00:57:40,040 some inevitable, um, progression to something, um, better. 540 00:57:40,040 --> 00:57:51,010 And. Well, we're probably responsible, um, for, um, large extinction events ourselves, you know, with the megafauna and possibly close cousins. 541 00:57:51,040 --> 00:57:55,830 So it's interesting to, uh, to reflect on where we might be going. 542 00:57:57,570 --> 00:58:04,170 It'll take a very physics approach to this, um, which is it's not necessarily my own thoughts. 543 00:58:04,170 --> 00:58:08,070 Purpose of life, I think is is very personal to each individual. 544 00:58:08,490 --> 00:58:17,010 Um, but, um, this idea that life is a natural consequence of the physical and chemical processes of the universe and the universe, 545 00:58:17,460 --> 00:58:23,310 you know, we think started as a big bang, and it's trying to dispel its energy into the sort of lowest state that it can. 546 00:58:23,730 --> 00:58:28,840 And it turns out life is very good at using energy, converting energy into higher from higher. 547 00:58:28,840 --> 00:58:37,430 forms into sort of heat in the lowest state. And so if involving, uh, an organism that can ask this question, why do I exist? 548 00:58:37,440 --> 00:58:41,520 What's the purpose if that keeps us going and keeps us surviving? 549 00:58:41,670 --> 00:58:47,160 Life has evolved to keep going. Um, it keeps using energy, and the universe likes that. 550 00:58:47,520 --> 00:58:51,240 Um, so maybe that's another way to think about it. 551 00:58:51,450 --> 00:58:54,989 But the reductionist physics. Yeah, if I may say. 552 00:58:54,990 --> 00:59:03,570 I would say that for me, it's a very personal question as to what we individually take as the purpose and meaning for life. 553 00:59:04,170 --> 00:59:12,990 Well, maybe, uh, Jessica has submitted, uh, maybe a, uh, a comment which might address your point, which is what is the purpose of life? 554 00:59:13,500 --> 00:59:19,220 Is it that we have no choice but to keep ourselves in this quest for the rest of our lives? 555 00:59:19,230 --> 00:59:22,350 That gives it purpose. Um. Maybe that's. That's this. 556 00:59:22,680 --> 00:59:27,330 Let me open it again up to the audience for, uh, additional questions, please. 557 00:59:27,340 --> 00:59:31,770 The gentleman there. Thank you. And we'd love it from students. 558 00:59:31,770 --> 00:59:36,690 So don't be shy. No question is unacceptable. So thank you to all the speakers. 559 00:59:36,720 --> 00:59:43,709 It's been a fascinating evening. Um, despite humanity's intelligence and growing intelligence, 560 00:59:43,710 --> 00:59:50,190 one might argue there is undoubtedly a history of using that intelligence for battle and destroy and destruction, 561 00:59:50,190 --> 00:59:53,190 whether that's fellow man or the planet. 562 00:59:54,000 --> 01:00:03,299 So, given Moore's Law and the, um, doubling of computing power, you sort of think of Terminator Judgement Day scenario, 563 01:00:03,300 --> 01:00:12,300 where artificial intelligence gets so clever that it decides that humans are a threat, unnecessary, etc. 564 01:00:12,300 --> 01:00:15,420 Is that is that pure make believe or is that a genuine? 565 01:00:16,490 --> 01:00:20,240 Risk, do you think? Well, how worried should we be? Nigel, 566 01:00:20,250 --> 01:00:27,649 How many years have we got? I mean, this was certainly a, um, a particular concern of the first AI, uh, 567 01:00:27,650 --> 01:00:32,900 summit in Bletchley Park, just over, you know, back in, uh, over a year, year ago. 568 01:00:32,900 --> 01:00:41,719 Now, um, um, you'll notice that the UK has not signed up to a more modest version of what it's signed up to, 569 01:00:41,720 --> 01:00:45,020 uh, to a year and a half, which is odd, but there you go. 570 01:00:45,200 --> 01:00:55,340 Um, at today's Paris, uh, event, um, I think it is important that we take those, uh, issues of of guardrails of, uh, of safety. 571 01:00:55,730 --> 01:01:01,880 Um, the system doesn't have to be, um, super sentient to nevertheless be a threat. 572 01:01:01,940 --> 01:01:08,630 Uh, if we put them into ascendancy of of, uh, over tasks that are critical, um, whether or not, 573 01:01:08,690 --> 01:01:13,440 and you can have all sorts of scenarios that imagine how they could become, 574 01:01:13,580 --> 01:01:18,530 uh, in some sense, following a set of objectives which they don't have to be aware of, 575 01:01:18,530 --> 01:01:23,059 but nevertheless determine their behaviour, which could lead to bad outcomes for us. 576 01:01:23,060 --> 01:01:31,640 So guardrails, regulations, some notion of where we don't want to proliferate this dual use technology, 577 01:01:31,970 --> 01:01:36,920 like every other general technology and science, we use it for good and ill. 578 01:01:36,920 --> 01:01:38,719 We use it. And, um, 579 01:01:38,720 --> 01:01:49,460 we've got to think quite hard about how we establish limits on how we will put this technology to use and possibly how we will align those systems, 580 01:01:49,730 --> 01:01:53,680 not to determine that they would rather do something that we'd rather they didn't. 581 01:01:53,720 --> 01:02:00,830 So I think that's absolutely. Um, but you're very much answering from the context of us living alongside artificial intelligence. 582 01:02:00,830 --> 01:02:04,580 If we take some of Luke's comments in the context of how does life live 583 01:02:04,580 --> 01:02:08,750 Well, how would I, independent of the human race, live 584 01:02:08,750 --> 01:02:12,860 Well, we need, uh, does I need to generate its own political system? 585 01:02:14,510 --> 01:02:19,700 Well, I think that's a that is an interesting question. I mean, some have thought, uh, seriously about the notion. 586 01:02:19,820 --> 01:02:26,600 At what point is the Bicentennial man kind of notion, at what point do these systems become to have rights and or, 587 01:02:26,920 --> 01:02:31,969 or a sense that, there's a framework of, of behaviour? 588 01:02:31,970 --> 01:02:36,709 Uh, why shouldn't we just treat them as automated slaves in some sense? 589 01:02:36,710 --> 01:02:39,800 I mean, this has a long philosophical tradition, um, 590 01:02:40,880 --> 01:02:45,470 where Aristotle tried to convince himself that sense of humans who were slaves somehow didn't have souls, 591 01:02:45,470 --> 01:02:47,720 and so he didn't have to worry about that in the same way. 592 01:02:48,110 --> 01:03:00,339 Um, I think looking I mean, again, uh, in some sense out to your you're assuming that there will be a richness and a textures in them and, 593 01:03:00,340 --> 01:03:07,249 and a sense of presence that would motivate a mobilised the kind of questions that we've had here. 594 01:03:07,250 --> 01:03:14,120 And that may be an in principle, I don't rule it out at all in principle, but it's a very long way from where we are now. 595 01:03:14,120 --> 01:03:18,440 I don't think we have to worry about giving them rights for quite some time yet. 596 01:03:18,710 --> 01:03:25,610 Um, what we should, though, is, is not imagine we can just use these systems in any way we might may wish. 597 01:03:25,610 --> 01:03:29,870 Because in the act of doing that, whether that's a sex robot, for example, 598 01:03:29,870 --> 01:03:33,920 would you give it to a recidivist to kind of cure them of their compulsions? 599 01:03:34,340 --> 01:03:37,880 You Corson or own interaction? 600 01:03:38,180 --> 01:03:42,800 You you in some sense, it's a act of contagion. 601 01:03:42,920 --> 01:03:50,390 We should treat them as if human, even in the context in which we understand they are AI systems. 602 01:03:50,750 --> 01:03:55,040 I mean, I would yeah, I would say we already have a political relationship with these systems. 603 01:03:55,040 --> 01:04:02,839 Think about the impact of, um, uh, kind of bots on Twitter to elections in this kind of stuff that so the, 604 01:04:02,840 --> 01:04:09,500 the issue, I think the point that Nigel raised about human consciousness and the human body is always adaptive. 605 01:04:09,980 --> 01:04:14,600 And so, uh, a good example is cooking We wouldn't evolve. 606 01:04:14,600 --> 01:04:19,670 Is that in my understanding? Correct me if I'm wrong, but we wouldn't evolve the kind of brains we had if we didn't cook, 607 01:04:20,030 --> 01:04:25,640 depend on technologies in a sense, to become the kinds of creatures we are. 608 01:04:26,060 --> 01:04:31,580 And so there's always this adaptive mutuality going on with our tools. 609 01:04:32,030 --> 01:04:36,139 Uh, and that is, I think, intensified with AI. 610 01:04:36,140 --> 01:04:40,310 We're all it's not the kind of Terminator scenario that concerns me. 611 01:04:40,550 --> 01:04:45,560 It's the ways in which humans consistently invent tools. 612 01:04:45,920 --> 01:04:50,660 Money is a tool. What do we do? And again, religious traditions have a lot to say on this. 613 01:04:50,900 --> 01:04:55,520 We always end up worshipping our tools and tools are made to make life better. 614 01:04:56,090 --> 01:05:05,700 And again and again and again. It's a human kind of move to end up serving our tools, not are tools serving us. 615 01:05:06,170 --> 01:05:11,329 A state is a human tool. It's supposed to organise life better and make it better. 616 01:05:11,330 --> 01:05:15,680 That's all it is. We end up, we fight wars over states. 617 01:05:16,160 --> 01:05:17,840 So I think that's much more my concern. 618 01:05:17,840 --> 01:05:27,950 We're already in that process where we are creating tools which we then imagine ourselves in relationship to good examples to computer. 619 01:05:28,250 --> 01:05:32,000 Human life and consciousness is not machine like. 620 01:05:32,600 --> 01:05:35,809 Again and again, I still meet people who think, oh, it's like a machine. 621 01:05:35,810 --> 01:05:39,650 I they develop whole scientific models as if the human brain is like a machine. 622 01:05:39,650 --> 01:05:45,110 It's nothing like a machine. Um, part of the breakthrough to AI was to stop thinking like that. 623 01:05:45,410 --> 01:05:47,300 And so thinking in neural networks. 624 01:05:47,630 --> 01:05:59,060 So the sense in which we end up through a classic human process of mazes, mimicking our tools and then serving our tools, we're already in that game. 625 01:05:59,570 --> 01:06:07,070 And so I think that's the deeper challenge is how do we break that cycle, particularly with the intensities of this technology, 626 01:06:07,610 --> 01:06:17,090 uh, and thereby remake ourselves in its image rather than using it genuinely to engineer better forms of life 627 01:06:17,440 --> 01:06:21,229 It's very interesting. I mean, it was the point you made about the sort of augmented aspect. 628 01:06:21,230 --> 01:06:29,210 So how we think about augmented or augmenting our life through technology, but also, you know, the evolution, you know, from the medical area, 629 01:06:29,240 --> 01:06:36,709 you know, that I work in or did work in, um, you know, the sort of evolution for prosthetics and cyborgs are making tools. 630 01:06:36,710 --> 01:06:41,810 You becoming a cyborg in effect. And how do you assimilate that as part of you? 631 01:06:41,810 --> 01:06:47,299 And how do you have that thing that's a tool. But now is embedded in your body, a sense of belonging. 632 01:06:47,300 --> 01:06:50,300 And we're going to go we're going there. And that challenges. 633 01:06:50,300 --> 01:06:57,220 What do we mean by the human aspect of life because we're we're evolving it to actually embed the technology as opposed to using it as a tool. 634 01:06:57,230 --> 01:07:05,059 Jayne just to add to that, that, you know, if that's that's the direction that we're going, that if we do encounter other lifeforms on other planets, 635 01:07:05,060 --> 01:07:13,490 um, that we will be taking that AI with us or possibly sending it out ahead of us, the first contact we already do that with our technology 636 01:07:13,620 --> 01:07:17,130 on Mars, we send robots. 637 01:07:18,610 --> 01:07:25,210 We have to think about how we would want our AI to be treated, or how we would treat one if it came to us. 638 01:07:25,540 --> 01:07:30,290 And I think if you consider it in that form, it adds answer to the question. 639 01:07:30,330 --> 01:07:33,400 Yeah, absolutely. I'm determined to get a question from a student. 640 01:07:33,700 --> 01:07:37,300 One. I've got one here. That's the microphone, please. 641 01:07:38,350 --> 01:07:44,190 642 01:07:44,430 --> 01:07:47,970 Can you hear me? 643 01:07:48,200 --> 01:07:57,719 Yes. Yes, yes. Yeah. So I've been noticing that there's kind of a contrast between the questions that we are asked about life in other planets, 644 01:07:57,720 --> 01:08:01,620 which is with, with, like, a sense of wonder and excitement. 645 01:08:01,860 --> 01:08:09,149 And then when we look at life here and we talk about it with a sense of like, fear or concern, 646 01:08:09,150 --> 01:08:18,480 and I was wondering when you ask the question about whether we can imagine life in other planets being way far from life from Earth. 647 01:08:19,260 --> 01:08:27,389 Um, I also thought that maybe we are not or I haven't heard enough, uh, thoughts that are more hopeful, uh, 648 01:08:27,390 --> 01:08:36,090 for what I can bring or even, like, how the humans can do politics in a way that will make us feel proud and happy. 649 01:08:36,330 --> 01:08:45,420 And, uh, whether AI can inform that, even the way in which we look for life in other planets rather that than with fear, 650 01:08:45,480 --> 01:08:50,190 look for it with excitement and think, that is going to be, uh, like a good. 651 01:08:50,550 --> 01:08:53,300 Yeah, like a good, uh, approach. Yeah. 652 01:08:53,310 --> 01:09:05,910 So I was wondering if there are like, uh, thoughts or like ideas of futures, uh, informed by AI and by these questions about, uh, the human endeavour. 653 01:09:06,270 --> 01:09:09,629 There are more, uh, hopeful, hopeful and optimistic. 654 01:09:09,630 --> 01:09:13,860 So thank thank you. No, it's a really good point because we tend to sort of go to the worries. 655 01:09:14,250 --> 01:09:17,420 Um, right. Nigel. And then Luke 656 01:09:17,670 --> 01:09:22,680 Jayne you're evidently excited to find life on other planets. I think she's going to be buzzed if we find other life. 657 01:09:23,250 --> 01:09:26,420 But we need to know why and what will that bring to us? 658 01:09:26,430 --> 01:09:30,120 And it's not just about us as humans benefiting, but let's have some of the positives, because there aren't many 659 01:09:30,120 --> 01:09:34,120 Yeah, I mean, again, dystopian tropes are just everywhere. 660 01:09:34,140 --> 01:09:37,250 It's easy to get kind of very, very lots of shroud waving going on. 661 01:09:37,260 --> 01:09:44,549 I mean, the ways in which these systems are, we should look at what technologies have done in the past when we invented writing, 662 01:09:44,550 --> 01:09:54,090 when we learned to read that technology changed our neurology, we developed parts of our brain to do that. 663 01:09:54,090 --> 01:09:57,870 When they go wrong, we get various forms of dyslexia. 664 01:09:57,870 --> 01:10:04,740 You know, we said, we know. We know that we can we can use these technologies to really enable us to help us flourish. 665 01:10:05,100 --> 01:10:13,680 Um, and I think this is so exciting, whether it's in science where these systems will help us discover, find patterns, um, 666 01:10:14,250 --> 01:10:21,750 do huge amounts of rapid assembly of suggestions and ideas that can be pursued, 667 01:10:21,750 --> 01:10:25,799 many of them computationally, not necessarily have to go through laboratory process. 668 01:10:25,800 --> 01:10:29,040 So there's huge opportunities in science. Wonderful royal society 669 01:10:29,040 --> 01:10:32,910 Royal society report on science in the age of AI covers much of this. 670 01:10:33,210 --> 01:10:38,460 But in arts and creative content, people think that it's this the end of the creative? 671 01:10:38,970 --> 01:10:43,920 No. I mean, one reason why is that we already face a challenge with these models, 672 01:10:43,920 --> 01:10:49,860 having consumed all the stuff that's out there that humans have produced, they start to consume their own content. 673 01:10:49,860 --> 01:10:55,559 And there's this wonderful idea called Mad model autophagy disorder. 674 01:10:55,560 --> 01:11:04,020 They consume their own content and they become increasingly, well, deranged. 675 01:11:04,020 --> 01:11:08,280 They start to dis associating some quite interesting ways. 676 01:11:08,280 --> 01:11:20,069 So the human in the creative context is fundamentally important, not least because of course, we are living our own humanity. 677 01:11:20,070 --> 01:11:30,389 We are expressing our sense of hopes and fears in an age where these systems will help us create new works of art, will help us in our science, 678 01:11:30,390 --> 01:11:38,700 in our health, in our leisure, in our pastimes, possibly even in helping come to more effective political settlements with one another. 679 01:11:39,070 --> 01:11:48,930 Jayne Yeah, I think searching for life and finding it elsewhere, it helps us to understand our own origins, um, and how life begins. 680 01:11:48,930 --> 01:11:52,440 It's one of the huge unanswered questions of modern day science. 681 01:11:52,440 --> 01:11:56,909 And I would hope that, you know, centuries from now, we'll look back and we'll be thinking, 682 01:11:56,910 --> 01:12:03,120 oh, can you believe in, like, the two thousands they didn't know, like how life formed like this. 683 01:12:03,120 --> 01:12:04,440 Just the thing that you think you would know. 684 01:12:04,620 --> 01:12:12,029 So I think you know that exploration and finding life in all its possible forms is is deeply fascinating and something I don't know, 685 01:12:12,030 --> 01:12:17,190 I, I do this every day, so I it's the most fun thing I can think to do. 686 01:12:18,030 --> 01:12:23,220 Well, before we take the final question, because sadly we are coming to the end of our time, I just want to ask one more question. 687 01:12:23,520 --> 01:12:30,060 Uh, actually directed at Jayne But please, if the others want to join. And this is from another undergraduate student, Pre-submitted Ben O'Donnell. 688 01:12:30,570 --> 01:12:36,600 And the question is, if the ongoing research into exoplanetary atmospheres uncovered signs of technological life in our galaxy, 689 01:12:36,780 --> 01:12:40,770 what would be your foremost ethical or moral concerns with messaging that life? 690 01:12:41,160 --> 01:12:46,970 How might we learn from prior messaging attempts such as Frank Drake, Carl Sagan 691 01:12:47,300 --> 01:12:51,850 I received a message and the exclusion of violence and religion from both messages, 692 01:12:51,860 --> 01:12:56,930 so I'd like to maybe have Jayne and Luke comment on that question, and then we'll take the final question 693 01:12:57,260 --> 01:13:04,100 So whether or not to message, uh, it's called Meti messaging extraterrestrial intelligence. 694 01:13:04,100 --> 01:13:10,100 And we could already do that. We know that there are planetary systems where, you know, it's very similar to the Eaeth 695 01:13:10,100 --> 01:13:12,050 And so we could just send messages to them. 696 01:13:12,530 --> 01:13:19,160 Um, and I think Nigel already mentioned this about the sort of dark forest idea that you put out the message and the answer comes back. 697 01:13:20,330 --> 01:13:28,310 it's the same approach you would take to engaging with any new civilisation. 698 01:13:28,820 --> 01:13:35,629 Um, so we can actually look back into our own history and try to learn from as, as a nomadic species. 699 01:13:35,630 --> 01:13:41,800 We've gone out, we explored, we've encountered each other. And how did those situations go and what did we learn from that? 700 01:13:41,810 --> 01:13:48,470 And can we then apply that to how we might deal with contacting extraterrestrial intelligence? 701 01:13:48,860 --> 01:13:53,999 Um. It's not an astrophysics question is definitely an ethical question. 702 01:13:54,000 --> 01:14:02,149 Okay. As to how to do that. Yeah, I think I mean, I think as far as I recall, I might be wrong in this, that, 703 01:14:02,150 --> 01:14:10,070 that no kind of humanities people were involved in the creation of those messages, that they tended to have a view of the human. 704 01:14:10,340 --> 01:14:19,700 They tried to have this very neutral, de-particularised, um, claim to be universal view of the human. 705 01:14:20,180 --> 01:14:29,890 And I think that was a mistake. Um, I think the part of being human and what we know about human interaction and the inherently uncultured nature, uh, 706 01:14:29,960 --> 01:14:39,980 and particular nature, is to kind of lean into that and to what would it what would another alien intelligence be looking for? 707 01:14:40,010 --> 01:14:43,879 It would equally be some particular form of life. 708 01:14:43,880 --> 01:14:52,100 So to have a kind of de-particularised form of life you're sending out that would inherently produce this is a weird bunch of people. 709 01:14:52,430 --> 01:14:53,720 What on earth is that? 710 01:14:54,020 --> 01:15:03,440 You know, whereas to actually say, uh, this is there's a kind of multiplicity of cultural forms that the thick with stories and ritual process, 711 01:15:03,440 --> 01:15:06,919 and that is central to what it means to be human. 712 01:15:06,920 --> 01:15:15,770 And I think there's this problem of a kind of decontextualised idea of rationality rather than a deeply relational, embodied notion of rationality, 713 01:15:15,770 --> 01:15:22,489 which will distinguish us from AI and other forms of planet, and the kind of ritual, um, 714 01:15:22,490 --> 01:15:28,790 religious dare I say, other kinds of processes that that are central to the formation of us as humans. 715 01:15:28,790 --> 01:15:37,040 I think the other thing to say, I'm just going back to the last question is the key thing here is what stories we tell. 716 01:15:37,040 --> 01:15:42,799 We are storytelling animals, and that is crucial to the development of science, to all this kind of stuff. 717 01:15:42,800 --> 01:15:50,750 So what stories are we putting out there, and what story of hope and and hope is a virtue and a commitment. 718 01:15:50,750 --> 01:15:59,030 It's not optimism. It's not, um, a kind of built on speculation on, uh, 719 01:15:59,060 --> 01:16:05,389 existing facts or conditions which might be violent and terrible and all sorts of terrible things. 720 01:16:05,390 --> 01:16:15,260 That's it's a commitment to and and the capacity to develop a that commitment over time to a vision of the good, 721 01:16:15,290 --> 01:16:19,100 to a vision of a more moral way of being in the world. 722 01:16:19,550 --> 01:16:24,440 Uh, and that is hard work. Um, and that is a struggle, collective and personal. 723 01:16:24,830 --> 01:16:34,640 And I think that the cultivation of that hope in the face of what is very bad signals, um, is crucial to us going forward. 724 01:16:35,120 --> 01:16:42,260 And I think a hopeful orientation in the face of the challenges we're facing around AI or whatever. 725 01:16:42,860 --> 01:16:48,680 Um, but that depends on what story we tell about ourselves and about the universe. 726 01:16:48,920 --> 01:16:53,780 And as Nigel was saying, there's a lot of both tragic and apocalyptic stories being told. 727 01:16:54,230 --> 01:17:02,210 What is the kind of realistic, hopeful story which we want to tell about this university, about life itself. 728 01:17:02,510 --> 01:17:05,890 And I think that is a profound question for us. Yeah. No. Absolutely no. 729 01:17:05,920 --> 01:17:10,309 Well, I'm a big fan of hope. I did a whole sermon on it just recently. And as a neuroscientist I hope 730 01:17:10,310 --> 01:17:12,860 So, um, I, I know that we're just after time, 731 01:17:12,860 --> 01:17:17,360 but I can't resist offering one more question to if you could be really succinct and then really succinct answers. 732 01:17:17,360 --> 01:17:20,900 And then we'll wrap up, because I know that some students need to get back for their college dinners, but, um. 733 01:17:21,110 --> 01:17:27,050 Yeah. Hello. Uh, Nigel, were you presented sentience as a continuous variable in your plots? 734 01:17:27,470 --> 01:17:33,620 I think that's a really interesting concept. What would it mean, or what would it be for humans to be more sentient? 735 01:17:33,920 --> 01:17:40,760 And can AI theology or astrophysics be used as a tool for us to achieve greater sentience? 736 01:17:41,570 --> 01:17:46,490 That's a great. But I mean, it was bit contentious to have those as continuous, uh, dimensions at all. 737 01:17:46,490 --> 01:17:56,270 You might say that, um, you know, the, the, the biological versus the silicon, they're kind of more categorical, um, types rather than continuous, 738 01:17:56,570 --> 01:18:00,710 but continuous sentiments is something we talked about if you read the books on, on, uh, 739 01:18:00,980 --> 01:18:06,860 on cephalopod, uh, you know, is that is this a system that might have some sense of awareness? 740 01:18:06,860 --> 01:18:15,139 You know, are there other categories of, uh, of being that have people say a dim flicker of all this or that? 741 01:18:15,140 --> 01:18:20,960 Um, what does it mean to have that rich sense of of a presence? 742 01:18:21,230 --> 01:18:25,610 Uh, and it's, it's the classically challenging problem that none of us quite know. 743 01:18:25,910 --> 01:18:32,959 Um, what the what the what the trajectory to have that in our systems is. 744 01:18:32,960 --> 01:18:40,760 But what I suspect is when you open up our current AI system, there is so much that is missing from what has been. 745 01:18:42,370 --> 01:18:47,530 Selected for over billions of years in the history of life on Earth. 746 01:18:47,680 --> 01:18:51,040 That there's a massive amount of. 747 01:18:53,320 --> 01:18:57,850 Material to account for before. I think we get to the point where we seriously. 748 01:18:58,150 --> 01:19:01,600 Entertain the notion that these systems have sentience. 749 01:19:01,600 --> 01:19:06,700 So how far you go to the left or right? Great question. I think I think it's something we want to be debating more. 750 01:19:07,030 --> 01:19:10,689 Brilliant. Well, on that note, uh, this has been just fantastic. 751 01:19:10,690 --> 01:19:15,520 I've certainly learnt an enormous amount. It's been fantastic to to just have this time with you all. 752 01:19:15,640 --> 01:19:19,480 Clearly, there's much for us all to learn still about life and develop and understand. 753 01:19:19,480 --> 01:19:21,459 So thank you all so much. 754 01:19:21,460 --> 01:19:28,600 We'll do a clap at the end just to announce that next term we will have another Sheldonian series and the topic will be truth. 755 01:19:29,200 --> 01:19:35,170 And uh, because this is what we're about, aren't we in this university finding it and then disseminating it. 756 01:19:35,290 --> 01:19:39,219 So let me first of all, thank Tim Soutpommasane and Julias Grower and David Isaac 757 01:19:39,220 --> 01:19:43,660 who've been the architects for this series and all the work they've done, the events team that, of course got us here tonight. 758 01:19:43,840 --> 01:19:48,880 You all for coming. But most of all, can we please have a round of applause for Jayne, Luke and Nigel for a fantastic evening.