1 00:00:12,160 --> 00:00:15,639 I'd like to welcome you to the Trinity Church, Straight G lecture, 2 00:00:15,640 --> 00:00:21,430 the Straight G lectures O the distinguished lecture series from the Department of Computer Science. 3 00:00:21,940 --> 00:00:25,420 We have a hash tag which is in front of me. Hash ox, straight Tilak. 4 00:00:26,050 --> 00:00:29,680 Please do tweet at me. Nice about it. If you are if you are going to tweet. 5 00:00:30,010 --> 00:00:33,220 Before we begin, I would like to acknowledge our sponsors. 6 00:00:33,520 --> 00:00:41,440 Oxford Asset Management now for a number of years have been sponsoring these lectures and it's been a transformative thing for us, 7 00:00:41,440 --> 00:00:48,519 enabling us to bring in speakers and to put on a show that we simply wouldn't have been able to do otherwise. 8 00:00:48,520 --> 00:00:53,280 So we're extremely grateful to Oxford Asset Management for their continuing support. 9 00:00:53,720 --> 00:00:56,560 I I'm sure they would like me to point out that they are hiring. 10 00:00:57,580 --> 00:01:02,140 So if you Google Oxford Asset Management, you will hire, you will find many job opportunities. 11 00:01:02,320 --> 00:01:09,880 But thank you to Oxford Asset Management. So now on to our speakers and I'm delighted to welcome Neil Laurence. 12 00:01:10,570 --> 00:01:18,130 Neil is the inaugural DeepMind Professor of Machine Learning at the University of Cambridge, where he leads the university's flagship mission on A.I. 13 00:01:18,310 --> 00:01:22,690 A.I. At camp. He's been working on machine learning models for 20 years, 14 00:01:23,020 --> 00:01:28,150 and he recently returned to academia after three years of director of Machine Learning Amazon. 15 00:01:28,660 --> 00:01:32,050 His main interest is the interaction of machine learning with the physical world, 16 00:01:32,260 --> 00:01:40,240 and this interest was triggered by deploying machine learning solutions in the African context, where end to end solutions are essential. 17 00:01:40,510 --> 00:01:45,280 And this has inspired new research directions at the interface of machine learning and systems research. 18 00:01:45,760 --> 00:01:50,320 And that work is funded by Senior A.I. Fellowship from the Alan Turing Institute. 19 00:01:50,860 --> 00:01:58,839 He's interim chair of the Advisory Board of the UK Centre for Data Ethics and Innovation and a member of the UK's A.I. Council, 20 00:01:58,840 --> 00:02:05,350 which advises government on all matters. A.I. Neil is a visiting professor at the University of Sheffield, 21 00:02:05,530 --> 00:02:10,090 where he was for a number of years as an academic and is the co-host of Talking Machines. 22 00:02:10,480 --> 00:02:17,889 And the title of Neil's lecture could hardly be more contemporary use or be used. 23 00:02:17,890 --> 00:02:21,610 Regaining Control of Artificial Intelligence. Males, The floor is yours. 24 00:02:29,860 --> 00:02:33,819 Thanks to this chip that's working. Okay. And thank you for that wonderful introduction. 25 00:02:33,820 --> 00:02:39,720 And thank you, Mike, Leslie, everyone, for the invitation and an apology. 26 00:02:39,730 --> 00:02:48,220 This invitation was originally for a year ago, and I caught COVID just before I was due to give the presentation. 27 00:02:48,550 --> 00:02:57,040 The title is actually the same, so imagine how much better the world would have been had I been able to give this lecture a year ago. 28 00:02:57,040 --> 00:03:00,790 Look, another another consequence of the pandemic. 29 00:03:01,720 --> 00:03:05,350 Okay. So I try and use historic. 30 00:03:05,360 --> 00:03:11,260 I'm not going to give a technical thought really today. I'm just going to talk about issues that lead to technical questions. 31 00:03:11,830 --> 00:03:19,989 And when I think about artificial intelligence, I like to sort of use these three apocryphal quotes that will be in the talk today. 32 00:03:19,990 --> 00:03:23,310 So the apocryphal quote about Henry Ford is when? 33 00:03:23,320 --> 00:03:29,050 And then he says that if I'd asked people what they wanted out of a car, they would have asked for faster horse. 34 00:03:30,070 --> 00:03:38,290 And that provokes in my mind the question. So if you ask people what do they want out of artificial intelligence, is the answer a smarter human? 35 00:03:39,820 --> 00:03:43,630 And then the question that follows is, is even what does that mean? 36 00:03:43,750 --> 00:03:47,530 I think, by the way, if we had a faster horse, we'd already have autonomous vehicles. 37 00:03:47,530 --> 00:03:50,500 So that would have been a pretty cool thing. So maybe the customer's always right. 38 00:03:51,970 --> 00:03:59,980 Now, I talk about this a lot and I'm only going to go through it very briefly because I just want to contextualise the way I think about it, 39 00:04:00,550 --> 00:04:08,170 which is when we look at humans in human intelligence, I've been trying to do public understanding, machine learning folks for about ten years, 40 00:04:08,170 --> 00:04:14,920 and I wanted to capture why machine intelligence is so very different from human intelligence and 41 00:04:14,920 --> 00:04:21,190 simply thought at one level it's because I'm speaking to you using sound and machines communicate, 42 00:04:21,310 --> 00:04:24,639 using radio waves or light speed and cables. 43 00:04:24,640 --> 00:04:26,980 So they communicate about a million times faster. 44 00:04:26,980 --> 00:04:34,660 And this leads to all these sort of consequences that for me to talk to you if I'm talking in a good rate of speech, 45 00:04:34,660 --> 00:04:39,370 if we look at how much information I'm sharing, it's about 2000 bits per minute. 46 00:04:39,640 --> 00:04:46,660 So that's a bit as defined by Shannon, an information theory that's the equivalent information as 2000 coin tosses. 47 00:04:46,660 --> 00:04:55,330 So it's about telling you the result of 2000 tennis matches where the odds were even every minute, which is pretty good now. 48 00:04:55,330 --> 00:05:02,770 But to do that, I'm sharing with you the thoughts are inside my head, and it's very hard to estimate how many calculations per seconds we're doing. 49 00:05:02,770 --> 00:05:06,790 But if you look at the best estimates I could find for simulating a brain, 50 00:05:06,790 --> 00:05:12,100 you would need a machine about as fast as the Met Office supercomputer down in Exeter, 51 00:05:12,490 --> 00:05:16,030 which can do about 1,000,000,000 billion calculations per second. 52 00:05:17,380 --> 00:05:21,550 Now, if you compare that to a simple computer, these slides actually date back a few years. 53 00:05:21,550 --> 00:05:30,340 So maybe they're a bit faster than this. Nowadays, a computer can communicate at 60 billion bits per minute if it's using gigabit per second Internet. 54 00:05:30,640 --> 00:05:34,030 So that means if we turn out, that's a nine to salary. 55 00:05:34,030 --> 00:05:39,220 So imagine your university student stipend is $20 a month. 56 00:05:39,670 --> 00:05:42,670 It's not great money, but it's kind of maybe enough to survive on. 57 00:05:43,240 --> 00:05:47,500 And then imagine your university stipend is $60 billion a month. 58 00:05:49,540 --> 00:05:58,810 It's actually closer to the budget of a country, so it's closer to what the UK spends on the whole of health and Social Security $60 billion a month. 59 00:05:59,020 --> 00:06:02,620 So it's operating in terms of this information exchange, much, much faster. 60 00:06:02,770 --> 00:06:08,950 But then our best estimate is that a typical machine in terms of number of calculations, it can do per second. 61 00:06:09,520 --> 00:06:15,339 And I've done discriminant calculations a second distance. You see there are about 100 billion calculations. 62 00:06:15,340 --> 00:06:20,379 So actually a typical machine is doing a lot less compute per second than our brains. 63 00:06:20,380 --> 00:06:25,790 Of course, we can't access all that compute. Neither can the computer access all the compute it does. 64 00:06:25,790 --> 00:06:31,389 So imagine if I wanted to share with you 1 seconds worth of calculations from my head. 65 00:06:31,390 --> 00:06:37,240 If these numbers are right, it would take me a billion years to tell you all the calculations. 66 00:06:37,240 --> 00:06:42,100 5 billion years to tell you all the calculations that had occurred in my head in 1 seconds. 67 00:06:42,520 --> 00:06:50,840 So while my battery is running low, this is an. I'm not sure I was referring to my brain or the machine. 68 00:06:50,960 --> 00:06:58,940 Not me. I can see a blob that's not strong. So I meant to infer human. 69 00:06:59,330 --> 00:07:04,310 Human intelligence. So, 5 billion years. 70 00:07:04,430 --> 00:07:07,700 What is a computer? What did you share? 1 seconds worth of computation. 71 00:07:07,700 --> 00:07:11,180 You would take it about 20 minutes. So this means your intelligence is locked in. 72 00:07:11,630 --> 00:07:14,630 So what I mean by locked in is you can hear these stories. 73 00:07:14,630 --> 00:07:19,790 And sometimes in a longer version of this section, I would talk about a guy called Jean-Dominique Bobic, 74 00:07:19,790 --> 00:07:23,900 who was the French editor of Elle magazine, who was locked in, could only communicate by winking. 75 00:07:24,110 --> 00:07:29,270 They made a film about him. He wrote the book about the use the film after he was locked in. 76 00:07:29,480 --> 00:07:33,800 And I think that's a state we all think about. We all think, oh, wow, what would it be like to be in that state? 77 00:07:33,950 --> 00:07:37,160 Well, that's the state you're in, right? You just evolve that way. 78 00:07:37,250 --> 00:07:40,790 You don't notice. But compared to the machine, that's where you're at. 79 00:07:42,050 --> 00:07:48,110 So in some sense, it's already a nonsense to talk about a more intelligent human being, 80 00:07:48,320 --> 00:07:52,730 because the human being is defined by constraints, not by capabilities. 81 00:07:53,030 --> 00:07:58,640 We're defined in terms of our intelligence, in the way we share information by a limited ability to communicate. 82 00:07:58,850 --> 00:08:02,120 And we've evolved in that state for hundreds of thousands of years, 83 00:08:02,330 --> 00:08:09,200 as Homo sapiens should say that in the Natural History Museum and for, you know, millions of years as primates, etc., etc. 84 00:08:09,200 --> 00:08:14,480 And all animals do this to some. Well, mammals, higher mammals do this to someone great, to all that sort of thing. 85 00:08:14,810 --> 00:08:18,950 So second apocryphal quote, the six word novel never written by Ernest Hemingway, 86 00:08:18,950 --> 00:08:23,870 but apocryphal written by honest Hemingway for sale baby shoes never wore. 87 00:08:24,290 --> 00:08:29,750 Now, so that six words, 7 to 2 bits of information on average, 12 bits per word we have. 88 00:08:30,980 --> 00:08:37,790 But there's so much more meaning in those 12 words than 72 bits of information. 89 00:08:38,090 --> 00:08:43,070 And that's because you exist as humans. You sit within a wider culture, a wider society. 90 00:08:43,370 --> 00:08:48,330 And you can imagine what it would feel to be the human that has to place that up. 91 00:08:50,000 --> 00:09:00,830 So we use this information bandwidth in a way that allows us to lean on the fact that we are all human and we understand what it means to be a human, 92 00:09:01,070 --> 00:09:05,830 to communicate things that feel like there's a lot more than 72 bits of information in there, 93 00:09:05,840 --> 00:09:09,570 because we can all imagine what it felt like to write that advert. 94 00:09:10,370 --> 00:09:14,959 So although the information but I'm sure our context is very important and we 95 00:09:14,960 --> 00:09:20,210 can extract a lot of information from very few provided pieces of information. 96 00:09:20,240 --> 00:09:23,300 So does anyone know who this apocryphal quote is? Apocryphal divine. 97 00:09:24,350 --> 00:09:31,610 There are three types of three families and statistics. It is Mark Twain. 98 00:09:32,330 --> 00:09:36,890 Not apocryphal in Mark Twain, but Mark Twain appropriately credits it to Benjamin Disraeli. 99 00:09:37,370 --> 00:09:42,560 But it was said by Lord Balfour, the foreign secretary tried to do it to someone else at some point. 100 00:09:42,770 --> 00:09:48,889 But the point about this quote is that one of the reasons that we end up misunderstanding and misinterpreting 101 00:09:48,890 --> 00:09:54,650 statistics is because we're so information constrained that when we get little bits of information, 102 00:09:54,650 --> 00:10:00,980 we tend to overinterpret. We tend to see more behind that data than we have any right to. 103 00:10:01,340 --> 00:10:03,530 And that comes out in a bad way in statistics. 104 00:10:03,530 --> 00:10:08,810 It comes out in a good way when you're imagining what it felt for the person that had to write that advert. 105 00:10:09,440 --> 00:10:10,940 So I kind of apocryphal. 106 00:10:11,080 --> 00:10:21,440 Benjamin Disraeli I tend to think one of the the modern equivalent of these quotes is this There are three types of lies, lie families and big data. 107 00:10:21,740 --> 00:10:27,410 Because what's happened in this modern data era and that's being driven by the machine interconnection, 108 00:10:27,680 --> 00:10:31,730 is requiring more and more information, more and more data. 109 00:10:31,940 --> 00:10:37,100 And we're basically forgetting the hard lessons of a field known as mathematical statistics. 110 00:10:37,370 --> 00:10:41,479 So that quote from Disraeli predates the mathematical field of statistics. 111 00:10:41,480 --> 00:10:44,510 So I'm quite close to computer science, government statistics. 112 00:10:44,960 --> 00:10:48,830 They study mathematical statistics. They just don't say mathematical statistics. 113 00:10:48,830 --> 00:10:56,660 They study that the they say the process of understanding whether whether you can believe what the statistics is purporting to say or not. 114 00:10:57,020 --> 00:10:59,389 And we don't even say mathematical statistics anymore. 115 00:10:59,390 --> 00:11:05,840 But it was a new field that had to come in to stop people misinterpreting statistics, which is the sort of science mistake. 116 00:11:06,110 --> 00:11:13,459 So that's Karl Pearson, of course, unfortunately associated with that field is a lot of fairly unpleasant things like eugenics. 117 00:11:13,460 --> 00:11:21,440 Most of the early mathematical statisticians were also involved in eugenics to one form or another, because they believe once you could tell this, 118 00:11:21,440 --> 00:11:27,020 we should be looking to improve the human species in some way, and that led to some quite problematic results. 119 00:11:27,260 --> 00:11:32,900 So that is still an open question about what does mathematical data science look like in our era. 120 00:11:33,380 --> 00:11:40,550 And it's being triggered by this sort of new image because we now have an evolved relationship with information. 121 00:11:41,960 --> 00:11:49,610 So in the past, very, very far past 100,000 years ago, we could go in the museum and check, 122 00:11:49,610 --> 00:11:53,570 but I think we would been in groups of about 30 to 50 operating together, 123 00:11:53,870 --> 00:12:00,739 hunter gatherers, maybe slightly larger, trusting each other, basing sort of inferences about what's going on. 124 00:12:00,740 --> 00:12:07,549 Those relationships of trust and understanding who each of us is. Since then, we became we came into cities. 125 00:12:07,550 --> 00:12:11,030 We've now grown things. We have to introduce laws to govern us. 126 00:12:11,870 --> 00:12:15,980 And we also ended up with statistics to look at how those cities are being managed. 127 00:12:16,160 --> 00:12:20,120 And we've had to teach ourselves not to misinterpret those statistics when making decisions. 128 00:12:20,360 --> 00:12:28,400 We removed ourselves from that direct contact of person to person, and that's led to this evolved relationship with information. 129 00:12:28,430 --> 00:12:36,950 So in the past 100 years, you know, this is the situation that you had humans with this very low bandwidth connection. 130 00:12:37,310 --> 00:12:40,730 And a lot of what we were trying to do was manage that low bandwidth connection 131 00:12:40,730 --> 00:12:44,210 to ensure that we weren't misinterpreting the information we were receiving. 132 00:12:45,140 --> 00:12:49,940 But now what we've got is this We've got an enormously high bandwidth connection between 133 00:12:50,780 --> 00:12:56,150 the machine and data and then a low bandwidth connection between us and the machine. 134 00:12:56,900 --> 00:13:05,900 And this is leading to all sorts of problems. You know, and I we can we can look back over the last ten years in terms of election manipulation, 135 00:13:05,900 --> 00:13:09,290 in terms of social media, etc., etc., and what that's done. 136 00:13:09,290 --> 00:13:14,550 And you can put that down to the fact that the machine is now mediating what information it chooses to show. 137 00:13:14,990 --> 00:13:22,070 I mean, it's actually using classical statistics to work out how to draw us in to our engagement with the machine. 138 00:13:22,280 --> 00:13:27,130 But it's not using classical physics to say, how can I share the truth with this individual? 139 00:13:27,140 --> 00:13:31,610 It's using classical statistics to say, how can I cause this individual to engage with me? 140 00:13:31,850 --> 00:13:39,590 And of course, the answer turns out to be share information with them that, you know, fulfils their prejudices, that type of thing. 141 00:13:40,910 --> 00:13:41,600 She's not great. 142 00:13:42,530 --> 00:13:50,330 So what we have is this as the challenge for data science, that we're trying to deal with this new high bandwidth connection to the machine, 143 00:13:50,450 --> 00:13:57,030 which is then presenting information to us in the low bandwidth way, and we've got classical statistics being the traditional way now. 144 00:13:57,050 --> 00:14:01,250 The first thing I want to emphasise is when you see this connection, you realise, well, 145 00:14:02,060 --> 00:14:06,770 the statistics have a massive role in mediating these new connections and understanding what's going wrong. 146 00:14:06,920 --> 00:14:10,790 But it's not everything. But there is enormous role of computer science as well. 147 00:14:11,450 --> 00:14:16,160 You know, mathematics just in understanding this, this new flow of information. 148 00:14:17,750 --> 00:14:25,280 But then the thing I think we need to also be talking about on top of that is this new wave of movements which are actually, 149 00:14:25,370 --> 00:14:29,420 I would argue, operating almost directly on that link between the human and the machine. 150 00:14:29,770 --> 00:14:33,760 So forming new ways for us to communicate with the machine. 151 00:14:34,420 --> 00:14:40,690 So in the past, what a statistician would do if they wanted to represent data is they would train the human to 152 00:14:40,690 --> 00:14:47,469 understand what a P value is or apparently misunderstand what a P value is or what an error bar is, 153 00:14:47,470 --> 00:14:49,000 or how to interpret a graph. 154 00:14:49,150 --> 00:14:55,990 And then the machine would share, you know, if you were using the machine that information in the form of a graph and provide that information to you. 155 00:14:57,370 --> 00:15:02,800 What we've got now is machines that will just talk to us and when they talk to us, 156 00:15:03,010 --> 00:15:12,520 they are going right to the depths of that desire to overinterpret because that feeding into our desire to anthropomorphise eyes. 157 00:15:13,570 --> 00:15:17,680 And I can't say anthropomorphise very well. So I just say anthrax. 158 00:15:18,190 --> 00:15:28,240 So we all tend to treat complex things as if they are human in some form, whether it's our cat or whether it's our car or whether it's our computer. 159 00:15:28,630 --> 00:15:34,810 And when machines are starting to exhibit language, that tendency is going to increase even more. 160 00:15:35,290 --> 00:15:40,830 But the problem is that we mustn't forget this fundamental difference, that these intelligences are quite different from us. 161 00:15:40,840 --> 00:15:44,980 They are quite alien to ours. They can augment us. 162 00:15:45,400 --> 00:15:51,970 But the point in this tool is they can never replace us and they should never be allowed to replace us, 163 00:15:52,300 --> 00:15:56,020 because at the core of our intelligence is our limitations. 164 00:15:56,410 --> 00:16:03,850 And those are things that right from its birth, however capable the machine has, it will never have those limitations baked into it. 165 00:16:05,780 --> 00:16:10,689 Okay, so when I think about the scale of what we're experiencing now, 166 00:16:10,690 --> 00:16:16,810 I was giving I was convening the UK, AI Fellows, students and postdocs at a hackathon, 167 00:16:17,530 --> 00:16:22,809 partially because I strongly believe that we are so dependent on that generation 168 00:16:22,810 --> 00:16:27,910 of people to deploy and use these technologies in the best way possible, 169 00:16:28,210 --> 00:16:34,860 because it's not going to be the old people like myself who understand the ways in which these things can be deployed and used. 170 00:16:34,880 --> 00:16:41,890 We can see that in every previous revolution it's young people. So I've tried to come up with analogies for how big I think this revolution is. 171 00:16:41,890 --> 00:16:49,780 And sometimes I talk all right about the printing press and the sort of extraordinary effect that had 500 years ago leading to the Enlightenment, 172 00:16:49,780 --> 00:16:53,950 but also sort of 500 years of war around religion and all sorts of things going on in Europe. 173 00:16:54,340 --> 00:16:57,460 But then I kind of caution to the printing press doesn't go far enough. 174 00:16:57,640 --> 00:17:03,850 So the revolution I think we're engaged in at the moment is arguably akin to the revolution of rights. 175 00:17:04,000 --> 00:17:11,800 So this is a sort of early cuneiform tablet that if you go back to well, I started working with a colleague, 176 00:17:12,310 --> 00:17:20,310 geologist in Cambridge who works on Nepal from 1500 B.C. to 1000 B.C. and reads these tablets with, 177 00:17:20,320 --> 00:17:23,860 you know, court cases written down this time that as far as I can make out, 178 00:17:23,860 --> 00:17:28,810 the Greeks were some form of barbarians going around the Mediterranean, attacking cities. 179 00:17:29,650 --> 00:17:38,620 And at the same time, the city of Nepal has 2000 years of written history going back to, you know, before Gilgamesh existed. 180 00:17:38,920 --> 00:17:47,020 So this is a society and a culture that existed some 3000 years ago that had 2000 years of written history, 181 00:17:47,020 --> 00:17:49,480 written in the form of these sort of clay tablets. 182 00:17:49,720 --> 00:17:56,580 But then the thing I sort of think about is what was it like for the first person when when the Epic of Gilgamesh, which was an oral tradition, 183 00:17:56,590 --> 00:18:02,170 like presumably most of these traditions were like The Odyssey and everything else was suddenly written down for the first time, 184 00:18:02,380 --> 00:18:06,640 and the sort of bard who's learned this whole epic and can sing it and do everything else. 185 00:18:06,640 --> 00:18:10,690 He turns up and says, Now this or I may I put it written down, Now don't need you anymore. 186 00:18:11,740 --> 00:18:15,460 We don't need you to remember things because we can remember things. 187 00:18:16,680 --> 00:18:22,230 All a piece of tabloid. And the first form of usage for these tablets was not to write down poems. 188 00:18:22,470 --> 00:18:26,430 It was to write down accounts. So-and-so owes me three sheep. 189 00:18:26,880 --> 00:18:31,770 I owe them four lots of call. And I think there's a tremendous parallel between that and the computer, 190 00:18:32,220 --> 00:18:37,680 which so far has mainly been and somehow a tool of the accountants excel spreadsheets. 191 00:18:37,890 --> 00:18:44,880 It doesn't and can't interface with us in the poetic language we might like to use and what it suddenly gains. 192 00:18:45,270 --> 00:18:47,190 I mean it. Pearson But it's not something. 193 00:18:47,190 --> 00:18:52,440 This has been going on for a number of years, but it's crossed the threshold into, Oh yes, you can talk to the machine. 194 00:18:55,030 --> 00:19:03,009 And I think that this is really one particular reason why this is sending society into these sort of spasms of 195 00:19:03,010 --> 00:19:12,129 excitement is is related to an experience I had in 2017 where we were doing a Royal Society report on machine learning. 196 00:19:12,130 --> 00:19:17,290 So people were thinking about this a long time ago and worrying about how technology would change things. 197 00:19:17,650 --> 00:19:22,410 And we were very focussed at the time as it was on my Richard and Daniel Susskind on the future profession. 198 00:19:22,420 --> 00:19:25,630 It's a very different technological transform, the work of human experts, 199 00:19:25,870 --> 00:19:31,929 and we were doing this work from 2015 to about 2017 and about midway through the work when we were coming up with 200 00:19:31,930 --> 00:19:39,970 a lot of conclusions about how the lives of the middle classes were going to be affected by this revolution, 201 00:19:40,360 --> 00:19:48,459 we had a particular occurrence that really stuck with me and that was the Brexit vote and we were in the wrong society. 202 00:19:48,460 --> 00:19:55,660 About a week later and we were sat around a table and we were all digesting the consequences of this vote. 203 00:19:55,660 --> 00:20:00,910 And there were employees from all parties and there were policy experts and there were domain experts. 204 00:20:01,300 --> 00:20:05,170 And one of the MP said, Well, I bet no one in this room voted for Brexit. 205 00:20:05,800 --> 00:20:08,860 And it suddenly occurred to me, I'll bet your rights. 206 00:20:09,100 --> 00:20:10,300 And isn't that the problem? 207 00:20:11,650 --> 00:20:20,799 Because the people who voted for Brexit are the ones who were disengaged from the decision making, and that's why they voted for Brexit. 208 00:20:20,800 --> 00:20:28,330 And you can see that in populism wherever it exists. And that gave me a great fear because this book speaks to the empowerment. 209 00:20:28,990 --> 00:20:32,590 It speaks to all of us that are already engaged in decision making. 210 00:20:33,580 --> 00:20:41,110 And that's also what you see going on now. Now, I'm not saying the allies aren't going to be transformed as educated people, 211 00:20:41,740 --> 00:20:47,920 but my experience is that educated people are amongst the best equipped to handle such change. 212 00:20:49,480 --> 00:20:53,560 And mean what I have in my head is that of the coin pushing machine. 213 00:20:54,010 --> 00:20:57,910 So this is a coin which I used to love before I as a kid, I don't know if anyone knows what they are. 214 00:20:59,170 --> 00:21:02,910 These are the more fair grounds. You've got to bring the Blackpool or whatever else in you. 215 00:21:03,070 --> 00:21:07,000 You've got these two pennies and they go on the sets and the sets move in and out and you draw the 216 00:21:07,000 --> 00:21:12,960 coin and the coin sort of goes in the sand here and then it lands and it should push these coins off. 217 00:21:12,970 --> 00:21:18,280 You can see these coins are deformed and then when the step pushes out, some coins will fall off and hopefully you'll win. 218 00:21:18,520 --> 00:21:22,900 Now, which coins fall off? Is it the coins that move? No, it's the coins on the edge. 219 00:21:23,680 --> 00:21:29,800 Because those people who are already on the edge of society, who are already challenged by things, 220 00:21:30,070 --> 00:21:33,070 are the ones that are going to be affected by this form of disruption. 221 00:21:33,400 --> 00:21:39,760 And my sense is, yes, those of us in the professions, those of us who are educated can do things well. 222 00:21:39,820 --> 00:21:45,190 We'll have to learn. We'll be disrupted in a major way. But we actually have the intelligence and capability to do your jobs. 223 00:21:45,340 --> 00:21:49,630 It's the knock on effects that it has on those people that have already given up 224 00:21:49,990 --> 00:21:55,629 that I met a guy in Lancaster once who used to who decided not to go to university, 225 00:21:55,630 --> 00:21:59,620 wanted to be in a band. He went catering, called into catering. He worked at the prison. 226 00:21:59,800 --> 00:22:00,700 The president got shot. 227 00:22:00,820 --> 00:22:06,969 And then, of course, I met him because he was driving a taxi, because that's where, if you ask any taxi driver why they're driving a taxi, 228 00:22:06,970 --> 00:22:11,410 you'll always get this sequence of stories about, you know, because anyone can sort of drive a taxi. 229 00:22:11,620 --> 00:22:14,800 But there are lots of people's where there isn't much demand for taxi. 230 00:22:16,300 --> 00:22:23,200 So when we did have a call for the Royal Society back in 2017, it was called machine made the complex computers that lead by example. 231 00:22:23,230 --> 00:22:26,920 I still think it's a really valid report that is actually still referenced today, 232 00:22:27,100 --> 00:22:32,860 but one of the things we did is we spoke to people and we asked them what they would like machine learning solutions to do. 233 00:22:34,000 --> 00:22:38,559 And so we did a it was the Ipsos MORI poll on public views, machine learning. 234 00:22:38,560 --> 00:22:43,060 So this is qualitative research, which I think is going to be increasingly important in this domain. 235 00:22:43,330 --> 00:22:48,400 And I understand in qualitative research, even if you're a quantitative scientist, needs to improve dramatically. 236 00:22:48,730 --> 00:22:53,380 So these are the sort of things they said, and I'm not sure if you can read, but these are sort of areas we looked at. 237 00:22:53,740 --> 00:23:00,640 So we asked about health and health and they can see the greatest potential benefit to individuals and society. 238 00:23:00,970 --> 00:23:02,890 We asked about social care, 239 00:23:03,280 --> 00:23:10,030 so they saw the potential for resourcing issues that they fear to mobilise machines with due to involvement and emotional contact. 240 00:23:10,600 --> 00:23:18,339 Right. So that's clearly a worry and sort of thing. But they could see that there was benefit, the best case scenario, unachievable. 241 00:23:18,340 --> 00:23:21,100 She and they were keeping carers to spend more time with patients. 242 00:23:22,000 --> 00:23:26,530 I mean that would be brilliant if that's what we've done in any of our hospitals or social care. 243 00:23:26,530 --> 00:23:28,209 I don't think it's a very good record. 244 00:23:28,210 --> 00:23:34,660 But marketing maybe even thought marketing was, Oh yeah, we could be used to tailoring marketing and that might not be a bad idea. 245 00:23:34,870 --> 00:23:37,959 Transport driver's cars could have benefits. 246 00:23:37,960 --> 00:23:39,040 These are just regular people. 247 00:23:39,040 --> 00:23:46,240 They were sort of in Birmingham in a sort of like small groups finance universe and thought about our to monitor potentially fraudulent activity. 248 00:23:46,480 --> 00:23:50,200 People are really, really smart about what they want technology to do. 249 00:23:50,620 --> 00:23:54,110 Participants tend to think machine learning and crime the thought patterns in. 250 00:23:54,480 --> 00:23:57,240 Good idea. So let's see how might work accurately in practice. 251 00:23:57,870 --> 00:24:03,179 Smart people, they saw it as a useful tool to aid the military police resources but concerned about consequence to stereotyping. 252 00:24:03,180 --> 00:24:09,030 Individual people are smart. This is why at a time, by the way, when most people had not even heard what machine learning was, 253 00:24:09,030 --> 00:24:12,000 that was the first thing you had to do is educate them what machine learning was. 254 00:24:13,620 --> 00:24:17,939 Public education partisans are concerned that tainted education based on machine learning 255 00:24:17,940 --> 00:24:21,929 would result in killing and limiting people to some career path that too young and again, 256 00:24:21,930 --> 00:24:24,990 really, really smart. But they also sort of see benefits in that. 257 00:24:25,350 --> 00:24:30,239 What is the one area that all participants thought was ridiculous? 258 00:24:30,240 --> 00:24:34,080 That we should not be trying to do that? They could see no point in it whatsoever. 259 00:24:35,250 --> 00:24:41,639 RS They still fail to see the purpose of machine learning and read the poetry for all. 260 00:24:41,640 --> 00:24:42,450 Other case study. 261 00:24:42,450 --> 00:24:48,209 Participants recognise that machine might not do a better job than a human, however, but they do not think this application creates art, 262 00:24:48,210 --> 00:24:53,670 as doing so is considered to be a fundamental human activity that machines could only mimic best. 263 00:24:54,420 --> 00:25:00,600 Where have we made the greatest progress in machine learning in the ensuing six years? 264 00:25:01,260 --> 00:25:03,060 In mimicking what humans do. 265 00:25:03,090 --> 00:25:10,920 So in honour of that, I often get to go right to Silicon Valley That introduces a large language model us spoken by Mickey McHugh. 266 00:25:11,160 --> 00:25:18,150 This was chapter 54. Chapter 3.5 didn't do quite as good a job, but yes, 267 00:25:19,350 --> 00:25:24,720 I don't know who pockets to enable new POC is as if woven by is POC inappropriate person to create? 268 00:25:24,840 --> 00:25:28,320 Largely because of different play, but it's not a fixed. 269 00:25:28,870 --> 00:25:36,969 Right. Well, is the crossover plan. It's the one where they create large language models and there's roaming now. 270 00:25:36,970 --> 00:25:41,110 And you know, it's in nowadays. You know, you've got better call Saul, all sorts going on. 271 00:25:41,110 --> 00:25:45,790 The characters appear in other things. Digital Athenaeum. 272 00:25:46,150 --> 00:25:51,010 My goodness. But what would that fundamentally be? 273 00:25:51,010 --> 00:25:54,760 So I want to say, does anyone recognise what this map is of? 274 00:25:56,920 --> 00:26:00,820 You're not. What's. What's the map of. Western Europe. 275 00:26:01,680 --> 00:26:05,250 Yeah. And can you guess what year it is and what the circumstances of the matter. 276 00:26:06,750 --> 00:26:17,010 Well, go to that site and can you guess the date? There's only a few significant dates in World War Two before, but again, in 44 June, Portugal. 277 00:26:17,010 --> 00:26:21,390 Yeah, June the fifth, 1944. And can you guys see my penis? 278 00:26:23,960 --> 00:26:27,620 As a former teacher, it's a job and not of the weather stations. 279 00:26:28,400 --> 00:26:34,490 Yeah. So it's the pressure and the different motivations. And and actually, you know, I get it from the ice baths. 280 00:26:35,120 --> 00:26:38,780 So what was going on in June 44 was that there was a storm in the channel. 281 00:26:39,080 --> 00:26:44,060 What was also going on was that the tides were at the right level for an invasion of Europe to occur. 282 00:26:44,900 --> 00:26:52,820 What was difficult for the Germans is that the weather comes from the West and that way the stations were on France. 283 00:26:53,240 --> 00:27:02,510 Does anyone know where Rommel was on the day of the invasion at his wife's 50th birthday party in Germany, because he got the weather report and said, 284 00:27:02,510 --> 00:27:10,040 Well, there's no way there'll be an invasion tomorrow so I can go home to my wife's birthday party and everything will be fine. 285 00:27:10,070 --> 00:27:15,380 Of course, he was kind of rudely awoken when I say and had to drive rapidly back to the front. 286 00:27:16,340 --> 00:27:22,159 Now, if we look on the alternative map, this is the allied view of the weather on that day. 287 00:27:22,160 --> 00:27:26,729 And the allies actually on the same day, the front says, and they understand that there's going to be a gap in the weather. 288 00:27:26,730 --> 00:27:30,410 And that's actually multiple weather reports that are in conflict in this even controversy 289 00:27:30,410 --> 00:27:33,950 today over who agreed that there was going to be a significant gap in the weather. 290 00:27:34,430 --> 00:27:40,130 But the interesting thing about weather forecasting in both these cases is it was being done by interpolation. 291 00:27:40,340 --> 00:27:44,150 So you had a number of weather stations spaced across Britain. 292 00:27:44,300 --> 00:27:52,820 I mean, actually it's interesting to me. Martin I'm not sure saying we should have had weather information from France because of Enigma decrypts, 293 00:27:53,060 --> 00:27:54,680 but whether they would put it on the map or not. 294 00:27:54,830 --> 00:27:59,420 So we could actually interpolate across the channel when we did our weather reports, whereas the Germans, 295 00:27:59,780 --> 00:28:03,440 I don't think, could have some weather information on the previous map, I think in Cornwall. 296 00:28:03,470 --> 00:28:06,590 So I'm not sure how they're getting it and what the CORNISH doing. 297 00:28:11,030 --> 00:28:18,560 But basically you're having to interpret. But for Britain, you're interpreting in the right direction because the weather is moving in the right way. 298 00:28:19,190 --> 00:28:25,490 A modern weather forecast is, of course, done using Navier-stokes equation on that big computer done in Exeter, 299 00:28:25,730 --> 00:28:32,389 and they use Navier-stokes equation that allows them to extrapolate. And that extrapolation is sort of key in giving us these long term predictions 300 00:28:32,390 --> 00:28:35,480 up to five or six days because we understand the physics of the weather. 301 00:28:35,480 --> 00:28:42,860 We're not just looking at what the pressure is instead of having a stab at where it might move through some minimal sort of examination over time. 302 00:28:43,130 --> 00:28:46,930 So that's why the Germans got the weather wrong. Now, what's that got to do with TATP? 303 00:28:48,080 --> 00:28:53,410 Well, fundamentally, all it's doing is it's operating. It doesn't have access. 304 00:28:53,420 --> 00:28:59,780 It's in this position. It doesn't have access to the future where humans are given this tool and they 305 00:28:59,780 --> 00:29:04,010 start to work with this tool and they start doing what they will do with that. 306 00:29:04,610 --> 00:29:12,590 Why does that come about? Well, the enormous quantity of data we fed these machines is basically at the primary level. 307 00:29:12,800 --> 00:29:16,160 It's giving a feature space in which interpolation works. 308 00:29:16,490 --> 00:29:21,410 And all these capabilities like logical reasoning stuff are emergent because that 309 00:29:21,410 --> 00:29:25,280 feature space is emergent by just looking at enormous quantities of our data. 310 00:29:25,610 --> 00:29:30,050 But it like in the final layer of this thing, it's still having to interpolate. 311 00:29:30,260 --> 00:29:37,730 It can't build that feature space and extrapolate in such a way that says how well, how would McCue show actually react? 312 00:29:38,150 --> 00:29:41,750 You know, what would they do, first of all, which you could see it could say something, 313 00:29:42,020 --> 00:29:46,490 but we'll only know what McCue would do when the modern McCue shows get given the tool. 314 00:29:46,910 --> 00:29:54,800 And so at this very key point, given that humans, whatever their limitations, are coming out, this problem from particular perspective, 315 00:29:55,430 --> 00:29:59,690 you can never replace them because as soon as you augment them with the tool, 316 00:29:59,870 --> 00:30:03,710 they are something that is far greater than either themselves or the tool on its own. 317 00:30:03,860 --> 00:30:10,009 Just like we became far greater than ourselves by writing things in clay tablets, you know, for a large amount of time. 318 00:30:10,010 --> 00:30:15,290 That was how we shared knowledge and built upon knowledge and understood geometry and all sorts of other things. 319 00:30:15,890 --> 00:30:19,550 So those models are only interpolated now. 320 00:30:20,150 --> 00:30:25,790 This is something I talk about a lot, and I should not question what I would have said about this a little bit. 321 00:30:26,270 --> 00:30:32,060 Because when people talk about artificial intelligence, a question occurred to me is what do they mean? 322 00:30:32,570 --> 00:30:34,640 Because different people mean different things. 323 00:30:35,180 --> 00:30:41,830 I was at the Natural History Museum in London talking with some of the scientists there the other week and on my way, 324 00:30:41,840 --> 00:30:46,900 and I got chatting to the security guard. And I always like to ask people, what do you think it is? 325 00:30:47,480 --> 00:30:54,379 And it's quite hard to understand. A common thing is behind this because intelligence is such an emotive word. 326 00:30:54,380 --> 00:30:57,380 So they sort of think it's like them. It's like, well, it can't be quite like you. 327 00:30:57,860 --> 00:31:05,570 But I do think want to say to you that this technology might and could do is be the first technology that when it automates, 328 00:31:06,230 --> 00:31:10,310 tries to adapt to who we are or I think that's what it purports to do. 329 00:31:10,490 --> 00:31:17,430 Right. So all previous generations of automation, if you invented a weaving in a room or if you invented a spinning journey, 330 00:31:17,690 --> 00:31:21,420 it required everyone to turn up at the factory and service the machine. 331 00:31:21,460 --> 00:31:23,910 So Samuel Butler. It's about this in the 19th century, 332 00:31:24,120 --> 00:31:31,080 the way in which these forms of automation actually enslave us to the machine because we're adaptable and the machine isn't. 333 00:31:31,470 --> 00:31:34,950 So if you look at railways, you know, you all have to turn up on time station. 334 00:31:34,950 --> 00:31:39,810 You didn't even have universal time across the United Kingdom until they had railways. 335 00:31:40,110 --> 00:31:46,050 So every previous generation of automation requires us as flexible entities to adapt to the machine. 336 00:31:46,260 --> 00:31:51,930 And I think what people expect that they're getting from an AI is an entity that will adapt to them. 337 00:31:54,030 --> 00:31:56,759 I think it's a fallacy, but maybe I'm wrong. 338 00:31:56,760 --> 00:32:01,830 And maybe what we're seeing now is when you have machines that can understand the whole of human history and culture, 339 00:32:02,160 --> 00:32:05,490 they can mimic adaptation to a certain it will feel a bit more. 340 00:32:05,580 --> 00:32:10,050 I don't know if we'll truly feel that in two, three years time. I think people are feeling that today. 341 00:32:10,380 --> 00:32:17,700 I can have a conversation with a machine about my software that feels like it is adapting to who I am and what I like. 342 00:32:18,870 --> 00:32:23,060 But when it comes to then applying that in critical decision making, 343 00:32:23,070 --> 00:32:31,170 so a lot of what we care about in society is what one might think of as consequential decisions being made by people about us and who we are. 344 00:32:31,530 --> 00:32:36,870 I think there's kind of a problem with this point of view because the thing I sort of started noticing, 345 00:32:36,870 --> 00:32:40,490 and I think it's also written about I'm not the first to notice it and people use different words, 346 00:32:40,590 --> 00:32:45,090 but everyone in machine learning talks about we want decisions that are fair. 347 00:32:45,900 --> 00:32:53,040 But when they say that, we want fair decisions. I've noticed that people mean a couple of different things. 348 00:32:53,040 --> 00:32:53,399 And in fact, 349 00:32:53,400 --> 00:33:02,280 it was Karl Rasmussen that he went out for a run once because I was talking about the complexity of a fair decision and how difficult it is. 350 00:33:02,640 --> 00:33:07,200 If I'm going to make a nuanced decision about some individuals in front of me from 351 00:33:07,200 --> 00:33:11,160 different backgrounds who all appear qualified to enter the University of Cambridge, 352 00:33:11,340 --> 00:33:18,240 and I'm taking into account all their different backgrounds, what they've done up until now, etc., etc., because they can all manage the degree. 353 00:33:19,080 --> 00:33:26,010 I'm using an immense amount of my understanding of these individuals as a human being and trying to find out about who they are. 354 00:33:26,460 --> 00:33:30,870 And it seems fair that I should do that because I'm taking into account these individuals 355 00:33:30,870 --> 00:33:34,140 experience of life and the struggles they've had to get through to get there. 356 00:33:34,770 --> 00:33:38,150 But it's another form of fairness that says, no, the rules should be clear. Now. 357 00:33:38,160 --> 00:33:43,290 Basically, you should have to get these grades and then you should be accepted into university. 358 00:33:43,860 --> 00:33:50,730 And the same thing applies for any consequential decision making, like giving loans or, you know, court cases. 359 00:33:51,150 --> 00:33:55,050 And this is now reflected in the General Data Protection regulation, 360 00:33:55,740 --> 00:34:03,240 which says that there are certain protected characteristics and you should not ever decide about someone on the basis of those characteristics. 361 00:34:03,810 --> 00:34:07,980 But what I find interesting about this is it's always like this is political alley. 362 00:34:07,980 --> 00:34:12,390 We decide which side, which politic politics is, errs on which side. 363 00:34:12,600 --> 00:34:20,759 To me, both of these extremes are dystopias. The extreme where all you're doing is considering everyone's history and and if only they be given 364 00:34:20,760 --> 00:34:26,309 the chances and giving everyone the same opportunity is one dystopia and the extreme way you say, 365 00:34:26,310 --> 00:34:35,670 no, this is the fixed rule, and only if you pass this rule do you get to have the beneficial outcome or the negative outcome is another dystopia. 366 00:34:36,270 --> 00:34:39,900 And in practice, we're always operating somewhere between. 367 00:34:40,680 --> 00:34:44,159 And it was sort of struggling to think about this because it's really important. 368 00:34:44,160 --> 00:34:50,820 How can you have something that is procedurally fair and simultaneously nuanced and substantively fair? 369 00:34:51,480 --> 00:34:53,280 Because it's something to be perceived. 370 00:34:53,280 --> 00:34:58,980 To be fair, we all have to understand, like, I don't think the tax code is fair because no one understands the tax code. 371 00:34:59,250 --> 00:35:05,250 So if you pay someone a large amount of money, they can find a hole in the tax code and you can pay less tax. 372 00:35:05,460 --> 00:35:08,610 So that's not procedurally fair, even though it's a process. Right. 373 00:35:08,610 --> 00:35:14,520 And it's not procedure fair because it's difficult to understand. So for something to be perceived fair, it has to be clear to everyone. 374 00:35:14,850 --> 00:35:18,300 But simultaneously, we're going to take into account all these nuances. 375 00:35:18,510 --> 00:35:22,440 How can we have something that is taking into account nuances and also be simple? 376 00:35:23,280 --> 00:35:25,979 Well, the answer goes back to that word. 377 00:35:25,980 --> 00:35:32,670 And folks, if you gather a group of people together, if you convene them really well and teach them how to make decisions, 378 00:35:32,970 --> 00:35:39,480 you can have a decision making process that is simultaneously nuanced and procedurally fair, 379 00:35:39,810 --> 00:35:45,720 because you can have people I mean, the most complex things in this room are the other people in this room. 380 00:35:46,170 --> 00:35:49,400 But we all think we have a deep and intuitive understanding of those people. 381 00:35:49,410 --> 00:35:54,930 I mean, when we do action because we co-evolved together, right? So that goes right to the heart of who we are as humans. 382 00:35:55,320 --> 00:36:00,600 So I call this a marvellous resolution between these two extremes that the only way you 383 00:36:00,600 --> 00:36:05,370 can break these two is to have a procedure where you've got well trained human beings. 384 00:36:05,370 --> 00:36:08,010 And of course, very often we're not well trained with bias or whatever. 385 00:36:08,330 --> 00:36:14,360 If you train us humans well about good decision making and then you also have a procedure that sort of says, yeah, 386 00:36:14,370 --> 00:36:18,810 we can how to get these marks, and then you'll be before a committee, and then that committee will make the final decision. 387 00:36:19,020 --> 00:36:22,140 Everyone says, Sure, that sounds fine if you say deal terms of. 388 00:36:22,240 --> 00:36:27,420 Say. And then the machine looks at your entire history and does really complicated calculations 389 00:36:27,420 --> 00:36:32,430 about what's going on that that's very hard for people to understand that sort of procedure, 390 00:36:32,430 --> 00:36:38,370 because it's a procedure, but one that we don't feel an intuition about unless we're assuming that machine is operating like you. 391 00:36:39,030 --> 00:36:42,359 So for me, this says that for consequential decision making, 392 00:36:42,360 --> 00:36:49,470 in order to have decisions that can balance between these two forms of fairness, we must always have humans in the loop. 393 00:36:51,520 --> 00:36:56,530 But what does that mean? Because we know machines can be a great help in these decisions. 394 00:36:58,390 --> 00:36:58,780 Okay. 395 00:36:58,780 --> 00:37:06,250 So I kind of think that the other side to that and I love these lectures, I listen to them recently, these are the best lectures on air quality here. 396 00:37:06,790 --> 00:37:13,750 And from 2002 and then on Al, because what I, Norah O'Neill is talking about is a question of trust. 397 00:37:13,960 --> 00:37:21,610 And she's talking about a world where everything has become more procedural and we're calling everything to account through processes. 398 00:37:21,820 --> 00:37:29,350 And she's talking about the way that erodes our professional faith in individuals who are paid to make good decisions. 399 00:37:29,530 --> 00:37:35,230 Now, there are clearly things that we all know about professional individuals who have made corrupted decisions that are in the papers all the time. 400 00:37:35,650 --> 00:37:42,730 But what Neil is saying is that by increasing accountability in terms of scoring them and marking them and making measures, 401 00:37:42,940 --> 00:37:45,550 you're actually eroding that professional duty. 402 00:37:45,880 --> 00:37:52,560 And what she talks about a lot is that all rights come with duties and that you can't have rights with duties. 403 00:37:52,720 --> 00:37:56,560 And what you will increasingly here, certainly if you hear this, is lapses again, 404 00:37:57,070 --> 00:38:02,680 is is that that's kind of the way that people talk about things like certainly even in academic politics, 405 00:38:03,220 --> 00:38:08,230 I very often don't hear my colleagues talking about what their duties are towards their students, 406 00:38:08,230 --> 00:38:11,590 but I often hear them talking about what their rights are around their intellectual property. 407 00:38:12,730 --> 00:38:16,570 And it would be good if they were focusing a bit more on what their duties are. 408 00:38:16,630 --> 00:38:21,550 And the point O'Neill is making is you can't have rights without duties now. 409 00:38:22,520 --> 00:38:27,870 I mean, she can't have duties because the machine itself does not participate in the society. 410 00:38:27,940 --> 00:38:31,179 It can't be trusted in the same way. And that's the point I Nora O'Neill makes. 411 00:38:31,180 --> 00:38:33,640 And it's a very strong point that sticks with me, 412 00:38:33,880 --> 00:38:38,680 because the first time she made that point was when I was interviewing her for the Royal Society Report, 413 00:38:38,680 --> 00:38:43,410 and I asked her if we could trust the machine. And she told me very specifically, you couldn't. 414 00:38:43,740 --> 00:38:48,900 But listening to her ideas of why that is the case is really about this is the same thing. 415 00:38:48,910 --> 00:38:54,910 You need humans with a human understanding of what professional duty is to be in the loop around this decision making. 416 00:38:55,720 --> 00:38:56,770 So what does that look like? 417 00:38:56,830 --> 00:39:03,130 So they actually she found a quote that relates to this example that I was using before I even commissioned for the elections. 418 00:39:03,340 --> 00:39:07,260 So again, universities are, I think I could say, on the basis of ability and promise, 419 00:39:07,270 --> 00:39:12,830 but they're supposed to also administration will represent the intent. There's no guarantee that the process meets the target. 420 00:39:13,060 --> 00:39:14,320 And I mean, she doesn't say this. 421 00:39:14,320 --> 00:39:21,400 What I'm trying to say is that the only way you can do that is if you have trusted individuals who are balancing those things as part of that process. 422 00:39:21,610 --> 00:39:25,360 And the only way you can trust those individuals is they're well trained and well convened, 423 00:39:25,360 --> 00:39:28,760 etc., etc., and they're held to account by their position in society. 424 00:39:28,780 --> 00:39:32,440 You can't trust the machine to do that because it doesn't have a position in society. 425 00:39:33,850 --> 00:39:38,200 So what does this mean for in terms of what we're doing with artificial intelligence today? 426 00:39:38,290 --> 00:39:44,110 Well, I kind of think as a result, artificial intelligence could only ever be seen as a tool. 427 00:39:44,290 --> 00:39:48,550 And if anyone is telling you it's a replacement for us, I think that's a ridiculous notion. 428 00:39:48,970 --> 00:39:54,460 That always has to be a ridiculous notion. Apart from the following, what does it mean for it to be a tool? 429 00:39:55,030 --> 00:40:01,030 I mean, I that's I want to talk about is the National Advisory Committee of Aeronautics in Langley Field, 430 00:40:02,020 --> 00:40:08,530 who participated in the Second World War and before that in testing aircraft and when they would test the aircraft. 431 00:40:09,370 --> 00:40:15,120 One of the things they did, a guy called Bob Gilruth wrote, is the handbook on how an aircraft should handle. 432 00:40:16,240 --> 00:40:18,639 So what he did and he's a member, the test pilots, 433 00:40:18,640 --> 00:40:24,640 they had what was called spectrum came out of I don't know which one, but I read a lot about seven come out race 47. 434 00:40:25,390 --> 00:40:29,770 When they got a new plan they went up and flew these plans and they moved the 435 00:40:29,770 --> 00:40:33,579 stick around and the plane did things and pilots would say flies like a blue, 436 00:40:33,580 --> 00:40:40,490 fly like a brick, whatever. But they didn't know what that meant in terms of how they should change the plane to fly that, sir. 437 00:40:41,260 --> 00:40:45,850 So what Bob Gilbert did is he wrote a paper that you can go and read it online somewhere 438 00:40:45,850 --> 00:40:50,620 that talks about how a plane should respond when you put £10 of pressure on the stick. 439 00:40:51,160 --> 00:40:55,360 Right. So if you put £10 of pressure on the left and it's a fighter, it should roll at a certain rate. 440 00:40:55,540 --> 00:41:01,809 If you put £10 of stick pressure on the stick and it's a bomber, it should roll at a slower rate As you approach your stall, 441 00:41:01,810 --> 00:41:05,290 the section JUDDER on the run into the stall to warn the pilot. 442 00:41:05,560 --> 00:41:07,630 He quantified all of this. Right. 443 00:41:07,990 --> 00:41:16,090 So when these pilots were flying planes, they could feel the plane and the plane was communicating in a way that was well understood. 444 00:41:16,390 --> 00:41:23,560 Now that control of a stick. Well, the other thing about nationalise the Commission Aeronautics, it became a founding group of method. 445 00:41:24,010 --> 00:41:32,140 And the people that did this work also designed the crewed capsules, the Mercury capsule, the Gemini capsule, and the Apollo capsule. 446 00:41:32,410 --> 00:41:40,690 The actually indulged in providing that feedback to pilots and the control to pilots on entities that were entirely controlled by computers. 447 00:41:40,930 --> 00:41:44,320 So the Apollo guidance computer was the thing that was firing the thrusters. 448 00:41:44,560 --> 00:41:49,800 But Neil Armstrong is moving the stick. So in that case, you've got this really interesting situation where the. 449 00:41:49,820 --> 00:41:55,880 Neil Armstrong is not in control because the computer's in control, but Neil Armstrong feels like he's in control. 450 00:41:57,970 --> 00:42:03,550 And that's the kind of balance we're looking for, because for the things that Neil Armstrong cares about, he is in control. 451 00:42:04,120 --> 00:42:07,270 And the same is true for the pilot of an A380 today. 452 00:42:07,600 --> 00:42:08,860 When they move that stick, 453 00:42:09,070 --> 00:42:14,770 the plane that's going into a computer and the computer is deciding what to do and sometimes the computer overrules the pilot. 454 00:42:15,370 --> 00:42:21,850 It's also true for flight fighter pilots today where fighter pilots would not be capable of flying, obviously, without the computer systems. 455 00:42:22,270 --> 00:42:26,140 So the notion of control you're looking for is something that's akin to this, 456 00:42:26,320 --> 00:42:31,900 where you want the intent of the human being to be understood and projected and enhanced by the machine. 457 00:42:32,290 --> 00:42:36,580 But the machine itself is doing things that are beyond the capability of any given human. 458 00:42:37,060 --> 00:42:38,379 Now, this is incredibly difficult, 459 00:42:38,380 --> 00:42:46,870 and I find this is a really ironic that the control stick was first developed by the Wright Brothers in 1904 or whenever they flew for their flyer, 460 00:42:47,530 --> 00:42:50,740 and it just went forward and backwards because they could only go up and down. 461 00:42:50,950 --> 00:42:54,160 And within 36 years we characterise that interface. 462 00:42:54,640 --> 00:42:57,430 The interface we're now faced with now is one, two. 463 00:42:57,820 --> 00:43:09,040 Let's just talk to our machines and our ability to characterise that interface between the computer and the human is minimal, 464 00:43:09,880 --> 00:43:16,060 like in terms of how we can be manipulated or how we should present uncertain information or how 465 00:43:16,060 --> 00:43:22,360 we share an understanding of a patient with a machine or how it should share it back is minimal. 466 00:43:23,200 --> 00:43:28,659 We don't have the equivalent of £10 left should lead to this and we urgently need to 467 00:43:28,660 --> 00:43:33,070 get on top of that because so far we've been manipulated in fairly simplistic ways. 468 00:43:33,250 --> 00:43:40,809 Like here's a like button or here's an image you'll like or click on this, or here's a question about We'll never believe what so-and-so looked like. 469 00:43:40,810 --> 00:43:44,200 Now, when you knew them in the 1980s, apparently you will take on that. 470 00:43:45,670 --> 00:43:46,630 That's simplistic. 471 00:43:46,640 --> 00:43:53,320 But now there's all sorts of opportunities for machines to interact with this in this highly complex way that we haven't quantified. 472 00:43:53,770 --> 00:44:00,459 So I kind of think that the main thing that we need is the equivalent of a proving ground where when we're looking at these models, 473 00:44:00,460 --> 00:44:04,270 we can understand how they are being deployed in practice. 474 00:44:05,440 --> 00:44:09,040 So understand the nature of the tool, what's the potential, what it pitfalls. 475 00:44:09,370 --> 00:44:12,370 And that's the way we can build cyber capability. 476 00:44:12,880 --> 00:44:18,190 There are going to be loads of disruptions, but there's loads of things we need to do and loads of ways we can be empowered. 477 00:44:18,520 --> 00:44:22,420 So the question is how do we get this understanding as widespread as possible? 478 00:44:23,050 --> 00:44:31,030 Like I said in the introduction, one of the things that has been enormously inspiration to me is working on point data science solutions in Africa. 479 00:44:31,030 --> 00:44:35,200 So Data Science Africa is a bottom up initiative for capacity building. 480 00:44:35,200 --> 00:44:42,010 They've signed machinery and they are on the African continent. And what I love about it is you're always trying to deploy end to end solutions. 481 00:44:42,280 --> 00:44:47,229 So you don't sit in an office and imagine what a farmer might want or a clinician might want. 482 00:44:47,230 --> 00:44:51,430 You actually go out to the health centre or the field and you don't imagine 483 00:44:51,430 --> 00:44:54,579 what the Ministry of Health might want or Minister of Agriculture might want. 484 00:44:54,580 --> 00:45:01,629 You actually go and talk to the Ministry of Agriculture and you find most things that we think about machine learning are utterly irrelevant and the 485 00:45:01,630 --> 00:45:08,800 most of the challenges that people are genuinely experiencing are not being properly researched because they're all at the interfaces with people. 486 00:45:09,730 --> 00:45:13,389 So the thing we need to urgently do is to get on top of working with those people. 487 00:45:13,390 --> 00:45:23,350 So there's an awful mess. In 2015, when we first set up weeks of talks about how we got in and finally get some key facts on the alarm. 488 00:45:23,350 --> 00:45:29,510 So you'll find all these notes and everything. Another aspect to this is how do people maintain power? 489 00:45:29,530 --> 00:45:32,770 How do people maintain control of the system in a societal way? 490 00:45:33,130 --> 00:45:36,670 Well, actually, all this information is coming from us as individuals. 491 00:45:37,060 --> 00:45:40,060 It's our collective eyes data that is being put into these machines. 492 00:45:40,240 --> 00:45:41,980 And we have rights over that data, 493 00:45:42,130 --> 00:45:46,330 whether they are intellectual property rights or whether they're coming from the general Data Protection regulation. 494 00:45:46,720 --> 00:45:49,000 But one challenge is that there's an asymmetry of power. 495 00:45:49,030 --> 00:45:53,470 So if I want to call up Google and say, Can you stop doing this with my data, they're not going to answer my call. 496 00:45:53,860 --> 00:45:57,849 So the notion of a data trust and there's a whole family of mainstream media is now is an 497 00:45:57,850 --> 00:46:03,520 entity which collectivise these people's data rights and operates on this ecosystem to say, 498 00:46:03,520 --> 00:46:09,070 well, basically you can't do that with these models because we want you to delete our data if you're going to do that. 499 00:46:09,430 --> 00:46:17,780 So there's a lot of work that's been going on. We've got some in this area reports 18 months of their trust initiative, 200 experts, 500 00:46:17,830 --> 00:46:23,260 some 70 or so and exciting free real world data transfer funding as pilot Wonga, 501 00:46:23,260 --> 00:46:30,159 which is a sort of local data trust people around bricks in the definition two of which are associated with medical data, 502 00:46:30,160 --> 00:46:36,220 one with people who opted out for the general practitioners recent data sharing scheme 503 00:46:36,310 --> 00:46:40,420 and another one for the Born in Scotland cohort saying which is just starting out. 504 00:46:41,620 --> 00:46:45,790 The other thing I think is vitally important and this is our funding, 505 00:46:45,970 --> 00:46:53,590 we also have to make funding here from Schmidt futures for a project money for four years now is the accelerated programme for scientific discovery, 506 00:46:53,860 --> 00:47:02,530 where the main focus is to go out to data. Science to go out to scientific domain experts and talk to them about their scientific problems 507 00:47:02,680 --> 00:47:06,650 and try and work out what the solutions they need from a machine learning perspective are. 508 00:47:06,880 --> 00:47:11,230 Because only by understanding the problems they face is it's entirely inspired by the 509 00:47:11,410 --> 00:47:16,390 talk to the farmer in the field approach that the DSA has to take the South Africa. 510 00:47:17,020 --> 00:47:22,000 But working with scientists and we go beyond science to engineering and humanities got project, 511 00:47:22,010 --> 00:47:26,680 as I said in my serology friend to try and understand what is it they're looking at? 512 00:47:26,690 --> 00:47:29,110 How how would they like their lives to be better? 513 00:47:29,290 --> 00:47:35,020 Rather than building a neural network and imposing on people, talking to people about the war, the problems you are facing. 514 00:47:35,260 --> 00:47:40,690 And it's exactly inspired by the fact that nurses aren't getting to spend more time with patients through computers. 515 00:47:40,690 --> 00:47:46,330 They're getting to spend more time with computers. And the reason is because that computer solution is imposed from the centre, 516 00:47:46,510 --> 00:47:54,790 rather than by talking to the nurse about what they need and these new capabilities in terms of our ability to code and communicate with machines, 517 00:47:55,060 --> 00:48:02,200 offer an opportunity to turn that around. And this stuff about there won't be any jobs, there's a lot of jobs doing that. 518 00:48:02,380 --> 00:48:09,420 We are utterly failed to deliver on this for like the last eight years, as we've just seen from what people ask for from the Royal Society poll. 519 00:48:09,700 --> 00:48:12,340 Of course, jobs will be different and of course it will be disruptive. 520 00:48:13,450 --> 00:48:20,439 So the final thing is that we're trying to sort of bring that together with the CAM flagship mission. 521 00:48:20,440 --> 00:48:24,759 And if you read and do feel free, we published this report before we got the mission approved, 522 00:48:24,760 --> 00:48:30,790 because the thing I hate about grant proposals is when they've rejected it and you don't get to do anything with your proposal. 523 00:48:30,790 --> 00:48:35,950 So I said, Well, well, we're going to write a report first, publish it, and then you can decide whether to fund it or not. 524 00:48:36,400 --> 00:48:42,910 Part of the reason for that is because, rightly or wrongly, people look to institutions like Cambridge and Oxford to provide leadership. 525 00:48:43,210 --> 00:48:48,430 So this report brought together a number of these strands of thinking communication across the university. 526 00:48:48,470 --> 00:48:52,420 What people expect to see from our and, you know, we hope other people will copy it, 527 00:48:52,420 --> 00:48:57,310 because what it's trying to do is start from what the societal challenges are at MIT, 528 00:48:57,460 --> 00:49:00,970 that Cambridge is not positioned to solve all societal challenges, 529 00:49:01,210 --> 00:49:07,900 but it's position to be a partner in helping people solve those challenges and try and get people to think about how we contribute in that way. 530 00:49:08,080 --> 00:49:12,580 Because it's up to leading institutions like Oxford and Cambridge and Imperial and UCL and 531 00:49:12,580 --> 00:49:17,920 Sheffield and Manchester to really stop all those coins falling off the side of the step. 532 00:49:19,840 --> 00:49:23,829 So Professor following that is just convening and I love it. 533 00:49:23,830 --> 00:49:27,730 To see what's so cool about this is relatively small amounts of funding. 534 00:49:28,150 --> 00:49:32,350 If you're funding the right people, it tends to be young people in an interdisciplinary way. 535 00:49:32,350 --> 00:49:38,860 You can just see beautiful things happen. I think we had this whole meeting where we just funded these groups to the tune of £10,000, 536 00:49:38,860 --> 00:49:42,130 and I was thinking, Yeah, that's about the cost of a business class flight. 537 00:49:42,820 --> 00:49:51,940 And yet some of these groups had produced like workshops, working papers, discussions that were just thinking in ways that only young people can. 538 00:49:51,940 --> 00:49:56,110 The old people who've been stuck in this area for too long are just not seeing the possibilities. 539 00:49:56,260 --> 00:49:59,470 And so it's really vital. We saw funding at that level. 540 00:49:59,920 --> 00:50:07,930 And with that, I'll just say thanks. I don't have any conclusions other than as the enormous amount of work to do, but we're going to do it together. 541 00:50:08,080 --> 00:50:13,960 And if we do it together, well, I think we can lead to a society that is much better for everyone, 542 00:50:14,110 --> 00:50:18,250 not just the professionals, but also all those people on the edge. 543 00:50:18,400 --> 00:50:19,120 Thank you very much.