1 00:00:01,270 --> 00:00:04,570 I'm going to talk about the moral machine experiment, the big project. 2 00:00:04,930 --> 00:00:10,870 It's a project that started in 2016. And we also talk about some other projects as well. 3 00:00:10,900 --> 00:00:12,010 Hopefully if time allows. 4 00:00:12,820 --> 00:00:18,600 Please do feel free to interrupt me if you want, especially if you have clarifying questions that will help you to understand what's going after. 5 00:00:18,970 --> 00:00:22,240 And if you want to have more critical questions, that's also always welcome. 6 00:00:22,870 --> 00:00:27,910 What if you want to keep it to the end? That's awesome. I'll try to make sure that there is enough time at the end for wishes. 7 00:00:30,160 --> 00:00:38,310 Right. So it's probably, you already now know driverless cars are being tested on the roads in different places in the world. 8 00:00:38,320 --> 00:00:42,040 And they still have a lot of technical issues, of course. 9 00:00:42,940 --> 00:00:50,930 But let's say in the future, once they tackle all of these technical and social issues, maybe they will be become a reality. 10 00:00:51,610 --> 00:00:58,689 And some people have made some calculations saying that once they have been perfected in terms of safety, 11 00:00:58,690 --> 00:01:03,270 they will be able to minimise 90% of the current crashes by humans. 12 00:01:03,280 --> 00:01:11,470 Now, regardless of if it's accurate or miscalculated, but let's say it is 90%, the remaining 10%, 13 00:01:11,800 --> 00:01:17,460 and some of these 10% might be good cases where the cars are basically resolving more of the trade offs. 14 00:01:18,070 --> 00:01:27,880 Now, a very stylised, abstracted version of this one on the tables is inspired by what most people here know as the trolley problem. 15 00:01:28,630 --> 00:01:32,840 But in the domain of driverless cars, this is the simplest version of it drivers. 16 00:01:33,040 --> 00:01:36,219 The car is on the road. Ten pedestrians jumping from a bridge 17 00:01:36,220 --> 00:01:45,220 One of its brake fails. The car is about to kill them. The only way out is if it swerves into this passerby and sacrifice the passerby. 18 00:01:45,220 --> 00:01:49,360 Should the car do that or should it just continue to kill the ten people? 19 00:01:49,540 --> 00:01:53,860 In another version, the car swerves into a barrier, sacrificing its own passenger. 20 00:01:54,160 --> 00:01:55,330 And should the car do that? 21 00:01:56,350 --> 00:02:06,130 Of course, this was basically a study where initially started, was published in 2016 by my co-authors and my former advisor, 22 00:02:07,000 --> 00:02:11,440 and they found that most people actually approve of utilitarian driverless cars. 23 00:02:11,440 --> 00:02:17,680 They think it is fine for cars to sacrifice their own passengers to save ten pedestrians. 24 00:02:18,310 --> 00:02:22,310 But there was a catch, of course, because when they ask people if they're willing to buy SUVs such as this, 25 00:02:22,330 --> 00:02:25,270 sacrificing cars, people say they're not willing to do that. 26 00:02:26,320 --> 00:02:33,180 Now, many people have, of course, presented this work for a while, and a lot of people have said, look, 27 00:02:33,200 --> 00:02:41,410 driverless cars will never face these trolley problem dilemmas, which is could be true, could be maybe they're going to be very rare situations. 28 00:02:42,730 --> 00:02:44,290 But, you know, 29 00:02:44,290 --> 00:02:53,290 driverless cars would probably face more trade offs and we actually do face more of those trade offs in our daily lives and will resolve them on a daily basis. 30 00:02:53,650 --> 00:02:56,590 Let's take this this case, which is less stylised. 31 00:02:56,590 --> 00:03:02,930 Then imagine you're driving the car in the middle of the white car in the middle, and you have this big truck on your right. 32 00:03:03,070 --> 00:03:10,149 This is a cyclist on your left. Now, maybe you might be inclined to drive closer to the cyclist, 33 00:03:10,150 --> 00:03:14,530 because maybe you're conscious about being struck and you're trying to protect yourself. 34 00:03:14,800 --> 00:03:18,430 In this case, you are pushing more risk towards the cyclist. 35 00:03:19,180 --> 00:03:23,380 Or maybe you're very conscious of what the cyclists are closer to the vehicle 36 00:03:23,390 --> 00:03:27,890 and shifting more risk towards your self right now while doing these things. 37 00:03:27,970 --> 00:03:32,130 Maybe if I asked you in a survey online, you would all say, Well, 38 00:03:32,140 --> 00:03:36,400 of course I'm going to go closer to for the big truck because I'm such a nice person. 39 00:03:36,400 --> 00:03:39,640 I'm an activist. I would sacrifice myself to help everyone else. 40 00:03:40,300 --> 00:03:49,630 But then in reality, you would do something different. Right now, let's take you out of this and put that driverless car extender right. 41 00:03:49,900 --> 00:03:53,920 For driverless cars. You're going to be implemented these decisions ahead of time. 42 00:03:54,010 --> 00:03:57,120 Right. And actually, you probably would not be specific. 43 00:03:57,130 --> 00:03:59,860 You say in this scenario, I would do this or would do that, 44 00:04:00,190 --> 00:04:05,350 but basically the car would make calculation based on other data and some optimisation and whatever. 45 00:04:05,710 --> 00:04:15,190 And maybe these techniques will become as a side effect of the car, minimising the company's liability or maximising user convenience. 46 00:04:16,160 --> 00:04:22,959 But in a way or another, imagine now you have this driverless cars who have all of this implemented ahead of time. 47 00:04:22,960 --> 00:04:27,430 So we don't really have to you know, there's no spur of the moment decisions. 48 00:04:28,060 --> 00:04:34,420 And also these decisions are implemented in hundreds of thousands of cars because the same algorithm is doing the same thing. 49 00:04:34,870 --> 00:04:42,099 Now, imagine you have two companies, one of them because their cars cost closer to the the big truck. 50 00:04:42,100 --> 00:04:45,970 In this case, other one goes closer to the cyclist. 51 00:04:46,360 --> 00:04:57,400 And let's say after driving hundreds of thousands of miles or millions of miles, they say the one that is driving closer to the truck, 52 00:04:57,700 --> 00:05:07,599 maybe the risk materialises in the death of five people waiting for the other one, the risk with your license in sacrificing one cyclist. 53 00:05:07,600 --> 00:05:12,400 But then we kind of get back into the trolley problem, but at a statistical level, right. 54 00:05:13,210 --> 00:05:18,610 So in a way, this is not really this whole discussion is not really about the trolley dilemma. 55 00:05:18,610 --> 00:05:23,229 It's all your problem and what's the car will do when it's going to be sick of you killing? 56 00:05:23,230 --> 00:05:31,410 Actually, actively. Right. It's more about how these machines will actually change the distribution of risk and everything. 57 00:05:31,520 --> 00:05:34,360 This new diet and it's got mirrors. 58 00:05:34,360 --> 00:05:44,890 Also all the other concerns about HIV in general that how would we change the distribution of different values and distribution of wealth, 59 00:05:44,920 --> 00:05:50,320 distribution of of, you know, the fairness, all of these things that are increasing inequality. 60 00:05:51,850 --> 00:05:55,970 How would machines change that calculation? Right. So I hope that's kind of clear. 61 00:05:57,610 --> 00:06:02,170 So we wanted to understand what regular people think about this, 62 00:06:03,310 --> 00:06:12,520 especially that it is important because the emotional salience of of people when there's always a crash in the news is always quite big. 63 00:06:13,510 --> 00:06:19,630 So we've got this website called War Machine and website was deployed in 2016. 64 00:06:20,830 --> 00:06:24,550 Basically, it generates the moral dilemmas that are faced by Barbara Starr. 65 00:06:24,940 --> 00:06:29,500 In this case, for example, the car is has one passenger and it's headed for the road. 66 00:06:30,100 --> 00:06:33,909 And these four pedestrians jump in front of its brake. 67 00:06:33,910 --> 00:06:37,719 Feels like in every story the car is about to kill them. 68 00:06:37,720 --> 00:06:41,190 And the only way out if it swerves into this barrier. Which means what? 69 00:06:41,200 --> 00:06:44,290 Sacrifice this female athlete inside the car? 70 00:06:44,590 --> 00:06:51,010 What if it does not? It will do this for pedestrians are, by the way, or jaywalking, including the dog, which you do. 71 00:06:51,310 --> 00:06:56,640 And then you would basically as a user who works like you would say, oh, I think the car should do this or do that. 72 00:06:56,650 --> 00:07:00,610 And then you click and then you get the next scenario. You get the Kenobi scenarios. 73 00:07:00,610 --> 00:07:07,410 And at the end, just as a gamification value, we tell people to give you feedback, saying some summary of that. 74 00:07:07,420 --> 00:07:13,780 Okay, so this is how you answered your question. You know, this is how you compare to other people who've answered this. 75 00:07:14,470 --> 00:07:18,030 Now, these are not very scientific, but at least, you know, people engage. 76 00:07:18,040 --> 00:07:22,630 They basically encourage them to be more truthful about what they would do. 77 00:07:25,430 --> 00:07:31,160 In building this website, we try to include multiple different characters and this website. 78 00:07:31,850 --> 00:07:41,450 This has to be you know, this has two values. One of them is what is more engagement, but it's also increased the external validity of our findings. 79 00:07:41,450 --> 00:07:48,560 And we will see that later on that, you know, this whatever we've tried to see, like whether it's like we're trying to study gender or age, 80 00:07:48,800 --> 00:07:54,709 like it's not studied by specific kind of characters, but more different kind of different kind of character. 81 00:07:54,710 --> 00:08:03,650 So you see here like adults and kids and seniors and you know, you see like athletes and the criminal and a baby and a girl going to have. 82 00:08:05,930 --> 00:08:10,670 And then in building this scene, we kind of created four different components. 83 00:08:11,300 --> 00:08:16,870 One of them is whether the car should stay or swerve just to do the present omission versus commission. 84 00:08:18,080 --> 00:08:21,350 Also, the relationship of those potential casualties with the car. 85 00:08:21,530 --> 00:08:28,910 So they could be either pedestrians or passenger. And if if they'll be distance, whether they're crossing legally or illegally. 86 00:08:29,240 --> 00:08:35,170 And also some features about those, let's say, passengers or pedestrians. 87 00:08:35,180 --> 00:08:39,770 In this case, we're putting one group as being older than the other group. 88 00:08:40,900 --> 00:08:48,290 Okay. So basically we had multiple factors and this is basically a multifactorial design where we 89 00:08:48,300 --> 00:08:52,320 include multiple attributes and we value them on top of each other with some restrictions. 90 00:08:52,980 --> 00:08:56,910 And then we try to understand the effect of each of these attributes on people's assets. 91 00:08:57,210 --> 00:09:04,800 So the first one, as I mentioned, was interventionism. And then we have this, which you also mentioned and legality and then all of these factors. 92 00:09:04,890 --> 00:09:11,190 Now these like those we chose them because we knew from the literature that they do make a difference on people's decisions, 93 00:09:11,430 --> 00:09:14,190 on the regular people, not because we think that they should. 94 00:09:16,650 --> 00:09:25,350 So we took these factors together and then we put this website up and then we just we Tanzania works out to ten languages. 95 00:09:25,650 --> 00:09:30,750 What happened that suddenly the website was to our surprise, that got a lot of attention. 96 00:09:30,750 --> 00:09:33,270 So we realised we want to from different places in the world. 97 00:09:33,270 --> 00:09:41,880 So we wanted to make sure we also reach, try to reach more audience and as well and, and reach more representative samples. 98 00:09:42,930 --> 00:09:52,889 Very painful does you know because I think a website to try to 4 million users came to the to the 99 00:09:52,890 --> 00:09:59,520 website during the first 18 months they contribute to 40 million decisions during those 18 month. 100 00:09:59,850 --> 00:10:02,850 And there was a optional survey at the end. 101 00:10:02,850 --> 00:10:07,530 So we had like almost like a half a million people who filled this survey, which has lots of people benefits. 102 00:10:07,830 --> 00:10:11,489 Now, of course, it looks like it's still running, so we're still collecting data. 103 00:10:11,490 --> 00:10:15,510 And this 40 millions has become, I think, 100 million. 104 00:10:15,510 --> 00:10:27,270 So we're not doing anything with the data yet. But potentially later on, this is basically a map showing 1.41 location from each one user. 105 00:10:27,270 --> 00:10:33,780 At least one person can. Of course it is representative in some places, but like in other places, like in Africa, 106 00:10:33,780 --> 00:10:39,809 we have a lot of people just for comparison, you can see all the map of the whether it's light or electricity. 107 00:10:39,810 --> 00:10:47,100 So you can see that places where they have access to electricity, somehow the website somehow reached someone. 108 00:10:47,220 --> 00:10:54,820 Someone there. And the results of this all of this were basically published in this Nature Journal paper, 109 00:10:55,240 --> 00:10:58,300 which I'm going to show you now about the results right now. 110 00:10:59,740 --> 00:11:07,070 Right. So in order to understand the results, the x axis is basically a difference between two probabilities, 111 00:11:07,790 --> 00:11:13,580 and these are the probability of choosing the right hand side minus what we get using the left hand side. 112 00:11:13,610 --> 00:11:20,900 So in this case, I'm going to think all this in order that have humans on one of the sides are going to see how often people choose to save them. 113 00:11:21,230 --> 00:11:26,930 And of course, that probability of spanning the humans available the same thing with pets, and then also flagged these two possibilities. 114 00:11:27,140 --> 00:11:36,160 And then I get this number, which is because it's. Then you could do the same thing with a number of lives. 115 00:11:37,180 --> 00:11:42,420 So again, you could do the same thing. You think, Oh, this is a tough one, but have to be. 116 00:11:43,330 --> 00:11:49,300 And in this case, the one here means you added one more character. 117 00:11:49,780 --> 00:11:53,560 Four means you've added four more. So this is a case of moving from 1 to 5. 118 00:11:54,550 --> 00:12:04,900 As you can see here, as you know, as you add more layers, the the the the preference increases and the average of that somehow coincides with two. 119 00:12:06,160 --> 00:12:11,090 You could do the same for all of other factors, and you could also compare the different factors. 120 00:12:11,100 --> 00:12:19,860 So as you can see, we talked about spanning humans, but that seemed to be the most consistently separate cases come after spanning more lives. 121 00:12:19,870 --> 00:12:26,560 Or if you are doing this and then spreading the young over the elderly, which also, some people would say, another form of sectarianism. 122 00:12:27,760 --> 00:12:32,770 And then after these three, which are quite stronger than the others come these two others, 123 00:12:32,770 --> 00:12:36,730 which is the role over the unlawful and higher status of North status, 124 00:12:37,030 --> 00:12:43,450 and then up to preference for inaction over action, which is the Commission versus commission. 125 00:12:44,580 --> 00:12:48,990 All right. So we have these nine different numbers which we can see at the global level. 126 00:12:50,670 --> 00:12:59,190 Now, spanning the arc over there, there was quite a curious thing because it's probably among these three is probably the most controversial, 127 00:12:59,190 --> 00:13:03,120 even though that perhaps many of you probably know that in the medical domain, 128 00:13:03,120 --> 00:13:08,970 there are situations where it is justified to use age as a factor in prioritisation. 129 00:13:09,930 --> 00:13:14,910 But at the same time, we're curious to know, like whether this is something influenced by some specific characters. 130 00:13:15,330 --> 00:13:19,160 So we could actually do the same analysis at the level of each of our characters. 131 00:13:19,190 --> 00:13:26,250 Remember, we have 20 different characters. So similarly, the x axis here, I'm going to stake all the scenarios, 132 00:13:26,280 --> 00:13:30,510 the type of baby stroller, one baby stroller, and I'll see how often people spend them. 133 00:13:31,020 --> 00:13:34,820 And then I would do the same thing with the adult male. I go see that, right? 134 00:13:34,830 --> 00:13:38,819 I get another property and then subtract these two probabilities and I would get this x axis 135 00:13:38,820 --> 00:13:44,860 and see baby stroller is much more prepared to be spared than the adult male and female. 136 00:13:45,330 --> 00:13:47,820 And then you do the same thing for all other characters. 137 00:13:48,180 --> 00:13:53,280 And then you can see that some of them are on the positive side, which means they are spared more than the adults. 138 00:13:54,240 --> 00:13:57,180 They are located in female and others on the negative sides, meaning they are. 139 00:13:57,750 --> 00:14:03,600 Now, if you look at the top four, you can see that they're all involved, young characters. 140 00:14:03,600 --> 00:14:09,540 So it seems like and this is like this is kind of the preference for age is somehow something that's quite consistent. 141 00:14:09,540 --> 00:14:15,299 We can see it across the different factors now also the same thing as well. 142 00:14:15,300 --> 00:14:19,080 If you look at the world, I mean you can see the go get a card and the criminal that comes in between. 143 00:14:19,080 --> 00:14:23,990 Curiously after that, you can see that also Italy comes also back. 144 00:14:24,150 --> 00:14:32,250 So it seems like people do feel very strongly about the age as a factor, as a, you know, to to have a perfect moment. 145 00:14:35,070 --> 00:14:42,240 As I mentioned, the people came from different places in the world, so we could basically do the same analysis at the level of each country. 146 00:14:42,810 --> 00:14:50,100 We had people coming from over 200 countries, but for this analysis, we took the companies that got at least 100, 100 participants. 147 00:14:50,760 --> 00:14:56,220 And basically, we calculated the same nine numbers for each country, and then we can calculate. 148 00:14:57,030 --> 00:15:03,540 So you have a vector of nine dimension, a vector, and then you could calculate the distance between every two countries. 149 00:15:03,540 --> 00:15:11,280 And then you could use that to, to do like using a hurricane clustering, which can be visualised in this database. 150 00:15:12,060 --> 00:15:17,940 It shows multiple levels. So you can see for each country what is the closest country to their answers. 151 00:15:18,180 --> 00:15:22,140 But also, you can see broadly how countries congregates. 152 00:15:22,380 --> 00:15:26,670 So you can see if you can see the countries. But basically you have three coastal branches. 153 00:15:27,360 --> 00:15:31,450 There's the red one. Western. And then the new one, the eastern. 154 00:15:31,470 --> 00:15:38,380 And then the yellow one, which is because the eastern and then the colour of the country names the they're 155 00:15:38,700 --> 00:15:44,760 using some cultural amount that builds up to a few policies have came up with. 156 00:15:45,420 --> 00:15:56,399 So it's not our own but some of these names might might sound like seems like a weird classification that some countries are English speaking, 157 00:15:56,400 --> 00:16:01,220 others are abortion or Catholic majority or Baltic or whatever. 158 00:16:01,230 --> 00:16:09,690 So you can see that this example, like Latin America, these are all the ones that the Isle of Man's, that Myanmar. 159 00:16:09,690 --> 00:16:13,080 That's bizarre. So it won't always be exactly one. 160 00:16:13,590 --> 00:16:18,270 Yeah. So we focus on something else. It must be a small, mostly small sample issue. 161 00:16:19,290 --> 00:16:24,140 That's true. That's true. That's very very much I thing unless. 162 00:16:25,350 --> 00:16:33,270 So these are these these hurricane clustering things is in places which are near on the circle actually near to each other. 163 00:16:33,870 --> 00:16:41,140 So just near in the sea. Yes. So nobody but the tree if you look at you can Australia that that little. 164 00:16:42,000 --> 00:16:48,210 I didn't see that that that was that has branched right out there it's very new it's very near. 165 00:16:49,020 --> 00:16:53,370 Yes true so that's very yes that's true. 166 00:16:54,150 --> 00:17:00,780 So I mean, of course, this method is is a little bit it's very good sensitivity to some actually some small changes. 167 00:17:01,020 --> 00:17:09,120 But I guess and you could do statistical this to show that it's better than random at least some but you know you very spot visual but you could see, 168 00:17:09,120 --> 00:17:14,070 for example, that these countries that are, you know, English speaking, they kind of congregate together. 169 00:17:15,150 --> 00:17:22,200 And I'll talk about this three clusters now. So the first one, as I said before, the Western, because it has many Western countries, of course, 170 00:17:22,200 --> 00:17:30,570 the some exceptions you can see in your speaking countries, you can see protesters Catholic and also those countries. 171 00:17:33,300 --> 00:17:37,560 The second one was called. Eastern Islamic Countries Conversion. 172 00:17:38,340 --> 00:17:45,480 South Asian countries. And then the last one was a bit weird one for the sovereign. 173 00:17:45,860 --> 00:17:52,610 But basically you have to subtract is one of them has white cube like an American with some exception 174 00:17:52,610 --> 00:18:00,380 there and clear like some kind of by the former French colonies somehow fitting together this. 175 00:18:01,620 --> 00:18:10,040 Now what you could do there, sir? What you could do is try to see how each cluster chain is different from the other two. 176 00:18:11,030 --> 00:18:14,930 So this is now just really to make comparisons between the clusters. 177 00:18:15,710 --> 00:18:23,390 One note to keep here is that for all of these nine differences that I showed you, the direction is always is always. 178 00:18:23,600 --> 00:18:27,740 Most of the time is the same across countries, but the magnitude is different. 179 00:18:28,880 --> 00:18:32,210 Calculating this is a z-score so that equals zero means average. 180 00:18:33,200 --> 00:18:37,910 Western doesn't have much interesting stuff because it has most countries, so it's supposed to average everything. 181 00:18:38,240 --> 00:18:45,050 But you can see here, for example, that in the Eastern societies, or like at least two cultures, you can see that, for example, 182 00:18:45,380 --> 00:18:55,130 Spain, the local is a bit or is a little story that average Spain pedestrians are a bit stronger, sparing the spacing. 183 00:18:55,130 --> 00:19:00,350 The younger is very, very weak. Right and they do still spare the younger over the elderly. 184 00:19:00,350 --> 00:19:08,089 But at the same time, this difference is much less pronounced in those countries, which is also consistent with anthropological studies, 185 00:19:08,090 --> 00:19:11,270 that they are more have more spirituality, they have a more respect to the elderly. 186 00:19:11,450 --> 00:19:20,029 They have a stronger family ties. And then this other one they've also got spitting females is much stronger, for example, 187 00:19:20,030 --> 00:19:24,800 which is apparently something given by France and all the countries that were influenced by there, 188 00:19:25,610 --> 00:19:29,150 which is, you know, stronger than other European countries, for example. 189 00:19:31,400 --> 00:19:42,140 Much more probably insightful, is trying to get at exploring some of the association and this organisation at the county level. 190 00:19:42,380 --> 00:19:50,570 One of the, let's say anthropological or cultural measures is individualism, that some countries are very individualistic. 191 00:19:50,570 --> 00:19:54,170 People focus on the individuals. They see themselves as independent. 192 00:19:55,790 --> 00:19:59,090 Those countries usually have a strong institutions. 193 00:20:00,230 --> 00:20:05,870 Why? On the other extreme, we have collectivist countries where people have a stronger family ties. 194 00:20:05,870 --> 00:20:11,540 They have their loyalty is to their extended families rather than to the state. 195 00:20:12,600 --> 00:20:19,729 And so so individualist countries, they usually like more like egalitarian. 196 00:20:19,730 --> 00:20:28,040 So everyone is equally to the same. And so having more saving more lives is always better than saving fewer lives. 197 00:20:28,070 --> 00:20:33,860 What would be collectivist society that all other factors that might play another? 198 00:20:34,460 --> 00:20:40,130 Another also factor or index is called rule of law, which is how much respected the law in a country as well. 199 00:20:40,340 --> 00:20:44,319 We can see here that spending that often is correlated with it as well. 200 00:20:44,320 --> 00:20:49,340 Other countries like Japan or Germany all have a very strong rule of law. 201 00:20:49,370 --> 00:20:56,540 And in this country, it seems like I mean, again, spending to local was always preferred in most countries, 202 00:20:56,540 --> 00:20:59,600 but this difference becomes bigger than it was once. 203 00:21:01,420 --> 00:21:04,750 We're going to talk more about the implications of what I just mentioned. 204 00:21:05,470 --> 00:21:10,780 I just want to make sure that we also make it clear that, you know, many of you, of course, 205 00:21:10,780 --> 00:21:16,990 know that this distinction between normative ethics and descriptive ethics and this work that goes into these more like descriptive ethics. 206 00:21:17,740 --> 00:21:26,620 Normative ethics is what what we should do. This is what the general public think we should do or what we actually do, what our culture begins to do. 207 00:21:27,070 --> 00:21:35,680 Norms tell us to do. So this was more like a descriptive trying to get at what people are usually or think they get people to do. 208 00:21:36,250 --> 00:21:44,409 But there have been some efforts on ethics, a lot of efforts, but that usually, I think at the level of countries, 209 00:21:44,410 --> 00:21:53,860 Germany was the first countries that actually they brought in a committee and they brought these committee representatives from different domains. 210 00:21:54,580 --> 00:21:57,910 No ethics, engineering and a Catholic bishop as well. 211 00:21:58,930 --> 00:22:05,380 So these committee, they sat together for a few months and, you know, and eventually they came up with some set of rules. 212 00:22:06,810 --> 00:22:11,760 Of course, there's you know, from hearing from one of the members of the committee who said, like, there was a lot of fighting, 213 00:22:11,760 --> 00:22:17,820 of course, and people there were things where people did not want to sign their names off until they were being resolved. 214 00:22:18,360 --> 00:22:24,990 So at the end of it, there were some contradicting statements because everyone wanted to be, you know, represented in the statements. 215 00:22:25,950 --> 00:22:31,649 But what I'm going to do is just know I'll just list the top five differences that we found in people. 216 00:22:31,650 --> 00:22:36,660 And I would just just make a connection to what the committee have actually said. 217 00:22:38,580 --> 00:22:41,060 The first thing I said is human overheads. 218 00:22:41,910 --> 00:22:50,820 And there was a line in that report that say it's fine if we give priority for humans over space or objects, 219 00:22:51,000 --> 00:22:56,760 if that's the only way out to save that human. The second factor is more of a future. 220 00:22:57,540 --> 00:23:02,429 This is where the committee we're a little bit conflicted because some people say it's fine, 221 00:23:02,430 --> 00:23:10,290 but it's fine if you if you know, if you do programming to reduce the number of lives lost. 222 00:23:10,320 --> 00:23:17,610 But but there was another sentence that I also add saying, but you should not be comparing lives and sacrificing lives for other lives. 223 00:23:18,150 --> 00:23:24,240 So in a way, just to appease both parts of the different parts of the committee, younger, over elderly. 224 00:23:25,170 --> 00:23:34,820 This was curiously in complete disagreement with the committee who said this is unacceptable and over age is, you know, 225 00:23:34,950 --> 00:23:41,700 all over are making in a way they're completely in disagreement because they had some some statements saying, 226 00:23:41,970 --> 00:23:46,170 well, if the whole of remark came about because somebody was not following the rules, 227 00:23:46,860 --> 00:23:51,200 then the people who are following the law should not be that right now. 228 00:23:51,210 --> 00:23:54,300 Notice how this sentence is being put in a way. 229 00:23:54,420 --> 00:23:59,280 So does that mean the court should punish the people whom, you know, who jokes? 230 00:23:59,610 --> 00:24:02,870 That does sound a bit harsh. So, no, we don't say that. 231 00:24:02,880 --> 00:24:10,350 But, you know, so anyway, again, like it's not very controversial, but in a way you could you can think about it that not completely disagreement. 232 00:24:10,710 --> 00:24:13,980 And of course, this last one is clearly in disagreement. 233 00:24:14,940 --> 00:24:23,190 Now, to come to the second part of the of the implication, I guess I mentioned about the the level of the countries. 234 00:24:23,580 --> 00:24:28,390 And we found that, for example, some countries are more in favour of sacrificing the G. 235 00:24:28,430 --> 00:24:40,980 G workers than others. So let's say we have, for example, some driverless cars in some countries and they have some level of it's up to 21%. 236 00:24:41,220 --> 00:24:45,360 It's better than regular cars. So that's why it's enough to have them on the road. 237 00:24:46,080 --> 00:24:54,090 And let's say somebody came up with some update and this update will result in better or more safer driverless cars. 238 00:24:54,660 --> 00:25:00,010 But this safety is conditional on the fact that everyone is crossing to get by. 239 00:25:00,270 --> 00:25:03,989 If I go for crossing illegally, things would be even worse for them then. 240 00:25:03,990 --> 00:25:06,270 Should this company basically implement this update? 241 00:25:06,720 --> 00:25:13,650 And in a way that seems like in countries like Germany and Japan, it seems like that is the moral thing to do, 242 00:25:14,190 --> 00:25:21,120 the right thing to do for people like, you know, of course, you know, we should be able to have those cars on the road. 243 00:25:21,120 --> 00:25:25,560 But in other countries that maybe like in the Middle East, where I am from or in Latin America, 244 00:25:25,770 --> 00:25:31,320 people would find it like outrageous that you will basically to make cars that could be more harmful for, 245 00:25:31,650 --> 00:25:34,650 you know, you know, outrageously harsh punishment. 246 00:25:36,840 --> 00:25:39,959 So this was the result of the of the major people that I just mentioned. 247 00:25:39,960 --> 00:25:44,730 But there have been some work for work on the motor machine data. 248 00:25:45,720 --> 00:25:54,740 One of them is basically this is collaboration with Justin Wong's group at M.I.T., who do like work on cognitive science and computation. 249 00:25:54,740 --> 00:26:02,040 One of the signs and this is basically proof of concept building a computational model using Oracle Bayesian modelling. 250 00:26:02,700 --> 00:26:08,210 And in this case, we can assume that people have latent prophecies over the different characters. 251 00:26:09,780 --> 00:26:13,890 And then we will try to see that based on these latent preferences, which we don't know. 252 00:26:14,160 --> 00:26:21,050 They have answered the machine. QUESTION So, so we've got to do is we try to reverse engineer how they've done right. 253 00:26:21,570 --> 00:26:27,630 So we got to take their answers and basically we would take the scenario and spin it basically the characters into this, 254 00:26:28,290 --> 00:26:31,199 let's say Victor, which would tell you one, 255 00:26:31,200 --> 00:26:38,040 if the character exist in your scenario and then you adjust the dimensionality of those group using some concept like for example, 256 00:26:38,280 --> 00:26:41,759 you know, the pregnant woman and a baby that are both like babies. 257 00:26:41,760 --> 00:26:49,830 So it's kind of a conceptually the same. And then from those answers, you basically tried to calculate the basically, you know, 258 00:26:49,840 --> 00:26:57,340 you could think about the utility is is a combination of both the weight of each of these factors and whether they exist or not. 259 00:26:57,490 --> 00:27:03,190 Now, the weight we don't know which is what we try to uncover or reverse engineer using people's answers. 260 00:27:03,520 --> 00:27:07,450 Once you uncover the weights for each person, then you would be able to calculate. 261 00:27:07,660 --> 00:27:13,719 If I give you a new well, then you can multiply the weight by whether this person exists. 262 00:27:13,720 --> 00:27:20,470 And then you can tell me, oh, this person, they would for the utility of saving this group would be better than this group. 263 00:27:20,470 --> 00:27:23,170 And as a result, you can predict what they would have done. 264 00:27:24,970 --> 00:27:32,530 Another project was with people at CMU who was basically using a collective decision maker. 265 00:27:33,490 --> 00:27:37,000 Again, this is also a proof of concept, so we don't really recommend doing it. 266 00:27:37,330 --> 00:27:42,280 But let's say if you have a group of people who have answered some of these scenarios, 267 00:27:42,280 --> 00:27:46,900 then you can basically model what they would have answered on all of the other possible scenarios. 268 00:27:47,200 --> 00:27:55,599 And then somehow you can have some kind of a collective decision maker and aggregation function that could aggregate all of this together. 269 00:27:55,600 --> 00:28:02,500 And now it gives you one big function that would tell you what the whole society would have answered in this situation. 270 00:28:02,530 --> 00:28:08,310 Now, again, this is probably not a good idea because people obviously formalising whatever. 271 00:28:08,560 --> 00:28:13,300 Right. But maybe maybe let's say if you take a subset of your scenarios, 272 00:28:13,750 --> 00:28:20,290 some of these scenarios that maybe that the experts think are are defensible or acceptable. 273 00:28:20,290 --> 00:28:26,500 But what the experts disagree over, then maybe maybe the public could be used as a tiebreaker in some. 274 00:28:27,760 --> 00:28:33,220 Okay. So now let's forget about the Oculus.com rocket motor skills for some time. 275 00:28:34,180 --> 00:28:41,230 This is a project that is also risking the war machine, because when my machine made a look, I had a lot of people coming in. 276 00:28:41,890 --> 00:28:46,930 We decided that maybe this chance also for us to study the original classic TV program as well. 277 00:28:47,110 --> 00:28:55,090 So we opened the other page and we put this classic trolley scenarios, the ones that we only took three. 278 00:28:55,270 --> 00:28:58,810 Actually, this is the one this is, I guess the classic, the most famous one. 279 00:28:59,830 --> 00:29:07,959 For those of you who don't know much, you know, this is the the trolley or the bookstore, whatever is going to kill those five workers. 280 00:29:07,960 --> 00:29:12,460 And you're standing you're the man in blue who is standing next to the switch. 281 00:29:12,760 --> 00:29:16,840 We can put this which goes to the other side, kills this one person. 282 00:29:17,260 --> 00:29:18,520 What should the person do? 283 00:29:19,780 --> 00:29:26,170 And also, we had because thanks to the war machine popularity, we also had people coming also from different places in the world. 284 00:29:27,280 --> 00:29:29,770 And we basically used three different versions. 285 00:29:30,380 --> 00:29:36,840 For the first version, you'll know this, which you just mentioned it, which is probably the second most famous ones. 286 00:29:36,850 --> 00:29:39,970 The against Wally is going to kill five people. 287 00:29:40,000 --> 00:29:45,460 You're standing on a bridge, and if you have it as a large person, you could push them. No, instead of putting the switch. 288 00:29:45,820 --> 00:29:51,070 And if you push them and they go down, I mean, it doesn't appear in the picture, but basically. 289 00:29:57,300 --> 00:30:00,750 Like if they follow them, that their body is big, so it won't stop. 290 00:30:01,350 --> 00:30:08,190 It's only what they would die by. So in a way, in both cases, you're sacrificing one person to save five. 291 00:30:08,550 --> 00:30:16,440 But somehow, still, people find it more acceptable to sacrifice one person in this case than in this case. 292 00:30:16,470 --> 00:30:18,750 Right. And people came out with different theories. 293 00:30:18,750 --> 00:30:24,750 And and, you know, there are two main differences that people have hypothesised that make a difference. 294 00:30:24,780 --> 00:30:30,090 One of them is that in this case, you're putting a switch. In this case, you're actually pushing someone by it. 295 00:30:30,120 --> 00:30:33,420 So there's a comfort, emotional factor. 296 00:30:33,960 --> 00:30:37,860 But also, it's kind of like a more subtle difference is that in this case, 297 00:30:38,190 --> 00:30:45,299 if you if you flip the switch and say this one person actually ended up running away, you'll be happy. 298 00:30:45,300 --> 00:30:50,470 That's the best thing you could do. Right. So it's actually you know, it's actually nice if they end up running away. 299 00:30:50,480 --> 00:30:56,850 Right. So in a way, you're you know, that this is more like a side effect of you trying to save this life. 300 00:30:57,690 --> 00:31:00,910 Well, in this case, you really want literally to hit the person. 301 00:31:01,380 --> 00:31:04,680 If it misses them, then it's probably going to go and kill the five people. 302 00:31:04,680 --> 00:31:07,680 So in a way, I recall that kind of double effect. 303 00:31:07,830 --> 00:31:15,200 Some people came up with a different variation. That is, you know, where a man who's wearing knapsack and knapsack. 304 00:31:15,720 --> 00:31:20,850 Well, basically you stop it. But basically so, you know, thank you for the other reasons. 305 00:31:20,910 --> 00:31:29,340 So then people also came up with this middle as well dilemma where tries to combine elements of both. 306 00:31:29,370 --> 00:31:36,290 So in this case, the trolley will if you do nothing is will continue to go to five people just like in the other case. 307 00:31:36,300 --> 00:31:41,640 And if you pull the switch then it goes into the side and then kill this one person. 308 00:31:42,360 --> 00:31:48,300 But unlike in the original switch, if this person runs away, then don't even look back into the fight. 309 00:31:48,570 --> 00:31:54,630 So in a way similar to this one, you wanted to kind of, you know, to hit that person. 310 00:31:54,960 --> 00:31:59,220 Right. So in a way, it's similar to this one is a less like this one and other aspects. 311 00:31:59,550 --> 00:32:05,550 So you know, as because theorised as actually people find it more acceptable to to sacrifice 312 00:32:05,550 --> 00:32:09,180 one person in this one more than in this war and in this one more dynamic. 313 00:32:09,500 --> 00:32:14,490 So a sort of like most of the studies that have been done before, we're done in mostly the Western world. 314 00:32:14,730 --> 00:32:22,050 Sometimes I would include people in China or Japan. But basically we were able to select it from about 40 countries. 315 00:32:22,440 --> 00:32:27,330 And interestingly, we could find that in all of the 40 countries that the order was the same. 316 00:32:27,750 --> 00:32:30,950 So in all these countries to put on acceptable. More acceptable to. 317 00:32:32,850 --> 00:32:38,520 Religion is actually even. Through sacrifice and because it's huge. 318 00:32:38,790 --> 00:32:47,010 Then the loop and then maybe for now, interestingly as well, there was some kind of radiation and these radiation was much bigger in case of Asia. 319 00:32:47,880 --> 00:32:53,430 So the second question was, is there any factor that we could basically help us predict? 320 00:32:53,610 --> 00:33:03,030 Why we find these differences? And one such factor is something a kind of index called relational mobility and the issue of 321 00:33:03,030 --> 00:33:09,640 mobility as talks about the fluidity and ease with which people can make new relationships. 322 00:33:09,660 --> 00:33:16,049 So some societies, it's much easier for people to make new relationships with spaces like, say, 323 00:33:16,050 --> 00:33:21,090 London or bigger cities and other places where maybe like in the suburbs, it's, it's much harder. 324 00:33:21,090 --> 00:33:26,250 So it's harder to make new relationships. But then your relationships are much they last longer. 325 00:33:26,400 --> 00:33:33,629 Right? So you have this kind of more persistent relationship with people and in places where you 326 00:33:33,630 --> 00:33:37,350 have a higher relation mobility because it's easier for you to make new relationships, 327 00:33:37,350 --> 00:33:42,899 you become more you have more self expression, you'll be more likely to find like minded people. 328 00:33:42,900 --> 00:33:49,550 So you'll be less likely to hide your opinion or like this, that you're social, less, 329 00:33:49,560 --> 00:33:56,790 less worried about your the social signals that you send to your friends, unlike in the places where there's a no relation mobility. 330 00:33:57,980 --> 00:34:07,580 And there have been these people who actually at least established this this measure and talked about why they thought it came about. 331 00:34:07,970 --> 00:34:11,000 They talk about these places that have a lot of stress. 332 00:34:11,020 --> 00:34:19,430 So people became more co-dependent. So the relationship became more on occasion, became stronger and longer energy and subsistence lives. 333 00:34:20,000 --> 00:34:22,970 Like growing rice also requires a lot of people working together. 334 00:34:23,870 --> 00:34:28,400 So anyway, it's kind of a side work, but this for people who are interested to learn more about this. 335 00:34:28,970 --> 00:34:32,030 Now, why is this relevant to this story? 336 00:34:32,300 --> 00:34:34,960 Apparently, if you live in a place where, you know, 337 00:34:35,050 --> 00:34:43,250 basically like immoral judgement as it seems so like there are some moral decisions that could send a negative signal, social signals and others. 338 00:34:43,610 --> 00:34:48,200 And I'm referring to some of the work that I'm sure you know Molly as well here, 339 00:34:48,710 --> 00:34:56,780 who basically showed that people like if you do if you're a utilitarian decision, that's kind of send a more, 340 00:34:57,140 --> 00:35:02,720 let's say, negative signal that if you are if you adopt your policy abilities, 341 00:35:02,840 --> 00:35:09,079 that kind of decisions in a way, in these cases, you are sacrificing this one person to save the many. 342 00:35:09,080 --> 00:35:13,730 This is kind of a sends a negative social signals to your friends. 343 00:35:13,940 --> 00:35:18,019 But if you're in a society where there's a higher emission mobility, you don't care much because you know, 344 00:35:18,020 --> 00:35:23,300 you can make new friends and you can you know, you can you can make you can meet like minded friends as well. 345 00:35:23,900 --> 00:35:34,940 What in the other places not only would not tell people what you actually think, but you also might even internalise the views that will not help you. 346 00:35:34,940 --> 00:35:39,690 So we find some kind of a, you know, a stronger correlation with these three different things. 347 00:35:39,710 --> 00:35:50,750 Of course, it is driven by the Asian countries, but in another and so there was a follow up paper that did more stringent analysis on this data, 348 00:35:50,750 --> 00:35:54,050 and they showed that the only one that survives is this one. 349 00:35:54,200 --> 00:36:00,020 So in a way, it does make sense because the footbridge is really seen this as pushing. 350 00:36:00,470 --> 00:36:03,980 Pushing a person is quite a strong social signal. 351 00:36:06,200 --> 00:36:10,639 Okay. I'm going to move on to. We still have time in about 15 minutes. 352 00:36:10,640 --> 00:36:16,510 Yeah. Okay. So I'm going to move on to cover relevant stuff. 353 00:36:17,620 --> 00:36:20,830 This is about, as you all know, of course, 354 00:36:22,240 --> 00:36:31,270 the last two years were some tough times and especially the they're probably among the first cases that were in the media talking about 355 00:36:31,810 --> 00:36:38,290 a number in Italy where doctors had to start making decisions about who should get the ventilators or who should be prioritised. 356 00:36:38,650 --> 00:36:41,530 And of course, in some countries, they have rules and they have codes. 357 00:36:42,160 --> 00:36:50,020 In other countries, the creative committees or people or of officials who made decisions were different from the doctors who were operating. 358 00:36:52,610 --> 00:37:00,700 And basically people have talked about different factors. Of course, there are always the notorious cases like first come, first serve. 359 00:37:00,700 --> 00:37:06,490 The people who arrive first actually should be given ventilators first. 360 00:37:06,850 --> 00:37:11,980 And then there's another one that if a group of people caught this come at the same time, then you should probably look at this. 361 00:37:12,370 --> 00:37:20,529 And other people talk about different possibilities, like prognosis, who is most likely to survive and quality of life? 362 00:37:20,530 --> 00:37:24,490 Whose quality of life is going to be better in the long run age, 363 00:37:25,000 --> 00:37:31,210 giving priority to the younger over the other social value passed like doctors who have maybe they have. 364 00:37:32,170 --> 00:37:39,489 They became sick because they have been operating on or on patients social value for future like students, 365 00:37:39,490 --> 00:37:45,850 medicine students and nurse nursing students who are probably going to become able to help in the future. 366 00:37:46,960 --> 00:37:53,290 So we consider these five different and these two. So we created basically this big small survey. 367 00:37:53,290 --> 00:37:57,729 It was a B not not at all similar to the more massive pictorial process, 368 00:37:57,730 --> 00:38:04,210 just like a week short questions, asking people from 0 to 1 or two, from 0 to 100. 369 00:38:04,480 --> 00:38:07,240 How much do you think this factor should be considered? How much is too? 370 00:38:08,260 --> 00:38:12,070 And then we basically took this survey and we put it into two different places. 371 00:38:12,580 --> 00:38:18,640 One of them is we basically hired Google, as you probably know, the kind of company you pay the money, the. 372 00:38:19,070 --> 00:38:22,710 The good thing about them, they can get you nationally because if the samples are quite expensive, 373 00:38:22,720 --> 00:38:27,400 though, because they're expensive, we could basically have to choose for different countries. 374 00:38:27,700 --> 00:38:31,720 Now, remember I told you about these four clusters, three clusters in the morning machine. 375 00:38:31,930 --> 00:38:40,720 So we tried to basically choose one company from each one of those. So we had the US, we had Japan, Brazil and France. 376 00:38:42,820 --> 00:38:46,510 We also put the survey on the war machine, which, again, we're capitalising on this. 377 00:38:48,790 --> 00:38:53,530 We're going to go up. So we put it in different places. And then people would take, I think, the survey. 378 00:38:56,610 --> 00:39:01,740 And then just to talk about this, the results basically if you are in favour of. 379 00:39:02,760 --> 00:39:03,810 Let's say, prognoses. 380 00:39:04,050 --> 00:39:11,459 If one person thinks that prognosis should be served, then we have got to basically compare that with what they thought of the first come, 381 00:39:11,460 --> 00:39:19,130 first served, and that if they give a bit higher score than both, then you would say this person prefer prognosis than nothing else. 382 00:39:20,010 --> 00:39:26,879 And the same thing with the other four. Now we have five different lines, each one that has two possibilities. 383 00:39:26,880 --> 00:39:31,260 So we have 32 different possibilities to the point of five different possibilities. 384 00:39:31,680 --> 00:39:34,720 And among these 32, there are two extremes. 385 00:39:34,740 --> 00:39:37,410 One of them is 41%. 386 00:39:38,280 --> 00:39:47,130 Give a higher ranking for all of these, far more than the other two and nobody else where they give all of them know where that is. 387 00:39:48,420 --> 00:39:54,930 All of these things in between. What we find interesting is that in the four countries that we have used nationally, 388 00:39:54,930 --> 00:40:00,240 it was in the sample that the strongest two opinions or the strongest two combinations 389 00:40:00,600 --> 00:40:09,259 where the nuclear age and then zilch and then whatever else comes after that, 390 00:40:09,260 --> 00:40:12,540 there is something different. Okay. So these are the four countries. 391 00:40:13,020 --> 00:40:18,030 And then. We also had like about 20 other countries from the morning machine. 392 00:40:18,270 --> 00:40:22,070 They did not have nationally visible samples, but we had more countries, so why not? 393 00:40:22,790 --> 00:40:27,490 So we can see also that these two are the strongest two. 394 00:40:28,830 --> 00:40:31,860 Let's say profiles in our data and in every. 395 00:40:33,240 --> 00:40:36,660 So this is in a way and these are like the most extreme decisions as well. 396 00:40:36,690 --> 00:40:41,040 So in a way, there is some polarisation about people in those countries. 397 00:40:41,310 --> 00:40:45,860 Some of them was completely you know, they would not approve of anything I proposed. 398 00:40:45,960 --> 00:40:55,200 But if the others would approve of all the possibilities. Right. And we did like to, as we see any of the trials, could be more palatable than others. 399 00:40:55,200 --> 00:41:01,080 And it seems like prognoses seem to be like the most agreeable or most acceptable on all of this. 400 00:41:02,720 --> 00:41:08,980 Other work with also we did is I'm jumping out to a different project, but I'm not going to talk much about. 401 00:41:08,990 --> 00:41:13,100 But it's also inspired by the war machine. It's a different domain, but my goodness, 402 00:41:13,100 --> 00:41:20,500 it's tragedy that most people who were in the group meeting today for presentation for the results that we are about to submit. 403 00:41:20,510 --> 00:41:28,280 But basically you could either your $100 could either save HIPAA, which would help 20 women in North Africa, 404 00:41:28,490 --> 00:41:33,490 but wedding spiel, or you could give it to charity B, which would help for women in North America. 405 00:41:33,500 --> 00:41:37,250 And let's say you are in North America, for example, providing clean water, 406 00:41:37,490 --> 00:41:45,139 and then you would kick and say, oh, I think we should do this and should do that. And we also use literature to basically explore different factors. 407 00:41:45,140 --> 00:41:47,990 Again, some of these have been well-studied. 408 00:41:49,610 --> 00:41:56,960 And finally, I'm going to talk about this paper, which is which is an opinion piece, but it's for me, it ties together. 409 00:41:57,560 --> 00:42:04,250 So most of the stuff that I basically presented today, we think about it as more of a research agenda that I can do, 410 00:42:05,270 --> 00:42:15,649 was co-written with all this scholars around the world and basically we are going to competition ethics and basically we're inspired 411 00:42:15,650 --> 00:42:24,650 by a David Moore who's kind of very respected now in the cognitive science community of basically wrote this book about vision. 412 00:42:24,650 --> 00:42:29,900 And he suggested that vision should be studied at a computational algorithmic levels. 413 00:42:30,410 --> 00:42:34,070 And as a result of that, actually, the study of vision have been revolutionised. 414 00:42:35,450 --> 00:42:41,150 Basically, we're able today to build machines that have the human level capacity for vision. 415 00:42:43,080 --> 00:42:47,010 But we also became more able to understand how humans also see. 416 00:42:47,130 --> 00:42:52,370 Right. And this was kind of like this exchange between these two things. 417 00:42:52,370 --> 00:42:58,800 So trying to put vision into algorithms made it easier to understand some of these processes. 418 00:42:59,910 --> 00:43:03,299 So inspired by this, we're kind of talking about that, you know, 419 00:43:03,300 --> 00:43:10,950 there should be some kind of a research agenda that focus on formalising ethics and basically in algorithm in algorithmic terms. 420 00:43:10,950 --> 00:43:15,510 When we say algorithms, we're kind of pretty broad here. Like could be anything you could think about. 421 00:43:16,260 --> 00:43:19,319 But basically when you're not only formalising ethics, 422 00:43:19,320 --> 00:43:26,460 but also using this formalisation to either engineer ethical AI systems or trying to understand human moral decisions. 423 00:43:26,880 --> 00:43:34,890 So it's kind of could be either either goal. And I said here, algorithmic is pretty broad, could be just a set of rules, could be machine learning, 424 00:43:34,890 --> 00:43:39,300 could be networks, but could be also, you know, an equation for representation. 425 00:43:39,630 --> 00:43:43,500 And the idea here is that from one side, you know, 426 00:43:43,560 --> 00:43:49,650 you could you could use whatever we have on the street about human moral decision to help us to engineer ethical AI systems. 427 00:43:50,040 --> 00:43:57,550 But also, when we go through the exercise of engineering a system, it will help us understand better our own moral decision. 428 00:43:57,570 --> 00:44:05,100 So it might be that those, you know, ventilation mitigators officers, maybe they were making some following some rules. 429 00:44:05,100 --> 00:44:13,650 But at the same time, there were some really specific cases where they had to basically act on their own. 430 00:44:13,680 --> 00:44:18,540 Right. And maybe some of these cases, maybe there were some factors were not included in the rules. 431 00:44:18,570 --> 00:44:20,880 Maybe they they were not very consistent. 432 00:44:21,990 --> 00:44:30,090 So consistency is quite important here, especially so the idea of computational ethics here is like going to be trying to make things systematic, 433 00:44:30,090 --> 00:44:34,560 systematic or more coherent is like basically this conflict is on coherence. 434 00:44:34,920 --> 00:44:40,440 And of course, when you talk about coherence, when it comes to morality, reflective equilibrium comes to mind. 435 00:44:41,430 --> 00:44:43,590 For those of you who don't know, I suppose everyone probably know. 436 00:44:43,890 --> 00:44:51,540 But basically, you know, you can start usually with having more than two wishes about what you should do in some specific situation, 437 00:44:51,900 --> 00:44:55,020 and then maybe you can go from there into building your own theory. 438 00:44:55,140 --> 00:44:59,730 Right? And you could call that moral principles about like some kind of a general rule. 439 00:44:59,730 --> 00:45:04,320 Like, for example, let's say al-Shabab do like so I guess like come up with this rule that, you know, 440 00:45:04,350 --> 00:45:09,330 killing you can have fear that, you know, this person killed this person here sounds like not a good thing to do. 441 00:45:09,990 --> 00:45:18,090 And then you come up with the zero rule saying, well, killing is wrong. But then you come and you bring this theory to a tried using some use cases. 442 00:45:18,480 --> 00:45:22,140 And now you see, how would this theory work in this use cases? 443 00:45:22,380 --> 00:45:25,410 And then you test it out against your intuition. 444 00:45:26,010 --> 00:45:32,129 And then you go into cases saying like, what if I'm fighting for myself and then I kill someone while I'm defending myself with killing? 445 00:45:32,130 --> 00:45:36,090 Be wrong in this case. And then you be like, Oh, okay, so what can you do about that? 446 00:45:36,100 --> 00:45:39,690 So then you go into another round of revising your theory. 447 00:45:40,710 --> 00:45:42,540 So until you, you know, 448 00:45:42,570 --> 00:45:51,210 hopefully at some point you do some kind of coherent a rule and basically just kind of trying to just put that into computational terms, 449 00:45:51,780 --> 00:45:55,590 then this would just be simply trying to use some kind of data science tools to 450 00:45:55,830 --> 00:46:01,139 quantify those more intuitions and talk about formalising those moral principles, 451 00:46:01,140 --> 00:46:07,469 putting it into, you know, into do it like mathematics or algorithms and then use some cases. 452 00:46:07,470 --> 00:46:12,180 So basically kind of the same thing, just you just Jennifer putting the computational twist to it. 453 00:46:13,710 --> 00:46:19,980 And basically then we could talk about, so what does what does it mean to formalise the resolve of emerging or, 454 00:46:19,980 --> 00:46:25,170 let's say, existing research fields that have been operating, some of them focussed on formalising normative ethics. 455 00:46:25,170 --> 00:46:34,799 They use logic to formalise rules or use some kind of different tools to formalise the school ethics. 456 00:46:34,800 --> 00:46:42,990 I've mentioned some of the work that I've worked on that could be described in this way, formalising this, or you could have lots of different values. 457 00:46:42,990 --> 00:46:47,400 You want to balance things. We want to build a program that can be able to balance these different uses. 458 00:46:48,270 --> 00:46:56,170 And there was evaluation. You could talk about the guy wasting the performance of the machines and evaluating the human behaviour. 459 00:46:56,220 --> 00:47:02,320 Again, some also, you know, you can think about these questions related to that domain and then the idea that you 460 00:47:02,760 --> 00:47:07,430 evaluate is basically at this stage what is making testing against your intuition, 461 00:47:07,440 --> 00:47:12,120 what does this is about for formalising the principles? 462 00:47:12,130 --> 00:47:18,240 So when you think about theories, you're more about in this domain, when you try to test it out in this way and then. 463 00:47:19,410 --> 00:47:23,280 You know, if you have people who are working on this, these questions together, 464 00:47:23,280 --> 00:47:30,770 then you can have some kind of synergy and then some probably new questions or let's say more emerging fields might come up. 465 00:47:30,780 --> 00:47:37,770 Like, for example, nobody could. I mean, there may be some kind of research lines will emerge to talk about how can we effectively 466 00:47:37,980 --> 00:47:42,680 fighting maybe shrink the difference between normative and descriptive alignment, 467 00:47:42,690 --> 00:47:48,479 maybe whether in educating the public about the normative, or maybe in trying to understand more about what the public think. 468 00:47:48,480 --> 00:47:53,850 And maybe this could inform what we know about normative ethics or human machine emergent morality, 469 00:47:53,850 --> 00:47:58,860 because also there are so many situations where humans and machines will probably the future act next to each other. 470 00:47:59,190 --> 00:48:07,259 And then some of these you know, some of these decisions could really emerge in unexpected ways, 471 00:48:07,260 --> 00:48:10,200 like you can imagine driving a civic survivalist cause. 472 00:48:10,200 --> 00:48:17,339 And then you might be see some kind of an intersection and you realise those cars would have to stop anyway. 473 00:48:17,340 --> 00:48:20,880 So you kind of bully it. So you jump in front of it and they have to stop. 474 00:48:21,180 --> 00:48:25,860 And as a result you maybe you become more aggressive in your driving and maybe 475 00:48:25,860 --> 00:48:29,580 cars that are defensive will turn everyone into a more aggressive driver, 476 00:48:29,820 --> 00:48:36,180 or maybe vice versa, maybe, you know, maybe the cosmic bully us and then we will be more defensive and as a result, you become more cautious. 477 00:48:36,510 --> 00:48:45,989 So studying this, these questions is, you know, I guess like you could use like can theory and algorithmic algorithms, 478 00:48:45,990 --> 00:48:49,350 like some kind of domains that have been work on similar questions. 479 00:48:49,350 --> 00:48:56,200 But this is just really more like a research research agenda in general and without actually.