1 00:00:00,090 --> 00:00:05,520 Thank you very much for the introduction. Thank you to the previous speakers. As a political theorist, you did her defo here 2 00:00:05,520 --> 00:00:10,700 many years ago. She would have loved to have your presentation the very beginning of my page. The experts thing 3 00:00:10,700 --> 00:00:16,490 is to figure out how to explain to her why last week, Martin, when my work is so crucially important 4 00:00:16,490 --> 00:00:21,540 that those pieces fit within just a couple of minutes of explanation about what the 5 00:00:21,540 --> 00:00:26,790 oil is, and then I'll hand over to my colleagues. We will take down a great debt to give an indication 6 00:00:26,790 --> 00:00:31,920 of what type of research we do. For those of you that don't know, we're relatively 7 00:00:31,920 --> 00:00:37,560 new. I love. Every time we do have a new centre, it's too great because I feel like an old hand. 8 00:00:37,560 --> 00:00:43,290 We've been around since 2001 and we were set up to be a multidisciplinary centre 9 00:00:43,290 --> 00:00:48,310 based within social sciences, but multidisciplinary focussing on the societal implications of 10 00:00:48,310 --> 00:00:53,340 digital technologies. It's funny looking back, actually. I mean, the questions that we are 11 00:00:53,340 --> 00:00:58,530 asking now are many in many ways the same questions that you're asking in 2001, but perhaps applied to 12 00:00:58,530 --> 00:01:03,640 new and innovative technologies. But I think that, you know, so much of what the new institute 13 00:01:03,640 --> 00:01:08,880 is going to do is is hugely exciting to us. And I'm frankly very delighted to see know 14 00:01:08,880 --> 00:01:13,890 significant investment in this field and new opportunities for us to collaborate both with the humanities, 15 00:01:13,890 --> 00:01:18,960 but actually across the way. University is the chart that Niger presented showed. So, yes, congratulations on 16 00:01:18,960 --> 00:01:24,560 those that helped make this happen. I work in this area. I would say 17 00:01:24,560 --> 00:01:29,560 a sort of maybe threefold. The first is, if you like, in the 18 00:01:29,560 --> 00:01:34,770 very broadest sense. So what you want an angel covered at the very beginning in the sort of, you know, the array 19 00:01:34,770 --> 00:01:40,140 of different topics and areas is just fundamental to what we do. So, you know, big, broad questions 20 00:01:40,140 --> 00:01:45,570 about. About innovation, about the development of new tools, 21 00:01:45,570 --> 00:01:50,580 new ways of using artificial intelligence to develop new products, new forms and 22 00:01:50,580 --> 00:01:55,650 functions of those in everyday life. Questions about what this means for how we regulate and 23 00:01:55,650 --> 00:02:00,720 govern these technologies. This this is just our everyday business. And it's something that that all of my now nearly 24 00:02:00,720 --> 00:02:05,760 50 faculty are frankly engaged in. However, it is a much narrower sense 25 00:02:05,760 --> 00:02:10,980 in which actually quite the questions of ethics and I arise in a number 26 00:02:10,980 --> 00:02:16,380 of very specific sort of research has portfolios to which you're going to hear about tonight. Since 27 00:02:16,380 --> 00:02:21,690 2014, I'm pleased to note that we've had in-house philosophers. 28 00:02:21,690 --> 00:02:27,090 We've had IVC in the very early days of the Let me on if you're ready with his work around 29 00:02:27,090 --> 00:02:32,850 information, ethics, addressing questions like know sort of moral, 30 00:02:32,850 --> 00:02:38,060 artificial Egil evil or the morality of artificial agents that that's been with us is safe runabouts 31 00:02:38,060 --> 00:02:43,290 for five, six years. But actually, we do not just have philosophers looking at these questions 32 00:02:43,290 --> 00:02:48,410 now. So we brought in several philosophers on top of each on it, one of which you can going to hear from shortly. 33 00:02:48,410 --> 00:02:53,430 But we also now have lawyers like Sandra who's going to speak to you. We have political scientists. We 34 00:02:53,430 --> 00:02:58,650 have data scientists again, though, whose work is fundamentally concerned with questions of ethics 35 00:02:58,650 --> 00:03:04,470 and I ised writ large. And in those cases, it covers issues such as 36 00:03:04,470 --> 00:03:09,570 things like what constitutes fairness or unfairness, discrimination, if you like, 37 00:03:09,570 --> 00:03:14,880 in the uses of I political questions about once you once you maybe identified 38 00:03:14,880 --> 00:03:20,920 principles of fairness, how you might go about regulating for those or holding companies to account. 39 00:03:20,920 --> 00:03:26,140 We cover applied questions such as, again, if you if you identify unfair practises or sort 40 00:03:26,140 --> 00:03:31,140 of sources of lack of diversity in data collection, what does that mean for innovation? 41 00:03:31,140 --> 00:03:36,900 What does that mean for data collection? What does it mean for privacy? So if you like the very fact that a multidisciplinary 42 00:03:36,900 --> 00:03:42,000 means that we aren't just approaching this, certainly from a philosophical angle, but we are taking some of these sort of core 43 00:03:42,000 --> 00:03:47,640 philosophical questions and playing them out across different disciplines and different topics, 44 00:03:47,640 --> 00:03:53,090 we hope, all with the aim of improving not just research outputs, 45 00:03:53,090 --> 00:03:59,400 research, understanding these issues, but also, if you like, societal understanding and societal practises. 46 00:03:59,400 --> 00:04:05,190 So if we have a sort of a broad focus, this very narrow focus in specific projects, again, which we'll hear about shortly. 47 00:04:05,190 --> 00:04:11,070 I just want to flag up. But we also have a focus on this in our teaching. We have full graduate degrees, 48 00:04:11,070 --> 00:04:16,080 two masters. One is social data science. One is social science, the Internet, two corresponding defo 49 00:04:16,080 --> 00:04:21,420 programmes. And in each of those, again, these broad questions around ethics. And I arrive 50 00:04:21,420 --> 00:04:27,510 in different places. We actually have got pure philosophy paper, for example, and one of our master's degrees 51 00:04:27,510 --> 00:04:32,880 in another. We we embrace the core questions about what things like fairness and transparency would look like 52 00:04:32,880 --> 00:04:37,960 in the practise of social data science. So again, I really applaud, I think, the desire to ensure 53 00:04:37,960 --> 00:04:43,290 that the work of the new institute will also include content for new courses, 54 00:04:43,290 --> 00:04:48,600 new option papers. Certainly we find it immensely satisfying experience to deliver those to our students. 55 00:04:48,600 --> 00:04:53,820 I'm happy to take questions later on on what the eyes I was doing in this area, but it's my great pleasure to hand 56 00:04:53,820 --> 00:04:58,440 over to my ex. Such as?