1 00:00:10,260 --> 00:00:15,480 Hello, everyone, and welcome to the third event in our public seminar series this term. 2 00:00:15,480 --> 00:00:19,770 My name's Rebecca Owen and I'm an associate professor here at the department. 3 00:00:19,770 --> 00:00:26,370 And I co-convener the New Technologies Research Group with Professor Winters. 4 00:00:26,370 --> 00:00:32,280 But before I introduce our speaker today and I've just got to do a little bit of housekeeping. 5 00:00:32,280 --> 00:00:36,240 So if the fire alarm does go off, then it won't be a drill. 6 00:00:36,240 --> 00:00:43,530 So please leave by the mark exits and the gathering point is at the car park and 34 gardens. 7 00:00:43,530 --> 00:00:51,660 Okay, so with this housekeeping out of the way, I'm really delighted to introduce our Dr. Sanjay Nathan Grey and Saunders, 8 00:00:51,660 --> 00:00:54,570 an associate professor in education at the Department of Curriculum, 9 00:00:54,570 --> 00:01:00,180 Pedagogy and Assessment University College London, and her research focuses on the social, 10 00:01:00,180 --> 00:01:07,110 ethical issues surrounding contemporary identity, biometrics, A.I. and algorithms. 11 00:01:07,110 --> 00:01:14,730 So within this domain, Sandra has written numerous articles and books, including most recently and physically blighted the deterioration of childhood 12 00:01:14,730 --> 00:01:20,490 and also curriculum reform in European schools towards a 21st century vision. 13 00:01:20,490 --> 00:01:24,450 Sandra is a member of the Privacy Expert Group of the Biometrics Institute. 14 00:01:24,450 --> 00:01:28,260 She's a senior member at Wolfson College at the University of Cambridge, 15 00:01:28,260 --> 00:01:35,880 and she's on the advisory boards and the Digital Fund Me, which is a child privacy advocacy organisation. 16 00:01:35,880 --> 00:01:37,650 What's also very exciting about sound, 17 00:01:37,650 --> 00:01:44,460 just like she's also chair of the relatively new phones Artificial and Human Intelligence Special Interest Group, 18 00:01:44,460 --> 00:01:50,790 which is part of the British Education Research Association. And this special interest group is really exciting, 19 00:01:50,790 --> 00:01:58,290 and I really do encourage you all to have a look at it and take part in it and see all of the events that they've got coming up. 20 00:01:58,290 --> 00:02:02,100 So within the Learning, a new technologies research group here at the department, 21 00:02:02,100 --> 00:02:07,890 we really try and apply a critical stance towards the use of technology in learning and education, 22 00:02:07,890 --> 00:02:11,550 regardless of life, stage, context or indeed, technology. 23 00:02:11,550 --> 00:02:17,560 And Sandra really shares that commitment in her work and the combination of her teaching, background and application. 24 00:02:17,560 --> 00:02:23,220 Psychological theory, I think, makes her work really important, and I'm particularly exciting. 25 00:02:23,220 --> 00:02:31,210 So given this, I was just so excited when she accepted our invitation to come and talk with us and tell us about her work and so on. 26 00:02:31,210 --> 00:02:35,520 Now and every soldier who talk about A.I. and social relations in schools. 27 00:02:35,520 --> 00:02:41,400 So thank you so much, Sanjay. Thank you. I'm really, really delighted to be here. 28 00:02:41,400 --> 00:02:49,590 It's a really busy time in the AI and education world because all of a sudden people are seeing the light a little bit. 29 00:02:49,590 --> 00:02:53,160 So until now, there's been a lot of technological development, 30 00:02:53,160 --> 00:03:01,800 but there's been less of an emphasis on things like the ethics and philosophy that underpins the use of AI in an educational context. 31 00:03:01,800 --> 00:03:09,300 And so suddenly, a lot of things have converged at once, so we have industry being more interested in pursuing that. 32 00:03:09,300 --> 00:03:13,320 We certainly have academia looking to do more of that kind of work. 33 00:03:13,320 --> 00:03:18,750 And so it's it's a sort of very fortunate constellation and a very good time, I think, 34 00:03:18,750 --> 00:03:27,420 to start asking ourselves some really tough questions about what's going on and what kind of future we want A.I. and education to give us. 35 00:03:27,420 --> 00:03:30,750 So that's going to be the thrust of what I'm talking about today. 36 00:03:30,750 --> 00:03:37,170 So I'm going to talk in a general sense about I'm going to do a sort of intellectual mapping of the types 37 00:03:37,170 --> 00:03:43,260 of AI that are out there and I'm going to try and relate them to different functions in education. 38 00:03:43,260 --> 00:03:51,750 And if there's any technologists in the room, you'll find a lot of flaws in this because of course, the terms that we use overlap so much for A.I. 39 00:03:51,750 --> 00:03:54,210 So it's a rather artificial division, 40 00:03:54,210 --> 00:04:03,150 but it's a kind of taste of taste of the smorgasbord of delights that artificial intelligence and education can offer, 41 00:04:03,150 --> 00:04:05,610 and whilst also allowing us to start thinking, OK. 42 00:04:05,610 --> 00:04:14,340 So if it means that in terms of social relations, what do we have to do to make sure that we are steering it and it is not leading us? 43 00:04:14,340 --> 00:04:19,020 And so that's the sort of main sort of framework of what I'm trying to talk about. 44 00:04:19,020 --> 00:04:22,230 And then I'll theorise a little bit more such think about some of the big 45 00:04:22,230 --> 00:04:29,300 sociological questions about the human condition in the light of these categories. 46 00:04:29,300 --> 00:04:36,410 So the main point of trying to think about artificial artificial intelligence is trying to decide what 47 00:04:36,410 --> 00:04:42,740 we're doing right by today's children because we have a huge responsibility to the next generation. 48 00:04:42,740 --> 00:04:50,030 And for most of us in this room, I imagine it's we get up in the morning, we don't get up in the morning to think about how to make their lives worse. 49 00:04:50,030 --> 00:04:54,710 We get up in the morning to try to think about how to make everybody's lives better and for 50 00:04:54,710 --> 00:05:00,650 people to have a more enriching and fulfilled future for society to be a better place. 51 00:05:00,650 --> 00:05:10,940 So social relations is a really good way of using that as a lens for understanding how this can happen and the reason that we need to do it, 52 00:05:10,940 --> 00:05:17,640 particularly for artificial intelligence, is because no technology can ever be properly neutral. 53 00:05:17,640 --> 00:05:26,730 Everything has got a cult of either a cognitive bias or a set of preconceptions or a social bias or a social context for 54 00:05:26,730 --> 00:05:35,850 its use and every development that we have in terms of technology has significant repercussions for the future of society. 55 00:05:35,850 --> 00:05:43,560 And I mean that in various contexts. So the political, economic and cultural aspects of society. 56 00:05:43,560 --> 00:05:49,860 And one of the things that we perhaps don't talk about enough in relation to this is the idea of trust. 57 00:05:49,860 --> 00:05:56,910 Now, trust is starting to become a more frequently used term in in tech industry developments and so on, 58 00:05:56,910 --> 00:06:00,870 and it's something that I'd like us to talk about a lot more. 59 00:06:00,870 --> 00:06:05,970 And we're talking about mutual trust here amongst different stakeholders in the educational process. 60 00:06:05,970 --> 00:06:10,500 So we're talking about not just developers, we're not talking about necessarily just clients, 61 00:06:10,500 --> 00:06:15,120 but also pupils, teachers, parents and everybody who's got a vested interest. 62 00:06:15,120 --> 00:06:19,950 Every stakeholder that's got a vested interest, which is pretty much all of us. 63 00:06:19,950 --> 00:06:29,100 But sometimes that word trust gets really lost in technological translation, so it can end up meaning something fairly bland and prosaic. 64 00:06:29,100 --> 00:06:39,010 So whether I've given you a 36 page version of the terms and conditions document that you're supposed to have read and then you take, you've read it. 65 00:06:39,010 --> 00:06:48,160 And do you trust me that in that document that you're only really pretending to have read that I won't do anything nefarious with your data? 66 00:06:48,160 --> 00:06:57,430 It also means the idea of operating technological systems reliably and transparently and with everybody's best interests at the centre, 67 00:06:57,430 --> 00:07:03,430 so it being about more than the sustainability or growth of a business and individual business. 68 00:07:03,430 --> 00:07:10,900 And that's something I'll talk about later as well. And Boyden Crawford in their work, described this as a form of empowerment. 69 00:07:10,900 --> 00:07:20,230 This idea that collectively there's a sort of collective human endeavour and that within that individual actors or people have 70 00:07:20,230 --> 00:07:28,570 the you ought to describe them have have got some kind of ability to influence outcomes and events and direction and so on. 71 00:07:28,570 --> 00:07:32,380 And to think about technology like this is very different from thinking about 72 00:07:32,380 --> 00:07:37,540 technology solely in terms of making things cheaper or easier for other people. 73 00:07:37,540 --> 00:07:46,480 So one of the things I'm very rude about at work, for example, is anonymized master's dissertations because it seems to me that, you know, 74 00:07:46,480 --> 00:07:50,830 we've gone from a system where somebody gave me a nice pre bounce thing that I read and marked very nice, 75 00:07:50,830 --> 00:07:57,550 tidy people have been doing that for hundreds of years. Suddenly, I downloaded the assignment. 76 00:07:57,550 --> 00:08:03,790 I printed the assignment because I can't mark that much on screen. I found the assignment. 77 00:08:03,790 --> 00:08:09,790 I then had to sort of pretend I didn't know who'd written it, even though I'd seen them once a month for the whole academic year. 78 00:08:09,790 --> 00:08:15,380 And then I have to upload the marks and it's what I call platform writers. 79 00:08:15,380 --> 00:08:20,680 So somebody had the great idea that putting all of this paperless was going to make it the world a better place. 80 00:08:20,680 --> 00:08:26,110 But now the students are anxious about whether it's been uploaded, and I'm anxious about whether I've done all the uploading. 81 00:08:26,110 --> 00:08:36,520 And you can multiply this by the dozen or so platforms I encounter in daily work, the sort of shadow work aspect of of being a university lecturer. 82 00:08:36,520 --> 00:08:42,670 And the problem with this sort of thing and I think we're in the very early days of this kind of technology. 83 00:08:42,670 --> 00:08:51,190 So somebody said this would be a good idea, but there hasn't been any kind of conversation about what really is involved in the job and what really is 84 00:08:51,190 --> 00:08:56,870 involved in being a student and what we're all trying to do and whether it makes life difficult for people, 85 00:08:56,870 --> 00:09:04,840 for example, if they have a visual impairment or muscular problems and they can't spend a long time looking at computers and so on. 86 00:09:04,840 --> 00:09:08,560 And it may be that something that we think is very fair, 87 00:09:08,560 --> 00:09:16,210 like having a platform where it doesn't know necessarily which students submitted what apart from from a digit string. 88 00:09:16,210 --> 00:09:19,480 We don't know whether that really is fair, 89 00:09:19,480 --> 00:09:26,680 because if it's making life difficult for particular groups and we haven't tracked that or asked ourselves that question, 90 00:09:26,680 --> 00:09:31,780 then perhaps we haven't done our job properly enough in looking beyond just adopting the first technology 91 00:09:31,780 --> 00:09:39,220 that comes to light that we think is exciting purely on efficiency or intellectual augmentation grounds. 92 00:09:39,220 --> 00:09:46,900 So another example might be perhaps tracking students in particular ways or trying to offer courses in new ways and so on, 93 00:09:46,900 --> 00:09:50,800 or for doing university governance or business governance and so on. 94 00:09:50,800 --> 00:09:53,320 And so we end up with a situation where there's a lot going on, 95 00:09:53,320 --> 00:09:59,530 but we're not giving ourselves the time to think through what it means of the societal level to be doing these things. 96 00:09:59,530 --> 00:10:03,670 So I'm going to talk quite a lot about that in a minute. 97 00:10:03,670 --> 00:10:11,350 So I suppose this this talk is really a manifesto to kind of urge people to think about adopting the best 98 00:10:11,350 --> 00:10:17,800 characteristics of humanity when thinking about new technologies in the broadest possible societal benefit, 99 00:10:17,800 --> 00:10:20,770 rather than just thinking about commercial interests. 100 00:10:20,770 --> 00:10:28,320 And I suppose this would particularly relate to things like surveillance technologies in schools in particular. 101 00:10:28,320 --> 00:10:33,210 But that sounds very negative, and this is actually really about having an enormous moment of opportunity. 102 00:10:33,210 --> 00:10:35,040 So we're finding that for the first time, 103 00:10:35,040 --> 00:10:45,840 machines have got an immense power to influence learning and teaching and everything that goes on in ways that operate sort of many, 104 00:10:45,840 --> 00:10:53,310 many orders of brilliance better than even the sort of most enormous group of human minds. 105 00:10:53,310 --> 00:11:00,600 So the human capability of the operators has been completely superseded because the scale of it is just so enormous, 106 00:11:00,600 --> 00:11:04,560 particularly when we're talking about very large scale datasets sets. 107 00:11:04,560 --> 00:11:11,250 So we no longer have to work on a hypothesis based approach to research with testing our ideas. 108 00:11:11,250 --> 00:11:17,860 For example, the machine could perhaps find it for us. That's quite scary for a researcher. 109 00:11:17,860 --> 00:11:20,800 At what point do they replace me with a robot? 110 00:11:20,800 --> 00:11:28,230 And a lot of people have talked about the data fixation of education in this sense and the idea that, you know, 111 00:11:28,230 --> 00:11:36,870 you've got this huge scale of operations and it doesn't matter how much effort the human species would put into trying to replicate that. 112 00:11:36,870 --> 00:11:43,350 All that mind power can't hold all of that data, and it's the first time in history that's really happened. 113 00:11:43,350 --> 00:11:49,290 You know, until now, we could manually do a lot of operations if we chose to, which is very expensive and time consuming. 114 00:11:49,290 --> 00:11:55,080 It needs a lot of people with which surpassed that now. We surpassed that quite a while ago. 115 00:11:55,080 --> 00:12:02,330 And so we can't replicate, so they're on their own. So we have no systems that are interrogating each other, 116 00:12:02,330 --> 00:12:10,310 and that brings new questions about whether they are deciding direction for things different from what we would decide. 117 00:12:10,310 --> 00:12:17,720 And this is why the human is so important in the middle of this. So no longer are machines just responding to a set of instructions that we give them. 118 00:12:17,720 --> 00:12:25,760 They're thinking about their own instructions. And that's the essence of A.I. and this is a scaling up of data collection and analysis. 119 00:12:25,760 --> 00:12:34,610 Absolutely unprecedented. It's particularly piloted on a lot of children and young people, which is why it's of particular concern to educators. 120 00:12:34,610 --> 00:12:40,280 It's relatively uncontrolled and unregulated, and a lot of us are trying to do things about that. 121 00:12:40,280 --> 00:12:51,830 And the commercial sense sector is at the centre of this development, not the university and not the the kind of altruistic, a societal wing of that. 122 00:12:51,830 --> 00:13:00,470 There's a difference there. So we have a commercial thrust and we have a commercial push towards monopolisation in particular. 123 00:13:00,470 --> 00:13:03,710 And again, that's something that I'll be talking about in a minute. 124 00:13:03,710 --> 00:13:10,970 So we have real big impacts on privacy because of this and the humans are being taken out of the equation. 125 00:13:10,970 --> 00:13:16,940 You have massive implications for child and teacher relations and also the idea that you have 126 00:13:16,940 --> 00:13:22,070 a cohesive national or international education system for this being compromised by this. 127 00:13:22,070 --> 00:13:29,810 This is a profound change that I can't overemphasise. And so that's a key theoretical focus of what I'm talking about today. 128 00:13:29,810 --> 00:13:33,710 So what I thought I would do now when I've got rid of whatever this is, 129 00:13:33,710 --> 00:13:41,420 is to talk a little bit about the sort of basics of A.I. and just take through. 130 00:13:41,420 --> 00:13:46,670 I've tried to formulate a framework that is simplistic in some senses, 131 00:13:46,670 --> 00:13:51,740 but gives us ways of thinking about different categories in an educational context. 132 00:13:51,740 --> 00:13:55,700 Now I'm starting with this very interesting press release extract from Pierce. 133 00:13:55,700 --> 00:14:02,690 Now Pearson are pushing really, really hard to be the monopoly provider in education. 134 00:14:02,690 --> 00:14:08,570 So as it says, unlike other sectors, education is yet to fully realise the benefits of digital and advanced A.I. techniques, 135 00:14:08,570 --> 00:14:13,610 and there are great opportunities to improve learning outcomes and to enable better teaching. 136 00:14:13,610 --> 00:14:18,800 Well, that's the sort of invoking of a sort of rhetoric of virtue. 137 00:14:18,800 --> 00:14:25,790 In a sense, person is committing to transforming the learning experience and becoming the digital winner in education. 138 00:14:25,790 --> 00:14:31,900 So there's obviously only one winner. So that says Monopoly. 139 00:14:31,900 --> 00:14:39,880 So I'm going to try and play a film clip manner of people saying the BBC programme years and years. 140 00:14:39,880 --> 00:14:48,210 Yeah, it's quite scary, isn't it, so I just want to remind you of a little got a little clip here? 141 00:14:48,210 --> 00:14:57,210 I just wanted to know what the money is likely to catch. 142 00:14:57,210 --> 00:15:20,390 You can tell it is a movie. You know, not for a long time. 143 00:15:20,390 --> 00:15:33,150 You know, been thinking since I was born that I don't belong. 144 00:15:33,150 --> 00:15:45,580 So it's really a great time. And reading up on it and. 145 00:15:45,580 --> 00:15:49,760 You can just see her. 146 00:15:49,760 --> 00:16:08,210 OK, so I guess I'm a little sweaty for my the way we look at the smoking bill, it's going to be confusing for someone to make a mess of it sometimes. 147 00:16:08,210 --> 00:16:14,910 But we give you ice cream. 148 00:16:14,910 --> 00:16:23,730 You always will. I mean, we don't need to rush to go off the time to even talk about this. 149 00:16:23,730 --> 00:16:31,930 And if it turns out that we've got some, it's exorbitant and we'll be happy. 150 00:16:31,930 --> 00:16:38,920 No one more conceptual is look now for some differences. 151 00:16:38,920 --> 00:16:44,200 What do you mean? I'm what transsexual? I'm trans human. 152 00:16:44,200 --> 00:16:50,300 Oh, OK. I'm sorry to keep changing your words in sex. 153 00:16:50,300 --> 00:17:02,170 No, I said, I'm not comfortable with my body, so I want to get this thing with the arms and legs every single bit of it. 154 00:17:02,170 --> 00:17:11,770 I don't want to be flesh. I'm really sorry, but I'm going to sleep this thing and I'm going to do. 155 00:17:11,770 --> 00:17:24,010 What do you mean? They say one thing that happens when you go and you pick your brain into the cloud 156 00:17:24,010 --> 00:17:35,950 and opposing you snack into the air so you could live forever as information, 157 00:17:35,950 --> 00:17:40,860 because that's what trans humans are more male or female. 158 00:17:40,860 --> 00:17:45,820 That's why the innocent life or death is only fatal. 159 00:17:45,820 --> 00:18:04,300 I will be later. She acts it so beautifully, doesn't she? 160 00:18:04,300 --> 00:18:09,430 So. You know, we're at a time of sort of major change, really. 161 00:18:09,430 --> 00:18:16,060 And the question is, I mean, for me, that's a really interesting example because you've got the Snapchat filter, 162 00:18:16,060 --> 00:18:22,300 that's virtual reality and you've got so many layers that people are not really understanding what's going on, 163 00:18:22,300 --> 00:18:31,870 not understanding about social change and so on. So I'm going to break this down now into a really sort of basic set of definitions. 164 00:18:31,870 --> 00:18:36,370 So the first thing I wanted to talk about was the idea of predictive analysis. 165 00:18:36,370 --> 00:18:40,570 Now, if every bit of A.I. is going to have a degree of predictive analysis in it, 166 00:18:40,570 --> 00:18:46,430 so it's an artificial division, but I've called this the digital school bursar. 167 00:18:46,430 --> 00:18:54,920 So it's the idea that you have something going on behind the scenes, assessing things like admissions and finance and resourcing and so on. 168 00:18:54,920 --> 00:19:01,550 And it's a very general term. It dates back to the good touring probability estimate. 169 00:19:01,550 --> 00:19:12,800 So we have touring decoding messages in World War Two by assessing the likelihood of whether a word is what you think it's going to be. 170 00:19:12,800 --> 00:19:17,410 And then it was later refined in the sixties by good to allow for incomplete data sets. 171 00:19:17,410 --> 00:19:29,390 So that's the maths of it. And so what it allows us to do is to predict future events by working out probabilistic statistical calculations. 172 00:19:29,390 --> 00:19:34,420 So where do we see it used at the moment? Well, it's happening a lot, even if you're not aware. 173 00:19:34,420 --> 00:19:45,740 So it is in the background of a lot of financial products that you'll be receiving or using, and it's also used a lot in the insurance sector. 174 00:19:45,740 --> 00:19:51,620 So if you phone up for car insurance, it's in the background working out what you're going to have to pay. 175 00:19:51,620 --> 00:19:56,750 Through various sets of algorithms, and it also will be on your phone, 176 00:19:56,750 --> 00:20:07,700 perhaps when it's predicting likes or making recommendations for things that you might want to see on social media or online purchases and so on, 177 00:20:07,700 --> 00:20:15,140 and the sets of algorithms in the background there as well. It's part of your autocomplete function on your phone, 178 00:20:15,140 --> 00:20:20,900 and you can see how well that works from when you send a text and you've put entirely the wrong word in it, 179 00:20:20,900 --> 00:20:27,890 which is always amusing and things like crime and health trend prediction very prominent there in education, 180 00:20:27,890 --> 00:20:35,330 which is what we're all here to think about, is says things like the likelihood of young people dropping out of courses, 181 00:20:35,330 --> 00:20:38,690 whether they'll succeed on different academic programmes and things like that. 182 00:20:38,690 --> 00:20:46,160 And it's early days in that respect. Now that might seem a very early worthy way of doing things because you don't want to 183 00:20:46,160 --> 00:20:50,390 have young people going on the wrong course or having to drop out and getting upset. 184 00:20:50,390 --> 00:20:57,050 You want human misery to you. But on the other hand, we say John King John. 185 00:20:57,050 --> 00:21:03,020 So it depends on you how you've configured the algorithm, which determines how successful it's going to be at doing what you want it to do. 186 00:21:03,020 --> 00:21:07,700 So if you make a mistake in that, it might start screening out the wrong kinds of people, 187 00:21:07,700 --> 00:21:15,500 which is exactly what happened with organisations such as Google using it for human resources functions where it was saying, 188 00:21:15,500 --> 00:21:18,170 Well, this this person is female and most developers are male. 189 00:21:18,170 --> 00:21:23,810 Therefore, if it's a female applying for a developer job, it's not likely to match what we think is a good developer. 190 00:21:23,810 --> 00:21:30,080 So you get these inadvertent forms of discrimination and that could happen in education, too. 191 00:21:30,080 --> 00:21:35,000 So this we call adverse selection, which is a technical term from the insurance industry, 192 00:21:35,000 --> 00:21:42,420 where deprived of vulnerable groups are deliberately rejected from a group through automated decision making. 193 00:21:42,420 --> 00:21:46,560 So another example of this in education is the intelligent zoning engine, 194 00:21:46,560 --> 00:21:55,980 which they've used since 2017 in Berlin district of Tempelhof shown a bag where they've used it to German school admissions and catchment areas. 195 00:21:55,980 --> 00:22:06,390 And what the algorithms try to do is look at time and travel distance and determine which is the most efficient way of getting kids to school, 196 00:22:06,390 --> 00:22:13,260 which is the least use of resource. And this being Germany, of course, they would want it very efficient. 197 00:22:13,260 --> 00:22:20,970 I can say that some half German, so they're very, you know, doing something that sounds very worthwhile, very good guarding resources and so on. 198 00:22:20,970 --> 00:22:26,160 But the risk of a system like this is that it might inadvertently entrench disadvantage 199 00:22:26,160 --> 00:22:30,780 because you're putting you're putting children's educational futures at stake, 200 00:22:30,780 --> 00:22:39,930 depending on what they happen to live next to. So if you're in an area of poor social housing and you'll always be at school with children who attend, 201 00:22:39,930 --> 00:22:42,750 who live in poor social housing, and if you're in an affluent suburb, 202 00:22:42,750 --> 00:22:47,670 you'll always be at school with kids from the affluent suburb and there's no mixing, 203 00:22:47,670 --> 00:22:52,080 which you could say is not very helpful if you're trying to design the perfect system. 204 00:22:52,080 --> 00:22:59,010 So it's kind of scuppering other aspects of education policy where mixing is thought to be a bit more desirable. 205 00:22:59,010 --> 00:23:04,740 And another example might be the open university. So they have a system called you analyse, 206 00:23:04,740 --> 00:23:11,460 and they track students through the university to see which students are likely to need extra toward intervention. 207 00:23:11,460 --> 00:23:17,250 So that sounds a very clever thing to do, particularly on a large scale basis like the Open University would need to do. 208 00:23:17,250 --> 00:23:24,990 And this is fine. You know, they're very pleased with it and it seems to be working very well, but it's trained on an atypical. 209 00:23:24,990 --> 00:23:31,050 What we see is that a typical university population, so if you see that, you know, in a typical university, 210 00:23:31,050 --> 00:23:39,690 you might have half the students being full time undergraduates, for example, who have left school a year or two previously. 211 00:23:39,690 --> 00:23:44,480 If it's trained on mature part time students, the big mistake would be to take that saying Oh, 212 00:23:44,480 --> 00:23:51,570 that works brilliantly in university and then say, Come and use it at UCL, where I work, where the population is completely different. 213 00:23:51,570 --> 00:23:58,230 So you get so many false referrals and you'd be missing a lot of people that are having problems because you've tried it on the wrong population. 214 00:23:58,230 --> 00:24:05,250 So it's really important to try to to match that as closely as possible that it's likely that if you did buy a product like that, 215 00:24:05,250 --> 00:24:07,500 you'd have to retrain it completely. 216 00:24:07,500 --> 00:24:16,080 And we know this because there's been work done in, for example, screening in child protection cases where there's notable accuracy problems, 217 00:24:16,080 --> 00:24:21,480 where it's been done on the wrong population, so it becomes racially biased or assumptions are made of stone. 218 00:24:21,480 --> 00:24:33,170 So that's absolutely key to doing these things well. So the next excitement in artificial intelligence is the idea of deep learning. 219 00:24:33,170 --> 00:24:38,850 Another word that we bandy about a lot that is sometimes vague in its use. 220 00:24:38,850 --> 00:24:49,110 Colonel, that was the last slide. And I've labelled this is the digital music writing room, so this allows for sort of creative acts. 221 00:24:49,110 --> 00:24:52,390 And I find this quite exciting, actually. 222 00:24:52,390 --> 00:25:01,470 So the algorithm itself will keep looping to calculate higher order features, yet high level features, so it will try to, for example, 223 00:25:01,470 --> 00:25:08,800 number two faces and then it might look at what an AI is and might look at different kinds of face it tries to refine itself over time. 224 00:25:08,800 --> 00:25:14,590 So it has a lot of layers to it, moving from pixels to a precise image. 225 00:25:14,590 --> 00:25:16,510 And sometimes it's quite good at that. 226 00:25:16,510 --> 00:25:24,910 So you can try on the internet and see if you're trying to find something like an image that you've already got, and it can be quite accurate. 227 00:25:24,910 --> 00:25:32,290 Or sometimes it isn't. So I tried this recently with a cutlery set and it came came up with images of two, 228 00:25:32,290 --> 00:25:37,030 and they look very similar because there's lots of things in rows in a box, but it got confused with those, too. 229 00:25:37,030 --> 00:25:45,350 So it's quite hard for computers to learn these things, but they can do it and it becomes more precise over time. 230 00:25:45,350 --> 00:25:53,010 So it works in some certain circumstances, but just as with the toolkits and the cutlery boxes, it can be problematic. 231 00:25:53,010 --> 00:26:01,190 So they're often biased towards white men because they work and sometimes Asian men, because they tend to be the developers that the testing is. 232 00:26:01,190 --> 00:26:06,650 And so it's a bit like the learning problem from the Open University, you know, 233 00:26:06,650 --> 00:26:13,600 if you were to test it on one type of face, it's going to struggle with, for example, black faces. 234 00:26:13,600 --> 00:26:17,240 So there's been a lot of scandal in the news about, you know, 235 00:26:17,240 --> 00:26:24,200 Google images and also some sometimes pass passport by biometric recognition systems getting this very wrong. 236 00:26:24,200 --> 00:26:31,100 So people have tried to upload a photograph of themselves to renew their passport, and they're told the eyes are closed because they can't. 237 00:26:31,100 --> 00:26:36,470 You do very surprised, but this was a quite shocking story that came out recently. 238 00:26:36,470 --> 00:26:41,510 And so they're very flawed algorithms and people are quite rightly objecting to this. 239 00:26:41,510 --> 00:26:43,370 So that's the downside. 240 00:26:43,370 --> 00:26:52,550 But the upside is that you can have rudimentary art and music works where they look at the algorithm would look at thousands and thousands of, 241 00:26:52,550 --> 00:26:57,950 say, Van Gogh and similar type pictures and then try and create one of its own. 242 00:26:57,950 --> 00:27:04,250 And that's where it gets really good. So you can't see it very well, but that's sort of a I artwork in the back of that slide. 243 00:27:04,250 --> 00:27:12,530 And it's also worked with music and also literature, and you can use it for things like facial surveillance as well. 244 00:27:12,530 --> 00:27:16,760 So I didn't find school uniform in a crowd, which is a bit nerve wracking. 245 00:27:16,760 --> 00:27:22,970 But then it could also design at the perfect school uniform if it had enough of them to look at. 246 00:27:22,970 --> 00:27:29,240 So what can we do with it in education where we could look at it as a way of encouraging new 247 00:27:29,240 --> 00:27:34,190 forms of curriculum interaction so we could have pupils developing artworks using this, 248 00:27:34,190 --> 00:27:37,670 which I think might be an interest in curriculum development? 249 00:27:37,670 --> 00:27:44,810 You can have things like spatial augmented reality in schools as a project called Future Gems that use with autistic children who are 250 00:27:44,810 --> 00:27:53,630 non-verbal to try to get them to interact with the surroundings so something might appear there and I might interact with it and so on. 251 00:27:53,630 --> 00:27:59,780 And you know, there's that kind of option and you've also got I think I've got a quote here. 252 00:27:59,780 --> 00:28:03,120 Yes. So it's tried to write a novel. I always try to write a novel. 253 00:28:03,120 --> 00:28:06,170 So that's from the first I novel won the road. 254 00:28:06,170 --> 00:28:12,710 It was nine 17 in the morning and the house was heavy, which I think is a brilliant sentence, which I want to read the. 255 00:28:12,710 --> 00:28:17,420 And so you can imagine some kind of literary stimulation for pupils work and so on. 256 00:28:17,420 --> 00:28:23,180 So in that sense, I think it's a nice convergence of science and art and humanities and all sorts of other 257 00:28:23,180 --> 00:28:35,230 things and performance art that could be a very fertile ground for for students at school. 258 00:28:35,230 --> 00:28:45,460 So then we have the digital school inspector. So machine learning, and again, I must emphasise all these terms overlap, they're not perfect. 259 00:28:45,460 --> 00:28:48,910 So that's a term developed by Samuel, and that's the idea. 260 00:28:48,910 --> 00:28:53,530 It was first used to play an online version of the board game checkers. 261 00:28:53,530 --> 00:29:02,570 And so we see that use in different contexts now. And the idea that you have algorithms and statistical models sensing pattern recognition, 262 00:29:02,570 --> 00:29:08,170 allowing for decisions to be made with little human intervention, they're getting awfully good at their so. 263 00:29:08,170 --> 00:29:17,050 So we have our alphabet go. We have this fancy chess functions and that kind of thing quite intimidating. 264 00:29:17,050 --> 00:29:22,900 So how do we use it in education? Well, we have a school inspection. 265 00:29:22,900 --> 00:29:30,610 And so they do things like mapping attainment trends to identify schools in need of inspection, to save a human, having to go through all of the data. 266 00:29:30,610 --> 00:29:37,510 But the problem with this is that it assumes the data are neutral and that everything is kind of equally weighted. 267 00:29:37,510 --> 00:29:45,910 And of course, that's a really major problem. So having been a school governor, you know, we used to tear our hair out if we'd had a bout of illness, 268 00:29:45,910 --> 00:29:50,560 a sort of epidemic of illness just before the SATs exams and the results were death. 269 00:29:50,560 --> 00:29:57,130 And that would make us look very bad. But the danger with a system like this is that it would then trigger some sort of inspection. 270 00:29:57,130 --> 00:30:02,950 What happened was the kids got ill. It didn't mean anything about the way teaching and learning was happening in the schools. 271 00:30:02,950 --> 00:30:10,630 Just bad luck. So because this work would work quite quickly in the background and offset are using techniques like this, 272 00:30:10,630 --> 00:30:16,210 you know, you might be wasting resources really through false positives just because of blips that happen. 273 00:30:16,210 --> 00:30:21,580 The consequence is it a consequence or a subsequent is it a correlation or causation? 274 00:30:21,580 --> 00:30:26,530 So there's there's different questions that need to be asked. 275 00:30:26,530 --> 00:30:38,170 So unless you're questioning the intrinsic value and analysis of data really thoroughly, then I think it's of limited use. 276 00:30:38,170 --> 00:30:44,860 So the next film we have is the idea of a neural network, which I call the digital prefect, 277 00:30:44,860 --> 00:30:49,150 because it was starting to get a little bit hard to think of all the educational terms, 278 00:30:49,150 --> 00:30:55,810 but it's the idea that you have a computer system emulating a biological system like the human brain. 279 00:30:55,810 --> 00:31:04,360 We get very muddled in our thinking about human brains and computers, and we often try to think of human brains in a computational sense. 280 00:31:04,360 --> 00:31:09,520 And computers, we try to say, Oh, well, we need to build a computer that can act like a human brain. 281 00:31:09,520 --> 00:31:16,990 And they're very, very different things. So I try and sort of think of a divide like wet and dry intelligence that different intelligences. 282 00:31:16,990 --> 00:31:21,790 The group we've set up for the British Educational Research Association is 283 00:31:21,790 --> 00:31:26,080 artificial and human intelligence because we don't think one emulates another, 284 00:31:26,080 --> 00:31:35,980 necessarily a separate beings. But when you're trying to build the pattern of one, you have the idea of nodes or artificial neurones, 285 00:31:35,980 --> 00:31:41,320 and they're taught to recognise whether something is true or false by a system of algorithms. 286 00:31:41,320 --> 00:31:46,030 So the system can then train independently over time to do this better and more frequently, 287 00:31:46,030 --> 00:31:54,550 a bit like the facial recognition system already talked about. So you could recognise individual pupils in crowds or on social media images. 288 00:31:54,550 --> 00:32:00,070 And indeed, this Chinese social credit system uses this quite a lot. 289 00:32:00,070 --> 00:32:06,470 But in practise, the system would need enormous numbers of samples to do this very well. 290 00:32:06,470 --> 00:32:14,560 So it's not realistic in all but the most extreme circumstances when you've got enormous datasets. 291 00:32:14,560 --> 00:32:22,090 We know that sometimes schools are gamed by individuals because this has happened in the Chinese social credit system, 292 00:32:22,090 --> 00:32:29,350 and we know that there's also the potential of significant privacy violations because people don't necessarily 293 00:32:29,350 --> 00:32:34,240 know that they're being tracked if you're collecting a big enough dataset to do anything useful with it. 294 00:32:34,240 --> 00:32:48,730 So that's a problem. Another problem is inferring things like attentiveness from video streams or sort of live observation of classroom activities. 295 00:32:48,730 --> 00:32:52,660 So what I could do if I was mean, I could make you aware. 296 00:32:52,660 --> 00:32:55,990 One of the brain KO headsets have been in the news. 297 00:32:55,990 --> 00:33:01,670 And then if any of you weren't paying full attention to my talk, then I could make a note for later. 298 00:33:01,670 --> 00:33:07,660 You see, I could get the system to tell your mum and dad if this was a school that you hadn't 299 00:33:07,660 --> 00:33:11,590 been attending in class and then if you didn't do very well and a little more like, 300 00:33:11,590 --> 00:33:18,430 say, well, there were twenty six incidences where you weren't sitting upright and your your brain wasn't responding properly. 301 00:33:18,430 --> 00:33:25,120 So I could do that and I could use emotion analysis to try to see whether you were being attentive. 302 00:33:25,120 --> 00:33:30,220 But it would be a waste of time because as we all probably know, 303 00:33:30,220 --> 00:33:35,260 it doesn't matter really what you look like when you're learning and people do this in different ways. 304 00:33:35,260 --> 00:33:44,530 And the evidence is so weak for the idea that a particular kind of impedance level on a forehead is going to relate to some kind of deep learning. 305 00:33:44,530 --> 00:33:53,290 Taking off in the back of beyond that will be meaningful in terms of the wider school experience so that the evidence is hopeless. 306 00:33:53,290 --> 00:34:05,170 So it's a little bit of a tech led innovation and hence the combination of the doubts about its efficacy and the Chinese parents suspicion of this. 307 00:34:05,170 --> 00:34:12,250 Now it's it's gone a bit too far, even for the relaxed Chinese parents to think that this might be useful. 308 00:34:12,250 --> 00:34:15,820 So it's been withdrawn this week from the schools it was used in. 309 00:34:15,820 --> 00:34:24,590 So that says quite a lot, really. So a lot of experimental work like that going on, but of of dubious potential, I think. 310 00:34:24,590 --> 00:34:33,290 So there have been other attempts to use neural networks to look at just the data side of school and then to link that to data on school, 311 00:34:33,290 --> 00:34:39,260 family and demographic variables to try to see whether you could reduce school dropout rates on this. 312 00:34:39,260 --> 00:34:42,650 Places like Turkey superficial promise. 313 00:34:42,650 --> 00:34:50,180 But again, it depends on what you're telling the algorithm to do, and you have to be very nervous about discrimination in accuracy. 314 00:34:50,180 --> 00:34:53,600 So, you know, you have to ask whether certain categories are being conflated. 315 00:34:53,600 --> 00:34:58,490 So ethnicity and social class, for example, whether they need to be divided. 316 00:34:58,490 --> 00:35:07,760 So it's about, again, it's about the quality of the questions that you ask, determining how well each dataset can be deployed. 317 00:35:07,760 --> 00:35:15,900 So that's just a summary. So the next one is the digital tutor, the expert system, 318 00:35:15,900 --> 00:35:20,310 and that's something that we've probably will come across a lot because it's been around a long time. 319 00:35:20,310 --> 00:35:26,190 This has been around since the 60s and that was developed by somebody called. 320 00:35:26,190 --> 00:35:30,990 Feigenbaum is part of the Stanford Heuristic Programming Project. 321 00:35:30,990 --> 00:35:40,620 It's a software programme that enables emulates human decision making processes by using databases to make expert decisions. 322 00:35:40,620 --> 00:35:47,490 So it's very useful for online tutoring, for remediating and accelerating student knowledge. 323 00:35:47,490 --> 00:35:57,780 And just generally the training context as well. However, not perfect because it's limited by the ability to update its own knowledge. 324 00:35:57,780 --> 00:36:03,720 And if that isn't frequent enough, then you're not having a dynamic enough system, 325 00:36:03,720 --> 00:36:09,720 whereas with a human being, they're often being inadvertently updated on a daily basis. 326 00:36:09,720 --> 00:36:17,230 It's also culturally bound, so the knowledge needs embedding before you start using it, and then that's it. 327 00:36:17,230 --> 00:36:23,010 So unless you're having a complete new version, you're stuck with what you've got. 328 00:36:23,010 --> 00:36:27,300 We see it use quite effectively in many schools for for routine things. 329 00:36:27,300 --> 00:36:35,670 So things like education perfect with their products like language, perfect mathematics, and they've now got something called spelling drome. 330 00:36:35,670 --> 00:36:44,250 So it's really useful for things like drill and identifying where pupils are having particular problems with curriculum issues. 331 00:36:44,250 --> 00:36:47,650 I think there are great ambitions to use it for bigger things. 332 00:36:47,650 --> 00:36:55,530 So for example, you've had the whole old school movement in Silicon Valley where they tried to have a highly personalised curriculum using this 333 00:36:55,530 --> 00:37:06,690 sort of resource to try to uniquely identify on a pupil by pupil basis what they needed to know next and to test things and so on. 334 00:37:06,690 --> 00:37:11,400 And it was quite ineffective. And so a lot of the teachers ended up doing some, you know, 335 00:37:11,400 --> 00:37:17,130 quite a reasonable amount of direct instruction to compensate for the shortcomings of that. 336 00:37:17,130 --> 00:37:22,770 And indeed, the schools have started closing. The parents were losing confidence and so forth, 337 00:37:22,770 --> 00:37:28,800 which raises the question about if you're having these higher level functions for 338 00:37:28,800 --> 00:37:33,780 something like this with the need to for schools to buy expensive subscriptions. 339 00:37:33,780 --> 00:37:40,380 You're talking about then less affluent schools not being able to afford these and perhaps being left behind digitally. 340 00:37:40,380 --> 00:37:44,300 So that's of great concern. 341 00:37:44,300 --> 00:37:50,150 The next one, I would have liked to have brought a robot, and I did have one to bring along, but apparently it was broken, so I couldn't. 342 00:37:50,150 --> 00:37:55,640 So the social robotics, the digital classroom assistant. 343 00:37:55,640 --> 00:38:01,700 So we're talking about something that can be used as a substitute for humans, it might not look like a human, 344 00:38:01,700 --> 00:38:08,120 but something that can do this one particularly fascinating where this has been used in education. 345 00:38:08,120 --> 00:38:15,350 I should do this with two year olds a very long time ago. It started in the 1960s, the idea of the logo programming language, 346 00:38:15,350 --> 00:38:19,820 which then developed into little turtles that could break out of this and move around 347 00:38:19,820 --> 00:38:23,870 the room and the kids do subversive things and they get under your feet kind of thing. 348 00:38:23,870 --> 00:38:32,510 And they were meant to be really subversive. And there's been a fantastic paper on this about how we're supposed to disrupt 349 00:38:32,510 --> 00:38:37,910 a relatively conservative curriculum framework for schools and save the world. 350 00:38:37,910 --> 00:38:44,150 And that kind of thing in whether they did that or not is another question, 351 00:38:44,150 --> 00:38:48,890 because people started to corral the knowledge and try to turn it into something that could be measured and so on. 352 00:38:48,890 --> 00:38:52,310 So the anarchy left and with it, the creativity. 353 00:38:52,310 --> 00:39:01,700 Perhaps so they're not around so much now, but we see robots and robotic knowledge in social robotics as a kind of revolutionary technical fix. 354 00:39:01,700 --> 00:39:07,070 So, you know, in sci fi, you'd probably see it as a classroom assistant replacing a human. 355 00:39:07,070 --> 00:39:12,440 Well, sometimes that can be a threat. And indeed, when the European Commission looked into this and interviewed lots of people 356 00:39:12,440 --> 00:39:16,250 about what they thought and people were quite scared about robots in general, 357 00:39:16,250 --> 00:39:20,390 I think it's too many Terminator movies myself. 358 00:39:20,390 --> 00:39:28,280 So what would do we actually see what we see slow progress towards social robots through neuromorphic computing techniques, 359 00:39:28,280 --> 00:39:33,920 so the idea of emulating the human brain similar to neural networking, 360 00:39:33,920 --> 00:39:41,790 but they're likely to be quite limited to particular domains or particular tasks simply because the public resist them. 361 00:39:41,790 --> 00:39:49,880 And one really interesting thing we found so there's a school in Cambridge that has just been set up a state comprehensive school, 362 00:39:49,880 --> 00:39:53,990 but it decided just to be the most high tech school imaginable on the budget it had. 363 00:39:53,990 --> 00:39:58,630 And one of the things they commissioned was a really nice kind of virtual reality cat. 364 00:39:58,630 --> 00:40:06,290 So go to an open day and you look on the screen, you realise you're actually standing next to a cat that you can't see. 365 00:40:06,290 --> 00:40:11,150 And it's all in 3D and it's all terribly exciting. So on and the cat does lots of interesting things. 366 00:40:11,150 --> 00:40:15,830 It goes around the school and it pops up on computers around school, and all the kids have a laptop. 367 00:40:15,830 --> 00:40:21,290 It pops up on the different platforms and the children's best to feed it and engage with it. 368 00:40:21,290 --> 00:40:29,480 So if you're a bit bored in class, you can kind of virtually straight. The cat is very nice, except except this got a bit nasty, really. 369 00:40:29,480 --> 00:40:36,500 So, you know, some of the teenagers started saying, Well, let's stop the cat and see what happens, what happens if we're means of the cat? 370 00:40:36,500 --> 00:40:42,410 You know, how far can we go with the cat? And so people would say it was horrendous, and I'm thinking, that's cool being a teenager. 371 00:40:42,410 --> 00:40:45,350 I mean, you kind of learning what the boundaries are, right? 372 00:40:45,350 --> 00:40:51,140 And you can't really do this to a real life thing, but you could do it to a virtual world. 373 00:40:51,140 --> 00:40:56,830 But then we get into the question of it's a bit like, do people have to be taught to say please and thank you to Siri? 374 00:40:56,830 --> 00:41:01,610 You know, do we educate people to not be mean to the virtual reality cat? 375 00:41:01,610 --> 00:41:04,100 What point do people start confusing the two? 376 00:41:04,100 --> 00:41:10,270 So the some really big questions about having this kind of stuff flying around in schools and how we approach it, 377 00:41:10,270 --> 00:41:16,580 it's got to be more than the gadget, whether it's potentially sort of cloud robotics or networked robots. 378 00:41:16,580 --> 00:41:25,370 So you could imagine them being usefully used to sort of help with educational assessment in a routine way or reception, that kind of thing. 379 00:41:25,370 --> 00:41:33,880 Maybe an internet of things where basic things are connected by it via embedded devices and they talk to each other. 380 00:41:33,880 --> 00:41:41,920 On the other hand, this leads to a growth in individualised learning, attendance trackers, school environmental controls and so on. 381 00:41:41,920 --> 00:41:48,850 Well, although currently personalisation is quite business driven, so we've sort of allowed it to happen, 382 00:41:48,850 --> 00:41:52,270 but nobody's asking the question, you know, what do we actually want to happen? 383 00:41:52,270 --> 00:41:58,240 So it's quite warm in here, isn't it? You know, so in an ideal world, you'd have some kind of internet of things social robotic thing going on, 384 00:41:58,240 --> 00:42:01,420 and some little white thing would have slid over and adjusted the climate because it would have 385 00:42:01,420 --> 00:42:06,580 known optimally what everybody needs to to be able to to really feel comfortable in the room, 386 00:42:06,580 --> 00:42:11,140 for example. But, you know, it's not something I can order on Amazon yet, is it? 387 00:42:11,140 --> 00:42:24,770 So this is a complicated set up. So. 388 00:42:24,770 --> 00:42:31,100 Where we have to sort of think about the theory a little bit more is in relation to forms of power 389 00:42:31,100 --> 00:42:39,270 here and things like globalisation and accountability towards learners and that kind of thing. 390 00:42:39,270 --> 00:42:42,540 And see about the impact on the social relations generally, 391 00:42:42,540 --> 00:42:50,340 so you have the scaling up of data to a massive degree, changing the nature of teacher professionalism. 392 00:42:50,340 --> 00:42:58,350 So you have this mediated at several removes from the pupils, teachers and parents that are at the forefront of learning. 393 00:42:58,350 --> 00:43:07,980 You have globalisation. Is these really big commercial players and supranational education organisations starting to develop in this area? 394 00:43:07,980 --> 00:43:13,830 And I'm not sure what accountability is there towards learners, except in the most general sense. 395 00:43:13,830 --> 00:43:17,700 And I certainly don't see much accountability towards individual learners. 396 00:43:17,700 --> 00:43:22,860 So it's a greater good argument that these organisations are harnessing that perhaps a 397 00:43:22,860 --> 00:43:28,650 little bit difficult to reconcile with the social relations at an individual level. 398 00:43:28,650 --> 00:43:38,100 So we see forms of power shifting as the corporations and the super nationals become increasingly dominant and monopolised. 399 00:43:38,100 --> 00:43:47,490 And the idea that, yeah, we're trying to have new forms of regulation, such as the GDPR regulation that came in last May. 400 00:43:47,490 --> 00:43:54,090 But it's always running behind and perhaps needs to become stronger. 401 00:43:54,090 --> 00:44:03,490 So if we want that to happen and if we want this to be a really positive future for artificial intelligence and education, what do we do? 402 00:44:03,490 --> 00:44:08,530 Well, I've been looking back, said, look forward really when I've been doing work on this, 403 00:44:08,530 --> 00:44:13,720 so I've been thinking about the idea of Burnstein and Pedagogic rights, 404 00:44:13,720 --> 00:44:23,110 which is kind of to the end of his education, symbolic control and identity book. 405 00:44:23,110 --> 00:44:27,970 And it's something that perhaps we haven't given much attention to for a while, but it certainly works here. 406 00:44:27,970 --> 00:44:35,200 And he talks about the idea of education enhancement, participation and inclusion a very personal level. 407 00:44:35,200 --> 00:44:39,400 And I think perhaps that's something that we can embed in our debates about the way 408 00:44:39,400 --> 00:44:45,910 forward for artificial intelligence here as as a way of bringing the human back into it, 409 00:44:45,910 --> 00:44:47,290 really. 410 00:44:47,290 --> 00:44:55,090 And then I've been looking even further back, really at home and social exchange theory, and I've been writing about this in different contexts. 411 00:44:55,090 --> 00:45:00,790 I've been trying to think about how do we know if an exchange is democratically equivalent or whether it's imbalanced? 412 00:45:00,790 --> 00:45:04,540 And I've been using the terms authentic and inauthentic for this, 413 00:45:04,540 --> 00:45:10,810 which are uncomfortable for a lot of technologists because that sounds a bit too much like a technical term. 414 00:45:10,810 --> 00:45:13,330 But it's the idea of do people have a buy in? 415 00:45:13,330 --> 00:45:21,080 Do you have a stakeholder engagement with different things that are happening or is this something that's done to people? 416 00:45:21,080 --> 00:45:26,360 So if you say if you've got the stakeholder buy in, that's authentic. And if it's done to people, it's inauthentic. 417 00:45:26,360 --> 00:45:37,770 What have we actually got? Monopoly situations growing up where people have very little agency, so there's really big questions to ask them. 418 00:45:37,770 --> 00:45:47,160 So to conclude, I think you need to think more about forms of power, we need to think more about things like the nature of consent. 419 00:45:47,160 --> 00:45:51,180 So if something is chosen by a school or institution, 420 00:45:51,180 --> 00:45:58,650 do we have to ask questions about whether consent should be more individual and things like transparency of processing? 421 00:45:58,650 --> 00:46:06,240 How much should providers have to tell us about how they use their algorithms begged them on their commercially sensitive? 422 00:46:06,240 --> 00:46:10,140 This is a really thorny question in the GDPR. 423 00:46:10,140 --> 00:46:18,860 They're required to do this, but they can argue that this would compromise their business model, so that's not very well worked out. 424 00:46:18,860 --> 00:46:28,140 The idea that, you know, how far can pupils give their time and data for free so that these big business models can be built, 425 00:46:28,140 --> 00:46:32,370 that all of these products that I've described today rely upon? 426 00:46:32,370 --> 00:46:37,170 And what rights do people have towards privacy of data in that respect? 427 00:46:37,170 --> 00:46:45,240 So it requires very, some very careful balancing tests that aren't happening at the moment because it's just a bit of a wild west of development. 428 00:46:45,240 --> 00:46:52,680 We have a vast and ungainly educational landscape and a lot of things competing for our attention. 429 00:46:52,680 --> 00:47:00,360 And meanwhile, everything is being refined and reimagined, but not necessarily with a lot of consultation, 430 00:47:00,360 --> 00:47:09,080 even though the users are being rendered increasingly governable by these systems, by these very remote commercial systems. 431 00:47:09,080 --> 00:47:13,950 So we need to think about the future of the society in which this is happening. 432 00:47:13,950 --> 00:47:21,260 And if you think back to the Pearson quotes, there are great opportunities to improve learning outcomes and enable better teaching, 433 00:47:21,260 --> 00:47:25,610 so nobody can really argue with that, transforming the learning experience. 434 00:47:25,610 --> 00:47:31,010 But I would contest that we need to do this together and for it not to be a commercial project 435 00:47:31,010 --> 00:47:39,680 of dominance and to keep enhancement participation and inclusion right at the centre of that. 436 00:47:39,680 --> 00:47:50,390 So. It's this grey area between the authentic and the authentic and the personal and the commercial that the future needs to be mapped out. 437 00:47:50,390 --> 00:47:55,880 And if we don't do this, then we've killed this thing at birth, really, because it'll never do what we want it to do, 438 00:47:55,880 --> 00:48:01,100 and it will be so discriminatory on a social basis that it will cause more problems than it solves. 439 00:48:01,100 --> 00:48:06,020 So this is why the pedagogic rights framework of Bernstein's particularly interesting as a 440 00:48:06,020 --> 00:48:13,040 discipline for thinking about it and by promoting the involvement of users at all of these levels, 441 00:48:13,040 --> 00:48:23,240 it allows us to really build a more robust future. And government needs to go beyond rhetorical flourishes about rights and so on and consent and 442 00:48:23,240 --> 00:48:28,490 have a more solid regulatory position so that it really represents the things that need to happen. 443 00:48:28,490 --> 00:48:37,250 And if we get this right, then it will be very exciting. If it gets it wrong, will have increased social fragmentation, which is very dangerous. 444 00:48:37,250 --> 00:48:48,896 So we need to choose our path very wisely. Thank you.