1 00:00:00,030 --> 00:00:11,610 I just learned that we built this very sophisticated artificial intelligence to deeply understand students technical competencies. 2 00:00:12,830 --> 00:00:16,430 Which in and of itself didn't seem to be that predictive of anything. 3 00:00:17,210 --> 00:00:23,540 So you need to know how to program to be a programmer. Sure. But it turns out that's not what predicts a great programmer. 4 00:00:26,190 --> 00:00:31,260 Not having enough work. Climate change funding minus a high. 5 00:00:33,520 --> 00:00:38,310 And. Large scale social disorder, antibiotic resistant diseases. 6 00:00:39,160 --> 00:00:42,610 Education. Employment. Private capital for social good. 7 00:00:42,940 --> 00:00:50,740 Quantum computing. Hello and welcome to the Feature Business podcast from say Business School. 8 00:00:51,400 --> 00:00:58,660 I'm Emily Baron and each week I host a conversation on a topic that will define the future of business and wider society. 9 00:00:58,990 --> 00:01:03,820 Speaking to experts from Oxford University, leading businesspeople and entrepreneurs, 10 00:01:04,270 --> 00:01:08,980 I spoke to Dr. Vivienne Ming after her keynote at the annual Social Impact Careers Conference. 11 00:01:09,610 --> 00:01:18,370 She's a theoretical neuroscientist and entrepreneur who describes herself as a professional, mad scientist and demented do gooder. 12 00:01:19,090 --> 00:01:25,510 She's built applications for finding orphans in refugee camps and taking the bias out of hiring. 13 00:01:26,440 --> 00:01:31,270 I started our conversation by asking her why she's an optimist about AI. 14 00:01:35,960 --> 00:01:46,910 Well, I'm not an arbitrary optimist. I think we have some profound choices in front of ourselves as scientists, as business leaders, as policymakers. 15 00:01:47,510 --> 00:02:01,250 And what I'm optimistic is that that we are not inevitably headed to something awful, but that we can make good choices and change the direction. 16 00:02:02,840 --> 00:02:09,710 I am pessimistic in that right now most of the choices we make are not great, right? 17 00:02:10,160 --> 00:02:19,180 But I'll explain what I mean. So I have been involved in a lot of different projects, broadly defined in the artificial intelligence space, 18 00:02:19,190 --> 00:02:24,440 probably more accurately for any geeks listening to this in machine learning. 19 00:02:25,880 --> 00:02:28,790 So way back when, when I was a full time academic, 20 00:02:29,180 --> 00:02:38,390 we built computational neuroscience with theoretical neuroscience systems that learned how to hear or learned how to attend to the world. 21 00:02:40,280 --> 00:02:48,019 We tried to understand how the brain works by starting from first principles like information 22 00:02:48,020 --> 00:02:53,990 theory and then building machine learning systems that learned how to hear or see or attend. 23 00:02:54,920 --> 00:03:03,530 And from that, we would discover all of these fascinating things about the world more in line with why than with what or how. 24 00:03:03,800 --> 00:03:08,120 Why did we hear the way that we do? Why did we make choices the way we do? 25 00:03:08,770 --> 00:03:13,250 And so it's a fun and amazing field of neuroscience. 26 00:03:13,730 --> 00:03:17,900 And if it still seems abstract to you, just substitute the word theoretical with lazy. 27 00:03:18,470 --> 00:03:23,090 I come from the field of lazy neuroscience where we don't actually have to run experiments. 28 00:03:25,490 --> 00:03:33,440 So from there, I started to feel like there were insights that we could take out of that field to something higher, bigger. 29 00:03:33,980 --> 00:03:42,230 One of the biggest insights in our field is brains respond differently to rich natural stimuli than they did to artificial lab stimuli. 30 00:03:42,710 --> 00:03:47,300 Right. Well. If you think about education. 31 00:03:48,220 --> 00:03:56,950 We have this rich learning environment, kids playing, spending time with parents, reading books, exploring the world. 32 00:03:57,520 --> 00:04:01,120 And then we had an artificial lab where you give them a test. 33 00:04:02,080 --> 00:04:08,230 And I thought, could we understand a child learning in the natural environment? 34 00:04:08,530 --> 00:04:13,820 Mm hmm. Could that not just be the learning experience, but the assessment itself? 35 00:04:13,900 --> 00:04:24,460 Right. Is them learning. And my wife, who happens to be herself a learning sciences researcher, we thought about this problem for a little while, 36 00:04:24,670 --> 00:04:29,350 and then we thought about a potential solution, I guess you could say. 37 00:04:30,460 --> 00:04:35,410 And we looked at each other and we thought, is this an experiment or is it a Start-Up? 38 00:04:36,220 --> 00:04:41,260 And she had never done one. I certainly had not done a tech start-up before. 39 00:04:41,650 --> 00:04:44,110 And it certainly was an education for me. 40 00:04:44,650 --> 00:04:52,930 It turns out this is before Coursera and Khan Academy and raising money for educational technology was brutally difficult. 41 00:04:54,340 --> 00:04:58,360 All the more difficult, if I may be so blunt, raising money as a woman. 42 00:04:59,890 --> 00:05:05,320 And, you know, we we ended up exploring other potential markets, 43 00:05:05,320 --> 00:05:14,350 but I quickly got tired with them because I realised I wasn't there to build a good business that would pay back my investors. 44 00:05:14,620 --> 00:05:16,630 I was there to actually solve a problem. 45 00:05:17,470 --> 00:05:27,430 So we immediately sold the company and we literally the next day we started an education company right in the same offices. 46 00:05:27,430 --> 00:05:33,040 Except this time we didn't take any money. I funded it myself and we had two horses. 47 00:05:33,040 --> 00:05:39,489 We actually we built and published some papers, built a system, did the research, 48 00:05:39,490 --> 00:05:45,100 published papers showing that we in fact, we could listen to little kids in the classroom. 49 00:05:45,640 --> 00:05:53,320 We could monitor college, university students and MBA students just chatting with each other online and do 50 00:05:53,320 --> 00:05:57,260 better than the formal assessment and understanding what they knew or didn't know. 51 00:05:58,000 --> 00:06:01,000 Wow. But more importantly. 52 00:06:02,150 --> 00:06:07,040 We could do it at week three in the class rather than at the end of the class. 53 00:06:07,610 --> 00:06:12,560 So as what is called a formative rather than a summative assessment, if we could tell you as the professor, 54 00:06:12,890 --> 00:06:18,980 hey, these five kids are going to get this question wrong on your hypothetical final exam one. 55 00:06:19,550 --> 00:06:23,960 Why waste time on the exam at all? And two, why not change? 56 00:06:24,140 --> 00:06:26,810 Right now, what you're doing so that they don't fail. 57 00:06:28,190 --> 00:06:34,640 So if I was going to I'd love to go back to that thing about the choices we're making and where we're making good choice in budget. 58 00:06:34,760 --> 00:06:44,330 But just to dive into that very quickly. So I've seen a few education start-ups trying to do this in a very low tech way with pen and paper. 59 00:06:45,920 --> 00:06:51,500 Has that approach been adopted anywhere in the States? I mean, certainly we're not doing formative assessment in Oxford University, 60 00:06:51,890 --> 00:06:58,150 which is somewhere away from four from even even considering that a as form assessment. 61 00:06:58,160 --> 00:07:01,280 But so there are many groups that are kind of working in this space. 62 00:07:01,610 --> 00:07:06,649 And it's it is fairly transformative because this is what's called competency based education. 63 00:07:06,650 --> 00:07:11,150 So the point isn't that you study for a fixed period of time and then you pass an exam. 64 00:07:11,660 --> 00:07:14,899 It is you study until you understand the material. Right. 65 00:07:14,900 --> 00:07:26,210 And then you're done. Right. And that it's even more different than that because it means now your job as a tutor, as a teacher, as a professor is. 66 00:07:27,300 --> 00:07:33,300 Understand that student and what they need to be competent in the material. 67 00:07:33,990 --> 00:07:37,230 So there are things like this going on in the world today. 68 00:07:38,070 --> 00:07:43,950 But the funny thing is, along the way to building that system and publishing those papers, 69 00:07:44,190 --> 00:07:54,120 I had a little detour as the chief scientist of this company called Guild, where we build A.I. driven systems to take bias out of the hiring process. 70 00:07:54,630 --> 00:08:01,170 And you know what I didn't see amongst the literally hundreds of thousands of data points. 71 00:08:01,170 --> 00:08:04,710 We looked at all the many, many variables we analysed. 72 00:08:05,690 --> 00:08:12,440 The things that were not predictive of the highest quality work your grades, your test scores, 73 00:08:12,680 --> 00:08:23,840 even the university you went to the most elite universities had positive but very modest in predictive validity. 74 00:08:25,010 --> 00:08:33,079 In other words, a bachelor of Computer Science from Stanford was genuinely a positive predictor of someone's ability as a software developer, 75 00:08:33,080 --> 00:08:42,940 but a pretty modest one. Whereas their motivation, Wright, correctly assessed, was a substantially bigger predictor, swamped it out. 76 00:08:42,950 --> 00:08:51,919 So I jokingly tweeted One day at Carnegie Mellon is better than Berkeley and MIT is better than CMU, 77 00:08:51,920 --> 00:08:56,750 and Caltech is better than M.I.T. and Stanford is better than Caltech. 78 00:08:57,350 --> 00:09:00,440 And motivation is much bigger than any of them. 79 00:09:01,640 --> 00:09:07,500 But we only hire. By looking at your name, your school and your last job. 80 00:09:07,620 --> 00:09:11,580 That's the starting point for every hiring process at almost any company. 81 00:09:13,710 --> 00:09:16,540 So think about what that means for my work in education. 82 00:09:16,740 --> 00:09:28,380 I just learned that we built this very sophisticated artificial intelligence to deeply understand students technical competencies. 83 00:09:29,610 --> 00:09:33,210 Which in and of itself didn't seem to be that predictive of anything. 84 00:09:33,990 --> 00:09:40,320 So you need to know how to program to be a programmer. Sure. But it turns out that's not what predicts a great programmer. 85 00:09:41,520 --> 00:09:51,330 Interestingly enough, as a small aside, we find that social skills, though less common in programmers than, say, salespeople, are just as predictive. 86 00:09:51,930 --> 00:09:58,650 Right. Which, by the way, goes against a somewhat notorious former Google Engineer's claims about software engineers. 87 00:09:59,970 --> 00:10:03,480 So we really wanted to rethink them what education actually means. 88 00:10:03,990 --> 00:10:07,230 And this was much of a why, why, why process for us. 89 00:10:07,530 --> 00:10:12,150 Why do we have a formal education system globally? 90 00:10:12,300 --> 00:10:15,720 In America, the Western world, whatever you want, however you want to look at it. 91 00:10:16,770 --> 00:10:25,200 And to me, it's, I hope, obvious that it's not about producing good grades or high marks. 92 00:10:25,620 --> 00:10:29,760 It's not about going to university. It's not even about getting a great job. 93 00:10:30,150 --> 00:10:34,560 Though any of those might well be mediators of what I really care about. 94 00:10:35,530 --> 00:10:42,370 The purpose of education is to produce happy, healthy, impactful lives and let society reap the benefits. 95 00:10:43,490 --> 00:10:51,380 And in that case, it means anything we might do in education is only valid if it supports that, 96 00:10:52,130 --> 00:10:55,160 and it's only valid if it supports that for this child. 97 00:10:56,520 --> 00:11:00,480 That gives you a pretty wild ambition then. 98 00:11:01,750 --> 00:11:08,680 Educate people based on a predictive model of how it will impact their long term life outcomes. 99 00:11:09,280 --> 00:11:13,599 And so that's what we did now, because we knew it would work, because we even know it will work. 100 00:11:13,600 --> 00:11:21,759 Today, though, I think we've done enough research and real world application and now a couple of years of 101 00:11:21,760 --> 00:11:27,970 actually being live in the world where we can see it sure looks like it's having an impact. 102 00:11:28,450 --> 00:11:34,450 So we revisited education and we thought, well, who owns the longest term outcomes for a child? 103 00:11:35,340 --> 00:11:39,720 And it's their parents and other caregivers, grandparents, foster parents, whoever it might be. 104 00:11:40,320 --> 00:11:43,320 So what if we built a tool to help parents? 105 00:11:44,790 --> 00:11:51,630 And in supplying that tool, it gave them activities that they could do with their kids. 106 00:11:51,960 --> 00:11:54,990 Like, how about something you can do with your kid tonight? 107 00:11:55,110 --> 00:12:00,900 Do you have 20 minutes? What's the single thing you could do with your child that would have the biggest impact on their long term outcomes? 108 00:12:00,960 --> 00:12:05,400 So I do want to jump in there because he says I'm out of this one, which I thought was fascinating. 109 00:12:05,410 --> 00:12:10,150 You mentioned the growth mindset, which has been, you know, 110 00:12:10,230 --> 00:12:16,320 so important for lots of educators understanding of how to do the best by the kids they're looking after. 111 00:12:16,830 --> 00:12:22,230 And you said the biggest predictor is the parental growth mindset, which I thought was a really kind of cool phrase. 112 00:12:22,620 --> 00:12:29,009 So when you have these predictive models which can tell you, you know, if you are X, 113 00:12:29,010 --> 00:12:33,390 Y, Z stage when you were 11, we know what's going to happen when you're 25. 114 00:12:33,810 --> 00:12:41,700 How do you balance the power in that knowledge with giving parents the right tools at the right time, 115 00:12:41,700 --> 00:12:48,030 but not to the extent that you can, you know, give them a preconceived notion of what that kid's going to achieve. 116 00:12:48,240 --> 00:12:52,890 And you can most certainly lead them down the wrong path. 117 00:12:53,400 --> 00:12:58,139 So when I say predictive models, this is the language of my field. 118 00:12:58,140 --> 00:13:01,140 And in all fairness, I think can often get misinterpreted, 119 00:13:01,140 --> 00:13:06,780 like perfect crystal ball predictions of the future, none of which is a reality in our work. 120 00:13:07,350 --> 00:13:15,270 The simple idea probably an easier the less engaging way described as they're actuarial models. 121 00:13:15,300 --> 00:13:25,080 Right. And what we're trying to do is say, if we look at enough of different factors at a given moment in a kid's life. 122 00:13:25,990 --> 00:13:33,880 What could we do with the highest confidence to have and minimise any potential downsides? 123 00:13:34,920 --> 00:13:39,080 And add. Whatever. 124 00:13:39,090 --> 00:13:42,580 I'm going to make up a magical thing, a life outcome score. 125 00:13:42,590 --> 00:13:47,059 What if it was 20% higher? Right. We're not talking about maxing out everyone. 126 00:13:47,060 --> 00:13:53,030 I can't do that. I gosh, I wish I could, but can I make a positive difference? 127 00:13:53,030 --> 00:13:59,510 Right. And it turns out the one not easiest thing I can do, but the one most obvious thing I can do. 128 00:13:59,780 --> 00:14:06,170 The thing that parents would many parents would love to have is the exact worst thing to do, 129 00:14:06,410 --> 00:14:12,440 which is tell them who their child is or who you predict they will be. 130 00:14:13,080 --> 00:14:18,440 And as soon as you tell them that it's a cursed crystal ball and you just made everything worse. 131 00:14:18,740 --> 00:14:22,370 Even if your predictions are positive, hey, your child's going to win the Nobel Prize. 132 00:14:22,850 --> 00:14:25,700 I just made it less likely to happen now. 133 00:14:25,700 --> 00:14:32,329 It will it will actually decrease your growth mindset about your child because now you're worried, oh, god, I don't want them to not do this. 134 00:14:32,330 --> 00:14:40,400 And you become more fragile and more brittle in your approach. How do we focus on the things that will make a long term positive difference? 135 00:14:41,210 --> 00:14:50,330 Some of these are phenomenally reliable literacy interventions are universally good resiliency interventions, 136 00:14:50,810 --> 00:14:54,320 appropriately applied growth mindset interventions. 137 00:14:54,740 --> 00:15:03,920 Yes, it is possible to run a resiliency intervention where you set the bar so hard that the kid just fails and all they learn is failure. 138 00:15:03,920 --> 00:15:07,520 And you don't get anywhere, but you do it right. 139 00:15:07,760 --> 00:15:16,730 And these things, you know, I was once interviewed for the BBC and the interviewer was like, You can't know what people will need in 20 years. 140 00:15:17,810 --> 00:15:21,380 How could you possibly say that? And I say you're right. 141 00:15:22,220 --> 00:15:30,050 You know, if you asked me, does any of my empirical research show that kids will need to be resilient in 20 years? 142 00:15:30,560 --> 00:15:34,639 Well, I don't happen to own a TARDIS, so no, I haven't been in the future. 143 00:15:34,640 --> 00:15:42,020 And so I don't know. But if you can imagine a world where being resilient isn't helpful, then you have an imagination. 144 00:15:42,020 --> 00:15:51,350 I do not. It is such a ubiquitous predictor of positive life outcomes today, and for such obvious and intuitive reasons, 145 00:15:51,590 --> 00:15:55,820 though we can study it more deeply than that, that it's fundamental. 146 00:15:56,420 --> 00:15:58,790 So we do find there are other things that are more nuanced, 147 00:15:59,270 --> 00:16:11,330 and our goal is to own the responsibility of making certain that the sorts of things we recommend are safe, reliable and will make a difference. 148 00:16:11,960 --> 00:16:17,890 So there are things that, particularly with older children, where they become a little more differentiated, right? 149 00:16:19,010 --> 00:16:29,450 One child may be very resilient emotionally, and so getting tough feedback works perfectly well for them, but not everybody is. 150 00:16:30,350 --> 00:16:38,870 And one of the things we focus on is not simply absolutely don't tell this parent, here's your child and here's how they compare to everyone else. 151 00:16:38,870 --> 00:16:43,430 That's awful, and here's who they will be is just as bad. 152 00:16:43,910 --> 00:16:47,180 Here's who they could be, is what we want to communicate. 153 00:16:47,840 --> 00:16:53,210 But even internally in our system, when we measure all these different constructs that we look at. 154 00:16:54,120 --> 00:17:02,880 When we focus on delivering an activity to a parent, it's not because this child has a strength in that, and we're we're augmenting it. 155 00:17:03,150 --> 00:17:06,270 It's not that the child has a weakness and we're remediating it. 156 00:17:06,690 --> 00:17:12,390 What we actually look at, again, for you nerds and geeks is the gradient on that vector. 157 00:17:12,600 --> 00:17:17,040 We want to know where they're changing. Where does this child the most plastic? 158 00:17:17,880 --> 00:17:20,880 No one's really done this research before. So we're kind of naive. 159 00:17:20,880 --> 00:17:25,350 Is working on a strength better than helping a weakness? It's not entirely clear. 160 00:17:26,010 --> 00:17:32,910 So what we figured is at any given moment, so as far as we can tell, 161 00:17:33,540 --> 00:17:41,730 everyone's plastic in some part of themselves and certainly relative to other things about them. 162 00:17:41,970 --> 00:17:44,730 Focus on that thing. Yeah. Or maybe those few things. 163 00:17:45,930 --> 00:17:50,850 So, you know, the concern that you're articulating, which is you could get a wrong you could mislead the press, 164 00:17:51,180 --> 00:17:54,660 all of that is is very reasonable and not more than reasonable. 165 00:17:54,930 --> 00:18:01,680 We are obligated to take all of that deeply, seriously, and we build it into the models. 166 00:18:01,740 --> 00:18:08,040 Right. It doesn't mean that it's perfect, but the whole thing is a grand experiment and we learn and literally iterate on a daily basis. 167 00:18:08,400 --> 00:18:18,810 So one of the things I wanted to pick up on there and sort of just as an aside, one of the kids I sort of knew in my previous life before the MBA. 168 00:18:19,260 --> 00:18:24,810 I remember him saying to someone, this kid from a, you know, just French background and says He London. 169 00:18:25,230 --> 00:18:30,600 And I remember saying that his his life ambition was to be a labourer because 170 00:18:30,600 --> 00:18:36,390 the richest person he knew was someone who earned $2,000 a month and had a dog. 171 00:18:36,690 --> 00:18:43,230 And for him, in his worldview, that was, you know, the kind of that that was that was it. 172 00:18:43,860 --> 00:18:50,870 And one of the things you said earlier, which really struck a chord with me, was the phrase your belief based utility. 173 00:18:50,880 --> 00:19:00,600 And I'd love to understand from you whether there's any work you're doing now to use these kind of tools, trying to help people who don't have that. 174 00:19:01,440 --> 00:19:04,590 Very much so. It's one of the constructs very explicitly. 175 00:19:04,600 --> 00:19:09,960 It's that as an idea is a fairly new one out of behavioural economics. 176 00:19:10,530 --> 00:19:18,420 So we've sort of adopted the language in the knowledge base of people like George Loewenstein and others that have done research in this space. 177 00:19:18,930 --> 00:19:26,040 It was something that I had intuitions around even before then, and we were working in several years ago, maybe five years ago. 178 00:19:26,400 --> 00:19:33,330 I read this paper and it described a pretty well-known phenomenon in the States and a variant of it that exists everywhere in the world. 179 00:19:34,080 --> 00:19:38,580 Students from underrepresented backgrounds, but high performers, you know, 180 00:19:38,580 --> 00:19:45,750 elite students in secondary school get full scholarships to go to Harvard, go to Stanford. 181 00:19:46,720 --> 00:19:52,810 And to say they do it and they turn down those scholarships at rates higher than a traditional student sort of read. 182 00:19:52,810 --> 00:20:00,670 White male is. I mean, the traditional students don't turn down these scholarships, but for peculiar reasons. 183 00:20:01,120 --> 00:20:06,700 So it's, it's almost a totally different behaviour, but they do it and at fairly high rates. 184 00:20:07,450 --> 00:20:12,040 And for a hundred years armchair philosophers in America have said, well, you know, 185 00:20:12,040 --> 00:20:16,960 black families are different, they're culturally Hispanic families, it's the economics. 186 00:20:17,740 --> 00:20:22,210 Finally, a few years ago, someone did just what seems like the most obvious experiment. 187 00:20:22,690 --> 00:20:26,170 Some of them go and some of them don't. What's different between them? 188 00:20:27,610 --> 00:20:32,380 And it turned out there was nothing different between them systematically in terms of culture and economics. 189 00:20:32,710 --> 00:20:40,900 And in fact, of the dozens of variables they looked at, only three were meaningfully predictive and the number one far and away. 190 00:20:41,410 --> 00:20:44,890 They knew someone from their neighbourhood that had gone before them. Right. 191 00:20:45,800 --> 00:20:55,850 And you know, the way I just told the story in terms of race and probably is how many people were interpreting it even before I brought that up. 192 00:20:56,660 --> 00:21:00,920 But as I began to think about this, I realised this was my dad, right? 193 00:21:01,190 --> 00:21:06,080 He grew up in a farm in the south central part of Kansas. 194 00:21:06,110 --> 00:21:09,319 Mm hmm. And I mean a tiny farm. 195 00:21:09,320 --> 00:21:13,850 Like, if you wanted to go visit the town that he grew up in, you have to push the tumbleweeds out of the way. 196 00:21:15,320 --> 00:21:20,930 And in after three years of secondary school, he had already graduated. 197 00:21:20,960 --> 00:21:25,550 He got a full five year non-performance based scholarship to MIT, and he didn't go. 198 00:21:26,480 --> 00:21:32,420 And the reason he didn't go, admittedly a little less about him than about his grandma, my grandparents, his parents. 199 00:21:33,080 --> 00:21:36,640 What's the point of going to MIT? If you're just going to end up back on the farm, right. 200 00:21:38,630 --> 00:21:43,400 In yes. To me, one single generation later, it seems irrational. 201 00:21:43,700 --> 00:21:52,069 Yeah. What? What could your life be if he had gone off and there was its own pressure on me to then, 202 00:21:52,070 --> 00:21:58,010 you know, try and be this kid to be the one that kind of lives out the dreams that he had had. 203 00:21:58,220 --> 00:22:02,270 I mean, his life was not ruined. Ended up becoming a doctor. I grew up in California. 204 00:22:02,570 --> 00:22:05,750 You know, it was it was an amazing experience. 205 00:22:05,750 --> 00:22:12,500 But he always felt that he hadn't. He always described himself as, you know, some small town doctor. 206 00:22:12,920 --> 00:22:15,590 And it was clear he always thought he could have done more than that. 207 00:22:16,610 --> 00:22:21,499 And he made it very clear to me to live a life of substance that that really had an impact on the world. 208 00:22:21,500 --> 00:22:24,950 And he built that even when I went through very hard times in my life. 209 00:22:25,580 --> 00:22:29,050 And that was something he built deeply into me. 210 00:22:29,060 --> 00:22:37,600 So when I came out to the other end, that was still there. So you see, this is a really broad based thing. 211 00:22:37,610 --> 00:22:44,780 I'm willing to bet in the UK there's probably a very strong North-South gradient where equally performing students, 212 00:22:45,740 --> 00:22:50,510 equitably performing students from the north are less likely to attend elite universities, 213 00:22:50,510 --> 00:22:55,610 even on invitation than those from further south, perhaps with an interesting border effect. 214 00:22:56,600 --> 00:23:07,310 And and, you know, that's a tragedy if you look at the work of people like Raj Chetty, an economist at Stanford, 215 00:23:07,430 --> 00:23:13,190 he had this set of papers about a few months ago that got a lot of press about lost Einsteins. 216 00:23:13,220 --> 00:23:19,280 Mm hmm. He found, for example, that a student from the lowest economic quintile. 217 00:23:20,060 --> 00:23:23,120 But the top math performance quintile. 218 00:23:23,120 --> 00:23:29,810 Yeah. Was as likely to have a patent in their life as a student from the top economic quintile. 219 00:23:29,840 --> 00:23:37,790 And the lowest math. Wow. And that means we all have lost. 220 00:23:38,330 --> 00:23:44,990 And of course, when you talk about tears, you look at how many of those kids you're talking about. 221 00:23:46,280 --> 00:23:53,660 It is profound. It is self-destructive that we perpetuates these sort of things. 222 00:23:54,230 --> 00:24:01,630 So, yeah, when you look at belief based utility and you say, what do I think is really going to pay off in my life? 223 00:24:01,640 --> 00:24:09,230 What do I think is plausible? It's possible. Role models are huge impact. 224 00:24:09,230 --> 00:24:12,260 They're not the only factor, but they are huge impact. 225 00:24:12,380 --> 00:24:19,370 Peer Remodelling Parent Role Model. Turns out role modelling by elite performers less so. 226 00:24:19,430 --> 00:24:27,860 You know, the idea that there was a guy who was the president and his dad was from Kenya is not necessarily very motivating to a kid. 227 00:24:28,730 --> 00:24:35,990 You know, growing up on the streets whose family has lived in the rough part of any number of cities 228 00:24:35,990 --> 00:24:40,640 around America for a long time and and has experienced discrimination his whole life. 229 00:24:41,150 --> 00:24:48,290 It's easy to look at President Obama and say he's an outlier or even someone that came seemingly from a similar background like Oprah Winfrey. 230 00:24:48,290 --> 00:24:54,940 You still look at her and you say, sure, she's a billionaire because she's exceptional and it's hard to believe that could be you. 231 00:24:55,370 --> 00:24:57,040 One needs that personal connection. 232 00:24:57,050 --> 00:25:06,050 So a big part of our work is how can we actually help to engineer some of these experiences, both the personal experiences, 233 00:25:06,050 --> 00:25:15,730 but also an engineer that I'm kind of at an epidemiological like if we can go in and create some of these differentials, 234 00:25:15,730 --> 00:25:23,240 as we call them, lift experiences even, it's for it's for a substantial minority of the kids in these neighbourhoods. 235 00:25:23,510 --> 00:25:28,310 Then they become the kid that all the ones around them know. 236 00:25:28,880 --> 00:25:35,470 They become the one from the neighbourhood that came to Oxford or if they had to slam Cambridge. 237 00:25:35,480 --> 00:25:45,250 Hampshire is acceptable the son. And so looking at that and thinking we could actually make a difference is a huge motivator in this. 238 00:25:46,450 --> 00:25:50,200 And it's not as much as I might want it to be. 239 00:25:50,530 --> 00:25:55,240 It's just never going to be in their term. How do we change 8 billion lives? 240 00:25:55,450 --> 00:25:56,770 That may be our ambition. 241 00:25:57,310 --> 00:26:08,960 But realistically speaking, there are hundreds of millions, even potentially billions of lives that are within the scope of what we can change. 242 00:26:10,030 --> 00:26:17,560 And this very simple idea. I mean, it is such a subtle one, not simply that anyone can be amazing. 243 00:26:17,590 --> 00:26:20,780 It's sort of an ethos of Silicon Valley. Anyone can be amazing. 244 00:26:20,800 --> 00:26:29,460 But if you're not, it's not my fault, right? We hire the amazing people. Everyone can be amazing if that's true and someone isn't. 245 00:26:30,090 --> 00:26:37,560 You own them. You don't own all of it. I'm not saying you're a bad person because you're drinking a latte walking past a homeless person, 246 00:26:38,460 --> 00:26:46,680 but you do own some of the moral responsibility to try to make a difference in the lives around you. 247 00:26:47,610 --> 00:26:51,570 Because it's in your self-interest to do so. Right. 248 00:26:52,200 --> 00:26:54,870 So this has been a big part of I just happen to deal with numbers. 249 00:26:55,380 --> 00:27:00,750 I happen to work with algorithms and and entrepreneurship and brains and these sorts of things. 250 00:27:01,230 --> 00:27:17,570 But ultimately, it's a very human story. Thank you for listening to this episode of the Feature Business Podcast. 251 00:27:17,600 --> 00:27:22,590 Next time I'll be talking to Courts of Paris and MBA student and expert on climate change. 252 00:27:22,610 --> 00:27:26,240 We discuss two degree warming and the impending carbon bubble. 253 00:27:26,270 --> 00:27:30,470 Please do subscribe to this podcast in iTunes or wherever you get your podcasts. 254 00:27:30,500 --> 00:27:33,990 This episode was brought to you by Paris, Michael and Brady. 255 00:27:34,010 --> 00:27:42,330 Patrick and Emily, thank you very much for listening and goodbye, all.