1 00:00:00,210 --> 00:00:03,570 [Auto-generated transcript. Edits may have been applied for clarity.] Okay. Thanks, everybody, for coming tonight. 2 00:00:03,960 --> 00:00:12,210 I'd like to welcome our guest speaker, um, for the Behavioural Sciences and Complex Interventions module for the MSC in Translational Health Sciences. 3 00:00:12,580 --> 00:00:17,580 I'm very I'm the course director for the MSC, which focuses on the social science of innovation. 4 00:00:17,580 --> 00:00:25,020 And we're interested in innovation, technology, humanity more generally, which makes this a perfect talk for this module and the wider MSC. 5 00:00:25,530 --> 00:00:28,740 Um, so I'd like to introduce Scott Clark, who's our guest speaker. 6 00:00:29,130 --> 00:00:35,480 Um, so he is a behavioural economy economist by training. 7 00:00:36,450 --> 00:00:40,920 Anywhere. Yeah. So not a behavioural sciences as such, but closely aligned, I would say. 8 00:00:41,160 --> 00:00:43,860 Former academic Queen Mary University, um, 9 00:00:43,860 --> 00:00:49,589 focusing on largely these days on the relationship between technology and humanity, which is right up our alley. 10 00:00:49,590 --> 00:00:54,930 I'm sure you'll agree. Um, and currently works for Infosys, um, consulting. 11 00:00:54,930 --> 00:00:58,230 And we'll have a lot of interesting things to say. So I will turn it over to. 12 00:00:58,410 --> 00:01:06,709 Thank you so much. Thanks for the kind introduction and for the, um, invitation to talk to you this evening. 13 00:01:06,710 --> 00:01:08,000 And so it's a pleasure to be here. 14 00:01:08,570 --> 00:01:19,310 It occurred to me as I was driving in this evening, the last time I spoke at Oxford was in 1994, um, about, uh, Britain's future in Europe. 15 00:01:19,340 --> 00:01:22,630 We actually did predict the fourth would be out of Europe by 2020. 16 00:01:22,640 --> 00:01:26,570 So it's probably one of the last things I've actually got correct as our future. 17 00:01:27,470 --> 00:01:31,790 Um, but it did take the university 30 years approximately to bring me back again. 18 00:01:31,790 --> 00:01:35,120 So I'm not sure what impression I made the first time round. 19 00:01:35,390 --> 00:01:38,090 Hopefully a better impression this time. 20 00:01:38,720 --> 00:01:52,220 Um, so, listen, guys, uh, our world is at a very unique juncture in its history, characterised by increasingly complex and uncertain trajectories. 21 00:01:52,760 --> 00:02:03,920 Seismic shifts in our sociological, technological, ecological, political and economic landscapes are transforming the way we live, 22 00:02:04,640 --> 00:02:10,640 the way we work, the way we communicate, the way we perceive reality. 23 00:02:11,550 --> 00:02:13,880 So what does this mean for our future, 24 00:02:14,420 --> 00:02:23,390 and what does it mean for the future of humanity as technology continues to proliferate and mature as an unprecedented pace? 25 00:02:24,110 --> 00:02:30,380 So I'm going to take you on a little bit of a journey this evening, a journey into 2013. 26 00:02:31,430 --> 00:02:39,950 Am I qualified to take you on that journey? That's a very good question, and you can answer it for everybody at the end of the end of the lecture. 27 00:02:40,310 --> 00:02:44,370 Um, I'm a behavioural economist, which I think is a behavioural scientist, actually. 28 00:02:44,390 --> 00:02:53,420 Um, behavioural economics is really trying to understand the complexities of human behaviour and decision making at the intersection of economics, 29 00:02:53,420 --> 00:02:57,840 psychology and sociology. I began my career as a behavioural economist. 30 00:02:57,860 --> 00:03:05,360 I have the last 30 years being dazzled by the bright lights of business and technology consulting, 31 00:03:06,020 --> 00:03:13,160 and what I do is I work with companies around the world to help them understand what next, what lies around the corner. 32 00:03:13,500 --> 00:03:18,920 At this this intersection of uncertainty, sociological and technology change, 33 00:03:19,280 --> 00:03:25,009 and then helping those companies design innovative business models and formulate 34 00:03:25,010 --> 00:03:29,780 strategies to help them to thrive and be relevant in that changing paradigm. 35 00:03:30,350 --> 00:03:34,129 It's a great job. My great thing about being a futurist, as you can never be wrong, 36 00:03:34,130 --> 00:03:37,730 because by the time you get to the future, people forgot what you said in the first place. 37 00:03:37,730 --> 00:03:43,710 So it's a wonderful place to be. But I think it's increasingly important to start thinking forward. 38 00:03:43,950 --> 00:03:49,830 We are not going to be able to meet the uncertainty of technological change by focusing on on the now, 39 00:03:49,840 --> 00:03:54,389 we've got to start to think more creatively about what lies ahead as we start 40 00:03:54,390 --> 00:03:59,670 to explore interventions that are enhancing or designed to enhance human life, 41 00:04:00,000 --> 00:04:04,800 or we also are preserving the things that we value at the essence of humanity. 42 00:04:05,100 --> 00:04:06,420 It is that set of trade offs. 43 00:04:08,620 --> 00:04:18,520 Before you get too cocky about about my predictions, my, uh uh uh uh, prophecies for the future, I think it's important to, 44 00:04:18,550 --> 00:04:27,430 to reflect on on the words of the great American economist John Kenneth Galbraith, who clearly wasn't a fan of trying to predict the future. 45 00:04:27,910 --> 00:04:32,440 I'll add another from the great Danish physicist of the last century, Niels Bohr, 46 00:04:32,860 --> 00:04:37,960 who said that prediction is extremely difficult, particularly when it's about the future. 47 00:04:41,050 --> 00:04:46,240 The reason why we struggle in imagining the future. 48 00:04:47,020 --> 00:04:50,110 The inconvenient truth was we're just not very good at it. 49 00:04:50,590 --> 00:04:55,060 And the number of reasons why we're not very good at imagining what lies ahead. 50 00:04:55,840 --> 00:05:04,990 I think at the core is the fact that as companies and I think as individuals as well as organisations, we have a very difficult time admitting. 51 00:05:05,940 --> 00:05:11,640 That the things that got us to where we are today are the same things that would get us to where we want to go. 52 00:05:13,150 --> 00:05:21,450 As Jim Collins said in his book Good to Great a number of years ago, what stops good companies from becoming great is the fact they're already good. 53 00:05:21,460 --> 00:05:25,360 It bleeds complacency. We have a difficult time letting go. 54 00:05:27,080 --> 00:05:33,020 We're also very rooted in the past. We apply a past forward view of the future. 55 00:05:33,860 --> 00:05:37,790 We try to take the things that we understand more past and projecting forward. 56 00:05:38,210 --> 00:05:43,690 It's not going to change. Well, of course it does change. The second. 57 00:05:45,070 --> 00:05:54,100 Is, most companies continue to respond to technological change with surface level adjustments. 58 00:05:55,240 --> 00:06:02,800 At the ninth, if the 2020 pandemic taught us anything, it's that companies can't compete on surface level changes. 59 00:06:04,970 --> 00:06:09,500 We have to rethink how we operate as businesses in the context of that change. 60 00:06:10,700 --> 00:06:15,860 We have to rethink how we create, how we capture and how we distribute value. 61 00:06:18,840 --> 00:06:22,380 Booba doesn't compete on the quality of its cause. 62 00:06:23,130 --> 00:06:26,730 Nor does Amazon compete on the quality of its website. 63 00:06:27,820 --> 00:06:32,710 It's the things that lie under the surface that help those companies disrupt an industry. 64 00:06:34,740 --> 00:06:40,470 But there's another consideration of play, which is we have a tendency as as humans. 65 00:06:42,380 --> 00:06:49,580 To overestimate the amount of change that is likely in the next two years in the short term, 66 00:06:49,820 --> 00:06:56,180 and underestimate the amount of exponential growth or change that's going to happen in the long term, 67 00:06:56,180 --> 00:07:01,010 and the number of explanations, I think, for the for the former. Why do we overstate the short term? 68 00:07:01,850 --> 00:07:05,110 You can blame it on optimism bias. All right. 69 00:07:05,150 --> 00:07:08,750 It's the same complex that we have at the start of the new year. 70 00:07:08,750 --> 00:07:13,670 When we start working out of the gym, we're going to continue to do those things continuously throughout the year. 71 00:07:13,670 --> 00:07:18,590 And of course, we don't. Um, again, back to this past forward mentality. 72 00:07:19,220 --> 00:07:24,110 We tend to overstate the amount of change that happens to our lives in the last year or two years, 73 00:07:24,110 --> 00:07:26,900 and we think the next couple of years is going to be very much the same. 74 00:07:27,440 --> 00:07:33,590 But it's very difficult for human beings to think forward, to think beyond that 2 or 3 year periods. 75 00:07:34,070 --> 00:07:44,180 If I said to you that your laptop is going to double in its processing power in the next two years, you need to head around that. 76 00:07:44,180 --> 00:07:49,730 But if I said it's going to continue to double at that rate for the next ten years, you'd really struggle to understand what that means. 77 00:07:50,690 --> 00:07:58,490 And one of the shortcomings. Of humans is that we struggle with the exponential function. 78 00:07:59,090 --> 00:08:03,260 We have a really difficult time understanding this concept of doubling. 79 00:08:04,180 --> 00:08:10,450 That's exactly what we have to do. As we start thinking forward in time and start to imagine the futures. 80 00:08:11,830 --> 00:08:13,750 So how should we be thinking about the future? 81 00:08:15,130 --> 00:08:23,440 First of all, as individuals, as society, and as businesses, we should be saying to ourselves, the future is not predetermined. 82 00:08:24,280 --> 00:08:33,550 We have the opportunity to build it and God is past whatever the pundits are telling us, whatever the experts are telling us about what lies ahead, 83 00:08:34,480 --> 00:08:40,900 we have the opportunity to design the future that we want, not the future that somebody else is telling us will be. 84 00:08:42,050 --> 00:08:49,129 Second of all, as we're designing that future and identifying the interventions that we want to prioritise in 85 00:08:49,130 --> 00:08:54,170 changing human behaviour and changing organisational behaviour in the context of that future, 86 00:08:54,740 --> 00:09:00,680 we must be doing things differently, not because they're different, but because they're better. 87 00:09:01,930 --> 00:09:05,110 We can't keep chasing the shiny object. Right. 88 00:09:05,120 --> 00:09:09,099 We've seen instances of that so many times over the last few years. 89 00:09:09,100 --> 00:09:15,470 And think of it from a technological perspective. Every organisation I work with, they have to get going on blockchain. 90 00:09:15,490 --> 00:09:18,920 They have to start to invest in the metaverse. Now is Gen Z. 91 00:09:18,940 --> 00:09:28,089 There's always some new technology, and if they're not grasping, if they're not understanding and leveraging effectively, they risk falling behind. 92 00:09:28,090 --> 00:09:33,520 And the message that we deliver to those companies is, is, is don't chase the shiny object. 93 00:09:33,580 --> 00:09:37,660 Understand what problems you need to solve for and why. 94 00:09:38,020 --> 00:09:42,430 And walk back from that to figure out the technologies that will help get you there smarter. 95 00:09:43,580 --> 00:09:48,120 Faster and more profitably. And third of all, in defining the future. 96 00:09:48,140 --> 00:09:52,280 Ask not what do I think it will be? Ask what do I want it to be? 97 00:09:52,790 --> 00:09:57,230 That's how we, as organisations and as individuals should be thinking about the future. 98 00:09:59,160 --> 00:10:05,230 When we think about, um. Forecasting and we think about futurism. 99 00:10:05,860 --> 00:10:15,640 What we are in the need of is a much more structured and systematic and systematic and organised way of looking at the future. 100 00:10:15,910 --> 00:10:25,330 Strategic foresight is the embodiment or convergence of multiple approaches that help us understand what's changing around us, 101 00:10:26,140 --> 00:10:30,520 what's that's going to mean for society and for the organisations that we are part of? 102 00:10:32,280 --> 00:10:36,690 What futures are likely to evolve and how we can effect change. 103 00:10:37,960 --> 00:10:40,990 It's about future thinking, but it's also about systems thinking. 104 00:10:41,440 --> 00:10:49,630 It's thinking holistically about all the different macro and micro trends that can either accelerate or disrupt the future that we want. 105 00:10:50,620 --> 00:10:52,180 And this about exponential thinking. 106 00:10:53,440 --> 00:11:03,190 It's thinking about the convergence of technologies and convergence of of trends and how that will transform things beyond our expectations. 107 00:11:04,460 --> 00:11:12,640 So when we talk about prediction, it is less about identifying where we're going to land and more about designing the future we want. 108 00:11:12,650 --> 00:11:23,410 And I think that's the critical distinction. So when we talk about how we look forward in time, how we start to design or think about the future. 109 00:11:23,800 --> 00:11:29,380 It's really the coming together of two approaches. Forecasting and designing. 110 00:11:30,250 --> 00:11:33,130 Forecasting is a data driven, 111 00:11:33,700 --> 00:11:41,200 rigid set of of calculations based on the things that are happening around us and what we think is going to sue as a result of that. 112 00:11:42,100 --> 00:11:45,970 And that is important. We need to understand those trends and how they're going to play out over time. 113 00:11:46,180 --> 00:11:52,210 But equally important is the concept of designing the future, right? 114 00:11:52,240 --> 00:12:00,930 It's our ability to imagine the future we want. Not relying on data to tell us the future that is most likely to ensue. 115 00:12:01,740 --> 00:12:04,830 So we help clients think about that future. 116 00:12:05,100 --> 00:12:08,249 We do it in the context of these four different modes, 117 00:12:08,250 --> 00:12:14,610 which is really the coming together of two dimensions projection, which is data driven versus imagination. 118 00:12:15,980 --> 00:12:23,870 And value neutral versus value led. As behavioural scientists were told to adopt a value neutral approach, 119 00:12:23,870 --> 00:12:28,850 don't allow our biases to interfere with our assessments, with our calculations. 120 00:12:29,210 --> 00:12:36,020 Certainly, as behavioural economists, that's what we're taught to, to to believe that the reality is this is our future. 121 00:12:37,200 --> 00:12:46,460 We have a responsibility, a moral responsibility to allow those values to start to define what future we want and why, 122 00:12:46,470 --> 00:12:52,080 and then to walk back from that future to figure out the things we need to invest in to get there intelligently. 123 00:12:53,060 --> 00:13:04,960 So I'm actually proposing a method or a set of methods that brings together data driven approaches with imagination or data neutral approaches, 124 00:13:04,970 --> 00:13:09,830 right, to make sure that we are addressing alternative futures, multiple futures. 125 00:13:10,640 --> 00:13:22,459 So we are prepared for what lies ahead. And I think there's a real opportunity to find an effective integration of 126 00:13:22,460 --> 00:13:27,830 strategic foresight and behavioural science as we start to map out our futures. 127 00:13:28,370 --> 00:13:33,560 Neither one of these disciplines is effective alone as it is together. 128 00:13:34,520 --> 00:13:40,520 The challenge with strategic foresight is it often overlooks or downplays the human dimension. 129 00:13:40,940 --> 00:13:48,320 It relies too much on on economic and technological and sociological trends, but doesn't really get at the the human element of change. 130 00:13:50,140 --> 00:13:55,780 And I think the challenge of behavioural science is it's far too rooted in the present. 131 00:13:56,290 --> 00:14:01,210 It doesn't look forward enough in terms of what might be what's around the corner. 132 00:14:02,770 --> 00:14:13,599 And I think that that's, um, it lacks the, the breadth of, of understanding of the different dimensions likely to impact what lies ahead. 133 00:14:13,600 --> 00:14:20,440 So by bringing these two disciplines together, we're able to achieve a number of scenarios that we can't when we separate them. 134 00:14:20,800 --> 00:14:26,170 The first is the ability to calculate the human risks resulting from possible change. 135 00:14:27,320 --> 00:14:32,360 They built to ensure that the future scenarios incorporate human needs, desires, and aspirations. 136 00:14:32,870 --> 00:14:38,180 And third of all, to create interventions that are more likely to be adopted and sustained. 137 00:14:39,980 --> 00:14:42,860 So. So it's not about choosing one to approach the other. 138 00:14:42,860 --> 00:14:50,450 It's finding opportunities to converge these very powerful approaches in order to create futures that are realisable, 139 00:14:50,570 --> 00:14:58,140 but also futures that map to our desires. So with that said, welcome to 2013. 140 00:14:58,920 --> 00:15:05,280 And to give you some context. It's been proximately 12 months since Donald Trump left office. 141 00:15:08,200 --> 00:15:16,000 The countries of Portugal, Spain and Morocco are gathering to to host the 24th World Cup. 142 00:15:16,330 --> 00:15:20,680 Between those three nations. It's the 50th anniversary. 143 00:15:21,820 --> 00:15:25,690 Of Michael Jackson's Thriller album the 20,000. 144 00:15:26,230 --> 00:15:34,060 Michael Jackson fanatics dressed as goblins and zombies, we imagine, will be gathering in in Times Square to do a flashmob dance. 145 00:15:34,900 --> 00:15:40,720 And it's also the 100th anniversary of Clyde Tombaugh discovery of the planet Pluto. 146 00:15:41,410 --> 00:15:44,740 All right, though, though, that's where we're going to be in 2013. 147 00:15:44,800 --> 00:15:55,360 And when we talk about what we can expect in 2013, it's a combination of relative certainties, probabilities and possibilities. 148 00:15:55,390 --> 00:16:03,350 So let's start with the relative certainties. First of all, the planet is going to get a little more crowded. 149 00:16:04,490 --> 00:16:08,810 The population in 2030 is going to reach 8.6 billion. 150 00:16:10,010 --> 00:16:14,510 Now the overall average annual population growth rate. 151 00:16:15,750 --> 00:16:19,770 Is dropping from 1.1% per year. 152 00:16:20,840 --> 00:16:24,200 From the last decade to just under 0.8%. 153 00:16:26,470 --> 00:16:29,570 However. Here's some interesting elements. 154 00:16:30,080 --> 00:16:35,630 India became the largest single population on the planet last year. 155 00:16:35,660 --> 00:16:42,020 It will continue to widen that gap by 2030, reaching 1.5 billion people. 156 00:16:43,080 --> 00:16:49,830 And what's particularly interesting is where we're going to start to see the largest population increases. 157 00:16:50,490 --> 00:17:02,760 Subsaharan Africa is going to experience population increases between now and 2030, ranging from 2.6% a year to 4.7% a year. 158 00:17:04,040 --> 00:17:07,460 That's significant by 2030. 159 00:17:07,700 --> 00:17:12,020 1 in 5 people on the planet who reside in subsaharan Africa. 160 00:17:13,190 --> 00:17:20,360 And by 2050. sub-Saharan Africa's population will be 90% greater than it is today. 161 00:17:21,910 --> 00:17:26,830 Why that's important is most of sub-Saharan Africa is not food self-sufficient. 162 00:17:27,920 --> 00:17:38,450 It is going to put a tremendous amount of burden on an already flawed healthcare system, economic system, housing system within those countries. 163 00:17:40,230 --> 00:17:45,360 So how do we start to to prepare those countries for this enormous population bust? 164 00:17:48,050 --> 00:17:53,870 The second is on. Habitants are going to get quite a bit older than what they are today by 2030. 165 00:17:55,290 --> 00:18:01,020 16.5% of the global population will be over the age of 60 by 2050. 166 00:18:01,050 --> 00:18:05,160 21.5%. That's significant. 167 00:18:07,150 --> 00:18:14,920 Right. It's significant because we imagine the implications that has for existing health system and social systems. 168 00:18:17,060 --> 00:18:26,360 And being able to introduce interventions for early disease diagnosis and better care of the elderly is not just a moral question. 169 00:18:27,230 --> 00:18:37,620 It will become an economic question as well. Vertical or global inequities will continue to widen. 170 00:18:43,590 --> 00:18:46,800 The richest 10% by 2030. 171 00:18:47,900 --> 00:18:56,820 We'll own 80% of global wealth. And take in 65% of global income. 172 00:18:57,720 --> 00:19:10,250 Or if you prefer, 1%. Of the global population will take in 64% of global wealth and 39% of total income. 173 00:19:10,760 --> 00:19:23,090 The richest 1% is experiencing a wealth increase of approximately 6% on average year on year, which is double the other 99% of the population. 174 00:19:24,080 --> 00:19:29,930 We are seeing this widening gap. Between the super rich and everybody else. 175 00:19:30,530 --> 00:19:35,450 That's going to create enormous amount of social and political tension within these 176 00:19:35,450 --> 00:19:39,500 companies and within these countries and across the borders of these countries. 177 00:19:42,470 --> 00:19:45,620 And climate change will feel even more real. 178 00:19:46,220 --> 00:19:49,990 Despite the drop in temperature in the UK in the last couple of days. 179 00:19:50,000 --> 00:19:58,400 The reality is that by 2030, yes, the planet is going to be warmer and the air and water is going to be more polluted. 180 00:19:59,510 --> 00:20:04,160 And not only will this have direct implications on human life. 181 00:20:05,440 --> 00:20:10,510 But because of the lack of of food self-sufficiency that I commented on earlier, 182 00:20:11,230 --> 00:20:17,110 indirectly, it's going to affect our ability to increase the food production to meet this. 183 00:20:17,430 --> 00:20:23,500 Um, uh, population growth, particularly in, in subsaharan Africa and the developing world. 184 00:20:25,730 --> 00:20:29,060 Now I have some people say to me, oh, don't worry about it. I is going to sort this out. 185 00:20:29,780 --> 00:20:33,260 All right. I will sort this out. Well, first of all, I don't think I will sort this out. 186 00:20:33,530 --> 00:20:37,759 It does require as much human intervention as technological intervention. 187 00:20:37,760 --> 00:20:41,210 But the interesting thing about AI, there's a paradox when it comes to the environment. 188 00:20:42,490 --> 00:20:46,850 By. I is not a clean technology. Right. 189 00:20:46,980 --> 00:20:54,060 In order to train a single AI algorithm not to deploy it, but just train it with data. 190 00:20:55,850 --> 00:21:03,080 It produces the same amount of carbon emissions. As 120 vehicles over a 12 month period. 191 00:21:03,590 --> 00:21:12,420 Petrol vehicles. And to run a single data centre, which is required to power the AI algorithms, 192 00:21:13,120 --> 00:21:19,680 you'll need more and more more of these data centres around the globe in order to be able to achieve some of the expectations around AI. 193 00:21:20,190 --> 00:21:30,150 But to run one data centre. It produces 300,000 gallons of water, which is the equivalent of 100,000 homes in a single day. 194 00:21:31,020 --> 00:21:37,800 So there are some environmental consequences tied to our technological, uh, evolution. 195 00:21:38,040 --> 00:21:46,520 But it's not a saving grace. And technology will increasingly disrupt our lives. 196 00:21:47,480 --> 00:21:53,090 That is an inevitability. Now what's interesting, many of you may have heard about Moore's Law, 197 00:21:53,780 --> 00:22:00,410 and Gordon Moore was was the chairman of Intel in 1965 and came out and predicted that 198 00:22:00,530 --> 00:22:05,810 computer processing power would essentially double every 18 months to two years. 199 00:22:06,380 --> 00:22:09,710 And from 1975 to about 2012. He was right. 200 00:22:10,070 --> 00:22:20,070 That was an accurate prediction. However. Since 2012, AI computer systems have essentially been doubling every 3.4 months. 201 00:22:20,940 --> 00:22:25,440 We have seen an enormous eruption of technology resulting from that. 202 00:22:25,440 --> 00:22:34,230 And if we start to look at this, this mapping of technology innovation over the last ten years, a number of things stand out. 203 00:22:34,740 --> 00:22:40,800 One is the sheer volume of new technology that's starting to come away, that we're starting to get our heads around. 204 00:22:41,810 --> 00:22:46,280 Second is the rate of adoption is increasing significantly. 205 00:22:46,850 --> 00:22:51,450 Technology that enters is as a as a as a pilot or as a concept. 206 00:22:51,450 --> 00:22:54,170 So going mainstream, moving much more quickly. 207 00:22:54,860 --> 00:23:04,040 Third of all, the amount of technology that is, is entering our, um, a landscape, entering a world that is transformative in nature. 208 00:23:04,250 --> 00:23:09,110 When I say transformative, not just transforming businesses, but transforming human lives. 209 00:23:09,980 --> 00:23:13,640 And the other piece, I think it's really, really important that we often forget about technology. 210 00:23:14,120 --> 00:23:17,830 It's not these single technologies and the impact that they have, right? 211 00:23:17,840 --> 00:23:22,370 It's how they converge to create new realities and new outcomes. 212 00:23:23,540 --> 00:23:27,650 By the adjacent possibilities that result from technologies coming together. 213 00:23:27,920 --> 00:23:32,360 What happens when AI and augmented reality converge? Well, I am blockchain. 214 00:23:32,360 --> 00:23:38,590 Converge. What are the possibilities? You think of it like the wheel or even fire, right? 215 00:23:38,630 --> 00:23:43,940 As a as a single innovation versus what becomes you start to add all those adjacencies to it. 216 00:23:44,870 --> 00:23:49,400 So I think when it comes to technology, it's fair to say we ain't seen nothing yet. 217 00:23:49,430 --> 00:23:52,639 We're at the very beginning of this transformation, 218 00:23:52,640 --> 00:23:58,340 and we have to understand how technology can be leveraged in order to drive positive impact to human lives. 219 00:24:01,600 --> 00:24:07,000 And this technology is essentially driving four primary industry shifts. 220 00:24:08,030 --> 00:24:14,300 Which is going to be extremely important as we start to understand the world will be living in in 2030 and beyond. 221 00:24:15,200 --> 00:24:23,350 The first is rapid digitalisation. Powered by transformation and LED on data. 222 00:24:24,540 --> 00:24:32,350 I and the cloud. Data is the foundation of every modern business. 223 00:24:33,700 --> 00:24:41,080 To give you an idea of how important data is. And some of you may know this two in this room has an iPhone and Apple iPhone show of hands. 224 00:24:41,920 --> 00:24:49,510 So most of you. So you might know that that iPhone came with a Google search default search engine. 225 00:24:49,510 --> 00:24:53,520 Are you aware of that? Google pays Apple. 226 00:24:54,940 --> 00:25:01,540 Just over $20 billion a year to run that search engine as its default search engine. 227 00:25:02,800 --> 00:25:07,510 And that last year with 17.5% of Apple's total profits. 228 00:25:08,380 --> 00:25:12,880 So why is Google paying Apple for the privilege of running that search engine? 229 00:25:14,120 --> 00:25:19,310 On Apple's operating system. It's because Google understands the power of data. 230 00:25:19,640 --> 00:25:30,500 Google understands how important is it as a business to understand how we as humans make decisions, interact, communicate and live our lives? 231 00:25:31,280 --> 00:25:35,780 That's the importance of data. A data is the fuel. 232 00:25:35,800 --> 00:25:44,110 Data is the foundation of every business. AI is the additive to that fuel to help organisations react quicker. 233 00:25:45,960 --> 00:25:52,400 One more efficiently. Now AI is is not just a fact. 234 00:25:53,030 --> 00:25:57,710 I is is not going to enter this trough of disillusionment and go away for 2 or 3 years. 235 00:25:59,190 --> 00:26:04,010 They are already transforming every industry that we interact with. 236 00:26:04,020 --> 00:26:09,690 It's in impacting your lives and has been impacting your lives for many years already. 237 00:26:10,200 --> 00:26:16,499 And we're seeing it in banking with with loan approvals. We're seeing it in in utilities with smart grids. 238 00:26:16,500 --> 00:26:21,480 We're seeing it in healthcare with, um, AI enabled remote diagnostics. 239 00:26:21,990 --> 00:26:29,130 Um, we're seeing it in the oil industry with, with um, uh, with oil drilling and optimisation of oil drilling. 240 00:26:30,100 --> 00:26:33,570 To give incredible idea of where we all live, I continued. 241 00:26:34,110 --> 00:26:42,240 It's a great book by most adults called Scary Smart if you know that it spoke adult was the chief commercial officer of Google X. 242 00:26:43,810 --> 00:26:47,760 He said, listen, there are three inevitabilities when it comes to AI. 243 00:26:48,930 --> 00:26:52,560 Number one, it ain't going anywhere. It's here to stay. 244 00:26:53,010 --> 00:26:58,140 Get used to it. Find a way to coexist with it. Number two. 245 00:26:59,560 --> 00:27:08,470 AR is going to be smarter than human beings. In fact, it's going to reach the level of human intellect by 2029. 246 00:27:09,510 --> 00:27:16,110 And by 2049 to 25 years from now, it'll be a billion times smarter to the average human. 247 00:27:17,160 --> 00:27:19,260 Which means, metaphorically, the relationship, 248 00:27:19,260 --> 00:27:30,600 the intellectual relationship between AI and humanity in 25 years time will be that of of a human, an average human, and a housefly. 249 00:27:30,810 --> 00:27:35,430 In today's society, that's scary, right? 250 00:27:36,810 --> 00:27:41,190 And the third. Get ready for this. It's going to make mistakes. 251 00:27:42,940 --> 00:27:46,180 How big those mistakes all is up to us. 252 00:27:47,170 --> 00:27:53,280 Because think if I today has just kind of entering this adolescent period so low in infant 253 00:27:53,290 --> 00:27:57,879 so child some of the lessons to think back to when you entered your adolescent years, 254 00:27:57,880 --> 00:28:03,580 how comfortable with you and your own skin. Think about the decisions you made, where they good decisions or bad decisions. 255 00:28:04,360 --> 00:28:09,470 Probably a lot more bad decisions. You parents would say, right. And that's where we are. 256 00:28:09,500 --> 00:28:14,500 We have a responsibility to parent to technology. That's gonna be difficult. 257 00:28:15,520 --> 00:28:19,000 Because we don't agree on what the basic ethical guardrails should be. 258 00:28:19,720 --> 00:28:24,070 Not within societies and not across societies. But that's where we are. 259 00:28:24,100 --> 00:28:32,870 As far as I was concerned. And if AI is the additive cloud will make all this possible. 260 00:28:33,230 --> 00:28:38,810 Data clouds allow us to connect products and deliver innovation at scale and with speed. 261 00:28:41,180 --> 00:28:46,309 And it's interesting. I work with automotive clients, um, who have said, you know, 262 00:28:46,310 --> 00:28:53,450 for years they're in the internal combustion engine industry who are now saying we're no longer in internal combustion engine industry. 263 00:28:53,810 --> 00:28:58,310 We're in the, um, cloud and connected products industry. 264 00:28:58,650 --> 00:29:02,570 Like they're recognising the importance the cloud will have as a differentiator. 265 00:29:04,800 --> 00:29:08,940 The second is every business will become a software business. 266 00:29:11,560 --> 00:29:24,270 In the past. Software is very clunky. And had a very, um, complex and, and somewhat, um, limiting distribution model. 267 00:29:26,120 --> 00:29:29,560 But today. Software. Simple. 268 00:29:30,980 --> 00:29:38,360 It's configurable. It's assembly led. It's no code, and its distribution is infinite. 269 00:29:39,260 --> 00:29:48,560 The software is becoming ubiquitous in every business, and every business is competing on its ability to create impactful software. 270 00:29:50,920 --> 00:29:59,740 Since 2021. The non-tech sector has hired more software professionals than the tech sector. 271 00:30:01,330 --> 00:30:08,440 Every company, every industry is recognising the importance that software will have the future of its business, the relevancy of its business. 272 00:30:09,460 --> 00:30:16,480 Volkswagen announced a couple of years ago that they missed the next generation of automobiles because they couldn't have enough software engineers. 273 00:30:18,670 --> 00:30:21,820 There's a massive talent gap that we have to be able to address. 274 00:30:24,870 --> 00:30:28,500 The third is the physical and digital convergence. 275 00:30:29,430 --> 00:30:35,280 For years we talked about the need to to find ways to to move experiences and 276 00:30:35,280 --> 00:30:39,300 data from physical to digital and back again in the seamless way possible. 277 00:30:39,480 --> 00:30:46,950 And most companies operate it all lives operated in a physical and digital capacity by 2030. 278 00:30:47,610 --> 00:30:53,850 I think that will exist. They'll just be our lives and our physical lives will be connected. 279 00:30:55,050 --> 00:31:01,950 And they'll be driven and defined by, by, um, by all connected experiences and by technology. 280 00:31:03,470 --> 00:31:06,860 I know the wheels will need to become more human, right? 281 00:31:07,040 --> 00:31:12,240 So basically, technology will become ambient. It will weave its way to the fabric of our lives. 282 00:31:12,260 --> 00:31:16,790 It'll be very difficult to distinguish between what's physical or virtual and what is real. 283 00:31:18,770 --> 00:31:22,190 And the last is industry, migration and diversification. 284 00:31:22,760 --> 00:31:33,319 Every company is looking to achieve greater relevance and distinction based on a trusted brand and an existing customer base, 285 00:31:33,320 --> 00:31:36,080 installed customer base, and a digital platform. 286 00:31:37,520 --> 00:31:45,090 It's very difficult to determine what industry any business is in today as they start to find these new forms of relevance in the US. 287 00:31:45,120 --> 00:31:51,780 Amazon, you all know, is as a big box retailer, is moving into banking and health care. 288 00:31:52,170 --> 00:31:56,159 Amazon is already in banking and health care and now looking to get into smart homes. 289 00:31:56,160 --> 00:32:01,740 They see that as their constant. Philip Morris up until about eight months ago. 290 00:32:01,770 --> 00:32:07,560 You all know Philip Morris is the tobacco manufacturer committed to a small, smoke free future by 2030. 291 00:32:08,070 --> 00:32:16,110 Acquired the chewing fat, um, Fatten Pharma, um, a pharmaceutical company, because they wanted to move tobacco into health care. 292 00:32:17,820 --> 00:32:27,060 Why? Because they looked at the the the ICOs that heated tobacco device as being a perfect opportunity for inhaled therapeutics. 293 00:32:28,700 --> 00:32:34,820 You were able to convince the shareholders to pull out of that now. But they're still committed to finding a new opportunity to get into healthcare. 294 00:32:36,650 --> 00:32:43,379 Utilities companies are starting to move into retail because they recognise, with the, um, introduction of, 295 00:32:43,380 --> 00:32:49,690 of the, uh, expansion of small cars, that people aren't going to be spending three minutes at the pump anymore. 296 00:32:49,700 --> 00:32:53,150 They can be spending 45 minutes at the, at the, the charging station. 297 00:32:53,390 --> 00:32:57,320 They want to get into the charging business, need to develop retail stores that go along with that. 298 00:32:57,620 --> 00:33:03,410 What we're seeing consumer health companies or seeing pharmaceutical companies spinning off their consumer health businesses as well. 299 00:33:03,710 --> 00:33:09,410 So every company is trying to find new forms of relevance and new ways to make an impact by diversifying. 300 00:33:12,600 --> 00:33:20,370 So what does this mean for human lives? This is Charlotte. Charlotte graduated from UCL with a BA psychology in 2027. 301 00:33:21,570 --> 00:33:30,870 Now 23, just got a job as a brand manager for a sports retailer and just started renting her first flat in Clapham Junction. 302 00:33:31,110 --> 00:33:36,060 She is a very typical Gen Z. Was zooming, if you prefer, in 2030. 303 00:33:37,210 --> 00:33:40,500 That was Charlotte living her life. Is it much different from today? 304 00:33:41,250 --> 00:33:47,310 Well, first of all, Charlotte wakes up a very relaxed way because she's had an AI enabled sleep, 305 00:33:47,850 --> 00:33:53,009 um, where, um, connected, uh, devices and sensors within her, 306 00:33:53,010 --> 00:33:59,820 her bedroom have managed to adjust the lighting and the sound and the air quality, um, 307 00:34:00,270 --> 00:34:07,320 and the lighting all based on, on on delivering the highest quality of of sleep experience. 308 00:34:09,480 --> 00:34:14,440 While she's asleep. Her AI agent plans out her day. 309 00:34:15,720 --> 00:34:23,010 Right. It creates a daily schedule, exercise routine, wardrobe and diet as she gets up. 310 00:34:23,310 --> 00:34:29,219 Her smart bathroom mirror evaluates her biometric data with biometric vitals in 311 00:34:29,220 --> 00:34:33,750 order to be able to recommend health and skin products for that particular day. 312 00:34:34,530 --> 00:34:38,399 While she's having a shower, her small shower head notifies the coffee dispenser. 313 00:34:38,400 --> 00:34:42,270 Downstairs units awake and coming downstairs, and the coffee starts to brew. 314 00:34:44,240 --> 00:34:49,850 3D printing is going to be mainstream by around 2028 2029. 315 00:34:50,060 --> 00:34:53,990 We believe that Charlotte will have a 3D food printer in her kitchen, 316 00:34:55,070 --> 00:35:00,560 and that 3D food printer is preparing for breakfast based on her real time personalised nutrition plan. 317 00:35:01,670 --> 00:35:10,100 Now problem with Charlotte is is she, um, received some notification from a doctor recently this just to cut out salt in her diet. 318 00:35:10,370 --> 00:35:15,680 She loves the taste of salt to heat food without salt. This is just invented in spoon tech. 319 00:35:16,370 --> 00:35:21,049 Spoon tech is a device that was actually piloted the Consumer Electronics Show last 320 00:35:21,050 --> 00:35:28,900 year that uses ion sensory technology to create artificial flavours from real food, 321 00:35:28,910 --> 00:35:33,080 so she's able to get the sensation. Of the food she can't eat. 322 00:35:33,320 --> 00:35:42,440 Using this technology. After she has a breakfast, she's to do some shopping, 323 00:35:42,830 --> 00:35:48,170 and she relies on a USB agent and augmented reality digital shopping assistant to 324 00:35:48,170 --> 00:35:52,790 provide a very highly personalised shopping experience for her at the supermarket. 325 00:35:53,690 --> 00:35:59,500 Now, I think this is a very conservative estimate. Of what life is going to be like in 2030 from a shopping perspective. 326 00:35:59,520 --> 00:36:03,710 I'll tell you what. Most of you, I assume. 327 00:36:04,190 --> 00:36:07,250 Certainly since the pandemic. Um, purchased from Amazon. 328 00:36:07,370 --> 00:36:16,790 Correct? Most of you. And when you're shopping on on Amazon, you experience their real time office, right? 329 00:36:16,790 --> 00:36:28,070 They have a personal recommendation engine. That personal recommendation engine last year had a 9.6% efficacy across all products, 330 00:36:28,760 --> 00:36:34,070 which means that one inch ten products that Amazon recommended to you last year, you actually clicked on and purchased. 331 00:36:35,300 --> 00:36:39,770 Just not that it's a pretty good success rate. Most companies operate around 3.3 to 3.6%. 332 00:36:40,840 --> 00:36:46,500 It was particularly important. Is through the maturity of its AI platform. 333 00:36:47,040 --> 00:36:56,079 Amazon believes that by 2027, the efficacy of its recommendation engine will have gone from 9.6% to around 34, 334 00:36:56,080 --> 00:37:01,440 or 35%, which means 1 in 3 products it recommends you click on and purchase. 335 00:37:02,310 --> 00:37:08,220 Why is that important? Because once Amazon crosses that 30 to 33% threshold, 336 00:37:08,700 --> 00:37:16,290 it will be economically viable for Amazon to shift from a click and deliver model to delivering click model. 337 00:37:16,830 --> 00:37:21,840 It will send you your products before you request that you get home from work or from school. 338 00:37:22,020 --> 00:37:24,599 And you've got your toilet paper, you've got your skin cream, 339 00:37:24,600 --> 00:37:30,840 you've got your your mustard or whatever else you need for dinner that evening, waiting for you on your doorstep. 340 00:37:31,710 --> 00:37:33,450 And you'll have an easy return policy. 341 00:37:33,840 --> 00:37:40,890 But every time you keep those products or return them Amazon learn something new about you and that efficacy will continue to increase. 342 00:37:41,220 --> 00:37:46,560 So that's the shopping experience that we'll expect Charlotte to be experiencing by 2030. 343 00:37:48,640 --> 00:37:51,820 She orders a, uh, jacket for evening events. 344 00:37:52,120 --> 00:37:59,170 Um, also relying on her virtual agent to find the perfect jacket because it knows her personal taste and size and fit. 345 00:37:59,590 --> 00:38:03,280 And that point is delivered within two hours by a flying drone. 346 00:38:04,150 --> 00:38:13,900 AI research shows that by 2030, the average British citizen will be receiving between 4 and 5 packages a week by flying them in 2030. 347 00:38:15,540 --> 00:38:21,849 All right. Chosen has dinner again planned by her virtual assistant. 348 00:38:21,850 --> 00:38:27,370 And this is a a out of home, uh, mixed reality technology dining experience, 349 00:38:27,670 --> 00:38:35,020 which is something that we are seeing already in the market as she relaxes at home that evening with content that is personally curated for her, 350 00:38:35,560 --> 00:38:36,910 by her art assistant, 351 00:38:36,910 --> 00:38:46,810 by this is how Charlotte will live her life in 2035 and in a way that is interacting with with technology in a much more meaningful way. 352 00:38:50,970 --> 00:38:54,450 So what does this mean for other aspects of life? 353 00:38:54,480 --> 00:39:06,660 Let's talk about work. This is the big elephant in the room. We've all heard the narrative that AI is going to eat away at white collar jobs. 354 00:39:06,960 --> 00:39:09,750 It's going to kill the middle class. Destroy the middle class. 355 00:39:10,620 --> 00:39:18,600 We're going to have this barbell society with the hyper rich, the unemployed, and most of us are going to be sitting around all day at home, 356 00:39:19,470 --> 00:39:28,830 drinking dandelion and burdock, eating Doritos, watching countdown, and waiting for our universal basic, uh, income check to arrive. 357 00:39:31,310 --> 00:39:34,340 Quite candidly, that is absolute nonsense. It's not going to happen. 358 00:39:35,150 --> 00:39:37,190 I is not going to steal our jobs. 359 00:39:37,730 --> 00:39:47,420 In fact, there is a counter intuitive trend in place, and there is a job gap that is emerging because of the maturity of AI. 360 00:39:48,800 --> 00:40:01,879 In this year, the UK employment has showed there is a deficit of around 550,000 um uh qualified um technicians and engineers to be able to, 361 00:40:01,880 --> 00:40:07,010 to advance um, our, um, our work in the AI space. 362 00:40:08,210 --> 00:40:14,330 By 2030, that number will have ballooned to more than 4.5 or 5 million. 363 00:40:15,710 --> 00:40:19,280 So new jobs are being created all the time by AI. 364 00:40:19,730 --> 00:40:26,510 Yes, I will require us to rethink the skills we need in employment, and jobs are going to go through a process of reskilling. 365 00:40:26,510 --> 00:40:33,520 In fact, more than 69% of jobs are going to have to be reimagined or re skilled in order to coexist with AI, 366 00:40:33,520 --> 00:40:37,160 but is not taking away jobs as much as creating new jobs. 367 00:40:39,320 --> 00:40:48,230 In fact, by 2030, globally, we expect there to be a deficit of around 85 million people on the planet. 368 00:40:48,530 --> 00:40:52,490 In order to be able to capitalise on the tech revolution, 369 00:40:52,790 --> 00:41:04,520 and there will be more than $8.5 trillion of lost revenue if we can't address those stem, um, uh, capabilities in in the workforce. 370 00:41:06,050 --> 00:41:09,020 So it's not taking jobs, it's just redefining jobs. 371 00:41:09,530 --> 00:41:16,310 If you need some convincing of that, let's go back in history and let's look at the from an economic standpoint, 372 00:41:16,640 --> 00:41:19,670 the the first three industrial revolutions, 373 00:41:19,670 --> 00:41:29,180 the first being the, uh, steam engine in the late, uh, 18th century, followed by assembly line in the early 20th century. 374 00:41:29,360 --> 00:41:35,740 The digitalisation revolution of the 1980s and now AI platforms, these are creating jobs, 375 00:41:35,750 --> 00:41:44,860 have always created jobs because they create economic expansion. We have to rethink the relationship that we as humans have with this technology. 376 00:41:45,750 --> 00:41:51,420 As we look forward and it's really, really interesting. I've often said that the printing press. 377 00:41:52,450 --> 00:41:57,040 For large extent democratised information. The internet. 378 00:41:57,040 --> 00:42:04,180 On the other hand, democratise knowledge. AI is democratising expertise and competency. 379 00:42:04,840 --> 00:42:10,870 I think that's something we need to remember as we start to look at the nature of work in 20, 30 and beyond. 380 00:42:11,710 --> 00:42:18,460 From the technology perspective, we're going to see AI and and humanoid robots coexisting with humans, 381 00:42:18,460 --> 00:42:24,910 taking away the routine, mundane tasks to allow humans to focus on the creative skills, the social skills. 382 00:42:25,270 --> 00:42:33,400 But I does not possess that technology does not possess. We're going to see digital twins allowing people to be more productive in the workforce. 383 00:42:33,580 --> 00:42:35,490 We're going to see AI production, uh, 384 00:42:35,590 --> 00:42:42,760 productivity coaches and digital wellness coaches allowing people to be at the best of the happiest in the workplace. 385 00:42:43,630 --> 00:42:48,300 If we look at it from a human perspective, we're going to start to see more remote working. 386 00:42:48,310 --> 00:42:51,220 I think that's expected. We're going to see more automated working. 387 00:42:52,800 --> 00:42:57,750 But I think what's going to be really interesting going forward is this concept of designer careers. 388 00:42:58,290 --> 00:43:03,009 All right. We've all signed up for this this template, right. That we we have one career. 389 00:43:03,010 --> 00:43:06,510 And I'll say sometimes we're lucky to have maybe a second and some may have a third. 390 00:43:07,020 --> 00:43:15,709 And it's a rarity. But what's stopping somebody from beginning their career as a engineer and then deciding 6 or 7 years later, 391 00:43:15,710 --> 00:43:18,500 they want to pivot and become a designer and then become a lawyer. 392 00:43:18,950 --> 00:43:25,729 And I think with with the the maturation, the continue expansion of, of AI capability, 393 00:43:25,730 --> 00:43:30,650 it will be increasingly possible for people to make those adjustments in their career. 394 00:43:30,890 --> 00:43:35,270 It's going to be much easier to become an expert in something through AI. 395 00:43:35,960 --> 00:43:43,830 And we expect this to be a massive shift. I talk to clients about the impact that technology is going to have on the businesses in the coming years. 396 00:43:43,850 --> 00:43:48,140 The question they always come back to is not which technology I should be investing in. 397 00:43:48,500 --> 00:43:57,110 It's what humans we should be investing in. And more importantly, the question that gets asked is how do we ensure that in 20 and 30 and beyond, 398 00:43:57,380 --> 00:44:06,350 our company and our industry is equipped to attract and retain and motivate the highest talent that's available? 399 00:44:07,430 --> 00:44:11,210 At a client at a leading FMcG company. Leading packaged goods company. 400 00:44:11,600 --> 00:44:19,490 Who? You know, it says, listen, I'm worried by our business was formed on the ability of of of smart humans. 401 00:44:19,910 --> 00:44:22,010 If I look forward in 20 and 30 and beyond, 402 00:44:22,010 --> 00:44:29,010 I'm not thinking about do I invest in AI or blockchain or quantum computing or the next big technology innovation? 403 00:44:29,030 --> 00:44:36,410 My question is, how do I create a working environment that's going to attract and inspire and retain the top talent? 404 00:44:36,740 --> 00:44:48,920 So we have to think of it in those terms. And any skill that that can't be automated is going to be particularly valuable in this new paradigm. 405 00:44:49,860 --> 00:44:53,960 I think that's important to continue to emphasise. Right. 406 00:44:54,330 --> 00:45:02,400 Economy for years was was based on our physical abilities and then it moved to our intellectual capabilities. 407 00:45:02,790 --> 00:45:10,620 I think as we start thinking about the role that I will play in our future economy and the future of our industries, 408 00:45:10,950 --> 00:45:13,080 it's going to be social abilities that prevail. 409 00:45:13,770 --> 00:45:22,469 And yeah, not surprising what this chart shows is where there's going to be the biggest shift in skill requirements in 2030 compared to 2022, 410 00:45:22,470 --> 00:45:25,520 in both the US and in Western Europe. 411 00:45:25,530 --> 00:45:29,700 And not surprising, we're seeing a drop in physical manual skills, um, 412 00:45:29,700 --> 00:45:36,540 and basic cognitive skills that we're seeing a significant increase in higher cognitive skills like creativity and critical thinking, 413 00:45:36,540 --> 00:45:43,080 complex information processing. We're seeing a significant increase in entrepreneurial skills, leadership skills, 414 00:45:43,470 --> 00:45:48,510 individuals with resilience and flexibility who can work effectively with that technology. 415 00:45:49,100 --> 00:45:56,160 Right. This is what's going to be important. What about the future of London? 416 00:45:57,450 --> 00:46:03,430 Learning. Education is going to go through enormous shifts in the next 5 to 6 years. 417 00:46:03,470 --> 00:46:08,580 It's interesting I mentioned earlier about every business becoming a software business was particularly important. 418 00:46:08,850 --> 00:46:18,270 Is the industries that have fallen behind, that have been not quick enough in adopting software as a core asset are the ones that that's, 419 00:46:18,270 --> 00:46:23,790 um, that are or continue to deliver their services with the highest price point relative to inflation. 420 00:46:23,940 --> 00:46:28,560 And that includes health care and education. They have not moved quick enough in terms of software. 421 00:46:29,010 --> 00:46:32,730 But I think when we look at education, it is historically been very episodic. 422 00:46:33,240 --> 00:46:36,420 It will now become very continuous and very personalised. 423 00:46:37,350 --> 00:46:40,679 I believe we move away from what I've been calling just in case. 424 00:46:40,680 --> 00:46:44,610 Education. The just in time education. 425 00:46:46,180 --> 00:46:54,460 Why most of us spend the first 25 years of our lives learning skills that we hope will apply to the professions that we then choose to take on. 426 00:46:55,180 --> 00:47:00,250 The reality is, the skills we're learning today become obsolete within 2 or 3 years. 427 00:47:00,610 --> 00:47:06,490 It's impossible to teach somebody the skills they'll need that will continue be relevant five years from now, ten years from now. 428 00:47:07,220 --> 00:47:11,799 So we need to rethink our educational system and the platforms that lie beneath it in order 429 00:47:11,800 --> 00:47:17,770 to ensure that people continue to evolve as you demand and new skills become relevant. 430 00:47:17,800 --> 00:47:23,980 So the first is is continuous, flexible, on demand, the lifelong learner, both formal and informal. 431 00:47:25,550 --> 00:47:29,000 AI driven personalisation. The content, the context, 432 00:47:29,000 --> 00:47:36,770 the methodology of teaching is going to be adapted to the individual needs of the students and as those needs of the student change over time, 433 00:47:37,190 --> 00:47:40,310 as we learn more about how that student reacts to different stimuli. 434 00:47:41,480 --> 00:47:47,390 Then the content in context will shift as well. Education will become immersive and interactive. 435 00:47:48,310 --> 00:47:55,570 I imagine being a marine biologist and learning about, um, your discipline in a virtual underwater setting, 436 00:47:55,570 --> 00:48:02,110 or you're a palaeontologist, um, learning about about dinosaurs on a dinosaur island or an astronomer. 437 00:48:02,110 --> 00:48:07,480 That's that's, that's that's interacting with, with with asteroids in a virtual space environment. 438 00:48:08,890 --> 00:48:12,650 It's going to become gamified, it's going to be fun. It's going to be interactive. 439 00:48:12,650 --> 00:48:21,490 The ability to bring people together globally on a common platform language will no longer be a barrier because I will provide instantaneous, 440 00:48:21,490 --> 00:48:27,910 real time translation, the ability to interact with people in, in, not just in other countries, 441 00:48:27,910 --> 00:48:32,709 but people who do not speak your language as a first language and to neutralise that barrier. 442 00:48:32,710 --> 00:48:42,130 So you're learning, um, you're experiencing a greater cultural diversity and in, in the way that the content is being delivered to you. 443 00:48:44,610 --> 00:48:49,080 And continuous monitoring and assessment by most of you've probably heard about 444 00:48:49,320 --> 00:48:54,390 brain computer interfaces ability to actually communicate directly from the brain. 445 00:48:54,960 --> 00:49:01,350 But nothing has to be written that just brain thought to thought, ability to to understand students, 446 00:49:01,350 --> 00:49:09,700 how they respond to stimuli, and how the student teacher relationship is is unfolding using brain computer interface interface. 447 00:49:09,700 --> 00:49:17,219 That's going to be important in terms of being able to create this baseline and understand how we educate people most effectively. 448 00:49:17,220 --> 00:49:22,910 So education is going to go through the enormous shift. And how can we not talk about health care? 449 00:49:23,510 --> 00:49:33,380 And I think when we talk about the expected shifts in health care by 2030, Am reminded of the the quote from the Canadian, 450 00:49:33,530 --> 00:49:39,740 uh, science fiction author William Gibson, who said that the future is already here is just unevenly distributed. 451 00:49:40,910 --> 00:49:51,160 To a large extent. A lot of the things I've identified here do exist in, in, in, in non mainstream more, um, test and learn sort of way. 452 00:49:51,940 --> 00:49:55,959 Um, but this limitation will enable five health care at scale. 453 00:49:55,960 --> 00:50:04,570 I think most of you are familiar with £0.05. But the idea of of health care being about preventative, not being personalised, about being predictive, 454 00:50:05,140 --> 00:50:10,510 about being participatory, about being precise, those are the goals of the future health care system. 455 00:50:10,510 --> 00:50:15,100 How can technology help deliver that with greater ease and greater efficacy? 456 00:50:15,880 --> 00:50:20,380 The first is that consumers, if not already, will become the CEOs of their own health. 457 00:50:21,230 --> 00:50:27,560 I've got friends with doctors today who said the biggest challenge is to get patients coming in the morning already having self-diagnosed. 458 00:50:27,590 --> 00:50:30,220 Nobody ever comes into a doctor's office saying, can you help me, doctor? 459 00:50:30,230 --> 00:50:33,260 It's I've got this problem, and I found it on Google and tell me why I'm wrong. 460 00:50:33,740 --> 00:50:37,309 And it puts them in a moral dilemma because then you say, well, you know, 461 00:50:37,310 --> 00:50:42,380 I tell the patient the wrong and risk losing the patient because they go somewhere else and get get the treatment they want. 462 00:50:42,830 --> 00:50:48,380 Um, or do I, um, do I execute my, my professional standards in a way that I believe is right, 463 00:50:48,950 --> 00:50:53,540 but the ability for consumers to be masters of their own domain, 464 00:50:53,540 --> 00:50:59,210 to be CEOs of their own health ecosystem is going to become increasingly, um, uh, 465 00:50:59,270 --> 00:51:05,270 feasible due to the emergence of of wearable devices, smart sensors, smart technology. 466 00:51:05,750 --> 00:51:08,750 All right. This is already here. The study about a year ago, 467 00:51:08,750 --> 00:51:18,260 where 4000 Swedish adolescents agreed to have a small microchip placed into the skin that provided all of their biometric vitals a cholesterol level, 468 00:51:18,350 --> 00:51:21,650 sugar level. All this stuff is being fed into one single database. 469 00:51:22,160 --> 00:51:26,810 Is that something we'd be willing to do in compromise or data privacy to get a better health outcome? 470 00:51:28,150 --> 00:51:33,820 I think we all going to find that these, these, um, technologies are going to allow the, 471 00:51:33,820 --> 00:51:40,270 the consumer to be much more empowered and much more knowledgeable about the health options. 472 00:51:40,510 --> 00:51:42,790 We have to meet them in in that context. 473 00:51:43,540 --> 00:51:52,180 Smart IoT devices in the home will converge to provide these personalised health experiences and outcomes in a much more preventative way. 474 00:51:53,200 --> 00:51:54,880 We've all seen around telehealth, right? 475 00:51:55,120 --> 00:52:00,730 We know that that going forward, it's inevitable that that health care is delivered in a more than no fashion. 476 00:52:01,270 --> 00:52:09,510 Right? Instead of going to a dermatologist to have a, uh, an image of a, um, the skin, uh, uh, 477 00:52:09,520 --> 00:52:16,420 a mole or freckle, your ability to take the photograph from your iPhone and have it imaged remotely. 478 00:52:16,840 --> 00:52:25,310 Right. Tell a surgery we already see is is becoming a significant game changer in in the health care industry. 479 00:52:25,330 --> 00:52:32,350 This is a $5.9 billion industry growing at a CAGR of 16.1% year on year. 480 00:52:32,380 --> 00:52:38,940 This is going to be massive by 2030. By then, AI is changing all aspects of the health care, uh, 481 00:52:38,950 --> 00:52:44,439 ecosystem and value chain from from early disease detection all the way through to the 482 00:52:44,440 --> 00:52:48,509 use of genomic data to predict disease risk and provide personalised prevention plans. 483 00:52:48,510 --> 00:52:53,200 So all this is, is going to become mainstream by 2030. 484 00:52:56,010 --> 00:53:01,320 And listen when we're talking about understanding the relationship between humans and technology. 485 00:53:02,430 --> 00:53:07,290 I think we have to go to the very fundamental dimensions of what it means to be human. 486 00:53:07,860 --> 00:53:12,030 As a behavioural scientist, we need to understand people in the context in which they live, 487 00:53:12,030 --> 00:53:15,690 and we assume certain things are fixed or static and somewhat dynamic. 488 00:53:16,140 --> 00:53:24,800 But the reality is, in this new world, this new paradigm. I think the things that we assumed were static are not right for me. 489 00:53:24,810 --> 00:53:36,450 You know, time is is there are 86,400 seconds in in a day and, you know, 5240, uh, feet in a mile, right? 490 00:53:36,750 --> 00:53:44,400 There was a given. But but when we look at perception or perception of time and space is changing because of our relationship with technology. 491 00:53:44,800 --> 00:53:50,410 There's a time space compression. How we relate to time and space. 492 00:53:51,880 --> 00:53:59,530 People who are heavy users of devices. I'll process information more quickly and see time moving more quickly. 493 00:53:59,530 --> 00:54:06,040 In fact, 15% quicker than people who are low users of devices or infrequent users of devices. 494 00:54:07,120 --> 00:54:12,870 We think about it in terms of of attention span. Has implications for many of the things we talked about. 495 00:54:13,650 --> 00:54:19,110 Our attention span in the last 25 years has dropped from the 12 seconds to 8 seconds. 496 00:54:20,890 --> 00:54:25,420 Attention span is actually less currently than the average goldfish, 497 00:54:25,420 --> 00:54:29,379 and not sure if any biologists that room know how you figure out the attention span of a goldfish, 498 00:54:29,380 --> 00:54:32,800 but apparently it's it's higher than the average human today. 499 00:54:32,980 --> 00:54:37,120 The attention span is shifting on. Memory is changing as well. 500 00:54:37,900 --> 00:54:42,430 Our ability to to store information is becoming more and more difficult because 501 00:54:42,430 --> 00:54:45,490 of the way in which we gather and process information through scrolling. 502 00:54:46,030 --> 00:54:49,930 But our relationship to the present is also changing. What is the present today? 503 00:54:49,940 --> 00:55:00,519 We live in, in in such a strange, um, version of of of what is what is the the now as we're scrolling because when it's, 504 00:55:00,520 --> 00:55:07,300 we're scrolling, it's actually already happened. Right. But it's these strange perceptions of space and time that need to be completely redefined. 505 00:55:07,660 --> 00:55:10,860 And then our relationship to the truth. Right. 506 00:55:10,980 --> 00:55:15,760 We all grew up in a in a society where there was a version of the truth, and you agreed to it or not. 507 00:55:15,760 --> 00:55:21,570 But now there are multiple versions of the truth. Our ability to distinguish between what is real and what is virtual. 508 00:55:23,410 --> 00:55:32,680 Now our tendency, if not propensity, to use psychological filters to reinforce our own personal version of the truth that we want to project. 509 00:55:33,130 --> 00:55:37,540 All of this is made possible through technology or social environment. 510 00:55:38,350 --> 00:55:46,570 We assume that the internet has provided this on social media. This wonderful ability to connect with people in, in, in greater magnitude as it has. 511 00:55:47,260 --> 00:55:49,870 But at what expense and expense is that? 512 00:55:49,870 --> 00:55:59,520 We are now seeing greater instances of isolationism and alienation in society because people don't have those same meaningful relationships, 513 00:55:59,530 --> 00:56:03,910 the very superficial, um, uh, surface level relationships. 514 00:56:06,300 --> 00:56:13,080 About a work environment and how we relate to people and work in a remote environment does change who we are as human beings. 515 00:56:14,780 --> 00:56:19,640 Right and their own identity. How that's changing through technology, right? 516 00:56:19,670 --> 00:56:27,800 We've talked for years about the accident of birth, and how much of a life is defined by the things outside of our control or gender or appearance. 517 00:56:28,850 --> 00:56:37,880 Right? Right nicety. But now we're able to manufacture or create the version of ourselves we want to be, or at least project to others. 518 00:56:38,330 --> 00:56:43,610 The version that we create on social media is probably not the same version that exists in the physical world. 519 00:56:44,390 --> 00:56:47,900 Yeah, we've talked to clients in the past about when you're marketing to an individual, 520 00:56:48,110 --> 00:56:53,810 what version of that individual your marketing to the version exists in, in the physical world. 521 00:56:54,830 --> 00:57:00,260 Right. That you would identify as a as a friend or a brother or sister or parent or you. 522 00:57:01,010 --> 00:57:04,880 Marketing to the version of themselves they've created on social media. 523 00:57:05,810 --> 00:57:08,930 Right. The version they want to be seen as time. 524 00:57:09,710 --> 00:57:12,890 And this relationship to beauty, I think, is interesting as well. 525 00:57:12,920 --> 00:57:21,889 It was a study last year amongst American plastic surgeons that said 48% of people who, um, went for for uh, 526 00:57:21,890 --> 00:57:28,520 facial cosmetic surgery cited looking better in selfies as the main motivation for wanting cosmetic surgery. 527 00:57:28,520 --> 00:57:32,210 Right. Again, it's almost the use of the self. But we have to rethink. 528 00:57:33,270 --> 00:57:38,729 The relationships that we have to a broader environment because as behavioural scientists rely on traditional 529 00:57:38,730 --> 00:57:44,790 assumptions of time and space and relationships and self as we start to think through intubations intervention. 530 00:57:45,960 --> 00:57:51,930 So if we start to walk back from the future to figure out how do we deliver a future that does, 531 00:57:52,290 --> 00:57:56,730 um, align with our, uh, our desires and our aspirations? 532 00:57:57,450 --> 00:58:00,210 The first is we're not going to get there through single approaches. 533 00:58:00,810 --> 00:58:05,670 So I call the quintet of change with the ability to bring together data and innovation and digital, 534 00:58:05,670 --> 00:58:10,370 supported by strategic foresight and behavioural science to create this forward thinking culture, 535 00:58:10,380 --> 00:58:16,080 we can only address the complexity of change, these multiple dimensions working together. 536 00:58:17,850 --> 00:58:21,030 We've got to move to more of an ecosystem view of the world. 537 00:58:21,120 --> 00:58:28,440 For years we talked about winning in the platform economy. The reality is the world is becoming a network of ecosystems. 538 00:58:29,130 --> 00:58:36,540 So if you're working in the health care industry today, your ability to understand the interrelationships exist between payer, provider, 539 00:58:36,540 --> 00:58:42,990 patients and other stakeholders, and to create experiences that unite these different groups and create new sources of value, 540 00:58:43,830 --> 00:58:46,360 using data and intelligence to create these, these, 541 00:58:46,460 --> 00:58:54,330 these new intersections between the different constituents that make up that that particular ecosystem is going to be particularly important. 542 00:58:58,060 --> 00:59:01,480 We need to start thinking about innovation differently, right? 543 00:59:01,510 --> 00:59:10,900 The vast majority of innovation is looking at incremental change, addressing single problems one after the other, which delivers very limited return. 544 00:59:11,710 --> 00:59:17,410 If we really want to start to drive change and leverage innovation in a much more impactful way, 545 00:59:17,680 --> 00:59:21,490 we've got to start to think of innovation in a much more holistic and disruptive manner. 546 00:59:23,360 --> 00:59:28,250 And innovation really is, is, is a number of different possibilities coming together. 547 00:59:28,790 --> 00:59:32,060 There are multiple kinds of innovation and companies who are doing innovation. 548 00:59:32,810 --> 00:59:39,800 I think with the greatest efficacy and scale, the ones who are able to deliver innovation across many of these different modes. 549 00:59:40,700 --> 00:59:44,450 Um, Merck is an interesting one, right? As an example. 550 00:59:45,380 --> 00:59:52,400 So about eight years ago, Merck was looking to grow its vaccine business from 5 billion to 10 billion. 551 00:59:53,530 --> 01:00:00,429 And its initial strategy for doing that was to introduce more vaccines into mature markets, 552 01:00:00,430 --> 01:00:05,140 but realised it couldn't get any further than 8.5 billion on that strategy. 553 01:00:05,650 --> 01:00:12,370 The only way to get to 10 billion was to start to explore opportunities to introduce vaccines in developing markets. 554 01:00:12,700 --> 01:00:18,850 So focussed on two vaccines Gardasil for cervical cancer and roll attack for rotavirus. 555 01:00:19,960 --> 01:00:25,960 The problem is that in the countries that wanted to get into this virus with those, there's some this vaccine, 556 01:00:25,960 --> 01:00:31,090 subsaharan Africa, in particular countries like Cote d'Ivoire and Senegal and Nigeria and gone in and so forth. 557 01:00:32,350 --> 01:00:35,670 Um. Vaccine preventable diseases. 558 01:00:36,030 --> 01:00:40,110 One of them, one of those particular diseases, was not high up the hierarchy of need. 559 01:00:40,740 --> 01:00:41,970 So Merck have two options. 560 01:00:42,570 --> 01:00:49,500 The first option was to continue to educate the governments of those countries as to why they should embrace or adopt vaccine preventable diseases. 561 01:00:49,980 --> 01:00:54,750 The other was to help those countries address the things that were higher up the hierarchy of need. 562 01:00:55,080 --> 01:01:00,300 Things like, uh, food, sanitation and clean water and security and transportation. 563 01:01:01,080 --> 01:01:07,799 Right. And data integration. These types of issues that would face this country is a problem is milk didn't have those capabilities. 564 01:01:07,800 --> 01:01:14,310 So Merck actually created an innovation lab that would deliver those interventions that would be able to bring those 565 01:01:14,310 --> 01:01:19,620 interventions into those those countries in order to tackle the issues that were high up the hierarchy of need. 566 01:01:19,860 --> 01:01:25,560 And in doing so, take vaccines and their vaccines in particular, higher up the health schedule. 567 01:01:25,890 --> 01:01:28,500 Right. So interesting way of of addressing innovation. 568 01:01:29,040 --> 01:01:35,100 Uh, the, the women's royal canon, she if you guys know royal canon, which is the pet food brand belonging to Mars Petcare. 569 01:01:35,640 --> 01:01:41,610 While canon started to lose market share largely on the basis of of new digital natives, 570 01:01:41,940 --> 01:01:47,160 new entrants into the category that was stealing some of its, its, uh, its customer base. 571 01:01:47,520 --> 01:01:54,540 So both talent decided to shift from being a pet food provider to being a pet service provider, 572 01:01:54,930 --> 01:02:02,010 and recognising that as a as a pet parent, you have many decisions to make in terms of the breed of dog or cat. 573 01:02:02,040 --> 01:02:09,150 Um, being able to find the right breeder, uh, the vaccine schedule, which is quite complicated if you're a pet owner. 574 01:02:09,510 --> 01:02:17,740 Um, that's how you at the of um, the the schedule that follows, uh, finding a dog walker, 575 01:02:17,770 --> 01:02:22,110 finding a groom, all these things were challenges for the average pet parent. 576 01:02:22,110 --> 01:02:29,400 So they actually created a platform that would allow parents to make much smarter decisions across the entire value chain, 577 01:02:29,610 --> 01:02:36,150 and delivering innovations at each of those those points along the way to create a much more meaningful value exchange with the customer. 578 01:02:36,390 --> 01:02:44,610 So just interesting ways that companies have started to rethink their business model and rethink the operating model in order to deliver, 579 01:02:44,790 --> 01:02:49,200 um, new areas of value in, in, uh, in existing or new markets. 580 01:02:52,100 --> 01:03:03,050 I think at the core, at the heart of everything we do think about around technology, innovation and change is our ability to adopt a product mindset. 581 01:03:03,650 --> 01:03:06,799 Instead of asking the question what technology should be investing in? 582 01:03:06,800 --> 01:03:10,430 Do I chase this technology or that technology? Start with the problem. 583 01:03:11,330 --> 01:03:14,600 What Columbine solving for? Why does it matter? 584 01:03:15,900 --> 01:03:24,210 How is it impacting the lives of the people that I care about, and what are the different options for delivering against that particular problem? 585 01:03:26,360 --> 01:03:32,450 Which technology is the most feasible. You start with the problem and work back. 586 01:03:33,260 --> 01:03:36,830 Chances are we can deliver outcomes that deliver the value that you aspire to. 587 01:03:38,310 --> 01:03:44,070 Behavioural science perspective. If behavioural science is really about. 588 01:03:45,160 --> 01:03:53,560 Iteratively getting better and better. And understanding the curve of human behaviour and the application of behavioural science. 589 01:03:55,020 --> 01:04:01,340 Is being able to bend that curve. As we look forward in time to 2013. 590 01:04:01,350 --> 01:04:06,760 Beyond, we need to be asking ourselves. How do we bend that curve? 591 01:04:07,330 --> 01:04:16,860 And for whom? That's the most important question we ask yourself as as behavioural scientists, the number of things need to start to follow. 592 01:04:18,160 --> 01:04:25,570 First of all, begin with empathy. Focus on how designs and interventions improve human lives. 593 01:04:25,580 --> 01:04:34,340 Think bigger about the impact you can deliver. Address individual biases and interventions in the context of a complex social ecosystem. 594 01:04:36,040 --> 01:04:39,390 In order to drive constructive, systematic and lasting change. 595 01:04:39,400 --> 01:04:46,960 Don't look at individual problems or individual circumstances of of of human life. 596 01:04:46,990 --> 01:04:52,720 Look at the broader holistic ecosystem which this humans exist and live their lives and one to live their lives. 597 01:04:53,350 --> 01:04:59,770 Integrate approaches and methods and disciplines in areas that you may be less familiar. 598 01:05:00,010 --> 01:05:05,050 Neuroscience and computer science and philosophy. I just don't bring different lenses to the same problem. 599 01:05:06,580 --> 01:05:12,070 Establish ethical guardrails to ensure that behavioural interventions are managed responsibly. 600 01:05:13,670 --> 01:05:18,630 And target opportunities for how we can nudge individuals and society in a more positive direction. 601 01:05:18,680 --> 01:05:21,860 And that is the opportunity, if not the responsibility, 602 01:05:21,860 --> 01:05:27,950 of behavioural science as we start to navigate the most effective path to the future that we want. 603 01:05:30,830 --> 01:05:36,020 So what matters most? I think for years it's interesting. 604 01:05:36,020 --> 01:05:39,980 In business we would say the most important attribute was ability. 605 01:05:40,790 --> 01:05:45,860 We wanted to hire people. Who could make decisions quickly. 606 01:05:48,260 --> 01:05:54,740 I don't think that's what's going to get us to where we want to go. I think what's more important than ability is agility. 607 01:05:55,460 --> 01:06:01,170 It's our ability to hire people. Who aren't afraid to admit they were wrong. 608 01:06:02,990 --> 01:06:11,780 Who are quick to change their mind because the people with agility are the ones who are facing the next pandemic or the next technology disruption. 609 01:06:12,770 --> 01:06:17,150 I'm going to anticipate the problems and the risks ahead of time and lean around the corner. 610 01:06:17,720 --> 01:06:22,910 Have the courage to lead that change and not be a victim of it. 611 01:06:25,430 --> 01:06:29,960 And at the heart, I think side by side with agility is creativity. 612 01:06:30,860 --> 01:06:32,599 I think one of the greatest tragedies of, 613 01:06:32,600 --> 01:06:39,860 of digital transformation is that the technology itself has become more magical and more impactful in our lives. 614 01:06:39,890 --> 01:06:42,380 We as human beings have become increasingly complacent. 615 01:06:43,770 --> 01:06:48,210 Over the problems we want to solve for, and why we're waiting for technology to give us the answer. 616 01:06:48,900 --> 01:06:52,230 Technology is not going to give us the answer. It may help us get there quicker. 617 01:06:52,710 --> 01:06:55,140 We have to figure out the answers we want and why. 618 01:06:56,160 --> 01:07:05,160 Humans thinking 11 dimensions AI is a long way from being able to think in such a holistic way to create images at the source that. 619 01:07:05,170 --> 01:07:11,050 Lastly. Remember, it's not about the world that we're changing. 620 01:07:11,320 --> 01:07:18,110 It's very noble. This is your future and you have to design the future that you want. 621 01:07:20,120 --> 01:07:24,720 The journey that you want to embark upon. As you embark upon that journey. 622 01:07:26,010 --> 01:07:30,510 Be radically curious. Challenge convention. Challenge the new. 623 01:07:31,630 --> 01:07:37,950 Pushback. Be bold, be courageous, be wise and have fun. 624 01:07:39,430 --> 01:07:42,490 And build an extraordinary future that you'll be proud of. Okay. 625 01:07:42,850 --> 01:07:43,240 Thank you.