1 00:00:01,020 --> 00:00:07,230 Thank you. Thank you very much. Also, to come and stay on on keeping your hair for me rather than the free sandwiches as impoverished kids. 2 00:00:08,430 --> 00:00:16,140 So I'm Al Brown and I work at an organisation called DC DC, which is described as the Moody's Independent think tank. 3 00:00:16,830 --> 00:00:20,190 Because we're an organ of government. Semi-independent is probably here. 4 00:00:20,190 --> 00:00:27,450 It's a trade and our job is to think about stuff and study trends and try and work out what it means for the future 5 00:00:27,810 --> 00:00:34,560 and then provide advice to sort of governments can form strategies and try to work out what to do about these things. 6 00:00:35,130 --> 00:00:42,209 And we do that by plagiarising. We feel that we thing a lot relevant to themselves, which is one of the reasons I'm in the region. 7 00:00:42,210 --> 00:00:46,710 So I will get at least as much from you, from the questions as you think you're going to get from me. 8 00:00:48,070 --> 00:00:50,219 There's a very long list there and you don't get to read all of it. 9 00:00:50,220 --> 00:00:54,540 It's meant to just look impressive as you scan across the people that we steal from. 10 00:00:55,290 --> 00:01:02,040 And I wrote a blurb for this, but Robert and I went back and looked at it and realised it's not my promise to talk about 11 00:01:02,040 --> 00:01:08,160 some strongly held certainties that we have about the future and why they might not be safe, 12 00:01:09,090 --> 00:01:13,230 in which they might not be accurate, let's put it that way, and I'll put them on here. 13 00:01:15,060 --> 00:01:20,040 And at the end of that, if you recognise anyone has any of these before then that's good. 14 00:01:20,040 --> 00:01:24,299 I hear these all the time. And then on I ask, What do you mean by that? 15 00:01:24,300 --> 00:01:29,340 Frequently the conversation sort of gets a bit awkward and embarrassing as we start the times, 16 00:01:29,340 --> 00:01:33,330 and hopefully this will sort of skim over some of these bits because there's a lot to cover. 17 00:01:33,540 --> 00:01:39,749 So we have to get quite a little bit, not terribly deep on everything. I just thought it was just natural. 18 00:01:39,750 --> 00:01:43,080 Do you want to say something about caveats? Not you misconceived speak for. 19 00:01:43,410 --> 00:01:48,420 Oh, we have a good point. So I don't mind lecturing here. I am not an organ of state delivering government policy. 20 00:01:49,150 --> 00:01:52,770 You are picking the brains of the black out rounds. That's just what I'm looking expecting. 21 00:01:53,010 --> 00:01:56,370 But they're not here so happy with it as you're talking. 22 00:01:56,370 --> 00:01:59,400 So it's always good to start with a definition in economic thinking. 23 00:01:59,700 --> 00:02:08,310 So in this case, I would say that artificial intelligence is something that sits at the top of the Gartner Gartner point curve. 24 00:02:08,610 --> 00:02:15,690 So the peak of inflated expectations, but simultaneously manages to be something that is frequently underestimated and surprises us. 25 00:02:16,020 --> 00:02:21,450 It's quite an achievement. So something that's always better and simultaneously worse than we think is kind of worth understanding. 26 00:02:21,840 --> 00:02:28,940 And I use that here as a sort of bucket because you can find a million different definitions and that equals terribly well put together. 27 00:02:29,340 --> 00:02:33,200 But the useful way I found in thinking about this it is consists of three things really. 28 00:02:33,630 --> 00:02:39,540 One is algorithms, which is essentially just maths, much of which has been around since the fifties or sixties, 29 00:02:39,900 --> 00:02:43,740 but we didn't actually have other components that allow us to do anything with it. 30 00:02:45,150 --> 00:02:52,290 And then in 2012, we had a revolution in sort of the next leg, which is the processing power, particularly graphics processing chips. 31 00:02:52,710 --> 00:03:00,300 And they allowed us to use some of those algorithms, but they also only became useful once we have the data to exploit. 32 00:03:00,990 --> 00:03:04,980 And this is where the internet comes in and of itself to be enormously valuable. 33 00:03:05,610 --> 00:03:12,120 Not only is it a tool for finding data, but it's also become a system which generates data which makes a difference. 34 00:03:12,900 --> 00:03:17,580 So one of the things we do discover is pretty trends, which means all of our presentations in the polygraph system. 35 00:03:18,250 --> 00:03:26,880 So I going to tell you some things that you don't know already, but that's fine because Olympic athletes use priming by listening to like 36 00:03:27,330 --> 00:03:31,200 particularly inspiring music or watching films before they go and do an event. 37 00:03:31,210 --> 00:03:37,110 So I'm going to do the same to your Olympic level audience in this room and show you some of the things we know already to get you thinking in a way. 38 00:03:37,110 --> 00:03:40,589 So this is Moore's Law, so people talk about it running out. 39 00:03:40,590 --> 00:03:46,650 It kind of depends on how you choose to measure it. So there is a point where the size of a chip versus the size of the atom becomes a problem. 40 00:03:47,010 --> 00:03:51,840 But if you measure it in terms of calculations per thousand dollars, it continues to go strong strongly. 41 00:03:52,380 --> 00:03:57,240 And if you look at data generation, that is increasing exponentially. 42 00:03:58,530 --> 00:04:03,780 And if you look at why that is, again, going back to the Internet connectivity, the Internet matters. 43 00:04:04,650 --> 00:04:11,219 So I think we had 1 million devices in 1992 connected to the Internet whereabouts billion now. 44 00:04:11,220 --> 00:04:16,560 And the 200 billion is the estimate by 2050, probably slightly off on these figures, but kind of doesn't matter. 45 00:04:16,680 --> 00:04:24,419 It's a huge growth. I think it 50 billion. But if I tell you those things that make up the box, they weren't sticking your brain. 46 00:04:24,420 --> 00:04:26,580 So if I show you another image to do that sort of framing thing, 47 00:04:26,820 --> 00:04:31,290 you will instinctively recognise these on almost an emotional as well as an intellectual level. 48 00:04:31,620 --> 00:04:35,310 So we talk about data being generated and occurring automatically. 49 00:04:35,640 --> 00:04:40,320 That's kind of why I'm and that presents us with the sort of the first problem, 50 00:04:40,650 --> 00:04:45,930 which is there's an idea somehow that in all warfare you will just be able to alter and that has its 51 00:04:45,930 --> 00:04:50,280 place in things like you stood the line from home and if you were to be a successful terrorist, 52 00:04:50,610 --> 00:04:56,700 behaving like a 12th century man makes you quite difficult to find on the Internet, but that doesn't necessarily work for warfare. 53 00:04:56,700 --> 00:04:59,940 In an example I give you, is that a typical 99? 54 00:05:00,120 --> 00:05:06,780 The sortie generates between 20 to 40 laptops worth of data, one that comes back and it's presented to a human analyst. 55 00:05:06,810 --> 00:05:13,190 He then sits there and tries to make sense of all that data that necessarily means in the tempo that warfare occurs. 56 00:05:13,710 --> 00:05:17,030 The vast majority of that information then gathered on the cutting room floor. 57 00:05:17,820 --> 00:05:22,770 And if you're making information or making decisions on fractions of the data available to you, 58 00:05:23,160 --> 00:05:27,660 you are necessarily taking risk that you are not making the best decisions that you could do. 59 00:05:28,230 --> 00:05:33,900 So automation and how we ultimately how we process information is going to be fundamental warfare, 60 00:05:34,350 --> 00:05:36,960 but also in all other aspects of life, I would say too. 61 00:05:36,990 --> 00:05:43,830 So those graphs you saw about the increase in rates of data availability are true for all elements of the economy, economics and society as well. 62 00:05:46,380 --> 00:05:51,630 In 2015 we had an interesting development which kind of helps response of the response to that. 63 00:05:52,380 --> 00:05:58,740 And then it goes back to where Office Intelligence plays into this pretty much the first time on a standard benchmark test, 64 00:05:59,400 --> 00:06:04,630 the machines became better than humans at recognising faces and attributes. 65 00:06:06,120 --> 00:06:12,630 And you can see the error rates going down. And unsurprisingly, the error rate for humans has not changed because we're not evolving at that pace. 66 00:06:14,880 --> 00:06:19,230 So our ability to automate processing is going to necessarily depend on some of those technologies. 67 00:06:19,530 --> 00:06:25,110 And I described before how those three constituent elements I'd say there's two underpinning elements that make a difference as well. 68 00:06:25,800 --> 00:06:30,330 The first one, the image of the guy on the left of a subject matter, experts who are able to do this. 69 00:06:30,750 --> 00:06:37,110 And at moment they are a relatively low population figure and that makes them highly sought after. 70 00:06:37,530 --> 00:06:39,659 Now, in the future, there are going to be economic drivers, 71 00:06:39,660 --> 00:06:44,100 which means that everybody is going to be interested in doing more education, more research and creating more specialists. 72 00:06:44,430 --> 00:06:49,470 But right now they are a rare breed. And that ties into the second picture, which is to do with investment. 73 00:06:49,890 --> 00:06:59,549 And this is probably best described by a statistic in time, Wendy Hall's report for the government's view of economics and technology. 74 00:06:59,550 --> 00:07:04,950 When they were looking at artificial standards for machine learning and they highlighted the fact that there's a process called factory hiring, 75 00:07:04,950 --> 00:07:10,050 which is where you go out as a large company and you acquire in order to hire 76 00:07:10,470 --> 00:07:16,230 a small start ups or specialists on the average person is 5 to £10 million, 77 00:07:16,740 --> 00:07:21,959 which is a very significant figure. And you all have heard of sort of the amazing amounts the tiny little company DeepMind 78 00:07:21,960 --> 00:07:26,200 were built for from things that didn't necessarily means this ferocious competition. 79 00:07:26,260 --> 00:07:28,320 It's interesting as well, because if you are Google, 80 00:07:28,500 --> 00:07:33,930 you won't necessarily just go out and buy the Start-Up that has a gap that you need to fill in your stable. 81 00:07:34,440 --> 00:07:38,790 You will buy things that you may already have because you can deny it to your competition. 82 00:07:39,180 --> 00:07:42,300 So, so quite often there are lots of routes to achieve the same end. 83 00:07:42,600 --> 00:07:45,720 And if you own all of them, you start out towards being a natural monopoly. 84 00:07:48,270 --> 00:07:54,990 And the reason I put up a picture here of the Manhattan Project is I think this tends to lead towards an interesting phenomenon, 85 00:07:55,020 --> 00:07:56,700 which I call the Manhattan problem. 86 00:07:57,660 --> 00:08:06,000 So in the Manhattan Project, Oppenheimer is running a program where the state has a monopoly on all the constituent ingredients, 87 00:08:06,340 --> 00:08:11,450 can't go to the shops and buy uranium. You don't have big stables of scientists wandering around in the fray today. 88 00:08:11,460 --> 00:08:16,920 And Fred Oppenheimer and a number of people are all working in a very closely confined environment with 89 00:08:16,920 --> 00:08:21,180 walls of secrecy built around because they don't want to be counted on to be good at building atomic bomb, 90 00:08:21,180 --> 00:08:27,510 you need to build one. You need to bounce the ideas around those small groups of specialists in order to then accelerate the program to get further. 91 00:08:28,560 --> 00:08:33,000 And the state had a monopoly on that, which is one of the reasons, of course, that the U.S. have one. 92 00:08:33,000 --> 00:08:36,570 And then there's a gap of years before other major world powers start to catch up. 93 00:08:37,290 --> 00:08:43,139 In this instance, the reason I call a problem is that the constituent ingredients are no longer really owned by states, 94 00:08:43,140 --> 00:08:50,970 but they are owned by technology giants. So the data say 90% of all searches on the Internet were done through Google. 95 00:08:51,870 --> 00:08:58,200 The nearest competitor, Bing, I think, has the same number of searches as all conducted just through Google Maps. 96 00:08:58,680 --> 00:09:04,890 So you really have start hoovering up that sort of data and controlling it becomes important and then it starts to belong in the commercial sector. 97 00:09:05,010 --> 00:09:11,120 And indeed, there are regulations that prevent states and certain countries from holding or acquiring data. 98 00:09:11,130 --> 00:09:17,490 So you'll have had a GDPR and how that is radically altering what people are allowed to own, particularly state institutions. 99 00:09:17,790 --> 00:09:21,550 And there was this piece of regulation, which is this treaty now in the UK called RIP, 100 00:09:21,610 --> 00:09:29,010 which was the Regulation of Investigatory Powers Act, which stops the states acquiring large amounts of data just because its citizens. 101 00:09:30,830 --> 00:09:34,799 And I think that sort of purchasing power doesn't really tend to belong to the states as well. 102 00:09:34,800 --> 00:09:38,760 It largely belongs to commercial organisations because they can exploit it better as well. 103 00:09:41,190 --> 00:09:45,480 And you would have seen it in Zuckerberg in front of the Senate recently. 104 00:09:45,510 --> 00:09:54,060 You would have seen press statements on Cambridge Analytica and part of how society feels is kind of emerging in that field. 105 00:09:54,810 --> 00:09:59,820 But one thing, and this goes back to one of the least certainties I would be slightly cautious of is if. 106 00:09:59,930 --> 00:10:03,560 You ever hear someone say other people will be prepared to do things that we really wouldn't? 107 00:10:04,130 --> 00:10:09,440 Well, that's always true. That's kind of us having a different moral framework to those around us just in this room. 108 00:10:09,800 --> 00:10:15,380 It's always a fact of life, and different social groups having different ethical frameworks is a natural space as well. 109 00:10:16,010 --> 00:10:17,870 I'll be very careful about any exceptionalism, 110 00:10:17,870 --> 00:10:24,859 assuming that you will always have some sort of moral standing which you sits on which precludes you from doing something the other guys. 111 00:10:24,860 --> 00:10:28,550 Would you tend to alienate the enemy at it, but they will have different views. 112 00:10:28,640 --> 00:10:36,230 I would suggest that if you were in North Korea, for example, your comfort with the concept of autonomy is very different to in a Western state. 113 00:10:37,040 --> 00:10:43,429 So if you live in an organisation where autonomy isn't socially acceptable and yet you can control where people's families are, 114 00:10:43,430 --> 00:10:47,540 every piece of education they receive and what kind of behaviours you expect. 115 00:10:48,110 --> 00:10:51,860 The idea that you would then expect in machines vastly disproportionate level 116 00:10:51,860 --> 00:10:55,160 also to meet with none of those same levers or institutions of competence, 117 00:10:55,490 --> 00:11:00,800 I think is perhaps to paint the opponents in a negative light rather than actually looking to them for power. 118 00:11:01,700 --> 00:11:06,460 And we also you have to be careful as well about the idea of some sort of fixed ethical framework. 119 00:11:06,470 --> 00:11:11,930 We change our ethics all the time, and you can go back and look at things like in-vitro fertilisation, 120 00:11:12,650 --> 00:11:15,200 genetic screening for things like Down's syndrome. 121 00:11:15,410 --> 00:11:20,990 And you go back and you will find that a lot of a furore tends to disappear in retrospect as we change our ethical framework, 122 00:11:21,020 --> 00:11:25,790 as we become comfortable with changes in society and technology, and indeed we're not even that stable now. 123 00:11:26,240 --> 00:11:30,980 So one third of Britons surveyed in this newspaper report said they feared the rise of the robot, 124 00:11:31,100 --> 00:11:37,730 but nearly a fifth were prepared to have sex with an android. That's not a very coherent position, emotionally speaking, because not intellectually. 125 00:11:38,660 --> 00:11:44,630 And one of the reasons we tend to have these images is because we're trying to do we do it here mistake substitution. 126 00:11:45,110 --> 00:11:48,890 So we're not familiar. We can't see you do either of them at work. 127 00:11:48,950 --> 00:11:54,740 It's just a call and response effect on computers mean anything to us on an intellectual or emotional basis. 128 00:11:55,010 --> 00:11:58,100 So we substitute in what we do now, which is sci fi. 129 00:11:58,430 --> 00:12:02,390 And this particular picture and I mean this particular picture, 130 00:12:02,540 --> 00:12:08,240 it's the most common image that people have for the rise of robots, because this is the one that is most licensed by newspapers. 131 00:12:09,140 --> 00:12:15,140 So The Daily Telegraph printed full stories in four months using the same picture and the same story. 132 00:12:15,620 --> 00:12:20,460 So to find that your population then starts to develop a mental image of what the future looks like, and it looks a bit like this. 133 00:12:20,780 --> 00:12:28,309 So it's for us, there are more new exposure to, say, the Zuckerberg and sort of fear of big data is one form. 134 00:12:28,310 --> 00:12:33,740 This is from vehicle slaughter bots. Let's get to my institution of life, which is from the context of killer robots. 135 00:12:34,120 --> 00:12:41,090 And this is if you want to have an influence, campaigns that exist in Britain, this is definitely how to change public opinion. 136 00:12:41,360 --> 00:12:45,860 And it's really good. And the idea is at the beginning, you have a sort of Steve Jobs like military to get he hands up. 137 00:12:46,130 --> 00:12:51,350 He's created quadcopters with little shape charges in. They go out to some people who are nearby. 138 00:12:51,350 --> 00:12:53,480 Well it's like it's never really clear who they are. 139 00:12:54,110 --> 00:12:57,739 And then they end up going out in assassinating thousands of students based on things like Facebook 140 00:12:57,740 --> 00:13:01,640 profiles and leading all the students along because they seem to be right or left wing enough. 141 00:13:02,840 --> 00:13:06,290 And it's a terrifying image and it's quite emotionally powerful. 142 00:13:06,650 --> 00:13:10,790 But one of the reasons it hasn't led to sort of it's frustrated the company, 143 00:13:10,880 --> 00:13:16,100 the robots and others in the UN who are looking at it as an institution called The Gate, 144 00:13:16,340 --> 00:13:19,610 which is looking at whether to ban lethal some of US weapon systems. 145 00:13:20,420 --> 00:13:24,890 Are they really struggling? And they're not struggling because that evil people trying to dissuade it, 146 00:13:25,340 --> 00:13:29,210 they're struggling because of the challenge, which is trying to work out what it is you write about. 147 00:13:29,690 --> 00:13:33,319 So nobody particularly wants to get rid of a heat seeking missile because we know 148 00:13:33,320 --> 00:13:37,340 those have been more firmly front to back and nobody wants the silver version of Army. 149 00:13:37,340 --> 00:13:43,100 It's got some bad storms around them, scores of thousands of people. But you can't write a law about you need to find a boundary. 150 00:13:43,250 --> 00:13:47,720 And all of the things that are in that video kind of exist in the real world already. 151 00:13:48,050 --> 00:13:51,720 So there's very little in that. And you couldn't pick any one of those and say, this is the thing that we hate. 152 00:13:51,950 --> 00:13:53,690 You can say facial recognition technology. 153 00:13:53,700 --> 00:14:00,320 She'll never be in service again because otherwise it will keep getting longer and your friends would recommend you on Facebook and things like this. 154 00:14:01,190 --> 00:14:07,670 So there are all also it's there's no constituency single policy that technology and autonomy is kind of an emergent term. 155 00:14:07,760 --> 00:14:14,960 You tend to think of it as being a an ability to leave a system alone to get along with its job without being supervised. 156 00:14:15,080 --> 00:14:18,620 The problem is that that's contextual and it's based on your assessment of risk, 157 00:14:18,860 --> 00:14:25,460 and it's based on familiarity and the predictability that we've seen in our system in the past, which makes generating a law particularly difficult. 158 00:14:25,910 --> 00:14:32,720 There are issues that are very pertinent, so things like assurance and trust, how that is going to be collective training, 159 00:14:33,080 --> 00:14:39,170 which is more they're more like how sort of car assurance or airline assurance works in some ways. 160 00:14:39,170 --> 00:14:43,729 And those are probably going to get tackled first and will probably be the way that 161 00:14:43,730 --> 00:14:47,520 this starts to resolve itself in the same way as you have legislation around airlines, 162 00:14:47,810 --> 00:14:52,190 which we now would agree. So I would be very wary of fixed conceits or views. 163 00:14:52,730 --> 00:14:58,700 And and there's another problem, I'd say. So this is a system that sits on a North South Korean border northwards. 164 00:14:59,330 --> 00:15:02,850 It's made by. Subsidiary of Samsung. It's a remote weapons station. 165 00:15:02,850 --> 00:15:07,350 And what it does is it has the capability to do automatic detection and targeting. 166 00:15:08,010 --> 00:15:12,450 There are various as well, which can do voice recognition and facial recognition. 167 00:15:12,720 --> 00:15:18,900 And right now, it's it's laid back to a guy in a control room. So there's a bloke with a joystick deciding whether to shoot or not. 168 00:15:19,920 --> 00:15:23,340 But fundamentally, it's actually a software patch away from not needing that. 169 00:15:25,020 --> 00:15:30,720 And it kind of does everything that a century does. So it's great because he's not just sitting in a little box getting rained on, 170 00:15:31,140 --> 00:15:35,600 but he's also more vigilant so he doesn't get distracted by pulling on his iPhone all the time. 171 00:15:36,090 --> 00:15:39,180 Of course, because he's tired and had a big lunch. So none of those things happen. 172 00:15:40,320 --> 00:15:44,340 And yet this is something created by a man in America who doesn't like to kick out his lawn. 173 00:15:48,490 --> 00:15:52,630 This is one of the fundamental problems with the idea of the banner that this guy has put this together, 174 00:15:52,740 --> 00:15:59,309 and it's essentially the same system more or less, and yet he's running it off a laptop. 175 00:15:59,310 --> 00:16:03,450 No difference in the thing over there running it. And he's used some actual systems of service and some software. 176 00:16:03,450 --> 00:16:10,120 It's downloaded from the Internet. So one of the problems that we've got is that this is not, you know, 177 00:16:10,140 --> 00:16:14,120 in the idea that there's a Steve Jobs like figure out that leading all this technology is not the case. 178 00:16:14,130 --> 00:16:20,040 So going back to that kind of Manhattan problem idea, this is very much being developed in the civil commercial. 179 00:16:20,580 --> 00:16:27,390 These technologies that are accelerating in that world first and it's not like any here is somebody who's got a really old breakwater. 180 00:16:27,750 --> 00:16:34,110 And yet if you were to say they're going to put a sighting system on, it has the world's most advanced military accelerometers. 181 00:16:34,320 --> 00:16:36,350 It's links to G.P.S. mapping system. 182 00:16:36,360 --> 00:16:45,190 It's also a communication system that also does azimuth and has data for compass and can get light to mental updates on weather environment. 183 00:16:45,420 --> 00:16:49,710 It sounds really cool. And then you say, I'm talking about knowing that bunch of stuff on the side. 184 00:16:50,190 --> 00:16:58,620 You're able to, you know, generate some quite complex effects or quite complex technological things by using quite simple off the shelf technologies. 185 00:16:58,890 --> 00:17:02,400 The U.S. Marine Corps, I've got a great phrase for this. They call it the democratisation of precision. 186 00:17:02,880 --> 00:17:06,810 And I think it's going to lead to a lower cost of entry to, well, you know, 187 00:17:06,810 --> 00:17:11,190 precision warfare used to be a first world nations force, and it's not going to be for very long. 188 00:17:13,050 --> 00:17:16,290 But don't just think about it in terms of technology as well. 189 00:17:16,550 --> 00:17:19,710 Think instead of old actors being able to do new things. 190 00:17:20,520 --> 00:17:23,690 I would say that you have to think about it in social structures as well. 191 00:17:23,700 --> 00:17:28,800 So that is a self-driving car would have millions that are out there. If I was a member of the Animal Liberation Front, 192 00:17:29,550 --> 00:17:33,720 one of the things I really like doing is forming Huntington like scientists and trying to kill people out. 193 00:17:34,680 --> 00:17:43,320 And I have in my family Animal Liberation Front, I know no tolerance for being a martyr or even being harmed or even being arrested. 194 00:17:43,620 --> 00:17:50,189 And right now, that limits me to sending crappy pass the bombs in the mail to try and get someone to continue the life sciences which funny. 195 00:17:50,190 --> 00:17:58,830 I think they're really stupid, you know, and it does limit what you can say if in the future I can move my bomb to target the target, 196 00:17:58,890 --> 00:18:06,150 I no longer needs to have somebody pressed into service behind will have access to what has been described as the poor man's cruise missile. 197 00:18:06,630 --> 00:18:12,330 So the nature of violence is going to change and it's almost impossible to regulate out all of these things, obviously. 198 00:18:13,830 --> 00:18:19,020 Now, I wouldn't say also some people think that this just leads to a completely flat playing field and everybody's going to have other things. 199 00:18:19,680 --> 00:18:25,940 You will have an economic divergence. So again, it's a bit of footage from now rather than future, say the world's cheapest retailer, 200 00:18:26,670 --> 00:18:33,120 which is basically a small grenade which has been attached to this silly blue like tail structure. 201 00:18:33,120 --> 00:18:36,840 And some of these flying, of course, comes out bombing some of their opponents. 202 00:18:38,820 --> 00:18:44,730 And the ability to do that is not the same as the ability that is shown here. 203 00:18:45,240 --> 00:18:50,940 And there are other variants. So you can see locust or vertical on the internet, which is generated by the U.S. It's pretty similar. 204 00:18:50,940 --> 00:18:55,080 So this is developed by Chinese university in consultation with the Defence Forces. 205 00:18:55,350 --> 00:18:59,160 And the idea here is to build a swarm of systems which are able to interact. 206 00:18:59,460 --> 00:19:03,000 So, you know, there isn't going to be a one size fits all solution either. 207 00:19:04,440 --> 00:19:09,330 And I would also say that some of them are things of the simplicity of the idea and also natural compromise. 208 00:19:09,810 --> 00:19:16,170 So this is a little remote controlled aircraft that can land itself on a wall and then take off again and fly somewhere else. 209 00:19:16,260 --> 00:19:25,990 Maybe Chevron or Chevron domestic, forget which. And their idea here is to actually if I'm trying to look for people in an aircraft, say, 210 00:19:26,040 --> 00:19:31,559 I'm sorry, that my time, my ability to survey the area is limited by battery life. 211 00:19:31,560 --> 00:19:35,820 So if I can park, then I can extend the right the duration of my sensor. 212 00:19:36,690 --> 00:19:40,229 But actually parking on the floor is difficult because everything is fallen over and it says earthquake here. 213 00:19:40,230 --> 00:19:44,160 But, you know, you could easily imagine that would be severe somewhat today. 214 00:19:45,060 --> 00:19:48,780 But walls are self-cleaning if it doesn't stay there. So you're ready to talk? 215 00:19:49,230 --> 00:19:54,450 Actually, it's quite useful. But the battery life issue could be extended by another university, 216 00:19:54,450 --> 00:19:59,790 which you're looking at sending the propeller that sits on the top into a wind farm that regenerates the battery. 217 00:19:59,830 --> 00:20:02,920 As it sits and hangs on more. So you start to change some of the signs. 218 00:20:03,130 --> 00:20:11,980 The parishes. Some of these ideas are also farms from other variations say, you know, you will see stuff that's biomimetic. 219 00:20:12,250 --> 00:20:16,690 So again, one of the ideas I present to you is it's going to be a plethora of different things. 220 00:20:17,020 --> 00:20:23,860 We tend to think about robots back again in that sort of only picture, but they'll be all sorts of shapes and sizes and they won't all be fixed. 221 00:20:24,520 --> 00:20:26,200 So if not all solid. 222 00:20:26,620 --> 00:20:32,390 I also like to put this one up because then as I look across the audience, I can find work one in five, all from that newspaper report. 223 00:20:36,850 --> 00:20:43,390 Some people think that the information of these systems across the future tobacco space means that there isn't going to be any more surprise. 224 00:20:43,630 --> 00:20:48,700 Everything will be seen and anticipated in future. But I'm sorry I've missed a bit. 225 00:20:49,420 --> 00:20:54,670 One of the other things that I think artificial intelligence enables is going back to what's right now. 226 00:20:54,730 --> 00:21:00,010 The other tool for power that's quite significant is you find permanently broadcasting a signal like drain my batteries quite quickly. 227 00:21:00,280 --> 00:21:07,150 And in a military sense, being noisy has always been that and it's tuning the electromagnetic spectrum is what is walking around in the dark. 228 00:21:08,380 --> 00:21:16,120 So the ability to compress that data imagery, so I'm not sending you a full motion video, but I'm now actually sending you the words crank games. 229 00:21:16,120 --> 00:21:20,260 I can send a lot less data and I can be a lot more quiet and I can consider my battery life. 230 00:21:20,740 --> 00:21:26,380 So the ability to develop these algorithms, stick them on those systems, is going to change how the communication works, 231 00:21:28,060 --> 00:21:34,810 which indeed will help with that sort of volume of data, not just getting to the individual, but being processed across the airwaves masses. 232 00:21:35,440 --> 00:21:40,690 The electromagnetic spectrum is already too full and we have to give it to the person as well. 233 00:21:40,690 --> 00:21:45,310 So this isn't just going to be how it gets into the robots, it's going to be how it gets into the human events decisions. 234 00:21:45,590 --> 00:21:47,460 And this is an example from Arrow Glass. 235 00:21:48,160 --> 00:21:53,320 So the idea here was that rather than pilots having to sort of read or memorise their folder before they take off, 236 00:21:54,130 --> 00:22:01,720 they can actually have all of that data put in that the very cool set of glasses that you'll see the operator using the face on it. 237 00:22:02,530 --> 00:22:10,540 And it presents the data in a way that's fast and intuitive because speed of decision matters in all forms of life, but definitely in warfare. 238 00:22:11,560 --> 00:22:16,480 And you can see how the immediate read across to how people conduct warfare and could 239 00:22:16,480 --> 00:22:20,709 make better decisions with better information is going to be not just a robotics thing, 240 00:22:20,710 --> 00:22:28,660 but a how you integrate with the people. I'm going back to the idea of those little senses being everywhere, meaning there's no more surprising war. 241 00:22:28,660 --> 00:22:33,100 I don't think that's true. And the first example I would give you is just so chess. 242 00:22:33,170 --> 00:22:36,700 You have a completely glass bottle, you have perfect situational awareness. 243 00:22:36,700 --> 00:22:41,880 You know where everything is. You know all of the rules and capabilities, bases, and you also know the tempo as well. 244 00:22:42,010 --> 00:22:43,580 And you can. So your opponent takes a turn. 245 00:22:43,610 --> 00:22:49,180 You take it to an example, but if you play against a better player, you will still find yourself being surprised. 246 00:22:50,680 --> 00:22:55,270 The other thing I would say to you and without a quiz is just to be thinking, why does anyone know who that is? 247 00:22:57,310 --> 00:23:02,040 Well, Google's face in that algorithm is absolutely certain that that is military jobs. 248 00:23:03,670 --> 00:23:07,540 And the reason that is certain is because it is slightly semantic like glasses. 249 00:23:07,540 --> 00:23:15,129 That, again, star doesn't seem to be a feature in this presentation. And what it's doing is it's something called up a serial example. 250 00:23:15,130 --> 00:23:18,100 So the best description of it is on the top left here. 251 00:23:18,520 --> 00:23:24,070 So the picture on the top left of the panda is an original source picture, a crazy static looking box. 252 00:23:24,400 --> 00:23:27,640 Awesome changes to the colours of that picture that are really subtle. 253 00:23:28,190 --> 00:23:30,880 Now let's do that first, which it's create a picture on the right. 254 00:23:31,460 --> 00:23:38,020 What it's doing is it's increasing the signals that a computer sees that give it a different outcome. 255 00:23:38,440 --> 00:23:42,310 So you can see the scores below that again from Google in this recognition system. 256 00:23:42,580 --> 00:23:45,730 So it's 57% confident that it's a panda. 257 00:23:46,090 --> 00:23:50,740 And then you and I see what looks like the same picture, and yet it is 93% confident that it is a given. 258 00:23:51,790 --> 00:23:56,019 And that's sort of the subtle pictures there of the subtle pixel changes of water, actually, 259 00:23:56,020 --> 00:24:03,009 and that so the glasses that make the computer think about this military challenge, it's not just linked to that site, that star as well. 260 00:24:03,010 --> 00:24:09,669 There are more simple versions that are out there. So the crazy looking picture on the bottom left a slightly is a fractal pattern developed by an 261 00:24:09,670 --> 00:24:15,190 artist called Army and who has sort of a big brother theme to a lot of his designs for fashion. 262 00:24:15,490 --> 00:24:23,140 And it's a fabric pattern. And what it does is that is the the set of shadows, gaps, distances, the facial recognition software is looking for. 263 00:24:23,560 --> 00:24:27,879 And as you change the folds of your clothes, as you move around, it changes those distances. 264 00:24:27,880 --> 00:24:31,150 And when you walk past the facial recognition camera, it doesn't just get too far. 265 00:24:31,420 --> 00:24:39,730 That's wrong. It says that Robyn Rodney is buried in a list of thousands of other names and if there are enough on the computer isn't good enough, 266 00:24:40,000 --> 00:24:45,520 it just has a little over 500 shots that otherwise it has to sift through to find robots of all of those other names, 267 00:24:45,520 --> 00:24:50,440 which probably correlate to real people as well. So deception is like, well, I like this one. 268 00:24:50,440 --> 00:24:55,690 So this sees it in like real time for something that's 3D. So you can see this live picture there. 269 00:24:55,690 --> 00:24:59,710 It is being so largely recognised as a total 3D printed and. 270 00:25:00,080 --> 00:25:04,580 Some of those adversarial patterns in it and you can see it largely consistently being recognised as a. 271 00:25:06,140 --> 00:25:12,230 So the prospect of surprise disappearing or indeed machines being in front of them, that judgement seems to be extreme. 272 00:25:13,770 --> 00:25:17,690 Of these four people are all interesting. Does anyone know how they what they have in common? 273 00:25:18,860 --> 00:25:21,470 Well, the answer is that none of them exist. One would have. 274 00:25:23,180 --> 00:25:30,530 And so these are all generated by computers that we're trying to create from their own imagination. 275 00:25:30,530 --> 00:25:35,660 If you would like a better sense of what a human looks like and you can see it increases quite radically. 276 00:25:35,960 --> 00:25:39,080 So the prospect of deception isn't just that we can form machines, 277 00:25:39,080 --> 00:25:42,830 but it's increasingly going to be machines will allow us to fool each other as well. 278 00:25:44,330 --> 00:25:48,550 And another example I'll give you is live, real time video stitching. 279 00:25:49,490 --> 00:25:53,320 Now, somebody will know this from slightly more insane resources on leave up to your own morals. 280 00:25:54,590 --> 00:25:58,930 But you can see here somebody is able to in real time change the expressions. 281 00:25:58,940 --> 00:26:06,170 And indeed, there are other videos of people changing horses. And if you stare at George Bush's mouth on this, it looks a little bit grainy. 282 00:26:06,590 --> 00:26:11,150 But it's worth remembering that it will never be this bad again. This is only going to get better. 283 00:26:11,840 --> 00:26:15,500 So our ability to say seeing is believing on the Internet is going to diminish. 284 00:26:15,860 --> 00:26:23,610 And this is an area that is rapidly increasing and capable and an object that I'm thinking of, things that are ongoing. 285 00:26:24,130 --> 00:26:27,770 So DeepMind is is a fascinating organisation been speaking to and a lot of this. 286 00:26:29,270 --> 00:26:34,099 One of the things that is interesting I think is in the world of cybersecurity, 287 00:26:34,100 --> 00:26:40,730 one of the things that professional cybersecurity analysts are worried about is how some of these things can transpose. 288 00:26:41,120 --> 00:26:46,610 So Deepmind's next trick is to try and win a game called Starcraft. 289 00:26:47,180 --> 00:26:51,860 The interesting thing about Starcraft from a machine learning perspective is that you don't get to see over the battlefield. 290 00:26:51,860 --> 00:26:55,939 The other side is shrouded in fog until you get there and look, they aren't the same as you. 291 00:26:55,940 --> 00:26:59,569 They don't necessarily have the same features. You play real time. 292 00:26:59,570 --> 00:27:05,780 So there is a deterrence thing going on and there's a long term as a short term strategy component. 293 00:27:05,780 --> 00:27:09,350 So if I take a my guys have moved over and punched all your guys in the face of the very beginning, 294 00:27:09,800 --> 00:27:14,450 do I do that or do I go a little stunt by going off to build an arms factory so that we're coming back, 295 00:27:14,450 --> 00:27:17,540 makes a big takes versus maybe you're doing something in an intermediate term. 296 00:27:17,840 --> 00:27:26,059 So there's a series of judgements that need to be made. And the thing that is interesting to the cybersecurity world is that this could 297 00:27:26,060 --> 00:27:30,740 be readily transposed to what's called advanced persistent threat techniques. 298 00:27:31,730 --> 00:27:37,700 And how that works in cybersecurity is if you have a shield ball in front of you protecting yourself. 299 00:27:38,180 --> 00:27:43,759 And I am an incredible hacker. What I would do is I came up and I just tried your share war until I find a gap. 300 00:27:43,760 --> 00:27:50,540 So I'm looking for a weakness in your system. But more than that, I might do something which causes you to move one of your shields or responders. 301 00:27:50,690 --> 00:27:51,830 So I'm not trying to solve both. 302 00:27:52,040 --> 00:27:57,350 But if I keep pressing here, I notice you move this to here and I think that expresses weakness and then I can get here. 303 00:27:57,890 --> 00:28:05,360 And right now that requires very skilled hackers and a lot of time and expertise because then manually reading code we can get into the game. 304 00:28:05,720 --> 00:28:11,630 But if you're doing that, the cyclical rate of minute machine learning, your tempo becomes vastly faster. 305 00:28:12,140 --> 00:28:17,270 And so this means that not only in defence in terms of trying to hack in, but in terms of defence, 306 00:28:17,780 --> 00:28:24,650 you are going to find that systems like this are probably going to dominate the online version of cyber attack and cyber defence. 307 00:28:24,950 --> 00:28:28,669 And the other thing as well as you don't have to understand it from the ground up in the same 308 00:28:28,670 --> 00:28:33,079 way as the pilots does not have to be able to go to an aeroplane to be able to operate it. 309 00:28:33,080 --> 00:28:37,520 You just need star to fly. You don't need to be able to code this for first principles. 310 00:28:37,790 --> 00:28:43,520 If you can take it real, write it to a target and send it off on its way so it lowers the threshold for some of those sorts of attacks. 311 00:28:44,630 --> 00:28:47,960 Another factor that's going to change, I think the economics of war is, 312 00:28:49,460 --> 00:28:56,090 is the idea of huge headquarters and massive logistics supply base is not background or capacity. 313 00:28:57,110 --> 00:29:04,040 And the reason I say that is the idea of what's called parasitic warheads. So if you can really accurately target something more accurately, 314 00:29:04,040 --> 00:29:07,879 you can pick your target the less explosives, and that's what you need to take to be targets. 315 00:29:07,880 --> 00:29:14,630 And so if I can fly, if I can send a small unmanned drone to fly to Camp Bastion and it's looking 316 00:29:14,630 --> 00:29:18,510 for the wings of aeroplanes and it just puts bomb the whole street diamonds. 317 00:29:18,830 --> 00:29:24,140 I'm starting to make some really expensive aircraft, unreliable, relatively low risk. 318 00:29:25,040 --> 00:29:30,830 And there's a great quote which says, lining up all these aircraft on the runway might be the next version of lining up the ships at the moment. 319 00:29:31,940 --> 00:29:38,750 So there are some old people will have to reload. So in the Cold War, a whole bunch of an army wasn't just finding some big carrier somewhere. 320 00:29:38,990 --> 00:29:45,370 It was to move and hide in forests. And one of the lessons I think is coming out of the Ukraine, where both sides are using hard to detect these, 321 00:29:45,610 --> 00:29:53,440 is that both sides have to behave as if the other side has kind of superiority because whoever is fighter is dominant, 322 00:29:53,460 --> 00:29:59,470 doesn't determine whether somebody is looking down from the sky. And if you are seeing seen, obviously you can be targeted with alternative. 323 00:29:59,920 --> 00:30:05,920 So the concept of a benevolent sky, I think is going to disappear and it opens up other tactics as well. 324 00:30:06,190 --> 00:30:11,919 So that's the idea of pilot autonomy. So a great example here is David Perkins. 325 00:30:11,920 --> 00:30:20,770 The U.S. general pointed out that a U.S. ally in the Middle East and fighting $3 million paycheques on to destroy a drone have been fired, 326 00:30:21,010 --> 00:30:24,370 which costs $200 for hours. And it was successful. 327 00:30:24,680 --> 00:30:30,729 But Frank Hoffman describes this as a strategy set, by my defence, to measure cost vastly more than your productive measure. 328 00:30:30,730 --> 00:30:33,610 And Perkins went on to say. And if I was the guy who applied to join, 329 00:30:33,610 --> 00:30:39,100 I'd be delighted because I'd now be going out and spending is decreasing because many of these children go to training as I can, 330 00:30:39,310 --> 00:30:42,880 because I know it costs you 300 or $300 every time I do that. 331 00:30:44,260 --> 00:30:51,640 So the prospect of overwhelming things is going to change. And you know, this the one of the other organs from his people tend to be swarms. 332 00:30:52,120 --> 00:30:56,680 I don't think it's as simple as that again. So there are environments where it makes loads of sense to be a swarm. 333 00:30:57,070 --> 00:31:01,540 Say, for example, one missile equals one subset of this radar field does disappeared. 334 00:31:02,110 --> 00:31:08,340 But there are real world signs waiting parachutes to come in. And the reason that we have things like AWACS is for early warnings. 335 00:31:08,980 --> 00:31:14,140 And if you are looking over a border, you need a really big radar dish to look at one on the way. 336 00:31:14,500 --> 00:31:18,160 So the concepts, if you will, of a swarm suddenly starts to become not early warning. 337 00:31:18,310 --> 00:31:19,870 It just tells you what's crossing the border. 338 00:31:20,260 --> 00:31:26,740 So it's going to be a mix of the big and the things we recognise now, and especially along with new systems. 339 00:31:28,690 --> 00:31:31,240 And there are other practical prospects that people cross over. 340 00:31:31,250 --> 00:31:36,130 And so some people have described the advantage of having an autonomous armoured vehicles like tanks. 341 00:31:36,490 --> 00:31:41,250 I don't think necessarily swapping the humans out for robots is going to be the way that this thing plays out. 342 00:31:41,290 --> 00:31:49,209 Since and the example I'll give you here is this a robot tank has been designed and created and it could be part of a platoon of, 343 00:31:49,210 --> 00:31:53,320 let's say, four tanks where three of them, a robot, takes number six to the back of the mountain. 344 00:31:53,740 --> 00:31:59,620 Well, that sounds fantastic. The robots can go forward and do the dangerous stuff and the humans in the back making sort of tactical judgement 345 00:32:00,040 --> 00:32:06,700 until one of those tanks has a moment where the equivalent of an arm and a flat tire is throwing a truck. 346 00:32:07,270 --> 00:32:12,520 And now the only people who can get out and put the truck back on the tank of the crew and you only have one crew. 347 00:32:12,970 --> 00:32:17,320 So I've now gone from having four times to having no tanks, whereas when they were all gone, 348 00:32:17,650 --> 00:32:23,350 I went from having four tanks to having three tanks, and sometimes those small real world practices moments. 349 00:32:23,350 --> 00:32:30,639 Again, there was a prominent guy who was showing this tank driver, but he hated the idea. 350 00:32:30,640 --> 00:32:35,500 And it wasn't the tank truck thing, it was the weapon being because his whole prospect was at the end of the battle. 351 00:32:35,500 --> 00:32:39,190 They're going to spend the next two days exhausting themselves, just feeding the weapons. 352 00:32:40,540 --> 00:32:44,590 So the small, pragmatic realities are going to make a difference and understanding how these things play out. 353 00:32:44,980 --> 00:32:48,760 And I'll run a video here. First of all, see, that's the Tom Cruise dialogue, which it was fun. 354 00:32:49,930 --> 00:32:55,150 So this is from a film called The Edge of Tomorrow. And this goes back to this idea that they're going to be super brilliant. 355 00:32:56,740 --> 00:33:00,160 It's worth understanding that Tom Cruise has the same effect in his films, 356 00:33:00,160 --> 00:33:06,550 that he starts out rubbish fighting this war in the same way as Alphazero was rubbish in fighting for war playing chess the first time. 357 00:33:07,120 --> 00:33:14,050 And yet in 4 hours you've got the grandmaster level. But like Tom Cruise, it's fighting the same battle again and again and again and again. 358 00:33:14,410 --> 00:33:17,350 And there are changes within it, but it's kind of always the same battle. 359 00:33:17,770 --> 00:33:22,180 One of the reasons we always say that we're ill prepared for the next war is because we've never fought the same war twice. 360 00:33:22,540 --> 00:33:28,869 That always different. And so your ability to sort of plug a machine in and it be optimised kind of depends on your 361 00:33:28,870 --> 00:33:33,580 ability to generate the data you need or have a repetition rate that allows you to learn with you. 362 00:33:34,060 --> 00:33:36,230 And another example I give you is fruit flies intelligence. 363 00:33:36,910 --> 00:33:42,670 So people study genetics and fruit flies because they breed fast, mutate fast and die fast. 364 00:33:43,360 --> 00:33:50,650 If we had studied genetics in elephants, the experiments would die at a faster rate than the experiments. 365 00:33:51,370 --> 00:33:53,319 So there's a fairly good reason we pick those once. 366 00:33:53,320 --> 00:33:58,870 And I think the same sort of understanding of what artificial intelligence is trying to learn is getting smarter. 367 00:33:59,170 --> 00:34:04,920 So the types of problems are going to shape what we can actually study these things and it plays itself out in funny ways. 368 00:34:04,930 --> 00:34:12,880 We also tend to think we talk about economics and mass unemployment, but which jobs are automatable is interesting. 369 00:34:13,240 --> 00:34:16,660 So there is quite often an assumption that the lowest paid jobs will be the ones that come 370 00:34:16,660 --> 00:34:20,320 in droves because they can't be that hard because we don't pay people to do them that much. 371 00:34:20,740 --> 00:34:24,670 But it's wrong. An example I'm giving you is on the left of the auto pilot. 372 00:34:24,730 --> 00:34:31,480 He was able to find a747 because it's doing things we think are complicated but are just maths and you can automate that. 373 00:34:31,750 --> 00:34:36,630 Whereas the lady on the right that has to do all sorts of things which we find really difficult to codify, 374 00:34:36,680 --> 00:34:44,020 like dealing with troublesome customers, you know, trying to find who really get ordered vegetarian meals, all of that stuff. 375 00:34:44,200 --> 00:34:51,669 Yeah, very poor situations for computers, which means that, you know, there are counterintuitive examples. 376 00:34:51,670 --> 00:34:59,460 So in legal, legal case study reviews and in diagnostics, there's loads of automation happening which is affecting people. 377 00:35:00,110 --> 00:35:06,100 God is the brightest and best the society's adopted at a glance. And yet the fruit pickers and office cleaners are brilliant. 378 00:35:06,160 --> 00:35:11,200 Remarkably difficult to automate, at least economically. And at the top of the picture. 379 00:35:11,500 --> 00:35:14,890 I'd say going back to that things that you codify idea. 380 00:35:15,280 --> 00:35:19,540 So if the odyssey is describing stay or going to be on an interest to being central to strategy. 381 00:35:19,960 --> 00:35:23,200 We struggle to articulate what those things are to ourselves. 382 00:35:23,560 --> 00:35:27,910 The possibility we can reduce those to maths seems a very distant prospect. 383 00:35:28,270 --> 00:35:34,210 So the odds of machine learning doing what we would be happy with a strategy seems quite far away. 384 00:35:34,240 --> 00:35:39,130 Although they will affect strategy because there'll be tools that matter. So understanding relative strengths becomes important. 385 00:35:39,460 --> 00:35:44,290 So machines are faster than us. They are already proving better at detection and recognition. 386 00:35:45,100 --> 00:35:50,620 They are typically more efficient at things that we can focus on and they are better optimising in all sorts of ways, 387 00:35:50,620 --> 00:35:56,030 especially problems which we do we have a lot of data for. We can cycle quickly through how many would say, don't you know, 388 00:35:56,090 --> 00:36:01,580 spending a lot of power getting tired, sleepy or distracted, but they own negative context. 389 00:36:01,630 --> 00:36:06,940 So the ability of a robots to walk into a room and make a determination as to whether that's a child who 390 00:36:06,940 --> 00:36:13,120 has picked up a rifle or a child soldier is extremely difficult and could well just be an intuitive, 391 00:36:13,120 --> 00:36:17,920 almost unconscious act by a soldier walking into a room because the context is important. 392 00:36:19,180 --> 00:36:23,380 Right now, they're pretty bad at doing this big push to try and change that. 393 00:36:24,130 --> 00:36:29,050 But the ability to take lessons in one environment and reapply them in another one is a very powerful YouTube. 394 00:36:29,290 --> 00:36:34,330 And it is no mistake that all of the longest running religions are books of allegory, 395 00:36:34,630 --> 00:36:40,030 because the lessons that were written about worlds that no longer exist can still be applied by humans successfully. 396 00:36:40,430 --> 00:36:45,600 Or that's how that even why they are still found yet facilitated. 397 00:36:45,670 --> 00:36:50,230 This is not within the realm of the machine, but it's new for the obvious reasons that there's no data. 398 00:36:50,680 --> 00:36:55,140 So if it's the first time something has been seen in the machine control, it's versatility. 399 00:36:55,160 --> 00:37:02,799 So you can take a human and you can kind of do all sorts of things moderately well, if not perfectly, but quite often take the machine and put it on. 400 00:37:02,800 --> 00:37:07,270 A new project requires re-engineering and there's a thing called catastrophic forgetting at the moment, 401 00:37:07,270 --> 00:37:14,079 which is if I teach machines to be amazing objects and then I teach it back and it kind of forgets all of the things that it's learned about. 402 00:37:14,080 --> 00:37:20,240 Yes, there are some progress. There's some progress being made to try and put sort of elastic bands in the memory, but it's all pretty nice. 403 00:37:20,320 --> 00:37:24,640 And right now they tend to be as bad as if you started with a completely blank slate machine. 404 00:37:25,750 --> 00:37:32,200 And I'm one of the one with this will probably be one of the first to disappear is going to be complex manual handling and it's if you 405 00:37:32,200 --> 00:37:38,200 want an idea of how difficult something that we have evolved over thousands of years to do really well so we take it for granted, 406 00:37:38,410 --> 00:37:40,240 I would suggest go home, sit on your hand. 407 00:37:40,390 --> 00:37:45,400 The story isn't going where you think it's going to come and try and strike a match and you'll probably go through the 408 00:37:45,400 --> 00:37:51,250 whole box of matches before you get anywhere near successfully doing it because you're relying on unconscious feedback, 409 00:37:51,250 --> 00:37:54,460 pressure modulation. There's a whole raft of things to do. 410 00:37:54,730 --> 00:37:59,379 That right now is why machines are doing squishy fruit rather than actually packaging stuff. 411 00:37:59,380 --> 00:38:06,530 And, you know, firms like a car, if you've managed to automate everything else in that supply chain apart from irregular handling of soft objects. 412 00:38:07,990 --> 00:38:14,020 Oh, and again, the promise of the bottom now, because they still make mistakes that seem completely alien in that we would never make. 413 00:38:15,460 --> 00:38:20,620 So it's going to be a cocktail of human and machine, as one might say, that there isn't just going to be the machines coming out of it. 414 00:38:20,950 --> 00:38:28,600 And how you apply human consciousness as it becomes more economical to have lots of cheap presence means the human consciousness, 415 00:38:29,140 --> 00:38:35,970 naturally and logically must become proportionally scarcer on the battlefield compared to the different points of presence, you know, housing. 416 00:38:36,430 --> 00:38:40,659 And you then start to get to a place where you can't afford to waste it on doing, 417 00:38:40,660 --> 00:38:45,160 filling in Excel spreadsheets or other stuff that is kind of mechanistic. 418 00:38:45,160 --> 00:38:50,709 If done, you're going to have to start to preserve it for doing the context, judgement, things that you know, 419 00:38:50,710 --> 00:38:55,960 the machine is weaker and I could give you an example, it won't just be one black, one machine. 420 00:38:56,230 --> 00:39:01,990 You will have teams that do this probably with some sort of obviously my own symbol is because I had a kind of simple process intelligence 421 00:39:02,320 --> 00:39:07,120 and the brackets are where those people start with their consciousness and their intelligence and what they're thinking about. 422 00:39:07,420 --> 00:39:12,550 And there will be some places. So on the far right there, you have one driver, maybe he's near the French president. 423 00:39:12,970 --> 00:39:17,020 So you really don't want any emergent good ideas for the machine that you weren't expecting. 424 00:39:17,560 --> 00:39:21,160 So that's you sitting there very carefully with one person flying that system. 425 00:39:21,610 --> 00:39:26,920 And if it's complex to do, you'll find multitasking diminishes, which is one of the reasons, you know, have sex when you drive. 426 00:39:27,670 --> 00:39:33,129 So you focus your consciousness that you might find that you are actually controlling one thing, which is, in fact, a swarm. 427 00:39:33,130 --> 00:39:37,680 So you'll do it in concert with a machine to say, take this woman to that which you can give as an instruction, 428 00:39:37,690 --> 00:39:40,749 but the machine needs to coordinate because there's no way you can fly 50 aircraft. 429 00:39:40,750 --> 00:39:44,190 At the same time, it will also be how you put together a package of things. 430 00:39:44,200 --> 00:39:45,669 Save the little bracket on the left. 431 00:39:45,670 --> 00:39:52,060 There might be that we want to have a surveillance system that's not carrying any heavy metal because we wanted to be for a long time. 432 00:39:52,400 --> 00:39:57,890 We want something with a gun on the floor so that we can actually do the shooting and you want them concert. 433 00:39:57,910 --> 00:40:03,740 It's concerted team of. Things that are actually genius to maximise different kinds of strengths. 434 00:40:04,010 --> 00:40:08,780 But you kind of want have one person to join in flight or so that you're minimising your decision, actually. 435 00:40:09,590 --> 00:40:15,740 And then there'll be other things where, for example, the AWACS in the top left you you don't want to have to spend any time thinking about something. 436 00:40:16,190 --> 00:40:18,440 You just wanted to sit there and say. Right. 437 00:40:18,630 --> 00:40:23,270 Their only point is now I'm not going to pay attention to that because looking at a blank space waste of that consciousness, 438 00:40:23,570 --> 00:40:27,380 you will go there, you will fly around in this area and you will shout when something happens. 439 00:40:27,590 --> 00:40:29,510 But otherwise, just let me know you're going to run out of fuel. 440 00:40:30,020 --> 00:40:33,890 So it's going to be how you balance those things and it isn't going to all arrive overnight. 441 00:40:34,010 --> 00:40:37,880 And I think the way it's going to arrive is going to be the kind of series of overlapping phases. 442 00:40:38,060 --> 00:40:40,900 And I couldn't give you a timeframe. I'll tell you, in ten years it'll be this. 443 00:40:41,540 --> 00:40:49,940 And I think if anyone does a very, very, very clever war room, and it's going to probably play out in the same way as air power did. 444 00:40:50,120 --> 00:40:53,810 So the beginning used to have battleships, which were the kings of the sea. 445 00:40:54,200 --> 00:40:59,000 And in the First World War that was you know, that was in control of the oceans and seas. 446 00:40:59,570 --> 00:41:08,210 Between the sea wars, you added enormous amounts of money to get incremental improvements in what were essentially the same systems. 447 00:41:08,510 --> 00:41:12,520 So by the time the Second World War starts, you know, that got they've got better armour, 448 00:41:12,530 --> 00:41:16,690 they can shoot better and they're more accurate, but they've paid a fortune to do that. 449 00:41:17,060 --> 00:41:19,820 And you have the little box type things that are very likely to fall out of 450 00:41:19,820 --> 00:41:23,120 the sky in the First World War to kill the pilots as the other guy was like, to shoot down. 451 00:41:23,570 --> 00:41:27,920 I've had tens or hundreds of dollars for much bigger jumps in technology. 452 00:41:28,280 --> 00:41:34,950 And the three pictures you see, there were a string of battleships, which are tiny little planes flung up and catapulted into a suicide bomber. 453 00:41:36,170 --> 00:41:40,790 The middle picture is a software terminal taking off from HMS Pegasus, which is a World War One aircraft carrier. 454 00:41:41,480 --> 00:41:46,100 And then by 1942, the battleship is still the most amazing thing in close combat. 455 00:41:46,190 --> 00:41:51,310 But it's kind of irrelevant because you can never get that because the aircraft carrier was able to just a bit range. 456 00:41:51,740 --> 00:41:53,330 So I think we'll see the same sort of thing. 457 00:41:53,330 --> 00:41:59,719 So the first things will be stuff to augment what we already do because we're not in a position to understand those 458 00:41:59,720 --> 00:42:06,440 technologies well enough to know which are definitely going to change stuff and then still operate in parallel. 459 00:42:06,710 --> 00:42:11,480 And then I think eventually much of it will just be some things which become redundant because they're no longer. 460 00:42:12,290 --> 00:42:15,979 But I also don't think it's going to just be one box. And the reason is the picture of a radio. 461 00:42:15,980 --> 00:42:19,610 That is because it's kind of more of a component level technology in some ways. 462 00:42:20,060 --> 00:42:24,170 So in the same way as the French probably have the best time to start of World War two, 463 00:42:25,580 --> 00:42:33,049 it was actually the combination of all the artillery and transportation and people all melded together 464 00:42:33,050 --> 00:42:37,780 through the medium of the radio to allow Blitzkrieg to be as successful as it was at the start of my life. 465 00:42:38,840 --> 00:42:45,530 And I have given us a number of people who tend to get very depressed because they see all of the threats of this. 466 00:42:45,860 --> 00:42:50,410 I don't think there is one natural automatic winner. They do have to cross that my next target. 467 00:42:52,310 --> 00:42:57,980 But I think the future is neutral. I don't think any one person naturally owns this. 468 00:42:57,980 --> 00:43:02,810 There are places that have natural advantages because of research that is going on. 469 00:43:03,440 --> 00:43:05,509 If you do things to the cynical science and metrics, 470 00:43:05,510 --> 00:43:13,399 which is the study of patents and the three leading countries of the world finding patents of currently the US and China. 471 00:43:13,400 --> 00:43:16,850 And depending on how you measure it, the order changes there and then the UK. 472 00:43:17,300 --> 00:43:23,780 So being in that bracket is certainly very healthy. But understanding how these things will play out for the fact that it's going to be driven by 473 00:43:23,780 --> 00:43:27,740 the commercial sector largely means that quite a lot of it's going to be available for everyone. 474 00:43:28,160 --> 00:43:32,600 So it's going to be proactive experimenters who make the most of this. 475 00:43:32,610 --> 00:43:39,800 I think in the same way as was true for the emergence of Echo. And speaking of proactive, inventive people, 476 00:43:40,430 --> 00:43:46,940 this is so I I'm going to show you this that's a test from yesterday I was doing and I saw a preview of what's a robot. 477 00:43:49,340 --> 00:43:57,110 Now, if I'm labelling these pages, I think that is volume two on Google. 478 00:43:57,470 --> 00:44:05,450 They were trying to build some driving cars and I'm like really proving, no, I'm not a robot or am I a slave doing data labelling, 479 00:44:05,450 --> 00:44:13,430 all of it, especially since the old CAPTCHA process was beaten in, I think 2017 by a robot. 480 00:44:13,790 --> 00:44:17,060 So there are robots out there that can be other forms of this test. 481 00:44:17,540 --> 00:44:29,380 So it's all going to be about how your clever in getting everything you need. And on that one, I'll get back and see how much this will.