1 00:00:12,470 --> 00:00:14,260 Good Afternoon. 2 00:00:14,260 --> 00:00:18,250 It's a pleasure and an honor to give the last talk in this fantastic conference 3 00:00:18,250 --> 00:00:21,910 that Ursula and her team have organized. 4 00:00:21,910 --> 00:00:26,575 Also, the conference, because we're celebrating the 200th 5 00:00:26,575 --> 00:00:30,665 anniversary of Ada Lovelace, looks back at the history. 6 00:00:30,665 --> 00:00:33,576 But part of what makes Ada Lovelace really 7 00:00:33,576 --> 00:00:37,745 significant today is what happened over the 200 years. 8 00:00:37,745 --> 00:00:42,548 And the fact that the information technology today so much permeated 9 00:00:42,548 --> 00:00:48,512 influence in life make the contribution from 200 years ago look so significant. 10 00:00:48,512 --> 00:00:55,170 And before I'm say it is up to close this conference, looking into the future. 11 00:00:55,170 --> 00:01:01,860 And looking into the future we should use Ada Lovelace criterion, 12 00:01:01,860 --> 00:01:06,495 which is are we doing things for the use of mankind? 13 00:01:06,495 --> 00:01:11,138 So this is a wonderful quote and I'm grateful to Betty too 14 00:01:11,138 --> 00:01:15,790 that she closed with this yesterday and gave me a fantastic opportunity to open 15 00:01:15,790 --> 00:01:22,160 with it today in which Ada Lovelace, in a letter to Babbage writes. 16 00:01:22,160 --> 00:01:24,820 I mean I shortening, far be it from me to decide the influence and 17 00:01:24,820 --> 00:01:26,590 foundation of fame. 18 00:01:26,590 --> 00:01:30,200 I wish to add my mite, I presume this is the spelling of the time, 19 00:01:30,200 --> 00:01:32,940 she's not talking about insects. 20 00:01:32,940 --> 00:01:38,270 To the total, the most effective news of mankind, or for mankind. 21 00:01:38,270 --> 00:01:42,410 So the questions I want to put forward, is technology that we're developing, 22 00:01:42,410 --> 00:01:47,088 is it really, ultimately does it benefit mankind? 23 00:01:47,088 --> 00:01:52,490 And the lengths for which I want to examine it, is the length of jobs. 24 00:01:52,490 --> 00:01:55,910 And we're just coming out of a recession, 25 00:01:55,910 --> 00:01:58,590 that economists now call the Great Recession, 26 00:01:58,590 --> 00:02:03,060 which the the deepest recession we've witnessed since the Great Depression. 27 00:02:03,060 --> 00:02:08,760 And to appreciate the depth, to appreciate the depth of that recession, you look at 28 00:02:08,760 --> 00:02:14,590 the plot here that shows on one dimension, time, from the beginning of the recession. 29 00:02:14,590 --> 00:02:17,530 And on the other dimension, the loss of jobs. 30 00:02:17,530 --> 00:02:23,220 And you can see, that, that this, 31 00:02:23,220 --> 00:02:26,750 this, this is the last recession we emerge out of. 32 00:02:26,750 --> 00:02:32,630 You can see, it was the longest and the deepest since World War II. 33 00:02:32,630 --> 00:02:34,670 So, really if you want to see some equivalent, 34 00:02:34,670 --> 00:02:39,060 you have to go all the way back to the Great Depression. 35 00:02:39,060 --> 00:02:42,370 But, what I want to do here is not talk about what happened over the last few 36 00:02:42,370 --> 00:02:47,870 years, but to take a longer look at the issue of jobs. 37 00:02:47,870 --> 00:02:51,610 And for that we're going to look at what economists have been calling 38 00:02:51,610 --> 00:02:54,220 the great coupling. 39 00:02:54,220 --> 00:02:59,980 So what drives economic growth overall is economic productivity. 40 00:02:59,980 --> 00:03:06,780 And if you look from roughly post World War Two, 41 00:03:06,780 --> 00:03:12,400 this is from 1953, labor productivity consistently increased. 42 00:03:12,400 --> 00:03:16,500 And it dove, it produced benefit for everyone. 43 00:03:16,500 --> 00:03:19,600 GDP grew, and with growth of GDP, 44 00:03:19,600 --> 00:03:25,070 jobs were created, and income, it family income rose. 45 00:03:25,070 --> 00:03:29,100 So you see that we have, these are four separate indicators. 46 00:03:29,100 --> 00:03:32,030 But they kind of go up together in harmony. 47 00:03:32,030 --> 00:03:35,610 And they've done it over a period of 30 years. 48 00:03:35,610 --> 00:03:38,950 So much so that we kind of think well, they must go together in harmony. 49 00:03:38,950 --> 00:03:42,270 This is an economic law. 50 00:03:42,270 --> 00:03:44,660 But it turns out that this is not the case. 51 00:03:44,660 --> 00:03:47,730 And what happened since, roughly 1980. 52 00:03:47,730 --> 00:03:51,220 Is becoming known as the great decoupling. 53 00:03:51,220 --> 00:03:56,330 Because now we can see that productivity can continue to grow, and 54 00:03:56,330 --> 00:04:00,300 as productivity continue to grow, the economy grows, this is GDP. 55 00:04:00,300 --> 00:04:04,080 But somehow the benefits are not spread uniformly. 56 00:04:04,080 --> 00:04:09,195 You can see that the two bottom lines, one is private employment. 57 00:04:09,195 --> 00:04:13,090 These are non government jobs, flattens. 58 00:04:13,090 --> 00:04:16,354 And even wars, is median household income. 59 00:04:16,354 --> 00:04:22,910 To sharpen this, we can, I've, to sharpen this, 60 00:04:22,910 --> 00:04:28,320 you can zoom in on this, and you can see how really things are breaking apart. 61 00:04:28,320 --> 00:04:31,560 And there is this phrase that many economists used to believe in. 62 00:04:31,560 --> 00:04:34,790 That the rising tide lifts all boats. 63 00:04:34,790 --> 00:04:37,950 If the economy grows, everybody will benefit. 64 00:04:37,950 --> 00:04:42,650 And now it seems that the tide and boat have parted company. 65 00:04:42,650 --> 00:04:46,730 And this is leading to a furious debate between economists and 66 00:04:46,730 --> 00:04:50,030 the question is, what is causing this? 67 00:04:50,030 --> 00:04:52,170 So, on one hand there is something, 68 00:04:52,170 --> 00:04:57,810 there is a group that calls itself almost proudly the Neo Luddites. 69 00:04:57,810 --> 00:05:01,870 And as the name suggests they blame technology. 70 00:05:01,870 --> 00:05:04,830 So, bring your cell phone and coffee from MIT. 71 00:05:04,830 --> 00:05:05,900 Productivity growth and 72 00:05:05,900 --> 00:05:09,660 employment growth started to become decoupled from each other. 73 00:05:09,660 --> 00:05:12,910 We're creating jobs but not enough of them. 74 00:05:12,910 --> 00:05:17,160 Saks and Kotlikof, what if machines are getting so smart thanks to 75 00:05:17,160 --> 00:05:23,190 the micro-processor brains that they no longer need unskilled labor to operate? 76 00:05:23,190 --> 00:05:27,210 Or Krugman, nobel prize winner, a famous New York Times columnist, 77 00:05:27,210 --> 00:05:31,400 can innovation and progress really help large number of workers? 78 00:05:31,400 --> 00:05:33,470 Maybe even workers in general? 79 00:05:33,470 --> 00:05:35,490 The truth is that it can. 80 00:05:35,490 --> 00:05:38,747 It seems economies have been aware of this possibility for almost 200 years. 81 00:05:38,747 --> 00:05:42,730 And so the new Luddites. 82 00:05:42,730 --> 00:05:47,440 On the other side, you have the people who call themselves neo-classical economists. 83 00:05:47,440 --> 00:05:49,600 And they say hogwash. 84 00:05:49,600 --> 00:05:54,390 Since the dawn of the industrial age a recurrent field has been 85 00:05:54,390 --> 00:05:58,430 that technological change will spawn mass unemployment. 86 00:05:58,430 --> 00:06:03,060 Neo classical economists predicted that this would not happen because people 87 00:06:03,060 --> 00:06:08,880 will find other jobs albeit possibly after a long period of painful adjustment. 88 00:06:08,880 --> 00:06:12,830 By and large that prediction has proven to be correct. 89 00:06:12,830 --> 00:06:16,810 So, why should we think that this time it will be otherwise? 90 00:06:16,810 --> 00:06:19,150 And this is a naturalist debate. 91 00:06:19,150 --> 00:06:23,118 The new Luddites are saying this time it's gonna be different. 92 00:06:23,118 --> 00:06:27,500 And the neo classical are saying no, why should they be different this time? 93 00:06:27,500 --> 00:06:29,330 So who is right? 94 00:06:29,330 --> 00:06:32,190 Economists cannot agree, and I'm not even an economist, so 95 00:06:32,190 --> 00:06:35,680 I want to take the view from computer science of this question. 96 00:06:35,680 --> 00:06:39,910 And to let this view we go back to 97 00:06:39,910 --> 00:06:45,600 the belt of artificial intelligence attributed widely to Alan Curing. 98 00:06:45,600 --> 00:06:51,370 Who in the very famous, 1950 paper, which I think is his most cited paper. 99 00:06:51,370 --> 00:06:55,780 Even more cited is the paper that where he introduced tooling machines. 100 00:06:55,780 --> 00:07:02,180 And, in this paper, he discusses the questions, can machines be intelligent? 101 00:07:02,180 --> 00:07:05,746 The paper is very famous for the Turing Test. 102 00:07:05,746 --> 00:07:09,120 But the Turing Test in my opinion is the weakest part of the paper. 103 00:07:09,120 --> 00:07:12,750 The strong part of the paper is this philosophical analysis for 104 00:07:12,750 --> 00:07:15,740 the feasibility of intelligent machines. 105 00:07:15,740 --> 00:07:18,040 And queuing comes very strongly, yes. 106 00:07:18,040 --> 00:07:23,931 It is feasible, and you're right I believe, that the century use of words, 107 00:07:23,931 --> 00:07:28,410 and generally you create an opinion, would have altered so much, that one will be 108 00:07:28,410 --> 00:07:33,050 able to speak of machine thinking, without expecting to be contradicted. 109 00:07:33,050 --> 00:07:36,200 And in the paper, he go through all the different kind of objections, and 110 00:07:36,200 --> 00:07:40,020 one of this objection came to be known as the loveless objection. 111 00:07:40,020 --> 00:07:41,760 So what is the loveless objection? 112 00:07:41,760 --> 00:07:45,320 Going back to her famous notes, where she writes. 113 00:07:45,320 --> 00:07:50,870 The analytical engine has no pretention whatever to originate anything. 114 00:07:50,870 --> 00:07:55,770 It can do whatever we know how to order to perform. 115 00:07:55,770 --> 00:07:58,100 And most people say yeah, you write a program and 116 00:07:58,100 --> 00:08:01,200 the program will only do what you want it to do. 117 00:08:01,200 --> 00:08:05,549 Except that Turing already understood that even if every program, 118 00:08:05,549 --> 00:08:09,301 predicting how that program would behave is impossible. 119 00:08:09,301 --> 00:08:13,851 And this is really the essence of his 1976 paper that deciding how a Turing 120 00:08:13,851 --> 00:08:16,660 machine would behave Is not, is undecidable. 121 00:08:16,660 --> 00:08:21,235 And the right, machine takes me by surprise, with great frequency. 122 00:08:21,235 --> 00:08:25,613 [INAUDIBLE] Argument did not appeal to the contest, or to [INAUDIBLE]. 123 00:08:25,613 --> 00:08:28,018 The reality is, neither the [INAUDIBLE] nor 124 00:08:28,018 --> 00:08:30,358 [INAUDIBLE] actually executed programs. 125 00:08:30,358 --> 00:08:32,470 [LAUGH] Anybody who wrote the program, and 126 00:08:32,470 --> 00:08:38,120 then let it run knows, it never behaves, [LAUGH] the way you want it to behave. 127 00:08:38,120 --> 00:08:43,860 We're surprised every time we run a program. 128 00:08:43,860 --> 00:08:49,670 Now, Turing was a techno-optimist, and this techno-optimism is used, 129 00:08:49,670 --> 00:08:54,270 you see it reflected very much also in the early age of AI, 1958 130 00:08:54,270 --> 00:09:00,950 Samuel within ten years a digital computer would be at the world chess champion. 131 00:09:00,950 --> 00:09:04,220 Minsky in 1967, within a generation, the [INAUDIBLE] 132 00:09:04,220 --> 00:09:08,020 critical artificial intelligence, would substantially be solved. 133 00:09:08,020 --> 00:09:11,040 Okay, generation, take it 30 years, 1997. 134 00:09:11,040 --> 00:09:14,575 Well here we are 2015, we've not quite sold it yet. 135 00:09:14,575 --> 00:09:19,510 [LAUGH] And this early optimism was followed because of the very 136 00:09:19,510 --> 00:09:23,700 often AI [INAUDIBLE], somehow it's endemic to the field, there are periods which we 137 00:09:23,700 --> 00:09:27,540 have been called especially [INAUDIBLE] in the field, AI winters. 138 00:09:27,540 --> 00:09:29,860 And what is an AI winter? 139 00:09:29,860 --> 00:09:31,410 One, it's slow progress. 140 00:09:31,410 --> 00:09:34,710 But slow progress we can tolerate. 141 00:09:34,710 --> 00:09:36,550 Funding, absolutely not. 142 00:09:36,550 --> 00:09:37,780 This is very bad. 143 00:09:37,780 --> 00:09:40,250 We should never tolerate it. 144 00:09:40,250 --> 00:09:43,590 And there have been at least two kind of periods of it. 145 00:09:43,590 --> 00:09:47,810 The late 70s, second half of the 70s and 146 00:09:47,810 --> 00:09:51,930 then mid 80 to 90s were both periods of AI winters. 147 00:09:51,930 --> 00:09:57,600 But in the late 1990s AI seemed to have turned a corner. 148 00:09:57,600 --> 00:10:06,630 An event that was deeply influential was IBM's Deep Blue beat Kasparov in chess. 149 00:10:06,630 --> 00:10:09,040 Kasparov was then the reigning world champion, 150 00:10:09,040 --> 00:10:14,130 still considered by many to be the best chess player of all times. 151 00:10:14,130 --> 00:10:17,240 The competition took place in New York City. 152 00:10:17,240 --> 00:10:22,550 IBM invited me, and pay travel expense, and hotel expense, couldn't say no. 153 00:10:22,550 --> 00:10:27,220 Even though I'm not a big chess fan, and my original conception would be watching 154 00:10:27,220 --> 00:10:31,345 a chess game that takes several hours, would be like watching the grass grow. 155 00:10:31,345 --> 00:10:37,410 [LAUGH] But they've actually made it very interesting, and I watched the first game. 156 00:10:37,410 --> 00:10:42,480 And in the first game Kasparov was white, was black. 157 00:10:42,480 --> 00:10:47,410 And even Kasparov was in a very disadvantaged situation 158 00:10:47,410 --> 00:10:52,390 because while could study Kasparov, Kasparov has no idea how plays, 159 00:10:52,390 --> 00:10:54,610 this is a fresh player you've never seen, 160 00:10:54,610 --> 00:10:59,850 never the less Kasparov played the Briggen defensive game and won the game. 161 00:10:59,850 --> 00:11:01,990 And I remember thinking to myself, 162 00:11:01,990 --> 00:11:07,280 well, one day computers will win in chess, but the time has not come. 163 00:11:07,280 --> 00:11:10,380 And I left my wife in Texas, so I'm going to go back to Texas, and 164 00:11:10,380 --> 00:11:11,485 I skip the second game. 165 00:11:11,485 --> 00:11:15,060 >> [LAUGH] >> Done. 166 00:11:15,060 --> 00:11:17,040 The second game, [INAUDIBLE] was white. 167 00:11:17,040 --> 00:11:23,020 And the blue went on the offensive and beat Kasparov and 168 00:11:23,020 --> 00:11:26,610 Kasparov was so shocked psychologically that he has never recovered and 169 00:11:26,610 --> 00:11:29,010 he has lost a six game match. 170 00:11:29,010 --> 00:11:34,120 So this was one milestone and then in 2005, there was something called a DARPA 171 00:11:34,120 --> 00:11:38,500 gun challenge which is about having autonomous vehicle drive in this valley. 172 00:11:38,500 --> 00:11:43,600 In 2004 when they did it no car went more than I think seven miles. 173 00:11:43,600 --> 00:11:48,670 And everybody said oh this is too difficult but in 2005 the Stanford car was 174 00:11:48,670 --> 00:11:55,760 able to complete 171 miles in an uncharted desert completely autonomously. 175 00:11:55,760 --> 00:11:58,840 And two years later they have the Oben challenge. 176 00:11:58,840 --> 00:12:03,380 And the CMU team will able to drive 55 miles in an urban city 177 00:12:03,380 --> 00:12:07,910 while obeying all traffic laws and avoiding hazards. 178 00:12:07,910 --> 00:12:11,810 And then in 2011, another really shocking milestone, 179 00:12:11,810 --> 00:12:17,070 again IBM what's on the the two greatest Jeopardy players of all time, 180 00:12:17,070 --> 00:12:22,790 Brad and Ken Jennings decisively, decisively. 181 00:12:22,790 --> 00:12:27,770 And this is answering all kind of clever, I mean, the questions can be very 182 00:12:27,770 --> 00:12:31,840 clever response and seemingly you really need to be intelligent. 183 00:12:31,840 --> 00:12:34,440 If you look at the question, it's quite amazing. 184 00:12:34,440 --> 00:12:41,070 You can do this purely by brute force with no how what we think of intelligence. 185 00:12:41,070 --> 00:12:45,131 And so today we see that the [INAUDIBLE] continue to make progress. 186 00:12:45,131 --> 00:12:50,402 This is a [INAUDIBLE] called the big dog. 187 00:12:50,402 --> 00:12:54,403 [SOUND] And this is not new, this is ten years old, and this is something, 188 00:12:54,403 --> 00:12:57,683 obviously you can imagine the military would have very much 189 00:12:57,683 --> 00:12:59,730 would like to have mechanical news. 190 00:12:59,730 --> 00:13:07,779 [SOUND] And you can see what happens when somebody gives it a kick. 191 00:13:07,779 --> 00:13:11,356 And very, you know almost like a real animal, 192 00:13:11,356 --> 00:13:14,310 it somehow find it's balance again. 193 00:13:14,310 --> 00:13:18,758 And we all heard now of the Google driverless car. 194 00:13:18,758 --> 00:13:20,316 [SOUND] 195 00:13:20,316 --> 00:13:43,241 [MUSIC] 196 00:13:43,241 --> 00:13:45,850 [SOUND] Look ma no hands. 197 00:13:45,850 --> 00:13:50,440 [MUSIC] 198 00:13:50,440 --> 00:13:53,790 So, what are the [INAUDIBLE] that we can go back, to queuing questions. 199 00:13:53,790 --> 00:13:56,507 What are the prospects for intelligent machinery? 200 00:13:56,507 --> 00:13:57,586 I would say very good. 201 00:13:57,586 --> 00:14:03,920 AI is making [INAUDIBLE] progress, you can go back to [INAUDIBLE] analysis. 202 00:14:03,920 --> 00:14:05,370 It's a very compelling analysis, 203 00:14:05,370 --> 00:14:09,760 why it's hard to argue that, one cannot have intelligent machines. 204 00:14:09,760 --> 00:14:12,070 And, they're lots of intelligent machines. 205 00:14:12,070 --> 00:14:14,020 Except not biological machines. 206 00:14:14,020 --> 00:14:16,410 And somehow they think they have made in God's image. 207 00:14:16,410 --> 00:14:17,950 But if you take the meta physics out of it, 208 00:14:17,950 --> 00:14:21,920 we are all existent proof that you can have intelligent machinery. 209 00:14:21,920 --> 00:14:24,950 Maybe silicon is not the answer, maybe we'll find a different 210 00:14:24,950 --> 00:14:29,810 way to build machines, but [INAUDIBLE] machines. 211 00:14:29,810 --> 00:14:37,060 And in fact, if you just now go and go to news.google.com and 212 00:14:37,060 --> 00:14:41,290 tried to do a search for robots jobs something. 213 00:14:41,290 --> 00:14:43,910 You see that every time something happened. 214 00:14:43,910 --> 00:14:48,490 Robots are performing more and more jobs that people used to do. 215 00:14:48,490 --> 00:14:49,810 Being a pharmacist. 216 00:14:49,810 --> 00:14:53,790 The biggest challenge of being a pharmacist is reading the prescription. 217 00:14:53,790 --> 00:14:56,460 After that, the rest is mechanical. 218 00:14:56,460 --> 00:15:00,637 The boning chicken that your mother used to be very proud of. 219 00:15:00,637 --> 00:15:04,786 [INAUDIBLE] To be mechanized, I cannot do it, but it seemed to be mechanized. 220 00:15:04,786 --> 00:15:10,180 Prison guards, a sedation, 221 00:15:10,180 --> 00:15:17,360 a bartending, more and more jobs to today, we are able to mechanize them. 222 00:15:17,360 --> 00:15:22,230 So the fear of intelligent machines, is really it goes back, 223 00:15:22,230 --> 00:15:25,660 you can find it all the way back to in some sense, the questions, 224 00:15:25,660 --> 00:15:30,300 go to the questions what are the essence of humanity, go back to Frankensteins, and 225 00:15:30,300 --> 00:15:35,020 the invention of the world war by Carl Trapik. 226 00:15:35,020 --> 00:15:37,580 And this is very much in the news today, 227 00:15:37,580 --> 00:15:42,320 in fact just around us Oxford has the future of humanity institute and 228 00:15:42,320 --> 00:15:45,958 one of the risks that they are worried about is the risk coming from strong AI. 229 00:15:45,958 --> 00:15:50,120 Cambridge seems to have the center for study of essential risk and how people 230 00:15:50,120 --> 00:15:56,170 think of AI as some kind of essential risk and we have recently Stephen Hawkings and 231 00:15:56,170 --> 00:16:00,504 you all kind of Bill Gates, everybody's talking about the risk [INAUDIBLE]. 232 00:16:00,504 --> 00:16:03,667 But other than think of this [INAUDIBLE] risk, 233 00:16:03,667 --> 00:16:08,260 I want to be much more precise, and talk about the impact on jobs. 234 00:16:08,260 --> 00:16:12,613 And because we went through a recession that had such a significant impact 235 00:16:12,613 --> 00:16:16,840 on jobs, this really got people attention back to the concept of jobs. 236 00:16:16,840 --> 00:16:22,770 And in 2009, Mountain Fold published a book, The Lights at the End of the Tunnel. 237 00:16:22,770 --> 00:16:25,720 And I think the metaphor has to come with, when you are walking through a tunnel, 238 00:16:25,720 --> 00:16:30,660 and you see lights at the end, is it the exit light? 239 00:16:30,660 --> 00:16:34,230 Or are these the lights of a truck coming towards you? 240 00:16:34,230 --> 00:16:38,030 And so he writes, it's it possible that auxilurating computer technology 241 00:16:38,030 --> 00:16:41,750 was a primary cause of the current economic global crisis and 242 00:16:41,750 --> 00:16:45,350 that even more disastrous impacts lie ahead. 243 00:16:45,350 --> 00:16:48,620 When the book came out it did not receive much attention, 244 00:16:48,620 --> 00:16:50,870 partly he's not a brand name. 245 00:16:50,870 --> 00:16:57,170 He doesn't come from an institution with a pedigree, he was a software entrepreneur. 246 00:16:57,170 --> 00:17:00,400 But in 2012, two really blue chip economists, 247 00:17:00,400 --> 00:17:05,214 Brynjolfsson and McAfeee at MIT publish a book, the Race against the Machine. 248 00:17:05,214 --> 00:17:09,110 In there they made the case very powerfully, technological progress 249 00:17:09,110 --> 00:17:14,660 is accelerating innovation, even if it leaves many types of workers behind. 250 00:17:14,660 --> 00:17:17,310 And so now the debate between the new Luddite and 251 00:17:17,310 --> 00:17:20,810 the new classical is raging on. 252 00:17:20,810 --> 00:17:26,130 And the new Luddites are marshalling a huge amount of data on their side. 253 00:17:26,130 --> 00:17:31,440 So I want to spend a few minutes just going through hardcore economic data. 254 00:17:31,440 --> 00:17:34,790 So one of thing is manufacturing. 255 00:17:34,790 --> 00:17:37,860 This is US data and 256 00:17:37,860 --> 00:17:41,600 we all heard how all of manufacturing went to China, China is doing all 257 00:17:41,600 --> 00:17:45,600 the manufacturing and manufacturing in United States has been destroyed. 258 00:17:45,600 --> 00:17:48,740 Turns out this is completely false. 259 00:17:48,740 --> 00:17:53,645 If you look at manufacturing volume and here you'll see data from 1950 to 260 00:17:53,645 --> 00:17:59,011 2010 of course it zigs and zags, but the general trend, it goes up all the time. 261 00:17:59,011 --> 00:18:05,521 The US is a manufacturing giant, but employment peaked somewhere around 1980. 262 00:18:05,521 --> 00:18:08,320 But since then it's been going down. 263 00:18:08,320 --> 00:18:10,460 Okay, all the jobs have gone to China. 264 00:18:10,460 --> 00:18:14,930 If I'll put here a Chinese data, it would do something similar. 265 00:18:14,930 --> 00:18:19,670 Except what you'll is that the peak of manufacturing employment is not in 1980, 266 00:18:19,670 --> 00:18:21,090 but is maybe around 2005, and 267 00:18:21,090 --> 00:18:27,900 then it peaked, and now it's going down also in China. 268 00:18:27,900 --> 00:18:35,351 A DOS almost pride itself in being as opposed to UHaul, a job creation machine. 269 00:18:35,351 --> 00:18:39,047 And in fact if you look at the 1940s, the 1950s, 270 00:18:39,047 --> 00:18:43,720 the 1960s the US really generated many, many jobs. 271 00:18:43,720 --> 00:18:48,540 But then you see some of the job machines start to slow down around 1980. 272 00:18:48,540 --> 00:18:56,650 So it goes down to 20% in 1980, another 20% in the 1990s. 273 00:18:56,650 --> 00:19:01,940 If you look at the first decade of the 21st century the whole decade 274 00:19:01,940 --> 00:19:03,595 net job losses. 275 00:19:03,595 --> 00:19:05,870 Overall between the boom and the bust and 276 00:19:05,870 --> 00:19:10,900 recession we have not created new jobs in the United States. 277 00:19:10,900 --> 00:19:13,560 Now couple of years ago Piketty's book came out, 278 00:19:13,560 --> 00:19:17,530 assumed that if you are walking on this planet you have heard of Piketty's book 279 00:19:17,530 --> 00:19:21,020 and so because of inequalities very hot political topic. 280 00:19:21,020 --> 00:19:23,660 And so you can see that even if you look at income, 281 00:19:23,660 --> 00:19:28,700 then you can see the top 1% is way up there compared to everyone else. 282 00:19:28,700 --> 00:19:32,530 Or if you look at share of income, you see the same picture, and 283 00:19:32,530 --> 00:19:36,640 of course the numbers always zig and zag especially at the top. 284 00:19:36,640 --> 00:19:41,080 The people top 1% are much more in fact sensitive to the economy in 285 00:19:41,080 --> 00:19:43,405 terms of percentage then people at the bottom. 286 00:19:43,405 --> 00:19:47,150 But you can see that, in particular. 287 00:19:47,150 --> 00:19:52,023 If you look at the shelf, every time you see the zig zag, 288 00:19:52,023 --> 00:19:56,400 but you see the only, the top 10% are gaining. 289 00:19:56,400 --> 00:20:01,691 Everybody else has lost in share of income. 290 00:20:01,691 --> 00:20:07,320 Now, I assume that most of us think of ourselves as the 99%. 291 00:20:07,320 --> 00:20:10,850 And then we're looking way up there and we see the 1%, but 292 00:20:10,850 --> 00:20:16,050 even if you slice it into the 1% you see the same phenomenon. 293 00:20:16,050 --> 00:20:20,370 So here you see the bottom 90% has just flat lined it, right? 294 00:20:20,370 --> 00:20:22,630 Above it you see the top 1%. 295 00:20:22,630 --> 00:20:29,892 Above it you see the 0.1% and way above it you the 0.01%. 296 00:20:29,892 --> 00:20:35,716 So even within the 1% the people there can complain about the top 1% of the top 1%. 297 00:20:35,716 --> 00:20:40,791 >> [LAUGH] >> If you 298 00:20:40,791 --> 00:20:46,710 look at the economy, the period of the economy contracts and expands. 299 00:20:46,710 --> 00:20:47,600 And to expand, 300 00:20:47,600 --> 00:20:51,730 you can see, okay, where will the benefit go when the economy expands? 301 00:20:51,730 --> 00:20:54,559 And you can see that in earlier period, 302 00:20:54,559 --> 00:20:57,999 most of the benefits went to the bottom 90%. 303 00:20:57,999 --> 00:21:00,130 But you can see this is decreasing and 304 00:21:00,130 --> 00:21:06,715 now most of the benefits go to the top 10% when economy grows. 305 00:21:06,715 --> 00:21:10,880 Now you could say, well, this is class war. 306 00:21:10,880 --> 00:21:13,520 Why you complaining because other people are doing better? 307 00:21:13,520 --> 00:21:18,570 Just focus on what you have and if other people get richer, what's the big deal? 308 00:21:18,570 --> 00:21:24,100 Turns out that you can tie inequality to all kinds of problems in society. 309 00:21:24,100 --> 00:21:26,790 For example, social mobility. 310 00:21:26,790 --> 00:21:29,940 We would like to think that maybe not everybody's born equal but 311 00:21:29,940 --> 00:21:32,390 everybody's born with equal opportunities. 312 00:21:32,390 --> 00:21:35,250 And so economists measure how easy it is for 313 00:21:35,250 --> 00:21:39,151 people to move between different socioeconomic strata. 314 00:21:39,151 --> 00:21:45,260 And it turns out that when you have higher inequality, you have lower mobility. 315 00:21:45,260 --> 00:21:48,750 So you can see here, for example, that US and 316 00:21:48,750 --> 00:21:51,770 Spain have very high inequality and very low mobility. 317 00:21:51,770 --> 00:21:55,661 In the Scandinavian country, at the other end have lower inequality and 318 00:21:55,661 --> 00:22:00,431 much higher mobility. 319 00:22:00,431 --> 00:22:04,390 A higher inequality means lower growth. 320 00:22:04,390 --> 00:22:08,430 Again you can see the points are all over the place but generally, 321 00:22:08,430 --> 00:22:16,850 if you have higher inequality, then you can see that economic growth is lower. 322 00:22:16,850 --> 00:22:21,060 The middle class which is considered to be the main stem of liberal society, 323 00:22:21,060 --> 00:22:22,052 of democratic society, 324 00:22:22,052 --> 00:22:25,780 you can see in this [INAUDIBLE] that the middle third of income. 325 00:22:25,780 --> 00:22:35,670 You can see how over the start of 1971 how the middle class is gradually shrinking. 326 00:22:35,670 --> 00:22:39,020 Typically, when we look at unemployment, the question that we're asking, 327 00:22:39,020 --> 00:22:43,340 how many people are looking for jobs and are not finding jobs? 328 00:22:43,340 --> 00:22:48,440 But more significant is economists measure what percentage of the people 25 and 329 00:22:48,440 --> 00:22:50,850 older are actually working, and 330 00:22:50,850 --> 00:22:57,010 you can see that in 1980 it was over 80% of the people over 25 were working. 331 00:22:57,010 --> 00:22:59,940 And now these numbers are going down, and 332 00:22:59,940 --> 00:23:05,190 now it's somewhere around 75%, fewer and fewer people are working. 333 00:23:05,190 --> 00:23:08,380 So if they're not working, what are they doing? 334 00:23:08,380 --> 00:23:15,140 Again, we see wages fell through GDP, it goes down. 335 00:23:15,140 --> 00:23:16,950 Long-term unemployment. 336 00:23:16,950 --> 00:23:19,070 People don't employ I think more than two years. 337 00:23:19,070 --> 00:23:23,660 You see again it zigzags but overall this numbers keeps going up. 338 00:23:23,660 --> 00:23:29,811 More and more people are unemployed for more than two years. 339 00:23:29,811 --> 00:23:34,193 So, we saw all these economic indicator that tell us that's something wrong happen 340 00:23:34,193 --> 00:23:37,230 in the economy over the past 30 years. 341 00:23:37,230 --> 00:23:38,553 What exactly happened? 342 00:23:38,553 --> 00:23:44,695 But because these are hard economic data points, in 2013 Gardner, 343 00:23:44,695 --> 00:23:50,736 which is a very mainstream business oriented consulting company write 344 00:23:50,736 --> 00:23:56,172 a report and the title is Smart Machines Will Have Widespread and 345 00:23:56,172 --> 00:23:59,711 Deep Business Impact Through 2020. 346 00:23:59,711 --> 00:24:03,486 So this is now, it's not just the communists that argue that, 347 00:24:03,486 --> 00:24:08,201 this is Gardner a very, very reputable company and they write most business. 348 00:24:08,201 --> 00:24:11,718 And thirdly, there's underestimated potential smart 349 00:24:11,718 --> 00:24:16,490 machines to take over millions of middle-class jobs in the coming decades. 350 00:24:16,490 --> 00:24:21,730 Job destruction will happen at a faster pace. 351 00:24:21,730 --> 00:24:24,940 And in a 2014 report, 352 00:24:24,940 --> 00:24:31,240 among about 1000 active economists, and pose to them the question, 353 00:24:31,240 --> 00:24:36,490 the [INAUDIBLE] technology and automation, are essential reason for media majors. 354 00:24:36,490 --> 00:24:42,180 Have been stagnant in Russia set over the past decade despite rising productivity. 355 00:24:42,180 --> 00:24:47,105 Among the thousand economies, about 43% agree, 30 are uncertain, 356 00:24:47,105 --> 00:24:51,000 25% disagree, only 4% strongly disagree. 357 00:24:51,000 --> 00:24:56,350 That means you have over 70% think that it's actually quite possible 358 00:24:56,350 --> 00:25:02,980 that this economic situation is caused by information technology. 359 00:25:02,980 --> 00:25:05,560 And, now I kind of follow 360 00:25:05,560 --> 00:25:07,770 the newspaper all the time to see what they say about it. 361 00:25:07,770 --> 00:25:11,060 These are now titles from the main stream media and 362 00:25:11,060 --> 00:25:14,210 you'll find them now on a regular basis. 363 00:25:14,210 --> 00:25:17,195 More jobs predicted for machines, not people. 364 00:25:17,195 --> 00:25:20,490 Will always [INAUDIBLE] your job, marathon machines, 365 00:25:20,490 --> 00:25:24,430 men versus machine, robots are winning, the rise of the robots. 366 00:25:24,430 --> 00:25:25,820 This is from last year. 367 00:25:25,820 --> 00:25:28,292 This is a cover of the economists magazine. 368 00:25:28,292 --> 00:25:31,727 Very solid, very business oriented magazine. 369 00:25:31,727 --> 00:25:34,688 This is the cover [INAUDIBLE]. 370 00:25:34,688 --> 00:25:41,548 The theme of the issue is the rise of the warlords. 371 00:25:41,548 --> 00:25:46,350 So, is this time going to be different? 372 00:25:46,350 --> 00:25:51,690 So, it is true that, if you go back to the beginning of the Industrial Revolutions, 373 00:25:51,690 --> 00:25:56,760 a technological change always destroys some kind of jobs, but 374 00:25:56,760 --> 00:25:59,130 it also create new jobs. 375 00:25:59,130 --> 00:26:01,530 So what's different this time? 376 00:26:01,530 --> 00:26:04,010 What's different this time that we are working and 377 00:26:04,010 --> 00:26:07,540 the computer scientists here are working on building machines and the goal is to 378 00:26:07,540 --> 00:26:13,690 build machine that will be able to outcompete us in anything that we can do. 379 00:26:13,690 --> 00:26:17,670 I mean, just think, imagine that we are extremely successful. 380 00:26:17,670 --> 00:26:21,239 Right now, where [INAUDIBLE] AI and information technology and 381 00:26:21,239 --> 00:26:23,780 robotics were a challenge. 382 00:26:23,780 --> 00:26:27,670 We could not yet emulate the physical dexterity and 383 00:26:27,670 --> 00:26:31,960 the situational awareness of people or even animals. 384 00:26:31,960 --> 00:26:32,610 Okay? 385 00:26:32,610 --> 00:26:37,030 Just take a juggler and think of the eye-hand coordination of a juggler. 386 00:26:37,030 --> 00:26:40,490 We cannot yet do it with machines. 387 00:26:40,490 --> 00:26:42,070 And we cannot yet 388 00:26:42,070 --> 00:26:47,150 do with machines the high level cognition that is required in many, many jobs. 389 00:26:47,150 --> 00:26:48,200 But there are many, 390 00:26:48,200 --> 00:26:54,350 many jobs that do not require such physical dexterity on one hand. 391 00:26:54,350 --> 00:26:57,260 And do not require high level commission on the other hand. 392 00:26:57,260 --> 00:27:00,820 Just as you go about your day, look at the jobs around you, and 393 00:27:00,820 --> 00:27:06,270 just think what kind of intelligence and physical dexterity they require. 394 00:27:06,270 --> 00:27:10,710 And that's why more and more of the jobs are being automated. 395 00:27:10,710 --> 00:27:16,550 So just last month, a study came up, here from Oxford. 396 00:27:16,550 --> 00:27:20,610 And they looked at the question, is okay, technology destroyed jobs, 397 00:27:20,610 --> 00:27:25,050 is technology also creating new jobs? 398 00:27:25,050 --> 00:27:26,100 So they want to look at that. 399 00:27:26,100 --> 00:27:27,930 What are the new jobs? 400 00:27:27,930 --> 00:27:29,990 And they found, they wanted to know between 2000, 401 00:27:29,990 --> 00:27:32,170 they looked at the United States. 402 00:27:32,170 --> 00:27:33,510 And asked himself in 2010, 403 00:27:33,510 --> 00:27:39,210 what percent of jobs in 2010 were jobs that did not exist in 2000? 404 00:27:39,210 --> 00:27:44,140 And the answer is half a percent. 405 00:27:44,140 --> 00:27:47,010 It's not as if there is this massive job engine. 406 00:27:47,010 --> 00:27:51,230 I mean IBM still probably has about 400,000 employees. 407 00:27:51,230 --> 00:27:54,200 IBM is so called IT legacy company. 408 00:27:54,200 --> 00:27:56,460 Facebook is the new company. 409 00:27:56,460 --> 00:27:58,680 How many employees does Facebook have? 410 00:27:58,680 --> 00:28:01,100 I think it's under 10,000 employees. 411 00:28:01,100 --> 00:28:06,280 And so the new technologies simply not creating these kind of jobs that we 412 00:28:06,280 --> 00:28:10,600 need them to create to keep the economy humming. 413 00:28:10,600 --> 00:28:16,280 And it was very useful to imagine a conversation between horses 414 00:28:16,280 --> 00:28:19,030 somewhere around the beginning of the century. 415 00:28:19,030 --> 00:28:21,270 So one horse is a pessimist. 416 00:28:21,270 --> 00:28:23,060 He's watching the coming out. 417 00:28:23,060 --> 00:28:27,088 He says oh my god, now they have horseless carriages. 418 00:28:27,088 --> 00:28:30,880 They have this automobile, 419 00:28:30,880 --> 00:28:34,590 what we now call autonomous car, they call it automobile. 420 00:28:34,590 --> 00:28:35,893 We have this automobile, and 421 00:28:35,893 --> 00:28:41,420 all this [INAUDIBLE] machinery, what will happen to horses? 422 00:28:41,420 --> 00:28:45,902 And the other horse is an optimist, he's new classical horse. 423 00:28:45,902 --> 00:28:48,889 >> [LAUGH] >> And he says look, yes, 424 00:28:48,889 --> 00:28:52,590 technology always destroys job for horses >> But 425 00:28:52,590 --> 00:28:56,130 the clergy always get new job for horses. 426 00:28:56,130 --> 00:28:57,090 Not to worry. 427 00:28:57,090 --> 00:29:01,730 There will be job for horses as far as we can imagine. 428 00:29:01,730 --> 00:29:08,900 As it happens, 1915 was the peak employment, so to speak, for horses. 429 00:29:08,900 --> 00:29:13,015 Because for horses, if you don't have a job, why should you exist, okay? 430 00:29:13,015 --> 00:29:15,020 >> [LAUGH] >> You are consuming resources. 431 00:29:15,020 --> 00:29:19,500 So today horses exist mostly, with few exceptions, as pets. 432 00:29:19,500 --> 00:29:20,230 Okay? 433 00:29:20,230 --> 00:29:22,995 And so fortunately for horses, they found a niche employment. 434 00:29:22,995 --> 00:29:25,630 >> [LAUGH] >> Being a pet. 435 00:29:25,630 --> 00:29:28,260 And so, it turns out that all this argument that said don't worry, 436 00:29:28,260 --> 00:29:30,770 technology will get new jobs for people. 437 00:29:30,770 --> 00:29:37,530 >> Completely analogous, you can say they will create new jobs for horses. 438 00:29:37,530 --> 00:29:41,300 So the question, I think, really to me it's 439 00:29:41,300 --> 00:29:46,200 one of the most fundamental questions I can think of about the future of humanity. 440 00:29:46,200 --> 00:29:49,360 Going beyond what happens when AI takes over. 441 00:29:49,360 --> 00:29:50,150 Something much more. 442 00:29:50,150 --> 00:29:53,270 Something coming out much in the near future. 443 00:29:53,270 --> 00:29:57,830 If machine can do all the work, or almost all the work, or 444 00:29:57,830 --> 00:30:01,480 even 50% of the jobs that we used to do. 445 00:30:01,480 --> 00:30:03,670 What will people do? 446 00:30:03,670 --> 00:30:08,042 Now, some of the other analysis go back to what will 447 00:30:08,042 --> 00:30:11,709 happen in 100 years, in 200 years. 448 00:30:11,709 --> 00:30:14,484 What makes it so difficult, especially by the field shown. 449 00:30:14,484 --> 00:30:19,655 Also for many of us, [INAUDIBLE] beyond our imagination is our great, 450 00:30:19,655 --> 00:30:25,740 great grandchildren, it's a bit too hard to wrap your brain around it. 451 00:30:25,740 --> 00:30:29,580 But if you think about 30 years, 30 years is 452 00:30:29,580 --> 00:30:35,470 we can imagine where in 1985 we are making a project for the next 30 years. 453 00:30:35,470 --> 00:30:38,750 And if we don't think we are going to be around in 30 years, 454 00:30:38,750 --> 00:30:42,220 we think our children are gonna be around, our grandchildren gonna be around. 455 00:30:42,220 --> 00:30:44,600 30 years is something that we're concerned about. 456 00:30:44,600 --> 00:30:48,410 One of the problem with global warming is it's gonna happen around 2100, 457 00:30:48,410 --> 00:30:53,710 30 years, he say that is almost, it's coming. 458 00:30:53,710 --> 00:30:58,010 So let's talk about what happened in 30 years. 459 00:30:58,010 --> 00:31:03,710 So, people tried to come up with many answers including we will immigrate 460 00:31:03,710 --> 00:31:08,180 to other planets which is probably this point seem it like a fancble answer. 461 00:31:08,180 --> 00:31:11,110 But the most common answer is wow. 462 00:31:11,110 --> 00:31:14,600 Actually, what you're describing is a wonderful future. 463 00:31:14,600 --> 00:31:16,440 We won't have to work anymore. 464 00:31:16,440 --> 00:31:20,305 We'll have new type of slaves, we'll have who do all the work. 465 00:31:20,305 --> 00:31:24,586 We'll be able to engage in meaningful leisure activities. 466 00:31:24,586 --> 00:31:26,710 This is the leisure answer. 467 00:31:26,710 --> 00:31:29,589 [SOUND] So there really two issues, big issues with this. 468 00:31:29,589 --> 00:31:32,359 One is we will need a huge economic adjustment, 469 00:31:32,359 --> 00:31:35,820 to move to a world from the current economy that we have, and 470 00:31:35,820 --> 00:31:40,690 this [INAUDIBLE] 30 years, 30 years is not a long time for huge economic change. 471 00:31:40,690 --> 00:31:45,870 If you look for example at the industrial revolution, so 472 00:31:45,870 --> 00:31:49,600 it started in the 17th century. 473 00:31:49,600 --> 00:31:54,588 When have we fully adjusted to the Revolution and I would argue that 474 00:31:54,588 --> 00:31:59,575 the full adjustment of this revolution is the creation of the modern social 475 00:31:59,575 --> 00:32:05,489 democratic state with a social safety network and when that did happen really? 476 00:32:05,489 --> 00:32:06,665 World War 2, so 477 00:32:06,665 --> 00:32:12,464 it actually took us 200 years at least to adapt to the Industrial Revolution and 478 00:32:12,464 --> 00:32:17,609 now we're talking about changes that are going to unfold in 30 years. 479 00:32:17,609 --> 00:32:22,790 So just economically adapt to a thing is going to be very, very challenging. 480 00:32:22,790 --> 00:32:26,240 Just have to imagine what happen if we saw that labor 481 00:32:26,240 --> 00:32:30,195 party participation rate is dropping down to now about 75%. 482 00:32:30,195 --> 00:32:32,541 What happen when it gets to 50%? 483 00:32:32,541 --> 00:32:37,090 50% of people are not working, and continue to drops. 484 00:32:37,090 --> 00:32:40,930 But beyond the economic issue, I think there is a deep philosophical question. 485 00:32:40,930 --> 00:32:42,100 What is the meaning of the good life? 486 00:32:42,100 --> 00:32:44,050 And the good life is a philosophical term, 487 00:32:44,050 --> 00:32:47,040 goes back to the classical RF, Greek philosopher. 488 00:32:47,040 --> 00:32:52,070 They wanted to know, what is the life worth living? 489 00:32:52,070 --> 00:32:55,590 And, in fact, there are some people in history that people did not work 490 00:32:55,590 --> 00:32:56,100 for a living. 491 00:32:56,100 --> 00:32:59,310 If you go back to some periods in the Roman Empire, where either you 492 00:32:59,310 --> 00:33:04,370 are patrician and you had a lot of money, or the slaves were doing all the work, and 493 00:33:04,370 --> 00:33:09,250 the plebian somehow subsidized bread and entertainment. 494 00:33:09,250 --> 00:33:12,780 And that seems to me a dystopia. 495 00:33:12,780 --> 00:33:17,130 Turns out that this question which is what will people do if they don't have to work 496 00:33:17,130 --> 00:33:20,150 was raised in a beautiful play by Theodore Herzel. 497 00:33:20,150 --> 00:33:22,230 So if you have heard of Theodore Herzel at all, 498 00:33:22,230 --> 00:33:28,220 you've heard of him as the founder of modern political Zionism. 499 00:33:28,220 --> 00:33:34,800 But he was a writer and he wrote a play in 1904 called Solon In Lydien. 500 00:33:34,800 --> 00:33:41,470 So Solon is in And he's perceived to be, everybody thinks he is very, very wise. 501 00:33:41,470 --> 00:33:44,480 Athena ask him to devise a new legal system. 502 00:33:44,480 --> 00:33:46,260 He writes a new legal code. 503 00:33:46,260 --> 00:33:48,140 He said you must follow it for ten years. 504 00:33:48,140 --> 00:33:49,940 You cannot change it for ten years. 505 00:33:49,940 --> 00:33:54,690 And he knows the adjustment will be hard so he leaves Athens for ten years. 506 00:33:54,690 --> 00:33:56,110 He goes to visit Lydia. 507 00:33:56,110 --> 00:33:57,538 Lydia is a very rich kingdom. 508 00:33:57,538 --> 00:34:02,612 A old [INAUDIBLE] we still say [INAUDIBLE] and [INAUDIBLE] arrived at [INAUDIBLE] and 509 00:34:02,612 --> 00:34:07,270 [INAUDIBLE] said oh thank you for coming here. 510 00:34:07,270 --> 00:34:08,230 I can use your wisdom. 511 00:34:08,230 --> 00:34:09,750 I need some advice. 512 00:34:09,750 --> 00:34:14,410 This young inventor called invented a way to make flour 513 00:34:14,410 --> 00:34:18,750 without actually having to raise the width. 514 00:34:18,750 --> 00:34:21,690 So time to figure out how I should reward Uksomos, 515 00:34:21,690 --> 00:34:26,970 I'm thinking of giving my daughter to him for his wife. 516 00:34:26,970 --> 00:34:32,010 And Solon said no, no, no to the contrary you should kill him. 517 00:34:32,010 --> 00:34:35,570 And Kasmov said you're out of your mind. 518 00:34:35,570 --> 00:34:36,720 This is a wonderful invention. 519 00:34:36,720 --> 00:34:37,766 People don't have to work. 520 00:34:37,766 --> 00:34:41,340 I need to [INAUDIBLE] and get out of here, I don't want to listen to your advice. 521 00:34:41,340 --> 00:34:45,870 Okay he says, so Uksomos is given permission to go and 522 00:34:45,870 --> 00:34:54,790 deploy his new invention, so now you can get a flower for free. 523 00:34:54,790 --> 00:34:58,580 My guess is the women still have to work to make bread out of the flour, but 524 00:34:58,580 --> 00:35:01,090 the men did not have to work anymore. 525 00:35:01,090 --> 00:35:04,500 So the men became lazy and restless, and so 526 00:35:04,500 --> 00:35:11,570 they get impatient and start complaining about everything, and became quarrelsome. 527 00:35:11,570 --> 00:35:15,450 And eventually they start complaining about Krasus, and 528 00:35:15,450 --> 00:35:21,550 there is a rebellion actually going in the ranks of the population. 529 00:35:21,550 --> 00:35:26,169 And so again there is an urgent meeting in the palace, with Krasus and 530 00:35:26,169 --> 00:35:27,533 Uksomos and Solon. 531 00:35:27,533 --> 00:35:31,302 And Solon says to you Uksomos, well you Uksomos, 532 00:35:31,302 --> 00:35:37,250 you are still of the opinion that you can make the people happy. 533 00:35:37,250 --> 00:35:39,450 You hear the commotion down there? 534 00:35:39,450 --> 00:35:45,710 Do you want him to steal, to have bread without worry, without work? 535 00:35:45,710 --> 00:35:48,830 And he gave them a chalice of poison wine 536 00:35:48,830 --> 00:35:52,710 because it's not enough that you stop your invention, it's in your head. 537 00:35:52,710 --> 00:35:54,310 People will get it out of you. 538 00:35:54,310 --> 00:35:57,760 The only way to destroy this invention is for you to kill yourself. 539 00:35:57,760 --> 00:36:04,040 And he said drink it for the welfare of mankind and Ukosmos drinks the wine. 540 00:36:04,040 --> 00:36:07,310 This is the end of the play. 541 00:36:07,310 --> 00:36:11,540 So this is really the question, if machines can do 542 00:36:11,540 --> 00:36:15,315 almost everything that humans used to do what will humans do? 543 00:36:15,315 --> 00:36:19,280 [INAUDIBLE] In his paper, 544 00:36:19,280 --> 00:36:23,610 which is all about the possibility of machine intelligence, 545 00:36:23,610 --> 00:36:27,730 is actually very excited about the possibility of a machine intelligence. 546 00:36:27,730 --> 00:36:31,750 He writes, we may hope, that machine will eventually compete with men, 547 00:36:31,750 --> 00:36:34,260 in all pure intellectual field. 548 00:36:34,260 --> 00:36:36,270 We can see plenty that needs to be done. 549 00:36:36,270 --> 00:36:39,410 So he's like a researcher, so wow, there's going to be lots of funding, 550 00:36:39,410 --> 00:36:42,340 lots of opportunities, this is great. 551 00:36:42,340 --> 00:36:45,170 And there is no trace in the paper of any 552 00:36:45,170 --> 00:36:47,930 worry about what implications of this means, and 553 00:36:47,930 --> 00:36:52,000 notions of social responsibility, is this a good direction to go to? 554 00:36:52,000 --> 00:36:56,950 Now I think this is comparable to probably the biggest technological change, 555 00:36:56,950 --> 00:37:01,790 the [INAUDIBLE] over the [INAUDIBLE] evolution, is we found a source of energy, 556 00:37:01,790 --> 00:37:05,170 which is not either human level, or animal level. 557 00:37:05,170 --> 00:37:08,140 And this has enabled us to live, if we look how we live today. 558 00:37:08,140 --> 00:37:11,910 Just incomparable to way people live you know 300 years ago. 559 00:37:11,910 --> 00:37:16,970 This [INAUDIBLE] label in such a way, and now we all realize. 560 00:37:16,970 --> 00:37:19,640 They were paying the price, was climate change. 561 00:37:19,640 --> 00:37:25,900 And how to get off the fossil fuel train, is incredibly hard. 562 00:37:25,900 --> 00:37:28,430 Now there's a palace meeting, there was a [INAUDIBLE] meeting, the [INAUDIBLE] 563 00:37:28,430 --> 00:37:34,670 meeting, they come and go, and we as a society have an incredibly hard time. 564 00:37:34,670 --> 00:37:37,310 Dealing with the change when it's about to happen. 565 00:37:37,310 --> 00:37:44,350 Maybe this time we need to think it before it happens, not after it already happened. 566 00:37:44,350 --> 00:37:48,320 In closing, I want to again quote [INAUDIBLE] and 567 00:37:48,320 --> 00:37:52,100 they say there is no economic law that says that everyone or 568 00:37:52,100 --> 00:37:56,840 even most people automatically benefit from technological change. 569 00:37:56,840 --> 00:37:58,780 There's no economic low. 570 00:37:58,780 --> 00:38:01,340 It happened over the 300 years, 571 00:38:01,340 --> 00:38:04,630 we don't know what's causing the current problem in the economy, 572 00:38:04,630 --> 00:38:08,466 there is a very good possibility that it is caused by technology. 573 00:38:08,466 --> 00:38:13,630 [INAUDIBLE] researchers from Harvard, [INAUDIBLE] report called 574 00:38:13,630 --> 00:38:18,190 Dancing with Robots Trying to think of how will adjust to a society. 575 00:38:18,190 --> 00:38:20,730 And I'm using [INAUDIBLE] as a metaphor for, 576 00:38:20,730 --> 00:38:24,520 generally for computing technology in all of its forms. 577 00:38:24,520 --> 00:38:31,220 And the right, the central domestic policy change, challenge, of the 21st century, 578 00:38:31,220 --> 00:38:35,960 is how to ensure middle class prosperity, and individual success. 579 00:38:35,960 --> 00:38:43,746 In an era of over intensifying globalization and technological upheaval. 580 00:38:43,746 --> 00:38:49,590 Michael Cameron in a very nice article, I think summarize it very succinctly. 581 00:38:49,590 --> 00:38:54,100 Will a world without work be heaven or hell? 582 00:38:54,100 --> 00:38:56,866 Now is the time to think it through. 583 00:38:56,866 --> 00:39:01,106 Thank you very much. 584 00:39:01,106 --> 00:39:04,160 >> [APPLAUSE]