1 00:00:04,360 --> 00:00:11,260 Good afternoon, everyone, and welcome to the second in a series of conversations at the Martin School is organising this term, 2 00:00:11,260 --> 00:00:16,900 exploring some of the ways that the pandemic has or may may change our thinking 3 00:00:16,900 --> 00:00:21,610 about the major issues facing humanity before introducing today's guests. 4 00:00:21,610 --> 00:00:25,780 Let me welcome our virtual audience and encourage you to ask questions. 5 00:00:25,780 --> 00:00:32,830 If you look at the bottom of your screen on the right, there's a button that you can press to ask a question. 6 00:00:32,830 --> 00:00:35,200 And there is also the facility to vote. 7 00:00:35,200 --> 00:00:43,480 What questions that you want me to put to the speaker and please do that is very helpful for me in deciding which question to ask. 8 00:00:43,480 --> 00:00:45,670 So it's an obvious pleasure to welcome today's guest. 9 00:00:45,670 --> 00:00:52,900 Martin Rees Lord Rees, who's professor of astronomy and cosmology at the University of Cambridge, over a long career. 10 00:00:52,900 --> 00:00:58,270 Martin has made extraordinary contributions to many areas of astronomy and cosmology, 11 00:00:58,270 --> 00:01:03,130 and it's received a very large number of prises as prises and honours. 12 00:01:03,130 --> 00:01:11,800 He's been president for the Royal Society and was made a life partner here in 2005 and is very active in science policy in the UK. 13 00:01:11,800 --> 00:01:18,550 UK Parliament's Upper Chamber Martinez astronomer royal position held by Flamsteed 14 00:01:18,550 --> 00:01:22,630 highly masculine and I believe you have had an asteroid named after you, 15 00:01:22,630 --> 00:01:24,580 if that's correct. 16 00:01:24,580 --> 00:01:36,430 Martin has been a long friend of the Martin School here in Oxford and has sat on our advisory council right since the foundation of the school. 17 00:01:36,430 --> 00:01:45,280 In addition to his fundamental science, Martin writes wonderful books for non-technical audience and off the internet, so books. 18 00:01:45,280 --> 00:01:52,330 Martin has written some are about cosmology, and I'm particularly fond of a book, which I think is about 10 years ago, 19 00:01:52,330 --> 00:02:00,820 just six numbers, which gives a fabulous insight into the summer and into some of the magic of modern astrophysics. 20 00:02:00,820 --> 00:02:05,590 But especially in recent years, for example, in this in his last book On The Future, 21 00:02:05,590 --> 00:02:10,930 Martin has written a lot on the major challenges facing humanity in the current st century, 22 00:02:10,930 --> 00:02:16,570 and he has established at Cambridge the Centre for the Study of Existential Risk. 23 00:02:16,570 --> 00:02:24,970 And we thus hugely fortunate have Martin here today to explore some of these essential risks and whether the world having experienced a major, 24 00:02:24,970 --> 00:02:33,880 if not existential crisis in the pandemic is better off or worse off in position in addressing them. 25 00:02:33,880 --> 00:02:37,810 Martin, could we talk a bit about some of the technological risks? 26 00:02:37,810 --> 00:02:41,020 And in your writing, you've talked about two categories. 27 00:02:41,020 --> 00:02:51,910 One is sort of around AI and robotics and nano, and the other is around bioterrorism and biological risks. 28 00:02:51,910 --> 00:02:56,590 Could you tell us a little bit about what concerns you most in those areas? Yes. 29 00:02:56,590 --> 00:03:05,290 I think the point is that those are both areas where equipment which is widely available, 30 00:03:05,290 --> 00:03:12,730 and it's logical that in the case of fire risks and the computer terminals can be used by 31 00:03:12,730 --> 00:03:19,570 error or by design to cause and effect which could be serious and even cascade globally. 32 00:03:19,570 --> 00:03:23,590 It's not like nuclear, which, of course, is the first major technological threat, 33 00:03:23,590 --> 00:03:30,100 and it's still one that looms over us because you can't build a nuclear bomb without massive 34 00:03:30,100 --> 00:03:35,830 special purpose equipment and it's easy to monitor worldwide when that's happening. 35 00:03:35,830 --> 00:03:41,980 But I think we do need to worry as we are getting more and more dependent, for instance, 36 00:03:41,980 --> 00:03:52,620 on the internet and everything like that on possibilities of either breakdowns or even worse of intentional cyber attacks on it. 37 00:03:52,620 --> 00:03:59,850 And in the case of a fire, as we get to that later when we talk about COVID 19, of course, 38 00:03:59,850 --> 00:04:12,200 one thing which is even more scary if it becomes possible for individuals or small groups to engineer viruses or give them intentionally, 39 00:04:12,200 --> 00:04:18,140 and that will be even worse than the natural pandemics we are confronting now. 40 00:04:18,140 --> 00:04:31,990 So with the issues around cyber threats, and I do you see them as existential risks or just as major systemic risks to infrastructure. 41 00:04:31,990 --> 00:04:38,840 So the latter, I think, is an existential risk and something that is going to wipe us all out. 42 00:04:38,840 --> 00:04:44,770 I think there are very few of those, so they are likely to happen because life is all about. 43 00:04:44,770 --> 00:04:52,430 But the way I look at this, and I think the possibility of these risks is going to give us a bumpy ride through the century. 44 00:04:52,430 --> 00:05:01,460 We're going to get more and more easy for individuals or small groups to have an effect that cascades very widely, 45 00:05:01,460 --> 00:05:07,550 as I put it in my book, The Global Village Idiots, but they will now have a global role. 46 00:05:07,550 --> 00:05:08,990 And that's the problem. 47 00:05:08,990 --> 00:05:19,670 And that's why I think we're already we know that cyber attacks can cause disruption, and we know that it can be other just failures, 48 00:05:19,670 --> 00:05:31,650 mechanical failures, electrical failures which make the grid go down in the country and cause other effects which can disrupt our lives. 49 00:05:31,650 --> 00:05:37,820 So it depends on technology. So I think we're getting more vulnerable and therefore there's more that can go wrong. 50 00:05:37,820 --> 00:05:48,980 And worse than that, cyber instance is where things can go wrong by evil intent as well as failure. 51 00:05:48,980 --> 00:05:53,930 And is there an analogy with the global financial system? 52 00:05:53,930 --> 00:05:59,390 And if we saw what happened in 2008 and the degree to which there was contagion 53 00:05:59,390 --> 00:06:04,280 and spread of a problem with the sort of infrastructure that we have now? 54 00:06:04,280 --> 00:06:11,990 Well, that's a good example to tell is that any problem in one part of the world is going globally, 55 00:06:11,990 --> 00:06:22,670 whether it's a financial system, the internet or change of delivery for manufacturing or air travel and all the rest. 56 00:06:22,670 --> 00:06:34,190 So we are so interconnected. And you know, Geidam of Classic will collapse where there could be a collapse of civilisation, one continent. 57 00:06:34,190 --> 00:06:41,910 And now it would not be possible to have such collapse of one continent without it going global, which was so interconnected. 58 00:06:41,910 --> 00:06:47,310 And so that's something which is new about this century and makes us more vulnerable. 59 00:06:47,310 --> 00:06:56,190 And also, of course, let's take the electric grid and its relationship with Israel in London or some other area. 60 00:06:56,190 --> 00:07:02,160 Then the lights, we are the least of our problems because a complete breakdown computer system. 61 00:07:02,160 --> 00:07:07,320 In fact, I quote in my book a report from the American Department Defence, 62 00:07:07,320 --> 00:07:16,920 which talks about the effect of a breakdown in the electricity grid in the eastern United States caused by some cyber attack. 63 00:07:16,920 --> 00:07:26,010 And it says that if they knew where the attacker came from, then it would merit a nuclear response. 64 00:07:26,010 --> 00:07:35,660 So that's why it's scary. And so I do feel that we will all suffer from the kind of breakdowns in our country's infrastructure. 65 00:07:35,660 --> 00:07:38,340 They're pretty routine in, say, India and Pakistan now. 66 00:07:38,340 --> 00:07:43,620 But of course, the for that reason, not quite so dependent on we are much more dependent on them. 67 00:07:43,620 --> 00:07:48,480 I'm just to leave the Earth briefly again, our solar system, 68 00:07:48,480 --> 00:07:56,370 solar storms and solar flares with a capacity to disrupt electrical grids and electrical supplies. 69 00:07:56,370 --> 00:08:01,260 Are they something that concern you? Yes, they're quite serious. In fact, they are natural. 70 00:08:01,260 --> 00:08:03,270 But of course, they're natural. 71 00:08:03,270 --> 00:08:11,670 They had no effect until we had electricity, whereas they do have effects on the producing surges in the electric grid. 72 00:08:11,670 --> 00:08:19,500 And of course, they could affect the workings of satellites, and they're very dependent on the tubes, the satnav and timing and all that. 73 00:08:19,500 --> 00:08:24,660 And so we are newly vulnerable to the closest race. 74 00:08:24,660 --> 00:08:31,800 So exploring a different category of threat sort of very severe, possibly existential, 75 00:08:31,800 --> 00:08:39,450 the sort of nexus of threats around our demands on the environment, which depend on our population size and our per capita consumption. 76 00:08:39,450 --> 00:08:48,520 What we're doing to climate, what we're doing to to water supplies and things on that that must be a major category of threat. 77 00:08:48,520 --> 00:08:57,030 Well, indeed, I think that said, the second class of threats are those which are caused by us collectively because the world population, 78 00:08:57,030 --> 00:09:04,440 it's doubled in the last 50 years. It's now seven point eight billion euros to nine billion by mid-century. 79 00:09:04,440 --> 00:09:08,640 And moreover, we are all more demanding of energy resources. 80 00:09:08,640 --> 00:09:12,630 And of course, we hope within the developing world they will indeed develop. 81 00:09:12,630 --> 00:09:16,710 But that's going to mean that they use more energy per capita, 82 00:09:16,710 --> 00:09:24,750 etc. So we do have a greater impact on the planet and this is doing two things, as you know, for me. 83 00:09:24,750 --> 00:09:31,830 One is affecting biodiversity and the other, of course, is leading to climate change. 84 00:09:31,830 --> 00:09:36,340 And this is, of course, as we all know, the very serious threat. 85 00:09:36,340 --> 00:09:42,450 But I think if you compare that with, say, COVID 19, it's like a slow motion. 86 00:09:42,450 --> 00:09:51,090 COVID 19 is an immediate threat, and the trouble is that things like climate change or slow motion. 87 00:09:51,090 --> 00:09:55,470 So we're in the position of the frog in the. 88 00:09:55,470 --> 00:10:00,930 Kind of water being heated, it doesn't realise what's happening until it's too late to escape. 89 00:10:00,930 --> 00:10:06,420 And that's the danger really of a lack of adequate response to something which 90 00:10:06,420 --> 00:10:13,350 is going to happen in decades and is already showing serious precursors. 91 00:10:13,350 --> 00:10:22,140 And also, I think we've got to bear in mind that the other kind of difference, as I addressed, they may be unfamiliar, 92 00:10:22,140 --> 00:10:28,380 but there's a good maxim due to make silver if the unfamiliar is not the same as the improbable. 93 00:10:28,380 --> 00:10:37,710 And I think although these things happen fairly rarely, even one occurrence, maybe too many. 94 00:10:37,710 --> 00:10:47,480 And we ought to prepare for the more. And in a bigger scale, that's true of climate and diversity preservation. 95 00:10:47,480 --> 00:10:52,580 Yes. An example of an unfamiliar threat. 96 00:10:52,580 --> 00:10:58,490 What would you what would you suggest in this in this line at the moment? 97 00:10:58,490 --> 00:11:02,990 Well, I think COVID 19 was unkind. 98 00:11:02,990 --> 00:11:12,020 Yes, but it should not have been unexpected really, because as we know, we've had influenza pandemics, 99 00:11:12,020 --> 00:11:22,740 we've had the virus pandemics, sales and mergers which didn't get the global traction, which COVID 19 was done. 100 00:11:22,740 --> 00:11:34,050 But this is a case where we should have been far more prepared than we were because in estimating estimates 101 00:11:34,050 --> 00:11:42,420 that the cost to the world of COVID 19 is going to be something of twenty five trillion dollars. 102 00:11:42,420 --> 00:11:47,370 Plus, of course, the loss of millions of lives. 103 00:11:47,370 --> 00:11:56,040 How likely was it taking note of sense of the pandemic like that was more unlikely than most for 50 years or something. 104 00:11:56,040 --> 00:12:01,710 This isn't some the insurance premium by both probability by impact. 105 00:12:01,710 --> 00:12:12,140 That means it would have been worth spending several hundred billion dollars in preparation for something like COVID 19. 106 00:12:12,140 --> 00:12:14,270 And of course, we were doing that, 107 00:12:14,270 --> 00:12:22,510 and it's understandable book is very hard to persuade governments to spend money on things which has never been addressed, immediate need. 108 00:12:22,510 --> 00:12:30,260 But I think one lesson we have learnt from COVID 19 is that we do need to think about how we should 109 00:12:30,260 --> 00:12:38,690 be more prepared for these events by the obvious preparations which we should have had COVID 19. 110 00:12:38,690 --> 00:12:44,390 And also, we've got to realise that's an important trade-off between two things. 111 00:12:44,390 --> 00:12:50,720 One is efficiency and the other's resilience and not be resilient. 112 00:12:50,720 --> 00:12:58,070 We've got to leave some slack in the system. You give two examples. It's not resiliency, 113 00:12:58,070 --> 00:13:05,930 although it's efficient supply chains around the world and not keeping any inventories or 114 00:13:05,930 --> 00:13:11,310 stocks because they're breaking one link in one chain can screw up the loss of manufacturing. 115 00:13:11,310 --> 00:13:18,290 So that's an example where it may be a false economy as it were to maximise efficiency. 116 00:13:18,290 --> 00:13:26,870 And another different example more relevant to COVID 19 is that it would be better if we had more intensive care beds, 117 00:13:26,870 --> 00:13:32,300 as I think they did routinely in Germany, this country said, was more prepared. 118 00:13:32,300 --> 00:13:40,400 So I think one thing we're going to learn from the pandemic is that we should be some slack 119 00:13:40,400 --> 00:13:49,250 in the system and be prepared to spend money in doing the kind of things that we needed, 120 00:13:49,250 --> 00:13:58,430 the protective clothing and all that sort of thing. So governments will happily spend very, very large amounts of money on defence. 121 00:13:58,430 --> 00:14:06,860 If one looks at, for example, the defence budget of the of the United States, it dwarfs that that's going into medical research. 122 00:14:06,860 --> 00:14:13,280 And much of that is predicated on potential ill defined threats in the in the future. 123 00:14:13,280 --> 00:14:18,200 Whereas other categories of threats which are still unfamiliar but exactly as you've just said, 124 00:14:18,200 --> 00:14:25,370 are more certain in many ways, it's far harder for governments to take action on. 125 00:14:25,370 --> 00:14:31,850 Well, that's a very good example, actually, because I believe it was the case that in 2009, 126 00:14:31,850 --> 00:14:43,490 the government stock up on vaccines for the first time ever actually came back, and some people berated the government for a waste of money. 127 00:14:43,490 --> 00:14:48,000 But of course, you have wasted you our insurance if your house doesn't burn down. 128 00:14:48,000 --> 00:14:55,010 And they were doing the right thing and we would be doing the right thing if we had better preparations for making vaccines. 129 00:14:55,010 --> 00:14:58,550 This and of course, as you say, in a sense, 130 00:14:58,550 --> 00:15:04,670 the whole of the Minister Defence Procurement Budget is in this category because 131 00:15:04,670 --> 00:15:09,350 we're having all these fighters and things which we hope you never have to use, 132 00:15:09,350 --> 00:15:14,710 and that's on the far larger financial scale that we're talking about here. 133 00:15:14,710 --> 00:15:24,630 A couple more details, one of the issues around how COVID has changed the world, just thinking about threats on. 134 00:15:24,630 --> 00:15:31,920 How concerned are you at the way that global politics is going, so it was only 15, 135 00:15:31,920 --> 00:15:40,020 20 years ago that we had there was a tacit assumption that the world was going inexorably towards a uniform liberal democracies. 136 00:15:40,020 --> 00:15:46,650 And now we've had waves of populism and we've had the reverse of democracy and in many countries. 137 00:15:46,650 --> 00:15:55,140 And of course, that's a paradox that some of the less democratic countries have found it easier to deal with the pandemic than more. 138 00:15:55,140 --> 00:16:05,460 How worried are you about sort of global political trends at the moment and our capacity to address the risks that you talk about? 139 00:16:05,460 --> 00:16:13,980 Well, I'm less worried somewhat after the American election results recently, obviously, but there is a more general concern. 140 00:16:13,980 --> 00:16:18,180 And I think in the country of populism, 141 00:16:18,180 --> 00:16:28,050 I think there is something which won't reverse so easily and that's the effect of social media and fake news and all that. 142 00:16:28,050 --> 00:16:39,260 If we go back. More than 20 years, and most people have most of their news via radio stations or the press. 143 00:16:39,260 --> 00:16:49,820 So it was filtered through professional journalists of some kind who in general filtered out or muffle the extremism. 144 00:16:49,820 --> 00:16:55,560 But what happens now is the opposite because on social media engines, extreme gets more clicks. 145 00:16:55,560 --> 00:17:04,240 That's something still more extreme. And so this is the tendency which is hollowing out moderate opinion and making it harder to get consensus. 146 00:17:04,240 --> 00:17:11,960 And I do worry about this as a downside of social media, which could be rather hard to reverse, in my opinion. 147 00:17:11,960 --> 00:17:17,720 It was aggravated, obviously by Trump, which is something anyway. And of course, as you say, 148 00:17:17,720 --> 00:17:25,370 it has been one of the things which has made it especially hard for our country and the U.S. 149 00:17:25,370 --> 00:17:32,360 to cope adequately with the recent pandemic as compared to what has happened in Taiwan, 150 00:17:32,360 --> 00:17:36,900 for instance. The the what's happened recently, 151 00:17:36,900 --> 00:17:43,140 both with some of the misinformation about COVID on social media and also the fury about 152 00:17:43,140 --> 00:17:50,160 the storming of the Capitol on January the 6th has brought some of these issues to a head. 153 00:17:50,160 --> 00:17:56,610 Are you optimistic that this may lead to movement in this area or do you think we'll just sort of go back 154 00:17:56,610 --> 00:18:04,440 to the status quo ante once the pandemic is over and we're back to more normal politics in the states? 155 00:18:04,440 --> 00:18:15,630 Well, I'm not. I do worry about the United States because of course, even though Trump lost the election, more than 70 million people voted for him. 156 00:18:15,630 --> 00:18:23,310 Still, more people than that carry guns routinely. So it's a very different social system from what they used to in northern Europe. 157 00:18:23,310 --> 00:18:27,450 And so I do worry about what's going to happen there. 158 00:18:27,450 --> 00:18:35,040 But I think this pandemic is going to make people aware of the need for collaboration. 159 00:18:35,040 --> 00:18:41,190 And I think going back to environment issues, we are, as you know, 160 00:18:41,190 --> 00:18:50,310 having important international conferences this year on climate in Glasgow and biodiversity in China. 161 00:18:50,310 --> 00:18:59,780 And I hope these will raise consciousness, that these are areas where the world does have to move together and. 162 00:18:59,780 --> 00:19:05,560 We need to get all the data, etc. and just quote something on the environment. 163 00:19:05,560 --> 00:19:14,750 I was on a conference call just this morning about space and small. 164 00:19:14,750 --> 00:19:20,850 Arrays of satellites now can monitor every day, every tree in the world. 165 00:19:20,850 --> 00:19:35,390 And get all this data, and this can be shared around the world and can check whether countries are conforming to pledges they've made for. 166 00:19:35,390 --> 00:19:41,000 Not cutting down forests or cutting CO2 to emissions and things like that, 167 00:19:41,000 --> 00:19:50,300 so I think technology is going to be very helpful in that sense in that it allows everyone to know what the data are. 168 00:19:50,300 --> 00:19:59,900 So one of the most extraordinary things about the pandemic is how science has come so much to the fore over the last year. 169 00:19:59,900 --> 00:20:03,740 As someone who's sort of taught what the R number is for 30 years, 170 00:20:03,740 --> 00:20:09,500 it's extraordinary that now everyone knows what that number is and largely gets it right. 171 00:20:09,500 --> 00:20:21,500 Chris Whitty and Patrick Vallance are household name, so privatise additions before Christmas was entitled Happy with. 172 00:20:21,500 --> 00:20:28,680 It's fun. Overall, do you think that this will be what will cover things? 173 00:20:28,680 --> 00:20:33,200 Do you think this is going to be temporary and do you think it will have lasting effects? 174 00:20:33,200 --> 00:20:37,880 Will the lasting effects be positive that people see the value of science? 175 00:20:37,880 --> 00:20:41,390 Or could there be a a reaction against science? 176 00:20:41,390 --> 00:20:46,850 We see Chris and Patrick been called Professor Bloom Professor Doom and think. 177 00:20:46,850 --> 00:20:54,030 Well, I mean, I think we've got to say that those particular scientists and all the others under huge pressure have done a wonderful job, 178 00:20:54,030 --> 00:20:59,900 but not the way they preserve their integrity and made things as clear as they can. 179 00:20:59,900 --> 00:21:07,190 Emphasising, of course, that there are uncertainties and that you change your mind as the facts change, 180 00:21:07,190 --> 00:21:18,290 which has happened several times as we know this pandemic. I think some scientists shouldn't be too worried about their prestige. 181 00:21:18,290 --> 00:21:25,380 I mean, scientists often bemoan the fact that the public doesn't know any science and doesn't suspect science, etc. And I think that's unfair. 182 00:21:25,380 --> 00:21:36,710 And to respect first, when the opinion polls done of the extent to which particular professional groups are respected by the public scientists, 183 00:21:36,710 --> 00:21:41,860 at least academic scientists come out pretty high. 184 00:21:41,860 --> 00:21:50,230 Government scientists somewhat lower, but all way above politicians, journalists and estate agents and the rest of them so respected. 185 00:21:50,230 --> 00:21:55,840 They may be below DP's, maybe below clergymen, but they're still pretty high. 186 00:21:55,840 --> 00:22:03,460 That's one thing. And also, I think it's a bit unfair to bemoan the public's ignorance of science. 187 00:22:03,460 --> 00:22:09,280 I mean, obviously, as you say, they've learnt about the unknown number and things like that in the last few months. 188 00:22:09,280 --> 00:22:13,960 But there is a lot of interest in science and biodiversity, 189 00:22:13,960 --> 00:22:24,700 etc. It's something which does concern huge numbers of people in this world and indeed in astronomy. 190 00:22:24,700 --> 00:22:36,790 I mean, I'm. They surprised at the large number of people who are fascinated by dinosaurs and space. 191 00:22:36,790 --> 00:22:43,060 Nothing could be more relevant, whether they like dinosaurs, but young kids especially approximated by that. 192 00:22:43,060 --> 00:22:51,870 And I think this is a good thing and I think scientists shouldn't bemoan. 193 00:22:51,870 --> 00:22:57,120 The fact that the public is not an expert in science anymore, 194 00:22:57,120 --> 00:23:01,920 that they should bemoan the fact that many people in the public don't know the history 195 00:23:01,920 --> 00:23:08,460 of their country and probably couldn't find a South Korea or Libya on the map. 196 00:23:08,460 --> 00:23:14,850 I think if you want to have an informed citizenry, then citizens have to have a feel for science, 197 00:23:14,850 --> 00:23:25,590 obviously because of a fraction of the issues that come up on energy, health and the etc. have a scientific component. 198 00:23:25,590 --> 00:23:33,810 But also one can't address these issues properly unless one knows a bit about history and politics. 199 00:23:33,810 --> 00:23:41,700 And that's why one should bemoan the ignorance in those areas and not particularly in scientists societies. 200 00:23:41,700 --> 00:23:51,780 I think mode too much because there's as much interest in what they do as there is in any other academic area. 201 00:23:51,780 --> 00:24:00,510 And I see that point. In addition to the sort of facts and I completely agree that we're silly if we sort of 202 00:24:00,510 --> 00:24:05,460 bemoaned that the public don't know the facts about science and the things that we love. 203 00:24:05,460 --> 00:24:13,230 But but what about the issues about some of the the methodologies of science and things in the interpretation of quantitative data? 204 00:24:13,230 --> 00:24:20,070 And I guess one of the things that's encouraged me in the last year is that people are very different type of people. 205 00:24:20,070 --> 00:24:23,490 The journalist Tim Harford, for example, your colleague at Cambridge, 206 00:24:23,490 --> 00:24:35,340 they did speak a whole to have been given a platform to explain some really quite complex ideas to a general audience. 207 00:24:35,340 --> 00:24:42,580 And those are really transferable skills that help you interpret the data about the pandemic but help you do many other things. 208 00:24:42,580 --> 00:24:49,140 It's not something you find encouraging is that the gang scientists and I think that should be a bigger part of the school education 209 00:24:49,140 --> 00:25:00,250 because one can't be an informed citizen unless one can look at the numbers and not be bamboozled by false claims about them. 210 00:25:00,250 --> 00:25:04,380 And that's why those people you mentioned are doing such a good job because they're important. 211 00:25:04,380 --> 00:25:10,420 But just going back to science, I think you've got to realise and of course, 212 00:25:10,420 --> 00:25:24,210 of balance and Richie and Mark Wolfe as a people who emphasises rightly that scientists are people who make the final decisions and offering advice, 213 00:25:24,210 --> 00:25:29,640 and the politicians who in these cases have to make decisions on behalf of all of us. 214 00:25:29,640 --> 00:25:35,280 They obviously do have to take into account the ethics and economics as well. 215 00:25:35,280 --> 00:25:41,850 And in those areas, scientists as citizens with no special expertise. 216 00:25:41,850 --> 00:25:47,940 I think it's very important that decisions which are made by politicians should be 217 00:25:47,940 --> 00:25:52,530 made on the basis of the best possible understanding of the best available data. 218 00:25:52,530 --> 00:26:00,220 But that's not the only thing they have to consider. They have to consider the political context and the economic context. 219 00:26:00,220 --> 00:26:06,220 Will the rest of it? And as you say, people like Patrick Vallance and Matt Wolpert completely understand it. 220 00:26:06,220 --> 00:26:14,040 My worry that some people in our community take a rather simplistic linear view about how science fits into policy, 221 00:26:14,040 --> 00:26:21,450 and I do not appreciate quite how hard the job politicians do have in integrating all these different things. 222 00:26:21,450 --> 00:26:27,980 I agree, and also some scientists to generate a mystique about the scientific method. 223 00:26:27,980 --> 00:26:33,000 And then it seems to me that what we do when we do science, 224 00:26:33,000 --> 00:26:40,890 it's not very different from what a detective does or of other people sort of trying to assess evidence, 225 00:26:40,890 --> 00:26:52,410 etc. So mode of thinking, which been exemplified by what the experts have been doing in studying the coronavirus. 226 00:26:52,410 --> 00:27:03,330 There's nothing very special about them. It's just a way of assessing rational evidence, which would be entirely familiar to any such person. 227 00:27:03,330 --> 00:27:11,280 So scientists shouldn't be shouldn't say that there is something very special called the scientific method. 228 00:27:11,280 --> 00:27:25,280 Nor, of course, should they or academics regard themselves especially clever and intelligent, but no more so than any other professions. 229 00:27:25,280 --> 00:27:33,080 Do you think that the pandemic, which is really a challenge that almost by definition, 230 00:27:33,080 --> 00:27:43,280 requires a societal communitarian in a way response may have changed both the discourse and civil 231 00:27:43,280 --> 00:27:51,480 society and also the discussions amongst politicians about collective approaches to other challenges. 232 00:27:51,480 --> 00:27:54,710 I don't want to go over the sort of nuts and bolts of what happened over the last year, 233 00:27:54,710 --> 00:28:03,120 but we do see a sort of tension between people with a more libertarian view and people who are more comfortable with collective action. 234 00:28:03,120 --> 00:28:08,300 And do you think that is changing the mood music in a good way? 235 00:28:08,300 --> 00:28:19,100 Well, I mean, I think we are realising that many of the so-called libertarians have really had a damaging impact because in cases, 236 00:28:19,100 --> 00:28:29,150 they are wearing masks. For instance, if you wear a mask or or if you but you increase a danger for other people as 237 00:28:29,150 --> 00:28:34,490 well as they wear masks not to protect yourself and that they do that a bit, 238 00:28:34,490 --> 00:28:38,840 but for altruistic reasons to protect other people just as much. 239 00:28:38,840 --> 00:28:45,650 And so I think extreme libertarians who bridle at wearing a mask have been damaging about hopes 240 00:28:45,650 --> 00:28:55,320 that the attitudes of extreme libertarianism is discredited a bit in some other contexts as well. 241 00:28:55,320 --> 00:29:06,400 So that's good. But I think one thing which is this report is that we realised that more and more of the important questions. 242 00:29:06,400 --> 00:29:09,490 Can't be tackled by a single nation alone. 243 00:29:09,490 --> 00:29:20,020 And obviously, the pandemic can't, nor can climate work and energy supply and biodiversity and all these things. 244 00:29:20,020 --> 00:29:28,090 And so I think it's going to be the case that we are going to have to give up more 245 00:29:28,090 --> 00:29:35,800 of our sovereignty to international bodies of various kinds and maybe a new body, 246 00:29:35,800 --> 00:29:45,100 rather like Joe to monitor compliance with the Paris and Glasgow pledges to cut CO2 emissions, for instance. 247 00:29:45,100 --> 00:29:51,610 Major energy. We probably need some new international body to regulate the internet, 248 00:29:51,610 --> 00:30:00,640 etc. because no single nation can compete with these massive multinational conglomerates that control the internet. 249 00:30:00,640 --> 00:30:05,890 So in all of these contexts, national sovereignty doesn't amount to very much. 250 00:30:05,890 --> 00:30:13,810 You've got to think globally. And I think that's going to be something that will be more and more widely unionised. 251 00:30:13,810 --> 00:30:23,200 And another point, of course, is that no, that communication and travel and that's local. 252 00:30:23,200 --> 00:30:34,420 When this whole topic becomes more common again, a communications is then I think we do have to worry about the fact that, 253 00:30:34,420 --> 00:30:39,550 for instance, in Africa could just pretty desperate. 254 00:30:39,550 --> 00:30:48,550 They now know their fate. The one thing they do have is the internet, and so they know that they know the injustice of it. 255 00:30:48,550 --> 00:30:54,680 And I think any sort of inequality between regions. 256 00:30:54,680 --> 00:31:03,680 I'm going to be very dangerous and will lead to better land mass migration and wars and all the rest in a 257 00:31:03,680 --> 00:31:10,100 way that wouldn't have happened 200 years ago when the middle of Africa didn't know what my listings were. 258 00:31:10,100 --> 00:31:18,530 And I think this is perhaps digressing slightly, and I know that a Paul Collier in Oxford is a business question. 259 00:31:18,530 --> 00:31:27,530 We've got to really have something like a mega Marshall Plan, where the northern nations ensure that Africa doesn't fall behind because, as you know, 260 00:31:27,530 --> 00:31:34,970 it's sort of a rapidly growing population and we can't develop like the Asian 261 00:31:34,970 --> 00:31:39,590 tigers did by undercutting manufacturing costs because robots do some of that now, 262 00:31:39,590 --> 00:31:45,350 just go me some other way of ensuring that it doesn't go behind, because if not, then those are your problems. 263 00:31:45,350 --> 00:31:53,540 And so I think the northern nations need to ensure that Africa's economy develops and not just for altruistic reasons. 264 00:31:53,540 --> 00:31:58,730 Martin, thank you for mentioning Paul Collier, which gives me an opportunity to say that this time next week, 265 00:31:58,730 --> 00:32:04,100 we'll be having a conversation with Paul and John Kaye on the recent book about it. 266 00:32:04,100 --> 00:32:12,530 Is very good book, and it is very good to be several books on that sort of thing actually on the. 267 00:32:12,530 --> 00:32:17,870 The toxic nature of extreme inequality and all that. 268 00:32:17,870 --> 00:32:24,140 And I completely agree with you, but I'm perhaps a little less optimistic than you. 269 00:32:24,140 --> 00:32:30,590 Aren't we seeing at the moment it becoming harder for countries to pool sovereignty? 270 00:32:30,590 --> 00:32:39,580 I certainly don't want to get into discussion on Brexit, but we have traded of sovereignty for trade advantage and see in the number of countries. 271 00:32:39,580 --> 00:32:49,100 And even Biden is, if you compare Biden with Obama or Clinton is talking about a disconnected America. 272 00:32:49,100 --> 00:32:54,440 Nowhere near as disconnected the US as Trump. 273 00:32:54,440 --> 00:33:01,160 Are you more optimistic because the pandemic may make us think about the importance of sharing sovereignty? 274 00:33:01,160 --> 00:33:05,750 That's the extent that we could do it directly, but of course, it's the major Brexit. 275 00:33:05,750 --> 00:33:15,950 But once I mean, one of the reasons why I was the leader was that given that we can't rely on 276 00:33:15,950 --> 00:33:22,190 America in the way that one did 20 years ago and we have China and Russia, 277 00:33:22,190 --> 00:33:26,750 this is the last. 278 00:33:26,750 --> 00:33:38,000 The least opportune time to weaken the coherence of Europe as a liberal democracy, and indeed I would go further and say that my politics, 279 00:33:38,000 --> 00:33:50,870 I think we in this country should try and learn more from Finland and Denmark and less than the United States to be united political. 280 00:33:50,870 --> 00:33:56,870 I think the main problem the government is allowing too much the United States, 281 00:33:56,870 --> 00:34:06,530 and we do far better with a higher tax regime, greater equality as is exemplified by the Scandinavian countries. 282 00:34:06,530 --> 00:34:09,680 I'm going to get to questions in about five minutes, 283 00:34:09,680 --> 00:34:17,120 so just please do vote for questions you would like me to ask Martin and add some yourself before doing that. 284 00:34:17,120 --> 00:34:22,850 If I could ask a couple of sort of more specific biological hazard questions. 285 00:34:22,850 --> 00:34:34,190 So in your book, you write extensively and persuasively about the dangers from bioterrorism and especially garage biotechnology, 286 00:34:34,190 --> 00:34:46,580 as well as the risks of modifying influenza virus and things like that such that we, we know and again haven't gone through the pandemic. 287 00:34:46,580 --> 00:34:52,520 Do you think that that will will raise the profile of what perhaps a little bit 288 00:34:52,520 --> 00:35:00,320 of a Cinderella issue compared with some of the other things we've talked about? Well, I mean, of course, I'm not an expert at all, 289 00:35:00,320 --> 00:35:10,220 but I do know that the influenza virus could be tweaked even 10 years ago to make it more virulent, more transmissible. 290 00:35:10,220 --> 00:35:16,820 And the same thing may happen with other viruses. This can be done perhaps by small groups and by an increasing number. 291 00:35:16,820 --> 00:35:21,050 And I do worry very much about this. 292 00:35:21,050 --> 00:35:24,980 That's even more than I do about cyber, because as you know, 293 00:35:24,980 --> 00:35:32,510 there are international discussions into academy discussions about regulating all 294 00:35:32,510 --> 00:35:38,930 these new biological techniques and on grounds of prudence and on grounds of ethics. 295 00:35:38,930 --> 00:35:47,160 But my worry is that even if we all agree internationally what the regulations should be, that enforcing them. 296 00:35:47,160 --> 00:35:52,290 Is as hopeless as enforcing the drug laws or the tax laws globally. 297 00:35:52,290 --> 00:35:59,250 And that's, I think, very scary given that one. 298 00:35:59,250 --> 00:36:02,300 But actually, it could be could be too many. 299 00:36:02,300 --> 00:36:09,960 So that's one of the things I've learnt most about how we can avoid these two bad actors were all of us needed to produce some, 300 00:36:09,960 --> 00:36:13,890 some bio disaster, how we can cope with them. 301 00:36:13,890 --> 00:36:22,880 And it's going to be a trade-off between three things we want to preserve our freedom, security and privacy. 302 00:36:22,880 --> 00:36:29,870 And in China, they may give up the privacy of security, but I don't know what would happen here, 303 00:36:29,870 --> 00:36:36,230 but I think we do have to accept far greater surveillance if we are going to avoid the 304 00:36:36,230 --> 00:36:42,320 possibility of things like that happening and similar arguments applied to cyber attacks. 305 00:36:42,320 --> 00:36:51,860 Mm-Hmm. Let me ask, this is a slightly more parochial question in the sense of us working in universities. 306 00:36:51,860 --> 00:37:00,260 But if you look at the successes and failures of not only this country, but at other high income countries response to the pandemic, 307 00:37:00,260 --> 00:37:06,050 then an area that's been really has worked well is the science response. 308 00:37:06,050 --> 00:37:13,820 Whether it's the big studies or potential therapies, whether it's the vaccines here and in in Germany, 309 00:37:13,820 --> 00:37:19,490 for example, and whether it's some of the the epidemiological modelling. 310 00:37:19,490 --> 00:37:27,260 And I guess this goes back to a point you make right at the beginning about resilience that we had some we didn't 311 00:37:27,260 --> 00:37:34,130 really think about some of the research groups we had as being sort of a national capability in terms of resilience. 312 00:37:34,130 --> 00:37:41,000 But they were and they were able to really get going once the once the pandemic started. 313 00:37:41,000 --> 00:37:48,980 Do you think the going thinking back to when you were presence across society in arguing so strongly for more money going into research? 314 00:37:48,980 --> 00:37:56,660 Do you think that's going to make the argument of of Adrian Smith, the current president, easier to make? 315 00:37:56,660 --> 00:38:06,770 Well, I would hope so. I think this is a case where we have benefited hugely from the very strong research in these particular areas. 316 00:38:06,770 --> 00:38:11,300 And I think it's very important to bear in mind that in this country, 317 00:38:11,300 --> 00:38:16,400 one of the big assets we have is in our research into practically everyone says this. 318 00:38:16,400 --> 00:38:21,090 And the reason is that this. 319 00:38:21,090 --> 00:38:25,140 Whose expertise that can be drawn on when it's needed, particularly to do. 320 00:38:25,140 --> 00:38:39,270 And also, of course, for the economy. And you know that people get for if we don't get smarter as it were and we've got two advances. 321 00:38:39,270 --> 00:38:50,100 In fact, there's one heresy, which I think people say the key thing is to take the ideas of developing labs and then develop them. 322 00:38:50,100 --> 00:38:56,880 But of course, even the best of times was discovered in the UK is going to be five or 10 percent of these discoveries. 323 00:38:56,880 --> 00:39:05,880 And so the important reason why we want strong universities, which are to specialise is still we have people who can be in the invisible college, 324 00:39:05,880 --> 00:39:10,660 in all subjects all over the world and can know what's going on. 325 00:39:10,660 --> 00:39:19,150 Anywhere in the world, and it's a clever idea they can run with it, so I would hope that British start-ups won't be just using ideas invented here, 326 00:39:19,150 --> 00:39:28,690 but will be taking advantage of the ideas that emerge all over the world and have a discerning eye to pick out the good ones. 327 00:39:28,690 --> 00:39:37,780 So and also, of course, in universities, we and the Americans have the still university concept. 328 00:39:37,780 --> 00:39:42,340 The Germans eventually, but they don't have it so much of having teaching and research together. 329 00:39:42,340 --> 00:39:47,290 And that's because an equally important part of the output of a university is, 330 00:39:47,290 --> 00:39:54,970 of course, that the highly educated students and the better educated they are. 331 00:39:54,970 --> 00:39:59,140 Then again, the better for the prosperity of this country. 332 00:39:59,140 --> 00:40:03,490 That's a platitude on a global scale. Thank you. 333 00:40:03,490 --> 00:40:11,710 Martin, I'm going to go to some questions and I'm just going to read out the first one, which has 10 votes. 334 00:40:11,710 --> 00:40:20,620 It's from Daniel Scharf. If, as the PM has said, climate change represents a greater threat than the pandemic should he set up regular No. 335 00:40:20,620 --> 00:40:26,260 10 briefings, so have ministers flanked by ecologists, meteorologists, behavioural economists, 336 00:40:26,260 --> 00:40:34,970 sociologists, psychologists, etc. and I go and address questions from the public NGOs and the press. 337 00:40:34,970 --> 00:40:41,250 Well, I mean, I think it's it's a slow motion process, isn't it? 338 00:40:41,250 --> 00:40:51,030 I think that's that's the problem that the pandemic happened fast enough and is so scary that everyone wanted the news every day. 339 00:40:51,030 --> 00:40:58,020 The trouble with climate policy is, of course, that it is long term. 340 00:40:58,020 --> 00:41:09,090 You've got to be thinking about the mitigation of the effects of climate change and adaptation on the time scale of up to 50 years. 341 00:41:09,090 --> 00:41:13,470 And I think we want to make sure that this is hardcoded within government. 342 00:41:13,470 --> 00:41:20,070 But I don't think the public is going to be confused by briefings. 343 00:41:20,070 --> 00:41:25,260 Just like to watch trees growing all the time, it's too slow. And so these things are important. 344 00:41:25,260 --> 00:41:33,060 But so I think we've got to prioritise when this is a problem and to say something more about climate change. 345 00:41:33,060 --> 00:41:42,420 I think it's true that the public perception of the public rating of the importance of climate change has risen tremendously. 346 00:41:42,420 --> 00:41:49,470 And this is partly the effect of signs, but it's more the effect of charismatic figures. 347 00:41:49,470 --> 00:41:56,130 I think we both know people who have been scientific advisers in government, and we know they're frustrated. 348 00:41:56,130 --> 00:41:57,660 There's something like COVID 19. 349 00:41:57,660 --> 00:42:07,890 They feel that they don't get much traction because they're always more urgent issues and the politicians care about the short term and the parochial. 350 00:42:07,890 --> 00:42:10,470 But let me give you two examples. 351 00:42:10,470 --> 00:42:21,060 The papal encyclical in 2015, which was the input which had very little effect because the pope is a standing ovation of the UN. 352 00:42:21,060 --> 00:42:25,080 He's got a billion followers in Latin America, Africa and East Asia, 353 00:42:25,080 --> 00:42:32,860 and that had a big effect in using the consensus at the Paris conference because he has his billion followers and our secular pope. 354 00:42:32,860 --> 00:42:40,120 Rutenberg has done his bit too, especially in the context of biodiversity and tactics in the ocean. 355 00:42:40,120 --> 00:42:46,530 I mean, the recent BBC programmes events are going on now are, of course, very important. 356 00:42:46,530 --> 00:42:50,790 But I think Mr Goh, not the most enlightened about politicians, 357 00:42:50,790 --> 00:42:59,250 would not have used any of his political capital on legislation to reduce the use of plastics. 358 00:42:59,250 --> 00:43:11,580 Had it not been for the scene in Blue Planet to showing the albatross of the nest and coughing up plastics which Young must have done for nourishment, 359 00:43:11,580 --> 00:43:19,480 that's an iconic picture. Rather like the polar bear in the melting ice was for the climate campaign. 360 00:43:19,480 --> 00:43:28,260 And so I think if politicians know the public cares, then they will take these long term decisions. 361 00:43:28,260 --> 00:43:33,870 And that's why it's very important that those members of the Santa community who are good at 362 00:43:33,870 --> 00:43:39,480 this and those who they can enlist because they better should be banging on about these issues, 363 00:43:39,480 --> 00:43:44,490 because only then will voters care and politicians care. 364 00:43:44,490 --> 00:43:52,440 It is also a fascinating thing that government actually had legislation and plans for reducing marine plastic, 365 00:43:52,440 --> 00:43:55,050 which they've been considering for some years, 366 00:43:55,050 --> 00:44:02,040 and it was a stimulus so that single television programme that gave them the sort of window and the opportunity. 367 00:44:02,040 --> 00:44:12,920 And I think it's a lesson for all of us interested in science and policy to have many plans ready to go and when the window of opportunity exists, 368 00:44:12,920 --> 00:44:19,110 time to encourage the charismatic figures and encourage young people to demonstrate and all these things. 369 00:44:19,110 --> 00:44:27,480 Because if these things are imperfect politicians in boxes and in the press, then they will rise on your agenda. 370 00:44:27,480 --> 00:44:35,730 And Marcus Rashford, raising the profile of urban food poverty is another local example. 371 00:44:35,730 --> 00:44:38,670 Let me read out another one, and this is from Max Nugent. 372 00:44:38,670 --> 00:44:48,870 In 2019, the Global Health Survey ranked the UK second in the world in pandemic preparedness, and the US was first. 373 00:44:48,870 --> 00:44:56,250 And this was above China, Germany and New Zealand, and we now have the highest death toll in Europe. 374 00:44:56,250 --> 00:45:05,790 Is it more important to have decisive and ineffective leaders in the crisis than have a level high level of preparedness? 375 00:45:05,790 --> 00:45:13,530 Well, of course, the decisive leaders are crucial and of course, the willingness of the public to accept strict lockdowns, 376 00:45:13,530 --> 00:45:18,000 etc. But I think I would contest the statement that we were well prepared. 377 00:45:18,000 --> 00:45:25,580 If you look at the national risk register, we were prepared for a influenza pandemic that was put behind us. 378 00:45:25,580 --> 00:45:33,750 But if you look at the end of 2017 published version of this, that. 379 00:45:33,750 --> 00:45:39,690 Epidemic X, no pandemic was not ready to tie at all kill thousand people, 380 00:45:39,690 --> 00:45:45,270 and so we were not prepared for the distinctive features of the corona virus, 381 00:45:45,270 --> 00:45:53,440 namely the need for huge amounts of protective clothing and the fact that it might be very difficult to develop a vaccine, 382 00:45:53,440 --> 00:45:58,590 unlike the flu, etc. So I think we were not prepared in the way we should have been. 383 00:45:58,590 --> 00:46:06,720 But of course, it's the political decisions and the willingness of the public to accept regulations, which is also crucial, 384 00:46:06,720 --> 00:46:12,330 and we weren't prepared despite having wakeup calls with nurses and staff and the lot in the last 20 years. 385 00:46:12,330 --> 00:46:24,150 That's, of course, one reason why the East Asian countries will forget. And thank you, we have a couple more questions that are sort of on this topic, 386 00:46:24,150 --> 00:46:33,890 so I hope we can talk about the need for legislation on climate change, how important issues will be choosing the right leaders. 387 00:46:33,890 --> 00:46:41,160 And so I guess that's a how important is the legislation on climate change more generally 388 00:46:41,160 --> 00:46:45,090 on some of the issues that we've been talked about that we've talked about then? 389 00:46:45,090 --> 00:46:51,720 One can approach it in different ways. One can regulate. One can legislate. One can try and have incentives within the market. 390 00:46:51,720 --> 00:46:59,880 Or one can try and ask consumers to drive change by, for example, picking low carbon food and things. 391 00:46:59,880 --> 00:47:01,740 I know it's a terribly broad question, 392 00:47:01,740 --> 00:47:12,480 but just in that sort of spectrum of from government action through incentives through allowing consumers to make the change. 393 00:47:12,480 --> 00:47:22,380 Do you have sort of a broad feeling about what works best and whether we have the balance right at the moment and forgive the terribly broad question? 394 00:47:22,380 --> 00:47:30,180 Well, of course, it's very hard to persuade people in their interest of long term benefits to. 395 00:47:30,180 --> 00:47:37,200 Give up things they like doing now, but let me say what I think the win win situation win win policy is the climate change, 396 00:47:37,200 --> 00:47:47,640 and that's to bear in mind that we in this country produce between one or two percent of the world's CO2 emissions. 397 00:47:47,640 --> 00:47:54,230 So even if we reach our 2050 targets, that's making only one or two percent difference the world, 398 00:47:54,230 --> 00:47:59,670 if we can aspire to having much more than one or two percent of the world's clever ideas and innovations. 399 00:47:59,670 --> 00:48:06,300 So I think what we ought to do is to focus on R&D effort. 400 00:48:06,300 --> 00:48:11,700 The government supported part of it on clean energy batteries, 401 00:48:11,700 --> 00:48:19,230 smart grids and all these things so that we can make it more efficient and bring the cost down so that, 402 00:48:19,230 --> 00:48:31,250 for instance, India, where they need a per capita energy supply more than now, we don't. 403 00:48:31,250 --> 00:48:37,550 They need more of their population growing additional for Africa if we can help them. 404 00:48:37,550 --> 00:48:47,810 To defraud, directed to clean energy, just to save leapfrog directly to orphans and never have landlines, then we will help them. 405 00:48:47,810 --> 00:48:49,100 We do this by collaboration. 406 00:48:49,100 --> 00:49:00,950 And of course, we can then do far more in reducing the world's CO2 emissions than the one or two percent reduction achieved by just cruising around. 407 00:49:00,950 --> 00:49:09,950 So if we can help India to develop without needing to increase its contribution to CO2 emissions, we'd be making a good deal. 408 00:49:09,950 --> 00:49:17,390 And it's hard to think of a more inspiring go for young engineers than to provide an affordable energy, 409 00:49:17,390 --> 00:49:24,590 not just for ourselves which are developing world. So that's that's my Win-Win situation, I think to prioritise that. 410 00:49:24,590 --> 00:49:37,820 Just as to your subject, we have expertise in food science and plant science and all that and feeding the world in 2050. 411 00:49:37,820 --> 00:49:46,220 Preserving biodiversity is another area where expertise is needed and what we 412 00:49:46,220 --> 00:49:52,830 can do in those areas as they scrum can make a great difference in the world. 413 00:49:52,830 --> 00:50:01,670 So I think those are these two really important areas where the UK, through its academic strength, 414 00:50:01,670 --> 00:50:06,890 which we talked about earlier, could really make a disproportionate difference. 415 00:50:06,890 --> 00:50:16,160 I absolutely agree with you. Let me ask this question are passed with with your legislator hat on rather than a science hat. 416 00:50:16,160 --> 00:50:23,780 What concrete steps could we take to convince governments to invest more in resilience? 417 00:50:23,780 --> 00:50:30,930 There. Well, I think I think there are ways I'm sure of the funding, the health service, 418 00:50:30,930 --> 00:50:37,480 Spencer Overton obviously would allow them to a more slack in the hospital system. 419 00:50:37,480 --> 00:50:47,250 That's one thing. And perhaps there could be some system to ensure that the manufacturing companies keep inventories, et cetera. 420 00:50:47,250 --> 00:50:58,560 And I think. Mainly to make these industries under less pressure and also to recognise the value of the public sector because the 421 00:50:58,560 --> 00:51:08,090 one thing we've seen in the pandemic is that the public sector has done better than certain private contractors have. 422 00:51:08,090 --> 00:51:18,770 That may not be your question, but I think those are lessons you can learn and can then think not just about the next pandemic, 423 00:51:18,770 --> 00:51:25,880 but about dealing with these other crises which are less imminent but are more important the long run. 424 00:51:25,880 --> 00:51:43,040 Thank you, that question was from rock paper. And Cooper asked the question about how automation, mechanisation and AI and how it may affect jobs, 425 00:51:43,040 --> 00:51:48,440 and so they worry that this will hollow out the middle classes. 426 00:51:48,440 --> 00:51:58,460 And if I extend the question a little bit, then you will get a small fraction of the population, be able to make a lot of money. 427 00:51:58,460 --> 00:52:03,080 So the returns on capital will get so much greater than the returns on labour. 428 00:52:03,080 --> 00:52:07,670 How much expansion? Some of the economic things we are talking around inequality. 429 00:52:07,670 --> 00:52:12,320 How much does that worry you and how much do we in the science community? 430 00:52:12,320 --> 00:52:17,960 Take some responsibility for addressing this issue. 431 00:52:17,960 --> 00:52:27,200 Well, it's certainly true that it's going to be a redeployment of labour this needed in that it's not just blue collar jobs. 432 00:52:27,200 --> 00:52:34,400 Is it going to be replaced into gardening and it's coming through rather hard to mechanised? 433 00:52:34,400 --> 00:52:48,990 But of course, many professional jobs are going to be, if not supplemented by machines of accountancy, legal work and medicine as well. 434 00:52:48,990 --> 00:52:55,820 And that's great. But I think the simple answer question is that clearly we need the redistribution and this 435 00:52:55,820 --> 00:53:02,480 is going to be achieved by proper taxation of the big companies that own the robots. 436 00:53:02,480 --> 00:53:09,180 And of course, if you said earlier, that's impeded by the fact that multinational companies to get around the rules. 437 00:53:09,180 --> 00:53:14,480 But I think having done that, this should be used. 438 00:53:14,480 --> 00:53:20,540 This money should be used to create a far larger number of. 439 00:53:20,540 --> 00:53:27,440 Dignified, properly paid jobs in areas where the human element is crucial, 440 00:53:27,440 --> 00:53:34,340 particularly carers, the old people now in immensely short supply and teaching assistants, 441 00:53:34,340 --> 00:53:42,110 and all the other key people to be recognised in the pandemic as being crucial to us and appreciated more than they were in the past. 442 00:53:42,110 --> 00:53:46,730 So I think there's a huge demand for more people in those professions. 443 00:53:46,730 --> 00:53:58,460 And of course, if those people now in mind numbing jobs in the Amazon warehouses or in. 444 00:53:58,460 --> 00:54:08,000 The cool centres, for instance, if they're replaced by machines and those people have dignified jobs, 445 00:54:08,000 --> 00:54:13,100 looking after young and old, they're not Win-Win. And so this again, 446 00:54:13,100 --> 00:54:21,020 is where we've got to learn that the mantra of a low tax economy is not the way to happiness because of the 447 00:54:21,020 --> 00:54:28,850 Scandinavian way of having a proper welfare state where there are plenty of people to do these jobs as carers, 448 00:54:28,850 --> 00:54:36,410 custodians, etc. and quoting David Goodhart, who I know we've both read them more parity of esteem for head, heart and hand jobs. 449 00:54:36,410 --> 00:54:43,550 That's ridiculous. So I'm going to finish with a question. 450 00:54:43,550 --> 00:54:56,060 I can't resist doing so. And this is from Olli Steadman and I asked, how can we address risk by by picking careers that are future proof? 451 00:54:56,060 --> 00:55:03,050 And he gives some examples i policymakers, gene tailoring ethics, pandemic crisis management. 452 00:55:03,050 --> 00:55:13,160 What jobs do we need more of? And then, he says, apart from astronomers smiley face, I'm already one of those. 453 00:55:13,160 --> 00:55:19,850 Well, I mean, I think the answer I would give is that a many jobs are changing so fast. 454 00:55:19,850 --> 00:55:23,990 That's the case for lifelong education is strengthening. 455 00:55:23,990 --> 00:55:30,500 And one good development now is just more public support for the idea that instead of everyone spending three 456 00:55:30,500 --> 00:55:39,050 years at the university between the age of 18 and twenty one might be better to give people three years of study, 457 00:55:39,050 --> 00:55:43,620 which they could be spread anywhere through their lives because they don't have to do a lot and change their careers. 458 00:55:43,620 --> 00:55:51,920 So I think that might be beneficial as to where we are going to need the extra jobs. 459 00:55:51,920 --> 00:56:01,190 Then, of course, we can't predict far ahead to be some no bigger need for astronomers than it is now, of course. 460 00:56:01,190 --> 00:56:13,370 But there'll be more there'll be more chance actually, for amateur scientists, I think as more people do acquire wealth, 461 00:56:13,370 --> 00:56:21,890 then I think we may go back to a situation when there are more people who were like Lord Reilly and Darwin in the 19th century, 462 00:56:21,890 --> 00:56:29,720 where the best science is done by people like that. And of course, there are some people in centuries like Lovelock, 463 00:56:29,720 --> 00:56:41,870 etc. who both independently and I think the rise of the independent scientist may be something which is more attractive. 464 00:56:41,870 --> 00:56:48,470 And of course, some people may prefer to take that route rather than spend the whole time teaching in university. 465 00:56:48,470 --> 00:56:53,120 I think this is going to be a change in higher education is going to be changed, 466 00:56:53,120 --> 00:57:05,960 and the pandemic and the balance between distance learning and real lectures is going to change much faster than it would have done otherwise. 467 00:57:05,960 --> 00:57:09,890 And a question which I know you get asked a lot as a stone. 468 00:57:09,890 --> 00:57:18,020 What role the future of humanity? Is it on Mars? No, it isn't. 469 00:57:18,020 --> 00:57:26,090 Of course, as an astronomer, I get asked, but if I want to be left alone in a plane or somewhere, 470 00:57:26,090 --> 00:57:34,490 I say a mathematician, but if I say I'm an astronomer, then I get asked these questions like, Oh, do aliens exist? 471 00:57:34,490 --> 00:57:43,190 ET cetera? No, no. Of course, there are some people who think that we will go to Mars. 472 00:57:43,190 --> 00:57:48,240 I personally don't think there's much of a case for. 473 00:57:48,240 --> 00:57:53,760 Human spaceflight at all is getting weaker with every advance in miniaturisation and robotics. 474 00:57:53,760 --> 00:58:02,870 So my take would be that human spaceflight would be done only as a sort of expensive spectator sport. 475 00:58:02,870 --> 00:58:09,770 Paid for by Messrs Musk and Bezos. And we should cheer them on. 476 00:58:09,770 --> 00:58:16,550 Good luck to them, and some of them may go to Mars and that would be fine. 477 00:58:16,550 --> 00:58:28,840 But it's a dangerous delusion to imagine, as Elon Musk does, and as my colleague Stephen Hawking did, that would be mass emigration tamales because. 478 00:58:28,840 --> 00:58:39,580 Terraforming Mars is far, far harder than dealing with climate change here on Earth, and there's really no Planet B for all of you risk averse people. 479 00:58:39,580 --> 00:58:43,690 So let's cheer on people like Elon Musk, who says he wants to die on Mars, 480 00:58:43,690 --> 00:58:53,020 but also impacts if you want to follow him and let's keep the Earth as a unique place for which we are adopted. 481 00:58:53,020 --> 00:58:57,890 But of course, one reason to cheer on these people is that. 482 00:58:57,890 --> 00:59:00,950 Although we're going to regulate all these new technologies, 483 00:59:00,950 --> 00:59:09,770 cyborg technology and genetic modification on Earth for conventional and ethical reasons, those guys on Mars will be away with the regulators. 484 00:59:09,770 --> 00:59:21,320 They'd be in a very hostile climate. So good luck to them. If you want to adopt their progeny to live in a alien and hostile world, 485 00:59:21,320 --> 00:59:26,930 and so if there's going to be any sort of host humans, well, not by natural selection, 486 00:59:26,930 --> 00:59:33,710 but by a sort of second intelligent design, they will, I think, start on Mars, where there's the incentive and defeat of regulation. 487 00:59:33,710 --> 00:59:41,570 Well, the here on Earth. I would love to go on talking for another couple of hours, but I'm afraid we have to stop there just before thanking you. 488 00:59:41,570 --> 00:59:53,420 Could I just say that, as I mentioned beforehand, the next in the series is Paul Collier and John Kay talking about economic issues going ahead? 489 00:59:53,420 --> 00:59:56,870 Martin, I want to thank you for two things. 490 00:59:56,870 --> 01:00:05,360 First of all, it's a pleasure to observe this in public for everything you've done to help the Oxford Martin School over the last 50 years. 491 01:00:05,360 --> 01:00:07,460 And I know Lillian Martin, who I think is on the call, 492 01:00:07,460 --> 01:00:16,820 would join me in thanking you so much for what you've done and thank you very much for the such a wonderful conversation this afternoon. 493 01:00:16,820 --> 01:00:23,930 I've really enjoyed it. And just looking at the the messages in the chat, and I think a lot of other people have done this as well. 494 01:00:23,930 --> 01:00:27,119 So thank you very much indeed. Thank you. Charles and Virginia.