1 00:00:04,700 --> 00:00:08,930 So if you'd just like to start by saying your name and your title and affiliation. 2 00:00:09,710 --> 00:00:14,990 So I'm Professor Julia Cox and Professor of Clinical Epidemiology and General Practice 3 00:00:15,260 --> 00:00:19,370 in the Department of Primary Care Health Sciences at the University of Oxford. 4 00:00:19,670 --> 00:00:24,139 Lovely. Thank you. And without telling me your entire life story, because I didn't have enough time for that, 5 00:00:24,140 --> 00:00:31,760 could you just give me a little outline of how you got from your first interest in medicine or medical science to to where you are now? 6 00:00:32,360 --> 00:00:41,899 Okay. So I trained as a doctor and in Sheffield University I did an academic training alongside training to be a general practitioners. 7 00:00:41,900 --> 00:00:47,360 I work as an NHS GP in Oxfordshire. And then I love to do research. 8 00:00:47,370 --> 00:00:53,390 And basically I've spent over 20 years doing research initially in the University of Nottingham. 9 00:00:53,690 --> 00:00:58,070 And then in the last three years moved to the University of Oxford, where I'm not based. 10 00:00:59,710 --> 00:01:06,670 Okay. That's lovely. So if you had to define a sort of single big question that gets you off to bed in the morning, what would you say it was? 11 00:01:07,390 --> 00:01:14,200 Um, so how how can we better inform the decisions that doctors and patients need 12 00:01:14,800 --> 00:01:20,740 to make about their health care to take take account of their preferences, 13 00:01:20,740 --> 00:01:27,160 but also that different risks and benefits so that we can enable people to make decisions about the health care. 14 00:01:28,760 --> 00:01:34,340 Okay. So and what was your main area of research focus before COVID came along? 15 00:01:34,640 --> 00:01:44,840 Okay, so I had been working for some years in electronic health records, so I've developed a few very large electronic record databases, 16 00:01:44,840 --> 00:01:51,500 one called the Key Research Database, which takes general practitioner information or anonymized data about patients. 17 00:01:52,430 --> 00:02:00,229 I was actually when your patient records and it's linked to other healthcare data so hospital admissions mortality 18 00:02:00,230 --> 00:02:08,059 council registry and I was using that data to answer questions about how common different diseases were, 19 00:02:08,060 --> 00:02:11,870 what the outcome is with patients, what they got those diseases, what treatments did they get, 20 00:02:12,620 --> 00:02:20,209 but particularly who's at risk from different conditions and different diseases and who might therefore benefit from specific interventions. 21 00:02:20,210 --> 00:02:24,620 So that was one theme. It's about predicting risk, modelling risk. 22 00:02:25,070 --> 00:02:32,510 And then the second broadly was actually looking at safety of medicines and looking at when a new medicine is being introduced. 23 00:02:32,540 --> 00:02:37,520 And in medical practice, we normally go with it. 24 00:02:37,880 --> 00:02:41,540 It's effective because it's been tested in randomised controlled trials, 25 00:02:42,140 --> 00:02:47,600 but there's generally limited information about safety until the medicines are actually used in large populations. 26 00:02:48,320 --> 00:02:55,040 So actually looking at safety of medicines as well as their effectiveness has been a key interest. 27 00:02:55,040 --> 00:03:01,100 And both of those two things, both the risk modelling and also safety of medicines, 28 00:03:01,520 --> 00:03:09,440 have become major pieces of work that themes in the COVID pandemic that I've been working on in the last couple of years. 29 00:03:11,010 --> 00:03:16,800 So let's just look a bit more closely about the whole question of health, these health databases. 30 00:03:18,780 --> 00:03:22,169 I mean, I remember I forgot which year it was, but a decade or so ago, 31 00:03:22,170 --> 00:03:27,810 there was a lot of fuss in the media about a national scheme called Care dot data that was eventually dropped. 32 00:03:28,800 --> 00:03:34,560 But nevertheless, there are these big databases. Can you explain a little bit more about how they work? 33 00:03:35,070 --> 00:03:39,479 Yes. Okay. So so we set the Q research database up sort of about 20 years ago. 34 00:03:39,480 --> 00:03:45,690 And it was the first time that an individual patient level data had been extracted from the EMA system, 35 00:03:45,690 --> 00:03:49,590 which is the biggest and the most commonly used computer system in general practice. 36 00:03:50,280 --> 00:03:54,870 And all of the data that is extracted then now is anonymized data. 37 00:03:54,870 --> 00:04:00,510 So we can't tell who individual patients are. Some don't have names and addresses, and we can't work out why people are. 38 00:04:00,990 --> 00:04:05,940 But we can see a lot of information about what illnesses they've got are what say, 39 00:04:05,940 --> 00:04:09,120 what blood tests they've had and what treatments that they're taking. 40 00:04:09,840 --> 00:04:18,870 And this database was sort of set up as part of a sort of philanthropic venture, really, with the company that makes the computer systems, 41 00:04:19,470 --> 00:04:26,230 because it's a unique and incredibly valuable source of information about health care and 42 00:04:26,410 --> 00:04:32,460 the treatments that people get as people need and then how they use and how safe they are. 43 00:04:33,240 --> 00:04:35,340 And that was been in place for some years. 44 00:04:35,340 --> 00:04:42,450 And I think the kinds of data could see that the people running that could see the value of these databases for research, 45 00:04:42,450 --> 00:04:46,380 but could also see a wider value for health delivery of health care. 46 00:04:47,210 --> 00:04:51,870 And they were, I think, 47 00:04:51,870 --> 00:04:58,010 did become quite controversial and subsequently did as well because of the devices 48 00:04:58,020 --> 00:05:01,500 that we've got in universities that we're only using data for research purposes. 49 00:05:02,190 --> 00:05:12,030 And a university like the University of Nottingham, now the University of Oxford is a very trusted research environment as we pretend taking such 50 00:05:12,030 --> 00:05:17,489 using these data sources and we're not able because it's none of my business was able to use 51 00:05:17,490 --> 00:05:23,910 the data provided purposes and some of the concerns I think related to that to the use of 52 00:05:23,910 --> 00:05:28,260 white purposes which weren't clear at the time exactly what the scope of those focuses were. 53 00:05:28,530 --> 00:05:36,860 But by focusing really specifically on research, questions of medical research, we managed to continue and expand the work that we do. 54 00:05:37,500 --> 00:05:45,780 But we have got now quite a broad patient representation as well throughout all the stages of our in that endeavour so that we make 55 00:05:45,780 --> 00:05:52,770 sure that we take the public with us in terms of the data is being used appropriately to answer questions of the benefit to patients. 56 00:05:55,900 --> 00:06:01,809 So what you may have mentioned a number of a figure earlier, and if you have forgotten, but what sort of size are we talking? 57 00:06:01,810 --> 00:06:07,000 How many patients are we talking about? So we've actually got such 5 million patient records on the database. 58 00:06:07,390 --> 00:06:10,540 Now, those patient records go back to 1990. 59 00:06:10,540 --> 00:06:18,489 So some years ago now tracking over 30 years, and it includes people who've left their GP practice, 60 00:06:18,490 --> 00:06:21,950 some people who sadly died and also new patients that have come along. 61 00:06:21,970 --> 00:06:30,190 So as a snapshot, it becomes about a quarter of the current population because the sad distress that we don't know the names of just those people, 62 00:06:30,190 --> 00:06:34,329 we just have anonymous records of those of those patients. 63 00:06:34,330 --> 00:06:40,510 But it is one of the largest, if not the largest such dataset in the UK. 64 00:06:41,140 --> 00:06:49,360 And it's it's been widely used for research over the last 20 years, an assortment actually of football moments now. 65 00:06:50,840 --> 00:06:55,030 And you actually set up your own company glitter can risk. 66 00:06:55,040 --> 00:07:01,249 Is that right? So so back in 2008, we developed some prediction software. 67 00:07:01,250 --> 00:07:07,940 The first the first type of prediction software called the key risk score, which basically takes information about a patient, 68 00:07:08,240 --> 00:07:14,060 their height, their weight, their cholesterol values, the family history and their medical conditions. 69 00:07:14,420 --> 00:07:21,590 And it built a predictive model which identifies the risk of somebody getting a heart attack or stroke in the next ten years. 70 00:07:22,460 --> 00:07:29,240 And in order to implement that software, we needed to set up a company to produce the software to make sure there was insurance for it. 71 00:07:29,240 --> 00:07:38,060 And it could be used as a medical device on it, and it could be embedded in in the NHS systems such that it could be use point of care. 72 00:07:38,540 --> 00:07:42,880 And so the company Psychiatrist Ltd was set up as a Start-Up Company really 73 00:07:42,920 --> 00:07:48,170 arising out of the University of Nottingham to implement the results so that you 74 00:07:48,170 --> 00:07:53,330 could translate the the research findings into something which is tangible and 75 00:07:53,360 --> 00:07:58,330 beneficial for patients and that is used across the NHS and still use our NHS. 76 00:07:59,210 --> 00:08:07,670 And yes, so that just kind of emphasising that point so that, that, that software has actually been adopted. 77 00:08:08,000 --> 00:08:16,750 Has it through throughout the NHS. Yes. So, so the, the steps where we developed initially, the risk, risk score, 78 00:08:16,760 --> 00:08:22,610 we published the paper in the British Medical Journal Oxford University is in fact researchers 79 00:08:22,820 --> 00:08:28,280 validated that risk to check that it worked correctly identify the high risk people. 80 00:08:28,910 --> 00:08:35,120 It was then recommended by the Nice National Institute complex and school as part of that guideline, 81 00:08:35,120 --> 00:08:40,040 and it was also adopted by the Department of Health for the NHS Health Checks programme, 82 00:08:40,040 --> 00:08:45,110 which is offered to everybody between the ages of 45 and 75 on a sort of rolling basis. 83 00:08:45,980 --> 00:08:50,000 And then that then led to the development of other tools. 84 00:08:50,000 --> 00:08:55,970 Rather than just looking at heart disease. We developed similar risk tools for Cardiff, for diabetes, 85 00:08:56,690 --> 00:09:04,430 for identifying people at high risk of having osteoporotic fractures, high risk of having cancers, 86 00:09:04,730 --> 00:09:12,650 with the idea that all of these tools could be used to actually identify people in the risk category where we could target something to them. 87 00:09:12,650 --> 00:09:22,750 So target an intervention like a medication or a prevention or a referral to to make sure they haven't got cuts of example. 88 00:09:23,270 --> 00:09:31,910 And then these risk tools have been built into clinical systems such as the EMA system that I mentioned, 89 00:09:32,630 --> 00:09:41,510 so that the GP's who provide data to us actually get some benefit back in terms of uteruses that exist in the clinical system and 90 00:09:41,510 --> 00:09:48,260 therefore a through that patients can actually get some direct benefit back from that from the utilities that we've developed as well. 91 00:09:48,830 --> 00:09:59,040 So it's been a a really interesting sort of cycle to do it and to be able to take a research idea really, and then change it into such a question, 92 00:09:59,040 --> 00:10:05,239 then deliver some results in an academic setting, but then translate that into something which is tangible, 93 00:10:05,240 --> 00:10:10,670 which people can then use to help with clinical care on a day to day basis. 94 00:10:10,670 --> 00:10:18,290 That's been a really satisfying thing to do as an academic because sometimes it can be quite far removed from actual, 95 00:10:18,290 --> 00:10:21,710 such tangible outputs because of the nature of the work that you do. 96 00:10:22,160 --> 00:10:29,360 So really privileged to be in that position. So doctors can essentially call people up and say, 97 00:10:29,660 --> 00:10:35,150 I think we ought to investigate you a little bit further rather than waiting for people to fall ill and and come and see them. 98 00:10:35,840 --> 00:10:37,950 Absolutely. And so with cardiovascular disease, you know, 99 00:10:37,970 --> 00:10:44,480 we've got interventions now that we know works and we know that statens again receptor Myles is in the 100 00:10:44,480 --> 00:10:50,420 University of Oxford did that looking at effectiveness of statins to reduce risk of cardiovascular disease. 101 00:10:51,680 --> 00:10:53,569 So we know that that works. 102 00:10:53,570 --> 00:11:00,020 And what the risk score does is actually helps quantify what that person's risk is so that we can then give that information to patients. 103 00:11:00,410 --> 00:11:08,000 And then if you were to then take a step down, then this would be a benefit to you and helping you to reduce your risk. 104 00:11:08,620 --> 00:11:17,899 So it's an interesting thing to do, and I think it's opened up a whole raft of similar sort of endeavours by people. 105 00:11:17,900 --> 00:11:24,830 I think it opens up the sort of it was the first time anybody had actually done predictive risk modelling 106 00:11:24,830 --> 00:11:30,979 using deep electronic health care records and produced a tool which could then be used in clinical practice. 107 00:11:30,980 --> 00:11:37,640 So it served as a sort of example really as to how you could you could do such a thing in the future. 108 00:11:38,410 --> 00:11:44,180 And are you personally interested in the geeky side of it, the coding and so on, or do you get other people to do that? 109 00:11:44,580 --> 00:11:49,730 Geeks Yes. So before I went to university, actually, I had a year I was supposed to be going to two. 110 00:11:50,270 --> 00:11:52,160 Another university to read physics. 111 00:11:52,700 --> 00:12:04,820 But I changed my mind and I spent a year programming computers for a large company, and I absolutely loved writing computer codes at that point. 112 00:12:05,420 --> 00:12:10,650 But I then went into medicine and then I come out. 113 00:12:10,870 --> 00:12:16,669 I'm still working in in medicine, but actually found that the computer skills as I got the mathematical skills could 114 00:12:16,670 --> 00:12:21,829 be combined with medical skills as I've got all wrapped up in some epidemiology, 115 00:12:21,830 --> 00:12:25,219 really in the way of being able to sit to implement things. 116 00:12:25,220 --> 00:12:27,620 So. So I was one of those doctors, 117 00:12:27,620 --> 00:12:37,040 doctors who code a central group of people who if a lot of people really who liked the side of things and this quite was quite 118 00:12:37,040 --> 00:12:44,540 a few of the survivors actually them who liked to be able to take things and then put them into software and make them useful. 119 00:12:46,240 --> 00:12:48,600 So let's finally get round to cope with it. 120 00:12:48,610 --> 00:12:58,300 So can you remember where you were or how you came to hear that there was something going on in China and that it might actually become quite serious? 121 00:12:59,410 --> 00:13:07,150 So I can remember back to hearing about the emergence of the virus. 122 00:13:07,180 --> 00:13:17,260 So back in 2005, actually, we went to such a scheme called the Queue Surveillance Infectious Diseases Surveillance System, 123 00:13:17,560 --> 00:13:20,830 and that has arisen because of concerns about bird flu. 124 00:13:21,550 --> 00:13:27,670 And at the time there was there was a real concern that it was going to be a pandemic and that the government would need 125 00:13:27,670 --> 00:13:34,940 very rapid access to information at the points that it's about to alert to a pandemic and then to be able to manage it. 126 00:13:34,960 --> 00:13:42,610 And so they wrote back that we sort of set up a surveillance system covering an even bigger group of patients, 127 00:13:42,610 --> 00:13:51,460 actually nearly 40 million patients, and collecting information basically to alert to things like flu outbreaks, infectious disease outbreaks. 128 00:13:52,150 --> 00:13:54,549 And that was run and run for many years, 129 00:13:54,550 --> 00:14:02,860 basically as a sort of a surveillance system and have been part of like endeavours to look at pandemic planning and things like that in the past. 130 00:14:03,310 --> 00:14:07,320 So as my awareness of pandemics was already, if you like, quite heightened. 131 00:14:07,330 --> 00:14:12,820 And so when I heard about Wuhan virus and heard about the cells, you know, 132 00:14:12,880 --> 00:14:21,340 I was really quite concerned about that because by the time that you think about trying to contain this, it's probably already already spread. 133 00:14:22,300 --> 00:14:31,510 So I can remember that quite, quite clearly. And I can remember being being asked by the government really quite early on to look at the surveillance 134 00:14:31,510 --> 00:14:37,270 systems that we've got in place and to see what we could potentially do to help enhance those. 135 00:14:38,200 --> 00:14:46,390 And then my direct involvement happened, I think it was in early May 2020. 136 00:14:46,780 --> 00:14:57,040 So so Chris Whitty, Chief Medical Officer, identified a list of patients who we thought might be at high risk of getting severe outcomes from COVID, 137 00:14:57,340 --> 00:14:59,440 who might need to have special interventions. 138 00:14:59,980 --> 00:15:06,970 And the way that that was done was basically to look at what we knew about flu and the people who were susceptible to getting flu each year. 139 00:15:07,090 --> 00:15:09,639 And again, because of the work I've been doing on surveillance. 140 00:15:09,640 --> 00:15:15,580 So for many years we've been tracking flu outbreaks and flu virus vaccine uptake and things like that. 141 00:15:16,330 --> 00:15:24,760 And they produced this list and it was the best we could do in the complete disaster information and the lack of that data at that particular time. 142 00:15:25,390 --> 00:15:33,370 But it became very apparent, particularly to Chris Whitty, but also Jenny Harries, now Dame Jenny Harries, 143 00:15:33,730 --> 00:15:39,840 that the people who were coming through the doors on the intensive care units were not necessarily the people that were in this high risk category. 144 00:15:40,270 --> 00:15:45,819 It didn't look the same. So there was a different ethnic mix and some of the conditions that we thought were 145 00:15:45,820 --> 00:15:50,620 high risk did seem to be and then other things which we didn't think did seem to be. 146 00:15:51,190 --> 00:15:54,010 And there was obviously a lot of public concern at the time. 147 00:15:54,670 --> 00:16:02,079 And so we were asked to sort of start to look at building on the work that they knew that we were able to do, 148 00:16:02,080 --> 00:16:09,450 which is basically develop prediction models for an outcome to then see whether we could actually make that outcome. 149 00:16:09,460 --> 00:16:12,820 So that kind of it's a poor outcome. 150 00:16:12,820 --> 00:16:17,800 But so basically somebody being admitted to hospital in Tesco or somebody dying type it. 151 00:16:18,430 --> 00:16:22,389 And so we then started off on some of the quickest research that we've ever done, 152 00:16:22,390 --> 00:16:25,930 which was only really possible because we've done this quite a lot of times before. 153 00:16:26,230 --> 00:16:33,400 So the conditions over the previous ten years as I was able to assemble new data as it was coming through. 154 00:16:33,430 --> 00:16:41,110 So the information, I think we were one of the first recipients of the the dataset from the Public Health England, 155 00:16:41,110 --> 00:16:46,179 as it was called, the Infectious Diseases Source Data Files. 156 00:16:46,180 --> 00:16:52,630 It's coming through. We're able to combine that with data we already had on hospital admissions. 157 00:16:52,960 --> 00:16:57,640 We get new and faster data on hospital admissions that was just becoming available. 158 00:16:58,320 --> 00:17:01,510 And because we already had the construction infrastructure, 159 00:17:01,510 --> 00:17:09,370 we were able to basically tag these extra things on and we were able to look at the expertise already in terms of modelling to be able 160 00:17:09,370 --> 00:17:18,880 to run and develop really very quickly some sort of new predictive models to look at who was at high risk of getting struck with COVID. 161 00:17:19,300 --> 00:17:28,180 We did a really extensive patient engagement at that point, so there were lots of it quite, quite understandably and predictably. 162 00:17:28,360 --> 00:17:32,860 There were lots of patient groups who were very concerned about that particular condition. 163 00:17:33,250 --> 00:17:38,590 People, for example, who were not actually recognised on the original shielded patient list. 164 00:17:38,590 --> 00:17:44,620 So people with severe kidney transplant, severe renal failure always had kidney transplant, for example. 165 00:17:44,630 --> 00:17:50,920 People with severe. Disabilities. So we were able to sort of work with those through the Department of Health, 166 00:17:50,920 --> 00:17:57,370 but work with a large groups of charity type organisations and patient groups, 167 00:17:57,700 --> 00:18:03,880 professional groups, ready to assimilate what the long list of conditions might be that we should look at. 168 00:18:04,810 --> 00:18:10,959 And we were able to then cycle through and look at those and develop the the research underpinning research, 169 00:18:10,960 --> 00:18:14,080 which we then published in the British Medical Journal. 170 00:18:15,460 --> 00:18:22,870 And we developed a model which we were then able to test and validate, not just on a sort of the dataset that we had, 171 00:18:22,870 --> 00:18:32,200 but we were also able to work with colleagues up in Scotland and Wales, Northern Ireland and and the Office of National Statistics. 172 00:18:32,950 --> 00:18:39,939 So we developed the model and then we got it to be tested to make sure that it worked in those different settings 173 00:18:39,940 --> 00:18:47,320 up in Scotland and Wales and also across the whole of the national database in the Office of National Statistics. 174 00:18:48,010 --> 00:18:56,229 So this was coming, I suppose, with the intention that we would be able to identify quickly people who needed interventions so 175 00:18:56,230 --> 00:19:02,060 that we could target those most at risk to get treatments that they needed and to get them first. 176 00:19:02,080 --> 00:19:07,690 And so one of the the prime use cases of this time last year were just I think 177 00:19:08,110 --> 00:19:13,380 that very last year we took the algorithm that we developed and we applied it. 178 00:19:14,410 --> 00:19:25,330 I say we we as a sort of a corporate way, if you like. So Oxford University Innovations did all the medical device regulation certification. 179 00:19:25,600 --> 00:19:30,639 Oxford Computing Consulting that works very closely with the university developed 180 00:19:30,640 --> 00:19:36,180 some sort of industrial scale software which was then sent to NHS Digital. 181 00:19:36,190 --> 00:19:43,810 So that's the sort of national organisation that organises and hosts a lot of the nation's health data. 182 00:19:44,680 --> 00:19:54,009 And then they then applied the algorithm to the national database to sort of stratify, basically put into risk categories, everybody on the database. 183 00:19:54,010 --> 00:20:01,719 And what that did is they added overnight, whereas overnight they took a month or two to get the software working, 184 00:20:01,720 --> 00:20:05,680 but once it was then run, we were able to end it, 185 00:20:05,680 --> 00:20:09,670 just as you were able to write to one and half million patients nationally 186 00:20:10,000 --> 00:20:14,020 to identify the fact that they were at higher risk of severe COVID outcomes, 187 00:20:14,740 --> 00:20:22,450 that they hadn't already be on to the original list. And then for those patients that they were given like a priority letter, 188 00:20:23,020 --> 00:20:28,020 and then 800,000 of those patients were able to come and get a vaccine earlier in the program. 189 00:20:28,030 --> 00:20:33,610 And they would have done just by virtue of their age, although the current arrangements are in place. 190 00:20:34,060 --> 00:20:42,760 So that was a very tangible thing that we were able to do and those patients came forward and I think there was a broad understanding at the time. 191 00:20:42,760 --> 00:20:44,829 I think most people just looking back, 192 00:20:44,830 --> 00:20:53,319 I think I think society sort of feels that the most vulnerable people that we have should be getting interventions. 193 00:20:53,320 --> 00:21:00,160 And people were pleased to see that grannies coming forward or people with serious health conditions sort of being prioritised. 194 00:21:00,640 --> 00:21:08,890 And I think one of the reasons that we could do that from the ethical perspective was because we knew that everybody would get a vaccine eventually. 195 00:21:08,890 --> 00:21:13,750 And what we were doing is just actually prioritising people according to the level of need, 196 00:21:14,200 --> 00:21:18,100 but doing it in a way that didn't make a decision, if you like. 197 00:21:18,160 --> 00:21:25,270 It didn't actually make people have a vaccine. It just basically identified the people who were at risk so they could be prioritised first. 198 00:21:25,930 --> 00:21:34,780 And that probably just just reminds me to mention that the enormous team effort this was really because of that there 70 dimensions 199 00:21:34,780 --> 00:21:42,999 and one of the dimensions was actually the ethics and the moral brought us into contact with the Moral Ethics Advisory Committee, 200 00:21:43,000 --> 00:21:49,110 which I was not aware of before, but which is a government led organisation which is was currently, 201 00:21:49,130 --> 00:21:52,720 which was chaired by John Montgomery, also from Oxford, 202 00:21:53,620 --> 00:22:01,210 which basically was able to look at the different approaches that we could have taken to stratification because there needs to be, 203 00:22:01,270 --> 00:22:08,290 there needs to be some sort of approach about how do you get the vaccines to everybody in the most appropriate way. 204 00:22:08,600 --> 00:22:11,379 And really to help us sort of think through what the issues were and to make 205 00:22:11,380 --> 00:22:16,600 sure it's compliant with what it what the general public would find acceptable, 206 00:22:16,600 --> 00:22:20,020 but also would be compliant with the legislation. 207 00:22:20,020 --> 00:22:21,909 And that protects lots of things like that as well. 208 00:22:21,910 --> 00:22:30,399 So there was a lot of team working on that and there was also a lot of team working to actually ensure that the information 209 00:22:30,400 --> 00:22:38,710 that was made available to patients is understandable as well and that the people could and to health care professionals. 210 00:22:39,400 --> 00:22:44,870 So so that was the the sort of the main outputs from that initial piece of work and then. 211 00:22:45,270 --> 00:22:57,950 The second output was that we developed a social sciences text by email saying, I'm just getting so close. 212 00:22:58,680 --> 00:23:08,190 So the second aspect was that we developed a clinical calculator which you can find at WW WW dot org. 213 00:23:08,190 --> 00:23:16,649 So the risk assessment tool was called COVID. And that's partly because the database which we've developed is called the Pew Research Database, 214 00:23:16,650 --> 00:23:20,730 and we've put generally a queue at the beginning of things too as a bit of a brand. 215 00:23:21,510 --> 00:23:28,680 NHS Digital also produced a risk assessment tool which allows an individual or a clinician, 216 00:23:28,680 --> 00:23:33,930 a doctor or patient to date to be able to put information out about their characteristics and their age, 217 00:23:33,930 --> 00:23:36,300 the sex and ethnic group and information about them. 218 00:23:36,870 --> 00:23:44,040 And then it would calculator a risk for them and it would compare that risk to other, other risks, for example, 219 00:23:44,280 --> 00:23:49,590 like the risk of somebody of the same age and sex as some who didn't have maybe severe health conditions. 220 00:23:50,070 --> 00:23:54,480 And that's possibly they go back to the beginning of that sort of thing, 221 00:23:54,870 --> 00:24:02,099 trying to provide to people better information about what the risks are that we're trying to avoid and therefore 222 00:24:02,100 --> 00:24:09,210 what benefits are by actually having the treatments that are needed and why they might be prioritised. 223 00:24:09,630 --> 00:24:12,240 So that's was the second thing that was produced. 224 00:24:12,240 --> 00:24:24,390 So that at the end of 2015 inches s a 52,021, we were asked to update the risk model that we've done because we have now got a vaccinated population. 225 00:24:25,170 --> 00:24:32,459 So one of the challenges has been the fact that this is not a static thing. 226 00:24:32,460 --> 00:24:40,690 So unlike cardiovascular risk, which is changing slowly over a decade, this is a situation which was changing month by month really, 227 00:24:40,710 --> 00:24:46,650 and that we we could go from unvaccinated population to a vaccinated population or a largely vaccinated population. 228 00:24:47,250 --> 00:24:55,260 And that the chief medical officer wanted us to find out what who was still vulnerable despite having a vaccination. 229 00:24:55,260 --> 00:24:57,120 So we had to be models again, 230 00:24:57,720 --> 00:25:05,280 this time linking in the vaccine data so that we could look at people who were still vulnerable at that point despite having been vaccinated. 231 00:25:06,060 --> 00:25:14,130 And then that was called UK Bits three, so that they could give it to is actually the model that we did because we have a new wave. 232 00:25:14,790 --> 00:25:19,290 I think it was the alpha wave looking back there were massed in the UK, 233 00:25:19,290 --> 00:25:27,269 but three was actually the the model that we did take account of the vaccinated population and we're just now being asked to look at COVID four, 234 00:25:27,270 --> 00:25:33,579 which is to take account of the on the second wave. And so, again, it really keeps you on your toes, I think. 235 00:25:33,580 --> 00:25:39,120 And I think one of the real interests for me about this is not just producing a tool that can be used, 236 00:25:39,120 --> 00:25:47,309 but actually when you've got a complex system so you've got new treatments becoming available, you've got a virus that doesn't seem to obey any rules. 237 00:25:47,310 --> 00:25:52,680 You know, it's just mutating without asking. And we need to keep the models up to date. 238 00:25:53,520 --> 00:25:56,730 So what's the best way to keep the models up to date? How do we do that? 239 00:25:57,180 --> 00:26:06,720 And so we're working quite broadly with other experts in the field to look at dynamic update models and how best to keep these models up to date. 240 00:26:08,220 --> 00:26:18,750 And then there's more recently even still. So in this particular subset to 2020 to some of the work that we've been doing has been trying 241 00:26:18,750 --> 00:26:23,909 to prioritise people for new therapeutics apart from vaccines that are become available. 242 00:26:23,910 --> 00:26:29,490 So drugs that are called monoclonal antibodies or maps are sometimes called. 243 00:26:30,240 --> 00:26:34,049 These are particularly useful for patients who can't make their own antibodies 244 00:26:34,050 --> 00:26:38,850 or who don't as a response to vaccines might be very sick for other reasons. 245 00:26:38,850 --> 00:26:45,720 And so the work that we've been doing has been helping prioritise patients for those particular treatments. 246 00:26:49,360 --> 00:26:53,259 Great. Thank you. Certainly good talking. 247 00:26:53,260 --> 00:27:00,160 And so I was just going to ask you. Going right back to the beginning, the first paper that you did, 248 00:27:01,090 --> 00:27:06,850 what were the main groups that you uncovered that hadn't been picked up in the initial shielding lists? 249 00:27:06,970 --> 00:27:13,600 Yeah. So the main that so people who got significant neurological problems, 250 00:27:13,600 --> 00:27:22,330 those people with much different substances to graft and multiple sclerosis, that was one category of people, people with Down syndrome. 251 00:27:22,370 --> 00:27:27,970 Again, that was that that was a a brand new finding that they seemed to have particularly high risks. 252 00:27:28,600 --> 00:27:31,809 And that finding is said to be replicated in other and other datasets. 253 00:27:31,810 --> 00:27:39,160 And, in fact, that people with Down syndrome were prioritised for interventions. 254 00:27:39,460 --> 00:27:47,290 And then, as I mentioned, the people with severe kidney disease and had transplants, they they were included. 255 00:27:47,530 --> 00:27:52,450 But the actual model is it isn't so much that it's a group of patients. 256 00:27:52,450 --> 00:27:54,639 It's about a cumulative thing. 257 00:27:54,640 --> 00:28:03,700 So it takes account of somebody's age, ethnicity and their medical conditions and indeed things like their level of obesity. 258 00:28:03,700 --> 00:28:10,059 So some of the research we did was used to basically perform some of the obesity planning, 259 00:28:10,060 --> 00:28:18,250 which is because as your body mass index goes up, as you become more obese and your risk of having certain outcomes also goes up. 260 00:28:18,250 --> 00:28:27,370 So and that is one of the few modifiable risk factors actually, where people can make a difference themselves through that, but that they're doing. 261 00:28:28,360 --> 00:28:34,629 The other piece of work probably to mention, is coming back to the theme of vaccine safety. 262 00:28:34,630 --> 00:28:39,250 So once again, this has been the same sort of success in my work over the years, 263 00:28:39,250 --> 00:28:44,290 which is once you've identified a group of patients who might need an intervention, 264 00:28:44,530 --> 00:28:50,259 you then also need to make sure that that interventions go to the right people, but that intervention is safe for those people. 265 00:28:50,260 --> 00:28:54,790 So over the years where we started with cardiovascular disease and we identify 266 00:28:54,790 --> 00:28:59,890 people high risk contact Cox's you might then have had started prescription. 267 00:29:00,310 --> 00:29:07,660 Our next theme of research was actually to go and look at the risks and benefits of statins to make sure there weren't any unexpected side effects. 268 00:29:07,660 --> 00:29:16,000 And so what we've done over the over Christmas, really in the run up to Christmas this year and in the middle of last year, 269 00:29:16,870 --> 00:29:26,860 is to look at the safety of the COVID vaccine. So we were very, again, very privileged to get access to the national datasets for COVID vaccination, 270 00:29:27,520 --> 00:29:32,470 which meant that we could look at 46 million people, 271 00:29:33,280 --> 00:29:41,890 all anonymized data, but we could compare who's had the Oxford vaccine, for example, who's had the Pfizer one use up the maternal one. 272 00:29:42,250 --> 00:29:50,649 And we were able to look at the side effects that were being uncovered at the time, and there was a concern to move to that quite quickly. 273 00:29:50,650 --> 00:30:01,059 So looking back to Easter last year, there was some concerns around some of the side effects, the thrombosis risk associated with the Oxford vaccine. 274 00:30:01,060 --> 00:30:09,850 So we had some funding from a priority government funding source to look at that quickly. 275 00:30:10,640 --> 00:30:17,770 We were able to we were able to confirm that there was a signal to do with the vaccine was first on basis risk. 276 00:30:17,770 --> 00:30:26,110 However, we were able to put that in context with the risk of the same outcome if you were to get infected with COVID. 277 00:30:26,950 --> 00:30:30,549 And in fact, actually, although there was an increased risk with that with the vaccine, 278 00:30:30,550 --> 00:30:34,330 it was nowhere near as big as the increased risk associated with the virus. 279 00:30:34,750 --> 00:30:41,140 And so therefore, whilst ideally not any sort of adverse events, 280 00:30:41,150 --> 00:30:46,420 when we know in the real world that there are, but the important thing is to be able to quantify that. 281 00:30:46,420 --> 00:30:55,300 So it was still able to be a positive message, which is that it's it's better to be vaccinated than it is to be to actually contract the virus. 282 00:30:56,050 --> 00:31:04,360 We were then able to follow that work up as concerns about myocarditis, which is inflammation of the hearts of such too much back. 283 00:31:04,660 --> 00:31:10,780 So but the other vaccines and again we were able to quantify those risks and 284 00:31:11,020 --> 00:31:15,210 identify the people most at risk within within some of those groups as well, 285 00:31:15,220 --> 00:31:19,120 published in papers in Nature Medicine just around Christmas time, 286 00:31:19,120 --> 00:31:29,320 which have been I think because the UK used three vaccines and it used them quickly and its vaccine programme was very successful at getting coverage. 287 00:31:29,890 --> 00:31:37,720 It also left us, I think, with a responsibility to to make sure that we use that data wisely and informative so that we could 288 00:31:37,900 --> 00:31:43,360 tell other people who might be in the programme in their own countries that were less advanced, 289 00:31:43,810 --> 00:31:47,709 sort of what, what the direct comparisons are between those different risks. 290 00:31:47,710 --> 00:31:51,870 And so that's been a. Important stream of work that we were just finishing there. 291 00:31:53,250 --> 00:32:00,000 And the message I'm just just to confirm that the message is that the risk is actually extremely small of the vaccines. 292 00:32:00,360 --> 00:32:08,170 It's not not nothing, but as you say, compared with actually catching the virus, it's it's much more that. 293 00:32:08,190 --> 00:32:11,640 That's right. So, for example, we were able to to present information. 294 00:32:11,670 --> 00:32:15,299 It may not be that for every million patients that have been vaccinated, 295 00:32:15,300 --> 00:32:22,740 then there might be one or two extra people who end up with one of these side effects that might not have otherwise occurred, 296 00:32:23,220 --> 00:32:27,720 but had the virus, then there might be ten more or 20 more, etc. 297 00:32:27,720 --> 00:32:32,670 So it's that and we've worked closely with the Edinburgh Institute, 298 00:32:32,910 --> 00:32:38,580 the National Institute at the University of Edinburgh to basically produce nice graphical images, 299 00:32:38,580 --> 00:32:43,290 basically which which allow you to contextualise that information and present it to 300 00:32:43,290 --> 00:32:49,790 people in a a readily accessible format so that anybody basically can understand it. 301 00:32:49,800 --> 00:33:00,990 And we've certainly found that that has been a valuable way of being able to communicate science in sort of a pictorial form, 302 00:33:01,840 --> 00:33:08,340 in simple, easy to use sort of graphics really, which can communicate complex. 303 00:33:08,760 --> 00:33:13,260 So it is complex doing these analysis, but the results are quite simple, you know, 304 00:33:13,290 --> 00:33:22,550 that you could summarise in the picture quite often, but there are maybe some subgroups that this country has changed its policy on, 305 00:33:22,590 --> 00:33:29,909 on the because of the effects of some of the vaccines in particular age groups like for example, 306 00:33:29,910 --> 00:33:33,719 the Oxford vaccine is now not used in younger people because of that, 307 00:33:33,720 --> 00:33:40,350 because of the it's such a risk that there are, but there are other vaccines which can which can be used instead. 308 00:33:40,740 --> 00:33:50,620 So I think it comes back to really the whole driving force for me that all my research is around trying to better quantify the risks and benefits of, 309 00:33:50,680 --> 00:33:59,490 of, of the people face because they might get, but also then the decisions that they need to make about what treatments or what what to do next. 310 00:34:00,390 --> 00:34:03,990 Because there's always a trade off really that you need to do. 311 00:34:03,990 --> 00:34:08,020 And I think to get consent from people as a doctor say, 312 00:34:08,440 --> 00:34:16,560 you need to both provide people with reliable information so they can make a decision about what they want, what they want to do. 313 00:34:18,250 --> 00:34:27,190 So is the ability to use these huge databases is clearly speeded up and make it much more accurate to the kind of risks, 314 00:34:27,610 --> 00:34:32,170 the risk stratification and also the spotting of side effects. 315 00:34:32,500 --> 00:34:37,930 How how was that done previously? So it's been done, I think. 316 00:34:39,070 --> 00:34:50,229 So I think with the drug safety surveillance, as we might call it, it's been done at a sort of smaller scale and not necessarily in a systematic way. 317 00:34:50,230 --> 00:34:58,270 And so that there's a thing called the yellow card system, which is run by the Medicines House Regulatory Agency, MHRA, 318 00:34:58,780 --> 00:35:06,200 and that's encouraged for as long as I've been qualified, which is which is some years now, but is encouraged with the new drug. 319 00:35:06,220 --> 00:35:11,920 If you think that somebody might have had the it up from too much, you fill out a card and you send it in. 320 00:35:11,920 --> 00:35:14,620 And and that certainly has got a value. 321 00:35:14,920 --> 00:35:20,460 But what it doesn't tell you is just does it tell you how many people have had the drugs and not had the side effects? 322 00:35:20,470 --> 00:35:25,650 You don't have that sort of denominator. So it's good for generating sort of concern. 323 00:35:25,660 --> 00:35:32,110 And that's partly how some of the concerns around that about thrombosis you talk about were raised. 324 00:35:32,500 --> 00:35:36,250 But it's not so good for actually quantifying it because you don't know what the 325 00:35:36,250 --> 00:35:42,070 denominator is and you can't work it out in the way in the way that you've just described. 326 00:35:42,700 --> 00:35:45,729 So I think one of the things with the future, which I think is really important, 327 00:35:45,730 --> 00:35:52,240 that we take the lessons that we've learnt from the pandemic and we take some of the new infrastructures that we built, 328 00:35:52,780 --> 00:35:54,910 the new data linkages that we've been able to do, 329 00:35:55,420 --> 00:36:03,190 and that we we harness that because some that some of the approaches that we've used statistically, for example, 330 00:36:03,190 --> 00:36:13,599 can be programmed into computers and can run automatically in the background, basically just looking for unexpected side effects. 331 00:36:13,600 --> 00:36:17,440 For example, you could have a subindex, 332 00:36:17,710 --> 00:36:26,140 a drug safety surveillance system which could automatically in a scalable way across millions of people when we have new medicines being rolled out, 333 00:36:26,140 --> 00:36:30,459 actually look to establish that they're safe and do that in a reliable way. 334 00:36:30,460 --> 00:36:35,320 That might be quite cost effective for the government as well to think about. 335 00:36:35,590 --> 00:36:46,270 And also independent from both the companies that make the drugs who who obviously very interested in what side effect profile might be rightly so. 336 00:36:46,280 --> 00:36:48,879 And and also the people who regulate the drugs, 337 00:36:48,880 --> 00:36:59,260 who is doing good at that to be making decisions more quickly that may be them than they hitherto needed to do, 338 00:36:59,680 --> 00:37:06,459 and then basically triangulate that and support it provides the supporting evidence that they need in order to be able to to 339 00:37:06,460 --> 00:37:13,030 look at because you can't look at the when you're in the middle of a pandemic and you're trying to scale a vaccine that, 340 00:37:13,030 --> 00:37:17,140 you know, works, you have to sort of do these things in real time. 341 00:37:17,260 --> 00:37:22,510 There's no point in doing a study two years later and then uncovering something. 342 00:37:22,510 --> 00:37:26,590 You have to have a system in place that allows you to do this as you go along. 343 00:37:26,920 --> 00:37:34,479 And I think one of the one of the other very interesting thing you mentioned about becoming a lot of discussion, 344 00:37:34,480 --> 00:37:41,160 I think there were a lot of concerns about for patients around about the use of data when they didn't know what it's going to be used for. 345 00:37:41,170 --> 00:37:48,579 But in the context of when we were stratifying patients and some of the feedback that we got from patients is that they were 346 00:37:48,580 --> 00:37:54,460 really keen to make sure that their doctor knew that they had conditions or that they had information that was up to date. 347 00:37:54,790 --> 00:37:59,290 They really wanted to know that their data was being used not just for the general good, 348 00:37:59,290 --> 00:38:05,950 but actually to make sure that that the information about their condition was being factored into the risk assessment. 349 00:38:06,400 --> 00:38:13,410 And if they were at risk of needing to have a priority for particular treatment, then then that could be available. 350 00:38:13,420 --> 00:38:20,799 So I think there's been a a sea change in how people have understood data, not just around the risk prediction thing. 351 00:38:20,800 --> 00:38:25,840 I suppose there's a massive increase in health literacy across the population. 352 00:38:25,840 --> 00:38:34,749 I mean, we've now got a population that knows about the names of various viruses and the separate viruses and knows the names 353 00:38:34,750 --> 00:38:41,640 of different vaccines and has some sort of in a way that would have been unthinkable even two years ago to have this. 354 00:38:41,690 --> 00:38:48,219 I think that there's this there's a big step change in how people want to know about information. 355 00:38:48,220 --> 00:38:55,000 They want to be able to assess their risks. I think there's been a big change as well about how we think about other people around this as 356 00:38:55,000 --> 00:39:00,100 well and the need to be considerate about the fact that you might everybody is in the chain. 357 00:39:00,100 --> 00:39:05,919 That might be somebody who's a vulnerable person, which I think some of the Christmas he said early on. 358 00:39:05,920 --> 00:39:11,590 And I think, you know, some of the hope is some of the work that we've done around being able to still 359 00:39:11,620 --> 00:39:17,140 assess people's vulnerability and risk to COVID might help other people. 360 00:39:17,370 --> 00:39:22,830 Be considerate of that particular vulnerable person in the sense that they know that that person is vulnerable. 361 00:39:23,160 --> 00:39:28,290 They might be able to make special concessions and arrangements and things like that to help protect that person. 362 00:39:28,700 --> 00:39:29,700 And as a society, 363 00:39:29,700 --> 00:39:37,290 we can become more considerate about threats and the fact that our actions might actually have consequences for other people and that we 364 00:39:37,290 --> 00:39:47,190 could protect potentially protect them by some of the measures that they've come to know a useful and necessary in these circumstances. 365 00:39:48,450 --> 00:39:59,360 So how was the the code that the QR code would work funded initially so that it was funded by the National Institute for Healthcare Research. 366 00:39:59,610 --> 00:40:06,989 So NIH. Ah, so we had a, we had to get going with the work as soon as we asked to do it. 367 00:40:06,990 --> 00:40:13,620 But so it took a little while for the funding actually to come through, but we knew that it would come through after after some months. 368 00:40:14,310 --> 00:40:18,360 And then that's being funded by that, by the NIH. 369 00:40:19,050 --> 00:40:23,550 And without that, we wouldn't be able to get those. 370 00:40:23,580 --> 00:40:30,030 But what was what did help in the very early stages is that there was a call from the Medical 371 00:40:30,030 --> 00:40:35,570 Sciences Division at the University of Oxford in the right at the beginning of the pandemic for us, 372 00:40:35,820 --> 00:40:38,850 because there was a donor fund being set up. 373 00:40:39,890 --> 00:40:43,110 So we got together very quickly within Oxford. 374 00:40:43,200 --> 00:40:51,950 I think everybody who who does special things, if you like to work with told data linkages sector we will make connections really quickly. 375 00:40:51,960 --> 00:40:59,340 And so we were lucky to get a couple of the early awards that the medical sciences division offered us. 376 00:40:59,340 --> 00:41:03,479 And one of those opposed was to it sounds a small thing, 377 00:41:03,480 --> 00:41:08,940 but actually it enabled us to actually pay for the access fees for some of the datasets 378 00:41:08,940 --> 00:41:12,690 that we needed because there's always costs associated with administering datasets. 379 00:41:13,140 --> 00:41:18,300 So that meant that we could we could actually get that estimate immediately. So there was no delay in getting that dataset. 380 00:41:19,110 --> 00:41:24,420 If we had to wait for the main grant to come through that, the first wave would come and go before we got there. 381 00:41:24,960 --> 00:41:29,010 And then the second one again, which was migrating from somatic installation for, 382 00:41:29,010 --> 00:41:36,690 was that it enabled us to buy some serious quantities of computer software and, and hardware particularly. 383 00:41:36,690 --> 00:41:41,379 So it's a big service that we've been able to set up for researchers. 384 00:41:41,380 --> 00:41:44,610 And so we've got teams of such as now that are locking in. 385 00:41:45,000 --> 00:41:48,090 It's all the data that we have as safely stored. 386 00:41:48,330 --> 00:41:54,299 And in that environment where you go to the data, you don't download data. 387 00:41:54,300 --> 00:41:57,600 So nobody downloads data into the actual emails data. 388 00:41:58,440 --> 00:42:08,579 Only a few people can access it. But what the funding did was basically enables that enabled us to buy some, some high powered computing facilities. 389 00:42:08,580 --> 00:42:14,770 That means that we could get more research seconds and get results out more quickly. 390 00:42:14,850 --> 00:42:18,540 So again, I think that was incredibly lucky being in pay cycles. 391 00:42:18,590 --> 00:42:23,549 That enabled us to do that. And really quickly and I think again, 392 00:42:23,550 --> 00:42:28,500 we were I felt really fortunate to be a donor being and also to get by the 393 00:42:28,500 --> 00:42:33,659 time when the pandemic started very quick and I had met a lot of colleagues, 394 00:42:33,660 --> 00:42:41,070 but then very quickly we were working with a consultant at the intensive care professor, 395 00:42:41,460 --> 00:42:45,630 Sir Peter Horby, for example, who is the chair of the Nervtag group for Sage. 396 00:42:46,770 --> 00:42:52,349 And really any expertise that we needed is available not just in Oxford, 397 00:42:52,350 --> 00:42:57,690 also has a very high concentration of people with very specific expertise from 398 00:42:57,690 --> 00:43:04,319 policy makers like Professor Anthony Harms and the from the Joint Committee 399 00:43:04,320 --> 00:43:09,209 on Vaccinations and the People like Professor Scrutiny develops the monoclonal 400 00:43:09,210 --> 00:43:12,600 antibodies of things that I about which we're now looking at evaluating. 401 00:43:12,660 --> 00:43:20,819 So it's been a really fruitful place to be when you need to get research done really quickly. 402 00:43:20,820 --> 00:43:25,959 It's been an excellent environment. I had a follow up question about the funding, 403 00:43:25,960 --> 00:43:32,260 because you what you mentioned a little while ago that you'd like to continue this approach for other kinds of conditions. 404 00:43:33,280 --> 00:43:39,820 How confident are you that in the light of the success with COVID, that that funding will be forthcoming? 405 00:43:39,910 --> 00:43:45,730 I think it will. I think it will. So one of the things that's happened is that NHS Digital, 406 00:43:46,090 --> 00:43:53,200 it was the first time that they actually ever done a sort of what they call a precision health intervention at scale. 407 00:43:53,200 --> 00:44:01,510 So applying a risk stratification system to a national data set to identify individuals who then said, let's just take it. 408 00:44:01,990 --> 00:44:06,400 For me, it's a special task since that's I'll be doing this sort of thing for other diseases and 409 00:44:06,410 --> 00:44:12,640 COVID this challenging but that similar look to different diseases and ways to the principle. 410 00:44:12,880 --> 00:44:18,910 But for them, it's the first time. And so that they've now started thinking about the whole cohort in approach. 411 00:44:18,910 --> 00:44:27,820 And so one of the things that's happened I think indirectly because of the pandemic is that we perhaps 412 00:44:28,220 --> 00:44:33,879 under a lot of pressure in the NHS for things like early diagnosis of cancer to cuts of screenings. 413 00:44:33,880 --> 00:44:38,120 A lot of the screening processes just ended up being shelved. 414 00:44:38,380 --> 00:44:42,830 So I think we stopped wrestling altogether for some period of time with the pandemic, 415 00:44:43,180 --> 00:44:46,899 and then obviously we start doing those things again and it comes back. 416 00:44:46,900 --> 00:44:48,639 You can't do everybody immediately. 417 00:44:48,640 --> 00:44:57,280 You have to have some sort of mechanism of being able to actually identify people who might be sitting on a waiting list who who are at higher risk, 418 00:44:57,340 --> 00:45:01,450 somebody else on the waiting list, and need that diagnosis more quickly. 419 00:45:01,750 --> 00:45:10,810 And so I know that there are sort of initiatives looking now at whether we can develop the cancer risk stratification into the national system. 420 00:45:11,060 --> 00:45:16,780 And obviously, research always has to come first, and that's always first and centre about developing the tools, 421 00:45:16,780 --> 00:45:21,430 making sure the tools work right and that knowledge and and then that can be engineered. 422 00:45:21,970 --> 00:45:33,150 So I think that that that is likely to be funding bodies such as CII, custom research, UK, MRC, I'll see, 423 00:45:33,160 --> 00:45:40,870 I think are looking at into those areas and I certainly want to some that we were successfully awarded to and what 424 00:45:40,900 --> 00:45:48,220 from the cuts research UK some just for Christmas to look at dynamic assessment of cuts of risk in primary care. 425 00:45:49,240 --> 00:45:55,900 So that is that that's an interesting one I understand what what she said. 426 00:45:56,680 --> 00:46:01,650 So some of it is as as a value it's a subset and it's sorry. 427 00:46:01,730 --> 00:46:05,830 It's like when you go away for three days with people you've never met before. 428 00:46:06,070 --> 00:46:12,190 But in all three we've got a central trust around a particularly thing you might have a particular expertise to offer. 429 00:46:12,190 --> 00:46:18,370 So. So the Social UK has a three day event just before Christmas where they got to get 430 00:46:18,370 --> 00:46:21,940 people who wanted to basically make a step change in the early diagnosis of cancer. 431 00:46:22,510 --> 00:46:30,880 So they included epidemiologists like myself, GP, GP as well, but people who work in secondary care, hospital doctors, 432 00:46:31,240 --> 00:46:38,020 but also engineers and mathematicians and behavioural therapy and psychologists and nursing colleagues, 433 00:46:38,020 --> 00:46:43,239 etc. and then they put you in a room together and basically gets give you a task and then you have to come 434 00:46:43,240 --> 00:46:52,750 up with proposals of something which could which could be a step change in early diagnosis of cancer. 435 00:46:52,930 --> 00:46:56,320 So that I would say that that hasn't yet been announced. 436 00:46:56,650 --> 00:47:01,720 So that will be at some time in that in that in the in the next couple of months the outcomes on that board. 437 00:47:02,830 --> 00:47:14,030 So so that's in the in the risk stratification for the virus cancer and potentially it's an approach that could be used for it. 438 00:47:14,350 --> 00:47:21,520 And looking at is screening for people who have got diabetes and need to have more tailored eye screening schedules. 439 00:47:21,520 --> 00:47:26,920 So some people might need to come every year or two years and people might need to come and be six months. 440 00:47:29,140 --> 00:47:32,590 It could be applied to that breast cancer screening. 441 00:47:32,590 --> 00:47:36,960 So that might be within the crude sort of scheme that we have minute, 442 00:47:37,030 --> 00:47:43,570 which basically says if people between two ages come every so often or maybe three years, like a year, 443 00:47:43,960 --> 00:47:51,010 you could tailor screening intervals to target people who are who've got a family history to come into the system, 444 00:47:51,280 --> 00:47:54,940 think some of this is already done, but not enough in a way that is systematic. 445 00:47:56,590 --> 00:48:00,530 And then from the drug safety side, I think that that's a little challenging. 446 00:48:00,530 --> 00:48:04,540 And I think there's a bigger emphasis on developing new medicines. 447 00:48:05,680 --> 00:48:08,620 And there's less emphasis, I think, on actually checking that that's safe. 448 00:48:09,040 --> 00:48:16,209 However, if you don't have a system that looks at safety, then you could end up with some sort of health scare, 449 00:48:16,210 --> 00:48:20,170 which then means that people don't want to use the medicines that are happening. 450 00:48:20,170 --> 00:48:23,910 So you have to have that education and the. 451 00:48:23,980 --> 00:48:31,240 Has to be. I think in order to get public confidence, professional confidence, it has to be more systematic approach to drug safety. 452 00:48:32,680 --> 00:48:43,899 And so I think that there's there's as ever, there's a need to to ensure that the people who fund things know that these are ongoing needs. 453 00:48:43,900 --> 00:48:50,770 We need to build capacity as well in the system to ensure that the next generation research is coming through. 454 00:48:50,800 --> 00:48:55,720 As some of you have had the research which to completely change by the pandemic, 455 00:48:55,720 --> 00:49:01,960 they weren't able to do the project they start stop doing because if the pandemic took over another people who, you know, 456 00:49:02,170 --> 00:49:06,800 whose research has really exploded in a wonderful way because that they've been rushed at the moment, you know, 457 00:49:06,960 --> 00:49:14,680 in that's in the time where they're asking and answering questions that are of international significance, 458 00:49:15,280 --> 00:49:18,100 and they're doing that at that pace on that scale. 459 00:49:19,120 --> 00:49:28,629 So I think I think it's important also to recognise that a lot of the research that I think has been done in Oxford but probably elsewhere, 460 00:49:28,630 --> 00:49:34,270 is not has been done quickly and effectively in the pandemic. 461 00:49:34,270 --> 00:49:44,200 But it's building on a track record of possibly many years or decades even of of of advances and investment. 462 00:49:44,200 --> 00:49:49,419 And so I think there is an important moment now for the government to be strategic in 463 00:49:49,420 --> 00:49:56,260 how it thinks about funding the but not just the pandemic sciences side of things, 464 00:49:56,260 --> 00:50:00,459 but the the other things I mentioned just a lot about the direct effects. 465 00:50:00,460 --> 00:50:07,990 The pandemic can have an impact just about the COVID itself. So you mentioned the high quality of research in Oxford. 466 00:50:08,000 --> 00:50:16,000 I mean, one consequence of that that I feel that I've perceived is that in normal times it's actually quite competitive. 467 00:50:17,140 --> 00:50:24,970 But you've talked a lot about collaboration between, you know, across disciplines and among people with different interests. 468 00:50:25,420 --> 00:50:30,790 Have you found that distinctively different from your previous experience of working in academic research? 469 00:50:30,910 --> 00:50:36,580 Well, I think it's been incredible, really, because I think when there's a serious threat to health, 470 00:50:36,580 --> 00:50:45,520 which there's got a lot of unknown things where we need to answer questions, you know, in a way, obviously, we don't wish we hadn't had the pandemic, 471 00:50:45,520 --> 00:50:53,160 but it's sets to us with the task and such questions which you can all unite around and be very rapidly sort of like, 472 00:50:53,230 --> 00:50:59,500 okay, I need somebody who's got this expertise or that expertise, and I've just found people fantastically collaborative. 473 00:50:59,500 --> 00:51:08,380 So. So, for example, when I wanted to look at looking at safety of vaccines in pregnancy, for example, 474 00:51:08,560 --> 00:51:14,980 and then you suddenly look around Oxford and you realise you have people here who 475 00:51:14,980 --> 00:51:24,040 are the world experts at looking at pregnancy and maternal inquiries and for death, 476 00:51:24,040 --> 00:51:28,090 for example, or experts in vaccination, etc. 477 00:51:28,450 --> 00:51:32,650 And that you ask people, you know, would you be interested in looking at this question? 478 00:51:33,190 --> 00:51:40,209 Can we can we better quantify risks of vaccination in pregnancy, not just COBRA vaccines, but you just find that everybody says yes. 479 00:51:40,210 --> 00:51:45,640 And can you know, when can we meet them? And the idea is to sort of keep flowing through. 480 00:51:45,640 --> 00:51:49,930 And you just suddenly realise that you've gone from a small project to a really 481 00:51:49,930 --> 00:51:55,210 large programme of work and but it's been fantastic from from my perspective, 482 00:51:55,570 --> 00:51:59,020 I think there's a competitive collaborative thing as well. 483 00:51:59,350 --> 00:52:08,499 And so I think, you know, the fact that people want to it is good to have a bit of a need to get had to put 484 00:52:08,500 --> 00:52:13,180 your research out there so that you can actually communicate your findings quickly. 485 00:52:13,180 --> 00:52:16,540 And you have to sometimes you do have to be first to do to do that well and be 486 00:52:16,540 --> 00:52:21,440 the person who's confirmed the 15th time because that becomes less interesting. 487 00:52:21,470 --> 00:52:31,120 So I think is the overwhelming need from the public and policymakers to get answers out of quickly has sort of tapped into, 488 00:52:31,120 --> 00:52:36,759 I think, the competitive sort of spirit that I think is probably in all of the Oxford academics, really. 489 00:52:36,760 --> 00:52:40,570 But then they've done it in a collaborative in a collaborative way. 490 00:52:40,840 --> 00:52:47,379 And you sometimes find that, you know, you're competing with people and collaborating with them at the same time and that's been fine as well. 491 00:52:47,380 --> 00:52:55,360 So we've been putting in for grants with people and because of hosting a national data, then we have to be fair. 492 00:52:55,360 --> 00:53:02,110 And if people want to put in the bid as well and compete with us with a bid that we're putting in, we support that as well. 493 00:53:02,110 --> 00:53:08,709 So so again, I think it's I think it just opens up your eyes to different ways of working and people 494 00:53:08,710 --> 00:53:15,060 bring different things to the table and they have a different lens and way bouncing, 495 00:53:15,400 --> 00:53:20,770 framing questions and answering questions. So, for example, right now we're doing some work. 496 00:53:21,460 --> 00:53:26,680 You're looking at the safety of these new COVID therapeutics that the monoclonal antibodies that I 497 00:53:26,680 --> 00:53:32,680 mentioned and there were three groups that will be commissioned to do exactly that same question. 498 00:53:32,680 --> 00:53:35,350 So we were all just given the question just before Christmas. 499 00:53:36,130 --> 00:53:46,480 There's a group up in Scotland, there's another group in Oxford as well, led by Dr. Ben Goldacre and A.L. So we were all given the same exam question, 500 00:53:46,480 --> 00:53:55,330 if you like, and then we all produced some protocols, and then we're all meeting on a weekly basis currently to see what answers we're coming up with. 501 00:53:55,330 --> 00:54:00,070 We've got slightly different datasets, we've got slightly different interests of scientific expertise, 502 00:54:00,550 --> 00:54:04,930 and so there's some work that we're confirming because we're both doing the same answers. 503 00:54:05,260 --> 00:54:07,960 And then there's other things that we're doing which are complementary, 504 00:54:08,380 --> 00:54:15,130 and then that will all come up to be concluded, I hope, in April in a sort of report that comes from all of us. 505 00:54:15,130 --> 00:54:21,130 So there's this sort of competitive, collaborative thing where everybody basically benefits, 506 00:54:21,130 --> 00:54:29,230 I think, from from from that sort of working relationship. And is that something you'd like to see continue in the future, 507 00:54:29,240 --> 00:54:38,540 and how likely do you think that is that with once the pressure of this unique emergency is lifted, we're not. 508 00:54:39,020 --> 00:54:43,639 I keep thinking that the pressure has gone and that we've got the other content coming along. 509 00:54:43,640 --> 00:54:46,070 And it just from from that point of view, 510 00:54:46,070 --> 00:54:52,370 the questions are still really from the policymakers point of I think the questions are still very urgent, I hope. 511 00:54:52,700 --> 00:54:55,879 I mean, it's a huge amount of I mean, doing research is a huge amount of fun. 512 00:54:55,880 --> 00:54:58,459 It's just a really interesting thing today. 513 00:54:58,460 --> 00:55:06,230 I mean, I've always enjoyed learning new things and asking questions and advance answer the questions that you ask of the interest to you, 514 00:55:06,230 --> 00:55:09,020 but also interest in interviews to other people as well. 515 00:55:09,320 --> 00:55:16,490 And I think having an environment like that, I sometimes joke about the fact of the research questions. 516 00:55:16,490 --> 00:55:22,830 In Oxford, this one research question generates two more, which generates two, and then it gets a of people together. 517 00:55:22,860 --> 00:55:31,759 And then you've got this enormous list of questions that you need to sort of think about what are the priorities and how can we best answer those? 518 00:55:31,760 --> 00:55:38,780 I think the constraints are going to be time and and then also from is it just come back to funding as well 519 00:55:39,050 --> 00:55:48,770 and making sure that whilst one of the nice things that happened about funding stream in the pandemic, 520 00:55:49,310 --> 00:55:56,450 one of the things I mentioned was the fact that Oxford was so quick off the mark at giving people funding and lockdowns basically quickly. 521 00:55:56,450 --> 00:56:04,490 And that puts us in a competitive position to be able to leverage some of the bigger grants that we then went on to reallocate momentum to to win. 522 00:56:05,450 --> 00:56:12,260 But it's just making sure that that is is matched by the sort of the metrics 523 00:56:12,820 --> 00:56:18,280 that people need to sort of keep for that Chris to be meeting always is it? 524 00:56:18,680 --> 00:56:22,790 There's quite a lot of traditional metrics here, like how many pages will be published, 525 00:56:22,880 --> 00:56:28,220 how much money people get in and and how many students to supervise and how many committees do you sort of sit on? 526 00:56:28,910 --> 00:56:34,790 And those are sort of all very fine, good, bona fide metrics of academic progress. 527 00:56:35,240 --> 00:56:39,080 Then maybe we need to think more about, you know, how many data sets to be shared, 528 00:56:39,380 --> 00:56:44,780 how many patients lives have you potentially changed through the use of your research? 529 00:56:44,780 --> 00:56:53,599 Actually, what impacts have been created? And I think these are things which are just starting to have many questions asked by patients. 530 00:56:53,600 --> 00:56:58,850 Have you actually been able to address in a meaningful way how many lives feature? 531 00:56:59,090 --> 00:57:05,540 I think these are sorts of things that in the last few years we've become more thoughtful about the impacts of measuring 532 00:57:05,540 --> 00:57:15,409 impact and and then making sure that there's that translational mechanism in place so that you can go from some research. 533 00:57:15,410 --> 00:57:19,580 Question I was saying earlier on research question to a tool or something which can 534 00:57:19,580 --> 00:57:24,950 be used to be able to actually build that tool and put it into software and systems. 535 00:57:25,280 --> 00:57:32,850 So I think, you know, again, being in Oxford and having Oxford University innovations is really helpful from that, from that. 536 00:57:32,870 --> 00:57:44,929 And so I hope that it'll it'll continue and I hope that it's sort of that we don't revert back to some of the sort of the old ways really, 537 00:57:44,930 --> 00:57:52,010 where it would take over a year to grant and it took over a year to find out of control on that grant or not, 538 00:57:53,030 --> 00:57:56,330 because certainly in the pandemic situation. Yes. 539 00:57:58,510 --> 00:58:04,669 So so I hope that some of these processes can be speeded up so that we can get more knowledge, 540 00:58:04,670 --> 00:58:08,840 new knowledge out there more quickly, so that they can be more beneficial to patients. 541 00:58:10,130 --> 00:58:14,660 So I've just got a couple of quick questions about your your personal experience of the pandemic. 542 00:58:15,680 --> 00:58:21,200 How did you feel personally threatened by the infection that that you might get it? 543 00:58:21,200 --> 00:58:28,009 And was that something you were fearful about or that people close to you might something like that like everybody? 544 00:58:28,010 --> 00:58:34,580 Well, in the beginning, they had known the nature of the infection about how serious it was. 545 00:58:34,880 --> 00:58:40,760 So I took you know, I took the government guidance extremely seriously and based on my family's as well. 546 00:58:41,090 --> 00:58:44,629 So when we were all supposed to be sitting in our houses going up over what today, 547 00:58:44,630 --> 00:58:50,570 that might be exactly what we did and when we first met wearing masks and we were doing exactly, exactly that. 548 00:58:50,570 --> 00:58:55,850 And so, you know, I think that I think that's responsible behaviour. 549 00:58:56,300 --> 00:59:06,380 I think I did manage to get Cambridge over Christmas having studiously avoided it for the two years last Christmas 21 Christmas 21 Yeah. 550 00:59:07,010 --> 00:59:08,870 Christmas 21 the one that's just happened. 551 00:59:09,650 --> 00:59:17,220 And I think luckily for me, I know I was already vaccinated to them it was it was smiles, it was mild illness, but I think so. 552 00:59:17,540 --> 00:59:21,019 So I think like everybody really. 553 00:59:21,020 --> 00:59:25,430 So that that was the concerns about sort of coming into contact with vulnerable people. 554 00:59:25,870 --> 00:59:29,949 You don't want to be the person who might spread an infection or somebody whose 555 00:59:29,950 --> 00:59:32,730 immune system might not be strong enough to be able to cope with it as well. 556 00:59:33,680 --> 00:59:44,470 So so I think, you know, just like everybody really in the in the face of the unknown and as you get older, as the risks naturally increase. 557 00:59:45,400 --> 00:59:50,170 But, you know, I think the good news is, is that we've now got high levels of immunity in the population. 558 00:59:51,070 --> 00:59:54,340 I think there have been difficult calls that the government would need to make. 559 00:59:54,340 --> 01:00:02,049 And I think largely, you know, either by hook or by crook, we've managed to get where we are reasonably well. 560 01:00:02,050 --> 01:00:07,960 I mean, I think that's I'm sure there will be lessons learned from the pandemic, because as we look back, 561 01:00:08,680 --> 01:00:18,969 as you would expect to of this completely unprecedented times that we've had, and how did the pandemic changed the way you you had to work? 562 01:00:18,970 --> 01:00:23,620 I mean, in terms of things like, you know, going to the office or travelling, I mean, academics tend to do a lot of travelling. 563 01:00:24,070 --> 01:00:36,400 Yeah. So it's completely changed. And so we've gone from sort of and sitting on trains and going down to London and well to having Zoom meetings. 564 01:00:37,000 --> 01:00:44,139 My very, very efficient use of time and almost no time in between some of the and I think 565 01:00:44,140 --> 01:00:49,860 that comes with its own time just as well because I think you miss out on, 566 01:00:49,870 --> 01:00:55,959 on some of the the chats that you have walking into meetings or coming out of meetings or the 567 01:00:55,960 --> 01:00:59,470 moments you have to walk through the beautiful environment in Oxford to the next meeting, 568 01:00:59,470 --> 01:01:04,030 etc. So I think everything is speeded up. 569 01:01:05,140 --> 01:01:12,360 And I think that, you know, I think we're all looking forward to getting back to the sort of the more usual ways of working, 570 01:01:12,410 --> 01:01:20,620 although I don't think it necessarily will happen. I think for some people it's been a welcome change and I've certainly in my work as a GP, 571 01:01:20,620 --> 01:01:27,640 I've certainly got patients who have actually been very relieved not to be doing a three hour commute every day and be able to spend time with family. 572 01:01:27,650 --> 01:01:34,450 Which happens to what's been interesting today and I think making sure that people 573 01:01:34,450 --> 01:01:39,790 can work effectively and comfortably because the still pandemic hasn't finished us, 574 01:01:39,820 --> 01:01:47,260 there are still concerns about whether new variants might emerge, whether that might escape the vaccine. 575 01:01:48,280 --> 01:01:54,060 And so I think there's still a need for sort of watchful caution, I guess. 576 01:01:54,130 --> 01:01:57,970 I think. But I didn't ask you about your work. 577 01:01:58,090 --> 01:02:02,379 I think I sort of assumed that you were no longer working as a GP, but you've continued to practice. 578 01:02:02,380 --> 01:02:08,790 Yes. So I have some for various reasons, but I've continued in my bit to make a general practice. 579 01:02:08,840 --> 01:02:19,450 Rather, throughout the pandemic, I've moved to remote searches, not speaking to people, but for geographical reasons as much as anything. 580 01:02:19,450 --> 01:02:27,969 But and I think that the way the health service is delivered is actually changed, not just in primary care but in secondary carers. 581 01:02:27,970 --> 01:02:35,080 But I think a lot of follow up appointments for patients and so from hospital systems are done by telephone. 582 01:02:35,110 --> 01:02:38,800 I think there's pros and cons of that. 583 01:02:38,810 --> 01:02:40,690 I think we need to get the balance. 584 01:02:42,490 --> 01:02:49,660 The balance might change over time as to how that's all delivered and this capacity issues as well, making sure it is enough workforce and so forth. 585 01:02:49,660 --> 01:02:57,760 But so I think there are challenges ahead and there are more teams in Oxford doing work on the value of remote consultations 586 01:02:57,760 --> 01:03:04,600 and how things have changed basically because of because of the pandemic and what the lessons are for some patients. 587 01:03:04,600 --> 01:03:13,239 I think it's it's more efficient for them not to have to come take an afternoon of work and sit in the GP surgery for it to have 588 01:03:13,240 --> 01:03:19,150 a ten minute appointment when when what they needed was a repeat prescription which could actually be switched at some time. 589 01:03:20,770 --> 01:03:29,050 But, but, but you know, there's pros and cons of these things. I think we just need to have some supply and see how things work out over time. 590 01:03:30,190 --> 01:03:36,370 And do you think I mean, I think everybody found the constraints of the pandemic quite difficult or most of them did. 591 01:03:37,120 --> 01:03:43,180 Do you think the fact that you were able to work on something that was actually addressing the problem supported your own wellbeing? 592 01:03:44,970 --> 01:03:49,960 Well, absolutely. I mean, it's it's it's really I absolutely love doing research. 593 01:03:49,970 --> 01:03:55,090 You know, it's my most privilege. I wish everyone is just brilliant. 594 01:03:55,270 --> 01:04:00,340 It's fantastic to know that some of the questions that we've been asking that day and I just love so. 595 01:04:00,580 --> 01:04:07,180 So for me, actually, having such a high concentration of research has been immense, really, really enjoyable. 596 01:04:07,180 --> 01:04:13,090 And it's one of the things I really enjoy. I think so, yeah. 597 01:04:13,180 --> 01:04:19,970 I think it's been that it's been a really valuable experience, really concentrated. 598 01:04:20,230 --> 01:04:23,920 I think. I don't think I had in the first year of the pandemic. 599 01:04:23,920 --> 01:04:33,890 I don't think I had a clear. We call them at home first gear. And I think it's only recently that I had an auto reply on with my contact details. 600 01:04:34,550 --> 01:04:37,590 If people need to instruct me why, I'm trying to take them on relief. 601 01:04:38,480 --> 01:04:44,930 So I think it's it's yes, it's had its challenges, I think, in that way. 602 01:04:45,470 --> 01:04:52,450 I suppose, particularly if you're in a role where people are expecting you to do something which they need you to do quickly. 603 01:04:53,180 --> 01:04:54,710 Uniquely placed to do it. 604 01:04:55,760 --> 01:05:03,070 I think that's been one one of the issues it's meant you just have to keep going really even WhatsApp code that I'm still sitting on a 605 01:05:03,110 --> 01:05:12,600 conference calls trying to help with the next bit of the national policy on things as I could and has the experience is the last question. 606 01:05:12,650 --> 01:05:21,580 Has the experience led you to think that there are things you'd like to see change in the future in the way research is conducted? 607 01:05:24,860 --> 01:05:31,639 I think one of the amazing things has been about the public engagement research. 608 01:05:31,640 --> 01:05:35,840 And when we were looking at the beginning of the vaccine work that we had this commission, 609 01:05:35,850 --> 01:05:43,819 there was a survey of patients there done by the organisation called HDI UK and I think they identified 610 01:05:43,820 --> 01:05:50,960 like 800 distinct research questions that patients had and wanted to know the answers to and those. 611 01:05:51,230 --> 01:05:59,630 And I think again, some of the questions that people have are not necessarily the ones that researchers are going and doing and that change. 612 01:05:59,780 --> 01:06:06,049 So I think the opportunity really is is to be able to tap into this enormous wealth of questions. 613 01:06:06,050 --> 01:06:12,020 For example, if people want to know if they've had coped already, that they have two vaccines. 614 01:06:12,380 --> 01:06:20,690 Is that as good as actually having had three jabs? What about if you had sort of COVID vaccines and then you had to keep it afterwards? 615 01:06:20,690 --> 01:06:22,819 Is that this is your immunity? 616 01:06:22,820 --> 01:06:28,070 Is that stacking up as well as it would have been had when had you had just in the different order and things like that as well? 617 01:06:28,400 --> 01:06:35,390 And these aren't necessarily questions that would be immediately popping in front of people's minds, but which which potentially it could be. 618 01:06:35,510 --> 01:06:41,420 So I think thing I'd like to see changes continue to change is that close of working 619 01:06:41,720 --> 01:06:46,100 relationship with patients where we can see but can benefit from the questions that 620 01:06:46,100 --> 01:06:50,719 are important to patients and answer those questions in a way that benefit them as 621 01:06:50,720 --> 01:06:56,660 well is also of interest to researchers and you sort of get to the policy makers. 622 01:06:57,920 --> 01:07:01,460 That's okay. That's lovely. I think that's covered everything. 623 01:07:01,540 --> 01:07:04,580 But I can't stop recording now. That. 624 01:07:07,880 --> 01:07:10,880 So what kind of nice and unusual things have happened? 625 01:07:12,140 --> 01:07:19,880 So so I have my very first trip to the House of Lords this Monday as part of the old policy parliamentary group on medical research. 626 01:07:20,180 --> 01:07:25,969 And the visits invited the key scientists who contributed to the pandemic response. 627 01:07:25,970 --> 01:07:32,360 And so I went along, having done the work really on the risk stratification and also vaccine safety. 628 01:07:32,360 --> 01:07:33,790 And then I was really pleased to meet up. 629 01:07:33,800 --> 01:07:41,480 There was a whole group of other Oxford researchers from people that helped make make the vaccine design vaccine test the vaccine. 630 01:07:42,200 --> 01:07:47,000 And of course, we were looking at the safety of the vaccine as well as people who had done the 631 01:07:48,260 --> 01:07:53,000 work to look at the the treatments that might be used in the recovery trial. 632 01:07:53,840 --> 01:08:02,780 And so, again, just with various people and going round and basically hearing everybody's story about what did they do sort of in 633 01:08:02,780 --> 01:08:10,340 the pandemic and how does it all all fit together and how they've actually come to be sort of on that that grouping. 634 01:08:10,700 --> 01:08:16,609 And that was a bit of a call to, I think, to the government again within that, 635 01:08:16,610 --> 01:08:24,319 which was to make sure that it recognised that building on the expertise of all scientists 636 01:08:24,320 --> 01:08:29,360 in this country that has accrued over many years and that does require investment, 637 01:08:29,360 --> 01:08:36,620 does require ongoing investment, and can't just be switched off off and switched on again in the way that investment is there. 638 01:08:36,620 --> 01:08:40,600 So there was a sense of what the purpose of that meeting funding was as well. 639 01:08:40,610 --> 01:08:46,610 So people at the MRC, in Cancer Research, UK and the association, medical charities, 640 01:08:46,610 --> 01:08:52,190 etc., to highlight the importance of, of what I think often goes on in the background. 641 01:08:52,580 --> 01:08:59,010 So then the pending foundational research that is going on will always continue to go on, but just to make sure that, that, 642 01:08:59,510 --> 01:09:06,589 that, that is supported and remains sort of scalable so that hopefully we won't be quite the speed and things that we've done, 643 01:09:06,590 --> 01:09:11,090 but we don't know about the future and there are always the possibilities of of similar 644 01:09:11,090 --> 01:09:14,720 things happening again and we just want to be able to leverage that in the future. 645 01:09:17,310 --> 01:09:20,460 Miss Lovelace. So that was a.