1 00:00:02,130 --> 00:00:04,920 So that was my last role as an academic cat herder, 2 00:00:04,920 --> 00:00:09,870 so I spent the last 10 years trying to get clinicians and academics talking to each other inside Oxford. 3 00:00:09,870 --> 00:00:13,440 And now I'm working on trying to get different disciplines working together 4 00:00:13,440 --> 00:00:18,060 across the UK through the National Consortium of Intelligent Medical Imaging. 5 00:00:18,060 --> 00:00:21,210 As Mike says, it's one of the initiatives we're working around. 6 00:00:21,210 --> 00:00:30,360 The ethics of the use of image data and medical imaging and See Me involves 15 hospital trusts across the length and breadth of the UK, 7 00:00:30,360 --> 00:00:35,910 nine industry partners. So we've got a particular interest in the use of patient and health data by companies, 8 00:00:35,910 --> 00:00:43,920 by small and international global players and of spread across cancer, cardiovascular medicine, metabolic health, stroke, you name it. 9 00:00:43,920 --> 00:00:46,080 We're probably pursuing something with it, 10 00:00:46,080 --> 00:00:52,560 and we were one of five centres set up by through the Life Sciences Industry Strategy just over a year ago to support 11 00:00:52,560 --> 00:00:57,930 aoy developments in medical imaging and digital pathology with the view that these were some of the low hanging fruit, 12 00:00:57,930 --> 00:01:04,320 why you could see the use of AI and the use of healthcare data for innovation and to drive patient impact. 13 00:01:04,320 --> 00:01:10,050 So as I said, we were set up with a range of aspirations to support companies and academics and progressing 14 00:01:10,050 --> 00:01:15,390 particular use cases of AI in health care and ensuring that we're engaging the NHS workforce. 15 00:01:15,390 --> 00:01:19,290 And how are we doing that in an ethical way that you're including patient consent, 16 00:01:19,290 --> 00:01:25,080 both for retrospective as well as prospective data collection, which some of you may have seen in the Guardian over the weekend. 17 00:01:25,080 --> 00:01:32,790 The idea of anonymized data and can you really anonymize data is one that's close to conversations about having an ally for us at the moment, 18 00:01:32,790 --> 00:01:35,160 how people feel about the use of commercial entities, 19 00:01:35,160 --> 00:01:40,350 engaging with their health datasets and also trying to ensure that we're bringing the patient and 20 00:01:40,350 --> 00:01:45,980 public perspective in which is why we've been working with Michael McKenzie and some of the and. 21 00:01:45,980 --> 00:01:54,260 So we're supporting a range of different projects, and they're sort of maps out along a pipeline of development from areas of clinical unmet need all 22 00:01:54,260 --> 00:01:59,660 the way through to companies with solutions that are looking to test and deploy into the NHS. 23 00:01:59,660 --> 00:02:05,810 And those present different challenges in sense of whether they're being prospectively or retrospectively acquired, 24 00:02:05,810 --> 00:02:11,000 whether there is informed consent around these studies and then the aspects around deployment of air 25 00:02:11,000 --> 00:02:18,200 developed by large entities such as health care in the US and the deployment of those into the NHS, 26 00:02:18,200 --> 00:02:24,170 and also the questions around how we ensure that everybody in the ecosystem is creating value back out of that. 27 00:02:24,170 --> 00:02:29,450 So not only the commercial value proposition, but also societal, academic, 28 00:02:29,450 --> 00:02:34,040 commercial and other forms of value that we're working with both Mike and the team and 29 00:02:34,040 --> 00:02:39,200 some of our external collaborators around how to capture those different times of value. 30 00:02:39,200 --> 00:02:46,460 So those are all projects we are progressing through, the UK says, and I wanted to give a couple of quick example. 31 00:02:46,460 --> 00:02:52,490 So this is one that involves when I was a start-up that spun out of the University of Oxford just over a year ago. 32 00:02:52,490 --> 00:03:02,450 So this was an academic research group that have used data from cardiac CT images and are progressing a new measure for fat attenuation index, 33 00:03:02,450 --> 00:03:09,410 which is a new marker that picks up of those 50 percent of people that go for cardiac CT because they're at risk of cardiovascular disease, 34 00:03:09,410 --> 00:03:12,770 about half the percent y going you fine of that 50 percent, 35 00:03:12,770 --> 00:03:19,160 about 25 percent of those are actually at risk of this new marker has been picked up through looking at the same images you were acquiring already, 36 00:03:19,160 --> 00:03:25,100 and you can detect new risk disease cohorts that you can then pick up and do something with more proactively. 37 00:03:25,100 --> 00:03:29,570 So the group inside the university developed an algorithm for this, 38 00:03:29,570 --> 00:03:38,090 and we're supporting them in scaling that up and gathering more datasets to develop that into a solution could then be deployed into the NHS. 39 00:03:38,090 --> 00:03:42,740 This is another quite different cohort, so we're working with a charity called him a Premier. 40 00:03:42,740 --> 00:03:51,710 Tice's UK hemochromatosis is an incredibly common genetic hereditary disorder, which is involved with presents as iron overload, 41 00:03:51,710 --> 00:03:58,130 but is chronically misdiagnosed with a lack of about 20 to 25 years before people are accurately diagnosed. 42 00:03:58,130 --> 00:03:59,810 The treatment is giving blood, 43 00:03:59,810 --> 00:04:06,410 so it's not of interest to big pharma companies because they're not going to make a Big Bang for their buck out of trying to cure these patients. 44 00:04:06,410 --> 00:04:11,270 But it is an unmet need where the use of imaging can pick up the disease through an MRI scan, 45 00:04:11,270 --> 00:04:15,660 and you can detect the iron levels in the liver and get them treated much sooner. 46 00:04:15,660 --> 00:04:22,130 So we're supporting a company and progressing how you would take that standard of care MRI images and flag up to 47 00:04:22,130 --> 00:04:28,220 patients and clinicians that they are at risk of having excessive iron and then be treated through giving blood. 48 00:04:28,220 --> 00:04:37,040 And these patients. So Kareena is actually the chair of the charity, and she was, as you can see, symptoms at 20 and over 25 years to be diagnosed. 49 00:04:37,040 --> 00:04:40,400 The charity themselves are very interested in directly donating their image 50 00:04:40,400 --> 00:04:44,120 and other patient datasets for research and for commercial research as well. 51 00:04:44,120 --> 00:04:53,060 So talking with them about what it means to be a patient that wants to donate the image records directly for research and commercial use. 52 00:04:53,060 --> 00:05:00,890 Just the other end of the extreme is E stroke suite. So this is deployed in the NHS currently and is a new method for mobile device pick 53 00:05:00,890 --> 00:05:05,810 up detection of patients more rapidly for those at risk of stroke and intervention. 54 00:05:05,810 --> 00:05:10,400 And this is about putting that absolution into the hands of the clinicians to be able to detect to read 55 00:05:10,400 --> 00:05:17,750 the scans remotely and then get the patients that need to be into the appropriate clinical care pathway. 56 00:05:17,750 --> 00:05:22,100 And we've been talking about deploying that into eight new hospitals into London, 57 00:05:22,100 --> 00:05:29,060 and that would generate over 17 million pounds worth of quality adjusted life year benefit in the next three years if that was deployed. 58 00:05:29,060 --> 00:05:37,880 So very significant potential benefits from an app based solution and just a kind of kind of screen scrolling through these. 59 00:05:37,880 --> 00:05:43,550 And then another one is critical care suite. So this is something which is a never even in the NHS. 60 00:05:43,550 --> 00:05:48,260 Nasogastric tubes are placed in order to feed patients who can't otherwise feed themselves. 61 00:05:48,260 --> 00:05:52,130 This tends to be a risky situation in the middle of the night in an eye and he department, 62 00:05:52,130 --> 00:05:58,280 where you've got a junior doc who's not slept for 36 hours and is trying to get this patient fed as quickly as possible. 63 00:05:58,280 --> 00:06:04,580 If they're incorrectly placed, they can end up not in the stomach, but in the lungs or in the chest cavity and feeding him begin. 64 00:06:04,580 --> 00:06:11,570 And you can end up either with severe adverse events or you can end up with a dead patient in order to avoid those arising. 65 00:06:11,570 --> 00:06:18,020 We're working with G to pick up an automation for this process, not to take away the decision from the clinicians, but to flag those. 66 00:06:18,020 --> 00:06:23,810 But there may be a problem. So trying to help pick up those cases where there might be a risk because otherwise 67 00:06:23,810 --> 00:06:27,650 these are picked up as a never event and no significant financial as well, 68 00:06:27,650 --> 00:06:32,600 as well as societal and individual patient impacts for this going awry. 69 00:06:32,600 --> 00:06:37,640 So those are sort of three or four of the projects that we're working on across the piece. 70 00:06:37,640 --> 00:06:46,210 And all of this is underpinned with what we've been doing and beginning with Michael McKenzie around, what does it mean to be using these? 71 00:06:46,210 --> 00:06:53,830 Ethically, particularly around image data, which has a certain sense of can you ever truly anonymize the data sets we're working with? 72 00:06:53,830 --> 00:06:58,300 Are we in effect in a position that you can't work with historic records within the hospitals? 73 00:06:58,300 --> 00:07:01,840 Because can you truly anonymize the data without consent? 74 00:07:01,840 --> 00:07:07,180 Can you be using retrospective data sets and the very different perspectives patients versus the general 75 00:07:07,180 --> 00:07:13,840 public feel about the use of image and other datasets for both academic research and commercial research? 76 00:07:13,840 --> 00:07:20,860 So we're working with both the Ethics Centre as well as Future Care Capital and some of a company called the behavioural 77 00:07:20,860 --> 00:07:27,550 architects to discuss some of the key drivers and leave us for having some of these conversations to bring the patients with us. 78 00:07:27,550 --> 00:07:33,250 And we're hoping that by doing this will have generated not only solutions that can help care in the future, 79 00:07:33,250 --> 00:07:39,790 but will have also supported the generation of imaging and other data resource for future research projects. 80 00:07:39,790 --> 00:07:47,498 And we're keen to kind of pull more people and more groups into there. So if you'd like to come and join and see me, then please do get in touch.