1 00:00:00,390 --> 00:00:05,100 Okay everybody to a second also seminar for the term. 2 00:00:05,280 --> 00:00:10,430 And just before I introduce to speak for today I just want to let you know that that one that 3 00:00:10,440 --> 00:00:14,910 is due to happen in two weeks time has actually been cancelled because it falls in the strike. 4 00:00:15,300 --> 00:00:19,020 So we've got one day and then we're not quite sure when the next one is, 5 00:00:19,020 --> 00:00:22,770 because I don't know whether that one following the next one is also going to be cancelled, but I'll let you know. 6 00:00:22,770 --> 00:00:31,589 But certainly the very next one has been cancelled. But today we have Professor Simon Cole who is here and who's visiting the centre. 7 00:00:31,590 --> 00:00:38,280 So you've probably seen him in the centre. He's here until the end of June, so you can harass him after the talk, but also over the next few months. 8 00:00:39,360 --> 00:00:41,579 And Simon is professor of Criminology, 9 00:00:41,580 --> 00:00:47,520 Law and Society and Director of the NIEKERK Centre for Science and Society at the University of California, Irvine. 10 00:00:47,880 --> 00:00:56,280 Simon's also the Academy to US Editor in chief of Theoretical Criminology, and he's associate editor of the National Registry of Exonerations. 11 00:00:57,180 --> 00:01:01,580 Today, Simon is going to be talking to us about that registry and about the work he does with it. 12 00:01:01,620 --> 00:01:08,620 So thank you. Thanks. Thank you, Mary, and thanks for such a great turnout. 13 00:01:10,090 --> 00:01:16,209 I was really excited to when Mary asked me to talk about the National Registry of Exonerations, 14 00:01:16,210 --> 00:01:25,130 even though she later claimed not to remember doing so because no one's really no one in academia has ever really asked me to talk about it. 15 00:01:25,130 --> 00:01:32,170 And I get asked to talk about forensic science or the Castle State, but not about this, let alone in criminology. 16 00:01:32,170 --> 00:01:36,700 So. So I'm kind of excited to talk about this and thanks for such great turnout. 17 00:01:37,660 --> 00:01:45,370 I was just going to say a little bit about myself, since you may not know me and some of my background is sort of relevant to what I'll say later, 18 00:01:45,640 --> 00:01:52,180 and then I'll talk a little bit about the context of the registry, and then I'll show you the registry. 19 00:01:52,540 --> 00:02:01,270 And then at that point three, there will have been like no intellectual or research content whatsoever in the talk. 20 00:02:01,540 --> 00:02:07,090 And so then in part four, I will try to introduce some intellectual and research content by talking about a project, 21 00:02:07,270 --> 00:02:13,210 research project that I'm doing that I would be happy to get everyone's feedback on because it's ongoing. 22 00:02:14,290 --> 00:02:17,379 So I just want to say a little bit about myself. 23 00:02:17,380 --> 00:02:23,470 I work in a it's sort of interesting in the context of the Criminology Centre, I work in the Department of Criminology, 24 00:02:23,470 --> 00:02:31,330 Law and Society in which either not one or only one faculty member has a degree in criminology. 25 00:02:32,230 --> 00:02:36,280 So about half of us have a degree in sociology and half are other. 26 00:02:37,000 --> 00:02:38,049 I'm part of the other. 27 00:02:38,050 --> 00:02:45,550 I have a great degree in science and technology studies, which is sort of social social studies of the making of scientific knowledge. 28 00:02:46,330 --> 00:02:51,310 Bruno Latour is the person whose name you would know in this field if you know anyone, 29 00:02:52,240 --> 00:02:59,490 although that's a little sad to say in the UK because there's a lot of major figures in that field that I'm from the UK cleaning stories, 30 00:02:59,530 --> 00:03:02,769 Steve Woolgar, who is still here at the business school, 31 00:03:02,770 --> 00:03:12,160 and in the early days there was an Edinburgh School of Science Studies and there was a back school of lots of stuff other than Latour. 32 00:03:13,570 --> 00:03:24,760 And I worked on forensic science as a project in scientific knowledge, very difficult to get jobs in science studies in the US. 33 00:03:24,850 --> 00:03:30,459 So I ended up using the forensic science thing to move into the field of criminology, law, 34 00:03:30,460 --> 00:03:36,610 society and my work on forensic science saying that a lot of it didn't seem to have validated itself, 35 00:03:37,030 --> 00:03:42,759 got people in the innocence, research and movement interested in my work, 36 00:03:42,760 --> 00:03:49,540 and I started getting invited to conferences about wrongful conviction and innocence. 37 00:03:49,750 --> 00:03:57,879 And you can't really go to such things where there are exonerees in attendance who are talking about their stories 38 00:03:57,880 --> 00:04:05,400 of being wrongly convicted and come away not wanting to kind of do something about the problem and get involved. 39 00:04:06,010 --> 00:04:14,820 And for a while I wanted to start an Innocence Project at UC Irvine, maybe focussed on forensic science, 40 00:04:14,830 --> 00:04:21,489 which I did start it very quietly, because once you start one of these things, you start getting mail from prisons. 41 00:04:21,490 --> 00:04:29,260 And I didn't want to be someone who was like let down people's hopes and it didn't really work. 42 00:04:29,260 --> 00:04:34,750 We didn't have we didn't have enough lawyers. I'm not a lawyer and we weren't really able to help people. 43 00:04:35,170 --> 00:04:40,120 And then the opportunity came along to work on the National Registry of Exonerations. 44 00:04:40,120 --> 00:04:47,709 It was looking for a new home, and that seemed a better fit, both for me and for UC Irvine. 45 00:04:47,710 --> 00:04:58,210 It's a research project. We don't actually help people or work with people, so we didn't need to do that and it's very much an academic project. 46 00:04:58,210 --> 00:05:05,560 So it was I was really thrilled to to be able to get involved in the Innocence Movement this, this way. 47 00:05:06,730 --> 00:05:15,940 So that's kind of how how I came to it. So let me tell you a little bit about the context of the National Registry of of Exonerations. 48 00:05:16,660 --> 00:05:27,489 So wrongful long, full conviction has like a long history of research and discussion in the United States as well as as well as here. 49 00:05:27,490 --> 00:05:33,760 And it's nice to talk about this with Carolyn Foyle here has done a lot of research on this on this topic. 50 00:05:34,540 --> 00:05:42,580 But you could go back to 1923. I don't know if if you've heard of this judge learned hand this side from having just a great name. 51 00:05:43,840 --> 00:05:50,410 She is often called the greatest US judge never to serve on the U.S. Supreme Court. 52 00:05:51,310 --> 00:05:56,590 And he has this quotation here saying, there isn't you know, we don't really need to worry about wrongful conviction. 53 00:05:56,590 --> 00:06:03,970 It doesn't actually happen. It's just a myth. And starting right around that time, not. 54 00:06:04,050 --> 00:06:09,450 Long afterwards, you start getting academics sort of debunking that claim. 55 00:06:09,540 --> 00:06:16,740 And the first one is probably Edward Borchardt is a law professor at Yale, publishes this book in 1932. 56 00:06:17,100 --> 00:06:25,770 And one of the things they do in order to debunk the claim that there are no wrongful convictions is to start compiling lists of wrongful convictions. 57 00:06:25,770 --> 00:06:29,010 And that's where you get this whole idea of lists. 58 00:06:29,310 --> 00:06:36,600 And you can kind of walk through all the history of books about wrongful conviction in the United States. 59 00:06:36,600 --> 00:06:44,880 And they almost always have a list. There's always something, some kind of a list, maybe good, maybe bad, but some kind of a list. 60 00:06:46,830 --> 00:06:53,070 So it's important to note, again, this may be obvious, but let me just note it, 61 00:06:54,090 --> 00:06:59,730 that when we talk about wrongful convictions, we're trying to sort of make lists and count wrongful convictions. 62 00:06:59,910 --> 00:07:04,530 We can't really count wrongful convictions because we don't know whether people are guilty or innocent. 63 00:07:04,530 --> 00:07:09,180 That's the whole point. All we know is whether they've been convicted or acquitted. 64 00:07:09,930 --> 00:07:14,850 So we kind of try to count wrongful convictions by counting exonerations. 65 00:07:14,850 --> 00:07:25,679 So people who have somehow been released by the legal system on grounds of innocence as a proxy for wrongful convictions. 66 00:07:25,680 --> 00:07:32,190 But of course, we know that it's a very imperfect proxy and that there must be lots and lots of people, 67 00:07:32,640 --> 00:07:39,450 multiples of the people who are exonerated, who are wrongly convicted and just aren't able to prove it and nobody finds out about it. 68 00:07:39,450 --> 00:07:48,270 So we never, never learn that. So exonerations are not the same as wrongful convictions in both directions. 69 00:07:50,190 --> 00:07:55,530 Some exonerations, maybe of guilty people. I think that's probably a really small number. 70 00:07:56,700 --> 00:08:02,129 But most wrongful convictions are probably not wrongful, not exonerations. 71 00:08:02,130 --> 00:08:10,060 So exonerations are probably a very small drop in the bucket of actual wrongful convictions that we don't know about and we never will know about. 72 00:08:11,880 --> 00:08:21,600 And how do I know this? Well, just read a few exonerations and see how lucky these people were to actually get exonerated and 73 00:08:22,020 --> 00:08:27,239 look at the fortuitously involved in each of these cases that they actually got exonerated. 74 00:08:27,240 --> 00:08:30,240 And then think of the people who didn't get those lucky breaks along the way. 75 00:08:30,420 --> 00:08:33,510 And those are the unknown, wrongly convicted. 76 00:08:35,400 --> 00:08:40,889 So this list making kind of culminates in around 1987 when Hugo Pardo and Michael 77 00:08:40,890 --> 00:08:46,920 Radelet publish this list of 350 wrongful convictions in death penalty cases. 78 00:08:48,420 --> 00:08:55,469 And they use this method in the middle here, cases in which we believe a majority of neutral observers, 79 00:08:55,470 --> 00:08:59,940 given the evidence at our disposal, would judge the defendant in question to be innocent. 80 00:08:59,950 --> 00:09:05,940 So we read the cases. We think these people are innocent and we think you, the reader, would agree. 81 00:09:06,900 --> 00:09:17,340 And this is the Reagan administration. So the attorney general didn't like this, and he sent these law professors, Paul Cosell and Stephen Markman. 82 00:09:17,370 --> 00:09:22,380 Paul Cosell is still around and says, you know, 83 00:09:23,400 --> 00:09:29,820 take down this study and they go and read the cases and they pick the weakest, the ones with the weakest claim to innocence. 84 00:09:29,820 --> 00:09:33,930 And they say, we've read this case file really carefully and we don't think this person is innocent. 85 00:09:33,930 --> 00:09:37,979 They look guilty as [INAUDIBLE] to us. And so we don't think this person is innocent. 86 00:09:37,980 --> 00:09:43,440 And if you don't and therefore you can't trust these researchers, they're anti-death penalty advocates. 87 00:09:43,770 --> 00:09:47,040 So we don't believe any of the 350 cases. 88 00:09:47,430 --> 00:09:52,980 And, you know, you can say what you want about that, but it was a somewhat effective critique. 89 00:09:53,760 --> 00:09:57,959 And then DNA happens. So DNA changes the game. 90 00:09:57,960 --> 00:10:04,380 DNA comes from the UK, from Al Jeffress at the University of Leicester. 91 00:10:04,560 --> 00:10:09,960 And the first case it was used in, actually the first case is an immigration case, 92 00:10:10,950 --> 00:10:15,779 but then it was used in a murder and it actually averted a potential wrongful 93 00:10:15,780 --> 00:10:22,230 conviction because the person suspected in these murders was confessed to the crime. 94 00:10:23,790 --> 00:10:29,909 And then that confession was disproved by analysis of the DNA, DNA evidence, 95 00:10:29,910 --> 00:10:35,930 and it identified another person as the perpetrator of this two murders of teenage girls. 96 00:10:36,000 --> 00:10:46,170 So that was the first exoneration involving DNA and also sort of begins to open up the potential for for the use of of of DNA. 97 00:10:47,160 --> 00:10:51,790 Again, you may know all of this, but I'll just quickly mention. 98 00:10:52,290 --> 00:11:01,790 DNA is used in exonerations when there is bodily fluid at the crime scene that probably has to come from the perpetrator. 99 00:11:01,800 --> 00:11:15,990 It's not that useful if not. And so and so such cases, especially in the preceding decades when DNA evidence was not that powerful and sensitive, 100 00:11:16,980 --> 00:11:22,470 the kind of cases that have this biological evidence tend to be rape murder cases, 101 00:11:22,710 --> 00:11:27,000 their murder cases where there's a semen sample because of because of a rape. 102 00:11:27,690 --> 00:11:31,770 So they overwhelmingly skew towards that kind of case. 103 00:11:33,300 --> 00:11:44,490 And so then this spreads to the United States, and you have the first DNA exoneration in 1989 of David Vasquez. 104 00:11:44,760 --> 00:11:50,580 And actually the case is remarkably similar to the Lester case. 105 00:11:50,910 --> 00:12:01,260 Again, Vasquez is this mildly mentally retarded, confessed to the crime through a dream confession, 106 00:12:01,500 --> 00:12:06,240 pled guilty, and then DNA eventually found the true perpetrator. 107 00:12:06,600 --> 00:12:12,780 And so this inspires the famous Innocence Project in the United States, where the lawyers, 108 00:12:12,780 --> 00:12:20,100 Barry Scheck and Peter Neufeld, say, see the potential of using DNA to to get people out of prison. 109 00:12:23,070 --> 00:12:27,030 But again, in order to do that, they need a very special kind of case, 110 00:12:27,030 --> 00:12:34,850 a case with biological evidence at the crime scene that's not tested at the time of the original investigation that's preserved. 111 00:12:34,860 --> 00:12:39,960 And then that's that's a big one. The legal requirements to preserve that evidence are essentially nil. 112 00:12:40,320 --> 00:12:47,370 So it's just luck whether the police decided to preserve that evidence and then it's able to be tested later. 113 00:12:47,730 --> 00:12:53,930 So those are the only cases that are eligible to become a DNA exoneration. 114 00:12:55,860 --> 00:13:06,720 But still they start litigating and they start getting people out of prison and they start getting a lot of media attention for doing so. 115 00:13:07,080 --> 00:13:13,110 And in some senses, they shock the criminal justice system that they're able to keep finding these innocent people in prison. 116 00:13:14,280 --> 00:13:18,209 And people keep saying they're going to it's going to taper off. 117 00:13:18,210 --> 00:13:24,090 They're going to run out of cases. And that keeps not happening for 20 years. 118 00:13:26,040 --> 00:13:30,150 Up to over 350 cases of DNA exonerations. 119 00:13:31,230 --> 00:13:38,400 So the importance of this for the discourse over wrongful convictions is that the DNA except DNA exonerations 120 00:13:38,400 --> 00:13:46,650 become a set of exonerations for which it is almost impossible to doubt factual innocence because you have DNA, 121 00:13:47,370 --> 00:13:54,780 usually from a semen sample, from a rape murder that doesn't match the person who's been convicted. 122 00:13:54,960 --> 00:13:58,860 It becomes kind of hard to believe that they are guilty of that crime. 123 00:13:59,220 --> 00:14:06,420 In the early cases, there were attempts to sort of reconcile these things, usually by saying, well, 124 00:14:06,420 --> 00:14:14,590 the person in prison didn't rape the victim, but maybe they were there and killed the victim while an accomplice rape them. 125 00:14:15,810 --> 00:14:26,790 But that argument became less and less tenable as more and more cases piled up until even what I would call innocence sceptics. 126 00:14:26,790 --> 00:14:34,859 The Paul Castles of the world had to admit that most of these 350 people were innocent of these crimes. 127 00:14:34,860 --> 00:14:39,540 And even if you wanted to say maybe a couple of them were guilty, you would. 128 00:14:39,720 --> 00:14:44,340 It's hard to really argue that most of them that that a large number were. 129 00:14:45,270 --> 00:14:52,590 So for the first time and these compiling of these list, you have this kind of bullet-proof list, as they call it. 130 00:14:52,890 --> 00:14:57,570 But as I mentioned already, it's a very skewed and specific set of cases. 131 00:14:57,810 --> 00:15:02,400 And if we can find our discussion of wrongful conviction into these DNA exonerations, 132 00:15:02,400 --> 00:15:11,670 it would be a very skewed and selective and unrepresented tiff discussion of both exonerations and of wrongful convictions. 133 00:15:13,680 --> 00:15:18,059 So then you get the National Registry of Exonerations, 134 00:15:18,060 --> 00:15:28,260 and this project originated from the idea of creating an encyclopaedia of wrongful convictions that Rob Horton, 135 00:15:28,260 --> 00:15:34,560 who's a journalist, very active in the exoneration movement, and Michael Radelet, a sociologist, came up with. 136 00:15:35,460 --> 00:15:47,820 And Sam GROSS, who became the founder of the registry, kind of made a comment on the proposal to create this encyclopaedia, which this was, 137 00:15:47,820 --> 00:15:56,790 I think, the early 2000s, and said, of course, you might want to have a website associated with this, with this encyclopaedia. 138 00:15:57,300 --> 00:16:02,910 And so that turned out to be that comment turned out to be sort of right. 139 00:16:02,910 --> 00:16:15,670 Precisely. At the time in technological history where a year or two later it seemed obvious that this should not be a book that would remain static, 140 00:16:15,670 --> 00:16:24,970 but that it should be a sort of living database that's on the web, and that it would be completely useless if it were a fixed text kind of book. 141 00:16:26,230 --> 00:16:31,360 And so the encyclopaedia involved evolves into the National Registry of Exonerations, 142 00:16:31,360 --> 00:16:36,400 founded by Sam GROSS at the University of Michigan and Rod Warden, who's at Northwestern. 143 00:16:38,710 --> 00:16:43,720 Northwestern drops out soon thereafter, and it becomes housed at Michigan. 144 00:16:44,620 --> 00:16:54,670 And then Sam hires Maurice Poe's Lee, a Pulitzer Prize winning journalist who used to work at the Chicago Tribune as the senior researcher. 145 00:16:54,910 --> 00:16:58,930 And Maurice is kind of a game changer for the registry. 146 00:16:58,960 --> 00:17:03,310 He starts turning the article right up, which I'll show you in a second, 147 00:17:04,600 --> 00:17:11,850 from very kind of threadbare things written by students into professional journalist kind of stuff. 148 00:17:11,860 --> 00:17:17,260 And he does it very quickly and really changes the registry. 149 00:17:17,620 --> 00:17:22,990 And then Sam starts looking, thinking about retiring and about succession. 150 00:17:23,380 --> 00:17:32,200 And UC Irvine, because of having several researchers interested in this topic, emerges as a possible home, 151 00:17:32,620 --> 00:17:38,860 and it ends up being a joint project of UCI, Michigan State and the University of Michigan. 152 00:17:40,980 --> 00:17:45,140 Okay. So we've talked about lists. 153 00:17:45,740 --> 00:17:52,250 So what we have in a sense are two different approaches to how we would make a list of exonerations. 154 00:17:52,640 --> 00:17:57,110 There's what you might call the social science model embodied by Vido and Radelet, 155 00:17:57,830 --> 00:18:03,410 where the researchers are using their judgement to decide whether people are guilty or innocent. 156 00:18:04,490 --> 00:18:11,630 But as we saw, that can be attacked by anyone who just says, I don't agree with your judgement that this person is innocent. 157 00:18:13,220 --> 00:18:18,320 Sam's approach is a deference model which he wants to avoid that problem. 158 00:18:19,490 --> 00:18:24,410 And the key to it is deference to the state about in a sense, 159 00:18:24,410 --> 00:18:33,170 the registries definition of an innocent eration is dependent upon the state exonerating the person. 160 00:18:33,320 --> 00:18:36,980 And the idea is we're not making our own judgements about any case. 161 00:18:37,010 --> 00:18:44,470 These are cases where the state itself has decided to relieve this person of the consequences, 162 00:18:44,480 --> 00:18:49,700 the punitive consequences of the case of the act for which they were convicted. 163 00:18:50,540 --> 00:18:54,980 The problem is with that is that the state's judgement might not be the same as my judgement. 164 00:18:55,430 --> 00:18:58,040 But I have to just sort of subsume that. 165 00:18:59,330 --> 00:19:07,340 You could ask questions about why we would rely on the state to expose its own failures, but of course the state is exposing its own failures a lot. 166 00:19:08,360 --> 00:19:11,810 And of course, someone could attack this as well if they wanted to. 167 00:19:12,440 --> 00:19:15,520 So that's two ways of doing it. I read this article recently. 168 00:19:15,530 --> 00:19:16,939 Maybe there's another model. 169 00:19:16,940 --> 00:19:26,870 Maybe we, instead of counting, we could just do what this judge in Japan did in this comment on the Carlos Ghosn case and just say, 170 00:19:27,140 --> 00:19:31,580 without counting, there are as many wrongful convictions as grains of sand on the beach. 171 00:19:34,310 --> 00:19:41,510 Then you no need to count them. So here's here's our short definition of exoneration. 172 00:19:47,290 --> 00:19:55,780 And here's the long definition and here's the checklist version of the long definition. 173 00:19:56,110 --> 00:20:05,740 You need to be convicted of a crime. You need to be officially cleared of all related charges based at least in part 174 00:20:05,770 --> 00:20:12,040 on new evidence of innocence without unexplainable physical evidence of guilt. 175 00:20:13,960 --> 00:20:14,380 That's it. 176 00:20:16,270 --> 00:20:25,840 And so one way I like to explain this, this is to go back to the point about the relationship between wrongful convictions and exonerations. 177 00:20:26,950 --> 00:20:32,020 Let's imagine that the criminal justice system mostly gets it right, 178 00:20:32,140 --> 00:20:40,950 and most convictions are right for lots more, and thus only a smaller number are wrongful. 179 00:20:40,960 --> 00:20:51,190 Let's just assume that for the sake of argument, exonerations are going to capture only the tip of the iceberg of wrongful convictions. 180 00:20:51,190 --> 00:21:00,100 There will be lots of wrongful convictions that we never know about that never become exonerations, huge categories of of cases, 181 00:21:00,130 --> 00:21:07,720 including especially low level crimes misdemeanours where it's almost impossible to get to go through all 182 00:21:07,720 --> 00:21:13,510 the legal process that you would need to to get exonerated and probably not worth it to you to to do so. 183 00:21:14,530 --> 00:21:21,459 And maybe with our definition of exoneration, there's a much smaller number of actually guilty people who, 184 00:21:21,460 --> 00:21:25,360 for whatever reason, meet our definition of exoneration. 185 00:21:26,200 --> 00:21:35,530 And our approach is to stick to the definition and accept these two error rates despite ourselves. 186 00:21:35,710 --> 00:21:43,810 And so if if we happen to run across someone who we're not really that convinced they're innocent, they meet the definition we put them in. 187 00:21:44,680 --> 00:21:51,940 But again, it's much more common to run across cases where we really think the person is innocent. 188 00:21:52,210 --> 00:21:59,200 And and we can't we they don't meet our definition and so we can't include them. 189 00:21:59,440 --> 00:22:04,300 And those are painful experiences, especially me being new at the registry. 190 00:22:05,740 --> 00:22:10,030 And yet that's that's the rules we've kind of set for ourselves. 191 00:22:13,190 --> 00:22:18,510 I always I always fail to click these little. You got all that red. 192 00:22:18,580 --> 00:22:25,950 Okay. So now let me try to show you a little bit of the registry itself. 193 00:22:29,210 --> 00:22:32,690 And I've got this temperamental mouse here, so bear with me. 194 00:22:40,270 --> 00:22:43,570 He's another one that's got the underline. Was it. 195 00:22:44,200 --> 00:22:49,120 Yeah. No, I think it's it's it's okay. I think I just saw the. 196 00:22:49,660 --> 00:22:54,489 Yeah, they did already. Open it. Okay. All right. 197 00:22:54,490 --> 00:22:59,830 So again, bear with me a little bit as I navigate this. 198 00:23:00,370 --> 00:23:06,590 So this is the registry. I was it. 199 00:23:06,800 --> 00:23:10,820 This is your Internet connection. Has it clicked on it? 200 00:23:11,300 --> 00:23:15,230 Yeah. Yeah, it says waiting. Yeah. 201 00:23:24,630 --> 00:23:29,660 Erica. Wow. 202 00:23:37,670 --> 00:23:40,820 This is so sad. So is that. 203 00:23:41,720 --> 00:23:45,440 That's not normal. Yeah, that's not normal. We had that before. Right, Samuel? 204 00:23:47,550 --> 00:23:51,459 We're going to put that up. But there it is. That's what I want. 205 00:23:51,460 --> 00:23:55,790 Wanted. I think so. 206 00:23:55,990 --> 00:24:00,490 Didn't you close it? Did you close it? What if I go forward? 207 00:24:04,970 --> 00:24:09,740 Yes. Okay. All right. So this is the heart of the registry. 208 00:24:10,670 --> 00:24:14,340 This is a this is a table with 2551 row. 209 00:24:14,360 --> 00:24:17,630 So we have 2551 exonerees. 210 00:24:17,930 --> 00:24:23,120 And so we have these basic demographic characteristics of each case. 211 00:24:23,390 --> 00:24:29,600 And then you can sort and filter by all of these demographics. 212 00:24:32,420 --> 00:24:38,930 So a table and a data set, that's the heart of the registry in a certain sense. 213 00:24:45,550 --> 00:24:48,580 But the other part is that any one of these cases. 214 00:24:53,630 --> 00:25:03,350 So I open it up. I get this professionally written by a journalist narrative summary of the case, 215 00:25:03,860 --> 00:25:14,599 and as you see that they're pretty substantive and compared to other lists that had been compiled before by, 216 00:25:14,600 --> 00:25:17,880 for example, the Innocence Project or the Centre on Wrongful Convictions. 217 00:25:17,900 --> 00:25:30,660 These are much more thorough, and we have two former former journalists who are writing these these things, and they're journalists, right? 218 00:25:30,680 --> 00:25:38,000 So they write this stuff really quickly and then they go on to the next one because that's that's what he used to do in newspapers. 219 00:25:38,450 --> 00:25:43,130 And it's kind of a piece of work that I'm not used to as an academic. 220 00:25:44,570 --> 00:25:51,980 And it's it's really incredible. And we're posting about 150 cases per year. 221 00:25:52,520 --> 00:25:56,000 So we're posting one every two or three days. 222 00:25:57,800 --> 00:26:02,510 And there. And so I've written a couple of these just to see what it was like, you know, 223 00:26:02,510 --> 00:26:11,540 and it took me like three months before I felt comfortable with the facts and having checked out all the sources and so on. 224 00:26:11,720 --> 00:26:20,600 And so they're checking out all the sources, all the research, and doing a much more elaborate social science coding than I showed you, 225 00:26:20,600 --> 00:26:24,200 because we have all sorts of other variables we're coding that are not on the public 226 00:26:24,200 --> 00:26:29,749 website doing all that and putting it up on the web once every two or three days. 227 00:26:29,750 --> 00:26:36,980 A new case, which I think is impressive, both for what we're doing as the registry, 228 00:26:37,190 --> 00:26:42,950 but also for the fact that someone's being exonerated in the United States once every two or three days. 229 00:26:47,820 --> 00:26:51,810 Okay. What else is on my list of things I wanted to show you very quickly. 230 00:26:52,080 --> 00:26:55,080 What's the sources of the people rising up? 231 00:26:55,590 --> 00:26:59,850 It's it's legal. It's almost entirely legal documents. 232 00:26:59,850 --> 00:27:06,900 So the legal pleadings and media sources, newspaper, newspaper articles and so on. 233 00:27:07,140 --> 00:27:11,490 We talked to the lawyers. We don't talk to the exonerees. 234 00:27:12,780 --> 00:27:21,420 So we usually be an interview with the lawyer and trying to get the lawyers, send us briefs and so on, court rulings, legal judgements and so on. 235 00:27:22,040 --> 00:27:33,809 But behind the behind all this, there's a source document folder for every single case which varies greatly in quality and sort of thoroughness. 236 00:27:33,810 --> 00:27:38,520 But that's almost entirely legal documents and newspaper articles. 237 00:27:39,450 --> 00:27:42,000 So not things that many other people want. 238 00:27:42,030 --> 00:27:49,920 Like I'm interested in forensic science, so I want the lab reports and the transcripts, you know, have them in almost every case. 239 00:27:51,720 --> 00:27:55,020 That does bring me to something I was going to say is one thing, you know, 240 00:27:55,020 --> 00:28:00,720 I sort of knew about the registry, but I didn't really know it very well until I took it over running it. 241 00:28:01,680 --> 00:28:12,509 So one thing is that to 2551 cases, it's sort of like geologic strata in terms of how the data was collected. 242 00:28:12,510 --> 00:28:19,290 As you go back in time and the data collection has gotten better and more rigorous and more reliable. 243 00:28:19,380 --> 00:28:26,670 The closer we get to the present and we don't have the resources to go back and rework the old cases. 244 00:28:27,390 --> 00:28:29,970 So you find a lot of variety in our data, 245 00:28:30,000 --> 00:28:39,210 both in terms of source documents and in terms of the coding of social science variables depending on when the case was posted. 246 00:28:40,170 --> 00:28:47,220 And you will see that when you look at the source documents on one document and these old cases and then ten in recent cases. 247 00:28:47,430 --> 00:28:53,250 So that comes up a lot. I'm. 248 00:28:56,520 --> 00:29:08,400 Let me just bring up. It's over here. 249 00:29:08,700 --> 00:29:10,860 In that case, what happened to his co-defendants? 250 00:29:17,160 --> 00:29:25,550 Evidence says he was sentenced to death and he was sentenced to life without parole because he was 17. 251 00:29:26,130 --> 00:29:29,250 Is that right? Yeah. I don't know. I don't know. 252 00:29:29,940 --> 00:29:36,540 Sorry. That should be a cop. Yeah. If he were match, then he'd be hyperlinked like this. 253 00:29:36,540 --> 00:29:39,600 Co-defendant. Right. Yeah. Okay. 254 00:29:39,750 --> 00:29:43,140 So maybe they're not exonerated. They might have taken a plea. 255 00:29:43,260 --> 00:29:49,860 Yeah. To get out or plead to one of the charges or something like that. 256 00:29:50,520 --> 00:29:54,270 So, yeah, it's not uncommon with co-defendants. 257 00:29:54,280 --> 00:29:58,680 Some of them are exonerated and some are not. So. 258 00:29:58,770 --> 00:30:06,160 And a plea would be a very common thing that they took some deal to go home and the other one didn't or wasn't offered it. 259 00:30:07,080 --> 00:30:17,910 This is a really sad thing to look at, but we're able to compile a list of the longest incarcerations for four exonerees. 260 00:30:19,110 --> 00:30:23,399 And so this just says something about sentencing practices in the state. 261 00:30:23,400 --> 00:30:30,840 It states. And it wasn't just the people at the back of the room. 262 00:30:30,840 --> 00:30:36,690 It nearly always says like, Oh, well, yeah, but it's how long they serve actually serves. 263 00:30:36,710 --> 00:30:41,880 Yeah. So. So the record holder is 45 years in prison for a crime they didn't commit. 264 00:30:41,880 --> 00:30:45,960 And the top ten are all over 35 years. 265 00:30:45,990 --> 00:30:51,810 And we keep adding these new cases like high on the list and pushing people, 266 00:30:51,990 --> 00:31:03,870 people with 30 more than 30 year incarcerations off the top ten of longest incarcerations, and some of them famous cases like this. 267 00:31:03,870 --> 00:31:08,040 Wilbert Jones, who was in Angola, the famous case. 268 00:31:11,370 --> 00:31:18,210 Okay. And then just want to show you a little bit about how the registry works. 269 00:31:24,270 --> 00:31:28,080 So the advisory board isn't that important. 270 00:31:29,370 --> 00:31:32,640 This is being test for how the registry works. 271 00:31:32,650 --> 00:31:37,860 They're very important. So, so founded by Sam and Rob Gordon. 272 00:31:39,240 --> 00:31:43,080 And then the plan, I can't really scroll very well here. 273 00:31:43,860 --> 00:31:51,990 So we have three editors kind of taking over the job from Sam Barba, Brian and Catherine Grosso, who are at Michigan State and myself. 274 00:31:54,900 --> 00:32:06,910 And just want to point and again, I can't scroll, but we have two professional journalists more closely and can do that. 275 00:32:07,500 --> 00:32:14,580 Yeah, but I'm using that wheel so that's what's now doesn't it's all right. 276 00:32:16,860 --> 00:32:25,260 So we now have not one, but two former professional journalists, Maurice Persily and Ken Waterberg, researching the cases and writing the cases. 277 00:32:26,040 --> 00:32:37,650 Jessica Paradis, who's a lawyer, who's doing the first edit and checking the coding, and then it goes to the editors. 278 00:32:37,950 --> 00:32:46,810 And then even the gal who's doing administration and development and communications and kind of running the whole centre. 279 00:32:46,830 --> 00:32:56,790 So again, it's kind of an unusual thing for me as an academic and in some ways feels more like running a newspaper than than an academic project. 280 00:32:58,110 --> 00:33:04,560 It kept going, going at a very fast pace. That tends to be driven by the journalists who are sort of used to that. 281 00:33:05,790 --> 00:33:08,880 Let's not restart it. We got it. 282 00:33:08,970 --> 00:33:13,140 All right. So I'm going to close this mercifully. 283 00:33:14,760 --> 00:33:23,520 I think there is. What's happening, though. Okay. 284 00:33:31,230 --> 00:33:35,430 Okay. All right. Now I just have to get rid of this. Rita, do we start now? 285 00:33:36,000 --> 00:33:44,220 We start tonight. It always happens in windows, doesn't it? 286 00:33:44,370 --> 00:33:47,549 Because no one maintains this thing because it's not there. It was announced. 287 00:33:47,550 --> 00:33:51,060 First you have to use a mouse. It's working. Okay. All right. 288 00:33:51,180 --> 00:33:56,820 So. So, as promised, sort of intellectual and research free so far. 289 00:33:56,850 --> 00:34:05,100 Right now, I want to talk a little bit about a project on on forensic science, because that's my academic interest. 290 00:34:06,000 --> 00:34:10,140 And so since I came on, they said, well, you can work on the forensic science part. 291 00:34:10,980 --> 00:34:19,500 So a little bit after I started, Gerry Laporte, who is the head forensic science guy at the National Institute of Justice in the United States, 292 00:34:20,130 --> 00:34:26,070 published this article on wrongful convictions and DNA exonerations. 293 00:34:26,430 --> 00:34:30,270 And he's got this little thing, so you probably can't read this in the back. 294 00:34:30,510 --> 00:34:42,210 He says, Well, the Innocence Project says that 157 of its 342 DNA exonerations, forensic science contributed to the wrongful conviction. 295 00:34:42,660 --> 00:34:49,030 That's 46%. And so they're saying bad things about forensic science. 296 00:34:50,310 --> 00:34:55,250 But I looked at the National Registry of Exonerations, and they're inconsistent. 297 00:34:55,260 --> 00:35:04,379 So 24 of those, 157 cases, they say the registry says forensic science did not contribute to that. 298 00:35:04,380 --> 00:35:09,390 It was like mistaken eyewitness identification or false confessions and so on. 299 00:35:09,960 --> 00:35:15,810 So I talked to the Innocence Project, who we work with closely, and we said, this isn't good. 300 00:35:15,810 --> 00:35:23,760 We need to kind of be consistent about this or at least figure out why we have this discrepancy and what the you know, 301 00:35:23,760 --> 00:35:31,590 what the issue is that we disagree about. So we started by looking at our definitions of forensic science. 302 00:35:31,860 --> 00:35:36,750 We call it false or misleading forensic science, forensic evidence. 303 00:35:36,900 --> 00:35:41,190 And they call it misapplication of forensic science. 304 00:35:43,110 --> 00:35:48,230 And it it certainly didn't seem like the root of the problem was the definitions. 305 00:35:48,240 --> 00:35:54,270 They are different, but there's nothing fundamentally there's no sort of real fundamental disagreement there. 306 00:35:54,750 --> 00:36:00,030 So the first thing to do was to come up with see if we could come up with an agreed upon definition. 307 00:36:00,870 --> 00:36:05,940 And both of us were open to that. And so we did. 308 00:36:06,480 --> 00:36:13,210 And so the new one we're going to call fit forensic expert evidence problem. 309 00:36:14,310 --> 00:36:18,260 And here's the very simple definition of it. 310 00:36:18,270 --> 00:36:26,729 So this is, I think, the first time I've really presented this definition to any audience other than within the internal work product. 311 00:36:26,730 --> 00:36:31,049 So I'm interested, you know, and it's not final. It hasn't gone up on any website yet. 312 00:36:31,050 --> 00:36:37,890 So I'm interested in feedback and and thoughts about it, but, but we are pretty happy with it. 313 00:36:37,900 --> 00:36:48,660 So not too much feedback. But what I'm going to try to do now is sort of explain to you some of the thinking that went into this definition, 314 00:36:48,660 --> 00:36:52,110 which actually for such a simple thing, there was a fair amount. 315 00:36:52,350 --> 00:36:57,059 So it raised a couple of questions. So one is kind of the science question. 316 00:36:57,060 --> 00:37:02,790 We did notice that we were just that some of us on both sides were not coding 317 00:37:02,790 --> 00:37:09,299 cases as forensic because the forensic science seemed like not forensic science. 318 00:37:09,300 --> 00:37:13,700 So and here's some examples on the left, hypnosis, slide detection, dog said. 319 00:37:13,710 --> 00:37:17,580 So somebody was just saying, well, that isn't forensic science, so it doesn't count. 320 00:37:19,350 --> 00:37:28,229 And the solution to this and this is why I talked about the science studies framework at the beginning, 321 00:37:28,230 --> 00:37:38,700 is because here's where I think science studies came into it, is that I'm so defining science is famously difficult in philosophy of science. 322 00:37:39,330 --> 00:37:48,090 So I wanted to avoid doing it. And in the end I kind of said, I don't care if if it's science or not, 323 00:37:48,390 --> 00:37:55,740 if the court is admitting this person as some kind of an expert, then that's fair game. 324 00:37:56,070 --> 00:38:03,180 And that's why the title of it, we have the word expert in it if you want to say it's not forensic science for whatever reason. 325 00:38:03,180 --> 00:38:12,570 But to me, forensic science is anything any kind of expertise that gets admitted into court is in some sense forensic science. 326 00:38:13,320 --> 00:38:22,440 So auto mechanics, psychologists, gang experts, cultural experts, hypnotists, if the courts are letting them in, I think it's forensic science. 327 00:38:22,440 --> 00:38:28,650 If you don't think it's forensic science, you should maybe talk to the courts about not letting hypnosis hypnotists testify. 328 00:38:29,750 --> 00:38:40,490 In court. And another way we have of saying it down here is that, you know what science is the question you're asking, not the answer you're giving. 329 00:38:41,000 --> 00:38:49,280 So if you're asking, do people with tattoos, are they more often in a gang than not in a gang? 330 00:38:49,970 --> 00:38:52,160 That's an empirical, scientific question. 331 00:38:52,190 --> 00:38:58,550 Now, whether a cultural anthropologist has empirical scientific ways of answering that question, I don't know. 332 00:38:58,790 --> 00:39:03,680 But if you want to claim to have knowledge about that question, then that's a scientific question. 333 00:39:06,650 --> 00:39:11,210 So let me give two examples of this from actual cases. 334 00:39:11,220 --> 00:39:17,440 So one is the Raymond Jennings case, and this case will be interesting to the criminology audience. 335 00:39:17,450 --> 00:39:21,950 So this this was a murder in a parking lot. 336 00:39:23,090 --> 00:39:27,310 And Jennings was the security guard on duty. 337 00:39:27,320 --> 00:39:32,570 And so eventually, for lack of other suspects, they decided he was the only one there. 338 00:39:32,570 --> 00:39:45,770 So it must it must have been him. And they got this expert witness, an FBI agent, who said that he was an expert in victimology. 339 00:39:45,800 --> 00:39:53,440 Now, this was interesting as a criminologist, because victimology is a thing in criminology, but that's not his victimology. 340 00:39:53,450 --> 00:39:57,950 His is the study of the victim to determine who might have killed her. 341 00:40:01,760 --> 00:40:09,590 So the classic one, I think is if the victim is a prostitute, you might that might lead you to some theory of how they died. 342 00:40:10,490 --> 00:40:18,200 So in this case, where the victim was someone who had come back from attending a rock video filming, he had a theory that, 343 00:40:19,760 --> 00:40:28,160 you know, that it must have been crime of opportunity by the only male nearby, which is Raymond Jennings, the security guard. 344 00:40:29,360 --> 00:40:39,500 However, later in the case, when the prosecution found evidence that she was in, you know, that she had been killed by gang members during a robbery, 345 00:40:40,460 --> 00:40:50,120 he then said, well, now that I've heard that seen that evidence, I no longer think that this was a sort of sex crime of opportunity. 346 00:40:50,990 --> 00:41:00,950 And so I wouldn't say that anymore. So to me, victimology, this guy was admitted as an expert witness. 347 00:41:00,950 --> 00:41:10,460 It certainly contributed to the conviction. Whether victimology is science or not is there's no reason to debate it. 348 00:41:11,000 --> 00:41:18,910 It's obviously a weak science if it is science, but it contributed to the conviction. 349 00:41:18,920 --> 00:41:28,100 It was expert evidence. Here's another one that's useful just to just to kind of show the variety of cases that we have in the registry, 350 00:41:28,100 --> 00:41:31,490 especially by not confining ourselves to DNA exonerations. 351 00:41:32,270 --> 00:41:35,960 They're not all. They're not all murder cases. 352 00:41:36,770 --> 00:41:45,230 Although non-high, level crimes still are very rare and under underrepresented in in the registry. 353 00:41:45,500 --> 00:41:49,940 Here's one that's like insurance fraud by a farmer. 354 00:41:51,320 --> 00:42:00,860 So very unusual case. And the prosecution got some expert to say, well, other the other farms were doing very well. 355 00:42:00,860 --> 00:42:10,969 So this guy's crop failure, for which he filed a claim, an allegedly false claim, was just bad farming and it's his fault. 356 00:42:10,970 --> 00:42:16,070 And so he's not entitled to this claim. So this is a very weird kind of expert. 357 00:42:16,070 --> 00:42:22,310 We hadn't really been prepared for crop experts. But, you know, they're an expert witness. 358 00:42:23,210 --> 00:42:27,080 If we found that there were a lot of crop experts, that would be interesting. 359 00:42:27,830 --> 00:42:31,400 There aren't. But why? 360 00:42:31,430 --> 00:42:42,220 Why rule it out? Why not collect data on it? The second difficult question was what about the of which actor do we focus on? 361 00:42:42,230 --> 00:42:50,800 So where does forensic science begin? What if the bad actor is not a forensic scientist, but a police officer or a prosecutor? 362 00:42:50,840 --> 00:42:54,230 But the problem involved forensic science. 363 00:42:55,400 --> 00:43:00,170 And I thought that we thought it was bad to not capture that. 364 00:43:00,710 --> 00:43:06,350 And so we decided what we were really trying to capture is a problem with the 365 00:43:06,350 --> 00:43:13,100 delivery of forensic science that doesn't require a forensic scientist necessarily. 366 00:43:13,100 --> 00:43:18,709 It doesn't really matter who is responsible if it's a problem in the delivery of forensic science. 367 00:43:18,710 --> 00:43:28,670 And I'll give you a case to illustrate this, the Kenny Waters case, it's actually a pretty, pretty well known case because Kenny Waters. 368 00:43:28,970 --> 00:43:32,209 Sister couldn't get a lawyer to exonerate him, 369 00:43:32,210 --> 00:43:40,520 so ended up putting herself through law school in order to then litigate her brother's case and get him exonerated. 370 00:43:41,600 --> 00:43:47,720 And there they are. And their son played in the movie by Hilary Swank and Sam Rockwell. 371 00:43:48,980 --> 00:43:54,470 So fairly well-known case because of because of the movie, which is called Conviction. 372 00:43:55,310 --> 00:44:03,470 But here, the police had fingerprints that excluded Kenny Waters from the crime scene. 373 00:44:03,740 --> 00:44:07,160 The police officer who had them retired, moved away, 374 00:44:07,340 --> 00:44:15,440 stored the prints and the report saying Kenny Waters was not the source of those prints in this incriminating location in a storage unit. 375 00:44:15,710 --> 00:44:28,130 And they were not found for years until that man went to law school and managed to obtain a court order to search this storage unit. 376 00:44:28,310 --> 00:44:31,320 So the fingerprint examiners didn't do anything wrong. 377 00:44:31,370 --> 00:44:38,390 They excluded him as the source and the police officer put them in a storage unit. 378 00:44:39,980 --> 00:44:46,400 And so I think this is a case this is something we need to know about. 379 00:44:47,000 --> 00:44:51,560 And imagine if we had 100 cases like this. 380 00:44:51,860 --> 00:44:58,370 Well, this is something forensic scientists should be interested in if if if our data shows that. 381 00:45:00,110 --> 00:45:06,409 You know, actually, a lot of the problems with forensic science are not with in the laboratory, with the forensic scientists themselves. 382 00:45:06,410 --> 00:45:10,790 It's because the cops are taking the evidence and putting locking it in storage units. 383 00:45:11,600 --> 00:45:18,290 They should know that and maybe fix that. And they didn't do anything wrong. 384 00:45:18,410 --> 00:45:25,040 So so in cases like this, I'm trying to imagine we only have one case like this. 385 00:45:25,040 --> 00:45:31,910 But what if we had 100 that would tell us something that might lead to sort of a policy response or an intervention. 386 00:45:33,420 --> 00:45:40,460 Another problem we dealt with that was probably the the trickiest problem is consistent with evidence. 387 00:45:41,060 --> 00:45:48,830 What do we do with evidence? That's literally true, but seems to be highly misleading with the juror. 388 00:45:48,830 --> 00:45:55,400 And the way we got to this was hair evidence where they say it's consistent with the suspect. 389 00:45:57,100 --> 00:46:00,620 It's sort of literally true. It's consistent with the suspect. 390 00:46:00,830 --> 00:46:07,400 But hair evidence is extremely unreliable. And it turns out it wasn't the suspect. 391 00:46:07,730 --> 00:46:12,830 It turns out we have a lot of this in in a lot of different areas, this kind of consistent. 392 00:46:12,830 --> 00:46:27,360 And I'll show you an example in a minute. So our answer to this is, again, to include it that we think we want to count these cases, 393 00:46:27,720 --> 00:46:36,390 but we no longer think that saying forensic science contributed to the wrongful conviction has an element of blame worthiness. 394 00:46:36,690 --> 00:46:40,080 There's a sense in which the analyst was telling the truth. 395 00:46:41,850 --> 00:46:47,009 They're just using a really lousy technique, which is hair evidence, microscopic hair comparison, 396 00:46:47,010 --> 00:46:50,670 or presumptive field drug tests, which I'll talk about in a second. 397 00:46:52,080 --> 00:46:57,180 But and so we became uncomfortable with, especially in the registry definition. 398 00:46:57,180 --> 00:47:03,389 Was the words false or misleading? I didn't want to use the word false or misleading for this consistent with testimony. 399 00:47:03,390 --> 00:47:06,720 I didn't think it was either false or even misleading. 400 00:47:07,230 --> 00:47:08,520 So here's an example. 401 00:47:08,550 --> 00:47:16,140 This another famous case maybe you know about the Steven Avery case, which got really famous because of the documentary series Making a Murderer. 402 00:47:17,310 --> 00:47:25,379 But in that original conviction, there was hair evidence and the serology is Sherry Culhane, 403 00:47:25,380 --> 00:47:32,460 who interestingly comes off kind of badly in the documentary as it's kind of biased. 404 00:47:33,840 --> 00:47:38,490 She gave like the most conservative hair testimony that I ever saw. 405 00:47:38,520 --> 00:47:43,500 She said it was similar and consistent with the victim's hair. 406 00:47:43,860 --> 00:47:46,950 And then she said, the hair. Many people are consistent with each other. 407 00:47:47,190 --> 00:47:49,890 She can't give a probability they're from the same source. 408 00:47:50,220 --> 00:47:56,520 Finally, when pressed, she says, All I'm saying is it's not impossible that the hairs are from the victim. 409 00:47:58,140 --> 00:48:04,080 Well, that's a true statement. That's a very philosophically cautious statement. 410 00:48:04,950 --> 00:48:08,940 So do I really want to fault her for saying that? 411 00:48:09,150 --> 00:48:17,010 Not really. But on the other hand, we know from the registry that and from the Innocence Project list, 412 00:48:17,010 --> 00:48:23,640 there's tons and tons of of exoneration cases where hair evidence contributed to the wrongful conviction. 413 00:48:24,000 --> 00:48:30,810 So this kind of testimony that it's consistent with, while true, leads to a lot of wrongful convictions. 414 00:48:30,840 --> 00:48:35,160 Why is that? Because I think they're using a really weak undiscriminating technique. 415 00:48:36,180 --> 00:48:46,260 And so by coating the hair cases, we are finding something out that this microscopic hair comparison is a really lousy forensic technique. 416 00:48:47,040 --> 00:48:50,250 But I don't think we're finding that people gave false testimony. 417 00:48:51,810 --> 00:48:57,420 But we we need to capture this and we need to know about it. And here's another example from the other end of the spectrum. 418 00:48:57,930 --> 00:49:03,719 This case also got a lot of publicity, was written up by ProPublica, Amy Allbright. 419 00:49:03,720 --> 00:49:07,140 And so she's she's from Louisiana. 420 00:49:07,140 --> 00:49:11,550 She's driving into Texas. The police pull her over. 421 00:49:12,270 --> 00:49:18,389 They see a white crumb on the floor mat of the car, not a baggie of white powder. 422 00:49:18,390 --> 00:49:26,250 A white crumb. And they do these field tests for drugs that the police carry in the back of their police cars, 423 00:49:26,250 --> 00:49:30,360 where they just do, you know, put it in a little baggie and shake it up and do a little colour test. 424 00:49:30,810 --> 00:49:36,420 And and it turns a certain colour, if it might be a controlled substance and it has, 425 00:49:36,420 --> 00:49:41,370 you know, an error rate of 25, 30%, something like that, very high error rate. 426 00:49:41,610 --> 00:49:46,469 But that's okay because this is just for the police use out in the field. 427 00:49:46,470 --> 00:49:53,430 And then that sample will go back to the laboratory and they'll do the better test with their proper chemical equipment, 428 00:49:53,670 --> 00:50:04,470 except that for these drug possession cases, while they're waiting for the lab, the suspect is going to plead guilty because they want to go home. 429 00:50:04,740 --> 00:50:08,520 And they're facing a very low level drug charge anyway. 430 00:50:08,520 --> 00:50:14,880 And they don't they don't want to lose their job and get evicted, which is what happened to Amy Allbright. 431 00:50:16,800 --> 00:50:24,960 And so, lo and behold, they test it in the laboratory and it's a cookie crumb, not a not a controlled substance. 432 00:50:25,920 --> 00:50:32,070 And because Houston, the D.A. of Houston, was concerned about these cases, too, 433 00:50:33,450 --> 00:50:37,589 she found out these cases were happening and decided to track all these people 434 00:50:37,590 --> 00:50:41,370 down and make sure they were exonerated and their criminal records were clear. 435 00:50:41,940 --> 00:50:47,249 We think this problem is happening all over the country with people pleading guilty, 436 00:50:47,250 --> 00:50:50,729 except that the D.A. isn't following up and making sure to clear their records. 437 00:50:50,730 --> 00:50:55,650 So they just have a misdemeanour on their on their record with all the consequences that go with that. 438 00:50:56,550 --> 00:51:06,900 But in Houston, they cleared it up. So we have 150 exonerations from Houston involving this presumptive field drug test. 439 00:51:07,200 --> 00:51:11,249 But again, I don't think the testimony if not that there was testimony. 440 00:51:11,250 --> 00:51:14,790 Right. There's always a report of a presumptive field drug test. 441 00:51:15,300 --> 00:51:20,250 I don't think it was false or misleading. It said. We did a presumptive test and it was positive. 442 00:51:20,520 --> 00:51:29,390 You need to do a proper test to confirm this. It wasn't false, but it produces a lot of wrongful convictions. 443 00:51:29,400 --> 00:51:36,900 So that's something we need to know about. So these consistent with statements, it's a big issue in forensic science. 444 00:51:37,020 --> 00:51:45,510 They're very common. And I think if we contribute, you know, if they contribute to wrongful convictions, we need to know about them. 445 00:51:45,930 --> 00:51:50,010 It's a little bit tricky for me because my in my work in forensic science, 446 00:51:50,290 --> 00:51:54,270 I have said that forensic scientists need to quantify the uncertainty of their tests. 447 00:51:54,570 --> 00:51:59,190 I've said if they don't, they need to tone down the certainty with which they testify. 448 00:51:59,700 --> 00:52:03,080 And specifically in the context of fingerprint identification. 449 00:52:03,090 --> 00:52:06,270 I've said, you know, I don't think you should say match. 450 00:52:06,270 --> 00:52:12,450 I don't think you should say it's his fingerprint. I don't think you should say identification if you can't come up with any statistics. 451 00:52:12,450 --> 00:52:20,300 Which would? Which is what I'd really like to see you do. Just say it's consistent with this so called fingerprint examiners. 452 00:52:20,310 --> 00:52:25,290 They should use this word consistent with. But we're also coding in these cases. 453 00:52:25,560 --> 00:52:31,290 And the way I look at this is if a discipline's going to can't come up with a statistical way 454 00:52:31,290 --> 00:52:35,720 of quantifying their uncertainty and they're going to rely on consistent with statements, 455 00:52:35,970 --> 00:52:43,379 you can do that, but you're going to have to own it. If you come up with a lot of wrongful convictions and if your technique is good, 456 00:52:43,380 --> 00:52:48,270 you probably won't produce a lot of wrongful convictions like, say, fingerprint evidence. 457 00:52:48,720 --> 00:52:53,430 If your technique is really weak, like microscopic hair comparison or presumptive drug test, 458 00:52:54,150 --> 00:52:58,230 you're going to produce a lot of wrongful convictions by giving this testimony. 459 00:52:59,340 --> 00:53:08,760 Setting aside again what what the jury might hear from somebody who says it's consistent with the defendant, you know, 460 00:53:09,010 --> 00:53:18,000 they're not going to take the Deep South probative value of that to be as low as it sort of actually is, if you think about what the statement makes. 461 00:53:18,840 --> 00:53:19,140 Okay. 462 00:53:19,500 --> 00:53:30,450 So I'm close to fiction, so I think I've kind of talked about this, the false or so we're not requiring that the evidence be false or misleading. 463 00:53:30,630 --> 00:53:36,510 So we kind of came up with this word problem. There's a problem with forensic evidence, but it doesn't mean anybody did something wrong. 464 00:53:36,510 --> 00:53:43,020 It doesn't mean anybody said necessarily said something false or misleading, but something is going wrong to cause wrongful convictions. 465 00:53:44,250 --> 00:53:49,229 So in some we made these choices. We broadened the definition of forensic evidence a little bit. 466 00:53:49,230 --> 00:53:55,680 We included more actors, and we focus on disembodied problems rather than blame worthiness. 467 00:53:56,490 --> 00:54:00,960 We're trying to avoid defining ontological concepts like science. 468 00:54:01,710 --> 00:54:07,170 In some sense, I think we're trying to let the data speak to us about where the problems lie 469 00:54:07,920 --> 00:54:12,570 and that they can function as an early detection system for stakeholders. 470 00:54:12,840 --> 00:54:16,500 It probably will help more cases than the old definitions did, 471 00:54:16,830 --> 00:54:23,580 but the balance of that is that all those cases are cases where we said a forensic scientist did something wrong. 472 00:54:24,400 --> 00:54:27,690 And so you can't assume that about these cases. 473 00:54:28,500 --> 00:54:31,620 So here's a summary of what we did. 474 00:54:31,890 --> 00:54:41,760 We look we started with Gerry Laporte, as I said, identify 24 cases where we were discrepant and then we found a bunch more that he didn't notice. 475 00:54:42,870 --> 00:54:51,390 So we ended up looking at 47 cases and this sort of shows where we said yes and they said no and they said no and we said yes. 476 00:54:52,110 --> 00:54:59,310 We ended up deciding 41 of them were had a forensic expert evidence problem and six did not. 477 00:55:00,900 --> 00:55:08,850 So if you look at all the DNA exonerations at that time, when we started the project two years ago, 478 00:55:10,140 --> 00:55:20,250 that ends up being 174 of the cases of the 351 DNA exonerations that existed at that time, 479 00:55:20,250 --> 00:55:26,160 about half of them ended up having a forensic or expert evidence problem. 480 00:55:26,910 --> 00:55:30,300 So it's actually slightly higher than Laporte saw. 481 00:55:30,930 --> 00:55:33,990 But again, we have this slightly expanded definition. 482 00:55:35,280 --> 00:55:48,000 And to date, while I was here on the sabbatical, the number is pretty similar, but we're up to 367 DNA exonerations, so it's about 49%. 483 00:55:49,770 --> 00:55:56,490 So why were there these discrepancies? Overwhelmingly, it's because somebody lacked information about a case. 484 00:55:56,700 --> 00:56:02,009 And once we traded information with the Innocence Project, there was no difficulty coding it. 485 00:56:02,010 --> 00:56:03,750 We just didn't know about this. 486 00:56:03,990 --> 00:56:11,670 And and most of those cases were serology cases where there was an issue with the interpretation of the serology evidence. 487 00:56:12,480 --> 00:56:17,730 12 cases were this consistent with issue, most of them hair, but some of them pathology. 488 00:56:19,020 --> 00:56:27,180 Then these sort of odd disciplines that may or may not be forensic and then this kind of broadened definition of of problem. 489 00:56:28,680 --> 00:56:36,540 So in some the portion of DNA exonerations to which forensic evidence contribute is still about half. 490 00:56:38,220 --> 00:56:42,030 That's partly with this expanded definition, but that's only about five cases. 491 00:56:42,030 --> 00:56:44,100 So that would only knock about 2% off. 492 00:56:45,210 --> 00:56:53,580 This is was true when he wrote and it's still true the portion of all exonerations, the ones in the National Registry of Exonerations is much lower. 493 00:56:54,360 --> 00:57:02,340 It's only about 24%. But we haven't applied this new definition across all these cases, so it might go up a little bit. 494 00:57:03,060 --> 00:57:07,320 That's not really surprising because of the nature of the DNA exoneration cases. 495 00:57:07,320 --> 00:57:11,790 These are cases where the person was exonerated through the use of forensic science. 496 00:57:11,790 --> 00:57:17,520 So it's not that surprising that they were originally convicted through the use of forensic science. 497 00:57:20,220 --> 00:57:24,180 And then there's a lot more to do now that we've done this project. 498 00:57:24,570 --> 00:57:31,950 There's I'm working on looking at the four different forensic disciplines which discipline contributed to this, to temporal trends, 499 00:57:31,950 --> 00:57:36,360 to how much this CO occurs with the other contributors like mistaken witness identification, 500 00:57:37,170 --> 00:57:42,630 other patterns, and also sort of subtypes of these problems in terms of what went wrong. 501 00:57:45,420 --> 00:57:58,110 So I hope your fans of the National Registry of Exonerations now follow us on social media now and we have stickers. 502 00:57:58,830 --> 00:58:03,660 If you're interested. And I'm happy to take questions or discussion.