1 00:00:00,270 --> 00:00:05,310 I remembered the line from the Hindu scripture of the Bhagavad Gita. 2 00:00:08,880 --> 00:00:14,830 Additional. It's trying to persuade the prince that. 3 00:00:16,650 --> 00:00:30,540 He should do his duty and to impress him takes on his multi armed form and says, Now I am the commander, the destroyer of worlds. 4 00:00:33,630 --> 00:00:43,800 I suppose we all thought that one way or another. That was a quote from the atomic physicist Robert Oppenheimer. 5 00:00:44,280 --> 00:00:50,430 Nicknamed Father of the Atomic Bomb, for the role he played in helping the United States develop the first atomic bomb. 6 00:00:51,480 --> 00:00:57,120 The clip reveals something of Oppenheimer's well-known moral struggle to come to terms with the consequences of his research. 7 00:00:57,870 --> 00:01:04,510 It's an interesting contrast to another of his famous quotes. There must be no barriers for freedom of inquiry. 8 00:01:05,110 --> 00:01:08,620 The scientist is free and must be free to ask any questions. 9 00:01:10,090 --> 00:01:13,960 This encapsulates a problem that's come to be known as the dual use dilemma. 10 00:01:14,740 --> 00:01:20,830 Quite often, the same research that can be used to do good can also be used for much more malevolent purposes. 11 00:01:22,060 --> 00:01:25,930 Whereas in the 20th century, atomic physics was at the heart of this dilemma. 12 00:01:26,440 --> 00:01:30,250 Today, much of the debate is over developments in biological sciences. 13 00:01:32,260 --> 00:01:36,280 My name is Chan Dima and I'm from the Parliamentary Office of Science and Technology. 14 00:01:37,120 --> 00:01:43,510 In this podcast, we'll be talking to scientists and scientific publishers who both have a role to play in tackling this. 15 00:01:44,630 --> 00:01:49,760 Leland talked to Phil Willis, empty chair of the Science and Technology Committee in the House of Commons, 16 00:01:50,120 --> 00:01:56,329 to hear his views on where politicians fit into all of this and to help me with this is Tom Douglas, 17 00:01:56,330 --> 00:01:59,870 who is a Ph.D. student in bioethics at Oxford University. 18 00:02:02,900 --> 00:02:03,500 Okay. Well, Tom, 19 00:02:03,500 --> 00:02:13,190 can you start by giving me some examples from history of why this dual use dilemma has cropped up with a classic example from history comes from, 20 00:02:13,190 --> 00:02:18,829 as you mentioned, 20th century nuclear physics. So a good example would be the Manhattan Project, 21 00:02:18,830 --> 00:02:28,010 where the US government wanted to develop nuclear fission technology for both energy generation purposes and for weapons development. 22 00:02:28,370 --> 00:02:35,600 And scientists that were involved in this project were quite aware that it had potentially devastating applications as well, as well as good ones. 23 00:02:36,470 --> 00:02:40,670 Why is this issue of dual use of particular interest now? 24 00:02:41,540 --> 00:02:45,410 I think one reason why it's come to our attention recently is because of some developments, 25 00:02:45,680 --> 00:02:52,940 mostly in the life sciences areas where it's been particularly obvious that there are potential applications of this research to develop, 26 00:02:52,940 --> 00:02:59,239 for example, biological weapons. This coincided with the September 11 attacks in the US, 27 00:02:59,240 --> 00:03:05,990 which obviously raised awareness of the threat from terrorism and also with the anthrax letters attacks in the US. 28 00:03:06,800 --> 00:03:14,060 As part of my research, I spoke to a number of scientific journal editors because there have been some suggestions that the Journal edited 29 00:03:14,060 --> 00:03:21,020 is may be in a position to to perhaps screen the papers they receive based on possible concerns about misuse. 30 00:03:21,020 --> 00:03:29,120 And some of us have suggested that journal editors should censor their journals in some way to ensure that they're 31 00:03:29,120 --> 00:03:34,309 not publishing what could be interpreted as effectively instructions on how to build a biological weapon. 32 00:03:34,310 --> 00:03:40,350 For example, one of the scientific journal editors that I spoke to was Professor Charles Payne from the University of Birmingham, 33 00:03:40,370 --> 00:03:45,330 who's a professor of microbiology and also the editor of the Journal of Medical Microbiology. 34 00:03:45,350 --> 00:03:53,680 A good example of that is the smallpox virus, which is fully understood, and we know all about its DNA sequence. 35 00:03:53,690 --> 00:03:59,690 We know that it's a serious pathogen, but it's it's no longer available, if you like, on the open market. 36 00:03:59,930 --> 00:04:03,470 So that could be recreated using DNA synthesis. 37 00:04:04,370 --> 00:04:08,450 However, that's that's the theoretical answer in practice. 38 00:04:08,480 --> 00:04:14,870 You also have to package that DNA to make a viable pathogen that will actually be infectious. 39 00:04:14,880 --> 00:04:22,040 And and with that and other technical problems, this is not nearly as easy a task as it as it might seem. 40 00:04:22,490 --> 00:04:29,130 So I don't think we're yet really close to even to doing that on a kind of bioterrorism scale. 41 00:04:29,150 --> 00:04:31,520 It would still be a massive and difficult task. 42 00:04:32,390 --> 00:04:38,510 The other angle on this is that one could, in theory, create a new pathogen, a kind of designer pathogen, 43 00:04:38,510 --> 00:04:48,020 and put together a bunch of genes, which, in theory, from what we know already, would create a very serious novel pathogenic organism. 44 00:04:48,680 --> 00:04:54,380 But again, I think we we really lack the understanding of how nature has done that job. 45 00:04:54,980 --> 00:05:03,650 My feeling is we're a very long way, really, from being able to design from scratch the ideal pathogen and created in the laboratory. 46 00:05:06,440 --> 00:05:11,190 Michael Calculate is an ethicist based in Canberra at the Australian National University. 47 00:05:11,210 --> 00:05:17,030 He's really the first philosopher or ethicist to start looking at the tool used to limit. 48 00:05:18,580 --> 00:05:29,050 MATTHEWS Now, it's clear that recent development and genetic make possible development of biological weapons, partly illustrated by CIA, quote, 49 00:05:29,380 --> 00:05:32,530 saying that advances in biotechnology, 50 00:05:32,530 --> 00:05:42,490 like genetic engineering now make it possible to create new diseases worse than any diseases previously known to humankind. 51 00:05:43,150 --> 00:05:51,730 So that's one example. DNA Synthesis. Did you explore any other areas of science which raise similar issues over the potential for harm? 52 00:05:52,360 --> 00:05:58,870 So one other area that's increasingly being discussed is neuroscience, which is currently a very hot scientific field. 53 00:05:58,870 --> 00:06:08,460 It's advancing rapidly, and some of the technologies that neuroscientists are developing or attempting to develop, it could be used in unethical ways. 54 00:06:08,470 --> 00:06:11,620 So one example would be a new imaging technology. 55 00:06:11,890 --> 00:06:18,370 It seems possible that in the future this technology could be developed to the point that people interpreting 56 00:06:18,370 --> 00:06:23,439 these brain scans can draw inferences about what the subjects were thinking or feeling at the time, 57 00:06:23,440 --> 00:06:27,070 and their concerns that this could be used in ways that invade people's privacy. 58 00:06:27,970 --> 00:06:30,730 So do you think scientists think enough about these issues? 59 00:06:31,810 --> 00:06:37,540 I think scientists have been very good at starting to think about a lot of ethical issues that have been raised by their work in recent years. 60 00:06:37,570 --> 00:06:41,140 This is particularly true in the life sciences, where there's been a kind of bioethics movement. 61 00:06:41,500 --> 00:06:46,240 People working in stem cell research, for example, have started to think quite a lot about the issues. 62 00:06:46,480 --> 00:06:54,580 But one issue that hasn't been thought about very much by either scientists or bioethicists is the issue of how scientific knowledge is used. 63 00:06:55,080 --> 00:06:58,899 The questions have been more been asked about whether scientific knowledge is produced in ethical ways, 64 00:06:58,900 --> 00:07:03,970 but not about whether it's actually ethical to produce certain kinds of knowledge that could be misused. 65 00:07:07,910 --> 00:07:14,360 So Tom certainly thinks that scientists themselves have some responsibility for thinking about the consequences of their research. 66 00:07:15,230 --> 00:07:19,340 That can this always be predicted? Should scientific research be policed? 67 00:07:20,030 --> 00:07:21,620 And if so, who should do it? 68 00:07:21,950 --> 00:07:30,980 And at what stage, before any resulting technologies are available on the market, before the research is published, or before it's even undertaken? 69 00:07:31,730 --> 00:07:38,700 Here's what Michael Salkeld thinks. Well, everyone's responsible, including ordinary voters. 70 00:07:39,030 --> 00:07:48,360 Those more directly involved scientists should be taking more use, dangers and consideration when they're deciding what research to do, 71 00:07:48,630 --> 00:07:55,080 when they're deciding whether or not to disseminate potentially dangerous research findings. 72 00:07:55,770 --> 00:08:03,840 There's a role for research institutions. They should be providing more in the way of ethics, education to scientists. 73 00:08:04,260 --> 00:08:10,409 And they need to be implementing more in the way of oversight mechanisms, i.e., you know, 74 00:08:10,410 --> 00:08:16,470 looking at what research is taking place and looking at the dual use dangers associated with that research. 75 00:08:16,530 --> 00:08:24,270 There's a role for journals to be screening submitted articles for tool use. 76 00:08:25,260 --> 00:08:30,240 So Salga thinks that there are cases where it would be justifiable to censor research. 77 00:08:30,690 --> 00:08:37,140 Although he accepts it would be difficult. And there are many other possible approaches like regulating trade, for example. 78 00:08:37,830 --> 00:08:41,550 But Charles Penn is sceptical about whether censorship would do any good. 79 00:08:42,000 --> 00:08:51,090 Well, I think at the moment it's very difficult to envisage practical interventions that would fulfil the need to 80 00:08:51,090 --> 00:08:57,930 be cautious about the possibilities while at the same time not inhibiting legitimate biomedical research. 81 00:08:58,630 --> 00:09:05,490 I can't really see ways at the moment that we could sensibly restrict access to the available technologies 82 00:09:05,490 --> 00:09:11,400 without really tying the hands of people who are following legitimate lines of medical research. 83 00:09:12,690 --> 00:09:19,110 And would you think that scientific articles or parts of them should ever be held back from publication because of concerns about misuse? 84 00:09:20,430 --> 00:09:24,720 I think one can imagine rare examples where that might be advisable. 85 00:09:24,960 --> 00:09:31,530 For example, in determining complete genome sequences of some of the more severe or serious pathogens. 86 00:09:31,980 --> 00:09:38,070 There have been questions asked by scientists, in fact, as to whether it was responsible to to publish these. 87 00:09:38,790 --> 00:09:41,609 An example of that would be the plague organism, for example, 88 00:09:41,610 --> 00:09:47,040 where the complete genome sequence was determined just within the last five or six years. 89 00:09:47,910 --> 00:09:53,160 And scientists involved in that have have told me that, you know, they did wonder. 90 00:09:53,160 --> 00:09:54,930 They gave it some consideration. 91 00:09:55,800 --> 00:10:05,790 I think one of the problems, though, is that the publication is actually the final stage in quite a long process of of scientific activity. 92 00:10:05,790 --> 00:10:12,179 Science, as we know it in the biomedical community is a communal activity. 93 00:10:12,180 --> 00:10:15,750 It's a community activity. We are all interdependent. 94 00:10:15,760 --> 00:10:19,680 There's an awful lot known in the community about what we're each doing. 95 00:10:20,490 --> 00:10:24,899 We talk about it with collaborators and colleagues long before it's published. 96 00:10:24,900 --> 00:10:35,010 So putting a restriction at the level of publication in many ways, the what you might say is the genie is already out of the bottle. 97 00:10:37,990 --> 00:10:40,780 So what role did policy makers play in all of this? 98 00:10:41,140 --> 00:10:48,880 Phil Willis, the chairman of the Innovation University Science and Skill Select Committee and soon to be the science and technology side, 99 00:10:49,750 --> 00:10:52,840 Phil Willis MP thinks they have the hardest job at all. 100 00:10:53,530 --> 00:10:55,810 I went to talk to him in his Westminster office. 101 00:10:56,320 --> 00:11:04,330 What role do you think policymakers have, if any, in ensuring that science is not used for unethical purposes? 102 00:11:04,990 --> 00:11:09,820 I think for scientists it's relatively easy because they simply follow the scientific research, 103 00:11:10,090 --> 00:11:16,750 and the whole ethics of being a scientist is, in fact, to try to draw your science to its logical end point. 104 00:11:17,200 --> 00:11:23,290 I think for ethicists, the issue is easy because, again, the subject might be complex. 105 00:11:23,500 --> 00:11:28,900 Their role is to weigh up the ethics for and against elements of science. 106 00:11:29,320 --> 00:11:32,590 For policymakers, it's dreadfully difficult. 107 00:11:32,950 --> 00:11:38,410 And we saw that, of course, in the United States with George Bush and the stem cell agenda, 108 00:11:39,250 --> 00:11:44,080 and we're seeing it in the UK, but particularly in Europe, over the GM agenda. 109 00:11:44,470 --> 00:11:56,200 So my my simple view is that policymakers have a duty to the public to actually ensure that there is a body of research which has been 110 00:11:56,200 --> 00:12:05,860 properly peer tested and which can be put openly and without comment before the public as the basis for forming policy judgements. 111 00:12:06,250 --> 00:12:11,530 And where that doesn't happen. You will get that policy and I think a distortion of science. 112 00:12:12,670 --> 00:12:20,110 So you're saying really that you don't think policy makers have any role where the research is actually done and the results disseminated? 113 00:12:20,530 --> 00:12:23,740 U.S. policymakers have have a role to play further down the line. 114 00:12:23,890 --> 00:12:27,670 I think increasingly down the line, I think they have a role to play. 115 00:12:28,180 --> 00:12:36,970 But when we're talking about pure science, I think the least that scientists are affected by policy, the better. 116 00:12:37,420 --> 00:12:43,900 I think the boundaries are best set by committees outside the House rather than by politicians. 117 00:12:44,560 --> 00:12:50,080 One of the issues that there's a lot of debate around at the moment is the idea of DNA synthesis and 118 00:12:50,080 --> 00:12:55,510 the possibility that terrorists might eventually be able to recreate pathogens in the laboratory. 119 00:12:56,110 --> 00:13:03,610 So that's an issue. But there's a lot of debate about whether we should restrict either what research is done or restrict what's published. 120 00:13:03,940 --> 00:13:08,920 Do you think that perhaps in instances like that, there's ever a case for scientific censorship? 121 00:13:09,880 --> 00:13:17,800 I don't. I think that once you actually censor science, you're going down a very slippery slope in an opposite direction. 122 00:13:18,280 --> 00:13:31,210 And the fact that we saw pre a 2008 in the United States deliberate attempts by the Bush administration to censor science and individual scientists 123 00:13:31,510 --> 00:13:39,010 and indeed to remove grants from them if they were overstepping the mark is something that I think we should deliberately avoid in this country. 124 00:13:39,580 --> 00:13:49,480 If you go back to the whole issue of of DNA streams and the way in which you can actually build out, for instance, dangerous pathogens. 125 00:13:49,930 --> 00:13:58,120 Most of that science is already out there. It's already available, you know, on the Internet, the idea that you can put a lid on it and say, 126 00:13:58,120 --> 00:14:02,140 oh, well, we'll all obey the rules by now is just absolute madness. 127 00:14:02,590 --> 00:14:09,489 Obama recently said that he wanted America to be the first in as many areas of 128 00:14:09,490 --> 00:14:14,470 science as possible and where they weren't first to be the country of choice. 129 00:14:14,860 --> 00:14:18,370 And that must be UK's position, you know as well. 130 00:14:18,700 --> 00:14:25,420 And once you start saying we are going to restrict certain elements of science for political reasons, 131 00:14:25,750 --> 00:14:29,260 then I genuinely think you will undermine science itself. 132 00:14:29,530 --> 00:14:35,440 Because if you start with a belief that science is inherently evil and cannot be controlled, 133 00:14:35,650 --> 00:14:41,050 well, you really go back to, you know, the horror movies rather than a sensible debate. 134 00:14:43,240 --> 00:14:51,070 I think that what I've learned from this podcast is that there is no simple solution to tackling the dual use nature of scientific research. 135 00:14:51,520 --> 00:14:56,229 We're all responsible in our own way for understanding the potential consequences of 136 00:14:56,230 --> 00:15:01,750 scientific research and taking appropriate action depending on what our role is in society. 137 00:15:03,990 --> 00:15:09,690 Although most of the people we've talked to have different opinions on what we should be doing about the dual use dilemma. 138 00:15:10,290 --> 00:15:18,240 They do all agree that there is a real risk that if we try and hold back scientific research, we could lose out on all the benefits that it can bring. 139 00:15:19,500 --> 00:15:24,030 And Ireland with an interesting quote from the French physicist and psychologist Gustav Labonte. 140 00:15:25,950 --> 00:15:32,680 Science has promised us truth. It has never promised us either peace or happiness.