1 00:00:00,060 --> 00:00:07,620 And I want to introduce Jamie Kwong, who's the Marshall scholar and war studies at King's College, London, and amongst other things, 2 00:00:07,620 --> 00:00:16,380 she'll be talking about information and misinformation, which is a topic that also in another context has been examining as well, 3 00:00:16,380 --> 00:00:23,130 because it can be quite harrowing as we start to think through the power of not just lack of information, 4 00:00:23,130 --> 00:00:31,350 but also the power of misinformation and how that shapes not just the dialogue, but actually has potentially very dangerous consequences. 5 00:00:31,350 --> 00:00:34,680 So welcome, Jamie, and thank you. Thank you so much. 6 00:00:34,680 --> 00:00:42,420 And it's actually great to end on Jeremy's point, just touching on some of these public attitudes, because my talk will focus on that. 7 00:00:42,420 --> 00:00:49,410 But before I begin, I just wanted to thank the organisers for having me, as well as all of my fellow speakers for their fascinating insights. 8 00:00:49,410 --> 00:00:54,540 I hope I can add some additional insights to that. So just a little bit about me. 9 00:00:54,540 --> 00:01:03,780 I'm a research assistant and PhD candidate at King's College, London, where my PhD focuses on US public opinion of North Korean nuclear proliferation. 10 00:01:03,780 --> 00:01:09,480 So I hope to zoom out a bit and talk about public opinion of nuclear weapons issues more broadly here. 11 00:01:09,480 --> 00:01:13,980 So I'm going to spend a lot of time covering three key areas. 12 00:01:13,980 --> 00:01:19,290 So first, thinking about interpreting public opinion polls, what can polls tell us? 13 00:01:19,290 --> 00:01:28,080 And importantly, what can't they tell us? Second, looking at the state of play, so what do we know about public opinion of nuclear issues? 14 00:01:28,080 --> 00:01:34,810 And then to Isabel's point, I thirdly want to talk about kind of contextualising these opinions. 15 00:01:34,810 --> 00:01:38,910 So talking about these non public actors, as I'll be calling them. 16 00:01:38,910 --> 00:01:46,860 So the role of NGOs, advocacy organisations, media policymakers, the like in reporting and shaping public opinion. 17 00:01:46,860 --> 00:01:53,370 So my bottom line up front here is that the public holds complex views about nuclear weapons issues. 18 00:01:53,370 --> 00:01:59,700 And so therefore we as experts and scholars need to be very thoughtful about how we analyse public 19 00:01:59,700 --> 00:02:06,690 opinion polls and should be wary about how various actors influence and report public opinion. 20 00:02:06,690 --> 00:02:12,090 So for us to begin with interpreting public opinion polls, what can and can't they tell us? 21 00:02:12,090 --> 00:02:17,340 So there's three things I want to consider here. So first, thinking about question types. 22 00:02:17,340 --> 00:02:22,080 So ultimately there's different attitudes that we can measure through public opinion polling. 23 00:02:22,080 --> 00:02:24,110 So first we have knowledge questions. 24 00:02:24,110 --> 00:02:31,530 So this again gets back to Jeremy's presentation, asking the public objective questions with a right and a wrong answer. 25 00:02:31,530 --> 00:02:36,780 So something that comes up in my research, for example, is what country does Kim Jong Il lead? 26 00:02:36,780 --> 00:02:40,230 And so these questions, unfortunately, are few and far between. 27 00:02:40,230 --> 00:02:48,510 But they're really helpful when they do pop up to gauge the public's actual understanding of the issues they're asked to express opinions about. 28 00:02:48,510 --> 00:02:50,980 And second, we have awareness questions. 29 00:02:50,980 --> 00:02:58,680 So this kind of helps gauge public salience of the topics, how much the public is paying attention to a particular issue, 30 00:02:58,680 --> 00:03:03,000 which again, hopefully builds on Jeremy's point of where are they getting this information? 31 00:03:03,000 --> 00:03:09,570 How much does it relate to media coverage? And third, you can also think about support for different policies. 32 00:03:09,570 --> 00:03:14,490 And so this often in the literature comes up as support for the use of force. 33 00:03:14,490 --> 00:03:21,750 But there's other formulations. So, for example, do you think your country should join the treaty on the prohibition of nuclear weapons or the TPN, 34 00:03:21,750 --> 00:03:25,610 as Nick Ritchie addressed in his his talk earlier? 35 00:03:25,610 --> 00:03:33,660 And so I think those questions, these kind of attitudes questions are the most commonly thought of metrics when you talk about public opinion polling. 36 00:03:33,660 --> 00:03:40,890 Right. It's called public opinion polling for a reason, but it's important to remember that there are different metrics that we can gauge. 37 00:03:40,890 --> 00:03:46,420 And so it's critical not to overstate what each specific question is measuring. 38 00:03:46,420 --> 00:03:53,200 So second type of consideration to think about when looking at public opinion polls is longitudinal data. 39 00:03:53,200 --> 00:04:00,780 So I'm just going to quickly pull up a slide here. Oops. 40 00:04:00,780 --> 00:04:05,010 There we are. Hoping that's popping up now, 41 00:04:05,010 --> 00:04:10,500 but it's so it's really helpful for analysis purposes when polling companies ask the same question 42 00:04:10,500 --> 00:04:16,590 repeatedly because it allows for a fairly robust assessment of changes in public opinion over time. 43 00:04:16,590 --> 00:04:25,050 And having this trend data helps us pinpoint fluctuations and allow us to explore why there might be particular spikes or dips, 44 00:04:25,050 --> 00:04:27,180 in particular attitude metrics. 45 00:04:27,180 --> 00:04:34,890 So, for example, again, this is one that comes up in my research, but you'll see that this looks at public threat perception of North Korea over time. 46 00:04:34,890 --> 00:04:42,000 And that box and red highlights a particular spike when the public found North Korea to be a very serious threat. 47 00:04:42,000 --> 00:04:47,880 And if you're aware of the North Korean nuclear timeline, that twenty seven year in July, 48 00:04:47,880 --> 00:04:54,640 North Korea tested an intercontinental ballistic missile and in September, it tested a North Korean nuclear weapon. 49 00:04:54,640 --> 00:05:02,550 And so you do see kind of a correlation there. So that's something that we can help analyse to see these trends over time. 50 00:05:02,550 --> 00:05:08,100 And ultimately, this longitudinal data allows us to draw more reliable conclusions about public 51 00:05:08,100 --> 00:05:12,810 attitudes and give this more full and comprehensive picture of public opinion. 52 00:05:12,810 --> 00:05:17,190 And so it's a much stronger and better practise than relying on one discrete public 53 00:05:17,190 --> 00:05:22,320 opinion poll and asserting that that poll accurately reflects public attitudes. 54 00:05:22,320 --> 00:05:27,720 Which leads me to my final point and perhaps most importantly, question wording matters. 55 00:05:27,720 --> 00:05:31,770 So actually, considering what the words in the question say. 56 00:05:31,770 --> 00:05:40,890 So, for example, here we're just seeing do you think North Korea presents a serious or non serious threat to the US? 57 00:05:40,890 --> 00:05:47,550 That's distinct from saying, do you think North Korea's nuclear weapons programme poses a serious threat to US interests? 58 00:05:47,550 --> 00:05:55,440 And so you generally find that the one when the word nuclear is mentioned, threat perception increases, which makes intuitive sense. 59 00:05:55,440 --> 00:06:02,220 But the point being phrasing can prime respondents to answer specific and often biased ways. 60 00:06:02,220 --> 00:06:08,130 And second, to think about here as well are what assumptions are we making when we're asking these questions? 61 00:06:08,130 --> 00:06:14,580 So it's critical, again, to Jeremy's point that we don't assume that the public has a fundamental knowledge or 62 00:06:14,580 --> 00:06:21,270 understanding of the context and even just the factual notions around nuclear weapons issues. 63 00:06:21,270 --> 00:06:26,640 And so, for example, asking the public if they support the Tea Party and W is less helpful than providing 64 00:06:26,640 --> 00:06:31,560 a description of the TPN W and then asking whether or not the public support it. 65 00:06:31,560 --> 00:06:41,310 However, it's critical to provide objective context to truly inform the public and gauge their nonbiased responses to the information presented, 66 00:06:41,310 --> 00:06:49,020 which again gets back to the phrasing and potential priming effects. So the large takeaway here is that when we're analysing public opinion polls, 67 00:06:49,020 --> 00:06:55,710 we should be very wary of any polls that don't provide objective context or don't provide any context at all. 68 00:06:55,710 --> 00:07:02,070 And it's especially problematic if we rely on single data points or polls that are continually making these 69 00:07:02,070 --> 00:07:08,890 assumptions without either measuring public knowledge or providing this additional contextual information. 70 00:07:08,890 --> 00:07:13,320 OK, so to get off my scholarly, how do we think about these polls? 71 00:07:13,320 --> 00:07:17,460 What do we actually know about public opinion of nuclear issues? 72 00:07:17,460 --> 00:07:21,840 So importantly, it's first important to note that public, 73 00:07:21,840 --> 00:07:27,240 the public rarely considers nuclear issues as the most important problem facing their country. 74 00:07:27,240 --> 00:07:31,140 They're much more likely to give attention to domestic issues. 75 00:07:31,140 --> 00:07:36,150 So you might interpret this as nuclear weapons are unlikely to be a central voting issue. 76 00:07:36,150 --> 00:07:41,940 But nevertheless, we care about public opinion and understanding of nuclear issues for many of the reasons discussed. 77 00:07:41,940 --> 00:07:44,910 I'm happy to go into them further in the Q&A, 78 00:07:44,910 --> 00:07:52,770 but what exactly do we know about these opinions so we can divide public opinion of these issues into a few big buckets? 79 00:07:52,770 --> 00:08:00,390 And so these aren't all encompassing, but it kind of gives a sense of the key areas where contemporary polling tends to focus. 80 00:08:00,390 --> 00:08:04,260 And no surprise, I'll look at three of them. So first, nuclear use. 81 00:08:04,260 --> 00:08:11,160 Second, nuclear capabilities and third, disarmament slash the TPN w so to the first, 82 00:08:11,160 --> 00:08:14,310 a lot of the experimental survey work that occurs in the literature. 83 00:08:14,310 --> 00:08:23,010 And by this I mean researchers conducting their own surveys to test for very clear causal relationships between public attitudes and specific factors. 84 00:08:23,010 --> 00:08:28,290 A lot of these surveys are looking at the public attitudes around the use of nuclear weapons. 85 00:08:28,290 --> 00:08:37,680 And so key scholars here, Scott Sagan and Ben Valentino, who have conducted a series of studies showing that despite the tradition of nuclear non use, 86 00:08:37,680 --> 00:08:45,870 the US public in particular is willing to support the use of nuclear weapons to preserve American lives and win a war quickly. 87 00:08:45,870 --> 00:08:51,300 And this attitude is tempered less by the nuclear taboo. 88 00:08:51,300 --> 00:08:57,780 Another key point in the nuclear literature and more so by fear of retribution and setting a negative precedent. 89 00:08:57,780 --> 00:09:02,800 Additional work by some scholars named Cocain Wells have gone further to show that this attitude. 90 00:09:02,800 --> 00:09:06,910 Is also significantly tempered when given vivid information about the physical 91 00:09:06,910 --> 00:09:11,110 effects of a nuclear attack and the possible nuclear risks and retaliation, 92 00:09:11,110 --> 00:09:17,080 so again, this point of educating the public helps them make more informed decisions and attitudes. 93 00:09:17,080 --> 00:09:20,620 Second, thinking about public attitudes around nuclear capabilities. 94 00:09:20,620 --> 00:09:27,880 So this is probably the biggest bucket and can take many forms, but often it relates to particularly policy, relevant topics. 95 00:09:27,880 --> 00:09:37,570 So, for example, US public attitudes on the replacement ICBM programme, which is called the ground based strategic deterrent or for short. 96 00:09:37,570 --> 00:09:42,250 So this is an increasingly contested topic amongst the expert community and within 97 00:09:42,250 --> 00:09:48,190 Congress as to whether or not the United States should continue to pursue this programme, 98 00:09:48,190 --> 00:09:53,890 given its costs, potential incomplete consideration of other alternatives, like I won't get into that there. 99 00:09:53,890 --> 00:09:58,990 But the Federation of American Scientists has told the public what they think about this, 100 00:09:58,990 --> 00:10:08,870 and they found that most support alternatives to leave and go so far as to support phasing out ICBMs altogether. 101 00:10:08,870 --> 00:10:13,750 So there's not much polling yet for comparison to this poll, 102 00:10:13,750 --> 00:10:21,010 but it suggests that this can be an area for future research and also to think about capabilities from a non US perspective, 103 00:10:21,010 --> 00:10:25,930 we can look at South Korean public attitudes on a national nuclear arsenal. 104 00:10:25,930 --> 00:10:33,520 And so we tend to see relatively high levels of support amongst the South Korean public for developing its own nuclear arsenal. 105 00:10:33,520 --> 00:10:36,430 But again, once you provide additional context, 106 00:10:36,430 --> 00:10:42,220 that support is significantly tempered when questions include discussion of the economic and security implications 107 00:10:42,220 --> 00:10:51,190 of doing so and finally thinking about public attitudes around disarmament in general and the TPN in particular, 108 00:10:51,190 --> 00:10:57,970 I want to kind of address this in a few different ways, because I know it's a topic that has come up already this morning. 109 00:10:57,970 --> 00:11:02,890 So one way to think about this is through the polling work of advocacy organisations. 110 00:11:02,890 --> 00:11:09,640 So, for example, the international campaign to abolish nuclear weapons, I can does some original polling on this topic. 111 00:11:09,640 --> 00:11:15,220 And so their polls show that majorities in NATO states support joining the TPN W 112 00:11:15,220 --> 00:11:21,280 and most support even doing so unilaterally and in spite of US pressure not to, 113 00:11:21,280 --> 00:11:25,120 then we can think about experimental survey work on the TPN. 114 00:11:25,120 --> 00:11:29,320 So a lot of this work is done by Rebecca Gibbons' and Stephen Hertzog. 115 00:11:29,320 --> 00:11:37,660 And they show that in the US elite cues, so arguments by people in the government against the treaty and social cues saying, 116 00:11:37,660 --> 00:11:45,400 you know, my peer also is against the treaty because the US public support for the TPN to decrease. 117 00:11:45,400 --> 00:11:47,650 However, interestingly, in Japan, 118 00:11:47,650 --> 00:11:56,530 their work shows that public support for the TPN W remains fairly high and constant when introduced to arguments by the government against the treaty, 119 00:11:56,530 --> 00:12:01,270 as I'm pretty sure Professor Northmoor will touch on in his remarks. 120 00:12:01,270 --> 00:12:08,920 And a final way to think about public attitudes on disarmament is analysis of existing public opinion polling data. 121 00:12:08,920 --> 00:12:12,010 So one of my colleagues at King's College London, Amelia Morgan, 122 00:12:12,010 --> 00:12:18,400 has done this in a recent piece where she shows that the public supports public support 123 00:12:18,400 --> 00:12:24,610 for the abstract concept of disarmament is much higher than support for the TPN itself. 124 00:12:24,610 --> 00:12:33,160 So kind of thinking about these different ways of how we can interpret public opinion of disarmament reminds us that to the points outlined above, 125 00:12:33,160 --> 00:12:35,950 it's important not to take polls at their face value. 126 00:12:35,950 --> 00:12:45,130 And in particular, you need to consider these non public actors involved in shaping, collecting information about and reporting public opinion. 127 00:12:45,130 --> 00:12:52,390 But just before I conclude and discuss those actors quickly, I want to underscore with these findings of these different attitudes buckets, 128 00:12:52,390 --> 00:12:58,960 as I've called them collectively tell us, and that's that public opinion of nuclear weapons issues is complex. 129 00:12:58,960 --> 00:13:05,020 So it's something I discuss often with my colleague Amelia is something that her piece really excellently demonstrates. 130 00:13:05,020 --> 00:13:13,450 But it's unfair for us as nuclear weapons experts to expect the public to have more clear and concise views of nuclear issues than we do. 131 00:13:13,450 --> 00:13:16,930 Right. We've spent all morning having contesting perspectives of this. 132 00:13:16,930 --> 00:13:21,290 So why should we expect the public to have figured it out any better than we have? 133 00:13:21,290 --> 00:13:25,100 So finally, just to conclude, I want to touch on these non-public actors, 134 00:13:25,100 --> 00:13:31,850 kind of a cheeky little term I came up with, but those that can kind of influence and shape public opinion. 135 00:13:31,850 --> 00:13:41,360 So I'm thinking any non public actors, so NGOs, advocacy organisations, the media, subject matter experts, policymakers. 136 00:13:41,360 --> 00:13:47,900 Again, let me touch on three considerations here. So a lot of academic work focuses on what's called elite use. 137 00:13:47,900 --> 00:13:56,780 So because foreign policy issues aren't proximate to the public, the public instead relies on cues from elites about how to think about these issues. 138 00:13:56,780 --> 00:14:00,470 So there's exceptions for when this is effective and not effective. 139 00:14:00,470 --> 00:14:05,930 But generally speaking, we tend to see that elite curlicues influence public opinion again. 140 00:14:05,930 --> 00:14:07,760 Makes intuitive sense, right? 141 00:14:07,760 --> 00:14:15,200 If you don't know much about a particular foreign policy issue, you're likely to consider in similar terms to an elite that you respect. 142 00:14:15,200 --> 00:14:26,150 So whether that's a New York Times columnist who you turn to or an elected official, second thinking about the media and a potential feedback loop. 143 00:14:26,150 --> 00:14:32,480 So we actually see that a lot of the regular polling that's done, especially in the US, is by media organisations. 144 00:14:32,480 --> 00:14:41,120 So you'll see on the Roper Centre, for instance, which houses a huge collection of public opinion polls, you'll see ABC and The Washington Post. 145 00:14:41,120 --> 00:14:44,870 You'll see CBS and The New York Times collecting these polls. 146 00:14:44,870 --> 00:14:52,040 So in doing so, there's this inherent feedback loop where questions reflect particularly salient news topics. 147 00:14:52,040 --> 00:14:56,990 And it's the same case for your classic polling organisations like Gallup and Pew. 148 00:14:56,990 --> 00:15:04,430 So we're seeing that these polling organisations are interested in knowing what the public thinks about the hot topics of the day. 149 00:15:04,430 --> 00:15:08,750 And that's partly, I think, why we don't see consistent polling on nuclear weapons issues. 150 00:15:08,750 --> 00:15:13,640 Right. It's as episodic as is the news coverage of these issues, which again, 151 00:15:13,640 --> 00:15:18,050 is problematic as analysts when we want to track these attitudes and better understand 152 00:15:18,050 --> 00:15:24,740 what the public thinks of nuclear issues in non salient times or in normal circumstances. 153 00:15:24,740 --> 00:15:27,110 And then finally, just to conclude, 154 00:15:27,110 --> 00:15:36,020 I think we need to think about these non-public actors interests and how that might be influencing not only public opinion, 155 00:15:36,020 --> 00:15:44,040 but reporting of public opinion in particular. And so a lot of these non public actors are purposely advancing particular perspectives. 156 00:15:44,040 --> 00:15:50,960 So, for example, advocacy organisations, it's the whole point right there advocating a particular position. 157 00:15:50,960 --> 00:15:55,950 And so in promoting interests, this is typically where we see. 158 00:15:55,950 --> 00:16:02,790 The most priming from question phrasing, so a question is skewed towards a particular position and so that, for example, 159 00:16:02,790 --> 00:16:06,900 could mean that a question doesn't provide an objective account of a debate on 160 00:16:06,900 --> 00:16:11,610 a particular issue and instead leads respondents in a particular direction. 161 00:16:11,610 --> 00:16:18,000 So that way, the organisation conducting the poll kind of gets the the results that they want. 162 00:16:18,000 --> 00:16:25,920 And so this often results in biased reporting and it becomes a particular issue when these groups are particularly skilled at public messaging. 163 00:16:25,920 --> 00:16:32,550 So that reporting often gets misrepresented as a comprehensive review of public opinion on a particular issue. 164 00:16:32,550 --> 00:16:41,790 But I'd argue that injecting non-public actor's interests into public polling is a disservice not only to the public, but also to policymakers. 165 00:16:41,790 --> 00:16:47,880 We should be striving to objectively consider what the public thinks about nuclear issues and try to better 166 00:16:47,880 --> 00:16:54,450 understand the complexity of those issues to ensure that our nuclear policies are as representative as possible. 167 00:16:54,450 --> 00:17:00,230 And so I'll just go ahead and conclude there and turn it back over to Isabella. Thanks so much again. 168 00:17:00,230 --> 00:17:06,590 Jamie, thank you so much for that and also how you helped us think through the nature of polling and its limitations and then, 169 00:17:06,590 --> 00:17:13,040 of course, the content or what is revealed through polling and then the cautions about even if we have those poll results, 170 00:17:13,040 --> 00:17:17,570 we have to also be very careful and do our due diligence in terms of who's doing the 171 00:17:17,570 --> 00:17:22,100 polling and what other kinds of agendas might be influencing what is then often 172 00:17:22,100 --> 00:17:28,160 perceived as a empirical result or a more powerful result than is actually the case 173 00:17:28,160 --> 00:17:33,257 if you scratch below the surface and went through the methodology that you did. So thank you so much for that.