1 00:00:00,460 --> 00:00:09,220 I am someone who works with tries to work on ethics from a variety of interdisciplinary interdisciplinary perspective, 2 00:00:09,220 --> 00:00:20,920 looking at a variety of different disciplines. And I focus in recent years in particular on ethics of war and peace and on military ethics. 3 00:00:21,040 --> 00:00:28,510 And I'm going to talk tonight about some work in neuroscience and psychology that I think is suggestive. 4 00:00:29,380 --> 00:00:35,800 Certainly, I'm not prepared to say it's definitive, but I think it may suggest some interesting lines of inquiry, 5 00:00:36,370 --> 00:00:44,590 both for scholars, but also perhaps for the military and thinking about ethics, training and so on. 6 00:00:44,890 --> 00:00:53,800 Let me begin with an incident in November 2005 in Anbar province, Iraq, 7 00:00:53,890 --> 00:01:05,020 where a convoy of four U.S. Marine vehicles containing Marines and friendly Iraqi soldiers was returning to base after resupplying a checkpoint. 8 00:01:06,550 --> 00:01:11,290 And the leader, the non-commissioned officer. 9 00:01:12,480 --> 00:01:16,620 In charge of the group was Sergeant Frank Wuterich. 10 00:01:17,790 --> 00:01:22,170 He's a 24 year old Marine who'd been an infantry instructor back home. 11 00:01:23,670 --> 00:01:30,300 This was his first deployment to Iraq and in fact, his first deployment in a combat theatre. 12 00:01:31,830 --> 00:01:39,240 Haditha is located in Anbar Province, which was a stronghold of Sunni resistance at the time. 13 00:01:39,810 --> 00:01:44,880 Marines and suffered suffered heavy casualties in the months before Wuterich's arrival, 14 00:01:45,420 --> 00:01:51,510 including six Marines who had been ambushed, tortured and killed, all of which was uploaded on the Internet. 15 00:01:52,560 --> 00:02:03,150 So shortly after 7 a.m., as Wuterich later recalled, an explosion louder than anything I have ever heard rocked the entire convoy. 16 00:02:04,550 --> 00:02:13,250 An improvised explosive vehicle and IED in the road had detonated beneath a vehicle about 35 metres behind Wuterich. 17 00:02:13,520 --> 00:02:17,480 The vehicle was blown into the air, killing one Marine, seriously wounding another. 18 00:02:18,350 --> 00:02:21,799 Wuterich immediately pulled his vehicle over to the side of the road, 19 00:02:21,800 --> 00:02:31,850 and the first thing he saw about 30 seconds or so later were five young men standing about ten metres away by the side of a car. 20 00:02:32,420 --> 00:02:39,110 They were unarmed. They made no move to advance toward him, nor did they exhibit any hostile behaviour. 21 00:02:40,610 --> 00:02:45,290 Wuterich later described what happened next. I took a knee in the road and fired. 22 00:02:45,890 --> 00:02:53,150 Engaging was the only choice. The threat had to be neutralised and this was the result. 23 00:02:55,430 --> 00:03:01,610 The five men whom Wuterich killed were four college students and a driver they had hired to take them to classes. 24 00:03:01,640 --> 00:03:10,580 No weapons were found in the car. Less than a minute had occurred between the IED explosion and when Wuterich opened fire. 25 00:03:11,780 --> 00:03:15,049 Now, the rules of engagement in the theatre required, quote, 26 00:03:15,050 --> 00:03:23,060 reasonable certainty that the proposed target is a legitimate military target, close quote, in order to use lethal force. 27 00:03:23,780 --> 00:03:29,780 And that meant that anyone who exhibited hostile action behaviour or hostile intent. 28 00:03:31,360 --> 00:03:40,060 Could be engaged with lethal force. So the question naturally is how could Wuterich have regarded these men as meeting this standard? 29 00:03:41,710 --> 00:03:47,440 So one natural account with which I certainly have some sympathy of what happened is that 30 00:03:48,130 --> 00:03:56,200 Wuterich's response reflected a failure of reason to temper powerful emotions of anger, 31 00:03:56,200 --> 00:04:05,230 fear, desire for revenge. Right. Ideally, reason would have slowed him down, would have led him to take more time, 32 00:04:05,380 --> 00:04:09,790 to look more closely at the men to make sure they were, in fact, hostile. 33 00:04:10,990 --> 00:04:14,740 But on this account, he was overcome by emotion. Right. 34 00:04:15,610 --> 00:04:21,520 And I certainly think this account captures some, but not necessarily all of what happened. 35 00:04:22,480 --> 00:04:28,450 And I want to suggest in my remarks that recent research for neuroscience and psychology suggests 36 00:04:28,450 --> 00:04:34,930 that we can also described Wuterich's actions as a failure to respond with the right emotions. 37 00:04:35,340 --> 00:04:42,550 Right. Now to appreciate this contrast Twitter reaction with the reaction of Hugh Thompson, 38 00:04:43,240 --> 00:04:49,690 the helicopter pilot who intervened to stop the slaughter at My Lai and to rescue innocent civilians. 39 00:04:50,530 --> 00:04:56,019 Thompson responded immediately to what he saw was anger in Thompson's own words. 40 00:04:56,020 --> 00:04:59,350 He said, I said, Damn it, it ain't going to happen. They ain't going to die. 41 00:04:59,650 --> 00:05:07,450 I was hot, I'll tell you that. I was hot. Those who received his radio message that day reported his voice was choked with emotion. 42 00:05:08,260 --> 00:05:13,240 He swore obscenities first and pleaded with the air crew to come down and help rescue the civilians. 43 00:05:14,740 --> 00:05:21,130 Thompson ordered his helicopter crew to open fire on any U.S. soldiers who tried to stop the rescue. 44 00:05:21,490 --> 00:05:29,340 All right. Now, I think we can say Thompson exhibited the right emotion under the circumstances. 45 00:05:30,710 --> 00:05:38,630 And research suggests that the right emotion can not only signal the presence of a morally salient feature of a situation. 46 00:05:39,580 --> 00:05:44,410 It can move us to act. That is, to take the appropriate steps in response. 47 00:05:44,440 --> 00:05:53,470 In this sense, it can be a source of moral motivation. You Thompson's emotional response led him to stop the massacre at considerable personal risk. 48 00:05:54,680 --> 00:06:01,970 It guided his moral perception of what was going on and motivated him to take morally praiseworthy action. 49 00:06:03,140 --> 00:06:10,940 For Wuterich, we might say responding with the right emotion would mean that he responded not only with anger and fear. 50 00:06:12,040 --> 00:06:17,049 Which were certainly virtually unavoidable, but also perhaps with a visceral, 51 00:06:17,050 --> 00:06:22,600 emotional immersion aversion to the prospect of killing innocent civilians. 52 00:06:24,160 --> 00:06:32,559 This could have led him to take just another minute or less to seek additional information about them in looking more closely. 53 00:06:32,560 --> 00:06:38,629 In other words, could have saved their lives. Responding with the right emotion, in other words, 54 00:06:38,630 --> 00:06:47,060 could have countered other emotions that prompted him to perceive the men as a threat and to open fire on them so quickly. 55 00:06:47,360 --> 00:06:54,830 Right. In fact, it could have slowed his response just enough to allow reason to begin to gain more influence, 56 00:06:54,840 --> 00:06:58,380 indeed, without this immediate non-conscious emotion. 57 00:06:59,370 --> 00:07:04,850 There might well be little prospect that reason could gain any influence with victory. 58 00:07:06,120 --> 00:07:12,029 So saying that Wuterich failed to respond with the right emotion is consistent with research that suggests that 59 00:07:12,030 --> 00:07:22,590 what we traditionally call emotional processing can play a role in some types of moral perception and judgement. 60 00:07:23,970 --> 00:07:29,440 And the work begins. Or is based on, I should say, 61 00:07:29,440 --> 00:07:37,690 a substantial body of research that offers an account more generally of emotion and cognition in which neural 62 00:07:37,690 --> 00:07:46,479 structures associated with emotion can play a crucial role in the activation of sensibilities that we tend to, 63 00:07:46,480 --> 00:07:49,690 at least in common parlance, to think of as moral. 64 00:07:51,250 --> 00:08:00,580 The work builds on research that indicates that our in general, our ability to make good decisions in a practical sense that further well-being or. 65 00:08:01,720 --> 00:08:07,480 Dependent in many ways on activity in portions of the brain that are associated with emotional processing. 66 00:08:10,030 --> 00:08:17,470 Now, I should say I don't like this next slide very much, but I'm going to do it just because it sort of does provide some simplification. 67 00:08:17,650 --> 00:08:25,850 Right. A neural structure that seems to play a key role in this process is the ventromedial prefrontal cortex. 68 00:08:25,900 --> 00:08:30,320 Right. And in particular, the orbitofrontal portion of the system. 69 00:08:30,320 --> 00:08:38,920 Now, I don't like this for many reasons, but one is it sort of suggests two discrete regions as opposed to neural networks. 70 00:08:39,190 --> 00:08:50,020 Right. But just to sort of orient you and the dorsolateral prefrontal cortex typically is associated with more deliberate, abstract sort of reasoning. 71 00:08:50,710 --> 00:09:01,210 And research suggests that individuals with damage to the ventral medial prefrontal cortex generally retain abstract reasoning capabilities, 72 00:09:01,810 --> 00:09:08,830 but they lack the ability to gauge the prospective, practical consequences of different courses of action. 73 00:09:09,760 --> 00:09:19,419 They make really bad decisions, right? Antonio Damasio suggests that this reflects the fact that normal decision making is guided 74 00:09:19,420 --> 00:09:24,550 by what he calls emotional or somatic responses that signal the affective valence. 75 00:09:25,600 --> 00:09:30,759 Of potential future states. These operate, he argues, 76 00:09:30,760 --> 00:09:38,290 to bias an individual toward or away from potential responses to a situation based on 77 00:09:38,290 --> 00:09:43,090 the anticipated emotional consequences of choosing one course of action or another. 78 00:09:43,630 --> 00:09:50,890 As he puts it, this occurs even before the subject becomes aware of the goodness or badness of the choice she is about to make. 79 00:09:51,890 --> 00:09:55,340 So for a patient, as he and a co-author put it, 80 00:09:55,730 --> 00:10:08,330 with impaired emotional processing knowledge without emotional signalling leads to dissociation between what no one knows and how one decides to act. 81 00:10:10,650 --> 00:10:17,880 Now, the emotional signal involved in this process, DiMaggio and others suggest, 82 00:10:18,460 --> 00:10:25,200 is the product of a complex process in which described very generally and certainly in oversimplified terms, 83 00:10:25,680 --> 00:10:34,800 neural structures in different areas, and inputs to the ventromedial prefrontal cortex that express positive and negative. 84 00:10:35,870 --> 00:10:45,889 Affective values associated with emotionally salient features of situations that an individual's confront confronts or one structure, 85 00:10:45,890 --> 00:10:55,670 for instance, processes information about others anticipated emotional states, another that ascribes belief and intentions to others. 86 00:10:56,420 --> 00:11:01,220 And these affective values reflect perception of the anticipated. 87 00:11:02,350 --> 00:11:10,059 Presence of rewards or penalties, if you will, from different courses of action. 88 00:11:10,060 --> 00:11:18,520 In other words, the anticipated emotional response, researchers argue, is what guides behaviour in many instances. 89 00:11:19,180 --> 00:11:26,229 Right. And this reflects an ongoing process of learning from experience in which positive and negative 90 00:11:26,230 --> 00:11:32,440 affective valence has become associated with certain objects or situations or activities. 91 00:11:33,460 --> 00:11:38,220 And at the same time, these vacancies are only probabilistic, right? 92 00:11:38,380 --> 00:11:47,440 Since the presence of these rewards or penalties are only predictive of what we associate with different people, 93 00:11:47,440 --> 00:11:55,480 different kinds of situations, different objects. These probabilities are adjusted on an ongoing basis as we gain more experience. 94 00:11:55,990 --> 00:12:05,590 So there's a comparison of the anticipated and the actual emotional consequences of certain kinds of behaviours and or choices. 95 00:12:06,100 --> 00:12:11,590 And then there is in essence feedback in which there is a learning process that goes on. 96 00:12:12,190 --> 00:12:18,310 There's also adjustment of the intensity of the emotional states that are associated with different outcomes. 97 00:12:18,320 --> 00:12:22,300 So again, comparing anticipated with actual emotional experiences. 98 00:12:22,850 --> 00:12:33,940 Right. And research suggests that the ventromedial prefrontal cortex computes these values and converts them into a signal that represents sort of an, 99 00:12:33,940 --> 00:12:43,300 all things considered, set of positive or emotional or a negative affective valence associated with choosing different courses of action. 100 00:12:45,010 --> 00:12:55,270 And all this occurs without what we think of as deliberate, logical reasoning, but that is it occurs non consciously. 101 00:12:55,270 --> 00:13:02,889 It occurs very rapidly. That's not to say that neural structures associated with deliberate reasoning play 102 00:13:02,890 --> 00:13:08,379 no part in the process structure such as the dorsolateral prefrontal cortex, 103 00:13:08,380 --> 00:13:18,550 for instance, appear to modulate the stimuli received by the ventral medial prefrontal cortex and thus the signal that it generally generates. 104 00:13:18,580 --> 00:13:28,330 But research at least, suggests that those inputs don't so much override emotion as modulate emotional processing. 105 00:13:29,230 --> 00:13:35,740 And the philosopher Jim Woodward suggests that good, 106 00:13:35,740 --> 00:13:43,150 practical decision making is the product of what we traditionally think of as both emotion and logical deliberation. 107 00:13:43,930 --> 00:13:47,290 But he suggests that this distinction itself may be misleading. 108 00:13:48,010 --> 00:13:56,290 What he suggests is, if by emotional, we mean structures that are affective involved in evaluation. 109 00:13:57,840 --> 00:14:06,810 That is appraisal of what ought to be the case and in motivation than structures like the VM, PFC are indeed involved in emotional processing. 110 00:14:08,030 --> 00:14:11,210 But this does not mean that they are non representational, 111 00:14:12,020 --> 00:14:17,030 impervious to influence from learning or by cognitive structures or passive in 112 00:14:17,030 --> 00:14:21,980 the sense of not involved in modulation and control of other neural structures. 113 00:14:23,310 --> 00:14:25,830 So processing in these structures, he argues, 114 00:14:25,830 --> 00:14:32,670 seems to involve an interim mixture of elements we associate with information processing and with affect. 115 00:14:33,600 --> 00:14:37,170 And so we think of this as emotional processing. 116 00:14:37,860 --> 00:14:45,000 And Woodward suggests that this greatly enhances behavioural flexibility and modification of behaviour 117 00:14:45,420 --> 00:14:53,970 as a result of learning and enables humans to move beyond fixed input output patterns in response. 118 00:14:55,830 --> 00:15:07,110 So emotional states research suggests or not simply brute instinctive, unmediated experiences sentiments but through the product of neural processing, 119 00:15:07,110 --> 00:15:16,320 processing that actually features a type of representation, calculation and learning regarded as typical of more delivered cognitive processes. 120 00:15:17,430 --> 00:15:22,230 Now, what might this have to do with moral decision making? 121 00:15:22,260 --> 00:15:25,590 Well, here I think this is more speculative, 122 00:15:26,910 --> 00:15:40,110 but I think it's it's worth I think pursuing the possibility that this general sort of emotional processing that occurs may play at least some role, 123 00:15:40,110 --> 00:15:44,250 perhaps in some kinds of decision making that we think of as moral. 124 00:15:45,690 --> 00:15:52,620 Certainly not positing that, you know, there is a single system within the brain engaged in what we think of as moral 125 00:15:52,620 --> 00:15:59,760 processing or that there are a multiplicity of of different types of moral judgements. 126 00:15:59,970 --> 00:16:07,140 Right. I mean, Walter SYNNOTT Armstrong was here argues that the whole notion of morality is not really a unified concept, 127 00:16:07,140 --> 00:16:12,330 that there's sort of a series of sort of disaggregated and maybe more localised sort of assessments, which I tend to agree with. 128 00:16:12,840 --> 00:16:20,520 But in any event, one important aspect of morality, again, I'll use the term in general terms, 129 00:16:21,370 --> 00:16:27,990 although perhaps colloquially at least maybe unified, is that it's concerned with guiding our conduct in interactions with others. 130 00:16:28,530 --> 00:16:35,850 That's not the that doesn't exhaust all morally salient aspects of the world, but that's certainly an important dimension of it. 131 00:16:37,130 --> 00:16:43,730 These interactions can be an especially significant source of reward or pain, right? 132 00:16:44,030 --> 00:16:52,910 They often present us with situations that involve a considerable amount of uncertainty in navigating this complex and fluid. 133 00:16:53,420 --> 00:17:00,710 Social terrain requires probabilistic judgements about others intentions, their values, their emotional states, 134 00:17:01,280 --> 00:17:09,800 their likely responses to our actions, and other considerations that are relevant in determining how to act in a given situation. 135 00:17:11,660 --> 00:17:20,809 This generally requires quite rapid recognition and assessment of subtle social cues for 136 00:17:20,810 --> 00:17:26,600 which non-conscious emotional processing is more suitable generally than deliberative. 137 00:17:28,290 --> 00:17:35,969 Cognitive processing. This enables us very quickly to anticipate such how to read others write. 138 00:17:35,970 --> 00:17:40,350 We've all had this experience right to anticipate the likely positive or negative. 139 00:17:41,820 --> 00:17:48,330 Emotional or affective consequences associated with responding in different ways in a social situation. 140 00:17:49,080 --> 00:17:58,860 Right. And at least some research suggests that the verb PFC plays an important role in processing these complex social emotions, 141 00:17:59,670 --> 00:18:07,050 such as guilt, embarrassment, empathy, resentment, and more generally recognising emotional emotions in others. 142 00:18:07,350 --> 00:18:16,319 Right. And research also suggests that at least some moral learning relies on this process of 143 00:18:16,320 --> 00:18:22,920 reflecting on the emotional states we experience as a result of our behaviour toward others. 144 00:18:23,940 --> 00:18:34,310 The perception is attuned to others. Experience guided by appreciation of their mental states can lead us to identify pain or distress while empathy. 145 00:18:34,800 --> 00:18:40,800 Empathy, of course, can generate negative emotional signals that lead us to want to relieve it. 146 00:18:41,360 --> 00:18:47,300 Right. Now, ideally, this process of more learning begins early in life. 147 00:18:47,510 --> 00:18:54,530 So a child may hit another child, whereupon an adult may say, How would you like it if he did that to you? 148 00:18:54,860 --> 00:19:06,139 Right. This ideally induces an empathic response in the first child, an aversion to what she has done and more. 149 00:19:06,140 --> 00:19:14,810 Learning over time consists of the association of that aversive emotion with the transgressive behaviour that elicited it. 150 00:19:16,040 --> 00:19:26,660 That's communicated in the future through an affective signal by the V the PFC that biases an individual away from such behaviour. 151 00:19:26,910 --> 00:19:30,080 Right. People also learn from moral instruction, 152 00:19:30,080 --> 00:19:36,860 often in the form of stories and illustrations in which they learn to have adverse or favourable reactions 153 00:19:36,860 --> 00:19:42,410 to various sorts of behaviour without actually engaging in that behaviour through the process of imaginative 154 00:19:44,600 --> 00:19:53,899 experiencing and support for at least some relationship between social and moral learning comes from the 155 00:19:53,900 --> 00:20:01,550 fact that the neural areas activated when subjects have at least certain kinds of moral intuitions, 156 00:20:02,630 --> 00:20:05,660 then involve response to complex, multifaceted, 157 00:20:05,660 --> 00:20:17,780 moral decisions that involve moral dilemmas seem to be just the areas active in aspects of social cognition involving automatic affect, 158 00:20:17,780 --> 00:20:24,380 leading processing and considerable research suggests that when these areas are damaged, 159 00:20:25,130 --> 00:20:34,190 there is adverse impact on the ability to engage effectively in what we think of as moral reasoning, 160 00:20:34,230 --> 00:20:38,150 at least of the type that are that are presented in the scenario. 161 00:20:39,020 --> 00:20:42,649 So one implication of this, and Jim Woodward, among others, 162 00:20:42,650 --> 00:20:50,510 has suggested this is that there may well not be neural structures that we think of as sort of dedicated, if you will. 163 00:20:51,430 --> 00:20:56,290 To moral cognition, if you want to call it that. 164 00:20:56,740 --> 00:21:09,840 Woodward argues, for instance, that. We may well engage in certain neural processing in situations that involve other people. 165 00:21:11,930 --> 00:21:19,339 And that some types of judgements that we think of as moral may be a subset of this more basic sort 166 00:21:19,340 --> 00:21:25,210 of neural processing that involves sort of mapping or reading and responding to the social world. 167 00:21:27,360 --> 00:21:40,020 That's at least one one possible thesis. In any event, this potential role of emotional processing also suggests that enduring, 168 00:21:40,080 --> 00:21:49,139 strong affective valence is associated with features of situations that that have more really salient moral 169 00:21:49,140 --> 00:21:54,600 salience may be especially likely to bridge the gap between moral decision making and moral behaviour. 170 00:21:55,650 --> 00:21:58,440 That is, it's well known that there's a difference between. 171 00:21:59,340 --> 00:22:05,490 Engaging in deliberation and coming to a conclusion about what is morally, morally appropriate and actually acting on that. 172 00:22:05,800 --> 00:22:11,820 Right. And some suggest that emotion. 173 00:22:12,890 --> 00:22:19,310 May enhance our ability in moving us to behave in ways that we have concluded are moral. 174 00:22:20,000 --> 00:22:30,649 If those conclusions are the product of aversion to or attraction to particular courses of action, one source of aversion or attraction, 175 00:22:30,650 --> 00:22:37,880 for instance, may be the role of emotional processing in simulating the mental and emotional states of others. 176 00:22:38,450 --> 00:22:43,250 Since, as one scholar suggests, the simulation works by our actually undergoing. 177 00:22:44,050 --> 00:22:51,670 Aspects of the processing underlying the mental state that we are detecting affective process. 178 00:22:51,670 --> 00:22:59,520 This may help in providing the motivation to act on the moral confusion that we reach. 179 00:23:00,340 --> 00:23:06,850 So there's a bit of a hint of humour here, although there are certainly some significant differences with you. 180 00:23:07,600 --> 00:23:12,910 So in any event, research suggests that individuals in at least some cases, 181 00:23:13,450 --> 00:23:23,890 may engage in rapid and non-conscious decision making about situations that have some dimensions that we think of as moral in some way. 182 00:23:24,910 --> 00:23:28,720 And that neural structure is associated with emotional processing. 183 00:23:29,260 --> 00:23:40,930 Right. May play a role in both perceiving what I would call moral perception and in moral assessment of different courses of action. 184 00:23:40,930 --> 00:23:48,040 And indeed, then in moving us to act in accordance with our assessment of those different courses of action. 185 00:23:49,480 --> 00:24:08,200 So. What I want to do now is go back to Sergeant Wuterich at Haditha and at least think about how we might reconstruct what he did. 186 00:24:09,850 --> 00:24:18,850 In the terms that I've just described. Right. And my hope here, obviously, is not to provide by any means a definitive reconstruction, 187 00:24:19,660 --> 00:24:28,450 but at least to see whether it is plausible to think of what he did in these terms and what the implications of that might be. 188 00:24:28,690 --> 00:24:33,310 So what I want to do is think of his. 189 00:24:34,870 --> 00:24:40,810 Perceptions and his implicit deliberation one minute before the IED went off. 190 00:24:42,340 --> 00:24:45,730 And one minute after the IED went off. 191 00:24:47,160 --> 00:24:57,270 And work on non-conscious decision making suggests that in any situation our perception is shaped very much by our concerns, 192 00:24:57,270 --> 00:25:06,030 our goals, our aims in that setting. This narrows us to focus on cues in the environment we regard as relevant to those interests, 193 00:25:06,600 --> 00:25:11,670 which we then in turn organise into larger patterns that define, if you will, the situation. 194 00:25:13,170 --> 00:25:21,520 For us, these patterns are infused with emotional significance and which is transmitted, as I mentioned earlier, through this, 195 00:25:22,110 --> 00:25:29,480 these various neural systems, among other places, to the V and PFC, which engages in this emotional processing. 196 00:25:29,490 --> 00:25:37,380 So we accept that is a very broad and very general sort of description. 197 00:25:37,860 --> 00:25:40,710 So one minute before the the the explosion, 198 00:25:42,150 --> 00:25:50,010 the salient goal in Wuterich's mind was to travel safely back to home base or avoiding the most likely threats, 199 00:25:50,910 --> 00:25:57,059 which were IEDs and ambushes, where his main orientation, we could say, 200 00:25:57,060 --> 00:26:04,590 was vigilance was focussed on cues in the environment most likely to signal to indicate possible danger. 201 00:26:05,010 --> 00:26:13,530 Okay. From an emotional standpoint, this orientation of vigilance was prompted naturally by strong. 202 00:26:14,630 --> 00:26:18,800 Anticipated negative emotions associated with the prospect of being attacked. 203 00:26:19,230 --> 00:26:25,460 Right. Obviously, we can see these emotions as reflecting a basic desire for self-preservation. 204 00:26:27,170 --> 00:26:33,350 What would he have regarded as morally salient features of the situation as he saw it? 205 00:26:34,650 --> 00:26:45,330 Well, at least one important element of a perception that a situation has morally salient features is, as James Rist has put it, 206 00:26:45,960 --> 00:26:55,200 the awareness that something one might do or is doing can affect the welfare of someone else, either directly or indirectly. 207 00:26:55,440 --> 00:27:04,440 So if we at least focus on that aspect of morality to think about what might be morally salient in Wuterich's universe, if you will, 208 00:27:04,770 --> 00:27:12,390 a minute before the explosion, one group of people, obviously, who would be affected by what he did were the men in his squad, right? 209 00:27:12,420 --> 00:27:16,350 As a leader of the squad, he had a moral responsibility to bring them back safely. 210 00:27:17,010 --> 00:27:23,430 The prospect of failing to do so would have elicited anticipated emotional distress. 211 00:27:24,180 --> 00:27:31,680 In this respect, his vigilance in identifying possible threats right had a moral dimension. 212 00:27:31,890 --> 00:27:34,920 You might say it wasn't purely self-preservation. 213 00:27:34,930 --> 00:27:38,760 I'll leave self-preservation and morality aside for the moment. 214 00:27:39,690 --> 00:27:45,840 Another group of people who would be affected by Wuterich's actions was, of course, the local population. 215 00:27:46,650 --> 00:27:52,340 He had a moral responsibility not to harm those who posed no danger to his squad. 216 00:27:53,280 --> 00:27:59,130 And in distinguishing those who could and could not be harmed, he was guided by the rules of engagement, 217 00:27:59,730 --> 00:28:04,380 which said someone had to exhibit hostile, hostile action or hostile intent. 218 00:28:04,950 --> 00:28:14,310 Right. In order to be harmed. Okay. Now, ideally, Wuterich would have taken care to distinguish innocent from hostile persons. 219 00:28:15,520 --> 00:28:20,110 Because of anticipated emotional distress from the prospect of killing an innocent person. 220 00:28:22,020 --> 00:28:28,120 Aversion to this distress would be based on, I think, normal moral sensibilities. 221 00:28:28,420 --> 00:28:36,160 And there's no medication that would. It was a typical in any sense and military training that that reinforced it. 222 00:28:37,540 --> 00:28:46,089 So in this posture of vigilance, the prospect of opening fire on five and our men standing by a car likely would 223 00:28:46,090 --> 00:28:52,360 have triggered anticipated vigilance as he was travelling back to home base, 224 00:28:52,900 --> 00:28:56,730 not being under attack, simply scanning the environment. 225 00:28:56,740 --> 00:29:04,450 Had he seen five men standing by a car without weapons, no overt sort of exhibition of hostility? 226 00:29:05,230 --> 00:29:08,500 It's unlikely he would have done what he later did. 227 00:29:09,040 --> 00:29:18,590 All right. Now. He probably would have taken more time to look more closely for any indications of whether they had hostile intent. 228 00:29:19,770 --> 00:29:25,690 So water vigilance that morning prompted him to engage in tactical manoeuvres to minimise danger. 229 00:29:25,710 --> 00:29:33,160 He changed the normal route back to the base as the convoy turned on to the road where the explosion later occurred. 230 00:29:33,180 --> 00:29:38,490 Wuterich also moved his vehicle to the left side of the road, and then the attack occurred. 231 00:29:38,520 --> 00:29:42,300 The IED blew up. The vehicle behind him was thrown into the air. 232 00:29:42,720 --> 00:29:50,129 One person was killed and then the men in the squad organised these situational cues 233 00:29:50,130 --> 00:29:54,510 into a pattern that defines the situation as one in which they're under attack. 234 00:29:55,340 --> 00:30:00,480 Right. And Marines have standard procedures to follow after an IED attack. 235 00:30:00,510 --> 00:30:06,750 As one said, we call those remedial action. What would happen if there is a close ambush for ambush, an IED? 236 00:30:07,230 --> 00:30:14,670 We run through these every single briefing so it becomes memorised in your head so that if it does happen, you can react without thinking. 237 00:30:16,680 --> 00:30:21,330 For an IED, it's always get out of the kill zone. Cordoned off the area. 238 00:30:21,780 --> 00:30:27,900 Look for a trigger man. And Wuterich said later, at this point, I realised my mission had changed. 239 00:30:28,650 --> 00:30:32,660 Okay. So prior to the explosion, he's trying to get back to the base safely. 240 00:30:32,670 --> 00:30:36,540 He's in this orientation of vigilance. Now he's under attack. 241 00:30:37,890 --> 00:30:42,860 We had practised this scenario before on whiteboards, in classrooms, in front of superior subordinates peers. 242 00:30:42,870 --> 00:30:50,950 My training would take over from here. So Wuterich now is in a new situation whose most relevant features are. 243 00:30:51,520 --> 00:30:55,560 One of his men has been killed. And his squad is under attack. 244 00:30:57,080 --> 00:31:04,400 Emotionally, the first feature of the situation likely triggers anger and perhaps a desire for revenge. 245 00:31:05,420 --> 00:31:14,250 The second. Largely triggers fear, an emotional aversion to the prospect of not bringing his men back safely to the base, not to mention himself. 246 00:31:15,090 --> 00:31:20,250 So he focuses on locating the IED trigger man and identifying who is attacking. 247 00:31:20,700 --> 00:31:27,540 There were some reports of small arms fire after the IED, which was a common sort of tactic that was used. 248 00:31:29,430 --> 00:31:35,879 And his cues and looking for maybe attacking they have moral salience for the reason I suggested earlier, 249 00:31:35,880 --> 00:31:39,900 because someone out there killed one of his men and maybe trying to kill him in his squad. 250 00:31:41,520 --> 00:31:48,780 But at the same time, the responsibility not to harm innocent civilians remains a morally salient feature of the situation. 251 00:31:50,010 --> 00:31:57,930 He also should be experiencing some emotional aversion to the prospect of violating this responsibility. 252 00:31:58,770 --> 00:32:06,240 This should lead him to engage in accurate, positive identification of hostile action or hostile intent. 253 00:32:07,860 --> 00:32:12,560 So right after the IED goes off, Woodward pulls to the side. 254 00:32:12,570 --> 00:32:21,230 He sees the five Iraqi men. He said, the first thing I noticed outside my vehicle was a white four door sedan to the Southwest. 255 00:32:22,260 --> 00:32:25,800 He said. So my immediate thought is, okay, maybe this was a car bomb. 256 00:32:26,640 --> 00:32:29,700 Okay, maybe these guys had something to do with the IED. 257 00:32:30,670 --> 00:32:36,520 So Wuterich regards the presence of the men itself as a possible cue of hostile action and intent. 258 00:32:36,520 --> 00:32:40,630 The IED and the men themselves are associated very closely in his mind. 259 00:32:41,590 --> 00:32:45,760 He then says there were military age males that were inside that car. 260 00:32:47,120 --> 00:32:53,950 The only thing that was Iraqi was them. I notice the term military age males, right? 261 00:32:54,700 --> 00:32:58,299 He didn't say there were five young men. There were five people of college age. 262 00:32:58,300 --> 00:33:04,930 These are military age males. He regards this as an additional cue that inclines him to interpret them as hostile. 263 00:33:05,650 --> 00:33:11,770 And then one of his colleagues, Sergeant Dela Cruz, was yelling at them in broken Arabic to get down on the ground. 264 00:33:12,770 --> 00:33:14,450 We said they were not complying. 265 00:33:14,690 --> 00:33:20,870 In fact, they were starting to run in the opposite direction, away from where Corporal Dela Cruz was approaching them. 266 00:33:22,680 --> 00:33:27,930 Twitter regards this as the final cue necessary to positively identify the men as hostile. 267 00:33:28,560 --> 00:33:34,380 He then opens fire and kills all of them. And this again in the span probably of less than a minute. 268 00:33:34,710 --> 00:33:43,170 Okay. So I think it's plausible to suggest Wuterich sense of what was morally salient in this situation. 269 00:33:44,220 --> 00:33:58,310 Was dominated. By his desire to find who had triggered the IED and to protect himself and his men is a process of what we might call interpretation. 270 00:33:58,520 --> 00:34:06,770 His reading of the cues was not sufficiently influenced by anticipated emotional distress from the prospect of killing innocent civilians. 271 00:34:08,230 --> 00:34:13,690 He did not, in other words, experience an emotion strong enough to slow. 272 00:34:15,050 --> 00:34:21,770 This rapid sort of. Um, information processing he was undergoing. 273 00:34:22,490 --> 00:34:28,520 And the direction that it led him, which was governed by emotions of anger and fear. 274 00:34:28,820 --> 00:34:33,719 Right. What could have interrupted this? Well, you might say. 275 00:34:33,720 --> 00:34:38,860 Well, had he just. Formed of self and taking more time to deliberate rationally. 276 00:34:39,700 --> 00:34:44,680 This would have led him to realise that what he was seeing couldn't on its face. 277 00:34:46,190 --> 00:34:50,780 Suffice. To fire on them, he would need to look more closely. 278 00:34:51,830 --> 00:34:59,540 But he was unlikely under these circumstances to draw at least fully on this faculty of rational deliberation. 279 00:35:00,610 --> 00:35:09,220 I would argue what was more likely to interrupt his rapid information processing and interpretation was an emotional response. 280 00:35:10,540 --> 00:35:14,590 Again anticipated emotional aversion to killing innocent civilians. 281 00:35:15,130 --> 00:35:22,060 That would have tempered. The other emotions that were leading him in a certain direction. 282 00:35:22,690 --> 00:35:26,740 I had had that anticipated response. 283 00:35:28,100 --> 00:35:31,470 Involve greater emotional and aversion. 284 00:35:31,490 --> 00:35:40,460 This could have led him to take just another minute or less to seek additional information about the men. 285 00:35:41,060 --> 00:35:48,740 Had anyone seen them before the explosion? In fact, someone had seen them in the convoy and had waved them over to the side. 286 00:35:48,770 --> 00:35:51,050 That's how they ended up where they were. 287 00:35:51,680 --> 00:35:58,040 Instead of relying on his colleagues broken Arabic, he could have turned to the Iraqi soldiers in his unit to communicate with the men. 288 00:35:59,180 --> 00:36:07,190 He could have taken a few seconds to consider anyone triggering an IED likely would be moving away from the convoy, not approaching it. 289 00:36:07,910 --> 00:36:11,030 So responding with the right emotion, in other words. 290 00:36:11,600 --> 00:36:21,680 Based again on this anticipated emotional distress would have slowed his response enough to allow reason to gain more influence. 291 00:36:22,430 --> 00:36:31,970 Indeed, I think without this immediate non-conscious emotion, there might be little prospect that reason actually could have much influence. 292 00:36:33,830 --> 00:36:39,200 So what was necessary was for him somehow to retain a strong enough. 293 00:36:40,660 --> 00:36:48,670 Visceral, emotional aversion to the prospect of killing innocent civilians when the situation changed from one of vigilance to being under attack. 294 00:36:49,140 --> 00:37:00,060 Right. So conclude with what implications there might be for military ethics training. 295 00:37:00,480 --> 00:37:09,200 Right. Certainly Wuterich was familiar with the Geneva Conventions. 296 00:37:09,210 --> 00:37:12,870 He had been briefed on the rules of engagement in the theatre. Right. 297 00:37:14,300 --> 00:37:25,860 What could have strengthened this visceral, emotional aversion that I describe that might have led him right to not to open fire on the men? 298 00:37:27,900 --> 00:37:34,170 Research suggests that at least two mechanisms might trigger a morally appropriate emotional aversion. 299 00:37:35,010 --> 00:37:39,390 One is what's called outcome aversion. This reflects a first person. 300 00:37:40,560 --> 00:37:53,280 I'm sorry. Yeah, a first person perspective on outcome aversion is contrasted with action aversion, which is a. 301 00:37:56,390 --> 00:37:59,750 A perspective, actually, of the perpetrator. Let me give you an example. 302 00:38:00,470 --> 00:38:05,540 Researchers illustrate this by using the example of one person hearing about. 303 00:38:06,570 --> 00:38:08,850 Another person slicing the throat of someone. 304 00:38:10,020 --> 00:38:18,630 The outcome, a version model, maintains that the emotional response to hearing about this scenario is grounded in empathy for the victim. 305 00:38:19,630 --> 00:38:23,020 And the distress caused by imagining the suffering of the victim. 306 00:38:24,160 --> 00:38:29,350 And naturally, empathy plays a significant role in that process. 307 00:38:30,460 --> 00:38:38,680 In this way, as Don Pizarro has suggested, empathic arousal operates as a first alert signalling moral relevance. 308 00:38:40,150 --> 00:38:42,220 So thinking back to Wuterich, 309 00:38:43,600 --> 00:38:50,650 what could military training have done other than make sure he knew he knew the Geneva Conventions and the rules of engagement? 310 00:38:50,680 --> 00:39:00,850 Well. It's not uncommon for service members in situations such as Iraq. 311 00:39:01,840 --> 00:39:05,800 To have indifference or even resentment toward the local population. 312 00:39:07,030 --> 00:39:12,520 The sense that they're not cooperating. The sense that maybe they're, in fact, assisting the enemy. 313 00:39:13,210 --> 00:39:23,650 Right. And I've heard people talk about the importance of unit squad leaders, unit leaders, different levels of educating. 314 00:39:24,690 --> 00:39:28,740 Forces about the lives of people who are caught in these conflicts. 315 00:39:29,080 --> 00:39:39,690 You know, they don't represent the standard sort of fully autonomous chooser that we tend to think of ourselves as being right. 316 00:39:39,930 --> 00:39:48,120 They're enmeshed in a very complex web of relationships that may subject them to danger from a variety of different sources. 317 00:39:48,300 --> 00:39:57,930 Right. To the extent I have heard at least that one appreciates this, that may trigger at least some degree of empathy and therefore a greater. 318 00:39:59,270 --> 00:40:03,290 Emotional aversion to killing someone who is, in fact, innocent. 319 00:40:04,460 --> 00:40:07,790 Action aversion involves. 320 00:40:09,970 --> 00:40:16,330 An emotional response that puts oneself in the shoes of the perpetrator rather than the victim. 321 00:40:16,660 --> 00:40:23,620 Right. So in my in the example, how would it feel to slice someone's throat? 322 00:40:24,130 --> 00:40:33,280 Right. And research suggests that the result of the distress that results may be the roots of this may be empathic, 323 00:40:33,280 --> 00:40:40,390 but that over time this may be internalised. So that emotional distress occurs even with the knowledge that no one is being hurt. 324 00:40:40,510 --> 00:40:45,610 So someone hears about someone throat being slashed with a rubber knife. 325 00:40:45,970 --> 00:40:49,840 Right. Or the prospect of firing a toy gun at someone. 326 00:40:50,200 --> 00:40:56,950 Actually, it can elicit emotional distress, even knowing that no one is going to be harmed because one has internalised this. 327 00:40:57,100 --> 00:41:01,540 Am I the sort of person who does this? Right. And that's what's known as action aversion. 328 00:41:02,110 --> 00:41:09,700 And for action aversion. You know, one way maybe of harnessing the appropriate what I would call an emotional response. 329 00:41:10,210 --> 00:41:14,740 Right. Could be to foster a greater sense of professional duty. 330 00:41:16,070 --> 00:41:19,880 That. Focuses on protecting innocent people. 331 00:41:20,450 --> 00:41:20,750 Right. 332 00:41:21,140 --> 00:41:30,590 There are a sense of professional identity in which there is distress at the prospect of violating this professional duty to protect the innocent. 333 00:41:30,900 --> 00:41:35,240 Right. That you know, this is just this is the sort of thing we don't do. 334 00:41:35,240 --> 00:41:40,280 We're here to protect these people. Right. And your duty, you can you could undermine the mission. 335 00:41:40,430 --> 00:41:45,780 That's a more pragmatic right sort of argument. But, you know, that's not who we are. 336 00:41:45,800 --> 00:41:49,240 That's that's not what we do. We take steps to try to avoid that point. 337 00:41:49,850 --> 00:41:54,950 So as I say, this is all to me certainly quite suggestive. 338 00:41:56,840 --> 00:42:03,110 You'll see at the bottom there a reference to Aristotle, and that's even more speculative and even more suggestive. 339 00:42:03,110 --> 00:42:17,330 But it does seem to me that to the extent we focus on trying to inculcate the propensity to respond with the right sort of emotion, 340 00:42:17,960 --> 00:42:21,650 there is at least some resemblance to Aristotle's notion of habituation. 341 00:42:22,160 --> 00:42:32,960 Right. And so I can't go very far because I haven't thought certainly hadn't thought deeply about this. 342 00:42:33,350 --> 00:42:39,530 But it does seem to me that sort of cultivating the right kinds of emotion bears some resemblance to what what he was talking about. 343 00:42:39,530 --> 00:42:46,220 So again. I'm really more or less at the beginning of this research project. 344 00:42:46,240 --> 00:42:50,200 I think there are some interesting avenues of inquiry. 345 00:42:50,230 --> 00:42:58,390 I think there are also probably some challenges. But thank you for your attention, and I welcome your comments and questions.