1 00:00:00,090 --> 00:00:06,630 So I'm very, very delighted to welcome Professor Catherine Pope here, who's going to give this talk that's really hotly anticipated. 2 00:00:07,410 --> 00:00:11,910 And we are very excited to learn that she is joining our Department of the Department of 3 00:00:11,910 --> 00:00:16,500 Primary Care Health Sciences as of Monday and to do great things in medical sociology. 4 00:00:16,900 --> 00:00:19,470 So no pressure. Thank you. Okay. 5 00:00:20,310 --> 00:00:32,520 So the disclaimers I have are that I'm from South London, so I talk quite fast and sometimes I get animated and I often walk up and down. 6 00:00:32,910 --> 00:00:36,720 I might even walk that way, which can be a bit intimidating for people. 7 00:00:37,050 --> 00:00:42,420 So the main thing is if you really don't understand something or I'm going so fast, 8 00:00:43,530 --> 00:00:50,520 you can put your hand up and stop me and I'll try to slow down and be a bit less South London and forlorn. 9 00:00:51,480 --> 00:00:58,680 And then the other thing is that I am aware this is being recorded and I can get a bit passionate and sweary sometimes, 10 00:00:58,980 --> 00:01:05,190 but mindful that I'm being recorded and mindful of my new fanbase at the back. 11 00:01:05,430 --> 00:01:10,290 I'm going to try very hard to be, you know, just contain that a bit. 12 00:01:11,310 --> 00:01:17,700 So I might have to have you watch me and signal if I suddenly launch into bad language from Australia. 13 00:01:18,240 --> 00:01:23,940 Oh, why do you say it's all going to be fine then? 14 00:01:23,970 --> 00:01:34,830 Okay, so I often start these talks with just a bit about who I am, mainly because I don't know if she's still there, 15 00:01:34,830 --> 00:01:42,299 but there used to be an opera singer called Catherine Pope and also a famous skateboarder called Catherine Pope. 16 00:01:42,300 --> 00:01:46,410 And I just don't want to disappoint you if you thought you were coming to see either of them. 17 00:01:46,740 --> 00:01:52,830 Okay. So I am the Catherine Pope who was involved in writing various things in the BMJ 18 00:01:52,830 --> 00:01:58,020 about qualitative research some time ago and a few little books about that. 19 00:01:58,350 --> 00:02:05,610 And I'll be drawing on some of the knowledge from some of the work that went into those papers tonight. 20 00:02:07,230 --> 00:02:13,980 And can I just see a show of hands here, the students on the qualitative introduction, just so that I know where you are. 21 00:02:13,980 --> 00:02:16,980 Right. Okay. You very cleverly spread yourself around. Okay. 22 00:02:17,310 --> 00:02:25,620 So one of the things that I said I would like to do in this lecture was to talk about what it's really like, 23 00:02:25,620 --> 00:02:29,370 which is why there was that slightly sexy title of the secret diary, 24 00:02:29,760 --> 00:02:37,800 because you're being taught all of the proper stuff by the proper people about how it's meant to be done and what, you know, what you do. 25 00:02:38,220 --> 00:02:44,520 And I'm going to tell you some of the inside stories of what it's been like for me doing the 26 00:02:44,520 --> 00:02:50,430 kinds of qualitative ethnography and case study work that I've done over the last few years. 27 00:02:51,600 --> 00:02:57,150 And I want to be open and honest about that so that you can kind of see it, warts and all. 28 00:02:58,560 --> 00:03:04,170 And I hope that will be useful to you in terms of thinking about how you might use qualitative research. 29 00:03:04,470 --> 00:03:09,420 And I can see that there are some other very experienced qualitative researchers in the 30 00:03:09,420 --> 00:03:14,730 audience who will also no doubt have their own stories to tell about some of these things. 31 00:03:15,000 --> 00:03:21,239 And you never know. We may disagree about some views, which is always good for an evening lecture. 32 00:03:21,240 --> 00:03:23,700 I think, you know, a bit of controversy anyway. 33 00:03:24,180 --> 00:03:38,600 So I want to start with the famous Erving Goffman, because his work inspired a lot of what I have done in terms of reading Goffman ethnography. 34 00:03:39,090 --> 00:03:49,530 And he talked in this lecture that subsequently got written up about ethnography being a way of doing observation 35 00:03:49,530 --> 00:03:57,089 in settings that you subject yourself your own body and your own personality and your own social situation to the 36 00:03:57,090 --> 00:04:04,110 set of contingencies that play upon a set of individuals so that you can physically and ecologically penetrate 37 00:04:04,110 --> 00:04:10,980 their circle of response to their social situation or their work situation or their ethnic situation or whatever. 38 00:04:11,220 --> 00:04:16,170 So you are close to them while they are responding to what life does to them. 39 00:04:16,590 --> 00:04:26,100 So that's the high point of the this is what you should be doing when you're doing ethnographic observation and qualitative observation of a setting, 40 00:04:27,600 --> 00:04:40,270 and then another guru in the field, Martin Hammersley, who's written quite a lot of work around thinking about ethnography as a method, points out. 41 00:04:40,270 --> 00:04:45,959 But actually a lot of us do not live up to Irving's behest. 42 00:04:45,960 --> 00:04:49,230 We do not actually live with the people we study. 43 00:04:50,010 --> 00:04:57,389 We don't reside in the same place, spending time with them most of the day, most of the week, the month in a month out in state. 44 00:04:57,390 --> 00:05:01,180 Many sociological ethnographers. And I would. Health Services. 45 00:05:01,450 --> 00:05:08,530 Ethnography is also focus on what happens in a particular work locale or social institution when it is in operation. 46 00:05:08,740 --> 00:05:12,700 So that in this sense there participant observation is part time. 47 00:05:13,360 --> 00:05:19,030 And I think that's a really important criticism of what a lot of us do and it's 48 00:05:19,030 --> 00:05:23,590 something that we need to think about if we're going to take our work forward. 49 00:05:23,950 --> 00:05:34,660 So one of my fascinations, partly because I drink too much coffee, is I'm very interested in work at night in health care settings. 50 00:05:35,680 --> 00:05:42,549 And I'm fascinated by the fact that so much research goes on during the day, particularly between nine and five. 51 00:05:42,550 --> 00:05:48,100 And obviously there's lots of things that go on between nine and five in a hospital or in a clinic situation. 52 00:05:48,370 --> 00:05:52,600 But there's a lot of fascinating stuff that also goes on at 2:00 in the morning, 53 00:05:53,560 --> 00:05:57,160 and sometimes we don't bother to go and look at those sorts of things. 54 00:05:57,490 --> 00:06:06,490 So I think these things are a reminder that we might need to think a little bit more about how we engage with these methods. 55 00:06:07,510 --> 00:06:15,860 So what I want to do is share with you some observations about the kinds of things you see when you start to look. 56 00:06:15,880 --> 00:06:20,470 So for those of you that are sitting there and are thinking, I don't even know what ethnography is. 57 00:06:20,770 --> 00:06:23,049 Basically it's hanging around, 58 00:06:23,050 --> 00:06:35,350 observing and talking to people and finding out what's going on and it reveals things that you won't get if you simply send people a survey. 59 00:06:35,470 --> 00:06:45,700 Or actually, even if you just ask them things, you won't get as much as you will if you observe and start looking at the situations. 60 00:06:46,300 --> 00:06:52,840 So I'm one of the project sided with an ambulance handover project, and when we set out, 61 00:06:53,200 --> 00:07:01,510 I think we had a very naive view that there was this thing called handover, which was and I'm illustrating now for my friends at the back, 62 00:07:01,720 --> 00:07:05,230 if you imagine I'm an ambulance and ambulance washes, 63 00:07:05,800 --> 00:07:11,950 that's me being an ambulance to the door of the hospital and it hands over a patient and you 64 00:07:11,950 --> 00:07:15,790 take them out of the back of the ambulance and you hand them over to some doctors and nurses, 65 00:07:16,030 --> 00:07:22,210 and then they go into the hospital. And we thought that this was going to be really interesting and we would study it. 66 00:07:22,720 --> 00:07:27,640 And we found out lots of other things about ambulance handovers. 67 00:07:27,640 --> 00:07:31,330 And I'll tell you some more of those insights that we got from doing observation. 68 00:07:31,750 --> 00:07:39,760 But one of the observations was handover before you've even left the compound and the parking space. 69 00:07:41,470 --> 00:07:51,160 So this is an ambulance crew member who's checking the rucksacks the ambulance crews carry and inside the rucksack. 70 00:07:51,160 --> 00:07:57,670 So lots of little sealed bags with little tags on so that, you know, that they haven't been opened because then you know, 71 00:07:57,670 --> 00:08:01,270 that they've got the right amount of whatever it is that you need in there. 72 00:08:02,290 --> 00:08:10,210 And the person that I'm working with is very experienced, seems to know what should be in each of the bags and what the contents are. 73 00:08:10,570 --> 00:08:19,990 And some of them have been signed by a packer, and he talks to me about this, that he trusts these bags have got the right contents. 74 00:08:21,100 --> 00:08:25,750 And then there's another crew member there who says that the things that are required are there. 75 00:08:26,200 --> 00:08:32,740 And the crew member goes to the cupboard and collect some of the things that are missing to replenish the bag. 76 00:08:33,550 --> 00:08:38,530 And it occurred to me that there were lots of different levels of trust in this 77 00:08:38,530 --> 00:08:43,780 very simple operation of picking the rucksack and deciding whether to take it. 78 00:08:43,780 --> 00:08:49,690 And when we started looking at this bit of the handover chain of events, 79 00:08:50,020 --> 00:08:58,200 what we noticed was there were some crews who trusted that the rucksack would be okay if so-and-so said it's, 80 00:08:58,210 --> 00:09:00,460 yeah, I didn't use anything, don't worry, that's fine. 81 00:09:00,640 --> 00:09:07,210 Others would meticulously go through it and check everything because they didn't trust the person that handed the rucksack over to them. 82 00:09:08,560 --> 00:09:12,730 So handover happened before we even got to a patient. 83 00:09:12,730 --> 00:09:17,800 There was this handover of the vehicle and the rucksack and all of the equipment. 84 00:09:18,550 --> 00:09:23,530 So that was an interesting thing that I hadn't even thought about when we started the project. 85 00:09:26,290 --> 00:09:39,430 And the other thing that the handover project showed us was the level of complexity in ambulance handovers, as I just described to you. 86 00:09:39,760 --> 00:09:44,890 We thought handover was this thing that happened over here at the door of the hospital. 87 00:09:45,250 --> 00:09:49,270 The crew say, here's your patient. Please take them away and save their lives. 88 00:09:49,960 --> 00:09:56,770 And actually, what happens is a whole series of complex bits of handover. 89 00:09:57,400 --> 00:10:04,690 And many of those are predicated on bits of technology that do or do not work, which is a thing I've thrown in just for tresch. 90 00:10:06,460 --> 00:10:12,640 So one fascinating thing was there are various digital communication systems on the ambulance. 91 00:10:13,300 --> 00:10:17,590 And when they don't work, everybody reaches in their pocket and pulls out their mobile phone, 92 00:10:17,590 --> 00:10:20,499 which they're not meant to have their personal mobile phone with them at work. 93 00:10:20,500 --> 00:10:30,550 But they do because they know they need to use it for when the digital stuff fails and bits of information may be handed over on the mobile phone. 94 00:10:31,330 --> 00:10:38,860 There's also some fascinating equipment that's meant to give hot traces so that you can do fantastic stuff about 95 00:10:38,860 --> 00:10:46,060 finding out whether someone's having a heart attack and relay that information to the hospital ahead of your arrival. 96 00:10:46,360 --> 00:10:50,700 And very often the battery doesn't work. So actually that doesn't actually work. 97 00:10:50,700 --> 00:11:00,820 So that bit of handover doesn't work. But there's really interesting pathways of landline, mobile, telephone, radio and computer data communication. 98 00:11:01,060 --> 00:11:04,900 And many of these things happen way before we get to the door of the hospital. 99 00:11:05,380 --> 00:11:10,090 So hanging around on the ambulances, you see all of that. 100 00:11:10,570 --> 00:11:15,310 If you go and talk to the managers of the ambulance service or the hospital, 101 00:11:15,460 --> 00:11:20,950 they may only talk about the bit at the door of the hospital because for them, that's the important bit. 102 00:11:21,220 --> 00:11:32,260 But these bits may be important if you're thinking about information decay or some of the issues about team working and communication and patient. 103 00:11:32,260 --> 00:11:35,950 You know how people care compassionately for patients and families. 104 00:11:38,200 --> 00:11:46,719 Another useful thing that I've taken to doing in when I'm doing observation and obviously you have to get ethical permission for this, 105 00:11:46,720 --> 00:11:55,880 particularly if it includes pictures of people, is to understand the context in which interventions are taking place. 106 00:11:55,900 --> 00:12:05,200 So a lot of the work that we do in healthcare is about introducing changes, new practices, quality improvement into health care settings. 107 00:12:05,770 --> 00:12:16,120 And one of the things that fascinated me on a project that failed as an intervention was how crowded the intervention 108 00:12:16,120 --> 00:12:23,620 setting was with numerous different things that staff were expected to engage with us processes of change. 109 00:12:24,130 --> 00:12:33,730 So these are just some of the posters that I noticed when I was in a setting where we were trying to do our intervention. 110 00:12:33,910 --> 00:12:40,750 It was very difficult for the staff to see our intervention in amongst all of the other things that were going on. 111 00:12:42,280 --> 00:12:50,520 So there were things there about changing in how how you were going to do the recycling in amongst things that were about patient safety. 112 00:12:50,530 --> 00:13:00,819 And you go around the hospital that this particular study happened in and all the stairwells had multiple posters up about different things, 113 00:13:00,820 --> 00:13:05,380 studies that were going on, research, changes in policies and various things. 114 00:13:05,800 --> 00:13:14,500 And basically everybody was just walking past everything because there's too much noise and too much trying to grab their attention. 115 00:13:16,410 --> 00:13:20,020 And this was the study that failed, 116 00:13:20,020 --> 00:13:30,670 which was an attempt to use patients health checklists to drive up the quality of communication in the emergency department. 117 00:13:30,680 --> 00:13:40,150 So the idea was Atul Gawande says checklists are a great idea because they help people systematically make sure they do all the right things. 118 00:13:40,780 --> 00:13:46,540 If any of you seen that fantastic film about the the pilot that lands the plane on the Hudson? 119 00:13:46,930 --> 00:13:53,020 He does the engine checklist despite the fact that he knows the engines are failed. 120 00:13:53,020 --> 00:13:55,749 Both engines have failed, but he does the checklist. 121 00:13:55,750 --> 00:14:01,420 And later, apparently, he said that it was because it was part of his process of landing the plane. 122 00:14:01,420 --> 00:14:05,799 And it's just like you do it as a routine and it's so inculcated that you do it. 123 00:14:05,800 --> 00:14:12,080 Even when everything is telling you that the engines aren't working, you still go through the checklist which says all your engines working. 124 00:14:12,100 --> 00:14:15,250 It's like, No, I know that one already, but he did the checklist. 125 00:14:15,730 --> 00:14:24,940 So we had this idea to have this fantastic checklist. And as part of the study, rather than just introduce the checklist idea and then make. 126 00:14:25,000 --> 00:14:28,090 We interview the staff and patients about how it was going. 127 00:14:28,090 --> 00:14:31,720 As a study, I hung around to see what they did. 128 00:14:32,860 --> 00:14:39,850 And one of the things that they did was they piled up stuff on top of the little box with the little checklist thing, 129 00:14:40,390 --> 00:14:44,380 which kind of deters you from handing out to the patient. 130 00:14:46,420 --> 00:14:54,430 And that was what I thought was the low point of that study and is a really good example of why it might have failed. 131 00:14:54,430 --> 00:14:59,470 Because if you keep covering up with blood pressure cuffs, you're never going to hand any out. 132 00:15:00,250 --> 00:15:07,209 But the CQ save is it was even more amazing because everything got tidied away so that 133 00:15:07,210 --> 00:15:13,420 no one could find the box with the stuff in because it was so tidy we couldn't find it. 134 00:15:13,420 --> 00:15:19,890 And I spent a lot of time like a headless chicken running around the emergency department trying to find the box with the checklist in. 135 00:15:21,220 --> 00:15:25,150 The other thing is that I think that is one of the pen boxes there. 136 00:15:25,510 --> 00:15:29,709 If you ever do a study in the emergency department and I'm going to take a guess, 137 00:15:29,710 --> 00:15:35,360 it may apply in other clinical areas if you need people to do things that involve pens. 138 00:15:35,560 --> 00:15:43,270 I suggest you buy ten times as many pens as you think you will need because everybody that comes along doesn't take the checklist, 139 00:15:43,270 --> 00:15:48,880 but they will take the free pen and you're just constantly watching people going on pens. 140 00:15:49,450 --> 00:15:57,099 So one of my plans is to just simply, you know, when I'm rich and famous is I'm just going to keep supplying pens for the NHS because 141 00:15:57,100 --> 00:16:03,050 clearly you can never have enough pens in the paperless world of the NHS anyway. 142 00:16:03,070 --> 00:16:05,920 So that's one of the stories about hidden interventions. 143 00:16:08,080 --> 00:16:20,250 Another study we did was about safety rules in operating theatres where people are meant to have a systemised checklist again for running surgery. 144 00:16:20,260 --> 00:16:25,809 And one of the things that you do when you're operating is that you should make 145 00:16:25,810 --> 00:16:29,920 sure that everybody is aware of who the patient is and what the procedure is, 146 00:16:29,920 --> 00:16:37,720 and some crucial bits of information and some of that is ideally meant to be presented on a whiteboard in the operating theatre. 147 00:16:37,990 --> 00:16:46,299 Seems like a sensible idea, except that when I was standing in the operating theatre, I'm five foot two, 148 00:16:46,300 --> 00:16:51,550 so I am a bit challenged sometimes when it comes to reaching high things in supermarkets. 149 00:16:52,480 --> 00:17:00,219 I noticed that this particular theatre had a white board that was so high up that even the really tall 150 00:17:00,220 --> 00:17:06,520 surgeon couldn't have possibly reached to fill in the bit that said patient's name and some other details. 151 00:17:06,730 --> 00:17:11,170 So they'd got the white board, but they've made it impossible for people to use, 152 00:17:11,620 --> 00:17:16,720 which then helps you understand why certain bits of the checklist may not be being followed. 153 00:17:17,230 --> 00:17:22,990 But again, if I hadn't seen it, I wouldn't have understood that that was what was happening in that environment. 154 00:17:25,000 --> 00:17:38,530 And then another study we did was of NHS 111999 call centres, which are when your phone for an ambulance or you phone NHS one one for urgent care, 155 00:17:39,310 --> 00:17:48,100 your call goes through to a non clinically trained call handler who has a piece of computer software 156 00:17:48,100 --> 00:17:54,730 that's an algorithm and they work through that software to decide what kind of care you need. 157 00:17:55,810 --> 00:17:59,290 And when I first started studying this software, 158 00:17:59,320 --> 00:18:10,000 I was told that they what they wanted was to kind of make it possible for a very low skilled person to use and to eliminate the need for a clinician. 159 00:18:10,300 --> 00:18:12,670 And over the ten years that I've been studying it, 160 00:18:12,670 --> 00:18:19,030 more and more clinicians have been brought in to help make this software run and make this process run, 161 00:18:19,720 --> 00:18:25,570 which is often a fascinating aspect of how we introduce technologies that are meant to substitute. 162 00:18:25,570 --> 00:18:33,190 And it turns out they don't substitute. And one of the things that I managed to notice while I was watching how this co 163 00:18:33,190 --> 00:18:39,819 handling software was used was the way that the clinical staff kind of hover in 164 00:18:39,820 --> 00:18:44,680 the room and they're just kind of listening out or they're sort of looking to see 165 00:18:44,680 --> 00:18:49,209 if there's a call handler over there who's looking a bit agitated or confused. 166 00:18:49,210 --> 00:18:53,170 Sometimes the call handlers put their hands up and want the clinicians to come over. 167 00:18:53,530 --> 00:19:02,649 And this quote here is for one of the clinicians was actually standing beside the call handler and was listening 168 00:19:02,650 --> 00:19:11,710 in and was offering assistance about the call to get it to a disposition that was speak to the doctor. 169 00:19:11,980 --> 00:19:18,250 So it's almost like she's trying to listen for one side of a telephone conversation and 170 00:19:18,250 --> 00:19:24,880 then to chip in little bits of advice to help kind of finesse the processing of that. 171 00:19:24,990 --> 00:19:32,100 Of that call. So these are the kinds of things that you get when you start looking at 172 00:19:33,270 --> 00:19:37,740 organisational settings and groups of people and the sorts of things that they do. 173 00:19:41,640 --> 00:19:49,150 One of the other things that you get when you hang around is what sociologists and anthropologists call atrocity stories. 174 00:19:49,770 --> 00:19:54,209 And these are stories that you sometimes get when you interview people. 175 00:19:54,210 --> 00:19:59,910 But I find that sometimes there's something about a formal interview when you're sat in an office recording an interview 176 00:19:59,910 --> 00:20:07,170 where people feel that they have to tell you the publicly acceptable versions of of what goes on with their profession. 177 00:20:07,530 --> 00:20:12,840 What you get when people are actually engaged in their practice and you're watching them practising, 178 00:20:13,080 --> 00:20:21,120 is they often talk about the terrible things that happened or things they have heard 179 00:20:21,120 --> 00:20:27,330 about as examples of why they are doing behaving in particular sorts of ways. 180 00:20:28,200 --> 00:20:35,129 So one of the anaesthetists on a study that we did of anaesthetic practice talked 181 00:20:35,130 --> 00:20:42,990 about how they in a particular case and anaesthetist gave the anaesthetic without 182 00:20:42,990 --> 00:20:49,139 warning the patient and the patient panicked and this anaesthetist said I felt uneasy 183 00:20:49,140 --> 00:20:53,010 then because the patient sat bow upright and started grabbing hold of her throat. 184 00:20:53,010 --> 00:20:57,390 And I felt because I hadn't warned the patient, I thought the anaesthesiologist was going to do it. 185 00:20:57,420 --> 00:21:04,050 The patient was scared stiff, stiff. And if that was me, I would have quite a phobia about coming into theatres now. 186 00:21:05,070 --> 00:21:12,690 And that alerted us to a whole set of practices, which was seen as completely inconsequential by anaesthetists, 187 00:21:12,960 --> 00:21:17,550 which are we wrote about subsequently as induction routines, 188 00:21:17,910 --> 00:21:25,620 which are little snippets of talk that anaesthetists do with patients as they were inducing anaesthesia. 189 00:21:26,070 --> 00:21:36,240 And I'm sure that some of you in this room may have had anaesthesia, but basically anaesthetists tell you lies as they're anaesthetising you. 190 00:21:36,240 --> 00:21:41,910 They say things like this will feel cold, this will feel like gin and tonic. 191 00:21:42,750 --> 00:21:49,110 You'll feel a bit like you've had a few drinks and they have all these little turns of phrase that they use. 192 00:21:49,710 --> 00:21:54,970 And what they are designed to do is stop you doing what that patient did there. 193 00:21:55,050 --> 00:22:01,140 Because as I understand it, when anaesthetic agents are introduced, 194 00:22:01,260 --> 00:22:09,569 what you feel often is pain in the sense that something is being introduced into your arm and 195 00:22:09,570 --> 00:22:18,059 it feels a sensation that your natural instinct is to get away from that and ideally sit up, 196 00:22:18,060 --> 00:22:23,250 pull away the lines that are introducing the agent that is causing pain and get away. 197 00:22:24,330 --> 00:22:30,900 And if you think about the fact that many people having surgery will be nervous anyway and perhaps a bit agitated before they arrived, 198 00:22:31,500 --> 00:22:33,240 notwithstanding having pretty mates. 199 00:22:33,900 --> 00:22:44,850 So all of this talk is designed to calm people down and allow them to normalise the process, and it's not taught formally. 200 00:22:44,880 --> 00:22:46,800 You learn it by osmosis. 201 00:22:47,190 --> 00:22:56,579 So the apprenticeship system of training, anaesthetists and surgeons allows them to learn that there are things that they need to say. 202 00:22:56,580 --> 00:23:04,500 And then everybody picks up their own little patter that they use for doing anaesthetic induction. 203 00:23:05,250 --> 00:23:12,420 And again, I don't think any of the anaesthetists would have told us about that behaviour because it wasn't important to them, 204 00:23:12,990 --> 00:23:16,380 but we picked it up because of an atrocity story. 205 00:23:17,820 --> 00:23:21,720 And similarly there was another in the Safety Rules project. 206 00:23:23,310 --> 00:23:30,030 The there's a lot of work around the safety rules and the checklists around 207 00:23:30,300 --> 00:23:36,990 the introduction of particular drugs and agents as part of various processes. 208 00:23:37,560 --> 00:23:43,440 And there are particular things that have huge risks associated with them. 209 00:23:43,440 --> 00:23:46,860 And one of them is the use of of of heparin mentioned here. 210 00:23:47,430 --> 00:23:56,400 And this cardiologist said, I have to make sure this is given it gets written on the board, but it must not be written until it is actually given. 211 00:23:56,700 --> 00:24:04,980 He says he likes to hear a 20 minute warning from another team member and then it needs to be given and to have this confirmed at the time. 212 00:24:05,220 --> 00:24:13,200 And then he goes on to tell me a story of a colleague who experienced a fatality of a patient because the heparin was not administered. 213 00:24:13,470 --> 00:24:24,730 So this is a checklist rule following behaviour where somebody in an operating theatre said we're going to give this now an. 214 00:24:24,840 --> 00:24:33,870 It got written on a board, so everybody started behaving as if the drug had been given and it hadn't actually been given and it caused a fatality. 215 00:24:34,140 --> 00:24:42,840 And those stories are hugely important to clinicians because they are the things that inculcate safety practices. 216 00:24:43,180 --> 00:24:46,440 You know, it's like it gets it, drills it into their bones. 217 00:24:48,300 --> 00:24:54,150 So these atrocity stories often come out when you're hanging around and you've got to know people. 218 00:24:54,360 --> 00:24:58,500 They might just say that by way of explanation for something that they're doing. 219 00:24:59,460 --> 00:25:08,040 And again, they may not tell you that in an interview, although there are some some examples of these kinds of stories occurring in interviews. 220 00:25:08,040 --> 00:25:14,700 But they're often really helpful in alerting you to important things that you might not otherwise notice. 221 00:25:17,040 --> 00:25:25,110 So one of the issues in doing this kind of research, which I think you might have covered, so for those of you that are on the quality, 222 00:25:25,230 --> 00:25:36,750 of course, is this idea that in qualitative research, but particularly observation and ethnography, the researcher is the research instrument. 223 00:25:37,410 --> 00:25:42,480 So in survey research you have your questionnaire and you give your questionnaires, people. 224 00:25:43,560 --> 00:25:49,950 But in this kind of research, it's all about the person doing the research. 225 00:25:50,280 --> 00:25:58,860 And there are a number of articles talking about how this is really great and enables you to get fantastic data, 226 00:25:59,400 --> 00:26:05,550 but it also can be disabling or limiting in terms of what you're able to do. 227 00:26:06,570 --> 00:26:12,060 So I've just pulled out some of the examples of that from my own experience. 228 00:26:12,900 --> 00:26:23,760 So I'm not clinically trained. I have worked in ethnographic teams with people who are clinically trained and I've spent 16 years in 229 00:26:23,760 --> 00:26:29,370 Southhampton and I've got to know particularly some of the Southampton clinical settings quite well. 230 00:26:30,060 --> 00:26:39,060 So negotiating access is something that can be easy or difficult depending on some of those characteristics. 231 00:26:39,570 --> 00:26:46,980 So if I say that I want to go and observe the emergency department in Southampton General Hospital, I can get in there. 232 00:26:48,090 --> 00:26:51,930 If I say I want to do that here in Oxford, that might be a bit more tricky. 233 00:26:52,500 --> 00:26:57,300 If one of you says you want to do it and you don't have that magic professor thing at the front of your name. 234 00:26:57,510 --> 00:26:59,970 Maybe you'll find that more or less difficult. 235 00:27:00,780 --> 00:27:10,080 So negotiating access can be problematic depending on who you are, where you're coming from, who your contacts are. 236 00:27:10,350 --> 00:27:15,450 And if you're an insider in the organisation that may facilitate you getting in. 237 00:27:15,450 --> 00:27:21,300 Or it may be that people don't want to let you in because they think you know all of their dirty secrets. 238 00:27:23,850 --> 00:27:32,790 The problem of establishing rapport in a situation and in a setting is one of those things that is written about endlessly in the textbooks, 239 00:27:32,790 --> 00:27:42,089 as if you can just kind of switch it on and switch it off. And what I found over the years is that it's very dependent on the team that you're 240 00:27:42,090 --> 00:27:46,799 studying and the setting that you're in and the kind of work that they're doing. 241 00:27:46,800 --> 00:27:55,830 But also silly things like how you're feeling. If you're feeling nervous or trepidation about what you're going to observe, 242 00:27:56,040 --> 00:27:59,909 then actually you carry that with you and establishing rapport with the group. 243 00:27:59,910 --> 00:28:05,820 Or if you feel intimidated by the group or members of the group, that can be an issue. 244 00:28:06,210 --> 00:28:10,560 And again, that might vary whether you're an insider or an outsider. 245 00:28:11,910 --> 00:28:19,380 There's some lovely stuff about the tacit and taken for granted knowledge in a particular setting. 246 00:28:19,410 --> 00:28:27,360 If you're a clinician, you're you've already got hardwired information about a setting that you think, 247 00:28:27,360 --> 00:28:33,749 you know, that can be great because it can fast track you to seeing really interesting things, 248 00:28:33,750 --> 00:28:43,860 but it can also blink you to seeing things that you need to see because you make assumptions about the behaviours that you're watching. 249 00:28:45,600 --> 00:28:53,640 And in the anaesthetic study that we did, Dawn Goodwin, who was an anaesthetic nurse by background, 250 00:28:53,850 --> 00:28:59,009 did some of the do the bulk of the ethnographic observation, myself and Maggie Moore, 251 00:28:59,010 --> 00:29:03,690 who were medical and staff sociologists, 252 00:29:04,050 --> 00:29:08,790 did the other observation and sometimes we did the observation together and we did 253 00:29:08,790 --> 00:29:14,910 that partly to answer the question Did it matter who was doing the observation? 254 00:29:15,330 --> 00:29:24,510 And one of the gratifying things was that the only real difference between what I wrote in my notes and what. 255 00:29:24,560 --> 00:29:31,190 Dawn as a clinically trained person wrote in her notes while she knew all the fancy names for the things inside the syringes. 256 00:29:31,430 --> 00:29:37,610 And I wrote down things like the big white syringe and then afterwards said, What does the big white syringe have in it? 257 00:29:38,000 --> 00:29:44,660 Because I didn't know what it was, but we actually managed to have notes that were comparable. 258 00:29:44,670 --> 00:29:53,240 But nonetheless, you can also build up knowledge when you're in a setting over time where you start to take things for granted. 259 00:29:55,010 --> 00:30:00,229 A lovely example of the taken for granted this of the researcher was on a surgical 260 00:30:00,230 --> 00:30:06,320 study I did towards the end of my time working in the operating theatres. 261 00:30:06,560 --> 00:30:11,540 One of the surgeons was so used to me hanging around in scrubs that he said, Oh, do you want to scrub in on this? 262 00:30:11,990 --> 00:30:16,400 And there was a brief moment where I thought, I know how to do this operation now. 263 00:30:16,790 --> 00:30:20,600 And I thought, No, actually, I'm a sociologist, probably shouldn't. 264 00:30:21,710 --> 00:30:33,500 But these things are you do need to think through what it is that you may or may not be seeing as you get to know people in a setting. 265 00:30:34,130 --> 00:30:37,490 They may say things to you that are confidential. 266 00:30:38,030 --> 00:30:44,839 Sometimes they helpfully say, This is off the record, and you're like, Okay, it's off the record. 267 00:30:44,840 --> 00:30:50,330 But I'm still hearing it. It's going in my head. I'm going to be thinking about it all the way home. 268 00:30:52,100 --> 00:30:56,899 So you at least you got some signal that maybe you won't write it down in capital letters in your notes, 269 00:30:56,900 --> 00:31:02,840 because they've told you it's off for off the record. Other times people will just have highly confidential conversations in front of 270 00:31:02,840 --> 00:31:06,889 you because you're part of the scenery and they've got used to you being there, 271 00:31:06,890 --> 00:31:11,270 particularly if you're in a surgical setting and you're wearing scrubs and you look like everybody else. 272 00:31:12,830 --> 00:31:15,590 So there are some issues there and doing. 273 00:31:15,590 --> 00:31:24,290 Goodwin led on a paper from the anaesthetic project that talks about situational ethics and how we might deal with some of those issues. 274 00:31:26,180 --> 00:31:31,100 And finally, there's an issue around your responsibility as a researcher, 275 00:31:31,100 --> 00:31:37,910 because the textbooks often make great play of this idea that we're not meant to mess with the field in any way. 276 00:31:38,450 --> 00:31:45,710 We're meant to be there as observers. But we really shouldn't be getting, you know, getting too involved in it. 277 00:31:46,070 --> 00:31:51,950 And if you do clinical studies and you're not a clinician, you're constantly told, you know, 278 00:31:51,950 --> 00:31:54,919 don't touch stuff because you're not clinician and for goodness sake, 279 00:31:54,920 --> 00:31:58,280 don't look like you're doing anything clinical because that's really not allowed. 280 00:31:58,970 --> 00:32:04,940 But then there are issues about what do you do if something comes up that is problematic. 281 00:32:04,970 --> 00:32:13,850 So the classic one is the safeguarding issue. What do you do if you feel that there is a safeguarding issue for somebody in the setting? 282 00:32:14,150 --> 00:32:23,209 So one of my researchers was in a situation observing ambulance crews and went to 283 00:32:23,210 --> 00:32:29,570 a house where she then had concerns about the children that were in that house. 284 00:32:29,570 --> 00:32:36,350 And the plan was that the ambulance crew were going to leave because they felt that there was nothing that they could do. 285 00:32:36,770 --> 00:32:46,970 But this researcher was concerned about the children, so rang social services and waited because she felt that was the appropriate thing to do. 286 00:32:47,480 --> 00:32:54,170 There are those sorts of issues and notwithstanding the injunction not to touch anything clinical. 287 00:32:54,170 --> 00:33:01,610 When I was doing the ambulance study, I became a trip stand on Southampton Common. 288 00:33:01,910 --> 00:33:07,400 It's one of my finest moments standing there in the dark going, I'm a stand. 289 00:33:07,670 --> 00:33:11,620 I'm holding this trip for you because you haven't got enough hands. 290 00:33:11,630 --> 00:33:14,990 And we were on we were in one of those ambulance cars. 291 00:33:15,740 --> 00:33:20,990 So I'd done the drip and was waiting for the ambulance to come and collect the patient, but needed to do lots of other things. 292 00:33:20,990 --> 00:33:27,290 So I stood there and I kind of knew that I was not meant to be holding anything clinical, but what do you do in that situation? 293 00:33:27,560 --> 00:33:35,960 So there are judgement calls that you make. So the textbooks will tell you don't touch and the health and safety people will tell you don't touch. 294 00:33:36,260 --> 00:33:39,770 And then there are some times when you have to think, Well, what should I do? 295 00:33:40,160 --> 00:33:44,150 And my response to that has always been, what would a good citizen do? 296 00:33:45,290 --> 00:33:51,630 So I'm not clinical, so I don't have clinical responsibility in terms of despite having watched the telly, you know, 297 00:33:51,680 --> 00:33:56,720 the television and film things where they put biros through people's windpipes to, 298 00:33:56,780 --> 00:34:01,700 you know, I'm not going to be doing that, but I will hold a drip stand. 299 00:34:02,000 --> 00:34:08,510 And I have held the hand of patients in stressful situations because I'm a person 300 00:34:08,510 --> 00:34:13,760 in the room who can do that while everybody else is saving lives and doing other, 301 00:34:13,760 --> 00:34:18,750 much more important stuff if they need if they say they want somebody to hold the hand or they're feeling frightened, 302 00:34:18,780 --> 00:34:22,999 I would say, would you like me to hold your hand? Would that make it or would you like me to talk to you? 303 00:34:23,000 --> 00:34:27,950 And we could talk about something that might distress. And I feel that that is okay. 304 00:34:28,370 --> 00:34:31,520 But there are textbooks that will tell you that's a real no no. 305 00:34:37,650 --> 00:34:39,990 There's something about this imported research. 306 00:34:39,990 --> 00:34:48,899 So this idea of the researcher as the research instrument and one of the things that we did on the ambulance study was in order to get ethics, 307 00:34:48,900 --> 00:34:55,350 we had to have a load of procedures for what do you do if you're on the ambulance and the patient or the relative says, 308 00:34:55,350 --> 00:35:03,419 we don't want you observing this. So we wrote this thing in the ethics thing where we drew the diagram of the ambulance and we explained 309 00:35:03,420 --> 00:35:09,299 that Cathy will be sat in one of these seats in the back of the cab unless the patient says no, 310 00:35:09,300 --> 00:35:12,570 in which case she will move to the front and sit with the driver. 311 00:35:13,470 --> 00:35:18,000 And now I will tell you this now, and I do realise this is going to be podcast, 312 00:35:18,000 --> 00:35:24,840 so I'm now breaking you no secret that lying there on the ambulance cab there's a door 313 00:35:25,140 --> 00:35:29,640 that's meant to shut with the idea that when the driver is driving the ambulance, 314 00:35:29,640 --> 00:35:32,940 they're not being distracted by anything that's going on in the back. 315 00:35:33,600 --> 00:35:45,780 Nearly every ambulance that I went in had a very carefully constructed piece of something box, a plastic cup, 316 00:35:47,310 --> 00:35:51,990 the stray bit of medical equipment that was used to prop those doors open so 317 00:35:51,990 --> 00:35:56,460 that the cab drivers could talk to the people working on patients in the back, 318 00:35:56,790 --> 00:36:02,790 which meant that the real confidentiality of me sitting in the front was broken. 319 00:36:03,030 --> 00:36:07,950 But I decided to let that go in the interests of research about ambulances. 320 00:36:09,690 --> 00:36:19,019 And there's one of those things about doing research on ambulances that I'd I'd done the thing of talking myself 321 00:36:19,020 --> 00:36:25,829 through what will I do if we go to a road traffic accident and there's children that are being killed or, 322 00:36:25,830 --> 00:36:29,010 you know, and I'd done all of the head talk about what I was doing. And in fact, 323 00:36:29,010 --> 00:36:34,620 I set up a mentoring buddy system with somebody else that was doing ambulance and 324 00:36:34,620 --> 00:36:38,520 emergency department observations that I would have somebody to debrief with. 325 00:36:38,820 --> 00:36:43,680 And I also had a member of the clinical team who said they would take supervisory responsibility for me. 326 00:36:43,860 --> 00:36:47,820 So I'd done all of that and I'd done all this stuff about what do I do if I get blood on me? 327 00:36:48,030 --> 00:36:51,770 What do I do if somebody is violent, what I do if somebody's sick on me, you know, 328 00:36:51,780 --> 00:36:57,540 I done all of that and I'd done the how am I going to take notes going at speed? 329 00:36:57,540 --> 00:37:00,209 And I'd practised in the back of the car I'd got, you know, it's, 330 00:37:00,210 --> 00:37:05,550 it's my partner drive just as fast as you can within the speed limit and I'm just going to see what I can write. 331 00:37:05,850 --> 00:37:09,810 So I was like, Yeah, I can do this. And one of my colleagues said to me, 332 00:37:10,500 --> 00:37:19,410 Do you get carsick at all because you're going to be in a box going at 90 miles an hour and there were no windows and lots of people get carsick. 333 00:37:19,680 --> 00:37:22,940 I thought, I don't know. I don't know if I'm going to get caught. 334 00:37:22,950 --> 00:37:31,230 Thankfully, I didn't. But there were things that I hadn't even thought about as an important researcher going into that setting. 335 00:37:32,880 --> 00:37:36,540 And then that was just the thing about actually ambulance work. 336 00:37:36,900 --> 00:37:44,100 You feel that embodied ness and I think there are many other research settings in healthcare where you feel that. 337 00:37:44,100 --> 00:37:50,100 So there was this lovely occasion when it was a cold and dark night and the road was icy. 338 00:37:50,100 --> 00:37:56,100 It was winter and the ambulance gate at one point and the drive was great, 339 00:37:56,100 --> 00:38:00,120 but it was a bit scary as we were skating on the ice and then we turned off. 340 00:38:00,120 --> 00:38:05,720 We went under a bridge, went across a field, we went along something that on Google Maps was a road, 341 00:38:05,730 --> 00:38:08,610 but it didn't feel like a road as far as I was concerned. 342 00:38:09,210 --> 00:38:14,850 And then we arrived at a couple of houses in the middle of the field and we didn't have any mobile phone reception. 343 00:38:15,540 --> 00:38:21,660 That's a little bit scary and a little bit unnerving, and I'm glad it was me doing it, 344 00:38:22,170 --> 00:38:28,530 you know, an age close to what I am now rather than maybe the 20 year old me starting out. 345 00:38:29,430 --> 00:38:32,669 So I think those sorts of things are just interesting. 346 00:38:32,670 --> 00:38:40,980 And often when you read about the embodied research, it's written about in a way that doesn't maybe bring some of that light to life. 347 00:38:42,060 --> 00:38:51,990 And there's also the stuff about safety where there's often a lot in ethics about how when you're doing qualitative research, 348 00:38:51,990 --> 00:38:54,690 how you're not going to damage anybody else. 349 00:38:55,470 --> 00:39:04,410 But sometimes that attention isn't paid to how you might look after yourself as the researcher and the very first study cited 30 years ago, 350 00:39:04,800 --> 00:39:12,330 it wasn't until I went to work in the US that I had proper occupational health and safety 351 00:39:12,330 --> 00:39:18,330 briefings about what to do with blood contamination and body fluids and all of those. 352 00:39:18,540 --> 00:39:22,890 So I've been merrily going into operating theatres in the UK, 353 00:39:23,520 --> 00:39:29,969 often being introduced as This is Cathy, she's a social worker and I'm like, no, no, no sociologist. 354 00:39:29,970 --> 00:39:35,160 Slightly different and, and sometimes not being introduced to. 355 00:39:35,560 --> 00:39:41,220 Which I feel quite disturbed about now. At the time I was grateful because I was doing it for my PhD. 356 00:39:41,470 --> 00:39:46,540 I was just so grateful to get into the setting. I look back on that now and think This was completely unethical. 357 00:39:46,840 --> 00:39:51,360 And now I make sure that patients, even if they're going to be anaesthetised when I'm in the room, 358 00:39:51,430 --> 00:39:54,220 know that I'm going to be in the room and what I'm doing, 359 00:39:55,000 --> 00:40:00,460 even if I'm not actually collecting data about them, because often I'm watching what the surgeons or the anaesthetists are doing. 360 00:40:01,090 --> 00:40:09,050 And so, again, this is one of those examples of things that you learn in settings about safety. 361 00:40:09,070 --> 00:40:15,790 So one of the things in the interventional radiology settings is wearing light gowns to protect you from radiation. 362 00:40:16,720 --> 00:40:23,100 And there was this lovely conversation with one of the clinicians who said, you know, you mustn't drop these LED gowns on the floor. 363 00:40:23,110 --> 00:40:27,129 They're really expensive and you need to look hard and you mustn't drop them on the floor. 364 00:40:27,130 --> 00:40:30,400 They have to go on the special hangers and they're really heavy. 365 00:40:30,640 --> 00:40:38,860 So when the radiation is not being used, everybody takes their gowns off and everybody puts them on the back of the chair or on the table, 366 00:40:38,860 --> 00:40:46,830 and sometimes they slide off onto the floor. So you notice what's going on with the safety equipment. 367 00:40:47,410 --> 00:40:56,170 And that was that was interesting. This, by the way, is a picture of me that Glenn Roberts owns and frightens me with every night. 368 00:40:56,170 --> 00:41:03,130 Again, that's me doing the treatment centre project that we did when we went to visit a treatment centre that was being built. 369 00:41:03,280 --> 00:41:09,370 So that's why I'm in a hard hat for that one. I don't always dress like that when I'm doing ethnographic work. 370 00:41:10,120 --> 00:41:18,090 And then another example from that same study, one of the clinicians talked to me about the they have these headband, 371 00:41:18,100 --> 00:41:25,120 a little clip on dosimeters that tell you how much radiation you've been exposed to. 372 00:41:25,330 --> 00:41:30,639 And they're designed to fit on glasses, except that they can often only fit on the bridge of the glasses, 373 00:41:30,640 --> 00:41:34,480 which means you can't actually see because they kind of obscure your vision. 374 00:41:35,350 --> 00:41:39,760 So some people put them on their headbands or they put them on a bit of their cap, 375 00:41:40,120 --> 00:41:45,040 and she wears contact lenses with clear protective glasses over the to the top. 376 00:41:45,040 --> 00:41:49,660 And she's describing all of that. And then as we're talking, she goes, Oh, and I've lost it again. 377 00:41:50,680 --> 00:41:57,370 And she has to go and find another one. And there is a bit me thinking, I wonder if she hadn't been talking to me whether she would even have known. 378 00:41:57,370 --> 00:42:02,050 And these things are meant to record the dosage that you receive over a month or something. 379 00:42:02,770 --> 00:42:13,260 So you learn things about the safety wear, both as the embodied researcher having to wear it and by talking to the people in the setting. 380 00:42:14,380 --> 00:42:18,400 And I thought I'd be honest about an unnerving situation. 381 00:42:18,760 --> 00:42:21,339 I have had lots of unnerving situations, 382 00:42:21,340 --> 00:42:28,270 including deciding that I would get into a fast red sports car with a surgeon after having watched operating this, 383 00:42:28,270 --> 00:42:37,870 because I thought I might get really good data and I did get really good data, but I was very scared and I would say that's a step too far. 384 00:42:37,870 --> 00:42:43,720 Never get in a sports car with a surgeon because they just don't seem to know what the speed limits are. 385 00:42:44,800 --> 00:42:50,080 Or maybe that's changed. But this was a more recent situation, 386 00:42:50,080 --> 00:42:59,140 and one of the studies that I did and this was anaesthetists are lovely people and I think they do amazing things, 387 00:42:59,770 --> 00:43:02,800 but there are bits of downtime in operations. 388 00:43:03,300 --> 00:43:10,150 One of the jokes says that that's when the anaesthetist has a snooze and this particular anaesthetist was looking at his 389 00:43:10,150 --> 00:43:16,420 mobile phone in the bit in the operation where he didn't need to be paying quite so much attention to what was going on. 390 00:43:16,930 --> 00:43:23,080 And they were talking about holidays and things like that and reading tweets out and things like that. 391 00:43:23,350 --> 00:43:31,509 And then he started looking me up on Twitter and he went through as many Kathryn Pope's as he could find on Twitter and 392 00:43:31,510 --> 00:43:41,620 was reading out their bios and guessing whether it was me and I was trying to do my I'm a professional researcher. 393 00:43:41,620 --> 00:43:44,859 I'm not going to rise to this. I'm not going to I'm not going to challenge it. 394 00:43:44,860 --> 00:43:47,950 I'm simply going to observe it. I'm going to write some notes about it. 395 00:43:48,340 --> 00:43:53,410 And I'm feeling a little bit nervous and I'm thinking, Oh, what did I tweet about last night and what, 396 00:43:53,710 --> 00:43:58,810 you know, have I put anything on Twitter that I would feel embarrassed about in this situation? 397 00:43:59,020 --> 00:44:05,740 I do have a rule about not tweeting when I'm drunk, but nonetheless, I'm probably sweary on Twitter as I can be in lecture theatres. 398 00:44:06,610 --> 00:44:11,890 So I was starting to feel a little bit nervous about this because it felt like my life was being in. 399 00:44:12,200 --> 00:44:18,790 Although that's a public space, it wasn't a space that I had chose necessarily to share with that surgical team, 400 00:44:19,000 --> 00:44:25,000 and they didn't know about my political activism and my role in the trade union. 401 00:44:25,360 --> 00:44:29,410 They didn't know any of that, and I didn't feel they needed to know about that. But it was all on Twitter. 402 00:44:30,760 --> 00:44:35,240 And later, as I was reading the notes, it made me think that I. 403 00:44:35,320 --> 00:44:48,130 Actually, that was I kind of understood what that pushback was because I was in their space and this was partly this person saying, 404 00:44:48,370 --> 00:44:52,810 You're in my space watching me. I'm feeling a bit uncomfortable. 405 00:44:53,290 --> 00:45:00,130 I can do that. Back to you. And it made me think differently about the data and about my role. 406 00:45:00,970 --> 00:45:09,370 So sometimes I think we have these unnerving situations or things happening for our work and we feel a bit uncomfortable. 407 00:45:09,610 --> 00:45:13,860 And I think we need to get curious about that and question what's going on, 408 00:45:13,870 --> 00:45:18,010 because I think you can do some further learning from those kinds of situations. 409 00:45:20,660 --> 00:45:27,790 And there's a load of stuff about status passage in the anthropological literature and in the sociological literature, 410 00:45:27,790 --> 00:45:37,900 which is basically just this idea that we move through different kinds of status as, as as we're in a work situation or during the life course. 411 00:45:38,380 --> 00:45:49,810 And I just want to reflect on my own status passages because I think when I started doing observational research, 412 00:45:49,810 --> 00:45:54,890 I was 21 and I've never been in a clinical environment. 413 00:45:54,910 --> 00:46:02,649 I'd done a sociology degree and I'd worked in Sainsbury's and supermarkets and those kinds of things and I've never been in a clinical 414 00:46:02,650 --> 00:46:16,600 situation and I think I was really naive and I was fresh faced and I was female and petite and I was studying urological surgeons, 415 00:46:16,600 --> 00:46:20,540 which at the time I think there was one female urological surgeon. 416 00:46:20,560 --> 00:46:26,410 I hope that's changed now. I haven't looked it up recently. But I think that I was very naive. 417 00:46:27,010 --> 00:46:32,920 And the if I did that study again, now I know because of all the other studies I've done, 418 00:46:33,280 --> 00:46:39,310 I would have some cumulative wisdom about the settings and about the people and about the interactions. 419 00:46:40,150 --> 00:46:45,040 And I think just acknowledging that might be helpful. 420 00:46:45,940 --> 00:46:49,810 There's a load of stuff in the literature about this thing called Going Native, 421 00:46:49,870 --> 00:46:54,819 which is the idea that you overall identify with the people that you're studying to the 422 00:46:54,820 --> 00:46:58,900 point where you can't actually see what's going on because you just feel like them. 423 00:46:59,290 --> 00:47:07,390 And I think there are ways that you can counter that by making really good field notes and being reflective about what you're saying. 424 00:47:07,660 --> 00:47:16,690 But an example of that is that I don't think I feel the same way going into surgical environments as I did when I was 20, 21. 425 00:47:17,050 --> 00:47:26,050 When I first went into a surgical environment, I wrote notes about the smell, the temperature and the bizarreness of. 426 00:47:26,200 --> 00:47:35,260 I wrote pages about how weird it was that I got to see the insides of people's bodies when they never got to see the insides of their bodies. 427 00:47:35,710 --> 00:47:41,140 And how weird it was that every surgeon opens people's abdomens and goes, Oh, I wasn't expecting that. 428 00:47:41,710 --> 00:47:49,330 And I wrote stuff about that because I was fascinated by it and horrified by it and discomforted by it. 429 00:47:49,990 --> 00:47:54,160 And now I walk in and I'm like, Oh, what's that kit that they've got there? 430 00:47:54,160 --> 00:48:03,520 And what? You know? And I might read about the surgical procedure or seen it before, and I think that's changed how I do things and see things. 431 00:48:03,520 --> 00:48:09,280 So I need to reflect on that. And if I'm working in a team with more and less experienced people, 432 00:48:09,400 --> 00:48:15,940 we need to use that to counter the going negative and over identifying with the field. 433 00:48:16,780 --> 00:48:25,509 And then I think also linked to that, I think I was a mascot in the first studies that I did. 434 00:48:25,510 --> 00:48:31,930 I think there was something particularly amongst the urological surgeons who were mainly men, 435 00:48:32,770 --> 00:48:37,780 as I said, which was we've got this young woman following us around. 436 00:48:38,380 --> 00:48:43,780 It's quite fun because that doesn't normally happen in our working life and we can kind of go, Oh, 437 00:48:43,780 --> 00:48:51,820 I've got the young woman following me today, and I was, and that gives you access to things, but it also closes that, you know. 438 00:48:51,850 --> 00:49:00,040 So I think I was told things that maybe they thought I wouldn't understand and maybe 439 00:49:00,550 --> 00:49:04,420 I'm sure there were things that I just didn't clue in to and didn't understand. 440 00:49:04,930 --> 00:49:12,550 And now if I go in, I've got the baggage of being a professor, being, you know, I'm 53, so I'm older. 441 00:49:12,760 --> 00:49:17,470 You know, there's all sorts of things about who I am now and what people expect. 442 00:49:17,980 --> 00:49:22,030 Well, I wonder how that is also influencing what's going on. 443 00:49:22,570 --> 00:49:34,950 And I just hold that up as being aware of those passages because, again, I think when we when the textbooks when I write about these methods, we. 444 00:49:35,140 --> 00:49:43,360 And not to think about those things. But it does matter that I'm white and older and from South London and not clinical. 445 00:49:43,690 --> 00:49:53,710 Those things do matter and short because that means I see and hear particular things and I feel particular things and that matters. 446 00:49:56,880 --> 00:50:04,850 And I thought. Just to end on a bit of controversy, I don't know how many of you saw that, because I started with Erving Goffman. 447 00:50:04,870 --> 00:50:17,290 His daughter Alice wrote a fascinating book based on a very long term ethnography of what she calls fugitive life in an American city, 448 00:50:17,290 --> 00:50:31,510 basically living in a very deprived area and watching the gang and crime activity of a small group of people in an urban deprived setting. 449 00:50:32,860 --> 00:50:37,270 I still think this is a fantastic ethnography. 450 00:50:37,510 --> 00:50:43,210 I enjoyed reading it and I think I would recommend it to any of you that haven't read it. 451 00:50:43,840 --> 00:50:47,980 But there was a huge furore about this book. 452 00:50:49,660 --> 00:51:02,380 There have been some, I think, important criticisms of its focus on criminality as its key theme and set of messages 453 00:51:02,620 --> 00:51:10,300 and the portrayal of black Americans as being involved in criminal criminal activity. 454 00:51:11,470 --> 00:51:19,450 There are what Lubet calls uncertain vignettes in that there are some episodes in this ethnography where Alice 455 00:51:19,450 --> 00:51:27,580 Goffman describes things which make it appear that she probably participated in things that were criminal activities. 456 00:51:28,510 --> 00:51:37,640 And there's some debate about what her role was and whether she should or could be prosecuted. 457 00:51:37,660 --> 00:51:41,380 So there's a bunch of ethical issues and dilemmas. 458 00:51:42,400 --> 00:51:52,990 And I think that part of the reason for the furore around this particular study is 459 00:51:52,990 --> 00:51:58,059 because some of these things are present in a lot of the ethnographic work that we do, 460 00:51:58,060 --> 00:52:06,160 but on a lower level. So I think that some of the things that I've described to you, they're not up there with committing a felony. 461 00:52:06,490 --> 00:52:14,170 But the fact that I've mentioned that and anaesthetist was playing with their mobile phone while they were doing an operation, 462 00:52:14,170 --> 00:52:18,159 for example, should I should I be telling you that it's a piece of my data? 463 00:52:18,160 --> 00:52:22,600 I've anonymized it. There are issues there, 464 00:52:23,380 --> 00:52:34,630 and then there are issues about how how these things get reported and how we protect people and how we allow people to see what it is that we've done. 465 00:52:35,290 --> 00:52:48,000 And as a final in having started with those injunctions from from Erving Goffman and Martin Hammersley Mitchell Junior, 466 00:52:48,010 --> 00:52:51,760 who has also written some stunning, stunning ethnography, 467 00:52:51,850 --> 00:53:00,339 which I recommend to you, says we can improve our methods by engaging in practices that reassure our readers that they can trust, 468 00:53:00,340 --> 00:53:08,890 that they know how they have been convinced. It is a lack of transparency that results in a sense that the wool is being pulled over a reader's eyes. 469 00:53:09,160 --> 00:53:18,040 Our goal should be to institutionalise methods that make it normative for us to be as upfront as possible about how we have achieved our facts. 470 00:53:18,820 --> 00:53:27,820 So part of what I've wanted to do tonight is be open about some of the stuff that maybe sometimes gets hidden behind 471 00:53:27,820 --> 00:53:35,560 the formal reporting of ethnography and the kinds of observational research that we do in health care settings. 472 00:53:35,830 --> 00:53:44,680 Because I think that will get us to this place and we will start to have ethnography so that we can really 473 00:53:44,680 --> 00:53:51,490 engage with and that we can then use to make a difference to health care and health care practices. 474 00:53:52,720 --> 00:53:56,170 So there is a slide with references on if you want them. 475 00:53:56,170 --> 00:54:01,630 Some of the references were embedded. There may be some people here are desperate for the references, so you can ask me for them. 476 00:54:02,800 --> 00:54:10,900 And then I have to acknowledge various things about the studies being funded by great and wonderful organisations that give you money, 477 00:54:11,080 --> 00:54:14,920 but then make you say that everything you say has to be attributed to you, not to them. 478 00:54:15,160 --> 00:54:21,430 So that's what that slide is. And I particularly want to the bottom of that slide acknowledge that most of this 479 00:54:21,430 --> 00:54:25,659 research that I've talked about has been done in teams with amazing people, 480 00:54:25,660 --> 00:54:32,170 and all the best bits of it is obviously their work, not mine. And that's me and I can take questions. 481 00:54:32,290 --> 00:54:32,770 Thank you.