1 00:00:08,560 --> 00:00:15,850 The final talk that we have on experiments dead today is by IS by professors on Évoquant, 2 00:00:15,850 --> 00:00:22,120 who is a associate professor of sustainable social development at the University of Bern. 3 00:00:22,120 --> 00:00:30,730 So we're very excited to be able to bring her all the way from Switzerland to the UK to give a talk to us. 4 00:00:30,730 --> 00:00:35,470 She was previously at the Centre for Experimental Social Sciences here at Nottingham College 5 00:00:35,470 --> 00:00:41,260 and before that she was at the University of Zurich and the Department of Economics. 6 00:00:41,260 --> 00:00:46,510 It is very interesting work using experimental methodologies, especially field experiments, 7 00:00:46,510 --> 00:00:57,430 to study things that are tricky to measure and topics that are culturally sensitive, such as female genital cutting and prenatal sex selection. 8 00:00:57,430 --> 00:01:02,740 And she studies and runs field experiments and in many different parts of the world, 9 00:01:02,740 --> 00:01:07,930 including having done them in Sudan, in Armenia and in other places. 10 00:01:07,930 --> 00:01:15,250 And today she's going to talk to us about eliciting data and challenging intercultural settings by using mobile labs. 11 00:01:15,250 --> 00:01:23,410 OK, thank you. Well, thanks for having me. 12 00:01:23,410 --> 00:01:28,780 It's been nice and pretty challenging to be on Saturday at four o'clock. 13 00:01:28,780 --> 00:01:33,580 I try my best to be on time, so I, of course you can ask questions in between. 14 00:01:33,580 --> 00:01:40,840 I might be a little more rigorous, and Ray was just maybe by trying to get the all out advice here. 15 00:01:40,840 --> 00:01:47,950 So, yeah, it's easy. Already said I will talk basically about how we can use mobile apps. 16 00:01:47,950 --> 00:01:58,130 So what's the advantage of using mobile apps? With respect to development research, I guess you are familiar with this idea that it's long ago. 17 00:01:58,130 --> 00:02:02,620 Is it? The gold standard has been established in a way that you have to run an asset if you want to have 18 00:02:02,620 --> 00:02:07,930 a reasonable or you would want to have an impact data collection and an impact measurement? 19 00:02:07,930 --> 00:02:12,430 And there was a lot of focus on how we do that, right? How can we avoid attrition? 20 00:02:12,430 --> 00:02:14,290 How do you really randomise into fear? 21 00:02:14,290 --> 00:02:20,100 How do you negotiate with your partner in the field if it's a faith in design or to not redesign and all of this? 22 00:02:20,100 --> 00:02:25,540 So there was lots of lots of research and lots of discussion about how do we really do that? 23 00:02:25,540 --> 00:02:28,090 And that shifted in the last couple of years, 24 00:02:28,090 --> 00:02:34,870 a bit more in the way that behavioural science got stronger and stronger with respect to development, science. 25 00:02:34,870 --> 00:02:38,620 And what I mean research is that there was really this trend where people, 26 00:02:38,620 --> 00:02:43,450 first of all, argues that we have to understand the behaviour and see it better. 27 00:02:43,450 --> 00:02:50,650 So it's not just the standard about we have to give the right incentives, we have to give people information so that it was really moving into. 28 00:02:50,650 --> 00:02:59,740 We have to understand is this behaviour biases? So we have a science was used as a diagnostic tool to inform the programmes. 29 00:02:59,740 --> 00:03:01,390 That means the programmes are the change. 30 00:03:01,390 --> 00:03:08,620 It's not just give you money for you sending your child to school or I explain that it's good to use fertiliser and then you do it. 31 00:03:08,620 --> 00:03:16,960 But now it's all about role models. It's about how we change social norms and that and this is what I'm talking about that led to a pressure. 32 00:03:16,960 --> 00:03:23,920 It's on the side of impact data collection. So if a lot of the interventions are now about telling you what to do, 33 00:03:23,920 --> 00:03:29,180 I cannot just afterwards ask you what you're doing because it's very obvious what you're saying. 34 00:03:29,180 --> 00:03:40,480 And so there is really a shift towards trying to improve measurements in the field as a consequence of using a lot of behavioural driven, 35 00:03:40,480 --> 00:03:45,220 informative programme work by now. And this is a little bit of a problem, as you see here. 36 00:03:45,220 --> 00:03:49,970 So that might be if you as a self-employed writer, if you don't use a certified. 37 00:03:49,970 --> 00:03:56,050 The challenge is maybe that as soon as there's an intervention, a lot of interventions are its immediate interventions. 38 00:03:56,050 --> 00:04:02,920 Now there is a big telenovela running that's really repeatedly telling you that it's not the right thing to circumcise the girls. 39 00:04:02,920 --> 00:04:10,780 So you can then come to a situation where people are completely aware what's desirable as a socially desirable behaviour is, 40 00:04:10,780 --> 00:04:18,640 and you need to be able as a scientist to disentangle just increased awareness from behavioural or attitudinal change. 41 00:04:18,640 --> 00:04:26,170 And this is basically what I have to do in my research quite a lot, and this is what I'm talking about today. 42 00:04:26,170 --> 00:04:30,070 It's probably true and all kind of topics in development. 43 00:04:30,070 --> 00:04:35,080 So when you study poverty or something, that's really more incentive based, 44 00:04:35,080 --> 00:04:40,120 but it's definitely a challenge when you have this kind of solid, socially culturally sensitive behaviours. 45 00:04:40,120 --> 00:04:45,310 Like you said, already, female genital cutting is such a topic it's almost in any country you go. 46 00:04:45,310 --> 00:04:51,820 Everybody has learnt that when we are coming, I say, like the aid agencies are coming, 47 00:04:51,820 --> 00:04:55,570 that we do not want to hear that you support human agenda cutting. 48 00:04:55,570 --> 00:05:02,430 So the first ladies of countries often, you know, talk about AIDS, but it's never really. 49 00:05:02,430 --> 00:05:09,480 I mean, it's not really you're not punished if you if you keep on doing it, even though it's illegal in most countries, 50 00:05:09,480 --> 00:05:15,450 it's similar with prenatal sex selection in Romania and in the southern caucuses, as everybody knows by now. 51 00:05:15,450 --> 00:05:21,600 UNFPA made one kind of big campaign and then tells everybody that this is the wrong thing to do. 52 00:05:21,600 --> 00:05:27,730 And as a topic that studies corruption in higher education or exclusion of people with disabilities. 53 00:05:27,730 --> 00:05:36,000 So these are all topics where people are, where it's really challenging to disentangle awareness after complaint from the right attitude and change. 54 00:05:36,000 --> 00:05:43,860 We also had this is, I think, with regard to research to measuring the tricky things we had to workshop it all together was a behavioural team of the 55 00:05:43,860 --> 00:05:51,990 world and where we basically asked this question How do we actually know if we measure what we think we are measuring? 56 00:05:51,990 --> 00:05:58,810 How do we figure that out? And I want to talk about the first three points. 57 00:05:58,810 --> 00:06:03,570 So there are different points how you can avoid the social desirability bias that people tell 58 00:06:03,570 --> 00:06:09,330 you basically what you want to hear that people try to anticipate what's good for them. 59 00:06:09,330 --> 00:06:18,240 Maybe they think if I say I stop circumcising and have some advantages in another domain because the same programme is also providing other services. 60 00:06:18,240 --> 00:06:26,460 And I would talk about mainly how privacy can help, how indirect measurement methods can help and how giving incentives can help. 61 00:06:26,460 --> 00:06:34,650 And all of this is very easily done very easily, but it's done when you use mobile apps because they give you all three of them. 62 00:06:34,650 --> 00:06:38,320 I'm not talking a lot about this as a tool. I have just one slides. 63 00:06:38,320 --> 00:06:42,750 That's some statistical measure of measurement methods. 64 00:06:42,750 --> 00:06:48,150 But since I'm coming from the lab, I only talk about the first three. But this is an example about the list experiment. 65 00:06:48,150 --> 00:06:53,820 This is what was often done before you basically went this richer data collection, 66 00:06:53,820 --> 00:06:58,590 and I talk about today where you would basically ask a direct question. 67 00:06:58,590 --> 00:07:08,640 This is how basically all the time the DHS had asked the question about circumcising the guy very direct and often in face-to-face interviews, 68 00:07:08,640 --> 00:07:12,840 it's always the same argument people who can't really read and write, What shall we do? 69 00:07:12,840 --> 00:07:19,330 We can't just send surveys out like we do in Europe and hope many people respond, send it back. 70 00:07:19,330 --> 00:07:21,240 So you basically sit in front of a person, 71 00:07:21,240 --> 00:07:27,630 interview a person and probably two thirds of the extended family standing around and listening to the answers. 72 00:07:27,630 --> 00:07:35,550 And then they develop methods legs in this experiment where you ask, let's say, five questions that are not related to the sensitive topic. 73 00:07:35,550 --> 00:07:41,790 And then this is the control group. You only ask people to report how many answers are yes. 74 00:07:41,790 --> 00:07:51,000 And then you compare it to a treatment group where you're at the sensitive question and then you can kind of measure some biases. 75 00:07:51,000 --> 00:07:56,940 This is not very useful if you're really interested in randomised controlled trials because it's on an aggregate level. 76 00:07:56,940 --> 00:08:05,280 And that's also why I'm not talking about these methods. And then there was another study that's pretty interesting by society that ran an experiment 77 00:08:05,280 --> 00:08:10,080 where he basically was an experiment wanted to figure out what privacy actually means. 78 00:08:10,080 --> 00:08:18,150 My privacy really had the social desirability biases. So he had to study in India, where he asked some question about the caste system, 79 00:08:18,150 --> 00:08:25,590 where he knew how many questions would be desirable and how many people would answer truly. 80 00:08:25,590 --> 00:08:28,660 And he did face to face interviews. Well, now I'm touching it. 81 00:08:28,660 --> 00:08:35,850 So you did face to face interviews. And then he did the same interviews with an empty suite player where people were just listening to the 82 00:08:35,850 --> 00:08:41,880 questions and then the head to sheet very simple where they would give an answer if they agree or disagree. 83 00:08:41,880 --> 00:08:45,540 And what was nice is then he buried the empty three match sites, 84 00:08:45,540 --> 00:08:52,920 so you would use a situation where you really don't give the results back to the interviewer to test an interview effect. 85 00:08:52,920 --> 00:09:00,300 You change the formulation, whether it was more in the first order, one more direct or indirect question. 86 00:09:00,300 --> 00:09:08,200 And then he also used earphones more as a speaker. So people around you would all listen to the question or you would only listen to it on your own. 87 00:09:08,200 --> 00:09:13,770 So this is testing the bystander effect. And what was interesting is that was the most important effect. 88 00:09:13,770 --> 00:09:21,240 So when, when, when, when everybody could hear the questions, a number of socially desirable answers would really go up. 89 00:09:21,240 --> 00:09:27,780 And that I found interesting because this is very often true in most as a data collection, we look at the people not really on their own. 90 00:09:27,780 --> 00:09:32,220 It's not just interview, it's really you ask women about domestic violence. 91 00:09:32,220 --> 00:09:37,590 Maybe the husband of your wife is just next door and he can. I can I just hear what you say? 92 00:09:37,590 --> 00:09:41,910 And we know this has also led to a serious backlash in some studies. 93 00:09:41,910 --> 00:09:54,170 So I'm talking about basically how privacy can help when you use computerised learning to field experiments. 94 00:09:54,170 --> 00:10:01,670 I don't know if I need to tell a lot about life in the field experiments. I guess you all have heard about it today already a little bit. 95 00:10:01,670 --> 00:10:05,240 It's basically the same like lab experiment, but you take it into the field. 96 00:10:05,240 --> 00:10:09,860 That means you probably have an informative, theoretically informed subject pool. 97 00:10:09,860 --> 00:10:18,200 So maybe you're interested in. There's a famous study that showed that bankers actually not self-selecting into risky behaviour, 98 00:10:18,200 --> 00:10:21,230 but they have a culture that makes them taking risky choices. 99 00:10:21,230 --> 00:10:29,600 You wouldn't get to bankers in Zurich in our lab, so we had to take the lab to the bank as to not see so much of their time. 100 00:10:29,600 --> 00:10:33,140 So this is definitely an advantage, but it's still very highly experimental controlled. 101 00:10:33,140 --> 00:10:37,310 You know who was doing what? And this is what I talk about. 102 00:10:37,310 --> 00:10:38,990 It's nice for interdependent research, 103 00:10:38,990 --> 00:10:47,090 and it's nice to provide this critical privacy that what I personally think we very often just take away from certain people because we have this. 104 00:10:47,090 --> 00:10:53,900 I mean, I've heard many stories of good papers, but often people say, well, they they couldn't read and write, What should I do? 105 00:10:53,900 --> 00:11:01,250 So it's just a few studies I did. We wanted to see how social preferences develop in children. 106 00:11:01,250 --> 00:11:08,720 There was a study done by and stare many years ago, but they basically had an experimental who would trade with the children, 107 00:11:08,720 --> 00:11:12,320 gummy bears and other sweets, and we tried to replicate it. 108 00:11:12,320 --> 00:11:16,490 But really, in a way, word was anonymous and people and kids were really behind the computer. 109 00:11:16,490 --> 00:11:23,940 They were four to eight years old, but we programmed in a really nice way and try to see how that works. 110 00:11:23,940 --> 00:11:27,950 And this is the study I'm talking to say most of the time about. 111 00:11:27,950 --> 00:11:33,170 There was a study done in Sweden with people who have not used computers before most of them 112 00:11:33,170 --> 00:11:39,350 and was kind of small-scale farmers and with people who also really needed audio recording. 113 00:11:39,350 --> 00:11:40,640 And so, yeah, you see, 114 00:11:40,640 --> 00:11:48,260 we built up a mobile app and keep it were at the end being able to do pretty complicated tasks on their own behind the computer. 115 00:11:48,260 --> 00:11:53,660 And it's probably already my main take home message that this is actually possible. 116 00:11:53,660 --> 00:11:59,030 It's a bit more work on our side, but I would encourage everybody to do that. 117 00:11:59,030 --> 00:12:03,230 So the first study is about imagine putting this very little background here. 118 00:12:03,230 --> 00:12:08,270 You see where most of the cutting takes place in the world. We work in Sudan. 119 00:12:08,270 --> 00:12:16,820 The reason is that we did it in Sudan. There are different types of cutting. It's always a removal of female genital organs for non-medical reasons. 120 00:12:16,820 --> 00:12:18,290 But in Sudan that use Typekit, 121 00:12:18,290 --> 00:12:24,740 we that means they remove all the organs and then they they kind of stitched together and it's rolling together in that way. 122 00:12:24,740 --> 00:12:33,680 So this is the one where health issues have been documented and for many other reasons, we chose to work in Sudan. 123 00:12:33,680 --> 00:12:43,260 Just a little background to the study. So it's a policy basically assumes that due diligence cutting is a coordination game 124 00:12:43,260 --> 00:12:47,370 in the way that people are forced to coordinate and want to leave their minds. 125 00:12:47,370 --> 00:12:50,200 So everybody cuts on, nobody cuts. 126 00:12:50,200 --> 00:12:59,220 I'll spare you all the details, but it's a pretty well established theoretical discussion about it is this really is a case or not. 127 00:12:59,220 --> 00:13:02,940 And we wanted to test it, and it means we wanted to test it in a way that decide, OK, 128 00:13:02,940 --> 00:13:07,710 if this is true and people coordinate on one or the other equilibrium you should have, 129 00:13:07,710 --> 00:13:13,920 the aggregate level has this specific positive dependent dynamics is everybody cuts or nobody cuts. 130 00:13:13,920 --> 00:13:18,120 So our first measurement girlfriend, we could not solve this computerised mess. 131 00:13:18,120 --> 00:13:25,470 Let's just basically we wanted to estimates of cutting rates in villages in Sudan and see if we can have 132 00:13:25,470 --> 00:13:32,760 this distribution that we have this discontinuity in communities it cut in communities that did not touch. 133 00:13:32,760 --> 00:13:40,140 And again, I don't go into the details, but we figured out that there is actually a visible signal because in Sudan and in many other countries too, 134 00:13:40,140 --> 00:13:45,210 even with medical exams, you wouldn't get that through the ethics committee or whatever you want to call it. 135 00:13:45,210 --> 00:13:53,100 So as a government, I would rather call it the ethics committee. But we found out that all the and circumcised in the same week, 136 00:13:53,100 --> 00:13:59,850 nearly two weeks before they go to school for the first time and that they get henna painted on their feet on the day of the circumcision. 137 00:13:59,850 --> 00:14:05,940 And the henna does not grow out of it, does not wash out of the toenails, so it has to grow out and we had the window. 138 00:14:05,940 --> 00:14:08,640 It was far more complicated than this, but in a nutshell, 139 00:14:08,640 --> 00:14:15,090 we just counted how many guides first had had them on their feet and estimated the cutting rates. 140 00:14:15,090 --> 00:14:20,440 And then what you see here is the that cutting rates of all the communities. And you see there is no discontinuity. 141 00:14:20,440 --> 00:14:24,720 We have cutting rates of all time between 20 and 80 percent. 142 00:14:24,720 --> 00:14:27,150 Basically, that does not surprise you. 143 00:14:27,150 --> 00:14:36,270 Not so much because I didn't give you a whole lot of background, but this was actually this led to a lot of arguing with UNICEF Finance. 144 00:14:36,270 --> 00:14:37,620 All of this too, 145 00:14:37,620 --> 00:14:45,750 because this really goes again against all the programme works that they have built on this idea that it's a coordination game and it's just people, 146 00:14:45,750 --> 00:14:55,350 someone equilibrium to the other. So we didn't find it, but now I'm coming back to the topic of the presentation. 147 00:14:55,350 --> 00:15:03,570 So we had this very unusual measurement. Now is a hennesy, which only works one time a year and just two weeks. 148 00:15:03,570 --> 00:15:11,020 And it's very hard to take that as an impact data collection. I mean, you have to really time very well and then you don't know how long before. 149 00:15:11,020 --> 00:15:18,320 That period should have an intervention or not, so what we were thinking, how we can have a more. 150 00:15:18,320 --> 00:15:25,130 Measurement that you can scale up better, that correlates hopefully with the data, and then we thought, OK, 151 00:15:25,130 --> 00:15:29,650 we try to do an implicit association test against the people who are psychologists know 152 00:15:29,650 --> 00:15:36,590 it's it's a pretty established test in psychology also critically discussed there, 153 00:15:36,590 --> 00:15:45,020 but lately used a lot by development economists. And the idea is that you basically work with reaction time instead of asking people directly, 154 00:15:45,020 --> 00:15:54,530 you present and you try stimuli and various stimuli to people, and then you try to let them categorise this and measures reaction time. 155 00:15:54,530 --> 00:15:59,750 I give you an soon afterwards. So it's in a way an indirect measure. 156 00:15:59,750 --> 00:16:06,470 It's rather complicated to do. And the challenge was how can we do it with a subject pool as we had where people are 157 00:16:06,470 --> 00:16:11,450 really not exposed to work with the computer ever before and people can read and write. 158 00:16:11,450 --> 00:16:16,370 So we decide we nevertheless try it and this is a goal. 159 00:16:16,370 --> 00:16:23,060 So it was you. This is basically the idea in an it, you get first introduced to neutral stimuli and this should be neutral. 160 00:16:23,060 --> 00:16:31,160 This is a circumcised girl and the girl that's not circumcised. It's very clear for Sudanese people because a piece of cottons is circumcised. 161 00:16:31,160 --> 00:16:35,240 The class that circumcised daughters has accepted this pattern. 162 00:16:35,240 --> 00:16:42,980 So there is a lot of qualitative, informative work that goes into this project that I'm not talking about today. 163 00:16:42,980 --> 00:16:49,130 But anyway, it introduces two girls. And then you see, here's a good should be neutral. 164 00:16:49,130 --> 00:16:54,170 And then we added a valid stimuli, namely a negative and the positive stimuli. 165 00:16:54,170 --> 00:16:59,180 And then you categorise them here. The cut girl is combined with the negative stimuli. 166 00:16:59,180 --> 00:17:07,700 Unpacked, though, is a positive one. And then these images of the girls popping up or people hear a bad or positive word. 167 00:17:07,700 --> 00:17:13,430 And then you had only two buttons to press a pink one and blue one, and you had to do it as fast as possible. 168 00:17:13,430 --> 00:17:21,800 And we measure reaction time. And we also measure the error rate and then you reverse the pairing and you do the same again. 169 00:17:21,800 --> 00:17:29,990 So I guess the intention is clear. You're really you're really slow down if this is not how you feel about it and if this is how you feel about it, 170 00:17:29,990 --> 00:17:35,030 it's bad to be circumcised and this task is much easier for you. 171 00:17:35,030 --> 00:17:42,050 You can test many of those. It's a Harvard web page and about different religion, women and signs and all kinds of stuff. 172 00:17:42,050 --> 00:17:46,070 And I guess with some of them, you really feel fear. 173 00:17:46,070 --> 00:17:52,740 Maybe what I mean that that it's just slowing it down a millisecond and that stimulates over time. 174 00:17:52,740 --> 00:18:05,700 So this is this is basically get the story that was it just just just the beginning? 175 00:18:05,700 --> 00:18:12,160 Shall be these kids? Yeah. So the question is, can you? 176 00:18:12,160 --> 00:18:17,840 Yeah. Would you like to? Could you again repeat the intuition be the end is slowing down? 177 00:18:17,840 --> 00:18:18,980 Yeah. 178 00:18:18,980 --> 00:18:28,820 So if you so the idea is basically let's assume you think it's better to be circumcised and it's good not to be circumcised and it's good to be cut. 179 00:18:28,820 --> 00:18:33,650 And then those images are what's coming up. Then this is how you're aligned with it. 180 00:18:33,650 --> 00:18:38,840 It's easier for you to press the same button for a positive word and for a cut. 181 00:18:38,840 --> 00:18:42,680 It's harder for you to do this task because it's just again, 182 00:18:42,680 --> 00:18:48,740 you're it goes against the implicit association, and this is why it's just a little harder. 183 00:18:48,740 --> 00:18:53,360 There has been lots of meta analysis, and this works best with an uninformed subject, 184 00:18:53,360 --> 00:18:57,920 which would be probably harder if I do something about racism with you guys now, 185 00:18:57,920 --> 00:19:05,210 because somehow you might figure it out and you try and forces me to go against it because you want to even hide it in this task. 186 00:19:05,210 --> 00:19:08,120 There's lots of discussion in this task really working a lot. 187 00:19:08,120 --> 00:19:14,090 It seems to really works best with people that haven't done that before, that have no insight about this a whole lot. 188 00:19:14,090 --> 00:19:19,790 And then you really, just with the reaction time picked up this difference in the implicit attitude. 189 00:19:19,790 --> 00:19:25,540 Let's basically the idea, and it's a pretty complex task. 190 00:19:25,540 --> 00:19:32,840 You know, you do it first for one pairing, then you have to do some intermediate stuff to get this out of their heads and the other one. 191 00:19:32,840 --> 00:19:37,010 So, well, I wanted to present it to you because this is something everybody told us. 192 00:19:37,010 --> 00:19:40,940 You can't do that. It's just doesn't work. It's it's crazy to do that in Sudan. 193 00:19:40,940 --> 00:19:46,130 And at the end we went one hundred forty four communities. We had a lab with 50 computers. 194 00:19:46,130 --> 00:19:51,890 We had 24 hour, seven days a week security with us. But this is really the message I want to bring here. 195 00:19:51,890 --> 00:20:00,260 You can do it. It's just a bit more work on your side, but it's worth it in a way because it's first of all, it's valuing people that you're not. 196 00:20:00,260 --> 00:20:06,050 Just seeing what I can ask you about your sexual behaviour and your mama sitting next to you, and I just don't care. 197 00:20:06,050 --> 00:20:10,280 We wouldn't do that and we wouldn't go away. Was it in any country like Switzerland? 198 00:20:10,280 --> 00:20:17,560 If you tried that, people would never let you in their house? So this is a nice way of trying to also be a bit more. 199 00:20:17,560 --> 00:20:22,810 Taking privacy serious and topics that matter for people and other places. 200 00:20:22,810 --> 00:20:27,410 And as a result, so you see, it's also very similar to the Hennessy's. 201 00:20:27,410 --> 00:20:33,730 We don't have a discontinuity, we don't have a group that's clearly pro cutting and clearly against cutting societies. 202 00:20:33,730 --> 00:20:43,030 Poor is basically going from two to minus two, and that means you're in favour of cutting and this means you're in favour of money cutting. 203 00:20:43,030 --> 00:20:48,250 And what's important is that we could show that this correlates with the hidden. 204 00:20:48,250 --> 00:20:53,500 So it was not the same people. It was randomly selected people from the communities. 205 00:20:53,500 --> 00:21:02,110 But this did allow us to use this method basically to run it behind a randomised controlled trial because we have 206 00:21:02,110 --> 00:21:08,800 developed two new methods and we were kind of confident that we can rely on it because it's picking up the same stuff. 207 00:21:08,800 --> 00:21:15,400 This was important because we would, of course, criticise, not criticise, but the Hanafi, there's a nice idea. 208 00:21:15,400 --> 00:21:19,080 It worked very well, but it's nothing that we can just use a whole lot. 209 00:21:19,080 --> 00:21:23,590 And if you're really interested in attitudes are changing any time. 210 00:21:23,590 --> 00:21:31,150 So we did this and then I make just for a bit more fun and make a little thing where I don't talk about medicines, 211 00:21:31,150 --> 00:21:35,370 but just showing you why this was important that we have developed this as a method. 212 00:21:35,370 --> 00:21:43,400 So, yeah. Sorry. 213 00:21:43,400 --> 00:21:50,590 Just just as a question, so the implicit the scale, the I-80, 214 00:21:50,590 --> 00:21:59,290 what what so what does it mean if it's zero point three means where do you where do you see to this is this is just mean. 215 00:21:59,290 --> 00:22:03,520 This is basically it goes from two to minus two plus two is you're really in 216 00:22:03,520 --> 00:22:08,500 favour of non putting and minus two means you're OK putting in zero second. 217 00:22:08,500 --> 00:22:16,830 OK, and and you know that this is not just random because it correlates with the actual measure of that was important for us. 218 00:22:16,830 --> 00:22:22,760 I mean, that's the message. You're right, if you develop new methods, you hopefully can validate and somehow. 219 00:22:22,760 --> 00:22:30,970 And we also did that in this is over all 144 communities, but we did it for each community individually. 220 00:22:30,970 --> 00:22:36,490 OK. And the choice in the implicit association test was either positive or negative. 221 00:22:36,490 --> 00:22:42,490 There wasn't a middle. No, you can't have that there. You have basically two valid stimuli and two neutral ones. 222 00:22:42,490 --> 00:22:47,050 And if you do that, you should probably where maybe you are even in favour of not cutting. 223 00:22:47,050 --> 00:22:53,170 But the goods are neutral, a cut down on fat girls and it should be an interesting one. 224 00:22:53,170 --> 00:22:58,320 Yeah, that's the idea. OK, thanks. All right. Thank you. 225 00:22:58,320 --> 00:23:04,690 And just later explores what you can do with it, and so we had the fun of producing four different movies in Sudan, 226 00:23:04,690 --> 00:23:13,350 and the idea was that we wanted to have again a very mild intervention where we basically use the results from the first paper and showed that people 227 00:23:13,350 --> 00:23:22,860 who plot and don't cut next door to door so we can produce movie now where we just use their own arguments discussing it out without us coming. 228 00:23:22,860 --> 00:23:26,840 Talking about something like this is a violation of human rights or something like this. 229 00:23:26,840 --> 00:23:32,460 So we just produce four movies and we run two experiments. 230 00:23:32,460 --> 00:23:40,950 We run one where we were very risk averse, where we just let people watch the movie on their own in the lab and then directly 231 00:23:40,950 --> 00:23:44,790 afterwards runs implicit association test to see if there is an immediate effect, 232 00:23:44,790 --> 00:23:51,920 depending if you see a movie where we discuss, we mentioned cutting or you see a movie where we don't talk about it at all is a controlled movie. 233 00:23:51,920 --> 00:23:58,410 And they were so 75 percent identical. And so it was a really, really, really mild intervention. 234 00:23:58,410 --> 00:24:04,500 And then we got these results and show them to you a minute and then we got more courageous and we said, 235 00:24:04,500 --> 00:24:10,320 OK, now we can also randomly assigned the movies to communities, which is, of course, a much bigger study. 236 00:24:10,320 --> 00:24:14,820 You have hundred forty four communities to make public viewing in every community. 237 00:24:14,820 --> 00:24:17,820 You randomly select 60 people per community. 238 00:24:17,820 --> 00:24:24,960 You measure the attitudes before you show the movie and you measure the attitudes afterwards, and he sees the results again. 239 00:24:24,960 --> 00:24:31,590 So this is the i t score again, and here you have the control movie and three experimental movies. 240 00:24:31,590 --> 00:24:36,570 And this is the big study where we randomised at the community level. And you see directly after the movie, 241 00:24:36,570 --> 00:24:43,410 the attitudes are more in favour of non cutting in comparison to the control and movie where nothing seemed to have changed. 242 00:24:43,410 --> 00:24:46,350 And then interesting though, in the field study, 243 00:24:46,350 --> 00:24:53,070 we could wait one week because we didn't had spill-over effects because it was randomised at the community level. 244 00:24:53,070 --> 00:24:58,440 The other one, of course, if you just wait an hour, everybody talks to each other. 245 00:24:58,440 --> 00:25:04,380 And I do think spill-over effects of serious effects, and you should try to avoid them whenever you can. 246 00:25:04,380 --> 00:25:06,510 That's why we randomised at community level. 247 00:25:06,510 --> 00:25:13,710 And then you actually see that only one of the three experimental movies still has an effect a week afterwards, 248 00:25:13,710 --> 00:25:17,830 which is also nice in the way that it's it's also always implicit association test. 249 00:25:17,830 --> 00:25:24,980 So we did it maybe a week after they watched the movie, and then we see that actually only one of the movie has a lasting effect. 250 00:25:24,980 --> 00:25:30,990 Yeah. Excuse me, miss you if I missed it. But in the movies, are they like the same? 251 00:25:30,990 --> 00:25:34,680 And then the actors did another scene? Exactly. 252 00:25:34,680 --> 00:25:39,060 Yeah, yeah. You directed something like that? Yeah, I was. I was very short on this. 253 00:25:39,060 --> 00:25:44,730 This is really because it's not really a lot about the talk I'm trying to give you. 254 00:25:44,730 --> 00:25:52,590 But the idea is that basically we had a Sudanese writer who would write the movie and we would experimentally constrain him a little bit. 255 00:25:52,590 --> 00:26:00,420 So in the controlled movie, it was just like any good movie about love and drama and rejection and anything. 256 00:26:00,420 --> 00:26:07,260 And then in the other story, we had 30 percent of the movie talking about in a non-judgemental way with the pro and cons about cutting. 257 00:26:07,260 --> 00:26:13,980 So it's like you would listen to your neighbour secretly who talks about it in a very non-scientific way, 258 00:26:13,980 --> 00:26:21,060 but it wasn't scientific in the way how we manipulate it to argument. So in one movie, you would talk about your private values. 259 00:26:21,060 --> 00:26:26,430 In another movie, it was just about coordination, incentives. So it was basically the elements of a coordination game. 260 00:26:26,430 --> 00:26:28,020 Preferences versus coordination. 261 00:26:28,020 --> 00:26:36,690 Incentives were tested against each other, and the winning movie is the one where both were in so coordination incentive seems to matter. 262 00:26:36,690 --> 00:26:41,490 But private values are. It's a matter that was an idea, and I would say so. 263 00:26:41,490 --> 00:26:44,760 The these people told me if you would watch the movies without the sound, 264 00:26:44,760 --> 00:26:52,600 you would probably not even be able to say the difference because we tried to keep it as soon as possible. 265 00:26:52,600 --> 00:26:59,910 And as an example, I want to give you to show you again how you how if you really want to, 266 00:26:59,910 --> 00:27:09,420 how you can actually collect very rich behavioural data under very challenging circumstances, and this is about prenatal sex selection and mania. 267 00:27:09,420 --> 00:27:16,190 That's actually how you and I met because we have a shared interest in this topic. 268 00:27:16,190 --> 00:27:22,870 You just see an overview, it's already from 2010, it's from the paper 2015, 269 00:27:22,870 --> 00:27:30,400 where you see how postnatal and prenatal selection influence the story about two missing females, 270 00:27:30,400 --> 00:27:34,480 and you see that prenatal sex infections actually going up. 271 00:27:34,480 --> 00:27:39,520 And the idea is at least in the southern Caucasus ideas. So several things come together. 272 00:27:39,520 --> 00:27:47,890 It's a decline in fertility. It's an improvement of methods. You can detect the gender of the child early on, earlier. 273 00:27:47,890 --> 00:27:53,080 And in that way, there is this idea this UNICEF has this very is it. 274 00:27:53,080 --> 00:27:58,330 It's a mess. So it's it's getting more and more in this direction and fertility goes down 275 00:27:58,330 --> 00:28:01,810 and people have a sudden preference that you find it in many other places, 276 00:28:01,810 --> 00:28:11,060 too, where people just try to basically affect the situation and ensure they have a son by aborting the females. 277 00:28:11,060 --> 00:28:18,460 And Amaya is a very nice example of how you can have this problem of desirability. 278 00:28:18,460 --> 00:28:22,480 So they basically figured out that they have a problem with it in somewhere. 279 00:28:22,480 --> 00:28:27,670 In 2010, there were a lot of data published by demographer Jim Mato, 280 00:28:27,670 --> 00:28:36,250 who could really show that the sex ratio of births but in some regions at more than one hundred fifty boys are born one hundred girls. 281 00:28:36,250 --> 00:28:38,740 And so they could somehow they had to do something. 282 00:28:38,740 --> 00:28:45,370 And then they did a lot of media campaign, not with scientists just telling everybody it's better to preference friends, 283 00:28:45,370 --> 00:28:50,740 that you should not have a preference and that you should be favouring girls and boys in the same way. 284 00:28:50,740 --> 00:28:54,700 And this is not really exclusive. It's not showing clear resides. 285 00:28:54,700 --> 00:29:04,360 It's just out of their report. It's not a scientific report, but they basically asked families what kind of gender it's their child should have has. 286 00:29:04,360 --> 00:29:07,030 If they had one child, what then should have? 287 00:29:07,030 --> 00:29:14,980 And then they did the same 2011 and 2016, which means I don't have a preference and see that there was a big change. 288 00:29:14,980 --> 00:29:21,590 This is what I meant before. That does mean, of course, that there is a change, there was not the change in the sex ratio adverse, 289 00:29:21,590 --> 00:29:25,550 but this is how you see how I especially what we do now as a time, 290 00:29:25,550 --> 00:29:32,000 how media campaigns and all of this not set up in the nation was a randomised controlled trial. 291 00:29:32,000 --> 00:29:38,150 Can use can lead to situations like this where it's just a scientist extremely hard to work with it. 292 00:29:38,150 --> 00:29:40,940 And we kind of started this study somewhere in the middle. 293 00:29:40,940 --> 00:29:47,390 So this was very unfortunate for us to kind of stopped it then says you can run an asset in a situation 294 00:29:47,390 --> 00:29:53,810 like this in a small country where everybody knows within a week what you're supposed to say. 295 00:29:53,810 --> 00:30:00,140 So we said, OK, what we do now is we don't do an asset. But we really again trying to develop measurement. 296 00:30:00,140 --> 00:30:07,880 So it said in this situation works, and I do think this is a really nice example for a really challenging situation. 297 00:30:07,880 --> 00:30:11,310 Small country, big media continue and FDA was behind it. 298 00:30:11,310 --> 00:30:15,710 European Union was behind it. How do you now really figure out what's going on? 299 00:30:15,710 --> 00:30:21,770 And it's an interesting case because women are actually pretty well educated enough to participate in the labour market. 300 00:30:21,770 --> 00:30:30,890 So we really wanted to try to understand why do they have this outrageous imbalance in the sex ratio for us? 301 00:30:30,890 --> 00:30:36,320 So again, we decided that we would work with mobile apps here in India. 302 00:30:36,320 --> 00:30:44,690 It wasn't possible to get people in a lab for several reasons, so we basically build mobile labs and people people's houses. 303 00:30:44,690 --> 00:30:51,350 And that was important because we wanted to interview the wife, the husband and the husband's mother. 304 00:30:51,350 --> 00:30:55,730 All the programmes focussing on the husband's mother. This is a common wisdom. 305 00:30:55,730 --> 00:31:01,130 Somehow, UNICEF focuses on it that this is a driving force behind prenatal selection. 306 00:31:01,130 --> 00:31:06,950 So we said we want to run a little study with all three of them in parallel to make sure they again, 307 00:31:06,950 --> 00:31:10,670 not one stands behind the other when you give an answer. 308 00:31:10,670 --> 00:31:15,020 That was the reason we couldn't get people in the lab and said never leaves the house or three. 309 00:31:15,020 --> 00:31:18,620 At the same time, somebody has to be at home. What do we do with the kids? 310 00:31:18,620 --> 00:31:23,960 So we had really many laps of three computers that we built up in nine hundred 311 00:31:23,960 --> 00:31:28,190 different households in Romania and collected data where on the one hand, 312 00:31:28,190 --> 00:31:34,760 we wanted to have explicit measures and detailed preferences on the sex distribution of the children and as a staff, 313 00:31:34,760 --> 00:31:38,900 then we wanted to have an implicit measure again. So implicit association test. 314 00:31:38,900 --> 00:31:43,460 And we also worked with incentivised misalucha, as I would tell you in a minute, 315 00:31:43,460 --> 00:31:49,420 because it can also, of course, help to really go to the true attitudes or the child behaviour. 316 00:31:49,420 --> 00:31:54,710 And I wanted to just talk very shortly about it again because you cannot just 317 00:31:54,710 --> 00:31:59,240 test visit whether there is a positive or negative attitude toward something. 318 00:31:59,240 --> 00:32:07,230 You can also test stereotypes. And that's a nice way to do development research because in the way they are coming from somewhere, 319 00:32:07,230 --> 00:32:14,420 and if you do a campaign, it's maybe it's a more subtle way to see if you actually did do some changes here. 320 00:32:14,420 --> 00:32:23,060 The stereotypical it we run in Romania is based on this kind of saying that they say that the boys are ours and they don't belong to the others, 321 00:32:23,060 --> 00:32:27,440 and this is the idea that the boys stay with you. But if you have a girl, you marry her off. 322 00:32:27,440 --> 00:32:33,020 And that's it, basically. So she's not really very useful at that point anymore. 323 00:32:33,020 --> 00:32:39,050 So we use and we probably see now that for you, if you don't think that way, this should be a really weird IoT for you. 324 00:32:39,050 --> 00:32:44,720 So instead of positive negative terms, we have terms that, on the one hand, show flourishing. 325 00:32:44,720 --> 00:32:52,340 That means you extend your family, your still somehow there. And then we have lots of words that I found it a weird idea, to be honest. 326 00:32:52,340 --> 00:32:59,620 I don't even know how to do that one where it's about abruptly ending the family line. 327 00:32:59,620 --> 00:33:07,010 Find us. And then we did the same stuff again where we had the family was only boys and the family was only girls. 328 00:33:07,010 --> 00:33:11,540 And then you have the words down, whether you continue the family or you don't. 329 00:33:11,540 --> 00:33:15,740 And again, you have images or you hear different of those words. 330 00:33:15,740 --> 00:33:25,370 Then we reverse encoding again. It's same story as before, and we measure again the score and just talk about the results. 331 00:33:25,370 --> 00:33:30,980 It's again from two to two. It's the same. It's basically a programme you buy, and it's it's always working in the same way, 332 00:33:30,980 --> 00:33:35,720 which is nice because it's nice to compare when other people use it tool. 333 00:33:35,720 --> 00:33:45,140 And we did it for a balance. One was just positive and negative and for stereotypes, it just explained to you and you just again, 334 00:33:45,140 --> 00:33:51,510 don't read this data or it's just came in to weeks ago. And I don't want to go into the details. 335 00:33:51,510 --> 00:33:53,000 I really want to talk about the methods. 336 00:33:53,000 --> 00:34:01,910 But what's interesting is in our data very systematically across all the different regions and also in society for violence and for stereotypes. 337 00:34:01,910 --> 00:34:08,360 It looks like there has been some as a problem. So to say the husbands have a strong preference for a boy and the husband seemed 338 00:34:08,360 --> 00:34:13,580 to be very much leaning into the stereotype and not the mother in law so much. 339 00:34:13,580 --> 00:34:25,450 And now we feel like again, we're kind of. Ready after we understood all of this data may be doing in our city now, because now we maybe have a. 340 00:34:25,450 --> 00:34:32,090 We are convinced that we can disentangle this awareness from the asbestos. 341 00:34:32,090 --> 00:34:36,680 But because it's more complicated than mania, because we know there was this big campaign that reached everybody, 342 00:34:36,680 --> 00:34:45,170 we also did a lot of other measurements with some preferences at present and then again trying to validate one for the other and not just, 343 00:34:45,170 --> 00:34:49,340 you know, developing an it and run it afterwards. And also, 344 00:34:49,340 --> 00:34:57,770 he was very useful that we worked with computerised methods because typically the DHS is measuring sound preferences in a way where it first 345 00:34:57,770 --> 00:35:04,880 asked you if you could go back to the time you did not have children and which was exactly the number of children to have in your whole life. 346 00:35:04,880 --> 00:35:11,270 How many would that be? And then they ask you based on this answer, how should be the sex composition? 347 00:35:11,270 --> 00:35:18,830 And of course, it's a very problematic question. It's conditioning. First of all, under the family size, you say, if you say sorry, 348 00:35:18,830 --> 00:35:23,740 you can't really reveal an unbalanced preference and wants you to say to and you have 349 00:35:23,740 --> 00:35:28,310 this nightand preference is very extreme to say sunshine and you might say son, 350 00:35:28,310 --> 00:35:33,620 daughter. And of course, it's the family size varies across different region. 351 00:35:33,620 --> 00:35:37,970 The preference was a family size. You also have lots of noise there. 352 00:35:37,970 --> 00:35:44,240 And again, all of this was done in privacy. We managed to do all of this in the way that people who do it on their own. 353 00:35:44,240 --> 00:35:53,450 Of course, this was not as hard as in Sudan. I mean, they could not write the songs, although we audio recorded everything to at the same time. 354 00:35:53,450 --> 00:35:57,940 And then we did basically hear what two shots would be used an indirect measurement. 355 00:35:57,940 --> 00:36:02,720 So first of all, we asked them about the family composition of the youngest child. 356 00:36:02,720 --> 00:36:09,920 So you don't have this problem that you basically have to rationalise in a way that you already have three children, 357 00:36:09,920 --> 00:36:11,690 but you really think two would be best, 358 00:36:11,690 --> 00:36:20,780 but you don't really want to say that because then you see that kind of, you know, unfair will mean or you say something really bad towards the child. 359 00:36:20,780 --> 00:36:29,450 So we work with a hypothetical case where we ask them about basically what would be the right size and the composition for the youngest child. 360 00:36:29,450 --> 00:36:38,420 And we could also, because it was computerised on the paper, you could never move all of this in the way that a person would handle it on its own. 361 00:36:38,420 --> 00:36:42,140 So we could programme it in a way that we ask what's a perfect family size? 362 00:36:42,140 --> 00:36:45,710 And we had lots of questions where people would forget about this question. 363 00:36:45,710 --> 00:36:49,580 And then in a random order, we would ask different people about If you have one child, 364 00:36:49,580 --> 00:36:56,270 what's your what's the perfect sex position, if you have to, if you have three, if you have four. 365 00:36:56,270 --> 00:37:04,100 So again, this was extremely easy in a way. If you think about it hard enough, programme it in a way that these people could really do it. 366 00:37:04,100 --> 00:37:09,470 Without that, they needed help for it because the programme would do everything for you. 367 00:37:09,470 --> 00:37:18,880 And in that way, it was also nice again that there was no bystander effect, that no interviewer effect to go just to give you a short anecdote. 368 00:37:18,880 --> 00:37:24,910 We work with the National Statistical Service of Armenia, and they do either state in all of this, 369 00:37:24,910 --> 00:37:28,810 and they were telling us that we get the best interviewer. 370 00:37:28,810 --> 00:37:34,300 Maybe I should visit when I'm actually I got anyway that we get the best interview as a half. 371 00:37:34,300 --> 00:37:40,030 And then, yeah, when I train them, there were a couple of men that came to me and said, Hey, you know, you just tell me the answer. 372 00:37:40,030 --> 00:37:44,710 You won't get any answer. Said, you tell me you want in your questionnaire. 373 00:37:44,710 --> 00:37:51,580 So we saw yet another reason to have computerised methods and leave the interview out as much as possible. 374 00:37:51,580 --> 00:37:57,490 And I mean, he really had had the best intention, right? I didn't even mean that as a criticism of my life, 375 00:37:57,490 --> 00:38:04,330 people have other ideas about how they wanted to have of data together, I guess, just to answer very short. 376 00:38:04,330 --> 00:38:10,870 And so you see, of course, this is all done on their own and you get a lot more insights because one of the key question also, 377 00:38:10,870 --> 00:38:14,650 is it the preference for a son or is it a preference for the more sons, the better. 378 00:38:14,650 --> 00:38:19,660 And so, for instance, we'll see if it's just one child. Everybody obviously wants a son. 379 00:38:19,660 --> 00:38:27,130 This is a different regions and there's a three character, husband, wife, mother in law. 380 00:38:27,130 --> 00:38:30,850 But then if you ask them two children, they all want a son and a daughter. 381 00:38:30,850 --> 00:38:37,210 So it's not just the more sons better. And then I personally think getting really interesting here, when you have three children, 382 00:38:37,210 --> 00:38:43,360 or even if you ask for for their interest, many combinations and people seem to be relatively clear. 383 00:38:43,360 --> 00:38:49,870 This area is a little different. The sex ratio is much lower, a much more even than these two. 384 00:38:49,870 --> 00:38:55,150 But what's interesting is everybody seems to be sure that the first one is to be a son, 385 00:38:55,150 --> 00:39:00,190 and nobody wants something like even daughter, son, daughter, daughter or something like this. 386 00:39:00,190 --> 00:39:06,710 So I think if you again, if you programme all of this isn't saying it's it's it's a much richer data and said it's 387 00:39:06,710 --> 00:39:11,980 OK treating the people with more respect instead of just asking them all of this directly. 388 00:39:11,980 --> 00:39:20,200 And that's the message I want to talk about. Your remaining two minutes is vignette studies where you use incentives, because if you use incentives, 389 00:39:20,200 --> 00:39:26,980 of course, you can also really go for the true answer in the way that you can't if you don't give incentives. 390 00:39:26,980 --> 00:39:31,420 And I guess most of you know how vignettes work, so you have a little background story. 391 00:39:31,420 --> 00:39:37,990 In our case, people would listen to a story about the couple that has two daughters planning to have their say at the last child, 392 00:39:37,990 --> 00:39:43,540 put the edge in a way where it was clear to people that chances are not increasing anymore. 393 00:39:43,540 --> 00:39:48,220 And then they find out that they are pregnant and she has a girl. 394 00:39:48,220 --> 00:39:49,180 And then in a vignette, 395 00:39:49,180 --> 00:39:59,290 you very different experimental treatments are to say so you can vary the financial security that they think the daughter would be useful in the past. 396 00:39:59,290 --> 00:40:03,190 In the future, are they the marries and then or are different things? 397 00:40:03,190 --> 00:40:07,180 And then also various people do abort or they don't abort? 398 00:40:07,180 --> 00:40:15,190 And then you ask people basically the private opinion, and you can also ask them beliefs about what other people would answer. 399 00:40:15,190 --> 00:40:21,850 And if you don't do this in privacy, so we have everything from buying to it's an indirect measure from the wording. 400 00:40:21,850 --> 00:40:24,190 We provide an incentive as you see now and again, 401 00:40:24,190 --> 00:40:30,190 they could do it completely on their own without you recordings, without anybody listening or watching them. 402 00:40:30,190 --> 00:40:35,800 So to say then what's nice about this is so you ask them, what do you think about the choice of the couple? 403 00:40:35,800 --> 00:40:43,120 Let's assume you said they believe their daughter is financially secure in them and they decided to abort. 404 00:40:43,120 --> 00:40:47,830 Then you ask them, Are you happy, sad or very happy? Very sad for the family? 405 00:40:47,830 --> 00:40:55,090 And this is just a private opinion. And what you do is incentivise vineyards to basically nets and play a coordination game in the way that you say, 406 00:40:55,090 --> 00:41:00,720 OK, now we interested in what you think as specific as a group did. 407 00:41:00,720 --> 00:41:06,460 So it's like you're trying to measure perceived social norms. People say these are a focal point. 408 00:41:06,460 --> 00:41:11,800 And in that way, we basically tell them we did the same study, the same question as a women. 409 00:41:11,800 --> 00:41:16,630 If you correct, guess if you correctly guess what? Another woman answer to that? 410 00:41:16,630 --> 00:41:23,080 I don't know. 10 U.S. dollar and like this, you disentangle this kind of stuff that you're not trying to be consistent. 411 00:41:23,080 --> 00:41:27,670 You don't try to rationalise that you say what everybody thinks I do. 412 00:41:27,670 --> 00:41:32,620 The incentives are pretty high. They know that to immediately get it by, I mean, it's on the phone. 413 00:41:32,620 --> 00:41:39,250 We made it clear we showed that before the beginning of the study, they trusted us at this point. 414 00:41:39,250 --> 00:41:49,790 And again, this is. So a development context, a pretty elaborate way of collecting data off of doing a lot of effort, 415 00:41:49,790 --> 00:41:56,600 of trying to disentangle this stuff of awareness, whereas it's media behaviour and attitudinal change. 416 00:41:56,600 --> 00:42:04,460 And this is, I think, what I'm really just going we don't have results and ones where I don't show results. 417 00:42:04,460 --> 00:42:09,040 It's really just to summarise it. I think. 418 00:42:09,040 --> 00:42:16,780 I think there's really this big trend that we really convinced now that it's just not enough to tell people what's right to do. 419 00:42:16,780 --> 00:42:22,990 Information alone clearly does need to change, at least not with those topics. 420 00:42:22,990 --> 00:42:28,240 We also know that just giving incentives is not always enough, it's just not enough to to to change. 421 00:42:28,240 --> 00:42:33,670 The incentive structure is not enough to do conditional cash transfer and conditional cash transfers. 422 00:42:33,670 --> 00:42:40,930 The idea is maybe if if the programmes are so heavily relying on a normative language, somehow telling you what's best to do. 423 00:42:40,930 --> 00:42:48,130 Establishing role models and all of these things working with mental model shifts, then you do need to get a better measurement. 424 00:42:48,130 --> 00:42:53,080 And I think this is a trend that you also use mobile apps and then you work as intended. 425 00:42:53,080 --> 00:43:01,420 You have this privacy, you have kind of interdependent situations and not being and not saying, 426 00:43:01,420 --> 00:43:08,350 we can't do that because people have never seen a computer. People can read and write, this is a really hard situation. 427 00:43:08,350 --> 00:43:13,420 So I would encourage everybody to always think that this is your job to sort that out. 428 00:43:13,420 --> 00:43:24,250 And that's the job of the people. Have to say thank you. 429 00:43:24,250 --> 00:43:33,410 If you have any questions, I can make the last. 430 00:43:33,410 --> 00:43:41,720 Thank you ever so much, I really love the roads that are great, but I wanted to ask, how did you like run pilot studies on these things? 431 00:43:41,720 --> 00:43:47,510 Like how long has it taken you to set up this, this final body of work, for example? 432 00:43:47,510 --> 00:43:55,220 Because I know I do field experiments and they take forever. And do you do you also run like pilots? 433 00:43:55,220 --> 00:44:02,870 And I guess link to some of this is how much confusion did you generate amongst the participants in that? 434 00:44:02,870 --> 00:44:08,390 Did you? Yeah, it's linked to pilots in the pilot to find out what they do and do not understand. 435 00:44:08,390 --> 00:44:14,510 You then went back. Yeah, etcetera. Yeah. So yeah, they they they take long. 436 00:44:14,510 --> 00:44:18,140 I wouldn't. I wouldn't do the Sudan study during my PhD. 437 00:44:18,140 --> 00:44:21,650 Probably not. You also depend on other people. So you know it. 438 00:44:21,650 --> 00:44:29,440 It's not just on me, but of course, there is a lot of. Piloting, but also what I would call material production. 439 00:44:29,440 --> 00:44:35,430 That's really it's really important in Sudan. The movie production was was took forever, 440 00:44:35,430 --> 00:44:41,760 writing the script and then arguing with a very talented writer because I wanted to become to Fisher to get my different treatments. 441 00:44:41,760 --> 00:44:45,210 And all of this society looks, Oh, it's so easy once you present presented, 442 00:44:45,210 --> 00:44:53,040 but it's incredible in some countries how long it takes to get an idea to a sketch that you can use in mania. 443 00:44:53,040 --> 00:44:59,970 They always look like the Kardashians, and we kept telling the has to look like a family with no social, economic background and all of this. 444 00:44:59,970 --> 00:45:02,460 I mean, that matters, of course. 445 00:45:02,460 --> 00:45:10,950 But working with a was activist with the people, I mean, being an artist in Sudan means an activist, so you have a lot of free testing in that way. 446 00:45:10,950 --> 00:45:17,340 Of course, as a girl is equally likely it starts there. Then what we did is when we had our message together, 447 00:45:17,340 --> 00:45:21,690 we also really protested the procedure in a way because exactly we didn't want 448 00:45:21,690 --> 00:45:25,710 what you said when we went in the final one hundred forty four communities, 449 00:45:25,710 --> 00:45:31,470 I was never there. I had at that point sixty data collectors trained and they were a bit hierarchical. 450 00:45:31,470 --> 00:45:37,710 There was a study of a C person and it went down and then four people were supervised by one. 451 00:45:37,710 --> 00:45:44,040 And all of this because I think I can I can really say what people reported at the end of the song. 452 00:45:44,040 --> 00:45:48,320 That fun. You know, we came in there, we had this big computers and data up in school. 453 00:45:48,320 --> 00:45:54,630 They also got rewarded for a four day because I couldn't do what they normally would do. 454 00:45:54,630 --> 00:46:02,550 So I think actually, I saw some video footage the government took and it didn't ever really look like it was really like people understood. 455 00:46:02,550 --> 00:46:08,190 And this is a difference why I don't use this experiment. So I think the message was clear this might be a topic of sensitivity. 456 00:46:08,190 --> 00:46:12,660 I'm interested in your opinion, but I don't really care. I don't record what you really do. 457 00:46:12,660 --> 00:46:16,350 I will never ask you directly, do you continue to circumcise your daughters? 458 00:46:16,350 --> 00:46:23,880 I think people understood in a way that we are interested in something and that we put a lot of effort of making them feeling OK with it. 459 00:46:23,880 --> 00:46:29,700 And I think we managed to do that actually in a good way. But yeah, it's a lot of before the stunning study runs. 460 00:46:29,700 --> 00:46:38,270 I think it was over one and a half year we got better. And I mean, once you do it, then you really get much faster and. 461 00:46:38,270 --> 00:46:45,550 I mean, because that the so we do it next week in Albania. 462 00:46:45,550 --> 00:46:53,530 It's a long procedure, first as a government. So we first presented to the government and then we present it later on, such as a community, 463 00:46:53,530 --> 00:47:00,970 as a community leader and whoever they choose to join him or her family to him. 464 00:47:00,970 --> 00:47:10,690 Yeah. I'm over, 465 00:47:10,690 --> 00:47:15,910 I assume that you have read the public letter in the Guardian a year ago on the whole 466 00:47:15,910 --> 00:47:20,950 critique on asked that their names and they got a lot of money in the last ten years. 467 00:47:20,950 --> 00:47:26,140 But have they actually done anything besides being the thing to do? 468 00:47:26,140 --> 00:47:37,060 So I was wondering, how do you in the time of increasing critique, how would you still validate that this is the way to go in development? 469 00:47:37,060 --> 00:47:43,000 I think this is exactly already a reaction to this, that we are trying things in Maine, which is I mean, 470 00:47:43,000 --> 00:47:48,370 first of all, I agree if you see how many of our cities are produced just by Jay Paul and they all piled up. 471 00:47:48,370 --> 00:47:55,060 I once had an amazing research assistant who did a review for me about I had an offer to do a project on my nutrition, 472 00:47:55,060 --> 00:47:57,610 and then he got everything together that was ever done. 473 00:47:57,610 --> 00:48:03,230 We published all the studies that seemed to look really reliable, and it was a mix of anything. 474 00:48:03,230 --> 00:48:07,060 Right. Everybody change something. He, as a package was changed. It was a different country. 475 00:48:07,060 --> 00:48:11,890 And so I mean, this is, of course, it's a serious criticism. 476 00:48:11,890 --> 00:48:16,150 I do think this behavioural insights, instead of using behavioural science, 477 00:48:16,150 --> 00:48:21,790 is a step in the direction because what I didn't say before, but that basically means you focus on mechanism much more. 478 00:48:21,790 --> 00:48:27,670 So that's why we had four movies in Sudan. I didn't just want to show a best of movie versus the control. 479 00:48:27,670 --> 00:48:34,630 We really could learn if the argument is, is it a coordination issue or is it a private value issue? 480 00:48:34,630 --> 00:48:42,280 And it requires a different programme because in one, I'm just trying to shift a critical mass over the threshold. 481 00:48:42,280 --> 00:48:46,960 So to say and the other one, I can work on an individual basis. And I think that's my reply. 482 00:48:46,960 --> 00:48:54,190 First of all, that it's probably the best if you're really interested in a blood test, but it should probably come. 483 00:48:54,190 --> 00:49:01,300 And of course, that's expensive. I mean, if I want a test mechanism, I have four treatments and not just to control it end to end in a programme, 484 00:49:01,300 --> 00:49:05,100 but I think this is the way where field experiments. I know what's ahead. 485 00:49:05,100 --> 00:49:12,430 I would argue if you go back to the study in Sudan, the first one we did had just two hundred people. 486 00:49:12,430 --> 00:49:19,090 And this is something I also figured out of. People think immediately about big stuff when they do acid cities. 487 00:49:19,090 --> 00:49:24,820 But of course, you can also do it in a lab, in the field setting to get started at least and do it in a much smaller way. 488 00:49:24,820 --> 00:49:32,420 But again, with a focus on AIDS and really trying to understand the mechanism behind it and not just why that works or doesn't work. 489 00:49:32,420 --> 00:49:42,260 But then it's still the time dimension that's rather short. I mean, values develop over centuries if person just runs a couple of years. 490 00:49:42,260 --> 00:49:44,720 Well, that's that's another question to discuss, right? 491 00:49:44,720 --> 00:49:50,780 I mean, this depends very much on the topics and there are examples where maybe an incentive works. 492 00:49:50,780 --> 00:49:53,270 I mean, there is no genuine answer I can give, does it? 493 00:49:53,270 --> 00:50:01,640 There are cities that did one huge debt funded from cases either did one where they showed a 20 minutes movie and then six months later, 494 00:50:01,640 --> 00:50:09,200 they could still measure their effect. We really were lucky we could come back one week and then Sudanese government said, Look, guys, this is it. 495 00:50:09,200 --> 00:50:12,710 I mean, was really harsh conditions in Sudan. 496 00:50:12,710 --> 00:50:19,100 But of course, it would be wonderful if we could now go back and measure the heads of penalty again and be working on all of this. 497 00:50:19,100 --> 00:50:23,630 But of course, this is always the same challenge. If you're going to feel this is just kind of stuff. 498 00:50:23,630 --> 00:50:27,830 I mean, that's not surprising to you. Some programmes take longer. 499 00:50:27,830 --> 00:50:31,990 I mean, even now in Romania, it takes nine months to give birth to a child or whatever. 500 00:50:31,990 --> 00:50:38,950 I mean, you shouldn't just have it two weeks. In fact, imagine afterwards, but. 501 00:50:38,950 --> 00:50:50,220 I mean, that's just how life is in development research, I would say. 502 00:50:50,220 --> 00:50:56,490 Thank you. I was wondering, could you go back to the distribution? Is it look like this? 503 00:50:56,490 --> 00:51:00,000 I mean, there's a lot of within variation this one? Yeah. 504 00:51:00,000 --> 00:51:05,280 Mm hmm. Is that right? Like, I don't I mean, it's kind of surprising. 505 00:51:05,280 --> 00:51:16,140 I was wondering, are they aware that they are not like that as a lot of variation between them? 506 00:51:16,140 --> 00:51:21,270 That was interesting seeing I mean, I think this is really only interesting for for you, 507 00:51:21,270 --> 00:51:25,020 if I would have even provided more background when we started this project, 508 00:51:25,020 --> 00:51:30,780 there was only one idea that everybody cuts and nobody knows how to do better. 509 00:51:30,780 --> 00:51:34,920 This was basically the approach that goes back to this paper from Jeremy McKee, 510 00:51:34,920 --> 00:51:42,630 where he really said This is a party nation issue is good for fundraising because everybody is somehow locked into this equilibrium. 511 00:51:42,630 --> 00:51:53,070 And when we figured out, I mean, what you see here, where we did this now is many, more, many more communities, it's actually only four to four. 512 00:51:53,070 --> 00:51:59,580 We did it with one hundred forty four now and we figured out that people basically left the door who cut and who don't thought. 513 00:51:59,580 --> 00:52:05,760 And in the same months when they published is an economist published a paper using DHS data, also showing interest in this. 514 00:52:05,760 --> 00:52:10,660 Its variation is at the household level, not at the community level in certain different African countries. 515 00:52:10,660 --> 00:52:14,790 So this is when we started only asking this question. 516 00:52:14,790 --> 00:52:21,360 We then later collected data where we ask people, Actually, do you have a problem and say, you're cutting family, do you? 517 00:52:21,360 --> 00:52:26,040 Are you willing to marry your daughter to a family that doesn't touch all your son or whatever? 518 00:52:26,040 --> 00:52:31,210 And about 40 percent in every community said, Yeah, we don't care. 519 00:52:31,210 --> 00:52:33,150 So we were even more pleasant. 520 00:52:33,150 --> 00:52:41,460 And then I just have an anecdote where we asked to six to six data collectors in the US and you point out in your community who plots and who doesn't. 521 00:52:41,460 --> 00:52:44,760 And he said, Yeah, of course, I know exactly who is doing what. 522 00:52:44,760 --> 00:52:59,910 So it's a lot of new information, and it would now need another study to probably understand. 523 00:52:59,910 --> 00:53:06,720 Yeah, thank you. All of these seem you mentioned in passing and quite reasonably with the time constraint that you weren't going to mention 524 00:53:06,720 --> 00:53:12,810 all of the qualitative work that went into being able to think about things like the patterns on the children's clothes. 525 00:53:12,810 --> 00:53:18,600 But these are obviously very large projects that probably are coordinated between. 526 00:53:18,600 --> 00:53:23,580 I would be interested in hearing are they coordinated between multiple groups or 527 00:53:23,580 --> 00:53:28,050 scholars in the same working group or how do you kind of bring together this team? 528 00:53:28,050 --> 00:53:34,630 That's obviously. Different expertise, both in and across nation. 529 00:53:34,630 --> 00:53:37,120 Or what is the team? Yeah, yeah. Yeah, I should. 530 00:53:37,120 --> 00:53:45,490 Anyway, I've mentioned that all of this work is done within Sanchez ever since I was a biologist and a behavioural economist for me. 531 00:53:45,490 --> 00:53:51,310 So, yeah, we were not a development department and was just as real as was, of course. 532 00:53:51,310 --> 00:53:56,500 I mean, it was really UNICEF who who was a few partners implementing partner and UNICEF. 533 00:53:56,500 --> 00:54:03,940 Of course, they have a communication for development office. We had an amazing office there and she had lots of insights and all of these. 534 00:54:03,940 --> 00:54:09,490 We did trust the Sudanese talent to write the script and we we really first let him go free 535 00:54:09,490 --> 00:54:15,320 because we sometimes we need the perfect story and then we need to make it scientific. 536 00:54:15,320 --> 00:54:23,220 And so in that way, she made and funding is a topic where there is a lot of great work done before, so we didn't need to do a lot. 537 00:54:23,220 --> 00:54:27,190 But I don't think she's an anthropologist. She did a lot of work on this. 538 00:54:27,190 --> 00:54:35,090 Trying to find out what campaign is medicalisation is about seeing if people even care about if you say this is human rights, 539 00:54:35,090 --> 00:54:42,790 she studied backlash and all of this. So this is actually a topic where this was the first answer and the first serious quantitative approach. 540 00:54:42,790 --> 00:54:49,540 But it's a topic where there is a lot of really great qualitative research and there was even a woman. 541 00:54:49,540 --> 00:54:51,490 She knows exactly where we were in Sudan. 542 00:54:51,490 --> 00:54:57,820 She published a book in early eighties about instructing an anthropologist in Sudan in exactly that location. 543 00:54:57,820 --> 00:55:07,550 So this was going. So good qualitative work, this is the foundation that you already had, it wasn't like members of you. 544 00:55:07,550 --> 00:55:13,510 Which might be the case in a situation where there wasn't a lot of. 545 00:55:13,510 --> 00:55:18,930 Yeah, I think so, I'm ready now, rent proposals and similar topics, so. 546 00:55:18,930 --> 00:55:23,230 So, yeah, qualitative research is important. That's right. 547 00:55:23,230 --> 00:55:30,170 And I would also now, to be honest, I would say if there was, I would say there was. 548 00:55:30,170 --> 00:55:34,190 Research done by UNICEF. A bit more I took that went into it, 549 00:55:34,190 --> 00:55:40,340 I did now read several when proposals were really clearly relative qualitative person that we really have an anthropologist, 550 00:55:40,340 --> 00:55:43,890 somebody on board from the beginning. I don't know. 551 00:55:43,890 --> 00:55:46,220 Maybe here we just I don't know. 552 00:55:46,220 --> 00:56:00,650 We were maybe have a media head from UNICEF at that time, but of course, it would be nice to have somebody on board and do that before. 553 00:56:00,650 --> 00:56:12,970 Any loss of. We don't have any more questions, and I want to thank Sonya against a really insightful, very interesting talk. 554 00:56:12,970 --> 00:56:19,150 To close experiments there and also for coming all the way from Switzerland for it. 555 00:56:19,150 --> 00:56:23,850 Yeah, so let's give her a round of applause.