1 00:00:05,870 --> 00:00:12,530 Good evening, everyone. My name is Walter Armbrust. I'm a fellow Middle East centre and I'm hosting tonight's webinar. 2 00:00:12,530 --> 00:00:18,560 The title of tonight's session is Counter-Revolution versus Counter Marginalisation 3 00:00:18,560 --> 00:00:24,740 Movements revisiting the online tug of war a decade after the Arab Spring. 4 00:00:24,740 --> 00:00:28,560 And we're fortunate to have two excellent speakers tonight. 5 00:00:28,560 --> 00:00:33,290 I'll introduce them. The first is Dr. Inside Hymies, 6 00:00:33,290 --> 00:00:41,000 who is associate professor in the Department of Communication and also an affiliate professor of women's studies at the University of Maryland. 7 00:00:41,000 --> 00:00:47,420 She's an expert on Arab and Muslim media and has co-authored with Mohammed and now AWEA two books, 8 00:00:47,420 --> 00:00:51,680 Islam Dot Com Contemporary Islamic Discourse in Cyberspace. 9 00:00:51,680 --> 00:00:59,660 That was in 2009 and Egyptian Revolution 2.0, political blogging, civic engagement and citizen journalism. 10 00:00:59,660 --> 00:01:03,920 That was in 2013, both published by Palgrave Macmillan. 11 00:01:03,920 --> 00:01:14,000 And she's also co-editor with a medley of Arab Women's Activism and socio Political Transformation Unfinished Gendered Revolutions. 12 00:01:14,000 --> 00:01:18,680 That was in 2018, again by Palgrave Macmillan. 13 00:01:18,680 --> 00:01:26,330 She has authored and co-authored numerous book Jack Prison and Journal articles and confidence papers in both English and Arabic. 14 00:01:26,330 --> 00:01:31,550 And she also works a great deal as a media commentator and analyst, as a public speaker, 15 00:01:31,550 --> 00:01:35,330 a human rights commissioner in the Human Rights Commission in Montgomery County, 16 00:01:35,330 --> 00:01:42,110 Maryland, and a radio host who presents a monthly radio show on U.S. Arab Radio, 17 00:01:42,110 --> 00:01:48,590 which is the first Arab American radio station broadcasting in the United States and Canada. 18 00:01:48,590 --> 00:01:55,100 Our second speaker is Mark Owen Jones, who did his pitch D in 2016 at Durham University, 19 00:01:55,100 --> 00:02:00,500 where he wrote an interdisciplinary thesis on the history of political repression in Bahrain, 20 00:02:00,500 --> 00:02:07,160 which won the 2016 Association for Gulf and Arab and Arabian Peninsula Studies PRISE. 21 00:02:07,160 --> 00:02:16,190 Dr Jones spent much of his childhood in Bahrain and has also lived in various parts of the Middle East, including Saudi Arabia, the Sudan and Syria. 22 00:02:16,190 --> 00:02:20,690 And he now teaches at the Hamad bin Halifa University in Qatar. 23 00:02:20,690 --> 00:02:27,350 And before that, taught at Tubingen University's Institute for Political Science and also as a lecturer in the history 24 00:02:27,350 --> 00:02:33,470 of the Gulf and Arabian Peninsula at Exeter University's Institute of Arab and Islamic Studies. 25 00:02:33,470 --> 00:02:41,180 He's added to books and an upcoming monograph on political repression in Bahrain with Cambridge University Press. 26 00:02:41,180 --> 00:02:46,700 I don't know what the title is, but perhaps you will tell us during the course of the lecture. In addition to his academic work, 27 00:02:46,700 --> 00:02:54,200 Mark enjoys communicating his research to broader audience audiences and has bio lines in the Washington Post's New Statesman, 28 00:02:54,200 --> 00:02:58,370 CNN, the Independent PEN International and several others. 29 00:02:58,370 --> 00:03:03,980 And he's also appeared frequently on the BBC, Channel four News and Al Jazeera. 30 00:03:03,980 --> 00:03:13,430 So we'll now move straight to our speakers. We'll have Sara first and then Mark, each will speak for 10 to 15 minutes and then we'll have Q&A. 31 00:03:13,430 --> 00:03:19,990 So take it away, Tom. Thank you so much for this kind introduction and for the kind invitation, Walter. 32 00:03:19,990 --> 00:03:24,790 It's really an honour and a pleasure to join you all today. It's an honour to college in Oxford. 33 00:03:24,790 --> 00:03:30,580 The Middle East centre, this great intellectual, academic institution. 34 00:03:30,580 --> 00:03:34,660 My talk today is going to focus on really what has been changing and shifting. 35 00:03:34,660 --> 00:03:38,920 Ten years after the eruption of the so-called Arab Spring uprisings in the region, 36 00:03:38,920 --> 00:03:43,000 what has been happening in terms of the shift of the role of social media? 37 00:03:43,000 --> 00:03:48,820 Why these shifts have been happening and also some of the paradoxes that emerged out of these shifts? 38 00:03:48,820 --> 00:03:53,710 And how can we understand the dynamics of the relationship between different political and social 39 00:03:53,710 --> 00:04:00,520 players in the Arab political and media landscapes in light of all of these changes and paradoxes? 40 00:04:00,520 --> 00:04:07,120 Let me first start by saying that when the Arab Spring uprisings and up to 10 years ago, there was this moment of euphoria. 41 00:04:07,120 --> 00:04:11,800 I am an Egyptian American. So obviously, I was very euphoric about the eruption of the uprisings. 42 00:04:11,800 --> 00:04:16,660 But as a communications scholar, we also had these moments of, quote unquote, technical euphoria. 43 00:04:16,660 --> 00:04:25,120 We were delighted at what social media can do. The role of social media as amplifiers and catalysts and mobilisers and coordinators, 44 00:04:25,120 --> 00:04:31,870 how they can bring people together or the needs that actions on the ground help them to mobilise through Facebook pages like, 45 00:04:31,870 --> 00:04:37,700 you know, we are all inside that can draw hundreds of thousands of people, you know, followers on Twitter, you name it. 46 00:04:37,700 --> 00:04:39,790 It was this moment of reckoning for ya. 47 00:04:39,790 --> 00:04:48,670 Ten years later, we are looking at a completely different landscape, both politically and in terms of mediated communication in the region as well. 48 00:04:48,670 --> 00:04:53,320 What we are seeing in a step is a rise in what we call digital authoritarianism, 49 00:04:53,320 --> 00:05:00,220 this phenomenon whereby regimes are really jumping on these social media and digital media bandwagon. 50 00:05:00,220 --> 00:05:03,650 They are publishing their own skills technically and digitally. 51 00:05:03,650 --> 00:05:09,400 They're creating their own cyber armies, electronic armies, going after the opponents many times, 52 00:05:09,400 --> 00:05:14,740 hacking, sabotaging, blogging, you know, and really sometimes even having more dire consequences, 53 00:05:14,740 --> 00:05:21,080 like finding somebody whose IP address going after this person, arresting him or her or putting them in jail or, 54 00:05:21,080 --> 00:05:25,480 you know, in some dire circumstances, even ending their lives altogether. 55 00:05:25,480 --> 00:05:30,820 So this is a completely different picture now in terms of this rise of what we call digital authoritarianism, 56 00:05:30,820 --> 00:05:34,600 which replaced this earlier movement of the tech new for you. 57 00:05:34,600 --> 00:05:36,700 Why did that happen? How did it happen? 58 00:05:36,700 --> 00:05:45,130 Well, it is really showing us some of the limitations of the social media that they can really play a dual role as a double role. 59 00:05:45,130 --> 00:05:53,230 They can be instruments or tools for mobilisation and tools for liberation, but they can also be tools for repression. 60 00:05:53,230 --> 00:05:58,360 This dual role of social media as both tools of liberation and repression is very important. 61 00:05:58,360 --> 00:06:03,520 And we should, as communications, scholars, sociology scholars, political science scholars, really understand that, 62 00:06:03,520 --> 00:06:10,750 investigate the reasons and the dynamics besides behind this kind of duality in the role of social media. 63 00:06:10,750 --> 00:06:13,660 There's also other qualities that we should pay attention to. 64 00:06:13,660 --> 00:06:19,870 For example, when you have a moment of the collective solidarity, people are all in Tahrir Square. 65 00:06:19,870 --> 00:06:24,380 They have a common goal. They are always calling for Mubarak to go. 66 00:06:24,380 --> 00:06:26,680 How come you have to go? You have to go. 67 00:06:26,680 --> 00:06:34,690 People, regardless of gender, regardless of age, regardless of religion, regardless of their own personal leanings or orientations. 68 00:06:34,690 --> 00:06:40,930 What do united in this wonderful, you know, historic moment of solidarity and unity? 69 00:06:40,930 --> 00:06:45,130 If you have this kind of solidarity and unity, which is very rare, but when it happens, 70 00:06:45,130 --> 00:06:50,620 social media becomes a very important tool for amplifying the voices of dissent. 71 00:06:50,620 --> 00:06:57,070 And the voices of opposition and bringing people together and uniting them behind a common cause. 72 00:06:57,070 --> 00:07:02,470 But once you start to have political divisions, fragmentation and polarisation, 73 00:07:02,470 --> 00:07:07,900 which is pretty much what has been witnessed in a country like Egypt and many of the other Arab Spring countries that 74 00:07:07,900 --> 00:07:15,730 witnessed the reversals and backlashes in terms of the journeys to democratisation and reform with the exception of Tunisia, 75 00:07:15,730 --> 00:07:22,750 then this moment of unity and uniformity and solidarity is got to be replaced with divisions and fragmentation. 76 00:07:22,750 --> 00:07:27,640 And once that happens, the more polarisation and fragmentation you have, 77 00:07:27,640 --> 00:07:33,400 the more social media is going to be used as weapons to increase the distance between the self and the other, 78 00:07:33,400 --> 00:07:40,360 to widen the gaps and to widen the distances between different groups and even the weaponization of social media, 79 00:07:40,360 --> 00:07:46,720 meaning they're going to be used as tools and weapons to attack your own opponents and to come after them. 80 00:07:46,720 --> 00:07:51,670 So that is the picture that you're witnessing now, unfortunately, in the so-called Arab Spring countries. 81 00:07:51,670 --> 00:07:59,380 As I said, with a unique exception of Tunisia, which was able to have a much more smooth path towards democratisation and reform. 82 00:07:59,380 --> 00:08:02,590 And the other countries, unfortunately, have had very dire outcomes. 83 00:08:02,590 --> 00:08:06,640 You need to go into details because I'm sure everybody on this call knows about it. 84 00:08:06,640 --> 00:08:13,690 In this particular case, on social media, I weaponized in two very dangerous directions in the hands of the regimes. 85 00:08:13,690 --> 00:08:21,910 They become tools of repression and. Authoritarianism, as we explain. And also in the hands of opposing parties that are going after each other. 86 00:08:21,910 --> 00:08:29,620 They will become with weaponized to go after your opponents and to try to attack them and smear them and stigmatise them online. 87 00:08:29,620 --> 00:08:34,240 So that is not a pretty picture, unfortunately. Right. Why did this picture happen? 88 00:08:34,240 --> 00:08:40,180 What's really caused all of this? I guess that we have over credited social media too much. 89 00:08:40,180 --> 00:08:48,340 We have seen labels like the Facebook revolution in Egypt, the Twitter uprising in Tunisia, the YouTube uprising in Syria. 90 00:08:48,340 --> 00:08:54,610 And I think we were really kind of taken away or kind of really pulled in this moment of euphoria over crediting social 91 00:08:54,610 --> 00:09:01,810 media as if they are magical tools that can bring about reform and change and democratisation all by themselves. 92 00:09:01,810 --> 00:09:05,470 Ten years forward now, we've spoken about a completely different picture. 93 00:09:05,470 --> 00:09:08,540 So what are some of the limitations of the role of social media? 94 00:09:08,540 --> 00:09:12,550 I talked about some of them already, but other things we have to take note of as well. 95 00:09:12,550 --> 00:09:20,340 They can not fill the gap or the void, which is created by the vacuum of not having an active civil society. 96 00:09:20,340 --> 00:09:23,590 In many of these countries, there was no active civil society, 97 00:09:23,590 --> 00:09:29,950 no real action in terms of having engy OWS and oppositional groups and activism on the ground. 98 00:09:29,950 --> 00:09:38,710 And when you have this kind of vacuum, you can not really rely on social media to come in as a magical tool and to fill this kind of vacuum. 99 00:09:38,710 --> 00:09:47,170 It's not going to be able to do so. You need to have a coordinated effort at every level of society, politically, socially, economically, 100 00:09:47,170 --> 00:09:53,530 in terms of media communication, to be able to really move forward towards a successful journey of democratisation. 101 00:09:53,530 --> 00:09:59,620 And. And what we are witnessing now, these failures or these reversals in democratisation are telling us, 102 00:09:59,620 --> 00:10:03,850 hey, wait a minute, there is only so much that social media can do. 103 00:10:03,850 --> 00:10:09,820 After all, they're not magical tools. They're not going to bring about change and reform and democratisation all by themselves. 104 00:10:09,820 --> 00:10:18,790 You need to have an active, vibrant, organised civil society to fill some of these vacuums and to create this kind of momentum on the ground first, 105 00:10:18,790 --> 00:10:24,760 and that social media can come in as a supplementary and complementary factor or as a 106 00:10:24,760 --> 00:10:30,760 catalyst that can speed up this process of reform and this process of transformation. 107 00:10:30,760 --> 00:10:36,580 So that's, in my opinion, is a very important lesson and a very important take away from these reversals that we are witnessing. 108 00:10:36,580 --> 00:10:39,640 Ten years after the Arab Spring. 109 00:10:39,640 --> 00:10:44,920 Having said all of that, I want to also focus on the theme that we're talking about, which is counter-revolution, right. 110 00:10:44,920 --> 00:10:50,950 And countering marginalisation when the Arab Spring uprisings erupted. People were like, yeah, you know, I used for real. 111 00:10:50,950 --> 00:10:56,500 I'd like to make bread and freedom and social justice, which is like what people would asking for. 112 00:10:56,500 --> 00:11:00,880 Would they ask for a change of regime and all of that? Well, that is true. 113 00:11:00,880 --> 00:11:06,760 And it's also true that certain groups felt more marginalised more than others based on gender, 114 00:11:06,760 --> 00:11:12,250 based on socio economic status, based on their own religious or political ideologies. 115 00:11:12,250 --> 00:11:16,030 They felt like they don't have a place at the table. 116 00:11:16,030 --> 00:11:22,030 And they felt that this would be a golden moment for them to really amplify their own voices or to make their 117 00:11:22,030 --> 00:11:28,700 voices heard in another way and to meet the demands of interest taken into account and taken into consideration. 118 00:11:28,700 --> 00:11:33,140 For example, if you look at women's movements, right? The feminist movements in the Arab world. 119 00:11:33,140 --> 00:11:40,370 OK. After the initial success of the Egyptian revolution leading to the ousting of Mubarak from office. 120 00:11:40,370 --> 00:11:45,080 Women's movements. That's a golden moments. Awesome. Let's try to capitalise on that. 121 00:11:45,080 --> 00:11:49,900 Let's try to build on it. So on Women's International Day, which is, by the way, March 8th. 122 00:11:49,900 --> 00:11:54,200 So just coming in a few days, right? March 8th, you said, OK. 123 00:11:54,200 --> 00:11:58,940 Let's have million a year and million women's march in the heart of the square, 124 00:11:58,940 --> 00:12:03,290 trying to mimic this whole thing of having about Junia or a million person march 125 00:12:03,290 --> 00:12:07,460 in the square calling for reform and democratisation and political change. 126 00:12:07,460 --> 00:12:10,850 Let's do this kind of a million women march in the square. 127 00:12:10,850 --> 00:12:17,240 Unfortunately, only five hundred women showed up and they were shouted at with the very bad words like, you know, 128 00:12:17,240 --> 00:12:23,210 basically putting them down and asking them to go to the hair salon and just use nail polish and look forward. 129 00:12:23,210 --> 00:12:28,670 And that's their space. Well, that was a very disappointing moment because obviously women played a very, 130 00:12:28,670 --> 00:12:32,570 very important and central role in the midst of the Arab Spring uprisings. 131 00:12:32,570 --> 00:12:37,100 Many of them sacrificed. They were harassed. They were arrested. They were beaten. 132 00:12:37,100 --> 00:12:42,980 Some of them even paid their own lives as a price. And they lost their lives as a result of this kind of resistance mechanism. 133 00:12:42,980 --> 00:12:48,710 And being at the front lines, what is the takeaway here? What can we learn from this sad moment? 134 00:12:48,710 --> 00:12:54,500 I think, like we can learn the domain of democracy that ousting a dictator from office is easy. 135 00:12:54,500 --> 00:13:02,720 But figuring out what to do next is not the same thing could be said about the groups of women working in the area of feminist activism, 136 00:13:02,720 --> 00:13:08,810 for example, in this particular case. You can also say organising a campaign online is easy. 137 00:13:08,810 --> 00:13:17,890 Calling for a march is easy, but changing societal attitudes and changing society's mindset is not so unless and until 138 00:13:17,890 --> 00:13:23,990 the society itself is prepared for this type of socio political or socio economic change. 139 00:13:23,990 --> 00:13:29,330 We did not expect much success just based on tweeting or posting or blogging about it. 140 00:13:29,330 --> 00:13:35,030 Let's have a great a million women march in the midst of Tahrir Square to capitalise on the revolution. 141 00:13:35,030 --> 00:13:39,500 It's not going to happen and that shows the kind of work that needs to be done ahead of us. 142 00:13:39,500 --> 00:13:42,590 The same thing is applicable to religious groups, Islamic groups. 143 00:13:42,590 --> 00:13:49,880 Some of them also tried to make some kind of, you know, buzz about themselves as an activity and try to amplify their voices as we know. 144 00:13:49,880 --> 00:13:56,720 Of course, the story of the Muslim Brotherhood in Egypt, they just had this very brief golden moment during President Morsi's time. 145 00:13:56,720 --> 00:14:01,730 They were banned. And they were harassed and persecuted before and after. 146 00:14:01,730 --> 00:14:08,900 Right. So some of them are still trying to use some social media platforms to amplify their voices or to make the messages heard. 147 00:14:08,900 --> 00:14:16,130 But it is not definitely creating the kind of change or the kinds of, you know, visibility that they really are looking for. 148 00:14:16,130 --> 00:14:21,620 And sometimes visibility can even backfire against them because of their anti-government position. 149 00:14:21,620 --> 00:14:28,340 It can be a double edged sword as well. Again, raising a flag that you can only do so much just using social media. 150 00:14:28,340 --> 00:14:34,110 You need to have more structural and organised change if you really want to have actual change on the ground. 151 00:14:34,110 --> 00:14:38,210 So let me now move to another part of my thought, which is the paradoxes. 152 00:14:38,210 --> 00:14:46,400 What kind of paradoxes are we really witnessing and seeing in this post Arab Spring space 10 years after the eruption of the uprising? 153 00:14:46,400 --> 00:14:51,410 I talked about some of them already. But I want to highlight others, which are equally very important. 154 00:14:51,410 --> 00:14:57,320 One of them, for example, is this whole paradox of the anonymity versus visibility. 155 00:14:57,320 --> 00:15:01,640 Right. If you are anonymous, in some cases, it can protect your own identity. 156 00:15:01,640 --> 00:15:06,500 You can be more safe. The government is not going to go after. He was not going to harass you because you are anonymous. 157 00:15:06,500 --> 00:15:10,580 Right. So some of the active, especially those who are belonging to certain, quote unquote, 158 00:15:10,580 --> 00:15:15,230 marginalised groups such as, for example, certain opponents of the regime, 159 00:15:15,230 --> 00:15:21,110 whether they are from this particular political ideology or another or sort of religious ideology, 160 00:15:21,110 --> 00:15:27,410 some of them have resorted to this whole curtain of anonymity to protect themselves from regime harassment. 161 00:15:27,410 --> 00:15:31,590 Also, for women in particular, in traditional and conservative societies, 162 00:15:31,590 --> 00:15:36,170 anonymity becomes more important because they are also fighting a double battle. 163 00:15:36,170 --> 00:15:40,010 Is a political battle against the disease. And there is also a social button. 164 00:15:40,010 --> 00:15:41,840 Again, a certain, you know, 165 00:15:41,840 --> 00:15:49,890 societal considerations and constraints and stigmas and traditions are pretty hampered that mobility and advancement and visibility. 166 00:15:49,890 --> 00:15:54,350 So that could be helpful. But just like everything else also has its own downsides. 167 00:15:54,350 --> 00:15:58,910 And the downsides are it can decrease the provability that impact their 168 00:15:58,910 --> 00:16:05,750 visibility and the overall credibility as players in the socio political field. 169 00:16:05,750 --> 00:16:10,490 So some of the activists I interviewed have been talking about. Yeah, I can protect myself by being anonymous. 170 00:16:10,490 --> 00:16:11,150 But guess what? 171 00:16:11,150 --> 00:16:17,940 CNN is never going to go and try to interview and you'll find out where I am and try to have a great interview with me because nobody will know why. 172 00:16:17,940 --> 00:16:20,930 And anyway. So that's really one of the paradoxes. 173 00:16:20,930 --> 00:16:28,510 Another paradox is a double edged sword of resistance in the diaspora in the midst of this very repressive environment we talked about. 174 00:16:28,510 --> 00:16:32,970 There is no question that a lot of opponents of the regimes feel much safer. 175 00:16:32,970 --> 00:16:39,530 You know, voicing their own concerns and opposition at a distance, many of them are in self-imposed exile. 176 00:16:39,530 --> 00:16:42,920 They're living abroad in many different countries all over the world to try to 177 00:16:42,920 --> 00:16:47,060 stay away from the regime of oppression and from the regimes going after them. 178 00:16:47,060 --> 00:16:53,810 Again, that's a double edged sword in a way. It can give you a safe space where you try to voice your own, you know, 179 00:16:53,810 --> 00:16:59,000 opposition to the regime without being put behind bars or trying to be arrested or killed. 180 00:16:59,000 --> 00:17:05,720 But on the other hand, there's only so much you can do in the diaspora when you are away from your own home country, 181 00:17:05,720 --> 00:17:14,120 because sometimes you will not have enough impact or enough effect on the dynamics of the socio political field inside your own country. 182 00:17:14,120 --> 00:17:20,980 If you are just freed despite opposition online and you are not actually having real interaction with people at home, 183 00:17:20,980 --> 00:17:26,960 it can also limit your own ability to have some kind of impact or or effective on your home country. 184 00:17:26,960 --> 00:17:36,760 So that's another paradox. A third paradox, which is very interesting is this shift from what we call the state feminism or top down feminism, 185 00:17:36,760 --> 00:17:39,050 quote, then called the first lady syndrome. 186 00:17:39,050 --> 00:17:46,100 When the government tries to really kind of give you the impression or the image or the picture that it is really creating reform or change. 187 00:17:46,100 --> 00:17:46,940 But in reality, 188 00:17:46,940 --> 00:17:55,070 nothing is really happening in terms of affecting women's lives on the ground and really making changes in their own lives in a meaningful way. 189 00:17:55,070 --> 00:17:59,870 And, of course, the shift, what we call grassroots activism at the grassroots level, 190 00:17:59,870 --> 00:18:06,650 which means when women themselves start to take matters into their own hands and organise their own movements and mobilise on the ground, 191 00:18:06,650 --> 00:18:11,030 as we have seen, for example, in the midst of the Arab Spring, the movements, unfortunately, 192 00:18:11,030 --> 00:18:17,540 a lot of the so-called quote unquote, pop down forms of feminism have had very limited trickle down effect. 193 00:18:17,540 --> 00:18:24,050 They did not really impact women's lives in meaningful ways. Whether you're talking about the appointing of many women ministers in Egypt, 194 00:18:24,050 --> 00:18:31,970 whether you're talking about non Arab Spring countries like Saudi Arabia trying to give an image of reform and an image of change, it was reality. 195 00:18:31,970 --> 00:18:38,120 Women's situation is still very constrained and still very much a negative reality on the ground. 196 00:18:38,120 --> 00:18:41,180 So that's another interesting paradox that we have to be aware of. 197 00:18:41,180 --> 00:18:48,020 And I end with one other candidate, which is very important, which is the fact that unfortunately, some of the quote unquote, 198 00:18:48,020 --> 00:18:54,620 marginalised groups that we're talking about are trying to use social media to counter that on marginalisation. 199 00:18:54,620 --> 00:18:58,310 Not all of them have necessarily been supportive of revolutions. 200 00:18:58,310 --> 00:19:02,510 Some of them have actually turned into problems, the revolutionary movements themselves, 201 00:19:02,510 --> 00:19:06,140 which is a very interesting but very sad but a box to talk about. 202 00:19:06,140 --> 00:19:13,130 For example, you have seen some groups of women and other marginalised minorities that have been part of the quote unquote, a senior U.S. 203 00:19:13,130 --> 00:19:19,730 Or Saudi Mr President movement that was supporting Mubarak. And some of them are also supporting the military dictatorship under Sisi. 204 00:19:19,730 --> 00:19:26,180 And in other parts of the Arab world as well. I've seen some Islamist groups like the Salafis in Egypt, for example, 205 00:19:26,180 --> 00:19:31,850 taking some positions that are pretty much in support of the the military coup 206 00:19:31,850 --> 00:19:36,170 in Egypt and supporting it instead of trying to take a position against it. 207 00:19:36,170 --> 00:19:40,490 So these are, quote unquote, marginalised groups that instead of being the, you know, 208 00:19:40,490 --> 00:19:44,600 revolutionaries, they have taken the position of being counter revolutionaries. 209 00:19:44,600 --> 00:19:51,290 I think the explanation could be an attempt on the part to try to counter that awful marginalisation by 210 00:19:51,290 --> 00:19:57,470 trying to appease the regime or trying to make peace with the status quo rather than to go against it. 211 00:19:57,470 --> 00:20:03,230 And finally, we could not turn a blind eye to the moment that we are all in right now, which is the pandemic movement, 212 00:20:03,230 --> 00:20:08,570 which brought about a lot of constraints in terms of the activism in general and 213 00:20:08,570 --> 00:20:13,610 social media activism in particular with the enacting of a lot of cyber crime laws. 214 00:20:13,610 --> 00:20:20,120 Let's read a book. A lot of constraints on activists and journalists and people will try to get the word out 215 00:20:20,120 --> 00:20:24,740 about anything that does not go along with the government's policy or the government's plans. 216 00:20:24,740 --> 00:20:29,630 In many cases, these people find themselves there from TV silence. The websites are blocked. 217 00:20:29,630 --> 00:20:33,920 They can be put behind bars. In some extreme cases, they can be arrested and killed. 218 00:20:33,920 --> 00:20:37,190 Even these are very sad stories, but are very real ones. 219 00:20:37,190 --> 00:20:44,120 And the future prediction is that this whole wave of digital authoritarianism is going to escalate. 220 00:20:44,120 --> 00:20:50,330 Is what I get from bad to worse as both parties, the regimes on one hand and their opponents on the other hand, 221 00:20:50,330 --> 00:20:53,640 try to master their own skills of cyber activism, 222 00:20:53,640 --> 00:21:00,230 use them as weapons to attack the other and use them as weapons to just make their own stories heard while silencing others. 223 00:21:00,230 --> 00:21:05,630 So that is the prediction moving forward in the post pandemic era. And 10 years after the Arab Spring. 224 00:21:05,630 --> 00:21:10,080 Thank you so much. Thank you, sir. When I moved to Mark. 225 00:21:10,080 --> 00:21:18,630 Thank you very much, Walter, for your introduction. I should really update my bio because the book I'm behind is out political repression, right? 226 00:21:18,630 --> 00:21:22,810 No, it's my words. Thanks so much, Sahhaf. 227 00:21:22,810 --> 00:21:34,770 Your your you've I think you've laid out the ground perfectly of this idea between digital euphoria or digital utopianism and digital dystopia. 228 00:21:34,770 --> 00:21:42,750 As those who know me will know, I'm incredibly cynical. So I definitely situate myself on the digital dystopian side of things. 229 00:21:42,750 --> 00:21:47,960 So I generally see things as going from bad to worse and a general level. 230 00:21:47,960 --> 00:21:54,660 But today I want to talk. I want to give a few empirical examples of digital authoritarianism. 231 00:21:54,660 --> 00:22:01,670 But what I want to do first is, is actually think about digital authoritarianism, not as a state century can enterprise. 232 00:22:01,670 --> 00:22:09,060 And what I mean by that is it's increasing important to look at the various nodes that occur within digital authoritarianism that enable it to happen. 233 00:22:09,060 --> 00:22:11,700 There's a tendency mostly within political science, 234 00:22:11,700 --> 00:22:18,270 I think I ought to to focus on the role of the state and the role of the states specific role in terrorism. 235 00:22:18,270 --> 00:22:21,510 But as we know, obviously, authoritarianism is a phenomenon. 236 00:22:21,510 --> 00:22:27,240 And the resilience or authoritarianism is abetted by number of forces, including those of outside powers. 237 00:22:27,240 --> 00:22:30,100 So I'll start with a brief anecdote. 238 00:22:30,100 --> 00:22:35,850 And firstly, we have to accept that part of digital authoritarianism is this notion of disinformation and fake news, 239 00:22:35,850 --> 00:22:41,310 propaganda, etc., etc, is an aspect of digital authoritarianism. And the where the anecdote is not yet my anecdote. 240 00:22:41,310 --> 00:22:49,110 I'm not going to lie if we think about the role of a London PR firm that also 241 00:22:49,110 --> 00:22:53,820 helped the Soviet party in power doing work promoting Mohammed bin Salman, 242 00:22:53,820 --> 00:22:58,830 the crown prince of Saudi Arabia. Why is that the case? 243 00:22:58,830 --> 00:23:05,430 Well, there's a market for whitewashing the reputation of dictators, and this has always been the case, 244 00:23:05,430 --> 00:23:10,110 for example, in the Gulf, in Britain's relation and Britain's particular lectureship in the Gulf. 245 00:23:10,110 --> 00:23:14,400 And it raises these questions of who profits from digital authoritarianism. 246 00:23:14,400 --> 00:23:21,330 Why is digital authoritarianism happening? Because the problem when we talk about tools like Twitter, social media uncritically, 247 00:23:21,330 --> 00:23:27,750 is that we have an almost an element of technological determinism where we assume these tools just exist. 248 00:23:27,750 --> 00:23:33,450 And why do they just exist? But actually, these tools exist for specific reason. 249 00:23:33,450 --> 00:23:41,130 If we look at why Twitter has proliferated around the world, where Facebook has proliferated around the world, we actually have to ask ourselves, 250 00:23:41,130 --> 00:23:51,150 why has a product such as Facebook or Twitter been allowed to be distributed around the world without any due diligence, 251 00:23:51,150 --> 00:23:55,100 for example, of the potential negative consequences of that product? 252 00:23:55,100 --> 00:24:01,700 Now, I think that in these times, if you sold a car. Or our washing machine, and that washing machine or car was faulty. 253 00:24:01,700 --> 00:24:05,550 It would be withdrawn and the manufacturer withdraw that product because there's a safety problem. 254 00:24:05,550 --> 00:24:10,170 However, there's this normative assumption with technology, Anderson. 255 00:24:10,170 --> 00:24:15,150 And this ties in with this notion of digital euphoria and techno utopianism. 256 00:24:15,150 --> 00:24:22,410 This itself is an ontology, right? You know, if those of you have seen the social Delamar, I think on Netflix, 257 00:24:22,410 --> 00:24:25,770 we'll see that there's this notion of these tech firms in Silicon Valley who 258 00:24:25,770 --> 00:24:30,090 firmly believe that the technology that they are creating is doing a social good. 259 00:24:30,090 --> 00:24:36,180 There's no particular malice there. However, there's this embedded ideology that technology solves problems. 260 00:24:36,180 --> 00:24:38,720 And this idea that technology solves problems. 261 00:24:38,720 --> 00:24:43,500 And specifically in the case of social media issues to do with authoritarianism and they enable free speech. 262 00:24:43,500 --> 00:24:49,290 And democracy is embedded within this kind of North America centric notion that these technologies are going to do that. 263 00:24:49,290 --> 00:24:59,370 And that's embedded within and created within a society or a country where there is a specific, you know, debates about or services, civic space. 264 00:24:59,370 --> 00:25:04,110 Right. So this is the background for why these technologies are created. 265 00:25:04,110 --> 00:25:10,770 And then they are distribute around the world without any, for example, due diligence of human rights. 266 00:25:10,770 --> 00:25:14,070 If you are a technology company or a social media company or a Start-Up, 267 00:25:14,070 --> 00:25:19,230 surely you need to be asking yourself, what will the impact of this product have on the market? 268 00:25:19,230 --> 00:25:21,190 I'm going to send it to. 269 00:25:21,190 --> 00:25:28,580 And we've seen countless examples of how technology has enables violence and authoritarianism when the most egregious examples is Facebook, 270 00:25:28,580 --> 00:25:36,230 who even acknowledged that the role of that technology in Myanmar, for example, facilitated ethnic cleansing. 271 00:25:36,230 --> 00:25:39,560 And so we know that social media is a cognitive device, 272 00:25:39,560 --> 00:25:47,390 but very little seems to be done about exploring the relationship between the political economy of these countries and the way they benefit. 273 00:25:47,390 --> 00:25:52,700 And I think this is increasingly important to study digital changes, just like we might study the arms trade. 274 00:25:52,700 --> 00:25:59,660 We need to actually explore these avenues of connexions rather than seeing technologies as isolated, separate falls. 275 00:25:59,660 --> 00:26:11,210 We need to understand technology as a a product that also has a great weight of incentives behind it, push it to market. 276 00:26:11,210 --> 00:26:17,420 And I think this is particularly important. And when we're looking at, you know. 277 00:26:17,420 --> 00:26:21,410 And this is key when it comes to companies, I think, like Twitter and Facebook. 278 00:26:21,410 --> 00:26:26,930 How are they? Why is it so difficult, for example, as you know, so has discussed, 279 00:26:26,930 --> 00:26:32,150 that with consent as academics and policy makers and NGOs arguing that the role of 280 00:26:32,150 --> 00:26:38,540 technology is increasingly detrimental in places like the Gulf or certain countries. 281 00:26:38,540 --> 00:26:46,820 It's being utilised as a tool of oppression, a tool of surveillance, and in many cases a tool of intimidation to troll and harass activists. 282 00:26:46,820 --> 00:26:54,350 So why is it allowed to go? Why is it still existing if its use is simply to facilitate state repression? 283 00:26:54,350 --> 00:27:00,450 Well, this is a big a big question. And again, it raises issues of revenue. 284 00:27:00,450 --> 00:27:03,560 Is the Middle East market, the Arabic speaking market, 285 00:27:03,560 --> 00:27:10,760 so great that these companies are benefiting not no longer from a product that allows it facilitates free speech. 286 00:27:10,760 --> 00:27:15,120 But actually from a product that is being used simply as an extension of the security apparatus. 287 00:27:15,120 --> 00:27:20,570 And is there an ethical issue that we need to raise it? Because when we talk about technology, again, 288 00:27:20,570 --> 00:27:25,870 there's a tendency to forget about policy implications and those kind of things that that can change this. 289 00:27:25,870 --> 00:27:30,500 And, you know, and we know, for example, in the Gulf that the Gulf is wealthy. 290 00:27:30,500 --> 00:27:38,770 We know that that is the largest population centre in the Gulf region, high technology, penetration rates, high tech social media. 291 00:27:38,770 --> 00:27:50,540 And, you know, this is this enables one specific state with an authoritarian framework to dominate the Arabic speaking online social media community. 292 00:27:50,540 --> 00:27:57,260 So what that means is, is that if you look at content on Twitter and this is another important thing to bear in mind, 293 00:27:57,260 --> 00:27:59,750 if you're looking for a discussion about foreign policy, 294 00:27:59,750 --> 00:28:07,880 say, the war in Yemen on Twitter, if you look at the results when you search in Arabic, you're not going to be presented with, 295 00:28:07,880 --> 00:28:13,340 you know, equal sides of the debate where every country in the region has the ability to express 296 00:28:13,340 --> 00:28:18,180 the same amount of content about a specific topic on the on the country and the country, 297 00:28:18,180 --> 00:28:23,840 the most resources, iPhone's population will be able to dominate that discussion. 298 00:28:23,840 --> 00:28:25,820 So now, if you look at the topics around Yemen, 299 00:28:25,820 --> 00:28:31,100 you will see that those conversations are entirely dominated by accounts representing foreign policy positions, 300 00:28:31,100 --> 00:28:35,210 for example, of the UAE and Saudi Arabia. That's in Arabic. Right. 301 00:28:35,210 --> 00:28:40,730 So what you have is a situation where a country can use its demographic power and it's relatively 302 00:28:40,730 --> 00:28:46,940 advanced technological infrastructure to shape the social media sphere in the Arabic world. 303 00:28:46,940 --> 00:28:54,890 And so we're seeing this kind of imbalance of voices within the region based on wealth and also an authoritarian tendency and 304 00:28:54,890 --> 00:29:03,920 authoritarian tendency is the perceived importance of authoritarian regimes to actually need to dominate the information space, 305 00:29:03,920 --> 00:29:10,550 which is a drive. And, you know, this is hugely problematic if we're talking about online civil society, 306 00:29:10,550 --> 00:29:13,940 because when we talk about online civil society and on MySpace, it's more often, 307 00:29:13,940 --> 00:29:15,710 especially in the Arabic speaking world, 308 00:29:15,710 --> 00:29:21,950 we're talking about an Arabic speaking space as opposed to a state bound by national boundaries and that kind of thing. 309 00:29:21,950 --> 00:29:27,680 And this is a, I think, particularly pertinent to see to see which countries are benefiting from this and who 310 00:29:27,680 --> 00:29:33,680 then has a disproportionate amount of what I call digital hegemony or digital power, 311 00:29:33,680 --> 00:29:40,250 who has the ability or the digital capital to dominate the digital space discursively and in other ways. 312 00:29:40,250 --> 00:29:47,030 And this is a very important kind of thing we have to see because this power is not necessarily new. 313 00:29:47,030 --> 00:29:51,770 If you look at the post-war Gulfstar post Gulf War structure. I mean, the 1990 Gulf War. 314 00:29:51,770 --> 00:29:52,940 We'll see. For example, you know, 315 00:29:52,940 --> 00:29:59,750 this the classic anecdote that people in the Gulf found out about the invasion of Iraq a few days after it happened by watching CNN. 316 00:29:59,750 --> 00:30:05,420 And at that point, it was you know, we saw increasing investment in particular from Saudi Arabia in satellite TV channels, 317 00:30:05,420 --> 00:30:07,370 in the infrastructure of satellite television. 318 00:30:07,370 --> 00:30:13,940 The whole notion being that if the role of information was seen as important and you don't just try to, you know, 319 00:30:13,940 --> 00:30:19,690 dominate your own national press, but you try to influence the informational infrastructure, the whole country. 320 00:30:19,690 --> 00:30:27,010 And we see this now with Twitter in the same way. But what we're seeing is an increasingly aggressive attempt to do this. 321 00:30:27,010 --> 00:30:32,170 So whilst you can buy out a satellite TV channel, you can't necessarily buy out Twitter. 322 00:30:32,170 --> 00:30:41,200 Well, can you? There's actually been increasing worry about, for example, Saudi's public investment fund via Softbank investing in Silicon Valley. 323 00:30:41,200 --> 00:30:46,420 We know that they aren't investing a lot of start-ups. We know, for example, that Saudi Arabia now or three, 324 00:30:46,420 --> 00:30:52,060 two Saudi citizens are facing a criminal case from the FBI in the USA because high level 325 00:30:52,060 --> 00:30:58,270 Saudis connected to who has been named as royal one infiltrated Twitter headquarters in 2016, 326 00:30:58,270 --> 00:31:01,660 stole lots of personal data and then took this back to Saudi. 327 00:31:01,660 --> 00:31:07,210 And we know we we know that there are people now are activists, Saudi activists, 328 00:31:07,210 --> 00:31:11,040 who say that that data leak is directly responsible for the arrest of Saudi citizens. 329 00:31:11,040 --> 00:31:18,970 Right. So what we're seeing here is this evidence of the perceived importance of the information ecology for the current, 330 00:31:18,970 --> 00:31:21,220 for example, in this case, the Saudi regime. 331 00:31:21,220 --> 00:31:27,560 And this is key because we're seeing attempts to drive and dominate the discourse and in reality, we're seeing it happen. 332 00:31:27,560 --> 00:31:31,480 You know, I've been looking now at the digital space for the past 10 years, 333 00:31:31,480 --> 00:31:39,720 and there is this trajectory of increasing or decreasing critical conversation and increasing dominance by pro-government voices, 334 00:31:39,720 --> 00:31:44,730 mostly from the Gulf. And there's a couple of other aspects that I think are interesting in terms of where is this going? 335 00:31:44,730 --> 00:31:53,770 But I want to mention well, firstly, what do you do if you can't summon enough resources to create the discussion you want? 336 00:31:53,770 --> 00:31:56,320 Well, we're seeing an increased amount of automation here. 337 00:31:56,320 --> 00:32:03,370 So a lot of my work has been on bots, which are automated accounts that can replicate content in huge amounts and huge volumes. 338 00:32:03,370 --> 00:32:07,780 And the efficacy, or at least of all of these bots is quite astounding. 339 00:32:07,780 --> 00:32:15,490 I did a study a couple of years ago looking at specific sectarian hate speech, anti Sunni and Shia hate speech, and found that, you know, 340 00:32:15,490 --> 00:32:21,460 a large amount of this specific slurs at the sectarian slurs that we saw online were actually 341 00:32:21,460 --> 00:32:27,820 being generated by bots which would belong to the satellite TV channel in Saudi Arabia. 342 00:32:27,820 --> 00:32:35,110 So a lot of the discourse that we saw, the sectarian discourse online, was actually created essentially by Ottoman automatons. 343 00:32:35,110 --> 00:32:40,390 Right. So it wasn't necessary, reflecting the real world of individual people, but these kind of fake accounts. 344 00:32:40,390 --> 00:32:46,540 And so the other thing as a I and it's an artificial intelligence becomes better. 345 00:32:46,540 --> 00:32:53,230 The whole notion of the adversaries and the people we're going to see online might change because theoretically it will be trivial 346 00:32:53,230 --> 00:33:00,280 soon to have an online account or Twitter account or Facebook account that is advanced in terms of artificial intelligence, 347 00:33:00,280 --> 00:33:07,780 that it can perfectly replicate government propaganda and appear real fool someone else, that they are real. 348 00:33:07,780 --> 00:33:09,610 And this might sound farfetched, but I don't think it is. 349 00:33:09,610 --> 00:33:17,830 I mean, to use, you know, another brief stir is that, you know, Saudi Arabia famously gave Sophia this robot that was created. 350 00:33:17,830 --> 00:33:21,950 I can't remember the company who made it, but they gave this robot citizenship. 351 00:33:21,950 --> 00:33:27,070 And there was something kind of symbolic about giving a robot citizenship because this robot is a robot. 352 00:33:27,070 --> 00:33:29,170 It's fundamentally someone you can programme, 353 00:33:29,170 --> 00:33:35,560 someone who you could ideally shape in such a way that they will never engage in dissent and will always praise the regime. 354 00:33:35,560 --> 00:33:39,260 So they are giving a robot citizenship. You are sort of saying, well, this is the ideal citizen. 355 00:33:39,260 --> 00:33:41,680 And in fact, that was a woman was even more striking. 356 00:33:41,680 --> 00:33:50,830 And so the idea that we might have an autonomous civil online civil society dominated by bots and robots spreading propaganda already exists, 357 00:33:50,830 --> 00:33:55,960 but in a very crude way. So I wonder where this is taking us. 358 00:33:55,960 --> 00:33:59,980 And, you know, artificial intelligence now is one of the next avenues worth exploring in terms of 359 00:33:59,980 --> 00:34:05,630 the technological dystopia and how that will be used by authoritarian regimes. 360 00:34:05,630 --> 00:34:11,090 But just to go back to my original point to end on, this isn't just the remnants of authoritarian regimes. 361 00:34:11,090 --> 00:34:15,590 Technology is often mentioned as a tool and how you use it tool is depending on its purpose. 362 00:34:15,590 --> 00:34:20,190 I could have a video camera, I could make a wonderful movie, or I could use a video camera to spy on someone. 363 00:34:20,190 --> 00:34:24,200 Yep. But we need to be also looking at who is making these tools and who is selling these 364 00:34:24,200 --> 00:34:27,680 tools and who is benefiting financially from these tools and for what purpose. 365 00:34:27,680 --> 00:34:33,590 And also, what's the ideology, ontology embedded within these tools themselves, you know? 366 00:34:33,590 --> 00:34:39,830 And you know, and in terms of policy that what this matters for so many reasons, some of which I've just mentioned. 367 00:34:39,830 --> 00:34:43,360 But if we take it again to this notion of digital orientalism, 368 00:34:43,360 --> 00:34:49,670 where what happens in the Middle East in terms of the Arabic speaking world is seen as less important. 369 00:34:49,670 --> 00:34:56,750 If we can go back to the Myanma incident, ethnic cleansing was facilitated because of disinformation spread on Facebook. 370 00:34:56,750 --> 00:35:00,170 And one of the reasons that happened is because Facebook had invested so little and 371 00:35:00,170 --> 00:35:05,450 having the appropriate language specialists who could moderate content in that region. 372 00:35:05,450 --> 00:35:10,700 And the same is true in, I would say, even more true in the Arab speaking world. 373 00:35:10,700 --> 00:35:14,500 The same kind of infractions of whatever Twitter policies or Facebook policies 374 00:35:14,500 --> 00:35:18,530 that occur here are not dealt with in the same way as they are dealt with in, 375 00:35:18,530 --> 00:35:23,870 say, North America or Europe. You know, it's seen as a bit of a backwater in the way it's dealt with. 376 00:35:23,870 --> 00:35:31,430 So there's this kind of elements of sort of imperial sort of digital imperialism, the sense that whereas the market share here, 377 00:35:31,430 --> 00:35:36,410 the advertising revenues generated from people using the products here are acceptable. 378 00:35:36,410 --> 00:35:41,330 However, policing the use of that product. So it's not using authoritarian ways. 379 00:35:41,330 --> 00:35:44,750 It's not incentivised because that would actually affect the bottom line. 380 00:35:44,750 --> 00:35:52,220 And that's something we also need to consider when we're looking at technology and policy, because digital technology doesn't exist in a vacuum. 381 00:35:52,220 --> 00:35:56,840 It comes from somewhere and it's some people profit off of it. 382 00:35:56,840 --> 00:36:02,150 And that is part of the disinformation, the digital authoritarian network and supply chain and structure. 383 00:36:02,150 --> 00:36:09,620 So it's not just a state centric enterprise, but it's a collection of different actors who for various reasons, benefit in specific ways. 384 00:36:09,620 --> 00:36:14,310 And I think that's an important way to conceptualise digital authoritarianism. 385 00:36:14,310 --> 00:36:18,260 And I'm going to end. So this type of questions. Thank you. 386 00:36:18,260 --> 00:36:24,140 Thank you both for excellent and thought-Provoking presentations. 387 00:36:24,140 --> 00:36:33,990 I'll begin by asking you both a couple of questions. Who will give the audience time to formulate their own questions for Soha? 388 00:36:33,990 --> 00:36:41,010 I think the one word that didn't occur in your talk was surveillance. 389 00:36:41,010 --> 00:36:49,080 You're still talking. You're talking about social media in terms of their capacity to join people together. 390 00:36:49,080 --> 00:36:55,470 And you acknowledge, like the rest of us have. That didn't necessarily turn out the way so many people thought it would. 391 00:36:55,470 --> 00:37:02,250 During that initial nominee of euphoria during the Arab revolutions. 392 00:37:02,250 --> 00:37:06,210 But I'd like to hear your thoughts about. 393 00:37:06,210 --> 00:37:14,050 Social, well, not just actually not just social media, but of all information technology as a means of surveillance by the state. 394 00:37:14,050 --> 00:37:15,910 And of course, it goes beyond social media. 395 00:37:15,910 --> 00:37:25,570 Every time we go onto the Internet and start clicking messages or clicking or clicking on Web sites, we can potentially be watched. 396 00:37:25,570 --> 00:37:34,930 And we actually are being watched. And so I'd like to hear I mean, in a way, I'm asking you to give your take on Marc's dystopian ism. 397 00:37:34,930 --> 00:37:44,290 But I was struck by by not having actually heard the word surveillance in your whole talk. 398 00:37:44,290 --> 00:37:49,580 And for Mark, I want to ask you whether. 399 00:37:49,580 --> 00:37:56,420 You are running a risk of just recreating the technological determinism that 400 00:37:56,420 --> 00:38:02,370 we all found so dissatisfying in the initial days of the Arab revolutions. 401 00:38:02,370 --> 00:38:10,790 And, you know, we had the many people had this sort of naive thought that social media had the capacity to put the truth out there. 402 00:38:10,790 --> 00:38:16,130 And if the truth were out there, then eventually, of course, it would it would prevail over falsehood. 403 00:38:16,130 --> 00:38:24,620 And now we know better. And social media seems to actually be having the opposite effect. 404 00:38:24,620 --> 00:38:29,600 And so you mean you're saying that it's inseparable? 405 00:38:29,600 --> 00:38:36,440 The technology is inseparable from the malign forces that are creating it and getting us all to use it. 406 00:38:36,440 --> 00:38:42,990 But it's not going to go away. We're going to continue to live in a media saturated world. 407 00:38:42,990 --> 00:38:56,390 And and I've. Is it your dystopian isn't so deep that the only thing we can do is stop using media and find ways to live our lives outside of it? 408 00:38:56,390 --> 00:39:00,650 You want me to go first? Walter? Yes. You refer your question about surveillance. 409 00:39:00,650 --> 00:39:07,190 Indeed. This is a very important point. Of course, we can never turn a blind eye to the amount of surveillance happening in cyberspace. 410 00:39:07,190 --> 00:39:09,110 And I'll just buy this. Seems like you are right. 411 00:39:09,110 --> 00:39:15,200 You mentioned even social media themselves have their own built in mechanisms of surveillance, which is very scary. 412 00:39:15,200 --> 00:39:19,480 You know, issues related to personal security, issues related to national security. 413 00:39:19,480 --> 00:39:24,530 Invasion of privacy. How much information you could or should share on social media. 414 00:39:24,530 --> 00:39:31,970 All of this is becoming a really scary reality that we as consumers and users of social media have to grapple with day in and day out. 415 00:39:31,970 --> 00:39:35,180 Think about how many times you use the Internet to buy something. 416 00:39:35,180 --> 00:39:40,790 And then you were flooded with all of these e-mails asking you if you're also interested in buying X, Y, Z. 417 00:39:40,790 --> 00:39:43,760 Or you watch a certain movie and you get all of these, you know, 418 00:39:43,760 --> 00:39:47,960 e-mails or all these notifications from different social media platforms asking you that. 419 00:39:47,960 --> 00:39:50,840 You'll also be interested maybe in watching this, this and that. 420 00:39:50,840 --> 00:39:58,670 What that tells you is that you are really under closer monitoring from so many different groups and so many different agencies all at once. 421 00:39:58,670 --> 00:40:04,940 The regimes are part of it. But there are also other entities. You know, let's kill these social media platforms themselves. 422 00:40:04,940 --> 00:40:11,950 Not to mention, of course, when you have major scandals like what's happened on Facebook and the leaking of personal information of, 423 00:40:11,950 --> 00:40:18,350 you know, thousands and thousands, hundreds of thousands or even sometimes millions of users of these different social media platforms. 424 00:40:18,350 --> 00:40:22,610 But if you want to link this back to our discussion related to the platform or the images 425 00:40:22,610 --> 00:40:28,190 we have now revealing in the Arab world 10 years after the eruption of the uprisings, 426 00:40:28,190 --> 00:40:33,410 I think what I ended with in terms of the moment we are living in now, which is the pandemic movement, 427 00:40:33,410 --> 00:40:40,370 brings in a much darker and even more scary image in terms of these Serbians techniques and mechanisms, 428 00:40:40,370 --> 00:40:46,550 simply because a lot of regimes have now started to implement a certain apps and certain to exist on social media, 429 00:40:46,550 --> 00:40:54,400 whereby they can actually do what is called contact tracing. They can trace who you have been talking with, who have been visiting with. 430 00:40:54,400 --> 00:41:00,140 We have been seeing or places you've been going to your own location of that very sensitive person. 431 00:41:00,140 --> 00:41:06,470 Information is now being under closer surveillance and the close watch of the regime is under. 432 00:41:06,470 --> 00:41:11,930 Of course, the method of protecting your own personal safety and your health and the health of your family. 433 00:41:11,930 --> 00:41:17,870 So we are entering into this new phase, if you will, a very dangerous degree of surveillance, 434 00:41:17,870 --> 00:41:23,060 which is much more intrusive into everyone's life in this pandemic moment. 435 00:41:23,060 --> 00:41:28,550 And my prediction is moving forward with the regimes publishing their own skills and their own 436 00:41:28,550 --> 00:41:34,430 tools and their own learning curve in terms of mastering technological and technical skills. 437 00:41:34,430 --> 00:41:39,710 We're going to see even a greater degree of surveillance and invasion of citizens privacy and 438 00:41:39,710 --> 00:41:45,020 going after opponents and opposition movements using many of these modern tools and techniques. 439 00:41:45,020 --> 00:41:50,990 As I mentioned, Deep Pandemic is bringing all of that now to the surface. And this is only the tip of the iceberg. 440 00:41:50,990 --> 00:41:53,630 I think from here it's going to get from from bad to worse. 441 00:41:53,630 --> 00:41:59,630 So we as consumers of social media have to be really much more alert and aware of all of these mechanisms. 442 00:41:59,630 --> 00:42:02,870 Just very quickly, two things I want to quickly point to. 443 00:42:02,870 --> 00:42:09,120 I forgot to mention them and my thoughts. One of them is the chicken and egg question in terms of what comes first. 444 00:42:09,120 --> 00:42:13,110 Meanwhile, democratisation or having them active civil society. 445 00:42:13,110 --> 00:42:19,400 You and I talked about, you know, the fact that you have a vacuum in terms of civil society and the social media can not fill this vacuum. 446 00:42:19,400 --> 00:42:27,260 The question then becomes, then what comes first? Should you have a stable democracy in order to have an active and vibrant civil society? 447 00:42:27,260 --> 00:42:34,730 Or do you need to have an active civil society and fill these these vacuums first in order to have a proper transition to democratisation? 448 00:42:34,730 --> 00:42:39,320 Unfortunately, up till this moment in time, I cannot promise that I have an answer for this question. 449 00:42:39,320 --> 00:42:46,280 It appears to be another paradox or another dilemma that we have to grapple with as scholars and as professionals 450 00:42:46,280 --> 00:42:51,590 and as people working in the field of social media and cyber activism and also in political science. 451 00:42:51,590 --> 00:42:55,790 What comes first is a chicken and egg question or dilemma that we have to grapple with. 452 00:42:55,790 --> 00:43:00,650 And finally, this whole notion of cyber fortia that we talked about or cyber utopia, 453 00:43:00,650 --> 00:43:08,840 we need to reach a middle ground to what we can call cyber realism, whereby we are not over crediting or either crediting social media. 454 00:43:08,840 --> 00:43:14,180 We don't have this moment to afforded. We call it the Facebook revolution, Twitter uprising and all of that. 455 00:43:14,180 --> 00:43:22,010 But at the same time, we don't want to turn a blind eye to the importance and the vitality of social media and the different roles they can play. 456 00:43:22,010 --> 00:43:27,830 So we need to arrive at some kind of middle ground or some kind of balance that we can call cyber realism, 457 00:43:27,830 --> 00:43:38,180 trying to weigh the pros and cons and the strengths and the limitations. I mean, I think that kind of leads what I said leads off nicely to two. 458 00:43:38,180 --> 00:43:48,380 Walter, your question, because I think the notion of cyber realism. For me, it's it's it's not in the same value category is either utopianism. 459 00:43:48,380 --> 00:43:53,750 Dystopian isn't one you type. This implies something good, positive dystopian, something negative. 460 00:43:53,750 --> 00:44:00,920 Realism is reality, but reality can also be good or bad and it can go in one of those directions. 461 00:44:00,920 --> 00:44:05,200 Right. So the question for me is, where are we now? 462 00:44:05,200 --> 00:44:08,020 And I think, yes, I am certainly cynical. 463 00:44:08,020 --> 00:44:14,720 Well, I think there is a risk of of of kind of, you know, having a teleology it's all gonna be bad and it's going to end terrible. 464 00:44:14,720 --> 00:44:22,410 Right. And that's where we're going. And that could happen. It might not. But I think the moment we're in, we have to. 465 00:44:22,410 --> 00:44:29,040 Look at the reality. I don't think anyone's his questioning. That's 10 years ago, there was a moment of euphoria and the point where it now, 466 00:44:29,040 --> 00:44:33,900 certainly in the Arab world is one of less hope in the vote of technology. 467 00:44:33,900 --> 00:44:40,620 And I think what's interesting about this in terms of connecting. Back again, back to where this technology originates. 468 00:44:40,620 --> 00:44:45,180 We've seen the rise of fake news and disinformation. Recently, we've seen the rise of post truth politics, 469 00:44:45,180 --> 00:44:51,740 which is the destruction of faith and trust in institutions, the denigration of the news media organisations, 470 00:44:51,740 --> 00:44:56,040 the essential gaslighting of institutions to such an extent that people reduce 471 00:44:56,040 --> 00:45:00,180 their trust in institutions that direct them on figures such as Donald Trump, Donald Trump. 472 00:45:00,180 --> 00:45:06,480 And if we look at what happened in the 6th of January in the US, the Capital Hill, you know, this isn't just an unique event. 473 00:45:06,480 --> 00:45:13,410 This has been building for some time. The role of technology, the role of fake news, the data, the eak, 474 00:45:13,410 --> 00:45:16,650 the policy ecology that exists in the US that has allowed technology companies to 475 00:45:16,650 --> 00:45:21,360 grow this big and grow this dominance without regulation has also allowed this. 476 00:45:21,360 --> 00:45:25,110 So I think the trajectory we're on and that we've seen in the region and we've seen 477 00:45:25,110 --> 00:45:30,990 in the US recently reflects a tendency towards going to an authoritarian place. 478 00:45:30,990 --> 00:45:34,890 And some people might dismiss the word authoritarianism and just use a term liberal practises, right? 479 00:45:34,890 --> 00:45:40,140 Well, I think the liberal practises are being enabled by the current digital information, 480 00:45:40,140 --> 00:45:44,370 ecology and social media, whether that is surveillance or disinformation. 481 00:45:44,370 --> 00:45:50,550 I think they're well connected. So was I don't dismiss the possibility of an alternative. 482 00:45:50,550 --> 00:45:54,660 I still think that there's a real struggle that just as if I was a citizen, 483 00:45:54,660 --> 00:46:00,330 an enforcer in regime structures that are very real and very authoritarian. 484 00:46:00,330 --> 00:46:03,390 And in order to overcome, those certain conditions must be met. 485 00:46:03,390 --> 00:46:11,140 And I think whilst there isn't a resistance against the direction that digital technology and digital governance is heading. 486 00:46:11,140 --> 00:46:16,060 We're still at a point where we are resisting increasing illiberal practises within that realm. 487 00:46:16,060 --> 00:46:24,720 So, yes, disconnect everything is what I'm saying, except for Zoom, obviously, because, you know, this is where we learn parenting. 488 00:46:24,720 --> 00:46:33,380 Is a good use of technology, right? The questions from the audience will be curated by my colleague Eugene. 489 00:46:33,380 --> 00:46:38,960 Thank you, Walter. It's a pleasure to see so many people of getting their questions in. 490 00:46:38,960 --> 00:46:43,840 If you'd like to ask a question, this is to Glassboro and a beer Najara. 491 00:46:43,840 --> 00:46:47,540 I see you've raised hands, but that function doesn't work on this platform. 492 00:46:47,540 --> 00:46:52,100 So please go to the queue in a bar at the bottom and type of question in and rejection. 493 00:46:52,100 --> 00:46:54,410 I've asked you to clarify your question. 494 00:46:54,410 --> 00:47:00,630 Have a look at my message and see if you can refine your question a little bit so that I can ask it to our speakers. 495 00:47:00,630 --> 00:47:06,440 But I'm going to start with one of our students future showcase who's posing the question to Sahar. 496 00:47:06,440 --> 00:47:12,650 He says that you say that ousting a dictator is easy, but figuring out what to do next is hard. 497 00:47:12,650 --> 00:47:15,600 And then you said that it depends on society. 498 00:47:15,600 --> 00:47:22,890 If society is ready and I think culture wants to challenge, you're putting too much emphasis on the role of society, Sahar. 499 00:47:22,890 --> 00:47:28,490 He goes on and says, is it really in the hands of society and not the military or the economic elites or 500 00:47:28,490 --> 00:47:33,170 international backers like the US or the Saudis or the UAE in the case of Egypt, 501 00:47:33,170 --> 00:47:36,920 who have a greater capacity to put pressure on the powers that be? 502 00:47:36,920 --> 00:47:45,490 So have you put too much pressure on its own speakers? I need brief answers from you because we've got a lot of questions here over the use, Sahar. 503 00:47:45,490 --> 00:47:51,400 Well, let me first start by saying this is an excellent and very thoughtful question, and I don't think we are really important for this agreement. 504 00:47:51,400 --> 00:47:55,090 The chicken and egg dilemma that I posed at the end of the paradox of what comes first. 505 00:47:55,090 --> 00:48:00,220 Right. You need to have these kinds of structural changes in society on different levels, socially, 506 00:48:00,220 --> 00:48:04,720 politically, economically, before you can really have a successful transition to democratisation, 507 00:48:04,720 --> 00:48:08,380 or do you need to have some kind of stable democratisation first in order to have 508 00:48:08,380 --> 00:48:12,010 all of these structural changes and filling some of these vacuums and gaps? 509 00:48:12,010 --> 00:48:17,770 Definitely. There's definitely a lot of good points here that you can not just rely on for the good society alone, 510 00:48:17,770 --> 00:48:20,350 especially when you have all of these constraints. 511 00:48:20,350 --> 00:48:27,130 You're not talking about some kind of, you know, Western democracy where you have freedom of expression, freedom of the press and freedom of speech. 512 00:48:27,130 --> 00:48:32,080 And you can just go about and just put aside the highest office in the land and be safe and secure. 513 00:48:32,080 --> 00:48:35,170 We're talking about very repressive regimes, very authoritarian regimes. 514 00:48:35,170 --> 00:48:41,260 You're talking about consequences in terms of activism as all of these things have to be taken into consideration. 515 00:48:41,260 --> 00:48:44,710 Of course, all the other points mentioned in the question as well in terms of foreign 516 00:48:44,710 --> 00:48:47,980 interference and international influence and all of these other constraints, 517 00:48:47,980 --> 00:48:52,480 the role of the military, all of these factors have to be taken into consideration. 518 00:48:52,480 --> 00:48:58,600 And my whole point was there was this moment of euphoria in 2011 when we thought, yes, you want a shot, you need is putting this on. 519 00:48:58,600 --> 00:49:03,600 The people want to overthrow the regime. And we thought that by Mubarak just leaving office or stepping down, 520 00:49:03,600 --> 00:49:09,940 that if you were able to do this mission successfully, 10 years forward, we discovered that this was not the case. 521 00:49:09,940 --> 00:49:11,560 And you need to have really, 522 00:49:11,560 --> 00:49:17,860 really much more work and much more investment in building this active civil society and building this kinds of grassroots activism, 523 00:49:17,860 --> 00:49:22,510 a mechanism which is very, very difficult to do when you have a repressive regime in place. 524 00:49:22,510 --> 00:49:25,960 So this would be my answer to this question. Thank you so much, sir. 525 00:49:25,960 --> 00:49:32,680 I've got a question for Mark next from Tom Perez, who clearly picking up on your work on Bortz, 526 00:49:32,680 --> 00:49:41,560 wants to know what researching state sponsored exploitation of major Silicon Valley platforms such as Twitter, Facebook, Google and Apple Inc. 527 00:49:41,560 --> 00:49:47,920 What would such research look like? Well, I mean, I do engage in this work, and so what? 528 00:49:47,920 --> 00:49:58,760 A lot of it does is involve monitoring, identifying and finding these kind of problems, such as bot networks and and exposing it. 529 00:49:58,760 --> 00:50:02,710 And one of the reasons here is that companies like Facebook or Twitter generally 530 00:50:02,710 --> 00:50:06,310 on that transparent in terms of the information operations they document. 531 00:50:06,310 --> 00:50:11,960 So without civil society activist pushing back, then there is very little that has you know, 532 00:50:11,960 --> 00:50:16,150 there's very little chance that that's going to be much change in this realm. 533 00:50:16,150 --> 00:50:21,310 And I think this taps in briefly to another question, which is the power of these social media companies. 534 00:50:21,310 --> 00:50:25,750 One of you asked, well, I think about Trump being banned. And I'll just answer this quickly because it's relevant. 535 00:50:25,750 --> 00:50:30,100 I think it's a good thing. Trump was banned because he was generating hate speech. 536 00:50:30,100 --> 00:50:34,300 And, you know, I think in terms of society doing no good. 537 00:50:34,300 --> 00:50:40,900 The problem with him being banned, it reflected the power of social media companies in in intervening in discourse. 538 00:50:40,900 --> 00:50:49,510 However, it also highlighted how there was a lack of transparency, public debate, public input in those decisions, who and who should not be banned. 539 00:50:49,510 --> 00:50:56,050 And it really exposes that these companies, which can impact on the digital discussion and the civil society, 540 00:50:56,050 --> 00:51:03,100 are not subject to the kind of democratic oversight that we we should probably have them doing. 541 00:51:03,100 --> 00:51:08,320 Thank you very much, Mark. I've got a question from Hollywood, Joshua, for a start, 542 00:51:08,320 --> 00:51:15,670 who would like to challenge the assertion you made that Tunisia is an exception to the way Arab Spring affected other countries? 543 00:51:15,670 --> 00:51:23,520 He wants to know what internal and external factors in Tunisia rendered the country exceptional in this regard. 544 00:51:23,520 --> 00:51:27,910 I know that some of Micronesia friends would even grapple with this term and sometimes even criticise that, 545 00:51:27,910 --> 00:51:31,450 they would say, oh, you talk about the region exceptionalism too much. 546 00:51:31,450 --> 00:51:36,310 We still have our own challenges, that we have our internal problems and we have economic challenges. 547 00:51:36,310 --> 00:51:43,140 I mean, like. Yeah, you do. That's for sure. But at the same time, if you look at the stark partisan with other so-called, quote, 548 00:51:43,140 --> 00:51:49,240 good Arab Spring countries that really had their own, you know, evolutions go completely in the opposite direction. 549 00:51:49,240 --> 00:51:52,240 You know, the civil war in Syria, the war in Yemen, the Victorianism. 550 00:51:52,240 --> 00:51:58,630 And, you know, this is a country of strife and violence in Libya that it turns out much more harsher military dictatorship in Egypt, 551 00:51:58,630 --> 00:52:05,050 all of these very, very unfortunate outcomes that we've gotten an invisible, crushed revolution in in in Bahrain. 552 00:52:05,050 --> 00:52:12,160 You look at this very bleak and very gloomy picture. I think Tunisia stands out as a very successful exemption for a number of reasons. 553 00:52:12,160 --> 00:52:16,460 It's a much smaller country. It is a very high literacy rate of over 90 percent. 554 00:52:16,460 --> 00:52:20,440 The women are very literate and educated that are very much part of the public sphere. 555 00:52:20,440 --> 00:52:24,460 And I will say women are heads of society and they give birth to the second half. 556 00:52:24,460 --> 00:52:29,420 So they're very important to be at the table. And it's very important for them to have their voices heard. 557 00:52:29,420 --> 00:52:36,580 The judges, they're taking part in decision making. It was also this very important success in terms of building alliances. 558 00:52:36,580 --> 00:52:39,310 Groups were able to get together and build successful alliances. 559 00:52:39,310 --> 00:52:46,720 Some of them won the Nobel prise for this great work, which unfortunately we have not witnessed in the other so-called Arab Spring countries. 560 00:52:46,720 --> 00:52:51,040 We have seen the fragmentation, polarisation and division, which I mentioned before. 561 00:52:51,040 --> 00:52:55,210 We have not seen a successful experiment of alliance building and coalition 562 00:52:55,210 --> 00:53:00,040 building that helps to really pave the way for a transition to democratisation. 563 00:53:00,040 --> 00:53:07,780 Or did you just put out as an example and also the lack of international interference in the case of Tunisia was also another health affected. 564 00:53:07,780 --> 00:53:15,010 So I think all of these factors coming together could lead to this. I still stand by my term in Tunisia, Tunisian exceptionalism until this moment. 565 00:53:15,010 --> 00:53:21,010 Thank you. Thank you very much, sir. I've got a couple of questions, Sharmarke, about the Gulf region. 566 00:53:21,010 --> 00:53:25,270 For one, I know there's been a lot of this is coming from called an aid writes. 567 00:53:25,270 --> 00:53:34,960 I know there's been quite a bit about Saudi Boort activity detected in the UK surrounding the aborted purchase of the soccer club Newcastle United. 568 00:53:34,960 --> 00:53:40,960 I wonder if we're seeing similar campaigns by Gulf actors online targeting spaces outside the MENA region? 569 00:53:40,960 --> 00:53:46,510 And if not, why not before you jump to that one from bloodedness? 570 00:53:46,510 --> 00:53:54,460 We have concerting Bahrain. We witnessed a situation using Internet and social media as a tool of political repression. 571 00:53:54,460 --> 00:53:58,210 Dr Jones spoke about it in his latest book. So we're plugging your book here. 572 00:53:58,210 --> 00:54:04,870 Mark, take note. Do you think this situation would change with the Biden administration? 573 00:54:04,870 --> 00:54:09,040 So the goals that its near abroad or whatever they like? 574 00:54:09,040 --> 00:54:14,440 I think the question about whether we see regimes targeting abroad are really important because, yes, they are. 575 00:54:14,440 --> 00:54:19,120 And this shows increasing boldness of certain actors in the Gulf region, particularly the UAE and Saudi Arabia. 576 00:54:19,120 --> 00:54:22,090 We see this with surveillance and in fact, antiperspirant. 577 00:54:22,090 --> 00:54:26,500 But, well, you know, there is an increasing attempt, especially under the Trump administration, 578 00:54:26,500 --> 00:54:32,680 from these regimes regimes to try to influence public opinion abroad. And to give one quick example. 579 00:54:32,680 --> 00:54:42,520 Last year, we detected my colleague and I jennison that at least we detected a network of 25 fake journalists that had used artificial intelligence, 580 00:54:42,520 --> 00:54:54,310 generated profile images and submitted heads, opinion articles to over forty five different international news outlets and had them published. 581 00:54:54,310 --> 00:54:59,360 So this is this included The Washington Times, South China, Morning Post. 582 00:54:59,360 --> 00:55:06,400 You know, Asia Times spiked in the U.K. These articles written by someone who wasn't who they were claiming 583 00:55:06,400 --> 00:55:12,160 to be basically articulating Emirati foreign policy in these international news outlets. 584 00:55:12,160 --> 00:55:17,740 And they successfully did about 95 oh, PED's. Right. So this is phenomenal. 585 00:55:17,740 --> 00:55:26,050 And this kind of activity we see all the time. So, yes, these regimes are targeting abroad and they are trying to influence public opinion abroad. 586 00:55:26,050 --> 00:55:29,410 And I think one of the reasons we saw this, particularly the past four, five years, 587 00:55:29,410 --> 00:55:34,450 is that Trump was seen as, you know, a very strong candidate to be supported. 588 00:55:34,450 --> 00:55:39,280 The goals of, say, I haven't been sound inside. So they were very much wants him to be re-elected. 589 00:55:39,280 --> 00:55:43,000 And as such, put resources into that. 590 00:55:43,000 --> 00:55:50,130 With regards to Biden and behind who the president of the U.S., his matches in terms of state repression in Bahrain. 591 00:55:50,130 --> 00:55:56,560 As soon as Trump were elected, we saw more executions in Bahrain than we've seen since before 2011. 592 00:55:56,560 --> 00:56:01,660 Biden will make a difference in terms of the plight of political prisoners in Bahrain, 593 00:56:01,660 --> 00:56:05,650 but I don't think it will be significant difference difference in terms of the role. 594 00:56:05,650 --> 00:56:10,560 Digital tools are being used the way digital tools are being used to to aid state repression. 595 00:56:10,560 --> 00:56:16,190 Unfortunately. Thank you very much, Mark. I've got a question now from a beer, Nazar of your. 596 00:56:16,190 --> 00:56:18,160 Glad you found your way to the queue in a bar. 597 00:56:18,160 --> 00:56:27,760 This one's for Subha and a beer is asking, what examples can you cite of women finding bargaining power with liberal activists in Egypt, 598 00:56:27,760 --> 00:56:34,660 Tunisia, Morocco and other countries where you might see a crack and current restrictions on the state of isolation of women? 599 00:56:34,660 --> 00:56:40,100 Or just addressing of women's issues, presumably with social media technology? 600 00:56:40,100 --> 00:56:42,650 Yeah. That's a. That's an excellent question. 601 00:56:42,650 --> 00:56:50,060 It's very important to really understand why women's movement have multiple than gold succeeds in many parts of the Arab world up to this moment. 602 00:56:50,060 --> 00:56:54,920 Even those that have resorted to using cyber cyber activism or cyber feminism 603 00:56:54,920 --> 00:56:59,150 in this case to promote and advocate for women's issues and women's causes. 604 00:56:59,150 --> 00:57:05,190 Because as we said before, social media are not magical tools and they're not going to bring about change and reform or in this case, 605 00:57:05,190 --> 00:57:10,400 the social advancement for women all by themselves. We need to change inside that mindset. 606 00:57:10,400 --> 00:57:15,470 And you need to have work on the ground, which is being done in collaboration between different groups. 607 00:57:15,470 --> 00:57:22,580 And I'll go back again to the Tunisian example as one that really a strikingly positive example in this sense and in this regard, 608 00:57:22,580 --> 00:57:28,340 we have seen that groups of women being able to bring about more public visibility for themselves, 609 00:57:28,340 --> 00:57:31,490 working together across the board and bridging their differences, 610 00:57:31,490 --> 00:57:38,510 groups of, you know, groups that have Islamist leading a liberal leaning secular right, left, you name it, all across the board, 611 00:57:38,510 --> 00:57:43,220 being able to come together and to pass wonderful legislations and new laws 612 00:57:43,220 --> 00:57:47,930 like the comprehensive law banning all forms of gender violence against women. 613 00:57:47,930 --> 00:57:50,750 And they did not justifying violence as domestic violence. 614 00:57:50,750 --> 00:57:56,870 They defined it as sexual, physical, emotional, economic, political violence of all different sorts. 615 00:57:56,870 --> 00:58:01,460 It was a huge success and a huge breakthrough that would not have happened had 616 00:58:01,460 --> 00:58:06,350 it not been for this kind of alliance and coalition building on the ground, not just online. 617 00:58:06,350 --> 00:58:10,340 So that is really a lesson and a take away that I think other women's groups across 618 00:58:10,340 --> 00:58:13,790 the region should learn from that they have to build alliances across the board, 619 00:58:13,790 --> 00:58:15,700 come together despite the differences. 620 00:58:15,700 --> 00:58:23,160 Use both online and offline and on the ground activism and coordination following this great example from Tunisia. 621 00:58:23,160 --> 00:58:26,330 So thank you so much. I know you have time for one last question. 622 00:58:26,330 --> 00:58:30,710 And I'm going to combine two questions here that are pushing back a little bit, Mark, 623 00:58:30,710 --> 00:58:37,690 on the cyber dystopia that you present us with and are pushing you to think of a positive way forward. 624 00:58:37,690 --> 00:58:47,320 Sareen quartiles. Was curious to know if they can render social media a more Democratic site where people can express themselves more freely. 625 00:58:47,320 --> 00:58:50,760 And our colleague Osama Azami asks you, Mark, 626 00:58:50,760 --> 00:58:59,760 what sort of regulatory frameworks could policymakers develop to bring greater transparency and democracy into the running of social media companies? 627 00:58:59,760 --> 00:59:05,430 So if you could give us a hope for the future. Mark, before we bring this session to read. 628 00:59:05,430 --> 00:59:12,060 All right. Well, I'm going to do a bait and switch, I think. But yeah, I think in terms of hope. 629 00:59:12,060 --> 00:59:18,060 Yes. Let's let's talk about clubhouse, because I think I want to ask about this clubhouse as a new Sociably platform that's taking the east by storm. 630 00:59:18,060 --> 00:59:22,440 And it's provoked, again, the same kind of debates we had about 10 years ago where people are being very optimistic. 631 00:59:22,440 --> 00:59:28,260 People are saying, wow, we're having these discussions in the region about controversial issues that we never had before. 632 00:59:28,260 --> 00:59:33,030 And I think this is. And I think this is a positive thing. There's a hot but there's a honeymoon period. 633 00:59:33,030 --> 00:59:35,220 I said P g. 634 00:59:35,220 --> 00:59:43,290 And they will approach it with this kind of optimism that enables them to have these very constructive and useful and important discussions. 635 00:59:43,290 --> 00:59:49,770 My fear is that it's already been co-opted. I wrote an op ed in the Middle East by saying how we're already seeing trolling occurring in those spaces. 636 00:59:49,770 --> 00:59:56,490 But it can occur, I think, in terms of, you know, again, I'm cynical, but that can stand up in terms of regulation. 637 00:59:56,490 --> 01:00:05,220 We need to look at whether regulation is being made right. Because if a lot of these companies exist in the U.S. or North American Europe, 638 01:00:05,220 --> 01:00:13,650 then the regulatory framework that allows them to be democratic spaces include issues of moderation, content, control, cetera. 639 01:00:13,650 --> 01:00:17,440 So the kind of pressure that we need to see again exists. 640 01:00:17,440 --> 01:00:21,330 And, you know, we need civil society groups to put pressure on these companies. 641 01:00:21,330 --> 01:00:28,380 We need people within those countries which the companies exist to put pressure on them to create terms or condition. 642 01:00:28,380 --> 01:00:36,170 Rules and regulations within the platform themselves actually facilitate a non-toxic behaviours and facilitate democratic interactions. 643 01:00:36,170 --> 01:00:41,060 As a quick example. For example, Twitter is inundated with bots and fake accounts. 644 01:00:41,060 --> 01:00:45,150 Palsson fake accounts are perfectly exploited by 13 regimes to intimidate and harass. 645 01:00:45,150 --> 01:00:50,550 They should have no place in the platform, but they do. So that's an example of how regulation could try to get rid of it. 646 01:00:50,550 --> 01:00:54,450 That's not necessarily so optimistic. But yeah, I can't. 647 01:00:54,450 --> 01:00:59,510 I can't. I'm sorry, Eugene. Well, thank you for this question. 648 01:00:59,510 --> 01:01:05,420 There are great questions. We'll try. I regret we're out of time. 649 01:01:05,420 --> 01:01:11,090 So I want to wrap up with several things. First of all, on behalf of Eugene was carrying the questions, 650 01:01:11,090 --> 01:01:16,850 I want to apologise to those of you who ask questions that we couldn't get to where they were. 651 01:01:16,850 --> 01:01:20,390 There always too many questions for us to get through in the allotted time. 652 01:01:20,390 --> 01:01:24,410 Secondly, I want to thank our speakers, Starr and Mark, for excellent presentations. 653 01:01:24,410 --> 01:01:28,520 It's been it's been a great session. We've all enjoyed it very much. 654 01:01:28,520 --> 01:01:38,420 And finally, I want to put in a plug for next week's lecture, which is the Iraqi Army's Diyar MIL Jalali annual lecture, 655 01:01:38,420 --> 01:01:43,970 which is sponsored by the founder and chair of the Russian Cultural Heritage Institute. 656 01:01:43,970 --> 01:01:51,340 And the title of that event will be Iran and the Arab Uprisings Opportunity Grasped or Squandered. 657 01:01:51,340 --> 01:02:01,030 The speaker for that will be a new show at the Shammy from Durham University, and it will be chaired by our colleague Stephanie Cronin. 658 01:02:01,030 --> 01:02:05,450 That takes place next week, Friday, March 12th, that same time, five o'clock. 659 01:02:05,450 --> 01:02:25,959 So on behalf of Middle East Centre and again with thanks to our guests, I want to say goodbye to all.