1 00:00:08,170 --> 00:00:14,470 Good evening, everyone. My name is Walter Armbrust. I'm a fellow Middle East centre and I'm hosting tonight's webinar. 2 00:00:14,470 --> 00:00:20,530 The title of tonight's session is Revolutions versus Counter Marginalisation 3 00:00:20,530 --> 00:00:26,710 Movements Revisiting the online tug of War a Decade After the Arab Spring. 4 00:00:26,710 --> 00:00:30,850 And we're fortunate to have two excellent speakers tonight. 5 00:00:30,850 --> 00:00:35,140 I'll introduce them. The first is Dr. Inside Hymies, 6 00:00:35,140 --> 00:00:42,850 who is associate professor in the Department of Communication and also an affiliate professor of women's studies at the University of Maryland. 7 00:00:42,850 --> 00:00:47,470 She's an expert on Arab and Muslim media and has co-authored with Mohamed. 8 00:00:47,470 --> 00:00:53,530 And Now We two books. Islam dot com. Contemporary Islamic Discourse in Cyberspace. 9 00:00:53,530 --> 00:01:01,540 That was in 2009. And Egyptian Revolution 2.0. Political blogging, civic engagement and citizen journalism. 10 00:01:01,540 --> 00:01:05,620 That was in 2013, both published by Palgrave Macmillan. 11 00:01:05,620 --> 00:01:14,460 And she's also co-editor with AM Medley of Arab Women's Activism and Socio Political Transformation Unfinished Gendered Revolutions. 12 00:01:14,460 --> 00:01:19,230 That was in 2018, again by Palgrave Macmillan. 13 00:01:19,230 --> 00:01:26,500 She's authored and co-authored numerous book chapters and journal articles and confidence papers in both English and Arabic. 14 00:01:26,500 --> 00:01:31,720 And she also works a great deal as a media commentator and analyst, as a public speaker, 15 00:01:31,720 --> 00:01:35,500 a human rights commissioner in the Human Rights Commission in Montgomery County, 16 00:01:35,500 --> 00:01:41,710 Maryland, and a radio host who presents a monthly radio show on U.S. Arab Radio, 17 00:01:41,710 --> 00:01:48,190 which is the first Arab American radio station broadcasting in the United States and Canada. 18 00:01:48,190 --> 00:01:54,700 Our second speaker is Mark Owen Jones, who did his Page D in 2016 at Durham University, 19 00:01:54,700 --> 00:01:59,890 where he wrote an interdisciplinary thesis on the history of political repression in Bahrain, 20 00:01:59,890 --> 00:02:06,550 which won the 2016 Association for Gulf and Arab and Arabian Peninsula Studies PRISE. 21 00:02:06,550 --> 00:02:15,580 Dr Jones spent much of his childhood in Bahrain and has also lived in various parts of the Middle East, including Saudi Arabia, the Sudan and Syria. 22 00:02:15,580 --> 00:02:20,080 And he now teaches at the Hamad bin Halifa University in Qatar. 23 00:02:20,080 --> 00:02:26,740 And before that, taught at Tübingen University's Institute for Political Science and also as a lecturer in the history 24 00:02:26,740 --> 00:02:32,860 of the Gulf and Arabian Peninsula at Exeter University's Institute of Arab and Islamic Studies. 25 00:02:32,860 --> 00:02:40,060 He's added two books and an upcoming monograph on political repression in Bahrain with Cambridge University Press. 26 00:02:40,060 --> 00:02:44,140 I don't know if the title is, but perhaps you will tell us during the course of the lecture. 27 00:02:44,140 --> 00:02:51,460 In addition to his academic work, Mark enjoys communicating his research to broader audiences and has by lines in The Washington Post, 28 00:02:51,460 --> 00:02:55,870 New Statesman, CNN, the Independent PEN International and several others. 29 00:02:55,870 --> 00:03:01,480 And he's also appeared frequently on BBC Channel four News and Al Jazeera. 30 00:03:01,480 --> 00:03:10,660 So we'll now move straight to our speakers. We'll have Sara first and then Mark, each will speak for ten to fifteen minutes and then we will have Q&A. 31 00:03:10,660 --> 00:03:17,210 So take it away, Tom. Thank you so much for this kind introduction and for the kind invitation, Walter. 32 00:03:17,210 --> 00:03:21,650 It's really an honour and a pleasure to join you all today. It's an honour to college in Oxford. 33 00:03:21,650 --> 00:03:26,360 The Middle East centre, this great intellectual, academic institution. 34 00:03:26,360 --> 00:03:30,440 My talk today is going to focus on really what has been changing and shifting. 35 00:03:30,440 --> 00:03:35,030 Ten years after the eruption of the so-called Arab Spring uprisings in the region, 36 00:03:35,030 --> 00:03:39,500 what has been happening in terms of the shift of the role of social media? 37 00:03:39,500 --> 00:03:45,650 Why these shifts have been happening and also some of the paradoxes that emerged out of these shifts? 38 00:03:45,650 --> 00:03:50,540 And how can we understand the dynamics of the relationship between different political and social 39 00:03:50,540 --> 00:03:57,680 players in the Arab political and media landscapes in light of all of these changes and paradoxes? 40 00:03:57,680 --> 00:04:04,310 Let me first start by saying that when the Arab Spring uprisings erupted 10 years ago, there was this moment of euphoria. 41 00:04:04,310 --> 00:04:08,990 I am an Egyptian American. So obviously, I was very euphoric about the eruption of the uprisings. 42 00:04:08,990 --> 00:04:13,850 But as a communications scholar, we also had these moments of, quote unquote, technical euphoria. 43 00:04:13,850 --> 00:04:22,280 We were delighted at what social media can do. The role of social media as amplifiers and Catalyst's and mobilisers and coordinators, 44 00:04:22,280 --> 00:04:26,210 how they can bring people together, coordinate their actions on the ground, 45 00:04:26,210 --> 00:04:29,210 help them to mobilise, create Facebook pages like, you know, 46 00:04:29,210 --> 00:04:35,360 we are all hindsight that can draw hundreds of thousands of people, you know, followers on Twitter, you name it. 47 00:04:35,360 --> 00:04:37,400 It was this moment of fact news for ya. 48 00:04:37,400 --> 00:04:47,070 Ten years later, we are looking at a completely different landscape, both politically and in terms of mediated communication in the region as well. 49 00:04:47,070 --> 00:04:52,090 Well, we are seeing instead is a rise in what we call digital authoritarianism. 50 00:04:52,090 --> 00:04:58,950 This phenomena whereby regimes are really jumping on these social media and digital media bandwagon. 51 00:04:58,950 --> 00:05:03,290 They are publishing their own skills technically and digitally. 52 00:05:03,290 --> 00:05:11,830 They're creating their own cyber armies, electronic armies, going after the opponents many times, hacking, sabotaging, blocking. 53 00:05:11,830 --> 00:05:18,430 You know, and really sometimes even having more dire consequences like finding somebody is IP address going after this person, 54 00:05:18,430 --> 00:05:26,560 arresting him or her or putting them in jail or, you know, in some dire circumstances, even ending their lives altogether. 55 00:05:26,560 --> 00:05:31,540 So this is a completely different picture now in terms of this rise of what we call digital authoritarianism, 56 00:05:31,540 --> 00:05:36,300 which replaced this earlier movement of the for you. 57 00:05:36,300 --> 00:05:39,420 Why did that happen, how did it happen? 58 00:05:39,420 --> 00:05:48,790 Well, it is really showing us some of the limitations of the social media that they can really play a dual role with a double role. 59 00:05:48,790 --> 00:05:57,490 They can be instruments or tools for mobilisation and tools for liberation, but they can also be tools for repression. 60 00:05:57,490 --> 00:06:02,620 This dual role of social media as both tools of liberation and depression is very important. 61 00:06:02,620 --> 00:06:05,560 And we should ask communications, scholars, sociologists, scholars, 62 00:06:05,560 --> 00:06:15,010 political science scholars really understand and investigate the reasons and the dynamics behind this kind of quality in the role of social media. 63 00:06:15,010 --> 00:06:19,300 There's also other qualities that we should pay attention to. What example? 64 00:06:19,300 --> 00:06:24,520 When you have a moment of the collective solidarity, people are all in Tahrir Square. 65 00:06:24,520 --> 00:06:29,330 They have a common goal. They're always calling for Mubarak to go. 66 00:06:29,330 --> 00:06:32,450 Hi, honey. You have to go. You have to go. 67 00:06:32,450 --> 00:06:40,480 People, regardless of gender, regardless of age, regardless of religion, regardless of their own personal leanings or orientations. 68 00:06:40,480 --> 00:06:47,110 What do united in this wonderful, you know, historic moment of solidarity and unity? 69 00:06:47,110 --> 00:06:51,310 If you have this kind of solidarity and unity, which is very rare, but when it happens, 70 00:06:51,310 --> 00:06:56,800 social media becomes a very important tool for amplifying the voices of dissent. 71 00:06:56,800 --> 00:07:03,640 And the voices of opposition and bringing people together and uniting them behind a common cause. 72 00:07:03,640 --> 00:07:09,010 But once you start to have political divisions, fragmentation and polarisation, 73 00:07:09,010 --> 00:07:14,470 which is pretty much what has been witnessed in a country like Egypt and many of the other Arab Spring countries that 74 00:07:14,470 --> 00:07:23,470 witnessed the reversals and backlashes in terms of the journeys to democratisation and reform with the exception of Tunisia. 75 00:07:23,470 --> 00:07:31,670 Then this moment of unity and uniformity and solidarity is going to be replaced with divisions and fragmentation. 76 00:07:31,670 --> 00:07:36,560 And once that happens, the more polarisation and fragmentation you have, 77 00:07:36,560 --> 00:07:42,320 the more social media is going to be used as weapons to increase the distance between the self and the other, 78 00:07:42,320 --> 00:07:49,280 to widen the gaps and widen the distances between different groups and even the weaponization of social media. 79 00:07:49,280 --> 00:07:56,830 Meaning they're going to be used as tools and weapons to attack your own opponents and to come after them. 80 00:07:56,830 --> 00:08:01,760 So that is the picture that you're are witnessing now, unfortunately, in the so-called Arab Spring countries. 81 00:08:01,760 --> 00:08:10,670 As I said, with a unique exception of Tunisia, which was able to have a much more smooth path towards democratisation and reform. 82 00:08:10,670 --> 00:08:13,860 And the other countries, unfortunately, have had very dire outcomes. 83 00:08:13,860 --> 00:08:17,910 You need to go into details because I'm sure everybody on this call knows about it. 84 00:08:17,910 --> 00:08:24,370 In this particular case on social media, I weaponized in two very dangerous directions. 85 00:08:24,370 --> 00:08:30,400 In the hands of the regimes, they become tools of repression, and this is the authoritarianism as we explain. 86 00:08:30,400 --> 00:08:34,960 And also in the hands of opposing parties that are going after each other. 87 00:08:34,960 --> 00:08:43,250 They would become with the nice to go after you old opponents and to try to attack them and smear them and stigmatise them online. 88 00:08:43,250 --> 00:08:47,880 So that is not a pretty picture, unfortunately. Right. Why did this picture happen? 89 00:08:47,880 --> 00:08:53,820 What's really caused all of this? I guess that we have over credited social media too much. 90 00:08:53,820 --> 00:09:03,720 We have seen labels like the Facebook revolution in Egypt, the Twitter uprising in Tunisia, the YouTube uprising in Syria. 91 00:09:03,720 --> 00:09:09,990 And I think we were really kind of taken away or kind of really pulled in this moment of euphoria over crediting social 92 00:09:09,990 --> 00:09:17,770 media as if they are magical tools that can bring about reform and change and democratisation all by themselves. 93 00:09:17,770 --> 00:09:21,430 Ten years forward now, we've spoken about a completely different picture. 94 00:09:21,430 --> 00:09:24,490 So what are some of the limitations of the role of social media? 95 00:09:24,490 --> 00:09:28,510 I talked about some of them already, but other things we have to take note of as well. 96 00:09:28,510 --> 00:09:36,460 They can not fill the gap or the void, which is created by the vacuum of not having an active civil society. 97 00:09:36,460 --> 00:09:39,700 In many of these countries, there was no active civil society, 98 00:09:39,700 --> 00:09:46,060 no real action in terms of having NGO, OWS and oppositional groups and activism on the ground. 99 00:09:46,060 --> 00:09:54,820 And when you have this kind of vacuum, you can not really rely on social media to come in as a magical tool and to fill this kind of vacuum. 100 00:09:54,820 --> 00:10:04,060 It's not going to be able to do so. You need to have a coordinated effort at every level of society, politically, socially, economically, 101 00:10:04,060 --> 00:10:11,230 in terms of media communication, to be able to really move forward towards a successful journey of democratisation and reform. 102 00:10:11,230 --> 00:10:16,990 And what we are witnessing now, these failures or these reversals in democratisation are telling us, 103 00:10:16,990 --> 00:10:21,790 hey, wait a minute, there is only so much that social media can do. 104 00:10:21,790 --> 00:10:28,270 After all, they're not magical tools. They're not going to bring about change and reform and democratisation all by themselves. 105 00:10:28,270 --> 00:10:37,810 You need to have an active, vibrant, organised civil society to fill some of these vacuums and to create this kind of momentum on the ground first. 106 00:10:37,810 --> 00:10:43,750 And that social media can come in as a supplementary and complementary factor or as a 107 00:10:43,750 --> 00:10:50,290 catalyst that can speed up this process of reform and this process of transformation. 108 00:10:50,290 --> 00:10:56,090 So that, in my opinion, is a very important lesson and a very important take away from these reversals that we are witnessing. 109 00:10:56,090 --> 00:10:59,650 Ten years after the Arab Spring. 110 00:10:59,650 --> 00:11:04,900 Having said all of that, I want to also focus on the theme that we're talking about, which is counter-revolution, right. 111 00:11:04,900 --> 00:11:11,500 And countering marginalisation. When the Arab Spring uprisings erupted, people were like, yeah, you know, I wish for real. 112 00:11:11,500 --> 00:11:12,540 I'd like to marry. 113 00:11:12,540 --> 00:11:20,290 Bread and freedom and social justice, which is like what people were asking for when they ask for a change of regime and all of that. 114 00:11:20,290 --> 00:11:27,730 Well, that is true. And it's also true that certain groups felt more marginalised more than others based on gender, 115 00:11:27,730 --> 00:11:33,220 based on socio economic status, based on their own religious or political ideologies. 116 00:11:33,220 --> 00:11:41,750 They felt like they don't have a place at the table. And they felt that this would be a golden moment for them to really amplify their own 117 00:11:41,750 --> 00:11:46,520 voices or to make their voices heard in another way and to meet the demands of interest, 118 00:11:46,520 --> 00:11:51,730 take it into account and take it into consideration. For example, if you look at women's movements, right? 119 00:11:51,730 --> 00:12:01,360 The feminist movements in the Arab world. OK. After the initial success of the Egyptian revolution leading to the ousting of Mubarak from office. 120 00:12:01,360 --> 00:12:06,040 Women's movements, that that's a golden moments. Awesome. Let's try to capitalise on that. 121 00:12:06,040 --> 00:12:11,330 Let's try to build on it. So on Women's International Day, which is, by the way, March 8th. 122 00:12:11,330 --> 00:12:21,370 So just coming in a few days, right? March 8th, you said, okay, let's have million a year, a million women's march in the heart of the square, 123 00:12:21,370 --> 00:12:25,720 trying to mimic this whole thing of having a a year or a million person march 124 00:12:25,720 --> 00:12:30,370 in your square calling for reform and democratisation and political change. 125 00:12:30,370 --> 00:12:34,930 Let's do this kind of a million women march in Tahrir Square. Unfortunately, 126 00:12:34,930 --> 00:12:41,500 only five hundred women showed up and they were shouted at with the very bad words like you will basically 127 00:12:41,500 --> 00:12:46,600 putting them down and asking them to go to the hair salon and just use nail polish and look forward. 128 00:12:46,600 --> 00:12:52,550 And that's their space. Well, that was a very disappointing moment because obviously women played a very, 129 00:12:52,550 --> 00:12:56,930 very important and central role in the midst of the Arab Spring uprisings. 130 00:12:56,930 --> 00:13:02,570 Many of them sacrificed. They were harassed. They were arrested. They were beaten. 131 00:13:02,570 --> 00:13:08,420 Some of them even paid their own lives as a price. And they lost their life as a result of this kind of resistance mechanism. 132 00:13:08,420 --> 00:13:12,010 And being at the frontline. What is the takeaway here? 133 00:13:12,010 --> 00:13:14,650 What can we learn from this sad moment? 134 00:13:14,650 --> 00:13:23,900 I think like we can learn the domain of democracy that ousting a dictator from office is easy, but figuring out what to do next is not. 135 00:13:23,900 --> 00:13:31,460 The same thing could be said about the groups of women working in the area of feminist activism, for example, in this particular case. 136 00:13:31,460 --> 00:13:36,170 You can also say organising a campaign online is easy. 137 00:13:36,170 --> 00:13:45,910 Calling for a march is easy, but changing societal attitudes and changing society's mindset is not so unless and until 138 00:13:45,910 --> 00:13:52,730 the society itself is prepared for this type of socio political or socio economic change. 139 00:13:52,730 --> 00:13:58,040 We did not expect much success just based on tweeting or posting or blogging about it. 140 00:13:58,040 --> 00:14:03,770 Let's have a great a million women march in the midst of Tahrir Square to capitalise on the revolution. 141 00:14:03,770 --> 00:14:09,620 It's not going to happen. And that shows the kind of work that needs to be done ahead of us. 142 00:14:09,620 --> 00:14:12,710 The same thing is applicable to religious groups, Islamic groups. 143 00:14:12,710 --> 00:14:20,030 Some of them also tried to make some kind of a little buzz about themselves as an activity and try to amplify their voices as we know. 144 00:14:20,030 --> 00:14:27,560 Of course, the story of the Muslim Brotherhood in Egypt, they just had this very brief golden moments during President Morsi's time. 145 00:14:27,560 --> 00:14:32,540 They were banned. And they were harassed and persecuted before and after. 146 00:14:32,540 --> 00:14:39,740 Right. So some of them are still trying to use some social media platforms to amplify their voices or to make the messages heard. 147 00:14:39,740 --> 00:14:46,970 But it is not definitely creating the kind of the change or the kind of visibility that they really are looking for. 148 00:14:46,970 --> 00:14:52,450 And sometimes visibility can even backfire against them because of their anti-government position. 149 00:14:52,450 --> 00:14:59,870 It can be a double edged sword as well. Again, raising the flag that you can only do so much using social media. 150 00:14:59,870 --> 00:15:06,320 You need to have more structural and organised change if you really want to have actual change on the ground. 151 00:15:06,320 --> 00:15:11,100 So let me now move to another part of my talk, which is the paradoxes. 152 00:15:11,100 --> 00:15:19,320 What kind of paradoxes are we really witnessing and seeing in this post Arab Spring space 10 years after the eruption of the uprising? 153 00:15:19,320 --> 00:15:25,030 I talked about some of them already. But I want to highlight others, which are equally very important. 154 00:15:25,030 --> 00:15:30,940 One of them, for example, is this whole paradox of the anonymity versus visibility. 155 00:15:30,940 --> 00:15:35,970 Right. If you are anonymous, in some cases, it can protect your own identity. 156 00:15:35,970 --> 00:15:40,810 You can be more safe. The government is not going to go after. He was not going to harass you because you are anonymous. 157 00:15:40,810 --> 00:15:45,580 Right. So some of the active, especially those who are belonging to certain, quote unquote, 158 00:15:45,580 --> 00:15:50,230 marginalised groups such as but exempts certain opponents of the regime, 159 00:15:50,230 --> 00:15:56,110 whether they are from this particular political ideology or another or sort of religious ideology, 160 00:15:56,110 --> 00:16:03,080 some of them have resorted to this whole curtain of anonymity to protect themselves from regime harassment. 161 00:16:03,080 --> 00:16:11,760 Also, for women in particular in traditional and conservative societies, and becomes more important because are also fighting a double battle. 162 00:16:11,760 --> 00:16:15,710 It is a political battle against the disease. And there is also a social button. 163 00:16:15,710 --> 00:16:22,580 Again, a certain, you know, societal considerations and constraints and stigmas and traditions are pretty high. 164 00:16:22,580 --> 00:16:27,760 Put that on mobility and advancement and visibility. So that could be helpful. 165 00:16:27,760 --> 00:16:31,430 But just like everything else also has its own downsides. 166 00:16:31,430 --> 00:16:38,750 And the downsides are it can decrease the credibility that impact their visibility and their their overall, 167 00:16:38,750 --> 00:16:43,530 you know, credibility as players in the socio political fields. 168 00:16:43,530 --> 00:16:48,270 So some of the activists I interviewed have been talking about. Yeah, I can protect myself by being anonymous. 169 00:16:48,270 --> 00:16:48,940 But guess what? 170 00:16:48,940 --> 00:16:56,670 CNN is never gonna go try to interview me or find out where I am and try to have a great interview with me because nobody will know who I am anyway. 171 00:16:56,670 --> 00:17:04,440 So that's really one of the paradoxes. Another paradox is a double edged sword of resistance in the diaspora. 172 00:17:04,440 --> 00:17:11,880 In the midst of this very repressive environment we talked about, and there's no question that a lot of opponents of the regimes feel much safer. 173 00:17:11,880 --> 00:17:18,400 You know, voicing their own concerns and opposition at that distance, many of them are in self-imposed exile. 174 00:17:18,400 --> 00:17:21,810 They're living abroad in many different countries all over the world to try to 175 00:17:21,810 --> 00:17:25,980 stay away from the regime of oppression and from the regimes going after them. 176 00:17:25,980 --> 00:17:33,580 Again, that's a double edged sword. In a way, it can give you a safe space where you try to voice your own, you know, 177 00:17:33,580 --> 00:17:38,780 opposition to the regime without being fooled and put behind bars or trying to be arrested or killed. 178 00:17:38,780 --> 00:17:45,500 But on the other hand, there's only so much you can do in the diaspora when you are away from your own home country, 179 00:17:45,500 --> 00:17:53,870 because sometimes you would not have enough impact or enough effect on the dynamics of the socio political field inside your own country. 180 00:17:53,870 --> 00:18:00,740 If you are just kind despite opposition online and you are not actually having real interaction with people at home, 181 00:18:00,740 --> 00:18:06,740 it can also limit to your own ability to have some kind of impact or effect on your home country. 182 00:18:06,740 --> 00:18:12,180 So that's another paradox. A third set of belts, which is very interesting, 183 00:18:12,180 --> 00:18:20,510 is this shift from what we call the state feminism or top down feminism, well, then called the first lady syndrome. 184 00:18:20,510 --> 00:18:28,450 When the government tries to really kind of give you the impression or the image or the picture that it is really creating reform or change its. 185 00:18:28,450 --> 00:18:29,260 But in reality, 186 00:18:29,260 --> 00:18:37,390 nothing is really happening in terms of affecting women's lives on the ground and really making changes in their own lives in a meaningful way. 187 00:18:37,390 --> 00:18:42,190 And of course, the shift, what we call grassroots activism at the grassroots level, 188 00:18:42,190 --> 00:18:49,810 which means when women themselves start to take matters into their own hands and organise their own movements and mobilise on the ground. 189 00:18:49,810 --> 00:18:54,290 As we have seen, for example, in the midst of the Arab Spring and movements. 190 00:18:54,290 --> 00:19:01,580 Unfortunately, a lot of the so-called, quote unquote, top down forms of feminism have had very limited trickle down effect. 191 00:19:01,580 --> 00:19:09,020 They did not really impact women's lives in meaningful ways. Whether you're talking about the appointing of many women ministers in Egypt, 192 00:19:09,020 --> 00:19:16,940 whether you're talking about non Arab Spring countries like Saudi Arabia, trying to give an image of reform and an image of change was reality. 193 00:19:16,940 --> 00:19:23,690 Women's situation is still very constrained and still very much a negative reality on the ground. 194 00:19:23,690 --> 00:19:26,750 So that's another interesting paradox that we have to be aware of. 195 00:19:26,750 --> 00:19:33,620 And I end with one other of that, which is very important, which is the fact that unfortunately, some of the quote unquote, 196 00:19:33,620 --> 00:19:40,220 marginalised groups that we're talking about are trying to use social media to counter that on marginalisation. 197 00:19:40,220 --> 00:19:43,880 Not all of them have necessarily been supportive of revolutions. 198 00:19:43,880 --> 00:19:52,310 Some of them have actually turned into counter movements themselves, which is a very interesting but very sad but a box to talk about. 199 00:19:52,310 --> 00:19:59,300 For example, you have seen some groups of women and other marginalised minorities that have been part of the quote unquote, a senior U.S. 200 00:19:59,300 --> 00:20:05,930 Or Saudi Mr President movement that were supporting Mubarak. And some of them are also supporting the military dictatorship under Sisi. 201 00:20:05,930 --> 00:20:12,950 And in other parts of the Arab world as well. I've seen some Islamist groups like the Salafis in Egypt, for example, 202 00:20:12,950 --> 00:20:18,290 taking some positions that are pretty much in support of the military coup in 203 00:20:18,290 --> 00:20:23,200 Egypt and supporting it instead of trying to take a position against that. 204 00:20:23,200 --> 00:20:27,530 So these are, quote unquote, marginalised groups that instead of being the, you know, 205 00:20:27,530 --> 00:20:32,360 revolutionaries, they have taken the position of being counter revolutionaries. 206 00:20:32,360 --> 00:20:38,900 I think the explanation could be an attempt on their part to try to counter that the auto marginalisation 207 00:20:38,900 --> 00:20:45,960 by trying to appease the regime or trying to make peace with the status quo rather than to go against it. 208 00:20:45,960 --> 00:20:51,750 And finally, we could not turn a blind eye to the moment that we are all in right now, which is the pandemic moment, 209 00:20:51,750 --> 00:20:59,250 which brought about a lot of constraints in terms of the activism in general and social media activism in particular, 210 00:20:59,250 --> 00:21:06,910 with the enacting of a lot of cyber crime laws that reads put a lot of constraints on activists and journalists and people 211 00:21:06,910 --> 00:21:13,230 who try to get the word out about anything that does not go along with the government's policy or the government's plans. 212 00:21:13,230 --> 00:21:18,870 In many cases, these people find themselves. They're completely silenced. The Web sites are blocked. 213 00:21:18,870 --> 00:21:23,610 They can be put behind bars. In some extreme cases, they can be arrested and killed. 214 00:21:23,610 --> 00:21:26,850 Even these are very sad stories, but they're very real ones. 215 00:21:26,850 --> 00:21:37,390 And the future prediction is that this whole wave of digital authoritarianism is going to escalate is what it get from bad to worse as both parties, 216 00:21:37,390 --> 00:21:44,220 the regimes on one hand and their opponents, on the other hand, try to master their own skills of cyber activism, 217 00:21:44,220 --> 00:21:51,270 use them as weapons to attack the other and use them as weapons to just make their own stories heard while silencing others. 218 00:21:51,270 --> 00:21:57,120 So that is the prediction moving forward in the post pandemic era. And 10 years after the Arab Spring. 219 00:21:57,120 --> 00:22:02,500 Thank you so much. Thank you, sir. When I moved to Mark. 220 00:22:02,500 --> 00:22:10,400 Thank you very much, Walter, for your introduction. I should really update my bio, because the book on behind is out political repression, right? 221 00:22:10,400 --> 00:22:13,820 No, it's my words. Thanks so much, Sahhaf. 222 00:22:13,820 --> 00:22:24,700 Your your you've I think you've laid out the ground perfectly of this idea between digital euphoria or digital utopianism and digital dystopia. 223 00:22:24,700 --> 00:22:32,080 As those who know me will know, I'm incredibly cynical. So I definitely situate myself on the digital dystopian side of things. 224 00:22:32,080 --> 00:22:36,770 So I generally see things as going from bad to worse and a general level. 225 00:22:36,770 --> 00:22:43,510 But today I want to talk. I want to give a few empirical examples of digital authoritarianism. 226 00:22:43,510 --> 00:22:49,690 But what I want to do first is, is actually think about digital authoritarianism, not as a state century enterprise. 227 00:22:49,690 --> 00:22:57,070 And what I mean by that is its increasing important to look at the various notes that occur within digital authoritarianism that enable it to happen. 228 00:22:57,070 --> 00:23:00,280 There's a tendency mostly within political science, I think, 229 00:23:00,280 --> 00:23:05,860 and I ought to to focus on the role of the state and the role of the states specific role in of terrorism. 230 00:23:05,860 --> 00:23:08,890 But as we know, obviously, authoritarianism is a phenomenon. 231 00:23:08,890 --> 00:23:14,260 And the resilience authoritarianism is abetted by number of forces, including those of outside powers. 232 00:23:14,260 --> 00:23:22,870 So I'll start with a brief anecdote. And firstly, we have to accept that part of digital partisanism is this notion of disinformation and fake news, 233 00:23:22,870 --> 00:23:29,260 propaganda, etc, as an aspect of digital authoritarianism. And the where the anecdote is not yet my anecdote, I'm not gonna lie. 234 00:23:29,260 --> 00:23:37,300 If we think about the role of a London PR firm that also helped the Soviet party in power, 235 00:23:37,300 --> 00:23:42,620 doing work, promoting my I had been summoned, the crown prince of Saudi Arabia. 236 00:23:42,620 --> 00:23:49,480 Why is that the case? Well, there's a market for whitewashing the reputation of dictators. 237 00:23:49,480 --> 00:23:55,350 And this has always been the case, for example, in the Gulf, in Britain's relations and Britain's particular lectureship in the Gulf. 238 00:23:55,350 --> 00:23:59,650 And it raises these questions of who profits from digital authoritarianism. 239 00:23:59,650 --> 00:24:06,580 Why is digital authoritarianism happening? Because the problem when we talk about tools like Twitter, a social media uncritically, 240 00:24:06,580 --> 00:24:12,970 is that we have an almost an element of technological determinism where we assume these tools just exist. 241 00:24:12,970 --> 00:24:18,280 And why do they just exist? But actually, these tools exist for a specific reason. 242 00:24:18,280 --> 00:24:25,930 If we look at why Twitter has proliferated around the world, where Facebook has proliferated around the world, we actually have to ask ourselves, 243 00:24:25,930 --> 00:24:34,720 why has product such as Facebook or Twitter been allowed to be distributed around the world without any due diligence, 244 00:24:34,720 --> 00:24:38,670 for example, of the potential negative consequences of that product? 245 00:24:38,670 --> 00:24:45,270 Now, I think that in these tenants, if you sold a car. Or our washing machine, and that washing machine or car was faulty. 246 00:24:45,270 --> 00:24:49,120 It would be withdrawn and the manufacturer withdraw that product because there's a safety problem. 247 00:24:49,120 --> 00:24:53,740 However, there's this normative assumption with technology, Anderson. 248 00:24:53,740 --> 00:24:58,720 And this ties in with this notion of digital euphoria and techno utopianism. 249 00:24:58,720 --> 00:25:05,980 This itself is an ontology, right? You know, if those of you have seen the social Delamar, I think on Netflix, 250 00:25:05,980 --> 00:25:09,340 we'll see that there's this notion of these tech firms in Silicon Valley who 251 00:25:09,340 --> 00:25:13,660 firmly believe that the technology that they are creating is doing a social good. 252 00:25:13,660 --> 00:25:19,750 There's no particular malice there. However, there's this embedded ideology that technology solves problems. 253 00:25:19,750 --> 00:25:22,280 And this idea that technology solves problems. 254 00:25:22,280 --> 00:25:27,070 And specifically in the case of social media issues to do with authoritarianism, they enable free speech. 255 00:25:27,070 --> 00:25:32,860 And democracy is embedded within this kind of North America centric notion that these technologies are to do that. 256 00:25:32,860 --> 00:25:42,510 And that's embedded within and created within a society or a country where there is a specific, you know, debates about or civil civic space. 257 00:25:42,510 --> 00:25:49,780 Right. So this is the background for why these technologies are created and then they are distribute around the world without any, 258 00:25:49,780 --> 00:25:52,210 for example, due diligence of human rights. 259 00:25:52,210 --> 00:25:57,670 If you are a technology company or a social media company or a Start-Up, surely you need to be asking yourself, 260 00:25:57,670 --> 00:26:02,340 what will the impact of this product have on the market that I'm going to send it to? 261 00:26:02,340 --> 00:26:09,730 And we've seen countless examples of how technology has enabled violence and authoritarianism when the most egregious examples is Facebook, 262 00:26:09,730 --> 00:26:16,420 who even acknowledged that the role of that technology in Myanmar, for example, facilitated ethnic cleansing. 263 00:26:16,420 --> 00:26:19,750 And so we know that social media is cognisant of this, 264 00:26:19,750 --> 00:26:27,430 but very little seems to be done about exploring the relationship between the political economy of these countries and the way they benefit. 265 00:26:27,430 --> 00:26:32,710 And I think this is increasingly important to study digital China, just like we might study the arms trade. 266 00:26:32,710 --> 00:26:39,430 We need to actually explore these avenues of connexions rather than seeing technologies as isolated, separate falls. 267 00:26:39,430 --> 00:26:49,450 We need to understand technology as a product that also has a great weight of incentives behind it, push it to market. 268 00:26:49,450 --> 00:26:51,680 And I think this is particularly important. 269 00:26:51,680 --> 00:26:58,470 And when we're looking at, you know, and this is key when it comes to companies, I think like Twitter and Facebook. 270 00:26:58,470 --> 00:27:06,690 Why is it so difficult, for example, as you know, SA has discussed that we're constantly as academics and policy makers and NGOs arguing 271 00:27:06,690 --> 00:27:12,510 that the role of technology is increasingly detrimental in places like the Gulf. 272 00:27:12,510 --> 00:27:16,950 Well, certain countries, it's being utilised as a tool of oppression, a tool of surveillance, 273 00:27:16,950 --> 00:27:20,510 and in many cases a tool of intimidation to troll and harass activists. 274 00:27:20,510 --> 00:27:29,040 OK. So why is it allowed to go? Why is it still existing if its use is simply to facilitate state repression? 275 00:27:29,040 --> 00:27:35,130 Well, this is a big a big question. And again, it raises issues of revenue. 276 00:27:35,130 --> 00:27:37,980 Is there the Middle East market, the Arabic speaking market, 277 00:27:37,980 --> 00:27:44,580 so great that these companies are benefiting no longer from a product that allows the facilitates with free speech? 278 00:27:44,580 --> 00:27:48,880 But actually from a product that has been used simply as an extension of the security apparatus. 279 00:27:48,880 --> 00:27:53,910 And is there an ethical issue that we need to raise it? Because when we talk about technology, again, 280 00:27:53,910 --> 00:27:59,250 there's a tendency to forget about policy implications and those kind of things that that can change this. 281 00:27:59,250 --> 00:28:02,790 And we know, for example, in the Gulf that the Gulf is wealthy. 282 00:28:02,790 --> 00:28:09,660 We know that Saudi is the largest population centre in the Gulf region, high technology, penetration rates, high tech social media. 283 00:28:09,660 --> 00:28:20,520 And, you know, this is this enables one specific state with an authoritarian framework to dominate the Arabic speaking online social media community. 284 00:28:20,520 --> 00:28:27,210 So what that means is, is that if you look at content on Twitter and this is another important thing to bear in mind, 285 00:28:27,210 --> 00:28:35,100 if you're looking for a discussion about foreign policy, say, the war in Yemen on Twitter, if you look at the results when you search in Arabic, 286 00:28:35,100 --> 00:28:42,840 you're not going to be presented with, you know, equal sides of the debate where every country and region has the ability to express 287 00:28:42,840 --> 00:28:47,460 the same amount of content about a specific topic on the country and the country. 288 00:28:47,460 --> 00:28:52,710 The most resources, iPhone's population will be able to dominate that discussion. 289 00:28:52,710 --> 00:28:54,690 So now, if you look at the topics around Yemen, 290 00:28:54,690 --> 00:28:59,940 you will see that those conversations are entirely dominated by accounts representing foreign policy positions, 291 00:28:59,940 --> 00:29:04,050 for example, of the UAE and Saudi Arabia. That's in Arabic. Right. 292 00:29:04,050 --> 00:29:08,970 So what you have is a situation where a country can use its demographic power and it's relatively 293 00:29:08,970 --> 00:29:15,150 advanced technological infrastructure to shape the social media sphere in the Arabic world. 294 00:29:15,150 --> 00:29:22,250 And so we're seeing this kind of imbalance of voices within the region based on wealth and also an authoritarian tendency. 295 00:29:22,250 --> 00:29:30,900 And that authoritarian tendency is the perceived importance of authoritarian regimes to actually need to dominate the information space, 296 00:29:30,900 --> 00:29:37,530 which is a drive. And, you know, this is hugely problematic if we're talking about online civil society, 297 00:29:37,530 --> 00:29:40,920 because when we talk about online civil society and on MySpace, it's more often, 298 00:29:40,920 --> 00:29:42,690 especially in the Arabic speaking world, 299 00:29:42,690 --> 00:29:48,900 we're talking about an Arabic speaking space as opposed to a state bound by national boundaries and that kind of thing. 300 00:29:48,900 --> 00:29:54,390 And this is know, I think, particularly pertinent to see which countries are benefiting from this and who then 301 00:29:54,390 --> 00:30:00,150 has a disproportionate amount of what I call digital hegemony or digital power. 302 00:30:00,150 --> 00:30:06,720 Who has the ability or the digital capital to dominate the digital space discursively and in other ways. 303 00:30:06,720 --> 00:30:12,690 And this is a very important kind of thing we have to see because this power is not necessarily new. 304 00:30:12,690 --> 00:30:16,440 If you look at the post-Cold War structure. I mean, the 1990 go for we'll see. 305 00:30:16,440 --> 00:30:17,300 For example, you know, 306 00:30:17,300 --> 00:30:24,120 this the classic anecdote that people in the Gulf found out about the invasion of Iraq a few days after it happened by watching CNN. 307 00:30:24,120 --> 00:30:29,820 And at that point, it was you know, we saw increasing investment in particular from Saudi Arabia in satellite TV channels, 308 00:30:29,820 --> 00:30:31,740 in the infrastructure of satellite television. 309 00:30:31,740 --> 00:30:38,880 The whole notion being that if the role of information was seen as important and you don't just try to dominate your own national press, 310 00:30:38,880 --> 00:30:46,750 but you try to influence the informational infrastructure, the whole country. And we see this now with Twitter in the same way. 311 00:30:46,750 --> 00:30:50,440 But what we're seeing is an increasingly aggressive attempt to do this. 312 00:30:50,440 --> 00:30:55,570 So whilst you can buy a satellite TV channel, you can't necessarily buy out Twitter. 313 00:30:55,570 --> 00:31:03,610 Well, can you? There's actually been increasing worry about, for example, Saudi's public investment fund via Softbank investing in Silicon Valley. 314 00:31:03,610 --> 00:31:06,490 We know that they fund invest in a lot of start-ups. We know, for example, 315 00:31:06,490 --> 00:31:13,810 that Saudi Arabia now or three or two Saudi citizens are facing a criminal case from the FBI in the USA because 316 00:31:13,810 --> 00:31:20,440 high level Saudis connected to who has been named as royal one infiltrated Twitter headquarters in 2016, 317 00:31:20,440 --> 00:31:23,830 stole lots of personal data and then took this back to Saudi. 318 00:31:23,830 --> 00:31:28,750 And we know we know that there are people now are activists, Saudi activists, 319 00:31:28,750 --> 00:31:32,590 who say that that data leak is directly responsible for the arrest of Saudi citizens. 320 00:31:32,590 --> 00:31:40,510 Right. So what we're seeing here is this evidence of the perceived importance of the information ecology for the current, 321 00:31:40,510 --> 00:31:46,180 for example, in this case, the Saudi regime. And this is key because we're seeing attempts to drive and dominate the discourse. 322 00:31:46,180 --> 00:31:52,630 And in reality, we're seeing it happen. You know, I've been looking now at the digital space for the past 10 years. 323 00:31:52,630 --> 00:32:00,560 And there is this trajectory of increasing or decreasing critical conversation and increasing dominance by pro-government voices, 324 00:32:00,560 --> 00:32:05,570 mostly from the Gulf. And there's a couple of other aspects that I think are interesting in terms of where is this going? 325 00:32:05,570 --> 00:32:14,120 But I want to mention well, firstly, what do you do if you can't summon enough resources to create the discussion you want? 326 00:32:14,120 --> 00:32:16,720 Well, that we're seeing increased amount of automation here. 327 00:32:16,720 --> 00:32:23,770 So a lot of my work has been on bots, which are automated accounts that can replicate content in huge amounts, in huge volumes. 328 00:32:23,770 --> 00:32:28,180 And the efficacy, or at least of all of these spots is quite astounding. 329 00:32:28,180 --> 00:32:35,860 I did a study a couple of years ago looking at specific sectarian hate speech, anti Sunni and Shia hate speech, and found that, you know, 330 00:32:35,860 --> 00:32:41,860 a large amount of this specific slurs at the sectarian slurs that we saw online were actually 331 00:32:41,860 --> 00:32:47,710 being generated by bots which would belong to the satellite TV channel in Saudi Arabia. 332 00:32:47,710 --> 00:32:54,450 So a lot of the discourse that we saw, the sectarian discourse online was actually created essentially by Ottoman automatons. 333 00:32:54,450 --> 00:32:59,680 Right. So it wasn't this reflecting the real world of individual people, but these kind of fake accounts. 334 00:32:59,680 --> 00:33:05,830 And so the other thing as a I and it's an artificial intelligence becomes better. 335 00:33:05,830 --> 00:33:12,520 The whole notion of the adversaries and the people we're going to see online might change because theoretically it will be trivial 336 00:33:12,520 --> 00:33:19,600 soon to have an online account or Twitter account or Facebook account that is advanced in terms of artificial intelligence, 337 00:33:19,600 --> 00:33:26,200 that it can perfectly replicate government propaganda and appear real and fool someone else, that they are real. 338 00:33:26,200 --> 00:33:28,030 And this might sound farfetched. I don't think it is. 339 00:33:28,030 --> 00:33:36,250 I mean, to use, you know, another brief story is that, you know, Saudi Arabia famously gave Sophia this robot that was created. 340 00:33:36,250 --> 00:33:40,410 I can't remember the company who made it, but they gave this robot citizenship. 341 00:33:40,410 --> 00:33:45,490 And there was something kind of symbolic about giving a robot citizenship because this robot is a robot. 342 00:33:45,490 --> 00:33:47,590 It's fundamentally someone you can programme, 343 00:33:47,590 --> 00:33:53,980 someone who you could ideally shape in such a way that they will never engage in dissent and will always praise the regime. 344 00:33:53,980 --> 00:33:57,710 So they are giving a robot citizenship. You are sort of saying, well, this is the ideal citizen. 345 00:33:57,710 --> 00:34:00,100 And in fact, that was a woman who was even more striking. 346 00:34:00,100 --> 00:34:09,250 And so the idea that we might have an autonomous civil online civil society dominated by bots and robots spreading propaganda already exists. 347 00:34:09,250 --> 00:34:14,400 But in a very crude way. So I wonder where this is taking us. 348 00:34:14,400 --> 00:34:18,420 And, you know, artificial intelligence now is one of the next avenues worth exploring in terms of 349 00:34:18,420 --> 00:34:24,060 the technological dystopia and how that will be used by authoritarian regimes. 350 00:34:24,060 --> 00:34:29,220 But just to go back to my original point to end on, this isn't just the remits authoritarian regimes. 351 00:34:29,220 --> 00:34:33,720 Technology, as I mentioned, is a tool. And how use a tool is depending on its purpose. 352 00:34:33,720 --> 00:34:37,910 I could have a video camera, I can make a wonderful movie, or I could use a video camera to spy on someone. 353 00:34:37,910 --> 00:34:41,970 Yep. But we need to be also looking at who is making these tools and who is selling these 354 00:34:41,970 --> 00:34:45,450 tools and who is benefiting financially from these tools and for what purpose. 355 00:34:45,450 --> 00:34:50,520 And also, what's the ideology, ontology embedded within these tools themselves? 356 00:34:50,520 --> 00:34:56,700 You know, and and in terms of policy that this matters for so many reasons, some of which I've just mentioned. 357 00:34:56,700 --> 00:35:00,240 But if we take it again to this notion of digital orientalism. 358 00:35:00,240 --> 00:35:06,540 Well, what happens in the Middle East in terms of the public speaking world is seen as less important, right? 359 00:35:06,540 --> 00:35:13,620 If we can go back to the Myanma incident, ethnic cleansing was facilitated because of disinformation spread on Facebook. 360 00:35:13,620 --> 00:35:17,040 And one of the reasons that happened is because Facebook had invested so little and 361 00:35:17,040 --> 00:35:22,290 having the appropriate language specialists who could moderate content in that region. 362 00:35:22,290 --> 00:35:27,570 And the same is true in, I would say, even more true in the Arab speaking world. 363 00:35:27,570 --> 00:35:31,370 The same kind of infractions of whatever Twitter policies or Facebook policies 364 00:35:31,370 --> 00:35:35,400 that occur here are not dealt with in the same way as they are dealt with in, 365 00:35:35,400 --> 00:35:40,740 say, North America or Europe. You know, it's seen as a bit of a backwater in the way it's dealt with. 366 00:35:40,740 --> 00:35:48,300 So there's this kind of elements of sort of a sort of digital imperialism, the sense that whereas the market share here, 367 00:35:48,300 --> 00:35:53,250 the advertising revenues generated from people using the products here are acceptable. 368 00:35:53,250 --> 00:35:58,170 However, policing the use of that product. So it's not using authoritarian ways. 369 00:35:58,170 --> 00:36:01,590 It's not incentivised because that would actually affect the bottom line. 370 00:36:01,590 --> 00:36:09,090 And that's something we also need to consider when we're looking at. So technology and policy, because digital technology doesn't exist in a vacuum. 371 00:36:09,090 --> 00:36:13,710 It comes from somewhere. And it's some people profit off of it. 372 00:36:13,710 --> 00:36:19,020 And that is part of the disinformation, the digital authoritarian network and supply chain and structure. 373 00:36:19,020 --> 00:36:25,890 So it's not just a state centric enterprise, but it's a collection of different actors who for various reasons, benefit in specific ways. 374 00:36:25,890 --> 00:36:30,580 And I think that's an important way to conceptualise digital authoritarianism. 375 00:36:30,580 --> 00:36:34,440 And I'm going to end. So this type of questions. Thank you. 376 00:36:34,440 --> 00:36:39,300 Thank you both for excellent and thought-Provoking presentations. 377 00:36:39,300 --> 00:36:47,610 I'll begin by asking you both a couple of questions. We'll give the audience time to formulate their own questions for Soha. 378 00:36:47,610 --> 00:36:52,140 The one word that didn't occur in your talk was surveillance. 379 00:36:52,140 --> 00:36:57,480 You're talking about social media in terms of their capacity to join people together. 380 00:36:57,480 --> 00:36:59,220 And you acknowledge, like the rest of us have, 381 00:36:59,220 --> 00:37:08,610 that it didn't necessarily turn out the way so many people thought it would during that initial moment of euphoria during the Arab revolutions. 382 00:37:08,610 --> 00:37:13,800 But I'd like to hear your thoughts about social well, not just actually not just social media, 383 00:37:13,800 --> 00:37:18,900 but of all information technology as a means of surveillance by the state. 384 00:37:18,900 --> 00:37:27,450 And, of course, it goes beyond social media. Every time we go onto the Internet and start clicking messages or clicking or clicking on websites, 385 00:37:27,450 --> 00:37:31,650 we can potentially be watched and we actually are being watched. 386 00:37:31,650 --> 00:37:38,280 And so I'd like to to hear I mean, in a way, I'm asking you to give your take on Marc's dystopian ism, 387 00:37:38,280 --> 00:37:46,230 but I was struck by by not having actually heard the word surveillance in your whole talk. 388 00:37:46,230 --> 00:37:57,650 And from Mark, I want to ask you whether. You are running a risk of just recreating the technological determinism that 389 00:37:57,650 --> 00:38:03,110 we all found so dissatisfying in the initial days of the Arab revolutions. 390 00:38:03,110 --> 00:38:11,540 And, you know, we had the many people had this sort of naive thought that social media had the capacity to put the truth out there. 391 00:38:11,540 --> 00:38:16,430 And if the truth were out there, then eventually, of course, it would it would prevail over falsehood. 392 00:38:16,430 --> 00:38:23,540 And now we know better. And social media seems to actually be having the opposite effect. 393 00:38:23,540 --> 00:38:27,870 And so you mean you're saying that it's inseparable? 394 00:38:27,870 --> 00:38:34,730 The technology is inseparable from the malign forces that are creating it and getting us all to use it. 395 00:38:34,730 --> 00:38:40,810 But it's not going to go away. We're going to continue to live in a media saturated world. 396 00:38:40,810 --> 00:38:52,690 And is your dystopian ism so deep that the only thing we can do is stop using media and find ways to live our lives outside of it? 397 00:38:52,690 --> 00:38:56,670 Do you want me to go first? Walter? Yes, you go first. All right. 398 00:38:56,670 --> 00:39:00,060 So there your question about surveillance. Indeed. This is a very important point. 399 00:39:00,060 --> 00:39:04,860 Of course, we can never turn a blind eye to the amount of surveillance happening in cyberspace. 400 00:39:04,860 --> 00:39:06,780 And I'll just buy this. Seems like you are right. 401 00:39:06,780 --> 00:39:12,850 You mentioned even social media themselves have their own built in mechanisms of surveillance, which is very scary. 402 00:39:12,850 --> 00:39:17,130 You know, issues related to personal security, issues related to national security. 403 00:39:17,130 --> 00:39:22,460 Invasion of privacy. How much information you could or should share on social media. 404 00:39:22,460 --> 00:39:30,210 All of this is becoming a really scary reality that we as consumers and users of social media have to grapple with day in and day out. 405 00:39:30,210 --> 00:39:33,690 Think about how many times you use the Internet to buy something. 406 00:39:33,690 --> 00:39:39,560 And then you were flooded with all of these e-mails asking you if you're also interested in buying X, Y, Z, 407 00:39:39,560 --> 00:39:42,540 or do you watch a certain movie and you get all of these, you know, 408 00:39:42,540 --> 00:39:46,690 e-mails or all these notifications from different social media platforms asking you that? 409 00:39:46,690 --> 00:39:50,660 You'll also be interested maybe in watching this, this and that's what that tells you, 410 00:39:50,660 --> 00:39:57,990 is that you are really under closer monitoring from so many different groups and so many different agencies all at once. 411 00:39:57,990 --> 00:40:04,560 The regimes are part of it. But there are also other entities. You know, let's kill these social media platforms themselves. 412 00:40:04,560 --> 00:40:11,480 Not to mention, of course, when you have major scandals like what's happened on Facebook and the leaking of personal information of, 413 00:40:11,480 --> 00:40:18,240 you know, thousands and thousands, hundreds of thousands or even sometimes millions of users of these different social media platforms. 414 00:40:18,240 --> 00:40:22,500 But if you want to link this back to our discussion related to the platform or the images 415 00:40:22,500 --> 00:40:28,350 we have now revealing in the Arab Arab world 10 years after the eruption of the uprisings, 416 00:40:28,350 --> 00:40:32,190 I think what I ended with in terms of the moment we are living in now, 417 00:40:32,190 --> 00:40:40,800 which is the pandemic movement brings in a much darker and even more scary image in terms of these Serbians techniques and mechanisms, 418 00:40:40,800 --> 00:40:45,500 simply because a lot of regimes have now started to implement a certain apps and certain 419 00:40:45,500 --> 00:40:50,550 tools on social media whereby they can actually do what is called contact tracing. 420 00:40:50,550 --> 00:40:55,680 They can trace who you have been talking with, who have been visiting with. 421 00:40:55,680 --> 00:41:01,710 We have been seeing or places you've been going to your own location of that very sensitive person. 422 00:41:01,710 --> 00:41:08,460 Information is now being under disclosed surveillance and the close watch of the regimes under, of course, 423 00:41:08,460 --> 00:41:13,760 the mantle of protecting your own personal safety and your health and the health of your family. 424 00:41:13,760 --> 00:41:19,710 So we are entering into this new phase, if you will, a very dangerous degree of surveillance, 425 00:41:19,710 --> 00:41:25,170 which is much more intrusive into everyone's life in this pandemic moment. 426 00:41:25,170 --> 00:41:30,660 And my prediction is moving forward with the regimes publishing their own skills and their own 427 00:41:30,660 --> 00:41:36,540 tools and their own learning curve in terms of mastering technological and technical skills. 428 00:41:36,540 --> 00:41:41,820 We're going to see even a greater degree of surveillance and invasion of citizens privacy and 429 00:41:41,820 --> 00:41:47,160 going after opponents and opposition movements using many of these modern tools and techniques. 430 00:41:47,160 --> 00:41:53,670 As I mentioned, deep endemic. It is bringing all of that now to the surface. And this is only the tip of the iceberg. 431 00:41:53,670 --> 00:41:56,310 I think from here it's going to get from from bad to worse. 432 00:41:56,310 --> 00:42:03,090 So we as consumers of social media have to be really much more alert and aware of all of these mechanisms. 433 00:42:03,090 --> 00:42:07,680 Just very quickly, two things I want to quickly point to. I forgot to mention them in my thoughts. 434 00:42:07,680 --> 00:42:12,590 One of them is the chicken and egg question in terms of what comes first. 435 00:42:12,590 --> 00:42:16,220 Democratisation or having them active civil society. 436 00:42:16,220 --> 00:42:23,100 And I talked about, you know, the fact that you have a vacuum in terms of civil society and the social media can not fill this vacuum. 437 00:42:23,100 --> 00:42:30,990 The question then becomes, then what comes first? Should you have a stable democracy in order to have an active and vibrant civil society? 438 00:42:30,990 --> 00:42:39,090 Or do you need to have an active civil society and fill these needs vacuums first in order to have a proper transition to democratisation? 439 00:42:39,090 --> 00:42:43,650 Unfortunately, up till this moment in time, I cannot promise that they have. And as for this question, 440 00:42:43,650 --> 00:42:48,390 it appears to be another paradox or another dilemma that we have to grapple with as 441 00:42:48,390 --> 00:42:53,330 scholars and as professionals and as people working in the field of social media and side. 442 00:42:53,330 --> 00:43:00,750 But activism and also in political science, what comes first is a chicken and egg question or dilemma that we have to grapple with. 443 00:43:00,750 --> 00:43:07,830 And finally, this whole notion of cyber euphoria that we talked about or cyber utopia, we need to reach a middle ground. 444 00:43:07,830 --> 00:43:14,430 What we can call cyber realism, whereby we are not over crediting or don't crediting social media. 445 00:43:14,430 --> 00:43:20,790 We don't have this moment of euphoria. We call it the Facebook revolution, Twitter uprising and all of that. 446 00:43:20,790 --> 00:43:28,890 But at the same time, we don't want to turn a blind eye to the importance and the vitality of social media and the different roles they can play. 447 00:43:28,890 --> 00:43:34,740 So we need to arrive at some kind of middle ground or some kind of balance that we can call cyber realism, 448 00:43:34,740 --> 00:43:40,550 trying to weigh the pros and cons and the strengths and the limitations. Ashleigh. 449 00:43:40,550 --> 00:43:46,140 Yes, Mark. I mean, I think that kind of leads what I said leads off nicely to two. 450 00:43:46,140 --> 00:43:55,400 Well, to your question, because I think the notion of cyber realism. For me, it's not in the same value category is either utopianism. 451 00:43:55,400 --> 00:44:00,800 Dystopian isn't one you type. This implies something good, positive dystopian, something negative. 452 00:44:00,800 --> 00:44:07,940 Realism is reality, but reality can also be good or bad and it can go in one of those directions. 453 00:44:07,940 --> 00:44:14,060 Right. So the question for me is, where are we now? I think, yes, I am certainly cynical. 454 00:44:14,060 --> 00:44:20,530 Well, I think there's a risk of of kind of, you know, having a teleology it's all gonna be bad and it's going to end terrible. 455 00:44:20,530 --> 00:44:28,630 Right. And that's where we're going. And that could happen. It might not. But I think the moment we're in, we have to look at the reality. 456 00:44:28,630 --> 00:44:32,850 I don't think anyone's his questioning. That's 10 years ago. There was a moment of euphoria. 457 00:44:32,850 --> 00:44:39,140 And the point where it now, certainly in the Arab world is one of less hope in the vote of technology. 458 00:44:39,140 --> 00:44:43,700 And I think what's interesting about this, in terms of connecting that again, 459 00:44:43,700 --> 00:44:48,560 back to where this technology originates, we've seen the rise of fake news and disinformation recently. 460 00:44:48,560 --> 00:44:53,150 We've seen the rise of post truth politics, which is the destruction of faith and trust in institutions, 461 00:44:53,150 --> 00:45:01,010 the denigration of the news media organisations, the essential gaslighting of institutions to such an extent that people reduce 462 00:45:01,010 --> 00:45:04,400 their trust in institutions that direct them on figures such as Donald Trump. 463 00:45:04,400 --> 00:45:10,700 And if we look at what happened in the 6th of January in the US, the Capital Hill, you know, this isn't just an unique event. 464 00:45:10,700 --> 00:45:15,430 This has been building for some time. The role of technology, the role of fake news, 465 00:45:15,430 --> 00:45:21,410 the policy ecology that exists in the US that has allowed technology companies to grow this big and grow this dominance without regulation, 466 00:45:21,410 --> 00:45:27,140 has also allowed this. So I think the trajectory we're on and that we've seen in the region and we've seen 467 00:45:27,140 --> 00:45:33,050 in the US recently reflects a tendency towards going to an authoritarian place. 468 00:45:33,050 --> 00:45:36,960 And some people might dismiss the word authoritarianism and just use the term liberal practises. Right. 469 00:45:36,960 --> 00:45:42,200 Well, I think did the liberal practises are being enabled by the current digital information, 470 00:45:42,200 --> 00:45:46,430 ecology and social media, whether that is surveillance or disinformation? 471 00:45:46,430 --> 00:45:52,610 I think they're well connected. So was I don't dismiss the possibility of an alternative. 472 00:45:52,610 --> 00:45:56,690 I still think that there's a real struggle that just as if I was a citizen, 473 00:45:56,690 --> 00:46:01,340 an enforcer in regime structures that are very real and very authoritarian. 474 00:46:01,340 --> 00:46:04,400 And in order to overcome, those certain conditions must be met. 475 00:46:04,400 --> 00:46:12,140 And I think whilst there is a resistance against the direction that digital technology and digital governance is heading. 476 00:46:12,140 --> 00:46:17,060 We're still at a point where we are resisting increasing illiberal practises within that realm. 477 00:46:17,060 --> 00:46:24,680 So, yes, disconnect everything is what I'm saying, except for Zoom, obviously, because, you know, this is where we learn parenting. 478 00:46:24,680 --> 00:46:32,780 Think it's a good use of technology? Yeah, right. The questions from the audience will be curated by my colleague Kuji. 479 00:46:32,780 --> 00:46:38,390 Thank you, Walter. It's a pleasure to see so many people of getting their questions in. 480 00:46:38,390 --> 00:46:42,980 If you'd like to ask a question, this is to Glassboro and Abeer Nazar. 481 00:46:42,980 --> 00:46:46,700 I see you've raised hands, but that function doesn't work on this platform. 482 00:46:46,700 --> 00:46:51,230 So please go to the queue in a bar at the bottom and type of question in and dejection. 483 00:46:51,230 --> 00:46:56,810 I've asked you to clarify your question. Have a look at my message and see if you can refine your question a little bit so 484 00:46:56,810 --> 00:47:00,920 that I can ask it to our speakers that I'm going to start with one of our students, 485 00:47:00,920 --> 00:47:11,270 Pyotr Ashoka's, who's posing a question to Sahar. He says that you say that ousting a dictator is easy, but figuring out what to do next is hard. 486 00:47:11,270 --> 00:47:14,180 And then you said that it depends on society. 487 00:47:14,180 --> 00:47:21,500 If society is ready and I think Joshua wants to challenge, you're putting too much emphasis on the role of society, Sahar. 488 00:47:21,500 --> 00:47:27,080 He goes on and says, is it really in the hands of society and not the military or the economic elites or 489 00:47:27,080 --> 00:47:31,790 international backers like the US or the Saudis or the UAE in the case of Egypt, 490 00:47:31,790 --> 00:47:36,900 who have a greater capacity to put pressure on the powers that be when speakers? 491 00:47:36,900 --> 00:47:42,350 I need brief answers from you because we've got a lot of questions here. Over to you, Sahar. 492 00:47:42,350 --> 00:47:48,260 Well, let me first start by saying this is an excellent and very thoughtful question, and I don't think we are really important for this agreement. 493 00:47:48,260 --> 00:47:51,950 The chicken and egg dilemma that I posed at the end of the paradox of what comes first. 494 00:47:51,950 --> 00:47:57,080 Right. You need to have these kinds of structural changes in society on different levels, socially, 495 00:47:57,080 --> 00:48:01,580 politically, economically, before you can really have a successful transition to democratisation, 496 00:48:01,580 --> 00:48:05,480 or do you need to have some kind of stable democratisation first in order to have 497 00:48:05,480 --> 00:48:09,110 all of these structural changes and filling some of these vacuums and gaps? 498 00:48:09,110 --> 00:48:15,170 Definitely. There's definitely a lot of good points here that you can cannot just rely on for the society alone, 499 00:48:15,170 --> 00:48:17,690 especially when you have all of these constraints. 500 00:48:17,690 --> 00:48:24,470 You're not talking about some kind of a, you know, Western democracy where you have freedom of expression, freedom of the press and freedom of speech. 501 00:48:24,470 --> 00:48:29,690 And you can just go about and just criticise the highest office in the land and be safe and secure. 502 00:48:29,690 --> 00:48:32,780 We're talking about very repressive regimes, very authoritarian regimes. 503 00:48:32,780 --> 00:48:38,840 You're talking about being consequences in terms of the activism as all of these things have to be taken into consideration. 504 00:48:38,840 --> 00:48:42,290 Of course, all the other points mentioned in the question as well in terms of foreign 505 00:48:42,290 --> 00:48:45,560 interference and international influence and all of these other constraints, 506 00:48:45,560 --> 00:48:50,220 the role of the military, all of these factors have to be taken into consideration. 507 00:48:50,220 --> 00:48:56,210 My whole point was there was this moment of euphoria in 2011 when we thought, yes, you want a shot, beauty is putting this on. 508 00:48:56,210 --> 00:49:01,200 The people want to overthrow the regime. And we thought that by Mubarak just leaving office or stepping down, 509 00:49:01,200 --> 00:49:07,520 that if we were able to do this mission successfully 10 years forward, we discovered that this was not the case. 510 00:49:07,520 --> 00:49:09,170 And you need to have really, 511 00:49:09,170 --> 00:49:15,470 really much more work and much more investment in building this active civil society and building these kinds of grassroots activism, 512 00:49:15,470 --> 00:49:20,120 a mechanism which is very, very difficult to do when you have a repressive regime in place. 513 00:49:20,120 --> 00:49:23,540 So this would be my answer to this question. Thank you so much, Sara. 514 00:49:23,540 --> 00:49:29,930 I've got a question for Mark. Next from Tom Perez, who clearly picking up on your work on Bortz, 515 00:49:29,930 --> 00:49:38,810 wants to know what researching state sponsored exploitation of major Silicon Valley platforms such as Twitter, Facebook, Google and Apple Inc. 516 00:49:38,810 --> 00:49:44,340 What would such research look like? Well, I mean, I do engage in this work. 517 00:49:44,340 --> 00:49:55,140 And so what a lot of it does is involve monitoring, identifying and finding these kind of problems, such as bot networks and exposing it. 518 00:49:55,140 --> 00:49:58,560 And one of the reasons here is that companies like Facebook or Twitter generally 519 00:49:58,560 --> 00:50:05,220 on that transparent in terms of the information operations they document. So without Civil Society Act pushing back. 520 00:50:05,220 --> 00:50:08,910 You know, there's very little chance that that's going to be much change in this realm. 521 00:50:08,910 --> 00:50:14,070 And I think this taps in briefly to another question, which is the power of these social media companies. 522 00:50:14,070 --> 00:50:18,510 One of you asked, well, I think about Trump being banned. And I'll just answer this quickly because it's relevant. 523 00:50:18,510 --> 00:50:22,390 I think it's a good thing. Trump was banned because he was generating hate speech. 524 00:50:22,390 --> 00:50:25,890 And, you know, I think in terms of society doing no good. 525 00:50:25,890 --> 00:50:32,490 The problem with him being banned, it reflected the power of social media companies in in intervening in discourse. 526 00:50:32,490 --> 00:50:39,390 However, it also highlighted how there was a lack of transparency, public debate, public input in those decisions. 527 00:50:39,390 --> 00:50:47,670 Who. And who should not be banned. And it really exposes that these companies, which can impact on the digital discussion and the civil society, 528 00:50:47,670 --> 00:50:54,710 are not subject to the kind of democratic oversight that we we should probably have them doing. 529 00:50:54,710 --> 00:50:59,240 Thank you very much, Mark. I've got a question from police, Joshua, for Sahad, 530 00:50:59,240 --> 00:51:06,560 who would like to challenge the assertion you made that Tunisia is an exception to the way Arab Spring affected other countries. 531 00:51:06,560 --> 00:51:14,430 He wants to know what internal and external factors in Tunisia rendered the country exceptional in this regard. 532 00:51:14,430 --> 00:51:19,150 I know that some of Micronesian friends would even grapple with this term and sometimes even criticise that they would say, 533 00:51:19,150 --> 00:51:27,220 oh, you talk about the exceptionalism too much. We still have our own challenges, that we have our internal problems and we have economic challenges. 534 00:51:27,220 --> 00:51:34,050 I mean, I. Yeah, you do. That's for sure. But at the same time, if you look at the stark comparison with other so-called, quote, 535 00:51:34,050 --> 00:51:40,150 good Arab Spring countries that really had their own, you know, revolutions go completely in the opposite direction. 536 00:51:40,150 --> 00:51:43,390 You know, the civil war in Syria, the war in Yemen, the sectarianism. 537 00:51:43,390 --> 00:51:49,990 And, you know, this is a country of strife and violence in Libya that it turns out much more harsher military dictatorship in Egypt. 538 00:51:49,990 --> 00:51:55,870 All of these very, very unfortunate outcomes that we've gotten an invisible crush, the revolution in Bahrain. 539 00:51:55,870 --> 00:52:03,280 You look at this very bleak and very gloomy picture. I think Tunisia stands out as a very successful exemption for a number of reasons. 540 00:52:03,280 --> 00:52:07,860 It's a much smaller country. It is a very high literacy rate of over 90 percent. 541 00:52:07,860 --> 00:52:12,100 The women are very literate and educated that are very much part of the public sphere. 542 00:52:12,100 --> 00:52:16,120 And I will just say women are heads of society and they give birth to the second half. 543 00:52:16,120 --> 00:52:21,080 So they're very important to be at the table. And it's very important for them to have their voices heard. 544 00:52:21,080 --> 00:52:28,750 The judges that taking part in decision making, it was also this very important success in terms of building alliances. 545 00:52:28,750 --> 00:52:31,480 Groups were able to get together and build successful alliances. 546 00:52:31,480 --> 00:52:38,890 Some of them won the Nobel prise for this great work, which unfortunately we have not witnessed in the other so-called Arab Spring countries. 547 00:52:38,890 --> 00:52:43,750 We have seen the fragmentation, polarisation and division, which I mentioned before. 548 00:52:43,750 --> 00:52:47,920 We have not seen a successful experiment of alliance building and coalition 549 00:52:47,920 --> 00:52:52,910 building that helps to really pave the way for a transition to democratisation. 550 00:52:52,910 --> 00:53:01,270 All is when you just put out as an example and also the lack of international interference in the case of Tunisia was also another health perspective. 551 00:53:01,270 --> 00:53:08,500 So I think all of these factors coming together created this. I still stand by my term Tunisian exceptionalism until this moment. 552 00:53:08,500 --> 00:53:14,110 Thank you. Thank you very much, sir. I've got a couple of questions for Mark about the Gulf region. 553 00:53:14,110 --> 00:53:18,340 For one, I know there's been a lot of this is coming from called an aid writes. 554 00:53:18,340 --> 00:53:28,000 I know there's been quite a bit about Saudi Boort activity detected in the UK surrounding the aborted purchase of the soccer club Newcastle United. 555 00:53:28,000 --> 00:53:34,000 I wonder if we're seeing similar campaigns by Gulf actors online targeting spaces outside the MENA region? 556 00:53:34,000 --> 00:53:41,740 And if not, why not? Before you jump to that one from bloodedness we have concerning Bahrain, 557 00:53:41,740 --> 00:53:47,500 we witnessed the situation using Internet and social media as a tool of political repression. 558 00:53:47,500 --> 00:53:51,250 Dr Jones spoke about it in his latest book. So we're plugging your book here. 559 00:53:51,250 --> 00:53:57,910 Mark, take note. Do you think this would change with the Biden administration? 560 00:53:57,910 --> 00:54:01,910 So the Gulf that its near abroad or whatever? 561 00:54:01,910 --> 00:54:07,450 I think the question about whether we see regimes targeting abroad are really important because, yes, they are. 562 00:54:07,450 --> 00:54:12,160 And this shows increasing boldness of certain actors in the Gulf region, particularly the UAE and Saudi Arabia. 563 00:54:12,160 --> 00:54:16,750 We see this with surveillance and in fact, antiperspirant, that there is an increasing attempt, 564 00:54:16,750 --> 00:54:21,850 especially under the Trump administration, from these regimes regimes to try and influence public opinion abroad. 565 00:54:21,850 --> 00:54:28,740 And to give one quick example that last year we detected my colleague and I jennison databased. 566 00:54:28,740 --> 00:54:38,470 We detected a network of 25 fake journalists that had used artificial intelligence, generated profile images and submitted op heads, 567 00:54:38,470 --> 00:54:45,880 opinion articles to over forty five different international news outlets and had them published. 568 00:54:45,880 --> 00:54:53,200 So this is this included The Washington Times, South China, Morning Post, Asia Times spiked in the UK. 569 00:54:53,200 --> 00:54:57,550 These were articles written by someone who wasn't who they were claiming to be 570 00:54:57,550 --> 00:55:02,500 basically articulating Emirati foreign policy in these international news outlets. 571 00:55:02,500 --> 00:55:08,050 And they successfully did about 95 five. Right. So this is phenomenal. 572 00:55:08,050 --> 00:55:16,360 And this kind of activity we see all the time. So, yes, these regimes are targeting abroad and they are trying to influence public opinion abroad. 573 00:55:16,360 --> 00:55:19,750 And I think one of the reasons we saw this, particularly the past four or five years, 574 00:55:19,750 --> 00:55:24,790 is that Trump was seen as, you know, a very strong candidate to be supported. 575 00:55:24,790 --> 00:55:27,460 The goals of, say, I haven't been sound, I haven't been inside. 576 00:55:27,460 --> 00:55:32,530 So they were very much wanted him to be re-elected and as such, put resources into that. 577 00:55:32,530 --> 00:55:38,950 With regards to Biden and Bahen, who the president of the U.S. is matches in terms of state repression in Bahrain. 578 00:55:38,950 --> 00:55:44,680 As soon as Trump were elected. We saw more executions in Bahrain than we've seen since before 2011. 579 00:55:44,680 --> 00:55:49,780 Biden will make a difference in terms of the plight of political prisoners in Bahrain, 580 00:55:49,780 --> 00:55:55,680 but I don't think that will be a significant difference in terms of the way additional tools are being used to aid state repression. 581 00:55:55,680 --> 00:56:01,320 Unfortunately. Thank you very much, Mark. I've got a question now from a beer, Nazar of your. 582 00:56:01,320 --> 00:56:03,300 Glad you found your way to the queue in a bar. 583 00:56:03,300 --> 00:56:12,900 This one's for Subha and a beer is asking, what examples can you cite of women finding bargaining power with liberal activists in Egypt, 584 00:56:12,900 --> 00:56:19,800 Tunisia, Morocco and other countries where you might see a crack and current restrictions on the state of isolation of women? 585 00:56:19,800 --> 00:56:25,320 Or just addressing of women's issues, presumably with social media technology? 586 00:56:25,320 --> 00:56:27,810 Yeah, that's a that's an excellent question. 587 00:56:27,810 --> 00:56:34,440 It's very important to really understand why women's movements have multiples and called succeeds in many parts of the Arab world. 588 00:56:34,440 --> 00:56:35,190 Up to this moment, 589 00:56:35,190 --> 00:56:43,740 even those that have resorted to using cyber activism or cyber feminism in this case to promote and to advocate for women's issues and women's causes. 590 00:56:43,740 --> 00:56:49,810 Because as we said before, social media are not magical tools and they're not going to bring about change and reform or in this case, 591 00:56:49,810 --> 00:56:53,670 the social advancement for women all by themselves. 592 00:56:53,670 --> 00:57:00,420 We need to change aside that mindset and you need to have work on the ground, which is being done in collaboration between different groups. 593 00:57:00,420 --> 00:57:07,860 And I'll go back again to the Tunisian example as one really a strikingly positive example in this sense and in this regard. 594 00:57:07,860 --> 00:57:13,620 We have seen that groups of women being able to bring about more public visibility for themselves, 595 00:57:13,620 --> 00:57:16,770 working together across the board and bridging their differences. 596 00:57:16,770 --> 00:57:23,790 Groups of people, groups that have Islamist leanings, a liberal leaning secular right, left, you name it, all across the board, 597 00:57:23,790 --> 00:57:30,000 being able to come together and to pass wonderful legislations and new laws like the comprehensive law 598 00:57:30,000 --> 00:57:36,030 banning all forms of gender violence against women as they did not justify violence as domestic violence. 599 00:57:36,030 --> 00:57:43,380 They define it as sexual, physical, emotional, economic, political violence of all different sorts. 600 00:57:43,380 --> 00:57:47,970 It was a huge success and a huge breakthrough that would not have happened had 601 00:57:47,970 --> 00:57:53,070 it not been for this kind of alliance and coalition building on the ground, not just online. 602 00:57:53,070 --> 00:57:57,060 So that is really a lesson and a take away that I think other women's groups across 603 00:57:57,060 --> 00:58:00,780 the region should learn from that they have to build alliances across the board, 604 00:58:00,780 --> 00:58:02,920 come together despite the differences. 605 00:58:02,920 --> 00:58:10,290 Use both online and offline and on the ground activism and coordination following this great example from Tunisia. 606 00:58:10,290 --> 00:58:17,560 Sir, thank you so much. I know you have time for one last question, and I'm going to combine two questions here that are pushing back a little bit, 607 00:58:17,560 --> 00:58:24,920 Mark, on this cyber dystopia that you present us with and our pushing you to think of a positive way forward. 608 00:58:24,920 --> 00:58:34,330 Sareen quartiles. Was curious to know if they can render social media a more Democratic site where people can express themselves more freely. 609 00:58:34,330 --> 00:58:37,780 And our colleague Osama Azami asks you, Mark, 610 00:58:37,780 --> 00:58:46,780 what sort of regulatory frameworks could policymakers develop to bring greater transparency and democracy into the running of social media companies? 611 00:58:46,780 --> 00:58:52,450 So if you could give us a hope for the future. Mark, before we bring this session to an end. 612 00:58:52,450 --> 00:58:59,960 All right. Well, I'm going to do a bait and switch, I think. But yeah, I think in terms of hope. 613 00:58:59,960 --> 00:59:06,500 Yes. Let's let's talk about clubhouse, because I think I want to ask about this clubhouse as a new speciality platform that's taking them by storm. 614 00:59:06,500 --> 00:59:11,180 And it's provoked, again, the same kind of debates we had about 10 years ago where people being very optimistic, 615 00:59:11,180 --> 00:59:16,970 people are saying, wow, we're having these discussions in the region about controversial issues that we never had before. 616 00:59:16,970 --> 00:59:20,510 And I think this is a positive thing. There's a hot but there's a honeymoon period. 617 00:59:20,510 --> 00:59:26,180 I said P g and they will approach it with this kind of optimism that enables 618 00:59:26,180 --> 00:59:32,090 them to have these very constructive and useful and important discussions. My fear is that it's already been co-opted. 619 00:59:32,090 --> 00:59:36,860 I wrote an op ed in the Middle East by saying how we're already seeing trawling occurring in those spaces. 620 00:59:36,860 --> 00:59:43,130 But it can occur, I think, in terms of, you know, I'm again, I'm cynical about the outcomes of that in terms of regulation. 621 00:59:43,130 --> 00:59:51,860 We need to look at whether regulation is being made right. Because if a lot of these companies exist in the US or North American Europe, 622 00:59:51,860 --> 01:00:00,260 then the regulatory framework that allows them to be democratic spaces include issues of moderation, content, control, et cetera. 623 01:00:00,260 --> 01:00:04,040 So the kind of pressure that we need to see again exists. 624 01:00:04,040 --> 01:00:07,970 And, you know, we need civil society groups to put pressure on these companies. 625 01:00:07,970 --> 01:00:14,180 We need people within those countries which the companies exist to put pressure on them to create terms, condition, 626 01:00:14,180 --> 01:00:21,820 rules and regulations within the platform themselves actually facilitate a non-toxic behaviours and facilitate democratic interactions. 627 01:00:21,820 --> 01:00:26,790 As a quick example, for example, Twitter is inundated with bots and fake accounts. 628 01:00:26,790 --> 01:00:30,950 False and fake accounts are perfectly exploited by A13 regimes to intimidate and harass. 629 01:00:30,950 --> 01:00:36,350 They should have no place in the platform, but they do. So that's an example of how regulation could try to get rid of it. 630 01:00:36,350 --> 01:00:40,240 That's not necessarily so optimistic, but. Yeah, I can't. 631 01:00:40,240 --> 01:00:44,880 I can't. I'm sorry you didn't come. Well, thank you for questions. 632 01:00:44,880 --> 01:00:49,960 Very great questions. We'll try. I regret that we're out of time. 633 01:00:49,960 --> 01:00:52,180 So I want to wrap up with several things. 634 01:00:52,180 --> 01:00:59,710 First of all, on behalf of Eugene, who's carrying the questions, I want to apologise to those of you who ask questions that we couldn't get to. 635 01:00:59,710 --> 01:01:03,550 They were there. Always too many questions for us to get through. And a lot of time. 636 01:01:03,550 --> 01:01:07,610 Secondly, I want to thank our speakers here and Mark for excellent presentations. 637 01:01:07,610 --> 01:01:11,140 It's been a great session. We've all enjoyed it very much. 638 01:01:11,140 --> 01:01:19,840 And finally, I want to put in a plug for next week's lecture, which is the Iraqi Army's Deeyah MIL Jalali annual lecture, 639 01:01:19,840 --> 01:01:24,640 which is sponsored by the founder and chair of the Russian Cultural Heritage Institute. 640 01:01:24,640 --> 01:01:31,700 And the title of that event will be Iran and the Arab Uprisings Opportunity Grasped or Squandered. 641 01:01:31,700 --> 01:01:39,570 The speaker for that will be a new show at the Shammy from Durham University, and it will be chaired by our colleague Stephanie Cronin. 642 01:01:39,570 --> 01:01:44,470 And that takes place next week, Friday, March 12th, at the same time, five o'clock. 643 01:01:44,470 --> 01:02:05,905 So on behalf of Middle East Centre and again with thanks to our guests, I want to say goodbye to all.