1 00:00:00,600 --> 00:00:04,470 I started off with this question is AI bad for democracy. 2 00:00:05,910 --> 00:00:13,920 And I use various entries to answer the question and also to clarify 3 00:00:15,000 --> 00:00:20,729 what I'll do today is to we have to focus on this concept of epistemic agency. 4 00:00:20,730 --> 00:00:24,690 I talked to this one one way this topic can be approached. 5 00:00:26,820 --> 00:00:36,270 So yeah it works. So the context of this is that concerns about democracy, 6 00:00:37,350 --> 00:00:48,870 which also relates to general context in which there is a tendency towards authoritarianism even in countries that pretend to be democracies. 7 00:00:48,870 --> 00:00:54,839 And in that context and there's this new technology, and then the question is, what is it going to do? 8 00:00:54,840 --> 00:00:57,960 Is it going to have an impact? And if so, how? 9 00:00:58,440 --> 00:01:06,520 And then as a philosopher, I want to conceptualise that. I want to to understand it by using concepts and I want to make arguments about it. 10 00:01:06,540 --> 00:01:18,240 So that's what I'm doing. I started more with the ethics of AI, but at the moment I'm very interested in using political philosophy to do that. 11 00:01:18,900 --> 00:01:25,290 This particular talk connects to epistemology and of course, moral philosophy. 12 00:01:25,660 --> 00:01:30,500 But yeah, that's also a kind of context for this talk 13 00:01:31,290 --> 00:01:36,830 So the question is, is AI bad for democracy, of course, has to start with this concept of what is AI 14 00:01:37,220 --> 00:01:43,440 And the various forms of AI today. 15 00:01:43,840 --> 00:01:49,920 AI is very popular, usually refers to machine learning, use of machine learning with big data. 16 00:01:51,060 --> 00:01:56,310 The computer helps us to find veterans in this data. 17 00:01:57,000 --> 00:02:06,569 It's basically a kind of statistics machine. And yeah, instead of appearing as something that only does calculations, 18 00:02:06,570 --> 00:02:14,190 suddenly these technologies, you know, engage, for example, with language, use language. 19 00:02:14,190 --> 00:02:18,630 One could say famous examples of you could be in similar, 20 00:02:20,700 --> 00:02:30,330 but also can be used for for political purposes, for example, for analysing data of social media users. 21 00:02:30,750 --> 00:02:33,910 And this can be politically useful. 22 00:02:36,390 --> 00:02:49,379 Democracy is the other concept, and this may at least as vague if you don't say more about what I do for the purpose of this paper 23 00:02:49,380 --> 00:02:56,640 in stock is to to just make make a distinction between thin versions of versions of democracy, 24 00:02:56,940 --> 00:03:05,220 which are all about voting and voting when when you have a system, a representational system with voting, 25 00:03:05,580 --> 00:03:09,970 then according to those conception of democracy, you have a democracy in your file. 26 00:03:12,570 --> 00:03:16,920 I personally sympathise with, I think, a richer conception of democracy. 27 00:03:17,970 --> 00:03:23,710 But then what? It's it's gets tricky. But but I find it very interesting. 28 00:03:23,730 --> 00:03:29,940 So, for example, you have a famous, you know, this kind of direction, 29 00:03:29,940 --> 00:03:34,769 things that deliberation and discussion is important in a democracy, not just voting. 30 00:03:34,770 --> 00:03:42,750 You need to actively engage citizens. So something also comes from from my country, Belgium. 31 00:03:43,620 --> 00:03:48,780 It has anachronistic conceptions, more of a struggle and having a voice. 32 00:03:50,010 --> 00:03:51,419 These are all concepts. 33 00:03:51,420 --> 00:04:00,090 And I think that that is really not just about voting, but there is a lot about language, also about discussion, which can be rational. 34 00:04:00,090 --> 00:04:03,750 Less rational can be emotional also. 35 00:04:05,040 --> 00:04:10,979 But it's, you know, what they have in common, these notions is that the citizens are supposed to engage. 36 00:04:10,980 --> 00:04:13,620 They don't only get something, they also have to do something. 37 00:04:13,890 --> 00:04:23,550 They have to actively somehow engage it in their political community by at least discussing this these common questions. 38 00:04:26,410 --> 00:04:38,530 Another way to talk about this is to talk about political agency, because if citizens are allowed to do that, can do that, are encouraged to do that. 39 00:04:38,530 --> 00:04:43,719 That means that one can say that I'm allowed to to freely vote. 40 00:04:43,720 --> 00:04:49,710 That's a political agency. But in this ritual conception, political agency means much more. 41 00:04:49,720 --> 00:04:58,240 It means that you're, you know, being allowed and being able to reason, to discuss, to defend your view. 42 00:04:58,240 --> 00:05:02,559 For example, something that was normal for philosophers. 43 00:05:02,560 --> 00:05:09,400 But it's not the government and political system that we have that all citizens do this things. 44 00:05:10,600 --> 00:05:18,380 And then being able to to make your voice heard, for example, or in another way, another form of political agency. 45 00:05:18,400 --> 00:05:21,430 So it all depends on what you mean by democracy. 46 00:05:22,480 --> 00:05:26,610 What I'm very interested in for this talk is the conditions for democracy. 47 00:05:27,130 --> 00:05:34,450 The question what is needed as conditions in order to get all these forms of agency and democracy going? 48 00:05:35,770 --> 00:05:40,150 And yeah, partly that can be about principles such as freedom. 49 00:05:41,020 --> 00:05:47,170 If you're if you don't have some basic freedoms, you're doing them authoritarian system, 50 00:05:47,560 --> 00:05:54,490 perhaps totalitarian system, then that's not you know, the conditions for democracy are not fulfilled. 51 00:05:54,490 --> 00:06:01,450 So you cannot even talk about let's have a rational discussion, for example, because your your imprisoned or your reason to be imprisoned. 52 00:06:01,990 --> 00:06:05,530 For example, in Hannah Arendt, 53 00:06:05,530 --> 00:06:14,830 there is also an interesting link to the social what kind of social environment do we do we need in a society to to have democracy. 54 00:06:15,940 --> 00:06:20,979 And she makes a connection which with loneliness and the opposite of that. 55 00:06:20,980 --> 00:06:27,880 So that's interesting. And then what is relevant for this talk is knowledge and therefore also education. 56 00:06:28,570 --> 00:06:35,440 What kind of knowledge do citizens need to exercise those forms of political democratic agency? 57 00:06:38,740 --> 00:06:48,030 And then, yeah, we need to look at ways not to talk about that notion. 58 00:06:48,880 --> 00:06:52,600 And one way to do that, it's just one way. 59 00:06:52,870 --> 00:07:02,139 And there's many possibilities here. But one way is to use the concepts of epistemic agency, which are function in social media. 60 00:07:02,140 --> 00:07:08,410 Small and yeah, I find it an interesting concept. 61 00:07:10,720 --> 00:07:13,720 Because then it would allow me to. 62 00:07:13,750 --> 00:07:22,780 To make it more specific and make more specific. Okay. What exactly could be wrong with political agency, then? 63 00:07:23,770 --> 00:07:30,700 When we look at the conditions for knowledge and what would be good, I'd be better in some sense. 64 00:07:30,700 --> 00:07:37,510 I mean, what does it mean? So all my my questions that I ask can now be asking the more specific grade. 65 00:07:38,920 --> 00:07:43,840 Maybe before I can ask these questions, I need to say something about epistemic agency. 66 00:07:45,190 --> 00:07:57,430 So in psychology and in philosophy, the idea is that Asians want some control over their beliefs and have a certain degree, 67 00:07:58,180 --> 00:08:01,300 descriptively speaking, a certain degree of control over their bodies. 68 00:08:01,900 --> 00:08:17,379 And one can then have discussions, for example, how, how how voluntary is that formation of beliefs and in particular, in, in social epistemology? 69 00:08:17,380 --> 00:08:20,920 The question is like given that we live in a social environment. 70 00:08:22,570 --> 00:08:29,260 Yeah. How you know, how, how free and voluntary is the formation of beliefs? 71 00:08:29,800 --> 00:08:36,580 Because we know, for example, from social psychology that our beliefs can be heavily influenced by our social environment. 72 00:08:36,910 --> 00:08:41,740 So what you know what in that context means this kind of. 73 00:08:43,580 --> 00:08:47,220 Indeed epistemic agency. Then what? 74 00:08:47,340 --> 00:08:51,730 What I do. Being philosopher of technology. 75 00:08:52,300 --> 00:08:55,820 I want to know. Like, what about the technological environments? 76 00:08:56,270 --> 00:09:04,350 Or could say the techno social environment. So what if we bring in A.I. there, for example, and see how that impacts the social environment? 77 00:09:04,370 --> 00:09:14,600 And does it change anything? What is it going to do to to epistemic agency and therefore, to to to these conditions for democracy? 78 00:09:16,140 --> 00:09:20,400 So to make that connection more precise, 79 00:09:20,400 --> 00:09:28,590 one could say that some degree of systemic agency is needed for a political agency in a democracy, especially in the first term. 80 00:09:29,700 --> 00:09:40,770 But even in the in the in the Jim version, if the inversion means like I need some control over my degree of control over the formation of minorities. 81 00:09:41,850 --> 00:09:46,570 Because why would voting make sense if I can opt for my own beliefs? 82 00:09:46,610 --> 00:09:51,090 Right. So if everything is known, we could just ask, for example, 83 00:09:53,070 --> 00:10:00,900 we could collect data about this that people have because there are manipulations, for example. 84 00:10:00,930 --> 00:10:13,830 So this is needed. But also in this version, where, where discussion is important for democracy, very, is very important to have. 85 00:10:14,730 --> 00:10:27,540 Yeah. The capacity to test myself for my beliefs reflect on these beliefs and also very important willing to revise them because you know, 86 00:10:27,630 --> 00:10:30,530 by having a discussion, if everyone just like, you know, 87 00:10:30,570 --> 00:10:40,049 this is what I believe and then at the end of the of the matter that these that all this this kind of reach of a sense of democracy, 88 00:10:40,050 --> 00:10:44,630 that you also learn something that you are willing to, you know, 89 00:10:44,700 --> 00:10:55,169 at least for for the duration of the discussion to to suspend holding on to those beliefs at all costs. 90 00:10:55,170 --> 00:11:01,920 You know, and we willing to discuss them also of the willingness to take responsibility for one's beliefs, 91 00:11:01,920 --> 00:11:07,230 willingness to, you know, voice them and and stand for them. 92 00:11:09,900 --> 00:11:14,729 And then the willingness to be openly discussed and revised. 93 00:11:14,730 --> 00:11:18,680 And I, I like to formulate that in terms of vulnerability. 94 00:11:21,670 --> 00:11:27,150 Do do we have one or at least vulnerable to reflection and deliberation? 95 00:11:30,150 --> 00:11:36,660 So then the question is, is the development and exercise of these forms of politically disciplined agency, 96 00:11:37,050 --> 00:11:45,330 and is the exercise of responsibility of citizens possible under conditions shaped by the use of AI and other digital technologies, 97 00:11:45,600 --> 00:11:50,819 especially in the social media environments or influenced by social media environments? 98 00:11:50,820 --> 00:12:00,150 So that is the question. And my answer is that since I may influence body formation in various ways, 99 00:12:02,010 --> 00:12:10,210 the exercise of forms of epistemic agency needed for political agency in a democracy becomes at least more difficult. 100 00:12:11,070 --> 00:12:14,240 So this constitutes a kind of risk or risk for democracy. 101 00:12:14,670 --> 00:12:21,060 There is no determinism because technological determinism would mean that you 102 00:12:21,090 --> 00:12:25,680 have a technology and you have an effect and there's nothing to do about it. 103 00:12:26,040 --> 00:12:31,560 This is not how our technology works, and this is not how politics in the social world works. 104 00:12:31,950 --> 00:12:36,810 But it is interesting to conceptualise the influence and the risk. 105 00:12:37,380 --> 00:12:44,220 So my conclusion will be that I am dangerous democracy, especially to conceptions of it. 106 00:12:44,580 --> 00:12:51,420 And, you know, regardless, you know, regardless of other ways that I may endanger democracy. 107 00:12:52,800 --> 00:13:00,180 So let's go into how I may influence the formation and revision of political beliefs, because that's then what's going on. 108 00:13:02,350 --> 00:13:05,890 But how exactly are these beliefs enforced? 109 00:13:06,820 --> 00:13:10,390 And I distinguish between three ways they are influenced. 110 00:13:10,720 --> 00:13:19,660 One is a direct spending, and that's one, the manipulation of beliefs by politicians and the people they employ. 111 00:13:21,070 --> 00:13:29,180 But there's also indirect and in terms of effects. One that has to do with knowledge and not so sure about that one. 112 00:13:29,200 --> 00:13:34,510 This is something that I would really like to push accountable to discussion. 113 00:13:34,780 --> 00:13:41,140 And then the third one is this phenomenon of epistemic bubbles. 114 00:13:41,260 --> 00:13:50,440 Echo chambers and so on, which I think also impacts the receiving agency of people and therefore the political experience. 115 00:13:50,500 --> 00:13:54,970 So these are the three parts of the paper. 116 00:13:55,360 --> 00:13:58,719 So to in order to discuss that, 117 00:13:58,720 --> 00:14:09,190 I picked up so experiment from Bondi that same as social epistemology literature or at least in literature on epistemic agency. 118 00:14:10,600 --> 00:14:18,040 So there's a person, Claire raised in a racist community and comes to believe that people with white skin colour are superior. 119 00:14:19,000 --> 00:14:27,940 And then the question is like, how can she, if she wants to change her belief, how can she change that belief? 120 00:14:28,110 --> 00:14:33,520 And in this case towards a non racist one. 121 00:14:34,870 --> 00:14:40,200 And the thought experiment from Bondi is all about social influence. 122 00:14:40,360 --> 00:14:49,720 This person being in a social environment and therefore having difficulties to change belief and being raised in a certain environment. 123 00:14:50,080 --> 00:14:53,739 And I'm interested in what about the technological influence? 124 00:14:53,740 --> 00:15:00,490 Will there be able to exercise service agency to change beliefs under the influence of air? 125 00:15:02,020 --> 00:15:16,180 So first, manipulation of beliefs. Social media campaigns based on analysis of data by means of AI and data science. 126 00:15:16,190 --> 00:15:20,840 This is not not a thought experiment. 127 00:15:21,230 --> 00:15:25,400 It is very real. There was this Cambridge Analytica case. 128 00:15:26,610 --> 00:15:33,370 I'm happy that it wasn't called the Oxford Analytical Case, which I will talk in Cambridge later this spring. 129 00:15:34,040 --> 00:15:38,750 So Cambridge Analytica was the case that Facebook Facebook users, 130 00:15:40,840 --> 00:15:50,440 they produced data by their behaviour and these data are then harvested in that case without their consent talking about consent forms, 131 00:15:50,960 --> 00:15:59,390 so without consent and also then used for microtargeting people to influence their voting behaviour. 132 00:15:59,420 --> 00:16:07,700 So this is very clearly intended use of AI for the influencing the beliefs of people. 133 00:16:07,710 --> 00:16:19,740 So. So to pick up Claire's examples is, for example, targeted by Trump political advertising in order to in that case, 134 00:16:19,740 --> 00:16:27,930 you know, not having her change her beliefs to stick to her beliefs and vote for her for political direction. 135 00:16:29,310 --> 00:16:37,230 So this this case can be interpreted as a case of belief formation and revision by means of direct manipulation of beliefs. 136 00:16:39,750 --> 00:16:44,309 It's somewhat comparable to nudging in the sense that no one is coarse. 137 00:16:44,310 --> 00:16:50,280 And Claire writes, no one is forcing her to have a certain belief or to not provide services. 138 00:16:50,850 --> 00:16:56,290 But there is some influence under the radar and influence. 139 00:16:57,060 --> 00:17:07,340 So it's by passing this this capacity we have as a threshold autonomous beings, something that the philosophers find the victory brain. 140 00:17:07,980 --> 00:17:13,800 So this is happening by means of this technology, by means of data science. 141 00:17:15,030 --> 00:17:21,030 And one could conceptualise that as saying that less political agency is reduced, 142 00:17:21,150 --> 00:17:26,790 as she is manipulated to change her beliefs or not change her beliefs without her knowledge. 143 00:17:26,820 --> 00:17:31,740 So the manipulation, she doesn't know about us and we doubt her concerns. 144 00:17:32,790 --> 00:17:43,440 And this clearly violates the requirements of Democratic citizens that it was the agency conditions for democratic citizenship. 145 00:17:44,930 --> 00:17:46,700 Then there's the Second World. 146 00:17:50,180 --> 00:18:04,790 So here my worry is that through air we get a lot of statistical knowledge out there, not always directly, but just through the results of it. 147 00:18:04,820 --> 00:18:16,490 For example, if you use Chuck YouTube, it's all about statistics of the occurrence of words like if they occur together, that's counted. 148 00:18:16,850 --> 00:18:21,830 And so that's is a certain statistics. And on the basis of that, this program. 149 00:18:24,290 --> 00:18:24,730 Thanks. 150 00:18:25,520 --> 00:18:42,139 Now this kind of mediation and bringing to the in the public sphere view of the of all the statistical information is problematic in various ways. 151 00:18:42,140 --> 00:18:48,379 I think first probably because it's it's yeah, it's something that's that's not clear that that's happening. 152 00:18:48,380 --> 00:18:57,830 It's something that if you look into the phenomenon, if you studied, if you if you're very interested in it, you can find out how A.I. works. 153 00:18:59,300 --> 00:19:06,379 But many people use or will use technology without finding out all these things. 154 00:19:06,380 --> 00:19:10,850 So they don't even know that this is based on statistical knowledge. 155 00:19:12,290 --> 00:19:22,279 But more importantly for here, for this particular worry I have is that people would not be encouraged to look at causal nodes, 156 00:19:22,280 --> 00:19:24,710 broken nodes, which we get through the sciences. 157 00:19:27,210 --> 00:19:34,830 So in this case, for example, there might be statistical information about success in a certain society. 158 00:19:35,190 --> 00:19:43,920 It could be that, for example, in American society, that people of a certain skin, skin colour whites are more successful. 159 00:19:44,280 --> 00:19:49,920 But that that's statistics. Then it gets to air, it gets emphasised. 160 00:19:50,460 --> 00:19:57,960 And my worry in terms of the changing of beliefs is that this, again, 161 00:19:57,960 --> 00:20:06,930 renders a perceived agency the exercise of that more difficult because the statistical knowledge is offered as a default. 162 00:20:07,620 --> 00:20:14,610 And I'm worried that people would not be encouraged to look at causal knowledge, global knowledge, 163 00:20:14,620 --> 00:20:24,810 which would tell you that there is no causal relationship between skin colour and success in society. 164 00:20:25,440 --> 00:20:34,259 So this this really, I think, makes it harder to change those beliefs. 165 00:20:34,260 --> 00:20:39,840 And in the in the example and it's a problem in a democracy if, 166 00:20:40,620 --> 00:20:52,530 if this statistical knowledge becomes more dominant in relation to the causal notes from the sciences, I'm not so sure about this point. 167 00:20:53,910 --> 00:21:03,660 But I'm looking forward to just see what you think about the third point, which is also often a way of influencing people without intending. 168 00:21:03,660 --> 00:21:07,520 So, um, because, although, you know, 169 00:21:07,590 --> 00:21:16,590 some politicians might be interested that people stay in a certain bubble and just stay having the same beliefs, it also happens unintended. 170 00:21:17,040 --> 00:21:23,549 So what is the phenomenon? The phenomenon is that if you're engaging through social media and in social media, 171 00:21:23,550 --> 00:21:36,030 there is this activity to to to moderate, to to also creates this to to analyse the patterns. 172 00:21:36,480 --> 00:21:43,350 And what happens there is that there's a danger that one always hears the same people, 173 00:21:43,350 --> 00:21:47,640 talks to the same people and remains in what's called an epistemic bubble. 174 00:21:48,090 --> 00:21:54,390 And in the literature, there's also the distinction epistemic bubbles and echo chambers. 175 00:21:54,720 --> 00:22:01,800 I'm not so interested in that here. For me, it's important here to see this as a problem of epistemic agency again. 176 00:22:02,220 --> 00:22:12,870 So if you exclude other voices, discredit other voices and distrust other sources, then we need to have this epistemic bubble. 177 00:22:13,260 --> 00:22:23,159 And you will be less likely, I think, to revise your beliefs and put them vulnerable to discussion because you will always hear the same. 178 00:22:23,160 --> 00:22:29,340 For example, Claire, he was always this white supremacist echoes and stays in her bubble. 179 00:22:29,580 --> 00:22:34,590 And so I think there's a problem not only for the diversity of views as such, 180 00:22:34,980 --> 00:22:48,210 but also because people are not encouraged to exercise their capacity for epistemic agency, which is problematic for for these democratic conditions. 181 00:22:50,040 --> 00:22:54,990 So, yeah, this diminishes Susan's capacity to run the liberalism further. 182 00:22:55,000 --> 00:23:03,150 Bill encourages rather people to feel confident, perhaps overcome overconfidence about their beliefs, 183 00:23:03,690 --> 00:23:10,360 and this reduces their political agency again, not necessarily intended. 184 00:23:10,360 --> 00:23:14,880 That can be done and it doesn't have to be and yeah. 185 00:23:15,630 --> 00:23:27,840 Contributes to to that by also moderating social media in such a way that you will see always more boasts of the things that you like. 186 00:23:27,840 --> 00:23:30,840 You really will see more opinions that you like. 187 00:23:31,140 --> 00:23:36,120 And so in the end, you're just having this, this cosy thing with your political friends, 188 00:23:36,120 --> 00:23:42,660 but you're insufficiently exposed to other views and you don't trust them so much anymore. 189 00:23:44,070 --> 00:23:54,750 So those are three ways that I may influence epistemic agency, epistemic agency as relevant to democracy. 190 00:23:55,620 --> 00:24:03,660 And, and, yeah, I think this was one way to to ask the question concerning democracy in a more specific way, 191 00:24:04,050 --> 00:24:08,100 namely, what kind of systemic agency is needed for a Democratic political agency? 192 00:24:08,430 --> 00:24:17,579 And people have suspicions of that kind of agency, given that age shapes agency, which, 193 00:24:17,580 --> 00:24:26,020 of course my normative interests and normative interests to preserve democracy and a normative interest to to. 194 00:24:26,640 --> 00:24:29,760 What's the relation between air and democracy? 195 00:24:31,500 --> 00:24:37,590 So the answer given here can be summarised as follows Supposing that some degree of 196 00:24:37,590 --> 00:24:42,720 and some forms of epistemic agency are needed for political agency in a democracy, 197 00:24:43,170 --> 00:24:49,979 in the sense that at least according to the ideal, in a democracy I need to be able to freely for my beliefs, 198 00:24:49,980 --> 00:24:56,640 to reflect on my political beliefs and to openly discuss these buildings with others, if necessary, revise them. 199 00:24:57,210 --> 00:25:04,170 Then the exercise of such forms of epistemic agency is to render more difficult, if not endangered, by. 200 00:25:04,350 --> 00:25:12,270 I support its internal manipulation of political beliefs and internal devoting of statistical knowledge as opposed to causal knowledge. 201 00:25:12,810 --> 00:25:19,350 And so it's usually an internal creation and maintenance of seeming bubbles and echo chambers, 202 00:25:19,770 --> 00:25:25,650 which discourages rendering still honourable discussion and revision. 203 00:25:27,580 --> 00:25:37,570 So this is, of course a kind of export to work, but I think it can be further developed, can be developed, for example, 204 00:25:39,070 --> 00:25:46,810 by looking at all sorts of other related issues that they're formulating noted epistemic agency, but using another concept. 205 00:25:47,080 --> 00:25:52,080 And so also think about like how can democracy be undermined here? 206 00:25:52,090 --> 00:26:02,530 And yeah, perhaps also empirical research could further support this claims or, or defuse some of these claims. 207 00:26:02,980 --> 00:26:13,959 And for example, with epistemic bubbles, there are some also some research that shows that that next to this polarisation and doubles, 208 00:26:13,960 --> 00:26:20,560 there's also a diversity of views on social media and of this kind of research, 209 00:26:20,560 --> 00:26:26,200 as far as I see, it is not conclusive, but it's worth further engaging with. 210 00:26:28,350 --> 00:26:38,310 So if this is the case, then, you know, to two ends with recommendations more towards the outside world is, 211 00:26:38,640 --> 00:26:42,930 well, if these dangers are analysed are real, 212 00:26:42,930 --> 00:26:52,050 then if we want to strengthen democracy, you know, this is of course, if, you know, there might be regimes and people who don't want that. 213 00:26:52,060 --> 00:26:56,040 But if we want to strengthen democracy, then we need. 214 00:26:58,670 --> 00:27:08,540 We need to regulate air and so that these phenomena are less and less, that there's less risk for this phenomena, and therefore it is influences. 215 00:27:09,560 --> 00:27:14,840 But also, I think education when it comes to nurses is of course really important. 216 00:27:15,320 --> 00:27:25,010 So we need to teach anyway for democracy people how to to exercise their political agency and create the knowledge basis for that. 217 00:27:26,990 --> 00:27:32,690 And that means, I think, to to, you know, in this kind of social, technological environment, 218 00:27:32,690 --> 00:27:40,490 when there's more pressure on belief for a vision and, you know, when it's more harder to exercise your political agency, 219 00:27:40,850 --> 00:27:42,440 I think in such environments, 220 00:27:42,830 --> 00:27:51,050 it is extremely important to enable and encourage citizens to reflect on their beliefs rather than vulnerable in open and public forums. 221 00:27:52,310 --> 00:27:58,610 And yeah, I think this is not just an individual matter to do and to give individuals the skills, 222 00:27:59,720 --> 00:28:09,740 but also requires an institutional reform because currently our educational institutions are are not necessarily geared at this. 223 00:28:09,740 --> 00:28:13,790 You have to be lucky or privileged to have that kind of limitation. 224 00:28:14,240 --> 00:28:22,070 So we need more of that kind of education, I think, for for everyone if we care about democracy. 225 00:28:22,430 --> 00:28:31,910 So in closing, and I would like to quote one of my favourite political philosophers, John Dewey, 226 00:28:33,140 --> 00:28:39,680 who said that democracy is not just a form of government, but primarily multiple social living. 227 00:28:40,820 --> 00:28:52,700 And that is, of course, one of these richer ideals of democracy rights, not just about voting, but it's about a community also, in his view. 228 00:28:53,810 --> 00:29:03,710 But of course, that raises a lot of questions about community, the role of community in modern society and and how how this can be done in practice. 229 00:29:04,790 --> 00:29:13,370 But I think for this paper, this epistemic agency I've been talking about, and also in the form of, yeah, 230 00:29:13,490 --> 00:29:23,930 the types of agency that are necessary to to make this kind of communication experience stronger and see this not only as as a, 231 00:29:23,930 --> 00:29:31,400 as a means for some abstract kind of democracy, but as a as, you know, as a valuable thing. 232 00:29:31,550 --> 00:29:35,750 I mean, so now these these are claims to stand apart from this paper. 233 00:29:35,750 --> 00:29:45,350 But I just want to end with a in order to also disclose the narrative of direction where I personally would like to go. 234 00:29:45,590 --> 00:29:55,669 However, I think the arguments in this paper are valid for all kinds of political agency and forms of democracy, so forth. 235 00:29:55,670 --> 00:30:08,960 Technology, them. I think we need this. Political technologies that grow them, that enable rather than hinder such forms of living, such experiences. 236 00:30:09,410 --> 00:30:13,219 And even without taking into account that of course, 237 00:30:13,220 --> 00:30:22,700 this that stimulates such forms of epistemic agency of been talking about rather than hindering rather than than 238 00:30:22,700 --> 00:30:35,330 risking to to erodes the the the conditions for formation of such such knowledge and such epistemic agency skills. 239 00:30:36,830 --> 00:30:37,220 Thank you.