1 00:00:01,620 --> 00:00:09,150 Okay. So thank you for coming. Welcome to several seminar on biomedical technology and more violent conflict. 2 00:00:09,780 --> 00:00:12,360 I'm afraid speaking today is not your statue. 3 00:00:12,430 --> 00:00:18,510 He's got to be talking to us about the rights of post medicine, somatic concerns about in biomedical technology. 4 00:00:18,870 --> 00:00:24,809 And we've got a lot of time. So after the talk, we'll have a lot of time for questions. 5 00:00:24,810 --> 00:00:29,490 And that's a great thing to talk about. Thank you. 6 00:00:30,030 --> 00:00:33,420 Good morning, everybody. And thank you for being here this morning. 7 00:00:35,100 --> 00:00:40,080 As you can see, I'm going to talk about some ethical concerns. 8 00:00:41,670 --> 00:00:46,290 I should add some ethical, personal concerns about biomedical technology. 9 00:00:46,770 --> 00:00:53,940 This is somehow a work in progress. So any comments and suggestions are more than welcome. 10 00:00:55,110 --> 00:01:06,000 Now, the idea of this paper started from a conversation or chat that I had with a friend of mine who at some point 11 00:01:06,390 --> 00:01:13,680 he went to talk to a doctor because he has three kidneys and he needs to go to the doctor on a regular basis. 12 00:01:14,280 --> 00:01:22,410 And after he came back, he told me that he thinks that doctors have no heart, that they are heartless people. 13 00:01:22,830 --> 00:01:32,819 And I was very curious because I knew that he went to a very good doctor in in Bucharest and asked him, 14 00:01:32,820 --> 00:01:38,790 why didn't the doctor treating well, did the doctor offered him the treatment? 15 00:01:38,790 --> 00:01:43,890 And he said, no, no, he was very good. I was, uh, I was feeling much better, doctor. 16 00:01:44,310 --> 00:01:54,780 I went to the doctor, but I got the feeling that she didn't think that I'm a person, but he thinks that they were more a problem to be solved. 17 00:01:56,370 --> 00:01:59,699 And then make the thinking. 18 00:01:59,700 --> 00:02:05,920 And I started to look into this and this is what I got so far. 19 00:02:05,940 --> 00:02:12,780 So I don't have a final conclusion from this to this paper. 20 00:02:13,930 --> 00:02:21,270 Well, as we all know, today's magazine is more and more a matter of technology, 21 00:02:21,960 --> 00:02:30,300 medical treatment and interventions not resistant by computer and even medical diagnostics are heavily imbued with technological barometers. 22 00:02:30,900 --> 00:02:37,979 Of course, medicine, in a way, has always relied on technology such as scalpels, probes and other instruments. 23 00:02:37,980 --> 00:02:44,400 But by the start of the 20th century, new instruments were available to study, diagnose and treat the body. 24 00:02:44,460 --> 00:02:52,560 Today, hospitals worldwide use complex computerised machines to image to image the body or assist its function. 25 00:02:54,150 --> 00:03:03,000 Now, using computers was one of the most important technological advancements in 20th century medicine. 26 00:03:03,510 --> 00:03:08,490 They became central to medical care from the 1950. 27 00:03:09,000 --> 00:03:13,690 Computerised machines in hospitals monitor patients continuously. 28 00:03:13,750 --> 00:03:20,280 And they also enabled insurers and state run health services to track patient records on a massive scale. 29 00:03:21,450 --> 00:03:29,880 Imaging techniques such and such as MRI or PMP were possible because faster computers to directly reconstruct images of the body. 30 00:03:30,240 --> 00:03:38,430 More diagnostic tests were developed because automated laboratory machines perform death tests quicker and more accurately. 31 00:03:39,330 --> 00:03:46,380 And this use of machines gave rise to specialist medical practitioners, as we all know. 32 00:03:46,890 --> 00:03:52,530 Technological technology had a major role in medicine becoming more specialised. 33 00:03:53,460 --> 00:04:01,500 Many medical technologies allowed specific parts of the body to be studied, diagnosed and treated. 34 00:04:02,280 --> 00:04:11,760 And this land, of course, to doctors who specialised in certain organs, specialised in the case, liver, heart and so on. 35 00:04:12,690 --> 00:04:19,739 These include ophthalmologists, doctors specialising in eye conditions and auto learning. 36 00:04:19,740 --> 00:04:29,970 Biologists diagnose specialist devices such as the X-ray machine, introduced medical professionals such as radiologists and radiographers. 37 00:04:30,360 --> 00:04:39,870 We are unknown before, but not all new technologies were readily accepted by the medical community. 38 00:04:40,290 --> 00:04:46,020 Many were viewed with suspicion and in many cases with severe suspicion. 39 00:04:46,350 --> 00:04:49,950 I would add in the 1950s, for example, 40 00:04:49,950 --> 00:04:59,520 some doctors doubted an X-ray image of the chest was as reliable as a physical examination, a person to person examination. 41 00:04:59,940 --> 00:05:10,860 Devices threatened were seen as the best friends to replace the diagnostic expertise of traditional medical practitioners. 42 00:05:11,670 --> 00:05:16,560 Many doctors valued their clinical experience over machine produced information. 43 00:05:17,940 --> 00:05:33,830 It may sound very peculiar to you, but we still have this kind of doctors in Romania today who think that machines are something strange. 44 00:05:33,850 --> 00:05:41,970 We shouldn't rely too heavily on them. Other technologies failed because doctors or patients found them impractical. 45 00:05:42,570 --> 00:05:52,230 ACG was only useful when it became portable and reliable enough to be used at the patient's bedside because in the beginning the big machine was very, 46 00:05:52,530 --> 00:05:58,050 very big. But today we even have what's called stellar medicine, 47 00:05:58,260 --> 00:06:03,809 covering a broad range of medical services from done in nursing to tell us psychiatry where 48 00:06:03,810 --> 00:06:09,270 doctors offer medical care using information technology to patients from remote places. 49 00:06:10,080 --> 00:06:19,770 Now, the use of machines, the use of technology brought some practical and ethical problems. 50 00:06:20,550 --> 00:06:25,140 Safety concerns and ghosts have limited their use and still does. 51 00:06:25,170 --> 00:06:37,139 Moreover, some historians and physicians argue that machines make doctors poorer here by encouraging them to focus only on the sick parts of the body, 52 00:06:37,140 --> 00:06:42,930 rather than caring for the patient as a whole, for the patient, as a human being, as a person. 53 00:06:43,620 --> 00:06:52,050 Many even question whether excessive use of technology within childbirth or to prolong life can be intrusive and do more harm than good. 54 00:06:54,390 --> 00:07:03,690 As a side note, we. It could be said that advancing technology has presented physicians and patients with many other serious ethical dilemmas. 55 00:07:04,110 --> 00:07:09,270 For example, ultrasound screens, foetuses for diseases before babies are born. 56 00:07:09,810 --> 00:07:16,080 However, some parents must decide whether to terminate the pregnancy if the fantasy is revealed to have a certain condition. 57 00:07:16,470 --> 00:07:23,580 So the question arises that medical technology imposed on us more than it empowers. 58 00:07:25,410 --> 00:07:32,760 Because this gradual change birth may be a tremendous increase in medical efficiency. 59 00:07:32,880 --> 00:07:35,490 And that's a fact. We cannot deny that. 60 00:07:35,700 --> 00:07:43,500 But most important, technology also brought about a subtle change of the relationship between patients and practitioners. 61 00:07:45,000 --> 00:07:50,460 I would say a certain relativistic shift towards a more fluid ontology. 62 00:07:51,630 --> 00:07:58,920 That means that whereas in traditional medicine, doctors and patients were interacting with each other as human beings, 63 00:07:59,400 --> 00:08:02,910 both as patients versus persons as specialists. 64 00:08:03,720 --> 00:08:10,440 In today's medicine, this relationship became a relationship between certain technological functions. 65 00:08:10,860 --> 00:08:17,470 Patients are medical problems. As I said before, proposals and practitioners. 66 00:08:17,490 --> 00:08:24,600 Doctors are complex instruments which guide the solving process of those puzzles. 67 00:08:25,920 --> 00:08:30,870 So it could be said that gradually but firmly. 68 00:08:31,320 --> 00:08:39,120 Technology made us enter into what might be called what I call the age of post medicine, 69 00:08:39,810 --> 00:08:48,820 which is an age in which we see patients as as medical or even scientific puzzles, more or less difficult to solve. 70 00:08:48,840 --> 00:08:57,990 Of course, machines whose functioning need the interventions of other machines, especially designed for that purpose. 71 00:08:58,800 --> 00:09:08,670 The patient is no longer seen as a person, which is a little something for the jargon, which is mostly substance. 72 00:09:09,030 --> 00:09:18,660 But as a conglomerate of functions and parametres attributes which need to be studied thoroughly in order to fit their function. 73 00:09:19,350 --> 00:09:25,410 And this is the shift I was mentioned before as an ontological shift. 74 00:09:25,860 --> 00:09:34,500 Moreover, that not patients are no longer present in the physician's office, but only their charts and medical computerised data. 75 00:09:35,220 --> 00:09:39,330 The patient becomes more and more the set of numbers and measurements which need 76 00:09:39,330 --> 00:09:43,770 to be properly evaluated and brought to bear on a sick person in flesh and blood. 77 00:09:46,320 --> 00:09:54,420 Someone might argue that this ontological shift, as I call it in the relationship between doctor and patient, 78 00:09:54,900 --> 00:10:04,470 is not something bad in itself, as long as, of course it provides more accuracy and efficiency in medical care and medical treatments. 79 00:10:05,100 --> 00:10:11,940 And she would be right if we consider this relationship strictly from a medical point of view. 80 00:10:12,480 --> 00:10:20,610 There is nothing wrong with it. There is nothing wrong with changing the way practitioners interact with their patients, 81 00:10:21,060 --> 00:10:26,430 which means seeing them as being more machine like, so to speak, rather than human life. 82 00:10:27,060 --> 00:10:32,010 If this change brings with itself measurable medical benefits, 83 00:10:32,760 --> 00:10:45,750 but the question arises again can we really ever consider this relationship between physician and patient solely from a medical point of view? 84 00:10:46,650 --> 00:10:53,580 In other words, our physicians and of course, patients only that, namely, physicians and patients. 85 00:10:57,540 --> 00:11:05,280 Because you see, Hippocratic Oath seems to imply something else, and we have the quotations from that. 86 00:11:06,180 --> 00:11:12,059 The first one says, I want to remember that there is art to medicine as well as science, 87 00:11:12,060 --> 00:11:18,330 and that world sympathy and understanding may outweighed the surgeon's knife or the canvas to drop. 88 00:11:18,900 --> 00:11:25,320 And the second one says, I will remember that I do not treat a fever chart of cancer as growth, 89 00:11:25,590 --> 00:11:30,960 but basic human being because illness may affect the person's family and economic stability. 90 00:11:31,260 --> 00:11:39,000 My responsibility includes death related problems defined to care adequately for the sick because for the patient. 91 00:11:39,330 --> 00:11:46,020 Now, from this point of view, it seems that technological progress broke with itself, 92 00:11:46,290 --> 00:11:53,370 a huge improvement on the science part, but a smaller one on the art board. 93 00:11:54,770 --> 00:12:02,740 You cannot manifest words, sympathy and understanding towards a bunch of medical charts and clinical data. 94 00:12:02,750 --> 00:12:12,740 That's obvious. Well, to put it in more smaller terms, it seems that in both the medicine, as I called it, 95 00:12:14,660 --> 00:12:22,880 patients are as persons are slowly but surely disappearing, being replaced by scientific and artificial medical constructs. 96 00:12:23,330 --> 00:12:28,640 In short, patients are objectified in the name of precision and efficiency. 97 00:12:31,130 --> 00:12:38,810 And this brings us to, as I said, the intermediary conclusion, because this is not final. 98 00:12:38,810 --> 00:12:47,700 So. I may change that in the following months, whether it be subtle and inevitable change. 99 00:12:47,910 --> 00:12:54,030 I guess it's inevitable we'll end up in a dramatic alienation of the doctor patient relationship or not. 100 00:12:54,330 --> 00:13:03,180 Is, of course, still something open to debate. However, I do believe that we should we shouldn't wait to see if it would happen or not. 101 00:13:03,870 --> 00:13:10,319 We should we should rather actively try to intervene to to influence the direction of 102 00:13:10,320 --> 00:13:15,960 this gradual shift towards technology by finding ways to preserve or even improve. 103 00:13:16,260 --> 00:13:23,760 If we could do that. The personal character of this very important relationship between doctor and patient 104 00:13:24,210 --> 00:13:29,400 and the degree of human qualities involved in medical interactions in general. 105 00:13:29,640 --> 00:13:35,250 Now, somewhere someone might might ask why I use that sort of medicine. 106 00:13:36,420 --> 00:13:40,110 In a way, both medicine is like postmodernism in philosophy. 107 00:13:41,370 --> 00:13:49,439 I think a postmodern philosopher, as we all know, is no longer interested to look for answers, 108 00:13:49,440 --> 00:13:58,650 for those big questions from philosophical tradition, questions about truth, about beauty, about justice and so on. 109 00:13:58,980 --> 00:14:04,530 Because he doesn't believe we can find answers to those questions and he doesn't believe those questions matter. 110 00:14:05,460 --> 00:14:09,780 We cannot find a definitive answer to questions about truth. 111 00:14:10,260 --> 00:14:17,820 We are always trying to tackle that question from a certain point of view, so that's useless. 112 00:14:18,180 --> 00:14:23,190 What really matters to for such a philosopher is conceptual edification, as regents, 113 00:14:23,190 --> 00:14:33,900 Rorty puts it through conversations and discussions, no matter what type of conversation or what kind of discourse we have. 114 00:14:35,010 --> 00:14:45,360 Well, in a similar fashion, it seems to me that in what I call post medicine, the trend is to avoid somehow empathy, 115 00:14:46,470 --> 00:14:55,110 sometimes not being that empathy towards the total stranger, which is the patient, is not really possible. 116 00:14:55,410 --> 00:14:59,490 You cannot have empathy if that person is a stranger to us. 117 00:14:59,610 --> 00:15:07,019 We can meaning, but we cannot have it gently and to concentrate on helping the patient with all means available, 118 00:15:07,020 --> 00:15:16,440 even if for during this process, the patient could be seen as the medical puzzle and not as a human being. 119 00:15:17,700 --> 00:15:21,870 And this is the reason I use the debt to impose medicine, 120 00:15:21,870 --> 00:15:35,190 because I think that there are strong similarities between both medicine and what is usually called post modernism with all sorts of differences. 121 00:15:35,490 --> 00:15:38,790 Of course, this is all. No, thank you. 122 00:15:44,810 --> 00:15:48,250 They will get started again seconds later today. 123 00:15:48,250 --> 00:15:50,650 It's not just the catch it looks like from the investor. 124 00:15:51,620 --> 00:15:57,219 And he's going to be talking about how reputation might lead to different kinds of stories tomorrow. 125 00:15:57,220 --> 00:16:03,180 Five and anything, huh? Okay. 126 00:16:05,500 --> 00:16:14,080 This is The Talk is based on a paper co-authored with a friend of mine who appeared on Twitter. 127 00:16:15,190 --> 00:16:24,700 And actually, I think, my gosh, he he said it would be interesting to give this to a newspaper. 128 00:16:24,710 --> 00:16:31,540 I'm not sure if the paper got to anyone you the entire night and you can judge. 129 00:16:31,540 --> 00:16:36,700 But if not, I would be happy to use any military might. 130 00:16:39,250 --> 00:16:45,500 Some things probably are scattered out in more details in the paper. 131 00:16:45,500 --> 00:16:52,080 Right. But by and large, I will follow the argument. 132 00:16:53,410 --> 00:17:03,069 And I have to say that while this was a work in progress, I presented it three years ago, I think, 133 00:17:03,070 --> 00:17:15,950 at the first Bucharest class of 75 verdicts, and I am very grateful to you guys for taking the time to come to this discussion. 134 00:17:17,320 --> 00:17:25,270 I'm happy to see some of the things that were quoted in the paper here and some of them, 135 00:17:25,270 --> 00:17:35,020 because it appeared to be Polish, because I had many good feedback and a vast array of the arguments. 136 00:17:40,980 --> 00:17:46,710 So I start with this thing that I like. 137 00:17:48,410 --> 00:18:01,860 TOBIN James O'Keefe is a famous economist, a Nobel Prize winner, a man talk about a conference and they have this quiet and that people change. 138 00:18:03,420 --> 00:18:07,110 I mean, he's playing a classic. 139 00:18:07,950 --> 00:18:11,010 They were arguing about the abortion issue. 140 00:18:11,310 --> 00:18:18,150 So there's nothing more dangerous than a philosopher who's learned just a little bit of the times, 141 00:18:19,020 --> 00:18:29,700 to which he replied with his well-known wit, unless it's an economist who hasn't learned anything about the first quarter. 142 00:18:30,270 --> 00:18:44,320 And, uh, this is, uh, you know, this story somehow is mind blowing part of my stance in this project, 143 00:18:44,340 --> 00:18:53,510 in this paper, and as a large commitment to start off argued. 144 00:18:53,730 --> 00:19:01,230 So, uh, there would be some economic arguments here. 145 00:19:01,800 --> 00:19:14,160 How will we come to this economic part and hopefully squeeze some interesting moral issues sort of out of the issue? 146 00:19:14,880 --> 00:19:22,400 And the issue is, uh, on the topic of moral by has one, I would say I'm a, 147 00:19:22,450 --> 00:19:28,859 I'm a moderate behaviour in the sense that at least for some kinds of moral, 148 00:19:28,860 --> 00:19:38,520 by as long as, let's say individual voluntary enhancement, I can't find any compelling objections. 149 00:19:39,420 --> 00:19:43,920 Uh, why look for other types of moral, biological. 150 00:19:43,920 --> 00:19:49,550 And I would have reservations, but I think my, I just assume, 151 00:19:49,570 --> 00:20:01,200 we just assume here and I'll go to my side that becoming a better person is uncontroversial, be a morally praiseworthy life goal. 152 00:20:01,740 --> 00:20:11,010 So I think by and large everybody would agree with that, uh, that we should strive to become better persons. 153 00:20:11,970 --> 00:20:17,970 Uh, and if taking biochemical processes in our bodies, uh, 154 00:20:18,390 --> 00:20:24,870 proves an effective way of achieving this without harming self or without harming other partners, 155 00:20:25,710 --> 00:20:36,660 then we think that least as legitimate means as the morally more traditional, traditional forms of moral enhancement. 156 00:20:37,350 --> 00:20:39,890 And this is actually just saying that, uh, 157 00:20:40,260 --> 00:20:50,310 we buy and join the others and we are sometimes arguments for other people for, and guys for our part of by enhancement. 158 00:20:51,330 --> 00:20:58,590 The second assumption that they all want going to argue in favour of theory is that the economic 159 00:20:58,590 --> 00:21:05,890 argument regarding the consequences of artificial scarcity induced by intellectual property, 160 00:21:05,900 --> 00:21:10,260 at least by the current regime. People like your property are correct. 161 00:21:10,770 --> 00:21:16,350 So intellectual property gives rise to the postcards. 162 00:21:17,340 --> 00:21:25,710 And there are some things that always we shouldn't expect when having just cars. 163 00:21:26,880 --> 00:21:35,470 Uh, however, we, we are not assuming that these arguments against intellectual property need to be decisive. 164 00:21:37,050 --> 00:21:47,850 Uh, although. Well rather than my among others we've known, especially my brother. 165 00:21:52,220 --> 00:21:58,010 So the question is, what are the likely consequences? 166 00:21:58,100 --> 00:22:05,570 Can we explore some consequences of the car become natural property framework? 167 00:22:07,370 --> 00:22:16,519 And we are only talking about dock payments here for both individuals who are willing to embark on the process 168 00:22:16,520 --> 00:22:24,800 of morality and are willing to sacrifice this car for resources for achieving this biochemical means, 169 00:22:25,370 --> 00:22:33,229 who are willing to pay money in order to buy drugs that would help them reach is you 170 00:22:33,230 --> 00:22:41,060 go subjective and are some of those consequences morally interesting or relevant? 171 00:22:42,200 --> 00:22:57,530 And we are only looking, as I said, of individual and voluntary by the hospital either quickly wrong or if one thinks about intellectual property. 172 00:23:00,230 --> 00:23:10,250 And first, I do think we need to distinguish between ideologic right and their material support, on the other hand, 173 00:23:11,330 --> 00:23:20,600 and an intellectual property right piece, a property in the union objects which are instantiated in the material support. 174 00:23:21,080 --> 00:23:31,160 So when you buy a book and the book is copyrighted, you only have a property right over the physical paper. 175 00:23:31,910 --> 00:23:34,760 You don't have any although you bought the boat, 176 00:23:34,760 --> 00:23:46,740 you don't have any property derived over the idea of the text that is instantiated in the material support differences. 177 00:23:46,760 --> 00:23:54,069 Thank you. That's a paper from 2000 time from Laura Byron. 178 00:23:54,070 --> 00:23:57,950 And she makes this distinction on intellectual property, right? 179 00:23:57,950 --> 00:24:07,850 It's a property, right? I feel fine. Why the more position on property rights referred to rather both for particular objects. 180 00:24:09,320 --> 00:24:18,140 So types of intellectual property rights, copyright, patent, trademark documents, sequence. 181 00:24:19,490 --> 00:24:22,780 And we are only considering here pharmaceutical patents. 182 00:24:25,950 --> 00:24:30,150 If you've made it, I'll come back to this in the discussion. 183 00:24:33,210 --> 00:24:44,160 A patent is a property rights or any variety regarding the invention of machinery or a process that has a useful function. 184 00:24:45,150 --> 00:24:53,670 And in particular, drug patents are usually property rights either over chemical formula or over 185 00:24:54,570 --> 00:24:58,680 industrial processes leading to the production of that particular product. 186 00:25:01,600 --> 00:25:04,770 What is that, Gene? Bye bye. A bite. 187 00:25:05,470 --> 00:25:12,060 It's actually the holder, which can be the holder can be more advanced or can be a compliment. 188 00:25:14,770 --> 00:25:26,290 The holder gets a limiting or monopoly over overvalued drug, in effect, the holder of this iPhone. 189 00:25:27,110 --> 00:25:36,470 Has the right of excluding others from benefiting or utilising without the permission of the holder the result of being the to war. 190 00:25:37,940 --> 00:25:45,499 And there might be many justifications in favour of granting this form of property. 191 00:25:45,500 --> 00:25:53,030 Right. There are armed gangs in the ancient and prone back of my mind, so I should be aware of that product, etc., etc. 192 00:25:54,170 --> 00:25:58,850 But I would say the most common argument we would cite, the most common arguments rests. 193 00:26:00,040 --> 00:26:04,449 On. On a theme that is well-known economics. 194 00:26:04,450 --> 00:26:15,760 I mean, the importance of incentives. And it goes something like because the cost of copying that idea is so low, it's close to zero. 195 00:26:17,410 --> 00:26:22,840 Uh, inventors need to be incentivised, otherwise nobody would complain. 196 00:26:23,440 --> 00:26:26,470 Otherwise we wouldn't have any sort of creativity. 197 00:26:28,960 --> 00:26:37,900 And it's really this monopoly grant. And by buying a patent, it's a way of incentivising creativity. 198 00:26:43,490 --> 00:26:54,590 As always, with regulations under law, there are an entitlement, but maybe not so much intended consequences. 199 00:26:56,060 --> 00:27:01,390 Unintended consequence is to keep incentives for people to be created. 200 00:27:02,860 --> 00:27:10,190 Oh, and why? Pharmaceutical industry in particular would need this sort of value? 201 00:27:10,940 --> 00:27:21,650 Because some people would argue it meets the features of what economists call the schumpeterian industry. 202 00:27:22,070 --> 00:27:31,280 Right? Marginal cost, small and constant marginal costs and innovation, those that are most important, competitive. 203 00:27:34,700 --> 00:27:39,440 But okay, let's take a step back now. 204 00:27:40,310 --> 00:27:47,780 And we generally talk about property rights, about the emergence of property rights in the context of the cars. 205 00:27:50,840 --> 00:27:58,580 And although it's the well-known story in the killing field to supplement nature, 206 00:27:59,150 --> 00:28:07,790 where he argues that we need rules of justice and property because resources, things are cars. 207 00:28:08,480 --> 00:28:15,350 And if we don't have rules of property and justice, conflicts over the use of things without care. 208 00:28:18,090 --> 00:28:27,270 And in a more tactical approach, the economics of economics are property rights or new institutional economics would say 209 00:28:27,270 --> 00:28:34,680 that the function of property rights is that of internalising negative externalities. 210 00:28:36,870 --> 00:28:43,890 And yeah, that holds for laptops and classes, chairs and tables. 211 00:28:44,850 --> 00:28:57,330 But let's talk for a piece. I mean, if I have a laptop and long past it's my laptop, then I walk out of that. 212 00:28:57,510 --> 00:29:03,030 Should have a lot. Be fine. I have. I go and let's say. 213 00:29:04,310 --> 00:29:13,390 Can you be the bad guy, please? Yeah, I will still have that feeling. 214 00:29:14,210 --> 00:29:23,570 In the same sense, what happens is that in a word, there would be two persons enjoying that particular cookie instead of just one person. 215 00:29:25,720 --> 00:29:38,310 Okay. So. What's particular about intellectual property is that if traditional property rights holder materials are meant 216 00:29:38,310 --> 00:29:48,620 to solve the problem with regards to intellectual property rights arguments to induce creative staff action. 217 00:29:52,300 --> 00:30:04,210 Okay. And there are some effects of scars of creating scars left availability, 218 00:30:04,600 --> 00:30:10,580 usually all things, all other things being equal, higher prices always went up. 219 00:30:11,650 --> 00:30:23,230 That's the basic way of how the law of the land operates and has a secondary consequence. 220 00:30:23,590 --> 00:30:35,620 A decrease in consumption, particularly for any individuals, would incur a higher opportunity cost for purchasing that particular thing at trial. 221 00:30:37,090 --> 00:30:47,260 And you could probably now see where the argument is going again that I would skip over the case 222 00:30:47,260 --> 00:30:58,450 studies that are well documented by inferences about how drunk patrons actually lead to huge, 223 00:30:59,710 --> 00:31:08,860 steep increases in prices like, you know, in the study in 2002 in South Africa. 224 00:31:10,840 --> 00:31:23,640 So the difference between a patented patented drug, $1,200 of HIV y and the generic equivalent would be 350. 225 00:31:24,790 --> 00:31:31,180 That's almost four times. And then it happens again and again. 226 00:31:32,080 --> 00:31:36,460 And probably you can already see where this is going. 227 00:31:39,080 --> 00:31:48,190 Straightforward argument from the wrong man who would say something like this if drug use or more of 228 00:31:48,190 --> 00:31:58,180 violent crimes are facing you and leave everything that in my scenario partner holds then higher prices. 229 00:31:58,720 --> 00:32:04,840 Artificial scarcity would lead to a smaller quantity demanded by the consumer. 230 00:32:05,740 --> 00:32:10,600 So the less you have less people becoming better persons. 231 00:32:14,990 --> 00:32:19,000 And sorry much. 232 00:32:24,000 --> 00:32:36,510 Yeah. That's straight forward. And I might that would be those people that cause demand increase the ones that preyed upon opportunity. 233 00:32:36,780 --> 00:32:54,810 So maybe the not so wealthy persons but this might be too straightforward problem and I think it is something to strive for because maybe, maybe, uh, 234 00:32:56,250 --> 00:32:59,219 maybe a possible objection will say something like, okay, 235 00:32:59,220 --> 00:33:12,390 about drugs or more of the buying aspects are not just any kind of, they're, they're that kind of work that is. 236 00:33:14,940 --> 00:33:24,780 Unaffected in consumption by changes in price that embodies what economists call in a. 237 00:33:26,790 --> 00:33:31,770 With regard to pricing, there are two ways to produce. 238 00:33:32,460 --> 00:33:46,740 These are called usually Veblen goods. I think first time the problem the first author told well you're right and you can save by a 239 00:33:46,740 --> 00:33:54,120 morally impossible target like retailing or you other examples I think it's actually generic. 240 00:33:54,360 --> 00:33:55,830 It's not about a greater than market, 241 00:33:58,380 --> 00:34:10,500 but maybe not the sort of position of to look like Rolls-Royce cars or fancy cosmetic compact appliance or I don't know, making that work for you. 242 00:34:10,500 --> 00:34:13,110 That's an action, the actual objects. 243 00:34:13,110 --> 00:34:25,290 And then we would be okay with that because it would only show that these sort of drugs would be desired and used mostly. 244 00:34:26,170 --> 00:34:30,250 By people with enough money to afford a position on good. 245 00:34:30,670 --> 00:34:36,580 I mean, if my working class contracts aren't the Rolls Royce of Rolls Royce is on Fox. 246 00:34:37,060 --> 00:34:41,020 Then we can wrap our case here. 247 00:34:42,700 --> 00:34:53,920 But maybe there's something else. Maybe there's something so peculiar about morality that would make these rods of that language. 248 00:34:57,070 --> 00:35:04,180 Maybe the motivation to obtain more of these thoughts of people who volunteer. 249 00:35:06,280 --> 00:35:10,690 Let me remind you, we're only talking about voluntary, you know, 250 00:35:11,080 --> 00:35:19,750 so maybe the desire is so strongly people want to become a part of this that they will actually disregard price increases. 251 00:35:22,380 --> 00:35:25,950 This could be a more promising life. 252 00:35:25,950 --> 00:35:30,510 But I'm a hard woman. But what would. 253 00:35:38,930 --> 00:35:44,059 What the he continues to face usually with faced with these sorts of arguments is look at what's 254 00:35:44,060 --> 00:35:52,430 called usually the 40% mark when a good a particular group becomes unavailable or huge scars. 255 00:35:52,790 --> 00:35:56,120 You look at the markets for goods that can. 256 00:35:58,280 --> 00:36:06,600 Of this partially substitute the artificial calzone and what is the substitute? 257 00:36:06,620 --> 00:36:18,020 MARTIN For more of my time. It could be the more traditional form of more personal education, taking ethics classes. 258 00:36:18,220 --> 00:36:22,880 I don't know. I don't know. The situation is actually. Box for my dream. 259 00:36:23,360 --> 00:36:30,160 But I wouldn't say that these courses are the most popular in Romania where we don't track that bad. 260 00:36:30,170 --> 00:36:39,889 But it's not the theme that students flock to professors or other sort of available 261 00:36:39,890 --> 00:36:46,850 services for more of an education doesn't seem that people are so usually interested. 262 00:36:49,460 --> 00:36:58,100 So we might probably more plausibly more possibly say that more than the house, 263 00:36:58,100 --> 00:37:04,610 but by more as one occupies different places in the preference dreams of various things. 264 00:37:07,550 --> 00:37:19,580 And it would be quite, quite adventurous to assume that morally impossible is an existential priority for most individuals. 265 00:37:20,780 --> 00:37:25,670 Of course, finally, philosophers might say it should be other essential criteria. 266 00:37:27,290 --> 00:37:31,580 Maybe it should be like such a priority. But I don't think. 267 00:37:35,920 --> 00:37:43,600 The kind of data and insights that we have point to being actually having Central Park. 268 00:37:48,840 --> 00:37:57,420 So who would be the likely buyer and user of relatively more expensive drugs? 269 00:37:58,380 --> 00:38:07,770 Individuals with strong, very strong desire to become morally harmless and for various reasons, 270 00:38:08,220 --> 00:38:23,880 making opportunity cost reasons and efficiency reasons and other kinds of reasons would see the biological way as the sort of thing for the Quakers, 271 00:38:25,170 --> 00:38:41,260 etc., etc. But then how do you know that they are there by their team? 272 00:38:41,740 --> 00:38:53,750 I think Steve, he has a phrase the bad apples problem and usually an important argument in favour of buying. 273 00:38:54,070 --> 00:38:57,740 One is that you can have all these bad Apple products. 274 00:38:58,600 --> 00:39:07,480 That's. Many people to my name Julianne some members of the for the book is actually the. 275 00:39:08,720 --> 00:39:16,520 So called bad apple problem of dangerously available access to weapons, etc. etc. 276 00:39:21,920 --> 00:39:29,120 But then the problem would be that if this economic argument holds water, 277 00:39:29,690 --> 00:39:42,480 then people who would embark on this project of of my gasoline would be people that are already that 278 00:39:42,680 --> 00:39:51,050 people that would sacrifice money to become better persons are people who by traditional philosophy, 279 00:39:51,320 --> 00:39:53,600 are already, in a sense better persons, 280 00:39:53,990 --> 00:40:03,200 because are those people who strongly desire to become better persons and are strongly committed to embark on this process. 281 00:40:04,010 --> 00:40:14,330 I mean, they can subsidise the kind of morality, utilitarian, etc. In any substantive account of morality, you'll find funds like this. 282 00:40:15,050 --> 00:40:21,230 It is important to really want a moral path and if you really want it, 283 00:40:21,680 --> 00:40:28,160 you are already higher in that sense than the person that does not want to become a better person. 284 00:40:32,780 --> 00:40:44,840 So, uh, it wouldn't be those bad apples that mean more than half of them would sacrifice money in order to get the prize. 285 00:40:45,410 --> 00:40:54,680 It would be the person that are already protecting themselves in terms of what people are. 286 00:40:55,550 --> 00:40:55,940 So. 287 00:41:02,570 --> 00:41:15,170 But yeah, that that can be one of the things I really, really would not buy and buyer would not go willingly to buy an organ transplant by themselves. 288 00:41:15,170 --> 00:41:26,090 So that's not a problem. But for them [INAUDIBLE] about the worst that yet and we think there's still an important range of consequences for. 289 00:41:27,850 --> 00:41:30,970 We call them in the paper the not so bad by that. 290 00:41:33,010 --> 00:41:36,550 So to see who are the not so bad, bad apples. 291 00:41:37,310 --> 00:41:41,770 Uh, let's look at this magic here. 292 00:41:42,540 --> 00:41:51,010 Here. We would have potential consumer or wealthy consumers or consumers. 293 00:41:51,730 --> 00:42:03,790 And I have to say that affluent, non-athletic friendly in terms of revenue, opportunity, cost, not we don't want to try to solve the financial crash. 294 00:42:03,790 --> 00:42:10,270 But you are more of an expert here than you are here, right? 295 00:42:10,840 --> 00:42:24,610 Uh, and you can say that are, they can have a designer Wal-Mart in a morally project fashion, 296 00:42:24,760 --> 00:42:33,010 fashion designer, for example, or that they are willing to consider embarking on, on this project. 297 00:42:33,370 --> 00:42:42,780 And somewhere over here would be those really bad apples who don't care about and they wouldn't do anything. 298 00:42:46,570 --> 00:42:58,240 But now imagine the scenario. Imagine this, uh, joke about, uh, who knows something about this happened? 299 00:42:59,350 --> 00:43:05,950 Who knows that he can become quite of fine print and he's not hyperbolic. 300 00:43:06,520 --> 00:43:18,790 That happens very, very rarely. But he knows that when it happens, uh, he can be common danger for his family. 301 00:43:19,900 --> 00:43:30,050 And we chose this is because, you know, we now we still have huge problems with both domestic violence and uh, 302 00:43:30,070 --> 00:43:39,160 and I live in, mostly in my village as well as even more in other parts of Romania is right. 303 00:43:40,220 --> 00:43:48,840 Uh, imagine he's joke he knows he has this problem and one more he looks like a paper 304 00:43:49,240 --> 00:44:00,160 or that he and here's the problem drug that inhibits violence severe and he says 305 00:44:01,480 --> 00:44:09,730 things okay then that's a sort of extra precaution that they can do I can what he did 306 00:44:09,730 --> 00:44:14,860 before in the scenarios avoid as much as possible any kind of triggering context. 307 00:44:15,310 --> 00:44:21,459 So any context in which drinking would be a given going out with the colleagues, etc., etc. 308 00:44:21,460 --> 00:44:31,270 But because that's I can think this text because I haven't been always successful in the past in avoiding these triggering conflicts, 309 00:44:32,980 --> 00:44:41,110 uh, and nobody's going to want to go and buy that product. 310 00:44:45,060 --> 00:44:49,410 One thing which I'd say is that the cheaper, the cheaper the drug. 311 00:44:50,160 --> 00:44:54,690 The most plausible means for Joe to buy the truck, of course. 312 00:44:55,690 --> 00:44:59,070 Uh, but it's not the only thing that we can say. 313 00:44:59,790 --> 00:45:04,330 Uh. We can say that it becomes them. 314 00:45:04,920 --> 00:45:09,000 On the one hand, the matter of racism. That. 315 00:45:10,060 --> 00:45:16,680 Because he could argue, he could say, well, I know this is our but it never happened. 316 00:45:16,690 --> 00:45:23,650 I came out and he sat in on that one job and instead of. 317 00:45:25,890 --> 00:45:30,980 Sacrificing this money in order to buy the drugs and the property costs. 318 00:45:31,410 --> 00:45:34,800 I would rather invest the money. 319 00:45:35,460 --> 00:45:40,410 I would rather spend the money to provide better food for my children. 320 00:45:41,160 --> 00:45:46,170 I would rather spend the money to get them a better education. 321 00:45:46,770 --> 00:45:56,100 Why would I spend the money with my my restaurants? 322 00:45:57,140 --> 00:46:17,490 So for, let's say, the facts of our car speed price increase for our class based on would be largely unimportant for affluent Joes. 323 00:46:17,850 --> 00:46:30,149 I would say even affluent persons with a strong desire to become a better person then probably by increasing price, 324 00:46:30,150 --> 00:46:34,320 would not change consumption and way for them to conceive. 325 00:46:34,320 --> 00:46:47,760 Or maybe if they're the kind of moral snobs that would buy from more affluent targeting of the market equivalent of Rolls-Royces. 326 00:46:48,540 --> 00:46:57,660 You don't get it. But here either Joe can fall either here or there. 327 00:46:58,020 --> 00:47:06,000 We the same with important differences but with with even more importance to the American. 328 00:47:10,690 --> 00:47:25,660 They would be less inclined on their artificial scarcity to invest, for they invest rather their resources into more of the buying pathway. 329 00:47:26,290 --> 00:47:33,670 And this does not the fact, as I said, really the worst attack on us we can imagine, 330 00:47:36,850 --> 00:47:48,370 but it would affect the incentives for more than by the council, for those people that are reasonable, a reasonable and plausible target for us. 331 00:47:48,820 --> 00:47:58,970 I mean, average Joes who would want to become better persons would be in a certain state of the world, 332 00:47:58,990 --> 00:48:03,430 are willing to spend resources in order to become another person, 333 00:48:05,470 --> 00:48:13,510 but will still have to prioritise spending those resources over other things and the 334 00:48:13,510 --> 00:48:25,040 way agents prioritise to relate of their their subjective higher goals of priorities. 335 00:48:25,110 --> 00:48:32,110 Always. Well, this is the weak version of what we think. 336 00:48:33,340 --> 00:48:43,570 We have argued in the paper that drug patents provide some incentive, at least for voluntary moral buying. 337 00:48:45,190 --> 00:48:57,850 And we think this is supported by the argument and maybe there is more of the important more than just be squeezing out of part of this argument. 338 00:48:59,350 --> 00:49:09,870 Uh. We hope at some point to write a second paper arguing for a considerably stronger. 339 00:49:11,030 --> 00:49:17,480 That one cannot be consistently both a supporter of one of the binding possible 340 00:49:18,470 --> 00:49:27,770 and a supporter of the current configuration of intellectual property rules. 341 00:49:28,680 --> 00:49:34,080 But with we are still quite a long way from getting better. 342 00:49:34,140 --> 00:49:37,870 Maybe we won't ever get there. We would like to. 343 00:49:39,530 --> 00:49:43,130 I'm not sure we can get there. So. Yeah. 344 00:49:44,630 --> 00:49:44,930 Thank you.