1 00:00:01,310 --> 00:00:12,460 So I. Right, then I think we ought to sort. 2 00:00:12,700 --> 00:00:16,900 Um, so welcome to everyone here. Welcome to everyone on the podcast. 3 00:00:17,400 --> 00:00:27,280 Um. Um, anyone who didn't get the, um, thing I sent out last week about the, um, argument that we tried to do. 4 00:00:27,310 --> 00:00:30,670 Can you email me on that address and I'll bounce it on to you? 5 00:00:31,000 --> 00:00:39,460 Um, and I'll make sure that the thing that I sent out to everybody here is also available online for the people looking at the podcast later on. 6 00:00:40,510 --> 00:00:45,490 Right? Um, well, this week we're going to do different types of arguments. 7 00:00:45,940 --> 00:00:52,420 Um, so versus last week we learnt how to set out arguments, logic, book style. 8 00:00:52,450 --> 00:00:55,880 Do you remember how set done. What is it for an argument to be set out? 9 00:00:55,920 --> 00:01:01,470 Logic books. Style. Come on. 10 00:01:01,650 --> 00:01:05,510 We've got to get you warmed up, haven't I? Previous time to conclusion. 11 00:01:05,520 --> 00:01:11,160 That's right. You. So you identify the premise and the second premise if there is one, and the third premise if there is one. 12 00:01:11,370 --> 00:01:16,529 And then the conclusion. That's right. And the way I did it was just to label then premise one, premise two. 13 00:01:16,530 --> 00:01:21,450 Conclusion. But you could just put the premise one, premise two and then align and then the conclusion. 14 00:01:21,450 --> 00:01:24,719 There are different ways of doing it. Um, okay. 15 00:01:24,720 --> 00:01:29,190 We dealt with ambiguities. Hold on one second. 16 00:01:34,060 --> 00:01:38,400 How much did you miss? Okay. Uh, right. 17 00:01:38,400 --> 00:01:42,870 We we dealt with ambiguities. Remember, there were different sorts of ambiguity. 18 00:01:43,250 --> 00:01:48,360 What is ambiguous? Can anyone tell me? Start off with what is an ambiguity? 19 00:01:49,460 --> 00:01:54,170 Two or more. It's a word or phrase that has two or more meanings. 20 00:01:54,180 --> 00:01:58,200 That's right. So it starts in the context where it is. 21 00:01:58,380 --> 00:02:01,440 It could mean two different things, which is very confusing. 22 00:02:01,710 --> 00:02:07,080 And there are different types of ambiguities. Can anyone remember a couple of these types? 23 00:02:12,510 --> 00:02:16,050 No. I was so sorry. Linguistic. 24 00:02:17,140 --> 00:02:21,130 Um. No. What do you mean by linguistic? 25 00:02:21,160 --> 00:02:25,030 You might have different meanings for. For example, is. Right. 26 00:02:25,060 --> 00:02:31,420 Okay. That's lexical. A lexical ambiguous is one word which could mean different things. 27 00:02:31,420 --> 00:02:36,910 So rum could mean the drink or, um, this is strange or something like that. 28 00:02:37,030 --> 00:02:45,550 Or we use the example of bank, um, which could be all sorts of things that it could be a verb was, was a noun, all sorts of different things. 29 00:02:45,790 --> 00:02:51,639 And then it also structural ambiguities where you have every particular love to say that, um, 30 00:02:51,640 --> 00:02:57,940 does it mean there is a sailor such that every girl loves him or every particular love some sailor? 31 00:02:58,330 --> 00:03:02,020 Um, so there are different sailors who loved by themselves or just one. 32 00:03:02,710 --> 00:03:06,820 Um, and then there's ambiguity. Cross-reference. Does anyone remember what that is? 33 00:03:08,890 --> 00:03:18,040 Or anaphora? You might have been taught in school and, um, before it references, when, um, you have a pronoun that links back to a name. 34 00:03:18,400 --> 00:03:24,700 So, um, just subject comes to party because she doesn't like, uh, do you remember that? 35 00:03:25,180 --> 00:03:28,540 She becomes ambiguous and the hook becomes ambiguous. 36 00:03:29,050 --> 00:03:32,950 Okay. And then there are ambiguities of of just of sounds. 37 00:03:33,160 --> 00:03:39,520 So the picture of water that you remember could be a picture or a picture with a title of other difficulty, 38 00:03:39,520 --> 00:03:43,900 something that's a picture of water or a picture drawing of water. 39 00:03:44,380 --> 00:03:47,830 Um, we then uh, to identify conclusions and premises. 40 00:03:47,830 --> 00:03:53,890 Um, we learnt there that although that sometimes there are a conclusion on premise indicators, 41 00:03:54,100 --> 00:03:59,950 what's the only foolproof way of identifying the conclusion of an argument. 42 00:04:01,060 --> 00:04:04,240 Okay. So role innocence. 43 00:04:04,450 --> 00:04:08,139 The role is it plays within the arguments from within a sentence. 44 00:04:08,140 --> 00:04:11,469 Because of course a conclusion will be a sentence. That's exactly right. 45 00:04:11,470 --> 00:04:19,270 So, um, what role does play by the conclusion? It's the census that is. 46 00:04:20,980 --> 00:04:29,430 That's being asserted. Exactly. So. And the premises, which you can also tell in a foolproof way, only by rule that they are wards. 47 00:04:33,280 --> 00:04:38,070 Three the reasons being given. So the conclusion that you said okay. 48 00:04:38,080 --> 00:04:43,149 And we then we went to eliminate relevant seats. And that's where we got into a bit of hot water, isn't it. 49 00:04:43,150 --> 00:04:48,280 Because you were all accusing me of throwing things out, uh, just for fun. 50 00:04:48,700 --> 00:05:00,009 Um, but I hope that those of you who got the thing that I sent around see that actually, what make something relevant or irrelevant is the conclusion. 51 00:05:00,010 --> 00:05:07,810 Because once you've identified the conclusion that you have the arguments because the premises are the reasons for that conclusion, 52 00:05:08,140 --> 00:05:13,180 and anything that isn't the reason for that conclusion is irrelevant, isn't it? 53 00:05:13,720 --> 00:05:18,730 So even if it seems very important, like God, I mean, what could be more important than God, perhaps? 54 00:05:19,030 --> 00:05:26,830 Um, I'm afraid he gets left out. If what ever said about God is not a reason for believing that conclusion. 55 00:05:27,700 --> 00:05:35,140 Um, we identified suppressed premises, and we saw that sometimes you have to put a suppressed punch in, and sometimes you don't. 56 00:05:35,350 --> 00:05:39,640 When do you have to put it in its context? 57 00:05:40,270 --> 00:05:43,840 Well it's controversial. Good. That's right. Because you can't leave it out as an argument. 58 00:05:43,870 --> 00:05:49,840 If it's controversial, it would weaken the argument to have it in because it's controversial. 59 00:05:50,260 --> 00:05:57,809 Um, when can you leave the premises out? But it's understood from the general context, 60 00:05:57,810 --> 00:06:03,800 and it's obvious when it's blindingly obvious when all you'll be doing is sort of eking out the argument. 61 00:06:03,830 --> 00:06:06,450 So I'm going to take my umbrella, you know, do I have to say, 62 00:06:06,450 --> 00:06:11,340 because it's raining and if I don't take an umbrella, I'll get wet and I don't want to get wet. 63 00:06:11,730 --> 00:06:16,680 Um, no, of course I don't. You mean you understand immediately when I say I'm going to take my umbrella? 64 00:06:17,280 --> 00:06:21,560 Um. And finally, we learned how to make terms consistent. 65 00:06:21,600 --> 00:06:27,390 What did that involve? Going to rain. 66 00:06:27,840 --> 00:06:31,860 Uh. That's right. We had something like two words that meant the same thing. 67 00:06:31,860 --> 00:06:36,330 Or two phrases that meant the same thing. So it's raining and it's pouring. 68 00:06:36,450 --> 00:06:42,150 We could just choose one of them and get rid of the other, because that's all it does is confuse us. 69 00:06:42,570 --> 00:06:45,720 Um. And we did the same thing with the cat is innocent. 70 00:06:46,110 --> 00:06:53,520 Um, and the cat hasn't since, um, I said, in the context of this argument, there's no difference between them. 71 00:06:54,120 --> 00:06:59,010 Um, okay. So that's that's what we did last week. Um, bit of revision for us. 72 00:06:59,490 --> 00:07:07,049 Um, so now we know what arguments are, and we can distinguish arguments from other sets of sentences to remember, 73 00:07:07,050 --> 00:07:10,410 we can distinguish them from, um, conditional sentences. 74 00:07:10,410 --> 00:07:13,830 For example, um, we can analyse arguments and set them out. 75 00:07:13,830 --> 00:07:17,010 Logic book solve. Um, okay. 76 00:07:17,010 --> 00:07:20,520 So, so if you're not too badly, I'm for lecture three. 77 00:07:21,840 --> 00:07:25,650 Uh, but we haven't yet asked on how to evaluate arguments. 78 00:07:25,920 --> 00:07:34,200 And that's important because some of you last week were having trouble with the analysis of an argument because as you were analysing it, 79 00:07:34,200 --> 00:07:40,020 you were trying to evaluate it as well. So you were saying this premises isn't true. 80 00:07:40,320 --> 00:07:45,270 Well, maybe it's not. Um, but all we're interested in at the moment is whether it's a premise or not. 81 00:07:45,690 --> 00:07:49,350 We're not interested in whether it's true or not at the moment, but we will be. 82 00:07:49,800 --> 00:07:54,150 Um, so we're not even going to be able to start on analysis this week. 83 00:07:54,150 --> 00:07:58,980 What we're going to start on this week is looking at two different types of argument. 84 00:08:01,620 --> 00:08:06,000 And it's very important to to be able to identify the two different types of arguments. 85 00:08:06,630 --> 00:08:11,460 So we're going to the first the normativity of critical reasoning. 86 00:08:12,090 --> 00:08:15,030 Uh um uh of course I'll explain what normativity means. 87 00:08:15,210 --> 00:08:21,270 We're going to look at the relation of following from uh, we're going to look at the two main types of arguments. 88 00:08:21,660 --> 00:08:25,800 Um, and we're going to learn how to distinguish these two types from each other. 89 00:08:26,040 --> 00:08:33,570 Deductive arguments from inductive. Um, so in critical reasoning we're concerned with whether or not an argument is good. 90 00:08:34,080 --> 00:08:39,060 So importantly, it isn't a purely descriptive discipline. 91 00:08:39,360 --> 00:08:44,560 A purely descriptive discipline is one that tries to say what the world is like. 92 00:08:44,580 --> 00:08:49,410 So zoology, for example, will attempt to describe a different species. 93 00:08:49,420 --> 00:08:55,230 So it will tend to categorise living things into species, uh, and into geniuses and so on. 94 00:08:55,440 --> 00:09:01,020 And then it will try and describe each species and so on. But it doesn't really get into shoulds. 95 00:09:01,440 --> 00:09:10,860 Um, in, um, biology, I mean, you might get into a situation where all you mean by should is you'll talk about normal, 96 00:09:11,400 --> 00:09:15,060 um, some things being normal, you'll say you're really talking statistics. 97 00:09:15,420 --> 00:09:19,649 So cats, uh, cats should have four legs. 98 00:09:19,650 --> 00:09:27,810 You might say, well, yes, indeed they should. Statistically, most cats do have four legs, but this isn't a moral should. 99 00:09:28,290 --> 00:09:36,239 Um, and so whereas with critical reasoning, uh, we've got a what's called a rational should, it's normative. 100 00:09:36,240 --> 00:09:43,770 It lays down standards for us to follow. A good argument is an argument we want, and we want it because it is good. 101 00:09:44,610 --> 00:09:47,580 So critical reasoning is a normative discipline. 102 00:09:48,480 --> 00:09:57,000 Um, we're interested in when a conclusion follows from a set of premises and when it follows recessive premises, it's good. 103 00:09:57,330 --> 00:10:03,690 Okay. So we're interested in arguments that are good. Um, and that's what an argument follows from a set of premises. 104 00:10:03,870 --> 00:10:11,159 And we're also interested in which arguments are bad. Um, I mean, you need to be able to identify bad arguments as important. 105 00:10:11,160 --> 00:10:20,430 These are doing good ones. So following from is a very important relation between premises and conclusion. 106 00:10:21,690 --> 00:10:28,950 And there are broadly two varieties of following from um that's what we're going to learn to distinguish from each other today. 107 00:10:29,430 --> 00:10:34,620 Um both deduction and induction. Uh varieties of following from. 108 00:10:34,860 --> 00:10:38,220 So conclusion may follow deductively from a set of premises. 109 00:10:38,430 --> 00:10:41,730 Or it may follow inductively from a set of premises. 110 00:10:42,030 --> 00:10:47,669 And really these two types of arguments exhaust the types of arguments. 111 00:10:47,670 --> 00:10:54,580 There are there are arguments within each type. But but deduction an argument is either deductive or it's in. 112 00:10:54,730 --> 00:11:02,500 Up to um, and there are three questions you can ask to determine whether an argument is deductive or inductive. 113 00:11:02,800 --> 00:11:07,870 Um, here are the three arguments. I'll let you. Sorry. Three questions that you need to ask. 114 00:11:13,420 --> 00:11:20,940 And of course I'm going to look at each and explain them. Okay. 115 00:11:20,940 --> 00:11:26,570 Anyone have any questions so far? Nope. 116 00:11:26,860 --> 00:11:38,920 Okay, let's, um, write. The first property that distinguishes deduction from induction is the fact that a good deductive argument is truth preserving. 117 00:11:40,210 --> 00:11:47,590 Unbiased deductive arguments aren't truth preserving, and inductive arguments are not truth preserving at all. 118 00:11:48,010 --> 00:11:53,079 So if an argument preserves the truth, it is a good deductive arguments. 119 00:11:53,080 --> 00:12:01,210 If it doesn't, it may still be a detective argument, but it's a bad one, and no inductive argument ever preserves the truth. 120 00:12:01,450 --> 00:12:04,810 So what do I mean by an argument? Speaking truth preserving? 121 00:12:05,860 --> 00:12:12,399 Well, an argument is truth preserving if and only if, and that I double f is not me misspelling if. 122 00:12:12,400 --> 00:12:19,480 By the way, whenever you see I double f, I suppose it could be an error, but it usually means if and only if. 123 00:12:19,990 --> 00:12:27,370 Um. So uh um, you get the f going in both directions if p than q and if q, then p. 124 00:12:27,820 --> 00:12:34,660 So an argument is truth preserving if and only if it's not logically possible for its premises to be true. 125 00:12:34,900 --> 00:12:39,910 Well, its conclusion is false. Okay. So it's a very strong relation. 126 00:12:40,270 --> 00:12:50,170 Uh, this is the relation of entailment. So premises entail the conclusions when it's not logically possible for a set of premises to be true. 127 00:12:50,440 --> 00:12:52,240 And a conclusion is false. 128 00:12:53,140 --> 00:13:02,500 Um, and just before we look more closely at that, you've got to be able to distinguish logical impossibility from physical impossibility. 129 00:13:02,770 --> 00:13:11,290 This is actually crucial for a lot of philosophy, but it's particularly crucial for critical reasoning because it's slower and slower. 130 00:13:11,590 --> 00:13:14,680 I'm not going too fast. Okay, okay. 131 00:13:15,160 --> 00:13:18,940 Um, so okay, let me go back to this one. 132 00:13:19,300 --> 00:13:24,520 An argument is truth preserving if and only if it's not logically possible. 133 00:13:24,850 --> 00:13:30,009 So this is why you've got to know the difference between logical impossibility and, um, 134 00:13:30,010 --> 00:13:37,060 physical impossibility if it's not logically possible for its premises to be true and its conclusion false. 135 00:13:37,810 --> 00:13:44,350 Um, so we've got to be able to distinguish logical impossibility and physical impossibility. 136 00:13:44,380 --> 00:13:47,950 Uh, by the same token, logical and physical possibility. 137 00:13:49,300 --> 00:13:54,790 Something is physically impossible if it's inconsistent with the laws of nature, 138 00:13:56,140 --> 00:14:01,210 and it's logically impossible if it's inconsistent with the laws of logic. 139 00:14:02,140 --> 00:14:08,020 Um, um, something's inconsistent. Two things are inconsistent if they can't both be true together. 140 00:14:08,650 --> 00:14:14,620 Okay, so, um, if the laws of nature are as they are, um, 141 00:14:14,620 --> 00:14:23,169 something is physically impossible if it can't be the case consistently with the way the laws of nature are and with physical, 142 00:14:23,170 --> 00:14:26,380 uh, logical impossibility. It's the laws of logic that matter. 143 00:14:27,220 --> 00:14:34,240 So it's physically impossible for a human being to swim three miles underwater without breathing apparatus. 144 00:14:34,450 --> 00:14:42,820 Do you see that? If somebody told you that done that, you would know that they were not speaking the truth because it simply can't be done? 145 00:14:43,450 --> 00:14:48,669 Um, it's also physically impossible for A to talking intelligently about cats. 146 00:14:48,670 --> 00:14:51,700 I mean, even even Oedipus. I talked to him about cancer. 147 00:14:51,700 --> 00:14:57,580 Yes. He still doesn't talk back intelligently about it, or even I'll intelligently to that. 148 00:14:58,270 --> 00:15:01,690 Um, but both these things are logically possible. 149 00:15:02,080 --> 00:15:05,870 Logic has nothing to say about with my cat discusses Kant, 150 00:15:06,220 --> 00:15:12,370 and it has nothing to say about whether you can swim under water for three miles without breathing apparatus. 151 00:15:12,970 --> 00:15:17,230 Um, it's logically impossible, on the other hand, for a square to be a circle. 152 00:15:17,830 --> 00:15:21,580 Um, so all those people, like my father talk about squaring the circle. 153 00:15:21,880 --> 00:15:28,630 It's nonsense. You cannot square a circle because if something's a square, it cannot be a circle. 154 00:15:28,630 --> 00:15:32,350 And you can tell that just from the very ideas involved. 155 00:15:32,740 --> 00:15:38,320 Uh, an a man can't be a married bachelor, because if he's a bachelor, he's not married. 156 00:15:38,680 --> 00:15:43,780 And if he's married, he's not a bachelor. Just like if something's a square, it's not a circle. 157 00:15:43,780 --> 00:15:47,260 And if it's not a circle, it's not a square. We know that. 158 00:15:47,590 --> 00:15:52,510 And we know that a person can't have exactly three and exactly four children. 159 00:15:52,900 --> 00:16:01,150 These things are ruled out by logic. Um, in other words, it's in the very nature of the concepts square and circle, 160 00:16:01,480 --> 00:16:08,650 marriage and bachelor and so on that we don't need to leave our own chair to know that there are no square circles. 161 00:16:08,800 --> 00:16:14,560 This is why it's so comfortable to be a philosopher. Um, so here's an exercise. 162 00:16:14,890 --> 00:16:21,430 Give me two examples of states of affairs that are logically possible but not physically possible. 163 00:16:22,510 --> 00:16:31,280 Can you do that? Can you give me two examples? Of states of affairs that are logically possible but not physically possible. 164 00:16:32,090 --> 00:16:36,740 So if you remember, one example was that my cats can't talk intelligently about cats. 165 00:16:37,280 --> 00:16:43,580 Uh, and another one is this a human being can't swim for three miles underwater without breathing apparatus? 166 00:16:44,210 --> 00:16:51,680 Another example like that. The sunset over froze to death. 167 00:16:52,670 --> 00:16:58,910 The sun so hot you froze to death. I'm not sure that isn't a logical impossibility. 168 00:16:59,270 --> 00:17:03,950 I mean, because you're working on the words freeze and hot, aren't you? 169 00:17:04,370 --> 00:17:08,870 And actually, they logically exclude each other. Don't like, um. 170 00:17:12,170 --> 00:17:15,710 Don't have a good pace. Comply. That's a very good one. 171 00:17:15,900 --> 00:17:19,940 This Pink Floyd pace, as far as logic is concerned. 172 00:17:19,940 --> 00:17:23,750 Fine. Pigs. All fine. But yes, there are no pigs that can fly. 173 00:17:24,230 --> 00:17:28,510 Um, okay. Uh, yes. 174 00:17:28,520 --> 00:17:31,970 You must be looking at the. Okay. And the other examples. 175 00:17:33,290 --> 00:17:39,650 Yes. You mean like it? I believe I believe that you believe you did it, but I didn't like you sort of see it. 176 00:17:40,100 --> 00:17:47,330 Yeah. Yeah, yeah. Hey, can anyone think of another example of something that's physically impossible but logically possible? 177 00:17:49,100 --> 00:17:54,020 I can jump to the meat. Uh, good. It's logically possible for you to jump to the moon. 178 00:17:54,020 --> 00:17:57,730 But it's physically impossible, isn't it? Yes. Good. Okay. 179 00:17:57,740 --> 00:18:02,000 I think you're getting the idea. Anyone else want to have a go? I saw through the wall. 180 00:18:02,480 --> 00:18:07,880 You saw through the wall? Good. Yeah, that. That's physically impossible, but logically it's fine. 181 00:18:08,540 --> 00:18:18,320 Um, so there are lots of things like this that are physically impossible, ruled out by the laws of nature, but not ruled out by the laws of logic. 182 00:18:18,390 --> 00:18:24,170 That's false. And of course, there may be worlds out there that have different laws of nature. 183 00:18:24,590 --> 00:18:29,540 Um, and so when we want to know what's going on in those worlds, 184 00:18:29,540 --> 00:18:35,890 it's to logic that we look not to the laws of nature, because they're not going to tell us what these worlds like. 185 00:18:36,620 --> 00:18:41,720 Uh, can I only get two examples of state, some things that are logically impossible? 186 00:18:42,200 --> 00:18:48,380 Okay, so you might see it being frozen by a hot sun might be a logical impossibility. 187 00:18:48,860 --> 00:19:00,760 Anyone think of another logical impossibility? So it's, uh, it's raining cats and dogs. 188 00:19:01,120 --> 00:19:05,469 Uh, funnily enough, that's that's a physical impossibility is not a logical impossibility. 189 00:19:05,470 --> 00:19:10,690 Actually, it does rain frogs occasionally, I understand, but it actually does. 190 00:19:11,560 --> 00:19:15,820 Burnt by ice. Um, being burnt by ice ash. 191 00:19:17,350 --> 00:19:25,090 It's just. I know what you mean by burnt, doesn't it? But actually, if you put your tongue on the ice, the, um, in the Antarctic, you will. 192 00:19:25,150 --> 00:19:29,740 You are in trouble. Um, and it's actually called a burn, isn't it? 193 00:19:30,370 --> 00:19:35,350 Um, yeah. A female born, a female boy. 194 00:19:35,380 --> 00:19:38,650 Yes. That's a good example of a logical impossibility. 195 00:19:38,680 --> 00:19:41,740 Yeah. Okay. Any anything else? 196 00:19:41,950 --> 00:19:50,380 It's quite difficult just thinking of examples. But I hope you're getting the idea of the difference between physical and logical impossibility. 197 00:19:50,650 --> 00:19:54,370 Logical impossibility is inconsistent with the laws of logic. 198 00:19:54,640 --> 00:20:02,470 It's actually ruled out by the very concepts involved and physically possibilities ruled out by the laws of nature. 199 00:20:02,950 --> 00:20:08,980 So whereas, um, the logical impossibility can be determined a priori. 200 00:20:09,130 --> 00:20:16,780 In other words, we don't need experience of the world to determine what a logical possibility is is all you need is two concepts. 201 00:20:17,230 --> 00:20:21,040 Um, with physical impossibility, you need to know what the world is like. 202 00:20:21,370 --> 00:20:24,880 Science tells us what's physically possible and impossible. 203 00:20:25,390 --> 00:20:31,240 Um, so what about these? By means of genetic manipulation, we can produce pigs that are able to fly. 204 00:20:31,480 --> 00:20:37,240 Is that physically impossible? Logically impossible? Or are we unable to tell physically? 205 00:20:37,240 --> 00:20:45,490 Is it it's physically impossible? That's right. I mean, actually, maybe by genetic manipulation, we will produce pigs in five days. 206 00:20:45,940 --> 00:20:51,879 Um, we think we want some. Um, Joe has exactly twice as many siblings as Janet. 207 00:20:51,880 --> 00:20:55,750 He has Susan and the twins do that. 208 00:20:55,750 --> 00:21:00,340 Do you see how that's logically impossible? If somebody told you that, you would. 209 00:21:01,630 --> 00:21:05,290 You think for a minute, but then you think, no, that can't be right. 210 00:21:05,290 --> 00:21:12,399 And the reason it can't be right is if he's got three siblings, then he cannot have exactly twice as many as Janet has. 211 00:21:12,400 --> 00:21:16,570 However many Janet has. Um, muon neutrinos can travel faster. 212 00:21:16,570 --> 00:21:20,860 The speed of light in a vacuum. Physically possible. 213 00:21:20,860 --> 00:21:25,510 Logically impossible to physically just wait. 214 00:21:25,690 --> 00:21:31,960 Yes, for a moment. That might be not, uh, logically possible and physically possible. 215 00:21:32,290 --> 00:21:38,110 Um, but some of this, uh, microscope is shaking or something. 216 00:21:38,620 --> 00:21:42,070 Um, okay. Physicists succeeded in building a time machine. 217 00:21:42,850 --> 00:21:48,730 Is that physically impossible? Logically impossible or such that we can't tell physically? 218 00:21:50,470 --> 00:21:57,070 Um, so we can't tell on that one because it looks as if time machine time travel is logically impossible. 219 00:21:57,070 --> 00:22:05,440 Because if you could travel back and kill one of your parents before you were born, then the. 220 00:22:05,440 --> 00:22:08,770 Do you see the logical inconsistency that we get there? 221 00:22:09,220 --> 00:22:14,440 Um, well, if it's logically inconsistent, it shouldn't be physical, physically possible. 222 00:22:14,440 --> 00:22:21,550 But there are physicists who argue that it's not only not logically impossible, it's physically possible. 223 00:22:21,970 --> 00:22:29,350 Um, and I actually, I if you look on YouTube, if you put in time travel physically possible, 224 00:22:29,470 --> 00:22:32,980 you'll get us a physicist who believes that it is physically possible. 225 00:22:33,220 --> 00:22:38,230 So I'm not going to say I think I'm inclined to think it's logically impossible. 226 00:22:38,980 --> 00:22:47,020 Whereas, uh, sort of is our point about all of this in the sense of you have to almost as a caveat or that says, 227 00:22:48,400 --> 00:22:53,830 given what we know today with physical impossibility, that's exactly what we have to say. 228 00:22:53,830 --> 00:23:01,720 Yes. Because, um, the laws of nature, we may be able to work with them to produce a peak that can fly. 229 00:23:02,230 --> 00:23:08,590 I mean, obviously we can't work against them, but that's things. Currently, sons flying pigs are, uh, against nature. 230 00:23:08,800 --> 00:23:11,800 But logical impossibility, no. 231 00:23:12,010 --> 00:23:16,390 Logical impossibility seems more fixed because the laws of logic seem more fixed. 232 00:23:16,420 --> 00:23:25,750 But actually, um, quantum mechanics, for example, has called into question, um, a couple of the laws of classical logic. 233 00:23:26,320 --> 00:23:31,149 So, um, where it says a distinction between the laws of nature, 234 00:23:31,150 --> 00:23:40,629 i.e. the natural uniformity that govern our worlds and the laws of nature that we put together, i.e., uh, descriptions of the natural uniformity. 235 00:23:40,630 --> 00:23:49,720 So you see the difference. So there's the laws, uh, laws, if you'd like, and the natural uniformity that we describe in the same way, 236 00:23:49,720 --> 00:23:54,910 the laws of logic, there's all a description of these laws, and there are the laws themselves. 237 00:23:55,270 --> 00:23:59,460 Um, we. They have the wrong description and they need to tighten it up at some point. 238 00:23:59,970 --> 00:24:04,410 So we can't be sure we've got the right understanding of either type of law. 239 00:24:04,950 --> 00:24:12,030 But it's certainly true with physical possibility that it's, um, malleable according to often physical knowledge. 240 00:24:13,320 --> 00:24:20,300 So. Well, let me just ask other any other questions about physical and logical impossibility before I move on, 241 00:24:20,310 --> 00:24:27,300 because that is it's quite an important distinction. Chris Quine produced an argument against logical certainty. 242 00:24:28,410 --> 00:24:33,930 Um. So no challenge is that what he does? 243 00:24:33,930 --> 00:24:39,390 Is he he's, um, he what he challenges is the existence of analytic truths. 244 00:24:40,740 --> 00:24:45,120 Um, so he doesn't think that there are truths. 245 00:24:45,120 --> 00:24:49,830 This is not realisable. Okay. 246 00:24:50,490 --> 00:24:59,780 Um, okay. So an argument is truth preserving if and only if it's not logically possible for its premises to be true and its conclusion false. 247 00:24:59,830 --> 00:25:01,830 That's the definition we had up there before. 248 00:25:02,650 --> 00:25:13,590 Um, and it's important to notice that the actual truth value of the premises is irrelevant, uh, to determining whether an argument preserves truth. 249 00:25:13,620 --> 00:25:18,720 So have a look at this argument. All heavenly bodies revolve around the earth. 250 00:25:18,990 --> 00:25:23,280 The sun is a heavenly body. Therefore, the sun revolves around the earth. 251 00:25:23,850 --> 00:25:29,040 Do you see that if those premises are true, the conclusion must be true. 252 00:25:29,270 --> 00:25:37,560 Yeah, yeah. Do you see that? But the conclusion is false, as is at least one of the premises. 253 00:25:38,940 --> 00:25:45,600 Okay. Do you see that? So we have an argument that preserves truth but doesn't generate truth. 254 00:25:46,500 --> 00:25:55,170 Um, so that's the importance of the distinction. You've got to distinguish between an argument's preserving truth and generating truth. 255 00:25:55,380 --> 00:26:02,580 And if it preserves truth, what you're saying is if there is truth in the premises, there will be truth in the conclusion. 256 00:26:03,000 --> 00:26:12,660 And that suits this argument very well. So if if these premises were true, the conclusion would be true would have to be true, wouldn't it? 257 00:26:13,080 --> 00:26:20,010 Couldn't possibly be false. It's logically impossible, in fact, for that purpose to be that conclusion to be false. 258 00:26:20,220 --> 00:26:25,860 If those premises are true, and that's what it is for an argument to be truth preserving. 259 00:26:26,490 --> 00:26:30,630 So if an argument has that property, then it's called valid. 260 00:26:30,660 --> 00:26:35,700 As we'll discover next week. And the premises entail the conclusion. 261 00:26:36,480 --> 00:26:39,510 Um, this is a good deductive argument. 262 00:26:39,870 --> 00:26:43,680 And that's so even though the conclusion is false. 263 00:26:44,220 --> 00:26:54,150 Um, can anyone tell me why? We'll look at this a bit next week, but can anyone tell me why the property of being truth preserving is useful, 264 00:26:54,390 --> 00:26:57,360 even though you can end up with a false conclusion? 265 00:26:58,680 --> 00:27:06,570 Why should we care that our argument should be truth preserving, even though you could end up with an argument with a false conclusion? 266 00:27:06,960 --> 00:27:10,050 Because you can trace back to find the false premise. 267 00:27:10,780 --> 00:27:18,450 Good. If an argument is truth preserving, you know either that it's argued that its conclusion is true if its premise is true, 268 00:27:18,450 --> 00:27:24,660 or you know that if the conclusion is false, at least one of the premises must be false. 269 00:27:25,140 --> 00:27:29,610 And of course, you can look back to find which premise is false. 270 00:27:29,920 --> 00:27:37,620 And of course, this is all swans are white. The thing is, in Swan River in Australia, if it's a swan will be white. 271 00:27:38,100 --> 00:27:43,740 Um, but it's black. Um, you know that one of your premises is false. 272 00:27:44,430 --> 00:27:50,790 Um, so it can be empirically very useful to have the property of truth preserving. 273 00:27:50,790 --> 00:27:59,880 This is an argument. So we can be certain of an argument that preserves the truth, that it's a good deductive argument. 274 00:28:00,210 --> 00:28:03,710 So it's not just deductive. It's also good deductive argument. 275 00:28:03,720 --> 00:28:11,250 If it preserves the truth. Um, we can be just as if the second argument doesn't preserve the truth. 276 00:28:11,760 --> 00:28:17,310 Then it's either a bad deductive argument or it's an inductive argument. 277 00:28:18,000 --> 00:28:23,730 Okay, so it could be one or the other, but it's it's certainly not a good deductive argument. 278 00:28:24,030 --> 00:28:29,150 So let's do this exercise which is following arguments is truth preserving. 279 00:28:29,160 --> 00:28:32,250 In other words is a good deductive argument. 280 00:28:32,370 --> 00:28:37,310 Have a look at them yourself and put your hands up when you think you've got the answer to one. 281 00:28:37,320 --> 00:28:44,650 Let's say. Hands up when you think you've got the answer to one. 282 00:28:46,210 --> 00:28:50,590 Tom is a banker. All bankers are rich. Therefore Tom is rich. 283 00:28:52,360 --> 00:28:56,100 Okay, ladies. Earn. It. 284 00:28:56,410 --> 00:28:59,770 That's a good. That's a good to Touch Bar. Does everyone agree? 285 00:29:00,340 --> 00:29:08,200 Yep. Okay. You are absolutely right. If Tom is a banker, and if all bankers are rich, then Tom must be rich. 286 00:29:08,200 --> 00:29:13,000 Mustn't eat. There's no question about it. So that's a truth preserving arguments. 287 00:29:13,330 --> 00:29:16,690 Um, and it's a good deductive argument. 288 00:29:16,690 --> 00:29:22,990 Therefore, um, what about Sue? And Tom leads similar lives, but Sue smokes and Tom doesn't. 289 00:29:23,440 --> 00:29:32,560 Therefore, Sue is more likely to die from heart disease than Tom puts up your hand when you think you know whether that's truth preserving or not. 290 00:29:40,360 --> 00:29:50,360 I'll let a few more put their hands up first. Okay, so tell us, do you think it's true, Sonny? 291 00:29:51,140 --> 00:29:55,070 No, because I think it's missing something. Yeah. Um, as it says. 292 00:29:55,070 --> 00:29:58,309 No. Okay. Stop there. Well, that's. It is. 293 00:29:58,310 --> 00:30:05,330 No. And that's because there is a possibility that the conclusion is false, even if the premise is true. 294 00:30:05,960 --> 00:30:13,280 Um, why is that? Because it doesn't make any link between smoking and what is it that needs an extra premise, right? 295 00:30:13,310 --> 00:30:18,290 And this is the premise, uh, I mean, it could be a premise that we just want to leave out, or it could be a controversial premise. 296 00:30:18,320 --> 00:30:21,890 What is the, um. Well, actually, no, don't ask for it. 297 00:30:22,310 --> 00:30:26,870 You're quite right. There is something left out. And that premise could be true. 298 00:30:27,050 --> 00:30:33,350 And that. And that conclusion is false. So we know that that argument doesn't preserve the truth. 299 00:30:33,620 --> 00:30:36,980 So there could be truth in the premises, but no truth in the conclusion. 300 00:30:37,100 --> 00:30:43,759 So it's not truth preserving. So either that's a bad deductible argument or it's an inductive argument. 301 00:30:43,760 --> 00:30:47,360 And we'll see later, which it is. What about all dogs are mortal? 302 00:30:47,570 --> 00:30:52,640 Lucy is mortal. Therefore Lucy is a dog. Which happens when you think you know. 303 00:30:54,860 --> 00:30:58,790 Okay, gentlemen. They're not. It's not truth preserving. 304 00:30:58,790 --> 00:31:03,019 That's right. Okay. It's actually a it's a bad deductive argument. 305 00:31:03,020 --> 00:31:06,739 This one, isn't it? Um, and we'll see why it is later on. 306 00:31:06,740 --> 00:31:10,309 But. Yes. Uh, so all dogs are lawful. Lucy is mortal. 307 00:31:10,310 --> 00:31:13,459 Therefore Lucy is a dog. Those two premises can be true. 308 00:31:13,460 --> 00:31:17,000 And yet that conclusion false quite easily, couldn't they? 309 00:31:17,000 --> 00:31:21,889 Lucy might be humans and mortal. Um, what about number four? 310 00:31:21,890 --> 00:31:26,390 Have a look at that and put your hands up when you think you know whether it's. 311 00:31:32,590 --> 00:31:39,490 Gentlemen that it's so good to have, um, everyone else agree. 312 00:31:40,030 --> 00:31:43,180 So, you know, before you. So I know before, uh. I'm sorry. 313 00:31:45,000 --> 00:31:48,819 I thought you were wrong, but actually, I think you're right. Can you guess wrong? 314 00:31:48,820 --> 00:31:52,680 So pretty clearly involves conceptual computer code. Yeah, exactly. 315 00:31:52,960 --> 00:32:01,360 So that's a good deductive argument. If that if it's true that killing is wrong and it's true that therapeutic cloning involves killing, 316 00:32:01,570 --> 00:32:05,470 then it must be true that therapeutic cloning is wrong. Okay. 317 00:32:05,730 --> 00:32:13,120 Uh, so if you want to deny that, you'd have to go back and deny one of the premises, wouldn't you have to say one of those premises is false? 318 00:32:13,780 --> 00:32:16,900 Okay, what about number five? Hands up when you think you've got. 319 00:32:17,920 --> 00:32:21,310 I'm giving this one away. Yes. All right, let's go. 320 00:32:21,370 --> 00:32:25,300 What it's about. Um, no. 321 00:32:25,510 --> 00:32:28,749 All you need at the moment is it's not a good deductive argument. 322 00:32:28,750 --> 00:32:33,820 It's not truth preserving. Um, actually, it's not a deductive argument, as we'll see later on. 323 00:32:34,540 --> 00:32:41,290 Um, every person with Huntington's disease who's been examined has had the HD gene on chromosome four. 324 00:32:41,590 --> 00:32:45,580 Therefore, everyone with HD has the gene on chromosome full. 325 00:32:45,850 --> 00:32:49,300 You see, that could be true. And that could be false. 326 00:32:49,810 --> 00:32:54,390 I mean, it's true that it's unlikely to be false, but it could be false, couldn't it? 327 00:32:54,400 --> 00:33:00,970 It could be that, um, we just haven't examined any of the HD people who have got the HD genome on prem. 328 00:33:00,980 --> 00:33:07,660 So five perhaps, or or whatever. Is there any biologists in the room who can, um. 329 00:33:07,870 --> 00:33:12,220 What about this one? If the liquid is acidic, it will turn this paper blue. 330 00:33:12,610 --> 00:33:19,660 This liquid does not turn litmus paper blue. Therefore, this liquid is not acidic and up. 331 00:33:21,790 --> 00:33:25,910 Um do that. Mhm. Yeah it is a good. 332 00:33:26,710 --> 00:33:33,310 Okay. Everyone agree. Good. Okay. So we've got a feel for when an argument is truth preserving or not. 333 00:33:33,580 --> 00:33:38,590 And we know that if it's true that it is a good deductive argument. 334 00:33:38,740 --> 00:33:44,650 And if it's not truth preserving it's either a bad deductive argument or it's an inductive argument. 335 00:33:45,460 --> 00:33:51,850 So that's the first question that we ask to distinguish deductive arguments from inductive. 336 00:33:52,180 --> 00:33:57,910 But it would be reasonable to ask at this point perhaps why is deduction useful? 337 00:33:58,540 --> 00:33:59,910 Um, um, 338 00:33:59,950 --> 00:34:08,470 and if we believe the premise is a good deductive arguments and it's logically impossible for the premises to be true and the conclusion false, 339 00:34:08,830 --> 00:34:12,520 then to believe the premises is to believe the conclusion, isn't it. 340 00:34:13,250 --> 00:34:20,530 Mhm. Do you see that in a, in a good deductive arguments the conclusion doesn't go any further than the premises. 341 00:34:21,170 --> 00:34:24,700 What is contained in the premises is contained in the conclusion. 342 00:34:25,270 --> 00:34:32,200 So you actually in a way you learn nothing from a deductive argument to good deductive arguments. 343 00:34:32,710 --> 00:34:40,740 Um, it doesn't tell us anything we don't already know. It's just it's it gives us the logical consequences of what we've already known. 344 00:34:41,230 --> 00:34:49,840 Well, if human beings were perfectly rational and believed all the logical consequences of all their beliefs, then deduction wouldn't be useful. 345 00:34:50,740 --> 00:34:55,090 The reason it is useful is because we're not perfectly rational. 346 00:34:55,510 --> 00:34:59,980 Um, we often need consequences of our belief spelled out to us. 347 00:35:00,430 --> 00:35:04,880 Um, and that's what deduction does. And I just dissolved this sentence. 348 00:35:04,960 --> 00:35:15,130 I put it in. I slightly changed it to make it even better, but it came from life savings, right? 349 00:35:16,060 --> 00:35:19,840 Okay. So, um. Uh. Deductive arguments. 350 00:35:20,560 --> 00:35:24,040 If the premises are true, then the conclusion must be true. 351 00:35:24,050 --> 00:35:29,560 There is no logically possible situation in which the premises are true and the conclusion false. 352 00:35:29,710 --> 00:35:32,820 That's a good deductive argument on purpose. 353 00:35:32,850 --> 00:35:38,440 So just back a little bit, um, on the on the need for the conclusion and not just the conclusion, 354 00:35:38,440 --> 00:35:43,270 not linked the two premises in a way that wouldn't have happened without the conclusion. 355 00:35:44,950 --> 00:35:51,130 Uh, yes. I mean, the conclusion is the logical consequence of the two premises table together links them. 356 00:35:51,580 --> 00:35:56,260 Well, it links and because it you've got to put them together in order to get the conclusion. 357 00:35:56,740 --> 00:36:07,140 I mean, so if you have, um, all heavenly bodies revolve around the earth, that would have, um, separable something. 358 00:36:08,020 --> 00:36:12,700 It wasn't possible. Uh, no, it's it's logical that this is truth, but, uh, 359 00:36:13,450 --> 00:36:20,380 but if you only have that premise without that one, then that one wouldn't logically follow, would it? 360 00:36:21,820 --> 00:36:27,930 No, but you would say didn't advance knowledge anymore, but it linked premises on premise. 361 00:36:27,940 --> 00:36:31,950 It links them and he can tell the lies. That's certainly true. 362 00:36:31,960 --> 00:36:38,200 I mean, the fact is that that that belief is a consequence of that, a logical consequence of that belief and that belief. 363 00:36:38,950 --> 00:36:42,610 So it links them because you need both of those beliefs to get that. 364 00:36:43,270 --> 00:36:46,420 Does that satisfy you? Yes. Good. 365 00:36:47,380 --> 00:36:52,870 Okay. Um, so, Beverly. Samberg. 366 00:36:54,910 --> 00:37:01,719 Okay. The second question we ask to distinguish deductive arguments from inductive arguments, um, 367 00:37:01,720 --> 00:37:10,690 is it's a matter of whether the conclusion follows from the premises an either or matter, or is it a matter of degree. 368 00:37:11,140 --> 00:37:14,170 So does this conclusion follow from the premises? 369 00:37:14,170 --> 00:37:21,910 Either definitely or definitely not. Or could you say, well, you know, maybe it does, maybe it doesn't. 370 00:37:22,720 --> 00:37:29,230 So let's have a look. It's an either or matter. The argument is deductive okay. 371 00:37:29,230 --> 00:37:34,210 So if it's either good or bad then it's a deductive argument. 372 00:37:34,540 --> 00:37:38,770 If it's a matter of degree. If it's not it's not bad. 373 00:37:39,520 --> 00:37:42,550 It's you know, it's better than the last one or something like that. 374 00:37:42,820 --> 00:37:44,980 Then it's an inductive argument. 375 00:37:45,670 --> 00:37:54,460 Let's so we know that the conclusion of a deductive argument follows from its premises if, if and only if the argument is truth preserving. 376 00:37:54,940 --> 00:38:01,690 So if a deductive argument is not truth preserving, its conclusion doesn't follow from its premises. 377 00:38:01,690 --> 00:38:06,640 It's either it does or it doesn't. It's either truth preserving or it's not. 378 00:38:07,000 --> 00:38:14,830 It's it can't be anything in between. Um, so the evaluation of a deductive argument is an either or matter. 379 00:38:14,920 --> 00:38:17,590 Either the conclusion follows from the premises or it doesn't. 380 00:38:17,590 --> 00:38:23,740 There's no maybe no degrees of following from at all when we're talking about deduction. 381 00:38:24,460 --> 00:38:30,970 Um, so okay, here's a good argument all by this a rich Deepak is a banker. 382 00:38:30,970 --> 00:38:36,460 Therefore Deepak is rich. Do you see that if those premises are true, the conclusion must be true. 383 00:38:38,050 --> 00:38:41,500 And here's the bad one. All by who's a rich? Deepak is rich. 384 00:38:41,500 --> 00:38:48,370 Therefore Deepak is a banker. Do you see that those two premises could be true without the conclusion is being true. 385 00:38:49,150 --> 00:38:52,750 And we can see that immediately. Can't wait to snoop on this. 386 00:38:52,750 --> 00:38:57,220 No. Maybe here. One is good, one is bad, and there's nothing in between. 387 00:38:57,400 --> 00:39:04,270 And that's always the case with deductive arguments. Question over here on the having one thing is can you see okay. 388 00:39:04,270 --> 00:39:12,580 Well therapeutic cloning. Yes, yes. It didn't say oh, uh, let me go back and see what it did. 389 00:39:12,670 --> 00:39:17,889 I think it inspired all, but let me see. Wait a few Rocky whistle a little. 390 00:39:17,890 --> 00:39:21,850 Not when you say so. Killing is wrong. Therapeutic cloning involves killing. 391 00:39:21,850 --> 00:39:26,260 Therefore therapeutic cloning is wrong. You mean this doesn't say all killing is wrong? 392 00:39:26,680 --> 00:39:32,350 No, but it does imply it, doesn't it? I says, quite generally, killing is wrong, right? 393 00:39:32,440 --> 00:39:41,290 There's no degree that the. So we would I mean, the only way you could make that conclusion false is by making that not true. 394 00:39:41,290 --> 00:39:48,010 In other words, you'd have to say some killing is wrong. Most killing is wrong, but it's not the case that all killing is wrong. 395 00:39:48,340 --> 00:39:53,510 Okay, can I just say, I just thought that was a good objection. 396 00:39:53,620 --> 00:39:58,520 For what if you then. So there's a question of degree around the therapeutic cloning. 397 00:39:58,990 --> 00:40:03,880 Sometimes it involves killing. Well, then you would be falsifying that premise. 398 00:40:04,450 --> 00:40:07,000 Yeah. So with this argument's. 399 00:40:08,210 --> 00:40:17,210 It's a truth preserving argument in that if that premise is true and that premise is true, then that conclusion must be true. 400 00:40:17,420 --> 00:40:23,690 And if you don't think that conclusion is true, you've got to question either that premise or that premise, 401 00:40:23,690 --> 00:40:27,140 and you could question them in the way that you've just discussed. 402 00:40:27,650 --> 00:40:30,890 Yeah. Any questions about that? 403 00:40:32,900 --> 00:40:38,350 No. Let's go on to, um, I've got my maybe one, I assume. 404 00:40:38,360 --> 00:40:43,640 Okay. Um, so which one is good? One is bad, and that's always the way it's going to be. 405 00:40:44,030 --> 00:40:52,910 Um, so what did you tell us about. Which is good. Um, one is truth preserving, i.e. when its conclusion follows deductively from its premises, 406 00:40:53,240 --> 00:40:57,950 its premise is all conclusive reason for believing its conclusion. 407 00:40:58,610 --> 00:41:06,920 Okay, it's either good or it's bad. So if it's good, then it's premises are conclusive reason for believing the conclusion. 408 00:41:07,280 --> 00:41:11,210 And that's because deductive arguments. Uh, monotonic. 409 00:41:11,390 --> 00:41:14,510 Okay, so here's a new word for you. This time it's a new word. 410 00:41:14,510 --> 00:41:18,140 And if it's not, um, deduction is monotonic. 411 00:41:18,440 --> 00:41:26,060 And what that means is that we can add anything we like to a good deductive argument without making it bad. 412 00:41:26,180 --> 00:41:34,580 Good. Um. We cannot if if a good argument is good, if a good deductive argument is good, it's conclusively good. 413 00:41:34,880 --> 00:41:44,450 And if it's bad, it's conclusively bad. Um, so there's nothing we could learn that would change a good deductive argument to being a bad one. 414 00:41:44,450 --> 00:41:47,420 So let's have a look. Here's just good deductive argument. 415 00:41:47,730 --> 00:41:55,010 Do you would you like to have a go at giving me a premise that might change that from a good argument to a bad argument? 416 00:41:56,210 --> 00:42:02,070 Have a think about it. Some some bad guys. 417 00:42:02,100 --> 00:42:05,160 Oh, good. Sorry. Uh, sorry. 418 00:42:05,730 --> 00:42:09,900 Good. Thank you. So rich. Um. 419 00:42:13,860 --> 00:42:19,110 Well, that's also, I think, a premise that's changing one of the premises there, which is not playing the game. 420 00:42:19,260 --> 00:42:22,380 Sorry. Sorry. 421 00:42:23,970 --> 00:42:29,490 Um, okay. I did not say that. Um, I'll tell you what. 422 00:42:29,490 --> 00:42:32,490 Let me show you what I've got. What I thought you might come up with. 423 00:42:32,790 --> 00:42:38,150 Because I think that might be easier. Um, you can try putting in. 424 00:42:38,160 --> 00:42:45,410 What am I trying to put in there? I can't remember now. I know, it's not. 425 00:42:45,930 --> 00:42:49,200 So I tried to I. It's not the case that Deepak is a banker here. 426 00:42:49,530 --> 00:42:53,100 So there's the argument. First premise, second premise, some conclusion. 427 00:42:53,310 --> 00:42:57,420 And I'm adding that in, in the hope that that might make it a bad argument. 428 00:42:58,560 --> 00:43:02,310 Okay. Do you think it does make it a bad argument though? 429 00:43:03,120 --> 00:43:13,230 Okay. It doesn't, does it? It certainly doesn't change it from, um, if all these premises are true, this conclusion is true, does it? 430 00:43:13,990 --> 00:43:18,390 Which actually, if all these premises are true, anything follows from a contradiction. 431 00:43:18,780 --> 00:43:22,810 And we've got a contradiction in here, haven't we. Here. Do you see. 432 00:43:22,830 --> 00:43:26,370 So, so anything follows from it. And what about this one. 433 00:43:26,670 --> 00:43:32,670 What have I added here? No banker is rich, so I've actually contradicted that one. 434 00:43:33,060 --> 00:43:41,850 Um, but we've got exactly the same thing. We haven't we? It hasn't gone from being the case that all the premises are true without, uh, 435 00:43:41,850 --> 00:43:48,060 so it's being the case that if all the premises is true, the conclusion must be true again. 436 00:43:49,200 --> 00:43:52,660 So with deduction, you've got absolute comp. 437 00:43:52,950 --> 00:43:57,390 It's conclusive. Uh, it's either good or it's bad and there's nothing you can do. 438 00:43:57,750 --> 00:44:00,959 I'm not sure I understand that. Okay. Um. 439 00:44:00,960 --> 00:44:04,110 Let's see. Let me do the bad one and see if that helps. 440 00:44:04,110 --> 00:44:11,550 And if not, we'll go back. Okay, so by the same token, we can add anything we like to a bad deductive argument without making it good. 441 00:44:12,030 --> 00:44:17,310 Once again, there's nothing we can learn that would change it from a bad argument to a good argument. 442 00:44:17,700 --> 00:44:22,680 So, um, I think I've done that. Okay, here's a bad deductive argument. 443 00:44:22,890 --> 00:44:27,030 All bankers are rich. Deepak is rich, therefore deeper. Deepak is a banker. 444 00:44:27,570 --> 00:44:35,280 And I've asked again the same question. Is there any premise we could add to that that would change it from a bad argument to a good argument? 445 00:44:36,870 --> 00:44:42,240 I got a couple of that I'm going to try, but let me just say, if I put something in here that's false. 446 00:44:43,560 --> 00:44:48,070 Okay. Do you see that? I'm not changing it from a good argument to a bad argument. 447 00:44:48,090 --> 00:44:54,000 If something in there is false and it, um, it's not the case that all the premises are true. 448 00:44:55,080 --> 00:45:02,940 Therefore, the conclusion doesn't have to be true either. And if I put something in there that's true then. 449 00:45:03,720 --> 00:45:07,530 Or if all the premises are true, the conclusion will still have to be true. 450 00:45:08,190 --> 00:45:12,280 Let's let's try that. So I'm not convincing myself that. 451 00:45:14,010 --> 00:45:17,250 Um, okay. So I put in here. 452 00:45:17,280 --> 00:45:21,389 It's not the case that Deepak is a banker. So all bankers are rich. 453 00:45:21,390 --> 00:45:27,390 Deepak is rich. It's not the case that Deepak just banker. So I've contradicted the conclusion there. 454 00:45:27,760 --> 00:45:32,670 Yeah, yeah. Okay. Does that change it from a from a bad argument into a good argument? 455 00:45:33,550 --> 00:45:38,130 And has it become truth preserving? No it hasn't. 456 00:45:38,220 --> 00:45:44,510 Has this become truth preserving? No no, no. 457 00:45:45,000 --> 00:45:49,110 And you'll find that nothing you can add will change it from being. 458 00:45:49,110 --> 00:45:52,230 Not truth preserving to being truth preserving. 459 00:45:52,410 --> 00:45:54,660 And in exactly the same way with a good argument. 460 00:45:54,690 --> 00:46:00,570 Nothing you could add would change it from being, uh, truth preserving to being, not truth preserving. 461 00:46:01,380 --> 00:46:04,620 Let me, um, do one more thing. So I'm okay. 462 00:46:04,980 --> 00:46:09,959 You're okay. That's where I was confused as I thought that, um, you're actually making. 463 00:46:09,960 --> 00:46:13,710 You still think it's a good argument? And I wasn't saying that. 464 00:46:13,710 --> 00:46:21,540 I think it was a battle. What you're not doing is changing it from a good argument to evolve, or indeed, from a bad argument. 465 00:46:21,900 --> 00:46:27,000 But let's have a look at, um, induction. So okay, this is what I'm claiming. 466 00:46:27,390 --> 00:46:31,260 Um, either if a good if a deductive argument is good, it's premises, 467 00:46:31,260 --> 00:46:36,450 a conclusive reason to believe a conclusion, there's nothing we're going to learn is going to change that. 468 00:46:36,810 --> 00:46:43,410 And what is deductive argument is bad that its premises are new whatsoever to believe the conclusion. 469 00:46:43,560 --> 00:46:47,460 And again, nothing is going to change that. It's either good or it's bad. 470 00:46:47,640 --> 00:46:54,780 And that's the end of us. Um, and that doesn't mean that the conclusion of a bad deductive argument might not be true. 471 00:46:55,920 --> 00:47:00,330 Do you see that? It means only that it's premises and no reason to believe the conclusion? 472 00:47:00,460 --> 00:47:03,970 So I think I'm going to give you a, um. 473 00:47:04,150 --> 00:47:08,860 So it's a fine deductive argument. If it's Monday, Mario will be wearing jeans. 474 00:47:09,130 --> 00:47:12,400 Marianne is wearing jeans. Therefore it's Monday. 475 00:47:13,510 --> 00:47:17,920 Um, it is about to start developing today. 476 00:47:18,460 --> 00:47:22,090 Yes. Good. Thank you. I've got to the point where, um. 477 00:47:22,090 --> 00:47:26,140 Okay, but is that conclusion false? No, it isn't, is it? 478 00:47:26,170 --> 00:47:30,820 No, it is Monday. But it's not. Our reason for believing it's Monday. 479 00:47:30,910 --> 00:47:34,870 Doesn't depend on that argument, does it? At least I hope so. 480 00:47:35,230 --> 00:47:42,700 Because if it depended on that argument, too. But. So you can have a bad descriptive argument with a true conclusion in exactly 481 00:47:42,700 --> 00:47:46,330 the same way you can have a good deductive argument with a false conclusion. 482 00:47:46,960 --> 00:47:50,680 So if you have a good deductive argument with a false conclusion, you know what? 483 00:47:52,120 --> 00:47:57,670 What do you know? If the conclusion of a good deductive argument is false. 484 00:47:59,020 --> 00:48:03,520 One of the premises is false. Good. Well done. That's the question. 485 00:48:03,790 --> 00:48:10,510 You still have extra pence after the first if if you claim one, your double if it's not, change history, please. 486 00:48:10,660 --> 00:48:15,729 Good argument. No it would. Then you say if and only if it's Monday. 487 00:48:15,730 --> 00:48:19,540 Marianne will be wearing jeans. Marianne is wearing jeans. 488 00:48:19,990 --> 00:48:25,390 Therefore it's Monday. Um, you would change it from a, um. 489 00:48:26,410 --> 00:48:30,700 I think that would make it a good argument. Um. 490 00:48:31,810 --> 00:48:35,800 And you're changing. Yes. You know, but you'd be changing the premises. 491 00:48:36,160 --> 00:48:38,740 Just curious to know that if you just change quite a tiny bit. Bit? 492 00:48:39,130 --> 00:48:45,010 Well, yes, but you'd be actually making an if into an if and only if it's not making a tiny change. 493 00:48:45,010 --> 00:48:48,130 It's making a huge change logically. Yeah, yeah. 494 00:48:49,030 --> 00:48:53,439 Okay. So either department is good premises, 495 00:48:53,440 --> 00:48:57,999 a conclusive reason to believe the conclusion it's truth preserving or deductive 496 00:48:58,000 --> 00:49:02,710 argument is bad because it's premises and no reason at all to believe its conclusion. 497 00:49:02,950 --> 00:49:06,580 So with deduction is always an either or matter. 498 00:49:07,210 --> 00:49:10,540 Um, but no inductive argument preserves the truth. 499 00:49:11,050 --> 00:49:16,780 Um, so we can't evaluate inductive arguments by appeal to whether or not they preserve the truth. 500 00:49:17,170 --> 00:49:21,880 Um, the very strongest inductive argument still doesn't preserve the truth. 501 00:49:22,240 --> 00:49:25,750 So we evaluate them according to how strong they are. 502 00:49:26,320 --> 00:49:29,770 And of course, strength is a matter of degree, isn't it? 503 00:49:30,250 --> 00:49:39,850 So here an inductive argument is strong if and only if the truth of its premises make its conclusions significantly more likely to be true. 504 00:49:40,390 --> 00:49:48,580 And an inductive argument is weak if and only if the truth of its premises makes its conclusion only slightly more likely to be true. 505 00:49:48,790 --> 00:49:57,640 And here are some examples. Well, actually, I've asked you a question, but I think you who who think so on the right hand side is the strong one. 506 00:49:58,150 --> 00:50:03,340 I know right hand side. No, that's that's this one. Uh, but that one. 507 00:50:03,820 --> 00:50:07,660 Yes. Uh. Yes, ma'am. No you're right. 508 00:50:08,020 --> 00:50:17,980 Mhm. Yes. Okay. Who thinks the one on the left hand side is strongest. Is that some of the stuff? 509 00:50:18,100 --> 00:50:22,480 Yes. I think that's a very strong inductive argument, isn't it? 510 00:50:23,320 --> 00:50:31,360 Someone who's written every day in the history of the universe suggests it is highly likely it's going to rise again tomorrow, isn't it? 511 00:50:31,870 --> 00:50:35,490 Um, whereas every time I've seen Marianne, she's been wearing earrings. 512 00:50:35,500 --> 00:50:41,680 Next time I see Marianne, she'll be wearing earrings. Okay, it's probably true, isn't it, that every time you've seen me, I've been wearing earrings. 513 00:50:42,040 --> 00:50:45,879 Um, but you've only seen me three times. Some of you, um. 514 00:50:45,880 --> 00:50:49,720 It's not really enough. Maybe next time you come around to my house at 4:00 in the morning. 515 00:50:49,900 --> 00:50:52,780 Get out of bed. I promise you, I don't wear earrings in bed. 516 00:50:53,360 --> 00:50:59,620 Uh, so that's a that's a fairly weak inductive algorithm, but do you see, they both have the same form, if you like. 517 00:51:00,130 --> 00:51:07,210 Um, so in each case, you'll say all the things that I've seen like this have been this. 518 00:51:07,660 --> 00:51:12,010 Therefore, the next one, one I haven't seen will be like this. Um, 519 00:51:12,610 --> 00:51:16,839 inductive straight is a matter of degree because one of the premises of an 520 00:51:16,840 --> 00:51:22,540 argument make the conclusion more or less likely is itself a matter of degree. 521 00:51:23,200 --> 00:51:26,770 Um, and inductive strength is not monotonic. 522 00:51:27,190 --> 00:51:31,990 Okay, so I taught you that new word deduction is monotonic. 523 00:51:32,170 --> 00:51:35,680 What that means is you can add anything you'd like to a good argument. 524 00:51:35,680 --> 00:51:41,469 You can't change it from being good. You can't stop it being good. You can add anything you like to a bad deductive argument. 525 00:51:41,470 --> 00:51:45,820 Can't stop it being bad. But evidence of strength is not monotonic. 526 00:51:46,150 --> 00:51:52,860 The addition of a new premise to an inductively strong argument might make it inductively weak. 527 00:51:52,870 --> 00:52:02,589 And I've got some examples coming up. Um, and also, the addition of a new premise to an inductively weak argument might make it inductively strong. 528 00:52:02,590 --> 00:52:08,620 So let's have a look at how this works. Um, so take the relatively strong inductive argument. 529 00:52:08,620 --> 00:52:11,800 Jones confess therefore Jones is guilty. 530 00:52:12,040 --> 00:52:15,310 Oh you agree? Probably. That's a pretty strong inductive argument, 531 00:52:15,610 --> 00:52:23,410 but then adds ten independent witnesses testify to Jones being 100 miles away from the scene of the crime at the time it was committed. 532 00:52:23,950 --> 00:52:27,370 Suddenly it looks very weak, doesn't it? Um. Um. 533 00:52:27,370 --> 00:52:28,350 Um, do you see that? 534 00:52:28,600 --> 00:52:36,640 So what we've done is we've taken a strong inductive argument, we've added another piece of information, and we've weakened the argument completely. 535 00:52:37,180 --> 00:52:40,420 Um, wait, um, Jones is a twin. 536 00:52:41,260 --> 00:52:45,940 Well, uh uh, well, would you remember when I said about ambiguity, 537 00:52:45,940 --> 00:52:53,380 we've got to assume that the Jones is referring to the same person, um, in the in the same context. 538 00:52:53,890 --> 00:52:57,760 Okay. Or take the relatively weak inductive argument. 539 00:52:57,760 --> 00:53:02,290 Jones, whose presence at the site cream site sign proves. 540 00:53:03,880 --> 00:53:11,530 Therefore. Jones, just go to the cream. Okay. Um, and ask the which which I think you'll agree is weak inductive argument. 541 00:53:11,530 --> 00:53:19,300 Yes. Okay. And the extra premise, uh, Smith, the policeman who tried to stop Jones killed the man, saw Jones plunge the dagger. 542 00:53:20,680 --> 00:53:25,960 I was hopeful with that one. Um, suddenly a weak argument looks very much stronger, doesn't it? 543 00:53:26,650 --> 00:53:30,969 Um, so induction is not monotonic. 544 00:53:30,970 --> 00:53:37,090 You could you can strengthen and weaken inductive arguments by discovering new information. 545 00:53:37,210 --> 00:53:42,550 You can't do that with deduction. Deduction. If it's good, it's good, it's bad, it's bad. 546 00:53:42,610 --> 00:53:45,150 And that's the end of it all. You in danger? 547 00:53:45,160 --> 00:53:50,350 They're turning the doctor argument into a deductive argument by the addition of the additional statement. 548 00:53:51,340 --> 00:53:59,080 Um, it's very easy to turn inductive arguments into deductive arguments by adding the right premises. 549 00:53:59,440 --> 00:54:04,870 Uh, have I done that here? I would have thought you're in danger of doing that in the second statement. 550 00:54:05,980 --> 00:54:10,570 Well, no, not if Jones died of electrocution. Okay. 551 00:54:11,290 --> 00:54:21,550 And he might have done mine pretty consistently with that has been kind of a good I mean, I haven't actually said what the crime is here. 552 00:54:23,230 --> 00:54:28,630 Okay. Um. Yes. 553 00:54:28,710 --> 00:54:33,820 I may not be. You could come back to me on that one if you want to later on. 554 00:54:34,210 --> 00:54:40,690 Okay. So let's have a look at this. This is, uh, exercise three, which is very similar to exercise two. 555 00:54:41,050 --> 00:54:49,810 Um, what we want to know now is whether it's being good or bad as a matter of degree or a matter of, um, either or. 556 00:54:50,050 --> 00:54:55,870 Um, whereas last time you identify the good deductive arguments, um, 557 00:54:56,050 --> 00:55:02,140 now you could also identify the bad deductive arguments and distinguish them from arguments. 558 00:55:02,410 --> 00:55:05,950 So Tom is a banker. All bankers are rich, therefore Tom is rich. 559 00:55:05,950 --> 00:55:10,899 What's that? Let's go to deductive us as it was before. 560 00:55:10,900 --> 00:55:15,100 Yeah, it doesn't change. Um, Sue and Tom lead similar lives, but Sue smoke. 561 00:55:15,350 --> 00:55:19,910 And Tom doesn't. Therefore, Sue is more likely to die from heart disease than Tom. 562 00:55:20,060 --> 00:55:23,690 Stick your hand up when you've got the. When you think you've got the answer to that. 563 00:55:24,230 --> 00:55:29,450 And. So what we want to know. 564 00:55:29,810 --> 00:55:33,440 We know that it's not a good deductive argument from the last time. 565 00:55:33,710 --> 00:55:37,520 But is it a data? It is a bad deductive argument. 566 00:55:37,760 --> 00:55:43,840 Or is it an inductive argument? So it's inductive. 567 00:55:43,850 --> 00:55:50,630 Exactly. So we could what could we add to that to make the conclusion much less likely? 568 00:55:53,070 --> 00:55:57,580 Why would you want to make people suffer? Well, just because if we can. 569 00:55:57,600 --> 00:56:01,200 We know it's inductive. Oh, yeah. If we. 570 00:56:01,200 --> 00:56:04,560 If we can't, we know it's. So what? What could we add to that? 571 00:56:04,650 --> 00:56:12,180 You could have the Thomas go to that hall. Um, all the Thomas all all Thomas family died of heart disease. 572 00:56:12,180 --> 00:56:18,180 Died young of heart disease. Yes. That that would immediately change the strength of that argument, wouldn't it? 573 00:56:18,630 --> 00:56:27,300 So as it is, it looks pretty strong. Um, but we can weaken it immediately by giving it a bit of Thomas family history. 574 00:56:27,930 --> 00:56:32,820 So it's a question when it, um, when it suggests that. So, um, Tom Latimer lives. 575 00:56:33,120 --> 00:56:37,380 Um, probably if you leave the first statement out. 576 00:56:37,800 --> 00:56:43,200 Uh, it might be stronger because Sue smokes and then Tom doesn't. 577 00:56:43,200 --> 00:56:50,100 So obviously, Sue is more likely to die when it suggests that Sue and Tom lead similar lives. 578 00:56:50,400 --> 00:56:53,400 Uh, is not true because she smokes and he doesn't. 579 00:56:53,400 --> 00:56:57,880 It isn't it? Well, they could lead some lives apart from that. 580 00:56:57,970 --> 00:57:04,650 I mean, they could both exercise, both eat lots of fruit and vegetables, but not eat much animal fat. 581 00:57:05,430 --> 00:57:09,180 Um, okay. What about all dogs are mortal. 582 00:57:09,180 --> 00:57:17,430 Lucy is mortal. Therefore Lucy is a dog. Put your hand up. If you say you know whether that's a bad deductive argument or an inductive argument. 583 00:57:19,680 --> 00:57:23,370 Uh, deductive. It's bad inductive bias. How do you know that? 584 00:57:23,850 --> 00:57:28,490 Because the premises. Uh, tell me more. Well, you have, uh. 585 00:57:29,100 --> 00:57:32,730 It's not true. It's not true. Uh, true. So is it. 586 00:57:32,880 --> 00:57:39,360 It's not truth preserving. And in fact, the premises are no reason whatsoever for believing the conclusion all day. 587 00:57:39,780 --> 00:57:44,880 So whereas with the first one, the premise is a conclusive reason for believing the conclusion, 588 00:57:45,120 --> 00:57:50,490 with the third one, the premises are no reason whatsoever for believing the conclusion. 589 00:57:51,000 --> 00:57:54,330 Okay, that's a bad deductive argument. What about number four? 590 00:57:54,340 --> 00:57:57,870 Can we we know that's a good deductive argument, don't we? 591 00:57:58,110 --> 00:58:09,450 Okay. What about number five? Is that a bad deductive argument or an inductive argument to indicate that it's inductive? 592 00:58:09,480 --> 00:58:15,630 That's right. Okay. So every person with Huntington's disease who's been examined has had the genome chromosome full. 593 00:58:15,930 --> 00:58:20,130 Therefore everyone with HD has the HD gene on chromosome four. 594 00:58:20,640 --> 00:58:28,210 Um, what would we want to. I mean, could you add anything to that to make it a bad inductive argument? 595 00:58:28,230 --> 00:58:34,170 I mean, as it stands, it looks pretty strong. What might we do to change it into a weak one? 596 00:58:35,340 --> 00:58:42,660 We've been examined. Yeah. Yeah, exactly. If we've only examined five people, then it suddenly looks a lot less strong. 597 00:58:43,170 --> 00:58:46,830 Good. Okay. And the last one was a good inductive. 598 00:58:47,070 --> 00:58:56,130 Deductive good. Okay, so we now know that if an argument preserves the truth, it's a good deductive argument. 599 00:58:56,610 --> 00:59:02,550 If an argument doesn't preserve the truth, it's either a bad deductive argument or it's an inductive argument. 600 00:59:03,240 --> 00:59:12,990 And we know that if an argument is either good or bad, with no matter of degrees or no maybes, um, then it's a deductive argument. 601 00:59:13,320 --> 00:59:18,540 And if it's being good or bad, it's a matter of degree that it's an inductive argument. 602 00:59:18,750 --> 00:59:22,650 Okay. So that's we've looked at our first question and our second question. 603 00:59:22,920 --> 00:59:30,300 Now let's look at the third. So the third question was can we evaluate the argument a priori. 604 00:59:30,840 --> 00:59:39,540 Now to be able to evaluate something a priori is able to say whether it's good or bad or true or false or whatever, something normative. 605 00:59:40,050 --> 00:59:45,840 Um, without experience, without needing to go and look at the world. 606 00:59:45,840 --> 00:59:50,940 If you'd like, you can do it simply from, um, logic on its own. 607 00:59:51,540 --> 00:59:54,570 Um, um, the opposite is a posterior. All right. 608 00:59:54,600 --> 00:59:57,810 Which is when you have to actually go and look at the world. 609 00:59:58,140 --> 01:00:06,450 So let's have a look at this. To be able to evaluate, uh, an argument a priori is to be able to tell whether the argument is good or bad by 610 01:00:06,450 --> 01:00:12,600 appeal only to the structure of the argument and to the logical words used in it, 611 01:00:13,050 --> 01:00:16,530 um, without need of any information about the world. 612 01:00:16,770 --> 01:00:20,620 Now, you might as well say you need information about language, and that indeed is true. 613 01:00:20,640 --> 01:00:27,870 So you need some information about the world. Um, but you don't need more than information about language and meanings. 614 01:00:28,380 --> 01:00:30,900 Um, so these are the logical words. 615 01:00:30,900 --> 01:00:40,320 And if then not or and if and only if, um, there are other logical words, but we wouldn't worry about those at the moment. 616 01:00:40,830 --> 01:00:45,590 Um, and this gives you an indication of what's meant by the structure of an argument. 617 01:00:45,600 --> 01:00:50,580 So here's an argument we use last week. Um, if it's snow, the mail will be late. 618 01:00:50,760 --> 01:00:54,729 It is snowing. Therefore, the. Will be late. Can you see that? 619 01:00:54,730 --> 01:00:58,030 If you, um, want to take. It is snowing. 620 01:00:58,360 --> 01:01:07,089 He. Then you've got f p under the chair again and the male will be late. 621 01:01:07,090 --> 01:01:11,140 Becomes cu. So if P then cu p 30. 622 01:01:12,490 --> 01:01:20,350 Do you see the structure of the occupants? Um, can be brought out just by taking away the sentences in the argument. 623 01:01:20,350 --> 01:01:24,010 Leaving the logical words. Um, okay. 624 01:01:24,020 --> 01:01:29,170 Is this is good argument. Put your hands up when you've decided, rather than yelling out. 625 01:01:50,370 --> 01:01:53,180 Put your hand up when you've decided up. So I can see it. Because. 626 01:01:53,190 --> 01:01:58,740 Because I'll ask when there's a sort of critical mass of people who know, they'll think they know. 627 01:02:03,050 --> 01:02:07,070 Okay. What do you think? Is it a good argument? Yes. Yes it is. 628 01:02:07,100 --> 01:02:10,730 Isn't it good? Does anyone know what it means? 629 01:02:11,840 --> 01:02:15,320 Which it is or what it is for something to be possible? 630 01:02:16,310 --> 01:02:19,220 No. Okay, good. I'm not surprised because I made them up. 631 01:02:19,730 --> 01:02:26,660 Um, but you see that actually, the content is completely irrelevant to your ability to evaluate the arguments. 632 01:02:27,050 --> 01:02:33,320 Um, you really don't need to know anything about the subject matter in order to be able to say that that argument is good. 633 01:02:33,800 --> 01:02:40,700 And the reason for that is that what makes that argument good is the way the logical words itself combines. 634 01:02:40,970 --> 01:02:45,800 It's the structure of the arguments. It's nothing to do with the contents of the arguments. 635 01:02:45,980 --> 01:02:49,220 And that's true of all deductive arguments. 636 01:02:49,490 --> 01:02:52,760 All deductive arguments can be evaluated a priori. 637 01:02:53,270 --> 01:02:56,419 Um, so oh, here's another one. Okay. 638 01:02:56,420 --> 01:03:02,930 Is this a good argument? No no no it isn't. 639 01:03:03,440 --> 01:03:07,339 And you can tell that because this is actually a Bob's deductive argument. 640 01:03:07,340 --> 01:03:15,440 Whereas the last one was a good deductive argument. Um, and it's interesting to note that the facts deduction is a priori. 641 01:03:15,680 --> 01:03:24,440 Actually means that it's topic neutral. Um, this is what makes logic or deductive logic anyway such a useful transferable skill. 642 01:03:24,710 --> 01:03:28,580 Because it doesn't matter what you're talking about, what you're thinking about. 643 01:03:28,940 --> 01:03:34,550 Um, if you can recognise a deductive argument, you will be able to evaluate it. 644 01:03:34,760 --> 01:03:39,169 Even if you know nothing whatsoever about the subject matter. 645 01:03:39,170 --> 01:03:44,810 The subject matter is completely irrelevant to evaluating a deductive argument, 646 01:03:45,140 --> 01:03:52,490 because what matters is the logical words and the way that combined, excuse me in the first premise. 647 01:03:52,970 --> 01:04:06,320 Who's speaking? Um, okay. In the first press, um, of the last one of the nonsense words that, um, if which is a positive and I'm having a negative. 648 01:04:09,140 --> 01:04:13,969 Then the first premise in those terms stands up. 649 01:04:13,970 --> 01:04:19,430 But it's nonsense. Well, um. 650 01:04:21,710 --> 01:04:27,230 What you're doing is you're trying to put content in that's going to make it's nonsense. 651 01:04:27,230 --> 01:04:32,210 But it's already nonsense, isn't it? I mean, you don't need to do anything to make it not cold. 652 01:04:32,270 --> 01:04:35,930 It's nonsense already, but it's still a good argument for bad arguments or whatever is is. 653 01:04:38,330 --> 01:04:42,680 What was causing you always cold? Which is a positive then? 654 01:04:42,680 --> 01:04:47,360 Which is a negative. And that is a premise that cannot stand up in logic. 655 01:04:50,390 --> 01:04:54,190 Well. It's true. No, it's the contents. 656 01:04:54,530 --> 01:04:57,920 It's. The logic is what you're saying is it's false. 657 01:04:58,340 --> 01:05:03,410 Yes. Yeah, but. Well, actually, it's not even false because it is meaningless, isn't it? 658 01:05:05,180 --> 01:05:08,360 But if you if widgets are positive, then widgets are negative. 659 01:05:09,020 --> 01:05:12,800 Mm hmm. I mean that that's nonsensical. It doesn't have meaning. 660 01:05:13,160 --> 01:05:17,270 Yes, but you only know that from the content that you're putting in there. 661 01:05:17,630 --> 01:05:24,440 And my aim in using this is to show that you can decide whether something is a good argument or not. 662 01:05:24,560 --> 01:05:28,640 Without knowing what the content is like a is about. 663 01:05:30,440 --> 01:05:36,590 So wanting to put in some content that doesn't make sense is not changing my arguments at all. 664 01:05:38,930 --> 01:05:47,479 So you could use positive and negative if you said they were positive. And negative meant well, is what I mean. 665 01:05:47,480 --> 01:05:50,880 Why yes. I mean why would you? Okay. 666 01:05:51,200 --> 01:05:55,810 If my aim is to show you is that you can tell whether an argument is good or bad 667 01:05:55,820 --> 01:05:59,570 and deductive argument is good or bad without knowing what the content means. 668 01:05:59,780 --> 01:06:06,740 I take it that I because you were able to see that the first argument was good, and that the first second argument was bad. 669 01:06:06,890 --> 01:06:10,340 I take it that I've proved my point. Yes. Okay. 670 01:06:10,520 --> 01:06:15,799 Now I could go back and discuss, but I'm inclined not to, 671 01:06:15,800 --> 01:06:23,330 because I don't see why changing one nonsensical argument into another nonsensical argument is going to gain a something. 672 01:06:23,970 --> 01:06:29,480 Okay. May I move on? Yes. I'm not sure if I've dealt with your better. 673 01:06:29,570 --> 01:06:33,799 Perhaps we can talk afterwards. If you. If you replaced it with P's and Q's. 674 01:06:33,800 --> 01:06:38,060 And it's clear what is clear with all of these positives. 675 01:06:38,060 --> 01:06:41,210 But you say all soon. Well, yeah. Let me carry on. 676 01:06:41,880 --> 01:06:44,900 Now I'm going to, um, replace things with P's and Q's. 677 01:06:45,210 --> 01:06:52,040 Uh, all of these I could show you. How do you see these two arguments have exactly the same structure. 678 01:06:59,580 --> 01:07:05,490 Yeah, know. Not because anyone wants to deny that. 679 01:07:08,640 --> 01:07:11,790 Can you help me work out? Well, the structure is using P's and Q's. 680 01:07:11,790 --> 01:07:20,340 Let's use that one. Um, he is p and q, then q p p. 681 01:07:21,420 --> 01:07:24,450 Therefore. Therefore q q q. 682 01:07:24,930 --> 01:07:29,610 And so here means it is snowing and q means the mail is late. 683 01:07:30,510 --> 01:07:34,310 What's P in the second document. Yeah. 684 01:07:34,530 --> 01:07:40,649 Yeah. And I mean the the actual produces the greatest happiness is greatest. 685 01:07:40,650 --> 01:07:46,709 Um, but what's Q if he's right it's or the act is to remove the cross-reference the actors. 686 01:07:46,710 --> 01:07:50,910 Right. Yes. Exactly. So so we've got exactly the same structure. 687 01:07:50,910 --> 01:07:59,070 Does the basis arguments stuff into it? Why don't you use p q uh, instead of r an X or something? 688 01:07:59,100 --> 01:08:04,770 Uh, yes. It's just convention that. Yeah. Yeah, it's a convention to do with pigs and queens. 689 01:08:05,130 --> 01:08:10,140 No no no no no no I, I think so. 690 01:08:10,760 --> 01:08:20,310 And actually some people would use um s ends here just because saying begins with s and then the mail begins to say um, 691 01:08:20,760 --> 01:08:24,030 but it would be then less relevant to that one. 692 01:08:24,330 --> 01:08:28,379 Um, so it's convention that we use P and Q. Okay. 693 01:08:28,380 --> 01:08:35,070 So hang on here because not all widgets I've ever seen have been having, um, therefore all widgets are having to. 694 01:08:35,580 --> 01:08:42,209 Okay. Can we tell whether that's a good argument or not to the fact that we can't, can we. 695 01:08:42,210 --> 01:08:49,230 Why not? Doesn't she don't say in the world you don't know how many widgets you see. 696 01:08:49,230 --> 01:08:52,350 You don't know what widgets are and you don't know what it is. 697 01:08:52,350 --> 01:08:57,690 Do we have another? And you I mean, no, just you do need to know what the content is, 698 01:08:58,050 --> 01:09:03,540 and you need to know something about the contents in order to be able to evaluate that argument. 699 01:09:03,870 --> 01:09:13,469 So with a deductive argument it can be evaluated a priori, i.e. on the basis only of its structure in the logical words um, 700 01:09:13,470 --> 01:09:24,240 if it can only be evaluated by appeal to some background information, some, um, information that you have, then it's an inductive argument. 701 01:09:26,070 --> 01:09:33,149 Okay. Um, so, um, inductive arguments can be evaluated only a posteriori. 702 01:09:33,150 --> 01:09:41,400 And in the light of an understanding of the content of the arguments, uh, and by bringing to bear background information about the world, 703 01:09:41,730 --> 01:09:48,690 so long as you can strip the content out of the deductive arguments and still determine whether it's a good or bad argument, 704 01:09:48,930 --> 01:09:55,140 there's no way you can stretch the content out with good inductive arguments and still hope to evaluate it. 705 01:09:56,670 --> 01:10:01,380 So can we evaluate these arguments a priori or not? 706 01:10:02,520 --> 01:10:06,059 Um, Jennifer is tall. Jennifer is the bank manager. 707 01:10:06,060 --> 01:10:14,370 Therefore the bank manager is tall. Put your hand up when you've decided whether it can be evaluated a priori. 708 01:10:20,460 --> 01:10:23,460 Okay. What do you think? Yes it can. Yes it can. 709 01:10:23,910 --> 01:10:31,080 Okay. So the it's the structure of that argument that tells us whether it's, um, a good argument or not. 710 01:10:31,080 --> 01:10:35,160 So is it good? Yes. Yes, it's a good argument, isn't it? 711 01:10:35,400 --> 01:10:38,700 What about crocodiles is dangerous. James's dog is dangerous. 712 01:10:38,700 --> 01:10:43,440 Therefore James's pet is crocodile. I think I was getting tired of this. 713 01:10:45,810 --> 01:10:50,130 Uh, can we evaluate that argument? Can't we? 714 01:10:50,430 --> 01:10:54,839 Yes. It's, uh. Uh, it's a bad job. 715 01:10:54,840 --> 01:10:58,230 Too long, doesn't it? We can see this is bad. We don't need to do anything. 716 01:10:58,890 --> 01:11:03,959 Um, it's wrong to tell a lie. Just tell your mum her hair looks good. 717 01:11:03,960 --> 01:11:07,560 Was a lie. Therefore, Jane's telling her mum her hair looked good, was wrong. 718 01:11:07,950 --> 01:11:12,150 Um, can we can we evaluate that argument a priori or not? 719 01:11:12,390 --> 01:11:15,420 Yes, I think we can. It's such a good one. 720 01:11:16,080 --> 01:11:20,280 Yeah, it's it's another good deductive argument. 721 01:11:20,700 --> 01:11:26,190 Tomato plants are to be fed well, kept warm and watered frequently, usually thrive. 722 01:11:26,670 --> 01:11:33,540 This tomato plants must be fed well and watered frequently. But it's dead that all this tomato plant hasn't been fed properly. 723 01:11:34,920 --> 01:11:38,850 That's interrupted. Yes. Um that's inductive. 724 01:11:39,270 --> 01:11:42,510 Um, might not be careful because you want to know why it's bad. 725 01:11:42,780 --> 01:11:46,770 Yeah. Why is it bad? I mean, is it just that I can just out of its pot? 726 01:11:47,550 --> 01:11:52,230 Um, okay. Yes. We need to know a little more about that situation. 727 01:11:52,230 --> 01:11:56,070 Wouldn't we would need to have some background information. Um, what about this one? 728 01:11:56,070 --> 01:11:59,760 This looks familiar. If this liquid is acidic, it will turn. 729 01:11:59,760 --> 01:12:03,510 It was paper blue. This liquid turns litmus paper blue. 730 01:12:03,510 --> 01:12:11,069 Therefore, this liquid is acidic. It's good for just about the last two springs. 731 01:12:11,070 --> 01:12:15,060 Well, hot and sunny. But the summers were awful. This spring was hot and sunny. 732 01:12:15,330 --> 01:12:18,420 Therefore, this summer will be awful. Yeah. 733 01:12:20,130 --> 01:12:26,340 Why? Do we need to bring to bear background information? 734 01:12:26,760 --> 01:12:30,780 What sort of background information will we need to bring to bear with the school counsellors? 735 01:12:31,260 --> 01:12:34,290 With the data? What we didn't know. 736 01:12:34,740 --> 01:12:38,280 We need more information. We need more to use with that information. Yes. 737 01:12:38,640 --> 01:12:44,310 We would want a lot more than two years, wouldn't we? In order to generate that scenario? 738 01:12:44,970 --> 01:12:48,720 Yes. Go back to five. Yes. 739 01:12:49,050 --> 01:12:53,640 If the liquid is something other than acidic, it like to paper through two. 740 01:12:54,780 --> 01:12:59,730 Um, what are you bringing to bear? To be honest, one analysis. 741 01:12:59,940 --> 01:13:06,000 Let me have a look at that. Um, so I think to the point where it's actually quite difficult to, to know. 742 01:13:06,630 --> 01:13:11,940 So if P then Q oh, that's a bad deductive argument. 743 01:13:12,120 --> 01:13:15,240 Yeah. Do you why did we say it was a good one? Yeah, yeah. 744 01:13:15,270 --> 01:13:18,630 That's because the last one that talked about newspaper in the city. 745 01:13:18,750 --> 01:13:31,710 No, that's a bad argument. Can you see that. It's um, it's P then q q therefore P, which is not if p that q p therefore q. 746 01:13:31,890 --> 01:13:34,920 Yes. Do you see. Yes. Got to be careful about that. 747 01:13:34,920 --> 01:13:40,380 So that's, that's um, a policy of affirming the consequence. 748 01:13:41,130 --> 01:13:50,610 Um, um, what would happen later on? I'm not sure that's me is good to talk to, but it's wrong to tell a lie. 749 01:13:50,640 --> 01:13:55,260 Jane's telling her mum her hair looked good was a lie. Therefore, Jane. 750 01:13:55,590 --> 01:14:00,240 Uh. It is. Well, it's just not why you say that is wrong. 751 01:14:00,270 --> 01:14:03,630 I think it so let me let me just, um, see if it is. 752 01:14:06,610 --> 01:14:13,690 Uh, so p is it's wrong to tell a lie. 753 01:14:14,890 --> 01:14:18,220 Um. Oh, actually, no. Uh, I'll have to do it. 754 01:14:18,310 --> 01:14:24,250 Um. I'm going to have to go into the predicate calculus. 755 01:14:24,260 --> 01:14:48,980 Excuse me a second. Um, sorry. Okay, here's my interpretation. 756 01:14:49,550 --> 01:14:54,800 F of x x is a lie. Okay of x x is wrong. 757 01:14:55,160 --> 01:14:58,400 An x, which is a, um, a thing. 758 01:14:58,520 --> 01:15:02,000 An action. In this case, Jane's telling her mom her hair looks good. 759 01:15:02,450 --> 01:15:09,229 So we've got, um. Uh, all x. 760 01:15:09,230 --> 01:15:12,740 If, uh, X is alive, then X is wrong. 761 01:15:13,490 --> 01:15:23,120 Okay, that's the first time. Is, um. L x x is a sorry, l s x is a lie. 762 01:15:24,170 --> 01:15:29,360 Do you see? We've got that. And therefore w x s rather. 763 01:15:32,700 --> 01:15:37,470 Both of those offices. So. Everything is such that if it's a lie, it's wrong. 764 01:15:38,130 --> 01:15:42,510 See, that's the first premise. Um, s is a lie. 765 01:15:42,840 --> 01:15:48,540 Jane's telling her mom her hair looks good, is lying. Therefore S is wrong. 766 01:15:48,960 --> 01:15:52,920 James telling her mom her hair looks good is wrong to be straight. 767 01:15:53,040 --> 01:15:56,400 Good deductive argument. Sorry to see the logic. 768 01:15:56,640 --> 01:16:01,710 Oh, good. But in reality, the premises are opening. 769 01:16:02,310 --> 01:16:08,370 Oh, well, but we're not evaluation purposes. Yeah, we're we're not evaluating the arguments at the moment. 770 01:16:08,640 --> 01:16:11,790 Uh, I'm not saying anything about whether the premises are true or false. 771 01:16:11,820 --> 01:16:15,300 I mean, if widget samplers. Yeah. Nonsense. 772 01:16:15,840 --> 01:16:26,430 Uh, what is positive? Can you go? Oh, sorry. So I wonder wonder, uh, would it make any difference to five if you put if and only if at the beginning, 773 01:16:27,420 --> 01:16:31,020 it would always make a difference to change it to a different. 774 01:16:31,020 --> 01:16:37,739 Only if always. I mean, I found that students, when they first start formalising something, um, 775 01:16:37,740 --> 01:16:43,170 will immediately reach for a double arrow, which you see if and only if, um, 776 01:16:43,350 --> 01:16:49,440 when actually all there is is an if or all the rest is an only if and actually 777 01:16:49,440 --> 01:16:55,710 if and only if it's very different to if and it's different also to only. 778 01:16:56,520 --> 01:17:02,400 Um so yes it would make a huge difference. Doesn't make it deductively valid or a good argument. 779 01:17:02,970 --> 01:17:09,900 Um, it's exactly the same as one we had earlier when somebody was talking about, if, you know. 780 01:17:10,320 --> 01:17:16,260 So if you have, if, if this liquid is acidic, it will turn litmus paper blue. 781 01:17:16,770 --> 01:17:20,220 This. Yes it would be. Yes. Yes. Good. Yeah. 782 01:17:21,330 --> 01:17:24,420 Uh, okay. So right. That's sort of the question asking for a right. 783 01:17:24,420 --> 01:17:27,570 That's good. Two for the price one. Yeah okay. 784 01:17:27,600 --> 01:17:30,720 Ooh exercise to do at home already. Right. 785 01:17:30,780 --> 01:17:35,400 Well we got in that case we can we can wait a bit for that because I'm sure there must be more questions. 786 01:17:35,970 --> 01:17:39,930 Okay. I usually summarise if I not done that this week. 787 01:17:39,990 --> 01:17:45,360 Yes. Here we go. Um, so there's your exercises to do at home. 788 01:17:45,360 --> 01:17:50,280 Um, and I will give you the answer sizes on your answer sheet next week. 789 01:17:50,790 --> 01:17:59,069 Um, so you can have a look at those at home. And so this week we've looked at critical reasoning is normative not descriptive. 790 01:17:59,070 --> 01:18:06,030 Do you remember that something's normative if it involves standards either moral standards or rational standards. 791 01:18:06,030 --> 01:18:11,490 So right and wrong in some sense it involves norms. 792 01:18:12,000 --> 01:18:20,850 Um, we've looked at the fact that there are two types just following from um, so deductive follow following from an inductive following from um, 793 01:18:20,850 --> 01:18:26,130 we've seen that deductive arguments are truth preserving, or at least they are when they're good. 794 01:18:26,640 --> 01:18:31,410 So I hadn't made that clear that. But that should be clear. The truth preserving if good. 795 01:18:32,310 --> 01:18:36,360 Um, there are also such that there being good is an either or matter. 796 01:18:36,810 --> 01:18:39,810 Uh, do you remember that they're monotonic? 797 01:18:40,260 --> 01:18:44,070 Um, once you once. It's good. It can't be changed to anything else. 798 01:18:44,640 --> 01:18:50,850 Um, um, deductive arguments are such that we can determine a priori whether they're good or not. 799 01:18:51,270 --> 01:18:55,679 An inductive arguments are necessary, serving no matter how good they are. 800 01:18:55,680 --> 01:19:01,650 They do preserve the truth in that very definite definition of truth preserving that I gave at the beginning. 801 01:19:02,130 --> 01:19:05,190 Um, there also is such that there being good is a matter of degree. 802 01:19:05,700 --> 01:19:09,570 Um, there's such that we can only determine whether they're good by bringing to bear, 803 01:19:09,840 --> 01:19:15,420 uh, experience of understanding of the contents and or background information. 804 01:19:16,200 --> 01:19:19,529 So that's it, folks, for today. 805 01:19:19,530 --> 01:19:27,300 Uh, but we've got time for questions, which is so, um, um, um, let's see if I can turn this. 806 01:19:29,290 --> 01:19:32,409 Does that make it easier? It makes it easier. Sorry. 807 01:19:32,410 --> 01:19:36,430 Some of the other questions. I got a feeling that you mentioned earlier. 808 01:19:36,460 --> 01:19:40,420 They also say white business as being. 809 01:19:41,820 --> 01:19:55,820 Not the time to that. Um, black swans, but, uh, but then later you said that the premise don't have to be true, and the GOP said all sponsor wives. 810 01:19:56,220 --> 01:19:58,470 This is a sport and therefore it is wives. 811 01:20:01,580 --> 01:20:10,430 Even if swarms only showed you that student multitasking, creativity of swarms, animals or birds, let's pretend they're animals. 812 01:20:11,360 --> 01:20:16,180 They're birds or they both shoot. 813 01:20:16,490 --> 01:20:25,850 Yeah. Okay, here's an argument. 814 01:20:32,220 --> 01:20:49,140 And here's another argument. Okay. 815 01:20:49,150 --> 01:20:52,520 All sounds white. The animal in the next room is a swan. 816 01:20:52,540 --> 01:20:58,450 Therefore, the animal in the next room is white. Is that an inductive argument or deductive argument? 817 01:20:59,770 --> 01:21:05,590 It's deductive. Okay, so, um, um, what about every swan I've ever seen has been white. 818 01:21:05,860 --> 01:21:09,790 Therefore, all swans of white is inductive. 819 01:21:10,000 --> 01:21:16,600 So you see that? Um, I mean, I could say all swans I've seen have been white. 820 01:21:17,290 --> 01:21:23,460 Um, it would still be inductive, wouldn't it? Um, it's the first one. 821 01:21:23,470 --> 01:21:28,000 The top one. Deductive one is correct, even though it's not true. 822 01:21:29,200 --> 01:21:36,520 Um, in real life, it is truth preserving. That's all argument, even though one of its premises is false. 823 01:21:36,850 --> 01:21:41,739 It's that we'll be doing this more next week. And it's it's something that people are find very difficult. 824 01:21:41,740 --> 01:21:47,140 Um, I think there's a sort of argument that goes in some people's heads that validity is good. 825 01:21:47,710 --> 01:21:52,480 Truth is good. Therefore validity and truth are the same thing, but they're not. 826 01:21:53,500 --> 01:22:00,280 You can have a valid argument with a false conclusion, and you can have an invalid argument with a true conclusion. 827 01:22:00,850 --> 01:22:09,550 So the notion of truth preserving this, um, is very different from from truth, simplicity or truth generating. 828 01:22:10,540 --> 01:22:16,710 Because. Can I take you back to the argument to back stabbing through the heart, the plunging the dagger? 829 01:22:16,720 --> 01:22:25,480 Yes. Because I think I would argue you're turning that deductive argument with a hidden, uh, with the hidden premise. 830 01:22:26,020 --> 01:22:33,250 Do you remember what it was? Uh, it was what I was talking about. 831 01:22:33,920 --> 01:22:38,530 This and it. You were not honest since he wasn't right. But, uh. 832 01:22:39,250 --> 01:22:44,430 Okay. Um, I would say that's a hidden premise which turns into one. 833 01:22:44,710 --> 01:22:49,960 No, every inductive argument. You could find a hidden premise that turns it into a deductive argument. 834 01:22:50,920 --> 01:22:58,180 I mean, uh, taking that one, every soul that I've seen has been white. 835 01:22:58,810 --> 01:23:01,990 I have seen every swan. 836 01:23:04,240 --> 01:23:07,930 Immediately becomes a detective. I mean, there's no problem. 837 01:23:07,930 --> 01:23:11,739 We take your deductive argument, add to your premise to turn it into deductive. 838 01:23:11,740 --> 01:23:15,670 Um, we can always do that. I don't think I've done that here. 839 01:23:16,630 --> 01:23:21,880 Um, well, I would suggest the premises plunging daggers into heart are inevitably fatal. 840 01:23:24,250 --> 01:23:28,360 Uh, not if you've already killed a chap from. I mean. 841 01:23:28,690 --> 01:23:32,310 So you die with electric shock, and then I plunge a dagger into your heart. 842 01:23:32,350 --> 01:23:37,929 I'm not guilty of murder. I may have the intention to murder you, Christopher, but I didn't do it. 843 01:23:37,930 --> 01:23:42,579 It's already dead anyway. What's that was, anyway? 844 01:23:42,580 --> 01:23:47,040 Because you don't know who his referred to. Um. 845 01:23:47,250 --> 01:23:51,210 Um. John? Yes. You would have to assume that there's somebody. 846 01:23:51,400 --> 01:23:56,080 I mean, we always see somebody dead here. Uh, that's what Hilton was, exactly. 847 01:23:58,060 --> 01:24:04,690 But he's not dead because you're you're policing is trying to see. If I just like to listen to his own heart. 848 01:24:04,690 --> 01:24:07,750 Of course. Yeah, yeah, that's true. 849 01:24:08,410 --> 01:24:17,170 But we do sometimes have to take just take it for granted that there is a person here that was introduced to, I think, in talking about the cats. 850 01:24:17,170 --> 01:24:22,090 We talked about her, didn't we, without make it absolute clear who we meant. 851 01:24:22,090 --> 01:24:25,149 But you got three people. They got Smith. Smith. 852 01:24:25,150 --> 01:24:28,570 You've got John. Um. Hit him. Yes. 853 01:24:28,660 --> 01:24:35,350 That's what he is. Yes. You've got three people. It doesn't say whose heart it holds. 854 01:24:36,010 --> 01:24:39,820 No. It's true. So when you've got the man. 855 01:24:39,970 --> 01:24:43,250 Yes. Who is him? Clearly. 856 01:24:43,270 --> 01:24:46,690 But it's the one. It's the. His. 857 01:24:47,860 --> 01:24:52,540 Yes. Physical. Three members who tried to stop him. 858 01:24:53,140 --> 01:24:58,510 Jones killed a man. So Jones plunge the jacket into his pocket so he could be Jones. 859 01:24:59,380 --> 01:25:07,870 Smith. Uh, I think for any English person to be an US for a restroom reference. 860 01:25:07,990 --> 01:25:11,220 Back to the science. Yes, I really do. Yes, yes. 861 01:25:11,740 --> 01:25:14,950 And I know you're old English, but I think you're just saying it for fun. 862 01:25:16,120 --> 01:25:22,760 Okay. Uh, well, look, supposing that. You a bit electrocution. 863 01:25:22,780 --> 01:25:29,410 You said he could have been electrocuted, but he can because he's dealing with someone who's still alive. 864 01:25:30,130 --> 01:25:35,880 He's going to kill. I don't try to prove he's been tried to stop Jones killing the man and failed. 865 01:25:36,100 --> 01:25:40,690 His wife stated, sometimes you get daggers plunged into your hearts without killing you. 866 01:25:40,720 --> 01:25:44,520 Coming. What are we talking about? 867 01:25:44,540 --> 01:25:49,350 This is. It doesn't seem to be the slightest bit interesting. 868 01:25:51,780 --> 01:26:03,480 Are you sure you have. When we look as students, logic deductive inductive reasoning basically takes subject to the tragedy of the equation, 869 01:26:03,870 --> 01:26:12,180 not with deductive inductive argument, because you've got to bring to bear your background knowledge on the argument with deduction. 870 01:26:12,210 --> 01:26:17,160 Yes. Um, your background knowledge is can be just taking this completely out of the equation. 871 01:26:17,670 --> 01:26:21,300 Yep. That's just like deducting this computers use deduction. 872 01:26:21,900 --> 01:26:27,630 They do. And don't cut it subjective to know how well we talk about something. 873 01:26:28,110 --> 01:26:33,870 First you just have to be different is that we can make it problematic. 874 01:26:34,320 --> 01:26:39,570 Yes. There's a big difference between deduction and induction in that with deduction 875 01:26:39,990 --> 01:26:45,490 we have a very simple mechanical system that can be given to computers. 876 01:26:45,510 --> 01:26:53,970 I mean, computers embody the, uh mechanistic testing evaluation system for deductive arguments. 877 01:26:54,370 --> 01:27:09,060 Um, we can't do all deductive arguments that way. And we for example, if I say, um, uh, lying is wrong, therefore, you shouldn't lie. 878 01:27:11,000 --> 01:27:16,940 Now that looks like a deductive argument. It couldn't be the case that that's true without that's being true. 879 01:27:17,690 --> 01:27:25,640 Um, but we can't mechanistically test that because, um, it seems to depend on the meaning of the word role. 880 01:27:26,570 --> 01:27:32,809 Um, so deduction, we could we could do the propositional calculus as a predicate calculus. 881 01:27:32,810 --> 01:27:40,490 All very powerful systems, uh, that we can use to test deduction, but they don't test all deduction. 882 01:27:40,700 --> 01:27:45,290 They still we we don't like logic, which is the logic of morality. 883 01:27:45,620 --> 01:27:55,430 It's still very much in its infancy. And there are other um, there's also modal logic, which is less in its infancy by if it is possible. 884 01:27:56,090 --> 01:28:01,130 Um, that just means possible. Then it is not. 885 01:28:02,430 --> 01:28:06,520 Um. Yeah. 886 01:28:06,530 --> 01:28:09,440 I've seen you've tried to think of something more interesting, but, um. 887 01:28:09,980 --> 01:28:16,670 Yeah, I mean, you can see that's a deductive argument, but actually, again, that seems to dependable the meaning of the word possible there. 888 01:28:17,360 --> 01:28:22,700 Um, so we might think that wrong and possible are both themselves logically, 889 01:28:22,700 --> 01:28:28,790 but it's like and or if but if so, they act very differently and we don't know how they act. 890 01:28:29,090 --> 01:28:33,080 But what is we a very good system for the vast majority of deductive arguments. 891 01:28:33,680 --> 01:28:42,020 Um. If we can mechanise probability at all, it's only a very small part of it. 892 01:28:42,530 --> 01:28:50,090 Um, so, yes, there is a problem for mechanistic systems because you need to bring to bear all sorts of background knowledge. 893 01:28:50,390 --> 01:28:53,390 Yeah. Okay. One final question. 894 01:28:53,540 --> 01:28:59,090 Is it correct? The speaker mentioned that, um, it could be subjective. 895 01:29:00,080 --> 01:29:03,440 The inductive arguments could be subjective. Is that correct? 896 01:29:04,580 --> 01:29:09,620 I don't like do you, uh, I banned graduates from using the word subjective to learn the third year. 897 01:29:10,320 --> 01:29:19,250 Um, um, it's all you need to bring to bear some background information to that extent. 898 01:29:19,250 --> 01:29:21,590 You're bringing your beliefs to bear on it. 899 01:29:22,220 --> 01:29:30,590 Um, I'm still, you know, subjective is is a subjective state, is a state of a subject to, say, accessible to conscious awareness. 900 01:29:31,070 --> 01:29:41,840 Um, so, but some subjective is not quite the right word, but it's certainly true that the different people could be more or less inductively bold. 901 01:29:42,560 --> 01:29:46,549 So some of you here might say, every time I've seen my aunt, she's been wearing earrings. 902 01:29:46,550 --> 01:29:51,650 Therefore, next time I see you should be wearing earrings, even though you met me three times. 903 01:29:51,980 --> 01:29:55,670 But some of you might wait to meet me ten times before you prepared to say that. 904 01:29:56,270 --> 01:30:07,190 Or whatever. Do you see? Um, so some people are very inductively bold, so they'll draw a generalised conclusion from very little evidence. 905 01:30:07,370 --> 01:30:12,259 And other people wants quite a lot of evidence before they'll draw an inductive conclusion. 906 01:30:12,260 --> 01:30:18,739 So there's certainly an element of subjectivity in how inductively followed you. 907 01:30:18,740 --> 01:30:22,520 All right. We'd better stop there. 908 01:30:22,760 --> 01:30:26,380 Um, and the, uh, presentation for today. 909 01:30:26,390 --> 01:30:30,460 So here are some of the arguments. But at the end of last weekend and today.