1 00:00:00,290 --> 00:00:07,590 OK. Here we are, a week for. Well done for staying with us so far this week, we're going to be looking at how to evaluate arguments, 2 00:00:07,590 --> 00:00:14,500 how to tell whether an argument is a good one or a bad one. And we'll start with inductive arguments. 3 00:00:14,500 --> 00:00:17,260 Right. Let's get started today. 4 00:00:17,260 --> 00:00:24,040 Last week, we learnt how to analyse arguments, and what I meant by that was how to identify them and how to set them out. 5 00:00:24,040 --> 00:00:30,250 Logic, books style. I gave you six steps to analysing an argument. 6 00:00:30,250 --> 00:00:37,570 And these are the only steps and other thing that's become clear to me from e-mails and questions I've had this week is that a lot 7 00:00:37,570 --> 00:00:45,520 of people are trying to evaluate an argument to say whether it's a good argument or a bad argument as they try and analyse it. 8 00:00:45,520 --> 00:00:55,090 Well, don't, because you'll always be led astray if you try to do that, especially with the complicated arguments like the one we looked at last week. 9 00:00:55,090 --> 00:01:01,270 So follow just these steps. Don't do anything else to the argument. 10 00:01:01,270 --> 00:01:05,460 Don't say I think the conclusion shouldn't have a knot in it here. 11 00:01:05,460 --> 00:01:09,580 And take the not out nots are really very important. 12 00:01:09,580 --> 00:01:14,950 And they shouldn't be added in either if they're not there. And so so just follow these steps. 13 00:01:14,950 --> 00:01:22,090 That's all you need to do. I'm not suggesting it's easy. In fact, this is the hardest thing you'll ever do in logic. 14 00:01:22,090 --> 00:01:31,390 Computers can't do this. Only we can do this. A computer can evaluate arguments very easily by appeal to just a very simple algorithm. 15 00:01:31,390 --> 00:01:37,780 But what it can't do is translate an argument in English into a formal language. 16 00:01:37,780 --> 00:01:43,360 Hopeless computers can't do that, or at least not unless a very, very, very simple. 17 00:01:43,360 --> 00:01:53,010 So those are the those are the steps that you must take to analyse arguments and don't try and evaluate them at the same time. 18 00:01:53,010 --> 00:01:58,870 Okay. We did see that although we needed to paraphrase arguments in order to complete these steps. 19 00:01:58,870 --> 00:02:06,110 In other words, we had to add in things that I mean, instead of it we put she or something like that or that wasn't a good example. 20 00:02:06,110 --> 00:02:09,910 But instead of it, it was tried to tickle him more. 21 00:02:09,910 --> 00:02:14,470 Do you remember? So we had to paraphrase arguments to complete this textbook. 22 00:02:14,470 --> 00:02:24,520 By paraphrase, I just mean put what's there in different words, not change the meaning of anything. 23 00:02:24,520 --> 00:02:29,470 And certainly don't add in any meanings or take away any meanings paraphrases 24 00:02:29,470 --> 00:02:34,840 just changing them the words so that the argument structure becomes clearer. 25 00:02:34,840 --> 00:02:36,790 Do you see the difference? And again, 26 00:02:36,790 --> 00:02:46,270 you'll probably need a bit of practise before that comes easily because it really is a temptation to evaluate the arguments and to change its meaning. 27 00:02:46,270 --> 00:02:51,130 If you think it would be clearer if so-and-so said this rather than that, 28 00:02:51,130 --> 00:02:57,760 but try to avoid that because what you're trying to do is identify the arguments as somebody else is making, 29 00:02:57,760 --> 00:03:02,300 not the argument that you would make if you were in his position. 30 00:03:02,300 --> 00:03:08,440 OK. Is the point about analysing arguments is in the hope that you might learn something and you 31 00:03:08,440 --> 00:03:15,390 won't do that if you're imposing your own grid of understanding onto someone else's argument? 32 00:03:15,390 --> 00:03:25,000 OK. So paraphrased, but don't change the meaning. We also saw that it's necessary to bring to bear our understanding of the argument. 33 00:03:25,000 --> 00:03:29,920 For example, do you remember the suppressed premises that we added last week? 34 00:03:29,920 --> 00:03:33,790 I mean, we had quite a tussle with some of them, didn't we? Some of them turned out. 35 00:03:33,790 --> 00:03:41,500 Some of the things that we thought might be suppressed premises turned out actually to be a matter of inconsistent terms or something like that. 36 00:03:41,500 --> 00:03:47,320 So we have to bring to bear our understanding of the argument and what follows from that. 37 00:03:47,320 --> 00:03:51,700 But don't read into the argument anything that isn't actually there. 38 00:03:51,700 --> 00:03:57,910 If a suppressed premises there, it's usually pretty clear that that's a suppressed premise of the argument, 39 00:03:57,910 --> 00:04:08,320 is that it's a premise that ought to be there but isn't. So all you're doing is making it explicit, something that's already there implicitly. 40 00:04:08,320 --> 00:04:16,350 OK. I think we're. OK. And I've just said it's extremely important in analysing an argument not to evaluate it. 41 00:04:16,350 --> 00:04:19,830 First you identify it, then you evaluate it. Okay. 42 00:04:19,830 --> 00:04:30,730 Any questions about all that before I move on to today? No. 43 00:04:30,730 --> 00:04:37,260 OK. Let's move on to today. What we're going to do today is to start learning how to evaluate arguments. 44 00:04:37,260 --> 00:04:41,280 Now, today, I've got down to starting with validity and truth. 45 00:04:41,280 --> 00:04:48,480 Looking at the distinction between them. But I've decided instead to start with induction and then go on to validity and truth next 46 00:04:48,480 --> 00:04:53,670 week and then look at deductive arguments in the evaluation of them in the final week. 47 00:04:53,670 --> 00:04:59,730 So we're going to deal with induction this week. Oh, OK. 48 00:04:59,730 --> 00:05:04,480 Oh, I've done it now. I was going to ask you to tell me what an inductive argument was, but no. 49 00:05:04,480 --> 00:05:08,800 Well, OK. You knew this anyway, didn't you? 50 00:05:08,800 --> 00:05:18,160 Yes. Good. Okay. They give a fantastic inductive arguments such as truth of their premises, makes the truth of that conclusion more or less likely. 51 00:05:18,160 --> 00:05:21,670 Okay. And if you remember, we looked at two examples in the first place. 52 00:05:21,670 --> 00:05:27,190 We looked at the sun's rising. Sun has risen every day in the history of the world. 53 00:05:27,190 --> 00:05:30,630 Therefore, the sun will rise tomorrow. And every time you see Marianne. 54 00:05:30,630 --> 00:05:35,890 She's been wearing earrings. So next time you see her, she'll be wearing earrings. I'm going to leave them off next week. 55 00:05:35,890 --> 00:05:45,400 If I remember, all inductive arguments rely on the principle of the uniformity of nature, as Hume called it, David Hume called it. 56 00:05:45,400 --> 00:05:52,450 And the only arguments for the principle of the uniformity of nature itself are themselves inductive. 57 00:05:52,450 --> 00:06:01,000 So it looks as if any arguments you offer for induction is going to be circular based on induction itself. 58 00:06:01,000 --> 00:06:07,380 And this is a this is a real problem. People would love to be able to justify the principle of the uniformity of nature. 59 00:06:07,380 --> 00:06:13,450 So why we should believe that the future will be like the past. 60 00:06:13,450 --> 00:06:16,790 But no one's conclusively succeeded there. 61 00:06:16,790 --> 00:06:23,380 There's reams and reams and reams of books and papers written on this problem. 62 00:06:23,380 --> 00:06:30,520 And there are lots of theories about it. But there's no theory on which everyone would converge. 63 00:06:30,520 --> 00:06:32,730 Yet. OK. 64 00:06:32,730 --> 00:06:41,430 Different types of inductive argument, inductive generalisations, causal generalisations, arguments from analogy and or arguments for authority. 65 00:06:41,430 --> 00:06:46,290 We're going to have a look at each of these separately and look at how to evaluate them. 66 00:06:46,290 --> 00:06:53,190 So by how to evaluate how to tell whether they're good arguments or bad arguments, please remember, inductive arguments are not. 67 00:06:53,190 --> 00:06:58,890 It's not a matter of either or with inductive arguments, they're either strong or weak. 68 00:06:58,890 --> 00:07:05,820 OK. So there's a great nation. It's a matter of degree as to how good an inductive argument is. 69 00:07:05,820 --> 00:07:09,800 Okay, let's start with inductive generalisations. 70 00:07:09,800 --> 00:07:17,160 And what do I mean by this is that the premise identifies the characteristic of a sample of the population of a population. 71 00:07:17,160 --> 00:07:22,290 And the conclusion extrapolates that characteristic to the rest of the population. 72 00:07:22,290 --> 00:07:28,860 And all inductive arguments are actually a form of this, of inductive generalisation. 73 00:07:28,860 --> 00:07:37,590 So in learning how to evaluate inductive generalisations, you can apply everything you learn to other types of inductive generalisation. 74 00:07:37,590 --> 00:07:43,920 But let's have a look at them generally. OK. Here are two examples. So, OK. 75 00:07:43,920 --> 00:07:51,660 Looking first at this one. What's the population that we're looking at here? 76 00:07:51,660 --> 00:08:00,600 So do you remember I said the premise points to a sample of a population and the conclusion extrapolates to the rest of the population. 77 00:08:00,600 --> 00:08:06,930 So what do I mean by the population in this case? Voters. 78 00:08:06,930 --> 00:08:16,470 Exactly. That's right. So we're saying here that 60 percent of the voters have been sampled and that 60 percent said they'd vote for Mr. Many promise. 79 00:08:16,470 --> 00:08:21,930 And we're extrapolating from that to therefore, actually, there's a suppressed premise here, isn't there? 80 00:08:21,930 --> 00:08:32,500 Or there's something we could add in here. The. 81 00:08:32,500 --> 00:08:34,560 Well, no, we'll move on to that in a minute. 82 00:08:34,560 --> 00:08:43,510 We we we're sort of assuming, aren't we, that 60 percent of the of the population as a whole would be enough for him to win? 83 00:08:43,510 --> 00:08:49,350 To see what I mean. Because that's implied by this, isn't it? 84 00:08:49,350 --> 00:08:56,030 Rather than actually stated. OK. And then the other one we've got here, what's the sample? 85 00:08:56,030 --> 00:09:08,670 Oh, sorry. What's the population here? Number one, what number? 86 00:09:08,670 --> 00:09:12,450 Calls to Beatty Young calls to Beatty. 87 00:09:12,450 --> 00:09:19,950 So the premier says, whenever I've tried to ring Beatty, whenever I've tried to make calls to Beatty, it's taken me hours. 88 00:09:19,950 --> 00:09:24,930 And I'm extrapolating to that for that, too. His calls by me, actually, isn't it? 89 00:09:24,930 --> 00:09:34,110 Rather than calls by calls generally. So I'm extrapolating from my past experience to my future experience. 90 00:09:34,110 --> 00:09:41,950 Correctly. OK, so what I want you is to have a look at each of these arguments, 91 00:09:41,950 --> 00:09:49,090 or you can choose just one of them if you if you want to do it more slowly and ask yourself to write 92 00:09:49,090 --> 00:09:55,990 down the questions to which you would need answers in order to decide whether these are good arguments. 93 00:09:55,990 --> 00:09:59,620 And then we'll go through them together. So have a look yourself. 94 00:09:59,620 --> 00:10:11,650 And just think about these questions and think about what you would ask in order to satisfy yourself that these were good arguments. 95 00:10:11,650 --> 00:10:23,150 Okay, anyone want to give me examples of the sort of questions you would ask on the first? 96 00:10:23,150 --> 00:10:31,200 OK. If if the electorate was just 10 people, why would that help you evaluate the argument? 97 00:10:31,200 --> 00:10:41,160 Is it really the population, the number of the population you want or what else might it be? 98 00:10:41,160 --> 00:10:44,940 You might want to know the size of the sample. Yes. Yes. 99 00:10:44,940 --> 00:10:55,890 I thought you might, because if if you've got 10 people only were in the sample and yet there are a million people in the population. 100 00:10:55,890 --> 00:11:00,370 Then the sample just isn't big enough, is it? I've got another question, 101 00:11:00,370 --> 00:11:06,990 because I know from experience that just because you're a voter who says he was rude to someone doesn't mean you'll actually vote. 102 00:11:06,990 --> 00:11:15,390 And so do you have to know what percentage of the people sampled are likely to vote? 103 00:11:15,390 --> 00:11:24,360 Implicit in your word, voters, because, no, it isn't implicit in the word voters. 104 00:11:24,360 --> 00:11:26,160 You might want to. 105 00:11:26,160 --> 00:11:33,660 Yes, I mean, one of the things you would certainly want to consider here is that the voters sampled said that they would vote for mr. 106 00:11:33,660 --> 00:11:40,380 What is whatever his name is, but actually won't vote for him or may not vote at all. 107 00:11:40,380 --> 00:11:48,020 Yes. I mean, either way, it wouldn't make much difference. So, yes, I do think that's a. 108 00:11:48,020 --> 00:11:55,620 Yes, because you can't see with the heat. Yes, that's a bit of background information that you would bring to bear on this particular argument. 109 00:11:55,620 --> 00:12:01,220 It's something you know about voters which show that you really have to know a bit more about. 110 00:12:01,220 --> 00:12:09,300 Well, you present. Presumably, I expect there's a number by which they they determine how many are likely to actually say. 111 00:12:09,300 --> 00:12:14,900 I don't know. Yes. Yes, you would. 112 00:12:14,900 --> 00:12:19,040 You'd certainly need to know whether they were telling the truth. Yes. Okay. Okay. 113 00:12:19,040 --> 00:12:25,910 It's certainly the case that Mr. many promises not likely to win if he's not going to stand, even if 60 percent of the votes. 114 00:12:25,910 --> 00:12:35,390 So actually, that's quite a good counterexample, isn't it? A case where the premise would be true, but the conclusion would have to be false? 115 00:12:35,390 --> 00:12:38,210 That is quite a good counterexample to that. 116 00:12:38,210 --> 00:12:44,680 If you've got a situation where the voters really did want to vote for whoever it was, but he wasn't going to stand. 117 00:12:44,680 --> 00:12:49,200 Yeah, I like that one. Another one here. Good. 118 00:12:49,200 --> 00:12:52,860 You'd want to know whether the sample is representative, wouldn't you? 119 00:12:52,860 --> 00:12:57,000 Because if if the only people they asked were males, 120 00:12:57,000 --> 00:13:04,890 then who knows what women are going to do or if they're all under 24 or if they're all black or if they're all. 121 00:13:04,890 --> 00:13:11,460 Whatever. You need to know that the sample chosen is representative of the population as a whole. 122 00:13:11,460 --> 00:13:20,520 Yes. Okay. So if you have something like the radio, there was a radio programme, wasn't there, that was taking votes for something rather. 123 00:13:20,520 --> 00:13:29,880 And a lot of people. So, I mean, actually, what you want to do is you want to ask whether the premise here is true at all. 124 00:13:29,880 --> 00:13:37,020 Yes, definitely. Yes. Yes. Because if it may be that 60 percent of the voters said that they'll vote for Mr. Brown. 125 00:13:37,020 --> 00:13:45,430 But then something dreadful happens. And. It's certainly not the case that if you sampled them again just before the election, 126 00:13:45,430 --> 00:13:49,540 they would still say, good, you're coming up with all sorts of things. I haven't got myself here. 127 00:13:49,540 --> 00:13:54,730 This is brilliant. Who did the sampling? 128 00:13:54,730 --> 00:14:02,650 Yeah, that that would be a very good thing. And again, I mean, that's another example of is the premise true? 129 00:14:02,650 --> 00:14:10,540 Because if the person is saying that 60 percent of the voters said that they would vote for him if they're all apparatchiks, 130 00:14:10,540 --> 00:14:18,660 for Mr. many promises who want to make him feel good before the election, you might question the premise itself, mightn't you? 131 00:14:18,660 --> 00:14:25,210 OK, what about this one? Or is there anything that would be added to this one that we haven't already considered? 132 00:14:25,210 --> 00:14:34,110 Gentlemen, that when I think it is perfectly good, because if I've been trying to ring Beattie at two o'clock in the morning, it might be perfect. 133 00:14:34,110 --> 00:14:41,230 You know, yes, it may have taken hours, but were I to ring at 10 o'clock in the morning, it might be different. 134 00:14:41,230 --> 00:14:45,490 I'm assuming that they don't answer the phone at two o'clock in the morning. OK. 135 00:14:45,490 --> 00:14:50,770 It's certainly reasonable to ask whether it's it's just me. 136 00:14:50,770 --> 00:14:55,750 Yes. I mean, there might be something about my particular telephone number that whenever I bring Beattie, 137 00:14:55,750 --> 00:15:00,220 there's something that says, don't answer this one or something like that. 138 00:15:00,220 --> 00:15:05,410 But ask the conclusion is that when I ring Beattie, do you see what I mean? 139 00:15:05,410 --> 00:15:16,170 Again, this this again, the way I've set this up, the population here is calls that I make to Beattie rather than calls that anyone makes to Beattie. 140 00:15:16,170 --> 00:15:22,000 Yes, I might have only made one or two. Again, that's structurally the same as when we said here. 141 00:15:22,000 --> 00:15:30,490 How many people did we sample in the population? And what percentage of their elderly population is that? 142 00:15:30,490 --> 00:15:38,920 And you're suggesting exactly the same thing here. Quite properly. If I've only tried to ring once or twice, then. 143 00:15:38,920 --> 00:15:42,850 Is that really a big enough sample? Good. 144 00:15:42,850 --> 00:15:48,940 Again, your questioning whether that's premises true. I mean, maybe I'm just very bad at calculating time. 145 00:15:48,940 --> 00:15:54,370 Maybe I'm one of these people who's very keen to get somebody answer my phone call immediately. 146 00:15:54,370 --> 00:15:59,730 And if it takes 30 seconds, then I get very irritated and thinks it's I think it's ours. 147 00:15:59,730 --> 00:16:00,280 OK. 148 00:16:00,280 --> 00:16:11,160 You would have to assume, wouldn't it, that it was the same part of Beatty again, because otherwise otherwise you'd get an equivocation, wouldn't you? 149 00:16:11,160 --> 00:16:15,690 There's Beatty here. Wouldn't mean the same as Beatty here. 150 00:16:15,690 --> 00:16:23,560 OK. An equivocation, by the way, is it is an argument in which you use the same word with two different meanings. 151 00:16:23,560 --> 00:16:27,250 OK. So if you think of the word bank, it could mean financial institution. 152 00:16:27,250 --> 00:16:31,210 It could mean an action of an aeroplane or it could mean the side of a river. 153 00:16:31,210 --> 00:16:35,830 And if in an argument you used it in all three of those meanings, 154 00:16:35,830 --> 00:16:40,420 you could imagine an argument that would look good, but as a matter of fact, wouldn't work at all. 155 00:16:40,420 --> 00:16:45,940 And that's as a result of equivocation, your equivocating on the word bank. 156 00:16:45,940 --> 00:16:56,050 So if I were equivocating here on the word Beattie or the the letters Beattie, my conclusion might not follow from my premises. 157 00:16:56,050 --> 00:17:00,070 OK. Very good. That really is good. I think it's very impressive. 158 00:17:00,070 --> 00:17:05,240 You'll see as I go through the things that I'm going to list that you've said just about all of them. 159 00:17:05,240 --> 00:17:08,860 OK, firstly, is that just about all of this one? I think I've got that. 160 00:17:08,860 --> 00:17:16,900 You haven't. Is the premise true? OK, we've got 60 percent of the sample said that they would vote for Mr. 161 00:17:16,900 --> 00:17:22,160 Many promised. Well, can we really believe that? Might they be bad at record keeping? 162 00:17:22,160 --> 00:17:31,120 So it actually wasn't 60 percent. It was only 50 percent. And you know, if if you last year, when you use those people, they were completely hopeless. 163 00:17:31,120 --> 00:17:35,470 Might they be engaged in wishful thinking? Might they be bad, just bad at maths? 164 00:17:35,470 --> 00:17:39,160 They can't work out percentages. Am I telling the truth? 165 00:17:39,160 --> 00:17:43,690 Am I in the pay of one of Beatty's rivals? Am I prone to exaggeration? 166 00:17:43,690 --> 00:17:50,630 Am I just very bad at estimating time? So lots of reasons why the premise itself might not be true. 167 00:17:50,630 --> 00:17:56,380 And if you remember, whenever we're evaluating an argument, there are two things we've got to look at. 168 00:17:56,380 --> 00:18:05,270 Can you remember what they are? Just two basic things we look at whenever we are evaluating an argument of any kind at all. 169 00:18:05,270 --> 00:18:11,370 One is, does the conclusion follow from the premises? That's right, and the other is. 170 00:18:11,370 --> 00:18:16,830 Are the premises true? That's right. Is if if even one premise is false, 171 00:18:16,830 --> 00:18:23,160 then then that doesn't guarantee the truth or the conclusion does it or doesn't even make the truth of the conclusion more likely. 172 00:18:23,160 --> 00:18:29,920 So first thing you look at when you look at any argument is, are the premises true? 173 00:18:29,920 --> 00:18:35,700 Okay, how large is the sample? Again, you got this. How many of those who would vote in the election were sampled? 174 00:18:35,700 --> 00:18:39,990 Ten out of one million. Well, that doesn't look very good, does it? 175 00:18:39,990 --> 00:18:45,180 A thousand out of one million. That looks better. How many is enough, though? 176 00:18:45,180 --> 00:18:49,390 Do you think? And that's a really difficult question, isn't it? 177 00:18:49,390 --> 00:18:58,180 How many is enough? I'm just specifying here that one million is the population. 178 00:18:58,180 --> 00:19:05,210 And then we're saying, OK, how many of those would count is enough? 179 00:19:05,210 --> 00:19:09,730 And I'm saying there actually isn't any answer to that. We can certainly answer that. 180 00:19:09,730 --> 00:19:16,310 Ten is probably not enough. And we might be able to say that. 181 00:19:16,310 --> 00:19:22,340 Nine hundred and ninety nine. Well, thousands or tens. I don't know how much a million is a thousand thousand isn't it? 182 00:19:22,340 --> 00:19:26,180 Okay. Nine hundred thousand would be enough. Okay. 183 00:19:26,180 --> 00:19:34,900 But in between those two numbers, what counts is enough statistics. 184 00:19:34,900 --> 00:19:39,340 Well, that's coming later. That's coming when we look at the representativeness of the sample at the moment. 185 00:19:39,340 --> 00:19:47,020 The only thing we're talking about is the size of the sample. If I say all swans are white and you say, well, what's your reason for saying that? 186 00:19:47,020 --> 00:19:51,920 And I say, well, I saw Swan just now and it was white. And you say. 187 00:19:51,920 --> 00:20:02,880 More just one. I see. And you can be more or less inductively bold, and actually, if we were to look at people in this room, 188 00:20:02,880 --> 00:20:08,200 if we were to do a headcount of people in this room, we'd find that some of us are very large. 189 00:20:08,200 --> 00:20:10,870 I shouldn't say ask because I'm I'm not inductively Bowles, 190 00:20:10,870 --> 00:20:18,480 but some of us would be prepared to extrapolate from a very small number and others of us would be very sceptical about it. 191 00:20:18,480 --> 00:20:25,050 Strapped, elating, even from quite a large number. So actually the question, how many is enough? 192 00:20:25,050 --> 00:20:33,970 The answer would be it depends on who you are on and on how inductively bold you are not. 193 00:20:33,970 --> 00:20:44,700 Well, statistics statisticians have to come up with something that they would count has enough we that right. 194 00:20:44,700 --> 00:20:52,420 Yep. And the larger the sample, the smaller the competence raised, they can be more competent. 195 00:20:52,420 --> 00:20:57,220 Represent. When you say the larger the sample. Do you mean that? 196 00:20:57,220 --> 00:21:03,630 It's certainly true that if the thousands have been sampled, that's much more confidence boosting than 10. 197 00:21:03,630 --> 00:21:07,390 Yes, that is. That's what you mean. Yes. Yes. Okay. 198 00:21:07,390 --> 00:21:17,770 Are they saying that as a result that. 80 percent is in the election will vote one way plus or minus two percent. 199 00:21:17,770 --> 00:21:23,470 All saying. Fifty five percent has a modest 10 percent. 200 00:21:23,470 --> 00:21:31,730 And they leave the range of their prediction depends upon the size of the sample. 201 00:21:31,730 --> 00:21:38,380 Yes. No, I think, you know, I'm I'm getting out of my depth here. 202 00:21:38,380 --> 00:21:46,270 I don't I don't understand what you're saying, I'm afraid, and it's quite a good example from history, which as an American election and mindset. 203 00:21:46,270 --> 00:21:53,480 Yes, we're coming to that. That's representativeness. Yes. Yes. You know, it's it's not that absolute saw. 204 00:21:53,480 --> 00:22:03,480 Sense, massive search, you know, let's let's leave representativeness aside at the moment, I'm just taught at the moment I'm just talking about size. 205 00:22:03,480 --> 00:22:09,090 Oh, all we need to look at is how many of those in the sample, how many in the population, 206 00:22:09,090 --> 00:22:20,040 how many in the sample do we think that we've got enough who've been sampled in order to make us more confident about the strapped nation? 207 00:22:20,040 --> 00:22:26,790 If I've only rung BBT once, then my claim that the next time I'm going to ring is is really pretty low, isn't it? 208 00:22:26,790 --> 00:22:33,240 It's a very weak argument, whereas if I've rung Beattie 50 times and not got through. 209 00:22:33,240 --> 00:22:42,390 Then that's more reason to think. So if we think remember that inductive arguments make the premises, make the conclusion more or less likely. 210 00:22:42,390 --> 00:22:50,640 Well, if my premises. I've rung Beattie once in the past and it took them hours to answer then, so it'll take them hours to answer again. 211 00:22:50,640 --> 00:22:57,870 My arguments much less strong than if I say I've rung Beattie 50 odd times in the past and it's taken them hours to answer. 212 00:22:57,870 --> 00:23:07,350 Therefore, it'll take Miles to answer next time. See what I mean? And again here, if I say ten out of what, 60 percent of ten. 213 00:23:07,350 --> 00:23:12,900 In other words, six voters out of a million said that they'd be voting for Mr Many promise. 214 00:23:12,900 --> 00:23:16,020 Therefore, Mr many promise will win the election. 215 00:23:16,020 --> 00:23:25,980 That's a less good argument, a weaker argument than if I say 60 percent of a thousand voters say that they'll vote for Mr many promise. 216 00:23:25,980 --> 00:23:31,590 Therefore, Mr many promise will 60 per cent of people will vote for him in the election and he'll win. 217 00:23:31,590 --> 00:23:35,370 See what I mean? We haven't actually looked at representativeness yet. 218 00:23:35,370 --> 00:23:40,710 We will about to do so. I know you're all dying to get onto on representativeness, so let's do so. 219 00:23:40,710 --> 00:23:47,040 Here we go. OK. The second thing that we ask is how representative is the sample, what you should do instantly. 220 00:23:47,040 --> 00:23:53,280 I'm giving you again, you might say it was another algorithm, another just list of steps that you might do. 221 00:23:53,280 --> 00:23:57,680 Again, try and keep them separate in your mind, because if you tick off each one, OK? 222 00:23:57,680 --> 00:24:02,250 You've asked yourself how many there are in the sample and how many there are in the population. 223 00:24:02,250 --> 00:24:07,800 And made a judgement about whether there is enough in the sample to be able to to extrapolate. 224 00:24:07,800 --> 00:24:13,090 Second question you ask is whether the sample is representative. See what I mean? 225 00:24:13,090 --> 00:24:20,370 Descartes very famous philosopher, brilliant philosopher, had a list of rules of thinking. 226 00:24:20,370 --> 00:24:28,710 And one of the things he said was that you should take any problem you have and break it up into its parts and then deal with 227 00:24:28,710 --> 00:24:36,180 each part separately and then make sure that looking at each of the parts you can put together as a solution to the whole. 228 00:24:36,180 --> 00:24:43,430 And what I'm suggesting is you ask each of these questions separately so that you make sure that you ask all of them. 229 00:24:43,430 --> 00:24:52,980 I mean, it just makes your thoughts clearer. Again, as with first, you identify the arguments and analyse it, then you evaluate it. 230 00:24:52,980 --> 00:24:56,730 OK. And you don't try and do both at once. OK. 231 00:24:56,730 --> 00:25:01,920 So here again, you got all these were the voters sampled all female? 232 00:25:01,920 --> 00:25:12,510 Well, I mean, there are a lot of medical experiments or medical surveys that look only at men and then extrapolate the results to women. 233 00:25:12,510 --> 00:25:15,870 I don't know if you've seen recently they've decided that for women, 234 00:25:15,870 --> 00:25:21,450 the symptoms of a heart attack are quite different from the heart attack symptoms of a man. 235 00:25:21,450 --> 00:25:21,990 And therefore, 236 00:25:21,990 --> 00:25:31,590 all the extrapolation that they've done in the past from male experience of heart attacks to female experience of heart attacks has been faulty. 237 00:25:31,590 --> 00:25:36,420 There was quite a big thing about that a couple of weeks ago. Are they all over 40? 238 00:25:36,420 --> 00:25:42,030 Are they all white? Are they all middle class? Are they all known to the person conducting the survey? 239 00:25:42,030 --> 00:25:47,670 The famous example that you were mentioning a minute ago and in fact, that you've just mentioned as well. 240 00:25:47,670 --> 00:25:52,890 In an election between who? Roosevelt in London? 241 00:25:52,890 --> 00:25:57,360 That's right. They thought that 60 percent of the population was going to vote. 242 00:25:57,360 --> 00:26:01,320 That's what their sample of said. But how did they find the sample? 243 00:26:01,320 --> 00:26:06,600 They looked in the telephone book. How many people had telephones then? 244 00:26:06,600 --> 00:26:13,290 Actually, very few. So although there was 60 per cent of the sample said that they would vote for Roosevelt. 245 00:26:13,290 --> 00:26:21,450 Actually, the sample was horrendously unrepresentative because it was middle class people with fair amount of money who had telephones. 246 00:26:21,450 --> 00:26:25,830 And therefore, it didn't represent the population as a whole. OK. 247 00:26:25,830 --> 00:26:33,400 And anyway, the same thing here again, we came as have I only rang Beattie on the Sunday after 10 pm when I'm in a hurry, et cetera, et cetera. 248 00:26:33,400 --> 00:26:38,850 Okay, so firstly, it's the premise. True. Secondly, what was it? 249 00:26:38,850 --> 00:26:46,230 How large is the sample as a percentage of the population? Thirdly, how representative is the sample? 250 00:26:46,230 --> 00:26:52,180 Three questions to ask. Here's another one. This is a one you haven't thought often. 251 00:26:52,180 --> 00:26:57,600 Perfectly reasonable that you shouldn't if you were asked. Here are two hands of cards. 252 00:26:57,600 --> 00:27:07,250 Which one is most likely to come up? And who thinks this one is most likely to come up? 253 00:27:07,250 --> 00:27:15,910 No. OK. Who thinks this one is most likely to come up? No, you're all very clever, aren't you? 254 00:27:15,910 --> 00:27:22,740 You're absolutely right. And they're actually equally likely to come up because, of course, cards are just at random. 255 00:27:22,740 --> 00:27:31,600 They're not. But actually, if if you if you ask the students at the university where this experiment was done, 256 00:27:31,600 --> 00:27:37,270 which hands is likely to come up, they come out overwhelmingly against this one. 257 00:27:37,270 --> 00:27:43,630 And for this one, this is much more likely to come up than this one. Now you can see why they think this, can't you? 258 00:27:43,630 --> 00:27:48,360 Can you? This is this one. 259 00:27:48,360 --> 00:27:53,630 Well, yes. This is the one that loved to have come up. And this is the one that they have come up. 260 00:27:53,630 --> 00:27:56,940 They think all the time sort of thing. 261 00:27:56,940 --> 00:28:05,520 But, of course, actually, it doesn't quite work like that because they're using an informal heuristic to say, in my experience, this never comes up. 262 00:28:05,520 --> 00:28:09,780 And this always comes up. And actually, you just can't use that here, can you? 263 00:28:09,780 --> 00:28:17,950 Because what comes up is something like that. But certainly not that. 264 00:28:17,950 --> 00:28:21,940 It just means a way of making a decision. OK. 265 00:28:21,940 --> 00:28:26,260 Rule of thumb, if you like, a way of making a decision. Thank you for asking. 266 00:28:26,260 --> 00:28:28,750 I should have explained it before. OK. So. 267 00:28:28,750 --> 00:28:42,490 So if inductive generalisation is based on on an informal claim like this, in my experience, hands like this never come up. 268 00:28:42,490 --> 00:28:47,410 Therefore, this one is is much less likely than that one. 269 00:28:47,410 --> 00:28:51,910 Then you should be very wary of the generalisation. 270 00:28:51,910 --> 00:28:56,110 And here's another one. And I expect you're all going to be clever enough to get this, too. 271 00:28:56,110 --> 00:29:03,130 OK. Four pages of a novel. How many words would you expect to find ending in NG and in four pages of a novel? 272 00:29:03,130 --> 00:29:07,570 How many words would you expect to find that includes the letter N? 273 00:29:07,570 --> 00:29:19,730 Would you expect that to be larger than that? Or vice versa. So you'd expect more of being words, no more of the N words. 274 00:29:19,730 --> 00:29:24,120 Put up your hands if you think there are more N words. OK. 275 00:29:24,120 --> 00:29:29,720 Put up your hand if you think more in words. OK, that's interesting. 276 00:29:29,720 --> 00:29:39,080 This time you have fallen for the trick because, of course, there are going to be words in words. 277 00:29:39,080 --> 00:29:44,840 That's right. They're always going to. Oh, okay. I'm sorry. So you're absolutely right. 278 00:29:44,840 --> 00:29:48,440 There are going to be many more N words than there are in words, 279 00:29:48,440 --> 00:29:52,580 because they're all they're going to be at least as many and words as there are in words. 280 00:29:52,580 --> 00:29:58,340 Yes. Okay. Sorry you did get that. What happens again when you ask these students, 281 00:29:58,340 --> 00:30:03,620 the psychology students at the university where this experiment was done is they expect 282 00:30:03,620 --> 00:30:11,810 many more of these because they can think of many more in words than they can of N words, 283 00:30:11,810 --> 00:30:17,900 and therefore they they inductively generalise again. Well, I can think of many more of those. 284 00:30:17,900 --> 00:30:22,760 Therefore, there probably are more of those. Again, bad arguments. 285 00:30:22,760 --> 00:30:31,940 If I ask you how many footballers or something from a particular team score, well, you'll be able to think why it is. 286 00:30:31,940 --> 00:30:37,190 That's a very bad example. Anything that you think you know a bit about, 287 00:30:37,190 --> 00:30:41,600 you're probably tempted to rely on your own experience to make an inductive 288 00:30:41,600 --> 00:30:46,220 generalisation that can work if you really do know what you're talking about. 289 00:30:46,220 --> 00:30:56,160 But it doesn't work if you're just using that way of doing it on on another context where actually your knowledge is not so. 290 00:30:56,160 --> 00:31:09,080 Secure. Okay. Okay, so five sets there, I think it was when you are evaluating any inductive generalisation you're looking for. 291 00:31:09,080 --> 00:31:15,770 Firstly, it's the premise true. Secondly, does the sample size of the population. 292 00:31:15,770 --> 00:31:19,640 Is it large enough compared to the population as a whole? 293 00:31:19,640 --> 00:31:29,660 Thirdly, is the sample representative or is there a bias in it due to whatever all sorts of reasons for different biases? 294 00:31:29,660 --> 00:31:30,980 And finally, 295 00:31:30,980 --> 00:31:42,000 is it based on on an informal heuristic that actually an informal rule of thumb that actually just won't stand up to proper scrutiny here? 296 00:31:42,000 --> 00:31:50,420 OK. And as I said, all inductive general, all inductive arguments are based on inductive generalisations. 297 00:31:50,420 --> 00:31:55,580 And so that little way of testing things can be used for all of them. 298 00:31:55,580 --> 00:32:02,510 Let's look at causal generalisations, okay. A causal generalisation is a type of inductive generalisation. 299 00:32:02,510 --> 00:32:06,710 The premise identifies a correlation between two types of events, 300 00:32:06,710 --> 00:32:12,170 and the conclusion states that events the first type cause events of the second type. 301 00:32:12,170 --> 00:32:20,660 So the idea is that if you see A and B, A and B and B, A and B, A's and B's are always correlated. 302 00:32:20,660 --> 00:32:26,060 You extrapolate to the claim that A's and B's will always be correlated. 303 00:32:26,060 --> 00:32:31,730 And you imply that the reason for this is that there is a causal relation between them. 304 00:32:31,730 --> 00:32:38,180 So where there's correlation, there's cause that sort of causal generalisation is. 305 00:32:38,180 --> 00:32:43,730 So let's have a look at a couple. Okay. Married men live longer than single men. 306 00:32:43,730 --> 00:32:49,420 Therefore, being married causes you to live longer. I apologise for this one. 307 00:32:49,420 --> 00:32:57,460 When air is allowed into a wound, maggots form, therefore, maggots and wounds are caused by air being allowed into the womb wound. 308 00:32:57,460 --> 00:33:02,710 This is. OK. 309 00:33:02,710 --> 00:33:11,780 I'll tell you what. Let's let's do it openly. What do we need to know to know whether this these arguments are good arguments? 310 00:33:11,780 --> 00:33:17,000 Okay, let's have a look. Again, we ask it's the premise to who says, man, married men live longer. 311 00:33:17,000 --> 00:33:22,550 Married men. A woman who wants to get married. Fred, whose parents split up when he was five. 312 00:33:22,550 --> 00:33:28,850 I mean, who who's saying this? Where are we actually getting this information from? Who says magots form when I get since the womb? 313 00:33:28,850 --> 00:33:34,100 Just as you said at the back there. Was it a newly qualified nurse who sues observed this once? 314 00:33:34,100 --> 00:33:41,330 Was it an elderly doctor who's seen it a lot, but only in his own experience and in his own study, perhaps? 315 00:33:41,330 --> 00:33:48,700 Or was it a scientific study of one that you would expect to be to have looked at more carefully? 316 00:33:48,700 --> 00:33:56,110 Causation is is actually I mean, to give you a little bit of background on this. 317 00:33:56,110 --> 00:34:05,170 David Hume. The person I've mentioned already in connexion with the principle of the uniformity of nature believes that actually causation. 318 00:34:05,170 --> 00:34:14,710 We cannot determine causation. If we find A causes B and we try and find out why A causes B, what what is this causal relation? 319 00:34:14,710 --> 00:34:20,170 What is it that relates the two things that would cause and effect? 320 00:34:20,170 --> 00:34:24,490 We'll just find another correlation. CND. OK. 321 00:34:24,490 --> 00:34:30,400 So why do we think C and D are correlated? We look further and we look down and we see yet another correlation. 322 00:34:30,400 --> 00:34:36,250 So all we ever see is correlation. We never actually see the causal relation itself. 323 00:34:36,250 --> 00:34:39,610 We can never get to the causal relations itself. 324 00:34:39,610 --> 00:34:46,330 And he actually thought arguably this is a very popular theory of Hume, although lots of people deny it. 325 00:34:46,330 --> 00:34:50,650 These days that he actually thought causation didn't exist at all. 326 00:34:50,650 --> 00:34:55,570 That causes that are beliefs about causation or just the habit of mind. 327 00:34:55,570 --> 00:35:00,310 So we see a correlation would be a correlate, it would be A correlated with B, 328 00:35:00,310 --> 00:35:05,470 and we start to say that A causes B, and all we mean by that is that A is correlated. 329 00:35:05,470 --> 00:35:11,380 B, there's just a constant conjunction between A and B, there's nothing that makes A cause. 330 00:35:11,380 --> 00:35:21,010 B. I have to say that there is another theory of Humes that that he says that A causes B, where has it not been the case that, 331 00:35:21,010 --> 00:35:29,440 A, it would not have been the case that, B, had it not been the case, that it would not have been the case that B. 332 00:35:29,440 --> 00:35:39,100 And that suggests that there is a power of some kind? Doesn't isn't that makes A cause B but but we don't have a C that power do we. 333 00:35:39,100 --> 00:35:45,190 We don't. We just see the cause and the effect and the correlation between them. 334 00:35:45,190 --> 00:35:50,300 And so causation is is a really interesting philosophical issue. 335 00:35:50,300 --> 00:35:55,170 The question what's causation is, is endlessly interesting. 336 00:35:55,170 --> 00:36:02,720 I think it's endlessly interesting, but it remains to be the case that our evidence for causation is always a correlation, 337 00:36:02,720 --> 00:36:11,720 but a correlation simply isn't sufficient as evidence for court causation, is it because it could be evidence for identity, for example. 338 00:36:11,720 --> 00:36:17,350 So that night. Well, the evening star. Goes down. 339 00:36:17,350 --> 00:36:21,900 Morning star rises and so on, so forth. Do they cause each other to do it? 340 00:36:21,900 --> 00:36:26,670 No, actually they're the same thing. That's why they're correlated. 341 00:36:26,670 --> 00:36:32,670 That's why the pattern is uniform. Do husbands cause wives? 342 00:36:32,670 --> 00:36:35,130 But they're correlated. 343 00:36:35,130 --> 00:36:43,530 Well, what we're saying is that correlation isn't sufficient for a causation, but it's the only evidence wherever likely to have. 344 00:36:43,530 --> 00:36:49,500 But when you when you see a causal generalisation, it will be based on correlations. 345 00:36:49,500 --> 00:36:54,810 But what we're alerting you to here is that a correlation isn't sufficient for a causation. 346 00:36:54,810 --> 00:36:58,800 You need to ask lots of other questions. So is the premise true? 347 00:36:58,800 --> 00:37:02,460 How strong is the correlation? How many married men were observed? 348 00:37:02,460 --> 00:37:05,940 I mean, this is, again, exactly the same as how many are in the sample from. 349 00:37:05,940 --> 00:37:11,310 Last question. How long were they observed? Were unmarried men observed? 350 00:37:11,310 --> 00:37:21,390 How many cases of maggots forming were observed? Because when John Stuart Mill, famous philosopher, English philosopher, 351 00:37:21,390 --> 00:37:28,590 came up with what he called the method of agreement and the method of difference for scientific experiments. 352 00:37:28,590 --> 00:37:33,150 What you if you're trying to work out what causes what you need to see, firstly, 353 00:37:33,150 --> 00:37:38,340 that they do correlate that, that the cause correlates with the effect. 354 00:37:38,340 --> 00:37:43,800 Next thing you need to do is to try and bring about the cause without the effect. 355 00:37:43,800 --> 00:37:49,770 Because if you're saying that is cause B, because all A's are always correlated with B, 356 00:37:49,770 --> 00:37:56,010 then what you do is you try and bring about an A without A, B, B is if you can do that. 357 00:37:56,010 --> 00:38:00,690 You've disproved your claim about causation. See what I mean? 358 00:38:00,690 --> 00:38:07,830 And that shows us that we tend to think that a cause is sufficient for its effect, that if A causes B, 359 00:38:07,830 --> 00:38:15,060 the occurrence of an A must be followed by the occurrence of a B because A is sufficient for B. 360 00:38:15,060 --> 00:38:24,890 So that that's the method of of sameness is and the method of differences, which tells you whether something is a cause or not. 361 00:38:24,890 --> 00:38:29,840 Also, you want to ask, does the causal relation make sense or could it be accidental? 362 00:38:29,840 --> 00:38:41,670 Let's say that we discovered that in the whole history of the universe, every time a match has been struck, a pineapple has fallen. 363 00:38:41,670 --> 00:38:47,070 OK. We have a correlation and we've done our very best to try and make sure that we've struck matches 364 00:38:47,070 --> 00:38:54,150 without a pineapple falling and keep on doing it so we can't break the correlation in any way. 365 00:38:54,150 --> 00:39:01,260 Do we think that match is striking cause pineapple's to fall? 366 00:39:01,260 --> 00:39:08,190 Well, some people are quite inductively bold here, they think, yes, if you've got a correlation as strong as that, it must be causal. 367 00:39:08,190 --> 00:39:14,340 Apparently, there's also a correlation between the length of skirts and the Dow Jones index. 368 00:39:14,340 --> 00:39:17,910 As one goes up, the other goes up and this one goes down, the other goes down. 369 00:39:17,910 --> 00:39:21,240 It might be the other way round. But anyway, there's a correlation here. 370 00:39:21,240 --> 00:39:27,570 Do we think that the length of skirts causes the rise and fall of the Dow Jones index or vice versa? 371 00:39:27,570 --> 00:39:33,600 That's more likely. You can sort of see something that makes sense. 372 00:39:33,600 --> 00:39:39,720 Can't you, in that? Because you could maybe when the down Jones index is really high. 373 00:39:39,720 --> 00:39:45,270 People are really excited and pleased and therefore they risk take risk taking. 374 00:39:45,270 --> 00:39:51,520 So they put on their mini skirt. Okay. 375 00:39:51,520 --> 00:39:56,070 It does. Okay. So the claim that's being married makes you live longer if you're a man. 376 00:39:56,070 --> 00:39:59,280 Why? Why would being marriage cause men to live longer? 377 00:39:59,280 --> 00:40:04,920 I think this is where your claim about all we, including the civil relationships, is quite interesting. 378 00:40:04,920 --> 00:40:11,400 Why would being married cause men to live longer? Stanley, I think it causes women die earlier. 379 00:40:11,400 --> 00:40:17,990 Just a warning to women and through. Okay, they're happier. 380 00:40:17,990 --> 00:40:22,500 The stress reduces. Another explanation. Well, that's a first. 381 00:40:22,500 --> 00:40:29,500 He said it might also be because women tend to look after diets and things like that. 382 00:40:29,500 --> 00:40:33,550 More men are cooks for more often than women are, perhaps. 383 00:40:33,550 --> 00:40:36,790 And when women do the cooking, they concerned about nutrition and da da, da da. 384 00:40:36,790 --> 00:40:40,600 So when a married man eats, he tends to eat more healthily than. 385 00:40:40,600 --> 00:40:44,260 I mean, we can think of reasons for why that would be the case, can't we? 386 00:40:44,260 --> 00:40:55,370 So it's not a complete mystery. What about this one? Why would Air getting into a wound cause maggots to form so. 387 00:40:55,370 --> 00:41:03,650 So, I mean, the experiment we've done here with some nurse has seen that when a wound was covered up by accidents or something like that, 388 00:41:03,650 --> 00:41:08,450 maggots didn't form. And she thinks, well, you know, could it be so? 389 00:41:08,450 --> 00:41:17,480 She covers up a few and she leaves a few open and she sees that the one she's covered up don't get maggots, whereas the ones left open do get maggots. 390 00:41:17,480 --> 00:41:23,330 So she's formed a hypothesis. Could it be that getting into the wound causes maggots? 391 00:41:23,330 --> 00:41:32,600 But why would that be the case? Perhaps because there's something carried in the air that causes maggots to form. 392 00:41:32,600 --> 00:41:36,650 And actually, we know now that that is the case. So, okay. 393 00:41:36,650 --> 00:41:43,730 So what does the causal relation make sense? Incidently, if it doesn't make sense, does that mean it's not causal? 394 00:41:43,730 --> 00:41:45,100 No, it doesn't touch you, does it. 395 00:41:45,100 --> 00:41:53,180 Cuz you can imagine that there may be something that is a complete mystery for us for a while and I wish I could think of an example, 396 00:41:53,180 --> 00:41:57,230 but which turns out to be true and turns out to have an explanation. 397 00:41:57,230 --> 00:42:08,310 But even so, if you can't if, if it just, if things seem to be just totally disparate, that would be a mark against this argument being a good one. 398 00:42:08,310 --> 00:42:15,660 And we also might and we've done this a bit. OK. What's causing it is what could it be that being long lived causes marriage? 399 00:42:15,660 --> 00:42:20,670 So it might be that having genes for longevity caused men to get married. 400 00:42:20,670 --> 00:42:26,250 So you said socio economic factors. But I'm suggesting it could be genetic factors. 401 00:42:26,250 --> 00:42:34,260 So there's one set of genes such that if a man has them, he's both more likely to get married and he's more likely to live longer. 402 00:42:34,260 --> 00:42:43,630 So there's one common cause for the two things, rather that one thing causes the other. 403 00:42:43,630 --> 00:42:49,160 Yeah, and that's I couldn't think of anything could Magots forming cause, yeah, to get into the womb. 404 00:42:49,160 --> 00:42:53,650 No, I couldn't think about that, so. Okay. 405 00:42:53,650 --> 00:42:59,640 Right, that so that's looking at causal generalisations and you'll see that many of the questions that you would ask 406 00:42:59,640 --> 00:43:05,880 about causal generate generalisations are also questions you've already asked about inductive generalisations. 407 00:43:05,880 --> 00:43:16,170 That's not surprising because causal generalisations are a type of inductive generalisation and or all the ones that you're asking separately, 408 00:43:16,170 --> 00:43:23,630 the ones that say, you know, why should we think that a correlation has a causal relation under it? 409 00:43:23,630 --> 00:43:33,270 Say that. So just moving on quickly to an analogy here, another type of inductive generalisation, 410 00:43:33,270 --> 00:43:42,090 it takes just one sample of something and then extrapolates from a character of that example to the character of something similar to that thing. 411 00:43:42,090 --> 00:43:47,070 And there's a famous argument from analogy. The universe is like a pocket watch. 412 00:43:47,070 --> 00:43:51,810 Pocket watches have designers. Therefore, the universe must have a designer. 413 00:43:51,810 --> 00:43:56,340 I think we're probably all familiar with that argument. Okay. 414 00:43:56,340 --> 00:44:04,570 How would we go about questioning this argument? Why do some communities like. 415 00:44:04,570 --> 00:44:10,720 Yes. OK. And what aspects are we picking out here and saying is similar to the two cases? 416 00:44:10,720 --> 00:44:17,390 So why is the universe like a pocket watch? I mean, using this famous example, what did the person believe? 417 00:44:17,390 --> 00:44:22,930 Was it Consecrations Leveret? That's right. 418 00:44:22,930 --> 00:44:26,860 It was very. Who was it? It's gone completely paly. 419 00:44:26,860 --> 00:44:32,060 That's right. Thank you. I'm sorry. I have got a handful of cotton wool. It's very strange. 420 00:44:32,060 --> 00:44:36,880 Yeah. Paly believed that the universe the pocket watches is moves regularly. 421 00:44:36,880 --> 00:44:42,340 It's very complex. It's. It must be very difficult to put together. 422 00:44:42,340 --> 00:44:46,390 And he believes that the universe is also very regular, very complex. 423 00:44:46,390 --> 00:44:51,910 It must have been difficult to put together. Therefore, if one has a designer, the other has a designer. 424 00:44:51,910 --> 00:44:54,960 What else might you ask? OK. 425 00:44:54,960 --> 00:45:03,300 There are many, though, there is a similarity, we might say, between pocket watches and the universe, but there are many, many dissimilarities. 426 00:45:03,300 --> 00:45:08,390 Why? Why should we consider that this similarity is more important than all these differences? 427 00:45:08,390 --> 00:45:15,300 Yes. Okay. But wouldn't you say that if the universe is like a pocket watch in this particular 428 00:45:15,300 --> 00:45:21,300 thing and the explanation Pocket Watch is having a designer is this particular thing. 429 00:45:21,300 --> 00:45:31,260 In other words, it's being very complex. So if we agree that everything that's complex and regular must have a designer. 430 00:45:31,260 --> 00:45:39,800 OK. But we all saying the universe is like a pocket watch in being very complex and regular pocket watches have a designer. 431 00:45:39,800 --> 00:45:43,970 Oh, I see. Okay, so you're absolutely right. 432 00:45:43,970 --> 00:45:50,720 I'm sorry. No, you are right. I was I was changing that second premise to everything that's complicated. 433 00:45:50,720 --> 00:45:54,560 Has a designer. And that's not what it says, is it? 434 00:45:54,560 --> 00:45:58,040 And so I've rightly been pulled up on that. Okay. 435 00:45:58,040 --> 00:46:05,760 It isn't what it says. I suppose that's why we think that this is going to work at all, though, isn't it? 436 00:46:05,760 --> 00:46:13,530 In order to give an argument, we do have to say a lot of things in support of the various premises and in 437 00:46:13,530 --> 00:46:17,250 support of our belief that the conclusion follows from the premise and so on. 438 00:46:17,250 --> 00:46:21,290 So you wouldn't expect almost anything said to be an argument. 439 00:46:21,290 --> 00:46:24,960 And actually, as you learn. Yes. 440 00:46:24,960 --> 00:46:30,960 No, I'm not surprised. I'm just I suppose what I'm doing is underfunding newspapers, 441 00:46:30,960 --> 00:46:39,610 because actually you need to read a whole a whole article in order to see what the claim being made is. 442 00:46:39,610 --> 00:46:44,830 And then you need to go back and identify what the reasons are being given for the claim. 443 00:46:44,830 --> 00:46:54,120 OK. Are the two things similar in there? And the respective is there is the respect in which they're similar relevant to the argument being made? 444 00:46:54,120 --> 00:46:58,200 And also, can we find a decent allergy, which is the thing you mentioned is. 445 00:46:58,200 --> 00:47:04,080 Are there differences between them and do the differences pertain to this argument? 446 00:47:04,080 --> 00:47:11,430 But the thing to remember about arguments from analogy is that they are extrapolating from just one example. 447 00:47:11,430 --> 00:47:19,890 Therefore, the one example and the extrapolation have to be really pretty strong before you should go along with them. 448 00:47:19,890 --> 00:47:27,690 So. Arguments from now, actually arguments from analogy are much more common and probably for the reason you're saying, 449 00:47:27,690 --> 00:47:31,860 because they often take us along with them emotionally. 450 00:47:31,860 --> 00:47:41,940 Let's finally look at arguments, room authority, which take one person or a group of persons who are or are assumed to be right about some things, 451 00:47:41,940 --> 00:47:45,840 and they extrapolate to the claim that they're right about other things. 452 00:47:45,840 --> 00:47:51,530 So human rights monitoring organisations are experts on whether human rights have been violated. 453 00:47:51,530 --> 00:47:58,500 They say that some prisoners are mistreated tonight in Mexico. Therefore, some prisoners are mistreated in Mexico. 454 00:47:58,500 --> 00:48:04,980 What do we need to ask about this? Where do they get their information from? 455 00:48:04,980 --> 00:48:10,860 Is it just that they've become hackneyed and cynical and they think that everyone mistreats everyone? 456 00:48:10,860 --> 00:48:15,280 Or do they actually have reasons for saying what they have? Yep. 457 00:48:15,280 --> 00:48:21,760 I mean, all is needed for this argument is that some prisoners are mistreated. Not that they're mistreated by anyone in particular. 458 00:48:21,760 --> 00:48:30,470 I think. Okay. You might say here we've solved the first previous here is they may be experts on whether human rights have been violated, 459 00:48:30,470 --> 00:48:35,210 but are they experts on whether somebody has been mistreated or are they perhaps 460 00:48:35,210 --> 00:48:41,750 saying trivial forms of mistreatment as violations of human rights or something? 461 00:48:41,750 --> 00:48:47,880 Is that what you mean? Yep. Okay. Okay. Well, let's have a look at the. 462 00:48:47,880 --> 00:48:50,750 Okay. Who exactly is the source of information? 463 00:48:50,750 --> 00:48:58,970 I mean, it was saying that it was implying at least that all the human rights organisations were saying it, but it might just say one. 464 00:48:58,970 --> 00:49:06,470 And again, there you would make you'd want to make a judgement about whether the source of information really is an expert, 465 00:49:06,470 --> 00:49:11,870 whether they're qualified in the appropriate area, because it's very easy. 466 00:49:11,870 --> 00:49:20,450 Again, going back to how inductively bold you are, if you have a tendency to think this person is an expert in one area, 467 00:49:20,450 --> 00:49:28,310 you may well inductively generalise to his or her being an expert in another area. 468 00:49:28,310 --> 00:49:36,350 So your tutor, for example, whom you think is you know, if Marijan says P, then P, which is of course a very good argument. 469 00:49:36,350 --> 00:49:46,580 But if what she's talking about is politics or mathematics or something like that, that is complete nonsense, isn't it? 470 00:49:46,580 --> 00:49:49,760 Okay. So so not only that, you need to know who they are. 471 00:49:49,760 --> 00:49:57,380 You need to know whether they're qualified in the right area. You need to know whether they're impartial in this in respect to this particular claim. 472 00:49:57,380 --> 00:50:02,240 So Amnesty International, let's say, are impartial. 473 00:50:02,240 --> 00:50:06,500 They go out and they get the evidence and they're very careful not to be biased. 474 00:50:06,500 --> 00:50:10,010 I don't know whether that's true and certainly. But let's say it could be. 475 00:50:10,010 --> 00:50:16,970 But then there might be another human rights organisation that's not careful to make sure that its information isn't biased. 476 00:50:16,970 --> 00:50:24,250 So you'd need to make a distinction between the fact that amnesty is is a reputable organisation and this other one isn't. 477 00:50:24,250 --> 00:50:34,270 Mark Colvin. Well, I mean, you you get that quite often, I mean. I mean, if if you want to belittle the results that come out of a particular survey, 478 00:50:34,270 --> 00:50:39,000 one way of doing it is to say that the people who are who are putting forward this survey are biased. 479 00:50:39,000 --> 00:50:43,070 So I've been working on. Look, is GM food. 480 00:50:43,070 --> 00:50:52,690 And actually, it's very, very difficult to to get a source that hasn't been funded by a pharmaceutical company or by a company 481 00:50:52,690 --> 00:50:58,840 that isn't that that's the sort of the soil association or somebody that's very anti GM foods. 482 00:50:58,840 --> 00:51:03,820 So finding something that really is an impartial source is really very difficult. 483 00:51:03,820 --> 00:51:10,450 And it's very, very important to to try and find one if you're really going to evaluate these arguments. 484 00:51:10,450 --> 00:51:18,760 Finally, the point you made a minute ago. It's very rarely the case that you have one expert in an area and it's very 485 00:51:18,760 --> 00:51:24,250 rarely the case also that all the experts in an area will agree on on something. 486 00:51:24,250 --> 00:51:31,450 And if you have different experts making different claims, you need to make a judgement as to where you think. 487 00:51:31,450 --> 00:51:35,530 Which of them do you think is is correct and what you can't rely on? 488 00:51:35,530 --> 00:51:40,990 There is an argument from authority, can you? Because they're both authorities. 489 00:51:40,990 --> 00:51:44,590 So if you were an undergraduate writing an essay or indeed, 490 00:51:44,590 --> 00:51:50,530 if you were you writing an essay on philosophy for me, I would have given you lots of reading. 491 00:51:50,530 --> 00:51:58,750 You would have done the reading. And I would have expected you to come away and to think, okay, well. 492 00:51:58,750 --> 00:52:04,810 So-and-so says this and Thingamabobs says that, and he says P and he says not P. 493 00:52:04,810 --> 00:52:10,240 Well, which of them is the case? Well, now you need to look at what the arguments are that so-and-so gives, what the arguments are, 494 00:52:10,240 --> 00:52:16,570 that thingummy Bob gives and work out which ones you think are the best ones and why. 495 00:52:16,570 --> 00:52:25,240 OK. So there's no substitute for thinking for yourself. An appeal to an argument for authority is okay for for various things. 496 00:52:25,240 --> 00:52:28,300 I mean, we have to rely on authorities for all sorts of things. 497 00:52:28,300 --> 00:52:36,610 But if you were trying to write a philosophy essay saying Marijan says P, therefore P will not do. 498 00:52:36,610 --> 00:52:46,530 And that's true of every philosopher you ever come across, because there are very, very few things in philosophy that aren't questioned. 499 00:52:46,530 --> 00:52:58,695 Okay, that's where I was going today. Next week, we'll look at validity and truth and then we'll turn to the evaluation of deductive arguments.