1 00:00:02,790 --> 00:00:25,420 I. Well, I'd like to thank them for the introduction and for inviting me to this wonderful lecture theatre. 2 00:00:25,440 --> 00:00:33,450 It's brilliant. Yeah. I want to talk about mathematical models of visual illusions. 3 00:00:33,850 --> 00:00:41,410 Now, if you've been watching the young lady spinning, you may have noticed that there is a visual illusion there. 4 00:00:41,430 --> 00:00:49,500 It's called the spinning dance or illusion. If you keep watching her after a while, she seems to change direction. 5 00:00:50,850 --> 00:00:54,080 She's going around one way and it might. 6 00:00:54,090 --> 00:00:58,499 Might take ten, 15, 20 seconds and suddenly, hang on. 7 00:00:58,500 --> 00:01:02,100 She's going the other way. It's not a different movie. 8 00:01:02,790 --> 00:01:06,270 It's exactly the same loop being played over and over and over again. 9 00:01:07,500 --> 00:01:10,710 But our visual system interprets it in two different ways. 10 00:01:10,980 --> 00:01:16,770 And because it's a silhouette, we don't have all the clues as to which lake is in front of which and things like that. 11 00:01:17,670 --> 00:01:31,500 And so just as you keep watching, for most people, the direction will rather randomly flip, but probably five, ten, 15 seconds or so. 12 00:01:32,310 --> 00:01:40,740 So. What I want to talk about is some work that's been going on over the last ten years or so. 13 00:01:41,160 --> 00:01:46,920 Some of it is by my research collaborator Marty Levitsky in Columbus, Ohio. 14 00:01:48,780 --> 00:01:52,800 He and I are working this a little bit in here to do this stuff we're working on at the moment. 15 00:01:53,070 --> 00:01:59,340 So this is kind of up to the date. Up to date mathematics. But I think it's a very interesting subject. 16 00:01:59,550 --> 00:02:06,780 So. If you want to know how to spell levitsky. 17 00:02:06,990 --> 00:02:16,350 There it is. When he books restaurants, they elicit party because they know how to spell that. 18 00:02:16,790 --> 00:02:21,720 Okay, so there are some very famous illusions. The spinning dance or I've frozen her for the moment. 19 00:02:22,380 --> 00:02:25,840 There's the duck rabbit illusion. Is this a duck? 20 00:02:25,860 --> 00:02:35,370 Is it a rabbit? If you think that the stuff sticking out to one side, there is a beak and the head is facing in that direction. 21 00:02:35,370 --> 00:02:44,340 It's a not very good duck. And if on the one hand you think that the thing sticking out at the side is, is it's an equally not very good rabbit. 22 00:02:45,210 --> 00:02:51,330 Okay. But it's. An image that can be interpreted in two different ways. 23 00:02:51,510 --> 00:02:57,270 Probably the most famous and the almost the simplest of these illusions is this. 24 00:02:57,660 --> 00:03:04,290 It's called the Nika Cube. After Nika who painted it, it was the first person to publish it. 25 00:03:06,330 --> 00:03:13,270 And. Certainly in our culture, you look at this, it's clearly a wireframe cube. 26 00:03:14,500 --> 00:03:21,310 But if you look at it and ask which of the squares that you're seeing is a front, it flips. 27 00:03:21,850 --> 00:03:26,920 Sometimes it seems to be pointing out of the screen toward you. Sometimes it seems to be pointing the other way. 28 00:03:27,820 --> 00:03:35,650 And again, this is because there are two different objects which give exactly the same projection. 29 00:03:37,120 --> 00:03:45,040 So for an illusion, I want to draw a distinction between three kinds of visual phenomena one called illusions. 30 00:03:45,040 --> 00:03:49,900 One is impossible figures, and the third one is called rivalry. So illusions. 31 00:03:50,980 --> 00:03:54,970 Both. I see the same information basically, although from slightly different angles. 32 00:03:57,700 --> 00:04:07,839 But that information is ambiguous. There's more than one possible interpretation, and the brain can't decide which interpretation is which. 33 00:04:07,840 --> 00:04:16,270 The brain is very busy. It doesn't just settle on something as like, Oh, that's what it is, it's fine done that end of it. 34 00:04:16,270 --> 00:04:19,450 Lose it for was it. It might not be that it might be something else. 35 00:04:20,140 --> 00:04:27,460 So you get fairly random changes but with some sort of average periodicity to them. 36 00:04:30,260 --> 00:04:35,060 So Nikko published a paper and you put a reference out there if you want it. 37 00:04:36,710 --> 00:04:42,470 1832 about the cube and if I shade one of the squares in it. 38 00:04:44,320 --> 00:04:51,220 This gives you a very clear the visual system. Certainly each of those is unambiguous as far as the visual system is concerned. 39 00:04:53,060 --> 00:05:00,170 It is also possible to say the whole thing is just a flat diagram on a flat sheet, but we tend to see this three dimensionality. 40 00:05:01,010 --> 00:05:09,799 So that's one example. Another one slightly sexist, but nonetheless, 1915. 41 00:05:09,800 --> 00:05:15,190 You can see why. It's his wife, but it's also his mother in law. 42 00:05:15,220 --> 00:05:19,660 Now, sometimes it's hard to the wife. 43 00:05:20,860 --> 00:05:24,080 Am I going to get her? Yes. You just about see the red there. 44 00:05:24,310 --> 00:05:29,549 Okay. That's her nose sticking out. 45 00:05:29,550 --> 00:05:33,120 She's looking away like this. Yes. 46 00:05:33,120 --> 00:05:36,490 And. Yeah. 47 00:05:36,670 --> 00:05:41,830 Whereas for mother in law. This is not the chin for mother in law. 48 00:05:42,070 --> 00:05:45,190 This is the nose. The mouth is here. 49 00:05:45,760 --> 00:05:52,780 That's her eye. And she's looking this way. She has a rather large, ugly nose. 50 00:05:53,710 --> 00:05:56,950 Yep. So stare at it. 51 00:05:56,950 --> 00:05:59,980 Could be his wife. It could be his mother. He doesn't say which ones. Which, of course. 52 00:06:05,500 --> 00:06:09,610 And the duck. Rabbit. That goes back to 1899. 53 00:06:11,770 --> 00:06:16,990 People have been interested in these things for a long time and in fact, quite a lot longer than anything we've seen. 54 00:06:17,830 --> 00:06:22,660 What about this? Say, is it a cat? 55 00:06:25,270 --> 00:06:29,140 He's a mouse. Now it's both. 56 00:06:30,430 --> 00:06:35,650 Is that a man playing a saxophone? Or is it a woman's face? 57 00:06:35,770 --> 00:06:43,480 Very heavily contrasted. It depends on which bits of the image you think, how you interpret them. 58 00:06:43,780 --> 00:06:50,400 Yes. This one sometimes works on a screen. 59 00:06:50,430 --> 00:06:56,280 It doesn't always if you stare at it, the circles seem to rotate slightly. 60 00:06:59,280 --> 00:07:04,439 If it's not working for you and it doesn't always when it's up like that, look it up on the internet. 61 00:07:04,440 --> 00:07:06,870 It's not hard to find. Google images will do it for you. 62 00:07:07,590 --> 00:07:15,690 And in smaller scale, something about how their brain is interpreting their image is giving the illusion of movement. 63 00:07:16,130 --> 00:07:21,840 So that's a rather different kind. And then there's this lovely one phrase, a spiral. 64 00:07:22,860 --> 00:07:29,250 And we have this beautiful series of spirals all converging to the centre, 65 00:07:29,970 --> 00:07:39,750 except they're not spirals, the circles, because each one of them is a circle. 66 00:07:40,890 --> 00:07:52,650 But because of the way that the circle is shaded, the eye is sort of drawn to the centre and gets this into this sort of impression of a spiral. 67 00:07:53,930 --> 00:07:57,030 So that's so nice. Would do it again. 68 00:07:57,210 --> 00:08:07,260 I mean, I know that's not a spiral. I've sat there and traced my finger around it and it comes back to our star to draw and I still don't believe it. 69 00:08:07,260 --> 00:08:11,040 And even when I put up the pink circle, I don't believe it. 70 00:08:11,670 --> 00:08:16,709 And you can put up as many circles as you like, but if you get enough circles, it kind of hides all the spiral bits. 71 00:08:16,710 --> 00:08:22,470 So then you start to believe it. And what about this one Stone Applied Illusion? 72 00:08:23,770 --> 00:08:33,530 What are we seeing here? Well, what I'm seeing at the moment is a series of white diamonds moving vertically. 73 00:08:36,650 --> 00:08:43,940 What I'm seeing now and some of you probably were seeing and have now changed your mind is a series of stripes going through each other like this? 74 00:08:45,830 --> 00:08:49,980 Yeah. You've got one not going at y one not going that way. 75 00:08:50,570 --> 00:08:56,750 And as you watch, suddenly it turns into diamonds going up again, and then it goes back to these moving stripes. 76 00:08:58,370 --> 00:09:04,490 So all of this is is telling us something about the visual system, but it's not at all clear what. 77 00:09:06,560 --> 00:09:14,540 But there's a quite, quite a good scientific principle is a useful way to get extra understanding 78 00:09:14,540 --> 00:09:18,320 of a complicated system is is to see what happens when it kind of goes wrong. 79 00:09:19,220 --> 00:09:23,900 You either try and break it or you try and find places where it's perhaps giving. 80 00:09:24,900 --> 00:09:30,030 Clues as to the internal processing that must be going on behind the perception. 81 00:09:30,040 --> 00:09:33,240 I mean, we just look at things, oh, that's that's a cat, that's a dog, whatever. 82 00:09:34,410 --> 00:09:37,890 The brain is doing an awful lot before we recognise a cat or dog. 83 00:09:41,630 --> 00:09:48,080 Second category impossible figures. This is the Penrose triangle. 84 00:09:50,240 --> 00:09:53,930 Each corner of this looks like a perfectly reasonable right angle. 85 00:09:54,560 --> 00:10:01,190 Bend in three dimensions, but if you follow it round, you realise you couldn't make one. 86 00:10:02,480 --> 00:10:08,570 You could make an object that looks like this. But if you move it very slight, you realise that it's completely different. 87 00:10:09,080 --> 00:10:13,850 But this is not really a possible object in ordinary three dimensional space. 88 00:10:15,230 --> 00:10:18,680 Locally, it makes sense. Globally, the pieces don't fit together. 89 00:10:18,680 --> 00:10:23,600 So this isn't an illusion. It's worse. 90 00:10:23,610 --> 00:10:27,600 It's something that can't exist at all. But hang on. I'm just looking at a picture of it. 91 00:10:29,340 --> 00:10:32,700 So for these both eyes, get the information again. 92 00:10:32,700 --> 00:10:35,760 But it's impossible. There is no consistent interpretation. 93 00:10:37,050 --> 00:10:42,420 But you see, what happens is your attention focuses on one corner or another corner or another corner, 94 00:10:42,420 --> 00:10:44,400 and you check and it makes sense at each corner. 95 00:10:44,610 --> 00:10:49,080 And every so often you kind of stand back from it and think, Yeah, but the whole thing doesn't make sense. 96 00:10:49,380 --> 00:10:52,530 There's lots of these. This is a sample. 97 00:10:54,180 --> 00:10:57,360 Yeah, the rectangle version of the triangle. 98 00:10:57,360 --> 00:11:07,350 There's a thing that at one end looks like three cylinders and the other end looks like some sort of square shaped U-shaped thing. 99 00:11:07,860 --> 00:11:13,370 And it doesn't fit together properly. The staircase in the middle at the bottom goes up and up and up and up and around. 100 00:11:13,380 --> 00:11:18,310 Around forever. Well, if you go the other way, it goes down, down, down, down, down. 101 00:11:18,730 --> 00:11:24,220 Forever. And how many feet has the elephant got? 102 00:11:26,440 --> 00:11:29,450 I like that one now. 103 00:11:31,210 --> 00:11:43,060 More Sasha. Artists, well known to mathematicians, used some of these impossible objects in his artistic work. 104 00:11:43,780 --> 00:11:51,040 And I just show you a couple. So the. The monks on the roof of the monastery there are going round and round one of these neverending staircases 105 00:11:51,400 --> 00:11:55,570 and one lot of them going up and the other lots are going down and they're going past each other. 106 00:11:58,570 --> 00:12:02,500 So he he found artistic inspiration in these things. 107 00:12:03,400 --> 00:12:11,320 And the third kind of phenomenon I want to talk about, and I'll start by saying a lot more about that, is called rivalry. 108 00:12:12,160 --> 00:12:27,040 This goes back to at least 1593. I suspect Aristotle knew about it, but early in 1593, Porter put two different books, one in front of each eye, 109 00:12:27,880 --> 00:12:33,940 and discovered he could either persuade himself that he was seeing one of them or he was seeing the other one, 110 00:12:35,440 --> 00:12:41,740 rather than seeing some sort of superposition of the two. And he mentioned this. 111 00:12:42,640 --> 00:12:47,860 So I want to talk about some rivalry experiments which are slightly puzzling. 112 00:12:48,580 --> 00:12:49,659 So in rivalry, 113 00:12:49,660 --> 00:13:01,150 it's different from illusions and impossible objects in the sense that each eye gets different information and that information is conflicting. 114 00:13:01,870 --> 00:13:07,600 It's not like when we normally look at an object with both eyes, see slightly different versions of the same thing, 115 00:13:08,110 --> 00:13:17,110 and put together a three dimensional interpretation is that the two things we're looking at to just can't possibly coexist at the same place. 116 00:13:18,760 --> 00:13:28,600 So there's an experiment done where the two images are vertical pink and grey stripes on horizontal green and grey stripes, 117 00:13:29,620 --> 00:13:33,460 one presented to the left, either x l e, one to the right eye. 118 00:13:34,780 --> 00:13:40,090 And there's another one which generally known as the monkey text experiment. 119 00:13:40,990 --> 00:13:42,280 And I'll explain a little bit more. 120 00:13:42,280 --> 00:13:51,880 But the images there is it's like a jigsaw puzzle of it's some sort of, I think, not really a monkey and some sort of blue text on a green background. 121 00:13:52,600 --> 00:13:56,590 And they've been cut out like a little jigsaw puzzle and then pieces have been swapped between them. 122 00:13:57,730 --> 00:14:10,180 So you've got these two mixed images and you show these two one to each eye and get people to in various ways tell you what they're seeing. 123 00:14:11,770 --> 00:14:14,830 Okay. So with this one, the surprise was. 124 00:14:17,710 --> 00:14:22,810 You only show them two images. And what a lot of people do is they say, oh, it's the vertical stripes. 125 00:14:22,810 --> 00:14:26,080 No, it's the horizontal green ones. No, it's the vertical pink ones and so on. 126 00:14:26,560 --> 00:14:30,550 But some of them say it's horizontal green and pink stripes. 127 00:14:32,070 --> 00:14:36,450 Or its vertical green and pink stripes. The two on the right. 128 00:14:37,200 --> 00:14:46,790 So. Some people see the images that were actually shown to the eyes, but they alternate in some way. 129 00:14:47,960 --> 00:14:55,400 Some people see images that were not shown to either eye, but were kind of assembled from bits of the information. 130 00:14:55,400 --> 00:14:59,030 But they put it together to create an image that they never actually saw. 131 00:14:59,720 --> 00:15:01,310 This is called colour Ms. Binding. 132 00:15:02,210 --> 00:15:10,070 The colours are bound to the stripe somewhere in the brain, but some of the colours have been assigned to the wrong bit of the image. 133 00:15:10,790 --> 00:15:14,690 So this general impression of Stripe in this vertical, horizontal, green, pink. 134 00:15:15,800 --> 00:15:26,470 But now you can put those together in several different ways. So the observed images may actually be things that were not presented at all. 135 00:15:28,710 --> 00:15:32,370 And this happens with the monkey text in a sense. I think it's set up to do this. 136 00:15:36,500 --> 00:15:43,790 A lot of people will just see the two actual images, these mixed jumbled up images, alternating. 137 00:15:43,790 --> 00:15:45,140 They'll see one and see the other. 138 00:15:46,310 --> 00:15:55,820 But sometimes they will see the complete monkey and the complete text, and they tend to alternate as well, which is what's interesting. 139 00:15:55,910 --> 00:15:59,780 So the brain is putting the pieces together, but there's several different ways to do that. 140 00:16:04,500 --> 00:16:13,050 So what we want to do is try and get some insight into what the visual system does when this happens. 141 00:16:13,920 --> 00:16:19,980 Compare this insight with the observations. But in science, what you'd like to do is predict new phenomena. 142 00:16:20,130 --> 00:16:24,209 You want to predict things that have not been seen. Compare them with observations. 143 00:16:24,210 --> 00:16:30,530 And you hope. That they will, in fact, be seen. 144 00:16:30,800 --> 00:16:33,800 And if not, you go back to the drawing board, start again. 145 00:16:34,160 --> 00:16:40,130 So this sort of seven step process model, what's going on? 146 00:16:40,430 --> 00:16:50,780 And I'm going to do this using a very simple little network, which is not really meant to be an exact representation of something that's happened. 147 00:16:50,900 --> 00:16:55,550 Things that exist in the brain is more a kind of schematic representation of what? 148 00:16:56,620 --> 00:16:59,530 Collections of nerve cells in the brain might be doing. 149 00:17:00,580 --> 00:17:08,290 Use some mathematical techniques to analyse those networks and see what in general terms, what would you expect them to do? 150 00:17:10,090 --> 00:17:15,879 Plug in some rather more specific equations so that the models match the neuroscience. 151 00:17:15,880 --> 00:17:23,290 Better ask about various things that are important to people in dynamical systems, such and indeed in the real world. 152 00:17:23,410 --> 00:17:31,000 Is it stable? You can have mathematical solutions to things which won't actually exist because they are too easily destroyed. 153 00:17:31,570 --> 00:17:39,610 A pencil, in theory, will balance on its point. You try it to balance on its side very nicely. 154 00:17:40,090 --> 00:17:45,400 It's stable on its side. It's unstable on each point. But there is a mathematical balance point there. 155 00:17:46,330 --> 00:17:49,690 Compare with observations. Invent new experiments. Make predictions. 156 00:17:49,990 --> 00:17:57,610 Get someone to do the experiments. I'm a mathematician, so my friend Marty, we can't we can't do the experiments. 157 00:17:57,610 --> 00:17:59,200 We don't have access to the equipment. 158 00:18:00,160 --> 00:18:06,670 We need to sign all the forms so that we can use graduate students or something as experimental subjects, you know. 159 00:18:06,730 --> 00:18:12,550 So but fortunately there are other people who do that kind of thing and then go round around this process. 160 00:18:13,290 --> 00:18:19,510 Okay. And the starting point for the mathematics is what we call the Wilson Network. 161 00:18:21,100 --> 00:18:28,600 Hugh Wilson, who works in mathematical neuroscience, was interested in modelling how the brain makes decisions, 162 00:18:29,590 --> 00:18:37,810 and he wanted some sort of schematic, high level model, not down on the level of individual nerve cells, but just how does the brain. 163 00:18:38,050 --> 00:18:41,320 Given some alternatives, how does it decide? 164 00:18:42,580 --> 00:18:43,780 What? What to do? 165 00:18:43,810 --> 00:18:51,460 What it's saying if there's an election and you want to vote, you've got a choice of candidates and you have to decide who to vote for. 166 00:18:52,120 --> 00:18:55,630 And you feed in all sorts of information. These are their policies. 167 00:18:55,990 --> 00:18:59,950 Yeah, well, they say those policies, but I don't believe this person any way or whatever. 168 00:19:00,250 --> 00:19:07,659 You put all this together and eventually you go and you, Marco, across on your balance sheet and somewhere in the brain, 169 00:19:07,660 --> 00:19:11,530 all of this stuff is being put together and a decision is popping out. 170 00:19:12,490 --> 00:19:18,160 And he, Wilson, said, Well, let's take a very simple example. 171 00:19:18,160 --> 00:19:27,360 I look at an adult. And it might be read, it might be green, it might be blue, and somewhere or other. 172 00:19:27,360 --> 00:19:31,080 There is a circuit that recognises red, green and blue. 173 00:19:32,580 --> 00:19:38,070 So as well as these dots representing what you're looking at, they kind of stand for bunches of nerve cells, 174 00:19:38,790 --> 00:19:44,250 some of which have been trained to recognise red, some to recognise green, some to recognise blue. 175 00:19:46,470 --> 00:19:51,780 A very important feature of such a network should be that it comes to one decision, 176 00:19:52,920 --> 00:19:58,640 and these black lines with dots on them represent what are called inhibitory connections. 177 00:19:58,650 --> 00:20:11,610 In the neuroscience jargon, that is an inhibitory connection is where you've got a bunch of nerve cells that think one cell nerve cells are active. 178 00:20:11,610 --> 00:20:17,430 They will fire, they will send electrical signals, and they may be not very active or they may be much more active. 179 00:20:18,090 --> 00:20:21,510 Perhaps they send more more signals per second, whatever. 180 00:20:22,140 --> 00:20:27,600 Okay. And inhibitory connection is it this one's active and it sends an inhibitory connection to this one. 181 00:20:27,810 --> 00:20:34,320 It shuts this one down. So basically it's telling the next sell a long stop. 182 00:20:36,390 --> 00:20:40,650 These connections go both ways. Each of these things are telling the other one to stop, 183 00:20:41,820 --> 00:20:52,590 but the dynamics is set up so that with I if I input something into this to if I input something that kind of looks red, 184 00:20:53,340 --> 00:20:58,080 that increases the activity, the red one so much that it shuts the other two down. 185 00:20:59,040 --> 00:21:03,270 And when they're shut down, they can't change the red one. So we decide on red. 186 00:21:04,110 --> 00:21:10,620 If something comes in that makes the green one excite, it shuts the other two down and so on. 187 00:21:11,100 --> 00:21:16,110 So it's a kind of winner take all structure, and it comes to one decision. 188 00:21:17,130 --> 00:21:29,160 Okay. Now, that's the easy bit. The by convention, I'm going to draw these things as vertical columns, so called. 189 00:21:29,670 --> 00:21:34,410 And in each column, each column represents an attribute of the image, 190 00:21:34,920 --> 00:21:41,970 some feature of the image that we're going to focus on and the different possibilities we'll call levels just because that's what they look like. 191 00:21:42,170 --> 00:21:48,930 Got to have some word for this. So winner takes all links between levels in one attribute column. 192 00:21:52,190 --> 00:21:57,560 And then you do the same kind of thing, but for different attributes. 193 00:21:58,010 --> 00:22:01,930 Okay. I've drawn them all looking the same just to emphasise the structure. 194 00:22:01,940 --> 00:22:06,350 But for example in the column is binding experiment. 195 00:22:06,860 --> 00:22:10,760 There are two kinds of attributes. Are the stripes vertical or horizontal? 196 00:22:11,690 --> 00:22:16,550 Is the colour pink or green? Should have one column with pink green. 197 00:22:16,760 --> 00:22:26,210 One column with vertical horizontal. So each column is trained to recognise which of those levels is occurring. 198 00:22:28,400 --> 00:22:36,440 And then you show each an image and in each column you pick out the attribute corresponding to the image, 199 00:22:37,760 --> 00:22:41,240 and you link those cells by excitatory connections. 200 00:22:42,050 --> 00:22:47,000 Those do the opposite. If this cell starts up and it excites this cell, then both of them are going. 201 00:22:47,450 --> 00:22:53,030 So as soon as one of these purple cells lights up, it makes all the others active as well. 202 00:22:53,960 --> 00:22:57,080 And you buy them all together with a different kind of coupling. 203 00:22:57,860 --> 00:23:06,650 Neuroscience is very strongly focussed on this use of inhibition and excitation between cells, 204 00:23:07,340 --> 00:23:11,000 and it sets up kind of computer like switching circuits and all sorts of things. 205 00:23:11,450 --> 00:23:17,780 Okay. But you when you train your Wilson network to recognise a particular type of image, 206 00:23:18,230 --> 00:23:23,180 what you're telling it is the component attributes, what the level should be. 207 00:23:23,450 --> 00:23:28,340 And then you put that in as right excitations between them. 208 00:23:29,450 --> 00:23:39,510 Okay. Now then the mathematical question is, having set up a network, you know, intuitively, you've kind of got the idea. 209 00:23:39,530 --> 00:23:44,210 It can choose levels and it's been trained on certain combinations. 210 00:23:44,660 --> 00:23:53,690 But what does it really do? And there are two philosophies on how to do this, which are call model independent and model dependent. 211 00:23:54,500 --> 00:24:00,500 Okay. Model independent is the structure of the thing means certain things are going to happen pretty 212 00:24:00,500 --> 00:24:06,050 much independently at the precise mathematical equations that you write down to the dynamics. 213 00:24:07,010 --> 00:24:13,100 There are lots of things in dynamics that don't actually depend a lot on the equations that my arm swings to and fro. 214 00:24:13,100 --> 00:24:16,100 Like a pendulum that works with a real pendulum. 215 00:24:16,100 --> 00:24:18,860 It works with my arm, it works with the other arm. 216 00:24:19,280 --> 00:24:24,050 The equations are actually slightly different in each case, even for the two arms they weren't quite the same. 217 00:24:24,410 --> 00:24:33,940 But nonetheless, swinging to and fro in a periodic fashion is a very common sort of model, independent behaviour, model dependencies. 218 00:24:33,950 --> 00:24:38,570 I'm going to write down detailed models, analyse the heck out of them and tell you what they do. 219 00:24:39,890 --> 00:24:43,850 So. Let's go for the simplest possible set up. 220 00:24:44,780 --> 00:24:52,130 So this is going to be my model of the brain looking at the rabbit duck or the neck, a cube, particularly in the neck, a cube. 221 00:24:53,450 --> 00:24:56,200 And also there are some simple examples of rivalry. 222 00:24:56,210 --> 00:25:03,050 So even this little two cell network, one column, two levels, can't get much simpler than that one called. 223 00:25:03,050 --> 00:25:05,670 On one level, there's nothing to choose. Okay. 224 00:25:06,320 --> 00:25:19,100 Well, with the next cube, we could say one of these represents perceiving the cube with the blue face poking out in front and the green. 225 00:25:19,400 --> 00:25:29,330 So there represents the alternative version of the cube where we perceive it with the other square face poking out in front of the eye isn't shown, 226 00:25:29,330 --> 00:25:33,260 the blue faces is just shown the next cube. But there are two interpretations. 227 00:25:34,430 --> 00:25:37,880 So what would this sort of network do? 228 00:25:41,220 --> 00:25:45,090 Well, it's real mathematics. There are some formulas. 229 00:25:45,390 --> 00:25:48,630 Here is a model dependent analysis of this thing. 230 00:25:49,830 --> 00:25:56,840 You write down equations, the ones that people like in this context, these equations for the the rate at which a neurone is firing, 231 00:25:56,850 --> 00:26:04,590 it sends out a series of pulses and if it sends out very fast pulses, then it's firing at a higher rate. 232 00:26:04,830 --> 00:26:07,980 And that actually means it's got very excited, thinks it's seen something. 233 00:26:08,850 --> 00:26:14,120 And what these equations do. There's a thing called the gain function. 234 00:26:14,130 --> 00:26:23,850 It looks like this. So it goes from zero up to one zero means of one means on. 235 00:26:24,630 --> 00:26:28,020 And it kind of transitions between. Okay. 236 00:26:28,830 --> 00:26:36,810 So what the equations say is basically that one of these cells receives input signals from the other cell. 237 00:26:38,100 --> 00:26:43,800 And the level at which that input signal occurs affects the behaviour of the red. 238 00:26:44,070 --> 00:26:46,500 So the red cell gets an input from the green cell. 239 00:26:47,960 --> 00:26:55,190 And response to that and the green so gets mathematically the same formula with some ones and twos into changed. 240 00:26:55,640 --> 00:27:02,600 Okay don't worry about it but if you want to know this is a published paper on this. 241 00:27:02,600 --> 00:27:10,760 This is the kind of stuff you write down. Okay. So there's two variables for each cell and there's a very standard form for the equations. 242 00:27:11,240 --> 00:27:17,600 When you get used to it, you can write these things down. I'm going to show you some much more complicated networks a little bit later. 243 00:27:19,010 --> 00:27:23,450 And I just remind you again, the game function is this sigmoid thing, as they call it. 244 00:27:23,900 --> 00:27:28,550 And if you shove this on the computer and solve the equations, this is the kind of picture you get. 245 00:27:29,720 --> 00:27:38,750 And this is done by John Renshaw. And the blue curve actually would be the green cell because he used different colours from me. 246 00:27:40,280 --> 00:27:46,580 But what you see is the red cell fires and then the blue light is green. 247 00:27:46,700 --> 00:27:56,780 So if the red is at the top and the blue is much lower level of activity, and then suddenly the blue has a high level activity and the red is low, 248 00:27:57,200 --> 00:28:00,920 then the red is high in the blues high, then the red sign and the blues high. 249 00:28:01,550 --> 00:28:10,340 So the way you interpret these signals is the cell that's firing at the highest right winds. 250 00:28:12,290 --> 00:28:17,120 So when the red is at the highest rate, you think that the next cube looks like the top picture. 251 00:28:18,080 --> 00:28:22,310 But when the other one is at the highest rate, you think it looks like the bottom picture. 252 00:28:22,790 --> 00:28:26,869 So what the mathematics predicts is it will move from the Q pointing one way to the other. 253 00:28:26,870 --> 00:28:31,280 Way back again, back again, back again, back and flipping between them. 254 00:28:32,930 --> 00:28:37,550 In this mobile, it's a perfectly irregular flip, which is not quite what's observed. 255 00:28:39,620 --> 00:28:48,079 And what actually usually do is then add some random terms to the equations to make the flips. 256 00:28:48,080 --> 00:28:49,220 Not completely irregular. 257 00:28:49,550 --> 00:29:00,379 But I'm going to be happy with regular flips, bearing in mind that we're not getting too close to too worried about the details. 258 00:29:00,380 --> 00:29:04,150 Let's get the big picture, broad picture. Okay. 259 00:29:04,640 --> 00:29:09,290 Now, these things switch on by what mathematicians call a hop bifurcation, 260 00:29:09,290 --> 00:29:18,710 which is a stable state, which, as you turn up some input to it, suddenly starts to wobble to. 261 00:29:20,410 --> 00:29:23,410 Okay. It's called a hot five fixation. 262 00:29:23,410 --> 00:29:27,430 His his little animation of one that occurs and then disappears again. 263 00:29:28,870 --> 00:29:36,940 You know, there's a the blue thing is the steady state which expands to something periodic and then shuts down again. 264 00:29:38,140 --> 00:29:42,850 And you can ask what sort of hot bifurcation is occur in this two cell model? 265 00:29:43,720 --> 00:29:48,610 And if we do a model independent analysis, the important thing is not really the dynamics, it's the symmetry. 266 00:29:49,720 --> 00:29:59,170 If I flip the two cells, all of the equations flip as well, and the equations for the two bits look the same. 267 00:29:59,380 --> 00:30:06,880 It's just that the variables are swapped. And there are two kinds of hot bifurcation that occur in this kind of system. 268 00:30:07,000 --> 00:30:11,530 And I can demonstrate. Okay. So here are the two cells. 269 00:30:11,680 --> 00:30:18,040 Either they both do the same thing. At the same time or they alternate like this. 270 00:30:19,840 --> 00:30:23,320 Yes. This is a child hopping. This is somebody walking. 271 00:30:24,940 --> 00:30:35,260 So these two patterns are what you don't see except with very special circumstances, which you have to fiddle the equations to make them work. 272 00:30:35,650 --> 00:30:41,950 Would be this one doing something and then this one doing the same thing, but perhaps just a little bit behind it. 273 00:30:41,980 --> 00:30:48,230 I can just about manage to do that, but my arms want to go back to this much more stable alternation. 274 00:30:48,860 --> 00:30:54,410 Yeah. So the math tells you that this is what you should expect. 275 00:30:55,280 --> 00:31:01,400 And of course, the out of face case is the one that John Rizal's computer program was picking up. 276 00:31:02,330 --> 00:31:05,480 And in fact, in these inhibitory networks, 277 00:31:06,260 --> 00:31:11,660 basically inhibition says they shouldn't both be firing at the same time because they're trying to shut each other down. 278 00:31:12,410 --> 00:31:17,059 If one of them wins, the other one is happy, but then the other one could die down. 279 00:31:17,060 --> 00:31:19,070 And then the second one start up. 280 00:31:20,090 --> 00:31:29,330 So the model independent analysis tells you you should expect to see this half period out of phase alternation in that particular network. 281 00:31:30,530 --> 00:31:34,340 And what we want to do is mimic that in more complicated cases. 282 00:31:37,560 --> 00:31:40,590 What really kicked this whole thing off from our point of view. 283 00:31:41,070 --> 00:31:43,590 People like John Ensign have been working on this for a lot longer. 284 00:31:44,520 --> 00:31:55,410 My friend Marty was he was and recently stopped being director of a thing called the Mathematical Sciences Institute in Ohio. 285 00:31:56,400 --> 00:32:02,700 And as director, he got the job that Alan had there, which was introducing the speaker and sometimes sitting and listening to the lecture. 286 00:32:02,940 --> 00:32:09,090 And somebody was giving a lecture on the rivalry and mentioned the monkey text. 287 00:32:11,610 --> 00:32:14,970 I'm Marty Thornton. I wonder what the Wilson Network looks like. 288 00:32:16,140 --> 00:32:22,950 And the answer is the simplest way to set it up is there's a picture of the jigsaw puzzle. 289 00:32:24,090 --> 00:32:31,440 Each of those pictures is essentially divided into six pieces, and you take the white pieces and the blue pieces, 290 00:32:32,130 --> 00:32:37,720 but then you swap the corresponding sets of three between the two images. 291 00:32:37,740 --> 00:32:44,730 So you you basically you can if you take the left hand picture, it's got. 292 00:32:47,280 --> 00:32:52,859 It's got text in the blue region. A monkey in the white region was the other picture. 293 00:32:52,860 --> 00:32:55,980 It got text in the white bridge and the monkey in the blue region. 294 00:32:57,090 --> 00:33:07,950 So your attributes, your columns in the network are what's going on in the right white region, what's going on in the blue region? 295 00:33:09,750 --> 00:33:14,070 And in each region it can either be monkey or ticks. That's the two levels. 296 00:33:15,270 --> 00:33:20,220 So if you ignore the cross in the middle, you've got just the two columns with these inhibitory connections. 297 00:33:21,060 --> 00:33:31,110 But you're trying your brain by showing the two eyes pictures in which monkey at one level goes with text at the other level. 298 00:33:31,500 --> 00:33:40,280 So you get this excitatory connection swapping levels, and then text in one level goes to monkey in the other one. 299 00:33:40,290 --> 00:33:46,740 So you get a crossing that goes the other way. So we get this excitation across inhibition up and down. 300 00:33:48,180 --> 00:33:54,759 Well, you can look at that and say, what? Are the patterns of oscillation. 301 00:33:54,760 --> 00:33:59,890 What are the analogues of in phase? Out of phase? The answer is there's four of them. 302 00:34:00,700 --> 00:34:09,820 Everything can be in phase. The stuff on the left side can be out of phase with the stuff on the right. 303 00:34:10,510 --> 00:34:17,440 So the two things on the left oscillate, the two things on the right oscillate, but they're out of phase with each other. 304 00:34:18,790 --> 00:34:21,860 Or you can do that diagonally or you can do it top bottom. 305 00:34:23,310 --> 00:34:35,160 Okay. These are the four possible half by fixations in that network, but the inhibitory connections rule two of them out. 306 00:34:36,820 --> 00:34:41,350 Inhibition means you shouldn't have the same colour top and bottom. 307 00:34:42,880 --> 00:34:47,500 So that was not good. And that was no good. 308 00:34:48,610 --> 00:34:52,989 But the other two so are having the same colours left and right. 309 00:34:52,990 --> 00:34:56,500 But they shouldn't be in the same colour. 310 00:34:58,570 --> 00:35:08,640 What are those? Two patterns of oscillation. One of them is the original pair of images, learned patterns. 311 00:35:08,910 --> 00:35:12,390 We're getting a rivalry between those two images. 312 00:35:13,710 --> 00:35:18,690 The other one, you see the whole monkey and the whole text. 313 00:35:19,690 --> 00:35:24,700 And if you go back and look at the diagram, you can actually check. That's really what you would be seeing. 314 00:35:24,970 --> 00:35:30,160 So this is rivalry between two patterns that the I never saw but the brain put together. 315 00:35:31,660 --> 00:35:36,310 It's not greatly surprising the brain puts it together. We're all quite good at jigsaw puzzles and things. 316 00:35:36,550 --> 00:35:45,980 But. The same thing is going on in the Chevelle column is binding experiment instead of monkey text. 317 00:35:46,580 --> 00:35:49,940 What you've got is essentially a horizontal vertical going with the colours. 318 00:35:50,450 --> 00:35:55,160 You can use a similar network, although later I'll show you a slightly better network for. 319 00:35:57,010 --> 00:36:05,290 Okay. And here's another pair of rivalry experiments that do the same thing. 320 00:36:05,500 --> 00:36:15,190 These are either beautiful. So if you show the left eye concentric circles and the right eye horizontal stripes, 321 00:36:15,580 --> 00:36:22,810 people sometimes see something which is circles in the left half of the visual field and stripes in the other half. 322 00:36:23,470 --> 00:36:31,280 Excuse me. Or the other way around, and they will say alternating between the two. 323 00:36:32,300 --> 00:36:38,060 Or if you show them this hourglass shaped thing and the hexagon again, 324 00:36:38,540 --> 00:36:43,820 the brain may put together half a one with half of the other and then flip between the two. 325 00:36:44,150 --> 00:36:46,610 Exactly. The same network will deal with these. 326 00:36:48,650 --> 00:36:53,150 So now we can make some predictions, unfortunately, and I'll give this away before you get too excited. 327 00:36:54,680 --> 00:36:57,770 No one's yet done the experiment because it's not entirely straightforward. 328 00:36:57,800 --> 00:37:04,250 But suppose we showed three red dots to one eye and three green dots to the other line. 329 00:37:04,580 --> 00:37:12,770 Well, you could get alternation between those patterns, but you can draw up you will some network. 330 00:37:13,280 --> 00:37:16,530 And if I rearrange this a bit, it's the same network. 331 00:37:16,610 --> 00:37:20,510 I'll just move things around. It's got symmetries of the hexagon. 332 00:37:21,440 --> 00:37:25,970 So, of course, the dynamic assistance people know all about bifurcation with the symmetries of the hexagon. 333 00:37:26,930 --> 00:37:33,380 And there's actually quite a lot of possibilities, but a rather nice one is what's called a discrete rotating wave. 334 00:37:35,480 --> 00:37:43,340 So a, b, c, D, e, f are a sequence of images that would be consistent with the dynamics of that network. 335 00:37:44,420 --> 00:37:52,100 And what's happening is essentially that the two red dots are rotating round the triangle in pairs, 336 00:37:52,910 --> 00:37:56,660 except sometimes it's only when this transition there's one red dot in two green dots. 337 00:37:57,290 --> 00:38:00,800 So the green and red dots chase each other round around the triangle. 338 00:38:01,400 --> 00:38:04,940 Now, it would be lovely to do an experiment that shows that happening. 339 00:38:05,810 --> 00:38:10,880 But as I say, unfortunately, no has actually done that. 340 00:38:12,650 --> 00:38:19,600 Okay. Now there is actually an algorithm for constructing these things. 341 00:38:21,230 --> 00:38:29,300 And the network that actually makes most sense for the columnist's binding experiment is a slightly more complicated one. 342 00:38:29,630 --> 00:38:33,460 I won't take you through the details. 343 00:38:33,470 --> 00:38:41,930 I'll just prove to you that this is there now a problem with the two cell model of the next cube, 344 00:38:41,930 --> 00:38:48,920 as you've kind of put into it, what you expect to get out. You've put in cube facing one white cube facing the other way. 345 00:38:50,380 --> 00:38:55,660 Okay. Well. That's kind. 346 00:38:55,690 --> 00:39:00,160 It would be nice to have something that that doesn't make that assumption. 347 00:39:00,400 --> 00:39:07,360 It doesn't even know for certain it's a cube. So what we thought was. 348 00:39:10,870 --> 00:39:16,420 It's reasonable that the brain should dissect this image into a series of lines, a number of the lines. 349 00:39:16,840 --> 00:39:21,670 There are eight of them. I haven't worried about the vertical edges because I'm not going to do anything with those. 350 00:39:24,460 --> 00:39:29,590 It's where those lines cross that the ambiguity arises. 351 00:39:31,090 --> 00:39:35,890 Now, suppose I set up this network. 352 00:39:37,230 --> 00:39:45,730 Okay, this is getting more complicated. Okay, let me explain where this network comes from. 353 00:39:46,660 --> 00:39:51,280 So the idea is this is a visual illusion. 354 00:39:51,280 --> 00:39:57,130 It's not rivalry. So there isn't the idea of a learned set of patterns. 355 00:39:57,700 --> 00:40:04,900 Instead of that, we have the idea that the brain has already learned to recognise certain shapes and images. 356 00:40:06,070 --> 00:40:11,290 This seems pretty clear, actually, that somewhere if I look at something and see that it's a cat, 357 00:40:11,740 --> 00:40:20,230 I know it's a cat because I've been told over and over again those things are cats and the brain has figured out that anything cat like is a cat. 358 00:40:23,350 --> 00:40:28,690 So when we see one of these pictures, we try to match it to an object that we already know about. 359 00:40:30,610 --> 00:40:35,140 Now, objects we know about must actually be possible in three dimensions. 360 00:40:36,190 --> 00:40:41,860 We're seeing a two dimensional projection of a three dimensional object. The question is which object is it? 361 00:40:42,760 --> 00:40:51,520 So there should be some sort of geometric consistency so that if I have if I see a line pointing out that way and I line pointing out that way, 362 00:40:51,790 --> 00:40:56,680 the stuff on the end of them should not be at the same place and things like that. 363 00:40:57,790 --> 00:41:04,420 So that's what allows us to draw up this particular very complicated network. 364 00:41:04,900 --> 00:41:11,230 And we only make three assumptions. Vertical lines in the image represent vertical lines in reality. 365 00:41:11,410 --> 00:41:21,850 Vertical is special in visual perception. It's very special lines that seem to be parallel in the image should be parallel or nearly so in the object. 366 00:41:22,600 --> 00:41:26,740 It's a reasonable assumption. It's not necessarily the case, but let's assume that. 367 00:41:27,700 --> 00:41:31,600 And similarly, lines that are not parallel in the. 368 00:41:33,040 --> 00:41:36,940 Image should not be parallel in the real thing. 369 00:41:37,810 --> 00:41:41,410 The only is not being given something designed to fool it. 370 00:41:42,520 --> 00:41:49,840 Well, if you then you can then draw up a model because of the. 371 00:41:52,540 --> 00:41:58,900 What I was calling columns are now arranged horizontally for each line. 372 00:41:59,740 --> 00:42:05,380 There are two directions it can point basically forwards or backwards relative to the plane of the image. 373 00:42:07,450 --> 00:42:12,490 And if you draw up the network, that corresponds to the way the cube goes. 374 00:42:13,570 --> 00:42:22,720 We get this quite complicated thing. The inhibitory pairs go across in each of the eight connections go across in each of the pairs. 375 00:42:23,200 --> 00:42:31,420 Everything else is excitatory connections or inhibitory ones, which essentially encoding this geometric consistency condition. 376 00:42:32,290 --> 00:42:35,340 So we got what mathematically is quite a pretty object. 377 00:42:35,350 --> 00:42:37,000 It's actually got quite a few symmetries. 378 00:42:37,960 --> 00:42:45,790 There's obvious ones left and right or flip top and bottom, but you can also take the two sets of pairs and flip them simultaneously. 379 00:42:47,110 --> 00:42:53,230 So you've got eight symmetries. So don't worry about the details. 380 00:42:54,190 --> 00:42:57,460 There are eight possible patterns of oscillation. 381 00:43:00,260 --> 00:43:11,080 Okay. Now, I'm not going to spend time explaining what this picture encodes, but you can sort of say this is a list of the eight oscillation patterns. 382 00:43:11,090 --> 00:43:21,020 But again, we can get rid of quite a lot of them because of the extra structure of inhibit inhibition and excitation and other things. 383 00:43:21,350 --> 00:43:25,970 We can get rid of everything in the top row. We can get rid of that one. 384 00:43:27,260 --> 00:43:33,500 That one and that one. Those are all inconsistent with the extra structure. 385 00:43:34,850 --> 00:43:43,010 And we're left with just one. What does that look like? 386 00:43:43,580 --> 00:43:46,610 Well, you can sit and stare at this. 387 00:43:46,620 --> 00:43:50,510 And if we had half an hour, I could take you through it and figure out. 388 00:43:51,200 --> 00:44:01,120 But let's look, for example, at. One and five here. 389 00:44:02,170 --> 00:44:09,580 The shaving is meant to say if if we think line one is pointing forwards, then we also think line five is pointing forwards. 390 00:44:10,270 --> 00:44:14,410 If line one is pointing out of the page, so is line five. 391 00:44:15,010 --> 00:44:21,910 And while we're at it, line four is also pointing out of the page and so is line eight. 392 00:44:23,380 --> 00:44:23,920 Okay. 393 00:44:26,950 --> 00:44:38,860 And similarly for the pairs two, three, six and seven, they're all pointing in the same direction, but they will flip between forwards and backwards. 394 00:44:40,330 --> 00:44:47,350 So if I were to solve the equations and here here's a picture of what's going on, and we can see some of the colours. 395 00:44:47,350 --> 00:44:51,850 Some of the colours are missing because the curves are hidden behind the other curves. They synchronise in pairs. 396 00:44:52,660 --> 00:44:59,020 But let's look at green and brown. 397 00:44:59,110 --> 00:45:02,950 Green is three forwards. Brown is seven forwards. 398 00:45:03,700 --> 00:45:06,970 Three forwards. Seven forwards. 399 00:45:10,410 --> 00:45:13,740 Yeah. Three. Yep. 400 00:45:15,430 --> 00:45:24,910 And. What we're seeing alternation between the colours or here the purple and the magenta. 401 00:45:25,870 --> 00:45:34,870 And if you actually look at that, what's happening is the whole image seem to flip from point to point in one way to the cube pointing the other way. 402 00:45:35,800 --> 00:45:41,760 But if you look very closely at where it changes. Things don't quite cross. 403 00:45:42,270 --> 00:45:45,300 You get on the left hand picture at the bottom. 404 00:45:45,720 --> 00:45:51,690 The curves are not all crossing through at the same point. So the flips occur very slightly different times. 405 00:45:53,660 --> 00:45:59,270 And what the brain is perceiving during that very short flip is actually an impossible figure. 406 00:45:59,930 --> 00:46:04,820 Bits of the cube are fitting together like that impossible rectangle. 407 00:46:06,560 --> 00:46:12,290 So the mathematical model was the time where, okay, for the moment, 408 00:46:12,930 --> 00:46:20,240 mathematical model is has this little period of confusion where the brain thinks part of the image is flipped. 409 00:46:22,080 --> 00:46:28,290 And the other part hasn't, but that doesn't make sense. And that somehow triggers the other one flipping and then the whole thing's consistent again. 410 00:46:29,460 --> 00:46:31,350 And the same thing happens in the rabbit duck. 411 00:46:32,100 --> 00:46:42,900 If I set up a network for rabbit and duck, two columns, one is it is or beak is the head facing right or left. 412 00:46:43,620 --> 00:46:47,460 Draw out my network, putting what I know about the images. 413 00:46:48,090 --> 00:47:00,870 And again I get this alternation between. Beak is right left, but the beacon is flip to slightly different time from the head. 414 00:47:01,470 --> 00:47:05,160 So I'm looking at the image and I'm seeing a duck. 415 00:47:06,390 --> 00:47:09,030 And then I think, no, those aren't beaks, those are ears. 416 00:47:09,870 --> 00:47:15,930 And momentarily the heads facing the wrong way and then the head must be facing the other way. 417 00:47:16,230 --> 00:47:20,640 Then it looks like a rabbit. So that's kind of prediction there. 418 00:47:22,910 --> 00:47:26,210 And I think I will. 419 00:47:29,050 --> 00:47:33,370 Finish at this point to allow time for questions. There is more. 420 00:47:34,180 --> 00:47:39,010 There is a cube with three possible interpretations. 421 00:47:39,640 --> 00:47:46,870 There's this just look at the spinning dancer again, and I'll finish with a little more analysis of the spinning dancer. 422 00:47:46,990 --> 00:47:56,160 Okay. The kind of. Attribute that we should focus on things like is the head facing forwards or backwards? 423 00:47:57,150 --> 00:48:00,990 Is that the left or right arm? Is the other one the left or right arm? 424 00:48:01,410 --> 00:48:04,980 Which leg is in front? Is that the left foot? Is that the right foot? 425 00:48:05,850 --> 00:48:11,100 And if I were to draw up a network for that, I'd have a whole pile of columns, 426 00:48:11,640 --> 00:48:17,820 possible positions of the head, which arm is which which leg is in front, which foot is which. 427 00:48:18,690 --> 00:48:25,110 And I have a whole pile of excitatory connections. So the things that make sense, we know it's a dance, so we know this is a human body. 428 00:48:27,090 --> 00:48:30,510 We don't have the top half spinning one way and the bottom half spinning the other way. 429 00:48:31,560 --> 00:48:38,050 You could do that with a model, but you can't actually do it with the doll or something if it had a pivot in the middle of it. 430 00:48:38,400 --> 00:48:41,970 You can't do that with a real person. It wouldn't it wouldn't last very long. 431 00:48:43,650 --> 00:48:50,400 And if you simulate that, what you see is essentially that there is the same effect as with the next cube, 432 00:48:50,760 --> 00:48:56,760 that the all of that all of the levels that go together with clockwise spinning 433 00:48:57,600 --> 00:49:03,450 flip at almost the same time to everything consistent with anticlockwise spinning. 434 00:49:03,960 --> 00:49:08,760 And then it flips back again and again if you look very, very closely at the transition. 435 00:49:09,690 --> 00:49:14,759 The different columns flip at slightly different times, but it's so close that in fact, 436 00:49:14,760 --> 00:49:21,840 in the mathematical pictures you just see that the waves flipping one after the other, everything flips together. 437 00:49:22,890 --> 00:49:27,060 Each one of these pictures is actually five curves sitting on top of each other. 438 00:49:27,450 --> 00:49:32,460 They're so close together, it looks like one curve. So that's the spinning dancer. 439 00:49:32,670 --> 00:49:44,360 There's a similar way of dealing with the moving plaid. And so that is a little introduction to the mathematics of visual illusions. 440 00:49:45,080 --> 00:49:46,090 And it's the.