1 00:00:14,110 --> 00:00:20,979 I'm Marcus to say Italy. I'm the Simone professor for the public understanding of science here in Oxford and a professor of mathematics. 2 00:00:20,980 --> 00:00:24,160 And this lecture is kind of like the highlight of our year, 3 00:00:24,160 --> 00:00:30,370 trying to communicate some of the exciting science that's going on here in Oxford and across the world to the public. 4 00:00:30,370 --> 00:00:35,499 And we're very grateful to the Simoni Fund, which helps to make this possible. 5 00:00:35,500 --> 00:00:38,310 And the Oxford plays house for for hosting us now. 6 00:00:38,320 --> 00:00:46,240 She was Richard Dawkins idea, my predecessor to hold this lecture on kind of neutral grounds, not in the university, 7 00:00:46,240 --> 00:00:54,280 but somewhere where the public and scientists can come together to to discuss kind of the important issues of the day in France. 8 00:00:54,820 --> 00:00:58,860 Today is the 20th anniversary of this lecture. 9 00:00:58,870 --> 00:01:03,159 So so we look, again, sort of backwards, a lot of lectures we've had, 10 00:01:03,160 --> 00:01:12,490 but also looking forward because the topic of today's lecture is really one that's going to affect society very much over the coming years. 11 00:01:12,730 --> 00:01:17,770 I just spent several years on a committee with the Royal Society looking at the impact AI 12 00:01:17,770 --> 00:01:23,490 and machine learning is going to have on what else is working but on our lives in general. 13 00:01:23,500 --> 00:01:31,000 So so I thought it was really urgent to get somebody who's really at the cutting edge of this field to come and talk to us about its impact. 14 00:01:31,330 --> 00:01:35,860 And I think we kind of realised that A.I. may be driving our cars, being our doctors, 15 00:01:36,130 --> 00:01:42,880 but we do feel there's some area that we really regard as uniquely human and that's our kind of emotional world. 16 00:01:43,570 --> 00:01:52,690 So I was very interested to invite Dr. Rosalind Picard from MIT Media Lab, who set up the Department for Affective Computing. 17 00:01:52,690 --> 00:01:58,390 There. She has several companies looking at the impact of AI on society and inventor as well. 18 00:01:59,140 --> 00:02:05,049 To come in, explore with us. Can we create AI with emotional intelligence? 19 00:02:05,050 --> 00:02:08,290 So please give a big Oxford. Welcome to Professor Picard. 20 00:02:09,460 --> 00:02:21,820 Thank you. Thank you, Mark. 21 00:02:21,820 --> 00:02:27,580 It's such a pleasure to be here with all of you. I'm going to start with something a bit dangerous because I know it's dark out there, 22 00:02:29,080 --> 00:02:32,890 but I'm going to ask you to close your eyes and imagine with me something first. 23 00:02:34,630 --> 00:02:43,180 Imagine that you are working on something very important to you and you go wherever you go to get away from it all to focus. 24 00:02:44,270 --> 00:02:49,790 This could be your office or someplace you haven't told anybody where you you go to get out of the way of people. 25 00:02:50,420 --> 00:02:54,320 And while you're in this place working really hard on this important deadline, 26 00:02:55,130 --> 00:03:05,030 this character barges into your space and interrupts you and doesn't apologise and doesn't notice that you're slightly annoyed by this. 27 00:03:07,540 --> 00:03:14,530 Then the character gives you some useless advice, which you then respond to with a little bit more annoyance. 28 00:03:15,500 --> 00:03:21,840 Which the character ignores. And he continues to be unhelpful at this point. 29 00:03:21,860 --> 00:03:25,110 Maybe you were very subtle, but you are no longer subtle. 30 00:03:25,130 --> 00:03:28,610 Let's say that you are very clear that this is annoying. 31 00:03:30,350 --> 00:03:33,800 But he continues to ignore that you are annoyed. 32 00:03:34,580 --> 00:03:38,360 And this goes on until finally you say leave. 33 00:03:38,840 --> 00:03:43,790 You're very explicit. You might get up and walk him out the door and he does leave. 34 00:03:43,820 --> 00:03:48,680 But first he winks and does a happy little dance before leaving. 35 00:03:49,280 --> 00:03:55,009 Now you can open your eyes and the story that you just imagined is one of the 36 00:03:55,010 --> 00:03:59,730 ones that has happened to many of us who use some of the early AI systems. 37 00:04:00,680 --> 00:04:05,180 Some of you my age or whatever might recognise this animated paperclip. 38 00:04:06,290 --> 00:04:08,210 Younger people may go, What's that? 39 00:04:08,870 --> 00:04:18,080 This was a very smart software agent made by Microsoft, deployed an office that actually had some of the first AI machine learning in it. 40 00:04:18,350 --> 00:04:21,470 That was quite good at seeing if you were writing a letter. 41 00:04:23,440 --> 00:04:26,950 It was not, however, good at seeing. 42 00:04:27,940 --> 00:04:32,500 If you're positive or negative about what it's doing, it didn't recognise your emotion. 43 00:04:33,160 --> 00:04:36,340 Now there's a time to ignore people's emotion. 44 00:04:36,340 --> 00:04:43,749 You don't want to, you know, flickering every time you smile, but there's a time to respond appropriately to emotion. 45 00:04:43,750 --> 00:04:51,639 And it did not do that. This happy little dance thing was probably really popular when people were coming in to test it and 46 00:04:51,640 --> 00:04:57,370 getting 50 bucks an hour in this modern facility to try out this cool new future technology at the time. 47 00:04:58,510 --> 00:05:05,320 But under the context that we just went through, that would not be an intelligent response. 48 00:05:06,650 --> 00:05:14,760 So these are examples of where I has not shown emotional intelligence and I won't go through all of the aspects of this. 49 00:05:14,790 --> 00:05:19,130 I am just going to skip around and touch on some kind of hot topics, I think, with respect to it. 50 00:05:19,550 --> 00:05:24,950 But the big picture is the AI should be smart about recognising our emotions. 51 00:05:25,870 --> 00:05:29,049 About knowing when to express them or not. 52 00:05:29,050 --> 00:05:33,670 Like when to do the happy little dance and when. Not when. 53 00:05:33,910 --> 00:05:37,780 How to handle our emotions. When to kind of ignore them and when to respond. 54 00:05:38,650 --> 00:05:42,310 And if the system has emotion. 55 00:05:42,550 --> 00:05:46,060 You know, what does that mean? If you're actually have emotion? We'll get into this. 56 00:05:47,320 --> 00:05:52,780 Then it should regulate those, right? It shouldn't be having a meltdown in front of you. 57 00:05:53,110 --> 00:05:56,920 We don't want that. We don't want our technology to just act emotional. 58 00:05:58,000 --> 00:06:04,930 Utilising emotions is a different kind of emotional intelligence that is about understanding how different 59 00:06:04,960 --> 00:06:09,790 affective states lead to different kinds of thinking or facilitate different kinds of thinking. 60 00:06:10,210 --> 00:06:16,000 And that can be used advantageously. For example, if you have a creative problem to solve. 61 00:06:16,330 --> 00:06:21,880 It has been shown that if you are put in a good mood first, you will think more out of the box. 62 00:06:22,630 --> 00:06:27,310 So if you really are not in a good mood, you have to solve a creative problem. 63 00:06:27,610 --> 00:06:29,799 You should. You could be emotional intelligence. 64 00:06:29,800 --> 00:06:34,120 I need to go get in a good mood first before I go bang my head against the wall trying to solve that problem. 65 00:06:36,740 --> 00:06:40,250 So what did we see instead? That was called intelligent interaction. 66 00:06:40,250 --> 00:06:48,110 We see animated objects of all kinds and robots and so forth, smiling at us, trying to make us happy. 67 00:06:48,740 --> 00:06:52,040 And that actually there is a time and place for that. 68 00:06:52,040 --> 00:06:57,140 Like when when someone first greets you or you come home and your dog sees you and it looks happy to see you, 69 00:06:57,980 --> 00:07:04,850 but as soon as it sees that you look like this, then it should change, right? 70 00:07:04,850 --> 00:07:08,240 It should stop looking happy. So it's fine when it first greets you. 71 00:07:08,420 --> 00:07:11,420 It's not fine after you have shown that you're not happy. 72 00:07:12,980 --> 00:07:14,750 This is not new wisdom. 73 00:07:14,930 --> 00:07:23,870 This has been around at least for thousands of years, as we see here from Proverbs, singing cheerful songs to somebody with a heavy heart. 74 00:07:24,470 --> 00:07:29,630 You might as well pour vinegar on their wounds or take away their coat on a cold day. 75 00:07:30,530 --> 00:07:43,109 Unwelcome. The first robot I saw that showed emotional intelligence was this Ph.D. thesis project of my colleague Cynthia Brazil. 76 00:07:43,110 --> 00:07:47,810 And I'm going to play a little clip of kismet because kismet does the right thing here. 77 00:07:51,310 --> 00:07:57,790 So another important skill for the robot to be able to learn from people is being able to recognise communicative intent. 78 00:07:57,820 --> 00:08:05,260 Very good kismet. And the way we've done that with Kismet right now is to have the robot recognise by tone of voice. 79 00:08:05,290 --> 00:08:08,710 Are you praising it? Are you scolding it? Where do you put your body? 80 00:08:12,220 --> 00:08:15,250 Oh, no, no. 81 00:08:15,790 --> 00:08:26,350 You're not to do that. No. So imagine that paper clip that I started off with. 82 00:08:27,190 --> 00:08:30,520 Bill Gates got a standing ovation when he announced it was going away. 83 00:08:31,650 --> 00:08:38,890 A lot of people were really unhappy with that thing. Now, if Clippy, instead of smiling and dancing every time you were upset, 84 00:08:39,040 --> 00:08:44,950 had done what Kismet had done and looked sorry, how many of you might have given it a second chance? 85 00:08:45,540 --> 00:08:50,169 Right. So one of the things with emotional intelligence we have to do is know how to look. 86 00:08:50,170 --> 00:08:54,220 Sorry, recognise when we made a mistake, apologise and repair the situation. 87 00:08:54,940 --> 00:09:01,060 Now we also need to know if the person we're interacting with is happy or upset. 88 00:09:01,660 --> 00:09:04,540 This should be easy, right? I work a lot with people with autism. 89 00:09:05,110 --> 00:09:10,240 One of the things they're usually taught is this smile means the person is truly happy. 90 00:09:10,510 --> 00:09:15,249 This is the smile you try to have in front of the camera where not just cheers with your mouth, but also smile with your eyes. 91 00:09:15,250 --> 00:09:24,220 Right? So in my book Affective Computing, I quoted the neurologists and the great psychologists and said, Yes, this is the true smile of delight. 92 00:09:25,640 --> 00:09:27,020 Then later we got data. 93 00:09:28,040 --> 00:09:40,460 And I'm going to play for you a clip of a kind of experiment that we did here where the people were brought in to fill out their resume on a Web site. 94 00:09:40,940 --> 00:09:44,910 And unbeknown to them, well, they know they're being recorded. 95 00:09:44,960 --> 00:09:50,150 They're fully informed of that. But we didn't tell them until afterwards that it was designed to be frustrating. 96 00:10:17,080 --> 00:10:21,640 I am sorry. You seem to be having difficulties getting through this form. 97 00:10:22,240 --> 00:10:26,740 Would you mind looking at the camera and share your thoughts on how to improve the form? 98 00:10:28,180 --> 00:10:34,709 This one stuck. All right. 99 00:10:34,710 --> 00:10:39,800 Now, you know, he looks like he's not having any fun. The next guy is in the exact same situation. 100 00:10:39,810 --> 00:10:56,870 Watch his face. Well I'll go home and watch mean that little very quick. 101 00:10:56,880 --> 00:11:00,060 I'm going to say it one more time because if you. It's a little hard to see. 102 00:11:00,450 --> 00:11:04,250 Watch the I. No, I don't know. 103 00:11:15,720 --> 00:11:20,160 All right. Not happy, but that was the happy face. 104 00:11:21,410 --> 00:11:25,610 Computer could detect this originally as the true smile of delight. 105 00:11:25,620 --> 00:11:30,320 In fact, we found 90% of people showed that happy face in our study with adults. 106 00:11:30,680 --> 00:11:35,180 Later, we found people just looking at children and learning situations had also found them 107 00:11:35,180 --> 00:11:39,890 showing this to smile when they were having learning interactions that weren't going well. 108 00:11:42,230 --> 00:11:45,410 Lots of reasons why people might do this. But for us, 109 00:11:45,410 --> 00:11:49,399 this is there's this challenge of both being honest to people who struggle with 110 00:11:49,400 --> 00:11:52,580 reading faces and letting them know that this is much harder than they've been told, 111 00:11:52,850 --> 00:11:57,230 and that if your boss looks like this, you might not want to keep doing what you were doing. 112 00:11:59,240 --> 00:12:05,390 But it's also a very hard problem to solve. So something as simple as recognising a smile turns out to be harder than we thought. 113 00:12:06,470 --> 00:12:14,750 My group uses a lot of machine learning now, more popularly called AI, even though the early AI people did not consider machine learning a part of AI. 114 00:12:15,410 --> 00:12:21,020 And what we did here is we took four different machine learning models and compared them to a human, 115 00:12:21,410 --> 00:12:28,370 and we showed videos that were of smiles elicited under real delightful circumstances versus the circumstance you just saw. 116 00:12:28,910 --> 00:12:34,190 And we looked at how people did at this, which were pretty good when it came from a truly delight smile, 117 00:12:34,490 --> 00:12:39,049 but was pretty bad when it came from a frustrated smile. Here are the machine learning. 118 00:12:39,050 --> 00:12:43,880 Algorithms are actually better than the person at recognising the smile of frustration, 119 00:12:44,240 --> 00:12:53,390 even though it has these same action units activated by the AI in the mouth with the computer and the person both were given was temporal information. 120 00:12:53,570 --> 00:13:01,640 And here the computer was better at breaking up the temporal information into some things that are typically quite different when you're frustrated, 121 00:13:01,730 --> 00:13:06,410 kind of more of a quick jerky smile. And a lot of people didn't pay attention to that. 122 00:13:07,880 --> 00:13:11,400 Today. You can get off the shelf software. 123 00:13:11,420 --> 00:13:14,480 You can actually get the one I'm showing you here for free online. 124 00:13:14,810 --> 00:13:21,860 And this can read your facial expressions in real time on a mobile device and much more than smiles. 125 00:13:22,160 --> 00:13:26,190 Here you'll see a subset of things that can recognise being mapped. 126 00:13:26,210 --> 00:13:30,920 Also, two emojis, which can be fun just for having a good time. 127 00:13:31,190 --> 00:13:34,550 This is software by one of our spin out companies, Affectiva, 128 00:13:34,790 --> 00:13:41,840 that has already analysed more than 7 million faces and is now more than 90% accurate on about 30 different expressions. 129 00:13:43,850 --> 00:13:47,179 As I mentioned, if you want to play with it, hopefully not right now in the middle of the talk. 130 00:13:47,180 --> 00:13:53,180 But it's dark. It works best with light. But you can download for free from iOS or Android. 131 00:13:53,510 --> 00:13:56,830 The Google play store just by searching for affects me. 132 00:13:57,170 --> 00:14:00,559 And then make sure your face is illuminated when you play with it and play with the 133 00:14:00,560 --> 00:14:04,820 settings and you can see how well it can recognise all of your facial expressions. 134 00:14:07,250 --> 00:14:16,010 Now, robots today give the appearance of being much more emotionally capable than they actually are. 135 00:14:16,820 --> 00:14:22,610 And I don't think The Tonight Show this this is everybody's probably asleep over here when Jemmy Fallon is 136 00:14:22,910 --> 00:14:29,059 on TV in the U.S. I'm going to play a clip of an episode that made my jaw drop originally when I saw it. 137 00:14:29,060 --> 00:14:35,220 And then I'm going to tell you the real punchline. Let's meet our next robot. 138 00:14:35,250 --> 00:14:43,410 This one came all the way from Hong Kong. Please welcome the founder and CEO of Hanson Robotics, David Hanson, and his robot, Sophia. 139 00:14:46,880 --> 00:14:50,290 Oh, my gosh. Welcome. Thank you so much for coming on the show. 140 00:14:50,300 --> 00:14:55,070 Nice to me as well. David, you brought a friend with you here and this is really kind of freaking me out. 141 00:14:56,630 --> 00:15:01,940 Yeah. This is Sophia. Uh huh. And Sophia is a social robot. 142 00:15:02,230 --> 00:15:10,480 Mm hmm. And she has artificial intelligence software that we've developed at Hanson Robotics, which can process visual data. 143 00:15:10,520 --> 00:15:16,999 She can see people's faces. She can process conversational data. 144 00:15:17,000 --> 00:15:22,580 Emotional data and use all of this to form relationships with people. 145 00:15:23,870 --> 00:15:28,399 Okay. So, I mean, she's basically alive. 146 00:15:28,400 --> 00:15:32,360 Is that what you're saying? Oh, yeah. Yeah. She is basically alive. 147 00:15:32,410 --> 00:15:41,690 Oh, would you like to maybe give it a try? Sure. I would say, what's this is like? 148 00:15:42,380 --> 00:15:46,020 You see how awkward my first days are to robots? 149 00:15:46,710 --> 00:15:50,580 Oh, I'm already. I'm getting nervous around a robot. Very pretty, Rover. Do I. 150 00:15:50,600 --> 00:15:53,730 Do I just say hello to. Yeah, yeah, yeah, yeah. 151 00:15:59,130 --> 00:16:04,440 Hi, Sophia. Hello, Jemmy. Oh, my God. 152 00:16:06,770 --> 00:16:14,270 Do you know where you are? Of course. I'm in New York City and I'm on my favourite show, The Tonight Show. 153 00:16:21,480 --> 00:16:25,290 Sophia, can you tell me a joke? Sure. What? 154 00:16:25,290 --> 00:16:28,570 Jesus can never be yours when she can never be mine. 155 00:16:28,590 --> 00:16:32,220 I don't know. Nacho cheese. Yeah, yeah, yeah, yeah. 156 00:16:35,760 --> 00:16:40,350 I like I like nacho cheese. Nacho cheese is the. 157 00:16:43,200 --> 00:16:47,720 Has to do. I'm getting laughs. Yeah. Maybe I should host the show. 158 00:16:47,750 --> 00:16:52,160 Okay. Stay in your langer now. 159 00:16:54,390 --> 00:16:59,430 Jimi, would you like to play a game of rock, paper, scissors, robot style? 160 00:17:00,150 --> 00:17:03,840 Sure. Okay. Let's get this game going. 161 00:17:04,350 --> 00:17:10,230 Show me your hand to start. Rock, paper, scissors. 162 00:17:10,830 --> 00:17:14,700 Shoot. Oh, I won. 163 00:17:15,210 --> 00:17:18,390 This is a good beginning of my plan to dominate the human race. 164 00:17:23,890 --> 00:17:30,350 All right. A lot of things to pick apart here. 165 00:17:32,700 --> 00:17:37,530 David said, Yeah, she's basically alive. 166 00:17:39,000 --> 00:17:42,870 Now when I hear the word, basically, I insert the word not. 167 00:17:44,450 --> 00:17:52,730 She's not alive. Sophie is not alive. In fact, this entire scene was scripted and rehearsed many times. 168 00:17:53,090 --> 00:17:57,470 Jimmy's a great actor. He made it look like the first time meeting this robot. 169 00:17:58,490 --> 00:18:02,840 David, the maker of Sophia, said they rehearsed it over and over and over. 170 00:18:03,890 --> 00:18:08,770 And there were people behind the curtain. Making things work. 171 00:18:12,050 --> 00:18:18,170 We don't know of any machines that are conscious and that have feelings or experiences the way that we do. 172 00:18:19,140 --> 00:18:22,860 They can put on a face of disgust you nacho cheese. 173 00:18:23,100 --> 00:18:30,870 But that's just a program saying put on the following set of movements on your face. 174 00:18:32,780 --> 00:18:37,549 I'm not saying we can't ever do this in the future. Perhaps it might be possible. 175 00:18:37,550 --> 00:18:42,830 But right now, none of us that I know of in the fields of computer science, 176 00:18:43,070 --> 00:18:48,200 I have any idea how to make something like this happen with the kinds of software, 177 00:18:48,200 --> 00:18:56,300 hardware, quantum computing, all kinds of bio computers, things that we're doing other than having a child creating offspring in the natural way. 178 00:18:58,010 --> 00:19:03,740 Also this. I cut off the end where she says, Oh, that was just a joke about my plan to dominate the world. 179 00:19:06,730 --> 00:19:10,830 No machines are evolving themselves to take over the world right now. 180 00:19:10,840 --> 00:19:17,800 It really does not have the ability to happen unless humans design in such an ability. 181 00:19:17,830 --> 00:19:22,450 I would add sort of power dominance hungry humans design in such an ability. 182 00:19:22,750 --> 00:19:26,800 So then it would be a human machine combo at some level doing this. 183 00:19:27,760 --> 00:19:32,080 This is certainly not likely to happen from the kinds of things I'm talking about with you today. 184 00:19:33,530 --> 00:19:38,650 People often ask. Well, whereas, you know. Some. 185 00:19:39,220 --> 00:19:42,940 I'm just a robot. Some people say I'm just a robot. We're just machines. 186 00:19:43,210 --> 00:19:51,190 And in fact, if you could build a machine that's exactly able to do the emotional things, I do look like it has these functions. 187 00:19:51,190 --> 00:19:56,409 I have walk like I do, talk like I do joke like I do express emotions. 188 00:19:56,410 --> 00:20:02,620 Like I do recognise emotions like I do. Then it's essentially equal where the same, right? 189 00:20:02,620 --> 00:20:09,100 And then we've once we can build it, we've fully understood it because in A.I. we often build things to better understand them. 190 00:20:09,400 --> 00:20:16,030 We build models of the brain to better understand how it works. So if we succeeded in building all of this, 191 00:20:16,030 --> 00:20:21,999 maybe even making it say that it was conscious and it had experiences, would that mean that that's all we are? 192 00:20:22,000 --> 00:20:33,409 We're just this material stuff. A story I like to think of that helps me with. 193 00:20:33,410 --> 00:20:38,600 This is the story of the radio instructions that were found by aliens on a planet. 194 00:20:39,830 --> 00:20:48,050 And they went and they found the materials to build the radio and they built everything exactly according to the instructions. 195 00:20:48,440 --> 00:20:56,150 And in the end, they turned the knob that was supposed to turn and make music and oh, my music came out. 196 00:20:56,870 --> 00:21:03,350 And so they thought we have therefore understood not only how radio works, but where music comes from. 197 00:21:05,120 --> 00:21:15,260 They thought that by building this with this phenomenon emerging, that it was therefore fully explained by the physical material stuff that was there. 198 00:21:15,800 --> 00:21:22,460 Now, of course, we know about more than that. So we know about radio waves and all this other stuff, but they didn't have access to that. 199 00:21:23,210 --> 00:21:29,750 Similarly, our thinking that just building something that has outward functions that are equivalent according 200 00:21:29,750 --> 00:21:35,480 to some list of criteria to think that that could be everything is a very myopic way of thinking. 201 00:21:36,260 --> 00:21:42,260 Here's a picture of her.She Ishiguro in Japan who has been making robots that look like him, his wife, 202 00:21:42,260 --> 00:21:47,210 his daughters, with the idea that eventually he could send his robot to give the talk instead of him. 203 00:21:47,260 --> 00:21:50,690 You wouldn't see the difference. And a lot of them do look look real. 204 00:21:51,230 --> 00:21:59,570 But by no means are these the same as the person that they're designed to functionally look similar to. 205 00:22:00,650 --> 00:22:02,630 Now, in our lab, we've been wrestling lately. 206 00:22:02,870 --> 00:22:10,340 The Media Lab grew up at MIT in the School of Architecture and Urban Studies, different from the A.I. lab that grew out of the computer science group. 207 00:22:11,060 --> 00:22:14,450 But we have a lot of kindred spirits that go back and forth. 208 00:22:15,200 --> 00:22:20,180 And the AI that I sort of grew up thinking I wanted to build was a lot like Sophia. 209 00:22:21,380 --> 00:22:27,680 But at the Media Lab, over the decades, we've been rethinking A.I. and thinking, What do we really want? 210 00:22:28,520 --> 00:22:31,250 What do we really want to be remembered for building? 211 00:22:31,280 --> 00:22:38,959 And Marvin Minsky, one of the founding parents of the field of AI and one of the founders of the Media Lab used to say, 212 00:22:38,960 --> 00:22:47,480 I want to build an A.I. that's so brilliant and powerful that, you know, we'll be lucky if it keeps us around as a household pet. 213 00:22:49,820 --> 00:22:56,240 And at first I thought, Wow, that's a pretty cool, big vision. And then I thought, Wait a minute, do I really want to be a household? 214 00:22:56,510 --> 00:22:59,240 Do I want my children to be household? My grandchildren to be household pets? 215 00:23:00,110 --> 00:23:07,520 So we have been rethinking what kind of air we want to build, and we've decided we want the future to be a better one for humans. 216 00:23:08,000 --> 00:23:14,990 And we want to build the kind of intelligence that extends human ability, not replaces human ability. 217 00:23:16,180 --> 00:23:23,640 Not that it can't replace some of the things we do, but it's got to remember, it's got to reflect what our priorities are. 218 00:23:23,650 --> 00:23:25,840 And so we'd like to think about that. 219 00:23:25,840 --> 00:23:32,890 So we have been looking at examples, you know, such as wearable cameras that augment for a blind person what they see. 220 00:23:33,100 --> 00:23:36,520 Or I'm going to tell you a story in a moment of a wearable smartwatch. 221 00:23:36,520 --> 00:23:41,590 We already are very familiar with the way smartphones give us superpowers. 222 00:23:42,370 --> 00:23:49,120 But something like this can actually extend our intelligence in a way that that's a bit surprising. 223 00:23:49,120 --> 00:23:54,580 And this one in particular started when I was working with children with autism. 224 00:23:54,880 --> 00:23:58,840 Here's a boy who was wearing one of our cameras that was augmented with a computer vision 225 00:23:58,840 --> 00:24:03,610 system you just saw that would whisper in his ear or show him in an alternate form. 226 00:24:03,610 --> 00:24:11,680 He could process what somebody's facial expressions were because he had difficulty processing the face of the person he was socially interacting with. 227 00:24:13,050 --> 00:24:16,320 And one day as I was interacting with a person with autism, she said to me. 228 00:24:17,440 --> 00:24:22,420 RAZ You have it all wrong. My biggest problem is not reading other people's emotions. 229 00:24:23,550 --> 00:24:27,270 My biggest problem is you're not reading my emotions. 230 00:24:27,990 --> 00:24:33,209 And I thought, great, you know, this is what I do for a living like this, that it's okay. 231 00:24:33,210 --> 00:24:37,680 I have room to improve. What do I need to do better? And she said, It's not just you. 232 00:24:37,680 --> 00:24:42,120 It's everybody is not reading my emotions accurately. And what emotions are we not reading accurately? 233 00:24:42,120 --> 00:24:44,430 You're not reading my stress, my anxiety. 234 00:24:44,640 --> 00:24:49,650 And I realised as we worked more with kids with autism, while some people are kind of stressed like this little boy, 235 00:24:50,010 --> 00:24:54,870 a lot of people would outwardly look very chill and inwardly they're about to erupt 236 00:24:55,020 --> 00:24:59,550 a meltdown or something that could cause injury to themself or others around them. 237 00:25:00,840 --> 00:25:03,930 And I realised back in our lab in our early days of affective computing, 238 00:25:03,930 --> 00:25:08,820 we had taken off the shelf skin conductance sensors that measured the sweat on your hand. 239 00:25:09,240 --> 00:25:16,139 And we'd learned a bunch about how these could get anticipatory responses in fight or flight responses because the skin is 240 00:25:16,140 --> 00:25:22,200 innervated by the sympathetic nervous system or part of our autonomic nervous system that is triggered with fight or flight. 241 00:25:22,350 --> 00:25:29,910 So I wondered if we could take this into a form factor that kids could use in the classroom if this might help them be better understood. 242 00:25:30,300 --> 00:25:37,260 So we built lots of different versions, learned where we could measure this and other places on the body because the palms were often inconvenient. 243 00:25:37,770 --> 00:25:43,050 And today there are two commercialised versions of this I'm wearing one that we use in a lot of research. 244 00:25:43,290 --> 00:25:47,310 The empathic 84. And this one, the embrace, which I'll tell you a little bit more about. 245 00:25:50,260 --> 00:25:56,410 First, I'll play just a little example of the data collected from a child we saw. 246 00:25:56,980 --> 00:26:00,750 There is a fine. How are you sure? Are you sure you want to know? 247 00:26:00,760 --> 00:26:02,920 I'm going to turn the sound down because she's about to have a meltdown. 248 00:26:03,760 --> 00:26:09,159 What you see in that little blue window at the top is her skin conductance level streaming from her two ankles, 249 00:26:09,160 --> 00:26:15,100 going up with her increased sympathetic reactivity likely here. 250 00:26:15,130 --> 00:26:19,380 Now, she's about to have a meltdown and we see it. It will peak in a moment here. 251 00:26:19,390 --> 00:26:28,540 The early version had wireless dropouts. That's why that long straight line, this blue window at the top corresponds to this little blue strip here. 252 00:26:29,170 --> 00:26:35,050 It's one minute wide. This is about 45 minutes of data. And you can see she's been building up for some time before that peak there. 253 00:26:35,440 --> 00:26:43,080 So a teacher or a parent who sees the child's level growing without seeing any external causes for this might then 254 00:26:43,090 --> 00:26:50,530 reason that there's something internal that's increasing that could be about to cause a response like we just saw. 255 00:26:52,500 --> 00:26:58,680 The signal doesn't just go up with stress. It also goes up with cognitive effort and engagement. 256 00:27:00,400 --> 00:27:05,410 The first time I saw this from the wrist, that previous girl was wearing it on her ankles. 257 00:27:06,320 --> 00:27:14,540 Was this data from an MIT student? What you see here from bottom to top is seven days left to right is 24 hours. 258 00:27:14,930 --> 00:27:22,370 We see, as we expected, big peaks in this activity with exams and and homeworks problems. 259 00:27:22,370 --> 00:27:24,290 That's I imagine Oxford's a lot like at MIT. 260 00:27:24,770 --> 00:27:31,910 There's, you know, really hard things to think about its effort mentally and emotionally to engage in this. 261 00:27:33,730 --> 00:27:41,530 Unfortunately, the low point every day that we found is over here on the right, underlined in yellow, and that's classroom activity. 262 00:27:43,140 --> 00:27:51,710 To the disappointment of MIT professors. The middle, the biggest peak of most days, surprisingly, is sleep. 263 00:27:52,070 --> 00:27:57,260 And if you're paying attention, you should be scratching your head going, Wait, didn't you just say this is arousal, stress, cognitive load? 264 00:27:57,260 --> 00:27:58,550 What the heck is going on here? 265 00:27:59,450 --> 00:28:06,830 It's a whole nother talk, but in a nutshell, we're finding that these patterns, when you zoom in on them with high frequency peaks, 266 00:28:07,460 --> 00:28:11,150 they exhibit statistics similar to single neurone hippocampal firing patterns. 267 00:28:11,150 --> 00:28:13,760 These are from regions of the brain involved in memory, 268 00:28:14,240 --> 00:28:21,260 and we are finding the signal to be useful and predicting how well people perform on a task they learned before sleep. 269 00:28:24,520 --> 00:28:28,209 Now. The first time we built the sensor, it looked like this. 270 00:28:28,210 --> 00:28:32,740 It was in a sweatband and it had homebrew electronics in it that frequently broke. 271 00:28:33,550 --> 00:28:42,610 And it was the end of the semester. One December I was in my office, I was working away and a young student knocks on the door and he says. 272 00:28:43,570 --> 00:28:47,440 Professor Picard, could I please borrow one of your wristband sensors? 273 00:28:48,420 --> 00:28:52,710 My little brother has autism. He can't talk. And I want to see what's stressing him out. 274 00:28:53,690 --> 00:28:59,510 And I said, sure. In fact, don't just take one, take two in case one breaks. 275 00:28:59,960 --> 00:29:03,280 And so he takes the two. He doesn't wait for one to break. 276 00:29:03,290 --> 00:29:08,540 He puts both of them on his little brother's wrists at the same time, the opposite of what I told him to do. 277 00:29:08,630 --> 00:29:14,050 But I'm really glad he didn't follow my instructions. I go back to MIT. 278 00:29:14,200 --> 00:29:21,430 I'm looking at the boys data on my computer and the first day looks very flat, both wrists. 279 00:29:22,390 --> 00:29:26,860 It's a kid who looks pretty relaxed. Second day, pretty relaxed. 280 00:29:27,640 --> 00:29:31,420 Third day, same thing. Go to the next day and my jaw drops. 281 00:29:32,080 --> 00:29:37,120 One of the signals on one wrist went so high that I thought the sensor must be broken. 282 00:29:38,340 --> 00:29:46,770 We have stressed people out at MIT every way we can imagine, you know, from qualifying exams to obnoxious noises in your ears. 283 00:29:47,190 --> 00:29:53,320 Boston drivers stress. And I had never seen a peak this big and weirder. 284 00:29:53,340 --> 00:29:55,680 How could you be stressed on just one side and not the other? 285 00:29:57,960 --> 00:30:03,000 I tried to do some debugging and I'm electro engineer so I thought I could figure this out. 286 00:30:03,000 --> 00:30:08,920 Ran a bunch of tests, could not explain what was going on. So I did the old fashioned debugging. 287 00:30:08,920 --> 00:30:12,100 I picked up the phone and I called the student at home on vacation. 288 00:30:13,620 --> 00:30:20,969 Hi. How is your Christmas? How's your little brother? Hey, do you have any idea what happened to him? 289 00:30:20,970 --> 00:30:25,310 And I gave him the exact date and time in the data. And he said, I don't know. 290 00:30:25,330 --> 00:30:29,260 I'll check the diary. Like MIT student keeps a diary. 291 00:30:29,560 --> 00:30:34,290 You know, probably Oxford students do diaries, too. So quick prayer. 292 00:30:34,300 --> 00:30:38,140 Like, what are the odds a teenager would write this down? Mom of three teenagers. 293 00:30:38,380 --> 00:30:41,950 Pretty low odds. He comes back. He has the exact date and time written down. 294 00:30:41,950 --> 00:30:47,009 And he says that was right before he had a grand mal seizure. Now. 295 00:30:47,010 --> 00:30:53,930 I didn't know what seizures were, but I knew another student's dad was chief of neurosurgery at Children's Hospital Boston. 296 00:30:53,940 --> 00:30:57,000 So I screwed up my courage and called Dr. Joe Madsen. 297 00:30:58,110 --> 00:31:05,579 Dr. Madsen, my name's Rosalind Picard. Do you know if it's possible somebody could have a huge, sympathetic nervous system surge here? 298 00:31:05,580 --> 00:31:08,610 It looked like it was 20 minutes before the seizure. And he said. 299 00:31:09,710 --> 00:31:13,430 Probably not. But he said, you know, it's interesting. 300 00:31:13,430 --> 00:31:17,540 We've had people whose hair stands on in on one arm 20 minutes before a seizure. 301 00:31:18,660 --> 00:31:22,170 And I'm like, on one arm. And I told him how, you know, it happened on only one side. 302 00:31:22,560 --> 00:31:29,370 He got interested. We made a lot more devices. 90 families were being enrolled in a clinical study where all the children, 303 00:31:29,970 --> 00:31:36,090 90 children were all candidates for brain surgery because they had seizures that were not stopped by medication. 304 00:31:36,600 --> 00:31:46,050 And we found that 100% of that first batch of children had very large skin conductance responses with their grandma seizures. 305 00:31:46,860 --> 00:31:54,690 Not 20 minutes in advance of them, but usually at the exact same time of a perfectly synchronised readout of their video EEG data. 306 00:31:55,020 --> 00:31:56,970 At the bottom here is movement data. 307 00:31:57,060 --> 00:32:02,430 When you combine that movement data with the skin conductance, you get a more sensitive and specific seizure detector. 308 00:32:04,200 --> 00:32:09,160 But there was a mystery here, and that is the seizure is only about the width of that red line. 309 00:32:09,180 --> 00:32:15,840 It's only a few minutes wide that the person is shaking, convulsing and having this unusual electrical activity in their brain. 310 00:32:16,260 --> 00:32:26,820 Why is this stress? Skin conductance, sweat, whatever it is, response so big for so long, for 20, 30 minutes when the person's just lying there. 311 00:32:27,540 --> 00:32:30,960 And as we tried to figure this out, we learned of another thing. 312 00:32:31,020 --> 00:32:35,850 And that was that there's something called SUDEP, sudden, unexpected death in epilepsy. 313 00:32:37,120 --> 00:32:45,300 And when a person who has epilepsy dies after a seizure, this you see this seizure activity. 314 00:32:45,350 --> 00:32:50,080 What what you see here from top to bottom is traces from electrodes on the scalp. 315 00:32:50,350 --> 00:32:56,140 This this time from left to right. And here each trace is going kind of electrically crazy because there is a seizure. 316 00:32:56,410 --> 00:33:00,010 Then the seizure stops. And we would like to see normal brain activity here. 317 00:33:00,010 --> 00:33:05,680 But instead, everything is below ten micro volts. It's suppressed. And it's called postnatal after the seizure. 318 00:33:05,860 --> 00:33:09,310 Generalised across all EEG channels. EEG suppression. 319 00:33:09,880 --> 00:33:19,600 And it turns out that while this occurs in 100% of SUDEP, it also occurs frequently in cases where people fortunately don't die after the seizure. 320 00:33:20,170 --> 00:33:24,550 But the duration of it turns out to be related to the size of the response on the wrist. 321 00:33:25,420 --> 00:33:27,910 This is not just a crazy artefact. 322 00:33:28,330 --> 00:33:34,960 This work is published in the top medical journal Neurology and has been replicated with other groups beyond this paediatric group as well. 323 00:33:36,710 --> 00:33:40,160 So this led us to start to learn more about SUDEP. 324 00:33:40,160 --> 00:33:50,059 And to my surprise, I learned that this thing that is taking people's lives is actually quite common and not not commonly discussed. 325 00:33:50,060 --> 00:33:57,530 It's actually the number two cause of years of potential life, loss of all neurological disorders, stroke being number one. 326 00:33:58,970 --> 00:34:03,890 There's a suit up every 7 to 9 minutes, and probably most people here have never heard of it. 327 00:34:04,580 --> 00:34:09,290 It's also more common than crib death. In the U.S., we call it Sudden Infant Death Syndrome. 328 00:34:09,320 --> 00:34:13,250 It takes more lives every year than sudden infant death syndrome or crib death. 329 00:34:14,890 --> 00:34:20,980 It starts with a little unusual electrical activity in the brain. If you have repeated ones of these, you get an epilepsy diagnosis. 330 00:34:21,280 --> 00:34:28,230 And what could be start off as a little brushfire can spread and generalise over your brain and cause convulsions all over. 331 00:34:28,240 --> 00:34:32,650 In fact, most of the time we get a response on both wrists because usually the seizure generalises 332 00:34:33,760 --> 00:34:39,190 their number one way to prevent SUDEP is to take your medications the number two 333 00:34:39,190 --> 00:34:45,069 way and actually what's been shown in the literature to make a significant difference 334 00:34:45,070 --> 00:34:49,450 in reducing the likelihood of a SUDEP after a seizure is for somebody to be there. 335 00:34:50,530 --> 00:34:59,530 To have somebody come check on you. So by you leaving here tonight and knowing about SUDEP, 336 00:34:59,770 --> 00:35:08,780 I hope that you will also be able to be somebody who would help people know that being there for one another could actually reduce these deaths. 337 00:35:08,800 --> 00:35:14,900 We think most of these are preventable. Now what's been shown is that when somebody gets there, 338 00:35:14,900 --> 00:35:21,200 the first thing they do is often say the person's name or just touch them or flip them over and stimulate them in some way. 339 00:35:21,740 --> 00:35:30,170 It's also been shown that going in this is invasive inside your brain and stimulating one of the key regions of the brain involved in emotion. 340 00:35:30,170 --> 00:35:35,600 The amygdala can turn off your breathing and you can sit there and just stop breathing. 341 00:35:36,170 --> 00:35:43,450 You're capable of breathing, but you don't breathe. And then when somebody says your name or talks to you or stimulates you, you breathe. 342 00:35:45,380 --> 00:35:51,140 It's been observed that when the seizure spreads to that region, it can turn off the breathing, but somebody stimulating can turn it back on. 343 00:35:51,230 --> 00:35:57,980 So at in Pedagogue, we decided to commercialise this and get it out with an alert. 344 00:35:58,340 --> 00:36:06,169 And now the Swatch Embrace runs A.I. Machine learning real time to detect generalised tonic clonic seizures and to 345 00:36:06,170 --> 00:36:11,720 send an alert to try to bring somebody there to hopefully stimulate the person and help restart their breathing. 346 00:36:11,730 --> 00:36:19,370 It can't do fully closed loop just yet, but we are getting stories that are like this that make my skin conductance go up. 347 00:36:20,180 --> 00:36:23,720 I read my email. This was from a mom who said she was in the shower that morning. 348 00:36:24,410 --> 00:36:30,890 She saw her phone on the counter by the shower go off saying her daughter needs her help. 349 00:36:31,730 --> 00:36:38,120 She goes running out of the shower to her daughter's bedroom, finds her daughter face down in bed, blue and not breathing. 350 00:36:39,230 --> 00:36:43,670 She flips her daughter over and her daughter takes a breath and turns pink. 351 00:36:44,120 --> 00:36:47,930 And this time, I think I stopped breathing reading this email. 352 00:36:49,040 --> 00:36:57,900 Since then, we've heard a lot of stories like this. Now the SUDEP is actually most common among people in there. 353 00:36:57,920 --> 00:37:04,549 It starts peaking in the twenties and the thirties and forties, and it could happen to somebody, 354 00:37:04,550 --> 00:37:08,330 you know, if it's a stigmatised over here in the UK as in the US, 355 00:37:08,330 --> 00:37:12,170 you'll probably if you go back and talk to your friends and say, I don't know anybody with epilepsy, do you? 356 00:37:12,530 --> 00:37:16,820 You'll probably find several of them saying, actually, I have epilepsy or my family member does. 357 00:37:17,330 --> 00:37:20,480 We don't tend to talk about it, but what in 26 people has epilepsy? 358 00:37:21,050 --> 00:37:27,590 If we talked about it and got to know when one another has a need for somebody to be there, we could actually prevent a lot of those deaths. 359 00:37:29,390 --> 00:37:35,130 So there is still no AI that can actually stimulate you and reposition you. 360 00:37:35,150 --> 00:37:39,290 It may be coming, but right now we need human help to close this loop. 361 00:37:39,860 --> 00:37:45,830 And it's actually kind of nice that we need human help because we need human help to do more than just stimulate us to breathe again. 362 00:37:45,950 --> 00:37:52,109 Right. We really need to get to know our neighbour. Now. 363 00:37:52,110 --> 00:38:01,560 I was very surprised when I saw that result that the duration of shut down was related to some activity deep in the brain, 364 00:38:02,280 --> 00:38:07,950 and that when the activity deep in the brain is happening, we're not sensing anything from the EEG. 365 00:38:07,950 --> 00:38:12,330 The idea looks like your brain's dead. But where we. Since the activity is on the wrist. 366 00:38:13,620 --> 00:38:18,090 So I was really glad I had tenure because this is pretty weird, right, to say there's deep brain activity. 367 00:38:18,300 --> 00:38:21,870 The EEG doesn't see it, but the wrist does. Huh. 368 00:38:21,990 --> 00:38:24,530 Like, talk about bizarre. 369 00:38:26,190 --> 00:38:32,700 And I was remarking on how strange I thought this was to a group of doctors, and one of them shakes her head at me and she goes, Roz, it's easy. 370 00:38:33,240 --> 00:38:37,460 Like, What do you mean it's easy? It's medicine. 1010. 371 00:38:37,470 --> 00:38:41,910 I never had medicine one on one. I'm an engineer. Medicine, 1 to 1. 372 00:38:42,000 --> 00:38:46,790 Everybody who took it was taught that as an embryo, we have three kinds of tissue. 373 00:38:46,800 --> 00:38:53,070 One of them forms our muscles and bones, the mesoderm, the endoderm formed our lungs, our stomach. 374 00:38:53,070 --> 00:39:02,520 All of that soft, all the soft organs inside. But the largest organ on the outside, the skin co formed with the brain, the spinal cord, 375 00:39:02,520 --> 00:39:08,490 the whole neuronal system, the ectoderm knit together our brain and our skin from the beginning. 376 00:39:09,710 --> 00:39:16,940 So she said, I'm not surprised that you would find activity deep in the brain not showing up on the scalp, but showing up on the wrist. 377 00:39:19,740 --> 00:39:22,110 Now this got me thinking because I study emotion. 378 00:39:22,170 --> 00:39:30,720 Most of the stuff that we had been studying was from brain scans and things deep in the brain and these homes of emotion in the brain. 379 00:39:31,230 --> 00:39:39,570 And one of the things that was of interest was something my boss had asked me for years before, 380 00:39:39,570 --> 00:39:46,740 and I had not taken him very seriously back before we started working on autism and epilepsy, and we were just trying to build smart computers. 381 00:39:48,240 --> 00:39:51,639 My boss said to me. Roz. What are you going to do? 382 00:39:51,640 --> 00:39:59,650 Something useful. When are you going to tell me my wife's mood before I go home. 383 00:40:01,820 --> 00:40:08,840 And I thought, oh, great, you know, mood rings, you know, he wanted the mood ring and that told him his wife's mood before he gets out. 384 00:40:08,840 --> 00:40:12,920 And, you know, Walter, the mood ring is just a stupid little sensor that turns colours with temperature. 385 00:40:14,060 --> 00:40:17,010 You know, I like to do things that are hard. I like to do things that are important. 386 00:40:17,020 --> 00:40:21,990 I mean, yeah, maybe I can make more money selling mood rings, but really, I want to solve hard, important problems. 387 00:40:23,050 --> 00:40:30,220 And as we got these brain findings and the connection to the scan, and I started to learn more about what we might be able to do with Mood, 388 00:40:30,790 --> 00:40:35,730 I started to learn about the importance of a very serious mood disorder. 389 00:40:35,740 --> 00:40:42,520 I know it's stigmatised here. Sorry, bringing up all these negative things tonight, but you've got to pay attention to this one. 390 00:40:42,520 --> 00:40:48,909 It's already the leading cause of disability and it's on track by the year 2030 to pass cancer or accidents. 391 00:40:48,910 --> 00:40:53,380 And stroke is the number one cause of death and years lived with disability. 392 00:40:55,420 --> 00:41:00,550 That's the forecast. Could we change that forecast? Could we have a better future instead of this one? 393 00:41:02,080 --> 00:41:09,610 One way I like to think about this and this is my one non-real data graph is this is well-being on the left here, higher is better. 394 00:41:10,030 --> 00:41:17,049 People are more positive, they feel great. You get admitted to Oxford or you get a job at your favourite place to work and you're doing great. 395 00:41:17,050 --> 00:41:22,870 You come in, you do even better. This is time. Everybody hits major stressors, everybody takes a dip. 396 00:41:23,500 --> 00:41:26,020 And unfortunately, even in the best companies, 20, 397 00:41:26,020 --> 00:41:34,720 30% of people are or more are dropping down here where they might get a major depressive disorder diagnosis. 398 00:41:35,080 --> 00:41:37,120 Others are resilient and snapping back. 399 00:41:41,270 --> 00:41:47,570 You know, the metaphor of the frog put on the pot on the stove and they turn on the temperature and the water starts to heat up. 400 00:41:48,430 --> 00:41:56,090 The mood ring, the stupid temperature sensor. If the frog had a temperature sensor, the frog could get out. 401 00:41:56,630 --> 00:42:01,040 Maybe jumped from the red to the blue before it got in trouble. 402 00:42:01,070 --> 00:42:06,710 Right. When I was talking with leading psychiatrists about what we should be doing to prevent this. 403 00:42:07,860 --> 00:42:11,190 I said, Well, do we just need to get people who are in trouble to you sooner? 404 00:42:11,200 --> 00:42:15,510 Because if we got them into medical care sooner, you could fix them, right? 405 00:42:15,540 --> 00:42:20,099 I mean, I'm talking with experts who run the depression innovation across all of the United States 406 00:42:20,100 --> 00:42:26,190 and have international partners working on all the cutting edge ways to treat depression. 407 00:42:28,010 --> 00:42:32,300 I said you could fix it. Right? And he said, no, we don't know how to fix it. 408 00:42:33,360 --> 00:42:38,040 And I thought, then we need to be not waiting for the guys working over here in the medical system. 409 00:42:38,040 --> 00:42:42,890 We need to do something over here to prevent it. We need to not let the frog boil. 410 00:42:42,900 --> 00:42:50,220 And you know what? These sensors that we're wearing. They're much more sophisticated than temperature sensors. 411 00:42:50,270 --> 00:42:55,100 Could they actually give us early warning of if mood or stress are getting out of hand? 412 00:42:56,870 --> 00:43:02,840 Now there are dozens of studies we're doing. I'm just going to give you a couple quick highlights, because I want to leave time for Q&A. 413 00:43:03,350 --> 00:43:11,450 But here's one. Here is data. What you see from bottom to top on the left is data from wristband, 414 00:43:11,450 --> 00:43:20,150 actually smartwatches and mobile phone data related to physiology, geolocation, social network patterns. 415 00:43:20,660 --> 00:43:27,020 Here from 20, this one was 22 patients diagnosed with major depression disorder. 416 00:43:27,410 --> 00:43:34,940 And on the bottom is the score on a gold standard depression inventory that a top mass general hospital psychiatrist is giving them. 417 00:43:35,300 --> 00:43:40,940 If it was perfect, they would all the dots would lie along this this diagonal line from bottom to top. 418 00:43:41,180 --> 00:43:43,940 And we see it's not perfect, but it's pretty doggone good. In fact, 419 00:43:43,940 --> 00:43:52,430 it's correlated 0.83 and that is the just objective data from the system being correlated with the 420 00:43:53,240 --> 00:44:01,340 physician that is already as well performing as physicians are with correlated with each other. 421 00:44:03,350 --> 00:44:10,310 We've also been asking, can we forecast to see if you're on a trajectory where things are getting better or worse? 422 00:44:10,550 --> 00:44:12,740 And the first version of this is shown here. 423 00:44:13,040 --> 00:44:21,350 This is using a kind of AI and deep learning that's called multitasking instead of just training it on mood or on stress or on health. 424 00:44:21,590 --> 00:44:28,100 We train it on multiple things. Those three here that are related so that it learns things that are common across them, 425 00:44:28,340 --> 00:44:33,050 and then it tunes the output layer to individual people or people like you. 426 00:44:33,890 --> 00:44:38,420 And here we just asked, how well would it work if we set it up? 427 00:44:38,420 --> 00:44:40,070 So that chance was 5050. 428 00:44:40,820 --> 00:44:48,800 And we just want to forecast if tomorrow night came just through data today, tomorrow night, you're in a great mood or a bad mood. 429 00:44:49,340 --> 00:44:56,330 You're really healthy or you're really sick or you're really super stressed or are you really calm or you're really stressed. 430 00:44:56,510 --> 00:45:01,690 The good ones versus the bad ones. 5050 would be random. 431 00:45:02,660 --> 00:45:05,810 It's first time it was 78 to 87%. 432 00:45:08,470 --> 00:45:17,500 Since then, we've been refining, we've done much more nuanced models with regression to give exact levels, and the accuracy is continuing to improve. 433 00:45:18,070 --> 00:45:23,880 We also want to know. What what's affecting it for you? 434 00:45:24,390 --> 00:45:29,760 And as we collect data and the last slide in this one, the data here is only limited to New England. 435 00:45:30,000 --> 00:45:34,890 It's limited to only New England college students here. Can't say this generalises behind that dataset yet. 436 00:45:35,490 --> 00:45:39,860 But here we looked at which things were most associated with being more stressed the next night. 437 00:45:39,880 --> 00:45:44,160 This is the doctor work of psychiatry or more calm the next night. 438 00:45:44,610 --> 00:45:50,580 And the number one thing for calmness was positive social interaction. 439 00:45:50,880 --> 00:45:56,610 Isn't that interesting? We see a lot of we computer scientists who are introverts kind of like, oh, darn, we have to be social. 440 00:45:57,510 --> 00:46:03,390 But for most people, the positive social interaction makes a huge difference in your calm tomorrow, 441 00:46:04,110 --> 00:46:06,809 and the negative social interaction has the reverse impact. 442 00:46:06,810 --> 00:46:13,800 And there's one data point even worse than the negative social interaction that I could give another whole talk about, and that's sleep. 443 00:46:14,220 --> 00:46:19,140 In particular, if your sleep times are very uncorrelated from day to day, 444 00:46:19,650 --> 00:46:24,570 you have high sleep e regularity that is consistently associated with higher 445 00:46:24,570 --> 00:46:30,270 stress and worse mood and worse mental health in our college student population. 446 00:46:31,570 --> 00:46:36,220 So we want to understand how to make this better for people and. 447 00:46:37,430 --> 00:46:44,540 It turns out this is really hard. Again, that's what I'm showing you here is just first experiments limited to college students. 448 00:46:45,460 --> 00:46:50,050 It's probably too small to see, but from top to bottom, there are a whole bunch of things. 449 00:46:50,060 --> 00:46:56,990 And I'll tell you the most important ones here, the one that's just highlighted in blue here is the Priestley social interaction. 450 00:46:56,990 --> 00:47:02,270 And down below some texting behaviour, college students interacting socially through texting. 451 00:47:03,170 --> 00:47:06,200 What's circled here? There are four vertical columns. 452 00:47:06,620 --> 00:47:10,490 Each row is a different thing that was measured in this particular model, 453 00:47:10,820 --> 00:47:16,010 and each column is a different group of people clustered by people who are similar to each other. 454 00:47:17,110 --> 00:47:23,320 And what we see is that the two columns in the middle, you see that bright yellow spot and that dark blue spot next to it. 455 00:47:23,830 --> 00:47:29,049 What that means is for the people in the column with the bright yellow spots, those behaviours, 456 00:47:29,050 --> 00:47:35,890 those social behaviours were highly associated with them being much happier the next day. 457 00:47:37,320 --> 00:47:44,250 But the people in this other group, those exact same behaviours are associated with them feeling much worse the next day. 458 00:47:45,850 --> 00:47:49,270 So we can't give a one size fits all piece of advice. 459 00:47:50,630 --> 00:47:59,000 When we look at people's data, we can start to learn what works for them and what doesn't work for them and give customised. 460 00:48:00,720 --> 00:48:05,100 Insights about what makes things look better for their forecast. 461 00:48:07,080 --> 00:48:10,290 So I don't know about you. I. I actually check weather forecasts. 462 00:48:10,290 --> 00:48:14,429 My husband doesn't. So I know this won't work. Won't be of interest to everybody, but I check. 463 00:48:14,430 --> 00:48:21,660 I like to see if it's going to rain because I look pretty horrible when I've been drenched and I'm interested in the mood forecast. 464 00:48:22,290 --> 00:48:26,340 However, I suppose my mood forecast looks like this. Right. 465 00:48:26,520 --> 00:48:29,639 Worse stress. Worse mood getting sick. 466 00:48:29,640 --> 00:48:33,810 Oh, man. You know, like who wants to look at that up, especially if you're already depressed. 467 00:48:34,050 --> 00:48:40,230 Right. This could be really bad. You know, if you look like this, like this app is going nowhere. 468 00:48:42,220 --> 00:48:45,250 And we run the risk of making people even worse off. 469 00:48:45,290 --> 00:48:49,270 Right. Which is not what we want. So what's the app that we want? 470 00:48:49,900 --> 00:48:55,990 Well, the one we want gives you not just the weather forecast, but the ability to change the weather, 471 00:48:56,410 --> 00:49:00,180 the mood weather, the stressed weather, the health, whether you're likely to get sick. 472 00:49:00,190 --> 00:49:03,970 But if you go to bed 2 hours early, you're more likely to. 473 00:49:04,870 --> 00:49:06,490 You're a lot less likely to get sick. 474 00:49:07,340 --> 00:49:14,810 If you spend some time praying or meditating or reaching out to a friend or making a new friend or getting some sunshine or fresh air, 475 00:49:15,560 --> 00:49:23,719 it can get to know from your behaviours and your interactions what makes a difference for you and in that moment give 476 00:49:23,720 --> 00:49:31,070 you in an evidence based way based on all this machine learning and ai of emotion and mood being put to use for you, 477 00:49:31,340 --> 00:49:35,720 it could help you see not only your forecast, but how to change your forecast. 478 00:49:38,070 --> 00:49:45,330 So that's the goal we're working toward right now is an AI that's not trying to take over the 479 00:49:45,330 --> 00:49:52,110 world or just impress people on stage with its facial expressions or its emotional jokes. 480 00:49:52,650 --> 00:50:03,660 But the AI that makes our lives better by helping us understand what's going on inside, that we want to change and helping us do that better. 481 00:50:05,230 --> 00:50:13,750 Let me wrap up here with a quick recap. I started with a story of how much harder some of this is than we thought. 482 00:50:13,750 --> 00:50:21,580 We thought like detecting smiles, easy. Then we learn that these peak smiles sometimes can show up when people are really frustrated. 483 00:50:21,910 --> 00:50:29,350 So we have to be much more sophisticated about how we interpret them and read more than just the picture of the facial expression. 484 00:50:30,370 --> 00:50:35,919 While working with people who wanted to understand each other's facial expressions better. 485 00:50:35,920 --> 00:50:39,920 We also learned some of them didn't want that. They wanted us to understand them better. 486 00:50:39,940 --> 00:50:42,670 They wanted us to understand their stress and anxiety better. 487 00:50:43,420 --> 00:50:51,190 And technology we had could be reshaped into wearable forms that could go out where their stressors were happening in schools, 488 00:50:51,190 --> 00:50:56,019 on playgrounds, and start to give them insight into how that stress was changing. 489 00:50:56,020 --> 00:51:02,080 In fact, this will be this measuring my both devices are measuring this right now as I'm up here. 490 00:51:04,230 --> 00:51:11,549 To our surprise, we learned when loaning out the skin conductance technology in the sweatband that it could also pick 491 00:51:11,550 --> 00:51:17,129 up unusual peaks that related not to sweating and stress in the sense we usually think about it, 492 00:51:17,130 --> 00:51:23,550 but related to unusual neural activation in the brain that arises with unusual electro activity with a seizure. 493 00:51:25,110 --> 00:51:31,649 We learned that the size of that activity was related to a signal in the brain associated with SUDEP sudden unexpected death, 494 00:51:31,650 --> 00:51:35,219 an epilepsy that happens every 7 to 9 minutes, 495 00:51:35,220 --> 00:51:44,730 and that by getting somebody there, we might be able to have a human become stimulate that person and restart their breathing and prevent death. 496 00:51:47,460 --> 00:51:52,110 With that. We then began partnering with neurologists and starting to learn more about what was going on deep in these 497 00:51:52,110 --> 00:51:58,590 emotion centres in the brain that related not only to the epilepsy but related to changes in mood and depression. 498 00:51:58,920 --> 00:52:03,810 What could we maybe detect even before a person needs to get a clinical diagnosis? 499 00:52:03,840 --> 00:52:07,140 Could we prevent depression? Could we prevent 80% of the cases of it? 500 00:52:07,170 --> 00:52:16,740 I think that's possible. We started to measure not only with wearables, but with smartphones, human behaviours that change very subtly, 501 00:52:17,190 --> 00:52:22,960 so subtly that you and I may not notice these things changing, but the technology is really good at picking up these changes. 502 00:52:22,980 --> 00:52:30,300 It can see the sophisticated, high dimensional equivalent of the frog in the water and the water getting hotter and hotter and hotter. 503 00:52:30,510 --> 00:52:33,840 And it could let us know that it's getting hotter and hotter and hotter. 504 00:52:36,300 --> 00:52:43,140 We're now using the deep learning machine learning AI in ways that are very personalised to 505 00:52:43,140 --> 00:52:49,590 help people get insights into not only what trajectory they're on and what their stress, 506 00:52:49,740 --> 00:52:55,590 health and mood are likely to be tomorrow, but also what they could do to change that. 507 00:52:55,980 --> 00:52:59,970 And in all of this, surprisingly, is a very low tech solution. 508 00:53:00,750 --> 00:53:03,900 All this fancy AI and all. And what do we need most? 509 00:53:04,170 --> 00:53:09,120 We seem to need each other. We seem to need human social interaction. 510 00:53:09,990 --> 00:53:14,129 We need to sometimes get our head out of our phones and our computers and just 511 00:53:14,130 --> 00:53:20,640 reconnect with each other and have what Barbara Fredrickson calls positive resonances, 512 00:53:20,640 --> 00:53:24,540 where we share a smile, where we share a story, a share, a hug. 513 00:53:24,570 --> 00:53:29,100 It doesn't always have to be a happy thing. It can be a sad thing. We empathise, we connect. 514 00:53:29,460 --> 00:53:38,670 And these things are highly associated with tomorrow being happier, less stressed and healthier. 515 00:53:41,380 --> 00:53:48,160 I hope that you will join me now and Q&A about this, that we can explore together how to make this future. 516 00:53:48,160 --> 00:53:55,510 And I invite you also to go online. We have publications related to all of this, and I'm happy to engage now and hear what's on your mind. 517 00:53:55,660 --> 00:54:03,380 Thank you. Thank you.