1 00:00:00,330 --> 00:00:04,530 Thank you, and thank you for inviting us, and it's lovely to be part of this audience, 2 00:00:04,530 --> 00:00:10,770 and I'll tell you a little bit about what I do and some of my colleagues do around A.I. in finance. 3 00:00:10,770 --> 00:00:18,870 And when I do this with my MBA students, I always begin by asking them what is finance about in one word? 4 00:00:18,870 --> 00:00:28,410 What do you think finance is about? If you had to describe it in one word and everyone who has not worked in finance says it's about money. 5 00:00:28,410 --> 00:00:34,430 But everyone who is working or has worked in finance is something else. 6 00:00:34,430 --> 00:00:44,390 And guess what, how Brits risk finance is about risk, the everyday life of finance practitioners is really all about risk, 7 00:00:44,390 --> 00:00:50,330 is about measuring risk, about quantifying the risk and then about pricing the risk. 8 00:00:50,330 --> 00:00:54,500 For example, if I ask the bank for a loan, 9 00:00:54,500 --> 00:01:00,200 the bank then uses whatever information they have about me and they come up with 10 00:01:00,200 --> 00:01:06,200 a probability or a number that describes the likelihood that I would pay back. 11 00:01:06,200 --> 00:01:11,720 And then based on that number, they can attach some kind of a decision whether to give me a loan or not, 12 00:01:11,720 --> 00:01:19,280 or in some cases, also a rate that goes with that loan depending on regulation and time place. 13 00:01:19,280 --> 00:01:24,050 And they can do that because of course, I'm one person and I will pay yes or no. 14 00:01:24,050 --> 00:01:30,020 But if there's lots and lots of people like me, that individual risk translate to some kind of distribution and it's like a casino. 15 00:01:30,020 --> 00:01:36,110 They can come up with a pricing that would compensate them for the risk that they take. 16 00:01:36,110 --> 00:01:40,130 So this is really what what's happened in finance in most aspects of finance. 17 00:01:40,130 --> 00:01:48,080 Something like what I described. But this is really where technology and I can have a big impact because the 18 00:01:48,080 --> 00:01:55,130 bank in this case is trying to estimate the likelihood that I will pay back. 19 00:01:55,130 --> 00:02:05,240 So the bank doesn't know as much as I do. So if they could go inside my head, they will have a better idea of likelihood that they will pay back. 20 00:02:05,240 --> 00:02:09,800 But if they can go into this, they will have a much better idea. 21 00:02:09,800 --> 00:02:16,160 I would pay back because this knows better than me the likelihood that I would pay back any law. 22 00:02:16,160 --> 00:02:19,780 OK. And I know it sounds like you're excited by that. 23 00:02:19,780 --> 00:02:23,270 Too scared by this. But this is this is this is what we've seen. 24 00:02:23,270 --> 00:02:28,640 And if you remember the everybody remember the scandal, which Facebook and the election, 25 00:02:28,640 --> 00:02:34,220 but it came from a project that started by Facebook, although they don't like to talk about it, called my personality. 26 00:02:34,220 --> 00:02:38,390 It started as a Facebook project with the Cambridge people before they started off, 27 00:02:38,390 --> 00:02:45,890 where they looked at the correlation between Facebook likes, which at that time was a public of the publicly available information. 28 00:02:45,890 --> 00:02:52,280 And what we call personality is measured by the Big Five. And this is the standard way of measuring facade. 29 00:02:52,280 --> 00:02:55,490 And if you remember the result, they found that if you have enough likes, 30 00:02:55,490 --> 00:03:00,880 then the algorithm can predict your personality better than your best friends. 31 00:03:00,880 --> 00:03:05,440 In fact, they came up with this number, which I think was 200, but I'm not sure that with 200 likes, 32 00:03:05,440 --> 00:03:11,380 the algorithm can predict with more accuracy than your partner could think about. 33 00:03:11,380 --> 00:03:14,950 You know, the person who lives with you, they can predict better. 34 00:03:14,950 --> 00:03:21,730 So clearly, the information is phenomenal and and this is both revealing we can stop here because you can see both the 35 00:03:21,730 --> 00:03:31,420 the opportunity and the challenges around algorithm making decisions for us and using the technology. 36 00:03:31,420 --> 00:03:39,490 Now this using technology using information via the internet and other devices is not new. 37 00:03:39,490 --> 00:03:42,370 It's new in finance, but it's been used in other areas. 38 00:03:42,370 --> 00:03:51,820 So, for example, in the travel industry, you know, until eight, eight years ago, something like that, 39 00:03:51,820 --> 00:03:56,020 if you travel somewhere, you know, you could have a bad hotel, you could have a good hotel. 40 00:03:56,020 --> 00:03:59,890 There was lots of variability. There is still variability, but a lot less. 41 00:03:59,890 --> 00:04:03,880 Yes, it's very unlikely that you'll go to a place and it will be completely, 42 00:04:03,880 --> 00:04:10,060 completely horrible because of TripAdvisor, because people share the information and the places know that. 43 00:04:10,060 --> 00:04:13,630 So they know that nobody will come if they get really, really bad trips. 44 00:04:13,630 --> 00:04:20,830 So the reviews on TripAdvisor, so that that sort of is a way of aggregating information and reducing risk. 45 00:04:20,830 --> 00:04:27,580 And maybe even a better example of that is with the insurance industry. And so if you are young drivers or if you have children who are young drivers, 46 00:04:27,580 --> 00:04:32,410 you're all familiar with this thing called insurance and insured the box or the box, the different names for it. 47 00:04:32,410 --> 00:04:40,300 But it's essentially a device that you put in the car and it transmits large information about the driver all the time. 48 00:04:40,300 --> 00:04:46,090 So if you think of the example I started with with the loan, it's pretty clear that this is a way of reducing the asymmetry. 49 00:04:46,090 --> 00:04:53,590 So you're saying, I really don't know, I'm the insurance company. I think young people are, you know, statistically bad drivers. 50 00:04:53,590 --> 00:04:55,540 So, you know, I have to give you a really, 51 00:04:55,540 --> 00:05:01,720 really bad rate to compensate for the fact that I don't know anything about you and you are young, but you can say, Look, 52 00:05:01,720 --> 00:05:06,550 let me convince you that I'm a good driver and this is a credible way for me to 53 00:05:06,550 --> 00:05:10,720 use technology and transmit the information life all the time in a way to do that. 54 00:05:10,720 --> 00:05:17,320 And just if you know, if you're familiar with fintech, you know, there's lots of apps like that that that help people build credit history and so on. 55 00:05:17,320 --> 00:05:21,580 In much the same way as what I just described to, you know, 56 00:05:21,580 --> 00:05:26,590 so this is a lot of the new stuff is known as fintech or financial technology, and it's relatively new. 57 00:05:26,590 --> 00:05:31,240 The reason it's new is because until sort of 2011 12, 58 00:05:31,240 --> 00:05:39,220 the regulator stopped any kind of use of technology in finance for because the regulators view was we are here to protect 59 00:05:39,220 --> 00:05:45,790 the public and anything that involves people lending money to each other or doing all kind of risky stuff is bad. 60 00:05:45,790 --> 00:05:52,990 And that's what we would say. No. And that has changed after the financial crisis and after the government have instructed the FCA, 61 00:05:52,990 --> 00:06:01,330 the financial used to be the FSA, the regulators to work with the start-ups and help them come up with all these new things. 62 00:06:01,330 --> 00:06:08,590 And so we have this explosion of all this new digital banks and all these kind of applications that you may have 63 00:06:08,590 --> 00:06:15,450 heard of that are exactly using that using technology in finance and coming up with all kinds of new things. 64 00:06:15,450 --> 00:06:22,780 Let me give you a couple of example and some of the much loved company Wonga, if you heard of them. 65 00:06:22,780 --> 00:06:27,780 So one guy, they're no longer in business. But the idea was to give loans what they now call payday loans. 66 00:06:27,780 --> 00:06:37,000 So to give very short a small amount of money for short amount of time for people charging quite high rates for that. 67 00:06:37,000 --> 00:06:43,120 And the idea is these are people who wouldn't be don't have credit cards, don't have any other ways of loaning money. 68 00:06:43,120 --> 00:06:47,830 The banks wouldn't touch them. And so it is offering them something that they would use. 69 00:06:47,830 --> 00:06:52,390 But the key to that was to use this technology because they need the money typically now. 70 00:06:52,390 --> 00:06:58,690 And the decision has to be made very, very quickly. Normally on the smartphone, using the information they have. 71 00:06:58,690 --> 00:07:02,320 Another example, a bit more big scale, but similar is Funding Circle, 72 00:07:02,320 --> 00:07:09,070 which is a company that started by two of our undergraduates from the business school from Keeble College and is now a publicly traded company. 73 00:07:09,070 --> 00:07:17,320 So you may have seen their adverts on television. They're quite big, but Funding Circle is a peer to peer lending, so we take money from us, 74 00:07:17,320 --> 00:07:25,390 the public and we lend it to small and medium enterprises, businesses, small shops, small businesses that are starting. 75 00:07:25,390 --> 00:07:33,730 And again, you have this idea that there is a gap in the market because the banks after the financial crisis have stopped lending to these people. 76 00:07:33,730 --> 00:07:38,350 They need money. Some of them will, you know, not bad companies. Yes. 77 00:07:38,350 --> 00:07:43,130 And the public, on the other hand, is getting kind of close to zero interest rates on their account. 78 00:07:43,130 --> 00:07:49,030 So why wouldn't the public entities and thing kind of share some of the rewards from that? 79 00:07:49,030 --> 00:07:54,490 And so they use and they created a very, very successful base in business based on that. 80 00:07:54,490 --> 00:08:00,160 There's many more like them, but hopefully you've heard of either of them now. 81 00:08:00,160 --> 00:08:04,300 Both of them, as I said, are offering the service in a place where there is a gap. 82 00:08:04,300 --> 00:08:09,760 So they're offering something that didn't exist before. That's a good thing. Both sides are happy. 83 00:08:09,760 --> 00:08:14,890 Roughly speaking with that being, both sides rely heavily on algorithms to make decisions. 84 00:08:14,890 --> 00:08:19,600 So in Wonga, it's you can't have a person. It's not worth your while having the person. 85 00:08:19,600 --> 00:08:23,020 So making a decision of a loan of £10. 86 00:08:23,020 --> 00:08:29,500 But with funding secured, even though it's much bigger sums, it's still highly dependent on the algorithms and with funding circle. 87 00:08:29,500 --> 00:08:33,400 And again, as an example, there is algorithms and two sides of the market. 88 00:08:33,400 --> 00:08:39,400 So when the company, when a small, medium company and small is asking for a loan, 89 00:08:39,400 --> 00:08:44,270 it is the algorithm that essentially make the decision whether to give it the loan or not. 90 00:08:44,270 --> 00:08:47,740 I think there's nothing particularly sophisticated about this algorithm, 91 00:08:47,740 --> 00:08:51,130 which I haven't seen it, but it's using the same information as the bank would use. 92 00:08:51,130 --> 00:08:59,620 And so to make the decision, but also on the other side, on the public side, we can't invest the money directly in the businesses that we like. 93 00:08:59,620 --> 00:09:04,960 What happened is they take our money and they split it between a minimum of 100 companies. 94 00:09:04,960 --> 00:09:09,250 OK, so if you put a thousand pounds, it goes to a minimum of 100 different companies. 95 00:09:09,250 --> 00:09:18,310 And the idea is that you get diversification. And if one or two or even five systems fail, you will still get decent returns for for your buck. 96 00:09:18,310 --> 00:09:27,700 So it all makes sense, but you can immediately see that this is different kind of finance where all the decisions are made by the algorithms. 97 00:09:27,700 --> 00:09:36,760 So this algorithms in finances is very similar to what you got called gets in and and you know, 98 00:09:36,760 --> 00:09:43,240 it's got big advantages, obviously, because it allows us to do things we couldn't do before. 99 00:09:43,240 --> 00:09:50,290 And as I said, both sides are happy with it, but it also comes up like you've just described, and that's good with lots of challenges. 100 00:09:50,290 --> 00:09:56,470 And the challenges in this case is how are we making sure that the algorithms do what they're supposed to do? 101 00:09:56,470 --> 00:10:02,710 And it's not because the algorithms are bad, it's because we don't really have control over that black box. 102 00:10:02,710 --> 00:10:09,140 And so that create all kinds of issues for the regulators that they wouldn't be familiar with before and how to deal with that. 103 00:10:09,140 --> 00:10:12,820 So for example, and I'm not suggesting any of these companies are doing that, 104 00:10:12,820 --> 00:10:16,990 but you can see that there is pressure on them to match the demand and supply. 105 00:10:16,990 --> 00:10:24,160 So that is a two sided market. Yes. So you have to give loans and you have to have money invested in these loans from the public. 106 00:10:24,160 --> 00:10:33,470 And what happens if there is more people signing up to lend money than businesses asking for money? 107 00:10:33,470 --> 00:10:40,670 So, you know, does that create pressure on them to maybe slightly change the button of the algorithm and say yes to more loans? 108 00:10:40,670 --> 00:10:45,620 Yeah. This is the kind of things you do you really need to think about. 109 00:10:45,620 --> 00:10:49,430 And in the case of one guy, I can say, because that's public information. 110 00:10:49,430 --> 00:10:52,640 That's exactly what happened. And the regulator for that got very, very, 111 00:10:52,640 --> 00:11:01,610 very upset with them because they felt that the buttons were changed and they weren't, you know, fully informed. 112 00:11:01,610 --> 00:11:12,030 There's no suggestion of that with funding secured with any of the other companies, but that's it gives you a taste of what it's like. 113 00:11:12,030 --> 00:11:16,650 Now, there is one part of finance where algorithm trade three, 114 00:11:16,650 --> 00:11:21,750 there's one find in finance where A.I. and algorithms are not new and this is algorithmic trading. 115 00:11:21,750 --> 00:11:28,290 And this is algorithmic trading and this is what we do in the in the army and also some of the colleagues, 116 00:11:28,290 --> 00:11:36,330 my colleagues in the business school looked into that now algorithmic trading, the reason that the so it started in the 80s. 117 00:11:36,330 --> 00:11:39,990 So for example, my the is man. 118 00:11:39,990 --> 00:11:47,280 And actually the full name is my age. Gender and age is initials of three statisticians that came from Cambridge. 119 00:11:47,280 --> 00:11:56,670 Apparently, there's a university there and they and they came up with one of the first black boxes that were that were used in to trade. 120 00:11:56,670 --> 00:12:01,890 And the reason the regulator didn't really worry about hedge fund is because hedge 121 00:12:01,890 --> 00:12:06,090 funds takes money from very rich individuals and from very sophisticated investors, 122 00:12:06,090 --> 00:12:13,320 so they don't have to worry about it. And so we have a case study in A.I. in algorithms going from the 80s. 123 00:12:13,320 --> 00:12:20,430 And it's a really interesting one because with algorithmic trading, there's been a lot of start-ups, a lot of funds, 124 00:12:20,430 --> 00:12:28,950 some of them super successful, like ages like Millennium, like some of the Renaissance, some of the big ones, but some that have failed. 125 00:12:28,950 --> 00:12:39,620 And so we can learn from these things and see how and when algorithms are used properly and when don't use that inappropriately. 126 00:12:39,620 --> 00:12:50,820 And let me just two more minutes to finish on this. So. So in two minutes, I want to say that algorithmic trading, as you said, is from the 80s, 127 00:12:50,820 --> 00:12:56,760 but it's still growing and developing because there's new sources of data. 128 00:12:56,760 --> 00:13:03,750 For example, nowadays satellite, you can buy that data with satellite images and you can look at Compaq's of companies and 129 00:13:03,750 --> 00:13:10,950 estimate the demand from that or when the CEO of publicly traded companies speak to analysts. 130 00:13:10,950 --> 00:13:18,840 There is now lots of boats that translate that into numbers and deduce from that or kind of sentiment analysis and so on, 131 00:13:18,840 --> 00:13:23,520 and if there is optimism or pessimism and then trade based on that. 132 00:13:23,520 --> 00:13:31,320 So all these things are in exists and actually a couple of quick one, this one colleague of us in the business school is working specifically on that. 133 00:13:31,320 --> 00:13:35,190 So it is so, so it's very dynamic. There's a lot, a lot going on. 134 00:13:35,190 --> 00:13:39,780 And as I said, there is a lot of examples to learn from. 135 00:13:39,780 --> 00:13:49,290 And you know. Let me just give you maybe a couple of sentences how I think about it then is this 136 00:13:49,290 --> 00:13:53,140 it sounds a bit like the Matrix when there's good programmes and bad programmes. 137 00:13:53,140 --> 00:13:59,050 But I think there are good algorithms and bad algorithms and bad algorithms is like the guy who's now. 138 00:13:59,050 --> 00:14:02,110 I think he just the Americans decided not to put him in prison. 139 00:14:02,110 --> 00:14:09,670 This guy from Onslow who did this flash crash, the trader that uses his algorithm trading to trade really, really quickly. 140 00:14:09,670 --> 00:14:17,230 And he basically moved the whole the American stock market and he is now. 141 00:14:17,230 --> 00:14:22,570 It was in trial. I think they decided not to put him in prison for various reasons, but. 142 00:14:22,570 --> 00:14:28,150 But that's clearly an algorithm just used badly because it increases the risks. 143 00:14:28,150 --> 00:14:33,590 But good programmes, good algorithms is when you use the information to reduce risks. 144 00:14:33,590 --> 00:14:39,850 If, for example, you kind of trade lots and lots of different markets, which would be very difficult for a person to do. 145 00:14:39,850 --> 00:14:47,980 But in a way reduces the risk because you are placing bets in many, many more different baskets than you would otherwise. 146 00:14:47,980 --> 00:14:50,830 And as I said, this is a fascinating area, 147 00:14:50,830 --> 00:14:59,830 and I'm looking forward to hearing from you at the end and also collaborating with other people around university because I think it looks different, 148 00:14:59,830 --> 00:15:06,602 but it's very, very similar questions that we are facing. Thank you very much.