1
00:00:06,960 --> 00:00:14,880
OK, so today we are pleased to welcome a unanimous withdrawal from a University of Warwick, where he's a postdoc.
2
00:00:14,880 --> 00:00:21,750
So he got his PSG in 2019. And I did and say in Paris.
3
00:00:21,750 --> 00:00:29,400
So his main research interests lie in numerical approximations for statistical models and in particular,
4
00:00:29,400 --> 00:00:33,060
sampling algorithms on the Monte Carlo method.
5
00:00:33,060 --> 00:00:38,750
And today we'll talk about it properly suggested long month project or so.
6
00:00:38,750 --> 00:00:43,200
You know, thank you very much better on interaction.
7
00:00:43,200 --> 00:00:51,150
Thank you very much for the invitation, and I'm very happy to talk today in the seminar for attending.
8
00:00:51,150 --> 00:01:01,500
So, uh, my talk today would be about having done the Monte Carlo and some.
9
00:01:01,500 --> 00:01:14,520
Tuning properties of this something algorithm and tuning programmes that we can face within when using this this sampler, and I'm going to.
10
00:01:14,520 --> 00:01:22,230
Start straight away. This is a joint work with another product from the new week whose name is going.
11
00:01:22,230 --> 00:01:27,560
And I'm going to speak my presentation into three parts.
12
00:01:27,560 --> 00:01:33,680
That's what I'm going to explain a bit what is going on in Monte-Carlo do some presents.
13
00:01:33,680 --> 00:01:43,970
What is the method? How it works and to also introduce some tuning problems for both smart metres.
14
00:01:43,970 --> 00:01:51,760
We are going to look at some suggestions that have been proposed in the literature to tackle these problems.
15
00:01:51,760 --> 00:01:54,530
And we are going to motivate you to take on the question,
16
00:01:54,530 --> 00:01:59,520
the problem we are going to study and in particular, we are going to introduce diffusion process.
17
00:01:59,520 --> 00:02:06,470
That's not a diffusion. So here the word normal diffusion should be understood as it does in several names of the ratio.
18
00:02:06,470 --> 00:02:11,330
The one we are going to look at is also named Alabi pronounced the diffusion.
19
00:02:11,330 --> 00:02:18,410
Or can you take several names? We just use this nose, run them all of these,
20
00:02:18,410 --> 00:02:28,850
and we are going to connect with passion of HSC that relies on one amazing stuff into integration time, known as organisation.
21
00:02:28,850 --> 00:02:33,770
And we are going to compare some exponential mixing rates between the two.
22
00:02:33,770 --> 00:02:40,760
This will motivate our study and our construction of an algorithm because he suggested last month
23
00:02:40,760 --> 00:02:49,430
trajectories that we will present as a robust alternative to agency and try to convince you what sense it is.
24
00:02:49,430 --> 00:02:57,260
Progress here. What business means? Yeah, robustness refers to robustness of tuning of its metres.
25
00:02:57,260 --> 00:03:07,220
So how how easy it is to calibrate the barometers of a particular intuition time?
26
00:03:07,220 --> 00:03:12,230
And I might we call it a robust technology to HMRC,
27
00:03:12,230 --> 00:03:19,850
but can also be seen simply as a robust extension in the sense that you can recover the sample of a particular case of this.
28
00:03:19,850 --> 00:03:27,500
OK. So if you have any questions, I would make some some pauses between between these sections.
29
00:03:27,500 --> 00:03:31,790
But if you want to interrupt me, there's also possible online as well.
30
00:03:31,790 --> 00:03:39,840
I have a much. So let's go I start with a few notations just to define the metrics, I'm going to use this.
31
00:03:39,840 --> 00:03:43,490
So here they are. All right.
32
00:03:43,490 --> 00:03:51,890
And he has the he has the goal in this talk, we are going to focus on the following objective,
33
00:03:51,890 --> 00:03:58,080
which is to stop at approximately from the target distribution. We set a density of spectral Ebix measure or not.
34
00:03:58,080 --> 00:04:02,150
So the first line here, we only assume that is a distinction by that.
35
00:04:02,150 --> 00:04:16,680
The medium density positive everywhere Onazi and Will denotes Phi to be minus log density up to an additive constant if we call with unselfconscious.
36
00:04:16,680 --> 00:04:28,360
So this becomes a function, we assume that it is if you smooth.
37
00:04:28,360 --> 00:04:38,200
And uh, and there's also. Multiculturalism, CMC something, right?
38
00:04:38,200 --> 00:04:42,760
And somehow, we are not going to refer to any particular statistical model here.
39
00:04:42,760 --> 00:04:52,630
Just think about this staggering submission by us, the distribution of high dimensional Bayesian statistical model, for instance.
40
00:04:52,630 --> 00:05:02,440
And we we will study the properties of of several mcms categories, including this HSC,
41
00:05:02,440 --> 00:05:09,970
something which is built upon internal dynamics that are defined here. And so what's what do they mean these dynamics?
42
00:05:09,970 --> 00:05:18,880
Well. You see, they are defined. Here is a system of function equation which describes the motion of both Position
43
00:05:18,880 --> 00:05:24,040
X and the TV so that if you start from any location in the state space X0.
44
00:05:24,040 --> 00:05:37,570
Zero zero wealth, you will follow some contours here of an extended target base to define freeze density that as a product of by and goes.
45
00:05:37,570 --> 00:05:41,370
And that's it. OK. So these are not mix.
46
00:05:41,370 --> 00:05:47,760
They have one property, they leave this distribution by starting violence in the sense that if you start
47
00:05:47,760 --> 00:05:52,830
from zero zero that are distributed according to by starting from for any team,
48
00:05:52,830 --> 00:05:59,490
activity will also be to be according to buy. So in that sense, they leave by starting violent now.
49
00:05:59,490 --> 00:06:08,050
Of course, these these are not exactly deterministic, which means that they cannot yield negative process.
50
00:06:08,050 --> 00:06:11,410
Simply because if you start from location x zero zero,
51
00:06:11,410 --> 00:06:17,160
behaviour would be completely deterministic and you can report the conversion distribution to was based on the.
52
00:06:17,160 --> 00:06:26,260
So to be able to achieve that, we are going to need to do something on the messenger algorithm, which we will do in two sites.
53
00:06:26,260 --> 00:06:34,150
But before that, we are going to, uh, to present one method to approximate these dynamics because in most general situations,
54
00:06:34,150 --> 00:06:42,880
we cannot have custom solutions for these benefits. So what we do instead is we approximate these dynamics with a timely situation.
55
00:06:42,880 --> 00:06:55,120
Here, I present the most probably one of the most famous additions for form for the I'm going to be known as the leapfrog integrator,
56
00:06:55,120 --> 00:06:58,400
also known as the storm of any updates, which is defined as follows.
57
00:06:58,400 --> 00:07:04,150
So what is its principle? Well, it's built upon the splitting of damage and dynamics into two parts.
58
00:07:04,150 --> 00:07:08,890
First, we start with the motion of the velocity for half a step for each of the two,
59
00:07:08,890 --> 00:07:14,810
but then we will follow ease following the dynamics of the position.
60
00:07:14,810 --> 00:07:22,030
X4 Footstep Agent, we recompose again with equations of motion for the velocity adjusted.
61
00:07:22,030 --> 00:07:29,080
And so together they yield this leapfrog in the weight, or they will do some of it dates.
62
00:07:29,080 --> 00:07:38,710
But there are some, some properties when it comes to playing a bit of this correction and b the simpler after I just eat something out of it.
63
00:07:38,710 --> 00:07:43,330
And amongst these properties are two proteins that are particularly useful are
64
00:07:43,330 --> 00:07:48,190
the fact that the preserving first and second that they are time reversible.
65
00:07:48,190 --> 00:07:54,650
So the second second notion simply means that somehow the forward move here.
66
00:07:54,650 --> 00:08:05,900
Can be related to the back one move up to a flip of the velocity issue, we first sign of a velocity somehow related to backward moving to the one.
67
00:08:05,900 --> 00:08:14,670
And we will call you, Michael. I mean, I think we all just think, though here we are a composition of the number of steps.
68
00:08:14,670 --> 00:08:20,330
Right. So in particular, if we choose ED to be proportional to two for each.
69
00:08:20,330 --> 00:08:25,460
And we will approximate using this if I can, Rachel.
70
00:08:25,460 --> 00:08:30,770
I'm kind of trajectory off.
71
00:08:30,770 --> 00:08:40,730
So built upon this tiny segregation, the standards Hamilton and Monte Carlo algorithm was introduced in nineteen eighty eight.
72
00:08:40,730 --> 00:08:43,220
And its principle is the following.
73
00:08:43,220 --> 00:08:53,120
So it requires choosing something by step, intuition, time see, and the number of steps will be personnel to operate.
74
00:08:53,120 --> 00:08:59,120
And at the start of the trajectory, we will simply draw fresh and noise.
75
00:08:59,120 --> 00:09:06,480
The Prime. And from zero and we prime we will propose.
76
00:09:06,480 --> 00:09:13,300
And you may like Al Harrington and trajectory for any progress that we are going to accept with stunning probability.
77
00:09:13,300 --> 00:09:22,850
That corresponds to the ratio of densities. Between the output of the trajectory and the initial position, velocity X0.
78
00:09:22,850 --> 00:09:29,780
And. I thought to be able to yield.
79
00:09:29,780 --> 00:09:35,090
Mac of Canada believes it's such an ATC mission, if you just look at the three last points, if you want to have a backup candidate.
80
00:09:35,090 --> 00:09:43,130
There's no distribution balance. You need a technical, just technical step that's insists on flipping the momentum.
81
00:09:43,130 --> 00:09:51,050
Whenever there is a rejection, so you sign just minus in velocity, when neither do you.
82
00:09:51,050 --> 00:09:56,480
Of course, this is of very little importance here because at every step of recovery, we just refresh free the momentum,
83
00:09:56,480 --> 00:10:02,690
which means that today could just erase this last step, and it would not have changed anything in the algorithm.
84
00:10:02,690 --> 00:10:12,530
So, so if the momentum keeps erased by four momentum refreshments, why talking about this for two minutes when you have the answer?
85
00:10:12,530 --> 00:10:25,310
Because if you have two years later, a transition of this algorithm was introduced known as a it, and it's built on the following principle.
86
00:10:25,310 --> 00:10:32,420
It's the same equation, except it involves now an additional parameter alpha known as assistance barbiturate.
87
00:10:32,420 --> 00:10:39,810
So when that fight is equal to zero, we recover exactly the HMRC algorithm that they just presented.
88
00:10:39,810 --> 00:10:45,600
Now, as soon as Alpha is strictly positive, the momentum refreshment becomes passion,
89
00:10:45,600 --> 00:10:50,710
right, and the closest allies to one, the more partial the refreshment vehicle.
90
00:10:50,710 --> 00:11:04,010
OK, so one motivation for for inducing persistence these successive trajectories is that somehow to yield efficient something,
91
00:11:04,010 --> 00:11:09,360
we want to be able to explore efficiently the safe space to be able to reuse the corporation as much as possible.
92
00:11:09,360 --> 00:11:13,920
So in that sense, it's it's a good motivation for it.
93
00:11:13,920 --> 00:11:22,220
Now there is a drawback here, which is that now the last step. Involve some women, something that will be only partially raised as soon as advice,
94
00:11:22,220 --> 00:11:29,980
really positive, and when that fact goes to white, there would be less and less.
95
00:11:29,980 --> 00:11:37,450
So in particular, this this decision, because of these momentum, Phillips received,
96
00:11:37,450 --> 00:11:44,500
I think, limited interest compared to the do not start out of HFC Sumpter,
97
00:11:44,500 --> 00:11:56,290
although there are quite a lot of work that have been devoted to trying to reduce the number of momentum flips by metallurgical innovations,
98
00:11:56,290 --> 00:12:03,910
including, for instance, built upon delayed reaction methods, for instance, where we increase a bit decompression time to be able to avoid rejection.
99
00:12:03,910 --> 00:12:06,910
Because when these when these momentum piece occurs,
100
00:12:06,910 --> 00:12:13,990
somehow it does really the opposite of what we would like because it always will backtrack upon the rejection.
101
00:12:13,990 --> 00:12:20,600
So when the crystals drips, somehow the beat the dynamics of X are guided by velocity.
102
00:12:20,600 --> 00:12:35,690
So if you just flick the velocity v, then then you will just go backwards, OK?
103
00:12:35,690 --> 00:12:39,920
Right, so this is actually quite frustrating for this choice of alpha,
104
00:12:39,920 --> 00:12:46,850
especially because one of the original goal of the paper in 1991 by your wits was to suggest to use Alpha out close
105
00:12:46,850 --> 00:12:55,570
to one to be able to mimic a difficult process known as Los Angeles and possibly introducing a few sites and.
106
00:12:55,570 --> 00:12:59,560
And the point is that this choice actually leads to a lot of money,
107
00:12:59,560 --> 00:13:07,210
people that has to be controlled somehow by choosing banks that we're going to talk a bit more about this and you few slides.
108
00:13:07,210 --> 00:13:15,070
But before that, I'm going to talk a bit more about the two of the family jewels, about the choice of the times that each and the integration.
109
00:13:15,070 --> 00:13:18,460
So one the question about these algorithms is how to choose its family does,
110
00:13:18,460 --> 00:13:24,670
and we are going to talk a bit about about solutions that have been subject in literature.
111
00:13:24,670 --> 00:13:28,980
To tackle this point.
112
00:13:28,980 --> 00:13:44,120
So one first solution concerns the choice of the time stamp each and is built upon a study of the scaling limits of the HMRC algorithm.
113
00:13:44,120 --> 00:13:52,460
And there are these assumptions here, a tool that assumes that the potential function right as a sum of your potential is right.
114
00:13:52,460 --> 00:14:02,730
So this assumption actually is very strong and very, very realistic because it simply boils down to assuming that distribution is a product for.
115
00:14:02,730 --> 00:14:08,200
Oh. Which means that all the components are in the balance here.
116
00:14:08,200 --> 00:14:16,480
And so if all the components in Iran, then we can simply if we can simply solve each of the something problem individually.
117
00:14:16,480 --> 00:14:21,190
So this is not really about how realistic this assumption is here,
118
00:14:21,190 --> 00:14:28,000
but it's more convenient mathematical framework to be able to study the scanning limit and especially the behaviour
119
00:14:28,000 --> 00:14:35,710
of the acceptance rates as dimension goes to infinity by when choosing a time step that decays when dimension.
120
00:14:35,710 --> 00:14:40,630
So that's really the goal here is to understand how to accept reject mechanism works in high
121
00:14:40,630 --> 00:14:49,660
dimension on the simple mathematical model that helps us do something that solves the equation,
122
00:14:49,660 --> 00:14:55,270
I guess. And so so this these works known as optimal scanning.
123
00:14:55,270 --> 00:15:04,540
Actually, I've been, uh, first introduced for one, a much simpler then having studied for my algorithm.
124
00:15:04,540 --> 00:15:08,260
And then in 2013, I've been the writer so far, the HSC,
125
00:15:08,260 --> 00:15:18,280
something that I basically sent in some quarters and the results that they obtain is the following if you choose the time step to be.
126
00:15:18,280 --> 00:15:23,440
Sine function here that's that goes to zero when they mention goes to this, the minus one four,
127
00:15:23,440 --> 00:15:29,140
then I think particularly with the acceptance rate, which converts to a non-trivial balance between going one.
128
00:15:29,140 --> 00:15:36,220
The first reason that this rate that it manages to gets Mexican state that is strictly between zero and one, I think particularly.
129
00:15:36,220 --> 00:15:44,500
And then the mathematical framework is actually meaty enough to be able to enable
130
00:15:44,500 --> 00:15:50,110
the optimisation excuse the optimisation of several measures of efficiency,
131
00:15:50,110 --> 00:15:57,880
including the equation distance, for instance, that yields a very simple, puny ruler.
132
00:15:57,880 --> 00:16:01,000
And that's the best choice of time step.
133
00:16:01,000 --> 00:16:09,770
I seem to think that you can get here these two two time centres that the X excellence rates will converge to sixty five percent.
134
00:16:09,770 --> 00:16:17,420
So this gives us more some information about the number of steps and therefore the number of green evaluations that we are going
135
00:16:17,420 --> 00:16:26,930
to have to compute to be able to reach a certain amount of integration time should be invested proportional to the times.
136
00:16:26,930 --> 00:16:32,250
And this is also a very simple tuning rule for choosing each.
137
00:16:32,250 --> 00:16:37,590
That is simply built on monitoring out of out of mcms outputs.
138
00:16:37,590 --> 00:16:45,830
The acceptance rates, global acceptance rates of the CMC change and to tune the times typical.
139
00:16:45,830 --> 00:16:55,570
OK. Let's say that, so it's quite convenient rule for tuning,
140
00:16:55,570 --> 00:17:04,680
we simply highlighted that it's also a partial answer to the question we just asked because it's tuning of a time step h.
141
00:17:04,680 --> 00:17:15,790
Conditionally on a fixed physical time and for anyone, value of advice could reasonably assume that they have free refreshments.
142
00:17:15,790 --> 00:17:20,320
Now we are going to talk a bit more about the choice of the intuition and.
143
00:17:20,320 --> 00:17:32,350
In particular, we are going to hear simply briefly talk about screening solutions that have been proposed, known as the Newtown sampler.
144
00:17:32,350 --> 00:17:45,430
Let's listen. And somehow, I think it's fair to say that the choice of the integration empty is really one big question for the tuning of HSC.
145
00:17:45,430 --> 00:17:52,360
And there have been a lot of lot of works that if tuning for these, but me too, I just present one idea here.
146
00:17:52,360 --> 00:18:00,190
But there are many others, and I think these works received quite a lot of interest because they are useful to provide
147
00:18:00,190 --> 00:18:05,840
the black box something that's functioning that require really minimal tuning from users.
148
00:18:05,840 --> 00:18:09,610
Also, you can calibrate automatically that is the value of T.
149
00:18:09,610 --> 00:18:15,910
And in the case, often you get on simpler without really detailing the algorithm.
150
00:18:15,910 --> 00:18:25,450
The ID is built upon the principle that the one should try to follow the dynamics until this stuff is met.
151
00:18:25,450 --> 00:18:29,470
While I'm showing that some of the right to such an idea division, of course.
152
00:18:29,470 --> 00:18:31,270
And why? Why this?
153
00:18:31,270 --> 00:18:45,700
Well, because when we look at the equity social media, the squares some distance between zero and we derive these dynamics through the internet,
154
00:18:45,700 --> 00:18:53,740
where we get that whenever these these products is positive, then this distance would be increasing when it becomes negative.
155
00:18:53,740 --> 00:19:02,050
And then we would reduce the distance somehow, if he wants to be able to explore that space efficiently, we want this distance to be to be large.
156
00:19:02,050 --> 00:19:11,650
And therefore these Nets sampler provides automatic tuning algorithm whose aim is to maximise the distance.
157
00:19:11,650 --> 00:19:13,160
By having said that.
158
00:19:13,160 --> 00:19:19,280
So if you are happy with this, with this measure of efficiency and you are happy with this, go well, that's definitely fine, right?
159
00:19:19,280 --> 00:19:23,000
I mean, that's you. You should choose this measure of efficiency.
160
00:19:23,000 --> 00:19:30,520
Somehow, it gives you guarantees about the ability and metrics, but it cannot give you any decent guarantees for a fixed component.
161
00:19:30,520 --> 00:19:41,660
So in other words, it might be that if you maximise the distance you have good mixing for, especially for for large components.
162
00:19:41,660 --> 00:19:51,800
Compressors have lost gains, but you really don't say anything uniformly, especially for components that might have small scale.
163
00:19:51,800 --> 00:19:59,020
So this we are going to highlight in the following. He's not really particularly effective of the 20s or 30s, if not me, where they say,
164
00:19:59,020 --> 00:20:04,060
I'm really highlighting the fact that's, for instance, you want to choose a fixed value of tea.
165
00:20:04,060 --> 00:20:09,070
Actually, you can find very simple examples where this can become very problematic.
166
00:20:09,070 --> 00:20:14,920
And actually, you don't have to go very far. Actually already on the go and target you with inteligente of scales.
167
00:20:14,920 --> 00:20:20,150
So we suppose we have different sigma eyes here that different ones we just have from viruses.
168
00:20:20,150 --> 00:20:31,450
So you might want to see my decon pronouncement that different. And and we look at the the problem of choosing one value of integration and that
169
00:20:31,450 --> 00:20:36,190
we'll be able to control all the correlations at once over the components one.
170
00:20:36,190 --> 00:20:40,990
OK, so we defined this with the correlation function or fruit of tea for component.
171
00:20:40,990 --> 00:20:45,430
I had to be just a coalition of tea and excited zero.
172
00:20:45,430 --> 00:20:49,480
And here on the cushion case, virtually everything is explicit.
173
00:20:49,480 --> 00:20:53,750
The calculations are very straightforward and easy. These are two corporations.
174
00:20:53,750 --> 00:21:00,150
They simply boil down to cosine is function with different elements. OK.
175
00:21:00,150 --> 00:21:06,060
So we can see that's on the graph here, if you sit, oppose all these other operations for this,
176
00:21:06,060 --> 00:21:13,260
these functions with given bandwidth and you have a bunch of them with different skills,
177
00:21:13,260 --> 00:21:23,730
then the task of choosing one that you see that will that will control all the conditions at once is quite hard, right?
178
00:21:23,730 --> 00:21:26,610
And sometimes it's just into other networks.
179
00:21:26,610 --> 00:21:36,100
If you look at the maximum value of these operation functions as a function of where these functionality can be actually arbitrary,
180
00:21:36,100 --> 00:21:42,800
arrest you close to one. So one main solution.
181
00:21:42,800 --> 00:21:54,900
So just even a little retro to tackle this problem is to rely on choosing at random the amount of time each iteration and so each before each starts.
182
00:21:54,900 --> 00:21:57,260
So I think we will draw the amount of interest.
183
00:21:57,260 --> 00:22:05,990
I'm from a certain distribution here in state how they consider the choice of the exponential distribution with rate number.
184
00:22:05,990 --> 00:22:11,900
And from from this, the rest is the same as before.
185
00:22:11,900 --> 00:22:15,740
And the purpose of this is to be able somehow to induce a smoothing effect on
186
00:22:15,740 --> 00:22:24,360
the correlations so drawing at random and making these repeated repeated loops.
187
00:22:24,360 --> 00:22:33,240
The goal is to somehow average the corporations of HMRC and negotiate case the conditions are that as when we can see here,
188
00:22:33,240 --> 00:22:41,490
some of the conditions of this randomise approach actually boils down to the corruption of the HMRC for
189
00:22:41,490 --> 00:22:48,450
components and that we obtain something here that can be uniformly controlled by the CFO of the larger scale.
190
00:22:48,450 --> 00:22:59,050
There is a uniform down that will be monotony was like to no minus one thing that would control
191
00:22:59,050 --> 00:23:03,330
that once all your operations and recall that this number here is the rate of the commission.
192
00:23:03,330 --> 00:23:08,220
The submission, which means that the universe of lambda is the expected integration.
193
00:23:08,220 --> 00:23:11,550
So when the expected integration time goes to infinity,
194
00:23:11,550 --> 00:23:26,890
somehow this is bound vanishes and we are able to control all these smooth correlations at once by choosing a number minus one sufficiently large.
195
00:23:26,890 --> 00:23:32,680
So in the next section, I'm going to motivate bits.
196
00:23:32,680 --> 00:23:39,100
The problem is that we are going to study particular, I'm going to introduce this loss of an efficient process.
197
00:23:39,100 --> 00:23:49,300
So if you have any questions before, please don't hesitate. All right.
198
00:23:49,300 --> 00:23:58,600
All right, so here we are going to introduce a 1990s way to introduce some randomness in the current trajectories.
199
00:23:58,600 --> 00:24:05,140
And the process that we are going through to do this here is known as the launch of our diffusion process,
200
00:24:05,140 --> 00:24:11,920
and there are several of the names, including an approach to asking you to pronounce its initial announcement as a lot of gotcha.
201
00:24:11,920 --> 00:24:22,570
Depending on you, I think you will look at it should not be confused with what it's highly of an upper limit.
202
00:24:22,570 --> 00:24:29,320
And so here just note that it involves still both the position and the Velocity V,
203
00:24:29,320 --> 00:24:37,780
and it's very it's a you see that it's it's very related to an and that it makes in the sense that when these permits are gamma is equal to zero.
204
00:24:37,780 --> 00:24:46,450
We just speak about internet dynamics and when Islamic gamma is tricky because it didn't reek of a stochastic differential equation here,
205
00:24:46,450 --> 00:24:57,440
that is somehow. Somehow, same assumption, and I don't but but see some momentum refreshment at that continuously induced by your brain motion.
206
00:24:57,440 --> 00:25:05,080
OK, so this this gamma is known in the region either as dumping damage or the friction comet.
207
00:25:05,080 --> 00:25:13,800
And you can see some out here, one property of this of process is that it leaves the stationary distribution by store.
208
00:25:13,800 --> 00:25:20,770
She's just probably beaten by an in violence, just as dynamics do.
209
00:25:20,770 --> 00:25:25,150
How are they? One difference with the confident because as soon as gamma is pretty positive.
210
00:25:25,150 --> 00:25:29,020
Well, now the randomness induced can can yield equity.
211
00:25:29,020 --> 00:25:44,110
So I think there is because you will be able somehow to convert distribution based on starting from zero zero and suitable assumptions.
212
00:25:44,110 --> 00:25:48,010
So what we try to is this this diffusion process has a negative way in which we run them,
213
00:25:48,010 --> 00:25:54,130
this trajectory that is not built upon randomised integration dynamics.
214
00:25:54,130 --> 00:26:03,070
And so there is an actual question that here we are investigating these can we can we achieve some robustness opportunity with activity in the
215
00:26:03,070 --> 00:26:12,460
sense that can we achieve some some unified control over the U.S. using this division process in a similar way as we had for randomisation?
216
00:26:12,460 --> 00:26:17,290
OK. So we consider the same same question example here we have this,
217
00:26:17,290 --> 00:26:22,990
and I'm going to ask you to believe me that we can compute explicitly this with the quotation
218
00:26:22,990 --> 00:26:27,750
functions for the diffusion and then push in case they bow down to these equations.
219
00:26:27,750 --> 00:26:31,990
It's also better known equations, I think it's better simply to plot what happens to these functionality.
220
00:26:31,990 --> 00:26:40,650
OK. So he's the first ballots in the first part, we will consider that there is only one 6th scale by April one.
221
00:26:40,650 --> 00:26:49,580
And we will let us do that. The friction parameter value that's several values of frequency do, in fact.
222
00:26:49,580 --> 00:26:53,390
And we can see here that the choice going backwards zero.
223
00:26:53,390 --> 00:27:03,620
We recover this sort of grey line that simply boils down to the question is functionality the creation of agency spirit?
224
00:27:03,620 --> 00:27:08,070
And as soon as I saw positive, all the conditions will converge to zero.
225
00:27:08,070 --> 00:27:16,600
I still go to the first point. Now, if you compare these choices of Ghana, well,
226
00:27:16,600 --> 00:27:24,970
the choice can make or zero is the one that will make these coalitions reach zero faster than any or the choice of Ghanaians,
227
00:27:24,970 --> 00:27:38,350
there is no right to do so. Somebody who if you're able to to to use to use tool to mimic exactly the dynamics and you get this to equal pay over two,
228
00:27:38,350 --> 00:27:43,210
then you just get to independence ethics samples with a C.
229
00:27:43,210 --> 00:27:50,440
And there's no hope by somehow inducing some momentum refreshment that you will go faster than this.
230
00:27:50,440 --> 00:27:56,300
So in that case, this means that if you want to sample from a standard, if done there,
231
00:27:56,300 --> 00:28:03,670
that no matter if you want to sample from a isotopic distribution, well amongst the stars of gamma, gamma equals zero is the best for sure.
232
00:28:03,670 --> 00:28:08,890
Now we are going to to you just right and those are affect the next time.
233
00:28:08,890 --> 00:28:16,730
But just before that, I just want to mention that you can see the behaviour when when Gamma one and two here is
234
00:28:16,730 --> 00:28:26,770
three and above a certain threshold equal to then they become they decay monotonic here
235
00:28:26,770 --> 00:28:32,230
and somehow there's a phase transition gamma equal to that also coincides with the choice
236
00:28:32,230 --> 00:28:38,160
of gamma that will that will optimise the exponential rate of convergence to zero.
237
00:28:38,160 --> 00:28:43,870
OK. So any value of galaxies would be below the threshold,
238
00:28:43,870 --> 00:28:51,220
we will refer to these values as the end of the arms regime and every bad you have come out at the higher than this threshold,
239
00:28:51,220 --> 00:28:54,820
we will call them over three.
240
00:28:54,820 --> 00:29:02,350
So now let's plot to the same with operations back in at least one framework, we are going now to fix some values of gamma and let the scales back.
241
00:29:02,350 --> 00:29:05,800
Here they are. All right. So which was two different values of government gamma or zero?
242
00:29:05,800 --> 00:29:23,520
And this face One Nation two here. The fate of.
243
00:29:23,520 --> 00:29:31,410
Well, the only thing I wanted to to, uh, to say here is that some of the deaths are two to three different behaviours above.
244
00:29:31,410 --> 00:29:36,750
And also maybe, maybe I should use another word we can discuss.
245
00:29:36,750 --> 00:29:48,370
OK, thank. So, so here we are going to fall to two different venues have come up.
246
00:29:48,370 --> 00:30:01,310
We are going to plot different, different states. So on the lessons, I can see that same grass as a few slides earlier for AMC and when get equal to.
247
00:30:01,310 --> 00:30:08,570
So these bombings are here. Well, we can see that there there's different views,
248
00:30:08,570 --> 00:30:13,000
depending on whether stigma is higher than the referent, higher or lower than the reference states.
249
00:30:13,000 --> 00:30:20,630
Michael, sorry, blue line corresponds to a reference can you can see that particular every scale at a lower than than this reference scale.
250
00:30:20,630 --> 00:30:25,430
They they have conditions that will decay in there would be oscillatory,
251
00:30:25,430 --> 00:30:30,470
but they will all decay to zero and they will all be uniformly dominated by this.
252
00:30:30,470 --> 00:30:33,310
By this quotation corresponding to see Mike or what?
253
00:30:33,310 --> 00:30:42,230
OK, so actually this this actually is quite interesting and it is quite simple to rule here if you want to control all the quotations at once.
254
00:30:42,230 --> 00:30:50,720
The following if you choose some of the friction Panopto to be equal to the two divided by the not just the larger scale.
255
00:30:50,720 --> 00:30:54,140
Well, you can control all the corporations.
256
00:30:54,140 --> 00:31:03,380
Here by by the way, the coalition responding to this larger scale, which decays exponentially fast, we wish them well somehow.
257
00:31:03,380 --> 00:31:09,170
This is a similar property as what we would observe from an opposition team.
258
00:31:09,170 --> 00:31:24,560
And in that sense, its neighbours. We argue that that's a positive choice of friction allow for better business opportunity.
259
00:31:24,560 --> 00:31:29,780
So now I'm going to connect this notion diffusion to the randomisation process.
260
00:31:29,780 --> 00:31:34,850
And here you don't have to bother too much about the first half. If you don't want some of this,
261
00:31:34,850 --> 00:31:40,520
first half is simply to express that stochastic process is we are going to study here the
262
00:31:40,520 --> 00:31:44,330
notion that diffusion randomisation process can be characterised through their infinite,
263
00:31:44,330 --> 00:31:52,220
infinitesimal generators. So this quantity here that simply corresponds to the derivative of the expen expected function.
264
00:31:52,220 --> 00:31:57,780
If something from sentiment and we are going to define the Giteau corresponding to them,
265
00:31:57,780 --> 00:32:08,850
then this one and we are going to define two different types of refreshments, one built upon Poisson process and the other one beat upon Winbush.
266
00:32:08,850 --> 00:32:15,170
And so the randomisation generator and the generico can be expressed as the sum of to generate those.
267
00:32:15,170 --> 00:32:22,450
And they have one component, which is this. And can I mention one generator?
268
00:32:22,450 --> 00:32:30,770
Could I thought it was one and they differ through the the choice of momentum refreshment?
269
00:32:30,770 --> 00:32:41,800
So I wonder what the randomisation? The refreshments are in just the discrete way by a question process with an intensity that.
270
00:32:41,800 --> 00:32:47,410
So the higher that is, the more often refreshments, OK?
271
00:32:47,410 --> 00:32:53,320
OK. And the higher alpha is spammy to hear.
272
00:32:53,320 --> 00:33:00,490
The more partial the refreshment becomes assistant for me was how partial the refreshments are, right?
273
00:33:00,490 --> 00:33:09,490
And on the second process, there's this family to get on a friction barbecue that controls how how fast we are going to refresh the momentum.
274
00:33:09,490 --> 00:33:12,400
Nicotine was way slower than motion.
275
00:33:12,400 --> 00:33:23,330
And so there is one reason that we present here is that when you choose Alpha, that goes to one while the intensity number goes to infinity,
276
00:33:23,330 --> 00:33:27,040
that goes to one, the spike in these particular rates for this value of gamma.
277
00:33:27,040 --> 00:33:32,970
Then what happens is that the generator from the mothership CMC converges to the generator of the stratosphere.
278
00:33:32,970 --> 00:33:41,080
OK, so is really what does it mean when it means that the more passion and frequency the refreshment becomes A.?
279
00:33:41,080 --> 00:33:52,300
Well, the closer we get from continuous refreshment induced by the emission and therefore the closer optimisation C becomes from the most efficient.
280
00:33:52,300 --> 00:34:03,530
So we will highlight this this result, together with another result that we have found which studies quantitative mixing rates for organisations.
281
00:34:03,530 --> 00:34:12,780
OK. So here we do not appear to be just transition, budgetary transition,
282
00:34:12,780 --> 00:34:20,250
kind of a fundamental issue with arbitrary parameters in density London and Bessie Smith Alpha.
283
00:34:20,250 --> 00:34:28,590
And we are going to try to states some convergence reasons about the new stamp process under these assumptions, a street year.
284
00:34:28,590 --> 00:34:35,670
So these assumptions A3 boils down to assuming that the distribution is truly concave,
285
00:34:35,670 --> 00:34:44,610
which is the same as assuming that the issue of the potential is lower bounded by positive constants.
286
00:34:44,610 --> 00:34:50,850
OK, so the right hand side of the assumption actually was already assumed the first line of presentation,
287
00:34:50,850 --> 00:34:57,780
because it just boils down to assuming that the greatness in this context and here is the theorem.
288
00:34:57,780 --> 00:35:06,480
If you choose the intensity from that to be same function as we see in the last line, it's due to go to that.
289
00:35:06,480 --> 00:35:14,310
If I go to one aspect of this rate for constants here is not the square root of an endless m.
290
00:35:14,310 --> 00:35:18,610
Then for any choice of alpha, we get some exponential convergence.
291
00:35:18,610 --> 00:35:25,670
Can we expect to do to sustain distance in perspective and to go over here as follows?
292
00:35:25,670 --> 00:35:29,550
Right. So so far, I need this. I think for something, it seems up to the first one, right,
293
00:35:29,550 --> 00:35:36,480
so that certain distances this distance between the pointy measures and some after time T two starts from distribution new,
294
00:35:36,480 --> 00:35:44,880
while the distribution after time t will be closer and closer to two to zero, it will converge to zero exponentially fast with the rates up.
295
00:35:44,880 --> 00:35:49,470
And so then of course, I see here on CS where this can be controlled.
296
00:35:49,470 --> 00:35:55,350
But what is important here to highlight is the rate of convergence up and so.
297
00:35:55,350 --> 00:36:02,520
So these are the bounds. It's true, but somehow this this mathematical framework here allows allows us to to get quite
298
00:36:02,520 --> 00:36:06,570
explicit rates of convergence with respect to the bombing to alpha and bow.
299
00:36:06,570 --> 00:36:11,490
And we also due to this constant slowdown and capital assumption.
300
00:36:11,490 --> 00:36:18,060
OK. And so the rate of convergence here, you can see that it's optimised its increasing function of alpha,
301
00:36:18,060 --> 00:36:22,670
and therefore it's optimised when alpha goes to one.
302
00:36:22,670 --> 00:36:26,480
My solution is the design solution when I was 12.
303
00:36:26,480 --> 00:36:36,750
And so when you combine both both results together and when you you remark it, it actually this this rate of much.
304
00:36:36,750 --> 00:36:43,110
The limit of this rate of emergence was obtained when that was when it was obtained for the nonchalant decision in the previous paper,
305
00:36:43,110 --> 00:36:48,870
when actually you can see these results actually has an interpolation between previous results that you
306
00:36:48,870 --> 00:36:55,200
give you some co-authors and what we have done like early on doing this for the natural diffusion.
307
00:36:55,200 --> 00:37:02,760
And it's some interpret these results and the interpretation and interpretation we present here is the following we might argue
308
00:37:02,760 --> 00:37:16,090
that the natural diffusion can be seen as a limit of SUMMARISATION that achieves the fastest exponential rate amongst song like.
309
00:37:16,090 --> 00:37:26,830
All right, so now I'm going to have to introduce the algorithm that we are studying, if you have any questions before I jump into the item we propose.
310
00:37:26,830 --> 00:37:36,340
Please don't hesitate. It's the proof.
311
00:37:36,340 --> 00:37:43,380
The proof here is built upon a synchronous closing argument, just as in the paper that I referred to.
312
00:37:43,380 --> 00:37:49,810
And so the choice of coupling is not that complicated is that choice of of coping.
313
00:37:49,810 --> 00:37:59,890
Now here the the technical technical parts relies on choosing a twist of the metric that is
314
00:37:59,890 --> 00:38:04,840
amongst the best tweets of the metric you can to be able to obtain the best rates as possible.
315
00:38:04,840 --> 00:38:12,930
You can also. We're going to get more Panopto and some details.
316
00:38:12,930 --> 00:38:18,620
Good. OK, so let's go.
317
00:38:18,620 --> 00:38:23,870
So here is the time position that we consider for the of division.
318
00:38:23,870 --> 00:38:30,470
So this is finding someone motivated construction of some player directly built on lots of
319
00:38:30,470 --> 00:38:37,010
trajectories instead of as an alternative to to using randomised to try something new.
320
00:38:37,010 --> 00:38:46,810
Sometimes. And we are going here to define setting a vision that is built upon this leapfrogging director that you see in the media.
321
00:38:46,810 --> 00:38:54,010
So it's very similar to this to this [INAUDIBLE] thing, Rachel. The difference is that we get some passive momentum refreshment before and after.
322
00:38:54,010 --> 00:38:59,860
So here Alpha is no longer a free barbecue is set by the values of friction, gamma and the time step h.
323
00:38:59,860 --> 00:39:05,140
OK. So when, when, when it goes to zero infinity, we can interpret.
324
00:39:05,140 --> 00:39:14,950
These are the Times station of the Najiba dynamics, and you can see then when amazing to see where we just recall the stomach.
325
00:39:14,950 --> 00:39:19,690
So we use these particular Thanksgiving edition because it's enabled to to preserve some,
326
00:39:19,690 --> 00:39:27,550
some useful properties of this stuff, but it dates time variability and present for something useful.
327
00:39:27,550 --> 00:39:36,430
When reading this correction on something, we are going to denote the update category, as he said, they often have to come up.
328
00:39:36,430 --> 00:39:45,350
That refers to the distribution of HIV h given exorbitant.
329
00:39:45,350 --> 00:39:52,060
So Excite is not out going on. So start don't, though, not a special moment of refreshment.
330
00:39:52,060 --> 00:40:02,650
Keep this in the city and bits of fresh freshness and something we are going to propose, something you might even call them like this.
331
00:40:02,650 --> 00:40:13,270
But it's just Eskom composition step of the subject to denote completion of the muck of chew gum.
332
00:40:13,270 --> 00:40:18,400
So here is the algorithm, so we could expect adjusted large event trajectories or models,
333
00:40:18,400 --> 00:40:24,490
and it requires choosing the amount of friction gamma a time separation to wish them.
334
00:40:24,490 --> 00:40:32,800
So at the start of each trajectory, we completely throw a new Gaussian refreshment just as for some other agency.
335
00:40:32,800 --> 00:40:38,140
And then instead of HSC, we have now to propose a on trajectory.
336
00:40:38,140 --> 00:40:44,200
With friction going up, and so when is equal to zero, I know, right, what's really happened,
337
00:40:44,200 --> 00:40:47,710
but when you go to see you actually recover, exactly, do the agency go with them?
338
00:40:47,710 --> 00:40:52,380
Actually, he's done. City-The does not exist that somehow we should just consider the racial difference.
339
00:40:52,380 --> 00:41:00,950
I guess what's what we obtain for the HMRC case and when I get my strictly positive, then then the excellence ratio becomes.
340
00:41:00,950 --> 00:41:08,860
This ratio of density is where you have a density suspected of racism, and you can interpret this probability in a in several ways.
341
00:41:08,860 --> 00:41:13,930
I just chose this one just to highlight the fact that's somehow on the denominator here.
342
00:41:13,930 --> 00:41:21,760
You can interpret it as density of the forward trajectory from x zero zero to x LDL and
343
00:41:21,760 --> 00:41:28,030
on the right or here we can interpret it as the density of the backward trajectory.
344
00:41:28,030 --> 00:41:34,380
We split more into.
345
00:41:34,380 --> 00:41:42,510
So as before, if you want us to do something, if you want these three steps through to get a mark of candidates leaves the decision you balance,
346
00:41:42,510 --> 00:41:50,190
you need to flip the momentum upon rejection, but since there are some, some full refreshment of the cities here.
347
00:41:50,190 --> 00:41:58,310
Well, actually, this line is completely erased at each HQ where you propose.
348
00:41:58,310 --> 00:42:06,440
So this is this is a difference with the GMC approach for which for any choice of was the assistance Alfa,
349
00:42:06,440 --> 00:42:13,160
then we could not erase those momentum clips and force them out due to a trade off between between
350
00:42:13,160 --> 00:42:21,250
them having not too many freedoms without having a big enough time step to reduce the carnage.
351
00:42:21,250 --> 00:42:28,270
All right, so we argued that this correction yields a neat way to to stabilise the diffusion in
352
00:42:28,270 --> 00:42:32,650
particular compared to previous approaches aiming to mimic this natural division.
353
00:42:32,650 --> 00:42:39,100
Our approach is different in the sense that we aim to metabolise whole trajectories of land.
354
00:42:39,100 --> 00:42:43,750
We ask people's approaches. We're studying the one step, one step correction.
355
00:42:43,750 --> 00:42:53,920
And for instance, I was talking about this Jamestown. And so if you take it James sideways, I mean, you take only one step and one step.
356
00:42:53,920 --> 00:42:58,990
And you have this of refreshment. That is very special, something we don't see very close to one.
357
00:42:58,990 --> 00:43:06,370
You can also make the most. It's usually when the thermostat goes to zero. However, here there is one difference for for, you know, our push.
358
00:43:06,370 --> 00:43:12,250
We have one degree of freedom, which is the sense of the trajectory shooting by the user.
359
00:43:12,250 --> 00:43:21,100
And and that can help to reduce the coalition base by choosing it in a smart way.
360
00:43:21,100 --> 00:43:27,460
So the second property here is that momentum fees can be raised by food refreshments, which is not the case for GHC approach.
361
00:43:27,460 --> 00:43:34,000
And we argue that that this scandal. So this gives them more time goes and gives a robust extension to HMRC.
362
00:43:34,000 --> 00:43:43,800
That's going to last point here because we've seen that somehow using the positive dumping can enable control of the West or the production function.
363
00:43:43,800 --> 00:43:51,750
And achieve similar similar goals as what we would do when drawing us from them, the inspiration empty.
364
00:43:51,750 --> 00:43:58,710
So in particular, also here we highlight the fact that as soon as my strictly positive trajectories, trajectories will be equity,
365
00:43:58,710 --> 00:44:05,010
which means that you cannot really suffer from some problem that you have in HNC when game is equal to zero,
366
00:44:05,010 --> 00:44:11,160
which is that if you let your your intuition t be large,
367
00:44:11,160 --> 00:44:29,590
you will just come back at some point to your original point and wrestle away some national and resources while having a high correlation.
368
00:44:29,590 --> 00:44:39,750
Right, so when when my strictly positive and you look at the trajectory of a Flamsteed somehow its trajectory with conversion distribution.
369
00:44:39,750 --> 00:44:45,630
To to buy stock, and if TI's large enough, some of the conditions would be to zero.
370
00:44:45,630 --> 00:44:54,000
And some of the output of the Georgie will be closer and closer to to an independent sample from from
371
00:44:54,000 --> 00:44:58,050
not exactly from my because it will be a flow decision because these times of the nation's best bet,
372
00:44:58,050 --> 00:45:15,740
it will become less and less dependent from the stock of the. I mean, how to turn the situation.
373
00:45:15,740 --> 00:45:28,100
At different points. So want to have.
374
00:45:28,100 --> 00:45:37,030
I'm not sure what you mean, so for the no yet done you talking about the choice of integration I see right there at work.
375
00:45:37,030 --> 00:45:53,990
Yeah. When he sgamma is very close to zero, you can get close after a certain amount of discretion 90 to your original point,
376
00:45:53,990 --> 00:45:56,770
but somehow when a sufficiently large,
377
00:45:56,770 --> 00:46:05,150
there is no particular reason why you should come back in a deterministic fashion to your original point on the.
378
00:46:05,150 --> 00:46:13,490
That's. But maybe we can discuss the semantics, I suppose.
379
00:46:13,490 --> 00:46:19,170
So we finally support this agreement with the theoretical.
380
00:46:19,170 --> 00:46:30,030
Justification that essentially is quite similar to the one that has been established for HMRC in 2013.
381
00:46:30,030 --> 00:46:37,170
My best guess in some quarters, which is down to a study of the spending limits under this assumption.
382
00:46:37,170 --> 00:46:43,140
So it's not the same assumption, it's the same assumption as what we have for HMRC.
383
00:46:43,140 --> 00:46:48,610
So we thought for that assumption, we can prove that's for any choice of friction.
384
00:46:48,610 --> 00:46:57,210
Gamma and this molten Gotham will have a skating limits that that will enable two other very
385
00:46:57,210 --> 00:47:04,140
explicit optimisation of efficiency measures that's yielded the tuning for continuing rule.
386
00:47:04,140 --> 00:47:12,390
If you choose your step such that the exit acceptance rate converges to 65 percent, some maximises measure of efficiency.
387
00:47:12,390 --> 00:47:19,080
And this is true for any choice of, in particular, its its seats or so.
388
00:47:19,080 --> 00:47:26,640
As a consequence of the fact that we use the default integrator inside this, this demonstration of their entrepreneur vision,
389
00:47:26,640 --> 00:47:33,750
and we wanted to highlight somehow that we can preserve several properties of HMRC by beating down
390
00:47:33,750 --> 00:47:42,060
that we in particular in particular the setting of the location of the accident probability.
391
00:47:42,060 --> 00:47:57,370
I spoke to H when he study, you can see is is quite related to the agency, although although here it was not the approval.
392
00:47:57,370 --> 00:48:05,830
Right, so that's that's some of that's the shooting rule stimulatory rules for see what you need conditional on T and on the friction, right?
393
00:48:05,830 --> 00:48:13,030
And so the friction. So we we look at the use of positive friction to be able to have a robust tuning strategy for T.
394
00:48:13,030 --> 00:48:22,000
And then we simply argue that the tuning of X can be done in a similar way as what we would do for four agents.
395
00:48:22,000 --> 00:48:33,180
Particularly we also recovered one force getting lost, but the damage. I that's I'm going to just summarise contributions here, so first,
396
00:48:33,180 --> 00:48:39,960
we present not division has a limit the fondamentaux agency that receives the first assessment 17C rate for some gift nuggets.
397
00:48:39,960 --> 00:48:45,630
Then we look at the use of positive dumping to enable control of the workplace and therefore allowing
398
00:48:45,630 --> 00:48:52,020
for for rubbish tuning respect to the intuition and mate completing all these conditions at once.
399
00:48:52,020 --> 00:48:59,610
Then we introduced this piece I just did not want to go is I go easy and we argue that this gives a
400
00:48:59,610 --> 00:49:06,420
correction for security in particular compared to previous approaches aiming for this notion of process.
401
00:49:06,420 --> 00:49:14,640
Here we metabolise whole. I think there is, which enabled to choose the lens to take the risk,
402
00:49:14,640 --> 00:49:23,910
and we highlight the fact that the momentum is now erased fully by full momentum investments which yield nice meets mathematical framework.
403
00:49:23,910 --> 00:49:31,740
In particular, the back of can off the chain of the position can be shown to be reversible.
404
00:49:31,740 --> 00:49:35,700
Thanks to this property. And finally, the we,
405
00:49:35,700 --> 00:49:39,450
we extend the outstanding results of best course and quarters for any choice of
406
00:49:39,450 --> 00:49:44,970
friction without that assumption and gets therefore this wonderful scanning for most,
407
00:49:44,970 --> 00:49:53,000
just as we do for agency. So we are interested in several, several perspectives for future work,
408
00:49:53,000 --> 00:50:00,410
so I still present some make comparisons because it's actually is almost finished work, it should be an archive within a few days.
409
00:50:00,410 --> 00:50:07,250
So we are still finishing using to make comparisons, which is why I didn't present here.
410
00:50:07,250 --> 00:50:14,300
And so there are several questions we we are quite interested in in working on in the future, particularly,
411
00:50:14,300 --> 00:50:22,490
we are quite interested in this adaptive tuning strategy for choosing a and gamma to be able to yield competitive.
412
00:50:22,490 --> 00:50:29,630
Competitive some players compared to other athletes doing so that have been proposed for HMRC and therefore hopefully can yield some,
413
00:50:29,630 --> 00:50:33,510
some maybe some information, but this gives a flip.
414
00:50:33,510 --> 00:50:41,040
And also, I think there's also quite a lot to do in terms of comparisons with this fundamental agency approach,
415
00:50:41,040 --> 00:50:45,900
either from a full article viewpoint on medical viewpoint. OK, thank you very much for your attention.
416
00:50:45,900 --> 00:51:06,420
If you have any questions, please don't hesitate. Thank you.
417
00:51:06,420 --> 00:51:13,860
Thanks. You say we have tried to stop it already, but.
418
00:51:13,860 --> 00:51:24,210
Spend. I. Doesn't look too bad to prioritise, to change it, to do it.
419
00:51:24,210 --> 00:51:33,250
So I've got some cards, but I want. That's excellent.
420
00:51:33,250 --> 00:51:38,830
Yeah, that's right. OK, thanks, I.
421
00:51:38,830 --> 00:51:48,290
That doesn't look like it would be best for. And so it's promising.
422
00:51:48,290 --> 00:51:55,230
What kind of guidance for expressing acceptance of the fact that some sides.
423
00:51:55,230 --> 00:52:02,790
I wish I wish I had my mind, my working paper here, but so, yeah,
424
00:52:02,790 --> 00:52:10,050
so so so let me try to to explain what we observe, for instance, so I can see simulations.
425
00:52:10,050 --> 00:52:13,680
Let's let's come back to the bush time off because it's true.
426
00:52:13,680 --> 00:52:21,120
It's easy, but it's also quite easy to even. I'd say quite a few things about this.
427
00:52:21,120 --> 00:52:33,570
So, for instance, we have an example in our working paper that somehow compares is more to get them together with randomisation with HMRC.
428
00:52:33,570 --> 00:52:36,940
OK. And I suppose we would have here.
429
00:52:36,940 --> 00:52:43,040
We take the same same framework that they presented, and we take some, some different scale sigma one sigma.
430
00:52:43,040 --> 00:52:48,600
And in the working paper, I think we just don't seem to be to be between the 150 commission members.
431
00:52:48,600 --> 00:52:52,230
Is this 50, which is OK because everybody is not super high?
432
00:52:52,230 --> 00:52:58,290
And actually when you do that and you look not only distance, but you look at,
433
00:52:58,290 --> 00:53:01,810
as you say, sample size, but not the sample size of one component, but the worst.
434
00:53:01,810 --> 00:53:08,970
If if somebody says all the components that you would get, some of you will, you will see that actually for HMRC, the worse.
435
00:53:08,970 --> 00:53:14,940
Yes, as a function of time for this particular model is quite an theoretic function of of.
436
00:53:14,940 --> 00:53:24,300
That actually breaks down when they when commission member increase and when they must them.
437
00:53:24,300 --> 00:53:33,090
And so if you are really if you want somehow to consider this measure of efficiency because you, you think that you, you all components here,
438
00:53:33,090 --> 00:53:43,890
these depths of rest and you want to have a uniform guarantee of all the yeses at once them, well, somehow with these these solutions,
439
00:53:43,890 --> 00:53:50,880
so much, says one of my agency and models where they can, they can probably do that for this particular model we have, we have some quite smooth,
440
00:53:50,880 --> 00:54:00,750
riskiest time that we can draw for both organisations and families that achieves some good performance, even if there are some gains.
441
00:54:00,750 --> 00:54:06,910
So you might say, OK, maybe for MCI can use instead a good recognition that to be able to reduce reduce the attention deficit.
442
00:54:06,910 --> 00:54:15,390
So this is always doable and we didn't talk about it too much. It's presented as a separate problem than than than this particular problem,
443
00:54:15,390 --> 00:54:24,330
because the preconditioning problem is something that you will do would do a purely somehow and and especially for something,
444
00:54:24,330 --> 00:54:26,880
it's quite sometimes hard to find a good.
445
00:54:26,880 --> 00:54:32,940
You have to find a global recognition of some also finding look out for condition in the position something is somehow.
446
00:54:32,940 --> 00:54:37,690
Sometimes it's easy, but so especially for something, it's not so easy because you cannot use a look.
447
00:54:37,690 --> 00:54:45,420
And when you have to use a global one, and there is a lot of situations where you can sometimes reduce the conditioning of the problem,
448
00:54:45,420 --> 00:54:52,620
but you still end up with it sequentially with a standard of 20 or 50 and in that case, actually the HMRC sample we fixed.
449
00:54:52,620 --> 00:54:55,920
So it's not really a particularity of agency itself.
450
00:54:55,920 --> 00:55:01,040
It's the fixed choice of intuition that causes problems always in the Russian framework.
451
00:55:01,040 --> 00:55:12,360
But it can be extended in all the areas where you see similar to that across the board, and you will also see problems when we can do with this.
452
00:55:12,360 --> 00:56:00,352
I think the message has been. You know, we have a run over as speaker of economic.