OK, this is where we left behind.
Question three or the third workshop.
So as you probably have seen that this question is.
In the context of gambling, that's why I.
I put this question in this section.
I think.
Question three question 4.
Are about.
Gambolling context.
I could have put this question in.
The second tutorial, because there's some similarity.
As you will see very shortly that there are a lot of similarity between this question.
And if you still remember the question where we try to infer.
The individual's subjective assessment of probability of winning.
In a gamble that that was one of the tutorial 2 questions.
They're very similar.
But I think it's because of the.
It is because of the sum of the elements that required for this question.
Were.
Introduced.
In Chapter 4, therefore, since more proper to put it here.
But essentially you would find this very similar. So a gambler with the exponential utility function where be greater than 0.
Now this tells you that this individual is risk averse because.
Exponential utility function where.
Be positive.
Means
it is a risk averse utility function.
So the gambler is not.
Risk a loving on risk neutrality.
We had a total wells to be 0.
And again, similar to the previous question to the question before we observed that this gambler has made a bet, the gambler has made a bet.
Events this bet is equal to 1/4.
Of his total wealth. So that is 25% of the bluezero.
Who observed that?
In a gamble, what kind of gamble is that? The gamble returns M times.
Of the bat, every wings if we lose returns nothing.
But the payback ratio is equal to.
M.
So that's the concept we we kind of introduced only in Chapter 4 the payback ratio is.
Then the probability of winning is P. Losing is 1 -- P.
So assuming that the gambler is rational.
We want you to infer.
An expression we want to want to be able to infer what is the value of B.
Now why this question is?
Um?
Is is useful is because?
That tells us the logic behind some kind of experiments that economists do.
You know, because this is something that risk aversion. As explained few times in the lecture.
Is something that is very.
Difficult to.
To estimate, you know, because individual has different.
Risk attitudes?
Even if they have the same kind of even if they're in the same risk aversion category.
For example, if there are risk averse individuals, their risk aversion level.
Could be very different.
It is very difficult to.
Arbitrarily determine what kind of utility function we should use, what is value of B? We should choose the economic analysis, but one interesting interesting question is too.
To to be able to.
Infer.
Roughly.
People's risk aversion level.
By some kind of experiment.
OK, but some kind of experiment, so I assume this is.
A kind of experiment.
So basically we asked a gambler or individual or subject in experiment to make a choice.
When he is presented with this opportunity.
And this opportunity is to be able to gamble an in again where he will.
Return the return is M * N As you bet with probability P.
And how much are you happy to bet this kind of imagine? This is a very simple experiment that you might be asked to do.
So.
Assuming that the gambler is rational.
We may be able to.
Estimated value of feet.
By observing.
His decision, so that is the idea. So this example this question.
I think it's a very easy simple example, simple question, but it tells you that potentially.
This is the foundation.
4.
People to do some experiment in order to recover some information that we do not observe directly. That is the idea.
No.
The question we discussed it before is.
To recover.
Some individuals subjective assessment of what is the probability of something happening not happening. But here we are able to.
Infer.
Something that is also completely subjective.
And completely observed.
It is the risk aversion level.
OK.
No.
The idea is very simple, assuming that this is true, this group is rational so that he will maximize expected and appear utility, so he would.
He would write down his expected utility function and trying to.
Find optimal.
Betting sites
so that is expected in a period T is maximized.
So after you know few weeks already, it is very easy to write down for this particular situation.
The expected utility.
Of the gambler.
The Gambler wants to choose the value of X.
If you bet X.
If you wings, that will be the total end of period wells.
That is, the probability that a utility, the probability if he does not win. That's that's the utility and the probability. So this is expected in a period today. If we choose to bet X.
Now the gambler's problem is to choose X so that this quantity is maximized. So in the usual way he has to differentiate this thing with respect to X.
Find the 1st order condition.
An find optimal solution.
So his optimal solution.
It's given by equation 1.
For given the information given is.
The payback ratio N probability of winning each round.
And.
The crucially, the risk aversion parameter B.
X is the optimal betting size.
Now this tells us that the relationship between B&X is 1 to one. So if you know because M.
And P these are.
Objective and completely externally exogenously determined.
So.
Different value of risk aversion will determine different value of betting size, so this one to one relationship between X&B.
So if we observe X.
If we observe X.
We can infer what is the value of B.
So that's the idea. So you can see this very similar to the question we had before, where there's one to one relationship between. I think the question we had before is 1 to one relationship between.
Subjective value of P&NX. But here P is. This is a casino game right? So the probability of winning losing is.
Completely observable.
What is not available is the.
Risk aversion parameter but this.
This solution tells us that there is 1 to one relationship. So if we observe that X according to the question.
Is 1/4 of the total wealth.
So we can infer the value of B.
Which.
Delivers.
An estimate, I mean, given the information, if these are observed with no error, P is observed with no error. M is known with no error. If this is observed with no error.
We back out the value of B, which is the individuals absolute risk aversion.
OK.
Now going back to the what I said potentially.
Except experimental economists, the arrange different kinds of experiments.
In order to observe how people behave in various kind of situations.
So by observing how they behave.
By observing their decisions.
And of course, under certain high level assumptions and certain general assumptions, for example, like people are risk averse.
For example, like the utility function is exponential if the only unknown value is B, then we can roughly.
Estimate the value of. Imagine you ask thousands or 500 people to do experiment.
And they will each one of them will reveal.
Write each one of them will review the value of B. So if you average across 1000 individuals or 500 individuals.
You get a rough estimate.
Right, you get a rough estimate. What is the value of B for the population?
Now, even though of course this is quite a simplistic kind of experiment, of course the kind of experiments that economics do, a more complicated but the rash you know is the same.
Right, so under certain assumptions, assuming individual rational but isn't connection between some unknown observable quantities and the behavior that can be observed so far, we observe, we can infer what is observable.
OK, that is some rationale behind experimental economics experiments.
So this is very much the sorcery, so the things to take away is that.
Um?
Under certain assumptions.
If you're happy with these assumptions, we make this one to one relationship between about observable quantity and observable behavior so.
From what we observed, we can infer.
The quantity that we want to we want to know.
And of course, in order for the individuals to best something positive that is in order for.
Someone to actually engage in abetting in the gamble like that certain conditions has to be satisfied.
OK, so in order for this quantity to be positive that is in order for the individual to to be willing to bet anything, condition has to be certain. That condition is that.
PN has to be greater than one.
OK, so in other words, from here we can infer that this condition means that the expected return.
Should be positive.
Expected return should be positive.
For this risk averse individual.
But Casino gains are typically have negative expected return because our average you will lose money in a casino.
So this tells you that.
A.
Risk averse individuals would not bet in a casino games.
If he is rational.
OK, but of course if the individual is this loving.
Or risk and you chew. The story will be different.
OK.
So what another way to look at it is that if PM is greater.
Then one then the actual value of the gamble must be positive.
If the gamble is actuarially neutral or have a negative accurate value and risk averse individual will not.
Engage in this kind of activity.
So these are the things you take away from this question.
Now let's move on to question four question for a.
Again, the gambling question problem would it makes this slightly different from most of the questions we examined before? Is that now the decision is not just a one off decision, it's a sequence of decision because you are it is. It is a strategy that we're interested in.
So the gambler has a.
Some kind of starting wealth and some kind of target, so he is allowed basically to engage in.
A sequence of.
Gambles in order to reach a target.
So the question is explained. Amuzie theory was explained in the electric.
OK, so basically the idea is that the whales.
Is a random number.
As a function of how many rounds you have bet.
So you will see will increase or decrease depending on the outcome of each round of betting.
So the.
So the relationship that allows us to derive a solution to the problem.
Is presented here, so if this is the probability of reaching the target.
Then it could come only from 2 pathways. 1 pathway is from.
Where your wealth is.
W 0 plus Delta if you win in the first round. If you win the first round, your wells becomes this.
If you eventually reach the target.
OK.
So that.
Is the probability from W 0 plus Delta to WG?
Or you will from.
From this wealth, if you lose in the first round, so in the first round you either win.
Or lose if you win.
Your words become this if you lose your words, become that.
So the probability of reaching the target must be equal to the weighted average of these two, because you, either after the first round, you either reach the target from W 0 plus data or you reach a target from W 0 minus Delta to WG. If you lose in the first round, so that relationship must hold in every step in every step.
So the idea is that.
From this point this thing can be.
Rewritten.
As a weighted average of the outcome of the second round, so you can you can you can split this thing in a similar way as we split this guy.
In a similar way, we rewrite this guy as a weighted average. You can rewrite this thing as a weighted average. You can imagine you will have a P.
And 1 -- P. But you have the blue 0 + 2 Delta you have here 00. Similarly you can decompose that so you can see this whole thing will become a tree like.
Meant multiple.
Layer of trees.
So each.
In each layer of the tree you can find you can there's a relationship.
So.
This can be solved one by one and eventually you will get.
Dissolution
so this solution depends on whether the odds are fair or not. If the odds are fair, then.
The probability of success only depends on initial wells and the target wells. If the odds are not fair.
Then the outcome depends on the betting sites. Depends on the building side. So the idea is that from this example, the idea we have seen is that if odds are fair.
The, no matter whether you bet one unit or double units or even more units, the probability of success will always be the same, because if odds are fair.
The probability of success does not really depend on betting size. It is dependent on initial wells and target wells.
But if the odds.
In your favor.
So this is where the odds are in your favor, because P, which is the probability of winning.
Is greater than the probability of losing, so the odds are in your favor. So another in your favor.
The probability of success is greater if you bet less.
Here, if you bet one unit 87% if you get 2 units every round 84%.
So if the author in your favor, it is actually better to bestemor.
If the odds are against you Q greater than P, it is actually better too.
Fit large.
If your best sport, the probability of success is actually.
Lower.
OK.
So as I explained what you can take away from.
This particular problem and the outcome here.
Is that?
If you are expected to.
To success.
Expected to succeed.
In other words.
You know, generally speaking, if you are on the right track.
If you're on the right track towards your target.
You want to be safe.
OK.
You don't want do something.
Too risky to increase.
Unnecessary risk, but if you are expected.
If you're on the wrong track, if you're going down, then if you then you need to take some risks. Do something dramatic in order to change.
Change the outcome.
I think this implication is very general OK.
The implication is general.
If the odds against you.
You expect it.
To lose the expected to.
To be unsuccessful, then you need to do something dramatic.
If the odds are in your favor, you're expected to win.
You don't want to do something dramatic.
Because you don't want to do something stupid.
2.
Suddenly?
Loose everything things like that. So that is the implication is quite general.
But the calculations here are very specific.
So what the question asks you to do a two things.
Question A is basically just to to do the calculation and to verify.
To verify our conclusions.
Now here, because the.
The difference between a target and the starting value of four units so.
Here we come to St Situations, unit bedding, double bedding.
Unit bedding means every round you bet $1 double bedding means everybody around. You bet $2 quadruple bedding means every round you bet $4.
And we assume three different scenarios.
The first scenario is where the probability of winning is greater.
It is greater than probably losing, so that is where the odds are in your favor. That is the fair odds, that is odds against you.
But this question is simple, it's just the numerical exercise.
So if you just putting the numbers for the numbers in the formula, you will get this outcome.
But because in a formula we need because in a formula we need.
We need Q / P.
OK, here is Q / P because we need Q / P, so it's convenient just to write down for each scenario. What is the?
Odds of losing Q / P is also losing.
So here goes in your favor.
But losing is less than 50%.
Fair out favorable.
So for plugging this number we get exactly what we expect. That is, when odds are in your favor.
Actually betting small is the better solution.
You bet more and more every round.
Your probability of success actually will reduce when odds are fair.
When those are fair.
There's no difference. It doesn't matter whether you get unit or double or change vets every round.
The probability of success is always equal to the ratio between the blue zero and the blue G.
But of course, if you bet more every round.
The outcome will will arrive earlier.
OK.
The whole game will be shut and the duration of the exercise will be significantly shortened.
Proportionally relatively to unit betting. If you bet twice as much, so everything was going to happen.
50% quicker
now if the odds are unfavorable.
Again, we can see that if you bet small in fact.
Your chance will be reduced.
If you bet more, you actually increase the chance of success, so that's just the repeat of what we observed from the lecture.
Here.
Nothing you, but what's more input interesting is the second question.
Now of course this solution.
This solution assumes every round you better constant.
OK, but a bit constant every round is what we what, not the ultimate solution we want, but this gives us some idea about how much we should bet each round.
So this is useful because the implication here.
Shed light on what should be the optimal strategy if you allowed to change your bedding size in every round. So if you are allowed.
To change to vary your bets each round.
And assuming that the odds are against you.
What would be optimal strategy?
OK, now first of all.
We know that if the odds are against you.
Then certainly.
If you bet equally every every round, you certainly should not get unit. You should get as much as you can.
But that is assuming that.
Constant betting that is every round you bet exactly the same amount.
So what if we are allowed to bet differently?
So we focus on the.
Unfavorable odds
because this is most interesting.
The solution to favorable odds is easy. I will go back to this.
Now first of all, we need to understand that.
In the first round, the situation is you have initial.
Situation like this initial starting initial situation. Initial situation. You have 28 units. Target is 32 units.
Now that here you need to consider how much I should bet in the first round.
That argument is following.
First of all.
Under the assumption.
According to the question that you only need to achieve 32.
So you don't need to achieve.
Anything more than that?
So anything more than that, if you achieve 30, three 3435.
There will be no difference.
In no different than 32.
So this means there's no need for you to be anything more than five.
More than four in the first round because if you win, you get 33. Isn't it the same as 32?
So, but if you lose, you lose 5 instead of losing 4. So what's the point to be anything more than four?
So this tells us because the target.
The difference between the target and the wealth level current wells level is 4. There is no need for you to be anything more than four.
OK.
In the meantime.
Because we know that.
If you bet less because the odds are in your, not in your favorite order against you. So if you get less, you're not doing yourself a favor. In this situation, we know we should bet as much as possible.
So we should not be anything less than four because you know that this is not the ideal strategy from the calculation here.
So there's no point to pet anything more than four.
And it is not optimal to be anything less than 4.
So therefore the only optimal vet in the first round must be 4 units.
So let's see what happens if you get one before units in the first round.
So if you win.
Your wells level becomes study 32.
When you reach your target.
And the probability of that happening is peak.
There is a 1 -- P probability that you can lose, so if you lose.
Your wealth becomes 24.
If you lose your words become 24 and the probability of losing is 1 -- P is equal to Q.
So if you lose in the first round you reach and you situation.
As the new situation is that you start from 24 units and the target is just the two units.
So in the new situation, that argument is exactly the same as before.
Because the distance between the target and.
Current wealth is 8.
So there's no point to bet anything more than 8.
So you bet anything you should bet.
Less or equal to 8 units, but in the meantime.
The logic.
From our calculation here.
Shows that to bed anything less than 8 is not good, because even though this is a table produced for 24 and 32.
Start 2018 study to well. In fact in here.
You can produce a table. You can do the calculation all over again. You can try if I bet one unit each round. What is the probability 2 units each around? What's the probability for units each around? What's problem using 24 as a starting value and 32 as the target value?
Well, even though you will get certainly different numbers.
Unit bedding double bedding coral betting an 8 * 8 units every round you will certainly get different numbers but what is clear is that.
It will show when odds are against you.
If you bet unit the probability of success is the smallest. If you bet 8, the probability of success is the greatest. So the thing argument applies.
When odds are against you, you will not get anything less than eight. You want to bed as much as possible, but the same time you don't want to be anything more than 8 because you don't need anything more than 8.
Consequently, it could be anything you should get less than eight or equal to at the same time, we should not be anything less than 8, so you should fat.
8 units.
In the second run.
Now, of course, if you win.
Yo Wells Reach 32 success.
But if you're unlucky, again.
Your wealth will be reduced from 24 to 16.
And you reach a new situation and the situation is 16 to 32. So the same argument applies. Again, I'm not going to repeat, so you don't want to do it anything more than 16 because you don't need anything more than 16.
But you don't want to bet you one unit 2 units for unit of eight units, because you know if you do the similar calculations.
To go from 16 to 32, it will tell you that to bed higher is better.
So to bed as high as possible, but not more than 16, so you should best 16 the last one.
If you wink.
You can set it to dollars, you succeed if you lose, your wealth will be reduced to 0.
If you will see reduced to zero, you're ruined or bankrupted.
That's the end of the game.
So the probability of that happening is again 1 -- P in the last round. So from this strategy you can see there's only one possibility that you.
One possible.
Pass.
One possible scenario that you will get bankrupted and the scenario is you lose three times in a row.
As long as you don't lose three times in a row, you will success either here or here or there.
OK, so you will get been corrupted if you if and only if.
You lose three times in a row.
So this strategy tells you that the probability of unsuccess the probability of winning is Q * Q * Q.
Q To the power of 3.
That is, the probability of you losing three times in a row.
As long as this does not happen, so the probability of this does not happen is 1 -- Q to the power 3.
So as long as this does not happen.
You succeed, so the probability of succeed.
The probability of succeeding the probability of success is 1 minus the probability of ruin and the probability of ruling is Q to the power three. Therefore, the probability of success based on this strategy.
IS 1 -- Q to the power of 3.
Now what is Q? Here Q is.
Only QQQS when odds are against you. OK so Q is 20 / 38.
Q is 20 / 38.
Therefore.
The.
Probability 1 -- Q to power 3.
Is 85 percent 42 and you can see that this percentage.
This probability of success is greater than.
Bedding.
Four times every time every round is better than this strategy. Better than this, really better than this friendship.
OK.
Now the idea is intuitively the idea is because odds are against you.
You want make sure that you can succeed.
In a smallest possible number of trials.
In the smallest possible number of settings.
So how do you minimize the number of betting you required?
That is to bet as much as you can.
Everyone there hoping that as long as you get one along as you win once.
You reach the target you don't want to rely on winning multiple rounds in order to reach the target you rely on.
You only need to win once.
Because to win.
To need only once is the once in the smallest possible rounds you need.
That increase their probability. But maximizer probably success.
Now what if the?
Um?
Odds are in your favor.
If the odds are in your favor, what would be optimal strategy?
We know that when we want, we want to better small as possible.
So if Unit 1 Unit 1 dollar is the smallest unit, then you should bet the smallest. But if you're allowed to bet 50 pens, or even one Pence 1 penny.
If you allowed to bet in infinitely small amount, that will even make this probability even higher.
In fact, you can try if you plug in this number.
If here in the question.
If you allow this data to approach 0.
Of course, we cannot directly plugging 0 because you're getting nfinity, so this thing has a limit. If debtor goes to 0. If you know how to get the limit OK for a function, try to get the limit. If you allow this thing to go to the limit by allowing by making Delta to approach zero, you find this limit. Of course, when Q is less than P, when the odds are in your favor. With this case it's this case you will find if the bedding size.
Is approaching zero. This probability will increase.
Equal to what?
100%.
So for other in your favorite writer, easy, that's just the best as small as possible. And then the probability of success is almost 100%.
But when the odds are not in your favor.
Then it matters what strategy you take.
And the best friend you take is to make sure that you only need to win once.
To leave the target leave, leave the casino.
Because the longer you you need to spend in the casino, the more likely you are going to lose money because every round you are expected to lose money.
You want to seek a quick win and quick exit.
With this rationale.
The best strategy is to bet as much as you need.
Hoping that you can succeed after winning. Only once you lose, increase your pets again. Make sure you only need to win once.
To succeed.
So if you're very unlucky, if you lose every round.
Of course it's possible you lose every round, but that strategy will minimize the chance of.
Losing and maximize your chance of success.
And we have shown that they in fact this quantity when we calculate this quantity, is actually greater than even betting for.
Units every round.
OK, so that is the.
Questions three and four of.
Of this tutorial.