Kelly betting

EasyRhino

Well-Known Member
#2
It produces the optimal growth rate in bankroll.

The formula is, roughly:

advantage * bankroll / variance

The advantage would be 1% or maybe 2% in a high count. Bankroll is self explanatory. Variance on blackjack is around 1.2 (the original formulation was for horse racing, so the "odds" were often 2:1, 9:1, etc).

So, your "max bet" with a 1.5% advantage and a $10k might be

0.015 * $10,000 / 1.2 = $125

You, theoretically, resize your bets every time your bankroll changes. Up and down. So there's a 0% chance of losing everything, but there's a 50% chance of losing 50%, and a 20% chance of losing 80%.

On the upside, if the bankroll grows, the growth rate will tend to increase as well.

It's often recommended to bet a smaller percentage of full Kelly to reduce flucutations.
 

ThodorisK

Well-Known Member
#4
Not exactly. It produces the optimal growth supposing :
1) that you will have the average luck.
2) that you will always bet exactly a particular fraction of your current bankroll.

Thus, the proof of the formula does not apply for realistic situations, as it assumes there is no minimum bet, zero chance of busting, and infinitely divisible bet size.

The optimum growth in general, is the more you bet, the more you win. If you bet a fraction of your current bankroll which is MUCH more than kelly (e.g. 10 times kelly), then (taking in account all the possible combinations of luck and their corresponding profit), the EV is greater than betting the kelly fraction. But if we suppose that you will have the luck of the average person, then betting kelly has a greater EV than betting more than kelly, and if you bet more than 2 times kelly, the EV is NEGATIVE (!) This negative EV still amazes me and makes me keep thinking on the subject, as it is derived under the presupposition that you cannot bust (!), as the proof of the kelly formula presupposes that there is no minimum bet.
 
Last edited:

Bojack1

Well-Known Member
#5
ThodorisK said:
Not exactly. It produces the optimal growth supposing :
1) that you will have the average luck.
2) that you will always bet exactly a particular fraction of your current bankroll.

Thus, the proof of the formula does not apply for realistic situations, as it assumes there is no minimum bet, zero chance of busting, and infinitely divisible bet size.

The optimum growth in general, is the more you bet, the more you win. If you bet a fraction of your current bankroll which is MUCH more than kelly (e.g. 10 times kelly), then (taking in account all the possible combinations of luck and their corresponding profit), the EV is greater than betting the kelly fraction. But if we suppose that you will have the luck of the average person, then betting kelly has a greater EV than betting more than kelly, and if you bet more than 2 times kelly, the EV is NEGATIVE (!) This negative EV still amazes me and makes me keep thinking on the subject, as it is derived under the presupposition that you cannot bust (!), as the proof of the kelly formula presupposes that there is no minimum bet.
I disagree with you on this. With the Kelly criterion you can figure your unit size as well as you are factoring in expected variance. I offer you this:

The optimal number of bankroll units is equal to the variance to expected value ratio divided by the advantage per true.

The average variance ratio to EV ratio is 1.33/1
The advantage per true is 1/2% (.005)
# of bankroll units= (variance/EV) /Advantage per true
# of bankroll units= 1.33/.005
# of bankroll units=266 Which is full kelly. Furthermore factoring in percentages of kelly make this exactly whats needed for optimal bankroll growth and real world risk management.

There are real simple steps to figure out what % kelly you are betting. To bet full Kelly if you divide your total bankroll by 266 you will have a basic idea what your unit should be. If your bankroll is not readily replenishable I would not consider betting with that much risk. But a more risk friendly approach could be .5 Kelly which you could find by just be dividing your total bankroll by 532.

So like in rhino's example if you have a bankroll of $10,000, if you divide that by 532 you get almost $19 as a unit. Which you could round up to $20. At this point with a max bet of $125 using rhino's equation your spread is roughly only 6:1. But if you can play lower than your unit during neutral and slightly negative counts, like say $10 min, than you double your spread to a more reasonable one if you are playing shoe games. The optimal choice is to not play negative hands and just backcount and wong in at positive and out when negative.

My rule of thumb varies slightly from rhino's also when finding the proper bet based on advantage. What I have learned with factoring in risk and volatility, the optimal bet size is equal to .75 of player advantage, multiplied by the bankroll.

So again using an advantage of 1.5% with a $10,000 bankroll I would do something like this:

.75 * 0.015 * 10,000

That gives a bet of $112.50. As that is not a realistic looking bet you could round up or down slightly for a more round number. Rounding up will give you pretty much something in line with rhino's figuring, rounding down, slightly more conservative.
 

ThodorisK

Well-Known Member
#6
I cant be bothered to read carefully your posts lads, these are things I already know or that do not tell me any new usefull conclusion.

But here's another thought: Kelly just answers the question:
"IF you decide to keep betting a constant fraction of your current bankroll, what is the constant fraction among all constant fractions that will maximize the growth of your bankroll?"
The answer to this question (which is the kelly formula and its proof) does not proove (at least obviously) that, in the first place, betting a constant fraction of your current bankroll is the most ... maximizing solution for the growth of your bankroll. A more maximizing solution could (possibly) be to increase the fraction as your current bankroll increases, and not betting a constant fraction. Proove this possibility wrong if you can. :rolleyes:
 
Last edited:

Bojack1

Well-Known Member
#7
ThodorisK said:
I cant be bothered to read carefully your posts lads, these are things I already know or that do not tell me any new usefull conclusion.

But here's another thought: Kelly just answers the question:
"IF you decide to keep betting a constant fraction of your current bankroll, what is the constant fraction among all constant fractions that will maximize the growth of your bankroll?"
The answer to this question (which is the kelly formula and its proof) does not answer or proove that, in the first place, betting a constant fraction of your current bankroll is the most ... maximizing solution for the growth of your bankroll. A more maximizing solution could (possibly) be that you increase the fraction as your current bankroll increases, and not betting a constant fraction. Proove this possibility wrong if you can. :rolleyes:
As your bankroll increases there is no need to increase the fraction Kelly in which you bet, but it is entirely possible to increase the amount in which you bet using the same ratio originally used. As a matter of fact in the case of my team, we had to use an even more conservative fraction kelly due the positive growth of our bankroll. Our unit no longer became feasible in real world play at the previous kelly fraction. So basically we bet more money, but at a more conservative rate as our bankroll grew.

As far as your questions, maybe you should run your own sims based on the numbers that you supposedly know vs what you think may be more optimal. What you think can't be proven can become much clearer the more hands played out. If you don't believe in sims, then you don't truly understand what the numbers mean or where they come from.
 
#8
Into the Breach

Kelly = optimal
optimal = most favorable or desired, optimum

Anything other then kelly constant resizing is less then optimal!

Betting .5 kelly has 75% the growth rate of kelly with the same theoretical 0% ror.
Betting double kelly has 0 growth rate, the same as not betting at all.
Betting above double kelly has negative growth rate which leads to 100% ror.

The growth rate of 1.2 kelly; with more variance, is the same as .8 kelly. It is better to underbet then overbet.

Optimal means optimal

However, we do have to make allowances for the real world. The most common real world differences are betting smaller fractions of kelly and less frequent resizing.
 
Last edited:

Kasi

Well-Known Member
#9
ThodorisK said:
I cant be bothered to read carefully your posts lads, these are things I already know or that do not tell me any new usefull conclusion.
Man I wish I could know what posts tell me things I already know without bothering to carefully read them. And don't/won't tell me anything new that's useful at all, without actually really reading them. You must be the luckiest guy on earth blessed with such an ability.

ThodorisK said:
But here's another thought...
Maybe I must be the luckiest man too, except in some anti-universe, where I am condemned to actually carefully read the posts and get told things I know are not true but yet have to accept them as new and useful conclusions nonetheless.

Hopefully you won't bother to read this post carefully either since, with any luck, it's something you already know too :)

Just my way of initiating a discussion lol. Truthfully, I don't think I understood anything you were saying. ER, BA, BoJ, I could follow. But I could be living in the wrong universe.

No big deal, it's a complicated subject to me too lol.

But "then betting kelly has a greater EV than betting more than kelly, and if you bet more than 2 times kelly, the EV is NEGATIVE (!) This negative EV still amazes me and makes me keep thinking on the subject, as it is derived under the presupposition that you cannot bust (!), as the proof of the kelly formula presupposes that there is no minimum bet."

First of all, Kelly maximizes growth of roll, not EV.

Second, betting Kelly does not have a greater EV than betting more than Kelly. Eliminate "not" and substitute "lesser" for "greater", and you can see how close you were.

Thirdly, if you are still bothering to actually not really read this, betting more than 2 times Kelly is not neg EV. I give you credit though for it amazing you since it's not actually true. Your instincts were good. Brain, maybe not so much lol. It is 100% ROR though.

I'll stop there lol - tell me about the universe you are in. And where I can buy a ticket. Because mine sucks. Yours sounds happy lol.
 

ThodorisK

Well-Known Member
#10
http://www.bjmath.com/bjmath/kelly/kelly.pdf (Archive copy)
This is the paper of Kelly himself.
Go and see the formula:

(current bankroll)=((1+f)^W)((1-f)^L)(starting bankroll)

where f is the proportion of the current bankroll that is bet each time (Kelly uses the letter l instead of f, and uses the symbols VN for the current bankroll and Vo for the starting bankroll). W is the average number of winning tosses and L the average number of losing tosses.

Suppose the payoff is 2, the probability of winning a toss is 0.51 and the probability of losing a toss is 0.49. This means an edge of 2%.
So in 1000 tosses, on average you will win 510 of them and lose 490 of them.
So, if you bet kelly, you bet 0.02 of your bankroll on each toss, i.e. f=0.02. So we have:

(current bankroll)=
((1+0.02)^510)((1-0.02)^490)(starting bankroll)=1.22(starting bankroll)

So your current bankroll after 1000 tosses will be on average 1.22 times your starting bankroll. This figure of 1.22 is the highest possible. For any other value for f, the growth of the bankroll given by this formula is lower than 1.22

Do the same with 2 times kelly, i.e. f=0.04. Then your current bankroll will be almost equal to your starting bankroll.

And if you bet f>0.04, your current bankroll after 1000 tosses, will be LESS than your starting bankroll! This has nothing to do with going bust. You end up at having a damage, without ever going broke, because this formula assumes there is no minimum bet and it is impossible to lose your bankroll (bust).

And no matter how much less than kelly is the f you bet, you dont get a current bankroll less than your starting bankroll.

The above equation is a function where at the axis of X we see the various values of f. And at the axis of Y the corresponding profit. And its graph is a curve and it has its highest Y value for f=kelly.
Thus kelly answers nothing more than this question: "SUPPOSING that you keep betting a CONSTANT fraction of your current bankroll, then what is the fraction among all possible fractions that maximises the growth of your bankroll?" Therefore kelly does not proove that betting kelly produces more growth than any other possibly existent formula which suggests to increase the fraction (as the current bankroll increases) instead of keeping it constant.

Now read my previous posts again, and you will understand them better.
 
Last edited:
#11
My Favorite Posts!

If you have 3 players:

One bets Kelly
One bets more then kelly
One bets less then kelly

In the short run if you have positive variance the bigger better will win more and have the largest bankroll. However, when the negative variance comes bets will have to be decreased so much that one will have a hard time competing with Kelly. If the bets are over double kelly there will be a spiral down to ruin.

In the short run if you have negative variance the smaller better will lose less and have the largest bankroll. When the positive variance comes bets will be to low in order stay with Kelly. They both have a 0% theoretical ror but the growth rate will be less that of Kelly.

Now over time variance will flatten out and kelly will overtake all other forms of betting.

Kelly = optimal
Optimal means optimal.:joker::whip:
optimal = most preferred, optimum
 
#12
From Theory to the Street

Proportional overbetting and 100% ror is not hard to see in reality. Set up a deck so every hand is positive. Take out a couple of small cards, play a hand
and quickly re-shuffle. If you bet 25% (constant resizing) of your bank on each hand (use calculator) you will go broke even though you are playing a positive ev game.:joker::whip:

If you bet 1.5 kelly you have the same growth rate as .5 kelly
If you bet over double kelly you will go broke in the long run.
 
Last edited:

ThodorisK

Well-Known Member
#13
Lol, again you did not understand it. Damn, someone must understand what I said.

Overbetting kelly is again betting A CONSTANT fraction of your current bankroll.

Now to explain it ever further, paradoxically I have to be even more difficult to understand:

An equation which suggests increasing the fraction of your current bankroll that you bet (as your current bankroll increases), does not imply that you bet more than kelly, because they are two fundamentally different things. Such an equation could suggest that you never bet more than kelly.
Or it could suggest that you bet kelly only when your current bankroll is exactly X times your starting bankroll. (And in any case, it suggests that you bet a different fraction of your bankroll for each different value of current bankroll). And, hung on now: The average value of all these different fractions (which correspond to different values of the current bankroll), could be equal to kelly !!! (obviously that average value would be derived form a set of series which tend to infinity). Therefore such an equation is neither betting more than kelly, nor betting less than kelly.

The proof of kelly itself cannot proove that betting the CONSTANT fraction of kelly is more growth maximising than ANY other possible formula which suggests increasing the fraction of your current bankroll that you bet, as your current bankroll increases.

IF, I repeat IF, betting the CONSTANT fraction of kelly is the most maximising solution for the growth of the bankroll, than any other possible formula which does not suggest to bet a CONSTANT fraction of your bankroll, this needs an additional mathematical proof which has not yet been proposed by anyone. Such a proof, IF it exists, is not at all self-evident from the proof of kelly. The proof of the kelly formula applies only when you always bet a CONSTANT fraction of your current bankroll for ALL the various values that your current bankroll takes during the random walk.
 
Last edited:
#14
Resizing is not Fixed

All my statements concerned kelly continuous resizing, which is what kelly betting is.:joker::whip: If you consider fixed bets that is something different.
 

EasyRhino

Well-Known Member
#15
We've all been talking about continuous resizing so far.

Oftentimes, a more simplistic "fixed kelly" approximation is taken, where a bankroll is started a bet size is determined, and that bet size remains the same until a threshold is reached (halved or doubled, commonly).

You keep capitalizing the work CONSTANT, but I am not sure to which, if either, method you are referring.
 

ThodorisK

Well-Known Member
#16
When kelly is e.g. 2%, you always bet 2% of your current bankroll. Thus the fraction f of the current bankroll that you bet, is CONSTANT, it is always 2%. Got it now?
 
#17
Not Sure if on the Same Page

ThodorisK said:
When kelly is e.g. 2%, you always bet 2% of your current bankroll. Thus the fraction f of the current bankroll that you bet, is CONSTANT, it is always 2%. Got it now?
If the advantage is 2% one does bet 2% of their bankroll. However, as your bankroll moves up and down the size of your bets changes.

example:
2% advantage
1.3 variance
$10,000 bankroll

bet size
$10,000 * 2 / 1.3 = $153.84

if you lose 20% of bankroll your bet changes
$8,000 * 2 / 1.3 = $123.07

if you win 20% of bankroll your bet changes
$12,000 * 2 / 1.3 = $184.61

Are we on the same page?
 

Bojack1

Well-Known Member
#18
ThodorisK said:
When kelly is e.g. 2%, you always bet 2% of your current bankroll. Thus the fraction f of the current bankroll that you bet, is CONSTANT, it is always 2%. Got it now?
Based on this post here I'm not sure if you are not understanding what Kelly is, or if you just worded it strange. Its seems you are confusing whats advantage and whats Kelly. If you meant that you have a 2% advantage thus betting 2% of your bankroll, then yes that would be correct. But since the advantage will change and not always be 2%, the bets will not always be at 2%. Bets will follow the percentage of the advantage.

I know you feel no need to read posts carefully since you already know this material as you say, but allow me to repeat myself once again. Using Kelly you bet a percentage of your bankroll proportional to your advantage. The amount you bet is determined by the true count, which is the measure of your advantage. In your example you use Kelly at 2%. That is not a measure of Kelly that is a measure of advantage. Kelly in practical use is usually measured as a whole or a fraction of the whole such as .5 or .4. 2% or .02 is not an option.


If you do have a better idea of optimal betting, then lets hear it. Since you do not seem to buy into the Kelly criterion I would like your thoughts on a better method. The burden of proof is on you since you brought all of this up, not anyone else. If you have no math or workable theory to back up your thoughts, well you know then what takes a walk.
 

Sonny

Well-Known Member
#19
ThodorisK said:
The proof of kelly itself cannot prove that betting the CONSTANT fraction of kelly is more growth maximizing than ANY other possible formula which suggests increasing the fraction of your current bankroll that you bet, as your current bankroll increases.
So you’re saying that there is no mathematical proof that a mixed strategy might not be better than the pure strategy of the Kelly Criterion. I think I get it.

-Sonny-
 
Top