I've been trying to figure out the optimal bet for blackjack, given a certain number of hands and my advantage. I think my math is wrong, because I'm getting some funky results. Does anyone know the correct math for this? Read on for my attempt.
I'm using the kelly function:
bet = advantage * bankroll / variance
Assume my bankroll is 1000, my advantage is 0.05, and the variance of 1 hand is 1.32.
bet = 0.05 * 1000 / 1.32 = 38
Now, you would assume that it would be be better to spread those 38 bets to 5 hands, rather than have them all on the same hand.
Computing the bet again, for 5 hands this time, I get some weird results. The variance of 5 hands of blackjack is 16.20. (using the function 1.32*n + 0.48*n*(n-1) )
bet = 0.05 * 1000 / 16.20 = 3.09
3.09*5 hands = 15. Shouldn't this number be greater than 38?
Should I be dividing by standard deviation instead of variance?
I'm using the kelly function:
bet = advantage * bankroll / variance
Assume my bankroll is 1000, my advantage is 0.05, and the variance of 1 hand is 1.32.
bet = 0.05 * 1000 / 1.32 = 38
Now, you would assume that it would be be better to spread those 38 bets to 5 hands, rather than have them all on the same hand.
Computing the bet again, for 5 hands this time, I get some weird results. The variance of 5 hands of blackjack is 16.20. (using the function 1.32*n + 0.48*n*(n-1) )
bet = 0.05 * 1000 / 16.20 = 3.09
3.09*5 hands = 15. Shouldn't this number be greater than 38?
Should I be dividing by standard deviation instead of variance?