assume_R said:
Okay, so that would be the variance per hand? I suppose that's my real question. If I wanted the variance per hour, (100 hands), I'd multiply that number by 100?
First, great question.
Second, I probably have no idea as I wonder about this too.
But, what is wrong with assuming $1 bet per "round", you say "hand" maybe, but, to me, IBA would be $2/$3 or 66.66%. Like rounds played =3 but hands played =4. I guess TBA would be $2/$4=50%?
I mean would the fundamental advantage of the game change if one bet $1, $1 and $398 and win $796 over $400 initially bet?
In your example, if one lost all 3 "rounds", betting $'s as you describe, one has lost $60/rd*3 or $180, no? If one won all 3 rounds one has won $60*3, no? I just can't see how variance could be $400+/rd? I just can't see how variance/rd could exceed maximum possible loss/rd?
I realize you are playing BJ and maybe I am only playing some undefined game, as a starting point, wherein I win $1 33% of the time, lose $1 33% of the time and win $2 33% of the time. My avg bet is $1/rd.
All this, maybe, assuming $1 SD/rd? Otherwise, why re-invent the wheel since, in BJ, variance/rd changes slightly, I think, at every TC anyway?
I mean, in real BJ, one will have payoffs from -8 to +8 per "round" if one can split to 4 "hands" vs a single dealer upcard.
OTOH, you and Sagefrog could be completely correct as little as I know about how to figure this out.
But, to answer your question here, I do think variance is additive. Even if $variance/rd is different for 100 rounds due to different betting each round. $Variance after 100 rds is the sum of the $variance after each round.