LovinItAll
Well-Known Member
This thought occurred to me when I was down 30 units in 30 minutes recently. I'm not a mathemetician, so my fundamental understanding of the following statements/comments may be incorrect. Some of the statements are just curiousities, so if they seem absurd to someone with a solid understanding of the subject, my apologies.
Let's say that a player is wagering on a game that has a 10 unit/hour SD. After 30 minutes, the player is down 30 units, or 3 standard deviations. Are the following questions/comments flawed?
- The player is really down 6 standard deviations, as 1 SD per hour = .5 SD's per 1/2 hour.
- The player hasn't played an hour, so the results re: SD are unknown as of yet.
- The player says. "It doesn't matter what just happened. I could easily lose another 11 units over the next 30 minutes (4+ SD's over 1 hour). I'm having a bad night. Bye."
- The player decides to increase his bet, as the chances of being 4+ standard deviations after an hour are slim.
- The player decides to keep his bet the same, as the chances of being 4+ standard deviations after an hour are slim and the player should have a decent shot at recovering some of his losses without increasing his bet.
- Based on the last statement, how does the Gambler's Fallacy apply to time and standard deviation? For example, we know that knowing past results doesn't impact future odds. With that, after the half hour of play and the resulting -3 SD's in results, do we simply 'reset the clock' after every wager as it relates to standard deviation? If the player loses 11 units during the next 30 minutes (common) the player is now down 4.1 SD's/hour, a rare occurence (.006%, I think).
To me, this is where the Gambler's Fallacy might bite someone.
The player is stuck 30 units in 30 minutes. He knows that the chance of being down 40+ units in an hour is supposed to be a fairly rare event so he manipulates his wager accordingly to recoup losses. He then loses and must withdraw his children from private school (he is a hopeless degen.....boo hoo).
I've never made the claim of being smart, so if my ignorance is profound, save the bullets.
Best ~ L.I.A.
Let's say that a player is wagering on a game that has a 10 unit/hour SD. After 30 minutes, the player is down 30 units, or 3 standard deviations. Are the following questions/comments flawed?
- The player is really down 6 standard deviations, as 1 SD per hour = .5 SD's per 1/2 hour.
- The player hasn't played an hour, so the results re: SD are unknown as of yet.
- The player says. "It doesn't matter what just happened. I could easily lose another 11 units over the next 30 minutes (4+ SD's over 1 hour). I'm having a bad night. Bye."
- The player decides to increase his bet, as the chances of being 4+ standard deviations after an hour are slim.
- The player decides to keep his bet the same, as the chances of being 4+ standard deviations after an hour are slim and the player should have a decent shot at recovering some of his losses without increasing his bet.
- Based on the last statement, how does the Gambler's Fallacy apply to time and standard deviation? For example, we know that knowing past results doesn't impact future odds. With that, after the half hour of play and the resulting -3 SD's in results, do we simply 'reset the clock' after every wager as it relates to standard deviation? If the player loses 11 units during the next 30 minutes (common) the player is now down 4.1 SD's/hour, a rare occurence (.006%, I think).
To me, this is where the Gambler's Fallacy might bite someone.
The player is stuck 30 units in 30 minutes. He knows that the chance of being down 40+ units in an hour is supposed to be a fairly rare event so he manipulates his wager accordingly to recoup losses. He then loses and must withdraw his children from private school (he is a hopeless degen.....boo hoo).
I've never made the claim of being smart, so if my ignorance is profound, save the bullets.
Best ~ L.I.A.