hand dealt results

Status
Not open for further replies.

aslan

Well-Known Member
#21
JJR said:
There's just one problem with what you stated. The D'alembert with a straight 50/50 game of chance would actually win. By your logic, the D'Alembert should just break even on a 50/50 game of chance, since every increased bet is just subject to the same 50% chance of winning. The end result then, should be that the game breaks even. But, that's not the case. The D'Alembert would show a profit on a straight 50/50 game of chance, instead of just breaking even. So, obviously, there's a flaw in the logic.
The law of equilibrium is a large numbers thing. It does not show up in the short run. Wins and losses appear to break even only in the context of billions, and then it is only a percentage thing. In point of fact, the coin flip needs never break even. In a billion coin flips, you may have lost a closet full of shirts, but it will appear that the wins and losses are approaching 50-50. Meanwhile, the house advantage has a much, much shorter horizon for working its dirty deed. They (casinos) gleefully welcome all D'Alembert players.
 
#22
how you win when count is good

aslan said:
It's easy. Each and every bet that you make is against the house advantage. Therefore, you are favored to lose each and every bet. If the house wins 48% of its bets and the player wins 43% of his bets (the rest are pushes), then those times when you raise your bet you will only win 43% of the time, and those times when you lower your bet, you will still only win 43% of the time. So your progression strategies of lowering and raising your bet are subject to the same house edge at all times, whether you are raising or lowering, makes no difference. You might as well just flat bet all the time, because all is subject to the house advantage. By flat betting, you won't get caught losing when you raise your bet, so you have a better chance of surviving longer.

Enter counting. Counting works because it can use information that has gone before in determining the size of your bet. Since lots of tens and aces are in the remaining cards, you know that the the advantage has temporarily switched to the player. Only counting lets you know when the player temporarily has the advantage.

With a progression, the information that went before, how many wins or losses, gives no meaningful data as to what the size of the bet should be, or who has the advantage, you or the house. That is because without counting all you know is that the player has the exact same chance of winning each hand on average no matter what preceded. Because ten wins preceded your current bet does not mean that a loss is imminent. No, the chances based on known information remains the same for each and every bet. For all you know, the count has remained at zero the entire time. That would mean you have only your average 43% chance of winning. The count may also have gone negative, giving you an even greater chance of losing, and it can remain that way the entire shoe. Just as in craps, the cards have no memory of what went before, that is, if you are not counting.

The concept is so simple that it is sometimes difficult to see. When you finally do see it, you wonder how you could have been so stupid. But don't feel bad; it's like an optical illusion. It naturally appears one way, when in reality it's the other.

Always remember. With dice, the chances are always the same each and every time you throw them (short of any manipulation on your part). That's why they say the dice have no memory. No matter how you bet your money at dice, the chances never change. Fifty sevens in a row does not mean a thing (except you're getting rich, lol), because each time you throw the dice you have the same chance of rolling a seven. With cards it's the same thing, except for card counting where you use what went before to shape your future strategy. Nothing else can work against the house advantage. 43% of the time you win, 48% of the time you lose, and 9% of the time you push. All of this is on average. It's knowing when you have the best chance to be in the 43% that pays off with card counting. Nothing else can do it.
Actually you still win the same 43% of hands when count is high. The reason a high count favors the player is that with a high count more BJ's are dealt to both player and dealer(in equal amounts) but the player BJ pays 3/2 and a dealer BJ results in only an even money loss for the player. Also a player can take insurance while a dealer cannot. With a high count this is very valuble for the player. Finally, a high count is favorable for most double downs and non-defensive splits and the dealer who must hit all stiffs is more likely to bust when there is an excess of tens.
 

aslan

Well-Known Member
#23
ricky ricardo said:
Actually you still win the same 43% of hands when count is high. The reason a high count favors the player is that with a high count more BJ's are dealt to both player and dealer(in equal amounts) but the player BJ pays 3/2 and a dealer BJ results in only an even money loss for the player. Also a player can take insurance while a dealer cannot. With a high count this is very valuble for the player. Finally, a high count is favorable for most double downs and non-defensive splits and the dealer who must hit all stiffs is more likely to bust when there is an excess of tens.
You get a star for knowing that. :p BTW, how's Lucy? :grin::whip:

When I am spreading large in a high count, that simple truth bothers me to no end. I keep wanting to downsize my bet after a few wins, knowing that my chances of winning every hand are not good. I have to fight that natural impulse, reminding myself that I have the advantage and no less chance of winning the next hand than I did the previous one. The most fun is when you win them all spreading to max bet! :grin:
 

JJR

New Member
#24
Well, actually the D'alembert assumes that a 50/50 game such as a coin toss, will result in equal amounts of wins and losses, heads vs tails. 500 million heads vs 500 million tails. You're suggesting that a 50/50 game will result in a loss of 100 thousand at some point in a billion trials. I would say play another billion hands and you will eventually break even, having the same number of heads as there were tails thrown. And in that case the D'alembert would eventually win. I mean, you're correct in that you're bankroll may get eaten up and make the game not a practical way to play, but theoretically (with an infinite bankroll) the game wins.

Just to add on a bit here. This would mean there's a flaw in the common logic you put forth in the earlier post, because the D'alembert by that logic would only break even and not win, because every increased bet is subject to the same 50% chance of winning. But, that's not the case, it doesn't break even, it wins. So, there must be another factor at work.
 

iCountNTrack

Well-Known Member
#25
JJR said:
Well, actually the D'alembert assumes that a 50/50 game such as a coin toss, will result in equal amounts of wins and losses, heads vs tails. 500 million heads vs 500 million tails. You're suggesting that a 50/50 game will result in a loss of 100 thousand at some point in a billion trials. I would say play another billion hands and you will eventually break even, having the same number of heads as there were tails thrown. And in that case the D'alembert would eventually win. I mean, you're correct in that you're bankroll may get eaten up and make the game not a practical way to play, but theoretically (with an infinite bankroll) the game wins.

Just to add on a bit here. This would mean there's a flaw in the common logic you put forth in the earlier post, because the D'alembert by that logic would only break even and not win, because every increased bet is subject to the same 50% chance of winning. But, that's not the case, it doesn't break even, it wins. So, there must be another factor at work.
Again your stats are flawed sorry.

A) The number of heads and tails need not to be exactly equal, you will never know with certainty that you will get 500 millions tails and 500 millions heads, but what you will know for sure that the the percentage of heads will approach 50% as the sample size (number of trials) increase. So in the event you get an extra 50K tails, the percentage of tails would be 50.005%.

P.S If you had an infinite bankroll why do you need to play :)
 

QFIT

Well-Known Member
#26
JJR said:
Well, actually the D'alembert assumes that a 50/50 game such as a coin toss, will result in equal amounts of wins and losses, heads vs tails. 500 million heads vs 500 million tails. You're suggesting that a 50/50 game will result in a loss of 100 thousand at some point in a billion trials. I would say play another billion hands and you will eventually break even, having the same number of heads as there were tails thrown.
Actually, the exact opposite is true. The more hands you play, the less likely that you will win exactly half. In fact, it is nearly impossible at a billion tosses.

JJR said:
Just to add on a bit here. This would mean there's a flaw in the common logic you put forth in the earlier post, because the D'alembert by that logic would only break even and not win, because every increased bet is subject to the same 50% chance of winning. But, that's not the case, it doesn't break even, it wins. So, there must be another factor at work.
No, it does not win. No, there is not another factor.
 

Sonny

Well-Known Member
#27
It sounds like more people need to read the sticky threads here. The reason that every single progression system fails (or underperforms) is called the Gambler's Fallacy. Any system built on that fallacy will not give you an edge. It's that simple. The stickies explain it all in painstaking detail.

-Sonny-
 

aslan

Well-Known Member
#28
JJR said:
Well, actually the D'alembert assumes that a 50/50 game such as a coin toss, will result in equal amounts of wins and losses, heads vs tails. 500 million heads vs 500 million tails. You're suggesting that a 50/50 game will result in a loss of 100 thousand at some point in a billion trials. I would say play another billion hands and you will eventually break even, having the same number of heads as there were tails thrown. And in that case the D'alembert would eventually win. I mean, you're correct in that you're bankroll may get eaten up and make the game not a practical way to play, but theoretically (with an infinite bankroll) the game wins.

Just to add on a bit here. This would mean there's a flaw in the common logic you put forth in the earlier post, because the D'alembert by that logic would only break even and not win, because every increased bet is subject to the same 50% chance of winning. But, that's not the case, it doesn't break even, it wins. So, there must be another factor at work.
If the D'Alembert assumes a 50/50 game will result in an equal number of wins and losses, as you state, it is a false assumption. The ONLY assumption that can be made is that the next flip has an equal chance of being a win or a loss. The coins do not remember that the last flip was a win; the coins do not favor a loss in any way, shape, or form. Even if you flip a 1,000 consecutive wins, a loss is no more likely on the 1,001st flip than it was on the 1st flip.

Even in an infinite universe there is no guarantee that the wins will ever equal the losses or vice versa. There is no active tension here that is aware that wins are outpacing losses and thereby seeks to true things up. The principle of equilibrium is based on the simple fact that a win or a loss is equally likely each and every time in a 50/50 proposition. In point of fact, let's say after a million flips the wins are 100 ahead of the losses. According to your logic, believing that the flips will result in an equal number of wins and losses, then from that point on, the coins should try to maintain the 100 win imbalance simply by viewing that point as a new starting point-- but that would invalidate the view that the coins were trying to seek an equal number of wins and losses from the "original" starting point. You can't have it both ways.

What really happens if you run billions of simulations is that the percentage difference tends to narrow. Obviously, a 100 unit difference after a million flips is a much greater disparity than a 100 unit difference after a billion flips.

I don't believe that the number of wins and losses must ever coincide. Maybe someone with a math background on here can instruct us what the probability is for wins to equal losses and in how many flips, if that would be useful.

On your last point, I concede that the D'Alembert will win if the flips break dead even due to the raising of the bet, but reject the notion that they must ever break even. Also, you would be going to an awful lot of trouble to obtain a one unit win which has a 50/50 chance of being taken away on the next flip. Add the house advantage to your game, and you are achieving nothing in my opinion but a greater chance to lose more every time you raise your bet above minimum bet.
 

aslan

Well-Known Member
#29
By the time I wrote my response, several far more capable persons had already answered JJR far better than I. I hope my foregoing perspective is useful if only from a layman's point of view. I don't think you have to be a mathematician or expert in probability analysis to understand these concepts, but it sure helps.
 

iCountNTrack

Well-Known Member
#30
The key idea to understand that the probability of getting a tails or heads on one toss with an unbiased coin remains unchanged and it is independent of the previous trials, so it doesnt matter if the last 100 tosses had 80 heads and 20 tails, the probability of getting tails on the very next toss is not increased because more tails are due it is still 50%.
What we know with absolute certainty is the larger the sample size (number of tosses) the closer the percentage of heads (or tails) gets to 50%. The denominator (the number of coin tosses) will take care (will dilute) any short term "excesses" or "deficits".

Edit:Aslan beat me
 

aslan

Well-Known Member
#31
iCountNTrack said:
The key idea to understand that the probability of getting a tails or heads on one toss with an unbiased coin remains unchanged and it is independent of the previous trials, so it doesnt matter if the last 100 tosses had 80 heads and 20 tails, the probability of getting tails on the very next toss is not increased because more tails are due it is still 50%.
What we know with absolute certainty is the larger the sample size (number of tosses) the closer the percentage of heads (or tails) gets to 50%. The denominator (the number of coin tosses) will take care (will dilute) any short term "excesses" or "deficits".

Edit:Aslan beat me
But you said it with an economy of words. :1st:
 

JJR

New Member
#32
I don't understand why there's an argument that a 50/50 game will break even eventually. That's what it's suppose to do, unless it's a biased game. The point that you could be down 100 thousand, well you could be up 100 thousand the entire time also, it's a 50/50 game. But, at some time it will break even. To reject the notion that a 50/50 game will have equal amounts of wins and losses, is saying that a 50/50 game is biased in one direction or the other. It may have some wild swings and stay below or above break even for long periods of time, but eventually, in a fair game, a 50/50 game of chance will have equal amounts of wins and losses. That's not a fallacy that's an inherent fact.
 

QFIT

Well-Known Member
#33
JJR said:
I don't understand why there's an argument that a 50/50 game will break even eventually. That's what it's suppose to do, unless it's a biased game. The point that you could be down 100 thousand, well you could be up 100 thousand the entire time also, it's a 50/50 game. But, at some time it will break even. To reject the notion that a 50/50 game will have equal amounts of wins and losses, is saying that a 50/50 game is biased in one direction or the other. It may have some wild swings and stay below or above break even for long periods of time, but eventually, in a fair game, a 50/50 game of chance will have equal amounts of wins and losses. That's not a fallacy that's an inherent fact.
No, that's Gambler's Fallacy. The longer you play, the closer to 50% (as a percent) you will average. BUT, the farther you will get from an equal number of wins and losses. The more tosses, the less likely there will exist an exact number of wins and losses. With trillions of hands, there is virtually no chance of an equal number of wins and losses.

See http://en.wikipedia.org/wiki/Gambler's_fallacy for a detailed explanation.
 

JJR

New Member
#35
Well, I do apologize. I did have a misunderstanding of the Gamblers Fallacy. I thought it only applied to the next trial in a series. That after losing 4 in a row you're chances of winning the next trial are still just 50/50. I didn't know they were saying a 50/50 game of chance won't have equal amounts of wins and losses. That's insane. My beef is apparantly with the Mathematicians. They're saying that a 50/50 game is actually a 49.998% game, that's crazy. Because the same is still holds true. It's inherent in a 50/50 game that you will have equal amounts of wins and losses eventually. To say that the Gambler's Fallacy holds true after a trillion trials is due to the fact that a 50/50 game is actually a 49.998% game and won't have equal amounts of wins and losses is just insane. It's inherently not true.
 

aslan

Well-Known Member
#36
JJR said:
Well, I do apologize. I did have a misunderstanding of the Gamblers Fallacy. I thought it only applied to the next trial in a series. That after losing 4 in a row you're chances of winning the next trial are still just 50/50. I didn't know they were saying a 50/50 game of chance won't have equal amounts of wins and losses. That's insane. My beef is apparantly with the Mathematicians. They're saying that a 50/50 game is actually a 49.998% game, that's crazy. Because the same is still holds true. It's inherent in a 50/50 game that you will have equal amounts of wins and losses eventually. To say that the Gambler's Fallacy holds true after a trillion trials is due to the fact that a 50/50 game is actually a 49.998% game and won't have equal amounts of wins and losses is just insane. It's inherently not true.
View attachment 6852

What we have...a failure to communicate.

From Wikipedia:


The term "expected value" can be misleading. It must not be confused with the "most probable value." The expected value is in general not a typical value that the random variable can take on. It is often helpful to interpret the expected value of a random variable as the long-run average value of the variable over many independent repetitions of an experiment.


The expected value may be intuitively understood by the law of large numbers: The expected value, when it exists, is almost surely the limit of the sample mean as sample size grows to infinity. The value may not be expected in the general sense — the "expected value" itself may be unlikely or even impossible (such as having 2.5 children), just like the sample mean.


I'm sorry, it is not "inherent in a 50/50 game that you will have equal amounts of wins and losses eventually."
 

Attachments

JJR

New Member
#37
Well, it will obviously be uncommon to have exactly equal amounts of wins and losses in a million trials. You will most likely be above or below that line during most of the trials. But, "eventually" you will, of course, at some point have equal amounts of wins and losses. That's inherent because that's the "given" we are working with in this mathematical equation.
 

QFIT

Well-Known Member
#38
JJR said:
Well, it will obviously be uncommon to have exactly equal amounts of wins and losses in a million trials. You will most likely be above or below that line during most of the trials. But, "eventually" you will, of course, at some point have equal amounts of wins and losses. That's inherent because that's the "given" we are working with in this mathematical equation.
Why on Earth do you believe this? What "given?" There is no such mathematical law.
 

aslan

Well-Known Member
#39
JJR said:
But, "eventually" you will, of course, at some point have equal amounts of wins and losses.
How do you "know" that? Are you saying it is not possible for it not to eventually even up? As a practical matter, what difference does it make? What mathematical equation are you referring to--certainly not an algebraic one?
 

JJR

New Member
#40
The "given" is simply that the game is 50/50. That's the known. That the game will have equal amounts of wins and losses. The game is not a 49.999% game or a 50.0001% game it is known that the game is 50/50. That's a given. It's inherent that it will eventually have equal amounts of wins and losses. It's not arguable, it's the "given". You're trying to argue that a 50/50 game is a 49.9999% game. It's inherentyly wrong. The given is that the game is 50/50. If you're results are any other then 50/50, you're game is biased and the results are no good. That's the "given". It's inherent that the game breaks even. It has to.
 
Status
Not open for further replies.
Top