Laurelindo
Member
I was thinking that maybe you can use Probability to get an accurate estimation of how much money and how many tries you need in order to get 99% success rate if you always keep betting twice your previous bet when you are losing in order to get everything you've currently lost back again (although this seems too good to be true).
Anyway, you can use Probability to calculate the success rate after a certain number of tries:
P = 1 - (x / [x+y])^t
P = probability for winning after t number of tries
x = unfavorable outcomes
y = favorable outcomes
t = number of tries
If we assume that the chances of winning are 40% each game then this gives the following result:
t = log(1-0,99) ÷ log(6 ÷ [6+4]) = 9.
So if we lose one dollar and then bet two dollars etc until we win all lost money back we get this:
2^0 + 2^1 ... + 2^8 = 511
...which means we would need $511 as a starting bet in order to have a 99% long-term theoretical success rate.
Is this correct?
Anyway, you can use Probability to calculate the success rate after a certain number of tries:
P = 1 - (x / [x+y])^t
P = probability for winning after t number of tries
x = unfavorable outcomes
y = favorable outcomes
t = number of tries
If we assume that the chances of winning are 40% each game then this gives the following result:
t = log(1-0,99) ÷ log(6 ÷ [6+4]) = 9.
So if we lose one dollar and then bet two dollars etc until we win all lost money back we get this:
2^0 + 2^1 ... + 2^8 = 511
...which means we would need $511 as a starting bet in order to have a 99% long-term theoretical success rate.
Is this correct?