Started questioning does the card counting really work

#1
Hi,


I became interested in Blackjack and card counting a little while ago - learned from wizardofodds.com and blackjackapprenticeship.com, read books by Dr. Thorp, Lance Humble and Peter Griffin; practiced basic strategy and card counting at home and visited casino in Louisiana a couple of times, but ended up losing money all the time. As I was aware of the "law of large number" and being a full time worker, I wanted to find out how many trials (hands) do I need to play in order to generate some meaningful gain, and whether that many number of casino visit is feasible for me or not. Therefore, I wrote a python script which includes basic strategy, playing deviation (Illustrious 18), card counting and 1 to 8 bet spread strategy. I am providing the Google Colab link below. To my surprise, no matter how many number of trials I ran (Most common streak I ran was 1 million hands. Assuming that casino deals in average of 40 hands/hour, 1 million hand is 25,000 hrs of Blackjack play), my script ended up losing money overall. There are localized streaks of gains, but the overall result is loss. I wonder if anyone else has similar real or virtual (simulated) advantage play experience of losses after losses. My script sort of broke my heart and I am wondering whether I should even think about casino visit counting cards, and been thinking is variance in nature more dominant than we would like to think.


Link to code (written in Python) in Google Colab:
https://colab.research.google.com/dr...oM?usp=sharing

Here is a set of example where I ran one set with playing deviation and another set without playing deviation. Each set contains 100 streaks of 10,000 hands/streak, i.e. 1,000,000 hands total. The screenshots of the bankroll and the game data of each hand is saved under:
https://drive.google.com/drive/folde...2Z?usp=sharing

The rules for the game is "Basic Strategy". User can select where he/she wants to include playing deviation or not. User can define number of deck and deck penetration. I usually run the program with 6 decks, 80% penetration. The BJ pay of 3:2 is hard coded, just like basic strategy and playing deviation - all hard coded. Player does not take insurance, unless the true count is greater that +3, in which case the insurance is automatically triggered. Dealer stands on hard 17, but hits on soft 17; No surrender allowed (that's how they played in the casino I visited in Lake Charles, LA). I can understand that looking at the code might be not a pleasant experience. That is why I have saved some example text file of the game data. There are two txt files (output files) of 1000 rounds of play each. One with playing deviation, and other one without playing deviation. These files show all the hands along with the running/true count, bet adjustment and bankroll updates, deviations if any (based on the true count), and number of cards remain in the deck after each round, for error tracking and game transparency. I tested quite rigorously, found some flaws, corrected, tested again. I do not find any more issues with the game, i.e. it simulates exactly how an advantage player would play in the casino. But according to the program, the player still loses money majority of the time. The game data can be accessed here: https://drive.google.com/drive/folde...uHOMwiEKinuy2Z

I have posted this in some other forum and got some feedback which helped improve the code. But the overall result is still loss. I would very much appreciate if anyone have any suggestion on how can in practicality an AP can lose money, even though he/she suppose to win probabilistically.
Thank you for your time.
 

Zero

Well-Known Member
#2
Others (like me) might be more willing to assist you if you placed the code and other files some place where they can be accessed anonymously instead of google colab/drive where a google account is required and everything is tracked and logged.

0
 
#3
Zero said:
Others (like me) might be more willing to assist you if you placed the code and other files some place where they can be accessed anonymously instead of google colab/drive where a google account is required and everything is tracked and logged.

0
Hello, thank you for your offer to help. Attached is the py and ipynb file (either one can be used) and some txt documents with game data, as well as some screenshots of higher number of trials.
 

Attachments

Zero

Well-Known Member
#4
I'm still looking so I don't know if this is the only problem, but I can see you're not handling soft hands correctly. This is from Deviation_no_1000 trials (1).txt so it should just be using basic strategy:
Code:
################## Round 92 ######################
Player 1 is betting 10

Dealer's Card:
Q
XX
Score:  10

Player 1 Cards:
A
5
Score:  16
Enter for Player 1
S for Stand
H for Hit
D for Double Dowm
SP for Split
----> :

Strategy:  H

Player 1 Cards:
A
5
A
Score:  17
Enter for Player 1
S for Stand
H for Hit
D for Double Dowm
SP for Split
----> :

Strategy:  S

Dealer's Card:
Q
4
Score:  14

Dealer's Card:
Q
4
7
Score:  21
Dealer Won!
The player stood on a soft 17 (A,5,A) against a dealer 10.
It looks like your code only checks if a hand is soft if the hand total is over 21:
Code:
                while ace_counter > 0 and score > 21:
                        is_soft = True
                        score -= 10
                        ace_counter -= 1
You wrote the rules were H17 and your code is using the basic strategy for H17, but the game is effectively S17. This may also be related to incorrect handling of soft hands:
Code:
################## Round 2 ######################
Player 1 is betting 10

Dealer's Card:
4
XX
Score:  4

Player 1 Cards:
A
7
Score:  18
Enter for Player 1
S for Stand
H for Hit
D for Double Dowm
SP for Split
----> :

Strategy:  S

Dealer's Card:
4
2
Score:  6

Dealer's Card:
4
2
A
Score:  17
Player 1 Won!
Not only did the player stand on a soft 18 (A,7) vs a dealer 4 (which should have been a double), the dealer did not hit their soft 17 (4,2,A). If it were an H17 game the dealer should have hit.

Fix the handling of soft hands and see if your results are more in line with expectations.

0
 
Last edited:

Zero

Well-Known Member
#5
I've also found a bug with tracking bets/bankroll during splits (or perhaps just multiple splits, or perhaps just splits when there's a double down, or perhaps just multiple splits when there's a double down). If you look at round 196 in that same file, the player split to 4 hands of [3,A],[3,9],[3,A],[3,6] against a dealer 4. As mentioned above, the incorrect handling of soft totals caused the player to stand on both soft 14's. The player doubled the [3,6] hand so the total player bet would have been 10+10+10+20=$50. But this is what happened after the dealer drew to 21 and won all the hands:
Code:
Bankroll_value 1 9730.0
<...snip...>
################## Round 196 ######################
<...snip...>
Dealer's Card:
4
7
Q
Score:  21
Dealer Won!
Bankroll_value 1.2 9710.0
Dealer Won!
Bankroll_value 1.1.1 9720.0
Dealer Won!
Bankroll_value 1.1.2.1 9710.0
Dealer Won!
Bankroll_value 1.1.2.2 9700.0
For some reason player hand 1.1.1 added 10 to bankroll even though the dealer won. I suspect it didn't actually treat it as a player win but instead subtracted 10 from the original 9730 instead of the current 9710. So the bankroll ended at 9700 (-30) instead of 9680 (-50).

0
 
Last edited:
#6
Hello, my code is not handling the soft hands correctly, indeed. And I saw that error in bet handling in the case of multiple splitting (round#196), also mis-judgement by player due to error in handling soft hand on this same round. I will fix the code and share with you soon. I very much appreciate you checking it in such a great detail. It is indeed a great help! Thank you very much!!
I was watching a documentary on Youtube about Don Johnson the other day. One of the criteria he negotiated with casinos was dealer to stand on soft 17. Currently, according to my code, dealer hits on soft 17 (that's how the casinos I visit in Lake Charles, LA play). The statistical edge difference by hitting vs standing on soft 17, may only be 0.2%. But over the long run, it will add up to the overall actual result. I will add this option in the input whether to allow dealer to hit or stand on soft 17.
Hope to keep in touch. If you found any further issues with the code, please share. I hope this would help people like me as well, besides clarifying some of my own curiosities.
 
#7
Zero said:
I've also found a bug with tracking bets/bankroll during splits (or perhaps just multiple splits, or perhaps just splits when there's a double down, or perhaps just multiple splits when there's a double down). If you look at round 196 in that same file, the player split to 4 hands of [3,A],[3,9],[3,A],[3,6] against a dealer 4. As mentioned above, the incorrect handling of soft totals caused the player to stand on both soft 14's. The player doubled the [3,6] hand so the total player bet would have been 10+10+10+20=$50. But this is what happened after the dealer drew to 21 and won all the hands:
Code:
Bankroll_value 1 9730.0
<...snip...>
################## Round 196 ######################
<...snip...>
Dealer's Card:
4
7
Q
Score:  21
Dealer Won!
Bankroll_value 1.2 9710.0
Dealer Won!
Bankroll_value 1.1.1 9720.0
Dealer Won!
Bankroll_value 1.1.2.1 9710.0
Dealer Won!
Bankroll_value 1.1.2.2 9700.0
For some reason player hand 1.1.1 added 10 to bankroll even though the dealer won. I suspect it didn't actually treat it as a player win but instead subtracted 10 from the original 9730 instead of the current 9710. So the bankroll ended at 9700 (-30) instead of 9680 (-50).

0
Hello, sorry it took me a while to fix the code. I fixed the issues with soft hands and bankroll computation during splitting. I have attached the revised code, as well as updated dataset. The change does not seem to impact the final outcome by much. Would appreciate if you could take a look and let me know if you see any further issue with the game. Thanks
 

Attachments

Zero

Well-Known Member
#8
sp002bj said:
I fixed the issues with soft hands and bankroll computation during splitting.
It would be good practice for you to go through the text file and verify the correct playing decisions are being made. I found 2 problems in just the first 10 rounds. Round 3: The player stood with a 16 vs a dealer 7. Round 9: The player stood with a soft 16 vs a dealer 4. I don't think you have all the bugs worked out quite yet.

0
 
Top